Sample records for coding model linking

  1. Coding for Parallel Links to Maximize the Expected Value of Decodable Messages

    NASA Technical Reports Server (NTRS)

    Klimesh, Matthew A.; Chang, Christopher S.

    2011-01-01

    When multiple parallel communication links are available, it is useful to consider link-utilization strategies that provide tradeoffs between reliability and throughput. Interesting cases arise when there are three or more available links. Under the model considered, the links have known probabilities of being in working order, and each link has a known capacity. The sender has a number of messages to send to the receiver. Each message has a size and a value (i.e., a worth or priority). Messages may be divided into pieces arbitrarily, and the value of each piece is proportional to its size. The goal is to choose combinations of messages to send on the links so that the expected value of the messages decodable by the receiver is maximized. There are three parts to the innovation: (1) Applying coding to parallel links under the model; (2) Linear programming formulation for finding the optimal combinations of messages to send on the links; and (3) Algorithms for assisting in finding feasible combinations of messages, as support for the linear programming formulation. There are similarities between this innovation and methods developed in the field of network coding. However, network coding has generally been concerned with either maximizing throughput in a fixed network, or robust communication of a fixed volume of data. In contrast, under this model, the throughput is expected to vary depending on the state of the network. Examples of error-correcting codes that are useful under this model but which are not needed under previous models have been found. This model can represent either a one-shot communication attempt, or a stream of communications. Under the one-shot model, message sizes and link capacities are quantities of information (e.g., measured in bits), while under the communications stream model, message sizes and link capacities are information rates (e.g., measured in bits/second). This work has the potential to increase the value of data returned from spacecraft under certain conditions.

  2. GeoFramework: A Modeling Framework for Solid Earth Geophysics

    NASA Astrophysics Data System (ADS)

    Gurnis, M.; Aivazis, M.; Tromp, J.; Tan, E.; Thoutireddy, P.; Liu, Q.; Choi, E.; Dicaprio, C.; Chen, M.; Simons, M.; Quenette, S.; Appelbe, B.; Aagaard, B.; Williams, C.; Lavier, L.; Moresi, L.; Law, H.

    2003-12-01

    As data sets in geophysics become larger and of greater relevance to other earth science disciplines, and as earth science becomes more interdisciplinary in general, modeling tools are being driven in new directions. There is now a greater need to link modeling codes to one another, link modeling codes to multiple datasets, and to make modeling software available to non modeling specialists. Coupled with rapid progress in computer hardware (including the computational speed afforded by massively parallel computers), progress in numerical algorithms, and the introduction of software frameworks, these lofty goals of merging software in geophysics are now possible. The GeoFramework project, a collaboration between computer scientists and geoscientists, is a response to these needs and opportunities. GeoFramework is based on and extends Pyre, a Python-based modeling framework, recently developed to link solid (Lagrangian) and fluid (Eulerian) models, as well as mesh generators, visualization packages, and databases, with one another for engineering applications. The utility and generality of Pyre as a general purpose framework in science is now being recognized. Besides its use in engineering and geophysics, it is also being used in particle physics and astronomy. Geology and geophysics impose their own unique requirements on software frameworks which are not generally available in existing frameworks and so there is a need for research in this area. One of the special requirements is the way Lagrangian and Eulerian codes will need to be linked in time and space within a plate tectonics context. GeoFramework has grown beyond its initial goal of linking a limited number of exiting codes together. The following codes are now being reengineered within the context of Pyre: Tecton, 3-D FE Visco-elastic code for lithospheric relaxation; CitComS, a code for spherical mantle convection; SpecFEM3D, a SEM code for global and regional seismic waves; eqsim, a FE code for dynamic earthquake rupture; SNAC, a developing 3-D coded based on the FLAC method for visco-elastoplastic deformation; SNARK, a 3-D FE-PIC method for viscoplastic deformation; and gPLATES an open source paleogeographic/plate tectonics modeling package. We will demonstrate how codes can be linked with themselves, such as a regional and global model of mantle convection and a visco-elastoplastic representation of the crust within viscous mantle flow. Finally, we will describe how http://GeoFramework.org has become a distribution site for a suite of modeling software in geophysics.

  3. A TDM link with channel coding and digital voice.

    NASA Technical Reports Server (NTRS)

    Jones, M. W.; Tu, K.; Harton, P. L.

    1972-01-01

    The features of a TDM (time-division multiplexed) link model are described. A PCM telemetry sequence was coded for error correction and multiplexed with a digitized voice channel. An all-digital implementation of a variable-slope delta modulation algorithm was used to digitize the voice channel. The results of extensive testing are reported. The measured coding gain and the system performance over a Gaussian channel are compared with theoretical predictions and computer simulations. Word intelligibility scores are reported as a measure of voice channel performance.

  4. Performance and Architecture Lab Modeling Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-06-19

    Analytical application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult. Furthermore, models are frequently expressed in forms that are hard to distribute and validate. The Performance and Architecture Lab Modeling tool, or Palm, is a modeling tool designed to make application modeling easier. Palm provides a source code modeling annotation language. Not only does the modeling language divide the modeling task into sub problems, it formally links an application's source code with its model. This link is important because a model's purpose is to capture application behavior. Furthermore, this linkmore » makes it possible to define rules for generating models according to source code organization. Palm generates hierarchical models according to well-defined rules. Given an application, a set of annotations, and a representative execution environment, Palm will generate the same model. A generated model is a an executable program whose constituent parts directly correspond to the modeled application. Palm generates models by combining top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. A model's hierarchy is defined by static and dynamic source code structure. Because Palm coordinates models and source code, Palm's models are 'first-class' and reproducible. Palm automates common modeling tasks. For instance, Palm incorporates measurements to focus attention, represent constant behavior, and validate models. Palm's workflow is as follows. The workflow's input is source code annotated with Palm modeling annotations. The most important annotation models an instance of a block of code. Given annotated source code, the Palm Compiler produces executables and the Palm Monitor collects a representative performance profile. The Palm Generator synthesizes a model based on the static and dynamic mapping of annotations to program behavior. The model -- an executable program -- is a hierarchical composition of annotation functions, synthesized functions, statistics for runtime values, and performance measurements.« less

  5. PRELIM: Predictive Relevance Estimation from Linked Models

    DTIC Science & Technology

    2014-10-14

    code ) 14-10-2014 Final Report 11-07-2014 to 14-10-2014 PRELIM: Predictive Relevance Estimation from Linked Models N00014-14-P-1185 10257H. Van Dyke...Parunak, Ph.D. Soar Technology, Inc. 1 Executive  Summary   PRELIM (Predictive Relevance Estimation from Linked Models) draws on semantic models...The central challenge in proactive decision support is to anticipate the decision and information needs of decision-makers, in the light of likely

  6. Analysis and calculation of macrosegregation in a casting ingot. MPS solidification model. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Maples, A. L.

    1980-01-01

    The software developed for the solidification model is presented. A link between the calculations and the FORTRAN code is provided, primarily in the form of global flow diagrams and data structures. A complete listing of the solidification code is given.

  7. An investigation of error characteristics and coding performance

    NASA Technical Reports Server (NTRS)

    Ebel, William J.; Ingels, Frank M.

    1993-01-01

    The first year's effort on NASA Grant NAG5-2006 was an investigation to characterize typical errors resulting from the EOS dorn link. The analysis methods developed for this effort were used on test data from a March 1992 White Sands Terminal Test. The effectiveness of a concatenated coding scheme of a Reed Solomon outer code and a convolutional inner code versus a Reed Solomon only code scheme has been investigated as well as the effectiveness of a Periodic Convolutional Interleaver in dispersing errors of certain types. The work effort consisted of development of software that allows simulation studies with the appropriate coding schemes plus either simulated data with errors or actual data with errors. The software program is entitled Communication Link Error Analysis (CLEAN) and models downlink errors, forward error correcting schemes, and interleavers.

  8. Building a model for disease classification integration in oncology, an approach based on the national cancer institute thesaurus.

    PubMed

    Jouhet, Vianney; Mougin, Fleur; Bréchat, Bérénice; Thiessard, Frantz

    2017-02-07

    Identifying incident cancer cases within a population remains essential for scientific research in oncology. Data produced within electronic health records can be useful for this purpose. Due to the multiplicity of providers, heterogeneous terminologies such as ICD-10 and ICD-O-3 are used for oncology diagnosis recording purpose. To enable disease identification based on these diagnoses, there is a need for integrating disease classifications in oncology. Our aim was to build a model integrating concepts involved in two disease classifications, namely ICD-10 (diagnosis) and ICD-O-3 (topography and morphology), despite their structural heterogeneity. Based on the NCIt, a "derivative" model for linking diagnosis and topography-morphology combinations was defined and built. ICD-O-3 and ICD-10 codes were then used to instantiate classes of the "derivative" model. Links between terminologies obtained through the model were then compared to mappings provided by the Surveillance, Epidemiology, and End Results (SEER) program. The model integrated 42% of neoplasm ICD-10 codes (excluding metastasis), 98% of ICD-O-3 morphology codes (excluding metastasis) and 68% of ICD-O-3 topography codes. For every codes instantiating at least a class in the "derivative" model, comparison with SEER mappings reveals that all mappings were actually available in the model as a link between the corresponding codes. We have proposed a method to automatically build a model for integrating ICD-10 and ICD-O-3 based on the NCIt. The resulting "derivative" model is a machine understandable resource that enables an integrated view of these heterogeneous terminologies. The NCIt structure and the available relationships can help to bridge disease classifications taking into account their structural and granular heterogeneities. However, (i) inconsistencies exist within the NCIt leading to misclassifications in the "derivative" model, (ii) the "derivative" model only integrates a part of ICD-10 and ICD-O-3. The NCIt is not sufficient for integration purpose and further work based on other termino-ontological resources is needed in order to enrich the model and avoid identified inconsistencies.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Antonelli, Perry Edward

    A low-level model-to-model interface is presented that will enable independent models to be linked into an integrated system of models. The interface is based on a standard set of functions that contain appropriate export and import schemas that enable models to be linked with no changes to the models themselves. These ideas are presented in the context of a specific multiscale material problem that couples atomistic-based molecular dynamics calculations to continuum calculations of fluid ow. These simulations will be used to examine the influence of interactions of the fluid with an adjacent solid on the fluid ow. The interface willmore » also be examined by adding it to an already existing modeling code, Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS) and comparing it with our own molecular dynamics code.« less

  10. Examples of Linking Codes Within GeoFramework

    NASA Astrophysics Data System (ADS)

    Tan, E.; Choi, E.; Thoutireddy, P.; Aivazis, M.; Lavier, L.; Quenette, S.; Gurnis, M.

    2003-12-01

    Geological processes usually encompass a broad spectrum of length and time scales. Traditionally, a modeling code (solver) is written to solve a problem with specific length and time scales in mind. The utility of the solver beyond the designated purpose is usually limited. Furthermore, two distinct solvers, even if each can solve complementary parts of a new problem, are difficult to link together to solve the problem as a whole. For example, Lagrangian deformation model with visco-elastoplastic crust is used to study deformation near plate boundary. Ideally, the driving force of the deformation should be derived from underlying mantle convection, and it requires linking the Lagrangian deformation model with a Eulerian mantle convection model. As our understanding of geological processes evolves, the need of integrated modeling codes, which should reuse existing codes as much as possible, begins to surface. GeoFramework project addresses this need by developing a suite of reusable and re-combinable tools for the Earth science community. GeoFramework is based on and extends Pyre, a Python-based modeling framework, recently developed to link solid (Lagrangian) and fluid (Eulerian) models, as well as mesh generators, visualization packages, and databases, with one another for engineering applications. Under the framework, a solver is aware of the existence of other solvers and can interact with each other via exchanging information across adjacent boundary. A solver needs to conform a standard interface and provide its own implementation for exchanging boundary information. The framework also provides facilities to control the coordination between interacting solvers. We will show an example of linking two solvers within GeoFramework. CitcomS is a finite element code which solves for thermal convection within a 3D spherical shell. CitcomS can solve for problems either within a full spherical (global) domain or a restricted (regional) domain of a full sphere by using different meshers. We can embed a regional CitcomS solver within a global CitcomS solver. We not that linking instances of the same solver is conceptually equivalent to linking to different solvers. The global solver has a coarser grid and a longer stable time step than the regional solver. Therefore, a global-solver time step consists of several regional-solver time steps. The time-marching scheme is described below. First, the global solver is advanced one global-solver time step. Then, the regional solver is advanced for several regional-solver time steps until it catches up global solver. Within each regional-solver time step, the velocity field of the global solver is interpolated in time and then is imposed to the regional solver as boundary conditions. Finally, the temperature field of the regional solver is extrapolated in space and is fed back to the global. These two solvers are linked and synchronized by the time-marching scheme. An effort to embed a visco-elastoplastic representation of the crust within viscous mantle flow is underway.

  11. Earth Observing System (EOS) Communication (Ecom) Modeling, Analysis, and Testbed (EMAT) activiy

    NASA Technical Reports Server (NTRS)

    Desai, Vishal

    1994-01-01

    This paper describes the Earth Observing System (EOS) Communication (Ecom) Modeling, Analysis, and Testbed (EMAT) activity performed by Code 540 in support of the Ecom project. Ecom is the ground-to-ground data transport system for operational EOS traffic. The National Aeronautic and Space Administration (NASA) Communications (Nascom) Division, Code 540, is responsible for implementing Ecom. Ecom interfaces with various systems to transport EOS forward link commands, return link telemetry, and science payload data. To understand the complexities surrounding the design and implementation of Ecom, it is necessary that sufficient testbedding, modeling, and analysis be conducted prior to the design phase. These activities, when grouped, are referred to as the EMAT activity. This paper describes work accomplished to date in each of the three major EMAT activities: modeling, analysis, and testbedding.

  12. Workflow for Integrating Mesoscale Heterogeneities in Materials Structure with Process Simulation of Titanium Alloys (Postprint)

    DTIC Science & Technology

    2014-10-01

    offer a practical solution to calculating the grain -scale hetero- geneity present in the deformation field. Consequently, crystal plasticity models...process/performance simulation codes (e.g., crystal plasticity finite element method). 15. SUBJECT TERMS ICME; microstructure informatics; higher...iii) protocols for direct and efficient linking of materials models/databases into process/performance simulation codes (e.g., crystal plasticity

  13. Variational learning and bits-back coding: an information-theoretic view to Bayesian learning.

    PubMed

    Honkela, Antti; Valpola, Harri

    2004-07-01

    The bits-back coding first introduced by Wallace in 1990 and later by Hinton and van Camp in 1993 provides an interesting link between Bayesian learning and information-theoretic minimum-description-length (MDL) learning approaches. The bits-back coding allows interpreting the cost function used in the variational Bayesian method called ensemble learning as a code length in addition to the Bayesian view of misfit of the posterior approximation and a lower bound of model evidence. Combining these two viewpoints provides interesting insights to the learning process and the functions of different parts of the model. In this paper, the problem of variational Bayesian learning of hierarchical latent variable models is used to demonstrate the benefits of the two views. The code-length interpretation provides new views to many parts of the problem such as model comparison and pruning and helps explain many phenomena occurring in learning.

  14. Coded spread spectrum digital transmission system design study

    NASA Technical Reports Server (NTRS)

    Heller, J. A.; Odenwalder, J. P.; Viterbi, A. J.

    1974-01-01

    Results are presented of a comprehensive study of the performance of Viterbi-decoded convolutional codes in the presence of nonideal carrier tracking and bit synchronization. A constraint length 7, rate 1/3 convolutional code and parameters suitable for the space shuttle coded communications links are used. Mathematical models are developed and theoretical and simulation results are obtained to determine the tracking and acquisition performance of the system. Pseudorandom sequence spread spectrum techniques are also considered to minimize potential degradation caused by multipath.

  15. Comparison of FDMA and CDMA for second generation land-mobile satellite communications

    NASA Technical Reports Server (NTRS)

    Yongacoglu, A.; Lyons, R. G.; Mazur, B. A.

    1990-01-01

    Code Division Multiple Access (CDMA) and Frequency Division Multiple Access (FDMA) (both analog and digital) systems capacities are compared on the basis of identical link availabilities and physical propagation models. Parameters are optimized for a bandwidth limited, multibeam environment. For CDMA, the benefits of voice activated carriers, antenna discrimination, polarization reuse, return link power control and multipath suppression are included in the analysis. For FDMA, the advantages of bandwidth efficient modulation/coding combinations, voice activated carriers, polarization reuse, beam placement, and frequency staggering were taken into account.

  16. AQUATOX Frequently Asked Questions

    EPA Pesticide Factsheets

    Capabilities, Installation, Source Code, Example Study Files, Biotic State Variables, Initial Conditions, Loadings, Volume, Sediments, Parameters, Libraries, Ecotoxicology, Waterbodies, Link to Watershed Models, Output, Metals, Troubleshooting

  17. Time Resolved Phonon Spectroscopy, Version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goett, Johnny; Zhu, Brian

    TRPS code was developed for the project "Time Resolved Phonon Spectroscopy". Routines contained in this piece of software were specially created to model phonon generation and tracking within materials that interact with ionizing radiation, particularly applicable to the modeling of cryogenic radiation detectors for dark matter and neutrino research. These routines were created to link seamlessly with the open source Geant4 framework for the modeling of radiation transport in matter, with the explicit intent of open sourcing them for eventual integration into that code base.

  18. Revised catalog of types of CODES applications implemented using linked state data : crash outcome data evaluation system (CODES)

    DOT National Transportation Integrated Search

    2000-06-01

    The purpose of the Revised Catalog of Types of CODES Applications Implemented Using Linked : State Data (CODES) is to inspire the development of new applications for linked data that support : efforts to reduce death, disability, severity, and health...

  19. Development of structured ICD-10 and its application to computer-assisted ICD coding.

    PubMed

    Imai, Takeshi; Kajino, Masayuki; Sato, Megumi; Ohe, Kazuhiko

    2010-01-01

    This paper presents: (1) a framework of formal representation of ICD10, which functions as a bridge between ontological information and natural language expressions; and (2) a methodology to use formally described ICD10 for computer-assisted ICD coding. First, we analyzed and structurized the meanings of categories in 15 chapters of ICD10. Then we expanded the structured ICD10 (S-ICD10) by adding subordinate concepts and labels derived from Japanese Standard Disease Names. The information model to describe formal representation was refined repeatedly. The resultant model includes 74 types of semantic links. We also developed an ICD coding module based on S-ICD10 and a 'Coding Principle,' which achieved high accuracy (>70%) for four chapters. These results not only demonstrate the basic feasibility of our coding framework but might also inform the development of the information model for formal description framework in the ICD11 revision.

  20. Improving the diagnosis related grouping model's ability to explain length of stay of elderly medical inpatients by incorporating function-linked variables.

    PubMed

    Sahadevan, S; Earnest, A; Koh, Y L; Lee, K M; Soh, C H; Ding, Y Y

    2004-09-01

    This study first aimed to determine the adequacy of the Diagnosis Related Grouping (DRG) model's ability to explain (1) the variance in the actual length of stay (LOS) of elderly medical inpatients and (2) the LOS difference in the same cohort between the departments of Geriatric Medicine (GRM) and General Medicine (GM). We then looked at how these explanatory abilities of the DRG changed when patients' function-linked variables (ignored by DRG) were incorporated into the model. Basic demographic data of a consecutively hospitalised cohort of elderly medical inpatients from GRM and GM, as well as their actual LOS, discharge DRG codes [with their corresponding trimmed average length of stay (ALOS)] and selected function-linked variables (including premorbid functional status, change in functional profile during hospitalisation and number of therapists seen) were recorded. Beginning with ALOS, function-linked variables that were significantly associated with LOS were then added into two multiple liner regression models so as to quantify how the functional dimension improved the DRGs' abilities to explain LOS variances and interdepartmental LOS differences. Forward selection procedure was employed to determine the final models. For the interdepartmental analysis, the study sample was restricted to patients who shared common DRG codes. 114 GRM and 118 GM patients were studied. Trimmed ALOS alone explained 8% of the actual LOS variance. With the addition of function-linked variables, the adjusted R2 of the final model increased to 28%. Due to common code restrictions, the data of 79 GRM and 78 GM patients were available for the analysis of interdepartmental LOS differences. At the unadjusted stage, the median stay of GRM patients was 4.3 days longer than GM's and with adjustments made for the DRGs, this difference was reduced to 3.9 days. Additionally adjusting for the patients' functional features diminished the interdepartmental LOS discrepancy even further, to 2.1 days. This study demonstrates that for elderly medical inpatients, the incorporation of patients' functional status significantly improves the DRG model's ability to predict the patients' actual LOS as well as to explain interdepartmental LOS differences between GRM and GM.

  1. Fast, Statistical Model of Surface Roughness for Ion-Solid Interaction Simulations and Efficient Code Coupling

    NASA Astrophysics Data System (ADS)

    Drobny, Jon; Curreli, Davide; Ruzic, David; Lasa, Ane; Green, David; Canik, John; Younkin, Tim; Blondel, Sophie; Wirth, Brian

    2017-10-01

    Surface roughness greatly impacts material erosion, and thus plays an important role in Plasma-Surface Interactions. Developing strategies for efficiently introducing rough surfaces into ion-solid interaction codes will be an important step towards whole-device modeling of plasma devices and future fusion reactors such as ITER. Fractal TRIDYN (F-TRIDYN) is an upgraded version of the Monte Carlo, BCA program TRIDYN developed for this purpose that includes an explicit fractal model of surface roughness and extended input and output options for file-based code coupling. Code coupling with both plasma and material codes has been achieved and allows for multi-scale, whole-device modeling of plasma experiments. These code coupling results will be presented. F-TRIDYN has been further upgraded with an alternative, statistical model of surface roughness. The statistical model is significantly faster than and compares favorably to the fractal model. Additionally, the statistical model compares well to alternative computational surface roughness models and experiments. Theoretical links between the fractal and statistical models are made, and further connections to experimental measurements of surface roughness are explored. This work was supported by the PSI-SciDAC Project funded by the U.S. Department of Energy through contract DOE-DE-SC0008658.

  2. An atomistic model for cross-linked HNBR elastomers used in seals

    NASA Astrophysics Data System (ADS)

    Molinari, Nicola; Sutton, Adrian; Stevens, John; Mostofi, Arash

    2015-03-01

    Hydrogenated nitrile butadiene rubber (HNBR) is one of the most common elastomeric materials used for seals in the oil and gas industry. These seals sometimes suffer ``explosive decompression,'' a costly problem in which gases permeate a seal at the elevated temperatures and pressures pertaining in oil and gas wells, leading to rupture when the seal is brought back to the surface. The experimental evidence that HNBR and its unsaturated parent NBR have markedly different swelling properties suggests that cross-linking may occur during hydrogenation of NBR to produce HNBR. We have developed a code compatible with the LAMMPS molecular dynamics package to generate fully atomistic HNBR configurations by hydrogenating initial NBR structures. This can be done with any desired degree of cross-linking. The code uses a model of atomic interactions based on the OPLS-AA force-field. We present calculations of the dependence of a number of bulk properties on the degree of cross-linking. Using our atomistic representations of HNBR and NBR, we hope to develop a better molecular understanding of the mechanisms that result in explosive decompression.

  3. Evolved Design, Integration, and Test of a Modular, Multi-Link, Spacecraft-Based Robotic Manipulator

    DTIC Science & Technology

    2016-06-01

    of the MATLAB code, the SPART model [24]. The portions of the SPART model relevant to this thesis are contained in (Appendices E –P). While the SPART...the kinematics and the dynamics of the system must be modeled and simulated numerically to understand how the system will behave for a given number... simulators with multiple-link robotic arms has been ongoing. B . STATE OF THE ART 1. An Overarching Context Space-based manipulators and the experimental

  4. Effect of digital scrambling on satellite communication links

    NASA Technical Reports Server (NTRS)

    Dessouky, K.

    1985-01-01

    Digital data scrambling has been considered for communication systems using NRZ symbol formats. The purpose is to increase the number of transitions in the data to improve the performance of the symbol synchronizer. This is accomplished without expanding the bandwidth but at the expense of increasing the data bit error rate (BER). Models for the scramblers/descramblers of practical interest are presented together with the appropriate link model. The effects of scrambling on the performance of coded and uncoded links are studied. The results are illustrated by application to the Tracking and Data Relay Satellite System (TDRSS) links. Conclusions regarding the usefulness of scrambling are also given.

  5. A combinatorial model for dentate gyrus sparse coding

    DOE PAGES

    Severa, William; Parekh, Ojas; James, Conrad D.; ...

    2016-12-29

    The dentate gyrus forms a critical link between the entorhinal cortex and CA3 by providing a sparse version of the signal. Concurrent with this increase in sparsity, a widely accepted theory suggests the dentate gyrus performs pattern separation—similar inputs yield decorrelated outputs. Although an active region of study and theory, few logically rigorous arguments detail the dentate gyrus’s (DG) coding. We suggest a theoretically tractable, combinatorial model for this action. The model provides formal methods for a highly redundant, arbitrarily sparse, and decorrelated output signal.To explore the value of this model framework, we assess how suitable it is for twomore » notable aspects of DG coding: how it can handle the highly structured grid cell representation in the input entorhinal cortex region and the presence of adult neurogenesis, which has been proposed to produce a heterogeneous code in the DG. We find tailoring the model to grid cell input yields expansion parameters consistent with the literature. In addition, the heterogeneous coding reflects activity gradation observed experimentally. Lastly, we connect this approach with more conventional binary threshold neural circuit models via a formal embedding.« less

  6. The Role of Margin in Link Design and Optimization

    NASA Technical Reports Server (NTRS)

    Cheung, K.

    2015-01-01

    Link analysis is a system engineering process in the design, development, and operation of communication systems and networks. Link models that are mathematical abstractions representing the useful signal power and the undesirable noise and attenuation effects (including weather effects if the signal path transverses through the atmosphere) that are integrated into the link budget calculation that provides the estimates of signal power and noise power at the receiver. Then the link margin is applied which attempts to counteract the fluctuations of the signal and noise power to ensure reliable data delivery from transmitter to receiver. (Link margin is dictated by the link margin policy or requirements.) A simple link budgeting approach assumes link parameters to be deterministic values typically adopted a rule-of-thumb policy of 3 dB link margin. This policy works for most S- and X-band links due to their insensitivity to weather effects. But for higher frequency links like Ka-band, Ku-band, and optical communication links, it is unclear if a 3 dB link margin would guarantee link closure. Statistical link analysis that adopted the 2-sigma or 3-sigma link margin incorporates link uncertainties in the sigma calculation. (The Deep Space Network (DSN) link margin policies are 2-sigma for downlink and 3-sigma for uplink.) The link reliability can therefore be quantified statistically even for higher frequency links. However in the current statistical link analysis approach, link reliability is only expressed as the likelihood of exceeding the signal-to-noise ratio (SNR) threshold that corresponds to a given bit-error-rate (BER) or frame-error-rate (FER) requirement. The method does not provide the true BER or FER estimate of the link with margin, or the required signalto-noise ratio (SNR) that would meet the BER or FER requirement in the statistical sense. In this paper, we perform in-depth analysis on the relationship between BER/FER requirement, operating SNR, and coding performance curve, in the case when the channel coherence time of link fluctuation is comparable or larger than the time duration of a codeword. We compute the "true" SNR design point that would meet the BER/FER requirement by taking into account the fluctuation of signal power and noise power at the receiver, and the shape of the coding performance curve. This analysis yields a number of valuable insights on the design choices of coding scheme and link margin for the reliable data delivery of a communication system - space and ground. We illustrate the aforementioned analysis using a number of standard NASA error-correcting codes.

  7. A Verification-Driven Approach to Traceability and Documentation for Auto-Generated Mathematical Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen W.; Fischer, Bernd

    2009-01-01

    Model-based development and automated code generation are increasingly used for production code in safety-critical applications, but since code generators are typically not qualified, the generated code must still be fully tested, reviewed, and certified. This is particularly arduous for mathematical and control engineering software which requires reviewers to trace subtle details of textbook formulas and algorithms to the code, and to match requirements (e.g., physical units or coordinate frames) not represented explicitly in models or code. Both tasks are complicated by the often opaque nature of auto-generated code. We address these problems by developing a verification-driven approach to traceability and documentation. We apply the AUTOCERT verification system to identify and then verify mathematical concepts in the code, based on a mathematical domain theory, and then use these verified traceability links between concepts, code, and verification conditions to construct a natural language report that provides a high-level structured argument explaining why and how the code uses the assumptions and complies with the requirements. We have applied our approach to generate review documents for several sub-systems of NASA s Project Constellation.

  8. Geographic Information Systems using CODES linked data (Crash outcome data evaluation system)

    DOT National Transportation Integrated Search

    2001-04-01

    This report presents information about geographic information systems (GIS) and CODES linked data. Section one provides an overview of a GIS and the benefits of linking to CODES. Section two outlines the basic issues relative to the types of map data...

  9. Refined mapping of autoimmune disease associated genetic variants with gene expression suggests an important role for non-coding RNAs.

    PubMed

    Ricaño-Ponce, Isis; Zhernakova, Daria V; Deelen, Patrick; Luo, Oscar; Li, Xingwang; Isaacs, Aaron; Karjalainen, Juha; Di Tommaso, Jennifer; Borek, Zuzanna Agnieszka; Zorro, Maria M; Gutierrez-Achury, Javier; Uitterlinden, Andre G; Hofman, Albert; van Meurs, Joyce; Netea, Mihai G; Jonkers, Iris H; Withoff, Sebo; van Duijn, Cornelia M; Li, Yang; Ruan, Yijun; Franke, Lude; Wijmenga, Cisca; Kumar, Vinod

    2016-04-01

    Genome-wide association and fine-mapping studies in 14 autoimmune diseases (AID) have implicated more than 250 loci in one or more of these diseases. As more than 90% of AID-associated SNPs are intergenic or intronic, pinpointing the causal genes is challenging. We performed a systematic analysis to link 460 SNPs that are associated with 14 AID to causal genes using transcriptomic data from 629 blood samples. We were able to link 71 (39%) of the AID-SNPs to two or more nearby genes, providing evidence that for part of the AID loci multiple causal genes exist. While 54 of the AID loci are shared by one or more AID, 17% of them do not share candidate causal genes. In addition to finding novel genes such as ULK3, we also implicate novel disease mechanisms and pathways like autophagy in celiac disease pathogenesis. Furthermore, 42 of the AID SNPs specifically affected the expression of 53 non-coding RNA genes. To further understand how the non-coding genome contributes to AID, the SNPs were linked to functional regulatory elements, which suggest a model where AID genes are regulated by network of chromatin looping/non-coding RNAs interactions. The looping model also explains how a causal candidate gene is not necessarily the gene closest to the AID SNP, which was the case in nearly 50% of cases. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. Digital scrambling for shuttle communication links: Do drawbacks outweigh advantages?

    NASA Technical Reports Server (NTRS)

    Dessouky, K.

    1985-01-01

    Digital data scrambling has been considered for communication systems using NRZ (non-return to zero) symbol formats. The purpose is to increase the number of transitions in the data to improve the performance of the symbol synchronizer. This is accomplished without expanding the bandwidth but at the expense of increasing the data bit error rate (BER). Models for the scramblers/descramblers of practical interest are presented together with the appropriate link model. The effects of scrambling on the performance of coded and uncoded links are studied. The results are illustrated by application to the Tracking and Data Relay Satellite System links. Conclusions regarding the usefulness of scrambling are also given.

  11. Adding kinetics and hydrodynamics to the CHEETAH thermochemical code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fried, L.E., Howard, W.M., Souers, P.C.

    1997-01-15

    In FY96 we released CHEETAH 1.40, which made extensive improvements on the stability and user friendliness of the code. CHEETAH now has over 175 users in government, academia, and industry. Efforts have also been focused on adding new advanced features to CHEETAH 2.0, which is scheduled for release in FY97. We have added a new chemical kinetics capability to CHEETAH. In the past, CHEETAH assumed complete thermodynamic equilibrium and independence of time. The addition of a chemical kinetic framework will allow for modeling of time-dependent phenomena, such as partial combustion and detonation in composite explosives with large reaction zones. Wemore » have implemented a Wood-Kirkwood detonation framework in CHEETAH, which allows for the treatment of nonideal detonations and explosive failure. A second major effort in the project this year has been linking CHEETAH to hydrodynamic codes to yield an improved HE product equation of state. We have linked CHEETAH to 1- and 2-D hydrodynamic codes, and have compared the code to experimental data. 15 refs., 13 figs., 1 tab.« less

  12. RTE: A computer code for Rocket Thermal Evaluation

    NASA Technical Reports Server (NTRS)

    Naraghi, Mohammad H. N.

    1995-01-01

    The numerical model for a rocket thermal analysis code (RTE) is discussed. RTE is a comprehensive thermal analysis code for thermal analysis of regeneratively cooled rocket engines. The input to the code consists of the composition of fuel/oxidant mixture and flow rates, chamber pressure, coolant temperature and pressure. dimensions of the engine, materials and the number of nodes in different parts of the engine. The code allows for temperature variation in axial, radial and circumferential directions. By implementing an iterative scheme, it provides nodal temperature distribution, rates of heat transfer, hot gas and coolant thermal and transport properties. The fuel/oxidant mixture ratio can be varied along the thrust chamber. This feature allows the user to incorporate a non-equilibrium model or an energy release model for the hot-gas-side. The user has the option of bypassing the hot-gas-side calculations and directly inputting the gas-side fluxes. This feature is used to link RTE to a boundary layer module for the hot-gas-side heat flux calculations.

  13. Generating Code Review Documentation for Auto-Generated Mission-Critical Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd

    2009-01-01

    Model-based design and automated code generation are increasingly used at NASA to produce actual flight code, particularly in the Guidance, Navigation, and Control domain. However, since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently auto-generated code still needs to be fully tested and certified. We have thus developed AUTOCERT, a generator-independent plug-in that supports the certification of auto-generated code. AUTOCERT takes a set of mission safety requirements, and formally verifies that the autogenerated code satisfies these requirements. It generates a natural language report that explains why and how the code complies with the specified requirements. The report is hyper-linked to both the program and the verification conditions and thus provides a high-level structured argument containing tracing information for use in code reviews.

  14. Coding performance of the Probe-Orbiter-Earth communication link

    NASA Technical Reports Server (NTRS)

    Divsalar, D.; Dolinar, S.; Pollara, F.

    1993-01-01

    The coding performance of the Probe-Orbiter-Earth communication link is analyzed and compared for several cases. It is assumed that the coding system consists of a convolutional code at the Probe, a quantizer and another convolutional code at the Orbiter, and two cascaded Viterbi decoders or a combined decoder on the ground.

  15. R Tricks for Kids

    ERIC Educational Resources Information Center

    Braun, W. John; White, Bethany J. G.; Craig, Gavin

    2014-01-01

    Real-world phenomena simulation models, which can be used to engage middle-school students with probability, are described. Links to R instructional material and easy-to-use code are provided to facilitate implementation in the classroom.

  16. Data Parallel Line Relaxation (DPLR) Code User Manual: Acadia - Version 4.01.1

    NASA Technical Reports Server (NTRS)

    Wright, Michael J.; White, Todd; Mangini, Nancy

    2009-01-01

    Data-Parallel Line Relaxation (DPLR) code is a computational fluid dynamic (CFD) solver that was developed at NASA Ames Research Center to help mission support teams generate high-value predictive solutions for hypersonic flow field problems. The DPLR Code Package is an MPI-based, parallel, full three-dimensional Navier-Stokes CFD solver with generalized models for finite-rate reaction kinetics, thermal and chemical non-equilibrium, accurate high-temperature transport coefficients, and ionized flow physics incorporated into the code. DPLR also includes a large selection of generalized realistic surface boundary conditions and links to enable loose coupling with external thermal protection system (TPS) material response and shock layer radiation codes.

  17. Individual differences in adaptive coding of face identity are linked to individual differences in face recognition ability.

    PubMed

    Rhodes, Gillian; Jeffery, Linda; Taylor, Libby; Hayward, William G; Ewing, Louise

    2014-06-01

    Despite their similarity as visual patterns, we can discriminate and recognize many thousands of faces. This expertise has been linked to 2 coding mechanisms: holistic integration of information across the face and adaptive coding of face identity using norms tuned by experience. Recently, individual differences in face recognition ability have been discovered and linked to differences in holistic coding. Here we show that they are also linked to individual differences in adaptive coding of face identity, measured using face identity aftereffects. Identity aftereffects correlated significantly with several measures of face-selective recognition ability. They also correlated marginally with own-race face recognition ability, suggesting a role for adaptive coding in the well-known other-race effect. More generally, these results highlight the important functional role of adaptive face-coding mechanisms in face expertise, taking us beyond the traditional focus on holistic coding mechanisms. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  18. CHARRON: Code for High Angular Resolution of Rotating Objects in Nature

    NASA Astrophysics Data System (ADS)

    Domiciano de Souza, A.; Zorec, J.; Vakili, F.

    2012-12-01

    Rotation is one of the fundamental physical parameters governing stellar physics and evolution. At the same time, spectrally resolved optical/IR long-baseline interferometry has proven to be an important observing tool to measure many physical effects linked to rotation, in particular, stellar flattening, gravity darkening, differential rotation. In order to interpret the high angular resolution observations from modern spectro-interferometers, such as VLTI/AMBER and VEGA/CHARA, we have developed an interferometry-oriented numerical model: CHARRON (Code for High Angular Resolution of Rotating Objects in Nature). We present here the characteristics of CHARRON, which is faster (≃q10-30 s per model) and thus more adapted to model-fitting than the first version of the code presented by Domiciano de Souza et al. (2002).

  19. Approximate minimum-time trajectories for 2-link flexible manipulators

    NASA Technical Reports Server (NTRS)

    Eisler, G. R.; Segalman, D. J.; Robinett, R. D.

    1989-01-01

    Powell's nonlinear programming code, VF02AD, was used to generate approximate minimum-time tip trajectories for 2-link semi-rigid and flexible manipulator movements in the horizontal plane. The manipulator is modeled with an efficient finite-element scheme for an n-link, m-joint system with horizontal-plane bending only. Constraints on the trajectory include boundary conditions on position and energy for a rest-to-rest maneuver, straight-line tracking between boundary positions, and motor torque limits. Trajectory comparisons utilize a change in the link stiffness, EI, to transition from the semi-rigid to flexible case. Results show the level of compliance necessary to excite significant modal behavior. Quiescence of the final configuration is examined with the finite-element model.

  20. Computer-based coding of free-text job descriptions to efficiently identify occupations in epidemiological studies

    PubMed Central

    Russ, Daniel E.; Ho, Kwan-Yuet; Colt, Joanne S.; Armenti, Karla R.; Baris, Dalsu; Chow, Wong-Ho; Davis, Faith; Johnson, Alison; Purdue, Mark P.; Karagas, Margaret R.; Schwartz, Kendra; Schwenn, Molly; Silverman, Debra T.; Johnson, Calvin A.; Friesen, Melissa C.

    2016-01-01

    Background Mapping job titles to standardized occupation classification (SOC) codes is an important step in identifying occupational risk factors in epidemiologic studies. Because manual coding is time-consuming and has moderate reliability, we developed an algorithm called SOCcer (Standardized Occupation Coding for Computer-assisted Epidemiologic Research) to assign SOC-2010 codes based on free-text job description components. Methods Job title and task-based classifiers were developed by comparing job descriptions to multiple sources linking job and task descriptions to SOC codes. An industry-based classifier was developed based on the SOC prevalence within an industry. These classifiers were used in a logistic model trained using 14,983 jobs with expert-assigned SOC codes to obtain empirical weights for an algorithm that scored each SOC/job description. We assigned the highest scoring SOC code to each job. SOCcer was validated in two occupational data sources by comparing SOC codes obtained from SOCcer to expert assigned SOC codes and lead exposure estimates obtained by linking SOC codes to a job-exposure matrix. Results For 11,991 case-control study jobs, SOCcer-assigned codes agreed with 44.5% and 76.3% of manually assigned codes at the 6- and 2-digit level, respectively. Agreement increased with the score, providing a mechanism to identify assignments needing review. Good agreement was observed between lead estimates based on SOCcer and manual SOC assignments (kappa: 0.6–0.8). Poorer performance was observed for inspection job descriptions, which included abbreviations and worksite-specific terminology. Conclusions Although some manual coding will remain necessary, using SOCcer may improve the efficiency of incorporating occupation into large-scale epidemiologic studies. PMID:27102331

  1. Modelling dwarf mistletoe at three scales: life history, ballistics and contagion

    Treesearch

    Donald C. E. Robinson; Brian W. Geils

    2006-01-01

    The epidemiology of dwarf mistletoe (Arceuthobium) is simulated for the reproduction, dispersal, and spatial patterns of these plant pathogens on conifer trees. A conceptual model for mistletoe spread and intensification is coded as sets of related subprograms that link to either of two individual-tree growth models (FVS and TASS) used by managers to develop...

  2. Study of information transfer optimization for communication satellites

    NASA Technical Reports Server (NTRS)

    Odenwalder, J. P.; Viterbi, A. J.; Jacobs, I. M.; Heller, J. A.

    1973-01-01

    The results are presented of a study of source coding, modulation/channel coding, and systems techniques for application to teleconferencing over high data rate digital communication satellite links. Simultaneous transmission of video, voice, data, and/or graphics is possible in various teleconferencing modes and one-way, two-way, and broadcast modes are considered. A satellite channel model including filters, limiter, a TWT, detectors, and an optimized equalizer is treated in detail. A complete analysis is presented for one set of system assumptions which exclude nonlinear gain and phase distortion in the TWT. Modulation, demodulation, and channel coding are considered, based on an additive white Gaussian noise channel model which is an idealization of an equalized channel. Source coding with emphasis on video data compression is reviewed, and the experimental facility utilized to test promising techniques is fully described.

  3. Link performance model for filter bank based multicarrier systems

    NASA Astrophysics Data System (ADS)

    Petrov, Dmitry; Oborina, Alexandra; Giupponi, Lorenza; Stitz, Tobias Hidalgo

    2014-12-01

    This paper presents a complete link level abstraction model for link quality estimation on the system level of filter bank multicarrier (FBMC)-based networks. The application of mean mutual information per coded bit (MMIB) approach is validated for the FBMC systems. The considered quality measure of the resource element for the FBMC transmission is the received signal-to-noise-plus-distortion ratio (SNDR). Simulation results of the proposed link abstraction model show that the proposed approach is capable of estimating the block error rate (BLER) accurately, even when the signal is propagated through the channels with deep and frequent fades, as it is the case for the 3GPP Hilly Terrain (3GPP-HT) and Enhanced Typical Urban (ETU) models. The FBMC-related results of link level simulations are compared with cyclic prefix orthogonal frequency division multiplexing (CP-OFDM) analogs. Simulation results are also validated through the comparison to reference publicly available results. Finally, the steps of link level abstraction algorithm for FBMC are formulated and its application for system level simulation of a professional mobile radio (PMR) network is discussed.

  4. Estimation of climate change impact on dead fuel moisture at local scale by using weather generators

    NASA Astrophysics Data System (ADS)

    Pellizzaro, Grazia; Bortolu, Sara; Dubrovsky, Martin; Arca, Bachisio; Ventura, Andrea; Duce, Pierpaolo

    2015-04-01

    The moisture content of dead fuel is an important variable in fire ignition and fire propagation. Moisture exchange in dead materials is controlled by physical processes, and is clearly dependent on atmospheric changes. According to projections of future climate in Southern Europe, changes in temperature, precipitation and extreme events are expected. More prolonged drought seasons could influence fuel moisture content and, consequently, the number of days characterized by high ignition danger in Mediterranean ecosystems. The low resolution of the climate data provided by the general circulation models (GCMs) represents a limitation for evaluating climate change impacts at local scale. For this reason, the climate research community has called to develop appropriate downscaling techniques. One of the downscaling approaches, which transform the raw outputs from the climate models (GCMs or RCMs) into data with more realistic structure, is based on linking a stochastic weather generator with the climate model outputs. Weather generators linked to climate change scenarios can therefore be used to create synthetic weather series (air temperature and relative humidity, wind speed and precipitation) representing present and future climates at local scale. The main aims of this work are to identify useful tools to determine potential impacts of expected climate change on dead fuel status in Mediterranean shrubland and, in particular, to estimate the effect of climate changes on the number of days characterized by critical values of dead fuel moisture. Measurements of dead fuel moisture content (FMC) in Mediterranean shrubland were performed by using humidity sensors in North Western Sardinia (Italy) for six years. Meteorological variables were also recorded. Data were used to determine the accuracy of the Canadian Fine Fuels Moisture Code (FFM code) in modelling moisture dynamics of dead fuel in Mediterranean vegetation. Critical threshold values of FFM code for Mediterranean climate were identified by percentile analysis, and new fuel moisture code classes were also defined. A stochastic weather generator (M&Rfi), linked to climate change scenarios derived from 17 available General Circulation Models (GCMs), was used to produce synthetic weather series, representing present and future climates, for four selected sites located in North Western Sardinia, Italy. The number of days with critical FFM code values for present and future climate were calculated and the potential impact of future climate change was analysed.

  5. SPECabq v. 2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chambers, Robert S.; Neidigk, Matthew A.

    Sandia SPECabq is FORTRAN code that defines the user supplied subroutines needed to perform nonlinear viscoelastic analyses in the ABAQUS commercial finite element code based on the Simplified Potential Energy Clock (SPEC) Model. The SPEC model was published in the open literature in 2009. It must be compiled and linked with the ABAQUS libraries under the user supplied subroutine option of the ABAQUS executable script. The subroutine is used to analyze the thermomechanical behavior of isotropic polymers predicting things like how a polymer may undergo stress or volume relaxation under different temperature and loading environments. This subroutine enables the ABAQUSmore » finite element code to be used for analyzing the thermo-mechanical behavior of samples and parts that are made from glassy polymers.« less

  6. An NPARC Turbulence Module with Wall Functions

    NASA Technical Reports Server (NTRS)

    Zhu, J.; Shih, T.-H.

    1997-01-01

    The turbulence module recently developed for the NPARC code has been extended to include wall functions. The Van Driest transformation is used so that the wall functions can be applied to both incompressible and compressible flows. The module is equipped with three two-equation K-epsilon turbulence models: Chien, Shih-Lumley and CMOTR models. Details of the wall functions as well as their numerical implementation are reported. It is shown that the inappropriate artificial viscosity in the near-wall region has a big influence on the solution of the wall function approach. A simple way to eliminate this influence is proposed, which gives satisfactory results during the code validation. The module can be easily linked to the NPARC code for practical applications.

  7. Utility of linking primary care electronic medical records with Canadian census data to study the determinants of chronic disease: an example based on socioeconomic status and obesity.

    PubMed

    Biro, Suzanne; Williamson, Tyler; Leggett, Jannet Ann; Barber, David; Morkem, Rachael; Moore, Kieran; Belanger, Paul; Mosley, Brian; Janssen, Ian

    2016-03-11

    Electronic medical records (EMRs) used in primary care contain a breadth of data that can be used in public health research. Patient data from EMRs could be linked with other data sources, such as a postal code linkage with Census data, to obtain additional information on environmental determinants of health. While promising, successful linkages between primary care EMRs with geographic measures is limited due to ethics review board concerns. This study tested the feasibility of extracting full postal code from primary care EMRs and linking this with area-level measures of the environment to demonstrate how such a linkage could be used to examine the determinants of disease. The association between obesity and area-level deprivation was used as an example to illustrate inequalities of obesity in adults. The analysis included EMRs of 7153 patients aged 20 years and older who visited a single, primary care site in 2011. Extracted patient information included demographics (date of birth, sex, postal code) and weight status (height, weight). Information extraction and management procedures were designed to mitigate the risk of individual re-identification when extracting full postal code from source EMRs. Based on patients' postal codes, area-based deprivation indexes were created using the smallest area unit used in Canadian censuses. Descriptive statistics and socioeconomic disparity summary measures of linked census and adult patients were calculated. The data extraction of full postal code met technological requirements for rendering health information extracted from local EMRs into anonymized data. The prevalence of obesity was 31.6 %. There was variation of obesity between deprivation quintiles; adults in the most deprived areas were 35 % more likely to be obese compared with adults in the least deprived areas (Chi-Square = 20.24(1), p < 0.0001). Maps depicting spatial representation of regional deprivation and obesity were created to highlight high risk areas. An area based socio-economic measure was linked with EMR-derived objective measures of height and weight to show a positive association between area-level deprivation and obesity. The linked dataset demonstrates a promising model for assessing health disparities and ecological factors associated with the development of chronic diseases with far reaching implications for informing public health and primary health care interventions and services.

  8. Effects of linking a soil-water-balance model with a groundwater-flow model

    USGS Publications Warehouse

    Stanton, Jennifer S.; Ryter, Derek W.; Peterson, Steven M.

    2013-01-01

    A previously published regional groundwater-flow model in north-central Nebraska was sequentially linked with the recently developed soil-water-balance (SWB) model to analyze effects to groundwater-flow model parameters and calibration results. The linked models provided a more detailed spatial and temporal distribution of simulated recharge based on hydrologic processes, improvement of simulated groundwater-level changes and base flows at specific sites in agricultural areas, and a physically based assessment of the relative magnitude of recharge for grassland, nonirrigated cropland, and irrigated cropland areas. Root-mean-squared (RMS) differences between the simulated and estimated or measured target values for the previously published model and linked models were relatively similar and did not improve for all types of calibration targets. However, without any adjustment to the SWB-generated recharge, the RMS difference between simulated and estimated base-flow target values for the groundwater-flow model was slightly smaller than for the previously published model, possibly indicating that the volume of recharge simulated by the SWB code was closer to actual hydrogeologic conditions than the previously published model provided. Groundwater-level and base-flow hydrographs showed that temporal patterns of simulated groundwater levels and base flows were more accurate for the linked models than for the previously published model at several sites, particularly in agricultural areas.

  9. The CCSDS Next Generation Space Data Link Protocol (NGSLP)

    NASA Technical Reports Server (NTRS)

    Kazz, Greg J.; Greenberg, Edward

    2014-01-01

    The CCSDS space link protocols i.e., Telemetry (TM), Telecommand (TC), Advanced Orbiting Systems (AOS) were developed in the early growth period of the space program. They were designed to meet the needs of the early missions, be compatible with the available technology and focused on the specific link environments. Digital technology was in its infancy and spacecraft power and mass issues enforced severe constraints on flight implementations. Therefore the Telecommand protocol was designed around a simple Bose, Hocquenghem, Chaudhuri (BCH) code that provided little coding gain and limited error detection but was relatively simple to decode on board. The infusion of the concatenated Convolutional and Reed-Solomon codes5 for telemetry was a major milestone and transformed telemetry applications by providing them the ability to more efficiently utilize the telemetry link and its ability to deliver user data. The ability to significantly lower the error rates on the telemetry links enabled the use of packet telemetry and data compression. The infusion of the high performance codes for telemetry was enabled by the advent of digital processing, but it was limited to earth based systems supporting telemetry. The latest CCSDS space link protocol, Proximity-1 was developed in early 2000 to meet the needs of short-range, bi-directional, fixed or mobile radio links characterized by short time delays, moderate but not weak signals, and short independent sessions. Proximity-1 has been successfully deployed on both NASA and ESA missions at Mars and is planned to be utilized by all Mars missions in development. A new age has arisen, one that now provides the means to perform advanced digital processing in spacecraft systems enabling the use of improved transponders, digital correlators, and high performance forward error correcting codes for all communications links. Flight transponders utilizing digital technology have emerged and can efficiently provide the means to make the next leap in performance for space link communications. Field Programmable Gate Arrays (FPGAs) provide the capability to incorporate high performance forward error correcting codes implemented within software transponders providing improved performance in data transfer, ranging, link security, and time correlation. Given these synergistic technological breakthroughs, the time has come to take advantage of them in applying them to both on going (e.g., command, telemetry) and emerging (e.g., space link security, optical communication) space link applications. However one of the constraining factors within the Data Link Layer in realizing these performance gains is the lack of a generic transfer frame format and common supporting services amongst the existing CCSDS link layer protocols. Currently each of the four CCSDS link layer protocols (TM, TC, AOS, and Proximity-1) have unique formats and services which prohibits their reuse across the totality of all space link applications of CCSDS member space agencies. For example, Mars missions. These missions implement their proximity data link layer using the Proximity-1 frame format and the services it supports but is still required to support the direct from Earth (TC) protocols and the Direct To Earth (AOS/TM) protocols. The prime purpose of this paper, is to describe a new general purpose CCSDS Data Link layer protocol, the NGSLP that will provide the required services along with a common transfer frame format for all the CCSDS space links (ground to/from space and space to space links) targeted for emerging missions after a CCSDS agency-wide coordinated date. This paper will also describe related options that can be included for the Coding and Synchronization sub-layer of the Data Link layer to extend the capacities of the link and additionally provide an independence of the transfer frame sub-layer from the coding sublayer. This feature will provide missions the option of running either the currently performed synchronous coding and transfer frame data link or an asynchronous coding/frame data link, in which the transfer frame length is independent of the block size of the code. The benefits from the elimination of this constraint (frame synchronized to the code block) will simplify the interface between the transponder and the data handling equipment and reduce implementation costs and complexities. The benefits include: inclusion of encoders/decoders into transmitters and receivers without regard to data link protocols, providing the ability to insert latency sensitive messages into the link to support launch, landing/docking, telerobotics. and Variable Coded Modulation (VCM). In addition the ability to transfer different sized frames can provide a backup for delivering stored anomaly engineering data simultaneously with real time data, or relaying of frames from various sources onto a trunk line for delivery to Earth.

  10. A simple stochastic weather generator for ecological modeling

    Treesearch

    A.G. Birt; M.R. Valdez-Vivas; R.M. Feldman; C.W. Lafon; D. Cairns; R.N. Coulson; M. Tchakerian; W. Xi; Jim Guldin

    2010-01-01

    Stochastic weather generators are useful tools for exploring the relationship between organisms and their environment. This paper describes a simple weather generator that can be used in ecological modeling projects. We provide a detailed description of methodology, and links to full C++ source code (http://weathergen.sourceforge.net) required to implement or modify...

  11. Palm: Easing the Burden of Analytical Performance Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tallent, Nathan R.; Hoisie, Adolfy

    2014-06-01

    Analytical (predictive) application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult because they must be both accurate and concise. To ease the burden of performance modeling, we developed Palm, a modeling tool that combines top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. To express insight, Palm defines a source code modeling annotation language. By coordinating models and source code, Palm's models are `first-class' and reproducible. Unlike prior work, Palm formally links models, functions, and measurements. As a result, Palm (a) uses functions to either abstract or express complexitymore » (b) generates hierarchical models (representing an application's static and dynamic structure); and (c) automatically incorporates measurements to focus attention, represent constant behavior, and validate models. We discuss generating models for three different applications.« less

  12. Modeling of Radiowave Propagation in a Forested Environment

    DTIC Science & Technology

    2014-09-01

    is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) Propagation models used in wireless communication system design play an...domains. Applications in both domains require communication devices and sensors to be operated in forested environments. Various methods have been...wireless communication system design play an important role in overall link performance. Propagation models in a forested environment, in particular

  13. Development of an integrated thermal-hydraulics capability incorporating RELAP5 and PANTHER neutronics code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Page, R.; Jones, J.R.

    1997-07-01

    Ensuring that safety analysis needs are met in the future is likely to lead to the development of new codes and the further development of existing codes. It is therefore advantageous to define standards for data interfaces and to develop software interfacing techniques which can readily accommodate changes when they are made. Defining interface standards is beneficial but is necessarily restricted in application if future requirements are not known in detail. Code interfacing methods are of particular relevance with the move towards automatic grid frequency response operation where the integration of plant dynamic, core follow and fault study calculation toolsmore » is considered advantageous. This paper describes the background and features of a new code TALINK (Transient Analysis code LINKage program) used to provide a flexible interface to link the RELAP5 thermal hydraulics code with the PANTHER neutron kinetics and the SIBDYM whole plant dynamic modelling codes used by Nuclear Electric. The complete package enables the codes to be executed in parallel and provides an integrated whole plant thermal-hydraulics and neutron kinetics model. In addition the paper discusses the capabilities and pedigree of the component codes used to form the integrated transient analysis package and the details of the calculation of a postulated Sizewell `B` Loss of offsite power fault transient.« less

  14. CBP PHASE I CODE INTEGRATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, F.; Brown, K.; Flach, G.

    The goal of the Cementitious Barriers Partnership (CBP) is to develop a reasonable and credible set of software tools to predict the structural, hydraulic, and chemical performance of cement barriers used in nuclear applications over extended time frames (greater than 100 years for operating facilities and greater than 1000 years for waste management). The simulation tools will be used to evaluate and predict the behavior of cementitious barriers used in near surface engineered waste disposal systems including waste forms, containment structures, entombments, and environmental remediation. These cementitious materials are exposed to dynamic environmental conditions that cause changes in material propertiesmore » via (i) aging, (ii) chloride attack, (iii) sulfate attack, (iv) carbonation, (v) oxidation, and (vi) primary constituent leaching. A set of state-of-the-art software tools has been selected as a starting point to capture these important aging and degradation phenomena. Integration of existing software developed by the CBP partner organizations was determined to be the quickest method of meeting the CBP goal of providing a computational tool that improves the prediction of the long-term behavior of cementitious materials. These partner codes were selected based on their maturity and ability to address the problems outlined above. The GoldSim Monte Carlo simulation program (GTG 2010a, GTG 2010b) was chosen as the code integration platform (Brown & Flach 2009b). GoldSim (current Version 10.5) is a Windows based graphical object-oriented computer program that provides a flexible environment for model development (Brown & Flach 2009b). The linking of GoldSim to external codes has previously been successfully demonstrated (Eary 2007, Mattie et al. 2007). GoldSim is capable of performing deterministic and probabilistic simulations and of modeling radioactive decay and constituent transport. As part of the CBP project, a general Dynamic Link Library (DLL) interface was developed to link GoldSim with external codes (Smith III et al. 2010). The DLL uses a list of code inputs provided by GoldSim to create an input file for the external application, runs the external code, and returns a list of outputs (read from files created by the external application) back to GoldSim. In this way GoldSim provides: (1) a unified user interface to the applications, (2) the capability of coupling selected codes in a synergistic manner, and (3) the capability of performing probabilistic uncertainty analysis with the codes. GoldSim is made available by the GoldSim Technology Group as a free 'Player' version that allows running but not editing GoldSim models. The player version makes the software readily available to a wider community of users that would wish to use the CBP application but do not have a license for GoldSim.« less

  15. ALARA: The next link in a chain of activation codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, P.P.H.; Henderson, D.L.

    1996-12-31

    The Adaptive Laplace and Analytic Radioactivity Analysis [ALARA] code has been developed as the next link in the chain of DKR radioactivity codes. Its methods address the criticisms of DKR while retaining its best features. While DKR ignored loops in the transmutation/decay scheme to preserve the exactness of the mathematical solution, ALARA incorporates new computational approaches without jeopardizing the most important features of DKR`s physical modelling and mathematical methods. The physical model uses `straightened-loop, linear chains` to achieve the same accuracy in the loop solutions as is demanded in the rest of the scheme. In cases where a chain hasmore » no loops, the exact DKR solution is used. Otherwise, ALARA adaptively chooses between a direct Laplace inversion technique and a Laplace expansion inversion technique to optimize the accuracy and speed of the solution. All of these methods result in matrix solutions which allow the fastest and most accurate solution of exact pulsing histories. Since the entire history is solved for each chain as it is created, ALARA achieves the optimum combination of high accuracy, high speed and low memory usage. 8 refs., 2 figs.« less

  16. Can Mismatch Negativity Be Linked to Synaptic Processes? A Glutamatergic Approach to Deviance Detection

    ERIC Educational Resources Information Center

    Strelnikov, Kuzma

    2007-01-01

    This article aims to provide a theoretical framework to elucidate the neurophysiological underpinnings of deviance detection as reflected by mismatch negativity. A six-step model of the information processing necessary for deviance detection is proposed. In this model, predictive coding of learned regularities is realized by means of long-term…

  17. Linking Physical and Numerical Modelling in Hydrogeology Using Sand Tank Experiments and Comsol Multiphysics

    ERIC Educational Resources Information Center

    Singha, Kamini; Loheide, Steven P., II

    2011-01-01

    Visualising subsurface processes in hydrogeology and building intuition for how these processes are controlled by changes in forcing is hard for many undergraduate students. While numerical modelling is one way to help undergraduate students explore outcomes of multiple scenarios, many codes are not user-friendly with respect to defining domains,…

  18. Bilayer Protograph Codes for Half-Duplex Relay Channels

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; VanNguyen, Thuy; Nosratinia, Aria

    2013-01-01

    Direct to Earth return links are limited by the size and power of lander devices. A standard alternative is provided by a two-hops return link: a proximity link (from lander to orbiter relay) and a deep-space link (from orbiter relay to Earth). Although direct to Earth return links are limited by the size and power of lander devices, using an additional link and a proposed coding for relay channels, one can obtain a more reliable signal. Although significant progress has been made in the relay coding problem, existing codes must be painstakingly optimized to match to a single set of channel conditions, many of them do not offer easy encoding, and most of them do not have structured design. A high-performing LDPC (low-density parity-check) code for the relay channel addresses simultaneously two important issues: a code structure that allows low encoding complexity, and a flexible rate-compatible code that allows matching to various channel conditions. Most of the previous high-performance LDPC codes for the relay channel are tightly optimized for a given channel quality, and are not easily adapted without extensive re-optimization for various channel conditions. This code for the relay channel combines structured design and easy encoding with rate compatibility to allow adaptation to the three links involved in the relay channel, and furthermore offers very good performance. The proposed code is constructed by synthesizing a bilayer structure with a pro to graph. In addition to the contribution to relay encoding, an improved family of protograph codes was produced for the point-to-point AWGN (additive white Gaussian noise) channel whose high-rate members enjoy thresholds that are within 0.07 dB of capacity. These LDPC relay codes address three important issues in an integrative manner: low encoding complexity, modular structure allowing for easy design, and rate compatibility so that the code can be easily matched to a variety of channel conditions without extensive re-optimization. The main problem of half-duplex relay coding can be reduced to the simultaneous design of two codes at two rates and two SNRs (signal-to-noise ratios), such that one is a subset of the other. This problem can be addressed by forceful optimization, but a clever method of addressing this problem is via the bilayer lengthened (BL) LDPC structure. This method uses a bilayer Tanner graph to make the two codes while using a concept of "parity forwarding" with subsequent successive decoding that removes the need to directly address the issue of uneven SNRs among the symbols of a given codeword. This method is attractive in that it addresses some of the main issues in the design of relay codes, but it does not by itself give rise to highly structured codes with simple encoding, nor does it give rate-compatible codes. The main contribution of this work is to construct a class of codes that simultaneously possess a bilayer parity- forwarding mechanism, while also benefiting from the properties of protograph codes having an easy encoding, a modular design, and being a rate-compatible code.

  19. Use of simulated satellite radiances from a mesoscale numerical model to understand kinematic and dynamic processes

    NASA Technical Reports Server (NTRS)

    Kalb, Michael; Robertson, Franklin; Jedlovec, Gary; Perkey, Donald

    1987-01-01

    Techniques by which mesoscale numerical weather prediction model output and radiative transfer codes are combined to simulate the radiance fields that a given passive temperature/moisture satellite sensor would see if viewing the evolving model atmosphere are introduced. The goals are to diagnose the dynamical atmospheric processes responsible for recurring patterns in observed satellite radiance fields, and to develop techniques to anticipate the ability of satellite sensor systems to depict atmospheric structures and provide information useful for numerical weather prediction (NWP). The concept of linking radiative transfer and dynamical NWP codes is demonstrated with time sequences of simulated radiance imagery in the 24 TIROS vertical sounder channels derived from model integrations for March 6, 1982.

  20. Integration of the SSPM and STAGE with the MPACT Virtual Facility Distributed Test Bed.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cipiti, Benjamin B.; Shoman, Nathan

    The Material Protection Accounting and Control Technologies (MPACT) program within DOE NE is working toward a 2020 milestone to demonstrate a Virtual Facility Distributed Test Bed. The goal of the Virtual Test Bed is to link all MPACT modeling tools, technology development, and experimental work to create a Safeguards and Security by Design capability for fuel cycle facilities. The Separation and Safeguards Performance Model (SSPM) forms the core safeguards analysis tool, and the Scenario Toolkit and Generation Environment (STAGE) code forms the core physical security tool. These models are used to design and analyze safeguards and security systems and generatemore » performance metrics. Work over the past year has focused on how these models will integrate with the other capabilities in the MPACT program and specific model changes to enable more streamlined integration in the future. This report describes the model changes and plans for how the models will be used more collaboratively. The Virtual Facility is not designed to integrate all capabilities into one master code, but rather to maintain stand-alone capabilities that communicate results between codes more effectively.« less

  1. Computer-based coding of free-text job descriptions to efficiently identify occupations in epidemiological studies.

    PubMed

    Russ, Daniel E; Ho, Kwan-Yuet; Colt, Joanne S; Armenti, Karla R; Baris, Dalsu; Chow, Wong-Ho; Davis, Faith; Johnson, Alison; Purdue, Mark P; Karagas, Margaret R; Schwartz, Kendra; Schwenn, Molly; Silverman, Debra T; Johnson, Calvin A; Friesen, Melissa C

    2016-06-01

    Mapping job titles to standardised occupation classification (SOC) codes is an important step in identifying occupational risk factors in epidemiological studies. Because manual coding is time-consuming and has moderate reliability, we developed an algorithm called SOCcer (Standardized Occupation Coding for Computer-assisted Epidemiologic Research) to assign SOC-2010 codes based on free-text job description components. Job title and task-based classifiers were developed by comparing job descriptions to multiple sources linking job and task descriptions to SOC codes. An industry-based classifier was developed based on the SOC prevalence within an industry. These classifiers were used in a logistic model trained using 14 983 jobs with expert-assigned SOC codes to obtain empirical weights for an algorithm that scored each SOC/job description. We assigned the highest scoring SOC code to each job. SOCcer was validated in 2 occupational data sources by comparing SOC codes obtained from SOCcer to expert assigned SOC codes and lead exposure estimates obtained by linking SOC codes to a job-exposure matrix. For 11 991 case-control study jobs, SOCcer-assigned codes agreed with 44.5% and 76.3% of manually assigned codes at the 6-digit and 2-digit level, respectively. Agreement increased with the score, providing a mechanism to identify assignments needing review. Good agreement was observed between lead estimates based on SOCcer and manual SOC assignments (κ 0.6-0.8). Poorer performance was observed for inspection job descriptions, which included abbreviations and worksite-specific terminology. Although some manual coding will remain necessary, using SOCcer may improve the efficiency of incorporating occupation into large-scale epidemiological studies. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  2. Shuttle Global Positioning System (GPS) system design study

    NASA Technical Reports Server (NTRS)

    Nilsen, P. W.

    1979-01-01

    The various integration problems in the Shuttle GPS system were investigated. The analysis of the Shuttle GPS link was studied. A preamplifier was designed since the Shuttle GPS antennas must be located remotely from the receiver. Several GPS receiver architecture trade-offs were discussed. The Shuttle RF harmonics and intermode that fall within the GPS receiver bandwidth were analyzed. The GPS PN code acquisition was examined. Since the receiver clock strongly affects both GPS carrier and code acquisition performance, a clock model was developed.

  3. Digitized forensics: retaining a link between physical and digital crime scene traces using QR-codes

    NASA Astrophysics Data System (ADS)

    Hildebrandt, Mario; Kiltz, Stefan; Dittmann, Jana

    2013-03-01

    The digitization of physical traces from crime scenes in forensic investigations in effect creates a digital chain-of-custody and entrains the challenge of creating a link between the two or more representations of the same trace. In order to be forensically sound, especially the two security aspects of integrity and authenticity need to be maintained at all times. Especially the adherence to the authenticity using technical means proves to be a challenge at the boundary between the physical object and its digital representations. In this article we propose a new method of linking physical objects with its digital counterparts using two-dimensional bar codes and additional meta-data accompanying the acquired data for integration in the conventional documentation of collection of items of evidence (bagging and tagging process). Using the exemplary chosen QR-code as particular implementation of a bar code and a model of the forensic process, we also supply a means to integrate our suggested approach into forensically sound proceedings as described by Holder et al.1 We use the example of the digital dactyloscopy as a forensic discipline, where currently progress is being made by digitizing some of the processing steps. We show an exemplary demonstrator of the suggested approach using a smartphone as a mobile device for the verification of the physical trace to extend the chain-of-custody from the physical to the digital domain. Our evaluation of the demonstrator is performed towards the readability and the verification of its contents. We can read the bar code despite its limited size of 42 x 42 mm and rather large amount of embedded data using various devices. Furthermore, the QR-code's error correction features help to recover contents of damaged codes. Subsequently, our appended digital signature allows for detecting malicious manipulations of the embedded data.

  4. Statistical and temporal irradiance fluctuations modeling for a ground-to-geostationary satellite optical link.

    PubMed

    Camboulives, A-R; Velluet, M-T; Poulenard, S; Saint-Antonin, L; Michau, V

    2018-02-01

    An optical communication link performance between the ground and a geostationary satellite can be impaired by scintillation, beam wandering, and beam spreading due to its propagation through atmospheric turbulence. These effects on the link performance can be mitigated by tracking and error correction codes coupled with interleaving. Precise numerical tools capable of describing the irradiance fluctuations statistically and of creating an irradiance time series are needed to characterize the benefits of these techniques and optimize them. The wave optics propagation methods have proven their capability of modeling the effects of atmospheric turbulence on a beam, but these are known to be computationally intensive. We present an analytical-numerical model which provides good results on the probability density functions of irradiance fluctuations as well as a time series with an important saving of time and computational resources.

  5. Development Of A Parallel Performance Model For The THOR Neutral Particle Transport Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yessayan, Raffi; Azmy, Yousry; Schunert, Sebastian

    The THOR neutral particle transport code enables simulation of complex geometries for various problems from reactor simulations to nuclear non-proliferation. It is undergoing a thorough V&V requiring computational efficiency. This has motivated various improvements including angular parallelization, outer iteration acceleration, and development of peripheral tools. For guiding future improvements to the code’s efficiency, better characterization of its parallel performance is useful. A parallel performance model (PPM) can be used to evaluate the benefits of modifications and to identify performance bottlenecks. Using INL’s Falcon HPC, the PPM development incorporates an evaluation of network communication behavior over heterogeneous links and a functionalmore » characterization of the per-cell/angle/group runtime of each major code component. After evaluating several possible sources of variability, this resulted in a communication model and a parallel portion model. The former’s accuracy is bounded by the variability of communication on Falcon while the latter has an error on the order of 1%.« less

  6. Unaligned instruction relocation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertolli, Carlo; O'Brien, John K.; Sallenave, Olivier H.

    In one embodiment, a computer-implemented method includes receiving source code to be compiled into an executable file for an unaligned instruction set architecture (ISA). Aligned assembled code is generated, by a computer processor. The aligned assembled code complies with an aligned ISA and includes aligned processor code for a processor and aligned accelerator code for an accelerator. A first linking pass is performed on the aligned assembled code, including relocating a first relocation target in the aligned accelerator code that refers to a first object outside the aligned accelerator code. Unaligned assembled code is generated in accordance with the unalignedmore » ISA and includes unaligned accelerator code for the accelerator and unaligned processor code for the processor. A second linking pass is performed on the unaligned assembled code, including relocating a second relocation target outside the unaligned accelerator code that refers to an object in the unaligned accelerator code.« less

  7. Unaligned instruction relocation

    DOEpatents

    Bertolli, Carlo; O'Brien, John K.; Sallenave, Olivier H.; Sura, Zehra N.

    2018-01-23

    In one embodiment, a computer-implemented method includes receiving source code to be compiled into an executable file for an unaligned instruction set architecture (ISA). Aligned assembled code is generated, by a computer processor. The aligned assembled code complies with an aligned ISA and includes aligned processor code for a processor and aligned accelerator code for an accelerator. A first linking pass is performed on the aligned assembled code, including relocating a first relocation target in the aligned accelerator code that refers to a first object outside the aligned accelerator code. Unaligned assembled code is generated in accordance with the unaligned ISA and includes unaligned accelerator code for the accelerator and unaligned processor code for the processor. A second linking pass is performed on the unaligned assembled code, including relocating a second relocation target outside the unaligned accelerator code that refers to an object in the unaligned accelerator code.

  8. The 'Brick Wall' radio loss approximation and the performance of strong channel codes for deep space applications at high data rates

    NASA Technical Reports Server (NTRS)

    Shambayati, Shervin

    2001-01-01

    In order to evaluate performance of strong channel codes in presence of imperfect carrier phase tracking for residual carrier BPSK modulation in this paper an approximate 'brick wall' model is developed which is independent of the channel code type for high data rates. It is shown that this approximation is reasonably accurate (less than 0.7dB for low FERs for (1784,1/6) code and less than 0.35dB for low FERs for (5920,1/6) code). Based on the approximation's accuracy, it is concluded that the effects of imperfect carrier tracking are more or less independent of the channel code type for strong channel codes. Therefore, the advantage that one strong channel code has over another with perfect carrier tracking translates to nearly the same advantage under imperfect carrier tracking conditions. This will allow the link designers to incorporate projected channel code performance of strong channel codes into their design tables without worrying about their behavior in the face of imperfect carrier phase tracking.

  9. NASA One-Dimensional Combustor Simulation--User Manual for S1D_ML

    NASA Technical Reports Server (NTRS)

    Stueber, Thomas J.; Paxson, Daniel E.

    2014-01-01

    The work presented in this paper is to promote research leading to a closed-loop control system to actively suppress thermo-acoustic instabilities. To serve as a model for such a closed-loop control system, a one-dimensional combustor simulation composed using MATLAB software tools has been written. This MATLAB based process is similar to a precursor one-dimensional combustor simulation that was formatted as FORTRAN 77 source code. The previous simulation process requires modification to the FORTRAN 77 source code, compiling, and linking when creating a new combustor simulation executable file. The MATLAB based simulation does not require making changes to the source code, recompiling, or linking. Furthermore, the MATLAB based simulation can be run from script files within the MATLAB environment or with a compiled copy of the executable file running in the Command Prompt window without requiring a licensed copy of MATLAB. This report presents a general simulation overview. Details regarding how to setup and initiate a simulation are also presented. Finally, the post-processing section describes the two types of files created while running the simulation and it also includes simulation results for a default simulation included with the source code.

  10. Coding/modulation trade-offs for Shuttle wideband data links

    NASA Technical Reports Server (NTRS)

    Batson, B. H.; Huth, G. K.; Trumpis, B. D.

    1974-01-01

    This paper describes various modulation and coding schemes which are potentially applicable to the Shuttle wideband data relay communications link. This link will be capable of accommodating up to 50 Mbps of scientific data and will be subject to a power constraint which forces the use of channel coding. Although convolutionally encoded coherent binary PSK is the tentative signal design choice for the wideband data relay link, FM techniques are of interest because of the associated hardware simplicity and because an FM system is already planned to be available for transmission of television via relay satellite to the ground. Binary and M-ary FSK are considered as candidate modulation techniques, and both coherent and noncoherent ground station detection schemes are examined. The potential use of convolutional coding is considered in conjunction with each of the candidate modulation techniques.

  11. Embedding QR codes in tumor board presentations, enhancing educational content for oncology information management.

    PubMed

    Siderits, Richard; Yates, Stacy; Rodriguez, Arelis; Lee, Tina; Rimmer, Cheryl; Roche, Mark

    2011-01-01

    Quick Response (QR) Codes are standard in supply management and seen with increasing frequency in advertisements. They are now present regularly in healthcare informatics and education. These 2-dimensional square bar codes, originally designed by the Toyota car company, are free of license and have a published international standard. The codes can be generated by free online software and the resulting images incorporated into presentations. The images can be scanned by "smart" phones and tablets using either the iOS or Android platforms, which link the device with the information represented by the QR code (uniform resource locator or URL, online video, text, v-calendar entries, short message service [SMS] and formatted text). Once linked to the device, the information can be viewed at any time after the original presentation, saved in the device or to a Web-based "cloud" repository, printed, or shared with others via email or Bluetooth file transfer. This paper describes how we use QR codes in our tumor board presentations, discusses the benefits, the different QR codes from Web links and how QR codes facilitate the distribution of educational content.

  12. [The DRG responsible physician in trauma and orthopedic surgery. Surgeon, encoder, and link to medical controlling].

    PubMed

    Ruffing, T; Huchzermeier, P; Muhm, M; Winkler, H

    2014-05-01

    Precise coding is an essential requirement in order to generate a valid DRG. The aim of our study was to evaluate the quality of the initial coding of surgical procedures, as well as to introduce our "hybrid model" of a surgical specialist supervising medical coding and a nonphysician for case auditing. The department's DRG responsible physician as a surgical specialist has profound knowledge both in surgery and in DRG coding. At a Level 1 hospital, 1000 coded cases of surgical procedures were checked. In our department, the DRG responsible physician who is both a surgeon and encoder has proven itself for many years. The initial surgical DRG coding had to be corrected by the DRG responsible physician in 42.2% of cases. On average, one hour per working day was necessary. The implementation of a DRG responsible physician is a simple, effective way to connect medical and business expertise without interface problems. Permanent feedback promotes both medical and economic sensitivity for the improvement of coding quality.

  13. Statistical core design methodology using the VIPRE thermal-hydraulics code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lloyd, M.W.; Feltus, M.A.

    1994-12-31

    This Penn State Statistical Core Design Methodology (PSSCDM) is unique because it not only includes the EPRI correlation/test data standard deviation but also the computational uncertainty for the VIPRE code model and the new composite box design correlation. The resultant PSSCDM equation mimics the EPRI DNBR correlation results well, with an uncertainty of 0.0389. The combined uncertainty yields a new DNBR limit of 1.18 that will provide more plant operational flexibility. This methodology and its associated correlation and uniqe coefficients are for a very particular VIPRE model; thus, the correlation will be specifically linked with the lumped channel and subchannelmore » layout. The results of this research and methodology, however, can be applied to plant-specific VIPRE models.« less

  14. Accessible, almost ab initio multi-scale modeling of entangled polymers via slip-links

    NASA Astrophysics Data System (ADS)

    Andreev, Marat

    It is widely accepted that dynamics of entangled polymers can be described by the tube model. Here we advocate for an alternative approach to entanglement modeling known as slip-links. Recently, slip-links were shown to possess important advantages over tube models, namely they have strong connections to atomistic, multichain levels of description, agree with non-equilibrium thermodynamics, are applicable to any chain architecture and can be used in linear or non-linear rheology. We present a hierarchy of slip-link models that are connected to each other through successive coarse graining. Models in the hierarchy are consistent in their overlapping domains of applicability in order to allow a straightforward mapping of parameters. In particular, the most--detailed level of description has four parameters, three of which can be determined directly from atomistic simulations. On the other hand, the least--detailed member of the hierarchy is numerically accessible, and allows for non-equilibrium flow predictions of complex chain architectures. Using GPU implementation these predictions can be obtained in minutes of computational time on a single desktop equipped with a mainstream gaming GPU. The GPU code is available online for free download.

  15. Coherence of Personal Narratives across the Lifespan: A Multidimensional Model and Coding Method

    PubMed Central

    Reese, Elaine; Haden, Catherine A.; Baker-Ward, Lynne; Bauer, Patricia; Fivush, Robyn; Ornstein, Peter A.

    2012-01-01

    Personal narratives are integral to autobiographical memory and to identity, with coherent personal narratives being linked to positive developmental outcomes across the lifespan. In this article, we review the theoretical and empirical literature that sets the stage for a new lifespan model of personal narrative coherence. This new model integrates context, chronology, and theme as essential dimensions of personal narrative coherence, each of which relies upon different developmental achievements and has a different developmental trajectory across the lifespan. A multidimensional method of coding narrative coherence (the Narrative Coherence Coding Scheme or NaCCS) was derived from the model and is described here. The utility of this approach is demonstrated by its application to 498 narratives that were collected in six laboratories from participants ranging in age from 3 years to adulthood. The value of the model is illustrated further by a discussion of its potential to guide future research on the developmental foundations of narrative coherence and on the benefits of personal narrative coherence for different aspects of psychological functioning. PMID:22754399

  16. Participation as an outcome measure in psychosocial oncology: content of cancer-specific health-related quality of life instruments.

    PubMed

    van der Mei, Sijrike F; Dijkers, Marcel P J M; Heerkens, Yvonne F

    2011-12-01

    To examine to what extent the concept and the domains of participation as defined in the International Classification of Functioning, Disability and Health (ICF) are represented in general cancer-specific health-related quality of life (HRQOL) instruments. Using the ICF linking rules, two coders independently extracted the meaningful concepts of ten instruments and linked these to ICF codes. The proportion of concepts that could be linked to ICF codes ranged from 68 to 95%. Although all instruments contained concepts linked to Participation (Chapters d7-d9 of the classification of 'Activities and Participation'), the instruments covered only a small part of all available ICF codes. The proportion of ICF codes in the instruments that were participation related ranged from 3 to 35%. 'Major life areas' (d8) was the most frequently used Participation Chapter, with d850 'remunerative employment' as the most used ICF code. The number of participation-related ICF codes covered in the instruments is limited. General cancer-specific HRQOL instruments only assess social life of cancer patients to a limited degree. This study's information on the content of these instruments may guide researchers in selecting the appropriate instrument for a specific research purpose.

  17. OpenCMISS: a multi-physics & multi-scale computational infrastructure for the VPH/Physiome project.

    PubMed

    Bradley, Chris; Bowery, Andy; Britten, Randall; Budelmann, Vincent; Camara, Oscar; Christie, Richard; Cookson, Andrew; Frangi, Alejandro F; Gamage, Thiranja Babarenda; Heidlauf, Thomas; Krittian, Sebastian; Ladd, David; Little, Caton; Mithraratne, Kumar; Nash, Martyn; Nickerson, David; Nielsen, Poul; Nordbø, Oyvind; Omholt, Stig; Pashaei, Ali; Paterson, David; Rajagopal, Vijayaraghavan; Reeve, Adam; Röhrle, Oliver; Safaei, Soroush; Sebastián, Rafael; Steghöfer, Martin; Wu, Tim; Yu, Ting; Zhang, Heye; Hunter, Peter

    2011-10-01

    The VPH/Physiome Project is developing the model encoding standards CellML (cellml.org) and FieldML (fieldml.org) as well as web-accessible model repositories based on these standards (models.physiome.org). Freely available open source computational modelling software is also being developed to solve the partial differential equations described by the models and to visualise results. The OpenCMISS code (opencmiss.org), described here, has been developed by the authors over the last six years to replace the CMISS code that has supported a number of organ system Physiome projects. OpenCMISS is designed to encompass multiple sets of physical equations and to link subcellular and tissue-level biophysical processes into organ-level processes. In the Heart Physiome project, for example, the large deformation mechanics of the myocardial wall need to be coupled to both ventricular flow and embedded coronary flow, and the reaction-diffusion equations that govern the propagation of electrical waves through myocardial tissue need to be coupled with equations that describe the ion channel currents that flow through the cardiac cell membranes. In this paper we discuss the design principles and distributed memory architecture behind the OpenCMISS code. We also discuss the design of the interfaces that link the sets of physical equations across common boundaries (such as fluid-structure coupling), or between spatial fields over the same domain (such as coupled electromechanics), and the concepts behind CellML and FieldML that are embodied in the OpenCMISS data structures. We show how all of these provide a flexible infrastructure for combining models developed across the VPH/Physiome community. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Linking sediment-charcoal records and ecological modeling to understand causes of fire-regime change in boreal forests

    Treesearch

    Linda B. Brubaker; Philip E. Higuera; T. Scott Rupp; Mark A. Olson; Patricia M. Anderson; Feng Sheng. Hu

    2009-01-01

    Interactions between vegetation and fire have the potential to overshadow direct effects of climate change on fire regimes in boreal forests of North America. We develop methods to compare sediment-charcoal records with fire regimes simulated by an ecological model, ALFRESCO (Alaskan Frame-based Ecosystem Code) and apply these methods to evaluate potential causes of a...

  19. Why Data Linkage? The Importance of CODES (Crash Outcome Data Evaluation System)

    DOT National Transportation Integrated Search

    1996-06-01

    This report briefly explains the computerized linked data system, Crash Outcome : Data Evaluation System (CODES) that provides greater depth accident data : analysis. The linking of data helps researchers to understand the nature of : traffic acciden...

  20. Discussion on LDPC Codes and Uplink Coding

    NASA Technical Reports Server (NTRS)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  1. Topology-selective jamming of fully-connected, code-division random-access networks

    NASA Technical Reports Server (NTRS)

    Polydoros, Andreas; Cheng, Unjeng

    1990-01-01

    The purpose is to introduce certain models of topology selective stochastic jamming and examine its impact on a class of fully-connected, spread-spectrum, slotted ALOHA-type random access networks. The theory covers dedicated as well as half-duplex units. The dominant role of the spatial duty factor is established, and connections with the dual concept of time selective jamming are discussed. The optimal choices of coding rate and link access parameters (from the users' side) and the jamming spatial fraction are numerically established for DS and FH spreading.

  2. Meshing of a Spiral Bevel Gearset with 3D Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Bibel, George D.; Handschuh, Robert

    1996-01-01

    Recent advances in spiral bevel gear geometry and finite element technology make it practical to conduct a structural analysis and analytically roll the gearset through mesh. With the advent of user specific programming linked to 3D solid modelers and mesh generators, model generation has become greatly automated. Contact algorithms available in general purpose finite element codes eliminate the need for the use and alignment of gap elements. Once the gearset is placed in mesh, user subroutines attached to the FE code easily roll the gearset through mesh. The method is described in detail. Preliminary results for a gearset segment showing the progression of the contact lineload is given as the gears roll through mesh.

  3. A feed-forward spiking model of shape-coding by IT cells

    PubMed Central

    Romeo, August; Supèr, Hans

    2014-01-01

    The ability to recognize a shape is linked to figure-ground (FG) organization. Cell preferences appear to be correlated across contrast-polarity reversals and mirror reversals of polygon displays, but not so much across FG reversals. Here we present a network structure which explains both shape-coding by simulated IT cells and suppression of responses to FG reversed stimuli. In our model FG segregation is achieved before shape discrimination, which is itself evidenced by the difference in spiking onsets of a pair of output cells. The studied example also includes feature extraction and illustrates a classification of binary images depending on the dominance of vertical or horizontal borders. PMID:24904494

  4. Utility of QR codes in biological collections

    PubMed Central

    Diazgranados, Mauricio; Funk, Vicki A.

    2013-01-01

    Abstract The popularity of QR codes for encoding information such as URIs has increased exponentially in step with the technological advances and availability of smartphones, digital tablets, and other electronic devices. We propose using QR codes on specimens in biological collections to facilitate linking vouchers’ electronic information with their associated collections. QR codes can efficiently provide such links for connecting collections, photographs, maps, ecosystem notes, citations, and even GenBank sequences. QR codes have numerous advantages over barcodes, including their small size, superior security mechanisms, increased complexity and quantity of information, and low implementation cost. The scope of this paper is to initiate an academic discussion about using QR codes on specimens in biological collections. PMID:24198709

  5. Utility of QR codes in biological collections.

    PubMed

    Diazgranados, Mauricio; Funk, Vicki A

    2013-01-01

    The popularity of QR codes for encoding information such as URIs has increased exponentially in step with the technological advances and availability of smartphones, digital tablets, and other electronic devices. We propose using QR codes on specimens in biological collections to facilitate linking vouchers' electronic information with their associated collections. QR codes can efficiently provide such links for connecting collections, photographs, maps, ecosystem notes, citations, and even GenBank sequences. QR codes have numerous advantages over barcodes, including their small size, superior security mechanisms, increased complexity and quantity of information, and low implementation cost. The scope of this paper is to initiate an academic discussion about using QR codes on specimens in biological collections.

  6. Both channel coding and wavefront correction on the turbulence mitigation of optical communications using orbital angular momentum multiplexing

    NASA Astrophysics Data System (ADS)

    Zhao, Shengmei; Wang, Le; Zou, Li; Gong, Longyan; Cheng, Weiwen; Zheng, Baoyu; Chen, Hanwu

    2016-10-01

    A free-space optical (FSO) communication link with multiplexed orbital angular momentum (OAM) modes has been demonstrated to largely enhance the system capacity without a corresponding increase in spectral bandwidth, but the performance of the link is unavoidably degraded by atmospheric turbulence (AT). In this paper, we propose a turbulence mitigation scheme to improve AT tolerance of the OAM-multiplexed FSO communication link using both channel coding and wavefront correction. In the scheme, we utilize a wavefront correction method to mitigate the phase distortion first, and then we use a channel code to further correct the errors in each OAM mode. The improvement of AT tolerance is discussed over the performance of the link with or without channel coding/wavefront correction. The results show that the bit error rate performance has been improved greatly. The detrimental effect of AT on the OAM-multiplexed FSO communication link could be removed by the proposed scheme even in the relatively strong turbulence regime, such as Cn2 = 3.6 ×10-14m - 2 / 3.

  7. Examining the relationship between comprehension and production processes in code-switched language

    PubMed Central

    Guzzardo Tamargo, Rosa E.; Valdés Kroff, Jorge R.; Dussias, Paola E.

    2016-01-01

    We employ code-switching (the alternation of two languages in bilingual communication) to test the hypothesis, derived from experience-based models of processing (e.g., Boland, Tanenhaus, Carlson, & Garnsey, 1989; Gennari & MacDonald, 2009), that bilinguals are sensitive to the combinatorial distributional patterns derived from production and that they use this information to guide processing during the comprehension of code-switched sentences. An analysis of spontaneous bilingual speech confirmed the existence of production asymmetries involving two auxiliary + participle phrases in Spanish–English code-switches. A subsequent eye-tracking study with two groups of bilingual code-switchers examined the consequences of the differences in distributional patterns found in the corpus study for comprehension. Participants’ comprehension costs mirrored the production patterns found in the corpus study. Findings are discussed in terms of the constraints that may be responsible for the distributional patterns in code-switching production and are situated within recent proposals of the links between production and comprehension. PMID:28670049

  8. Examining the relationship between comprehension and production processes in code-switched language.

    PubMed

    Guzzardo Tamargo, Rosa E; Valdés Kroff, Jorge R; Dussias, Paola E

    2016-08-01

    We employ code-switching (the alternation of two languages in bilingual communication) to test the hypothesis, derived from experience-based models of processing (e.g., Boland, Tanenhaus, Carlson, & Garnsey, 1989; Gennari & MacDonald, 2009), that bilinguals are sensitive to the combinatorial distributional patterns derived from production and that they use this information to guide processing during the comprehension of code-switched sentences. An analysis of spontaneous bilingual speech confirmed the existence of production asymmetries involving two auxiliary + participle phrases in Spanish-English code-switches. A subsequent eye-tracking study with two groups of bilingual code-switchers examined the consequences of the differences in distributional patterns found in the corpus study for comprehension. Participants' comprehension costs mirrored the production patterns found in the corpus study. Findings are discussed in terms of the constraints that may be responsible for the distributional patterns in code-switching production and are situated within recent proposals of the links between production and comprehension.

  9. Chemistry resolved kinetic flow modeling of TATB based explosives

    NASA Astrophysics Data System (ADS)

    Vitello, Peter; Fried, Laurence E.; William, Howard; Levesque, George; Souers, P. Clark

    2012-03-01

    Detonation waves in insensitive, TATB-based explosives are believed to have multiple time scale regimes. The initial burn rate of such explosives has a sub-microsecond time scale. However, significant late-time slow release in energy is believed to occur due to diffusion limited growth of carbon. In the intermediate time scale concentrations of product species likely change from being in equilibrium to being kinetic rate controlled. We use the thermo-chemical code CHEETAH linked to an ALE hydrodynamics code to model detonations. We term our model chemistry resolved kinetic flow, since CHEETAH tracks the time dependent concentrations of individual species in the detonation wave and calculates EOS values based on the concentrations. We present here two variants of our new rate model and comparison with hot, ambient, and cold experimental data for PBX 9502.

  10. Developmental assessment of the Fort St. Vrain version of the Composite HTGR Analysis Program (CHAP-2)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stroh, K.R.

    1980-01-01

    The Composite HTGR Analysis Program (CHAP) consists of a model-independent systems analysis mainframe named LASAN and model-dependent linked code modules, each representing a component, subsystem, or phenomenon of an HTGR plant. The Fort St. Vrain (FSV) version (CHAP-2) includes 21 coded modules that model the neutron kinetics and thermal response of the core; the thermal-hydraulics of the reactor primary coolant system, secondary steam supply system, and balance-of-plant; the actions of the control system and plant protection system; the response of the reactor building; and the relative hazard resulting from fuel particle failure. FSV steady-state and transient plant data are beingmore » used to partially verify the component modeling and dynamic smulation techniques used to predict plant response to postulated accident sequences.« less

  11. Porting Inition and Failure to Linked Cheetah

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vitello, P; Souers, P C

    2007-07-18

    Linked CHEETAH is a thermo-chemical code coupled to a 2-D hydrocode. Initially, a quadratic-pressure dependent kinetic rate was used, which worked well in modeling prompt detonation of explosives of large size, but does not work on other aspects of explosive behavior. The variable-pressure Tarantula reactive flow rate model was developed with JWL++ in order to also describe failure and initiation, and we have moved this model into Linked CHEETAH. The model works by turning on only above a pressure threshold, where a slow turn-on creates initiation. At a higher pressure, the rate suddenly leaps to a large value over amore » small pressure range. A slowly failing cylinder will see a rapidly declining rate, which pushes it quickly into failure. At a high pressure, the detonation rate is constant. A sequential validation procedure is used, which includes metal-confined cylinders, rate-sticks, corner-turning, initiation and threshold, gap tests and air gaps. The size (diameter) effect is central to the calibration.« less

  12. A tactile paging system for deaf-blind people, phase 1. [human factors engineering of bioinstrumentation

    NASA Technical Reports Server (NTRS)

    Baer, J. A.

    1976-01-01

    A tactile paging system for deaf-blind people has been brought from the concept stage to the development of a first model. The model consists of a central station that transmits coded information via radio link to an on-body (i.e., worn on the wrist) receiving unit, the output from which is a coded vibrotactile signal. The model is a combination of commercially available equipment, customized electronic circuits, and electromechanical transducers. The paging system facilitates communication to deaf-blind clients in an institutional environment as an aid in their training and other activities. Several subunits of the system were individually developed, tested, and integrated into an operating system ready for experimentation and evaluation. The operation and characteristics of the system are described and photographs are shown.

  13. Defining upper gastrointestinal bleeding from linked primary and secondary care data and the effect on occurrence and 28 day mortality.

    PubMed

    Crooks, Colin John; Card, Timothy Richard; West, Joe

    2012-11-13

    Primary care records from the UK have frequently been used to identify episodes of upper gastrointestinal bleeding in studies of drug toxicity because of their comprehensive population coverage and longitudinal recording of prescriptions and diagnoses. Recent linkage within England of primary and secondary care data has augmented this data but the timing and coding of concurrent events, and how the definition of events in linked data effects occurrence and 28 day mortality is not known. We used the recently linked English Hospital Episodes Statistics and General Practice Research Database, 1997-2010, to define events by; a specific upper gastrointestinal bleed code in either dataset, a specific bleed code in both datasets, or a less specific but plausible code from the linked dataset. This approach resulted in 81% of secondary care defined bleeds having a corresponding plausible code within 2 months in primary care. However only 62% of primary care defined bleeds had a corresponding plausible HES admission within 2 months. The more restrictive and specific case definitions excluded severe events and almost halved the 28 day case fatality when compared to broader and more sensitive definitions. Restrictive definitions of gastrointestinal bleeding in linked datasets fail to capture the full heterogeneity in coding possible following complex clinical events. Conversely too broad a definition in primary care introduces events not severe enough to warrant hospital admission. Ignoring these issues may unwittingly introduce selection bias into a study's results.

  14. Chemistry Resolved Kinetic Flow Modeling of TATB Based Explosives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vitello, P A; Fried, L E; Howard, W M

    2011-07-21

    Detonation waves in insensitive, TATB based explosives are believed to have multi-time scale regimes. The initial burn rate of such explosives has a sub-microsecond time scale. However, significant late-time slow release in energy is believed to occur due to diffusion limited growth of carbon. In the intermediate time scale concentrations of product species likely change from being in equilibrium to being kinetic rate controlled. They use the thermo-chemical code CHEETAH linked to an ALE hydrodynamics code to model detonations. They term their model chemistry resolved kinetic flow as CHEETAH tracks the time dependent concentrations of individual species in the detonationmore » wave and calculates EOS values based on the concentrations. A HE-validation suite of model simulations compared to experiments at ambient, hot, and cold temperatures has been developed. They present here a new rate model and comparison with experimental data.« less

  15. Coronal Magnetism and Forward Solarsoft Idl Package

    NASA Astrophysics Data System (ADS)

    Gibson, S. E.

    2014-12-01

    The FORWARD suite of Solar Soft IDL codes is a community resource for model-data comparison, with a particular emphasis on analyzing coronal magnetic fields. FORWARD may be used both to synthesize a broad range of coronal observables, and to access and compare to existing data. FORWARD works with numerical model datacubes, interfaces with the web-served Predictive Science Inc MAS simulation datacubes and the Solar Soft IDL Potential Field Source Surface (PFSS) package, and also includes several analytic models (more can be added). It connects to the Virtual Solar Observatory and other web-served observations to download data in a format directly comparable to model predictions. It utilizes the CHIANTI database in modeling UV/EUV lines, and links to the CLE polarimetry synthesis code for forbidden coronal lines. FORWARD enables "forward-fitting" of specific observations, and helps to build intuition into how the physical properties of coronal magnetic structures translate to observable properties.

  16. Further Analysis of Motorcycle Helmet Effectiveness Using CODES Linked Data

    DOT National Transportation Integrated Search

    1998-01-01

    Linked data from the Crash Outcome Data Evaluation System (CODES) in seven : states was used by the National Highway Traffic Safety Administration as the : basis of a 1996 Report to Congress on the Benefits of Safety Belts and : Motorcycle Helmets (D...

  17. Red River Waterway Thermal Studies. Report 2. Thermal Stress Analyses

    DTIC Science & Technology

    1991-12-01

    stress relaxation, q. Shrinkage of the concrete, and . Thermal properties of the concrete including coefficient of thermal expansion , specific heat...Finite-Element Code 12. The thermal stress analyses in this investigation was performed using ABAQUS , a general-purpose, heat-transfer and structural...model (the UMAT 9 subroutine discussed below) may be incorporated as an external subroutine linked to the ABAQUS library. 14. In order to model the

  18. MMAB Developmental Products

    Science.gov Websites

    Skip Navigation Links www.nws.noaa.gov NOAA logo - Click to go to the NOAA homepage National Weather Service NWS logo - Click to go to the NWS homepage Environmental Modeling Center Home News Organization Search Search Search by city or zip code. Press enter or select the go button to submit request

  19. RISK OF UNSATURATED/SATURATED TRANSPORT AND TRANSFORMATION OF CHEMICAL CONCENTRATIONS (RUSTIC): VOLUME 1. THEORY AND CODE VERIFICATION

    EPA Science Inventory

    The RUSTIC program links three subordinate models--PRZM, VADOFT, and SAFTMOD--in order to predict pesticide transport and transformation through the crop root zone, the unsaturated zone, and the saturated zone to drinking water wells. PRZM is a one-dimensional finite-difference m...

  20. Adaptive and reliably acknowledged FSO communications

    NASA Astrophysics Data System (ADS)

    Fitz, Michael P.; Halford, Thomas R.; Kose, Cenk; Cromwell, Jonathan; Gordon, Steven

    2015-05-01

    Atmospheric turbulence causes the receive signal intensity on free space optical (FSO) communication links to vary over time. Scintillation fades can stymie connectivity for milliseconds at a time. To approach the information-theoretic limits of communication in such time-varying channels, it necessary to either code across extremely long blocks of data - thereby inducing unacceptable delays - or to vary the code rate according to the instantaneous channel conditions. We describe the design, laboratory testing, and over-the-air testing of an FSO modem that employs a protocol with adaptive coded modulation (ACM) and hybrid automatic repeat request. For links with fixed throughput, this protocol provides a 10dB reduction in the required received signal-to-noise ratio (SNR); for links with fixed range, this protocol provides the greater than a 3x increase in throughput. Independent U.S. Government tests demonstrate that our protocol effectively adapts the code rate to match the instantaneous channel conditions. The modem is able to provide throughputs in excess of 850 Mbps on links with ranges greater than 15 kilometers.

  1. Development of a dynamic coupled hydro-geomechanical code and its application to induced seismicity

    NASA Astrophysics Data System (ADS)

    Miah, Md Mamun

    This research describes the importance of a hydro-geomechanical coupling in the geologic sub-surface environment from fluid injection at geothermal plants, large-scale geological CO2 sequestration for climate mitigation, enhanced oil recovery, and hydraulic fracturing during wells construction in the oil and gas industries. A sequential computational code is developed to capture the multiphysics interaction behavior by linking a flow simulation code TOUGH2 and a geomechanics modeling code PyLith. Numerical formulation of each code is discussed to demonstrate their modeling capabilities. The computational framework involves sequential coupling, and solution of two sub-problems- fluid flow through fractured and porous media and reservoir geomechanics. For each time step of flow calculation, pressure field is passed to the geomechanics code to compute effective stress field and fault slips. A simplified permeability model is implemented in the code that accounts for the permeability of porous and saturated rocks subject to confining stresses. The accuracy of the TOUGH-PyLith coupled simulator is tested by simulating Terzaghi's 1D consolidation problem. The modeling capability of coupled poroelasticity is validated by benchmarking it against Mandel's problem. The code is used to simulate both quasi-static and dynamic earthquake nucleation and slip distribution on a fault from the combined effect of far field tectonic loading and fluid injection by using an appropriate fault constitutive friction model. Results from the quasi-static induced earthquake simulations show a delayed response in earthquake nucleation. This is attributed to the increased total stress in the domain and not accounting for pressure on the fault. However, this issue is resolved in the final chapter in simulating a single event earthquake dynamic rupture. Simulation results show that fluid pressure has a positive effect on slip nucleation and subsequent crack propagation. This is confirmed by running a sensitivity analysis that shows an increase in injection well distance results in delayed slip nucleation and rupture propagation on the fault.

  2. Optical deep space communication via relay satellite

    NASA Technical Reports Server (NTRS)

    Gagliardi, R. M.; Vilnrotter, V. A.; Dolinar, S. J., Jr.

    1981-01-01

    The possible use of an optical for high rate data transmission from a deep space vehicle to an Earth-orbiting relay satellite while RF links are envisioned for the relay to Earth link was studied. A preliminary link analysis is presented for initial sizing of optical components and power levels, in terms of achievable data rates and feasible range distances. Modulation formats are restricted to pulsed laser operation, involving bot coded and uncoded schemes. The advantage of an optical link over present RF deep space link capabilities is shown. The problems of acquisition, pointing and tracking with narrow optical beams are presented and discussed. Mathematical models of beam trackers are derived, aiding in the design of such systems for minimizing beam pointing errors. The expected orbital geometry between spacecraft and relay satellite, and its impact on beam pointing dynamics are discussed.

  3. Performance Analysis of the Link-16/JTIDS Waveform With Concatenated Coding

    DTIC Science & Technology

    2009-09-01

    noncoherent demodulation in terms of both required signal power and throughput. 15. NUMBER OF PAGES 101 14. SUBJECT TERMS Link-16/JTIDS, Reed-Solomon...Pulsed-Noise Interference (PNI), Additive White Gaussian Noise (AWGN), coherent detection, noncoherent detection. 16. PRICE CODE 17. SECURITY...than the existing Link-16/JTIDS waveform in both AWGN and PNI, for both coherent and noncoherent demodulation, in terms of both required signal

  4. Design of orbital debris shields for oblique hypervelocity impact

    NASA Technical Reports Server (NTRS)

    Fahrenthold, Eric P.

    1994-01-01

    A new impact debris propagation code was written to link CTH simulations of space debris shield perforation to the Lagrangian finite element code DYNA3D, for space structure wall impact simulations. This software (DC3D) simulates debris cloud evolution using a nonlinear elastic-plastic deformable particle dynamics model, and renders computationally tractable the supercomputer simulation of oblique impacts on Whipple shield protected structures. Comparison of three dimensional, oblique impact simulations with experimental data shows good agreement over a range of velocities of interest in the design of orbital debris shielding. Source code developed during this research is provided on the enclosed floppy disk. An abstract based on the work described was submitted to the 1994 Hypervelocity Impact Symposium.

  5. How to Link to Official Documents from the Government Publishing Office (GPO)

    EPA Pesticide Factsheets

    The most consistent way to present up-to-date content from the Federal Register, US Code, Code of Federal Regulations (CFR), and so on is to link to the official version of the document on the GPO's Federal Digital System (FDSys) website.

  6. Ultra Safe And Secure Blasting System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, M M

    2009-07-27

    The Ultra is a blasting system that is designed for special applications where the risk and consequences of unauthorized demolition or blasting are so great that the use of an extraordinarily safe and secure blasting system is justified. Such a blasting system would be connected and logically welded together through digital code-linking as part of the blasting system set-up and initialization process. The Ultra's security is so robust that it will defeat the people who designed and built the components in any attempt at unauthorized detonation. Anyone attempting to gain unauthorized control of the system by substituting components or tappingmore » into communications lines will be thwarted in their inability to provide encrypted authentication. Authentication occurs through the use of codes that are generated by the system during initialization code-linking and the codes remain unknown to anyone, including the authorized operator. Once code-linked, a closed system has been created. The system requires all components connected as they were during initialization as well as a unique code entered by the operator for function and blasting.« less

  7. MARS15

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mokhov, Nikolai

    MARS is a Monte Carlo code for inclusive and exclusive simulation of three-dimensional hadronic and electromagnetic cascades, muon, heavy-ion and low-energy neutron transport in accelerator, detector, spacecraft and shielding components in the energy range from a fraction of an electronvolt up to 100 TeV. Recent developments in the MARS15 physical models of hadron, heavy-ion and lepton interactions with nuclei and atoms include a new nuclear cross section library, a model for soft pion production, the cascade-exciton model, the quark gluon string models, deuteron-nucleus and neutrino-nucleus interaction models, detailed description of negative hadron and muon absorption and a unified treatment ofmore » muon, charged hadron and heavy-ion electromagnetic interactions with matter. New algorithms are implemented into the code and thoroughly benchmarked against experimental data. The code capabilities to simulate cascades and generate a variety of results in complex media have been also enhanced. Other changes in the current version concern the improved photo- and electro-production of hadrons and muons, improved algorithms for the 3-body decays, particle tracking in magnetic fields, synchrotron radiation by electrons and muons, significantly extended histograming capabilities and material description, and improved computational performance. In addition to direct energy deposition calculations, a new set of fluence-to-dose conversion factors for all particles including neutrino are built into the code. The code includes new modules for calculation of Displacement-per-Atom and nuclide inventory. The powerful ROOT geometry and visualization model implemented in MARS15 provides a large set of geometrical elements with a possibility of producing composite shapes and assemblies and their 3D visualization along with a possible import/export of geometry descriptions created by other codes (via the GDML format) and CAD systems (via the STEP format). The built-in MARS-MAD Beamline Builder (MMBLB) was redesigned for use with the ROOT geometry package that allows a very efficient and highly-accurate description, modeling and visualization of beam loss induced effects in arbitrary beamlines and accelerator lattices. The MARS15 code includes links to the MCNP-family codes for neutron and photon production and transport below 20 MeV, to the ANSYS code for thermal and stress analyses and to the STRUCT code for multi-turn particle tracking in large synchrotrons and collider rings.« less

  8. Development of the MCNPX depletion capability: A Monte Carlo linked depletion method that automates the coupling between MCNPX and CINDER90 for high fidelity burnup calculations

    NASA Astrophysics Data System (ADS)

    Fensin, Michael Lorne

    Monte Carlo-linked depletion methods have gained recent interest due to the ability to more accurately model complex 3-dimesional geometries and better track the evolution of temporal nuclide inventory by simulating the actual physical process utilizing continuous energy coefficients. The integration of CINDER90 into the MCNPX Monte Carlo radiation transport code provides a high-fidelity completely self-contained Monte-Carlo-linked depletion capability in a well established, widely accepted Monte Carlo radiation transport code that is compatible with most nuclear criticality (KCODE) particle tracking features in MCNPX. MCNPX depletion tracks all necessary reaction rates and follows as many isotopes as cross section data permits in order to achieve a highly accurate temporal nuclide inventory solution. This work chronicles relevant nuclear history, surveys current methodologies of depletion theory, details the methodology in applied MCNPX and provides benchmark results for three independent OECD/NEA benchmarks. Relevant nuclear history, from the Oklo reactor two billion years ago to the current major United States nuclear fuel cycle development programs, is addressed in order to supply the motivation for the development of this technology. A survey of current reaction rate and temporal nuclide inventory techniques is then provided to offer justification for the depletion strategy applied within MCNPX. The MCNPX depletion strategy is then dissected and each code feature is detailed chronicling the methodology development from the original linking of MONTEBURNS and MCNP to the most recent public release of the integrated capability (MCNPX 2.6.F). Calculation results of the OECD/NEA Phase IB benchmark, H. B. Robinson benchmark and OECD/NEA Phase IVB are then provided. The acceptable results of these calculations offer sufficient confidence in the predictive capability of the MCNPX depletion method. This capability sets up a significant foundation, in a well established and supported radiation transport code, for further development of a Monte Carlo-linked depletion methodology which is essential to the future development of advanced reactor technologies that exceed the limitations of current deterministic based methods.

  9. Defining upper gastrointestinal bleeding from linked primary and secondary care data and the effect on occurrence and 28 day mortality

    PubMed Central

    2012-01-01

    Background Primary care records from the UK have frequently been used to identify episodes of upper gastrointestinal bleeding in studies of drug toxicity because of their comprehensive population coverage and longitudinal recording of prescriptions and diagnoses. Recent linkage within England of primary and secondary care data has augmented this data but the timing and coding of concurrent events, and how the definition of events in linked data effects occurrence and 28 day mortality is not known. Methods We used the recently linked English Hospital Episodes Statistics and General Practice Research Database, 1997–2010, to define events by; a specific upper gastrointestinal bleed code in either dataset, a specific bleed code in both datasets, or a less specific but plausible code from the linked dataset. Results This approach resulted in 81% of secondary care defined bleeds having a corresponding plausible code within 2 months in primary care. However only 62% of primary care defined bleeds had a corresponding plausible HES admission within 2 months. The more restrictive and specific case definitions excluded severe events and almost halved the 28 day case fatality when compared to broader and more sensitive definitions. Conclusions Restrictive definitions of gastrointestinal bleeding in linked datasets fail to capture the full heterogeneity in coding possible following complex clinical events. Conversely too broad a definition in primary care introduces events not severe enough to warrant hospital admission. Ignoring these issues may unwittingly introduce selection bias into a study’s results. PMID:23148590

  10. SARAH 4: A tool for (not only SUSY) model builders

    NASA Astrophysics Data System (ADS)

    Staub, Florian

    2014-06-01

    We present the new version of the Mathematica package SARAH which provides the same features for a non-supersymmetric model as previous versions for supersymmetric models. This includes an easy and straightforward definition of the model, the calculation of all vertices, mass matrices, tadpole equations, and self-energies. Also the two-loop renormalization group equations for a general gauge theory are now included and have been validated with the independent Python code PyR@TE. Model files for FeynArts, CalcHep/CompHep, WHIZARD and in the UFO format can be written, and source code for SPheno for the calculation of the mass spectrum, a set of precision observables, and the decay widths and branching ratios of all states can be generated. Furthermore, the new version includes routines to output model files for Vevacious for both, supersymmetric and non-supersymmetric, models. Global symmetries are also supported with this version and by linking Susyno the handling of Lie groups has been improved and extended.

  11. Energy efficient rateless codes for high speed data transfer over free space optical channels

    NASA Astrophysics Data System (ADS)

    Prakash, Geetha; Kulkarni, Muralidhar; Acharya, U. S.

    2015-03-01

    Terrestrial Free Space Optical (FSO) links transmit information by using the atmosphere (free space) as a medium. In this paper, we have investigated the use of Luby Transform (LT) codes as a means to mitigate the effects of data corruption induced by imperfect channel which usually takes the form of lost or corrupted packets. LT codes, which are a class of Fountain codes, can be used independent of the channel rate and as many code words as required can be generated to recover all the message bits irrespective of the channel performance. Achieving error free high data rates with limited energy resources is possible with FSO systems if error correction codes with minimal overheads on the power can be used. We also employ a combination of Binary Phase Shift Keying (BPSK) with provision for modification of threshold and optimized LT codes with belief propagation for decoding. These techniques provide additional protection even under strong turbulence regimes. Automatic Repeat Request (ARQ) is another method of improving link reliability. Performance of ARQ is limited by the number of retransmissions and the corresponding time delay. We prove through theoretical computations and simulations that LT codes consume less energy per bit. We validate the feasibility of using energy efficient LT codes over ARQ for FSO links to be used in optical wireless sensor networks within the eye safety limits.

  12. Variable Coding and Modulation Experiment Using NASA's Space Communication and Navigation Testbed

    NASA Technical Reports Server (NTRS)

    Downey, Joseph A.; Mortensen, Dale J.; Evans, Michael A.; Tollis, Nicholas S.

    2016-01-01

    National Aeronautics and Space Administration (NASA)'s Space Communication and Navigation Testbed on the International Space Station provides a unique opportunity to evaluate advanced communication techniques in an operational system. The experimental nature of the Testbed allows for rapid demonstrations while using flight hardware in a deployed system within NASA's networks. One example is variable coding and modulation, which is a method to increase data-throughput in a communication link. This paper describes recent flight testing with variable coding and modulation over S-band using a direct-to-earth link between the SCaN Testbed and the Glenn Research Center. The testing leverages the established Digital Video Broadcasting Second Generation (DVB-S2) standard to provide various modulation and coding options. The experiment was conducted in a challenging environment due to the multipath and shadowing caused by the International Space Station structure. Performance of the variable coding and modulation system is evaluated and compared to the capacity of the link, as well as standard NASA waveforms.

  13. Initial Kernel Timing Using a Simple PIM Performance Model

    NASA Technical Reports Server (NTRS)

    Katz, Daniel S.; Block, Gary L.; Springer, Paul L.; Sterling, Thomas; Brockman, Jay B.; Callahan, David

    2005-01-01

    This presentation will describe some initial results of paper-and-pencil studies of 4 or 5 application kernels applied to a processor-in-memory (PIM) system roughly similar to the Cascade Lightweight Processor (LWP). The application kernels are: * Linked list traversal * Sun of leaf nodes on a tree * Bitonic sort * Vector sum * Gaussian elimination The intent of this work is to guide and validate work on the Cascade project in the areas of compilers, simulators, and languages. We will first discuss the generic PIM structure. Then, we will explain the concepts needed to program a parallel PIM system (locality, threads, parcels). Next, we will present a simple PIM performance model that will be used in the remainder of the presentation. For each kernel, we will then present a set of codes, including codes for a single PIM node, and codes for multiple PIM nodes that move data to threads and move threads to data. These codes are written at a fairly low level, between assembly and C, but much closer to C than to assembly. For each code, we will present some hand-drafted timing forecasts, based on the simple PIM performance model. Finally, we will conclude by discussing what we have learned from this work, including what programming styles seem to work best, from the point-of-view of both expressiveness and performance.

  14. iTOUGH2 Universal Optimization Using the PEST Protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finsterle, S.A.

    2010-07-01

    iTOUGH2 (http://www-esd.lbl.gov/iTOUGH2) is a computer program for parameter estimation, sensitivity analysis, and uncertainty propagation analysis [Finsterle, 2007a, b, c]. iTOUGH2 contains a number of local and global minimization algorithms for automatic calibration of a model against measured data, or for the solution of other, more general optimization problems (see, for example, Finsterle [2005]). A detailed residual and estimation uncertainty analysis is conducted to assess the inversion results. Moreover, iTOUGH2 can be used to perform a formal sensitivity analysis, or to conduct Monte Carlo simulations for the examination for prediction uncertainties. iTOUGH2's capabilities are continually enhanced. As the name implies, iTOUGH2more » is developed for use in conjunction with the TOUGH2 forward simulator for nonisothermal multiphase flow in porous and fractured media [Pruess, 1991]. However, iTOUGH2 provides FORTRAN interfaces for the estimation of user-specified parameters (see subroutine USERPAR) based on user-specified observations (see subroutine USEROBS). These user interfaces can be invoked to add new parameter or observation types to the standard set provided in iTOUGH2. They can also be linked to non-TOUGH2 models, i.e., iTOUGH2 can be used as a universal optimization code, similar to other model-independent, nonlinear parameter estimation packages such as PEST [Doherty, 2008] or UCODE [Poeter and Hill, 1998]. However, to make iTOUGH2's optimization capabilities available for use with an external code, the user is required to write some FORTRAN code that provides the link between the iTOUGH2 parameter vector and the input parameters of the external code, and between the output variables of the external code and the iTOUGH2 observation vector. While allowing for maximum flexibility, the coding requirement of this approach limits its applicability to those users with FORTRAN coding knowledge. To make iTOUGH2 capabilities accessible to many application models, the PEST protocol [Doherty, 2007] has been implemented into iTOUGH2. This protocol enables communication between the application (which can be a single 'black-box' executable or a script or batch file that calls multiple codes) and iTOUGH2. The concept requires that for the application model: (1) Input is provided on one or more ASCII text input files; (2) Output is returned to one or more ASCII text output files; (3) The model is run using a system command (executable or script/batch file); and (4) The model runs to completion without any user intervention. For each forward run invoked by iTOUGH2, select parameters cited within the application model input files are then overwritten with values provided by iTOUGH2, and select variables cited within the output files are extracted and returned to iTOUGH2. It should be noted that the core of iTOUGH2, i.e., its optimization routines and related analysis tools, remains unchanged; it is only the communication format between input parameters, the application model, and output variables that are borrowed from PEST. The interface routines have been provided by Doherty [2007]. The iTOUGH2-PEST architecture is shown in Figure 1. This manual contains installation instructions for the iTOUGH2-PEST module, and describes the PEST protocol as well as the input formats needed in iTOUGH2. Examples are provided that demonstrate the use of model-independent optimization and analysis using iTOUGH2.« less

  15. VisTrails SAHM: visualization and workflow management for species habitat modeling

    USGS Publications Warehouse

    Morisette, Jeffrey T.; Jarnevich, Catherine S.; Holcombe, Tracy R.; Talbert, Colin B.; Ignizio, Drew A.; Talbert, Marian; Silva, Claudio; Koop, David; Swanson, Alan; Young, Nicholas E.

    2013-01-01

    The Software for Assisted Habitat Modeling (SAHM) has been created to both expedite habitat modeling and help maintain a record of the various input data, pre- and post-processing steps and modeling options incorporated in the construction of a species distribution model through the established workflow management and visualization VisTrails software. This paper provides an overview of the VisTrails:SAHM software including a link to the open source code, a table detailing the current SAHM modules, and a simple example modeling an invasive weed species in Rocky Mountain National Park, USA.

  16. Validation of suicide and self-harm records in the Clinical Practice Research Datalink

    PubMed Central

    Thomas, Kyla H; Davies, Neil; Metcalfe, Chris; Windmeijer, Frank; Martin, Richard M; Gunnell, David

    2013-01-01

    Aims The UK Clinical Practice Research Datalink (CPRD) is increasingly being used to investigate suicide-related adverse drug reactions. No studies have comprehensively validated the recording of suicide and nonfatal self-harm in the CPRD. We validated general practitioners' recording of these outcomes using linked Office for National Statistics (ONS) mortality and Hospital Episode Statistics (HES) admission data. Methods We identified cases of suicide and self-harm recorded using appropriate Read codes in the CPRD between 1998 and 2010 in patients aged ≥15 years. Suicides were defined as patients with Read codes for suicide recorded within 95 days of their death. International Classification of Diseases codes were used to identify suicides/hospital admissions for self-harm in the linked ONS and HES data sets. We compared CPRD-derived cases/incidence of suicide and self-harm with those identified from linked ONS mortality and HES data, national suicide incidence rates and published self-harm incidence data. Results Only 26.1% (n = 590) of the ‘true’ (ONS-confirmed) suicides were identified using Read codes. Furthermore, only 55.5% of Read code-identified suicides were confirmed as suicide by the ONS data. Of the HES-identified cases of self-harm, 68.4% were identified in the CPRD using Read codes. The CPRD self-harm rates based on Read codes had similar age and sex distributions to rates observed in self-harm hospital registers, although rates were underestimated in all age groups. Conclusions The CPRD recording of suicide using Read codes is unreliable, with significant inaccuracy (over- and under-reporting). Future CPRD suicide studies should use linked ONS mortality data. The under-reporting of self-harm appears to be less marked. PMID:23216533

  17. Use of FEC coding to improve statistical multiplexing performance for video transport over ATM networks

    NASA Astrophysics Data System (ADS)

    Kurceren, Ragip; Modestino, James W.

    1998-12-01

    The use of forward error-control (FEC) coding, possibly in conjunction with ARQ techniques, has emerged as a promising approach for video transport over ATM networks for cell-loss recovery and/or bit error correction, such as might be required for wireless links. Although FEC provides cell-loss recovery capabilities it also introduces transmission overhead which can possibly cause additional cell losses. A methodology is described to maximize the number of video sources multiplexed at a given quality of service (QoS), measured in terms of decoded cell loss probability, using interlaced FEC codes. The transport channel is modelled as a block interference channel (BIC) and the multiplexer as single server, deterministic service, finite buffer supporting N users. Based upon an information-theoretic characterization of the BIC and large deviation bounds on the buffer overflow probability, the described methodology provides theoretically achievable upper limits on the number of sources multiplexed. Performance of specific coding techniques using interlaced nonbinary Reed-Solomon (RS) codes and binary rate-compatible punctured convolutional (RCPC) codes is illustrated.

  18. Cognitive Jointly Optimal Code-Division Channelization and Routing Over Cooperative Links

    DTIC Science & Technology

    2014-04-01

    i List of Figures Fig. 1: Comparison between code-division channelization and FDM. Fig. 2: Secondary receiver SINR as a function of the iteration step...transmission percentage as a function of the number of active links under Cases rank(X′′) = 1 and > 1 (the study includes also the random code assignment...scheme); (b) Instantaneous output SINR of a primary signal against primary SINR-QoS threshold SINRthPU (thick line) and instanta- neous output SINR of

  19. High data rate coding for the space station telemetry links.

    NASA Technical Reports Server (NTRS)

    Lumb, D. R.; Viterbi, A. J.

    1971-01-01

    Coding systems for high data rates were examined from the standpoint of potential application in space-station telemetry links. Approaches considered included convolutional codes with sequential, Viterbi, and cascaded-Viterbi decoding. It was concluded that a high-speed (40 Mbps) sequential decoding system best satisfies the requirements for the assumed growth potential and specified constraints. Trade-off studies leading to this conclusion are viewed, and some sequential (Fano) algorithm improvements are discussed, together with real-time simulation results.

  20. Intra- and Inter-Individual Variation in Self-Reported Code-Switching Patterns of Adult Multilinguals

    ERIC Educational Resources Information Center

    Dewaele, Jean-Marc; Li, Wei

    2014-01-01

    The present study is a large-scale quantitative analysis of intra-individual variation (linked to type of interlocutor) and inter-individual variation (linked to multilingualism, sociobiographical variables and three personality traits) in self-reported frequency of code-switching (CS) among 2116 multilinguals. We found a significant effect of…

  1. An alpha-numeric code for representing N-linked glycan structures in secreted glycoproteins.

    PubMed

    Yusufi, Faraaz Noor Khan; Park, Wonjun; Lee, May May; Lee, Dong-Yup

    2009-01-01

    Advances in high-throughput techniques have led to the creation of increasing amounts of glycome data. The storage and analysis of this data would benefit greatly from a compact notation for describing glycan structures that can be easily stored and interpreted by computers. Towards this end, we propose a fixed-length alpha-numeric code for representing N-linked glycan structures commonly found in secreted glycoproteins from mammalian cell cultures. This code, GlycoDigit, employs a pre-assigned alpha-numeric index to represent the monosaccharides attached in different branches to the core glycan structure. The present branch-centric representation allows us to visualize the structure while the numerical nature of the code makes it machine readable. In addition, a difference operator can be defined to quantitatively differentiate between glycan structures for further analysis. The usefulness and applicability of GlycoDigit were demonstrated by constructing and visualizing an N-linked glycosylation network.

  2. Error-correction coding

    NASA Technical Reports Server (NTRS)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  3. Neural representation of objects in space: a dual coding account.

    PubMed Central

    Humphreys, G W

    1998-01-01

    I present evidence on the nature of object coding in the brain and discuss the implications of this coding for models of visual selective attention. Neuropsychological studies of task-based constraints on: (i) visual neglect; and (ii) reading and counting, reveal the existence of parallel forms of spatial representation for objects: within-object representations, where elements are coded as parts of objects, and between-object representations, where elements are coded as independent objects. Aside from these spatial codes for objects, however, the coding of visual space is limited. We are extremely poor at remembering small spatial displacements across eye movements, indicating (at best) impoverished coding of spatial position per se. Also, effects of element separation on spatial extinction can be eliminated by filling the space with an occluding object, indicating that spatial effects on visual selection are moderated by object coding. Overall, there are separate limits on visual processing reflecting: (i) the competition to code parts within objects; (ii) the small number of independent objects that can be coded in parallel; and (iii) task-based selection of whether within- or between-object codes determine behaviour. Between-object coding may be linked to the dorsal visual system while parallel coding of parts within objects takes place in the ventral system, although there may additionally be some dorsal involvement either when attention must be shifted within objects or when explicit spatial coding of parts is necessary for object identification. PMID:9770227

  4. Novel promoters and coding first exons in DLG2 linked to developmental disorders and intellectual disability.

    PubMed

    Reggiani, Claudio; Coppens, Sandra; Sekhara, Tayeb; Dimov, Ivan; Pichon, Bruno; Lufin, Nicolas; Addor, Marie-Claude; Belligni, Elga Fabia; Digilio, Maria Cristina; Faletra, Flavio; Ferrero, Giovanni Battista; Gerard, Marion; Isidor, Bertrand; Joss, Shelagh; Niel-Bütschi, Florence; Perrone, Maria Dolores; Petit, Florence; Renieri, Alessandra; Romana, Serge; Topa, Alexandra; Vermeesch, Joris Robert; Lenaerts, Tom; Casimir, Georges; Abramowicz, Marc; Bontempi, Gianluca; Vilain, Catheline; Deconinck, Nicolas; Smits, Guillaume

    2017-07-19

    Tissue-specific integrative omics has the potential to reveal new genic elements important for developmental disorders. Two pediatric patients with global developmental delay and intellectual disability phenotype underwent array-CGH genetic testing, both showing a partial deletion of the DLG2 gene. From independent human and murine omics datasets, we combined copy number variations, histone modifications, developmental tissue-specific regulation, and protein data to explore the molecular mechanism at play. Integrating genomics, transcriptomics, and epigenomics data, we describe two novel DLG2 promoters and coding first exons expressed in human fetal brain. Their murine conservation and protein-level evidence allowed us to produce new DLG2 gene models for human and mouse. These new genic elements are deleted in 90% of 29 patients (public and in-house) showing partial deletion of the DLG2 gene. The patients' clinical characteristics expand the neurodevelopmental phenotypic spectrum linked to DLG2 gene disruption to cognitive and behavioral categories. While protein-coding genes are regarded as well known, our work shows that integration of multiple omics datasets can unveil novel coding elements. From a clinical perspective, our work demonstrates that two new DLG2 promoters and exons are crucial for the neurodevelopmental phenotypes associated with this gene. In addition, our work brings evidence for the lack of cross-annotation in human versus mouse reference genomes and nucleotide versus protein databases.

  5. Coded throughput performance simulations for the time-varying satellite channel. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Han, LI

    1995-01-01

    The design of a reliable satellite communication link involving the data transfer from a small, low-orbit satellite to a ground station, but through a geostationary satellite, was examined. In such a scenario, the received signal power to noise density ratio increases as the transmitting low-orbit satellite comes into view, and then decreases as it then departs, resulting in a short-duration, time-varying communication link. The optimal values of the small satellite antenna beamwidth, signaling rate, modulation scheme and the theoretical link throughput (in bits per day) have been determined. The goal of this thesis is to choose a practical coding scheme which maximizes the daily link throughput while satisfying a prescribed probability of error requirement. We examine the throughput of both fixed rate and variable rate concatenated forward error correction (FEC) coding schemes for the additive white Gaussian noise (AWGN) channel, and then examine the effect of radio frequency interference (RFI) on the best coding scheme among them. Interleaving is used to mitigate degradation due to RFI. It was found that the variable rate concatenated coding scheme could achieve 74 percent of the theoretical throughput, equivalent to 1.11 Gbits/day based on the cutoff rate R(sub 0). For comparison, 87 percent is achievable for AWGN-only case.

  6. Remote coding scheme based on waveguide Bragg grating in PLC splitter chip for PON monitoring.

    PubMed

    Zhang, Xuan; Lu, Fengjun; Chen, Si; Zhao, Xingqun; Zhu, Min; Sun, Xiaohan

    2016-03-07

    A distributing arranged waveguide Bragg gratings (WBGs) in PLC splitter chip based remote coding scheme is proposed and analyzed for passive optical network (PON) monitoring, by which the management system can identify each drop fiber link through the same reflector in the terminal of each optical network unit, even though there exist several equidistant users. The corresponding coding and capacity models are respectively established and investigated so that we can obtain a minimum number of the WBGs needed under the condition of the distributed structure. Signal-to-noise ratio (SNR) model related to the number of equidistant users is also developed to extend the analyses for the overall performance of the system. Simulation results show the proposed scheme is feasible and allow the monitoring of a 64 users PON with SNR range of 7.5~10.6dB. The scheme can solve some of difficulties of construction site at the lower user cost for PON system.

  7. An investigation of error characteristics and coding performance

    NASA Technical Reports Server (NTRS)

    Ebel, William J.; Ingels, Frank M.

    1992-01-01

    The performance of forward error correcting coding schemes on errors anticipated for the Earth Observation System (EOS) Ku-band downlink are studied. The EOS transmits picture frame data to the ground via the Telemetry Data Relay Satellite System (TDRSS) to a ground-based receiver at White Sands. Due to unintentional RF interference from other systems operating in the Ku band, the noise at the receiver is non-Gaussian which may result in non-random errors output by the demodulator. That is, the downlink channel cannot be modeled by a simple memoryless Gaussian-noise channel. From previous experience, it is believed that those errors are bursty. The research proceeded by developing a computer based simulation, called Communication Link Error ANalysis (CLEAN), to model the downlink errors, forward error correcting schemes, and interleavers used with TDRSS. To date, the bulk of CLEAN was written, documented, debugged, and verified. The procedures for utilizing CLEAN to investigate code performance were established and are discussed.

  8. Using QR Codes to Differentiate Learning for Gifted and Talented Students

    ERIC Educational Resources Information Center

    Siegle, Del

    2015-01-01

    QR codes are two-dimensional square patterns that are capable of coding information that ranges from web addresses to links to YouTube video. The codes save time typing and eliminate errors in entering addresses incorrectly. These codes make learning with technology easier for students and motivationally engage them in news ways.

  9. Spatial panel analyses of alcohol outlets and motor vehicle crashes in California: 1999–2008

    PubMed Central

    Ponicki, William R.; Gruenewald, Paul J.; Remer, Lillian G.

    2014-01-01

    Although past research has linked alcohol outlet density to higher rates of drinking and many related social problems, there is conflicting evidence of density’s association with traffic crashes. An abundance of local alcohol outlets simultaneously encourages drinking and reduces driving distances required to obtain alcohol, leading to an indeterminate expected impact on alcohol-involved crash risk. This study separately investigates the effects of outlet density on (1) the risk of injury crashes relative to population and (2) the likelihood that any given crash is alcohol-involved, as indicated by police reports and single-vehicle nighttime status of crashes. Alcohol outlet density effects are estimated using Bayesian misalignment Poisson analyses of all California ZIP codes over the years 1999–2008. These misalignment models allow panel analysis of ZIP-code data despite frequent redefinition of postal-code boundaries, while also controlling for overdispersion and the effects of spatial autocorrelation. Because models control for overall retail density, estimated alcohol-outlet associations represent the extra effect of retail establishments selling alcohol. The results indicate a number of statistically well-supported associations between retail density and crash behavior, but the implied effects on crash risks are relatively small. Alcohol-serving restaurants have a greater impact on overall crash risks than on the likelihood that those crashes involve alcohol, whereas bars primarily affect the odds that crashes are alcohol-involved. Off-premise outlet density is negatively associated with risks of both crashes and alcohol involvement, while the presence of a tribal casino in a ZIP code is linked to higher odds of police-reported drinking involvement. Alcohol outlets in a given area are found to influence crash risks both locally and in adjacent ZIP codes, and significant spatial autocorrelation also suggests important relationships across geographical units. These results suggest that each type of alcohol outlet can have differing impacts on risks of crashing as well as the alcohol involvement of those crashes. PMID:23537623

  10. Silencing of X-Linked MicroRNAs by Meiotic Sex Chromosome Inactivation

    PubMed Central

    Royo, Hélène; Seitz, Hervé; ElInati, Elias; Peters, Antoine H. F. M.; Stadler, Michael B.; Turner, James M. A.

    2015-01-01

    During the pachytene stage of meiosis in male mammals, the X and Y chromosomes are transcriptionally silenced by Meiotic Sex Chromosome Inactivation (MSCI). MSCI is conserved in therian mammals and is essential for normal male fertility. Transcriptomics approaches have demonstrated that in mice, most or all protein-coding genes on the X chromosome are subject to MSCI. However, it is unclear whether X-linked non-coding RNAs behave in a similar manner. The X chromosome is enriched in microRNA (miRNA) genes, with many exhibiting testis-biased expression. Importantly, high expression levels of X-linked miRNAs (X-miRNAs) have been reported in pachytene spermatocytes, indicating that these genes may escape MSCI, and perhaps play a role in the XY-silencing process. Here we use RNA FISH to examine X-miRNA expression in the male germ line. We find that, like protein-coding X-genes, X-miRNAs are expressed prior to prophase I and are thereafter silenced during pachynema. X-miRNA silencing does not occur in mouse models with defective MSCI. Furthermore, X-miRNAs are expressed at pachynema when present as autosomally integrated transgenes. Thus, we conclude that silencing of X-miRNAs during pachynema in wild type males is MSCI-dependent. Importantly, misexpression of X-miRNAs during pachynema causes spermatogenic defects. We propose that MSCI represents a chromosomal mechanism by which X-miRNAs, and other potential X-encoded repressors, can be silenced, thereby regulating genes with critical late spermatogenic functions. PMID:26509798

  11. Predictive coding links perception, action, and learning to emotions in music. Comment on "The quartet theory of human emotions: An integrative and neurofunctional model" by S. Koelsch et al.

    NASA Astrophysics Data System (ADS)

    Gebauer, L.; Kringelbach, M. L.; Vuust, P.

    2015-06-01

    The review by Koelsch and colleagues [1] offers a timely, comprehensive, and anatomically detailed framework for understanding the neural correlates of human emotions. The authors describe emotion in a framework of four affect systems, which are linked to effector systems, and higher order cognitive functions. This is elegantly demonstrated through the example of music; a realm for exploring emotions in a domain, that can be independent of language but still highly relevant for understanding human emotions [2].

  12. A Dual-Channel Acquisition Method Based on Extended Replica Folding Algorithm for Long Pseudo-Noise Code in Inter-Satellite Links.

    PubMed

    Zhao, Hongbo; Chen, Yuying; Feng, Wenquan; Zhuang, Chen

    2018-05-25

    Inter-satellite links are an important component of the new generation of satellite navigation systems, characterized by low signal-to-noise ratio (SNR), complex electromagnetic interference and the short time slot of each satellite, which brings difficulties to the acquisition stage. The inter-satellite link in both Global Positioning System (GPS) and BeiDou Navigation Satellite System (BDS) adopt the long code spread spectrum system. However, long code acquisition is a difficult and time-consuming task due to the long code period. Traditional folding methods such as extended replica folding acquisition search technique (XFAST) and direct average are largely restricted because of code Doppler and additional SNR loss caused by replica folding. The dual folding method (DF-XFAST) and dual-channel method have been proposed to achieve long code acquisition in low SNR and high dynamic situations, respectively, but the former is easily affected by code Doppler and the latter is not fast enough. Considering the environment of inter-satellite links and the problems of existing algorithms, this paper proposes a new long code acquisition algorithm named dual-channel acquisition method based on the extended replica folding algorithm (DC-XFAST). This method employs dual channels for verification. Each channel contains an incoming signal block. Local code samples are folded and zero-padded to the length of the incoming signal block. After a circular FFT operation, the correlation results contain two peaks of the same magnitude and specified relative position. The detection process is eased through finding the two largest values. The verification takes all the full and partial peaks into account. Numerical results reveal that the DC-XFAST method can improve acquisition performance while acquisition speed is guaranteed. The method has a significantly higher acquisition probability than folding methods XFAST and DF-XFAST. Moreover, with the advantage of higher detection probability and lower false alarm probability, it has a lower mean acquisition time than traditional XFAST, DF-XFAST and zero-padding.

  13. A New Realistic Evaluation Analysis Method: Linked Coding of Context, Mechanism, and Outcome Relationships

    ERIC Educational Resources Information Center

    Jackson, Suzanne F.; Kolla, Gillian

    2012-01-01

    In attempting to use a realistic evaluation approach to explore the role of Community Parents in early parenting programs in Toronto, a novel technique was developed to analyze the links between contexts (C), mechanisms (M) and outcomes (O) directly from experienced practitioner interviews. Rather than coding the interviews into themes in terms of…

  14. Solving Mendelian Mysteries: The Non-coding Genome May Hold the Key.

    PubMed

    Valente, Enza Maria; Bhatia, Kailash P

    2018-02-22

    Despite revolutionary advances in sequencing approaches, many mendelian disorders have remained unexplained. In this issue of Cell, Aneichyk et al. combine genomic and cell-type-specific transcriptomic data to causally link a non-coding mutation in the ubiquitous TAF1 gene to X-linked dystonia-parkinsonism. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    This software is a plug-in that interfaces between the Phoenix Integration's Model Center and the Base SAS 9.2 applications. The end use of the plug-in is to link input and output data that resides in SAS tables or MS SQL to and from "legacy" software programs without recoding. The potential end users are users who need to run legacy code and want data stored in a SQL database.

  16. Conflict Negotiation and Autonomy Processes in Adolescent Romantic Relationships: An Observational Study of Interdependency in Boyfriend and Girlfriend Effects

    ERIC Educational Resources Information Center

    McIsaac, Caroline; Connolly, Jennifer; McKenney, Katherine S.; Pepler, Debra; Craig, Wendy

    2008-01-01

    This study examined the association between conflict negotiation and the expression of autonomy in adolescent romantic partners. Thirty-seven couples participated in a globally coded conflict interaction task. Actor-partner interdependence models (APIM) were used to quantify the extent to which boys' and girls' autonomy was linked solely to their…

  17. Dynamic analysis of space structures including elastic, multibody, and control behavior

    NASA Technical Reports Server (NTRS)

    Pinson, Larry; Soosaar, Keto

    1989-01-01

    The problem is to develop analysis methods, modeling stategies, and simulation tools to predict with assurance the on-orbit performance and integrity of large complex space structures that cannot be verified on the ground. The problem must incorporate large reliable structural models, multi-body flexible dynamics, multi-tier controller interaction, environmental models including 1g and atmosphere, various on-board disturbances, and linkage to mission-level performance codes. All areas are in serious need of work, but the weakest link is multi-body flexible dynamics.

  18. A Sequential Fluid-mechanic Chemical-kinetic Model of Propane HCCI Combustion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aceves, S M; Flowers, D L; Martinez-Frias, J

    2000-11-29

    We have developed a methodology for predicting combustion and emissions in a Homogeneous Charge Compression Ignition (HCCI) Engine. This methodology combines a detailed fluid mechanics code with a detailed chemical kinetics code. Instead of directly linking the two codes, which would require an extremely long computational time, the methodology consists of first running the fluid mechanics code to obtain temperature profiles as a function of time. These temperature profiles are then used as input to a multi-zone chemical kinetics code. The advantage of this procedure is that a small number of zones (10) is enough to obtain accurate results. Thismore » procedure achieves the benefits of linking the fluid mechanics and the chemical kinetics codes with a great reduction in the computational effort, to a level that can be handled with current computers. The success of this procedure is in large part a consequence of the fact that for much of the compression stroke the chemistry is inactive and thus has little influence on fluid mechanics and heat transfer. Then, when chemistry is active, combustion is rather sudden, leaving little time for interaction between chemistry and fluid mixing and heat transfer. This sequential methodology has been capable of explaining the main characteristics of HCCI combustion that have been observed in experiments. In this paper, we use our model to explore an HCCI engine running on propane. The paper compares experimental and numerical pressure traces, heat release rates, and hydrocarbon and carbon monoxide emissions. The results show an excellent agreement, even in parameters that are difficult to predict, such as chemical heat release rates. Carbon monoxide emissions are reasonably well predicted, even though it is intrinsically difficult to make good predictions of CO emissions in HCCI engines. The paper includes a sensitivity study on the effect of the heat transfer correlation on the results of the analysis. Importantly, the paper also shows a numerical study on how parameters such as swirl rate, crevices and ceramic walls could help in reducing HC and CO emissions from HCCI engines.« less

  19. Simulator of Space Communication Networks

    NASA Technical Reports Server (NTRS)

    Clare, Loren; Jennings, Esther; Gao, Jay; Segui, John; Kwong, Winston

    2005-01-01

    Multimission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) is a suite of software tools that simulates the behaviors of communication networks to be used in space exploration, and predict the performance of established and emerging space communication protocols and services. MACHETE consists of four general software systems: (1) a system for kinematic modeling of planetary and spacecraft motions; (2) a system for characterizing the engineering impact on the bandwidth and reliability of deep-space and in-situ communication links; (3) a system for generating traffic loads and modeling of protocol behaviors and state machines; and (4) a system of user-interface for performance metric visualizations. The kinematic-modeling system makes it possible to characterize space link connectivity effects, including occultations and signal losses arising from dynamic slant-range changes and antenna radiation patterns. The link-engineering system also accounts for antenna radiation patterns and other phenomena, including modulations, data rates, coding, noise, and multipath fading. The protocol system utilizes information from the kinematic-modeling and link-engineering systems to simulate operational scenarios of space missions and evaluate overall network performance. In addition, a Communications Effect Server (CES) interface for MACHETE has been developed to facilitate hybrid simulation of space communication networks with actual flight/ground software/hardware embedded in the overall system.

  20. Wireless Visual Sensor Network Resource Allocation using Cross-Layer Optimization

    DTIC Science & Technology

    2009-01-01

    Rate Compatible Punctured Convolutional (RCPC) codes for channel...vol. 44, pp. 2943–2959, November 1998. [22] J. Hagenauer, “ Rate - compatible punctured convolutional codes (RCPC codes ) and their applications,” IEEE... coding rate for H.264/AVC video compression is determined. At the data link layer, the Rate - Compatible Puctured Convolutional (RCPC) channel coding

  1. Reservoir simulation with MUFITS code: Extension for double porosity reservoirs and flows in horizontal wells

    NASA Astrophysics Data System (ADS)

    Afanasyev, Andrey

    2017-04-01

    Numerical modelling of multiphase flows in porous medium is necessary in many applications concerning subsurface utilization. An incomplete list of those applications includes oil and gas fields exploration, underground carbon dioxide storage and geothermal energy production. The numerical simulations are conducted using complicated computer programs called reservoir simulators. A robust simulator should include a wide range of modelling options covering various exploration techniques, rock and fluid properties, and geological settings. In this work we present a recent development of new options in MUFITS code [1]. The first option concerns modelling of multiphase flows in double-porosity double-permeability reservoirs. We describe internal representation of reservoir models in MUFITS, which are constructed as a 3D graph of grid blocks, pipe segments, interfaces, etc. In case of double porosity reservoir, two linked nodes of the graph correspond to a grid cell. We simulate the 6th SPE comparative problem [2] and a five-spot geothermal production problem to validate the option. The second option concerns modelling of flows in porous medium coupled with flows in horizontal wells that are represented in the 3D graph as a sequence of pipe segments linked with pipe junctions. The well completions link the pipe segments with reservoir. The hydraulics in the wellbore, i.e. the frictional pressure drop, is calculated in accordance with Haaland's formula. We validate the option against the 7th SPE comparative problem [3]. We acknowledge financial support by the Russian Foundation for Basic Research (project No RFBR-15-31-20585). References [1] Afanasyev, A. MUFITS Reservoir Simulation Software (www.mufits.imec.msu.ru). [2] Firoozabadi A. et al. Sixth SPE Comparative Solution Project: Dual-Porosity Simulators // J. Petrol. Tech. 1990. V.42. N.6. P.710-715. [3] Nghiem L., et al. Seventh SPE Comparative Solution Project: Modelling of Horizontal Wells in Reservoir Simulation // SPE Symp. Res. Sim., 1991. DOI: 10.2118/21221-MS.

  2. Dual coding: a cognitive model for psychoanalytic research.

    PubMed

    Bucci, W

    1985-01-01

    Four theories of mental representation derived from current experimental work in cognitive psychology have been discussed in relation to psychoanalytic theory. These are: verbal mediation theory, in which language determines or mediates thought; perceptual dominance theory, in which imagistic structures are dominant; common code or propositional models, in which all information, perceptual or linguistic, is represented in an abstract, amodal code; and dual coding, in which nonverbal and verbal information are each encoded, in symbolic form, in separate systems specialized for such representation, and connected by a complex system of referential relations. The weight of current empirical evidence supports the dual code theory. However, psychoanalysis has implicitly accepted a mixed model-perceptual dominance theory applying to unconscious representation, and verbal mediation characterizing mature conscious waking thought. The characterization of psychoanalysis, by Schafer, Spence, and others, as a domain in which reality is constructed rather than discovered, reflects the application of this incomplete mixed model. The representations of experience in the patient's mind are seen as without structure of their own, needing to be organized by words, thus vulnerable to distortion or dissolution by the language of the analyst or the patient himself. In these terms, hypothesis testing becomes a meaningless pursuit; the propositions of the theory are no longer falsifiable; the analyst is always more or less "right." This paper suggests that the integrated dual code formulation provides a more coherent theoretical framework for psychoanalysis than the mixed model, with important implications for theory and technique. In terms of dual coding, the problem is not that the nonverbal representations are vulnerable to distortion by words, but that the words that pass back and forth between analyst and patient will not affect the nonverbal schemata at all. Using the dual code formulation, and applying an investigative methodology derived from experimental cognitive psychology, a new approach to the verification of interpretations is possible. Some constructions of a patient's story may be seen as more accurate than others, by virtue of their linkage to stored perceptual representations in long-term memory. We can demonstrate that such linking has occurred in functional or operational terms--through evaluating the representation of imagistic content in the patient's speech.

  3. Generic reactive transport codes as flexible tools to integrate soil organic matter degradation models with water, transport and geochemistry in soils

    NASA Astrophysics Data System (ADS)

    Jacques, Diederik; Gérard, Fréderic; Mayer, Uli; Simunek, Jirka; Leterme, Bertrand

    2016-04-01

    A large number of organic matter degradation, CO2 transport and dissolved organic matter models have been developed during the last decades. However, organic matter degradation models are in many cases strictly hard-coded in terms of organic pools, degradation kinetics and dependency on environmental variables. The scientific input of the model user is typically limited to the adjustment of input parameters. In addition, the coupling with geochemical soil processes including aqueous speciation, pH-dependent sorption and colloid-facilitated transport are not incorporated in many of these models, strongly limiting the scope of their application. Furthermore, the most comprehensive organic matter degradation models are combined with simplified representations of flow and transport processes in the soil system. We illustrate the capability of generic reactive transport codes to overcome these shortcomings. The formulations of reactive transport codes include a physics-based continuum representation of flow and transport processes, while biogeochemical reactions can be described as equilibrium processes constrained by thermodynamic principles and/or kinetic reaction networks. The flexibility of these type of codes allows for straight-forward extension of reaction networks, permits the inclusion of new model components (e.g.: organic matter pools, rate equations, parameter dependency on environmental conditions) and in such a way facilitates an application-tailored implementation of organic matter degradation models and related processes. A numerical benchmark involving two reactive transport codes (HPx and MIN3P) demonstrates how the process-based simulation of transient variably saturated water flow (Richards equation), solute transport (advection-dispersion equation), heat transfer and diffusion in the gas phase can be combined with a flexible implementation of a soil organic matter degradation model. The benchmark includes the production of leachable organic matter and inorganic carbon in the aqueous and gaseous phases, as well as different decomposition functions with first-order, linear dependence or nonlinear dependence on a biomass pool. In addition, we show how processes such as local bioturbation (bio-diffusion) can be included implicitly through a Fickian formulation of transport of soil organic matter. Coupling soil organic matter models with generic and flexible reactive transport codes offers a valuable tool to enhance insights into coupled physico-chemical processes at different scales within the scope of C-biogeochemical cycles, possibly linked with other chemical elements such as plant nutrients and pollutants.

  4. Development of the FHR advanced natural circulation analysis code and application to FHR safety analysis

    DOE PAGES

    Guo, Z.; Zweibaum, N.; Shao, M.; ...

    2016-04-19

    The University of California, Berkeley (UCB) is performing thermal hydraulics safety analysis to develop the technical basis for design and licensing of fluoride-salt-cooled, high-temperature reactors (FHRs). FHR designs investigated by UCB use natural circulation for emergency, passive decay heat removal when normal decay heat removal systems fail. The FHR advanced natural circulation analysis (FANCY) code has been developed for assessment of passive decay heat removal capability and safety analysis of these innovative system designs. The FANCY code uses a one-dimensional, semi-implicit scheme to solve for pressure-linked mass, momentum and energy conservation equations. Graph theory is used to automatically generate amore » staggered mesh for complicated pipe network systems. Heat structure models have been implemented for three types of boundary conditions (Dirichlet, Neumann and Robin boundary conditions). Heat structures can be composed of several layers of different materials, and are used for simulation of heat structure temperature distribution and heat transfer rate. Control models are used to simulate sequences of events or trips of safety systems. A proportional-integral controller is also used to automatically make thermal hydraulic systems reach desired steady state conditions. A point kinetics model is used to model reactor kinetics behavior with temperature reactivity feedback. The underlying large sparse linear systems in these models are efficiently solved by using direct and iterative solvers provided by the SuperLU code on high performance machines. Input interfaces are designed to increase the flexibility of simulation for complicated thermal hydraulic systems. In conclusion, this paper mainly focuses on the methodology used to develop the FANCY code, and safety analysis of the Mark 1 pebble-bed FHR under development at UCB is performed.« less

  5. Data engineering systems: Computerized modeling and data bank capabilities for engineering analysis

    NASA Technical Reports Server (NTRS)

    Kopp, H.; Trettau, R.; Zolotar, B.

    1984-01-01

    The Data Engineering System (DES) is a computer-based system that organizes technical data and provides automated mechanisms for storage, retrieval, and engineering analysis. The DES combines the benefits of a structured data base system with automated links to large-scale analysis codes. While the DES provides the user with many of the capabilities of a computer-aided design (CAD) system, the systems are actually quite different in several respects. A typical CAD system emphasizes interactive graphics capabilities and organizes data in a manner that optimizes these graphics. On the other hand, the DES is a computer-aided engineering system intended for the engineer who must operationally understand an existing or planned design or who desires to carry out additional technical analysis based on a particular design. The DES emphasizes data retrieval in a form that not only provides the engineer access to search and display the data but also links the data automatically with the computer analysis codes.

  6. Transient flow analysis linked to fast pressure disturbance monitored in pipe systems

    NASA Astrophysics Data System (ADS)

    Kueny, J. L.; Lourenco, M.; Ballester, J. L.

    2012-11-01

    EDF Hydro Division has launched the RENOUVEAU program in order to increase performance and improve plant availability through anticipation. Due to this program, a large penstocks fleet is equipped with pressure transducers linked to a special monitoring system. Any significant disturbance of the pressure is captured in a snapshot and the waveform of the signal is stored and analyzed. During these transient states, variations in flow are unknown. In order to determine the structural impact of such overpressure occurring during complex transients conditions over the entire circuit, EDF DTG has asked ENSE3 GRENOBLE to develop a code called ACHYL CF*. The input data of ACHYL CF are circuit topology and pressure boundaries conditions. This article provide a description of the computer code developed for modeling the transient flow in a pipe network using the signals from pressure transducers as boundary conditions. Different test cases will be presented, simulating real hydro power plants for which measured pressure signals are available.

  7. Fate and Transport of Nitrogen and Carbon with Decomposition of Organic Matter in a Reduced Paddy Field Based on a Coupled Nitrogen-Carbon Cycling Model Using the HP1 Code

    NASA Astrophysics Data System (ADS)

    Toride, N.; Matsuoka, K.

    2017-12-01

    In order to predict the fate and transport of nitrogen in a reduced paddy field as a result of decomposition of organic matter, we implemented within the PHREEQC program a modified coupled carbon and nitrogen cycling model based on the LEACHM code. SOM decay processes from organic carbon (Org-C) to biomass carbon (Bio-C), humus carbon (Hum-C), and carbon dioxide (CO2) were described using first-order kinetics. Bio-C was recycled into the organic pool. When oxygen was available in an aerobic condition, O2 was used to produce CO2 as an electron accepter. When O2 availability is low, other electron acceptors such as NO3-, Mn4+, Fe3+, SO42-, were used depending on the redox potential. Decomposition of Org-N was related to the carbon cycle using the C/N ratio. Mineralization and immobilization were determined based on available NH4-N and the nitrogen demand for the formation of biomass and humus. Although nitrification was independently described with the first-order decay process, denitrification was linked with the SOM decay since NO3- was an electron accepter for the CO2 production. Proton reactions were coupled with the nitrification from NH4+ to NO3-, and the ammonium generation from NH3 to NH4+. Furthermore, cation and anion exchange reactions were included with the permanent negative charges and the pH dependent variable charges. The carbon and nitrogen cycling model described with PHREEQC was linked with HYDRUS-1D using the HP1 code. Various nitrogen and carbon transport scenarios were demonstrated for the application of organic matter to a saturated paddy soil.

  8. An Integrated Approach Linking Process to Structural Modeling With Microstructural Characterization for Injections-Molded Long-Fiber Thermoplastics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, Ba Nghiep; Bapanapalli, Satish K.; Smith, Mark T.

    2008-09-01

    The objective of our work is to enable the optimum design of lightweight automotive structural components using injection-molded long fiber thermoplastics (LFTs). To this end, an integrated approach that links process modeling to structural analysis with experimental microstructural characterization and validation is developed. First, process models for LFTs are developed and implemented into processing codes (e.g. ORIENT, Moldflow) to predict the microstructure of the as-formed composite (i.e. fiber length and orientation distributions). In parallel, characterization and testing methods are developed to obtain necessary microstructural data to validate process modeling predictions. Second, the predicted LFT composite microstructure is imported into amore » structural finite element analysis by ABAQUS to determine the response of the as-formed composite to given boundary conditions. At this stage, constitutive models accounting for the composite microstructure are developed to predict various types of behaviors (i.e. thermoelastic, viscoelastic, elastic-plastic, damage, fatigue, and impact) of LFTs. Experimental methods are also developed to determine material parameters and to validate constitutive models. Such a process-linked-structural modeling approach allows an LFT composite structure to be designed with confidence through numerical simulations. Some recent results of our collaborative research will be illustrated to show the usefulness and applications of this integrated approach.« less

  9. Received response based heuristic LDPC code for short-range non-line-of-sight ultraviolet communication.

    PubMed

    Qin, Heng; Zuo, Yong; Zhang, Dong; Li, Yinghui; Wu, Jian

    2017-03-06

    Through slight modification on typical photon multiplier tube (PMT) receiver output statistics, a generalized received response model considering both scattered propagation and random detection is presented to investigate the impact of inter-symbol interference (ISI) on link data rate of short-range non-line-of-sight (NLOS) ultraviolet communication. Good agreement with the experimental results by numerical simulation is shown. Based on the received response characteristics, a heuristic check matrix construction algorithm of low-density-parity-check (LDPC) code is further proposed to approach the data rate bound derived in a delayed sampling (DS) binary pulse position modulation (PPM) system. Compared to conventional LDPC coding methods, better bit error ratio (BER) below 1E-05 is achieved for short-range NLOS UVC systems operating at data rate of 2Mbps.

  10. Multiple Access Schemes for Lunar Missions

    NASA Technical Reports Server (NTRS)

    Deutsch, Leslie; Hamkins, Jon; Stocklin, Frank J.

    2010-01-01

    Two years ago, the NASA Coding, Modulation, and Link Protocol (CMLP) study was completed. The study, led by the authors of this paper, recommended codes, modulation schemes, and desired attributes of link protocols for all space communication links in NASA's future space architecture. Portions of the NASA CMLP team were reassembled to resolve one open issue: the use of multiple access (MA) communication from the lunar surface. The CMLP-MA team analyzed and simulated two candidate multiple access schemes that were identified in the original CMLP study: Code Division MA (CDMA) and Frequency Division MA (FDMA) based on a bandwidth-efficient Continuous Phase Modulation (CPM) with a superimposed Pseudo-Noise (PN) ranging signal (CPM/PN). This paper summarizes the results of the analysis and simulation of the CMLP-MA study and describes the final recommendations.

  11. Chemistry Resolved Kinetic Flow Modeling of TATB Based Explosives

    NASA Astrophysics Data System (ADS)

    Vitello, Peter; Fried, Lawrence; Howard, Mike; Levesque, George; Souers, Clark

    2011-06-01

    Detonation waves in insensitive, TATB based explosives are believed to have multi-time scale regimes. The initial burn rate of such explosives has a sub-microsecond time scale. However, significant late-time slow release in energy is believed to occur due to diffusion limited growth of carbon. In the intermediate time scale concentrations of product species likely change from being in equilibrium to being kinetic rate controlled. We use the thermo-chemical code CHEETAH linked to ALE hydrodynamics codes to model detonations. We term our model chemistry resolved kinetic flow as CHEETAH tracks the time dependent concentrations of individual species in the detonation wave and calculate EOS values based on the concentrations. A validation suite of model simulations compared to recent high fidelity metal push experiments at ambient and cold temperatures has been developed. We present here a study of multi-time scale kinetic rate effects for these experiments. Prepared by LLNL under Contract DE-AC52-07NA27344.

  12. Schroedinger’s Code: A Preliminary Study on Research Source Code Availability and Link Persistence in Astrophysics

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Teuben, Peter J.; Ryan, P. Wesley

    2018-05-01

    We examined software usage in a sample set of astrophysics research articles published in 2015 and searched for the source codes for the software mentioned in these research papers. We categorized the software to indicate whether the source code is available for download and whether there are restrictions to accessing it, and if the source code is not available, whether some other form of the software, such as a binary, is. We also extracted hyperlinks from one journal’s 2015 research articles, as links in articles can serve as an acknowledgment of software use and lead to the data used in the research, and tested them to determine which of these URLs are still accessible. For our sample of 715 software instances in the 166 articles we examined, we were able to categorize 418 records as according to whether source code was available and found that 285 unique codes were used, 58% of which offered the source code for download. Of the 2558 hyperlinks extracted from 1669 research articles, at best, 90% of them were available over our testing period.

  13. Dual routes for verbal repetition: articulation-based and acoustic-phonetic codes for pseudoword and word repetition, respectively.

    PubMed

    Yoo, Sejin; Chung, Jun-Young; Jeon, Hyeon-Ae; Lee, Kyoung-Min; Kim, Young-Bo; Cho, Zang-Hee

    2012-07-01

    Speech production is inextricably linked to speech perception, yet they are usually investigated in isolation. In this study, we employed a verbal-repetition task to identify the neural substrates of speech processing with two ends active simultaneously using functional MRI. Subjects verbally repeated auditory stimuli containing an ambiguous vowel sound that could be perceived as either a word or a pseudoword depending on the interpretation of the vowel. We found verbal repetition commonly activated the audition-articulation interface bilaterally at Sylvian fissures and superior temporal sulci. Contrasting word-versus-pseudoword trials revealed neural activities unique to word repetition in the left posterior middle temporal areas and activities unique to pseudoword repetition in the left inferior frontal gyrus. These findings imply that the tasks are carried out using different speech codes: an articulation-based code of pseudowords and an acoustic-phonetic code of words. It also supports the dual-stream model and imitative learning of vocabulary. Copyright © 2012 Elsevier Inc. All rights reserved.

  14. Analysis of the Effect of Electron Density Perturbations Generated by Gravity Waves on HF Communication Links

    NASA Astrophysics Data System (ADS)

    Fagre, M.; Elias, A. G.; Chum, J.; Cabrera, M. A.

    2017-12-01

    In the present work, ray tracing of high frequency (HF) signals in ionospheric disturbed conditions is analyzed, particularly in the presence of electron density perturbations generated by gravity waves (GWs). The three-dimensional numerical ray tracing code by Jones and Stephenson, based on Hamilton's equations, which is commonly used to study radio propagation through the ionosphere, is used. An electron density perturbation model is implemented to this code based upon the consideration of atmospheric GWs generated at a height of 150 km in the thermosphere and propagating up into the ionosphere. The motion of the neutral gas at these altitudes induces disturbances in the background plasma which affects HF signals propagation. To obtain a realistic model of GWs in order to analyze the propagation and dispersion characteristics, a GW ray tracing method with kinematic viscosity and thermal diffusivity was applied. The IRI-2012, HWM14 and NRLMSISE-00 models were incorporated to assess electron density, wind velocities, neutral temperature and total mass density needed for the ray tracing codes. Preliminary results of gravity wave effects on ground range and reflection height are presented for low-mid latitude ionosphere.

  15. Momentary Patterns of Covariation between Specific Affects and Interpersonal Behavior: Linking Relationship Science and Personality Assessment

    PubMed Central

    Ross, Jaclyn M.; Girard, Jeffrey M.; Wright, Aidan G.C.; Beeney, Joseph E.; Scott, Lori N.; Hallquist, Michael N.; Lazarus, Sophie A.; Stepp, Stephanie D.; Pilkonis, Paul A.

    2016-01-01

    Relationships are among the most salient factors affecting happiness and wellbeing for individuals and families. Relationship science has identified the study of dyadic behavioral patterns between couple members during conflict as an important window in to relational functioning with both short-term and long-term consequences. Several methods have been developed for the momentary assessment of behavior during interpersonal transactions. Among these, the most popular is the Specific Affect Coding System (SPAFF), which organizes social behavior into a set of discrete behavioral constructs. This study examines the interpersonal meaning of the SPAFF codes through the lens of interpersonal theory, which uses the fundamental dimensions of Dominance and Affiliation to organize interpersonal behavior. A sample of 67 couples completed a conflict task, which was video recorded and coded using SPAFF and a method for rating momentary interpersonal behavior, the Continuous Assessment of Interpersonal Dynamics (CAID). Actor partner interdependence models in a multilevel structural equation modeling framework were used to study the covariation of SPAFF codes and CAID ratings. Results showed that a number of SPAFF codes had clear interpersonal signatures, but many did not. Additionally, actor and partner effects for the same codes were strongly consistent with interpersonal theory’s principle of complementarity. Thus, findings reveal points of convergence and divergence in the two systems and provide support for central tenets of interpersonal theory. Future directions based on these initial findings are discussed. PMID:27148786

  16. Momentary patterns of covariation between specific affects and interpersonal behavior: Linking relationship science and personality assessment.

    PubMed

    Ross, Jaclyn M; Girard, Jeffrey M; Wright, Aidan G C; Beeney, Joseph E; Scott, Lori N; Hallquist, Michael N; Lazarus, Sophie A; Stepp, Stephanie D; Pilkonis, Paul A

    2017-02-01

    Relationships are among the most salient factors affecting happiness and wellbeing for individuals and families. Relationship science has identified the study of dyadic behavioral patterns between couple members during conflict as an important window in to relational functioning with both short-term and long-term consequences. Several methods have been developed for the momentary assessment of behavior during interpersonal transactions. Among these, the most popular is the Specific Affect Coding System (SPAFF), which organizes social behavior into a set of discrete behavioral constructs. This study examines the interpersonal meaning of the SPAFF codes through the lens of interpersonal theory, which uses the fundamental dimensions of Dominance and Affiliation to organize interpersonal behavior. A sample of 67 couples completed a conflict task, which was video recorded and coded using SPAFF and a method for rating momentary interpersonal behavior, the Continuous Assessment of Interpersonal Dynamics (CAID). Actor partner interdependence models in a multilevel structural equation modeling framework were used to study the covariation of SPAFF codes and CAID ratings. Results showed that a number of SPAFF codes had clear interpersonal signatures, but many did not. Additionally, actor and partner effects for the same codes were strongly consistent with interpersonal theory's principle of complementarity. Thus, findings reveal points of convergence and divergence in the 2 systems and provide support for central tenets of interpersonal theory. Future directions based on these initial findings are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. Cyclotron resonant scattering feature simulations. II. Description of the CRSF simulation process

    NASA Astrophysics Data System (ADS)

    Schwarm, F.-W.; Ballhausen, R.; Falkner, S.; Schönherr, G.; Pottschmidt, K.; Wolff, M. T.; Becker, P. A.; Fürst, F.; Marcu-Cheatham, D. M.; Hemphill, P. B.; Sokolova-Lapa, E.; Dauser, T.; Klochkov, D.; Ferrigno, C.; Wilms, J.

    2017-05-01

    Context. Cyclotron resonant scattering features (CRSFs) are formed by scattering of X-ray photons off quantized plasma electrons in the strong magnetic field (of the order 1012 G) close to the surface of an accreting X-ray pulsar. Due to the complex scattering cross-sections, the line profiles of CRSFs cannot be described by an analytic expression. Numerical methods, such as Monte Carlo (MC) simulations of the scattering processes, are required in order to predict precise line shapes for a given physical setup, which can be compared to observations to gain information about the underlying physics in these systems. Aims: A versatile simulation code is needed for the generation of synthetic cyclotron lines. Sophisticated geometries should be investigatable by making their simulation possible for the first time. Methods: The simulation utilizes the mean free path tables described in the first paper of this series for the fast interpolation of propagation lengths. The code is parallelized to make the very time-consuming simulations possible on convenient time scales. Furthermore, it can generate responses to monoenergetic photon injections, producing Green's functions, which can be used later to generate spectra for arbitrary continua. Results: We develop a new simulation code to generate synthetic cyclotron lines for complex scenarios, allowing for unprecedented physical interpretation of the observed data. An associated XSPEC model implementation is used to fit synthetic line profiles to NuSTAR data of Cep X-4. The code has been developed with the main goal of overcoming previous geometrical constraints in MC simulations of CRSFs. By applying this code also to more simple, classic geometries used in previous works, we furthermore address issues of code verification and cross-comparison of various models. The XSPEC model and the Green's function tables are available online (see link in footnote, page 1).

  18. Performance Analysis of a JTIDS/Link-16-type Waveform Transmitted over Slow, Flat Nakagami Fading Channels in the Presence of Narrowband Interference

    DTIC Science & Technology

    2008-12-01

    The effective two-way tactical data rate is 3,060 bits per second. Note that there is no parity check or forward error correction (FEC) coding used in...of 1800 bits per second. With the use of FEC coding , the channel data rate is 2250 bits per second; however, the information data rate is still the...Link-11. If the parity bits are included, the channel data rate is 28,800 bps. If FEC coding is considered, the channel data rate is 59,520 bps

  19. Administrative data measured surgical site infection probability within 30 days of surgery in elderly patients.

    PubMed

    van Walraven, Carl; Jackson, Timothy D; Daneman, Nick

    2016-09-01

    Elderly patients are inordinately affected by surgical site infections (SSIs). This study derived and internally validated a model that used routinely collected health administrative data to measure the probability of SSI in elderly patients within 30 days of surgery. All people exceeding 65 years undergoing surgery from two hospitals with known SSI status were linked to population-based administrative data sets in Ontario, Canada. We used bootstrap methods to create a multivariate model that used health administrative data to predict the probability of SSI. Of 3,436 patients, 177 (5.1%) had an SSI. The Elderly SSI Risk Model included six covariates: number of distinct physician fee codes within 30 days of surgery; presence or absence of a postdischarge prescription for an antibiotic; presence or absence of three diagnostic codes; and a previously derived score that gauged SSI risk based on procedure codes. The model was highly explanatory (Nagelkerke's R 2 , 0.458), strongly discriminative (C statistic, 0.918), and well calibrated (calibration slope, 1). Health administrative data can effectively determine 30-day risk of SSI risk in elderly patients undergoing a broad assortment of surgeries. External validation is necessary before this can be routinely used to monitor SSIs in the elderly. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Porting Initiation and Failure into Linked CHEETAH

    NASA Astrophysics Data System (ADS)

    Souers, Clark; Vitello, Peter

    2007-06-01

    Linked CHEETAH is a thermo-chemical code coupled to a 2-D hydrocode. Initially, a quadratic-pressure dependent kinetic rate was used, which worked well in modeling prompt detonation of explosives of large size, but does not work on other aspects of explosive behavior. The variable-pressure Tarantula reactive flow rate model was developed with JWL++ in order to also describe failure and initiation, and we have moved this model into Linked CHEETAH. The model works by turning on only above a pressure threshold, where a slow turn-on creates initiation. At a higher pressure, the rate suddenly leaps to a large value over a small pressure range. A slowly failing cylinder will see a rapidly declining rate, which pushes it quickly into failure. At a high pressure, the detonation rate is constant. A sequential validation procedure is used, which includes metal-confined cylinders, rate-sticks, corner-turning, initiation and threshold, gap tests and air gaps. The size (diameter) effect is central to the calibration. This work was performed under the auspices of the U.S. Department of Energy by the University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48.

  1. Autistic traits are linked to reduced adaptive coding of face identity and selectively poorer face recognition in men but not women.

    PubMed

    Rhodes, Gillian; Jeffery, Linda; Taylor, Libby; Ewing, Louise

    2013-11-01

    Our ability to discriminate and recognize thousands of faces despite their similarity as visual patterns relies on adaptive, norm-based, coding mechanisms that are continuously updated by experience. Reduced adaptive coding of face identity has been proposed as a neurocognitive endophenotype for autism, because it is found in autism and in relatives of individuals with autism. Autistic traits can also extend continuously into the general population, raising the possibility that reduced adaptive coding of face identity may be more generally associated with autistic traits. In the present study, we investigated whether adaptive coding of face identity decreases as autistic traits increase in an undergraduate population. Adaptive coding was measured using face identity aftereffects, and autistic traits were measured using the Autism-Spectrum Quotient (AQ) and its subscales. We also measured face and car recognition ability to determine whether autistic traits are selectively related to face recognition difficulties. We found that men who scored higher on levels of autistic traits related to social interaction had reduced adaptive coding of face identity. This result is consistent with the idea that atypical adaptive face-coding mechanisms are an endophenotype for autism. Autistic traits were also linked with face-selective recognition difficulties in men. However, there were some unexpected sex differences. In women, autistic traits were linked positively, rather than negatively, with adaptive coding of identity, and were unrelated to face-selective recognition difficulties. These sex differences indicate that autistic traits can have different neurocognitive correlates in men and women and raise the intriguing possibility that endophenotypes of autism can differ in males and females. © 2013 Elsevier Ltd. All rights reserved.

  2. Supporting Situated Learning Based on QR Codes with Etiquetar App: A Pilot Study

    ERIC Educational Resources Information Center

    Camacho, Miguel Olmedo; Pérez-Sanagustín, Mar; Alario-Hoyos, Carlos; Soldani, Xavier; Kloos, Carlos Delgado; Sayago, Sergio

    2014-01-01

    EtiquetAR is an authoring tool for supporting the design and enactment of situated learning experiences based on QR tags. Practitioners use etiquetAR for creating, managing and personalizing collections of QR codes with special properties: (1) codes can have more than one link pointing at different multimedia resources, (2) codes can be updated…

  3. Reactive transport modeling in fractured rock: A state-of-the-science review

    NASA Astrophysics Data System (ADS)

    MacQuarrie, Kerry T. B.; Mayer, K. Ulrich

    2005-10-01

    The field of reactive transport modeling has expanded significantly in the past two decades and has assisted in resolving many issues in Earth Sciences. Numerical models allow for detailed examination of coupled transport and reactions, or more general investigation of controlling processes over geologic time scales. Reactive transport models serve to provide guidance in field data collection and, in particular, enable researchers to link modeling and hydrogeochemical studies. In this state-of-science review, the key objectives were to examine the applicability of reactive transport codes for exploring issues of redox stability to depths of several hundreds of meters in sparsely fractured crystalline rock, with a focus on the Canadian Shield setting. A conceptual model of oxygen ingress and redox buffering, within a Shield environment at time and space scales relevant to nuclear waste repository performance, is developed through a review of previous research. This conceptual model describes geochemical and biological processes and mechanisms materially important to understanding redox buffering capacity and radionuclide mobility in the far-field. Consistent with this model, reactive transport codes should ideally be capable of simulating the effects of changing recharge water compositions as a result of long-term climate change, and fracture-matrix interactions that may govern water-rock interaction. Other aspects influencing the suitability of reactive transport codes include the treatment of various reaction and transport time scales, the ability to apply equilibrium or kinetic formulations simultaneously, the need to capture feedback between water-rock interactions and porosity-permeability changes, and the representation of fractured crystalline rock environments as discrete fracture or dual continuum media. A review of modern multicomponent reactive transport codes indicates a relatively high-level of maturity. Within the Yucca Mountain nuclear waste disposal program, reactive transport codes of varying complexity have been applied to investigate the migration of radionuclides and the geochemical evolution of host rock around the planned disposal facility. Through appropriate near- and far-field application of dual continuum codes, this example demonstrates how reactive transport models have been applied to assist in constraining historic water infiltration rates, interpreting the sealing of flow paths due to mineral precipitation, and investigating post-closure geochemical monitoring strategies. Natural analogue modeling studies, although few in number, are also of key importance as they allow the comparison of model results with hydrogeochemical and paleohydrogeological data over geologic time scales.

  4. Network coding multiuser scheme for indoor visible light communications

    NASA Astrophysics Data System (ADS)

    Zhang, Jiankun; Dang, Anhong

    2017-12-01

    Visible light communication (VLC) is a unique alternative for indoor data transfer and developing beyond point-to-point. However, for realizing high-capacity networks, VLC is facing challenges including the constrained bandwidth of the optical access point and random occlusion. A network coding scheme for VLC (NC-VLC) is proposed, with increased throughput and system robustness. Based on the Lambertian illumination model, theoretical decoding failure probability of the multiuser NC-VLC system is derived, and the impact of the system parameters on the performance is analyzed. Experiments demonstrate the proposed scheme successfully in the indoor multiuser scenario. These results indicate that the NC-VLC system shows a good performance under the link loss and random occlusion.

  5. Design optimization studies using COSMIC NASTRAN

    NASA Technical Reports Server (NTRS)

    Pitrof, Stephen M.; Bharatram, G.; Venkayya, Vipperla B.

    1993-01-01

    The purpose of this study is to create, test and document a procedure to integrate mathematical optimization algorithms with COSMIC NASTRAN. This procedure is very important to structural design engineers who wish to capitalize on optimization methods to ensure that their design is optimized for its intended application. The OPTNAST computer program was created to link NASTRAN and design optimization codes into one package. This implementation was tested using two truss structure models and optimizing their designs for minimum weight, subject to multiple loading conditions and displacement and stress constraints. However, the process is generalized so that an engineer could design other types of elements by adding to or modifying some parts of the code.

  6. Interdisciplinary analysis procedures in the modeling and control of large space-based structures

    NASA Technical Reports Server (NTRS)

    Cooper, Paul A.; Stockwell, Alan E.; Kim, Zeen C.

    1987-01-01

    The paper describes a computer software system called the Integrated Multidisciplinary Analysis Tool, IMAT, that has been developed at NASA Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite control systems influenced by structural dynamics. Using a menu-driven interactive executive program, IMAT links a relational database to commercial structural and controls analysis codes. The paper describes the procedures followed to analyze a complex satellite structure and control system. The codes used to accomplish the analysis are described, and an example is provided of an application of IMAT to the analysis of a reference space station subject to a rectangular pulse loading at its docking port.

  7. User's manual for the Composite HTGR Analysis Program (CHAP-1)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, J.S.; Secker, P.A. Jr.; Vigil, J.C.

    1977-03-01

    CHAP-1 is the first release version of an HTGR overall plant simulation program with both steady-state and transient solution capabilities. It consists of a model-independent systems analysis program and a collection of linked modules, each representing one or more components of the HTGR plant. Detailed instructions on the operation of the code and detailed descriptions of the HTGR model are provided. Information is also provided to allow the user to easily incorporate additional component modules, to modify or replace existing modules, or to incorporate a completely new simulation model into the CHAP systems analysis framework.

  8. Modeling of Adaptive Optics-Based Free-Space Communications Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilks, S C; Morris, J R; Brase, J M

    2002-08-06

    We introduce a wave-optics based simulation code written for air-optic laser communications links, that includes a detailed model of an adaptive optics compensation system. We present the results obtained by this model, where the phase of a communications laser beam is corrected, after it propagates through a turbulent atmosphere. The phase of the received laser beam is measured using a Shack-Hartmann wavefront sensor, and the correction method utilizes a MEMS mirror. Strehl improvement and amount of power coupled to the receiving fiber for both 1 km horizontal and 28 km slant paths are presented.

  9. Source Authentication for Code Dissemination Supporting Dynamic Packet Size in Wireless Sensor Networks.

    PubMed

    Kim, Daehee; Kim, Dongwan; An, Sunshin

    2016-07-09

    Code dissemination in wireless sensor networks (WSNs) is a procedure for distributing a new code image over the air in order to update programs. Due to the fact that WSNs are mostly deployed in unattended and hostile environments, secure code dissemination ensuring authenticity and integrity is essential. Recent works on dynamic packet size control in WSNs allow enhancing the energy efficiency of code dissemination by dynamically changing the packet size on the basis of link quality. However, the authentication tokens attached by the base station become useless in the next hop where the packet size can vary according to the link quality of the next hop. In this paper, we propose three source authentication schemes for code dissemination supporting dynamic packet size. Compared to traditional source authentication schemes such as μTESLA and digital signatures, our schemes provide secure source authentication under the environment, where the packet size changes in each hop, with smaller energy consumption.

  10. Source Authentication for Code Dissemination Supporting Dynamic Packet Size in Wireless Sensor Networks †

    PubMed Central

    Kim, Daehee; Kim, Dongwan; An, Sunshin

    2016-01-01

    Code dissemination in wireless sensor networks (WSNs) is a procedure for distributing a new code image over the air in order to update programs. Due to the fact that WSNs are mostly deployed in unattended and hostile environments, secure code dissemination ensuring authenticity and integrity is essential. Recent works on dynamic packet size control in WSNs allow enhancing the energy efficiency of code dissemination by dynamically changing the packet size on the basis of link quality. However, the authentication tokens attached by the base station become useless in the next hop where the packet size can vary according to the link quality of the next hop. In this paper, we propose three source authentication schemes for code dissemination supporting dynamic packet size. Compared to traditional source authentication schemes such as μTESLA and digital signatures, our schemes provide secure source authentication under the environment, where the packet size changes in each hop, with smaller energy consumption. PMID:27409616

  11. Ontological Modeling for Integrated Spacecraft Analysis

    NASA Technical Reports Server (NTRS)

    Wicks, Erica

    2011-01-01

    Current spacecraft work as a cooperative group of a number of subsystems. Each of these requiresmodeling software for development, testing, and prediction. It is the goal of my team to create anoverarching software architecture called the Integrated Spacecraft Analysis (ISCA) to aid in deploying the discrete subsystems' models. Such a plan has been attempted in the past, and has failed due to the excessive scope of the project. Our goal in this version of ISCA is to use new resources to reduce the scope of the project, including using ontological models to help link the internal interfaces of subsystems' models with the ISCA architecture.I have created an ontology of functions specific to the modeling system of the navigation system of a spacecraft. The resulting ontology not only links, at an architectural level, language specificinstantiations of the modeling system's code, but also is web-viewable and can act as a documentation standard. This ontology is proof of the concept that ontological modeling can aid in the integration necessary for ISCA to work, and can act as the prototype for future ISCA ontologies.

  12. Modelling Conditions and Health Care Processes in Electronic Health Records: An Application to Severe Mental Illness with the Clinical Practice Research Datalink.

    PubMed

    Olier, Ivan; Springate, David A; Ashcroft, Darren M; Doran, Tim; Reeves, David; Planner, Claire; Reilly, Siobhan; Kontopantelis, Evangelos

    2016-01-01

    The use of Electronic Health Records databases for medical research has become mainstream. In the UK, increasing use of Primary Care Databases is largely driven by almost complete computerisation and uniform standards within the National Health Service. Electronic Health Records research often begins with the development of a list of clinical codes with which to identify cases with a specific condition. We present a methodology and accompanying Stata and R commands (pcdsearch/Rpcdsearch) to help researchers in this task. We present severe mental illness as an example. We used the Clinical Practice Research Datalink, a UK Primary Care Database in which clinical information is largely organised using Read codes, a hierarchical clinical coding system. Pcdsearch is used to identify potentially relevant clinical codes and/or product codes from word-stubs and code-stubs suggested by clinicians. The returned code-lists are reviewed and codes relevant to the condition of interest are selected. The final code-list is then used to identify patients. We identified 270 Read codes linked to SMI and used them to identify cases in the database. We observed that our approach identified cases that would have been missed with a simpler approach using SMI registers defined within the UK Quality and Outcomes Framework. We described a framework for researchers of Electronic Health Records databases, for identifying patients with a particular condition or matching certain clinical criteria. The method is invariant to coding system or database and can be used with SNOMED CT, ICD or other medical classification code-lists.

  13. Testing Quick Response (QR) Codes as an Innovation to Improve Feedback Among Geographically-Separated Clerkship Sites.

    PubMed

    Snyder, Matthew J; Nguyen, Dana R; Womack, Jasmyne J; Bunt, Christopher W; Westerfield, Katie L; Bell, Adriane E; Ledford, Christy J W

    2018-03-01

    Collection of feedback regarding medical student clinical experiences for formative or summative purposes remains a challenge across clinical settings. The purpose of this study was to determine whether the use of a quick response (QR) code-linked online feedback form improves the frequency and efficiency of rater feedback. In 2016, we compared paper-based feedback forms, an online feedback form, and a QR code-linked online feedback form at 15 family medicine clerkship sites across the United States. Outcome measures included usability, number of feedback submissions per student, number of unique raters providing feedback, and timeliness of feedback provided to the clerkship director. The feedback method was significantly associated with usability, with QR code scoring the highest, and paper second. Accessing feedback via QR code was associated with the shortest time to prepare feedback. Across four rotations, separate repeated measures analyses of variance showed no effect of feedback system on the number of submissions per student or the number of unique raters. The results of this study demonstrate that preceptors in the family medicine clerkship rate QR code-linked feedback as a high usability platform. Additionally, this platform resulted in faster form completion than paper or online forms. An overarching finding of this study is that feedback forms must be portable and easily accessible. Potential implementation barriers and the social norm for providing feedback in this manner need to be considered.

  14. System Simulation of Nuclear Power Plant by Coupling RELAP5 and Matlab/Simulink

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meng Lin; Dong Hou; Zhihong Xu

    2006-07-01

    Since RELAP5 code has general and advanced features in thermal-hydraulic computation, it has been widely used in transient and accident safety analysis, experiment planning analysis, and system simulation, etc. So we wish to design, analyze, verify a new Instrumentation And Control (I and C) system of Nuclear Power Plant (NPP) based on the best-estimated code, and even develop our engineering simulator. But because of limited function of simulating control and protection system in RELAP5, it is necessary to expand the function for high efficient, accurate, flexible design and simulation of I and C system. Matlab/Simulink, a scientific computation software, justmore » can compensate the limitation, which is a powerful tool in research and simulation of plant process control. The software is selected as I and C part to be coupled with RELAP5 code to realize system simulation of NPPs. There are two key techniques to be solved. One is the dynamic data exchange, by which Matlab/Simulink receives plant parameters and returns control results. Database is used to communicate the two codes. Accordingly, Dynamic Link Library (DLL) is applied to link database in RELAP5, while DLL and S-Function is applied in Matlab/Simulink. The other problem is synchronization between the two codes for ensuring consistency in global simulation time. Because Matlab/Simulink always computes faster than RELAP5, the simulation time is sent by RELAP5 and received by Matlab/Simulink. A time control subroutine is added into the simulation procedure of Matlab/Simulink to control its simulation advancement. Through these ways, Matlab/Simulink is dynamically coupled with RELAP5. Thus, in Matlab/Simulink, we can freely design control and protection logic of NPPs and test it with best-estimated plant model feedback. A test will be shown to illuminate that results of coupling calculation are nearly the same with one of single RELAP5 with control logic. In practice, a real Pressurized Water Reactor (PWR) is modeled by RELAP5 code, and its main control and protection system is duplicated by Matlab/Simulink. Some steady states and transients are calculated under control of these I and C systems, and the results are compared with the plant test curves. The application showed that it can do exact system simulation of NPPs by coupling RELAP5 and Matlab/Simulink. This paper will mainly focus on the coupling method, plant thermal-hydraulic model, main control logics, test and application results. (authors)« less

  15. Developing and upgrading of solar system thermal energy storage simulation models. Technical progress report, March 1, 1979-February 29, 1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuhn, J K; von Fuchs, G F; Zob, A P

    1980-05-01

    Two water tank component simulation models have been selected and upgraded. These models are called the CSU Model and the Extended SOLSYS Model. The models have been standardized and links have been provided for operation in the TRNSYS simulation program. The models are described in analytical terms as well as in computer code. Specific water tank tests were performed for the purpose of model validation. Agreement between model data and test data is excellent. A description of the limitations has also been included. Streamlining results and criteria for the reduction of computer time have also been shown for both watermore » tank computer models. Computer codes for the models and instructions for operating these models in TRNSYS have also been included, making the models readily available for DOE and industry use. Rock bed component simulation models have been reviewed and a model selected and upgraded. This model is a logical extension of the Mumma-Marvin model. Specific rock bed tests have been performed for the purpose of validation. Data have been reviewed for consistency. Details of the test results concerned with rock characteristics and pressure drop through the bed have been explored and are reported.« less

  16. Farm Mapping to Assist, Protect, and Prepare Emergency Responders: Farm MAPPER.

    PubMed

    Reyes, Iris; Rollins, Tami; Mahnke, Andrea; Kadolph, Christopher; Minor, Gerald; Keifer, Matthew

    2014-01-01

    Responders such as firefighters and emergency medical technicians who respond to farm emergencies often face complex and unknown environments. They may encounter hazards such as fuels, solvents, pesticides, caustics, and exploding gas storage cylinders. Responders may be unaware of dirt roads within the farm that can expedite their arrival at critical sites or snow-covered manure pits that act as hidden hazards. A response to a farm, unless guided by someone familiar with the operation, may present a risk to responders and post a challenge in locating the victim. This project explored the use of a Web-based farm-mapping application optimized for tablets and accessible via easily accessible on-site matrix barcodes, or quick response codes (QR codes), to provide emergency responders with hazard and resource information to agricultural operations. Secured portals were developed for both farmers and responders, allowing both parties to populate and customize farm maps with icons. Data were stored online and linked to QR codes attached to mailbox posts where emergency responders may read them with a mobile device. Mock responses were conducted on dairy farms to test QR code linking efficacy, Web site security, and field usability. Findings from farmer usability tests showed willingness to enter data as well as ease of Web site navigation and data entry even with farmers who had limited computer knowledge. Usability tests with emergency responders showed ease of QR code connectivity to the farm maps and ease of Web site navigation. Further research is needed to improve data security as well as assess the program's applicability to nonfarm environments and integration with existing emergency response systems. The next phases of this project will expand the program for regional and national use, develop QR code-linked, Web-based extrication guidance for farm machinery for victim entrapment rescue, and create QR code-linked online training videos and materials for limited English proficient immigrant farm workers.

  17. QR Codes as Finding Aides: Linking Electronic and Print Library Resources

    ERIC Educational Resources Information Center

    Kane, Danielle; Schneidewind, Jeff

    2011-01-01

    As part of a focused, methodical, and evaluative approach to emerging technologies, QR codes are one of many new technologies being used by the UC Irvine Libraries. QR codes provide simple connections between print and virtual resources. In summer 2010, a small task force began to investigate how QR codes could be used to provide information and…

  18. Use Them ... or Lose Them? The Case for and against Using QR Codes

    ERIC Educational Resources Information Center

    Cunningham, Chuck; Dull, Cassie

    2011-01-01

    A quick-response (QR) code is a two-dimensional, black-and-white square barcode and links directly to a URL of one's choice. When the code is scanned with a smartphone, it will automatically redirect the user to the designated URL. QR codes are popping up everywhere--billboards, magazines, posters, shop windows, TVs, computer screens, and more.…

  19. Breaking the Code: The Creative Use of QR Codes to Market Extension Events

    ERIC Educational Resources Information Center

    Hill, Paul; Mills, Rebecca; Peterson, GaeLynn; Smith, Janet

    2013-01-01

    The use of smartphones has drastically increased in recent years, heralding an explosion in the use of QR codes. The black and white square barcodes that link the physical and digital world are everywhere. These simple codes can provide many opportunities to connect people in the physical world with many of Extension online resources. The…

  20. FPGA implementation of advanced FEC schemes for intelligent aggregation networks

    NASA Astrophysics Data System (ADS)

    Zou, Ding; Djordjevic, Ivan B.

    2016-02-01

    In state-of-the-art fiber-optics communication systems the fixed forward error correction (FEC) and constellation size are employed. While it is important to closely approach the Shannon limit by using turbo product codes (TPC) and low-density parity-check (LDPC) codes with soft-decision decoding (SDD) algorithm; rate-adaptive techniques, which enable increased information rates over short links and reliable transmission over long links, are likely to become more important with ever-increasing network traffic demands. In this invited paper, we describe a rate adaptive non-binary LDPC coding technique, and demonstrate its flexibility and good performance exhibiting no error floor at BER down to 10-15 in entire code rate range, by FPGA-based emulation, making it a viable solution in the next-generation high-speed intelligent aggregation networks.

  1. A satellite mobile communication system based on Band-Limited Quasi-Synchronous Code Division Multiple Access (BLQS-CDMA)

    NASA Technical Reports Server (NTRS)

    Degaudenzi, R.; Elia, C.; Viola, R.

    1990-01-01

    Discussed here is a new approach to code division multiple access applied to a mobile system for voice (and data) services based on Band Limited Quasi Synchronous Code Division Multiple Access (BLQS-CDMA). The system requires users to be chip synchronized to reduce the contribution of self-interference and to make use of voice activation in order to increase the satellite power efficiency. In order to achieve spectral efficiency, Nyquist chip pulse shaping is used with no detection performance impairment. The synchronization problems are solved in the forward link by distributing a master code, whereas carrier forced activation and closed loop control techniques have been adopted in the return link. System performance sensitivity to nonlinear amplification and timing/frequency synchronization errors are analyzed.

  2. Convolutional coding at 50 Mbps for the Shuttle Ku-band return link

    NASA Technical Reports Server (NTRS)

    Batson, B. H.; Huth, G. K.

    1976-01-01

    Error correcting coding is required for 50 Mbps data link from the Shuttle Orbiter through the Tracking and Data Relay Satellite System (TDRSS) to the ground because of severe power limitations. Convolutional coding has been chosen because the decoding algorithms (sequential and Viterbi) provide significant coding gains at the required bit error probability of one in 10 to the sixth power and can be implemented at 50 Mbps with moderate hardware. While a 50 Mbps sequential decoder has been built, the highest data rate achieved for a Viterbi decoder is 10 Mbps. Thus, five multiplexed 10 Mbps Viterbi decoders must be used to provide a 50 Mbps data rate. This paper discusses the tradeoffs which were considered when selecting the multiplexed Viterbi decoder approach for this application.

  3. High velocity impact on composite link of aircraft wing flap mechanism

    NASA Astrophysics Data System (ADS)

    Heimbs, Sebastian; Lang, Holger; Havar, Tamas

    2012-12-01

    This paper describes the numerical investigation of the mechanical behaviour of a structural component of an aircraft wing flap support impacted by a wheel rim fragment. The support link made of composite materials was modelled in the commercial finite element code Abaqus/Explicit, incorporating intralaminar and interlaminar failure modes by adequate material models and cohesive interfaces. Validation studies were performed step by step using quasi-static tensile test data and low velocity impact test data. Finally, high velocity impact simulations with a metallic rim fragment were performed for several load cases involving different impact angles, impactor rotation and pre-stress. The numerical rim release analysis turned out to be an efficient approach in the development process of such composite structures and for the identification of structural damage and worst case impact loading scenarios.

  4. The Communication Link and Error ANalysis (CLEAN) simulator

    NASA Technical Reports Server (NTRS)

    Ebel, William J.; Ingels, Frank M.; Crowe, Shane

    1993-01-01

    During the period July 1, 1993 through December 30, 1993, significant developments to the Communication Link and Error ANalysis (CLEAN) simulator were completed and include: (1) Soft decision Viterbi decoding; (2) node synchronization for the Soft decision Viterbi decoder; (3) insertion/deletion error programs; (4) convolutional encoder; (5) programs to investigate new convolutional codes; (6) pseudo-noise sequence generator; (7) soft decision data generator; (8) RICE compression/decompression (integration of RICE code generated by Pen-Shu Yeh at Goddard Space Flight Center); (9) Markov Chain channel modeling; (10) percent complete indicator when a program is executed; (11) header documentation; and (12) help utility. The CLEAN simulation tool is now capable of simulating a very wide variety of satellite communication links including the TDRSS downlink with RFI. The RICE compression/decompression schemes allow studies to be performed on error effects on RICE decompressed data. The Markov Chain modeling programs allow channels with memory to be simulated. Memory results from filtering, forward error correction encoding/decoding, differential encoding/decoding, channel RFI, nonlinear transponders and from many other satellite system processes. Besides the development of the simulation, a study was performed to determine whether the PCI provides a performance improvement for the TDRSS downlink. There exist RFI with several duty cycles for the TDRSS downlink. We conclude that the PCI does not improve performance for any of these interferers except possibly one which occurs for the TDRS East. Therefore, the usefulness of the PCI is a function of the time spent transmitting data to the WSGT through the TDRS East transponder.

  5. Development and Implementation of Dynamic Scripts to Execute Cycled GSI/WRF Forecasts

    NASA Technical Reports Server (NTRS)

    Zavodsky, Bradley; Srikishen, Jayanthi; Berndt, Emily; Li, Xuanli; Watson, Leela

    2014-01-01

    The Weather Research and Forecasting (WRF) numerical weather prediction (NWP) model and Gridpoint Statistical Interpolation (GSI) data assimilation (DA) are the operational systems that make up the North American Mesoscale (NAM) model and the NAM Data Assimilation System (NDAS) analysis used by National Weather Service forecasters. The Developmental Testbed Center (DTC) manages and distributes the code for the WRF and GSI, but it is up to individual researchers to link the systems together and write scripts to run the systems, which can take considerable time for those not familiar with the code. The objective of this project is to develop and disseminate a set of dynamic scripts that mimic the unique cycling configuration of the operational NAM to enable researchers to develop new modeling and data assimilation techniques that can be easily transferred to operations. The current version of the SPoRT GSI/WRF Scripts (v3.0.1) is compatible with WRF v3.3 and GSI v3.0.

  6. A Monte Carlo Simulation of Prompt Gamma Emission from Fission Fragments

    NASA Astrophysics Data System (ADS)

    Regnier, D.; Litaize, O.; Serot, O.

    2013-03-01

    The prompt fission gamma spectra and multiplicities are investigated through the Monte Carlo code FIFRELIN which is developed at the Cadarache CEA research center. Knowing the fully accelerated fragment properties, their de-excitation is simulated through a cascade of neutron, gamma and/or electron emissions. This paper presents the recent developments in the FIFRELIN code and the results obtained on the spontaneous fission of 252Cf. Concerning the decay cascades simulation, a fully Hauser-Feshbach model is compared with a previous one using a Weisskopf spectrum for neutron emission. A particular attention is paid to the treatment of the neutron/gamma competition. Calculations lead using different level density and gamma strength function models show significant discrepancies of the slope of the gamma spectra at high energy. The underestimation of the prompt gamma spectra obtained regardless our de-excitation cascade modeling choice is discussed. This discrepancy is probably linked to an underestimation of the post-neutron fragments spin in our calculation.

  7. Simulations of pattern dynamics for reaction-diffusion systems via SIMULINK

    PubMed Central

    2014-01-01

    Background Investigation of the nonlinear pattern dynamics of a reaction-diffusion system almost always requires numerical solution of the system’s set of defining differential equations. Traditionally, this would be done by selecting an appropriate differential equation solver from a library of such solvers, then writing computer codes (in a programming language such as C or Matlab) to access the selected solver and display the integrated results as a function of space and time. This “code-based” approach is flexible and powerful, but requires a certain level of programming sophistication. A modern alternative is to use a graphical programming interface such as Simulink to construct a data-flow diagram by assembling and linking appropriate code blocks drawn from a library. The result is a visual representation of the inter-relationships between the state variables whose output can be made completely equivalent to the code-based solution. Results As a tutorial introduction, we first demonstrate application of the Simulink data-flow technique to the classical van der Pol nonlinear oscillator, and compare Matlab and Simulink coding approaches to solving the van der Pol ordinary differential equations. We then show how to introduce space (in one and two dimensions) by solving numerically the partial differential equations for two different reaction-diffusion systems: the well-known Brusselator chemical reactor, and a continuum model for a two-dimensional sheet of human cortex whose neurons are linked by both chemical and electrical (diffusive) synapses. We compare the relative performances of the Matlab and Simulink implementations. Conclusions The pattern simulations by Simulink are in good agreement with theoretical predictions. Compared with traditional coding approaches, the Simulink block-diagram paradigm reduces the time and programming burden required to implement a solution for reaction-diffusion systems of equations. Construction of the block-diagram does not require high-level programming skills, and the graphical interface lends itself to easy modification and use by non-experts. PMID:24725437

  8. Simulations of pattern dynamics for reaction-diffusion systems via SIMULINK.

    PubMed

    Wang, Kaier; Steyn-Ross, Moira L; Steyn-Ross, D Alistair; Wilson, Marcus T; Sleigh, Jamie W; Shiraishi, Yoichi

    2014-04-11

    Investigation of the nonlinear pattern dynamics of a reaction-diffusion system almost always requires numerical solution of the system's set of defining differential equations. Traditionally, this would be done by selecting an appropriate differential equation solver from a library of such solvers, then writing computer codes (in a programming language such as C or Matlab) to access the selected solver and display the integrated results as a function of space and time. This "code-based" approach is flexible and powerful, but requires a certain level of programming sophistication. A modern alternative is to use a graphical programming interface such as Simulink to construct a data-flow diagram by assembling and linking appropriate code blocks drawn from a library. The result is a visual representation of the inter-relationships between the state variables whose output can be made completely equivalent to the code-based solution. As a tutorial introduction, we first demonstrate application of the Simulink data-flow technique to the classical van der Pol nonlinear oscillator, and compare Matlab and Simulink coding approaches to solving the van der Pol ordinary differential equations. We then show how to introduce space (in one and two dimensions) by solving numerically the partial differential equations for two different reaction-diffusion systems: the well-known Brusselator chemical reactor, and a continuum model for a two-dimensional sheet of human cortex whose neurons are linked by both chemical and electrical (diffusive) synapses. We compare the relative performances of the Matlab and Simulink implementations. The pattern simulations by Simulink are in good agreement with theoretical predictions. Compared with traditional coding approaches, the Simulink block-diagram paradigm reduces the time and programming burden required to implement a solution for reaction-diffusion systems of equations. Construction of the block-diagram does not require high-level programming skills, and the graphical interface lends itself to easy modification and use by non-experts.

  9. Planning U.S. General Purpose Forces: The Theater Nuclear Forces

    DTIC Science & Technology

    1977-01-01

    usefulness in combat. All U.S. nuclear weapons deployed in Europe are fitted with Permissive Action Links (PAL), coded devices designed to impede...may be proposed. The Standard Missile 2, the Harpoon missile, the Mk48 tor- pedo , and the SUBROC anti-submarine rocket are all being considered for...Permissive Action Link . A coded device attached to nuclear weapons deployed abroad that impedes the unauthorized arming or firing of the weapon. Pershing

  10. In-orbit verification of small optical transponder (SOTA): evaluation of satellite-to-ground laser communication links

    NASA Astrophysics Data System (ADS)

    Takenaka, Hideki; Koyama, Yoshisada; Akioka, Maki; Kolev, Dimitar; Iwakiri, Naohiko; Kunimori, Hiroo; Carrasco-Casado, Alberto; Munemasa, Yasushi; Okamoto, Eiji; Toyoshima, Morio

    2016-03-01

    Research and development of space optical communications is conducted in the National Institute of Information and Communications Technology (NICT). The NICT developed the Small Optical TrAnsponder (SOTA), which was embarked on a 50kg-class satellite and launched into a low earth orbit (LEO). The space-to-ground laser communication experiments have been conducted with the SOTA. Atmospheric turbulence causes signal fadings and becomes an issue to be solved in satellite-to-ground laser communication links. Therefore, as error-correcting functions, a Reed-Solomon (RS) code and a Low-Density Generator Matrix (LDGM) code are implemented in the communication system onboard the SOTA. In this paper, we present the in-orbit verification results of SOTA including the characteristic of the functions, the communication performance with the LDGM code via satellite-to-ground atmospheric paths, and the link budget analysis and the comparison between theoretical and experimental results.

  11. Computer algorithm for coding gain

    NASA Technical Reports Server (NTRS)

    Dodd, E. E.

    1974-01-01

    Development of a computer algorithm for coding gain for use in an automated communications link design system. Using an empirical formula which defines coding gain as used in space communications engineering, an algorithm is constructed on the basis of available performance data for nonsystematic convolutional encoding with soft-decision (eight-level) Viterbi decoding.

  12. A Strategy for Reusing the Data of Electronic Medical Record Systems for Clinical Research.

    PubMed

    Matsumura, Yasushi; Hattori, Atsushi; Manabe, Shiro; Tsuda, Tsutomu; Takeda, Toshihiro; Okada, Katsuki; Murata, Taizo; Mihara, Naoki

    2016-01-01

    There is a great need to reuse data stored in electronic medical records (EMR) databases for clinical research. We previously reported the development of a system in which progress notes and case report forms (CRFs) were simultaneously recorded using a template in the EMR in order to exclude redundant data entry. To make the data collection process more efficient, we are developing a system in which the data originally stored in the EMR database can be populated within a frame in a template. We developed interface plugin modules that retrieve data from the databases of other EMR applications. A universal keyword written in a template master is converted to a local code using a data conversion table, then the objective data is retrieved from the corresponding database. The template element data, which are entered by a template, are stored in the template element database. To retrieve the data entered by other templates, the objective data is designated by the template element code with the template code, or by the concept code if it is written for the element. When the application systems in the EMR generate documents, they also generate a PDF file and a corresponding document profile XML, which includes important data, and send them to the document archive server and the data sharing saver, respectively. In the data sharing server, the data are represented by an item with an item code with a document class code and its value. By linking a concept code to an item identifier, an objective data can be retrieved by designating a concept code. We employed a flexible strategy in which a unique identifier for a hospital is initially attached to all of the data that the hospital generates. The identifier is secondarily linked with concept codes. The data that are not linked with a concept code can also be retrieved using the unique identifier of the hospital. This strategy makes it possible to reuse any of a hospital's data.

  13. Conceptual Underpinnings of the Quality of Life in Neurological Disorders (Neuro-QoL): Comparisons of Core Sets for Stroke, Multiple Sclerosis, Spinal Cord Injury, and Traumatic Brain Injury.

    PubMed

    Wong, Alex W K; Lau, Stephen C L; Fong, Mandy W M; Cella, David; Lai, Jin-Shei; Heinemann, Allen W

    2018-04-03

    To determine the extent to which the content of the Quality of Life in Neurological Disorders (Neuro-QoL) covers the International Classification of Functioning, Disability and Health (ICF) Core Sets for multiple sclerosis (MS), stroke, spinal cord injury (SCI), and traumatic brain injury (TBI) using summary linkage indicators. Content analysis by linking content of the Neuro-QoL to corresponding ICF codes of each Core Set for MS, stroke, SCI, and TBI. Three academic centers. None. None. Four summary linkage indicators proposed by MacDermid et al were estimated to compare the content coverage between Neuro-QoL and the ICF codes of Core Sets for MS, stroke, MS, and TBI. Neuro-QoL represented 20% to 30% Core Set codes for different conditions in which more codes in Core Sets for MS (29%), stroke (28%), and TBI (28%) were covered than those for SCI in the long-term (20%) and early postacute (19%) contexts. Neuro-QoL represented nearly half of the unique Activity and Participation codes (43%-49%) and less than one third of the unique Body Function codes (12%-32%). It represented fewer Environmental Factors codes (2%-6%) and no Body Structures codes. Absolute linkage indicators found that at least 60% of Neuro-QoL items were linked to Core Set codes (63%-95%), but many items covered the same codes as revealed by unique linkage indicators (7%-13%), suggesting high concept redundancy among items. The Neuro-QoL links more closely to ICF Core Sets for stroke, MS, and TBI than to those for SCI, and primarily covers activity and participation ICF domains. Other instruments are needed to address concepts not measured by the Neuro-QoL when a comprehensive health assessment is needed. Copyright © 2018 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  14. The availability of web sites offering to sell opioid medications without prescriptions.

    PubMed

    Forman, Robert F; Woody, George E; McLellan, Thomas; Lynch, Kevin G

    2006-07-01

    This study was designed to determine the availability of web sites offering to sell opioid medications without prescriptions. Forty-seven Internet searches were conducted with a variety of opioid medication terms, including "codeine," "no prescription Vicodin," and "OxyContin." Two independent raters examined the links generated in each search and resolved any coding disagreements. The resulting links were coded as "no prescription web sites" (NPWs) if they offered to sell opioid medications without prescriptions. In searches with terms such as "no prescription codeine" and "Vicodin," over 50% of the links obtained were coded as "NPWs." The proportion of links yielding NPWs was greater when the phrase "no prescription" was added to the opioid term. More than 300 opioid NPWs were identified and entered into a database. Three national drug-use monitoring studies have cited significant increases in prescription opioid use over the past 5 years, particularly among young people. The emergence of NPWs introduces a new vector for unregulated access to opioids. Research is needed to determine the effect of NPWs on prescription opioid use initiation, misuse, and dependence.

  15. MatProps: Material Properties Database and Associated Access Library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Durrenberger, J K; Becker, R C; Goto, D M

    2007-08-13

    Coefficients for analytic constitutive and equation of state models (EOS), which are used by many hydro codes at LLNL, are currently stored in a legacy material database (Steinberg, UCRL-MA-106349). Parameters for numerous materials are available through this database, and include Steinberg-Guinan and Steinberg-Lund constitutive models for metals, JWL equations of state for high explosives, and Mie-Gruniesen equations of state for metals. These constitutive models are used in most of the simulations done by ASC codes today at Livermore. Analytic EOSs are also still used, but have been superseded in many cases by tabular representations in LEOS (http://leos.llnl.gov). Numerous advanced constitutivemore » models have been developed and implemented into ASC codes over the past 20 years. These newer models have more physics and better representations of material strength properties than their predecessors, and therefore more model coefficients. However, a material database of these coefficients is not readily available. Therefore incorporating these coefficients with those of the legacy models into a portable database that could be shared amongst codes would be most welcome. The goal of this paper is to describe the MatProp effort at LLNL to create such a database and associated access library that could be used by codes throughout the DOE complex and beyond. We have written an initial version of the MatProp database and access library and our DOE/ASC code ALE3D (Nichols et. al., UCRL-MA-152204) is able to import information from the database. The database, a link to which exists on the Sourceforge server at LLNL, contains coefficients for many materials and models (see Appendix), and includes material parameters in the following categories--flow stress, shear modulus, strength, damage, and equation of state. Future versions of the Matprop database and access library will include the ability to read and write material descriptions that can be exchanged between codes. It will also include an ability to do unit changes, i.e. have the library return parameters in user-specified unit systems. In addition to these, additional material categories can be added (e.g., phase change kinetics, etc.). The Matprop database and access library is part of a larger set of tools used at LLNL for assessing material model behavior. One of these is MSlib, a shared constitutive material model library. Another is the Material Strength Database (MSD), which allows users to compare parameter fits for specific constitutive models to available experimental data. Together with Matprop, these tools create a suite of capabilities that provide state-of-the-art models and parameters for those models to integrated simulation codes. This document is broken into several appendices. Appendix A contains a code example to retrieve several material coefficients. Appendix B contains the API for the Matprop data access library. Appendix C contains a list of the material names and model types currently available in the Matprop database. Appendix D contains a list of the parameter names for the currently recognized model types. Appendix E contains a full xml description of the material Tantalum.« less

  16. Higgs mass prediction in the MSSM at three-loop level in a pure \\overline{{ {DR}}} context

    NASA Astrophysics Data System (ADS)

    Harlander, Robert V.; Klappert, Jonas; Voigt, Alexander

    2017-12-01

    The impact of the three-loop effects of order α _tα _s^2 on the mass of the light CP-even Higgs boson in the { {MSSM}} is studied in a pure \\overline{{ {DR}}} context. For this purpose, we implement the results of Kant et al. (JHEP 08:104, 2010) into the C++ module Himalaya and link it to FlexibleSUSY, a Mathematica and C++ package to create spectrum generators for BSM models. The three-loop result is compared to the fixed-order two-loop calculations of the original FlexibleSUSY and of FeynHiggs, as well as to the result based on an EFT approach. Aside from the expected reduction of the renormalization scale dependence with respect to the lower-order results, we find that the three-loop contributions significantly reduce the difference from the EFT prediction in the TeV-region of the { {SUSY}} scale {M_S}. Himalaya can be linked also to other two-loop \\overline{{ {DR}}} codes, thus allowing for the elevation of these codes to the three-loop level.

  17. Evidence for Phex haploinsufficiency in murine X-linked hypophosphatemia.

    PubMed

    Wang, L; Du, L; Ecarot, B

    1999-04-01

    Mutations in the PHEX gene (phosphate-regulating gene with homology to endopeptidases on the X-chromosome) are responsible for X-linked hypophosphatemia (HYP). We previously reported the full-length coding sequence of murine Phex cDNA and provided evidence of Phex expression in bone and tooth. Here, we report the cloning of the entire 3.5-kb 3'UTR of the Phex gene, yielding a total of 6248 bp for the Phex transcript. Southern blot and RT-PCR analyses revealed that the 3' end of the coding sequence and the 3'UTR of the Phex gene, spanning exons 16 to 22, are deleted in Hyp, the mouse model for HYP. Northern blot analysis of bone revealed lack of expression of stable Phex mRNA from the mutant allele and expression of Phex transcripts from the wild-type allele in Hyp heterozygous females. Expression of the Phex protein in heterozygotes was confirmed by Western analysis with antibodies raised against a COOH-terminal peptide of the mouse Phex protein. Taken together, these results indicate that the dominant pattern of Hyp inheritance in mice is due to Phex haploinsufficiency.

  18. 3D Progressive Damage Modeling for Laminated Composite Based on Crack Band Theory and Continuum Damage Mechanics

    NASA Technical Reports Server (NTRS)

    Wang, John T.; Pineda, Evan J.; Ranatunga, Vipul; Smeltzer, Stanley S.

    2015-01-01

    A simple continuum damage mechanics (CDM) based 3D progressive damage analysis (PDA) tool for laminated composites was developed and implemented as a user defined material subroutine to link with a commercially available explicit finite element code. This PDA tool uses linear lamina properties from standard tests, predicts damage initiation with an easy-to-implement Hashin-Rotem failure criteria, and in the damage evolution phase, evaluates the degradation of material properties based on the crack band theory and traction-separation cohesive laws. It follows Matzenmiller et al.'s formulation to incorporate the degrading material properties into the damaged stiffness matrix. Since nonlinear shear and matrix stress-strain relations are not implemented, correction factors are used for slowing the reduction of the damaged shear stiffness terms to reflect the effect of these nonlinearities on the laminate strength predictions. This CDM based PDA tool is implemented as a user defined material (VUMAT) to link with the Abaqus/Explicit code. Strength predictions obtained, using this VUMAT, are correlated with test data for a set of notched specimens under tension and compression loads.

  19. Origins of tmRNA: the missing link in the birth of protein synthesis?

    PubMed

    Macé, Kevin; Gillet, Reynald

    2016-09-30

    The RNA world hypothesis refers to the early period on earth in which RNA was central in assuring both genetic continuity and catalysis. The end of this era coincided with the development of the genetic code and protein synthesis, symbolized by the apparition of the first non-random messenger RNA (mRNA). Modern transfer-messenger RNA (tmRNA) is a unique hybrid molecule which has the properties of both mRNA and transfer RNA (tRNA). It acts as a key molecule during trans-translation, a major quality control pathway of modern bacterial protein synthesis. tmRNA shares many common characteristics with ancestral RNA. Here, we present a model in which proto-tmRNAs were the first molecules on earth to support non-random protein synthesis, explaining the emergence of early genetic code. In this way, proto-tmRNA could be the missing link between the first mRNA and tRNA molecules and modern ribosome-mediated protein synthesis. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. On the Use of Statistics in Design and the Implications for Deterministic Computer Experiments

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design -- using orthogonal arrays to compute signal-to-noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that could become just as prevalent: it is the concept of metamodeling whereby statistical models are built to approximate detailed computer analysis codes. Although computers continue to get faster, analysis codes always seem to keep pace so that their computational time remains non-trivial. Through metamodeling, approximations of these codes are built that are orders of magnitude cheaper to run. These metamodels can then be linked to optimization routines for fast analysis, or they can serve as a bridge for integrating analysis codes across different domains. In this paper we first review metamodeling techniques that encompass design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We discuss their existing applications in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of metamodeling techniques in given situations and how common pitfalls can be avoided.

  1. Are V1 Simple Cells Optimized for Visual Occlusions? A Comparative Study

    PubMed Central

    Bornschein, Jörg; Henniges, Marc; Lücke, Jörg

    2013-01-01

    Simple cells in primary visual cortex were famously found to respond to low-level image components such as edges. Sparse coding and independent component analysis (ICA) emerged as the standard computational models for simple cell coding because they linked their receptive fields to the statistics of visual stimuli. However, a salient feature of image statistics, occlusions of image components, is not considered by these models. Here we ask if occlusions have an effect on the predicted shapes of simple cell receptive fields. We use a comparative approach to answer this question and investigate two models for simple cells: a standard linear model and an occlusive model. For both models we simultaneously estimate optimal receptive fields, sparsity and stimulus noise. The two models are identical except for their component superposition assumption. We find the image encoding and receptive fields predicted by the models to differ significantly. While both models predict many Gabor-like fields, the occlusive model predicts a much sparser encoding and high percentages of ‘globular’ receptive fields. This relatively new center-surround type of simple cell response is observed since reverse correlation is used in experimental studies. While high percentages of ‘globular’ fields can be obtained using specific choices of sparsity and overcompleteness in linear sparse coding, no or only low proportions are reported in the vast majority of studies on linear models (including all ICA models). Likewise, for the here investigated linear model and optimal sparsity, only low proportions of ‘globular’ fields are observed. In comparison, the occlusive model robustly infers high proportions and can match the experimentally observed high proportions of ‘globular’ fields well. Our computational study, therefore, suggests that ‘globular’ fields may be evidence for an optimal encoding of visual occlusions in primary visual cortex. PMID:23754938

  2. Final report on LDRD project : coupling strategies for multi-physics applications.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hopkins, Matthew Morgan; Moffat, Harry K.; Carnes, Brian

    Many current and future modeling applications at Sandia including ASC milestones will critically depend on the simultaneous solution of vastly different physical phenomena. Issues due to code coupling are often not addressed, understood, or even recognized. The objectives of the LDRD has been both in theory and in code development. We will show that we have provided a fundamental analysis of coupling, i.e., when strong coupling vs. a successive substitution strategy is needed. We have enabled the implementation of tighter coupling strategies through additions to the NOX and Sierra code suites to make coupling strategies available now. We have leveragedmore » existing functionality to do this. Specifically, we have built into NOX the capability to handle fully coupled simulations from multiple codes, and we have also built into NOX the capability to handle Jacobi Free Newton Krylov simulations that link multiple applications. We show how this capability may be accessed from within the Sierra Framework as well as from outside of Sierra. The critical impact from this LDRD is that we have shown how and have delivered strategies for enabling strong Newton-based coupling while respecting the modularity of existing codes. This will facilitate the use of these codes in a coupled manner to solve multi-physic applications.« less

  3. Paying a price: culture, trust, and negotiation consequences.

    PubMed

    Gunia, Brian C; Brett, Jeanne M; Nandkeolyar, Amit K; Kamdar, Dishan

    2011-07-01

    Three studies contrasting Indian and American negotiators tested hypotheses derived from theory proposing why there are cultural differences in trust and how cultural differences in trust influence negotiation strategy. Study 1 (a survey) documented that Indian negotiators trust their counterparts less than American negotiators. Study 2 (a negotiation simulation) linked American and Indian negotiators' self-reported trust and strategy to their insight and joint gains. Study 3 replicated and extended Study 2 using independently coded negotiation strategy data, allowing for stronger causal inference. Overall, the strategy associated with Indian negotiators' reluctance to extend interpersonal (as opposed to institutional) trust produced relatively poor outcomes. Our data support an expanded theoretical model of negotiation, linking culture to trust, strategies, and outcomes.

  4. Soldier communication net for the 21st century digitized battlespace

    NASA Astrophysics Data System (ADS)

    Mu, Libo; Zhang, Yutian

    1999-07-01

    This paper present soldier communication net scheme, which survives and operates in the 21st century battlefield environment. First, it analyzes the features, the need, function of the soldier communication net on the 21st century battlefield environment. Secondly it presents a layered model of the soldier communication net, derived from the OSI theory, and discusses the design of the 3 layers, link layer, link controller and input/output applications layer. Thirdly, it present some key technical discussion concerning with the direct-sequence-spread-spectrum communication, code/decode and low power consumption. Finally, it gives the conclusion that spread spectrum time division system is the best scheme of soldier communication net.

  5. QTCM Software Documentation. Volume 1. Programmers’s Manual

    DTIC Science & Technology

    1990-11-01

    technologies are implemented. Since the PSNs are comprised of private companies, the OMNCS has little direct control over how they operate and the... control equipment at each node to prevent any congestion at the switches. Therefore, all call blocking is modeled to occur at the trunks rather than at the...main program is linked with separately compiled subroutines that can be grouped intoI seven distinct functional areas as follows: Control code File

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Z.; Zweibaum, N.; Shao, M.

    The University of California, Berkeley (UCB) is performing thermal hydraulics safety analysis to develop the technical basis for design and licensing of fluoride-salt-cooled, high-temperature reactors (FHRs). FHR designs investigated by UCB use natural circulation for emergency, passive decay heat removal when normal decay heat removal systems fail. The FHR advanced natural circulation analysis (FANCY) code has been developed for assessment of passive decay heat removal capability and safety analysis of these innovative system designs. The FANCY code uses a one-dimensional, semi-implicit scheme to solve for pressure-linked mass, momentum and energy conservation equations. Graph theory is used to automatically generate amore » staggered mesh for complicated pipe network systems. Heat structure models have been implemented for three types of boundary conditions (Dirichlet, Neumann and Robin boundary conditions). Heat structures can be composed of several layers of different materials, and are used for simulation of heat structure temperature distribution and heat transfer rate. Control models are used to simulate sequences of events or trips of safety systems. A proportional-integral controller is also used to automatically make thermal hydraulic systems reach desired steady state conditions. A point kinetics model is used to model reactor kinetics behavior with temperature reactivity feedback. The underlying large sparse linear systems in these models are efficiently solved by using direct and iterative solvers provided by the SuperLU code on high performance machines. Input interfaces are designed to increase the flexibility of simulation for complicated thermal hydraulic systems. In conclusion, this paper mainly focuses on the methodology used to develop the FANCY code, and safety analysis of the Mark 1 pebble-bed FHR under development at UCB is performed.« less

  7. Design and System Implications of a Family of Wideband HF Data Waveforms

    DTIC Science & Technology

    2010-09-01

    code rates (i.e. 8/9, 9/10) will be used to attain the highest data rates for surface wave links. Very high puncturing of convolutional codes can...Communication Links”, Edition 1, North Atlantic Treaty Organization, 2009. [14] Yasuda, Y., Kashiki, K., Hirata, Y. “High- Rate Punctured Convolutional Codes ...length 7 convolutional code that has been used for over two decades in 110A. In addition, repetition coding and puncturing was

  8. Solving iTOUGH2 simulation and optimization problems using the PEST protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finsterle, S.A.; Zhang, Y.

    2011-02-01

    The PEST protocol has been implemented into the iTOUGH2 code, allowing the user to link any simulation program (with ASCII-based inputs and outputs) to iTOUGH2's sensitivity analysis, inverse modeling, and uncertainty quantification capabilities. These application models can be pre- or post-processors of the TOUGH2 non-isothermal multiphase flow and transport simulator, or programs that are unrelated to the TOUGH suite of codes. PEST-style template and instruction files are used, respectively, to pass input parameters updated by the iTOUGH2 optimization routines to the model, and to retrieve the model-calculated values that correspond to observable variables. We summarize the iTOUGH2 capabilities and demonstratemore » the flexibility added by the PEST protocol for the solution of a variety of simulation-optimization problems. In particular, the combination of loosely coupled and tightly integrated simulation and optimization routines provides both the flexibility and control needed to solve challenging inversion problems for the analysis of multiphase subsurface flow and transport systems.« less

  9. The introspective may achieve more: Enhancing existing Geoscientific models with native-language emulated structural reflection

    NASA Astrophysics Data System (ADS)

    Ji, Xinye; Shen, Chaopeng

    2018-01-01

    Geoscientific models manage myriad and increasingly complex data structures as trans-disciplinary models are integrated. They often incur significant redundancy with cross-cutting tasks. Reflection, the ability of a program to inspect and modify its structure and behavior at runtime, is known as a powerful tool to improve code reusability, abstraction, and separation of concerns. Reflection is rarely adopted in high-performance Geoscientific models, especially with Fortran, where it was previously deemed implausible. Practical constraints of language and legacy often limit us to feather-weight, native-language solutions. We demonstrate the usefulness of a structural-reflection-emulating, dynamically-linked metaObjects, gd. We show real-world examples including data structure self-assembly, effortless input/output (IO) and upgrade to parallel I/O, recursive actions and batch operations. We share gd and a derived module that reproduces MATLAB-like structure in Fortran and C++. We suggest that both a gd representation and a Fortran-native representation are maintained to access the data, each for separate purposes. Embracing emulated reflection allows generically-written codes that are highly re-usable across projects.

  10. Modulation/demodulation techniques for satellite communications. Part 1: Background

    NASA Technical Reports Server (NTRS)

    Omura, J. K.; Simon, M. K.

    1981-01-01

    Basic characteristics of digital data transmission systems described include the physical communication links, the notion of bandwidth, FCC regulations, and performance measurements such as bit rates, bit error probabilities, throughputs, and delays. The error probability performance and spectral characteristics of various modulation/demodulation techniques commonly used or proposed for use in radio and satellite communication links are summarized. Forward error correction with block or convolutional codes is also discussed along with the important coding parameter, channel cutoff rate.

  11. Cross-Layer Design for Video Transmission over Wireless Rician Slow-Fading Channels Using an Adaptive Multiresolution Modulation and Coding Scheme

    NASA Astrophysics Data System (ADS)

    Pei, Yong; Modestino, James W.

    2007-12-01

    We describe a multilayered video transport scheme for wireless channels capable of adapting to channel conditions in order to maximize end-to-end quality of service (QoS). This scheme combines a scalable H.263+ video source coder with unequal error protection (UEP) across layers. The UEP is achieved by employing different channel codes together with a multiresolution modulation approach to transport the different priority layers. Adaptivity to channel conditions is provided through a joint source-channel coding (JSCC) approach which attempts to jointly optimize the source and channel coding rates together with the modulation parameters to obtain the maximum achievable end-to-end QoS for the prevailing channel conditions. In this work, we model the wireless links as slow-fading Rician channel where the channel conditions can be described in terms of the channel signal-to-noise ratio (SNR) and the ratio of specular-to-diffuse energy[InlineEquation not available: see fulltext.]. The multiresolution modulation/coding scheme consists of binary rate-compatible punctured convolutional (RCPC) codes used together with nonuniform phase-shift keyed (PSK) signaling constellations. Results indicate that this adaptive JSCC scheme employing scalable video encoding together with a multiresolution modulation/coding approach leads to significant improvements in delivered video quality for specified channel conditions. In particular, the approach results in considerably improved graceful degradation properties for decreasing channel SNR.

  12. Representational geometry: integrating cognition, computation, and the brain

    PubMed Central

    Kriegeskorte, Nikolaus; Kievit, Rogier A.

    2013-01-01

    The cognitive concept of representation plays a key role in theories of brain information processing. However, linking neuronal activity to representational content and cognitive theory remains challenging. Recent studies have characterized the representational geometry of neural population codes by means of representational distance matrices, enabling researchers to compare representations across stages of processing and to test cognitive and computational theories. Representational geometry provides a useful intermediate level of description, capturing both the information represented in a neuronal population code and the format in which it is represented. We review recent insights gained with this approach in perception, memory, cognition, and action. Analyses of representational geometry can compare representations between models and the brain, and promise to explain brain computation as transformation of representational similarity structure. PMID:23876494

  13. Optimal Near-Hitless Network Failure Recovery Using Diversity Coding

    ERIC Educational Resources Information Center

    Avci, Serhat Nazim

    2013-01-01

    Link failures in wide area networks are common and cause significant data losses. Mesh-based protection schemes offer high capacity efficiency but they are slow, require complex signaling, and instable. Diversity coding is a proactive coding-based recovery technique which offers near-hitless (sub-ms) restoration with a competitive spare capacity…

  14. Quick Response (QR) Codes for Audio Support in Foreign Language Learning

    ERIC Educational Resources Information Center

    Vigil, Kathleen Murray

    2017-01-01

    This study explored the potential benefits and barriers of using quick response (QR) codes as a means by which to provide audio materials to middle-school students learning Spanish as a foreign language. Eleven teachers of Spanish to middle-school students created transmedia materials containing QR codes linking to audio resources. Students…

  15. A discriminative test among the different theories proposed to explain the origin of the genetic code: the coevolution theory finds additional support.

    PubMed

    Giulio, Massimo Di

    2018-05-19

    A discriminative statistical test among the different theories proposed to explain the origin of the genetic code is presented. Gathering the amino acids into polarity and biosynthetic classes that are the first expression of the physicochemical theory of the origin of the genetic code and the second expression of the coevolution theory, these classes are utilized in the Fisher's exact test to establish their significance within the genetic code table. Linking to the rows and columns of the genetic code of probabilities that express the statistical significance of these classes, I have finally been in the condition to be able to calculate a χ value to link to both the physicochemical theory and to the coevolution theory that would express the corroboration level referred to these theories. The comparison between these two χ values showed that the coevolution theory is able to explain - in this strictly empirical analysis - the origin of the genetic code better than that of the physicochemical theory. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Modelling Conditions and Health Care Processes in Electronic Health Records: An Application to Severe Mental Illness with the Clinical Practice Research Datalink

    PubMed Central

    Olier, Ivan; Springate, David A.; Ashcroft, Darren M.; Doran, Tim; Reeves, David; Planner, Claire; Reilly, Siobhan; Kontopantelis, Evangelos

    2016-01-01

    Background The use of Electronic Health Records databases for medical research has become mainstream. In the UK, increasing use of Primary Care Databases is largely driven by almost complete computerisation and uniform standards within the National Health Service. Electronic Health Records research often begins with the development of a list of clinical codes with which to identify cases with a specific condition. We present a methodology and accompanying Stata and R commands (pcdsearch/Rpcdsearch) to help researchers in this task. We present severe mental illness as an example. Methods We used the Clinical Practice Research Datalink, a UK Primary Care Database in which clinical information is largely organised using Read codes, a hierarchical clinical coding system. Pcdsearch is used to identify potentially relevant clinical codes and/or product codes from word-stubs and code-stubs suggested by clinicians. The returned code-lists are reviewed and codes relevant to the condition of interest are selected. The final code-list is then used to identify patients. Results We identified 270 Read codes linked to SMI and used them to identify cases in the database. We observed that our approach identified cases that would have been missed with a simpler approach using SMI registers defined within the UK Quality and Outcomes Framework. Conclusion We described a framework for researchers of Electronic Health Records databases, for identifying patients with a particular condition or matching certain clinical criteria. The method is invariant to coding system or database and can be used with SNOMED CT, ICD or other medical classification code-lists. PMID:26918439

  17. Surface reconstruction, figure-ground modulation, and border-ownership.

    PubMed

    Jeurissen, Danique; Self, Matthew W; Roelfsema, Pieter R

    2013-01-01

    The Differentiation-Integration for Surface Completion (DISC) model aims to explain the reconstruction of visual surfaces. We find the model a valuable contribution to our understanding of figure-ground organization. We point out that, next to border-ownership, neurons in visual cortex code whether surface elements belong to a figure or the background and that this is influenced by attention. We furthermore suggest that there must be strong links between object recognition and figure-ground assignment in order to resolve the status of interior contours. Incorporation of these factors in neurocomputational models will further improve our understanding of surface reconstruction, figure-ground organization, and border-ownership.

  18. Integrated Modelling - the next steps (Invited)

    NASA Astrophysics Data System (ADS)

    Moore, R. V.

    2010-12-01

    Integrated modelling (IM) has made considerable advances over the past decade but it has not yet been taken up as an operational tool in the way that its proponents had hoped. The reasons why will be discussed in Session U17. This talk will propose topics for a research and development programme and suggest an institutional structure which, together, could overcome the present obstacles. Their combined aim would be first to make IM into an operational tool useable by competent public authorities and commercial companies and, in time, to see it evolve into the modelling equivalent of Google Maps, something accessible and useable by anyone with a PC or an iphone and an internet connection. In a recent study, a number of government agencies, water authorities and utilities applied integrated modelling to operational problems. While the project demonstrated that IM could be used in an operational setting and had benefit, it also highlighted the advances that would be required for its widespread uptake. These were: greatly improving the ease with which models could be a) made linkable, b) linked and c) run; developing a methodology for applying integrated modelling; developing practical options for calibrating and validating linked models; addressing the science issues that arise when models are linked; extending the range of modelling concepts that can be linked; enabling interface standards to pass uncertainty information; making the interface standards platform independent; extending the range of platforms to include those for high performance computing; developing the concept of modelling components as web services; separating simulation code from the model’s GUI, so that all the results from the linked models can be viewed through a single GUI; developing scenario management systems so that that there is an audit trail of the version of each model and dataset used in each linked model run. In addition to the above, there is a need to build a set of integrated modelling demonstrations and to achieve a critical mass of accessible linkable models. These models will be spread across the world and there will need to be a search mechanism to find them. If the benefits of integrated modelling are to become widely available, ways will have to be found to automate and hide the linking process. All the above will require considerable resources. It is proposed to build these resources by creating an international community of practice based upon the open source model. The talk will expand upon the research and development required.

  19. Use of a Respondent-Generated Personal Code for Matching Anonymous Adolescent Surveys in Longitudinal Studies.

    PubMed

    Ripper, Lisa; Ciaravino, Samantha; Jones, Kelley; Jaime, Maria Catrina D; Miller, Elizabeth

    2017-06-01

    Research on sensitive and private topics relies heavily on self-reported responses. Social desirability bias may reduce the accuracy and reliability of self-reported responses. Anonymous surveys appear to improve the likelihood of honest responses. A challenge with prospective research is maintaining anonymity while linking individual surveys over time. We have tested a secret code method in which participants create their own code based on eight questions that are not expected to change. In an ongoing middle school trial, 95.7% of follow-up surveys are matched to a baseline survey after changing up to two-code variables. The percentage matched improves by allowing up to four changes (99.7%). The use of a secret code as an anonymous identifier for linking baseline and follow-up surveys is feasible for use with adolescents. While developed for violence prevention research, this method may be useful with other sensitive health behavior research. Copyright © 2017 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  20. An introduction to QR Codes: linking libraries and mobile patrons.

    PubMed

    Hoy, Matthew B

    2011-01-01

    QR codes, or "Quick Response" codes, are two-dimensional barcodes that can be scanned by mobile smartphone cameras. These codes can be used to provide fast access to URLs, telephone numbers, and short passages of text. With the rapid adoption of smartphones, librarians are able to use QR codes to promote services and help library users find materials quickly and independently. This article will explain what QR codes are, discuss how they can be used in the library, and describe issues surrounding their use. A list of resources for generating and scanning QR codes is also provided.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carothers, Christopher D.; Meredith, Jeremy S.; Blanco, Marc

    Performance modeling of extreme-scale applications on accurate representations of potential architectures is critical for designing next generation supercomputing systems because it is impractical to construct prototype systems at scale with new network hardware in order to explore designs and policies. However, these simulations often rely on static application traces that can be difficult to work with because of their size and lack of flexibility to extend or scale up without rerunning the original application. To address this problem, we have created a new technique for generating scalable, flexible workloads from real applications, we have implemented a prototype, called Durango, thatmore » combines a proven analytical performance modeling language, Aspen, with the massively parallel HPC network modeling capabilities of the CODES framework.Our models are compact, parameterized and representative of real applications with computation events. They are not resource intensive to create and are portable across simulator environments. We demonstrate the utility of Durango by simulating the LULESH application in the CODES simulation environment on several topologies and show that Durango is practical to use for simulation without loss of fidelity, as quantified by simulation metrics. During our validation of Durango's generated communication model of LULESH, we found that the original LULESH miniapp code had a latent bug where the MPI_Waitall operation was used incorrectly. This finding underscores the potential need for a tool such as Durango, beyond its benefits for flexible workload generation and modeling.Additionally, we demonstrate the efficacy of Durango's direct integration approach, which links Aspen into CODES as part of the running network simulation model. Here, Aspen generates the application-level computation timing events, which in turn drive the start of a network communication phase. Results show that Durango's performance scales well when executing both torus and dragonfly network models on up to 4K Blue Gene/Q nodes using 32K MPI ranks, Durango also avoids the overheads and complexities associated with extreme-scale trace files.« less

  2. UFO - The Universal FEYNRULES Output

    NASA Astrophysics Data System (ADS)

    Degrande, Céline; Duhr, Claude; Fuks, Benjamin; Grellscheid, David; Mattelaer, Olivier; Reiter, Thomas

    2012-06-01

    We present a new model format for automatized matrix-element generators, the so-called Universal FEYNRULES Output (UFO). The format is universal in the sense that it features compatibility with more than one single generator and is designed to be flexible, modular and agnostic of any assumption such as the number of particles or the color and Lorentz structures appearing in the interaction vertices. Unlike other model formats where text files need to be parsed, the information on the model is encoded into a PYTHON module that can easily be linked to other computer codes. We then describe an interface for the MATHEMATICA package FEYNRULES that allows for an automatic output of models in the UFO format.

  3. Connecting mirror neurons and forward models.

    PubMed

    Miall, R C

    2003-12-02

    Two recent developments in motor neuroscience are promising the extension of theoretical concepts from motor control towards cognitive processes, including human social interactions and understanding the intentions of others. The first of these is the discovery of what are now called mirror neurons, which code for both observed and executed actions. The second is the concept of internal models, and in particular recent proposals that forward and inverse models operate in paired modules. These two ideas will be briefly introduced, and a recent suggestion linking between the two processes of mirroring and modelling will be described which may underlie our abilities for imitating actions, for cooperation between two actors, and possibly for communication via gesture and language.

  4. Modulation and coding for throughput-efficient optical free-space links

    NASA Technical Reports Server (NTRS)

    Georghiades, Costas N.

    1993-01-01

    Optical direct-detection systems are currently being considered for some high-speed inter-satellite links, where data-rates of a few hundred megabits per second are evisioned under power and pulsewidth constraints. In this paper we investigate the capacity, cutoff-rate and error-probability performance of uncoded and trellis-coded systems for various modulation schemes and under various throughput and power constraints. Modulation schemes considered are on-off keying (OOK), pulse-position modulation (PPM), overlapping PPM (OPPM) and multi-pulse (combinatorial) PPM (MPPM).

  5. Superdense Coding over Optical Fiber Links with Complete Bell-State Measurements

    NASA Astrophysics Data System (ADS)

    Williams, Brian P.; Sadlier, Ronald J.; Humble, Travis S.

    2017-02-01

    Adopting quantum communication to modern networking requires transmitting quantum information through a fiber-based infrastructure. We report the first demonstration of superdense coding over optical fiber links, taking advantage of a complete Bell-state measurement enabled by time-polarization hyperentanglement, linear optics, and common single-photon detectors. We demonstrate the highest single-qubit channel capacity to date utilizing linear optics, 1.665 ±0.018 , and we provide a full experimental implementation of a hybrid, quantum-classical communication protocol for image transfer.

  6. Multi-Body Analysis of a Tiltrotor Configuration

    NASA Technical Reports Server (NTRS)

    Ghiringhelli, G. L.; Masarati, P.; Mantegazza, P.; Nixon, M. W.

    1997-01-01

    The paper describes the aeroelastic analysis of a tiltrotor configuration. The 1/5 scale wind tunnel semispan model of the V-22 tiltrotor aircraft is considered. The analysis is performed by means of a multi-body code, based on an original formulation. The differential equilibrium problem is stated in terms of first order differential equations. The equilibrium equations of every rigid body are written, together with the definitions of the momenta. The bodies are connected by kinematic constraints, applied in form of Lagrangian multipliers. Deformable components are mainly modelled by means of beam elements, based on an original finite volume formulation. Multi-disciplinar problems can be solved by adding user-defined differential equations. In the presented analysis the equations related to the control of the swash-plate of the model are considered. Advantages of a multi-body aeroelastic code over existing comprehensive rotorcraft codes include the exact modelling of the kinematics of the hub, the detailed modelling of the flexibility of critical hub components, and the possibility to simulate steady flight conditions as well as wind-up and maneuvers. The simulations described in the paper include: 1) the analysis of the aeroelastic stability, with particular regard to the proprotor/pylon instability that is peculiar to tiltrotors, 2) the determination of the dynamic behavior of the system and of the loads due to typical maneuvers, with particular regard to the conversion from helicopter to airplane mode, and 3) the stress evaluation in critical components, such as the pitch links and the conversion downstop spring.

  7. Estimating recharge rates with analytic element models and parameter estimation

    USGS Publications Warehouse

    Dripps, W.R.; Hunt, R.J.; Anderson, M.P.

    2006-01-01

    Quantifying the spatial and temporal distribution of recharge is usually a prerequisite for effective ground water flow modeling. In this study, an analytic element (AE) code (GFLOW) was used with a nonlinear parameter estimation code (UCODE) to quantify the spatial and temporal distribution of recharge using measured base flows as calibration targets. The ease and flexibility of AE model construction and evaluation make this approach well suited for recharge estimation. An AE flow model of an undeveloped watershed in northern Wisconsin was optimized to match median annual base flows at four stream gages for 1996 to 2000 to demonstrate the approach. Initial optimizations that assumed a constant distributed recharge rate provided good matches (within 5%) to most of the annual base flow estimates, but discrepancies of >12% at certain gages suggested that a single value of recharge for the entire watershed is inappropriate. Subsequent optimizations that allowed for spatially distributed recharge zones based on the distribution of vegetation types improved the fit and confirmed that vegetation can influence spatial recharge variability in this watershed. Temporally, the annual recharge values varied >2.5-fold between 1996 and 2000 during which there was an observed 1.7-fold difference in annual precipitation, underscoring the influence of nonclimatic factors on interannual recharge variability for regional flow modeling. The final recharge values compared favorably with more labor-intensive field measurements of recharge and results from studies, supporting the utility of using linked AE-parameter estimation codes for recharge estimation. Copyright ?? 2005 The Author(s).

  8. Featured Image: Tests of an MHD Code

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-09-01

    Creating the codes that are used to numerically model astrophysical systems takes a lot of work and a lot of testing! A new, publicly available moving-mesh magnetohydrodynamics (MHD) code, DISCO, is designed to model 2D and 3D orbital fluid motion, such as that of astrophysical disks. In a recent article, DISCO creator Paul Duffell (University of California, Berkeley) presents the code and the outcomes from a series of standard tests of DISCOs stability, accuracy, and scalability.From left to right and top to bottom, the test outputs shown above are: a cylindrical Kelvin-Helmholtz flow (showing off DISCOs numerical grid in 2D), a passive scalar in a smooth vortex (can DISCO maintain contact discontinuities?), a global look at the cylindrical Kelvin-Helmholtz flow, a Jupiter-mass planet opening a gap in a viscous disk, an MHD flywheel (a test of DISCOs stability), an MHD explosion revealing shock structures, an MHD rotor (a more challenging version of the explosion), a Flock 3D MRI test (can DISCO study linear growth of the magnetorotational instability in disks?), and a nonlinear 3D MRI test.Check out the gif below for a closer look at each of these images, or follow the link to the original article to see even more!CitationPaul C. Duffell 2016 ApJS 226 2. doi:10.3847/0067-0049/226/1/2

  9. Status Report On Preparation of a Revised Longwave Communication Link Vulnerability Code.

    DTIC Science & Technology

    1981-10-01

    Gallery or Earth-Detached) .J.J Cc) ONE ) E BOVTH ANDENA O EVOUDATENN (EE-G GE d.𔄀, (c) O (d) BOVTHbAD GROUND ANTENN (-G - Figre3.Ilusrain te osibl snge...shows simplified flow diagrams for the ionization models re- quired for LOS and ionospheric dependent propagation calculations. The time loop could be...placed outside the path loop for ionospheric dependent propagation but this requires considerable storage in order to perform nonequilibrium ionization

  10. Mass balance computation in SAGUARO

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, B.L.; Eaton, R.R.

    1986-12-01

    This report describes the development of the mass balance subroutines used with the finite-element code, SAGUARO, which models fluid flow in partially saturated porous media. Derivation of the basic mass storage and mass flux equations is included. The results of the SAGUARO mass-balance subroutine, MASS, are shown to compare favorably with the linked results of FEMTRAN. Implementation of the MASS option in SAGUARO is described. Instructions for use of the MASS option are demonstrated with the three sample cases.

  11. Oscillatory phase dynamics in neural entrainment underpin illusory percepts of time.

    PubMed

    Herrmann, Björn; Henry, Molly J; Grigutsch, Maren; Obleser, Jonas

    2013-10-02

    Neural oscillatory dynamics are a candidate mechanism to steer perception of time and temporal rate change. While oscillator models of time perception are strongly supported by behavioral evidence, a direct link to neural oscillations and oscillatory entrainment has not yet been provided. In addition, it has thus far remained unaddressed how context-induced illusory percepts of time are coded for in oscillator models of time perception. To investigate these questions, we used magnetoencephalography and examined the neural oscillatory dynamics that underpin pitch-induced illusory percepts of temporal rate change. Human participants listened to frequency-modulated sounds that varied over time in both modulation rate and pitch, and judged the direction of rate change (decrease vs increase). Our results demonstrate distinct neural mechanisms of rate perception: Modulation rate changes directly affected listeners' rate percept as well as the exact frequency of the neural oscillation. However, pitch-induced illusory rate changes were unrelated to the exact frequency of the neural responses. The rate change illusion was instead linked to changes in neural phase patterns, which allowed for single-trial decoding of percepts. That is, illusory underestimations or overestimations of perceived rate change were tightly coupled to increased intertrial phase coherence and changes in cerebro-acoustic phase lag. The results provide insight on how illusory percepts of time are coded for by neural oscillatory dynamics.

  12. How and why do infants imitate? An ideomotor approach to social and imitative learning in infancy (and beyond).

    PubMed

    Paulus, Markus

    2014-10-01

    It has been proposed that already in infancy, imitative learning plays a pivotal role in the acquisition of knowledge and abilities. Yet the cognitive mechanisms underlying the acquisition of novel action knowledge through social learning have remained unclear. The present contribution presents an ideomotor approach to imitative learning (IMAIL) in infancy (and beyond) that draws on the ideomotor theory of action control and on recent findings of perception-action matching. According to IMAIL, the central mechanism of imitative and social learning is the acquisition of cascading bidirectional action-effect associations through observation of own and others' actions. First, the observation of the visual effect of own actions leads to the acquisition of first-order action-effect associations, linking motor codes to the action's typical visual effects. Second, observing another person's action leads to motor activation (i.e., motor resonance) due to the first-order associations. This activated motor code then becomes linked to the other salient effects produced by the observed action, leading to the acquisition of (second-order) action-effect associations. These novel action-effect associations enable later imitation of the observed actions. The article reviews recent behavioral and neurophysiological studies with infants and adults that provide empirical support for the model. Furthermore, it is discussed how the model relates to other approaches on social-cognitive development and how developmental changes in imitative abilities can be conceptualized.

  13. Using linked data to evaluate collisions with fixed objects in Pennsylvania : Crash Outcome Data Evaluation System (CODES) linked data demonstration project

    DOT National Transportation Integrated Search

    1998-10-01

    This report uses police-reported motor vehicle crash data linked to Emergency Medical Services data and hospital discharge data to evaluate the relative risk of injury posed by specific roadside objects in Pennsylvania. The report focuses primarily o...

  14. Using linked data to evaluate motor vehicle crashes involving elderly drivers in Connecticut : Crash Outcome Data Evaluation System (CODES) linked data demonstration project

    DOT National Transportation Integrated Search

    1999-09-01

    A deterministic algorithm was developed which allowed data from Department of Transportation motor vehicle crash records, state mortality registry records, and hospital admission and emergency department records to be linked for analysis of the impac...

  15. Fix Broken and Redirected Links Using Group Dashboard Link Report

    EPA Pesticide Factsheets

    Learn how to use the tools provided by Drupal in the Group Dashboard to find and correct links on your site that are broken (404 and 403 error codes), or that redirect (301 and 302), for example to a homepage rather than the originally intended content.

  16. NSR&D FY15 Final Report. Modeling Mechanical, Thermal, and Chemical Effects of Impact

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, Christopher Curtis; Ma, Xia; Zhang, Duan Zhong

    2015-11-02

    The main goal of this project is to develop a computer model that explains and predicts coupled mechanical, thermal and chemical responses of HE under impact and friction insults. The modeling effort is based on the LANL-developed CartaBlanca code, which is implemented with the dual domain material point (DDMP) method to calculate complex and coupled thermal, chemical and mechanical effects among fluids, solids and the transitions between the states. In FY 15, we have implemented the TEPLA material model for metal and performed preliminary can penetration simulation and begun to link with experiment. Currently, we are working on implementing amore » shock to detonation transition (SDT) model (SURF) and JWL equation of state.« less

  17. Imprinted and X-linked non-coding RNAs as potential regulators of human placental function

    PubMed Central

    Buckberry, Sam; Bianco-Miotto, Tina; Roberts, Claire T

    2014-01-01

    Pregnancy outcome is inextricably linked to placental development, which is strictly controlled temporally and spatially through mechanisms that are only partially understood. However, increasing evidence suggests non-coding RNAs (ncRNAs) direct and regulate a considerable number of biological processes and therefore may constitute a previously hidden layer of regulatory information in the placenta. Many ncRNAs, including both microRNAs and long non-coding transcripts, show almost exclusive or predominant expression in the placenta compared with other somatic tissues and display altered expression patterns in placentas from complicated pregnancies. In this review, we explore the results of recent genome-scale and single gene expression studies using human placental tissue, but include studies in the mouse where human data are lacking. Our review focuses on the ncRNAs epigenetically regulated through genomic imprinting or X-chromosome inactivation and includes recent evidence surrounding the H19 lincRNA, the imprinted C19MC cluster microRNAs, and X-linked miRNAs associated with pregnancy complications. PMID:24081302

  18. Adaptive Coding and Modulation Experiment With NASA's Space Communication and Navigation Testbed

    NASA Technical Reports Server (NTRS)

    Downey, Joseph; Mortensen, Dale; Evans, Michael; Briones, Janette; Tollis, Nicholas

    2016-01-01

    National Aeronautics and Space Administration (NASA)'s Space Communication and Navigation Testbed is an advanced integrated communication payload on the International Space Station. This paper presents results from an adaptive coding and modulation (ACM) experiment over S-band using a direct-to-earth link between the SCaN Testbed and the Glenn Research Center. The testing leverages the established Digital Video Broadcasting Second Generation (DVB-S2) standard to provide various modulation and coding options, and uses the Space Data Link Protocol (Consultative Committee for Space Data Systems (CCSDS) standard) for the uplink and downlink data framing. The experiment was conducted in a challenging environment due to the multipath and shadowing caused by the International Space Station structure. Several approaches for improving the ACM system are presented, including predictive and learning techniques to accommodate signal fades. Performance of the system is evaluated as a function of end-to-end system latency (round-trip delay), and compared to the capacity of the link. Finally, improvements over standard NASA waveforms are presented.

  19. Adaptive Coding and Modulation Experiment With NASA's Space Communication and Navigation Testbed

    NASA Technical Reports Server (NTRS)

    Downey, Joseph A.; Mortensen, Dale J.; Evans, Michael A.; Briones, Janette C.; Tollis, Nicholas

    2016-01-01

    National Aeronautics and Space Administration (NASA)'s Space Communication and Navigation Testbed is an advanced integrated communication payload on the International Space Station. This paper presents results from an adaptive coding and modulation (ACM) experiment over S-band using a direct-to-earth link between the SCaN Testbed and the Glenn Research Center. The testing leverages the established Digital Video Broadcasting Second Generation (DVB-S2) standard to provide various modulation and coding options, and uses the Space Data Link Protocol (Consultative Committee for Space Data Systems (CCSDS) standard) for the uplink and downlink data framing. The experiment was con- ducted in a challenging environment due to the multipath and shadowing caused by the International Space Station structure. Several approaches for improving the ACM system are presented, including predictive and learning techniques to accommodate signal fades. Performance of the system is evaluated as a function of end-to-end system latency (round- trip delay), and compared to the capacity of the link. Finally, improvements over standard NASA waveforms are presented.

  20. Moving Controlled Vocabularies into the Semantic Web

    NASA Astrophysics Data System (ADS)

    Thomas, R.; Lowry, R. K.; Kokkinaki, A.

    2015-12-01

    One of the issues with legacy oceanographic data formats is that the only tool available for describing what a measurement is and how it was made is a single metadata tag known as the parameter code. The British Oceanographic Data Centre (BODC) has been supporting the international oceanographic community gain maximum benefit from this through a controlled vocabulary known as the BODC Parameter Usage Vocabulary (PUV). Over time this has grown to over 34,000 entries some of which have preferred labels with over 400 bytes of descriptive information detailing what was measured and how. A decade ago the BODC pioneered making this information available in a more useful form with the implementation of a prototype vocabulary server (NVS) that referenced each 'parameter code' as a URL. This developed into the current server (NVS V2) in which the parameter URL resolves into an RDF document based on the SKOS data model which includes a list of resource URLs mapped to the 'parameter'. For example the parameter code for a contaminant in biota, such as 'cadmium in Mytilus edulis', carries RDF triples leading to the entry for Mytilus edulis in the WoRMS and for cadmium in the ChEBI ontologies. By providing links into these external ontologies the information captured in a 1980s parameter code now conforms to the Linked Data paradigm of the Semantic Web, vastly increasing the descriptive information accessible to a user. This presentation will describe the next steps along the road to the Semantic Web with the development of a SPARQL end point1 to expose the PUV plus the 190 other controlled vocabularies held in NVS. Whilst this is ideal for those fluent in SPARQL, most users require something a little more user-friendly and so the NVS browser2 was developed over the end point to allow less technical users to query the vocabularies and navigate the NVS ontology. This tool integrates into an editor that allows vocabulary content to be manipulated by authorised users outside BODC. Having placed Linked Data tooling over a single SPARQL end point the obvious future development for this system is to support semantic interoperability outside NVS by the incorporation of federated SPARQL end points in the USA and Australia during the ODIP II project. 1https://vocab.nerc.ac.uk/sparql 2 https://www.bodc.ac.uk/data/codes_and_formats/vocabulary_search/

  1. Challenges in converting an interviewer-administered food probe database to self-administration in the National Cancer Institute Automated Self-administered 24-Hour Recall (ASA24).

    PubMed

    Zimmerman, Thea Palmer; Hull, Stephen G; McNutt, Suzanne; Mittl, Beth; Islam, Noemi; Guenther, Patricia M; Thompson, Frances E; Potischman, Nancy A; Subar, Amy F

    2009-12-01

    The National Cancer Institute (NCI) is developing an automated, self-administered 24-hour dietary recall (ASA24) application to collect and code dietary intake data. The goal of the ASA24 development is to create a web-based dietary interview based on the US Department of Agriculture (USDA) Automated Multiple Pass Method (AMPM) instrument currently used in the National Health and Nutrition Examination Survey (NHANES). The ASA24 food list, detail probes, and portion probes were drawn from the AMPM instrument; portion-size pictures from Baylor College of Medicine's Food Intake Recording Software System (FIRSSt) were added; and the food code/portion code assignments were linked to the USDA Food and Nutrient Database for Dietary Studies (FNDDS). The requirements that the interview be self-administered and fully auto-coded presented several challenges as the AMPM probes and responses were linked with the FNDDS food codes and portion pictures. This linking was accomplished through a "food pathway," or the sequence of steps that leads from a respondent's initial food selection, through the AMPM probes and portion pictures, to the point at which a food code and gram weight portion size are assigned. The ASA24 interview database that accomplishes this contains more than 1,100 food probes and more than 2 million food pathways and will include about 10,000 pictures of individual foods depicting up to 8 portion sizes per food. The ASA24 will make the administration of multiple days of recalls in large-scale studies economical and feasible.

  2. Patterns of Father Self Evaluations among Mexican and European American Men and Links to Adolescent Adjustment

    PubMed Central

    Perez-Brena, Norma J.; Cookston, Jeffrey T.; Fabricius, William V.; Saenz, Delia

    2013-01-01

    A mixed-method study identified profiles of fathers who mentioned key dimensions of their parenting and linked profile membership to adolescents’ adjustment using data from 337 European American, Mexican American and Mexican immigrant fathers and their early adolescent children. Father narratives about what fathers do well as parents were thematically coded for the presence of five fathering dimensions: emotional quality (how well father and child get along), involvement (amount of time spent together), provisioning (the amount of resources provided), discipline (the amount and success in parental control), and role modeling (teaching life lessons through example). Next, latent class analysis was used to identify three patterns of the likelihood of mentioning certain fathering dimensions: an emotionally-involved group mentioned emotional quality and involvement; an affective-control group mentioned emotional quality, involvement, discipline and role modeling; and an affective-model group mentioned emotional quality and role modeling. Profiles were significantly associated with subsequent adolescents’ reports of adjustment such that adolescents of affective-control fathers reported significantly more externalizing behaviors than adolescents of emotionally-involved fathers. PMID:24883049

  3. Controlled grafting of vinylic monomers on polyolefins: a robust mathematical modeling approach

    PubMed Central

    Saeb, Mohammad Reza; Rezaee, Babak; Shadman, Alireza; Formela, Krzysztof; Ahmadi, Zahed; Hemmati, Farkhondeh; Kermaniyan, Tayebeh Sadat; Mohammadi, Yousef

    2017-01-01

    Abstract Experimental and mathematical modeling analyses were used for controlling melt free-radical grafting of vinylic monomers on polyolefins and, thereby, reducing the disturbance of undesired cross-linking of polyolefins. Response surface, desirability function, and artificial intelligence methodologies were blended to modeling/optimization of grafting reaction in terms of vinylic monomer content, peroxide initiator concentration, and melt-processing time. An in-house code was developed based on artificial neural network that learns and mimics processing torque and grafting of glycidyl methacrylate (GMA) typical vinylic monomer on high-density polyethylene (HDPE). Application of response surface and desirability function enabled concurrent optimization of processing torque and GMA grafting on HDPE, through which we quantified for the first time competition between parallel reactions taking place during melt processing: (i) desirable grafting of GMA on HDPE; (ii) undesirable cross-linking of HDPE. The proposed robust mathematical modeling approach can precisely learn the behavior of grafting reaction of vinylic monomers on polyolefins and be placed into practice in finding exact operating condition needed for efficient grafting of reactive monomers on polyolefins. PMID:29491797

  4. Controlled grafting of vinylic monomers on polyolefins: a robust mathematical modeling approach.

    PubMed

    Saeb, Mohammad Reza; Rezaee, Babak; Shadman, Alireza; Formela, Krzysztof; Ahmadi, Zahed; Hemmati, Farkhondeh; Kermaniyan, Tayebeh Sadat; Mohammadi, Yousef

    2017-01-01

    Experimental and mathematical modeling analyses were used for controlling melt free-radical grafting of vinylic monomers on polyolefins and, thereby, reducing the disturbance of undesired cross-linking of polyolefins. Response surface, desirability function, and artificial intelligence methodologies were blended to modeling/optimization of grafting reaction in terms of vinylic monomer content, peroxide initiator concentration, and melt-processing time. An in-house code was developed based on artificial neural network that learns and mimics processing torque and grafting of glycidyl methacrylate (GMA) typical vinylic monomer on high-density polyethylene (HDPE). Application of response surface and desirability function enabled concurrent optimization of processing torque and GMA grafting on HDPE, through which we quantified for the first time competition between parallel reactions taking place during melt processing: (i) desirable grafting of GMA on HDPE; (ii) undesirable cross-linking of HDPE. The proposed robust mathematical modeling approach can precisely learn the behavior of grafting reaction of vinylic monomers on polyolefins and be placed into practice in finding exact operating condition needed for efficient grafting of reactive monomers on polyolefins.

  5. Does CTCF mediate between nuclear organization and gene expression?

    PubMed

    Ohlsson, Rolf; Lobanenkov, Victor; Klenova, Elena

    2010-01-01

    The multifunctional zinc-finger protein CCCTC-binding factor (CTCF) is a very strong candidate for the role of coordinating the expression level of coding sequences with their three-dimensional position in the nucleus, apparently responding to a "code" in the DNA itself. Dynamic interactions between chromatin fibers in the context of nuclear architecture have been implicated in various aspects of genome functions. However, the molecular basis of these interactions still remains elusive and is a subject of intense debate. Here we discuss the nature of CTCF-DNA interactions, the CTCF-binding specificity to its binding sites and the relationship between CTCF and chromatin, and we examine data linking CTCF with gene regulation in the three-dimensional nuclear space. We discuss why these features render CTCF a very strong candidate for the role and propose a unifying model, the "CTCF code," explaining the mechanistic basis of how the information encrypted in DNA may be interpreted by CTCF into diverse nuclear functions.

  6. A Non-Degenerate Code of Deleterious Variants in Mendelian Loci Contributes to Complex Disease Risk

    PubMed Central

    Blair, David R.; Lyttle, Christopher S.; Mortensen, Jonathan M.; Bearden, Charles F.; Jensen, Anders Boeck; Khiabanian, Hossein; Melamed, Rachel; Rabadan, Raul; Bernstam, Elmer V.; Brunak, Søren; Jensen, Lars Juhl; Nicolae, Dan; Shah, Nigam H.; Grossman, Robert L.; Cox, Nancy J.; White, Kevin P.; Rzhetsky, Andrey

    2013-01-01

    Summary Whereas countless highly penetrant variants have been associated with Mendelian disorders, the genetic etiologies underlying complex diseases remain largely unresolved. Here, we examine the extent to which Mendelian variation contributes to complex disease risk by mining the medical records of over 110 million patients. We detect thousands of associations between Mendelian and complex diseases, revealing a non-degenerate, phenotypic code that links each complex disorder to a unique collection of Mendelian loci. Using genome-wide association results, we demonstrate that common variants associated with complex diseases are enriched in the genes indicated by this “Mendelian code.” Finally, we detect hundreds of comorbidity associations among Mendelian disorders, and we use probabilistic genetic modeling to demonstrate that Mendelian variants likely contribute non-additively to the risk for a subset of complex diseases. Overall, this study illustrates a complementary approach for mapping complex disease loci and provides unique predictions concerning the etiologies of specific diseases. PMID:24074861

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilkey, Lindsay

    This milestone presents a demonstration of the High-to-Low (Hi2Lo) process in the VVI focus area. Validation and additional calculations with the commercial computational fluid dynamics code, STAR-CCM+, were performed using a 5x5 fuel assembly with non-mixing geometry and spacer grids. This geometry was based on the benchmark experiment provided by Westinghouse. Results from the simulations were compared to existing experimental data and to the subchannel thermal-hydraulics code COBRA-TF (CTF). An uncertainty quantification (UQ) process was developed for the STAR-CCM+ model and results of the STAR UQ were communicated to CTF. Results from STAR-CCM+ simulations were used as experimental design pointsmore » in CTF to calibrate the mixing parameter β and compared to results obtained using experimental data points. This demonstrated that CTF’s β parameter can be calibrated to match existing experimental data more closely. The Hi2Lo process for the STAR-CCM+/CTF code coupling was documented in this milestone and closely linked L3:VVI.H2LP15.01 milestone report.« less

  8. A Knowledge-Based Representation Scheme for Environmental Science Models

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Dungan, Jennifer L.; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    One of the primary methods available for studying environmental phenomena is the construction and analysis of computational models. We have been studying how artificial intelligence techniques can be applied to assist in the development and use of environmental science models within the context of NASA-sponsored activities. We have identified several high-utility areas as potential targets for research and development: model development; data visualization, analysis, and interpretation; model publishing and reuse, training and education; and framing, posing, and answering questions. Central to progress on any of the above areas is a representation for environmental models that contains a great deal more information than is present in a traditional software implementation. In particular, a traditional software implementation is devoid of any semantic information that connects the code with the environmental context that forms the background for the modeling activity. Before we can build AI systems to assist in model development and usage, we must develop a representation for environmental models that adequately describes a model's semantics and explicitly represents the relationship between the code and the modeling task at hand. We have developed one such representation in conjunction with our work on the SIGMA (Scientists' Intelligent Graphical Modeling Assistant) environment. The key feature of the representation is that it provides a semantic grounding for the symbols in a set of modeling equations by linking those symbols to an explicit representation of the underlying environmental scenario.

  9. Digital optical feeder links system for broadband geostationary satellite

    NASA Astrophysics Data System (ADS)

    Poulenard, Sylvain; Mège, Alexandre; Fuchs, Christian; Perlot, Nicolas; Riedi, Jerome; Perdigues, Josep

    2017-02-01

    An optical link based on a multiplex of wavelengths at 1.55μm is foreseen to be a valuable solution for the feeder link of the next generation of high-throughput geostationary satellite. The main satellite operator specifications for such link are an availability of 99.9% over the year, a capacity around 500Gbit/s and to be bent-pipe. Optical ground station networks connected to Terabit/s terrestrial fibers are proposed. The availability of the optical feeder link is simulated over 5 years based on a state-of-the-art cloud mask data bank and an atmospheric turbulence strength model. Yearly and seasonal optical feeder link availabilities are derived and discussed. On-ground and on-board terminals are designed to be compliant with 10Gbit/s per optical channel data rate taking into account adaptive optic systems to mitigate the impact of atmospheric turbulences on single-mode optical fiber receivers. The forward and return transmission chains, concept and implementation, are described. These are based on a digital transparent on-off keying optical link with digitalization of the DVB-S2 and DVB-RCS signals prior to the transmission, and a forward error correcting code. In addition, the satellite architecture is described taking into account optical and radiofrequency payloads as well as their interfaces.

  10. Representational geometry: integrating cognition, computation, and the brain.

    PubMed

    Kriegeskorte, Nikolaus; Kievit, Rogier A

    2013-08-01

    The cognitive concept of representation plays a key role in theories of brain information processing. However, linking neuronal activity to representational content and cognitive theory remains challenging. Recent studies have characterized the representational geometry of neural population codes by means of representational distance matrices, enabling researchers to compare representations across stages of processing and to test cognitive and computational theories. Representational geometry provides a useful intermediate level of description, capturing both the information represented in a neuronal population code and the format in which it is represented. We review recent insights gained with this approach in perception, memory, cognition, and action. Analyses of representational geometry can compare representations between models and the brain, and promise to explain brain computation as transformation of representational similarity structure. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Transnational organizational considerations for sociocultural differences in ethics and virtual team functioning in laboratory animal science.

    PubMed

    Pritt, Stacy L; Mackta, Jayne

    2010-05-01

    Business models for transnational organizations include linking different geographies through common codes of conduct, policies, and virtual teams. Global companies with laboratory animal science activities (whether outsourced or performed inhouse) often see the need for these business activities in relation to animal-based research and benefit from them. Global biomedical research organizations can learn how to better foster worldwide cooperation and teamwork by understanding and working with sociocultural differences in ethics and by knowing how to facilitate appropriate virtual team actions. Associated practices include implementing codes and policies transcend cultural, ethnic, or other boundaries and equipping virtual teams with the needed technology, support, and rewards to ensure timely and productive work that ultimately promotes good science and patient safety in drug development.

  12. Drugs, Guns, and Disadvantaged Youths: Co-Occurring Behavior and the Code of the Street

    ERIC Educational Resources Information Center

    Allen, Andrea N.; Lo, Celia C.

    2012-01-01

    Guided by Anderson's theory of the code of the street, this study explored social mechanisms linking individual-level disadvantage factors with the adoption of beliefs grounded in the code of the street and with drug trafficking and gun carrying--the co-occurring behavior shaping violence among young men in urban areas. Secondary data were…

  13. Admiralty Inlet Advanced Turbulence Measurements: final data and code archive

    DOE Data Explorer

    Kilcher, Levi (ORCID:0000000183851131); Thomson, Jim (ORCID:0000000289290088); Harding, Samuel

    2011-02-01

    Data and code that is not already in a public location that is used in Kilcher, Thomson, Harding, and Nylund (2017) "Turbulence Measurements from Compliant Moorings - Part II: Motion Correction" doi: 10.1175/JTECH-D-16-0213.1. The links point to Python source code used in the publication. All other files are source data used in the publication.

  14. Investigation of CSRZ code in FSO communication

    NASA Astrophysics Data System (ADS)

    Zhang, Zhike; Chang, Mingchao; Zhu, Ninghua; Liu, Yu

    2018-02-01

    A cost-effective carrier-suppressed return-to-zero (CSRZ) code generation scheme is proposed by employing a directly modulated laser (DML) module operated at 1.5 μm wavelength. Furthermore, the performance of CSRZ code signal in free-space optical (FSO) link transmission is studied by simulation. It is found from the results that the atmospheric turbulence can deteriorate the transmission performance. However, due to have lower average transmit power and higher spectrum efficient, CSRZ code signal can obtain better amplitude suppression ratio compared to the Non-return-to-zero (NRZ) code.

  15. The Use of a Bayesian Hierarchy to Develop and Validate a Co-Morbidity Score to Predict Mortality for Linked Primary and Secondary Care Data from the NHS in England

    PubMed Central

    Card, Tim R.; West, Joe

    2016-01-01

    Background We have assessed whether the linkage between routine primary and secondary care records provided an opportunity to develop an improved population based co-morbidity score with the combined information on co-morbidities from both health care settings. Methods We extracted all people older than 20 years at the start of 2005 within the linkage between the Hospital Episodes Statistics, Clinical Practice Research Datalink, and Office for National Statistics death register in England. A random 50% sample was used to identify relevant diagnostic codes using a Bayesian hierarchy to share information between similar Read and ICD 10 code groupings. Internal validation of the score was performed in the remaining 50% and discrimination was assessed using Harrell’s C statistic. Comparisons were made over time, age, and consultation rate with the Charlson and Elixhauser indexes. Results 657,264 people were followed up from the 1st January 2005. 98 groupings of codes were derived from the Bayesian hierarchy, and 37 had an adjusted weighting of greater than zero in the Cox proportional hazards model. 11 of these groupings had a different weighting dependent on whether they were coded from hospital or primary care. The C statistic reduced from 0.88 (95% confidence interval 0.88–0.88) in the first year of follow up, to 0.85 (0.85–0.85) including all 5 years. When we stratified the linked score by consultation rate the association with mortality remained consistent, but there was a significant interaction with age, with improved discrimination and fit in those under 50 years old (C = 0.85, 0.83–0.87) compared to the Charlson (C = 0.79, 0.77–0.82) or Elixhauser index (C = 0.81, 0.79–0.83). Conclusions The use of linked population based primary and secondary care data developed a co-morbidity score that had improved discrimination, particularly in younger age groups, and had a greater effect when adjusting for co-morbidity than existing scores. PMID:27788230

  16. Watershed and Economic Data InterOperability (WEDO) ...

    EPA Pesticide Factsheets

    Watershed and Economic Data InterOperability (WEDO) is a system of information technologies designed to publish watershed modeling studies for reuse. WEDO facilitates three aspects of interoperability: discovery, evaluation and integration of data. This increased level of interoperability goes beyond the current practice of publishing modeling studies as reports or journal articles. Rather than summarized results, modeling studies can be published with their full complement of input data, calibration parameters and output with associated metadata for easy duplication by others. Reproducible science is possible only if researchers can find, evaluate and use complete modeling studies performed by other modelers. WEDO greatly increases transparency by making detailed data available to the scientific community.WEDO is a next generation technology, a Web Service linked to the EPA’s EnviroAtlas for discovery of modeling studies nationwide. Streams and rivers are identified using the National Hydrography Dataset network and stream IDs. Streams with modeling studies available are color coded in the EnviroAtlas. One can select streams within a watershed of interest to readily find data available via WEDO. The WEDO website is linked from the EnviroAtlas to provide a thorough review of each modeling study. WEDO currently provides modeled flow and water quality time series, designed for a broad range of watershed and economic models for nutrient trading market analysis. M

  17. PASCO: Structural panel analysis and sizing code: Users manual - Revised

    NASA Technical Reports Server (NTRS)

    Anderson, M. S.; Stroud, W. J.; Durling, B. J.; Hennessy, K. W.

    1981-01-01

    A computer code denoted PASCO is described for analyzing and sizing uniaxially stiffened composite panels. Buckling and vibration analyses are carried out with a linked plate analysis computer code denoted VIPASA, which is included in PASCO. Sizing is based on nonlinear mathematical programming techniques and employs a computer code denoted CONMIN, also included in PASCO. Design requirements considered are initial buckling, material strength, stiffness and vibration frequency. A user's manual for PASCO is presented.

  18. The Coding of Biological Information: From Nucleotide Sequence to Protein Recognition

    NASA Astrophysics Data System (ADS)

    Štambuk, Nikola

    The paper reviews the classic results of Swanson, Dayhoff, Grantham, Blalock and Root-Bernstein, which link genetic code nucleotide patterns to the protein structure, evolution and molecular recognition. Symbolic representation of the binary addresses defining particular nucleotide and amino acid properties is discussed, with consideration of: structure and metric of the code, direct correspondence between amino acid and nucleotide information, and molecular recognition of the interacting protein motifs coded by the complementary DNA and RNA strands.

  19. The accuracy of burn diagnosis codes in health administrative data: A validation study.

    PubMed

    Mason, Stephanie A; Nathens, Avery B; Byrne, James P; Fowler, Rob; Gonzalez, Alejandro; Karanicolas, Paul J; Moineddin, Rahim; Jeschke, Marc G

    2017-03-01

    Health administrative databases may provide rich sources of data for the study of outcomes following burn. We aimed to determine the accuracy of International Classification of Diseases diagnoses codes for burn in a population-based administrative database. Data from a regional burn center's clinical registry of patients admitted between 2006-2013 were linked to administrative databases. Burn total body surface area (TBSA), depth, mechanism, and inhalation injury were compared between the registry and administrative records. The sensitivity, specificity, and positive and negative predictive values were determined, and coding agreement was assessed with the kappa statistic. 1215 burn center patients were linked to administrative records. TBSA codes were highly sensitive and specific for ≥10 and ≥20% TBSA (89/93% sensitive and 95/97% specific), with excellent agreement (κ, 0.85/κ, 0.88). Codes were weakly sensitive (68%) in identifying ≥10% TBSA full-thickness burn, though highly specific (86%) with moderate agreement (κ, 0.46). Codes for inhalation injury had limited sensitivity (43%) but high specificity (99%) with moderate agreement (κ, 0.54). Burn mechanism had excellent coding agreement (κ, 0.84). Administrative data diagnosis codes accurately identify burn by burn size and mechanism, while identification of inhalation injury or full-thickness burns is less sensitive but highly specific. Copyright © 2016 Elsevier Ltd and ISBI. All rights reserved.

  20. Linking Chaotic Advection with Subsurface Biogeochemical Processes

    NASA Astrophysics Data System (ADS)

    Mays, D. C.; Freedman, V. L.; White, S. K.; Fang, Y.; Neupauer, R.

    2017-12-01

    This work investigates the extent to which groundwater flow kinematics drive subsurface biogeochemical processes. In terms of groundwater flow kinematics, we consider chaotic advection, whose essential ingredient is stretching and folding of plumes. Chaotic advection is appealing within the context of groundwater remediation because it has been shown to optimize plume spreading in the laminar flows characteristic of aquifers. In terms of subsurface biogeochemical processes, we consider an existing model for microbially-mediated reduction of relatively mobile uranium(VI) to relatively immobile uranium(IV) following injection of acetate into a floodplain aquifer beneath a former uranium mill in Rifle, Colorado. This model has been implemented in the reactive transport code eSTOMP, the massively parallel version of STOMP (Subsurface Transport Over Multiple Phases). This presentation will report preliminary numerical simulations in which the hydraulic boundary conditions in the eSTOMP model are manipulated to simulate chaotic advection resulting from engineered injection and extraction of water through a manifold of wells surrounding the plume of injected acetate. This approach provides an avenue to simulate the impact of chaotic advection within the existing framework of the eSTOMP code.

  1. Apply network coding for H.264/SVC multicasting

    NASA Astrophysics Data System (ADS)

    Wang, Hui; Kuo, C.-C. Jay

    2008-08-01

    In a packet erasure network environment, video streaming benefits from error control in two ways to achieve graceful degradation. The first approach is application-level (or the link-level) forward error-correction (FEC) to provide erasure protection. The second error control approach is error concealment at the decoder end to compensate lost packets. A large amount of research work has been done in the above two areas. More recently, network coding (NC) techniques have been proposed for efficient data multicast over networks. It was shown in our previous work that multicast video streaming benefits from NC for its throughput improvement. An algebraic model is given to analyze the performance in this work. By exploiting the linear combination of video packets along nodes in a network and the SVC video format, the system achieves path diversity automatically and enables efficient video delivery to heterogeneous receivers in packet erasure channels. The application of network coding can protect video packets against the erasure network environment. However, the rank defficiency problem of random linear network coding makes the error concealment inefficiently. It is shown by computer simulation that the proposed NC video multicast scheme enables heterogenous receiving according to their capacity constraints. But it needs special designing to improve the video transmission performance when applying network coding.

  2. Origin and evolution of the long non-coding genes in the X-inactivation center.

    PubMed

    Romito, Antonio; Rougeulle, Claire

    2011-11-01

    Random X chromosome inactivation (XCI), the eutherian mechanism of X-linked gene dosage compensation, is controlled by a cis-acting locus termed the X-inactivation center (Xic). One of the striking features that characterize the Xic landscape is the abundance of loci transcribing non-coding RNAs (ncRNAs), including Xist, the master regulator of the inactivation process. Recent comparative genomic analyses have depicted the evolutionary scenario behind the origin of the X-inactivation center, revealing that this locus evolved from a region harboring protein-coding genes. During mammalian radiation, this ancestral protein-coding region was disrupted in the marsupial group, whilst it provided in eutherian lineage the starting material for the non-translated RNAs of the X-inactivation center. The emergence of non-coding genes occurred by a dual mechanism involving loss of protein-coding function of the pre-existing genes and integration of different classes of mobile elements, some of which modeled the structure and sequence of the non-coding genes in a species-specific manner. The rising genes started to produce transcripts that acquired function in regulating the epigenetic status of the X chromosome, as shown for Xist, its antisense Tsix, Jpx, and recently suggested for Ftx. Thus, the appearance of the Xic, which occurred after the divergence between eutherians and marsupials, was the basis for the evolution of random X inactivation as a strategy to achieve dosage compensation. Copyright © 2011. Published by Elsevier Masson SAS.

  3. Using linked data to evaluate medical and financial outcomes of motor vehicle crashes in Connecticut : Crash Outcome Data Evaluation System (CODES) linked data demonstration project

    DOT National Transportation Integrated Search

    1999-09-01

    A deterministic algorithm was developed which allowed data from Department of Transportation motor vehicle crash records, state mortality registry records, and hospital admission and emergency department records to be linked for analysis of the finan...

  4. Using linked data to evaluate hospital charges for motor vehicle crash victims in Pennsylvania : Crash Outcome Data Evaluation System (CODES) linked data demonstration project

    DOT National Transportation Integrated Search

    1998-10-01

    The report uses police-reported crash data that have been linked to hospital discharge data to evaluate charges for hospital care provided to motor vehicle crash victims in Pennsylvania. Approximately 17,000 crash victims were hospitalized in Pennsyl...

  5. Standardized reporting using CODES (Crash Outcome Data Evaluation System)

    DOT National Transportation Integrated Search

    1999-12-01

    While CODES projects have expanded to 25 states, there is no standardized reporting of the outcome measures that are available with linked data. This paper describes our efforts to build a standard format for reporting these outcomes. This format is ...

  6. Compiling standardized information from clinical practice: using content analysis and ICF Linking Rules in a goal-oriented youth rehabilitation program.

    PubMed

    Lustenberger, Nadia A; Prodinger, Birgit; Dorjbal, Delgerjargal; Rubinelli, Sara; Schmitt, Klaus; Scheel-Sailer, Anke

    2017-09-23

    To illustrate how routinely written narrative admission and discharge reports of a rehabilitation program for eight youths with chronic neurological health conditions can be transformed to the International Classification of Functioning, Disability and Health. First, a qualitative content analysis was conducted by building meaningful units with text segments assigned of the reports to the five elements of the Rehab-Cycle ® : goal; assessment; assignment; intervention; evaluation. Second, the meaningful units were then linked to the ICF using the refined ICF Linking Rules. With the first step of transformation, the emphasis of the narrative reports changed to a process oriented interdisciplinary layout, revealing three thematic blocks of goals: mobility, self-care, mental, and social functions. The linked 95 unique ICF codes could be grouped in clinically meaningful goal-centered ICF codes. Between the two independent linkers, the agreement rate was improved after complementing the rules with additional agreements. The ICF Linking Rules can be used to compile standardized health information from narrative reports if prior structured. The process requires time and expertise. To implement the ICF into common practice, the findings provide the starting point for reporting rehabilitation that builds upon existing practice and adheres to international standards. Implications for Rehabilitation This study provides evidence that routinely collected health information from rehabilitation practice can be transformed to the International Classification of Functioning, Disability and Health by using the "ICF Linking Rules", however, this requires time and expertise. The Rehab-Cycle ® , including assessments, assignments, goal setting, interventions and goal evaluation, serves as feasible framework for structuring this rehabilitation program and ensures that the complexity of local practice is appropriately reflected. The refined "ICF Linking Rules" lead to a standardized transformation process of narrative text and thus a higher quality with increased transparency. As a next step, the resulting format of goal codes supplemented by goal-clarifying codes could be validated to strengthen the implementation of the International Classification of Functioning, Disability and Health into rehabilitation routine by respecting the variety of clinical practice.

  7. A power-efficient communication system between brain-implantable devices and external computers.

    PubMed

    Yao, Ning; Lee, Heung-No; Chang, Cheng-Chun; Sclabassi, Robert J; Sun, Mingui

    2007-01-01

    In this paper, we propose a power efficient communication system for linking a brain-implantable device to an external system. For battery powered implantable devices, the processor and the transmitter power should be reduced in order to both conserve battery power and reduce the health risks associated with transmission. To accomplish this, a joint source-channel coding/decoding system is devised. Low-density generator matrix (LDGM) codes are used in our system due to their low encoding complexity. The power cost for signal processing within the implantable device is greatly reduced by avoiding explicit source encoding. Raw data which is highly correlated is transmitted. At the receiver, a Markov chain source correlation model is utilized to approximate and capture the correlation of raw data. A turbo iterative receiver algorithm is designed which connects the Markov chain source model to the LDGM decoder in a turbo-iterative way. Simulation results show that the proposed system can save up to 1 to 2.5 dB on transmission power.

  8. Spatial Clustering of Occupational Injuries in Communities

    PubMed Central

    Friedman, Lee; Chin, Brian; Madigan, Dana

    2015-01-01

    Objectives. Using the social-ecological model, we hypothesized that the home residences of injured workers would be clustered predictably and geographically. Methods. We linked health care and publicly available datasets by home zip code for traumatically injured workers in Illinois from 2000 to 2009. We calculated numbers and rates of injuries, determined the spatial relationships, and developed 3 models. Results. Among the 23 200 occupational injuries, 80% of cases were located in 20% of zip codes and clustered in 10 locations. After component analysis, numbers and clusters of injuries correlated directly with immigrants; injury rates inversely correlated with urban poverty. Conclusions. Traumatic occupational injuries were clustered spatially by home location of the affected workers and in a predictable way. This put an inequitable burden on communities and provided evidence for the possible value of community-based interventions for prevention of occupational injuries. Work should be included in health disparities research. Stakeholders should determine whether and how to intervene at the community level to prevent occupational injuries. PMID:25905838

  9. Expression-Linked Patterns of Codon Usage, Amino Acid Frequency, and Protein Length in the Basally Branching Arthropod Parasteatoda tepidariorum

    PubMed Central

    Whittle, Carrie A.; Extavour, Cassandra G.

    2016-01-01

    Abstract Spiders belong to the Chelicerata, the most basally branching arthropod subphylum. The common house spider, Parasteatoda tepidariorum, is an emerging model and provides a valuable system to address key questions in molecular evolution in an arthropod system that is distinct from traditionally studied insects. Here, we provide evidence suggesting that codon usage, amino acid frequency, and protein lengths are each influenced by expression-mediated selection in P. tepidariorum. First, highly expressed genes exhibited preferential usage of T3 codons in this spider, suggestive of selection. Second, genes with elevated transcription favored amino acids with low or intermediate size/complexity (S/C) scores (glycine and alanine) and disfavored those with large S/C scores (such as cysteine), consistent with the minimization of biosynthesis costs of abundant proteins. Third, we observed a negative correlation between expression level and coding sequence length. Together, we conclude that protein-coding genes exhibit signals of expression-related selection in this emerging, noninsect, arthropod model. PMID:27017527

  10. PRay - A graphical user interface for interactive visualization and modification of rayinvr models

    NASA Astrophysics Data System (ADS)

    Fromm, T.

    2016-01-01

    PRay is a graphical user interface for interactive displaying and editing of velocity models for seismic refraction. It is optimized for editing rayinvr models but can also be used as a dynamic viewer for ray tracing results from other software. The main features are the graphical editing of nodes and fast adjusting of the display (stations and phases). It can be extended by user-defined shell scripts and links to phase picking software. PRay is open source software written in the scripting language Perl, runs on Unix-like operating systems including Mac OS X and provides a version controlled source code repository for community development (https://sourceforge.net/projects/pray-plot-rayinvr/).

  11. Factors associated with delay in trauma team activation and impact on patient outcomes.

    PubMed

    Connolly, Rory; Woo, Michael Y; Lampron, Jacinthe; Perry, Jeffrey J

    2017-09-05

    Trauma code activation is initiated by emergency physicians using physiological and anatomical criteria, mechanism of injury, and patient demographic factors. Our objective was to identify factors associated with delayed trauma team activation. We assessed consecutive cases from a regional trauma database from January 2008 to March 2014. We defined a delay in trauma code activation as a time greater than 30 minutes from the time of arrival. We conducted univariate analysis for factors potentially influencing trauma team activation, and we subsequently used multiple logistic regression analysis models for delayed activation in relation to mortality, length of stay, and time to operative management. Patients totalling 846 were included for our analysis; 4.1% (35/846) of trauma codes were activated after 30 minutes. Mean age was 40.8 years in the early group versus 49.2 in the delayed group (p=0.01). Patients were over age 70 years in 7.6% in the early activation group versus 17.1% in the delayed group (p=0.04). There was no significant difference in sex, type of injury, injury severity, or time from injury between the two groups. There was no significant difference in mortality, median length of stay, or median time to operative management. Delayed activation is linked with increasing age with no clear link to increased mortality. Given the severe injuries in the delayed cohort that required activation of the trauma team, further emphasis on the older trauma patient and interventions to recognize this vulnerable population should be made.

  12. Superdense Coding over Optical Fiber Links with Complete Bell-State Measurements

    DOE PAGES

    Williams, Brian P.; Sadlier, Ronald J.; Humble, Travis S.

    2017-02-01

    Adopting quantum communication to modern networking requires transmitting quantum information through a fiber-based infrastructure. In this paper, we report the first demonstration of superdense coding over optical fiber links, taking advantage of a complete Bell-state measurement enabled by time-polarization hyperentanglement, linear optics, and common single-photon detectors. Finally, we demonstrate the highest single-qubit channel capacity to date utilizing linear optics, 1.665 ± 0.018, and we provide a full experimental implementation of a hybrid, quantum-classical communication protocol for image transfer.

  13. Optimizing the Galileo space communication link

    NASA Technical Reports Server (NTRS)

    Statman, J. I.

    1994-01-01

    The Galileo mission was originally designed to investigate Jupiter and its moons utilizing a high-rate, X-band (8415 MHz) communication downlink with a maximum rate of 134.4 kb/sec. However, following the failure of the high-gain antenna (HGA) to fully deploy, a completely new communication link design was established that is based on Galileo's S-band (2295 MHz), low-gain antenna (LGA). The new link relies on data compression, local and intercontinental arraying of antennas, a (14,1/4) convolutional code, a (255,M) variable-redundancy Reed-Solomon code, decoding feedback, and techniques to reprocess recorded data to greatly reduce data losses during signal acquisition. The combination of these techniques will enable return of significant science data from the mission.

  14. From Majorana fermions to topological order.

    PubMed

    Terhal, Barbara M; Hassler, Fabian; DiVincenzo, David P

    2012-06-29

    We consider a system consisting of a 2D network of links between Majorana fermions on superconducting islands. We show that the fermionic Hamiltonian modeling this system is topologically ordered in a region of parameter space: we show that Kitaev's toric code emerges in fourth-order perturbation theory. By using a Jordan-Wigner transformation we can map the model onto a family of signed 2D Ising models in a transverse field where the signs, ferromagnetic or antiferromagnetic, are determined by additional gauge bits. Our mapping allows an understanding of the nonperturbative regime and the phase transition to a nontopological phase. We discuss the physics behind a possible implementation of this model and argue how it can be used for topological quantum computation by adiabatic changes in the Hamiltonian.

  15. PROTEUS-SN User Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shemon, Emily R.; Smith, Micheal A.; Lee, Changho

    2016-02-16

    PROTEUS-SN is a three-dimensional, highly scalable, high-fidelity neutron transport code developed at Argonne National Laboratory. The code is applicable to all spectrum reactor transport calculations, particularly those in which a high degree of fidelity is needed either to represent spatial detail or to resolve solution gradients. PROTEUS-SN solves the second order formulation of the transport equation using the continuous Galerkin finite element method in space, the discrete ordinates approximation in angle, and the multigroup approximation in energy. PROTEUS-SN’s parallel methodology permits the efficient decomposition of the problem by both space and angle, permitting large problems to run efficiently on hundredsmore » of thousands of cores. PROTEUS-SN can also be used in serial or on smaller compute clusters (10’s to 100’s of cores) for smaller homogenized problems, although it is generally more computationally expensive than traditional homogenized methodology codes. PROTEUS-SN has been used to model partially homogenized systems, where regions of interest are represented explicitly and other regions are homogenized to reduce the problem size and required computational resources. PROTEUS-SN solves forward and adjoint eigenvalue problems and permits both neutron upscattering and downscattering. An adiabatic kinetics option has recently been included for performing simple time-dependent calculations in addition to standard steady state calculations. PROTEUS-SN handles void and reflective boundary conditions. Multigroup cross sections can be generated externally using the MC2-3 fast reactor multigroup cross section generation code or internally using the cross section application programming interface (API) which can treat the subgroup or resonance table libraries. PROTEUS-SN is written in Fortran 90 and also includes C preprocessor definitions. The code links against the PETSc, METIS, HDF5, and MPICH libraries. It optionally links against the MOAB library and is a part of the SHARP multi-physics suite for coupled multi-physics analysis of nuclear reactors. This user manual describes how to set up a neutron transport simulation with the PROTEUS-SN code. A companion methodology manual describes the theory and algorithms within PROTEUS-SN.« less

  16. A burst-mode photon counting receiver with automatic channel estimation and bit rate detection

    NASA Astrophysics Data System (ADS)

    Rao, Hemonth G.; DeVoe, Catherine E.; Fletcher, Andrew S.; Gaschits, Igor D.; Hakimi, Farhad; Hamilton, Scott A.; Hardy, Nicholas D.; Ingwersen, John G.; Kaminsky, Richard D.; Moores, John D.; Scheinbart, Marvin S.; Yarnall, Timothy M.

    2016-04-01

    We demonstrate a multi-rate burst-mode photon-counting receiver for undersea communication at data rates up to 10.416 Mb/s over a 30-foot water channel. To the best of our knowledge, this is the first demonstration of burst-mode photon-counting communication. With added attenuation, the maximum link loss is 97.1 dB at λ=517 nm. In clear ocean water, this equates to link distances up to 148 meters. For λ=470 nm, the achievable link distance in clear ocean water is 450 meters. The receiver incorporates soft-decision forward error correction (FEC) based on a product code of an inner LDPC code and an outer BCH code. The FEC supports multiple code rates to achieve error-free performance. We have selected a burst-mode receiver architecture to provide robust performance with respect to unpredictable channel obstructions. The receiver is capable of on-the-fly data rate detection and adapts to changing levels of signal and background light. The receiver updates its phase alignment and channel estimates every 1.6 ms, allowing for rapid changes in water quality as well as motion between transmitter and receiver. We demonstrate on-the-fly rate detection, channel BER within 0.2 dB of theory across all data rates, and error-free performance within 1.82 dB of soft-decision capacity across all tested code rates. All signal processing is done in FPGAs and runs continuously in real time.

  17. Large Eddy simulation of turbulence: A subgrid scale model including shear, vorticity, rotation, and buoyancy

    NASA Technical Reports Server (NTRS)

    Canuto, V. M.

    1994-01-01

    The Reynolds numbers that characterize geophysical and astrophysical turbulence (Re approximately equals 10(exp 8) for the planetary boundary layer and Re approximately equals 10(exp 14) for the Sun's interior) are too large to allow a direct numerical simulation (DNS) of the fundamental Navier-Stokes and temperature equations. In fact, the spatial number of grid points N approximately Re(exp 9/4) exceeds the computational capability of today's supercomputers. Alternative treatments are the ensemble-time average approach, and/or the volume average approach. Since the first method (Reynolds stress approach) is largely analytical, the resulting turbulence equations entail manageable computational requirements and can thus be linked to a stellar evolutionary code or, in the geophysical case, to general circulation models. In the volume average approach, one carries out a large eddy simulation (LES) which resolves numerically the largest scales, while the unresolved scales must be treated theoretically with a subgrid scale model (SGS). Contrary to the ensemble average approach, the LES+SGS approach has considerable computational requirements. Even if this prevents (for the time being) a LES+SGS model to be linked to stellar or geophysical codes, it is still of the greatest relevance as an 'experimental tool' to be used, inter alia, to improve the parameterizations needed in the ensemble average approach. Such a methodology has been successfully adopted in studies of the convective planetary boundary layer. Experienc e with the LES+SGS approach from different fields has shown that its reliability depends on the healthiness of the SGS model for numerical stability as well as for physical completeness. At present, the most widely used SGS model, the Smagorinsky model, accounts for the effect of the shear induced by the large resolved scales on the unresolved scales but does not account for the effects of buoyancy, anisotropy, rotation, and stable stratification. The latter phenomenon, which affects both geophysical and astrophysical turbulence (e.g., oceanic structure and convective overshooting in stars), has been singularly difficult to account for in turbulence modeling. For example, the widely used model of Deardorff has not been confirmed by recent LES results. As of today, there is no SGS model capable of incorporating buoyancy, rotation, shear, anistropy, and stable stratification (gravity waves). In this paper, we construct such a model which we call CM (complete model). We also present a hierarchy of simpler algebraic models (called AM) of varying complexity. Finally, we present a set of models which are simplified even further (called SM), the simplest of which is the Smagorinsky-Lilly model. The incorporation of these models into the presently available LES codes should begin with the SM, to be followed by the AM and finally by the CM.

  18. Large Eddy simulation of turbulence: A subgrid scale model including shear, vorticity, rotation, and buoyancy

    NASA Astrophysics Data System (ADS)

    Canuto, V. M.

    1994-06-01

    The Reynolds numbers that characterize geophysical and astrophysical turbulence (Re approximately equals 108 for the planetary boundary layer and Re approximately equals 1014 for the Sun's interior) are too large to allow a direct numerical simulation (DNS) of the fundamental Navier-Stokes and temperature equations. In fact, the spatial number of grid points N approximately Re9/4 exceeds the computational capability of today's supercomputers. Alternative treatments are the ensemble-time average approach, and/or the volume average approach. Since the first method (Reynolds stress approach) is largely analytical, the resulting turbulence equations entail manageable computational requirements and can thus be linked to a stellar evolutionary code or, in the geophysical case, to general circulation models. In the volume average approach, one carries out a large eddy simulation (LES) which resolves numerically the largest scales, while the unresolved scales must be treated theoretically with a subgrid scale model (SGS). Contrary to the ensemble average approach, the LES+SGS approach has considerable computational requirements. Even if this prevents (for the time being) a LES+SGS model to be linked to stellar or geophysical codes, it is still of the greatest relevance as an 'experimental tool' to be used, inter alia, to improve the parameterizations needed in the ensemble average approach. Such a methodology has been successfully adopted in studies of the convective planetary boundary layer. Experienc e with the LES+SGS approach from different fields has shown that its reliability depends on the healthiness of the SGS model for numerical stability as well as for physical completeness. At present, the most widely used SGS model, the Smagorinsky model, accounts for the effect of the shear induced by the large resolved scales on the unresolved scales but does not account for the effects of buoyancy, anisotropy, rotation, and stable stratification. The latter phenomenon, which affects both geophysical and astrophysical turbulence (e.g., oceanic structure and convective overshooting in stars), has been singularly difficult to account for in turbulence modeling. For example, the widely used model of Deardorff has not been confirmed by recent LES results. As of today, there is no SGS model capable of incorporating buoyancy, rotation, shear, anistropy, and stable stratification (gravity waves). In this paper, we construct such a model which we call CM (complete model). We also present a hierarchy of simpler algebraic models (called AM) of varying complexity. Finally, we present a set of models which are simplified even further (called SM), the simplest of which is the Smagorinsky-Lilly model. The incorporation of these models into the presently available LES codes should begin with the SM, to be followed by the AM and finally by the CM.

  19. The introspective may achieve more: Enhancing existing Geoscientific models with native-language emulated structural reflection

    DOE PAGES

    Ji, Xinye; Shen, Chaopeng

    2017-09-28

    Geoscientific models manage myriad and increasingly complex data structures as trans-disciplinary models are integrated. They often incur significant redundancy with cross-cutting tasks. Reflection, the ability of a program to inspect and modify its structure and behavior at runtime, is known as a powerful tool to improve code reusability, abstraction, and separation of concerns. Reflection is rarely adopted in high-performance Geoscientific models, especially with Fortran, where it was previously deemed implausible. Practical constraints of language and legacy often limit us to feather-weight, native-language solutions. We demonstrate the usefulness of a structural-reflection-emulating, dynamically-linked metaObjects, gd. We show real-world examples including data structuremore » self-assembly, effortless save/restart and upgrade to parallel I/O, recursive actions and batch operations. We share gd and a derived module that reproduces MATLAB-like structure in Fortran and C++. We suggest that both a gd representation and a Fortran-native representation are maintained to access the data, each for separate purposes. In conclusion, embracing emulated reflection allows generically-written codes that are highly re-usable across projects.« less

  20. The introspective may achieve more: Enhancing existing Geoscientific models with native-language emulated structural reflection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ji, Xinye; Shen, Chaopeng

    Geoscientific models manage myriad and increasingly complex data structures as trans-disciplinary models are integrated. They often incur significant redundancy with cross-cutting tasks. Reflection, the ability of a program to inspect and modify its structure and behavior at runtime, is known as a powerful tool to improve code reusability, abstraction, and separation of concerns. Reflection is rarely adopted in high-performance Geoscientific models, especially with Fortran, where it was previously deemed implausible. Practical constraints of language and legacy often limit us to feather-weight, native-language solutions. We demonstrate the usefulness of a structural-reflection-emulating, dynamically-linked metaObjects, gd. We show real-world examples including data structuremore » self-assembly, effortless save/restart and upgrade to parallel I/O, recursive actions and batch operations. We share gd and a derived module that reproduces MATLAB-like structure in Fortran and C++. We suggest that both a gd representation and a Fortran-native representation are maintained to access the data, each for separate purposes. In conclusion, embracing emulated reflection allows generically-written codes that are highly re-usable across projects.« less

  1. Micromechanics Analysis Code Post-Processing (MACPOST) User Guide. 1.0

    NASA Technical Reports Server (NTRS)

    Goldberg, Robert K.; Comiskey, Michele D.; Bednarcyk, Brett A.

    1999-01-01

    As advanced composite materials have gained wider usage. the need for analytical models and computer codes to predict the thermomechanical deformation response of these materials has increased significantly. Recently, a micromechanics technique called the generalized method of cells (GMC) has been developed, which has the capability to fulfill this -oal. Tc provide a framework for GMC, the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) has been developed. As MAC/GMC has been updated, significant improvements have been made to the post-processing capabilities of the code. Through the MACPOST program, which operates directly within the MSC/PATRAN graphical pre- and post-processing package, a direct link between the analysis capabilities of MAC/GMC and the post-processing capabilities of MSC/PATRAN has been established. MACPOST has simplified the production, printing. and exportation of results for unit cells analyzed by MAC/GMC. MACPOST allows different micro-level quantities to be plotted quickly and easily in contour plots. In addition, meaningful data for X-Y plots can be examined. MACPOST thus serves as an important analysis and visualization tool for the macro- and micro-level data generated by MAC/GMC. This report serves as the user's manual for the MACPOST program.

  2. CSlib, a library to couple codes via Client/Server messaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plimpton, Steve

    The CSlib is a small, portable library which enables two (or more) independent simulation codes to be coupled, by exchanging messages with each other. Both codes link to the library when they are built, and can them communicate with each other as they run. The messages contain data or instructions that the two codes send back-and-forth to each other. The messaging can take place via files, sockets, or MPI. The latter is a standard distributed-memory message-passing library.

  3. VizieR Online Data Catalog: Mercury-T code (Bolmont+, 2015)

    NASA Astrophysics Data System (ADS)

    Bolmont, E.; Raymond, S. N.; Leconte, J.; Hersant, F.; Correia, A. C. M.

    2015-11-01

    The major addition to Mercury provided in Mercury-T is the addition of the tidal forces and torques. But we also added the effect of general relativity and rotation-induced deformation. We explain in the following sections how these effects were incorporated in the code. We also give the planets and star/BD/Jupiter parameters which are implemented in the code. The link to this code and the manual can also be found here: http://www.emelinebolmont.com/research-interests (2 data files).

  4. A cascaded coding scheme for error control

    NASA Technical Reports Server (NTRS)

    Shu, L.; Kasami, T.

    1985-01-01

    A cascade coding scheme for error control is investigated. The scheme employs a combination of hard and soft decisions in decoding. Error performance is analyzed. If the inner and outer codes are chosen properly, extremely high reliability can be attained even for a high channel bit-error-rate. Some example schemes are evaluated. They seem to be quite suitable for satellite down-link error control.

  5. A cascaded coding scheme for error control

    NASA Technical Reports Server (NTRS)

    Kasami, T.; Lin, S.

    1985-01-01

    A cascaded coding scheme for error control was investigated. The scheme employs a combination of hard and soft decisions in decoding. Error performance is analyzed. If the inner and outer codes are chosen properly, extremely high reliability can be attained even for a high channel bit-error-rate. Some example schemes are studied which seem to be quite suitable for satellite down-link error control.

  6. Quality improvement utilizing in-situ simulation for a dual-hospital pediatric code response team.

    PubMed

    Yager, Phoebe; Collins, Corey; Blais, Carlene; O'Connor, Kathy; Donovan, Patricia; Martinez, Maureen; Cummings, Brian; Hartnick, Christopher; Noviski, Natan

    2016-09-01

    Given the rarity of in-hospital pediatric emergency events, identification of gaps and inefficiencies in the code response can be difficult. In-situ, simulation-based medical education programs can identify unrecognized systems-based challenges. We hypothesized that developing an in-situ, simulation-based pediatric emergency response program would identify latent inefficiencies in a complex, dual-hospital pediatric code response system and allow rapid intervention testing to improve performance before implementation at an institutional level. Pediatric leadership from two hospitals with a shared pediatric code response team employed the Institute for Healthcare Improvement's (IHI) Breakthrough Model for Collaborative Improvement to design a program consisting of Plan-Do-Study-Act cycles occurring in a simulated environment. The objectives of the program were to 1) identify inefficiencies in our pediatric code response; 2) correlate to current workflow; 3) employ an iterative process to test quality improvement interventions in a safe environment; and 4) measure performance before actual implementation at the institutional level. Twelve dual-hospital, in-situ, simulated, pediatric emergencies occurred over one year. The initial simulated event allowed identification of inefficiencies including delayed provider response, delayed initiation of cardiopulmonary resuscitation (CPR), and delayed vascular access. These gaps were linked to process issues including unreliable code pager activation, slow elevator response, and lack of responder familiarity with layout and contents of code cart. From first to last simulation with multiple simulated process improvements, code response time for secondary providers coming from the second hospital decreased from 29 to 7 min, time to CPR initiation decreased from 90 to 15 s, and vascular access obtainment decreased from 15 to 3 min. Some of these simulated process improvements were adopted into the institutional response while others continue to be trended over time for evidence that observed changes represent a true new state of control. Utilizing the IHI's Breakthrough Model, we developed a simulation-based program to 1) successfully identify gaps and inefficiencies in a complex, dual-hospital, pediatric code response system and 2) provide an environment in which to safely test quality improvement interventions before institutional dissemination. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  7. Air Quality Modeling and Forecasting over the United States Using WRF-Chem

    NASA Astrophysics Data System (ADS)

    Boxe, C.; Hafsa, U.; Blue, S.; Emmanuel, S.; Griffith, E.; Moore, J.; Tam, J.; Khan, I.; Cai, Z.; Bocolod, B.; Zhao, J.; Ahsan, S.; Gurung, D.; Tang, N.; Bartholomew, J.; Rafi, R.; Caltenco, K.; Rivas, M.; Ditta, H.; Alawlaqi, H.; Rowley, N.; Khatim, F.; Ketema, N.; Strothers, J.; Diallo, I.; Owens, C.; Radosavljevic, J.; Austin, S. A.; Johnson, L. P.; Zavala-Gutierrez, R.; Breary, N.; Saint-Hilaire, D.; Skeete, D.; Stock, J.; Salako, O.

    2016-12-01

    WRF-Chem is the Weather Research and Forecasting (WRF) model coupled with Chemistry. The model simulates the emission, transport, mixing, and chemical transformation of trace gases and aerosols simultaneously with the meteorology. The model is used for investigation of regional-scale air quality, field program analysis, and cloud-scale interactions between clouds and chemistry. The development of WRF-Chem is a collaborative effort among the community led by NOAA/ESRL scientists. The Official WRF-Chem web page is located at the NOAA web site. Our model development is closely linked with both NOAA/ESRL and DOE/PNNL efforts. Description of PNNL WRF-Chem model development is located at the PNNL web site as well as the PNNL Aerosol Modeling Testbed. High school and undergraduate students, representative of academic institutions throughout USA's Tri-State Area (New York, New Jersey, Connecticut), set up WRF-Chem on CUNY CSI's High Performance Computing Center. Students learned the back-end coding that governs WRF-Chems structure and the front-end coding that displays visually specified weather simulations and forecasts. Students also investigated the impact, to select baseline simulations/forecasts, due to the reaction, NO2 + OH + M → HOONO + M (k = 9.2 × 10-12 cm3 molecule-1 s-1, Mollner et al. 2010). The reaction of OH and NO2 to form gaseous nitric acid (HONO2) is among the most influential and in atmospheric chemistry. Till a few years prior, its rate coefficient remained poorly determined under tropospheric conditions because of difficulties in making laboratory measurements at 760 torr. These activities fosters student coding competencies and deep insights into weather forecast and air quality.

  8. Modeling variably saturated subsurface solute transport with MODFLOW-UZF and MT3DMS

    USGS Publications Warehouse

    Morway, Eric D.; Niswonger, Richard G.; Langevin, Christian D.; Bailey, Ryan T.; Healy, Richard W.

    2013-01-01

    The MT3DMS groundwater solute transport model was modified to simulate solute transport in the unsaturated zone by incorporating the unsaturated-zone flow (UZF1) package developed for MODFLOW. The modified MT3DMS code uses a volume-averaged approach in which Lagrangian-based UZF1 fluid fluxes and storage changes are mapped onto a fixed grid. Referred to as UZF-MT3DMS, the linked model was tested against published benchmarks solved analytically as well as against other published codes, most frequently the U.S. Geological Survey's Variably-Saturated Two-Dimensional Flow and Transport Model. Results from a suite of test cases demonstrate that the modified code accurately simulates solute advection, dispersion, and reaction in the unsaturated zone. Two- and three-dimensional simulations also were investigated to ensure unsaturated-saturated zone interaction was simulated correctly. Because the UZF1 solution is analytical, large-scale flow and transport investigations can be performed free from the computational and data burdens required by numerical solutions to Richards' equation. Results demonstrate that significant simulation runtime savings can be achieved with UZF-MT3DMS, an important development when hundreds or thousands of model runs are required during parameter estimation and uncertainty analysis. Three-dimensional variably saturated flow and transport simulations revealed UZF-MT3DMS to have runtimes that are less than one tenth of the time required by models that rely on Richards' equation. Given its accuracy and efficiency, and the wide-spread use of both MODFLOW and MT3DMS, the added capability of unsaturated-zone transport in this familiar modeling framework stands to benefit a broad user-ship.

  9. Modeling variably saturated subsurface solute transport with MODFLOW-UZF and MT3DMS.

    PubMed

    Morway, Eric D; Niswonger, Richard G; Langevin, Christian D; Bailey, Ryan T; Healy, Richard W

    2013-03-01

    The MT3DMS groundwater solute transport model was modified to simulate solute transport in the unsaturated zone by incorporating the unsaturated-zone flow (UZF1) package developed for MODFLOW. The modified MT3DMS code uses a volume-averaged approach in which Lagrangian-based UZF1 fluid fluxes and storage changes are mapped onto a fixed grid. Referred to as UZF-MT3DMS, the linked model was tested against published benchmarks solved analytically as well as against other published codes, most frequently the U.S. Geological Survey's Variably-Saturated Two-Dimensional Flow and Transport Model. Results from a suite of test cases demonstrate that the modified code accurately simulates solute advection, dispersion, and reaction in the unsaturated zone. Two- and three-dimensional simulations also were investigated to ensure unsaturated-saturated zone interaction was simulated correctly. Because the UZF1 solution is analytical, large-scale flow and transport investigations can be performed free from the computational and data burdens required by numerical solutions to Richards' equation. Results demonstrate that significant simulation runtime savings can be achieved with UZF-MT3DMS, an important development when hundreds or thousands of model runs are required during parameter estimation and uncertainty analysis. Three-dimensional variably saturated flow and transport simulations revealed UZF-MT3DMS to have runtimes that are less than one tenth of the time required by models that rely on Richards' equation. Given its accuracy and efficiency, and the wide-spread use of both MODFLOW and MT3DMS, the added capability of unsaturated-zone transport in this familiar modeling framework stands to benefit a broad user-ship. Published 2012. This article is a U.S. Government work and is in the public domain in the USA.

  10. Modeling and inversion Matlab algorithms for resistivity, induced polarization and seismic data

    NASA Astrophysics Data System (ADS)

    Karaoulis, M.; Revil, A.; Minsley, B. J.; Werkema, D. D.

    2011-12-01

    M. Karaoulis (1), D.D. Werkema (3), A. Revil (1,2), A., B. Minsley (4), (1) Colorado School of Mines, Dept. of Geophysics, Golden, CO, USA. (2) ISTerre, CNRS, UMR 5559, Université de Savoie, Equipe Volcan, Le Bourget du Lac, France. (3) U.S. EPA, ORD, NERL, ESD, CMB, Las Vegas, Nevada, USA . (4) USGS, Federal Center, Lakewood, 10, 80225-0046, CO. Abstract We propose 2D and 3D forward modeling and inversion package for DC resistivity, time domain induced polarization (IP), frequency-domain IP, and seismic refraction data. For the resistivity and IP case, discretization is based on rectangular cells, where each cell has as unknown resistivity in the case of DC modelling, resistivity and chargeability in the time domain IP modelling, and complex resistivity in the spectral IP modelling. The governing partial-differential equations are solved with the finite element method, which can be applied to both real and complex variables that are solved for. For the seismic case, forward modeling is based on solving the eikonal equation using a second-order fast marching method. The wavepaths are materialized by Fresnel volumes rather than by conventional rays. This approach accounts for complicated velocity models and is advantageous because it considers frequency effects on the velocity resolution. The inversion can accommodate data at a single time step, or as a time-lapse dataset if the geophysical data are gathered for monitoring purposes. The aim of time-lapse inversion is to find the change in the velocities or resistivities of each model cell as a function of time. Different time-lapse algorithms can be applied such as independent inversion, difference inversion, 4D inversion, and 4D active time constraint inversion. The forward algorithms are benchmarked against analytical solutions and inversion results are compared with existing ones. The algorithms are packaged as Matlab codes with a simple Graphical User Interface. Although the code is parallelized for multi-core cpus, it is not as fast as machine code. In the case of large datasets, someone should consider transferring parts of the code to C or Fortran through mex files. This code is available through EPA's website on the following link http://www.epa.gov/esd/cmb/GeophysicsWebsite/index.html Although this work was reviewed by EPA and approved for publication, it may not necessarily reflect official Agency policy.

  11. Bar Coding and Tracking in Pathology.

    PubMed

    Hanna, Matthew G; Pantanowitz, Liron

    2016-03-01

    Bar coding and specimen tracking are intricately linked to pathology workflow and efficiency. In the pathology laboratory, bar coding facilitates many laboratory practices, including specimen tracking, automation, and quality management. Data obtained from bar coding can be used to identify, locate, standardize, and audit specimens to achieve maximal laboratory efficiency and patient safety. Variables that need to be considered when implementing and maintaining a bar coding and tracking system include assets to be labeled, bar code symbologies, hardware, software, workflow, and laboratory and information technology infrastructure as well as interoperability with the laboratory information system. This article addresses these issues, primarily focusing on surgical pathology. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Protograph LDPC Codes Over Burst Erasure Channels

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; Dolinar, Sam; Jones, Christopher

    2006-01-01

    In this paper we design high rate protograph based LDPC codes suitable for binary erasure channels. To simplify the encoder and decoder implementation for high data rate transmission, the structure of codes are based on protographs and circulants. These LDPC codes can improve data link and network layer protocols in support of communication networks. Two classes of codes were designed. One class is designed for large block sizes with an iterative decoding threshold that approaches capacity of binary erasure channels. The other class is designed for short block sizes based on maximizing minimum stopping set size. For high code rates and short blocks the second class outperforms the first class.

  13. Bar Coding and Tracking in Pathology.

    PubMed

    Hanna, Matthew G; Pantanowitz, Liron

    2015-06-01

    Bar coding and specimen tracking are intricately linked to pathology workflow and efficiency. In the pathology laboratory, bar coding facilitates many laboratory practices, including specimen tracking, automation, and quality management. Data obtained from bar coding can be used to identify, locate, standardize, and audit specimens to achieve maximal laboratory efficiency and patient safety. Variables that need to be considered when implementing and maintaining a bar coding and tracking system include assets to be labeled, bar code symbologies, hardware, software, workflow, and laboratory and information technology infrastructure as well as interoperability with the laboratory information system. This article addresses these issues, primarily focusing on surgical pathology. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Nonlinear transient analysis by energy minimization: A theoretical basis for the ACTION computer code. [predicting the response of a lightweight aircraft during a crash

    NASA Technical Reports Server (NTRS)

    Kamat, M. P.

    1980-01-01

    The formulation basis for establishing the static or dynamic equilibrium configurations of finite element models of structures which may behave in the nonlinear range are provided. With both geometric and time independent material nonlinearities included, the development is restricted to simple one and two dimensional finite elements which are regarded as being the basic elements for modeling full aircraft-like structures under crash conditions. Representations of a rigid link and an impenetrable contact plane are added to the deformation model so that any number of nodes of the finite element model may be connected by a rigid link or may contact the plane. Equilibrium configurations are derived as the stationary conditions of a potential function of the generalized nodal variables of the model. Minimization of the nonlinear potential function is achieved by using the best current variable metric update formula for use in unconstrained minimization. Powell's conjugate gradient algorithm, which offers very low storage requirements at some slight increase in the total number of calculations, is the other alternative algorithm to be used for extremely large scale problems.

  15. Study in Mice Links Key Signaling Molecule to Underlying Cause of Osteogenesis Imperfecta

    MedlinePlus

    ... by mutations in a gene that codes for collagen, an abundant structural component of bone. This type ... linked to defects in enzymes that help process collagen to its mature form. These types of OI ...

  16. Strategies for comparing gene expression profiles from different microarray platforms: application to a case-control experiment.

    PubMed

    Severgnini, Marco; Bicciato, Silvio; Mangano, Eleonora; Scarlatti, Francesca; Mezzelani, Alessandra; Mattioli, Michela; Ghidoni, Riccardo; Peano, Clelia; Bonnal, Raoul; Viti, Federica; Milanesi, Luciano; De Bellis, Gianluca; Battaglia, Cristina

    2006-06-01

    Meta-analysis of microarray data is increasingly important, considering both the availability of multiple platforms using disparate technologies and the accumulation in public repositories of data sets from different laboratories. We addressed the issue of comparing gene expression profiles from two microarray platforms by devising a standardized investigative strategy. We tested this procedure by studying MDA-MB-231 cells, which undergo apoptosis on treatment with resveratrol. Gene expression profiles were obtained using high-density, short-oligonucleotide, single-color microarray platforms: GeneChip (Affymetrix) and CodeLink (Amersham). Interplatform analyses were carried out on 8414 common transcripts represented on both platforms, as identified by LocusLink ID, representing 70.8% and 88.6% of annotated GeneChip and CodeLink features, respectively. We identified 105 differentially expressed genes (DEGs) on CodeLink and 42 DEGs on GeneChip. Among them, only 9 DEGs were commonly identified by both platforms. Multiple analyses (BLAST alignment of probes with target sequences, gene ontology, literature mining, and quantitative real-time PCR) permitted us to investigate the factors contributing to the generation of platform-dependent results in single-color microarray experiments. An effective approach to cross-platform comparison involves microarrays of similar technologies, samples prepared by identical methods, and a standardized battery of bioinformatic and statistical analyses.

  17. Thermo-Mechanical and Electrochemistry Modeling of Planar SOFC Stacks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khaleel, Mohammad A.; Recknagle, Kurtis P.; Lin, Zijing

    2002-12-01

    Modeling activities at PNNL support design and development of modular SOFC systems. The SOFC stack modeling capability at PNNL has developed to a level at which planar stack designs can be compared and optimized for startup performance. Thermal-fluids and stress modeling is being performed to predict the transient temperature distribution and to determine the thermal stresses based on the temperature distribution. Current efforts also include the development of a model for calculating current density, cell voltage, and heat production in SOFC stacks with hydrogen or other fuels. The model includes the heat generation from both Joule heating and chemical reactions.more » It also accounts for species production and destruction via mass balance. The model is being linked to the finite element code MARC to allow for the evaluation of temperatures and stresses during steady state operations.« less

  18. Occupational self-coding and automatic recording (OSCAR): a novel web-based tool to collect and code lifetime job histories in large population-based studies.

    PubMed

    De Matteis, Sara; Jarvis, Deborah; Young, Heather; Young, Alan; Allen, Naomi; Potts, James; Darnton, Andrew; Rushton, Lesley; Cullinan, Paul

    2017-03-01

    Objectives The standard approach to the assessment of occupational exposures is through the manual collection and coding of job histories. This method is time-consuming and costly and makes it potentially unfeasible to perform high quality analyses on occupational exposures in large population-based studies. Our aim was to develop a novel, efficient web-based tool to collect and code lifetime job histories in the UK Biobank, a population-based cohort of over 500 000 participants. Methods We developed OSCAR (occupations self-coding automatic recording) based on the hierarchical structure of the UK Standard Occupational Classification (SOC) 2000, which allows individuals to collect and automatically code their lifetime job histories via a simple decision-tree model. Participants were asked to find each of their jobs by selecting appropriate job categories until they identified their job title, which was linked to a hidden 4-digit SOC code. For each occupation a job title in free text was also collected to estimate Cohen's kappa (κ) inter-rater agreement between SOC codes assigned by OSCAR and an expert manual coder. Results OSCAR was administered to 324 653 UK Biobank participants with an existing email address between June and September 2015. Complete 4-digit SOC-coded lifetime job histories were collected for 108 784 participants (response rate: 34%). Agreement between the 4-digit SOC codes assigned by OSCAR and the manual coder for a random sample of 400 job titles was moderately good [κ=0.45, 95% confidence interval (95% CI) 0.42-0.49], and improved when broader job categories were considered (κ=0.64, 95% CI 0.61-0.69 at a 1-digit SOC-code level). Conclusions OSCAR is a novel, efficient, and reasonably reliable web-based tool for collecting and automatically coding lifetime job histories in large population-based studies. Further application in other research projects for external validation purposes is warranted.

  19. Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images

    NASA Technical Reports Server (NTRS)

    Fischer, Bernd

    2004-01-01

    Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems which use numerical approximations even in cases where closed-form solutions exist. AutoBayes is implemented in Prolog and comprises approximately 75.000 lines of code. In this paper, we take one typical scientific data analysis problem-analyzing planetary nebulae images taken by the Hubble Space Telescope-and show how AutoBayes can be used to automate the implementation of the necessary anal- ysis programs. We initially follow the analysis described by Knuth and Hajian [KHO2] and use AutoBayes to derive code for the published models. We show the details of the code derivation process, including the symbolic computations and automatic integration of library procedures, and compare the results of the automatically generated and manually implemented code. We then go beyond the original analysis and use AutoBayes to derive code for a simple image segmentation procedure based on a mixture model which can be used to automate a manual preproceesing step. Finally, we combine the original approach with the simple segmentation which yields a more detailed analysis. This also demonstrates that AutoBayes makes it easy to combine different aspects of data analysis.

  20. Hierarchical surface code for network quantum computing with modules of arbitrary size

    NASA Astrophysics Data System (ADS)

    Li, Ying; Benjamin, Simon C.

    2016-10-01

    The network paradigm for quantum computing involves interconnecting many modules to form a scalable machine. Typically it is assumed that the links between modules are prone to noise while operations within modules have a significantly higher fidelity. To optimize fault tolerance in such architectures we introduce a hierarchical generalization of the surface code: a small "patch" of the code exists within each module and constitutes a single effective qubit of the logic-level surface code. Errors primarily occur in a two-dimensional subspace, i.e., patch perimeters extruded over time, and the resulting noise threshold for intermodule links can exceed ˜10 % even in the absence of purification. Increasing the number of qubits within each module decreases the number of qubits necessary for encoding a logical qubit. But this advantage is relatively modest, and broadly speaking, a "fine-grained" network of small modules containing only about eight qubits is competitive in total qubit count versus a "course" network with modules containing many hundreds of qubits.

  1. Dopamine Modulates Adaptive Prediction Error Coding in the Human Midbrain and Striatum.

    PubMed

    Diederen, Kelly M J; Ziauddeen, Hisham; Vestergaard, Martin D; Spencer, Tom; Schultz, Wolfram; Fletcher, Paul C

    2017-02-15

    Learning to optimally predict rewards requires agents to account for fluctuations in reward value. Recent work suggests that individuals can efficiently learn about variable rewards through adaptation of the learning rate, and coding of prediction errors relative to reward variability. Such adaptive coding has been linked to midbrain dopamine neurons in nonhuman primates, and evidence in support for a similar role of the dopaminergic system in humans is emerging from fMRI data. Here, we sought to investigate the effect of dopaminergic perturbations on adaptive prediction error coding in humans, using a between-subject, placebo-controlled pharmacological fMRI study with a dopaminergic agonist (bromocriptine) and antagonist (sulpiride). Participants performed a previously validated task in which they predicted the magnitude of upcoming rewards drawn from distributions with varying SDs. After each prediction, participants received a reward, yielding trial-by-trial prediction errors. Under placebo, we replicated previous observations of adaptive coding in the midbrain and ventral striatum. Treatment with sulpiride attenuated adaptive coding in both midbrain and ventral striatum, and was associated with a decrease in performance, whereas bromocriptine did not have a significant impact. Although we observed no differential effect of SD on performance between the groups, computational modeling suggested decreased behavioral adaptation in the sulpiride group. These results suggest that normal dopaminergic function is critical for adaptive prediction error coding, a key property of the brain thought to facilitate efficient learning in variable environments. Crucially, these results also offer potential insights for understanding the impact of disrupted dopamine function in mental illness. SIGNIFICANCE STATEMENT To choose optimally, we have to learn what to expect. Humans dampen learning when there is a great deal of variability in reward outcome, and two brain regions that are modulated by the brain chemical dopamine are sensitive to reward variability. Here, we aimed to directly relate dopamine to learning about variable rewards, and the neural encoding of associated teaching signals. We perturbed dopamine in healthy individuals using dopaminergic medication and asked them to predict variable rewards while we made brain scans. Dopamine perturbations impaired learning and the neural encoding of reward variability, thus establishing a direct link between dopamine and adaptation to reward variability. These results aid our understanding of clinical conditions associated with dopaminergic dysfunction, such as psychosis. Copyright © 2017 Diederen et al.

  2. raaSAFT: A framework enabling coarse-grained molecular dynamics simulations based on the SAFT- γ Mie force field

    NASA Astrophysics Data System (ADS)

    Ervik, Åsmund; Serratos, Guadalupe Jiménez; Müller, Erich A.

    2017-03-01

    We describe here raaSAFT, a Python code that enables the setup and running of coarse-grained molecular dynamics simulations in a systematic and efficient manner. The code is built on top of the popular HOOMD-blue code, and as such harnesses the computational power of GPUs. The methodology makes use of the SAFT- γ Mie force field, so the resulting coarse grained pair potentials are both closely linked to and consistent with the macroscopic thermodynamic properties of the simulated fluid. In raaSAFT both homonuclear and heteronuclear models are implemented for a wide range of compounds spanning from linear alkanes, to more complicated fluids such as water and alcohols, all the way up to nonionic surfactants and models of asphaltenes and resins. Adding new compounds as well as new features is made straightforward by the modularity of the code. To demonstrate the ease-of-use of raaSAFT, we give a detailed walkthrough of how to simulate liquid-liquid equilibrium of a hydrocarbon with water. We describe in detail how both homonuclear and heteronuclear compounds are implemented. To demonstrate the performance and versatility of raaSAFT, we simulate a large polymer-solvent mixture with 300 polystyrene molecules dissolved in 42 700 molecules of heptane, reproducing the experimentally observed temperature-dependent solubility of polystyrene. For this case we obtain a speedup of more than three orders of magnitude as compared to atomistically-detailed simulations.

  3. Alternatively Constrained Dictionary Learning For Image Superresolution.

    PubMed

    Lu, Xiaoqiang; Yuan, Yuan; Yan, Pingkun

    2014-03-01

    Dictionaries are crucial in sparse coding-based algorithm for image superresolution. Sparse coding is a typical unsupervised learning method to study the relationship between the patches of high-and low-resolution images. However, most of the sparse coding methods for image superresolution fail to simultaneously consider the geometrical structure of the dictionary and the corresponding coefficients, which may result in noticeable superresolution reconstruction artifacts. In other words, when a low-resolution image and its corresponding high-resolution image are represented in their feature spaces, the two sets of dictionaries and the obtained coefficients have intrinsic links, which has not yet been well studied. Motivated by the development on nonlocal self-similarity and manifold learning, a novel sparse coding method is reported to preserve the geometrical structure of the dictionary and the sparse coefficients of the data. Moreover, the proposed method can preserve the incoherence of dictionary entries and provide the sparse coefficients and learned dictionary from a new perspective, which have both reconstruction and discrimination properties to enhance the learning performance. Furthermore, to utilize the model of the proposed method more effectively for single-image superresolution, this paper also proposes a novel dictionary-pair learning method, which is named as two-stage dictionary training. Extensive experiments are carried out on a large set of images comparing with other popular algorithms for the same purpose, and the results clearly demonstrate the effectiveness of the proposed sparse representation model and the corresponding dictionary learning algorithm.

  4. Landsat Data Continuity Mission (LDCM) - Optimizing X-Band Usage

    NASA Technical Reports Server (NTRS)

    Garon, H. M.; Gal-Edd, J. S.; Dearth, K. W.; Sank, V. I.

    2010-01-01

    The NASA version of the low-density parity check (LDPC) 7/8-rate code, shortened to the dimensions of (8160, 7136), has been implemented as the forward error correction (FEC) schema for the Landsat Data Continuity Mission (LDCM). This is the first flight application of this code. In order to place a 440 Msps link within the 375 MHz wide X band we found it necessary to heavily bandpass filter the satellite transmitter output . Despite the significant amplitude and phase distortions that accompanied the spectral truncation, the mission required BER is maintained at < 10(exp -12) with less than 2 dB of implementation loss. We utilized a band-pass filter designed ostensibly to replicate the link distortions to demonstrate link design viability. The same filter was then used to optimize the adaptive equalizer in the receiver employed at the terminus of the downlink. The excellent results we obtained could be directly attributed to the implementation of the LDPC code and the amplitude and phase compensation provided in the receiver. Similar results were obtained with receivers from several vendors.

  5. SAADA: Astronomical Databases Made Easier

    NASA Astrophysics Data System (ADS)

    Michel, L.; Nguyen, H. N.; Motch, C.

    2005-12-01

    Many astronomers wish to share datasets with their community but have not enough manpower to develop databases having the functionalities required for high-level scientific applications. The SAADA project aims at automatizing the creation and deployment process of such databases. A generic but scientifically relevant data model has been designed which allows one to build databases by providing only a limited number of product mapping rules. Databases created by SAADA rely on a relational database supporting JDBC and covered by a Java layer including a lot of generated code. Such databases can simultaneously host spectra, images, source lists and plots. Data are grouped in user defined collections whose content can be seen as one unique set per data type even if their formats differ. Datasets can be correlated one with each other using qualified links. These links help, for example, to handle the nature of a cross-identification (e.g., a distance or a likelihood) or to describe their scientific content (e.g., by associating a spectrum to a catalog entry). The SAADA query engine is based on a language well suited to the data model which can handle constraints on linked data, in addition to classical astronomical queries. These constraints can be applied on the linked objects (number, class and attributes) and/or on the link qualifier values. Databases created by SAADA are accessed through a rich WEB interface or a Java API. We are currently developing an inter-operability module implanting VO protocols.

  6. Human factors in cockpit input and display for data link.

    DOT National Transportation Integrated Search

    1971-01-01

    Problems associated with the entry of air-ground-air : messages via keyboard for transmission by Data Link : are discussed. The ARINC proposal for a keyboard is : presented, and an alternative method for coding keys : is proposed for comparative eval...

  7. LERC-SLAM - THE NASA LEWIS RESEARCH CENTER SATELLITE LINK ATTENUATION MODEL PROGRAM (MACINTOSH VERSION)

    NASA Technical Reports Server (NTRS)

    Manning, R. M.

    1994-01-01

    The frequency and intensity of rain attenuation affecting the communication between a satellite and an earth terminal is an important consideration in planning satellite links. The NASA Lewis Research Center Satellite Link Attenuation Model Program (LeRC-SLAM) provides a static and dynamic statistical assessment of the impact of rain attenuation on a communications link established between an earth terminal and a geosynchronous satellite. The program is designed for use in the specification, design and assessment of satellite links for any terminal location in the continental United States. The basis for LeRC-SLAM is the ACTS Rain Attenuation Prediction Model, which uses a log-normal cumulative probability distribution to describe the random process of rain attenuation on satellite links. The derivation of the statistics for the rainrate process at the specified terminal location relies on long term rainfall records compiled by the U.S. Weather Service during time periods of up to 55 years in length. The theory of extreme value statistics is also utilized. The user provides 1) the longitudinal position of the satellite in geosynchronous orbit, 2) the geographical position of the earth terminal in terms of latitude and longitude, 3) the height above sea level of the terminal site, 4) the yearly average rainfall at the terminal site, and 5) the operating frequency of the communications link (within 1 to 1000 GHz, inclusive). Based on the yearly average rainfall at the terminal location, LeRC-SLAM calculates the relevant rain statistics for the site using an internal data base. The program then generates rain attenuation data for the satellite link. This data includes a description of the static (i.e., yearly) attenuation process, an evaluation of the cumulative probability distribution for attenuation effects, and an evaluation of the probability of fades below selected fade depths. In addition, LeRC-SLAM calculates the elevation and azimuth angles of the terminal antenna required to establish a link with the satellite, the statistical parameters that characterize the rainrate process at the terminal site, the length of the propagation path within the potential rain region, and its projected length onto the local horizontal. The IBM PC version of LeRC-SLAM (LEW-14979) is written in Microsoft QuickBASIC for an IBM PC compatible computer with a monitor and printer capable of supporting an 80-column format. The IBM PC version is available on a 5.25 inch MS-DOS format diskette. The program requires about 30K RAM. The source code and executable are included. The Macintosh version of LeRC-SLAM (LEW-14977) is written in Microsoft Basic, Binary (b) v2.00 for Macintosh II series computers running MacOS. This version requires 400K RAM and is available on a 3.5 inch 800K Macintosh format diskette, which includes source code only. The Macintosh version was developed in 1987 and the IBM PC version was developed in 1989. IBM PC is a trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. Macintosh is a registered trademark of Apple Computer, Inc.

  8. LERC-SLAM - THE NASA LEWIS RESEARCH CENTER SATELLITE LINK ATTENUATION MODEL PROGRAM (IBM PC VERSION)

    NASA Technical Reports Server (NTRS)

    Manning, R. M.

    1994-01-01

    The frequency and intensity of rain attenuation affecting the communication between a satellite and an earth terminal is an important consideration in planning satellite links. The NASA Lewis Research Center Satellite Link Attenuation Model Program (LeRC-SLAM) provides a static and dynamic statistical assessment of the impact of rain attenuation on a communications link established between an earth terminal and a geosynchronous satellite. The program is designed for use in the specification, design and assessment of satellite links for any terminal location in the continental United States. The basis for LeRC-SLAM is the ACTS Rain Attenuation Prediction Model, which uses a log-normal cumulative probability distribution to describe the random process of rain attenuation on satellite links. The derivation of the statistics for the rainrate process at the specified terminal location relies on long term rainfall records compiled by the U.S. Weather Service during time periods of up to 55 years in length. The theory of extreme value statistics is also utilized. The user provides 1) the longitudinal position of the satellite in geosynchronous orbit, 2) the geographical position of the earth terminal in terms of latitude and longitude, 3) the height above sea level of the terminal site, 4) the yearly average rainfall at the terminal site, and 5) the operating frequency of the communications link (within 1 to 1000 GHz, inclusive). Based on the yearly average rainfall at the terminal location, LeRC-SLAM calculates the relevant rain statistics for the site using an internal data base. The program then generates rain attenuation data for the satellite link. This data includes a description of the static (i.e., yearly) attenuation process, an evaluation of the cumulative probability distribution for attenuation effects, and an evaluation of the probability of fades below selected fade depths. In addition, LeRC-SLAM calculates the elevation and azimuth angles of the terminal antenna required to establish a link with the satellite, the statistical parameters that characterize the rainrate process at the terminal site, the length of the propagation path within the potential rain region, and its projected length onto the local horizontal. The IBM PC version of LeRC-SLAM (LEW-14979) is written in Microsoft QuickBASIC for an IBM PC compatible computer with a monitor and printer capable of supporting an 80-column format. The IBM PC version is available on a 5.25 inch MS-DOS format diskette. The program requires about 30K RAM. The source code and executable are included. The Macintosh version of LeRC-SLAM (LEW-14977) is written in Microsoft Basic, Binary (b) v2.00 for Macintosh II series computers running MacOS. This version requires 400K RAM and is available on a 3.5 inch 800K Macintosh format diskette, which includes source code only. The Macintosh version was developed in 1987 and the IBM PC version was developed in 1989. IBM PC is a trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. Macintosh is a registered trademark of Apple Computer, Inc.

  9. A Computational Model Predicting Disruption of Blood Vessel Development

    PubMed Central

    Kleinstreuer, Nicole; Dix, David; Rountree, Michael; Baker, Nancy; Sipes, Nisha; Reif, David; Spencer, Richard; Knudsen, Thomas

    2013-01-01

    Vascular development is a complex process regulated by dynamic biological networks that vary in topology and state across different tissues and developmental stages. Signals regulating de novo blood vessel formation (vasculogenesis) and remodeling (angiogenesis) come from a variety of biological pathways linked to endothelial cell (EC) behavior, extracellular matrix (ECM) remodeling and the local generation of chemokines and growth factors. Simulating these interactions at a systems level requires sufficient biological detail about the relevant molecular pathways and associated cellular behaviors, and tractable computational models that offset mathematical and biological complexity. Here, we describe a novel multicellular agent-based model of vasculogenesis using the CompuCell3D (http://www.compucell3d.org/) modeling environment supplemented with semi-automatic knowledgebase creation. The model incorporates vascular endothelial growth factor signals, pro- and anti-angiogenic inflammatory chemokine signals, and the plasminogen activating system of enzymes and proteases linked to ECM interactions, to simulate nascent EC organization, growth and remodeling. The model was shown to recapitulate stereotypical capillary plexus formation and structural emergence of non-coded cellular behaviors, such as a heterologous bridging phenomenon linking endothelial tip cells together during formation of polygonal endothelial cords. Molecular targets in the computational model were mapped to signatures of vascular disruption derived from in vitro chemical profiling using the EPA's ToxCast high-throughput screening (HTS) dataset. Simulating the HTS data with the cell-agent based model of vascular development predicted adverse effects of a reference anti-angiogenic thalidomide analog, 5HPP-33, on in vitro angiogenesis with respect to both concentration-response and morphological consequences. These findings support the utility of cell agent-based models for simulating a morphogenetic series of events and for the first time demonstrate the applicability of these models for predictive toxicology. PMID:23592958

  10. Design and Implementation of Secure and Reliable Communication using Optical Wireless Communication

    NASA Astrophysics Data System (ADS)

    Saadi, Muhammad; Bajpai, Ambar; Zhao, Yan; Sangwongngam, Paramin; Wuttisittikulkij, Lunchakorn

    2014-11-01

    Wireless networking intensify the tractability in the home and office environment to connect the internet without wires but at the cost of risks associated with stealing the data or threat of loading malicious code with the intention of harming the network. In this paper, we proposed a novel method of establishing a secure and reliable communication link using optical wireless communication (OWC). For security, spatial diversity based transmission using two optical transmitters is used and the reliability in the link is achieved by a newly proposed method for the construction of structured parity check matrix for binary Low Density Parity Check (LDPC) codes. Experimental results show that a successful secure and reliable link between the transmitter and the receiver can be achieved by using the proposed novel technique.

  11. An Arrhenius-type viscosity function to model sintering using the Skorohod Olevsky viscous sintering model within a finite element code.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ewsuk, Kevin Gregory; Arguello, Jose Guadalupe, Jr.; Reiterer, Markus W.

    2006-02-01

    The ease and ability to predict sintering shrinkage and densification with the Skorohod-Olevsky viscous sintering (SOVS) model within a finite-element (FE) code have been improved with the use of an Arrhenius-type viscosity function. The need for a better viscosity function was identified by evaluating SOVS model predictions made using a previously published polynomial viscosity function. Predictions made using the original, polynomial viscosity function do not accurately reflect experimentally observed sintering behavior. To more easily and better predict sintering behavior using FE simulations, a thermally activated viscosity function based on creep theory was used with the SOVS model. In comparison withmore » the polynomial viscosity function, SOVS model predictions made using the Arrhenius-type viscosity function are more representative of experimentally observed viscosity and sintering behavior. Additionally, the effects of changes in heating rate on densification can easily be predicted with the Arrhenius-type viscosity function. Another attribute of the Arrhenius-type viscosity function is that it provides the potential to link different sintering models. For example, the apparent activation energy, Q, for densification used in the construction of the master sintering curve for a low-temperature cofire ceramic dielectric has been used as the apparent activation energy for material flow in the Arrhenius-type viscosity function to predict heating rate-dependent sintering behavior using the SOVS model.« less

  12. Evidence-Based Reading and Writing Assessment for Dyslexia in Adolescents and Young Adults

    PubMed Central

    Nielsen, Kathleen; Abbott, Robert; Griffin, Whitney; Lott, Joe; Raskind, Wendy; Berninger, Virginia W.

    2016-01-01

    The same working memory and reading and writing achievement phenotypes (behavioral markers of genetic variants) validated in prior research with younger children and older adults in a multi-generational family genetics study of dyslexia were used to study 81 adolescent and young adults (ages 16 to 25) from that study. Dyslexia is impaired word reading and spelling skills below the population mean and ability to use oral language to express thinking. These working memory predictor measures were given and used to predict reading and writing achievement: Coding (storing and processing) heard and spoken words (phonological coding), read and written words (orthographic coding), base words and affixes (morphological coding), and accumulating words over time (syntax coding); Cross-Code Integration (phonological loop for linking phonological name and orthographic letter codes and orthographic loop for linking orthographic letter codes and finger sequencing codes), and Supervisory Attention (focused and switching attention and self-monitoring during written word finding). Multiple regressions showed that most predictors explained individual difference in at least one reading or writing outcome, but which predictors explained unique variance beyond shared variance depended on outcome. ANOVAs confirmed that research-supported criteria for dyslexia validated for younger children and their parents could be used to diagnose which adolescents and young adults did (n=31) or did not (n=50) meet research criteria for dyslexia. Findings are discussed in reference to the heterogeneity of phenotypes (behavioral markers of genetic variables) and their application to assessment for accommodations and ongoing instruction for adolescents and young adults with dyslexia. PMID:26855554

  13. Speech coding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfullymore » regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the coding techniques are equally applicable to any voice signal whether or not it carries any intelligible information, as the term speech implies. Other terms that are commonly used are speech compression and voice compression since the fundamental idea behind speech coding is to reduce (compress) the transmission rate (or equivalently the bandwidth) And/or reduce storage requirements In this document the terms speech and voice shall be used interchangeably.« less

  14. Behavioral change theories can inform the prediction of young adults' adoption of a plant-based diet.

    PubMed

    Wyker, Brett A; Davison, Kirsten K

    2010-01-01

    Drawing on the Theory of Planned Behavior (TPB) and the Transtheoretical Model (TTM), this study (1) examines links between stages of change for following a plant-based diet (PBD) and consuming more fruits and vegetables (FV); (2) tests an integrated theoretical model predicting intention to follow a PBD; and (3) identifies associated salient beliefs. Cross-sectional. Large public university in the northeastern United States. 204 college students. TPB and TTM constructs were assessed using validated scales. Outcome, normative, and control beliefs were measured using open-ended questions. The overlap between stages of change for FV consumption and adopting a PBD was assessed using Spearman rank correlation analysis and cross-tab comparisons. The proposed model predicting adoption of a PBD was tested using structural equation modeling (SEM). Salient beliefs were coded using automatic response coding software. No association was found between stages of change for FV consumption and following a PBD. Results from SEM analyses provided support for the proposed model predicting intention to follow a PBD. Gender differences in salient beliefs for following a PBD were found. Results demonstrate the potential for effective theory-driven and stage-tailored public health interventions to promote PBDs. Copyright 2010 Society for Nutrition Education. Published by Elsevier Inc. All rights reserved.

  15. EEE Links. Volume 5

    NASA Technical Reports Server (NTRS)

    Humphrey, Robert (Editor)

    1999-01-01

    The EEE Links Newsletter is a quarterly publication produced by Code 562 in support of the NASA HQ funded NASA Electronic Parts and Packaging (NEPP) Program. The newsletter is produced as an electronic format deliverable made available via the referenced www site administered by Code 562, The newsletter publishes brief articles on topics of interest to NASA programs and projects in the area of electronic parts and packaging. The newsletter does not provide information pertaining to patented or proprietary information. The information provided is at the level of that produced by industry and university researchers and is published at national and international conferences.

  16. Standards for detailed clinical models as the basis for medical data exchange and decision support.

    PubMed

    Coyle, Joseph F; Mori, Angelo Rossi; Huff, Stanley M

    2003-03-01

    Detailed clinical models are necessary to exchange medical data between heterogeneous computer systems and to maintain consistency in a longitudinal electronic medical record system. At Intermountain Health Care (IHC), we have a history of designing detailed clinical models. The purpose of this paper is to share our experience and the lessons we have learned over the last 5 years. IHC's newest model is implemented using eXtensible Markup Language (XML) Schema as the formalism, and conforms to the Health Level Seven (HL7) version 3 data types. The centerpiece of the new strategy is the Clinical Event Model, which is a flexible name-value pair data structure that is tightly linked to a coded terminology. We describe IHC's third-generation strategy for representing and implementing detailed clinical models, and discuss the reasons for this design.

  17. Using linked data to evaluate severity and outcome of injury by type of object struck (first object struck only) for motor vehicle crashes in Connecticut : Crash Outcome Data Evaluation System (CODES) linked data demonstration project

    DOT National Transportation Integrated Search

    1999-09-01

    A deterministic algorithm was developed which allowed data from Department of Transportation motor vehicle crash records, state mortality registry records, and hospital admission and emergency department records to be linked for analysis of the types...

  18. X-linked hypophosphatemia attributable to pseudoexons of the PHEX gene.

    PubMed

    Christie, P T; Harding, B; Nesbit, M A; Whyte, M P; Thakker, R V

    2001-08-01

    X-linked hypophosphatemia is commonly caused by mutations of the coding region of PHEX (phosphate-regulating gene with homologies to endopeptidases on the X chromosome). However, such PHEX mutations are not detected in approximately one third of X-linked hypophosphatemia patients who may harbor defects in the noncoding or intronic regions. We have therefore investigated 11 unrelated X-linked hypophosphatemia patients in whom coding region mutations had been excluded, for intronic mutations that may lead to mRNA splicing abnormalities, by the use of lymphoblastoid RNA and RT-PCRs. One X-linked hypophosphatemia patient was found to have 3 abnormally large transcripts, resulting from 51-bp, 100-bp, and 170-bp insertions, all of which would lead to missense peptides and premature termination codons. The origin of these transcripts was a mutation (g to t) at position +1268 of intron 7, which resulted in the occurrence of a high quality novel donor splice site (ggaagg to gtaagg). Splicing between this novel donor splice site and 3 preexisting, but normally silent, acceptor splice sites within intron 7 resulted in the occurrences of the 3 pseudoexons. This represents the first report of PHEX pseudoexons and reveals further the diversity of genetic abnormalities causing X-linked hypophosphatemia.

  19. Performance Analysis of a New Coded TH-CDMA Scheme in Dispersive Infrared Channel with Additive Gaussian Noise

    NASA Astrophysics Data System (ADS)

    Hamdi, Mazda; Kenari, Masoumeh Nasiri

    2013-06-01

    We consider a time-hopping based multiple access scheme introduced in [1] for communication over dispersive infrared links, and evaluate its performance for correlator and matched filter receivers. In the investigated time-hopping code division multiple access (TH-CDMA) method, the transmitter benefits a low rate convolutional encoder. In this method, the bit interval is divided into Nc chips and the output of the encoder along with a PN sequence assigned to the user determines the position of the chip in which the optical pulse is transmitted. We evaluate the multiple access performance of the system for correlation receiver considering background noise which is modeled as White Gaussian noise due to its large intensity. For the correlation receiver, the results show that for a fixed processing gain, at high transmit power, where the multiple access interference has the dominant effect, the performance improves by the coding gain. But at low transmit power, in which the increase of coding gain leads to the decrease of the chip time, and consequently, to more corruption due to the channel dispersion, there exists an optimum value for the coding gain. However, for the matched filter, the performance always improves by the coding gain. The results show that the matched filter receiver outperforms the correlation receiver in the considered cases. Our results show that, for the same bandwidth and bit rate, the proposed system excels other multiple access techniques, like conventional CDMA and time hopping scheme.

  20. Technical note: The Linked Paleo Data framework - a common tongue for paleoclimatology

    NASA Astrophysics Data System (ADS)

    McKay, Nicholas P.; Emile-Geay, Julien

    2016-04-01

    Paleoclimatology is a highly collaborative scientific endeavor, increasingly reliant on online databases for data sharing. Yet there is currently no universal way to describe, store and share paleoclimate data: in other words, no standard. Data standards are often regarded by scientists as mere technicalities, though they underlie much scientific and technological innovation, as well as facilitating collaborations between research groups. In this article, we propose a preliminary data standard for paleoclimate data, general enough to accommodate all the archive and measurement types encountered in a large international collaboration (PAGES 2k). We also introduce a vehicle for such structured data (Linked Paleo Data, or LiPD), leveraging recent advances in knowledge representation (Linked Open Data).The LiPD framework enables quick querying and extraction, and we expect that it will facilitate the writing of open-source community codes to access, analyze, model and visualize paleoclimate observations. We welcome community feedback on this standard, and encourage paleoclimatologists to experiment with the format for their own purposes.

  1. Adaptive Wavelet Coding Applied in a Wireless Control System.

    PubMed

    Gama, Felipe O S; Silveira, Luiz F Q; Salazar, Andrés O

    2017-12-13

    Wireless control systems can sense, control and act on the information exchanged between the wireless sensor nodes in a control loop. However, the exchanged information becomes susceptible to the degenerative effects produced by the multipath propagation. In order to minimize the destructive effects characteristic of wireless channels, several techniques have been investigated recently. Among them, wavelet coding is a good alternative for wireless communications for its robustness to the effects of multipath and its low computational complexity. This work proposes an adaptive wavelet coding whose parameters of code rate and signal constellation can vary according to the fading level and evaluates the use of this transmission system in a control loop implemented by wireless sensor nodes. The performance of the adaptive system was evaluated in terms of bit error rate (BER) versus E b / N 0 and spectral efficiency, considering a time-varying channel with flat Rayleigh fading, and in terms of processing overhead on a control system with wireless communication. The results obtained through computational simulations and experimental tests show performance gains obtained by insertion of the adaptive wavelet coding in a control loop with nodes interconnected by wireless link. These results enable the use of this technique in a wireless link control loop.

  2. Towards defining the role of glycans as hardware in information storage and transfer: basic principles, experimental approaches and recent progress.

    PubMed

    Solís, D; Jiménez-Barbero, J; Kaltner, H; Romero, A; Siebert, H C; von der Lieth, C W; Gabius, H J

    2001-01-01

    The term 'code' in biological information transfer appears to be tightly and hitherto exclusively connected with the genetic code based on nucleotides and translated into functional activities via proteins. However, the recent appreciation of the enormous coding capacity of oligosaccharide chains of natural glycoconjugates has spurred to give heed to a new concept: versatile glycan assembly by the genetically encoded glycosyltransferases endows cells with a probably not yet fully catalogued array of meaningful messages. Enciphered by sugar receptors such as endogenous lectins the information of code words established by a series of covalently linked monosaccharides as letters for example guides correct intra- and intercellular routing of glycoproteins, modulates cell proliferation or migration and mediates cell adhesion. Evidently, the elucidation of the structural frameworks and the recognition strategies within the operation of the sugar code poses a fascinating conundrum. The far-reaching impact of this recognition mode on the level of cells, tissues and organs has fueled vigorous investigations to probe the subtleties of protein-carbohydrate interactions. This review presents information on the necessarily concerted approach using X-ray crystallography, molecular modeling, nuclear magnetic resonance spectroscopy, thermodynamic analysis and engineered ligands and receptors. This part of the treatise is flanked by exemplarily chosen insights made possible by these techniques. Copyright 2001 S. Karger AG, Basel

  3. Multichannel error correction code decoder

    NASA Technical Reports Server (NTRS)

    Wagner, Paul K.; Ivancic, William D.

    1993-01-01

    A brief overview of a processing satellite for a mesh very-small-aperture (VSAT) communications network is provided. The multichannel error correction code (ECC) decoder system, the uplink signal generation and link simulation equipment, and the time-shared decoder are described. The testing is discussed. Applications of the time-shared decoder are recommended.

  4. Deep Drawing Simulations With Different Polycrystalline Models

    NASA Astrophysics Data System (ADS)

    Duchêne, Laurent; de Montleau, Pierre; Bouvier, Salima; Habraken, Anne Marie

    2004-06-01

    The goal of this research is to study the anisotropic material behavior during forming processes, represented by both complex yield loci and kinematic-isotropic hardening models. A first part of this paper describes the main concepts of the `Stress-strain interpolation' model that has been implemented in the non-linear finite element code Lagamine. This model consists of a local description of the yield locus based on the texture of the material through the full constraints Taylor's model. The texture evolution due to plastic deformations is computed throughout the FEM simulations. This `local yield locus' approach was initially linked to the classical isotropic Swift hardening law. Recently, a more complex hardening model was implemented: the physically-based microstructural model of Teodosiu. It takes into account intergranular heterogeneity due to the evolution of dislocation structures, that affects isotropic and kinematic hardening. The influence of the hardening model is compared to the influence of the texture evolution thanks to deep drawing simulations.

  5. Present state of HDTV coding in Japan and future prospect

    NASA Astrophysics Data System (ADS)

    Murakami, Hitomi

    The development status of HDTV digital codecs in Japan is evaluated; several bit rate-reduction codecs have been developed for 1125 lines/60-field HDTV, and performance trials have been conducted through satellite and optical fiber links. Prospective development efforts will attempt to achieve more efficient coding schemes able to reduce the bit rate to as little as 45 Mbps, as well as to apply coding schemes to automated teller machine networks.

  6. Open Software Tools Applied to Jordan's National Multi-Agent Water Management Model

    NASA Astrophysics Data System (ADS)

    Knox, Stephen; Meier, Philipp; Harou, Julien; Yoon, Jim; Selby, Philip; Lachaut, Thibaut; Klassert, Christian; Avisse, Nicolas; Khadem, Majed; Tilmant, Amaury; Gorelick, Steven

    2016-04-01

    Jordan is the fourth most water scarce country in the world, where demand exceeds supply in a politically and demographically unstable context. The Jordan Water Project (JWP) aims to perform policy evaluation by modelling the hydrology, economics, and governance of Jordan's water resource system. The multidisciplinary nature of the project requires a modelling software system capable of integrating submodels from multiple disciplines into a single decision making process and communicating results to stakeholders. This requires a tool for building an integrated model and a system where diverse data sets can be managed and visualised. The integrated Jordan model is built using Pynsim, an open-source multi-agent simulation framework implemented in Python. Pynsim operates on network structures of nodes and links and supports institutional hierarchies, where an institution represents a grouping of nodes, links or other institutions. At each time step, code within each node, link and institution can executed independently, allowing for their fully autonomous behaviour. Additionally, engines (sub-models) perform actions over the entire network or on a subset of the network, such as taking a decision on a set of nodes. Pynsim is modular in design, allowing distinct modules to be modified easily without affecting others. Data management and visualisation is performed using Hydra (www.hydraplatform.org), an open software platform allowing users to manage network structure and data. The Hydra data manager connects to Pynsim, providing necessary input parameters for the integrated model. By providing a high-level portal to the model, Hydra removes a barrier between the users of the model (researchers, stakeholders, planners etc) and the model itself, allowing them to manage data, run the model and visualise results all through a single user interface. Pynsim's ability to represent institutional hierarchies, inter-network communication and the separation of node, link and institutional logic from higher level processes (engine) suit JWP's requirements. The use of Hydra Platform and Pynsim helps make complex customised models such as the JWP model easier to run and manage with international groups of researchers.

  7. Software for Collaborative Engineering of Launch Rockets

    NASA Technical Reports Server (NTRS)

    Stanley, Thomas Troy

    2003-01-01

    The Rocket Evaluation and Cost Integration for Propulsion and Engineering software enables collaborative computing with automated exchange of information in the design and analysis of launch rockets and other complex systems. RECIPE can interact with and incorporate a variety of programs, including legacy codes, that model aspects of a system from the perspectives of different technological disciplines (e.g., aerodynamics, structures, propulsion, trajectory, aeroheating, controls, and operations) and that are used by different engineers on different computers running different operating systems. RECIPE consists mainly of (1) ISCRM a file-transfer subprogram that makes it possible for legacy codes executed in their original operating systems on their original computers to exchange data and (2) CONES an easy-to-use filewrapper subprogram that enables the integration of legacy codes. RECIPE provides a tightly integrated conceptual framework that emphasizes connectivity among the programs used by the collaborators, linking these programs in a manner that provides some configuration control while facilitating collaborative engineering tradeoff studies, including design to cost studies. In comparison with prior collaborative-engineering schemes, one based on the use of RECIPE enables fewer engineers to do more in less time.

  8. The blues of adolescent romance: observed affective interactions in adolescent romantic relationships associated with depressive symptoms.

    PubMed

    Ha, Thao; Dishion, Thomas J; Overbeek, Geertjan; Burk, William J; Engels, Rutger C M E

    2014-05-01

    We examined the associations between observed expressions of positive and negative emotions during conflict discussions and depressive symptoms during a 2-year period in a sample of 160 adolescents in 80 romantic relationships (M age = 15.48, SD = 1.16). Conflict discussions were coded using the 10-code Specific Affect Coding System. Depressive symptoms were assessed at the time of the observed conflict discussions (Time 1) and 2 years later (Time 2). Data were analyzed using actor-partner interdependence models. Girls' expression of both positive and negative emotions at T1 was related to their own depressive symptoms at T2 (actor effect). Boys' positive emotions and negative emotions (actor effect) and girls' negative emotions (partner effect) were related to boys' depressive symptoms at T2. Contrary to expectation, relationship break-up and relationship satisfaction were unrelated to changes in depressive symptoms or expression of negative or positive emotion during conflict discussion. These findings underscore the unique quality of adolescent romantic relationships and suggest new directions in the study of the link between mental health and romantic involvement in adolescence.

  9. Information trade-offs for optical quantum communication.

    PubMed

    Wilde, Mark M; Hayden, Patrick; Guha, Saikat

    2012-04-06

    Recent work has precisely characterized the achievable trade-offs between three key information processing tasks-classical communication (generation or consumption), quantum communication (generation or consumption), and shared entanglement (distribution or consumption), measured in bits, qubits, and ebits per channel use, respectively. Slices and corner points of this three-dimensional region reduce to well-known protocols for quantum channels. A trade-off coding technique can attain any point in the region and can outperform time sharing between the best-known protocols for accomplishing each information processing task by itself. Previously, the benefits of trade-off coding that had been found were too small to be of practical value (viz., for the dephasing and the universal cloning machine channels). In this Letter, we demonstrate that the associated performance gains are in fact remarkably high for several physically relevant bosonic channels that model free-space or fiber-optic links, thermal-noise channels, and amplifiers. We show that significant performance gains from trade-off coding also apply when trading photon-number resources between transmitting public and private classical information simultaneously over secret-key-assisted bosonic channels. © 2012 American Physical Society

  10. Lineage-Specific Genome Architecture Links Enhancers and Non-coding Disease Variants to Target Gene Promoters.

    PubMed

    Javierre, Biola M; Burren, Oliver S; Wilder, Steven P; Kreuzhuber, Roman; Hill, Steven M; Sewitz, Sven; Cairns, Jonathan; Wingett, Steven W; Várnai, Csilla; Thiecke, Michiel J; Burden, Frances; Farrow, Samantha; Cutler, Antony J; Rehnström, Karola; Downes, Kate; Grassi, Luigi; Kostadima, Myrto; Freire-Pritchett, Paula; Wang, Fan; Stunnenberg, Hendrik G; Todd, John A; Zerbino, Daniel R; Stegle, Oliver; Ouwehand, Willem H; Frontini, Mattia; Wallace, Chris; Spivakov, Mikhail; Fraser, Peter

    2016-11-17

    Long-range interactions between regulatory elements and gene promoters play key roles in transcriptional regulation. The vast majority of interactions are uncharted, constituting a major missing link in understanding genome control. Here, we use promoter capture Hi-C to identify interacting regions of 31,253 promoters in 17 human primary hematopoietic cell types. We show that promoter interactions are highly cell type specific and enriched for links between active promoters and epigenetically marked enhancers. Promoter interactomes reflect lineage relationships of the hematopoietic tree, consistent with dynamic remodeling of nuclear architecture during differentiation. Interacting regions are enriched in genetic variants linked with altered expression of genes they contact, highlighting their functional role. We exploit this rich resource to connect non-coding disease variants to putative target promoters, prioritizing thousands of disease-candidate genes and implicating disease pathways. Our results demonstrate the power of primary cell promoter interactomes to reveal insights into genomic regulatory mechanisms underlying common diseases. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  11. Spatial Learning and Action Planning in a Prefrontal Cortical Network Model

    PubMed Central

    Martinet, Louis-Emmanuel; Sheynikhovich, Denis; Benchenane, Karim; Arleo, Angelo

    2011-01-01

    The interplay between hippocampus and prefrontal cortex (PFC) is fundamental to spatial cognition. Complementing hippocampal place coding, prefrontal representations provide more abstract and hierarchically organized memories suitable for decision making. We model a prefrontal network mediating distributed information processing for spatial learning and action planning. Specific connectivity and synaptic adaptation principles shape the recurrent dynamics of the network arranged in cortical minicolumns. We show how the PFC columnar organization is suitable for learning sparse topological-metrical representations from redundant hippocampal inputs. The recurrent nature of the network supports multilevel spatial processing, allowing structural features of the environment to be encoded. An activation diffusion mechanism spreads the neural activity through the column population leading to trajectory planning. The model provides a functional framework for interpreting the activity of PFC neurons recorded during navigation tasks. We illustrate the link from single unit activity to behavioral responses. The results suggest plausible neural mechanisms subserving the cognitive “insight” capability originally attributed to rodents by Tolman & Honzik. Our time course analysis of neural responses shows how the interaction between hippocampus and PFC can yield the encoding of manifold information pertinent to spatial planning, including prospective coding and distance-to-goal correlates. PMID:21625569

  12. A cascaded coding scheme for error control and its performance analysis

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Kasami, Tadao; Fujiwara, Tohru; Takata, Toyoo

    1986-01-01

    A coding scheme is investigated for error control in data communication systems. The scheme is obtained by cascading two error correcting codes, called the inner and outer codes. The error performance of the scheme is analyzed for a binary symmetric channel with bit error rate epsilon <1/2. It is shown that if the inner and outer codes are chosen properly, extremely high reliability can be attained even for a high channel bit error rate. Various specific example schemes with inner codes ranging form high rates to very low rates and Reed-Solomon codes as inner codes are considered, and their error probabilities are evaluated. They all provide extremely high reliability even for very high bit error rates. Several example schemes are being considered by NASA for satellite and spacecraft down link error control.

  13. Coding and decoding in a point-to-point communication using the polarization of the light beam.

    PubMed

    Kavehvash, Z; Massoumian, F

    2008-05-10

    A new technique for coding and decoding of optical signals through the use of polarization is described. In this technique the concept of coding is translated to polarization. In other words, coding is done in such a way that each code represents a unique polarization. This is done by implementing a binary pattern on a spatial light modulator in such a way that the reflected light has the required polarization. Decoding is done by the detection of the received beam's polarization. By linking the concept of coding to polarization we can use each of these concepts in measuring the other one, attaining some gains. In this paper the construction of a simple point-to-point communication where coding and decoding is done through polarization will be discussed.

  14. Review Article: Mapping of children's health and development data on population level using the classification system ICF-CY.

    PubMed

    Ståhl, Ylva; Granlund, Mats; Gäre-Andersson, Boel; Enskär, Karin

    2011-02-01

    The aim of this study was to investigate if essential health and development data of all children in Sweden in the Child Health Service (CHS) and School Health Service (SHS) can be linked to the classification system International Classification of Functioning, Disability and Health--Children and Youth (ICF-CY). Lists of essential health terms, compiled by professionals from CHS and SHS, expected to be used in the national standardised records form the basis for the analysis in this study. The essential health terms have been linked to the codes of ICF-CY by using linking rules and a verification procedure. After exclusion of terms not directly describing children's health, a majority of the health terms could be linked into the ICF-CY with a high proportion of terms in body functions and a lower proportion in activity/participation and environment respectively. Some health terms had broad description and were linked to several ICF-CY codes. The precision of the health terms was at a medium level of detail. ICF-CY can be useful as a tool for documenting child health. It provides not only a code useful for statistical purposes but also a language useful for the CHS and SHS in their work on individual as well as population levels. It was noted that the health terms used by services mainly focused on health related to body function. This indicates that more focus is needed on health data related to child's functioning in everyday life situations.

  15. HydroDesktop as a Community Designed and Developed Resource for Hydrologic Data Discovery and Analysis

    NASA Astrophysics Data System (ADS)

    Ames, D. P.

    2013-12-01

    As has been seen in other informatics fields, well-documented and appropriately licensed open source software tools have the potential to significantly increase both opportunities and motivation for inter-institutional science and technology collaboration. The CUAHSI HIS (and related HydroShare) projects have aimed to foster such activities in hydrology resulting in the development of many useful community software components including the HydroDesktop software application. HydroDesktop is an open source, GIS-based, scriptable software application for discovering data on the CUAHSI Hydrologic Information System and related resources. It includes a well-defined plugin architecture and interface to allow 3rd party developers to create extensions and add new functionality without requiring recompiling of the full source code. HydroDesktop is built in the C# programming language and uses the open source DotSpatial GIS engine for spatial data management. Capabilities include data search, discovery, download, visualization, and export. An extension that integrates the R programming language with HydroDesktop provides scripting and data automation capabilities and an OpenMI plugin provides the ability to link models. Current revision and updates to HydroDesktop include migration of core business logic to cross platform, scriptable Python code modules that can be executed in any operating system or linked into other software front-end applications.

  16. RNA G-quadruplexes: emerging mechanisms in disease

    PubMed Central

    Cammas, Anne

    2017-01-01

    Abstract RNA G-quadruplexes (G4s) are formed by G-rich RNA sequences in protein-coding (mRNA) and non-coding (ncRNA) transcripts that fold into a four-stranded conformation. Experimental studies and bioinformatic predictions support the view that these structures are involved in different cellular functions associated to both DNA processes (telomere elongation, recombination and transcription) and RNA post-transcriptional mechanisms (including pre-mRNA processing, mRNA turnover, targeting and translation). An increasing number of different diseases have been associated with the inappropriate regulation of RNA G4s exemplifying the potential importance of these structures on human health. Here, we review the different molecular mechanisms underlying the link between RNA G4s and human diseases by proposing several overlapping models of deregulation emerging from recent research, including (i) sequestration of RNA-binding proteins, (ii) aberrant expression or localization of RNA G4-binding proteins, (iii) repeat associated non-AUG (RAN) translation, (iv) mRNA translational blockade and (v) disabling of protein–RNA G4 complexes. This review also provides a comprehensive survey of the functional RNA G4 and their mechanisms of action. Finally, we highlight future directions for research aimed at improving our understanding on RNA G4-mediated regulatory mechanisms linked to diseases. PMID:28013268

  17. Catalog of types of applications implemented using linked state data : crash outcome data evaluation system (CODES)

    DOT National Transportation Integrated Search

    1997-04-01

    The purpose of this catalog is to inspire the development of new applications : for linked data that support efforts to reduce death, disability, severity, and : health care costs related to motor vehicle crashes. The document is divided into : three...

  18. A Hebbian learning rule gives rise to mirror neurons and links them to control theoretic inverse models.

    PubMed

    Hanuschkin, A; Ganguli, S; Hahnloser, R H R

    2013-01-01

    Mirror neurons are neurons whose responses to the observation of a motor act resemble responses measured during production of that act. Computationally, mirror neurons have been viewed as evidence for the existence of internal inverse models. Such models, rooted within control theory, map-desired sensory targets onto the motor commands required to generate those targets. To jointly explore both the formation of mirrored responses and their functional contribution to inverse models, we develop a correlation-based theory of interactions between a sensory and a motor area. We show that a simple eligibility-weighted Hebbian learning rule, operating within a sensorimotor loop during motor explorations and stabilized by heterosynaptic competition, naturally gives rise to mirror neurons as well as control theoretic inverse models encoded in the synaptic weights from sensory to motor neurons. Crucially, we find that the correlational structure or stereotypy of the neural code underlying motor explorations determines the nature of the learned inverse model: random motor codes lead to causal inverses that map sensory activity patterns to their motor causes; such inverses are maximally useful, by allowing the imitation of arbitrary sensory target sequences. By contrast, stereotyped motor codes lead to less useful predictive inverses that map sensory activity to future motor actions. Our theory generalizes previous work on inverse models by showing that such models can be learned in a simple Hebbian framework without the need for error signals or backpropagation, and it makes new conceptual connections between the causal nature of inverse models, the statistical structure of motor variability, and the time-lag between sensory and motor responses of mirror neurons. Applied to bird song learning, our theory can account for puzzling aspects of the song system, including necessity of sensorimotor gating and selectivity of auditory responses to bird's own song (BOS) stimuli.

  19. A Hebbian learning rule gives rise to mirror neurons and links them to control theoretic inverse models

    PubMed Central

    Hanuschkin, A.; Ganguli, S.; Hahnloser, R. H. R.

    2013-01-01

    Mirror neurons are neurons whose responses to the observation of a motor act resemble responses measured during production of that act. Computationally, mirror neurons have been viewed as evidence for the existence of internal inverse models. Such models, rooted within control theory, map-desired sensory targets onto the motor commands required to generate those targets. To jointly explore both the formation of mirrored responses and their functional contribution to inverse models, we develop a correlation-based theory of interactions between a sensory and a motor area. We show that a simple eligibility-weighted Hebbian learning rule, operating within a sensorimotor loop during motor explorations and stabilized by heterosynaptic competition, naturally gives rise to mirror neurons as well as control theoretic inverse models encoded in the synaptic weights from sensory to motor neurons. Crucially, we find that the correlational structure or stereotypy of the neural code underlying motor explorations determines the nature of the learned inverse model: random motor codes lead to causal inverses that map sensory activity patterns to their motor causes; such inverses are maximally useful, by allowing the imitation of arbitrary sensory target sequences. By contrast, stereotyped motor codes lead to less useful predictive inverses that map sensory activity to future motor actions. Our theory generalizes previous work on inverse models by showing that such models can be learned in a simple Hebbian framework without the need for error signals or backpropagation, and it makes new conceptual connections between the causal nature of inverse models, the statistical structure of motor variability, and the time-lag between sensory and motor responses of mirror neurons. Applied to bird song learning, our theory can account for puzzling aspects of the song system, including necessity of sensorimotor gating and selectivity of auditory responses to bird's own song (BOS) stimuli. PMID:23801941

  20. Issues and opportunities: beam simulations for heavy ion fusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman, A

    1999-07-15

    UCRL- JC- 134975 PREPRINT code offering 3- D, axisymmetric, and ''transverse slice'' (steady flow) geometries, with a hierarchy of models for the ''lattice'' of focusing, bending, and accelerating elements. Interactive and script- driven code steering is afforded through an interpreter interface. The code runs with good parallel scaling on the T3E. Detailed simulations of machine segments and of complete small experiments, as well as simplified full- system runs, have been carried out, partially benchmarking the code. A magnetoinductive model, with module impedance and multi- beam effects, is under study. experiments, including an injector scalable to multi- beam arrays, a high-more » current beam transport and acceleration experiment, and a scaled final- focusing experiment. These ''phase I'' projects are laying the groundwork for the next major step in HIF development, the Integrated Research Experiment (IRE). Simulations aimed directly at the IRE must enable us to: design a facility with maximum power on target at minimal cost; set requirements for hardware tolerances, beam steering, etc.; and evaluate proposed chamber propagation modes. Finally, simulations must enable us to study all issues which arise in the context of a fusion driver, and must facilitate the assessment of driver options. In all of this, maximum advantage must be taken of emerging terascale computer architectures, requiring an aggressive code development effort. An organizing principle should be pursuit of the goal of integrated and detailed source- to- target simulation. methods for analysis of the beam dynamics in the various machine concepts, using moment- based methods for purposes of design, waveform synthesis, steering algorithm synthesis, etc. Three classes of discrete- particle models should be coupled: (1) electrostatic/ magnetoinductive PIC simulations should track the beams from the source through the final- focusing optics, passing details of the time- dependent distribution function to (2) electromagnetic or magnetoinductive PIC or hybrid PIG/ fluid simulations in the fusion chamber (which would finally pass their particle trajectory information to the radiation- hydrodynamics codes used for target design); in parallel, (3) detailed PIC, delta- f, core/ test- particle, and perhaps continuum Vlasov codes should be used to study individual sections of the driver and chamber very carefully; consistency may be assured by linking data from the PIC sequence, and knowledge gained may feed back into that sequence.« less

  1. Consequence analysis in LPG installation using an integrated computer package.

    PubMed

    Ditali, S; Colombi, M; Moreschini, G; Senni, S

    2000-01-07

    This paper presents the prototype of the computer code, Atlantide, developed to assess the consequences associated with accidental events that can occur in a LPG storage plant. The characteristic of Atlantide is to be simple enough but at the same time adequate to cope with consequence analysis as required by Italian legislation in fulfilling the Seveso Directive. The application of Atlantide is appropriate for LPG storage/transferring installations. The models and correlations implemented in the code are relevant to flashing liquid releases, heavy gas dispersion and other typical phenomena such as BLEVE/Fireball. The computer code allows, on the basis of the operating/design characteristics, the study of the relevant accidental events from the evaluation of the release rate (liquid, gaseous and two-phase) in the unit involved, to the analysis of the subsequent evaporation and dispersion, up to the assessment of the final phenomena of fire and explosion. This is done taking as reference simplified Event Trees which describe the evolution of accidental scenarios, taking into account the most likely meteorological conditions, the different release situations and other features typical of a LPG installation. The limited input data required and the automatic linking between the single models, that are activated in a defined sequence, depending on the accidental event selected, minimize both the time required for the risk analysis and the possibility of errors. Models and equations implemented in Atlantide have been selected from public literature or in-house developed software and tailored with the aim to be easy to use and fast to run but, nevertheless, able to provide realistic simulation of the accidental event as well as reliable results, in terms of physical effects and hazardous areas. The results have been compared with those of other internationally recognized codes and with the criteria adopted by Italian authorities to verify the Safety Reports for LPG installations. A brief of the theoretical basis of each model implemented in Atlantide and an example of application are included in the paper.

  2. Molecular Regulatory Pathways Link Sepsis With Metabolic Syndrome: Non-coding RNA Elements Underlying the Sepsis/Metabolic Cross-Talk.

    PubMed

    Meydan, Chanan; Bekenstein, Uriya; Soreq, Hermona

    2018-01-01

    Sepsis and metabolic syndrome (MetS) are both inflammation-related entities with high impact for human health and the consequences of concussions. Both represent imbalanced parasympathetic/cholinergic response to insulting triggers and variably uncontrolled inflammation that indicates shared upstream regulators, including short microRNAs (miRs) and long non-coding RNAs (lncRNAs). These may cross talk across multiple systems, leading to complex molecular and clinical outcomes. Notably, biomedical and RNA-sequencing based analyses both highlight new links between the acquired and inherited pathogenic, cardiac and inflammatory traits of sepsis/MetS. Those include the HOTAIR and MIAT lncRNAs and their targets, such as miR-122, -150, -155, -182, -197, -375, -608 and HLA-DRA. Implicating non-coding RNA regulators in sepsis and MetS may delineate novel high-value biomarkers and targets for intervention.

  3. Correction of mouse ornithine transcarbamylase deficiency by gene transfer into the germ line.

    PubMed Central

    Cavard, C; Grimber, G; Dubois, N; Chasse, J F; Bennoun, M; Minet-Thuriaux, M; Kamoun, P; Briand, P

    1988-01-01

    The sparse fur with abnormal skin and hair (Spf-ash) mouse is a model for the human X-linked hereditary disorder, ornithine transcarbamylase (OTC) deficiency. In Spf-ash mice, both OTC mRNA and enzyme activity are 5% of control values resulting in hyperammonemia, pronounced orotic aciduria and an abnormal phenotype characterized by growth retardation and sparse fur. Using microinjection, we introduced a construction containing rat OTC cDNA linked to the SV40 early promoter into fertilized eggs of Spf-ash mice. The expression of the transgene resulted in the development of a transgenic mouse whose phenotype and orotic acid excretion are fully normalized. Thus, the possibility of correcting hereditary enzymatic defect by gene transfer of heterologous cDNA coding for the normal enzyme has been demonstrated. Images PMID:3162766

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grote, D. P.

    Forthon generates links between Fortran and Python. Python is a high level, object oriented, interactive and scripting language that allows a flexible and versatile interface to computational tools. The Forthon package generates the necessary wrapping code which allows access to the Fortran database and to the Fortran subroutines and functions. This provides a development package where the computationally intensive parts of a code can be written in efficient Fortran, and the high level controlling code can be written in the much more versatile Python language.

  5. Further Developments in the Communication Link and Error Analysis (CLEAN) Simulator

    NASA Technical Reports Server (NTRS)

    Ebel, William J.; Ingels, Frank M.

    1995-01-01

    During the period 1 July 1993 - 30 June 1994, significant developments to the Communication Link and Error ANalysis (CLEAN) simulator were completed. Many of these were reported in the Semi-Annual report dated December 1993 which has been included in this report in Appendix A. Since December 1993, a number of additional modules have been added involving Unit-Memory Convolutional codes (UMC). These are: (1) Unit-Memory Convolutional Encoder module (UMCEncd); (2) Hard decision Unit-Memory Convolutional Decoder using the Viterbi decoding algorithm (VitUMC); and (3) a number of utility modules designed to investigate the performance of LTMC's such as LTMC column distance function (UMCdc), UMC free distance function (UMCdfree), UMC row distance function (UMCdr), and UMC Transformation (UMCTrans). The study of UMC's was driven, in part, by the desire to investigate high-rate convolutional codes which are better suited as inner codes for a concatenated coding scheme. A number of high-rate LTMC's were found which are good candidates for inner codes. Besides the further developments of the simulation, a study was performed to construct a table of the best known Unit-Memory Convolutional codes. Finally, a preliminary study of the usefulness of the Periodic Convolutional Interleaver (PCI) was completed and documented in a Technical note dated March 17, 1994. This technical note has also been included in this final report.

  6. What Are Those Checkerboard Things?: How QR Codes Can Enrich Student Projects

    ERIC Educational Resources Information Center

    Tucker, Al

    2011-01-01

    Students enrolled in commercial arts program design and publish their school's yearbook. For the 2010-2011 school year, the students applied Quick Response (QR) code technology to include links to events that occurred after the yearbook's print deadline, including graduation. The technology has many applications in the school setting, and the…

  7. 40 CFR 86.1806-04 - On-board diagnostics.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... codes shall be consistent with SAE J2012 “Diagnostic Trouble Code Definitions—Equivalent to ISO/DIS... sent to the scan tool over a J1850 data link shall use the Cyclic Redundancy Check and the three byte..., definitions and abbreviations shall be formatted according to SAE J1930 “Electrical/Electronic Systems...

  8. 40 CFR 86.1806-04 - On-board diagnostics.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... codes shall be consistent with SAE J2012 “Diagnostic Trouble Code Definitions—Equivalent to ISO/DIS... sent to the scan tool over a J1850 data link shall use the Cyclic Redundancy Check and the three byte..., definitions and abbreviations shall be formatted according to SAE J1930 “Electrical/Electronic Systems...

  9. 40 CFR 86.1806-04 - On-board diagnostics.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... codes shall be consistent with SAE J2012 “Diagnostic Trouble Code Definitions—Equivalent to ISO/DIS... sent to the scan tool over a J1850 data link shall use the Cyclic Redundancy Check and the three byte..., definitions and abbreviations shall be formatted according to SAE J1930 “Electrical/Electronic Systems...

  10. Physical-layer network coding in coherent optical OFDM systems.

    PubMed

    Guan, Xun; Chan, Chun-Kit

    2015-04-20

    We present the first experimental demonstration and characterization of the application of optical physical-layer network coding in coherent optical OFDM systems. It combines two optical OFDM frames to share the same link so as to enhance system throughput, while individual OFDM frames can be recovered with digital signal processing at the destined node.

  11. On Delay and Security in Network Coding

    ERIC Educational Resources Information Center

    Dikaliotis, Theodoros K.

    2013-01-01

    In this thesis, delay and security issues in network coding are considered. First, we study the delay incurred in the transmission of a fixed number of packets through acyclic networks comprised of erasure links. The two transmission schemes studied are routing with hop-by-hop retransmissions, where every node in the network simply stores and…

  12. Study to investigate and evaluate means of optimizing the Ku-band communication function for the space shuttle

    NASA Technical Reports Server (NTRS)

    Simon, M. K.; Udalov, S.; Huth, G. K.

    1976-01-01

    The forward link of the overall Ku-band communication system consists of the ground- TDRS-orbiter communication path. Because the last segment of the link is directed towards a relatively low orbiting shuttle, a PN code is used to reduce the spectral density. A method is presented for incorporating code acquisition and tracking functions into the orbiter's Ku-band receiver. Optimization of a three channel multiplexing technique is described. The importance of Costas loop parameters to provide false lock immunity for the receiver, and the advantage of using a sinusoidal subcarrier waveform, rather than square wave, are discussed.

  13. A study of high density bit transition requirements versus the effects on BCH error correcting coding

    NASA Technical Reports Server (NTRS)

    Ingels, F.; Schoggen, W. O.

    1981-01-01

    Several methods for increasing bit transition densities in a data stream are summarized, discussed in detail, and compared against constraints imposed by the 2 MHz data link of the space shuttle high rate multiplexer unit. These methods include use of alternate pulse code modulation waveforms, data stream modification by insertion, alternate bit inversion, differential encoding, error encoding, and use of bit scramblers. The psuedo-random cover sequence generator was chosen for application to the 2 MHz data link of the space shuttle high rate multiplexer unit. This method is fully analyzed and a design implementation proposed.

  14. Predictive Feedback Can Account for Biphasic Responses in the Lateral Geniculate Nucleus

    PubMed Central

    Jehee, Janneke F. M.; Ballard, Dana H.

    2009-01-01

    Biphasic neural response properties, where the optimal stimulus for driving a neural response changes from one stimulus pattern to the opposite stimulus pattern over short periods of time, have been described in several visual areas, including lateral geniculate nucleus (LGN), primary visual cortex (V1), and middle temporal area (MT). We describe a hierarchical model of predictive coding and simulations that capture these temporal variations in neuronal response properties. We focus on the LGN-V1 circuit and find that after training on natural images the model exhibits the brain's LGN-V1 connectivity structure, in which the structure of V1 receptive fields is linked to the spatial alignment and properties of center-surround cells in the LGN. In addition, the spatio-temporal response profile of LGN model neurons is biphasic in structure, resembling the biphasic response structure of neurons in cat LGN. Moreover, the model displays a specific pattern of influence of feedback, where LGN receptive fields that are aligned over a simple cell receptive field zone of the same polarity decrease their responses while neurons of opposite polarity increase their responses with feedback. This phase-reversed pattern of influence was recently observed in neurophysiology. These results corroborate the idea that predictive feedback is a general coding strategy in the brain. PMID:19412529

  15. Literature-based concept profiles for gene annotation: the issue of weighting.

    PubMed

    Jelier, Rob; Schuemie, Martijn J; Roes, Peter-Jan; van Mulligen, Erik M; Kors, Jan A

    2008-05-01

    Text-mining has been used to link biomedical concepts, such as genes or biological processes, to each other for annotation purposes or the generation of new hypotheses. To relate two concepts to each other several authors have used the vector space model, as vectors can be compared efficiently and transparently. Using this model, a concept is characterized by a list of associated concepts, together with weights that indicate the strength of the association. The associated concepts in the vectors and their weights are derived from a set of documents linked to the concept of interest. An important issue with this approach is the determination of the weights of the associated concepts. Various schemes have been proposed to determine these weights, but no comparative studies of the different approaches are available. Here we compare several weighting approaches in a large scale classification experiment. Three different techniques were evaluated: (1) weighting based on averaging, an empirical approach; (2) the log likelihood ratio, a test-based measure; (3) the uncertainty coefficient, an information-theory based measure. The weighting schemes were applied in a system that annotates genes with Gene Ontology codes. As the gold standard for our study we used the annotations provided by the Gene Ontology Annotation project. Classification performance was evaluated by means of the receiver operating characteristics (ROC) curve using the area under the curve (AUC) as the measure of performance. All methods performed well with median AUC scores greater than 0.84, and scored considerably higher than a binary approach without any weighting. Especially for the more specific Gene Ontology codes excellent performance was observed. The differences between the methods were small when considering the whole experiment. However, the number of documents that were linked to a concept proved to be an important variable. When larger amounts of texts were available for the generation of the concepts' vectors, the performance of the methods diverged considerably, with the uncertainty coefficient then outperforming the two other methods.

  16. MAPA: an interactive accelerator design code with GUI

    NASA Astrophysics Data System (ADS)

    Bruhwiler, David L.; Cary, John R.; Shasharina, Svetlana G.

    1999-06-01

    The MAPA code is an interactive accelerator modeling and design tool with an X/Motif GUI. MAPA has been developed in C++ and makes full use of object-oriented features. We present an overview of its features and describe how users can independently extend the capabilities of the entire application, including the GUI. For example, a user can define a new model for a focusing or accelerating element. If the appropriate form is followed, and the new element is "registered" with a single line in the specified file, then the GUI will fully support this user-defined element type after it has been compiled and then linked to the existing application. In particular, the GUI will bring up windows for modifying any relevant parameters of the new element type. At present, one can use the GUI for phase space tracking, finding fixed points and generating line plots for the Twiss parameters, the dispersion and the accelerator geometry. The user can define new types of simulations which the GUI will automatically support by providing a menu option to execute the simulation and subsequently rendering line plots of the resulting data.

  17. Multidisciplinary design optimization of aircraft wing structures with aeroelastic and aeroservoelastic constraints

    NASA Astrophysics Data System (ADS)

    Jung, Sang-Young

    Design procedures for aircraft wing structures with control surfaces are presented using multidisciplinary design optimization. Several disciplines such as stress analysis, structural vibration, aerodynamics, and controls are considered simultaneously and combined for design optimization. Vibration data and aerodynamic data including those in the transonic regime are calculated by existing codes. Flutter analyses are performed using those data. A flutter suppression method is studied using control laws in the closed-loop flutter equation. For the design optimization, optimization techniques such as approximation, design variable linking, temporary constraint deletion, and optimality criteria are used. Sensitivity derivatives of stresses and displacements for static loads, natural frequency, flutter characteristics, and control characteristics with respect to design variables are calculated for an approximate optimization. The objective function is the structural weight. The design variables are the section properties of the structural elements and the control gain factors. Existing multidisciplinary optimization codes (ASTROS* and MSC/NASTRAN) are used to perform single and multiple constraint optimizations of fully built up finite element wing structures. Three benchmark wing models are developed and/or modified for this purpose. The models are tested extensively.

  18. Coupled basin-scale water resource models for arid and semiarid regions

    NASA Astrophysics Data System (ADS)

    Winter, C.; Springer, E.; Costigan, K.; Fasel, P.; Mniewski, S.; Zyvoloski, G.

    2003-04-01

    Managers of semi-arid and arid water resources must allocate increasingly variable surface sources and limited groundwater resources to growing demands. This challenge is leading to a new generation of detailed computational models that link multiple interacting sources and demands. We will discuss a new computational model of arid region hydrology that we are parameterizing for the upper Rio Grande Basin of the United States. The model consists of linked components for the atmosphere (the Regional Atmospheric Modeling System, RAMS), surface hydrology (the Los Alamos Distributed Hydrologic System, LADHS), and groundwater (the Finite Element Heat and Mass code, FEHM), and the couplings between them. The model runs under the Parallel Application WorkSpace software developed at Los Alamos for applications running on large distributed memory computers. RAMS simulates regional meteorology coupled to global climate data on the one hand and land surface hydrology on the other. LADHS generates runoff by infiltration or saturation excess mechanisms, as well as interception, evapotranspiration, and snow accumulation and melt. FEHM simulates variably saturated flow and heat transport in three dimensions. A key issue is to increase the components’ spatial and temporal resolution to account for changes in topography and other rapidly changing variables that affect results such as soil moisture distribution or groundwater recharge. Thus, RAMS’ smallest grid is 5 km on a side, LADHS uses 100 m spacing, while FEHM concentrates processing on key volumes by means of an unstructured grid. Couplings within our model are based on new scaling methods that link groundwater-groundwater systems and streams to aquifers and we are developing evapotranspiration methods based on detailed calculations of latent heat and vegetative cover. Simulations of precipitation and soil moisture for the 1992-93 El Nino year will be used to demonstrate the approach and suggest further needs.

  19. Aeronautical audio broadcasting via satellite

    NASA Technical Reports Server (NTRS)

    Tzeng, Forrest F.

    1993-01-01

    A system design for aeronautical audio broadcasting, with C-band uplink and L-band downlink, via Inmarsat space segments is presented. Near-transparent-quality compression of 5-kHz bandwidth audio at 20.5 kbit/s is achieved based on a hybrid technique employing linear predictive modeling and transform-domain residual quantization. Concatenated Reed-Solomon/convolutional codes with quadrature phase shift keying are selected for bandwidth and power efficiency. RF bandwidth at 25 kHz per channel, and a decoded bit error rate at 10(exp -6) with E(sub b)/N(sub o) at 3.75 dB are obtained. An interleaver, scrambler, modem synchronization, and frame format were designed, and frequency-division multiple access was selected over code-division multiple access. A link budget computation based on a worst-case scenario indicates sufficient system power margins. Transponder occupancy analysis for 72 audio channels demonstrates ample remaining capacity to accommodate emerging aeronautical services.

  20. Normalization is a general neural mechanism for context-dependent decision making

    PubMed Central

    Louie, Kenway; Khaw, Mel W.; Glimcher, Paul W.

    2013-01-01

    Understanding the neural code is critical to linking brain and behavior. In sensory systems, divisive normalization seems to be a canonical neural computation, observed in areas ranging from retina to cortex and mediating processes including contrast adaptation, surround suppression, visual attention, and multisensory integration. Recent electrophysiological studies have extended these insights beyond the sensory domain, demonstrating an analogous algorithm for the value signals that guide decision making, but the effects of normalization on choice behavior are unknown. Here, we show that choice models using normalization generate significant (and classically irrational) choice phenomena driven by either the value or number of alternative options. In value-guided choice experiments, both monkey and human choosers show novel context-dependent behavior consistent with normalization. These findings suggest that the neural mechanism of value coding critically influences stochastic choice behavior and provide a generalizable quantitative framework for examining context effects in decision making. PMID:23530203

  1. Minimizing human error in radiopharmaceutical preparation and administration via a bar code-enhanced nuclear pharmacy management system.

    PubMed

    Hakala, John L; Hung, Joseph C; Mosman, Elton A

    2012-09-01

    The objective of this project was to ensure correct radiopharmaceutical administration through the use of a bar code system that links patient and drug profiles with on-site information management systems. This new combined system would minimize the amount of manual human manipulation, which has proven to be a primary source of error. The most common reason for dosing errors is improper patient identification when a dose is obtained from the nuclear pharmacy or when a dose is administered. A standardized electronic transfer of information from radiopharmaceutical preparation to injection will further reduce the risk of misadministration. Value stream maps showing the flow of the patient dose information, as well as potential points of human error, were developed. Next, a future-state map was created that included proposed corrections for the most common critical sites of error. Transitioning the current process to the future state will require solutions that address these sites. To optimize the future-state process, a bar code system that links the on-site radiology management system with the nuclear pharmacy management system was proposed. A bar-coded wristband connects the patient directly to the electronic information systems. The bar code-enhanced process linking the patient dose with the electronic information reduces the number of crucial points for human error and provides a framework to ensure that the prepared dose reaches the correct patient. Although the proposed flowchart is designed for a site with an in-house central nuclear pharmacy, much of the framework could be applied by nuclear medicine facilities using unit doses. An electronic connection between information management systems to allow the tracking of a radiopharmaceutical from preparation to administration can be a useful tool in preventing the mistakes that are an unfortunate reality for any facility.

  2. Developing small-area predictions for smoking and obesity prevalence in the United States for use in Environmental Public Health Tracking.

    PubMed

    Ortega Hinojosa, Alberto M; Davies, Molly M; Jarjour, Sarah; Burnett, Richard T; Mann, Jennifer K; Hughes, Edward; Balmes, John R; Turner, Michelle C; Jerrett, Michael

    2014-10-01

    Globally and in the United States, smoking and obesity are leading causes of death and disability. Reliable estimates of prevalence for these risk factors are often missing variables in public health surveillance programs. This may limit the capacity of public health surveillance to target interventions or to assess associations between other environmental risk factors (e.g., air pollution) and health because smoking and obesity are often important confounders. To generate prevalence estimates of smoking and obesity rates over small areas for the United States (i.e., at the ZIP code and census tract levels). We predicted smoking and obesity prevalence using a combined approach first using a lasso-based variable selection procedure followed by a two-level random effects regression with a Poisson link clustered on state and county. We used data from the Behavioral Risk Factor Surveillance System (BRFSS) from 1991 to 2010 to estimate the model. We used 10-fold cross-validated mean squared errors and the variance of the residuals to test our model. To downscale the estimates we combined the prediction equations with 1990 and 2000 U.S. Census data for each of the four five-year time periods in this time range at the ZIP code and census tract levels. Several sensitivity analyses were conducted using models that included only basic terms, that accounted for spatial autocorrelation, and used Generalized Linear Models that did not include random effects. The two-level random effects model produced improved estimates compared to the fixed effects-only models. Estimates were particularly improved for the two-thirds of the conterminous U.S. where BRFSS data were available to estimate the county level random effects. We downscaled the smoking and obesity rate predictions to derive ZIP code and census tract estimates. To our knowledge these smoking and obesity predictions are the first to be developed for the entire conterminous U.S. for census tracts and ZIP codes. Our estimates could have significant utility for public health surveillance. Copyright © 2014. Published by Elsevier Inc.

  3. 28 CFR 36.608 - Guidance concerning model codes.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Guidance concerning model codes. 36.608... Codes § 36.608 Guidance concerning model codes. Upon application by an authorized representative of a... relevant model code and issue guidance concerning whether and in what respects the model code is consistent...

  4. Programmable Pulse-Position-Modulation Encoder

    NASA Technical Reports Server (NTRS)

    Zhu, David; Farr, William

    2006-01-01

    A programmable pulse-position-modulation (PPM) encoder has been designed for use in testing an optical communication link. The encoder includes a programmable state machine and an electronic code book that can be updated to accommodate different PPM coding schemes. The encoder includes a field-programmable gate array (FPGA) that is programmed to step through the stored state machine and code book and that drives a custom high-speed serializer circuit board that is capable of generating subnanosecond pulses. The stored state machine and code book can be updated by means of a simple text interface through the serial port of a personal computer.

  5. Optical PAyload for Lasercomm Science (OPALS) link validation

    NASA Technical Reports Server (NTRS)

    Biswas, Abhijit; Oaida, Bogdan V.; Andrews, Kenneth S.; Kovalik, Joseph M.; Abrahamson, Matthew J.; Wright, Malcolm W.

    2015-01-01

    Recently several day and nighttime links under diverse atmospheric conditions were completed using the Optical Payload for Lasercomm Science (OPALS) flight system on-board the International Space Station (ISS). In this paper we compare measured optical power and its variance at either end of the link with predictions that include atmospheric propagation models. For the 976 nm laser beacon mean power transmitted from the ground to the ISS the predicted mean irradiance of 10's of microwatts per square meter close to zenith and its decrease with range and increased air mass shows good agreement with predictions. The irradiance fluctuations sampled at 100 Hz also follow the expected increase in scintillation with air mass representative of atmospheric coherence lengths at zenith at 500 nm in the 3-8 cm range. The downlink predicted power of 100's of nanowatts was also reconciled within the uncertainty of the atmospheric losses. Expected link performance with uncoded bit-error rates less than 1E-4 required for the Reed-Solomon code to correct errors for video, text and file transmission was verified. The results of predicted and measured powers and fluctuations suggest the need for further study and refinement.

  6. Critical role for cochlear hair cell BK channels for coding the temporal structure and dynamic range of auditory information for central auditory processing

    PubMed Central

    Kurt, Simone; Sausbier, Matthias; Rüttiger, Lukas; Brandt, Niels; Moeller, Christoph K.; Kindler, Jennifer; Sausbier, Ulrike; Zimmermann, Ulrike; van Straaten, Harald; Neuhuber, Winfried; Engel, Jutta; Knipper, Marlies; Ruth, Peter; Schulze, Holger

    2012-01-01

    Large conductance, voltage- and Ca2+-activated K+ (BK) channels in inner hair cells (IHCs) of the cochlea are essential for hearing. However, germline deletion of BKα, the pore-forming subunit KCNMA1 of the BK channel, surprisingly did not affect hearing thresholds in the first postnatal weeks, even though altered IHC membrane time constants, decreased IHC receptor potential alternating current/direct current ratio, and impaired spike timing of auditory fibers were reported in these mice. To investigate the role of IHC BK channels for central auditory processing, we generated a conditional mouse model with hair cell-specific deletion of BKα from postnatal day 10 onward. This had an unexpected effect on temporal coding in the central auditory system: neuronal single and multiunit responses in the inferior colliculus showed higher excitability and greater precision of temporal coding that may be linked to the improved discrimination of temporally modulated sounds observed in behavioral training. The higher precision of temporal coding, however, was restricted to slower modulations of sound and reduced stimulus-driven activity. This suggests a diminished dynamic range of stimulus coding that is expected to impair signal detection in noise. Thus, BK channels in IHCs are crucial for central coding of the temporal fine structure of sound and for detection of signals in a noisy environment.—Kurt, S., Sausbier, M., Rüttiger, L., Brandt, N., Moeller, C. K., Kindler, J., Sausbier, U., Zimmermann, U., van Straaten, H., Neuhuber, W., Engel, J., Knipper, M., Ruth, P., Schulze, H. Critical role for cochlear hair cell BK channels for coding the temporal structure and dynamic range of auditory information for central auditory processing. PMID:22691916

  7. The disclosure of diagnosis codes can breach research participants' privacy.

    PubMed

    Loukides, Grigorios; Denny, Joshua C; Malin, Bradley

    2010-01-01

    De-identified clinical data in standardized form (eg, diagnosis codes), derived from electronic medical records, are increasingly combined with research data (eg, DNA sequences) and disseminated to enable scientific investigations. This study examines whether released data can be linked with identified clinical records that are accessible via various resources to jeopardize patients' anonymity, and the ability of popular privacy protection methodologies to prevent such an attack. The study experimentally evaluates the re-identification risk of a de-identified sample of Vanderbilt's patient records involved in a genome-wide association study. It also measures the level of protection from re-identification, and data utility, provided by suppression and generalization. Privacy protection is quantified using the probability of re-identifying a patient in a larger population through diagnosis codes. Data utility is measured at a dataset level, using the percentage of retained information, as well as its description, and at a patient level, using two metrics based on the difference between the distribution of Internal Classification of Disease (ICD) version 9 codes before and after applying privacy protection. More than 96% of 2800 patients' records are shown to be uniquely identified by their diagnosis codes with respect to a population of 1.2 million patients. Generalization is shown to reduce further the percentage of de-identified records by less than 2%, and over 99% of the three-digit ICD-9 codes need to be suppressed to prevent re-identification. Popular privacy protection methods are inadequate to deliver a sufficiently protected and useful result when sharing data derived from complex clinical systems. The development of alternative privacy protection models is thus required.

  8. Natural selection reduced diversity on human y chromosomes.

    PubMed

    Wilson Sayres, Melissa A; Lohmueller, Kirk E; Nielsen, Rasmus

    2014-01-01

    The human Y chromosome exhibits surprisingly low levels of genetic diversity. This could result from neutral processes if the effective population size of males is reduced relative to females due to a higher variance in the number of offspring from males than from females. Alternatively, selection acting on new mutations, and affecting linked neutral sites, could reduce variability on the Y chromosome. Here, using genome-wide analyses of X, Y, autosomal and mitochondrial DNA, in combination with extensive population genetic simulations, we show that low observed Y chromosome variability is not consistent with a purely neutral model. Instead, we show that models of purifying selection are consistent with observed Y diversity. Further, the number of sites estimated to be under purifying selection greatly exceeds the number of Y-linked coding sites, suggesting the importance of the highly repetitive ampliconic regions. While we show that purifying selection removing deleterious mutations can explain the low diversity on the Y chromosome, we cannot exclude the possibility that positive selection acting on beneficial mutations could have also reduced diversity in linked neutral regions, and may have contributed to lowering human Y chromosome diversity. Because the functional significance of the ampliconic regions is poorly understood, our findings should motivate future research in this area.

  9. Natural Selection Reduced Diversity on Human Y Chromosomes

    PubMed Central

    Wilson Sayres, Melissa A.; Lohmueller, Kirk E.; Nielsen, Rasmus

    2014-01-01

    The human Y chromosome exhibits surprisingly low levels of genetic diversity. This could result from neutral processes if the effective population size of males is reduced relative to females due to a higher variance in the number of offspring from males than from females. Alternatively, selection acting on new mutations, and affecting linked neutral sites, could reduce variability on the Y chromosome. Here, using genome-wide analyses of X, Y, autosomal and mitochondrial DNA, in combination with extensive population genetic simulations, we show that low observed Y chromosome variability is not consistent with a purely neutral model. Instead, we show that models of purifying selection are consistent with observed Y diversity. Further, the number of sites estimated to be under purifying selection greatly exceeds the number of Y-linked coding sites, suggesting the importance of the highly repetitive ampliconic regions. While we show that purifying selection removing deleterious mutations can explain the low diversity on the Y chromosome, we cannot exclude the possibility that positive selection acting on beneficial mutations could have also reduced diversity in linked neutral regions, and may have contributed to lowering human Y chromosome diversity. Because the functional significance of the ampliconic regions is poorly understood, our findings should motivate future research in this area. PMID:24415951

  10. The structure of human 4F2hc ectodomain provides a model for homodimerization and electrostatic interaction with plasma membrane.

    PubMed

    Fort, Joana; de la Ballina, Laura R; Burghardt, Hans E; Ferrer-Costa, Carles; Turnay, Javier; Ferrer-Orta, Cristina; Usón, Isabel; Zorzano, Antonio; Fernández-Recio, Juan; Orozco, Modesto; Lizarbe, María Antonia; Fita, Ignacio; Palacín, Manuel

    2007-10-26

    4F2hc (CD98hc) is a multifunctional type II membrane glycoprotein involved in amino acid transport and cell fusion, adhesion, and transformation. The structure of the ectodomain of human 4F2hc has been solved using monoclinic (Protein Data Bank code 2DH2) and orthorhombic (Protein Data Bank code 2DH3) crystal forms at 2.1 and 2.8 A, respectively. It is composed of a (betaalpha)(8) barrel and an antiparallel beta(8) sandwich related to bacterial alpha-glycosidases, although lacking key catalytic residues and consequently catalytic activity. 2DH3 is a dimer with Zn(2+) coordination at the interface. Human 4F2hc expressed in several cell types resulted in cell surface and Cys(109) disulfide bridge-linked homodimers with major architectural features of the crystal dimer, as demonstrated by cross-linking experiments. 4F2hc has no significant hydrophobic patches at the surface. Monomer and homodimer have a polarized charged surface. The N terminus of the solved structure, including the position of Cys(109) residue located four residues apart from the transmembrane domain, is adjacent to the positive face of the ectodomain. This location of the N terminus and the Cys(109)-intervening disulfide bridge imposes space restrictions sufficient to support a model for electrostatic interaction of the 4F2hc ectodomain with membrane phospholipids. These results provide the first crystal structure of heteromeric amino acid transporters and suggest a dynamic interaction of the 4F2hc ectodomain with the plasma membrane.

  11. Multi-Body Analysis of the 1/5 Scale Wind Tunnel Model of the V-22 Tiltrotor

    NASA Technical Reports Server (NTRS)

    Ghiringhelli, G. L.; Masarati, P.; Mantegazza, P.; Nixon, M. W.

    1999-01-01

    The paper presents a multi-body analysis of the 1/5 scale wind tunnel model of the V-22 tiltrotor, the Wing and Rotor Aeroelastic Testing System (WRATS), currently tested at NASA Langley Research Center. An original multi-body formulation has been developed at the Dipartimento di Ingegneria Aerospaziale of the Politecnico di Milano, Italy. It is based on the direct writing of the equilibrium equations of independent rigid bodies, connected by kinematic constraints that result in the addition of algebraic constraint equations, and by dynamic constraints, that directly contribute to the equilibrium equations. The formulation has been extended to the simultaneous solution of interdisciplinary problems by modeling electric and hydraulic networks, for aeroservoelastic problems. The code has been tailored to the modeling of rotorcrafts while preserving a complete generality. A family of aerodynamic elements has been introduced to model high aspect aerodynamic surfaces, based on the strip theory, with quasi-steady aerodynamic coefficients, compressibility, post-stall interpolation of experimental data, dynamic stall modeling, and radial flow drag. Different models for the induced velocity of the rotor can be used, from uniform velocity to dynamic in flow. A complete dynamic and aeroelastic analysis of the model of the V-22 tiltrotor has been performed, to assess the validity of the formulation and to exploit the unique features of multi-body analysis with respect to conventional comprehensive rotorcraft codes; These are the ability to model the exact kinematics of mechanical systems, and the possibility to simulate unusual maneuvers and unusual flight conditions, that are particular to the tiltrotor, e.g. the conversion maneuver. A complete modal validation of the analytical model has been performed, to assess the ability to reproduce the correct dynamics of the system with a relatively coarse beam model of the semispan wing, pylon and rotor. Particular care has been used to model the kinematics of the gimbal joint, that characterizes the rotor hub, and of the control system, consisting in the entire swashplate mechanism. The kinematics of the fixed and the rotating plates have been modeled, with variable length control links used to input the controls, the rotating flexible links, the pitch horns and the pitch bearings. The investigations took advantage of concurring wind tunnel test runs, that were performed in August 1998, and allowed the acquisition of data specific to the multi-body analysis.

  12. System Synchronizes Recordings from Separated Video Cameras

    NASA Technical Reports Server (NTRS)

    Nail, William; Nail, William L.; Nail, Jasper M.; Le, Doung T.

    2009-01-01

    A system of electronic hardware and software for synchronizing recordings from multiple, physically separated video cameras is being developed, primarily for use in multiple-look-angle video production. The system, the time code used in the system, and the underlying method of synchronization upon which the design of the system is based are denoted generally by the term "Geo-TimeCode(TradeMark)." The system is embodied mostly in compact, lightweight, portable units (see figure) denoted video time-code units (VTUs) - one VTU for each video camera. The system is scalable in that any number of camera recordings can be synchronized. The estimated retail price per unit would be about $350 (in 2006 dollars). The need for this or another synchronization system external to video cameras arises because most video cameras do not include internal means for maintaining synchronization with other video cameras. Unlike prior video-camera-synchronization systems, this system does not depend on continuous cable or radio links between cameras (however, it does depend on occasional cable links lasting a few seconds). Also, whereas the time codes used in prior video-camera-synchronization systems typically repeat after 24 hours, the time code used in this system does not repeat for slightly more than 136 years; hence, this system is much better suited for long-term deployment of multiple cameras.

  13. A cascaded coding scheme for error control and its performance analysis

    NASA Technical Reports Server (NTRS)

    Lin, S.

    1986-01-01

    A coding scheme for error control in data communication systems is investigated. The scheme is obtained by cascading two error correcting codes, called the inner and the outer codes. The error performance of the scheme is analyzed for a binary symmetric channel with bit error rate epsilon < 1/2. It is shown that, if the inner and outer codes are chosen properly, extremely high reliability can be attained even for a high channel bit error rate. Various specific example schemes with inner codes ranging from high rates to very low rates and Reed-Solomon codes are considered, and their probabilities are evaluated. They all provide extremely high reliability even for very high bit error rates, say 0.1 to 0.01. Several example schemes are being considered by NASA for satellite and spacecraft down link error control.

  14. Ascertaining severe perineal trauma and associated risk factors by comparing birth data with multiple sources.

    PubMed

    Ampt, Amanda J; Ford, Jane B

    2015-09-30

    Population data are often used to monitor severe perineal trauma trends and investigate risk factors. Within New South Wales (NSW), two different datasets can be used, the Perinatal Data Collection ('birth' data) or a linked dataset combining birth data with the Admitted Patient Data Collection ('hospital' data). Severe perineal trauma can be ascertained by birth data alone, or by hospital International Classification of Diseases Australian Modification (ICD-10-AM) diagnosis and procedure coding in the linked dataset. The aim of this study was to compare rates and risk factors for severe perineal trauma using birth data alone versus using linked data. The study population consisted of all vaginal births in NSW between 2001 and 2011. Perineal injury coding in birth data was revised in 2006, so data were analysed separately for 2001-06 and 2006-11. Rates of severe perineal injury over time were compared in birth data alone versus linked data. Kappa and agreement statistics were calculated. Risk factor distributions (maternal age, primiparity, instrumental birth, birthweight ≥4 kg, Asian country of birth and episiotomy) were compared between women with severe perineal trauma identified by birth data alone, and those identified by linked data. Multivariable logistic regression was used to calculate the adjusted odds ratios (aORs) of severe perineal trauma. Among 697 202 women with vaginal births, 2.1% were identified with severe perineal trauma by birth data alone, and 2.6% by linked data. The rate discrepancy was higher among earlier data (1.7% for birth data, 2.4% for linked data). Kappa for earlier data was 0.78 (95% CI 0.78, 0.79), and 0.89 (95% CI 0.89, 0.89) for more recent data. With the exception of episiotomy, differences in risk factor distributions were small, with similar aORs. The aOR of severe perineal trauma for episiotomy was higher using linked data (1.33, 95% CI 1.27, 1.40) compared with birth data (1.02, 95% CI 0.97, 1.08). Although discrepancies in ascertainment of severe perineal trauma improved after revision of birth data coding in 2006, higher ascertainment by linked data was still evident for recent data. There were also higher risk estimates of severe perineal trauma with episiotomy by linked data than by birth data.

  15. Proceedings of the U.S. Army Symposium on Gun Dynamics (5th) Held in Rensselaerville, New York on 23-25 September 1987

    DTIC Science & Technology

    1987-09-01

    have shown that gun barrel heating, and hence thermal expansion , is both axially and circumferentially asymmetric. Circumferential, or cross-barrel...element code, which ended in the selection of ABAQUS . The code will perform static, dynamic, and thermal anal- ysis on a broad range of structures...analysis may be performed by a user supplied FORTRAN subroutine which is automatically linked to the code and supplements the stand- ard ABAQUS

  16. Simulating the dynamics of complex plasmas.

    PubMed

    Schwabe, M; Graves, D B

    2013-08-01

    Complex plasmas are low-temperature plasmas that contain micrometer-size particles in addition to the neutral gas particles and the ions and electrons that make up the plasma. The microparticles interact strongly and display a wealth of collective effects. Here we report on linked numerical simulations that reproduce many of the experimental results of complex plasmas. We model a capacitively coupled plasma with a fluid code written for the commercial package comsol. The output of this model is used to calculate forces on microparticles. The microparticles are modeled using the molecular dynamics package lammps, which we extended to include the forces from the plasma. Using this method, we are able to reproduce void formation, the separation of particles of different sizes into layers, lane formation, vortex formation, and other effects.

  17. The Five 'R's' for Developing Trusted Software Frameworks to increase confidence in, and maximise reuse of, Open Source Software.

    NASA Astrophysics Data System (ADS)

    Fraser, Ryan; Gross, Lutz; Wyborn, Lesley; Evans, Ben; Klump, Jens

    2015-04-01

    Recent investments in HPC, cloud and Petascale data stores, have dramatically increased the scale and resolution that earth science challenges can now be tackled. These new infrastructures are highly parallelised and to fully utilise them and access the large volumes of earth science data now available, a new approach to software stack engineering needs to be developed. The size, complexity and cost of the new infrastructures mean any software deployed has to be reliable, trusted and reusable. Increasingly software is available via open source repositories, but these usually only enable code to be discovered and downloaded. As a user it is hard for a scientist to judge the suitability and quality of individual codes: rarely is there information on how and where codes can be run, what the critical dependencies are, and in particular, on the version requirements and licensing of the underlying software stack. A trusted software framework is proposed to enable reliable software to be discovered, accessed and then deployed on multiple hardware environments. More specifically, this framework will enable those who generate the software, and those who fund the development of software, to gain credit for the effort, IP, time and dollars spent, and facilitate quantification of the impact of individual codes. For scientific users, the framework delivers reviewed and benchmarked scientific software with mechanisms to reproduce results. The trusted framework will have five separate, but connected components: Register, Review, Reference, Run, and Repeat. 1) The Register component will facilitate discovery of relevant software from multiple open source code repositories. The registration process of the code should include information about licensing, hardware environments it can be run on, define appropriate validation (testing) procedures and list the critical dependencies. 2) The Review component is targeting on the verification of the software typically against a set of benchmark cases. This will be achieved by linking the code in the software framework to peer review forums such as Mozilla Science or appropriate Journals (e.g. Geoscientific Model Development Journal) to assist users to know which codes to trust. 3) Referencing will be accomplished by linking the Software Framework to groups such as Figshare or ImpactStory that help disseminate and measure the impact of scientific research, including program code. 4) The Run component will draw on information supplied in the registration process, benchmark cases described in the review and relevant information to instantiate the scientific code on the selected environment. 5) The Repeat component will tap into existing Provenance Workflow engines that will automatically capture information that relate to a particular run of that software, including identification of all input and output artefacts, and all elements and transactions within that workflow. The proposed trusted software framework will enable users to rapidly discover and access reliable code, reduce the time to deploy it and greatly facilitate sharing, reuse and reinstallation of code. Properly designed it could enable an ability to scale out to massively parallel systems and be accessed nationally/ internationally for multiple use cases, including Supercomputer centres, cloud facilities, and local computers.

  18. What Are Those Checkerboard Things? How Quick Response Codes Can Enrich Student Projects

    ERIC Educational Resources Information Center

    Tucker, Al

    2011-01-01

    Students enrolled in the author's commercial arts program design and publish the school's yearbook. For the 2010-2011 school year, the students applied Quick Response (QR) code technology to include links to events that occurred after the yearbook's print deadline, including graduation. The technology has many applications in the school setting,…

  19. The Chern-Simons current in time series of knots and links in proteins

    NASA Astrophysics Data System (ADS)

    Capozziello, Salvatore; Pincak, Richard

    2018-06-01

    A superspace model of knots and links for DNA time series data is proposed to take into account the feedback loop from docking to undocking state of protein-protein interactions. In particular, the direction of interactions between the 8 hidden states of DNA is considered. It is a E8 ×E8 unified spin model where the genotype, from active and inactive side of DNA time data series, can be considered for any living organism. The mathematical model is borrowed from loop-quantum gravity and adapted to biology. It is used to derive equations for gene expression describing transitions from ground to excited states, and for the 8 coupling states between geneon and anti-geneon transposon and retrotransposon in trash DNA. Specifically, we adopt a modified Grothendieck cohomology and a modified Khovanov cohomology for biology. The result is a Chern-Simons current in (8 + 3) extradimensions of a given unoriented supermanifold with ghost fields of protein structures. The 8 dimensions come from the 8 hidden states of spinor field of genetic code. The extradimensions come from the 3 types of principle fiber bundle in the secondary protein.

  20. Strategies for fitting nonlinear ecological models in R, AD Model Builder, and BUGS

    USGS Publications Warehouse

    Bolker, Benjamin M.; Gardner, Beth; Maunder, Mark; Berg, Casper W.; Brooks, Mollie; Comita, Liza; Crone, Elizabeth; Cubaynes, Sarah; Davies, Trevor; de Valpine, Perry; Ford, Jessica; Gimenez, Olivier; Kéry, Marc; Kim, Eun Jung; Lennert-Cody, Cleridy; Magunsson, Arni; Martell, Steve; Nash, John; Nielson, Anders; Regentz, Jim; Skaug, Hans; Zipkin, Elise

    2013-01-01

    1. Ecologists often use nonlinear fitting techniques to estimate the parameters of complex ecological models, with attendant frustration. This paper compares three open-source model fitting tools and discusses general strategies for defining and fitting models. 2. R is convenient and (relatively) easy to learn, AD Model Builder is fast and robust but comes with a steep learning curve, while BUGS provides the greatest flexibility at the price of speed. 3. Our model-fitting suggestions range from general cultural advice (where possible, use the tools and models that are most common in your subfield) to specific suggestions about how to change the mathematical description of models to make them more amenable to parameter estimation. 4. A companion web site (https://groups.nceas.ucsb.edu/nonlinear-modeling/projects) presents detailed examples of application of the three tools to a variety of typical ecological estimation problems; each example links both to a detailed project report and to full source code and data.

  1. A Description of Advertisements for Alcohol on LinkNYC Kiosks in Manhattan, New York City: A Pilot Study.

    PubMed

    Basch, Corey H; Ethan, Danna; LeBlanc, Michael; Basch, Charles E

    2018-02-26

    Excessive alcohol consumption compromises health and increases risk of mortality. Advertisements for alcohol in city environments have been shown to influence consumption. The aim of this pilot study was to estimate the prevalence of alcohol advertisements displayed on LinkNYC kiosks, a new communication channel that provides outdoor Wi-Fi access and advertising on streets within urban environments. Direct observations were conducted to document advertisements on a 20% random sample of the 500 LinkNYC kiosks in Manhattan, NYC. From May to September of 2017, each of the 100 selected kiosks was observed for a 10-min period to document advertisements for alcohol. In addition, differences in prevalence of alcohol advertisements were examined by the location of the kiosk based on NYC zip codes' median annual income. Of the 2025 advertisements observed, 5.09% (N = 103) were for an alcohol product (including duplicates). Such advertisements were observed on 17% of the kiosks. No health warnings or age warnings were presented in any of the alcohol advertisements. Compared with kiosks located in zip codes with lower median annual income, significantly more alcohol advertisements were displayed in zip codes with higher median annual income. This is the first study to estimate the prevalence of alcohol advertising on the LinkNYC Wi-Fi and telecommunication system, now ubiquitous on Manhattan's sidewalks. This study adds to the current literature that suggests New York City residents could benefit from health-promoting versus health-compromising advertising. The findings also highlight the potential of LinkNYC kiosk marketing to undermine health-related social marketing efforts by City government and other organizations.

  2. Verification of the New FAST v8 Capabilities for the Modeling of Fixed-Bottom Offshore Wind Turbines: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barahona, B.; Jonkman, J.; Damiani, R.

    2014-12-01

    Coupled dynamic analysis has an important role in the design of offshore wind turbines because the systems are subject to complex operating conditions from the combined action of waves and wind. The aero-hydro-servo-elastic tool FAST v8 is framed in a novel modularization scheme that facilitates such analysis. Here, we present the verification of new capabilities of FAST v8 to model fixed-bottom offshore wind turbines. We analyze a series of load cases with both wind and wave loads and compare the results against those from the previous international code comparison projects-the International Energy Agency (IEA) Wind Task 23 Subtask 2 Offshoremore » Code Comparison Collaboration (OC3) and the IEA Wind Task 30 OC3 Continued (OC4) projects. The verification is performed using the NREL 5-MW reference turbine supported by monopile, tripod, and jacket substructures. The substructure structural-dynamics models are built within the new SubDyn module of FAST v8, which uses a linear finite-element beam model with Craig-Bampton dynamic system reduction. This allows the modal properties of the substructure to be synthesized and coupled to hydrodynamic loads and tower dynamics. The hydrodynamic loads are calculated using a new strip theory approach for multimember substructures in the updated HydroDyn module of FAST v8. These modules are linked to the rest of FAST through the new coupling scheme involving mapping between module-independent spatial discretizations and a numerically rigorous implicit solver. The results show that the new structural dynamics, hydrodynamics, and coupled solutions compare well to the results from the previous code comparison projects.« less

  3. Link Analysis in the Mission Planning Lab

    NASA Technical Reports Server (NTRS)

    McCarthy, Jessica A.; Cervantes, Benjamin W.; Daugherty, Sarah C.; Arroyo, Felipe; Mago, Divyang

    2011-01-01

    The legacy communications link analysis software currently used at Wallops Flight Facility involves processes that are different for command destruct, radar, and telemetry. There is a clear advantage to developing an easy-to-use tool that combines all the processes in one application. Link Analysis in the Mission Planning Lab (MPL) uses custom software and algorithms integrated with Analytical Graphics Inc. Satellite Toolkit (AGI STK). The MPL link analysis tool uses pre/post-mission data to conduct a dynamic link analysis between ground assets and the launch vehicle. Just as the legacy methods do, the MPL link analysis tool calculates signal strength and signal- to-noise according to the accepted processes for command destruct, radar, and telemetry assets. Graphs and other custom data are generated rapidly in formats for reports and presentations. STK is used for analysis as well as to depict plume angles and antenna gain patterns in 3D. The MPL has developed two interfaces with the STK software (see figure). The first interface is an HTML utility, which was developed in Visual Basic to enhance analysis for plume modeling and to offer a more user friendly, flexible tool. A graphical user interface (GUI) written in MATLAB (see figure upper right-hand corner) is also used to quickly depict link budget information for multiple ground assets. This new method yields a dramatic decrease in the time it takes to provide launch managers with the required link budgets to make critical pre-mission decisions. The software code used for these two custom utilities is a product of NASA's MPL.

  4. Multi-diversity combining and selection for relay-assisted mixed RF/FSO system

    NASA Astrophysics Data System (ADS)

    Chen, Li; Wang, Weidong

    2017-12-01

    We propose and analyze multi-diversity combining and selection to enhance the performance of relay-assisted mixed radio frequency/free-space optics (RF/FSO) system. We focus on a practical scenario for cellular network where a single-antenna source is communicating to a multi-apertures destination through a relay equipped with multiple receive antennas and multiple transmit apertures. The RF single input multiple output (SIMO) links employ either maximal-ratio combining (MRC) or receive antenna selection (RAS), and the FSO multiple input multiple output (MIMO) links adopt either repetition coding (RC) or transmit laser selection (TLS). The performance is evaluated via an outage probability analysis over Rayleigh fading RF links and Gamma-Gamma atmospheric turbulence FSO links with pointing errors where channel state information (CSI) assisted amplify-and-forward (AF) scheme is considered. Asymptotic closed-form expressions at high signal-to-noise ratio (SNR) are also derived. Coding gain and diversity order for different combining and selection schemes are further discussed. Numerical results are provided to verify and illustrate the analytical results.

  5. A Brief Survey of Media Access Control, Data Link Layer, and Protocol Technologies for Lunar Surface Communications

    NASA Technical Reports Server (NTRS)

    Wallett, Thomas M.

    2009-01-01

    This paper surveys and describes some of the existing media access control and data link layer technologies for possible application in lunar surface communications and the advanced wideband Direct Sequence Code Division Multiple Access (DSCDMA) conceptual systems utilizing phased-array technology that will evolve in the next decade. Time Domain Multiple Access (TDMA) and Code Division Multiple Access (CDMA) are standard Media Access Control (MAC) techniques that can be incorporated into lunar surface communications architectures. Another novel hybrid technique that is recently being developed for use with smart antenna technology combines the advantages of CDMA with those of TDMA. The relatively new and sundry wireless LAN data link layer protocols that are continually under development offer distinct advantages for lunar surface applications over the legacy protocols which are not wireless. Also several communication transport and routing protocols can be chosen with characteristics commensurate with smart antenna systems to provide spacecraft communications for links exhibiting high capacity on the surface of the Moon. The proper choices depend on the specific communication requirements.

  6. Enhanced anti-counterfeiting measures for additive manufacturing: coupling lanthanide nanomaterial chemical signatures with blockchain technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kennedy, Zachary C.; Stephenson, David E.; Christ, Josef F.

    The significant rise of additive manufacturing (AM) in recent years is in part due to the open sourced nature of the printing processes and reduced cost and capital barriers relative to traditional manufacturing. However, this democratization of manufacturing spurs an increased demand for producers and end-users to verify the authenticity and quality of individual parts. To this end, we introduce an anti-counterfeiting method composed of first embedding engineered nanomaterials into features of a 3D-printed part followed by non-destructive interrogation of these features to quantify a chemical signature profile. The part specific chemical signature data is then linked to a securitized,more » distributed, and time-stamped blockchain ledger entry. To demonstrate the utility of this approach, lanthanide-aspartic acid nanoscale coordination polymers (Ln3+- Asp NCs) / poly(lactic) acid (PLA) composites were formulated and transformed into a filament feedstock for fused deposition modeling (FDM) 3D printing. In the present case, a quick-response (QR) code containing the doped Ln3+-Asp NCs was printed using a dual-extruder FDM printer into pure PLA parts. The QR code provides a searchable reference to an Ethereum-based blockchain entry. The QR code physical features also serve as defined areas to probe the signatures arising from the embedded Ln3+-Asp NCs. Visible fluorescence emission with UV-excitation was quantified in terms of color using a smartphone camera and incorporated into blockchain entries. Ultimately, linking unique chemical signature data to blockchain databases is anticipated to make the costs of counterfeiting AM materials significantly more prohibitive and transactions between those in the supply chain more trustworthy.« less

  7. Expanding the Andersen Model: The Role of Psychosocial Factors in Long-Term Care Use

    PubMed Central

    Bradley, Elizabeth H; McGraw, Sarah A; Curry, Leslie; Buckser, Alison; King, Kinda L; Kasl, Stanislav V; Andersen, Ronald

    2002-01-01

    Objective To examine a prevailing conceptual model of health services use (Andersen 1995) and to suggest modifications that may enhance its explanatory power when applied to empirical studies of race/ethnicity and long-term care. Study Setting Twelve focus groups of African-American (five groups) and white (seven groups) individuals, aged 65 and older, residing in Connecticut during 2000. Study Design Using qualitative analysis, data were coded and analyzed in NUD-IST 4 software to facilitate the reporting of recurrent themes, supporting quotations, and links among the themes for developing the conceptual framework. Specific analysis was conducted to assess distinctions in common themes between African-American and white focus groups. Data Collection Data were collected using a standardized discussion guide, augmented by prompts for clarification. Audio taped sessions were transcribed and independently coded by investigators and crosschecked to enhance coding validity. An audit trail was maintained to document analytic decisions during data analysis and interpretation. Principal Findings Psychosocial factors (e.g., attitudes and knowledge, social norms, and perceived control) are identified as determinants of service use, thereby expanding the Andersen model (1995). African-American and white focus group members differed in their reported accessibility of information about long-term care, social norms concerning caregiving expectations and burden, and concerns of privacy and self-determination. Conclusions More comprehensive identification of psychosocial factors may enhance our understanding of the complex role of race/ethnicity in long-term care use as well as the effectiveness of policies and programs designed to address disparities in long-term care service use among minority and nonminority groups. PMID:12479494

  8. Expanding the Andersen model: the role of psychosocial factors in long-term care use.

    PubMed

    Bradley, Elizabeth H; McGraw, Sarah A; Curry, Leslie; Buckser, Alison; King, Kinda L; Kasl, Stanislav V; Andersen, Ronald

    2002-10-01

    To examine a prevailing conceptual model of health services use (Andersen 1995) and to suggest modifications that may enhance its explanatory power when applied to empirical studies of race/ethnicity and long-term care. Twelve focus groups of African-American (five groups) and white (seven groups) individuals, aged 65 and older, residing in Connecticut during 2000. Using qualitative analysis, data were coded and analyzed in NUD-IST 4 software to facilitate the reporting of recurrent themes, supporting quotations, and links among the themes for developing the conceptual framework. Specific analysis was conducted to assess distinctions in common themes between African-American and white focus groups. Data were collected using a standardized discussion guide, augmented by prompts for clarification. Audio taped sessions were transcribed and independently coded by investigators and crosschecked to enhance coding validity. An audit trail was maintained to document analytic decisions during data analysis and interpretation. Psychosocial factors (e.g., attitudes and knowledge, social norms, and perceived control) are identified as determinants of service use, thereby expanding the Andersen model (1995). African-American and white focus group members differed in their reported accessibility of information about long-term care, social norms concerning caregiving expectations and burden, and concerns of privacy and self-determination. More comprehensive identification of psychosocial factors may enhance our understanding of the complex role of race/ethnicity in long-term care use as well as the effectiveness of policies and programs designed to address disparities in long-term care service use among minority and nonminority groups.

  9. Conflict negotiation and autonomy processes in adolescent romantic relationships: an observational study of interdependency in boyfriend and girlfriend effects.

    PubMed

    McIsaac, Caroline; Connolly, Jennifer; McKenney, Katherine S; Pepler, Debra; Craig, Wendy

    2008-12-01

    This study examined the association between conflict negotiation and the expression of autonomy in adolescent romantic partners. Thirty-seven couples participated in a globally coded conflict interaction task. Actor-partner interdependence models (APIM) were used to quantify the extent to which boys' and girls' autonomy was linked solely to their own negotiation of the conflict or whether it was linked conjointly to their own and their partners' negotiation style. Combining agentic autonomy theories and peer socialization models, it was expected that boys' and girls' autonomy would be associated only with their own conflict behaviors when they employed conflict styles reflective of their same gender repertoire, and associated conjointly with self and partner behaviors when they employed gender-atypical conflict styles. Instead of an equal, albeit distinct, positioning in the autonomy dynamic, the results suggested that girls' autonomy is associated solely with their own conflict behaviors, whereas boys' autonomy is jointly associated with their own and their partners' conflict behaviors. We discuss the relative power of boys and girls in emergent dyadic contexts, emphasizing how romantic dynamics shape salient abilities.

  10. SRGULL - AN ADVANCED ENGINEERING MODEL FOR THE PREDICTION OF AIRFRAME INTEGRATED SCRAMJET CYCLE PERFORMANCE

    NASA Technical Reports Server (NTRS)

    Walton, J. T.

    1994-01-01

    The development of a single-stage-to-orbit aerospace vehicle intended to be launched horizontally into low Earth orbit, such as the National Aero-Space Plane (NASP), has concentrated on the use of the supersonic combustion ramjet (scramjet) propulsion cycle. SRGULL, a scramjet cycle analysis code, is an engineer's tool capable of nose-to-tail, hydrogen-fueled, airframe-integrated scramjet simulation in a real gas flow with equilibrium thermodynamic properties. This program facilitates initial estimates of scramjet cycle performance by linking a two-dimensional forebody, inlet and nozzle code with a one-dimensional combustor code. Five computer codes (SCRAM, SEAGUL, INLET, Progam HUD, and GASH) originally developed at NASA Langley Research Center in support of hypersonic technology are integrated in this program to analyze changing flow conditions. The one-dimensional combustor code is based on the combustor subroutine from SCRAM and the two-dimensional coding is based on an inviscid Euler program (SEAGUL). Kinetic energy efficiency input for sidewall area variation modeling can be calculated by the INLET program code. At the completion of inviscid component analysis, Program HUD, an integral boundary layer code based on the Spaulding-Chi method, is applied to determine the friction coefficient which is then used in a modified Reynolds Analogy to calculate heat transfer. Real gas flow properties such as flow composition, enthalpy, entropy, and density are calculated by the subroutine GASH. Combustor input conditions are taken from one-dimensionalizing the two-dimensional inlet exit flow. The SEAGUL portions of this program are limited to supersonic flows, but the combustor (SCRAM) section can handle supersonic and dual-mode operation. SRGULL has been compared to scramjet engine tests with excellent results. SRGULL was written in FORTRAN 77 on an IBM PC compatible using IBM's FORTRAN/2 or Microway's NDP386 F77 compiler. The program is fully user interactive, but can also run in batch mode. It operates under the UNIX, VMS, NOS, and DOS operating systems. The source code is not directly compatible with all PC compilers (e.g., Lahey or Microsoft FORTRAN) due to block and segment size requirements. SRGULL executable code requires about 490K RAM and a math coprocessor on PC's. The SRGULL program was developed in 1989, although the component programs originated in the 1960's and 1970's. IBM, IBM PC, and DOS are registered trademarks of International Business Machines. VMS is a registered trademark of Digital Equipment Corporation. UNIX is a registered trademark of Bell Laboratories. NOS is a registered trademark of Control Data Corporation.

  11. PYFLOW 2.0. A new open-source software for quantifying the impact and depositional properties of dilute pyroclastic density currents

    NASA Astrophysics Data System (ADS)

    Dioguardi, Fabio; Dellino, Pierfrancesco

    2017-04-01

    Dilute pyroclastic density currents (DPDC) are ground-hugging turbulent gas-particle flows that move down volcano slopes under the combined action of density contrast and gravity. DPDCs are dangerous for human lives and infrastructures both because they exert a dynamic pressure in their direction of motion and transport volcanic ash particles, which remain in the atmosphere during the waning stage and after the passage of a DPDC. Deposits formed by the passage of a DPDC show peculiar characteristics that can be linked to flow field variables with sedimentological models. Here we present PYFLOW_2.0, a significantly improved version of the code of Dioguardi and Dellino (2014) that was already extensively used for the hazard assessment of DPDCs at Campi Flegrei and Vesuvius (Italy). In the latest new version the code structure, the computation times and the data input method have been updated and improved. A set of shape-dependent drag laws have been implemented as to better estimate the aerodynamic drag of particles transported and deposited by the flow. A depositional model for calculating the deposition time and rate of the ash and lapilli layer formed by the pyroclastic flow has also been included. This model links deposit (e.g. componentry, grainsize) to flow characteristics (e.g. flow average density and shear velocity), the latter either calculated by the code itself or given in input by the user. The deposition rate is calculated by summing the contributions of each grainsize class of all components constituting the deposit (e.g. juvenile particles, crystals, etc.), which are in turn computed as a function of particle density, terminal velocity, concentration and deposition probability. Here we apply the concept of deposition probability, previously introduced for estimating the deposition rates of turbidity currents (Stow and Bowen, 1980), to DPDCs, although with a different approach, i.e. starting from what is observed in the deposit (e.g. the weight fractions ratios between the different grainsize classes). In this way, more realistic estimates of the deposition rate can be obtained, as the deposition probability of different grainsize constituting the DPDC deposit could be different and not necessarily equal to unity. Calculations of the deposition rates of large-scale experiments, previously computed with different methods, have been performed as experimental validation and are presented. Results of model application to DPDCs and turbidity currents will also be presented. Dioguardi, F, and P. Dellino (2014), PYFLOW: A computer code for the calculation of the impact parameters of Dilute Pyroclastic Density Currents (DPDC) based on field data, Powder Technol., 66, 200-210, doi:10.1016/j.cageo.2014.01.013 Stow, D. A. V., and A. J. Bowen (1980), A physical model for the transport and sorting of fine-grained sediment by turbidity currents, Sedimentology, 27, 31-46

  12. Organization and annotation of the Xcat critical region: elimination of seven positional candidate genes.

    PubMed

    Huang, Kristen M; Geunes-Boyer, Scarlett; Wu, Sufen; Dutra, Amalia; Favor, Jack; Stambolian, Dwight

    2004-05-01

    Xcat mice display X-linked congenital cataracts and are a mouse model for the human X-linked cataract disease Nance Horan syndrome (NHS). The genetic defect in Xcat mice and NHS patients is not known. We isolated and sequenced a BAC contig representing a portion of the Xcat critical region. We combined our sequencing data with the most recent mouse sequence assemblies from both Celera and public databases. The sequence of the 2.2-Mb Xcat critical region was then analyzed for potential Xcat candidate genes. The coding regions of the seven known genes within this area (Rai2, Rbbp7, Ctps2, Calb3, Grpr, Reps2, and Syap1) were sequenced in Xcat mice and no mutations were detected. The expression of Rai2 was quantitatively identical in wild-type and Xcat mutant eyes. These results indicate that the Xcat mutation is within a novel, undiscovered gene.

  13. EarthCollab, building geoscience-centric implementations of the VIVO semantic software suite

    NASA Astrophysics Data System (ADS)

    Rowan, L. R.; Gross, M. B.; Mayernik, M. S.; Daniels, M. D.; Krafft, D. B.; Kahn, H. J.; Allison, J.; Snyder, C. B.; Johns, E. M.; Stott, D.

    2017-12-01

    EarthCollab, an EarthCube Building Block project, is extending an existing open-source semantic web application, VIVO, to enable the exchange of information about scientific researchers and resources across institutions. EarthCollab is a collaboration between UNAVCO, a geodetic facility and consortium that supports diverse research projects informed by geodesy, The Bering Sea Project, an interdisciplinary field program whose data archive is hosted by NCAR's Earth Observing Laboratory, and Cornell University. VIVO has been implemented by more than 100 universities and research institutions to highlight research and institutional achievements. This presentation will discuss benefits and drawbacks of working with and extending open source software. Some extensions include plotting georeferenced objects on a map, a mobile-friendly theme, integration of faceting via Elasticsearch, extending the VIVO ontology to capture geoscience-centric objects and relationships, and the ability to cross-link between VIVO instances. Most implementations of VIVO gather information about a single organization. The EarthCollab project created VIVO extensions to enable cross-linking of VIVO instances to reduce the amount of duplicate information about the same people and scientific resources and to enable dynamic linking of related information across VIVO installations. As the list of customizations grows, so does the effort required to maintain compatibility between the EarthCollab forks and the main VIVO code. For example, dozens of libraries and dependencies were updated prior to the VIVO v1.10 release, which introduced conflicts in the EarthCollab cross-linking code. The cross-linking code has been developed to enable sharing of data across different versions of VIVO, however, using a JSON output schema standardized across versions. We will outline lessons learned in working with VIVO and its open source dependencies, which include Jena, Solr, Freemarker, and jQuery and discuss future work by EarthCollab, which includes refining the cross-linking VIVO capabilities by continued integration of persistent and unique identifiers to enable automated lookup and matching across institutional VIVOs.

  14. Performance optimization of PM-16QAM transmission system enabled by real-time self-adaptive coding.

    PubMed

    Qu, Zhen; Li, Yao; Mo, Weiyang; Yang, Mingwei; Zhu, Shengxiang; Kilper, Daniel C; Djordjevic, Ivan B

    2017-10-15

    We experimentally demonstrate self-adaptive coded 5×100  Gb/s WDM polarization multiplexed 16 quadrature amplitude modulation transmission over a 100 km fiber link, which is enabled by a real-time control plane. The real-time optical signal-to-noise ratio (OSNR) is measured using an optical performance monitoring device. The OSNR measurement is processed and fed back using control plane logic and messaging to the transmitter side for code adaptation, where the binary data are adaptively encoded with three types of low-density parity-check (LDPC) codes with code rates of 0.8, 0.75, and 0.7 of large girth. The total code-adaptation latency is measured to be 2273 ms. Compared with transmission without adaptation, average net capacity improvements of 102%, 36%, and 7.5% are obtained, respectively, by adaptive LDPC coding.

  15. Highly efficient codec based on significance-linked connected-component analysis of wavelet coefficients

    NASA Astrophysics Data System (ADS)

    Chai, Bing-Bing; Vass, Jozsef; Zhuang, Xinhua

    1997-04-01

    Recent success in wavelet coding is mainly attributed to the recognition of importance of data organization. There has been several very competitive wavelet codecs developed, namely, Shapiro's Embedded Zerotree Wavelets (EZW), Servetto et. al.'s Morphological Representation of Wavelet Data (MRWD), and Said and Pearlman's Set Partitioning in Hierarchical Trees (SPIHT). In this paper, we propose a new image compression algorithm called Significant-Linked Connected Component Analysis (SLCCA) of wavelet coefficients. SLCCA exploits both within-subband clustering of significant coefficients and cross-subband dependency in significant fields. A so-called significant link between connected components is designed to reduce the positional overhead of MRWD. In addition, the significant coefficients' magnitude are encoded in bit plane order to match the probability model of the adaptive arithmetic coder. Experiments show that SLCCA outperforms both EZW and MRWD, and is tied with SPIHT. Furthermore, it is observed that SLCCA generally has the best performance on images with large portion of texture. When applied to fingerprint image compression, it outperforms FBI's wavelet scalar quantization by about 1 dB.

  16. Children thinking mathematically beyond authoritative identities

    NASA Astrophysics Data System (ADS)

    MacMillan, Agnes

    1995-10-01

    A study into the mathematics-related interactions and developing attitudes of young children during the transition period between pre-school and school is reported. Transcripts of interactions during a six-week observation period in one of two preschool sites are coded according to the classifications defined within a theoretical framework. Two separate episodes of construction play were analysed and one of these is used to examine the mathematical nature of the children's interactions within an emerging model of autonomous learning. The results of the analysis indicate that access to self-regulatory social relations is very closely linked to the accessibility of mathematical meanings.

  17. Concurrent design of an RTP chamber and advanced control system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spence, P.; Schaper, C.; Kermani, A.

    1995-12-31

    A concurrent-engineering approach is applied to the development of an axisymmetric rapid-thermal-processing (RTP) reactor and its associated temperature controller. Using a detailed finite-element thermal model as a surrogate for actual hardware, the authors have developed and tested a multi-input multi-output (MIMO) controller. Closed-loop simulations are performed by linking the control algorithm with the finite-element code. Simulations show that good temperature uniformity is maintained on the wafer during both steady and transient conditions. A numerical study shows the effect of ramp rate, feedback gain, sensor placement, and wafer-emissivity patterns on system performance.

  18. A neural network model of semantic memory linking feature-based object representation and words.

    PubMed

    Cuppini, C; Magosso, E; Ursino, M

    2009-06-01

    Recent theories in cognitive neuroscience suggest that semantic memory is a distributed process, which involves many cortical areas and is based on a multimodal representation of objects. The aim of this work is to extend a previous model of object representation to realize a semantic memory, in which sensory-motor representations of objects are linked with words. The model assumes that each object is described as a collection of features, coded in different cortical areas via a topological organization. Features in different objects are segmented via gamma-band synchronization of neural oscillators. The feature areas are further connected with a lexical area, devoted to the representation of words. Synapses among the feature areas, and among the lexical area and the feature areas are trained via a time-dependent Hebbian rule, during a period in which individual objects are presented together with the corresponding words. Simulation results demonstrate that, during the retrieval phase, the network can deal with the simultaneous presence of objects (from sensory-motor inputs) and words (from acoustic inputs), can correctly associate objects with words and segment objects even in the presence of incomplete information. Moreover, the network can realize some semantic links among words representing objects with shared features. These results support the idea that semantic memory can be described as an integrated process, whose content is retrieved by the co-activation of different multimodal regions. In perspective, extended versions of this model may be used to test conceptual theories, and to provide a quantitative assessment of existing data (for instance concerning patients with neural deficits).

  19. Electronic health record analysis via deep poisson factor models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henao, Ricardo; Lu, James T.; Lucas, Joseph E.

    Electronic Health Record (EHR) phenotyping utilizes patient data captured through normal medical practice, to identify features that may represent computational medical phenotypes. These features may be used to identify at-risk patients and improve prediction of patient morbidity and mortality. We present a novel deep multi-modality architecture for EHR analysis (applicable to joint analysis of multiple forms of EHR data), based on Poisson Factor Analysis (PFA) modules. Each modality, composed of observed counts, is represented as a Poisson distribution, parameterized in terms of hidden binary units. In-formation from different modalities is shared via a deep hierarchy of common hidden units. Activationmore » of these binary units occurs with probability characterized as Bernoulli-Poisson link functions, instead of more traditional logistic link functions. In addition, we demon-strate that PFA modules can be adapted to discriminative modalities. To compute model parameters, we derive efficient Markov Chain Monte Carlo (MCMC) inference that scales efficiently, with significant computational gains when compared to related models based on logistic link functions. To explore the utility of these models, we apply them to a subset of patients from the Duke-Durham patient cohort. We identified a cohort of over 12,000 patients with Type 2 Diabetes Mellitus (T2DM) based on diagnosis codes and laboratory tests out of our patient population of over 240,000. Examining the common hidden units uniting the PFA modules, we identify patient features that represent medical concepts. Experiments indicate that our learned features are better able to predict mortality and morbidity than clinical features identified previously in a large-scale clinical trial.« less

  20. Electronic health record analysis via deep poisson factor models

    DOE PAGES

    Henao, Ricardo; Lu, James T.; Lucas, Joseph E.; ...

    2016-01-01

    Electronic Health Record (EHR) phenotyping utilizes patient data captured through normal medical practice, to identify features that may represent computational medical phenotypes. These features may be used to identify at-risk patients and improve prediction of patient morbidity and mortality. We present a novel deep multi-modality architecture for EHR analysis (applicable to joint analysis of multiple forms of EHR data), based on Poisson Factor Analysis (PFA) modules. Each modality, composed of observed counts, is represented as a Poisson distribution, parameterized in terms of hidden binary units. In-formation from different modalities is shared via a deep hierarchy of common hidden units. Activationmore » of these binary units occurs with probability characterized as Bernoulli-Poisson link functions, instead of more traditional logistic link functions. In addition, we demon-strate that PFA modules can be adapted to discriminative modalities. To compute model parameters, we derive efficient Markov Chain Monte Carlo (MCMC) inference that scales efficiently, with significant computational gains when compared to related models based on logistic link functions. To explore the utility of these models, we apply them to a subset of patients from the Duke-Durham patient cohort. We identified a cohort of over 12,000 patients with Type 2 Diabetes Mellitus (T2DM) based on diagnosis codes and laboratory tests out of our patient population of over 240,000. Examining the common hidden units uniting the PFA modules, we identify patient features that represent medical concepts. Experiments indicate that our learned features are better able to predict mortality and morbidity than clinical features identified previously in a large-scale clinical trial.« less

  1. 24 CFR 200.926c - Model code provisions for use in partially accepted code jurisdictions.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Minimum Property Standards § 200.926c Model code provisions for use in partially accepted code... partially accepted, then the properties eligible for HUD benefits in that jurisdiction shall be constructed..., those portions of one of the model codes with which the property must comply. Schedule for Model Code...

  2. 24 CFR 200.926c - Model code provisions for use in partially accepted code jurisdictions.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Minimum Property Standards § 200.926c Model code provisions for use in partially accepted code... partially accepted, then the properties eligible for HUD benefits in that jurisdiction shall be constructed..., those portions of one of the model codes with which the property must comply. Schedule for Model Code...

  3. IsoWeb: A Bayesian Isotope Mixing Model for Diet Analysis of the Whole Food Web

    PubMed Central

    Kadoya, Taku; Osada, Yutaka; Takimoto, Gaku

    2012-01-01

    Quantitative description of food webs provides fundamental information for the understanding of population, community, and ecosystem dynamics. Recently, stable isotope mixing models have been widely used to quantify dietary proportions of different food resources to a focal consumer. Here we propose a novel mixing model (IsoWeb) that estimates diet proportions of all consumers in a food web based on stable isotope information. IsoWeb requires a topological description of a food web, and stable isotope signatures of all consumers and resources in the web. A merit of IsoWeb is that it takes into account variation in trophic enrichment factors among different consumer-resource links. Sensitivity analysis using realistic hypothetical food webs suggests that IsoWeb is applicable to a wide variety of food webs differing in the number of species, connectance, sample size, and data variability. Sensitivity analysis based on real topological webs showed that IsoWeb can allow for a certain level of topological uncertainty in target food webs, including erroneously assuming false links, omission of existent links and species, and trophic aggregation into trophospecies. Moreover, using an illustrative application to a real food web, we demonstrated that IsoWeb can compare the plausibility of different candidate topologies for a focal web. These results suggest that IsoWeb provides a powerful tool to analyze food-web structure from stable isotope data. We provide R and BUGS codes to aid efficient applications of IsoWeb. PMID:22848427

  4. HowTo - Easy use of global unique identifier

    NASA Astrophysics Data System (ADS)

    Czerniak, A.; Fleischer, D.; Schirnick, C.

    2013-12-01

    The GEOMAR sample- and core repository covers several thousands of samples and cores and was collected over the last decades. In the actual project, we bring this collection up to the new generation and tag every sample and core with a unique identifier, in our case the International Geo Sample Number (ISGN). This work is done with our digital Ink and hand writing recognition implementation. The Smart Pen technology was save time and resources to record the information on every sample or core. In the procedure of recording, there are several steps systematical are done: 1. Getting all information about the core or sample, such as cruise number, responsible person and so on. 2. Tag with unique identifiers, in our case a QR-Code. 3. Wrote down the location of sample or core. After transmitting the information from Smart Pen, actually via USB but wireless is a choice too, into our server infrastructure the link to other information began. As it linked in our Virtual Research Environment (VRE) with the unique identifier (ISGN) sample or core can be located and the QR-Code was simply linked back from core or sample to ISGN with additional scientific information. On the QR-Code all important information are on it and it was simple to produce thousand of it.

  5. An Open Software Platform for Sharing Water Resource Models, Code and Data

    NASA Astrophysics Data System (ADS)

    Knox, Stephen; Meier, Philipp; Mohamed, Khaled; Korteling, Brett; Matrosov, Evgenii; Huskova, Ivana; Harou, Julien; Rosenberg, David; Tilmant, Amaury; Medellin-Azuara, Josue; Wicks, Jon

    2016-04-01

    The modelling of managed water resource systems requires new approaches in the face of increasing future uncertainty. Water resources management models, even if applied to diverse problem areas, use common approaches such as representing the problem as a network of nodes and links. We propose a data management software platform, called Hydra, that uses this commonality to allow multiple models using a node-link structure to be managed and run using a single software system. Hydra's user interface allows users to manage network topology and associated data. Hydra feeds this data directly into a model, importing from and exporting to different file formats using Apps. An App connects Hydra to a custom model, a modelling system such as GAMS or MATLAB or to different file formats such as MS Excel, CSV and ESRI Shapefiles. Hydra allows users to manage their data in a single, consistent place. Apps can be used to run domain-specific models and allow users to work with their own required file formats. The Hydra App Store offers a collaborative space where model developers can publish, review and comment on Apps, models and data. Example Apps and open-source libraries are available in a variety of languages (Python, Java and .NET). The App Store can act as a hub for water resource modellers to view and share Apps, models and data easily. This encourages an ecosystem of development using a shared platform, resulting in more model integration and potentially greater unity within resource modelling communities. www.hydraplatform.org www.hydraappstore.com

  6. Phobos lander coding system: Software and analysis

    NASA Technical Reports Server (NTRS)

    Cheung, K.-M.; Pollara, F.

    1988-01-01

    The software developed for the decoding system used in the telemetry link of the Phobos Lander mission is described. Encoders and decoders are provided to cover the three possible telemetry configurations. The software can be used to decode actual data or to simulate the performance of the telemetry system. The theoretical properties of the codes chosen for this mission are analyzed and discussed.

  7. 75 FR 57278 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-20

    ... Title 44, United States Code, as amended by the Paperwork Reduction Act of 1995, Pub. L. 104-13), the... encrypted before transfer to each client record, HRSA is able to link data for clients across Ryan White HIV..., Division of Policy and Information Coordination. [FR Doc. 2010-23416 Filed 9-17-10; 8:45 am] BILLING CODE...

  8. Whose Code of Conduct Matters Most? Examining the Link between Academic Integrity and Student Development

    ERIC Educational Resources Information Center

    Biswas, Ann E.

    2013-01-01

    Although most colleges strive to nurture a culture of integrity, incidents of dishonest behavior are on the rise. This article examines the role student development plays in students' perceptions of academic dishonesty and in their willingness to adhere to a code of conduct that may be in sharp contrast to traditional integrity policies.

  9. Apollo 16/AS-511/LM-11 operational calibration curves. Volume 1: Calibration curves for command service module CSM 113

    NASA Technical Reports Server (NTRS)

    Demoss, J. F. (Compiler)

    1971-01-01

    Calibration curves for the Apollo 16 command service module pulse code modulation downlink and onboard display are presented. Subjects discussed are: (1) measurement calibration curve format, (2) measurement identification, (3) multi-mode calibration data summary, (4) pulse code modulation bilevel events listing, and (5) calibration curves for instrumentation downlink and meter link.

  10. L1 and L2 Picture Naming in Mandarin-English Bilinguals: A Test of Bilingual Dual Coding Theory

    ERIC Educational Resources Information Center

    Jared, Debra; Poh, Rebecca Pei Yun; Paivio, Allan

    2013-01-01

    This study examined the nature of bilinguals' conceptual representations and the links from these representations to words in L1 and L2. Specifically, we tested an assumption of the Bilingual Dual Coding Theory that conceptual representations include image representations, and that learning two languages in separate contexts can result in…

  11. Advanced communications payload for mobile applications

    NASA Technical Reports Server (NTRS)

    Ames, S. A.; Kwan, R. K.

    1990-01-01

    An advanced satellite payload is proposed for single hop linking of mobile terminals of all classes as well as Very Small Aperture Terminal's (VSAT's). It relies on an intensive use of communications on-board processing and beam hopping for efficient link design to maximize capacity and a large satellite antenna aperture and high satellite transmitter power to minimize the cost of the ground terminals. Intersatellite links are used to improve the link quality and for high capacity relay. Power budgets are presented for links between the satellite and mobile, VSAT, and hub terminals. Defeating the effects of shadowing and fading requires the use of differentially coherent demodulation, concatenated forward error correction coding, and interleaving, all on a single link basis.

  12. Adaptation and perceptual norms

    NASA Astrophysics Data System (ADS)

    Webster, Michael A.; Yasuda, Maiko; Haber, Sara; Leonard, Deanne; Ballardini, Nicole

    2007-02-01

    We used adaptation to examine the relationship between perceptual norms--the stimuli observers describe as psychologically neutral, and response norms--the stimulus levels that leave visual sensitivity in a neutral or balanced state. Adapting to stimuli on opposite sides of a neutral point (e.g. redder or greener than white) biases appearance in opposite ways. Thus the adapting stimulus can be titrated to find the unique adapting level that does not bias appearance. We compared these response norms to subjectively defined neutral points both within the same observer (at different retinal eccentricities) and between observers. These comparisons were made for visual judgments of color, image focus, and human faces, stimuli that are very different and may depend on very different levels of processing, yet which share the property that for each there is a well defined and perceptually salient norm. In each case the adaptation aftereffects were consistent with an underlying sensitivity basis for the perceptual norm. Specifically, response norms were similar to and thus covaried with the perceptual norm, and under common adaptation differences between subjectively defined norms were reduced. These results are consistent with models of norm-based codes and suggest that these codes underlie an important link between visual coding and visual experience.

  13. Information Object Definition–based Unified Modeling Language Representation of DICOM Structured Reporting

    PubMed Central

    Tirado-Ramos, Alfredo; Hu, Jingkun; Lee, K.P.

    2002-01-01

    Supplement 23 to DICOM (Digital Imaging and Communications for Medicine), Structured Reporting, is a specification that supports a semantically rich representation of image and waveform content, enabling experts to share image and related patient information. DICOM SR supports the representation of textual and coded data linked to images and waveforms. Nevertheless, the medical information technology community needs models that work as bridges between the DICOM relational model and open object-oriented technologies. The authors assert that representations of the DICOM Structured Reporting standard, using object-oriented modeling languages such as the Unified Modeling Language, can provide a high-level reference view of the semantically rich framework of DICOM and its complex structures. They have produced an object-oriented model to represent the DICOM SR standard and have derived XML-exchangeable representations of this model using World Wide Web Consortium specifications. They expect the model to benefit developers and system architects who are interested in developing applications that are compliant with the DICOM SR specification. PMID:11751804

  14. Theoretical and computer models of detonation in solid explosives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tarver, C.M.; Urtiew, P.A.

    1997-10-01

    Recent experimental and theoretical advances in understanding energy transfer and chemical kinetics have led to improved models of detonation waves in solid explosives. The Nonequilibrium Zeldovich - von Neumann - Doring (NEZND) model is supported by picosecond laser experiments and molecular dynamics simulations of the multiphonon up-pumping and internal vibrational energy redistribution (IVR) processes by which the unreacted explosive molecules are excited to the transition state(s) preceding reaction behind the leading shock front(s). High temperature, high density transition state theory calculates the induction times measured by laser interferometric techniques. Exothermic chain reactions form product gases in highly excited vibrational states,more » which have been demonstrated to rapidly equilibrate via supercollisions. Embedded gauge and Fabry-Perot techniques measure the rates of reaction product expansion as thermal and chemical equilibrium is approached. Detonation reaction zone lengths in carbon-rich condensed phase explosives depend on the relatively slow formation of solid graphite or diamond. The Ignition and Growth reactive flow model based on pressure dependent reaction rates and Jones-Wilkins-Lee (JWL) equations of state has reproduced this nanosecond time resolved experimental data and thus has yielded accurate average reaction zone descriptions in one-, two- and three- dimensional hydrodynamic code calculations. The next generation reactive flow model requires improved equations of state and temperature dependent chemical kinetics. Such a model is being developed for the ALE3D hydrodynamic code, in which heat transfer and Arrhenius kinetics are intimately linked to the hydrodynamics.« less

  15. Rigorous study of low-complexity adaptive space-time block-coded MIMO receivers in high-speed mode multiplexed fiber-optic transmission links using few-mode fibers

    NASA Astrophysics Data System (ADS)

    Weng, Yi; He, Xuan; Wang, Junyi; Pan, Zhongqi

    2017-01-01

    Spatial-division multiplexing (SDM) techniques have been purposed to increase the capacity of optical fiber transmission links by utilizing multicore fibers or few-mode fibers (FMF). The most challenging impairments of SDMbased long-haul optical links mainly include modal dispersion and mode-dependent loss (MDL), whereas MDL arises from inline component imperfections, and breaks modal orthogonality thus degrading the capacity of multiple-inputmultiple- output (MIMO) receivers. To reduce MDL, optical approaches include mode scramblers and specialty fiber designs, yet these methods were burdened with high cost, yet cannot completely remove the accumulated MDL in the link. Besides, space-time trellis codes (STTC) were purposed to lessen MDL, but suffered from high complexity. In this work, we investigated the performance of space-time block-coding (STBC) scheme to mitigate MDL in SDM-based optical communication by exploiting space and delay diversity, whereas weight matrices of frequency-domain equalization (FDE) were updated heuristically using decision-directed recursive-least-squares (RLS) algorithm for convergence and channel estimation. The STBC was evaluated in a six-mode multiplexed system over 30-km FMF via 6×6 MIMO FDE, with modal gain offset 3 dB, core refractive index 1.49, numerical aperture 0.5. Results show that optical-signal-to-noise ratio (OSNR) tolerance can be improved via STBC by approximately 3.1, 4.9, 7.8 dB for QPSK, 16- and 64-QAM with respective bit-error-rates (BER) and minimum-mean-square-error (MMSE). Besides, we also evaluate the complexity optimization of STBC decoding scheme with zero-forcing decision feedback (ZFDF) equalizer by shortening the coding slot length, which is robust to frequency-selective fading channels, and can be scaled up for SDM systems with more dynamic channels.

  16. Programming PHREEQC calculations with C++ and Python a comparative study

    USGS Publications Warehouse

    Charlton, Scott R.; Parkhurst, David L.; Muller, Mike

    2011-01-01

    The new IPhreeqc module provides an application programming interface (API) to facilitate coupling of other codes with the U.S. Geological Survey geochemical model PHREEQC. Traditionally, loose coupling of PHREEQC with other applications required methods to create PHREEQC input files, start external PHREEQC processes, and process PHREEQC output files. IPhreeqc eliminates most of this effort by providing direct access to PHREEQC capabilities through a component object model (COM), a library, or a dynamically linked library (DLL). Input and calculations can be specified through internally programmed strings, and all data exchange between an application and the module can occur in computer memory. This study compares simulations programmed in C++ and Python that are tightly coupled with IPhreeqc modules to the traditional simulations that are loosely coupled to PHREEQC. The study compares performance, quantifies effort, and evaluates lines of code and the complexity of the design. The comparisons show that IPhreeqc offers a more powerful and simpler approach for incorporating PHREEQC calculations into transport models and other applications that need to perform PHREEQC calculations. The IPhreeqc module facilitates the design of coupled applications and significantly reduces run times. Even a moderate knowledge of one of the supported programming languages allows more efficient use of PHREEQC than the traditional loosely coupled approach.

  17. Application of Intel Many Integrated Core (MIC) architecture to the Yonsei University planetary boundary layer scheme in Weather Research and Forecasting model

    NASA Astrophysics Data System (ADS)

    Huang, Melin; Huang, Bormin; Huang, Allen H.

    2014-10-01

    The Weather Research and Forecasting (WRF) model provided operational services worldwide in many areas and has linked to our daily activity, in particular during severe weather events. The scheme of Yonsei University (YSU) is one of planetary boundary layer (PBL) models in WRF. The PBL is responsible for vertical sub-grid-scale fluxes due to eddy transports in the whole atmospheric column, determines the flux profiles within the well-mixed boundary layer and the stable layer, and thus provide atmospheric tendencies of temperature, moisture (including clouds), and horizontal momentum in the entire atmospheric column. The YSU scheme is very suitable for massively parallel computation as there are no interactions among horizontal grid points. To accelerate the computation process of the YSU scheme, we employ Intel Many Integrated Core (MIC) Architecture as it is a multiprocessor computer structure with merits of efficient parallelization and vectorization essentials. Our results show that the MIC-based optimization improved the performance of the first version of multi-threaded code on Xeon Phi 5110P by a factor of 2.4x. Furthermore, the same CPU-based optimizations improved the performance on Intel Xeon E5-2603 by a factor of 1.6x as compared to the first version of multi-threaded code.

  18. Numerical Simulations of Slow Stick Slip Events with PFC, a DEM Based Code

    NASA Astrophysics Data System (ADS)

    Ye, S. H.; Young, R. P.

    2017-12-01

    Nonvolcanic tremors around subduction zone have become a fascinating subject in seismology in recent years. Previous studies have shown that the nonvolcanic tremor beneath western Shikoku is composed of low frequency seismic waves overlapping each other. This finding provides direct link between tremor and slow earthquakes. Slow stick slip events are considered to be laboratory scaled slow earthquakes. Slow stick slip events are traditionally studied with direct shear or double direct shear experiment setup, in which the sliding velocity can be controlled to model a range of fast and slow stick slips. In this study, a PFC* model based on double direct shear is presented, with a central block clamped by two side blocks. The gauge layers between the central and side blocks are modelled as discrete fracture networks with smooth joint bonds between pairs of discrete elements. In addition, a second model is presented in this study. This model consists of a cylindrical sample subjected to triaxial stress. Similar to the previous model, a weak gauge layer at a 45 degrees is added into the sample, on which shear slipping is allowed. Several different simulations are conducted on this sample. While the confining stress is maintained at the same level in different simulations, the axial loading rate (displacement rate) varies. By varying the displacement rate, a range of slipping behaviour, from stick slip to slow stick slip are observed based on the stress-strain relationship. Currently, the stick slip and slow stick slip events are strictly observed based on the stress-strain relationship. In the future, we hope to monitor the displacement and velocity of the balls surrounding the gauge layer as a function of time, so as to generate a synthetic seismogram. This will allow us to extract seismic waveforms and potentially simulate the tremor-like waves found around subduction zones. *Particle flow code, a discrete element method based numerical simulation code developed by Itasca Inc.

  19. Chaste: An Open Source C++ Library for Computational Physiology and Biology

    PubMed Central

    Mirams, Gary R.; Arthurs, Christopher J.; Bernabeu, Miguel O.; Bordas, Rafel; Cooper, Jonathan; Corrias, Alberto; Davit, Yohan; Dunn, Sara-Jane; Fletcher, Alexander G.; Harvey, Daniel G.; Marsh, Megan E.; Osborne, James M.; Pathmanathan, Pras; Pitt-Francis, Joe; Southern, James; Zemzemi, Nejib; Gavaghan, David J.

    2013-01-01

    Chaste — Cancer, Heart And Soft Tissue Environment — is an open source C++ library for the computational simulation of mathematical models developed for physiology and biology. Code development has been driven by two initial applications: cardiac electrophysiology and cancer development. A large number of cardiac electrophysiology studies have been enabled and performed, including high-performance computational investigations of defibrillation on realistic human cardiac geometries. New models for the initiation and growth of tumours have been developed. In particular, cell-based simulations have provided novel insight into the role of stem cells in the colorectal crypt. Chaste is constantly evolving and is now being applied to a far wider range of problems. The code provides modules for handling common scientific computing components, such as meshes and solvers for ordinary and partial differential equations (ODEs/PDEs). Re-use of these components avoids the need for researchers to ‘re-invent the wheel’ with each new project, accelerating the rate of progress in new applications. Chaste is developed using industrially-derived techniques, in particular test-driven development, to ensure code quality, re-use and reliability. In this article we provide examples that illustrate the types of problems Chaste can be used to solve, which can be run on a desktop computer. We highlight some scientific studies that have used or are using Chaste, and the insights they have provided. The source code, both for specific releases and the development version, is available to download under an open source Berkeley Software Distribution (BSD) licence at http://www.cs.ox.ac.uk/chaste, together with details of a mailing list and links to documentation and tutorials. PMID:23516352

  20. Tobacco outlet density and converted versus native non-daily cigarette use in a national US sample

    PubMed Central

    Kirchner, Thomas R; Anesetti-Rothermel, Andrew; Bennett, Morgane; Gao, Hong; Carlos, Heather; Scheuermann, Taneisha S; Reitzel, Lorraine R; Ahluwalia, Jasjit S

    2017-01-01

    Objective Investigate whether non-daily smokers’ (NDS) cigarette price and purchase preferences, recent cessation attempts, and current intentions to quit are associated with the density of the retail cigarette product landscape surrounding their residential address. Participants Cross-sectional assessment of N=904 converted NDS (CNDS). who previously smoked every day, and N=297 native NDS (NNDS) who only smoked non-daily, drawn from a national panel. Outcome measures Kernel density estimation was used to generate a nationwide probability surface of tobacco outlets linked to participants’ residential ZIP code. Hierarchically nested log-linear models were compared to evaluate associations between outlet density, non-daily use patterns, price sensitivity and quit intentions. Results Overall, NDS in ZIP codes with greater outlet density were less likely than NDS in ZIP codes with lower outlet density to hold 6-month quit intentions when they also reported that price affected use patterns (G2=66.1, p<0.001) and purchase locations (G2=85.2, p<0.001). CNDS were more likely than NNDS to reside in ZIP codes with higher outlet density (G2=322.0, p<0.001). Compared with CNDS in ZIP codes with lower outlet density, CNDS in high-density ZIP codes were more likely to report that price influenced the amount they smoke (G2=43.9, p<0.001), and were more likely to look for better prices (G2=59.3, p<0.001). NDS residing in high-density ZIP codes were not more likely to report that price affected their cigarette brand choice compared with those in ZIP codes with lower density. Conclusions This paper provides initial evidence that the point-of-sale cigarette environment may be differentially associated with the maintenance of CNDS versus NNDS patterns. Future research should investigate how tobacco control efforts can be optimised to both promote cessation and curb the rising tide of non-daily smoking in the USA. PMID:26969172

  1. Decoding the neural mechanisms of human tool use

    PubMed Central

    Gallivan, Jason P; McLean, D Adam; Valyear, Kenneth F; Culham, Jody C

    2013-01-01

    Sophisticated tool use is a defining characteristic of the primate species but how is it supported by the brain, particularly the human brain? Here we show, using functional MRI and pattern classification methods, that tool use is subserved by multiple distributed action-centred neural representations that are both shared with and distinct from those of the hand. In areas of frontoparietal cortex we found a common representation for planned hand- and tool-related actions. In contrast, in parietal and occipitotemporal regions implicated in hand actions and body perception we found that coding remained selectively linked to upcoming actions of the hand whereas in parietal and occipitotemporal regions implicated in tool-related processing the coding remained selectively linked to upcoming actions of the tool. The highly specialized and hierarchical nature of this coding suggests that hand- and tool-related actions are represented separately at earlier levels of sensorimotor processing before becoming integrated in frontoparietal cortex. DOI: http://dx.doi.org/10.7554/eLife.00425.001 PMID:23741616

  2. Analysis of automatic repeat request methods for deep-space downlinks

    NASA Technical Reports Server (NTRS)

    Pollara, F.; Ekroot, L.

    1995-01-01

    Automatic repeat request (ARQ) methods cannot increase the capacity of a memoryless channel. However, they can be used to decrease the complexity of the channel-coding system to achieve essentially error-free transmission and to reduce link margins when the channel characteristics are poorly predictable. This article considers ARQ methods on a power-limited channel (e.g., the deep-space channel), where it is important to minimize the total power needed to transmit the data, as opposed to a bandwidth-limited channel (e.g., terrestrial data links), where the spectral efficiency or the total required transmission time is the most relevant performance measure. In the analysis, we compare the performance of three reference concatenated coded systems used in actual deep-space missions to that obtainable by ARQ methods using the same codes, in terms of required power, time to transmit with a given number of retransmissions, and achievable probability of word error. The ultimate limits of ARQ with an arbitrary number of retransmissions are also derived.

  3. The neuromodulator of exploration: A unifying theory of the role of dopamine in personality

    PubMed Central

    DeYoung, Colin G.

    2013-01-01

    The neuromodulator dopamine is centrally involved in reward, approach behavior, exploration, and various aspects of cognition. Variations in dopaminergic function appear to be associated with variations in personality, but exactly which traits are influenced by dopamine remains an open question. This paper proposes a theory of the role of dopamine in personality that organizes and explains the diversity of findings, utilizing the division of the dopaminergic system into value coding and salience coding neurons (Bromberg-Martin et al., 2010). The value coding system is proposed to be related primarily to Extraversion and the salience coding system to Openness/Intellect. Global levels of dopamine influence the higher order personality factor, Plasticity, which comprises the shared variance of Extraversion and Openness/Intellect. All other traits related to dopamine are linked to Plasticity or its subtraits. The general function of dopamine is to promote exploration, by facilitating engagement with cues of specific reward (value) and cues of the reward value of information (salience). This theory constitutes an extension of the entropy model of uncertainty (EMU; Hirsh et al., 2012), enabling EMU to account for the fact that uncertainty is an innate incentive reward as well as an innate threat. The theory accounts for the association of dopamine with traits ranging from sensation and novelty seeking, to impulsivity and aggression, to achievement striving, creativity, and cognitive abilities, to the overinclusive thinking characteristic of schizotypy. PMID:24294198

  4. Does theory influence the effectiveness of health behavior interventions? Meta-analysis.

    PubMed

    Prestwich, Andrew; Sniehotta, Falko F; Whittington, Craig; Dombrowski, Stephan U; Rogers, Lizzie; Michie, Susan

    2014-05-01

    To systematically investigate the extent and type of theory use in physical activity and dietary interventions, as well as associations between extent and type of theory use with intervention effectiveness. An in-depth analysis of studies included in two systematic reviews of physical activity and healthy eating interventions (k = 190). Extent and type of theory use was assessed using the Theory Coding Scheme (TCS) and intervention effectiveness was calculated using Hedges's g. Metaregressions assessed the relationships between these measures. Fifty-six percent of interventions reported a theory base. Of these, 90% did not report links between all of their behavior change techniques (BCTs) with specific theoretical constructs and 91% did not report links between all the specified constructs with BCTs. The associations between a composite score or specific items on the TCS and intervention effectiveness were inconsistent. Interventions based on Social Cognitive Theory or the Transtheoretical Model were similarly effective and no more effective than interventions not reporting a theory base. The coding of theory in these studies suggested that theory was not often used extensively in the development of interventions. Moreover, the relationships between type of theory used and the extent of theory use with effectiveness were generally weak. The findings suggest that attempts to apply the two theories commonly used in this review more extensively are unlikely to increase intervention effectiveness. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  5. Comorbid Parkinson's disease, falls and fractures in the 2010 National Emergency Department Sample

    PubMed Central

    Beydoun, Hind A.; Beydoun, May A.; Mishra, Nishant K.; Rostant, Ola S.; Zonderman, Alan B.; Eid, Shaker M.

    2017-01-01

    Introduction Parkinson's disease (PD) is a progressive, neurodegenerative disorder of multifactorial etiology affecting ~1% of older adults. Research focused on linking PD to falls and bone fractures has been limited in Emergency Department (ED) settings, where most injuries are identified. We assessed whether injured U.S. ED admissions with PD diagnoses were more likely to exhibit comorbid fall- or non-fall related bone fractures and whether a PD diagnosis with a concomitant fall or bone fracture is linked to worse prognosis. Methods We performed secondary analyses of 2010 Healthcare Utilization Project National ED Sample from 4,253,987 admissions to U.S. EDs linked to injured elderly patients. ED discharges with ICD-9-CM code (332.0) were identified as PD and those with ICD-9-CM code (800.0–829.0) were used to define bone fracture location. Linear and logistic regression models were constructed to estimate slopes (B) and odds ratios (OR) with 95% confidence intervals (CI). Results PD admissions had 28% increased adjusted prevalence of bone fracture. Non-fall injuries showed stronger relationship between PD and bone fracture (ORadj = 1.33, 95% CI: 1.22–1.45) than fall injuries (ORadj = 1.06, 95% CI: 1.01–1.10). PD had the strongest impact on hospitalization length when bone fracture and fall co-occurred, and total charges were directly associated with PD only for fall injuries. Finally, PD status was not related to in-hospital death in this population. Conclusions Among injured U.S. ED elderly patient visits, those with PD had higher bone fracture prevalence and more resource utilization especially among fall-related injuries. No association of PD with in-hospital death was noted. PMID:27887896

  6. Using linked data to evaluate traumatic brain injuries in New Mexico : Crash Outcome Data Evaluation System (CODES) linked data demonstration project

    DOT National Transportation Integrated Search

    1998-10-01

    The 1995 crude mortality rate of physician diagnosed traumatic brain injury (TBI) in the New Mexico is estimated to be 21 deaths per 100,000 population and the crude incidence of both hospitalized and fatal TBI is 110 cases per 100,000 population (1,...

  7. Impacts of Model Building Energy Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Athalye, Rahul A.; Sivaraman, Deepak; Elliott, Douglas B.

    The U.S. Department of Energy (DOE) Building Energy Codes Program (BECP) periodically evaluates national and state-level impacts associated with energy codes in residential and commercial buildings. Pacific Northwest National Laboratory (PNNL), funded by DOE, conducted an assessment of the prospective impacts of national model building energy codes from 2010 through 2040. A previous PNNL study evaluated the impact of the Building Energy Codes Program; this study looked more broadly at overall code impacts. This report describes the methodology used for the assessment and presents the impacts in terms of energy savings, consumer cost savings, and reduced CO 2 emissions atmore » the state level and at aggregated levels. This analysis does not represent all potential savings from energy codes in the U.S. because it excludes several states which have codes which are fundamentally different from the national model energy codes or which do not have state-wide codes. Energy codes follow a three-phase cycle that starts with the development of a new model code, proceeds with the adoption of the new code by states and local jurisdictions, and finishes when buildings comply with the code. The development of new model code editions creates the potential for increased energy savings. After a new model code is adopted, potential savings are realized in the field when new buildings (or additions and alterations) are constructed to comply with the new code. Delayed adoption of a model code and incomplete compliance with the code’s requirements erode potential savings. The contributions of all three phases are crucial to the overall impact of codes, and are considered in this assessment.« less

  8. Model Children's Code.

    ERIC Educational Resources Information Center

    New Mexico Univ., Albuquerque. American Indian Law Center.

    The Model Children's Code was developed to provide a legally correct model code that American Indian tribes can use to enact children's codes that fulfill their legal, cultural and economic needs. Code sections cover the court system, jurisdiction, juvenile offender procedures, minor-in-need-of-care, and termination. Almost every Code section is…

  9. Performance optimization of spectral amplitude coding OCDMA system using new enhanced multi diagonal code

    NASA Astrophysics Data System (ADS)

    Imtiaz, Waqas A.; Ilyas, M.; Khan, Yousaf

    2016-11-01

    This paper propose a new code to optimize the performance of spectral amplitude coding-optical code division multiple access (SAC-OCDMA) system. The unique two-matrix structure of the proposed enhanced multi diagonal (EMD) code and effective correlation properties, between intended and interfering subscribers, significantly elevates the performance of SAC-OCDMA system by negating multiple access interference (MAI) and associated phase induce intensity noise (PIIN). Performance of SAC-OCDMA system based on the proposed code is thoroughly analyzed for two detection techniques through analytic and simulation analysis by referring to bit error rate (BER), signal to noise ratio (SNR) and eye patterns at the receiving end. It is shown that EMD code while using SDD technique provides high transmission capacity, reduces the receiver complexity, and provides better performance as compared to complementary subtraction detection (CSD) technique. Furthermore, analysis shows that, for a minimum acceptable BER of 10-9 , the proposed system supports 64 subscribers at data rates of up to 2 Gbps for both up-down link transmission.

  10. Sifting, sorting and saturating data in a grounded theory study of information use by practice nurses: a worked example.

    PubMed

    Hoare, Karen J; Mills, Jane; Francis, Karen

    2012-12-01

    The terminology used to analyse data in a grounded theory study can be confusing. Different grounded theorists use a variety of terms which all have similar meanings. In the following study, we use terms adopted by Charmaz including: initial, focused and axial coding. Initial codes are used to analyse data with an emphasis on identifying gerunds, a verb acting as a noun. If initial codes are relevant to the developing theory, they are grouped with similar codes into categories. Categories become saturated when there are no new codes identified in the data. Axial codes are used to link categories together into a grounded theory process. Memo writing accompanies this data sifting and sorting. The following article explains how one initial code became a category providing a worked example of the grounded theory method of constant comparative analysis. The interplay between coding and categorization is facilitated by the constant comparative method. © 2012 Wiley Publishing Asia Pty Ltd.

  11. A MATLAB based 3D modeling and inversion code for MT data

    NASA Astrophysics Data System (ADS)

    Singh, Arun; Dehiya, Rahul; Gupta, Pravin K.; Israil, M.

    2017-07-01

    The development of a MATLAB based computer code, AP3DMT, for modeling and inversion of 3D Magnetotelluric (MT) data is presented. The code comprises two independent components: grid generator code and modeling/inversion code. The grid generator code performs model discretization and acts as an interface by generating various I/O files. The inversion code performs core computations in modular form - forward modeling, data functionals, sensitivity computations and regularization. These modules can be readily extended to other similar inverse problems like Controlled-Source EM (CSEM). The modular structure of the code provides a framework useful for implementation of new applications and inversion algorithms. The use of MATLAB and its libraries makes it more compact and user friendly. The code has been validated on several published models. To demonstrate its versatility and capabilities the results of inversion for two complex models are presented.

  12. Thin Shell Model for NIF capsule stagnation studies

    NASA Astrophysics Data System (ADS)

    Hammer, J. H.; Buchoff, M.; Brandon, S.; Field, J. E.; Gaffney, J.; Kritcher, A.; Nora, R. C.; Peterson, J. L.; Spears, B.; Springer, P. T.

    2015-11-01

    We adapt the thin shell model of Ott et al. to asymmetric ICF capsule implosions on NIF. Through much of an implosion, the shell aspect ratio is large so the thin shell approximation is well satisfied. Asymmetric pressure drive is applied using an analytic form for ablation pressure as a function of the x-ray flux, as well as time-dependent 3D drive asymmetry from hohlraum calculations. Since deviations from a sphere are small through peak velocity, we linearize the equations, decompose them by spherical harmonics and solve ODE's for the coefficients. The model gives the shell position, velocity and areal mass variations at the time of peak velocity, near 250 microns radius. The variables are used to initialize 3D rad-hydro calculations with the HYDRA and ARES codes. At link time the cold fuel shell and ablator are each characterized by a density, adiabat and mass. The thickness, position and velocity of each point are taken from the thin shell model. The interior of the shell is filled with a uniform gas density and temperature consistent with the 3/2PV energy found from 1D rad-hydro calculations. 3D linked simulations compare favorably with integrated simulations of the entire implosion. Through generating synthetic diagnostic data, the model offers a method for quickly testing hypothetical sources of asymmetry and comparing with experiment. Prepared by LLNL under Contract DE-AC52-07NA27344.

  13. Development of a comprehensive watershed model applied to study stream yield under drought conditions

    USGS Publications Warehouse

    Perkins, S.P.; Sophocleous, M.

    1999-01-01

    We developed a model code to simulate a watershed's hydrology and the hydraulic response of an interconnected stream-aquifer system, and applied the model code to the Lower Republican River Basin in Kansas. The model code links two well-known computer programs: MODFLOW (modular 3-D flow model), which simulates ground water flow and stream-aquifer interaction; and SWAT (soil water assessment tool), a soil water budget simulator for an agricultural watershed. SWAT represents a basin as a collection of subbasins in terms of soil, land use, and weather data, and simulates each subbasin on a daily basis to determine runoff, percolation, evaporation, irrigation, pond seepages and crop growth. Because SWAT applies a lumped hydrologic model to each subbasin, spatial heterogeneities with respect to factors such as soil type and land use are not resolved geographically, but can instead be represented statistically. For the Republican River Basin model, each combination of six soil types and three land uses, referred to as a hydrologic response unit (HRU), was simulated with a separate execution of SWAT. A spatially weighted average was then taken over these results for each hydrologic flux and time step by a separate program, SWBAVG. We wrote a package for MOD-FLOW to associate each subbasin with a subset of aquifer grid cells and stream reaches, and to distribute the hydrologic fluxes given for each subbasin by SWAT and SWBAVG over MODFLOW's stream-aquifer grid to represent tributary flow, surface and ground water diversions, ground water recharge, and evapotranspiration from ground water. The Lower Republican River Basin model was calibrated with respect to measured ground water levels, streamflow, and reported irrigation water use. The model was used to examine the relative contributions of stream yield components and the impact on stream yield and base flow of administrative measures to restrict irrigation water use during droughts. Model results indicate that tributary flow is the dominant component of stream yield and that reduction of irrigation water use produces a corresponding increase in base flow and stream yield. However, the increase in stream yield resulting from reduced water use does not appear to be of sufficient magnitude to restore minimum desirable streamflows.

  14. Convolution Operations on Coding Metasurface to Reach Flexible and Continuous Controls of Terahertz Beams.

    PubMed

    Liu, Shuo; Cui, Tie Jun; Zhang, Lei; Xu, Quan; Wang, Qiu; Wan, Xiang; Gu, Jian Qiang; Tang, Wen Xuan; Qing Qi, Mei; Han, Jia Guang; Zhang, Wei Li; Zhou, Xiao Yang; Cheng, Qiang

    2016-10-01

    The concept of coding metasurface makes a link between physically metamaterial particles and digital codes, and hence it is possible to perform digital signal processing on the coding metasurface to realize unusual physical phenomena. Here, this study presents to perform Fourier operations on coding metasurfaces and proposes a principle called as scattering-pattern shift using the convolution theorem, which allows steering of the scattering pattern to an arbitrarily predesigned direction. Owing to the constant reflection amplitude of coding particles, the required coding pattern can be simply achieved by the modulus of two coding matrices. This study demonstrates that the scattering patterns that are directly calculated from the coding pattern using the Fourier transform have excellent agreements to the numerical simulations based on realistic coding structures, providing an efficient method in optimizing coding patterns to achieve predesigned scattering beams. The most important advantage of this approach over the previous schemes in producing anomalous single-beam scattering is its flexible and continuous controls to arbitrary directions. This work opens a new route to study metamaterial from a fully digital perspective, predicting the possibility of combining conventional theorems in digital signal processing with the coding metasurface to realize more powerful manipulations of electromagnetic waves.

  15. Investigation of Bandwidth-Efficient Coding and Modulation Techniques

    NASA Technical Reports Server (NTRS)

    Osborne, William P.

    1992-01-01

    The necessary technology was studied to improve the bandwidth efficiency of the space-to-ground communications network using the current capabilities of that network as a baseline. The study was aimed at making space payloads, for example the Hubble Space Telescope, more capable without the need to completely redesign the link. Particular emphasis was placed on the following concepts: (1) what the requirements are which are necessary to convert an existing standard 4-ary phase shift keying communications link to one that can support, as a minimum, 8-ary phase shift keying with error corrections applied; and (2) to determine the feasibility of using the existing equipment configurations with additional signal processing equipment to realize the higher order modulation and coding schemes.

  16. A New Method to Classify Injury Severity by Diagnosis: Validation using Workers' Compensation and Trauma Registry Data

    PubMed Central

    Sears, Jeanne M.; Bowman, Stephen M.; Rotert, Mary; Hogg-Johnson, Sheilah

    2015-01-01

    Purpose Acute work-related trauma is a leading cause of death and disability among U.S. workers. Existing methods to estimate injury severity have important limitations. This study assessed a severe injury indicator constructed from a list of severe traumatic injury diagnosis codes previously developed for surveillance purposes. Study objectives were to: (1) describe the degree to which the severe injury indicator predicts work disability and medical cost outcomes; (2) assess whether this indicator adequately substitutes for estimating Abbreviated Injury Scale (AIS)-based injury severity from workers' compensation (WC) billing data; and (3) assess concordance between indicators constructed from Washington State Trauma Registry (WTR) and WC data. Methods WC claims for workers injured in Washington State from 1998-2008 were linked to WTR records. Competing risks survival analysis was used to model work disability outcomes. Adjusted total medical costs were modeled using linear regression. Information content of the severe injury indicator and AIS-based injury severity measures were compared using Akaike Information Criterion and R2. Results Of 208,522 eligible WC claims, 5% were classified as severe. Among WC claims linked to the WTR, there was substantial agreement between WC-based and WTR-based indicators (kappa=0.75). Information content of the severe injury indicator was similar to some AIS-based measures. The severe injury indicator was a significant predictor of WTR inclusion, early hospitalization, compensated time loss, total permanent disability, and total medical costs. Conclusions Severe traumatic injuries can be directly identified when diagnosis codes are available. This method provides a simple and transparent alternative to AIS-based injury severity estimation. PMID:25900409

  17. Major trauma: the unseen financial burden to trauma centres, a descriptive multicentre analysis.

    PubMed

    Curtis, Kate; Lam, Mary; Mitchell, Rebecca; Dickson, Cara; McDonnell, Karon

    2014-02-01

    This research examines the existing funding model for in-hospital trauma patient episodes in New South Wales (NSW), Australia and identifies factors that cause above-average treatment costs. Accurate information on the treatment costs of injury is needed to guide health-funding strategy and prevent inadvertent underfunding of specialist trauma centres, which treat a high trauma casemix. Admitted trauma patient data provided by 12 trauma centres were linked with financial data for 2008-09. Actual costs incurred by each hospital were compared with state-wide Australian Refined Diagnostic Related Groups (AR-DRG) average costs. Patient episodes where actual cost was higher than AR-DRG cost allocation were examined. There were 16693 patients at a total cost of AU$178.7million. The total costs incurred by trauma centres were $14.7million above the NSW peer-group average cost estimates. There were 10 AR-DRG where the total cost variance was greater than $500000. The AR-DRG with the largest proportion of patients were the upper limb injury categories, many of whom had multiple body regions injured and/or a traumatic brain injury (P<0.001). AR-DRG classifications do not adequately describe the trauma patient episode and are not commensurate with the expense of trauma treatment. A revision of AR-DRG used for trauma is needed. WHAT IS KNOWN ABOUT THIS TOPIC? Severely injured trauma patients often have multiple injuries, in more than one body region and the determination of appropriate AR-DRG can be difficult. Pilot research suggests that the AR-DRG do not accurately represent the care that is required for these patients. WHAT DOES THIS PAPER ADD? This is the first multicentre analysis of treatment costs and coding variance for major trauma in Australia. This research identifies the limitations of the current AR-DRGS and those that are particularly problematic. The value of linking trauma registry and financial data within each trauma centre is demonstrated. WHAT ARE THE IMPLICATIONS FOR PRACTITIONERS? Further work should be conducted between trauma services, clinical coding and finance departments to improve the accuracy of clinical coding, review funding models and ensure that AR-DRG allocation is commensurate with the expense of trauma treatment.

  18. Projection of Patient Condition Code Distributions Based on Mechanism of Injury

    DTIC Science & Technology

    2003-01-01

    The Medical Readiness and Strategic Plan (MRSP)1998-20041 requires that the military services develop a method for linking real world patient load...data with modern Patient Condition (PC) codes to enable planners to forecast medical workload and resource requirements. Determination of the likely...various levels of medical care. Medical planners and logisticians plan for medical contingencies based on anticipated patient streams, distributions of

  19. Implementation of a Parameterized Interacting Multiple Model Filter on an FPGA for Satellite Communications

    NASA Technical Reports Server (NTRS)

    Hackett, Timothy M.; Bilen, Sven G.; Ferreira, Paulo Victor R.; Wyglinski, Alexander M.; Reinhart, Richard C.

    2016-01-01

    In a communications channel, the space environment between a spacecraft and an Earth ground station can potentially cause the loss of a data link or at least degrade its performance due to atmospheric effects, shadowing, multipath, or other impairments. In adaptive and coded modulation, the signal power level at the receiver can be used in order to choose a modulation-coding technique that maximizes throughput while meeting bit error rate (BER) and other performance requirements. It is the goal of this research to implement a generalized interacting multiple model (IMM) filter based on Kalman filters for improved received power estimation on software-dened radio (SDR) technology for satellite communications applications. The IMM filter has been implemented in Verilog consisting of a customizable bank of Kalman filters for choosing between performance and resource utilization. Each Kalman filter can be implemented using either solely a Schur complement module (for high area efficiency) or with Schur complement, matrix multiplication, and matrix addition modules (for high performance). These modules were simulated and synthesized for the Virtex II platform on the JPL Radio Experimenter Development System (EDS) at NASA Glenn Research Center. The results for simulation, synthesis, and hardware testing are presented.

  20. A Framework for Debugging Geoscience Projects in a High Performance Computing Environment

    NASA Astrophysics Data System (ADS)

    Baxter, C.; Matott, L.

    2012-12-01

    High performance computing (HPC) infrastructure has become ubiquitous in today's world with the emergence of commercial cloud computing and academic supercomputing centers. Teams of geoscientists, hydrologists and engineers can take advantage of this infrastructure to undertake large research projects - for example, linking one or more site-specific environmental models with soft computing algorithms, such as heuristic global search procedures, to perform parameter estimation and predictive uncertainty analysis, and/or design least-cost remediation systems. However, the size, complexity and distributed nature of these projects can make identifying failures in the associated numerical experiments using conventional ad-hoc approaches both time- consuming and ineffective. To address these problems a multi-tiered debugging framework has been developed. The framework allows for quickly isolating and remedying a number of potential experimental failures, including: failures in the HPC scheduler; bugs in the soft computing code; bugs in the modeling code; and permissions and access control errors. The utility of the framework is demonstrated via application to a series of over 200,000 numerical experiments involving a suite of 5 heuristic global search algorithms and 15 mathematical test functions serving as cheap analogues for the simulation-based optimization of pump-and-treat subsurface remediation systems.

  1. Communications terminal breadboard

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A baseline design is presented of a digital communications link between an advanced manned spacecraft (AMS) and an earth terminal via an Intelsat 4 type communications satellite used as a geosynchronous orbiting relay station. The fabrication, integration, and testing of terminal elements at each end of the link are discussed. In the baseline link design, the information carrying capacity of the link was estimated for both the forward direction (earth terminal to AMS) and the return direction, based upon orbital geometry, relay satellite characteristics, terminal characteristics, and the improvement that can be achieved by the use of convolutional coding/Viterbi decoding techniques.

  2. Why Were Polysaccharides Necessary?

    NASA Astrophysics Data System (ADS)

    Tolstoguzov, Vladimir

    2004-12-01

    The main idea of this paper is that the primordial soup may be modelled by food systems whose structure-property relationship is based on non-specific interactions between denatured biopolymers. According to the proposed hypothesis, polysaccharides were the first biopolymers that decreased concentration of salts in the primordial soup, `compatibilised' and drove the joint evolution of proto-biopolymers. Synthesis of macromolecules within the polysaccharide-rich medium could have resulted in phase separation of the primordial soup and concentration of the polypeptides and nucleic acids in the dispersed phase particles. The concentration of proto-biopolymer mixtures favoured their cross-linking in hybrid supermacromolecules of conjugates. The cross-linking of proto-biopolymers could occur by hydrophobic, electrostatic interactions, H-bonds due to freezing aqueous mixed biopolymer dispersions and/or by covalent bonds due to the Maillard reaction. Cross-linking could have increased the local concentration of chemically different proto-biopolymers, fixed their relative positions and made their interactions reproducible. Attractive-repulsive interactions between cross-linked proto-biopolymer chains could develop pairing of the monomer units, improved chemical stability (against hydrolysis) and led to their mutual catalytic activity and coding. Conjugates could probably evolve to the first self-reproduced entities and then to specialized cellular organelles. Phase separation of the primordial soup with concentration of conjugates in the dispersed particles has probably resulted in proto-cells.

  3. Cellular miR-2909 RNomics governs the genes that ensure immune checkpoint regulation.

    PubMed

    Kaul, Deepak; Malik, Deepti; Wani, Sameena

    2018-06-20

    Cross-talk between coding RNAs and regulatory non-coding microRNAs, within human genome, has provided compelling evidence for the existence of flexible checkpoint control of T-Cell activation. The present study attempts to demonstrate that the interplay between miR-2909 and its effector KLF4 gene has the inherent capacity to regulate genes coding for CTLA4, CD28, CD40, CD134, PDL1, CD80, CD86, IL-6 and IL-10 within normal human peripheral blood mononuclear cells (PBMCs). Based upon these findings, we propose a pathway that links miR-2909 RNomics with the genes coding for immune checkpoint regulators required for the maintenance of immune homeostasis.

  4. Analysis of space telescope data collection systems

    NASA Technical Reports Server (NTRS)

    Ingels, F. M.

    1984-01-01

    The Multiple Access (MA) communication link of the Space Telescope (ST) is described. An expected performance bit error rate is presented. The historical perspective and rationale behind the ESTL space shuttle end-to-end tests are given. The concatenated coding scheme using a convolutional encoder for the outer coder is developed. The ESTL end-to-end tests on the space shuttle communication link are described. Most important is how a concatenated coding system will perform. This is a go-no-go system with respect to received signal-to-noise ratio. A discussion of the verification requirements and Specification document is presented, and those sections that apply to Space Telescope data and communications system are discussed. The Space Telescope System consists of the Space Telescope Orbiting Observatory (ST), the Space Telescope Science Institute, and the Space Telescope Operation Control Center. The MA system consists of the ST, the return link from the ST via the Tracking and Delay Relay Satellite system to White Sands, and from White Sands via the Domestic Communications Satellite to the STOCC.

  5. MicroRNAs and intellectual disability (ID) in Down syndrome, X-linked ID, and Fragile X syndrome

    PubMed Central

    Siew, Wei-Hong; Tan, Kai-Leng; Babaei, Maryam Abbaspour; Cheah, Pike-See; Ling, King-Hwa

    2013-01-01

    Intellectual disability (ID) is one of the many features manifested in various genetic syndromes leading to deficits in cognitive function among affected individuals. ID is a feature affected by polygenes and multiple environmental factors. It leads to a broad spectrum of affected clinical and behavioral characteristics among patients. Until now, the causative mechanism of ID is unknown and the progression of the condition is poorly understood. Advancement in technology and research had identified various genetic abnormalities and defects as the potential cause of ID. However, the link between these abnormalities with ID is remained inconclusive and the roles of many newly discovered genetic components such as non-coding RNAs have not been thoroughly investigated. In this review, we aim to consolidate and assimilate the latest development and findings on a class of small non-coding RNAs known as microRNAs (miRNAs) involvement in ID development and progression with special focus on Down syndrome (DS) and X-linked ID (XLID) [including Fragile X syndrome (FXS)]. PMID:23596395

  6. Automating Traceability for Generated Software Artifacts

    NASA Technical Reports Server (NTRS)

    Richardson, Julian; Green, Jeffrey

    2004-01-01

    Program synthesis automatically derives programs from specifications of their behavior. One advantage of program synthesis, as opposed to manual coding, is that there is a direct link between the specification and the derived program. This link is, however, not very fine-grained: it can be best characterized as Program is-derived- from Specification. When the generated program needs to be understood or modified, more $ne-grained linking is useful. In this paper, we present a novel technique for automatically deriving traceability relations between parts of a specification and parts of the synthesized program. The technique is very lightweight and works -- with varying degrees of success - for any process in which one artifact is automatically derived from another. We illustrate the generality of the technique by applying it to two kinds of automatic generation: synthesis of Kalman Filter programs from speci3cations using the Aut- oFilter program synthesis system, and generation of assembly language programs from C source code using the GCC C compilel: We evaluate the effectiveness of the technique in the latter application.

  7. Evaluation and implementation of QR Code Identity Tag system for Healthcare in Turkey.

    PubMed

    Uzun, Vassilya; Bilgin, Sami

    2016-01-01

    For this study, we designed a QR Code Identity Tag system to integrate into the Turkish healthcare system. This system provides QR code-based medical identification alerts and an in-hospital patient identification system. Every member of the medical system is assigned a unique QR Code Tag; to facilitate medical identification alerts, the QR Code Identity Tag can be worn as a bracelet or necklace or carried as an ID card. Patients must always possess the QR Code Identity bracelets within hospital grounds. These QR code bracelets link to the QR Code Identity website, where detailed information is stored; a smartphone or standalone QR code scanner can be used to scan the code. The design of this system allows authorized personnel (e.g., paramedics, firefighters, or police) to access more detailed patient information than the average smartphone user: emergency service professionals are authorized to access patient medical histories to improve the accuracy of medical treatment. In Istanbul, we tested the self-designed system with 174 participants. To analyze the QR Code Identity Tag system's usability, the participants completed the System Usability Scale questionnaire after using the system.

  8. Reduced adaptability, but no fundamental disruption, of norm-based face coding following early visual deprivation from congenital cataracts.

    PubMed

    Rhodes, Gillian; Nishimura, Mayu; de Heering, Adelaide; Jeffery, Linda; Maurer, Daphne

    2017-05-01

    Faces are adaptively coded relative to visual norms that are updated by experience, and this adaptive coding is linked to face recognition ability. Here we investigated whether adaptive coding of faces is disrupted in individuals (adolescents and adults) who experience face recognition difficulties following visual deprivation from congenital cataracts in infancy. We measured adaptive coding using face identity aftereffects, where smaller aftereffects indicate less adaptive updating of face-coding mechanisms by experience. We also examined whether the aftereffects increase with adaptor identity strength, consistent with norm-based coding of identity, as in typical populations, or whether they show a different pattern indicating some more fundamental disruption of face-coding mechanisms. Cataract-reversal patients showed significantly smaller face identity aftereffects than did controls (Experiments 1 and 2). However, their aftereffects increased significantly with adaptor strength, consistent with norm-based coding (Experiment 2). Thus we found reduced adaptability but no fundamental disruption of norm-based face-coding mechanisms in cataract-reversal patients. Our results suggest that early visual experience is important for the normal development of adaptive face-coding mechanisms. © 2016 John Wiley & Sons Ltd.

  9. User's guide for GSMP, a General System Modeling Program. [In PL/I

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, J. M.

    1979-10-01

    GSMP is designed for use by systems analysis teams. Given compiled subroutines that model the behavior of components plus instructions as to how they are to be interconnected, this program links them together to model a complete system. GSMP offers a fast response to management requests for reconfigurations of old systems and even initial configurations of new systems. Standard system-analytic services are provided: parameter sweeps, graphics, free-form input and formatted output, file storage and recovery, user-tested error diagnostics, component model and integration checkout and debugging facilities, sensitivity analysis, and a multimethod optimizer with nonlinear constraint handling capability. Steady-state or cyclicmore » time-dependence is simulated directly, initial-value problems only indirectly. The code is written in PL/I, but interfaces well with FORTRAN component models. Over the last five years GSMP has been used to model theta-pinch, tokamak, and heavy-ion fusion power plants, open- and closed-cycle magneto-hydrodynamic power plants, and total community energy systems.« less

  10. An Open Source modular platform for hydrological model implementation

    NASA Astrophysics Data System (ADS)

    Kolberg, Sjur; Bruland, Oddbjørn

    2010-05-01

    An implementation framework for setup and evaluation of spatio-temporal models is developed, forming a highly modularized distributed model system. The ENKI framework allows building space-time models for hydrological or other environmental purposes, from a suite of separately compiled subroutine modules. The approach makes it easy for students, researchers and other model developers to implement, exchange, and test single routines in a fixed framework. The open-source license and modular design of ENKI will also facilitate rapid dissemination of new methods to institutions engaged in operational hydropower forecasting or other water resource management. Written in C++, ENKI uses a plug-in structure to build a complete model from separately compiled subroutine implementations. These modules contain very little code apart from the core process simulation, and are compiled as dynamic-link libraries (dll). A narrow interface allows the main executable to recognise the number and type of the different variables in each routine. The framework then exposes these variables to the user within the proper context, ensuring that time series exist for input variables, initialisation for states, GIS data sets for static map data, manually or automatically calibrated values for parameters etc. ENKI is designed to meet three different levels of involvement in model construction: • Model application: Running and evaluating a given model. Regional calibration against arbitrary data using a rich suite of objective functions, including likelihood and Bayesian estimation. Uncertainty analysis directed towards input or parameter uncertainty. o Need not: Know the model's composition of subroutines, or the internal variables in the model, or the creation of method modules. • Model analysis: Link together different process methods, including parallel setup of alternative methods for solving the same task. Investigate the effect of different spatial discretization schemes. o Need not: Write or compile computer code, handle file IO for each modules, • Routine implementation and testing. Implementation of new process-simulating methods/equations, specialised objective functions or quality control routines, testing of these in an existing framework. o Need not: Implement user or model interface for the new routine, IO handling, administration of model setup and run, calibration and validation routines etc. From being developed for Norway's largest hydropower producer Statkraft, ENKI is now being turned into an Open Source project. At the time of writing, the licence and the project administration is not established. Also, it remains to port the application to other compilers and computer platforms. However, we hope that ENKI will prove useful for both academic and operational users.

  11. Simulation of Attitude and Trajectory Dynamics and Control of Multiple Spacecraft

    NASA Technical Reports Server (NTRS)

    Stoneking, Eric T.

    2009-01-01

    Agora software is a simulation of spacecraft attitude and orbit dynamics. It supports spacecraft models composed of multiple rigid bodies or flexible structural models. Agora simulates multiple spacecraft simultaneously, supporting rendezvous, proximity operations, and precision formation flying studies. The Agora environment includes ephemerides for all planets and major moons in the solar system, supporting design studies for deep space as well as geocentric missions. The environment also contains standard models for gravity, atmospheric density, and magnetic fields. Disturbance force and torque models include aerodynamic, gravity-gradient, solar radiation pressure, and third-body gravitation. In addition to the dynamic and environmental models, Agora supports geometrical visualization through an OpenGL interface. Prototype models are provided for common sensors, actuators, and control laws. A clean interface accommodates linking in actual flight code in place of the prototype control laws. The same simulation may be used for rapid feasibility studies, and then used for flight software validation as the design matures. Agora is open-source and portable across computing platforms, making it customizable and extensible. It is written to support the entire GNC (guidance, navigation, and control) design cycle, from rapid prototyping and design analysis, to high-fidelity flight code verification. As a top-down design, Agora is intended to accommodate a large range of missions, anywhere in the solar system. Both two-body and three-body flight regimes are supported, as well as seamless transition between them. Multiple spacecraft may be simultaneously simulated, enabling simulation of rendezvous scenarios, as well as formation flying. Built-in reference frames and orbit perturbation dynamics provide accurate modeling of precision formation control.

  12. Research on performance of three-layer MG-OXC system based on MLAG and OCDM

    NASA Astrophysics Data System (ADS)

    Wang, Yubao; Ren, Yanfei; Meng, Ying; Bai, Jian

    2017-10-01

    At present, as traffic volume which optical transport networks convey and species of traffic grooming methods increase rapidly, optical switching techniques are faced with a series of issues, such as more requests for the number of wavelengths and complicated structure management and implementation. This work introduces optical code switching based on wavelength switching, constructs the three layers multi-granularity optical cross connection (MG-OXC) system on the basis of optical code division multiplexing (OCDM) and presents a new traffic grooming algorithm. The proposed architecture can improve the flexibility of traffic grooming, reduce the amount of used wavelengths and save the number of consumed ports, hence, it can simplify routing device and enhance the performance of the system significantly. Through analyzing the network model of switching structure on multicast layered auxiliary graph (MLAG) and the establishment of traffic grooming links, and the simulation of blocking probability and throughput, this paper shows the excellent performance of this mentioned architecture.

  13. Photoactive Self-Shaping Hydrogels as Noncontact 3D Macro/Microscopic Photoprinting Platforms.

    PubMed

    Liao, Yue; An, Ning; Wang, Ning; Zhang, Yinyu; Song, Junfei; Zhou, Jinxiong; Liu, Wenguang

    2015-12-01

    A photocleavable terpolymer hydrogel cross-linked with o-nitrobenzyl derivative cross-linker is shown to be capable of self-shaping without losing its physical integrity and robustness due to spontaneous asymmetric swelling of network caused by UV-light-induced gradient cleavage of chemical cross-linkages. The continuum model and finite element method are used to elucidate the curling mechanism underlying. Remarkably, based on the self-changing principle, the photosensitive hydrogels can be developed as photoprinting soft and wet platforms onto which specific 3D characters and images are faithfully duplicated in macro/microscale without contact by UV light irradiation under the cover of customized photomasks. Importantly, a quick response (QR) code is accurately printed on the photoactive hydrogel for the first time. Scanning QR code with a smartphone can quickly connect to a web page. This photoactive hydrogel is promising to be a new printing or recording material. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Signatures of criticality arise from random subsampling in simple population models.

    PubMed

    Nonnenmacher, Marcel; Behrens, Christian; Berens, Philipp; Bethge, Matthias; Macke, Jakob H

    2017-10-01

    The rise of large-scale recordings of neuronal activity has fueled the hope to gain new insights into the collective activity of neural ensembles. How can one link the statistics of neural population activity to underlying principles and theories? One attempt to interpret such data builds upon analogies to the behaviour of collective systems in statistical physics. Divergence of the specific heat-a measure of population statistics derived from thermodynamics-has been used to suggest that neural populations are optimized to operate at a "critical point". However, these findings have been challenged by theoretical studies which have shown that common inputs can lead to diverging specific heat. Here, we connect "signatures of criticality", and in particular the divergence of specific heat, back to statistics of neural population activity commonly studied in neural coding: firing rates and pairwise correlations. We show that the specific heat diverges whenever the average correlation strength does not depend on population size. This is necessarily true when data with correlations is randomly subsampled during the analysis process, irrespective of the detailed structure or origin of correlations. We also show how the characteristic shape of specific heat capacity curves depends on firing rates and correlations, using both analytically tractable models and numerical simulations of a canonical feed-forward population model. To analyze these simulations, we develop efficient methods for characterizing large-scale neural population activity with maximum entropy models. We find that, consistent with experimental findings, increases in firing rates and correlation directly lead to more pronounced signatures. Thus, previous reports of thermodynamical criticality in neural populations based on the analysis of specific heat can be explained by average firing rates and correlations, and are not indicative of an optimized coding strategy. We conclude that a reliable interpretation of statistical tests for theories of neural coding is possible only in reference to relevant ground-truth models.

  15. Magnified Neural Envelope Coding Predicts Deficits in Speech Perception in Noise.

    PubMed

    Millman, Rebecca E; Mattys, Sven L; Gouws, André D; Prendergast, Garreth

    2017-08-09

    Verbal communication in noisy backgrounds is challenging. Understanding speech in background noise that fluctuates in intensity over time is particularly difficult for hearing-impaired listeners with a sensorineural hearing loss (SNHL). The reduction in fast-acting cochlear compression associated with SNHL exaggerates the perceived fluctuations in intensity in amplitude-modulated sounds. SNHL-induced changes in the coding of amplitude-modulated sounds may have a detrimental effect on the ability of SNHL listeners to understand speech in the presence of modulated background noise. To date, direct evidence for a link between magnified envelope coding and deficits in speech identification in modulated noise has been absent. Here, magnetoencephalography was used to quantify the effects of SNHL on phase locking to the temporal envelope of modulated noise (envelope coding) in human auditory cortex. Our results show that SNHL enhances the amplitude of envelope coding in posteromedial auditory cortex, whereas it enhances the fidelity of envelope coding in posteromedial and posterolateral auditory cortex. This dissociation was more evident in the right hemisphere, demonstrating functional lateralization in enhanced envelope coding in SNHL listeners. However, enhanced envelope coding was not perceptually beneficial. Our results also show that both hearing thresholds and, to a lesser extent, magnified cortical envelope coding in left posteromedial auditory cortex predict speech identification in modulated background noise. We propose a framework in which magnified envelope coding in posteromedial auditory cortex disrupts the segregation of speech from background noise, leading to deficits in speech perception in modulated background noise. SIGNIFICANCE STATEMENT People with hearing loss struggle to follow conversations in noisy environments. Background noise that fluctuates in intensity over time poses a particular challenge. Using magnetoencephalography, we demonstrate anatomically distinct cortical representations of modulated noise in normal-hearing and hearing-impaired listeners. This work provides the first link among hearing thresholds, the amplitude of cortical representations of modulated sounds, and the ability to understand speech in modulated background noise. In light of previous work, we propose that magnified cortical representations of modulated sounds disrupt the separation of speech from modulated background noise in auditory cortex. Copyright © 2017 Millman et al.

  16. Quicklook overview of model changes in Melcor 2.2: Rev 6342 to Rev 9496

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humphries, Larry L.

    2017-05-01

    MELCOR 2.2 is a significant official release of the MELCOR code with many new models and model improvements. This report provides the code user with a quick review and characterization of new models added, changes to existing models, the effect of code changes during this code development cycle (rev 6342 to rev 9496), a preview of validation results with this code version. More detailed information is found in the code Subversion logs as well as the User Guide and Reference Manuals.

  17. Production code control system for hydrodynamics simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slone, D.M.

    1997-08-18

    We describe how the Production Code Control System (pCCS), written in Perl, has been used to control and monitor the execution of a large hydrodynamics simulation code in a production environment. We have been able to integrate new, disparate, and often independent, applications into the PCCS framework without the need to modify any of our existing application codes. Both users and code developers see a consistent interface to the simulation code and associated applications regardless of the physical platform, whether an MPP, SMP, server, or desktop workstation. We will also describe our use of Perl to develop a configuration managementmore » system for the simulation code, as well as a code usage database and report generator. We used Perl to write a backplane that allows us plug in preprocessors, the hydrocode, postprocessors, visualization tools, persistent storage requests, and other codes. We need only teach PCCS a minimal amount about any new tool or code to essentially plug it in and make it usable to the hydrocode. PCCS has made it easier to link together disparate codes, since using Perl has removed the need to learn the idiosyncrasies of system or RPC programming. The text handling in Perl makes it easy to teach PCCS about new codes, or changes to existing codes.« less

  18. Facial expression coding in children and adolescents with autism: Reduced adaptability but intact norm-based coding.

    PubMed

    Rhodes, Gillian; Burton, Nichola; Jeffery, Linda; Read, Ainsley; Taylor, Libby; Ewing, Louise

    2018-05-01

    Individuals with autism spectrum disorder (ASD) can have difficulty recognizing emotional expressions. Here, we asked whether the underlying perceptual coding of expression is disrupted. Typical individuals code expression relative to a perceptual (average) norm that is continuously updated by experience. This adaptability of face-coding mechanisms has been linked to performance on various face tasks. We used an adaptation aftereffect paradigm to characterize expression coding in children and adolescents with autism. We asked whether face expression coding is less adaptable in autism and whether there is any fundamental disruption of norm-based coding. If expression coding is norm-based, then the face aftereffects should increase with adaptor expression strength (distance from the average expression). We observed this pattern in both autistic and typically developing participants, suggesting that norm-based coding is fundamentally intact in autism. Critically, however, expression aftereffects were reduced in the autism group, indicating that expression-coding mechanisms are less readily tuned by experience. Reduced adaptability has also been reported for coding of face identity and gaze direction. Thus, there appears to be a pervasive lack of adaptability in face-coding mechanisms in autism, which could contribute to face processing and broader social difficulties in the disorder. © 2017 The British Psychological Society.

  19. A concatenated coding scheme for error control

    NASA Technical Reports Server (NTRS)

    Lin, S.

    1985-01-01

    A concatenated coding scheme for error control in data communications is analyzed. The inner code is used for both error correction and detection, however the outer code is used only for error detection. A retransmission is requested if the outer code detects the presence of errors after the inner code decoding. The probability of undetected error of the above error control scheme is derived and upper bounded. Two specific exmaples are analyzed. In the first example, the inner code is a distance-4 shortened Hamming code with generator polynomial (X+1)(X(6)+X+1) = X(7)+X(6)+X(2)+1 and the outer code is a distance-4 shortened Hamming code with generator polynomial (X+1)X(15+X(14)+X(13)+X(12)+X(4)+X(3)+X(2)+X+1) = X(16)+X(12)+X(5)+1 which is the X.25 standard for packet-switched data network. This example is proposed for error control on NASA telecommand links. In the second example, the inner code is the same as that in the first example but the outer code is a shortened Reed-Solomon code with symbols from GF(2(8)) and generator polynomial (X+1)(X+alpha) where alpha is a primitive element in GF(z(8)).

  20. Mean Line Pump Flow Model in Rocket Engine System Simulation

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.; Lavelle, Thomas M.

    2000-01-01

    A mean line pump flow modeling method has been developed to provide a fast capability for modeling turbopumps of rocket engines. Based on this method, a mean line pump flow code PUMPA has been written that can predict the performance of pumps at off-design operating conditions, given the loss of the diffusion system at the design point. The pump code can model axial flow inducers, mixed-flow and centrifugal pumps. The code can model multistage pumps in series. The code features rapid input setup and computer run time, and is an effective analysis and conceptual design tool. The map generation capability of the code provides the map information needed for interfacing with a rocket engine system modeling code. The off-design and multistage modeling capabilities of the code permit parametric design space exploration of candidate pump configurations and provide pump performance data for engine system evaluation. The PUMPA code has been integrated with the Numerical Propulsion System Simulation (NPSS) code and an expander rocket engine system has been simulated. The mean line pump flow code runs as an integral part of the NPSS rocket engine system simulation and provides key pump performance information directly to the system model at all operating conditions.

  1. Modelling Per Capita Water Demand Change to Support System Planning

    NASA Astrophysics Data System (ADS)

    Garcia, M. E.; Islam, S.

    2016-12-01

    Water utilities have a number of levers to influence customer water usage. These include levers to proactively slow demand growth over time such as building and landscape codes as well as levers to decrease demands quickly in response to water stress including price increases, education campaigns, water restrictions, and incentive programs. Even actions aimed at short term reductions can result in long term water usage declines when substantial changes are made in water efficiency, as in incentives for fixture replacement or turf removal, or usage patterns such as permanent lawn watering restrictions. Demand change is therefore linked to hydrological conditions and to the effects of past management decisions - both typically included in water supply planning models. Yet, demand is typically incorporated exogenously using scenarios or endogenously using only price, though utilities also use rules and incentives issued in response to water stress and codes specifying standards for new construction to influence water usage. Explicitly including these policy levers in planning models enables concurrent testing of infrastructure and policy strategies and illuminates interactions between the two. The City of Las Vegas is used as a case study to develop and demonstrate this modeling approach. First, a statistical analysis of system data was employed to rule out alternate hypotheses of per capita demand decrease such as changes in population density and economic structure. Next, four demand sub-models were developed including one baseline model in which demand is a function of only price. The sub-models were then calibrated and tested using monthly data from 1997 to 2012. Finally, the best performing sub-model was integrated with a full supply and demand model. The results highlight the importance of both modeling water demand dynamics endogenously and taking a broader view of the variables influencing demand change.

  2. Communications and information research: Improved space link performance via concatenated forward error correction coding

    NASA Technical Reports Server (NTRS)

    Rao, T. R. N.; Seetharaman, G.; Feng, G. L.

    1996-01-01

    With the development of new advanced instruments for remote sensing applications, sensor data will be generated at a rate that not only requires increased onboard processing and storage capability, but imposes demands on the space to ground communication link and ground data management-communication system. Data compression and error control codes provide viable means to alleviate these demands. Two types of data compression have been studied by many researchers in the area of information theory: a lossless technique that guarantees full reconstruction of the data, and a lossy technique which generally gives higher data compaction ratio but incurs some distortion in the reconstructed data. To satisfy the many science disciplines which NASA supports, lossless data compression becomes a primary focus for the technology development. While transmitting the data obtained by any lossless data compression, it is very important to use some error-control code. For a long time, convolutional codes have been widely used in satellite telecommunications. To more efficiently transform the data obtained by the Rice algorithm, it is required to meet the a posteriori probability (APP) for each decoded bit. A relevant algorithm for this purpose has been proposed which minimizes the bit error probability in the decoding linear block and convolutional codes and meets the APP for each decoded bit. However, recent results on iterative decoding of 'Turbo codes', turn conventional wisdom on its head and suggest fundamentally new techniques. During the past several months of this research, the following approaches have been developed: (1) a new lossless data compression algorithm, which is much better than the extended Rice algorithm for various types of sensor data, (2) a new approach to determine the generalized Hamming weights of the algebraic-geometric codes defined by a large class of curves in high-dimensional spaces, (3) some efficient improved geometric Goppa codes for disk memory systems and high-speed mass memory systems, and (4) a tree based approach for data compression using dynamic programming.

  3. dbWFA: a web-based database for functional annotation of Triticum aestivum transcripts

    PubMed Central

    Vincent, Jonathan; Dai, Zhanwu; Ravel, Catherine; Choulet, Frédéric; Mouzeyar, Said; Bouzidi, M. Fouad; Agier, Marie; Martre, Pierre

    2013-01-01

    The functional annotation of genes based on sequence homology with genes from model species genomes is time-consuming because it is necessary to mine several unrelated databases. The aim of the present work was to develop a functional annotation database for common wheat Triticum aestivum (L.). The database, named dbWFA, is based on the reference NCBI UniGene set, an expressed gene catalogue built by expressed sequence tag clustering, and on full-length coding sequences retrieved from the TriFLDB database. Information from good-quality heterogeneous sources, including annotations for model plant species Arabidopsis thaliana (L.) Heynh. and Oryza sativa L., was gathered and linked to T. aestivum sequences through BLAST-based homology searches. Even though the complexity of the transcriptome cannot yet be fully appreciated, we developed a tool to easily and promptly obtain information from multiple functional annotation systems (Gene Ontology, MapMan bin codes, MIPS Functional Categories, PlantCyc pathway reactions and TAIR gene families). The use of dbWFA is illustrated here with several query examples. We were able to assign a putative function to 45% of the UniGenes and 81% of the full-length coding sequences from TriFLDB. Moreover, comparison of the annotation of the whole T. aestivum UniGene set along with curated annotations of the two model species assessed the accuracy of the annotation provided by dbWFA. To further illustrate the use of dbWFA, genes specifically expressed during the early cell division or late storage polymer accumulation phases of T. aestivum grain development were identified using a clustering analysis and then annotated using dbWFA. The annotation of these two sets of genes was consistent with previous analyses of T. aestivum grain transcriptomes and proteomes. Database URL: urgi.versailles.inra.fr/dbWFA/ PMID:23660284

  4. RAINLINK: Retrieval algorithm for rainfall monitoring employing microwave links from a cellular communication network

    NASA Astrophysics Data System (ADS)

    Uijlenhoet, R.; Overeem, A.; Leijnse, H.; Rios Gaona, M. F.

    2017-12-01

    The basic principle of rainfall estimation using microwave links is as follows. Rainfall attenuates the electromagnetic signals transmitted from one telephone tower to another. By measuring the received power at one end of a microwave link as a function of time, the path-integrated attenuation due to rainfall can be calculated, which can be converted to average rainfall intensities over the length of a link. Microwave links from cellular communication networks have been proposed as a promising new rainfall measurement technique for one decade. They are particularly interesting for those countries where few surface rainfall observations are available. Yet to date no operational (real-time) link-based rainfall products are available. To advance the process towards operational application and upscaling of this technique, there is a need for freely available, user-friendly computer code for microwave link data processing and rainfall mapping. Such software is now available as R package "RAINLINK" on GitHub (https://github.com/overeem11/RAINLINK). It contains a working example to compute link-based 15-min rainfall maps for the entire surface area of The Netherlands for 40 hours from real microwave link data. This is a working example using actual data from an extensive network of commercial microwave links, for the first time, which will allow users to test their own algorithms and compare their results with ours. The package consists of modular functions, which facilitates running only part of the algorithm. The main processings steps are: 1) Preprocessing of link data (initial quality and consistency checks); 2) Wet-dry classification using link data; 3) Reference signal determination; 4) Removal of outliers ; 5) Correction of received signal powers; 6) Computation of mean path-averaged rainfall intensities; 7) Interpolation of rainfall intensities ; 8) Rainfall map visualisation. Some applications of RAINLINK will be shown based on microwave link data from a temperate climate (the Netherlands), and from a subtropical climate (Brazil). We hope that RAINLINK will promote the application of rainfall monitoring using microwave links in poorly gauged regions around the world. We invite researchers to contribute to RAINLINK to make the code more generally applicable to data from different networks and climates.

  5. User preference as quality markers of paediatric web sites.

    PubMed

    Hernández-Borges, Angel A; Macías-Cervi, Pablo; Gaspar-Guardado, Asunción; Torres-Alvarez De Arcaya, María Luisa; Ruíz-Rabaza, Ana; Jiménez-Sosa, Alejandro

    2003-09-01

    Little is known about the ability of internet users to distinguish the best medical resources online, and how their preferences, measured by usage and popularity indexes, correlate with established quality criteria. Our objective was to analyse whether the number of inbound links and/or daily visits to a sample of paediatric web pages are reliable quality markers of the pages. Two-year follow-up study of 363 web pages with paediatric information. The number of inbound links and the average number of daily visits to the pages were calculated on a yearly basis. In addition, their rates of compliance with the codes of conduct, guidelines and/or principles of three international organizations were evaluated. The quality code most widely met by the sample web pages was the Health on the Net Foundation Code of Conduct (overall rate, 60.2%). Sample pages showed a low degree of compliance with principles related to privacy, confidentiality and electronic commerce (overall rate less than 45%). Most importantly, we observed a moderate, significant correlation between compliance with quality criteria and the number of inbound links (p < 0.001). However, no correlation was found between the number of daily visits to a page and its degree of compliance with the principles. Some indexes derived from the analysis of webmasters' hyperlinks could be reliable quality markers of medical web resources.

  6. Description of movement quality in patients with low back pain: A qualitative study as a first step to a practical definition.

    PubMed

    van Dijk, Margriet J H; Smorenburg, Nienke T A; Visser, Bart; Nijhuis-van der Sanden, Maria W G; Heerkens, Yvonne F

    2017-03-01

    As a first step to formulate a practical definition for movement quality (MQ), this study aims to explore how Dutch allied health care professionals (AHCPs) describe MQ of daily life activities in patients with low back pain (LBP). In this qualitative cross-sectional digital survey study, Dutch AHCPs (n = 91) described MQ in open text (n = 91) and with three keywords (n = 90). After exploratory qualitative content analysis, the ICF linking rules (International Classification of Functioning, Disability and Health) were applied to classify MQ descriptions and keywords. The identified meaningful concepts (MCs) of the descriptions (274) and keywords (239) were linked to ICF codes (87.5% and 80.3%, respectively), Personal factors (5.8% and 5.9%, respectively), and supplementary codes (6.6% and 13.8%, respectively). The MCs were linked to a total of 31 ICF codes, especially to b760 'control of voluntary movement functions', b7602 'coordination of voluntary movements', d4 'Mobility', and d230 'carry out daily routine'. Negative and positive formulated descriptions elucidated different MQ interpretations. Descriptions of MQ given by Dutch AHCPs in patients with LBP cover all ICF components. Coordination and functional movements are seen as the most elementary concepts of MQ. Variation in MQ descriptions and interpretations hinders defining MQ and indicates the necessity of additional steps.

  7. Kinetic Modeling of Slow Energy Release in Non-Ideal Carbon Rich Explosives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vitello, P; Fried, L; Glaesemann, K

    2006-06-20

    We present here the first self-consistent kinetic based model for long time-scale energy release in detonation waves in the non-ideal explosive LX-17. Non-ideal, insensitive carbon rich explosives, such as those based on TATB, are believed to have significant late-time slow release in energy. One proposed source of this energy is diffusion-limited growth of carbon clusters. In this paper we consider the late-time energy release problem in detonation waves using the thermochemical code CHEETAH linked to a multidimensional ALE hydrodynamics model. The linked CHEETAH-ALE model dimensional treats slowly reacting chemical species using kinetic rate laws, with chemical equilibrium assumed for speciesmore » coupled via fast time-scale reactions. In the model presented here we include separate rate equations for the transformation of the un-reacted explosive to product gases and for the growth of a small particulate form of condensed graphite to a large particulate form. The small particulate graphite is assumed to be in chemical equilibrium with the gaseous species allowing for coupling between the instantaneous thermodynamic state and the production of graphite clusters. For the explosive burn rate a pressure dependent rate law was used. Low pressure freezing of the gas species mass fractions was also included to account for regions where the kinetic coupling rates become longer than the hydrodynamic time-scales. The model rate parameters were calibrated using cylinder and rate-stick experimental data. Excellent long time agreement and size effect results were achieved.« less

  8. Improving Hospital Reporting of Patient Race and Ethnicity--Approaches to Data Auditing.

    PubMed

    Zingmond, David S; Parikh, Punam; Louie, Rachel; Lichtensztajn, Daphne Y; Ponce, Ninez; Hasnain-Wynia, Romana; Gomez, Scarlett Lin

    2015-08-01

    To investigate new metrics to improve the reporting of patient race and ethnicity (R/E) by hospitals. California Patient Discharge Database (PDD) and birth registry, 2008-2009, Healthcare and Cost Utilization Project's State Inpatient Database, 2008-2011, cancer registry 2000-2008, and 2010 US Census Summary File 2. We examined agreement between hospital reported R/E versus self-report among mothers delivering babies and a cancer cohort in California. Metrics were created to measure root mean squared differences (RMSD) by hospital between reported R/E distribution and R/E estimates using R/E distribution within each patient's zip code of residence. RMSD comparisons were made to corresponding "gold standard" facility-level measures within the maternal cohort for California and six comparison states. Maternal birth hospitalization (linked to the state birth registry) and cancer cohort records linked to preceding and subsequent hospitalizations. Hospital discharges were linked to the corresponding Census zip code tabulation area using patient zip code. Overall agreement between the PDD and the gold standard for the maternal cohort was 86 percent for the combined R/E measure and 71 percent for race alone. The RMSD measure is modestly correlated with the summary level gold standard measure for R/E (r = 0.44). The RMSD metric revealed general improvement in data agreement and completeness across states. "Other" and "unknown" categories were inconsistently applied within inpatient databases. Comparison between reported R/E and R/E estimates using zip code level data may be a reasonable first approach to evaluate and track hospital R/E reporting. Further work should focus on using more granular geocoded data for estimates and tracking data to improve hospital collection of R/E data. © Health Research and Educational Trust.

  9. Joint two dimensional inversion of gravity and magnetotelluric data using correspondence maps

    NASA Astrophysics Data System (ADS)

    Carrillo Lopez, J.; Gallardo, L. A.

    2016-12-01

    Inverse problems in Earth sciences are inherently non-unique. To improve models and reduce the number of solutions we need to provide extra information. In geological context, this information could be a priori information, for example, geological information, well log data, smoothness, or actually, information of measures of different kind of data. Joint inversion provides an approach to improve the solution and reduce the errors due to suppositions of each method. To do that, we need a link between two or more models. Some approaches have been explored successfully in recent years. For example, Gallardo and Meju (2003), Gallardo and Meju (2004, 2011), and Gallardo et. al. (2012) used the directions of properties to measure the similarity between models minimizing their cross gradients. In this work, we proposed a joint iterative inversion method that use spatial distribution of properties as a link. Correspondence maps could be better characterizing specific Earth systems due they consider the relation between properties. We implemented a code in Fortran to do a two dimensional inversion of magnetotelluric and gravity data, which are two of the standard methods in geophysical exploration. Synthetic tests show the advantages of joint inversion using correspondence maps against separate inversion. Finally, we applied this technique to magnetotelluric and gravity data in the geothermal zone located in Cerro Prieto, México.

  10. Parallel Distributed Processing Theory in the Age of Deep Networks.

    PubMed

    Bowers, Jeffrey S

    2017-12-01

    Parallel distributed processing (PDP) models in psychology are the precursors of deep networks used in computer science. However, only PDP models are associated with two core psychological claims, namely that all knowledge is coded in a distributed format and cognition is mediated by non-symbolic computations. These claims have long been debated in cognitive science, and recent work with deep networks speaks to this debate. Specifically, single-unit recordings show that deep networks learn units that respond selectively to meaningful categories, and researchers are finding that deep networks need to be supplemented with symbolic systems to perform some tasks. Given the close links between PDP and deep networks, it is surprising that research with deep networks is challenging PDP theory. Copyright © 2017. Published by Elsevier Ltd.

  11. TDRSS telecommunications system, PN code analysis

    NASA Technical Reports Server (NTRS)

    Dixon, R.; Gold, R.; Kaiser, F.

    1976-01-01

    The pseudo noise (PN) codes required to support the TDRSS telecommunications services are analyzed and the impact of alternate coding techniques on the user transponder equipment, the TDRSS equipment, and all factors that contribute to the acquisition and performance of these telecommunication services is assessed. Possible alternatives to the currently proposed hybrid FH/direct sequence acquisition procedures are considered and compared relative to acquisition time, implementation complexity, operational reliability, and cost. The hybrid FH/direct sequence technique is analyzed and rejected in favor of a recommended approach which minimizes acquisition time and user transponder complexity while maximizing probability of acquisition and overall link reliability.

  12. Evolutionary Computation with Spatial Receding Horizon Control to Minimize Network Coding Resources

    PubMed Central

    Leeson, Mark S.

    2014-01-01

    The minimization of network coding resources, such as coding nodes and links, is a challenging task, not only because it is a NP-hard problem, but also because the problem scale is huge; for example, networks in real world may have thousands or even millions of nodes and links. Genetic algorithms (GAs) have a good potential of resolving NP-hard problems like the network coding problem (NCP), but as a population-based algorithm, serious scalability and applicability problems are often confronted when GAs are applied to large- or huge-scale systems. Inspired by the temporal receding horizon control in control engineering, this paper proposes a novel spatial receding horizon control (SRHC) strategy as a network partitioning technology, and then designs an efficient GA to tackle the NCP. Traditional network partitioning methods can be viewed as a special case of the proposed SRHC, that is, one-step-wide SRHC, whilst the method in this paper is a generalized N-step-wide SRHC, which can make a better use of global information of network topologies. Besides the SRHC strategy, some useful designs are also reported in this paper. The advantages of the proposed SRHC and GA for the NCP are illustrated by extensive experiments, and they have a good potential of being extended to other large-scale complex problems. PMID:24883371

  13. Rocketdyne/Westinghouse nuclear thermal rocket engine modeling

    NASA Technical Reports Server (NTRS)

    Glass, James F.

    1993-01-01

    The topics are presented in viewgraph form and include the following: systems approach needed for nuclear thermal rocket (NTR) design optimization; generic NTR engine power balance codes; rocketdyne nuclear thermal system code; software capabilities; steady state model; NTR engine optimizer code-logic; reactor power calculation logic; sample multi-component configuration; NTR design code output; generic NTR code at Rocketdyne; Rocketdyne NTR model; and nuclear thermal rocket modeling directions.

  14. The Application of Social Characteristic and L1 Optimization in the Error Correction for Network Coding in Wireless Sensor Networks

    PubMed Central

    Zhang, Guangzhi; Cai, Shaobin; Xiong, Naixue

    2018-01-01

    One of the remarkable challenges about Wireless Sensor Networks (WSN) is how to transfer the collected data efficiently due to energy limitation of sensor nodes. Network coding will increase network throughput of WSN dramatically due to the broadcast nature of WSN. However, the network coding usually propagates a single original error over the whole network. Due to the special property of error propagation in network coding, most of error correction methods cannot correct more than C/2 corrupted errors where C is the max flow min cut of the network. To maximize the effectiveness of network coding applied in WSN, a new error-correcting mechanism to confront the propagated error is urgently needed. Based on the social network characteristic inherent in WSN and L1 optimization, we propose a novel scheme which successfully corrects more than C/2 corrupted errors. What is more, even if the error occurs on all the links of the network, our scheme also can correct errors successfully. With introducing a secret channel and a specially designed matrix which can trap some errors, we improve John and Yi’s model so that it can correct the propagated errors in network coding which usually pollute exactly 100% of the received messages. Taking advantage of the social characteristic inherent in WSN, we propose a new distributed approach that establishes reputation-based trust among sensor nodes in order to identify the informative upstream sensor nodes. With referred theory of social networks, the informative relay nodes are selected and marked with high trust value. The two methods of L1 optimization and utilizing social characteristic coordinate with each other, and can correct the propagated error whose fraction is even exactly 100% in WSN where network coding is performed. The effectiveness of the error correction scheme is validated through simulation experiments. PMID:29401668

  15. The Application of Social Characteristic and L1 Optimization in the Error Correction for Network Coding in Wireless Sensor Networks.

    PubMed

    Zhang, Guangzhi; Cai, Shaobin; Xiong, Naixue

    2018-02-03

    One of the remarkable challenges about Wireless Sensor Networks (WSN) is how to transfer the collected data efficiently due to energy limitation of sensor nodes. Network coding will increase network throughput of WSN dramatically due to the broadcast nature of WSN. However, the network coding usually propagates a single original error over the whole network. Due to the special property of error propagation in network coding, most of error correction methods cannot correct more than C /2 corrupted errors where C is the max flow min cut of the network. To maximize the effectiveness of network coding applied in WSN, a new error-correcting mechanism to confront the propagated error is urgently needed. Based on the social network characteristic inherent in WSN and L1 optimization, we propose a novel scheme which successfully corrects more than C /2 corrupted errors. What is more, even if the error occurs on all the links of the network, our scheme also can correct errors successfully. With introducing a secret channel and a specially designed matrix which can trap some errors, we improve John and Yi's model so that it can correct the propagated errors in network coding which usually pollute exactly 100% of the received messages. Taking advantage of the social characteristic inherent in WSN, we propose a new distributed approach that establishes reputation-based trust among sensor nodes in order to identify the informative upstream sensor nodes. With referred theory of social networks, the informative relay nodes are selected and marked with high trust value. The two methods of L1 optimization and utilizing social characteristic coordinate with each other, and can correct the propagated error whose fraction is even exactly 100% in WSN where network coding is performed. The effectiveness of the error correction scheme is validated through simulation experiments.

  16. User Instructions for the Policy Analysis Modeling System (PAMS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McNeil, Michael A.; Letschert, Virginie E.; Van Buskirk, Robert D.

    PAMS uses country-specific and product-specific data to calculate estimates of impacts of a Minimum Efficiency Performance Standard (MEPS) program. The analysis tool is self-contained in a Microsoft Excel spreadsheet, and requires no links to external data, or special code additions to run. The analysis can be customized to a particular program without additional user input, through the use of the pull-down menus located on the Summary page. In addition, the spreadsheet contains many areas into which user-generated input data can be entered for increased accuracy of projection. The following is a step-by-step guide for using and customizing the tool.

  17. Decoder synchronization for deep space missions

    NASA Technical Reports Server (NTRS)

    Statman, J. I.; Cheung, K.-M.; Chauvin, T. H.; Rabkin, J.; Belongie, M. L.

    1994-01-01

    The Consultative Committee for Space Data Standards (CCSDS) recommends that space communication links employ a concatenated, error-correcting, channel-coding system in which the inner code is a convolutional (7,1/2) code and the outer code is a (255,223) Reed-Solomon code. The traditional implementation is to perform the node synchronization for the Viterbi decoder and the frame synchronization for the Reed-Solomon decoder as separate, sequential operations. This article discusses a unified synchronization technique that is required for deep space missions that have data rates and signal-to-noise ratios (SNR's) that are extremely low. This technique combines frame synchronization in the bit and symbol domains and traditional accumulated-metric growth techniques to establish a joint frame and node synchronization. A variation on this technique is used for the Galileo spacecraft on its Jupiter-bound mission.

  18. Hydrological and water quality processes simulation by the integrated MOHID model

    NASA Astrophysics Data System (ADS)

    Epelde, Ane; Antiguedad, Iñaki; Brito, David; Eduardo, Jauch; Neves, Ramiro; Sauvage, Sabine; Sánchez-Pérez, José Miguel

    2016-04-01

    Different modelling approaches have been used in recent decades to study the water quality degradation caused by non-point source pollution. In this study, the MOHID fully distributed and physics-based model has been employed to simulate hydrological processes and nitrogen dynamics in a nitrate vulnerable zone: the Alegria River watershed (Basque Country, Northern Spain). The results of this study indicate that the MOHID code is suitable for hydrological processes simulation at the watershed scale, as the model shows satisfactory performance at simulating the discharge (with NSE: 0.74 and 0.76 during calibration and validation periods, respectively). The agronomical component of the code, allowed the simulation of agricultural practices, which lead to adequate crop yield simulation in the model. Furthermore, the nitrogen exportation also shows satisfactory performance (with NSE: 0.64 and 0.69 during calibration and validation periods, respectively). While the lack of field measurements do not allow to evaluate the nutrient cycling processes in depth, it has been observed that the MOHID model simulates the annual denitrification according to general ranges established for agricultural watersheds (in this study, 9 kg N ha-1 year-1). In addition, the model has simulated coherently the spatial distribution of the denitrification process, which is directly linked to the simulated hydrological conditions. Thus, the model has localized the highest rates nearby the discharge zone of the aquifer and also where the aquifer thickness is low. These results evidence the strength of this model to simulate watershed scale hydrological processes as well as the crop production and the agricultural activity derived water quality degradation (considering both nutrient exportation and nutrient cycling processes).

  19. Hot Rolling Scrap Reduction through Edge Cracking and Surface Defects Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beaudoin, Armand

    2016-05-29

    The design of future aircraft must address the combined demands for fuel efficiency, reduced emissions and lower operating costs. One contribution to these goals is weight savings through the development of new alloys and design techniques for airframe structures. This research contributes to the light-weighting through fabrication of monolithic components from advanced aluminum alloys by making a link between alloy processing history and in-service performance. Specifically, this research demonstrates the link between growing cracks with features of the alloy microstructure that follow from thermo-mechanical processing. This is achieved through a computer model of crack deviation. The model is validated againstmore » experimental data from production scale aluminum alloy plate, and demonstration of the effect of changes in processing history on crack growth is made. The model is cast in the open-source finite element code WARP3D, which is freely downloadable and well documented. This project provides benefit along several avenues. First, the technical contribution of the computer model offers the materials engineer a critical means of providing guidance both upstream, to process tuning to achieve optimal properties, and downstream, to enhance fault tolerance. Beyond the fuel savings and emissions reduction inherent in the light-weighting of aircraft structures, improved fault tolerance addresses demands for longer inspection intervals over baseline, and a lower life cycle cost.« less

  20. CFD Code Development for Combustor Flows

    NASA Technical Reports Server (NTRS)

    Norris, Andrew

    2003-01-01

    During the lifetime of this grant, work has been performed in the areas of model development, code development, code validation and code application. For model development, this has included the PDF combustion module, chemical kinetics based on thermodynamics, neural network storage of chemical kinetics, ILDM chemical kinetics and assumed PDF work. Many of these models were then implemented in the code, and in addition many improvements were made to the code, including the addition of new chemistry integrators, property evaluation schemes, new chemistry models and turbulence-chemistry interaction methodology. Validation of all new models and code improvements were also performed, while application of the code to the ZCET program and also the NPSS GEW combustor program were also performed. Several important items remain under development, including the NOx post processing, assumed PDF model development and chemical kinetic development. It is expected that this work will continue under the new grant.

  1. Toward a Probabilistic Automata Model of Some Aspects of Code-Switching.

    ERIC Educational Resources Information Center

    Dearholt, D. W.; Valdes-Fallis, G.

    1978-01-01

    The purpose of the model is to select either Spanish or English as the language to be used; its goals at this stage of development include modeling code-switching for lexical need, apparently random code-switching, dependency of code-switching upon sociolinguistic context, and code-switching within syntactic constraints. (EJS)

  2. Pictorial Superiority Effect

    ERIC Educational Resources Information Center

    Nelson, Douglas L.; And Others

    1976-01-01

    Pictures generally show superior recognition relative to their verbal labels. This experiment was designed to link this pictorial superiority effect to sensory or meaning codes associated with the two types of symbols. (Editor)

  3. Potential digitization/compression techniques for Shuttle video

    NASA Technical Reports Server (NTRS)

    Habibi, A.; Batson, B. H.

    1978-01-01

    The Space Shuttle initially will be using a field-sequential color television system but it is possible that an NTSC color TV system may be used for future missions. In addition to downlink color TV transmission via analog FM links, the Shuttle will use a high resolution slow-scan monochrome system for uplink transmission of text and graphics information. This paper discusses the characteristics of the Shuttle video systems, and evaluates digitization and/or bandwidth compression techniques for the various links. The more attractive techniques for the downlink video are based on a two-dimensional DPCM encoder that utilizes temporal and spectral as well as the spatial correlation of the color TV imagery. An appropriate technique for distortion-free coding of the uplink system utilizes two-dimensional HCK codes.

  4. 24 CFR 200.926b - Model codes.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 2 2010-04-01 2010-04-01 false Model codes. 200.926b Section 200... DEVELOPMENT GENERAL INTRODUCTION TO FHA PROGRAMS Minimum Property Standards § 200.926b Model codes. (a) Incorporation by reference. The following model code publications are incorporated by reference in accordance...

  5. Challenges in Integrating Nondestructive Evaluation and Finite Element Methods for Realistic Structural Analysis

    NASA Technical Reports Server (NTRS)

    Abdul-Aziz, Ali; Baaklini, George Y.; Zagidulin, Dmitri; Rauser, Richard W.

    2000-01-01

    Capabilities and expertise related to the development of links between nondestructive evaluation (NDE) and finite element analysis (FEA) at Glenn Research Center (GRC) are demonstrated. Current tools to analyze data produced by computed tomography (CT) scans are exercised to help assess the damage state in high temperature structural composite materials. A utility translator was written to convert velocity (an image processing software) STL data file to a suitable CAD-FEA type file. Finite element analyses are carried out with MARC, a commercial nonlinear finite element code, and the analytical results are discussed. Modeling was established by building MSC/Patran (a pre and post processing finite element package) generated model and comparing it to a model generated by Velocity in conjunction with MSC/Patran Graphics. Modeling issues and results are discussed in this paper. The entire process that outlines the tie between the data extracted via NDE and the finite element modeling and analysis is fully described.

  6. Lower- and higher-level models of right hemisphere language. A selective survey.

    PubMed

    Gainotti, Guido

    2016-01-01

    The models advanced to explain right hemisphere (RH) language function can be divided into two main types. According to the older (lower-level) models, RH language reflects the ontogenesis of conceptual and semantic-lexical development; the more recent models, on the other hand, suggest that the RH plays an important role in the use of higher-level language functions, such as metaphors, to convey complex, abstract concepts. The hypothesis that the RH may be preferentially involved in processing the semantic-lexical components of language was advanced by Zaidel in splitbrain patients and his model was confirmed by neuropsychological investigations, proving that right brain-damaged patients show selective semanticlexical disorders. The possible links between lower and higher levels of RH language are discussed, as is the hypothesis that the RH may have privileged access to the figurative aspects of novel metaphorical expressions, whereas conventionalization of metaphorical meaning could be a bilaterally-mediated process involving abstract semantic-lexical codes.

  7. Enhanced modeling features within TREETOPS

    NASA Technical Reports Server (NTRS)

    Vandervoort, R. J.; Kumar, Manoj N.

    1989-01-01

    The original motivation for TREETOPS was to build a generic multi-body simulation and remove the burden of writing multi-body equations from the engineers. The motivation of the enhancement was twofold: (1) to extend the menu of built-in features (sensors, actuators, constraints, etc.) that did not require user code; and (2) to extend the control system design capabilities by linking with other government funded software (NASTRAN and MATLAB). These enhancements also serve to bridge the gap between structures and control groups. It is common on large space programs for the structures groups to build hi-fidelity models of the structure using NASTRAN and for the controls group to build lower order models because they lack the tools to incorporate the former into their analysis. Now the controls engineers can accept the hi-fidelity NASTRAN models into TREETOPS, add sensors and actuators, perform model reduction and couple the result directly into MATLAB to perform their design. The controller can then be imported directly into TREETOPS for non-linear, time-history simulation.

  8. Computational Nuclear Physics and Post Hartree-Fock Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lietz, Justin; Sam, Novario; Hjorth-Jensen, M.

    We present a computational approach to infinite nuclear matter employing Hartree-Fock theory, many-body perturbation theory and coupled cluster theory. These lectures are closely linked with those of chapters 9, 10 and 11 and serve as input for the correlation functions employed in Monte Carlo calculations in chapter 9, the in-medium similarity renormalization group theory of dense fermionic systems of chapter 10 and the Green's function approach in chapter 11. We provide extensive code examples and benchmark calculations, allowing thereby an eventual reader to start writing her/his own codes. We start with an object-oriented serial code and end with discussions onmore » strategies for porting the code to present and planned high-performance computing facilities.« less

  9. PRECONDITIONED CONJUGATE-GRADIENT 2 (PCG2), a computer program for solving ground-water flow equations

    USGS Publications Warehouse

    Hill, Mary C.

    1990-01-01

    This report documents PCG2 : a numerical code to be used with the U.S. Geological Survey modular three-dimensional, finite-difference, ground-water flow model . PCG2 uses the preconditioned conjugate-gradient method to solve the equations produced by the model for hydraulic head. Linear or nonlinear flow conditions may be simulated. PCG2 includes two reconditioning options : modified incomplete Cholesky preconditioning, which is efficient on scalar computers; and polynomial preconditioning, which requires less computer storage and, with modifications that depend on the computer used, is most efficient on vector computers . Convergence of the solver is determined using both head-change and residual criteria. Nonlinear problems are solved using Picard iterations. This documentation provides a description of the preconditioned conjugate gradient method and the two preconditioners, detailed instructions for linking PCG2 to the modular model, sample data inputs, a brief description of PCG2, and a FORTRAN listing.

  10. Computational wing optimization and comparisons with experiment for a semi-span wing model

    NASA Technical Reports Server (NTRS)

    Waggoner, E. G.; Haney, H. P.; Ballhaus, W. F.

    1978-01-01

    A computational wing optimization procedure was developed and verified by an experimental investigation of a semi-span variable camber wing model in the NASA Ames Research Center 14 foot transonic wind tunnel. The Bailey-Ballhaus transonic potential flow analysis and Woodward-Carmichael linear theory codes were linked to Vanderplaats constrained minimization routine to optimize model configurations at several subsonic and transonic design points. The 35 deg swept wing is characterized by multi-segmented leading and trailing edge flaps whose hinge lines are swept relative to the leading and trailing edges of the wing. By varying deflection angles of the flap segments, camber and twist distribution can be optimized for different design conditions. Results indicate that numerical optimization can be both an effective and efficient design tool. The optimized configurations had as good or better lift to drag ratios at the design points as the best designs previously tested during an extensive parametric study.

  11. Multivariate temporal dictionary learning for EEG.

    PubMed

    Barthélemy, Q; Gouy-Pailler, C; Isaac, Y; Souloumiac, A; Larue, A; Mars, J I

    2013-04-30

    This article addresses the issue of representing electroencephalographic (EEG) signals in an efficient way. While classical approaches use a fixed Gabor dictionary to analyze EEG signals, this article proposes a data-driven method to obtain an adapted dictionary. To reach an efficient dictionary learning, appropriate spatial and temporal modeling is required. Inter-channels links are taken into account in the spatial multivariate model, and shift-invariance is used for the temporal model. Multivariate learned kernels are informative (a few atoms code plentiful energy) and interpretable (the atoms can have a physiological meaning). Using real EEG data, the proposed method is shown to outperform the classical multichannel matching pursuit used with a Gabor dictionary, as measured by the representative power of the learned dictionary and its spatial flexibility. Moreover, dictionary learning can capture interpretable patterns: this ability is illustrated on real data, learning a P300 evoked potential. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. G =  MAT: linking transcription factor expression and DNA binding data.

    PubMed

    Tretyakov, Konstantin; Laur, Sven; Vilo, Jaak

    2011-01-31

    Transcription factors are proteins that bind to motifs on the DNA and thus affect gene expression regulation. The qualitative description of the corresponding processes is therefore important for a better understanding of essential biological mechanisms. However, wet lab experiments targeted at the discovery of the regulatory interplay between transcription factors and binding sites are expensive. We propose a new, purely computational method for finding putative associations between transcription factors and motifs. This method is based on a linear model that combines sequence information with expression data. We present various methods for model parameter estimation and show, via experiments on simulated data, that these methods are reliable. Finally, we examine the performance of this model on biological data and conclude that it can indeed be used to discover meaningful associations. The developed software is available as a web tool and Scilab source code at http://biit.cs.ut.ee/gmat/.

  13. G = MAT: Linking Transcription Factor Expression and DNA Binding Data

    PubMed Central

    Tretyakov, Konstantin; Laur, Sven; Vilo, Jaak

    2011-01-01

    Transcription factors are proteins that bind to motifs on the DNA and thus affect gene expression regulation. The qualitative description of the corresponding processes is therefore important for a better understanding of essential biological mechanisms. However, wet lab experiments targeted at the discovery of the regulatory interplay between transcription factors and binding sites are expensive. We propose a new, purely computational method for finding putative associations between transcription factors and motifs. This method is based on a linear model that combines sequence information with expression data. We present various methods for model parameter estimation and show, via experiments on simulated data, that these methods are reliable. Finally, we examine the performance of this model on biological data and conclude that it can indeed be used to discover meaningful associations. The developed software is available as a web tool and Scilab source code at http://biit.cs.ut.ee/gmat/. PMID:21297945

  14. Genetic Variation Linked to Lung Cancer Survival in White Smokers | Center for Cancer Research

    Cancer.gov

    CCR investigators have discovered evidence that links lung cancer survival with genetic variations (called single nucleotide polymorphisms) in the MBL2 gene, a key player in innate immunity. The variations in the gene, which codes for a protein called the mannose-binding lectin, occur in its promoter region, where the RNA polymerase molecule binds to start transcription, and

  15. The Winding Road to Discovering the Link between Genetic Material and DNA

    ERIC Educational Resources Information Center

    Cherif, Abour H.; Roze, Maris; Movahedzadeh, Farahnaz

    2015-01-01

    This is an account of the three-centuries long journey to the discovery of the link between DNA and the transformation principle of heredity beginning with the discovery of the cell in 1665 and leading up to the 1953 discovery of the genetic code and the structure of DNA. This account also illustrates the way science works and how scientists do…

  16. Children's Perceptions of Family Relationships as Assessed in a Doll Story Completion Task: Links to Parenting, Social Competence, and Externalizing Behavior

    ERIC Educational Resources Information Center

    Laible, Deborah; Carlo, Gustavo; Torquati, Julia; Ontai, Lenna

    2004-01-01

    This study was designed to examine the links between parenting, children's perceptions of family relationships, and children's social behavior. Seventy-four children (M age = 6.01 years; 39 boys; 35 girls) and their parents took part in the study. Children completed relationship-oriented doll stories that were coded for coherence, prosocial…

  17. Programmer's reference manual for the VAX-Gerber link software package. Revision 1. 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Isobe, G.W.

    1985-10-01

    This guide provides the information necessary to edit, modify, and run the VAX-Gerber software link. Since the project is in the testing stage and still being modified, this guide discussess the final desired stage along with the current stage. The current stage is to set up as to allow the programmer to easily modify and update codes as necessary.

  18. Structure and application of an interface program between a geographic-information system and a ground-water flow model

    USGS Publications Warehouse

    Van Metre, P.C.

    1990-01-01

    A computer-program interface between a geographic-information system and a groundwater flow model links two unrelated software systems for use in developing the flow models. The interface program allows the modeler to compile and manage geographic components of a groundwater model within the geographic information system. A significant savings of time and effort is realized in developing, calibrating, and displaying the groundwater flow model. Four major guidelines were followed in developing the interface program: (1) no changes to the groundwater flow model code were to be made; (2) a data structure was to be designed within the geographic information system that follows the same basic data structure as the groundwater flow model; (3) the interface program was to be flexible enough to support all basic data options available within the model; and (4) the interface program was to be as efficient as possible in terms of computer time used and online-storage space needed. Because some programs in the interface are written in control-program language, the interface will run only on a computer with the PRIMOS operating system. (USGS)

  19. Small Engine Technology (SET) - Task 14 Axisymmetric Engine Simulation Environment

    NASA Technical Reports Server (NTRS)

    Miller, Max J.

    1999-01-01

    As part of the NPSS (Numerical Propulsion Simulation System) project, NASA Lewis has a goal of developing an U.S. industry standard for an axisymmetric engine simulation environment. In this program, AlliedSignal Engines (AE) contributed to this goal by evaluating the ENG20 software and developing support tools. ENG20 is a NASA developed axisymmetric engine simulation tool. The project was divided into six subtasks which are summarized below: Evaluate the capabilities of the ENG20 code using an existing test case to see how this procedure can capture the component interactions for a full engine. Link AE's compressor and turbine axisymmetric streamline curvature codes (UD0300M and TAPS) with ENG20, which will provide the necessary boundary conditions for an ENG20 engine simulation. Evaluate GE's Global Data System (GDS), attempt to use GDS to do the linking of codes described in Subtask 2 above. Use a turbofan engine test case to evaluate various aspects of the system, including the linkage of UD0300M and TAPS with ENG20 and the GE data storage system. Also, compare the solution results with cycle deck results, axisymmetric solutions (UD0300M and TAPS), and test data to determine the accuracy of the solution. Evaluate the order of accuracy and the convergence time for the solution. Provide a monthly status report and a final formal report documenting AE's evaluation of ENG20. Provide the developed interfaces that link UD0300M and TAPS with ENG20, to NASA. The interface that links UD0300M with ENG20 will be compatible with the industr,, version of UD0300M.

  20. High-precision two-way optic-fiber time transfer using an improved time code.

    PubMed

    Wu, Guiling; Hu, Liang; Zhang, Hao; Chen, Jianping

    2014-11-01

    We present a novel high-precision two-way optic-fiber time transfer scheme. The Inter-Range Instrumentation Group (IRIG-B) time code is modified by increasing bit rate and defining new fields. The modified time code can be transmitted directly using commercial optical transceivers and is able to efficiently suppress the effect of the Rayleigh backscattering in the optical fiber. A dedicated codec (encoder and decoder) with low delay fluctuation is developed. The synchronization issue is addressed by adopting a mask technique and combinational logic circuit. Its delay fluctuation is less than 27 ps in terms of the standard deviation. The two-way optic-fiber time transfer using the improved codec scheme is verified experimentally over 2 m to100 km fiber links. The results show that the stability over 100 km fiber link is always less than 35 ps with the minimum value of about 2 ps at the averaging time around 1000 s. The uncertainty of time difference induced by the chromatic dispersion over 100 km is less than 22 ps.

Top