Sample records for instantiations examples include

  1. Memes and their themata.

    PubMed

    Miranker, Willard L

    2010-01-01

    When it is instantiated as a neuronal state, a meme is characterized as a phenotype in a novel neuronal sense. A thema is an instantiation of a meme as a conscious experience (a thought-meme). It is a primitive to which no location may be attributed, and it serves as a canonical representative of a class of memes. Memes in such a class may have physical or ideal (Platonic) instantiations. Pairing of this memetic phenotype characterization with the ideal thematic primitive is an example of other pairings in nature that are identified, and in particular it informs a description of the pairing of the unconscious mind and manifestations of consciousness. Interrelationship of these pairings is what illuminates aspects of each of them. These constructs support introduction of a consciousness thesis and then a notion of a dynamic self-referential grammar that generates a growing repertoire of consciousness manifestations. A method showing how a neuronal state generates a specific concept (thema) is introduced, and a sample of a class of examples is given. Pointers to experiments relevant to development of the thesis are given.

  2. A reference architecture for the component factory

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Caldiera, Gianluigi; Cantone, Giovanni

    1992-01-01

    Software reuse can be achieved through an organization that focuses on utilization of life cycle products from previous developments. The component factory is both an example of the more general concepts of experience and domain factory and an organizational unit worth being considered independently. The critical features of such an organization are flexibility and continuous improvement. In order to achieve these features we can represent the architecture of the factory at different levels of abstraction and define a reference architecture from which specific architectures can be derived by instantiation. A reference architecture is an implementation and organization independent representation of the component factory and its environment. The paper outlines this reference architecture, discusses the instantiation process, and presents some examples of specific architectures by comparing them in the framework of the reference model.

  3. Safety Case Patterns: Theory and Applications

    NASA Technical Reports Server (NTRS)

    Denney, Ewen W.; Pai, Ganesh J.

    2015-01-01

    We develop the foundations for a theory of patterns of safety case argument structures, clarifying the concepts involved in pattern specification, including choices, labeling, and well-founded recursion. We specify six new patterns in addition to those existing in the literature. We give a generic way to specify the data required to instantiate patterns and a generic algorithm for their instantiation. This generalizes earlier work on generating argument fragments from requirements tables. We describe an implementation of these concepts in AdvoCATE, the Assurance Case Automation Toolset, showing how patterns are defined and can be instantiated. In particular, we describe how our extended notion of patterns can be specified, how they can be instantiated in an interactive manner, and, finally, how they can be automatically instantiated using our algorithm.

  4. A PRACTICAL ONTOLOGY FOR THE LARGE-SCALE MODELING OF SCHOLARLY ARTIFACTS AND THEIR USAGE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    RODRIGUEZ, MARKO A.; BOLLEN, JOHAN; VAN DE SOMPEL, HERBERT

    2007-01-30

    The large-scale analysis of scholarly artifact usage is constrained primarily by current practices in usage data archiving, privacy issues concerned with the dissemination of usage data, and the lack of a practical ontology for modeling the usage domain. As a remedy to the third constraint, this article presents a scholarly ontology that was engineered to represent those classes for which large-scale bibliographic and usage data exists, supports usage research, and whose instantiation is scalable to the order of 50 million articles along with their associated artifacts (e.g. authors and journals) and an accompanying 1 billion usage events. The real worldmore » instantiation of the presented abstract ontology is a semantic network model of the scholarly community which lends the scholarly process to statistical analysis and computational support. They present the ontology, discuss its instantiation, and provide some example inference rules for calculating various scholarly artifact metrics.« less

  5. Risk-Hedged Approach for Re-Routing Air Traffic Under Weather Uncertainty

    NASA Technical Reports Server (NTRS)

    Sadovsky, Alexander V.; Bilimoria, Karl D.

    2016-01-01

    This presentation corresponds to: our paper explores a new risk-hedged approach for re-routing air traffic around forecast convective weather. In this work, flying through a more likely weather instantiation is considered to pose a higher level of risk. Current operational practice strategically plans re-routes to avoid only the most likely (highest risk) weather instantiation, and then tactically makes any necessary adjustments as the weather evolves. The risk-hedged approach strategically plans re-routes by minimizing the risk-adjusted path length, incorporating multiple possible weather instantiations with associated likelihoods (risks). The resulting model is transparent and is readily analyzed for realism and treated with well-understood shortest-path algorithms. Risk-hedged re-routes are computed for some example weather instantiations. The main result is that in some scenarios, relative to an operational-practice proxy solution, the risk-hedged solution provides the benefits of lower risk as well as shorter path length. In other scenarios, the benefits of the risk-hedged solution are ambiguous, because the solution is characterized by a tradeoff between risk and path length. The risk-hedged solution can be executed in those scenarios where it provides a clear benefit over current operational practice.

  6. Software design by reusing architectures

    NASA Technical Reports Server (NTRS)

    Bhansali, Sanjay; Nii, H. Penny

    1992-01-01

    Abstraction fosters reuse by providing a class of artifacts that can be instantiated or customized to produce a set of artifacts meeting different specific requirements. It is proposed that significant leverage can be obtained by abstracting software system designs and the design process. The result of such an abstraction is a generic architecture and a set of knowledge-based, customization tools that can be used to instantiate the generic architecture. An approach for designing software systems based on the above idea are described. The approach is illustrated through an implemented example, and the advantages and limitations of the approach are discussed.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solis, John Hector

    In this paper, we present a modular framework for constructing a secure and efficient program obfuscation scheme. Our approach, inspired by the obfuscation with respect to oracle machines model of [4], retains an interactive online protocol with an oracle, but relaxes the original computational and storage restrictions. We argue this is reasonable given the computational resources of modern personal devices. Furthermore, we relax the information-theoretic security requirement for computational security to utilize established cryptographic primitives. With this additional flexibility we are free to explore different cryptographic buildingblocks. Our approach combines authenticated encryption with private information retrieval to construct a securemore » program obfuscation framework. We give a formal specification of our framework, based on desired functionality and security properties, and provide an example instantiation. In particular, we implement AES in Galois/Counter Mode for authenticated encryption and the Gentry-Ramzan [13]constant communication-rate private information retrieval scheme. We present our implementation results and show that non-trivial sized programs can be realized, but scalability is quickly limited by computational overhead. Finally, we include a discussion on security considerations when instantiating specific modules.« less

  8. Dual deep modeling: multi-level modeling with dual potencies and its formalization in F-Logic.

    PubMed

    Neumayr, Bernd; Schuetz, Christoph G; Jeusfeld, Manfred A; Schrefl, Michael

    2018-01-01

    An enterprise database contains a global, integrated, and consistent representation of a company's data. Multi-level modeling facilitates the definition and maintenance of such an integrated conceptual data model in a dynamic environment of changing data requirements of diverse applications. Multi-level models transcend the traditional separation of class and object with clabjects as the central modeling primitive, which allows for a more flexible and natural representation of many real-world use cases. In deep instantiation, the number of instantiation levels of a clabject or property is indicated by a single potency. Dual deep modeling (DDM) differentiates between source potency and target potency of a property or association and supports the flexible instantiation and refinement of the property by statements connecting clabjects at different modeling levels. DDM comes with multiple generalization of clabjects, subsetting/specialization of properties, and multi-level cardinality constraints. Examples are presented using a UML-style notation for DDM together with UML class and object diagrams for the representation of two-level user views derived from the multi-level model. Syntax and semantics of DDM are formalized and implemented in F-Logic, supporting the modeler with integrity checks and rich query facilities.

  9. Environmental modeling and recognition for an autonomous land vehicle

    NASA Technical Reports Server (NTRS)

    Lawton, D. T.; Levitt, T. S.; Mcconnell, C. C.; Nelson, P. C.

    1987-01-01

    An architecture for object modeling and recognition for an autonomous land vehicle is presented. Examples of objects of interest include terrain features, fields, roads, horizon features, trees, etc. The architecture is organized around a set of data bases for generic object models and perceptual structures, temporary memory for the instantiation of object and relational hypotheses, and a long term memory for storing stable hypotheses that are affixed to the terrain representation. Multiple inference processes operate over these databases. Researchers describe these particular components: the perceptual structure database, the grouping processes that operate over this, schemas, and the long term terrain database. A processing example that matches predictions from the long term terrain model to imagery, extracts significant perceptual structures for consideration as potential landmarks, and extracts a relational structure to update the long term terrain database is given.

  10. Teaching Scientific Concepts with Transparent Detector Models: An Example from Optics.

    ERIC Educational Resources Information Center

    Allen, Sue; And Others

    This paper describes an attempt to facilitate students' learning of scientific concepts by using detectors that take as input physical information and output an instantiation of the concept. The principle hypothesis was that students would have a better understanding of the concept of image if they were taught to use a simplified, runnable model…

  11. Fingerprinted circuits and methods of making and identifying the same

    NASA Technical Reports Server (NTRS)

    Ferguson, Michael Ian (Inventor)

    2011-01-01

    A circuit having a fingerprint for identification of a particular instantiation of the circuit is disclosed. The circuit may include a plurality of digital circuits or gates. Each of the digital circuits or gates is responsive to a configuration voltage applied to its analog input for controlling whether or not the digital circuit or gate performs its intended digital function and each of the digital circuits or gates transitioning between its functional state and its at least one other state when the configuration voltage equals a boundary voltage. The boundary voltage varies between different instantiations of the circuit for a majority of the digital circuits or gates and these differing boundary voltages serving to identify (or fingerprint) different instantiations of the same circuit.

  12. Fingerprinted circuits and methods of making and identifying the same

    NASA Technical Reports Server (NTRS)

    Ferguson, Michael Ian (Inventor)

    2012-01-01

    A circuit having a fingerprint for identification of a particular instantiation of the circuit is disclosed. The circuit may include a plurality of digital circuits or gates. Each of the digital circuits or gates is responsive to a configuration voltage applied to its analog input for controlling whether or not the digital circuit or gate performs its intended digital function and each of the digital circuits or gates transitioning between its functional state and its at least one other state when the configuration voltage equals a boundary voltage. The boundary voltage varies between different instantiations of the circuit for a majority of the digital circuits or gates and these differing boundary voltages serving to identify (or fingerprint) different instantiations of the same circuit.

  13. Empowering genomic medicine by establishing critical sequencing result data flows: the eMERGE example.

    PubMed

    Aronson, Samuel; Babb, Lawrence; Ames, Darren; Gibbs, Richard A; Venner, Eric; Connelly, John J; Marsolo, Keith; Weng, Chunhua; Williams, Marc S; Hartzler, Andrea L; Liang, Wayne H; Ralston, James D; Devine, Emily Beth; Murphy, Shawn; Chute, Christopher G; Caraballo, Pedro J; Kullo, Iftikhar J; Freimuth, Robert R; Rasmussen, Luke V; Wehbe, Firas H; Peterson, Josh F; Robinson, Jamie R; Wiley, Ken; Overby Taylor, Casey

    2018-05-31

    The eMERGE Network is establishing methods for electronic transmittal of patient genetic test results from laboratories to healthcare providers across organizational boundaries. We surveyed the capabilities and needs of different network participants, established a common transfer format, and implemented transfer mechanisms based on this format. The interfaces we created are examples of the connectivity that must be instantiated before electronic genetic and genomic clinical decision support can be effectively built at the point of care. This work serves as a case example for both standards bodies and other organizations working to build the infrastructure required to provide better electronic clinical decision support for clinicians.

  14. The cost of concreteness: the effect of nonessential information on analogical transfer.

    PubMed

    Kaminski, Jennifer A; Sloutsky, Vladimir M; Heckler, Andrew F

    2013-03-01

    Most theories of analogical transfer focus on similarities between the learning and transfer domains, where transfer is more likely between domains that share common surface features, similar elements, or common interpretations of structure. We suggest that characteristics of the learning instantiation alone can give rise to different levels of transfer. We propose that concreteness of the learning instantiation can hinder analogical transfer of well-defined structured concepts, such as mathematical concepts. We operationalize the term concreteness as the amount of information communicated through a specific instantiation of a concept. The 5 reported experiments with undergraduate students tested the hypothesis by presenting participants with the concept of a commutative mathematical group of order 3. The experiments varied the level of concreteness of the training instantiation and measured transfer of learning to a new instantiation. The results support the hypothesis, demonstrating better transfer from more generic instantiations (i.e., ones that communicate minimal extraneous information) than from more concrete instantiations. Specifically, concreteness was found to create an obstacle to successful structural alignment across domains, whereas generic instantiations led to spontaneous structural alignment. These findings have important implications for the theory of learning and transfer and practical implications for the design of educational material. Although some concreteness may activate prior knowledge and perhaps offer a leg up in the learning process, this benefit may come at the cost of transfer.

  15. "Concreteness Fading" Promotes Transfer of Mathematical Knowledge

    ERIC Educational Resources Information Center

    McNeil, Nicole M.; Fyfe, Emily R.

    2012-01-01

    Recent studies have suggested that educators should avoid concrete instantiations when the goal is to promote transfer. However, concrete instantiations may benefit transfer in the long run, particularly if they are "faded" into more abstract instantiations. Undergraduates were randomly assigned to learn a mathematical concept in one of three…

  16. Localized lossless authentication watermark (LAW)

    NASA Astrophysics Data System (ADS)

    Celik, Mehmet U.; Sharma, Gaurav; Tekalp, A. Murat; Saber, Eli S.

    2003-06-01

    A novel framework is proposed for lossless authentication watermarking of images which allows authentication and recovery of original images without any distortions. This overcomes a significant limitation of traditional authentication watermarks that irreversibly alter image data in the process of watermarking and authenticate the watermarked image rather than the original. In particular, authenticity is verified before full reconstruction of the original image, whose integrity is inferred from the reversibility of the watermarking procedure. This reduces computational requirements in situations when either the verification step fails or the zero-distortion reconstruction is not required. A particular instantiation of the framework is implemented using a hierarchical authentication scheme and the lossless generalized-LSB data embedding mechanism. The resulting algorithm, called localized lossless authentication watermark (LAW), can localize tampered regions of the image; has a low embedding distortion, which can be removed entirely if necessary; and supports public/private key authentication and recovery options. The effectiveness of the framework and the instantiation is demonstrated through examples.

  17. Replicated computational results (RCR) report for "BLIS: A framework for rapidly instantiating BLAS functionality"

    DOE PAGES

    Willenbring, James Michael

    2015-06-03

    “BLIS: A Framework for Rapidly Instantiating BLAS Functionality” includes single-platform BLIS performance results for both level-2 and level-3 operations that is competitive with OpenBLAS, ATLAS, and Intel MKL. A detailed description of the configuration used to generate the performance results was provided to the reviewer by the authors. All the software components used in the comparison were reinstalled and new performance results were generated and compared to the original results. After completing this process, the published results are deemed replicable by the reviewer.

  18. Intelligent sensor and controller framework for the power grid

    DOEpatents

    Akyol, Bora A.; Haack, Jereme Nathan; Craig, Jr., Philip Allen; Tews, Cody William; Kulkarni, Anand V.; Carpenter, Brandon J.; Maiden, Wendy M.; Ciraci, Selim

    2015-07-28

    Disclosed below are representative embodiments of methods, apparatus, and systems for monitoring and using data in an electric power grid. For example, one disclosed embodiment comprises a sensor for measuring an electrical characteristic of a power line, electrical generator, or electrical device; a network interface; a processor; and one or more computer-readable storage media storing computer-executable instructions. In this embodiment, the computer-executable instructions include instructions for implementing an authorization and authentication module for validating a software agent received at the network interface; instructions for implementing one or more agent execution environments for executing agent code that is included with the software agent and that causes data from the sensor to be collected; and instructions for implementing an agent packaging and instantiation module for storing the collected data in a data container of the software agent and for transmitting the software agent, along with the stored data, to a next destination.

  19. Intelligent sensor and controller framework for the power grid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akyol, Bora A.; Haack, Jereme Nathan; Craig, Jr., Philip Allen

    Disclosed below are representative embodiments of methods, apparatus, and systems for monitoring and using data in an electric power grid. For example, one disclosed embodiment comprises a sensor for measuring an electrical characteristic of a power line, electrical generator, or electrical device; a network interface; a processor; and one or more computer-readable storage media storing computer-executable instructions. In this embodiment, the computer-executable instructions include instructions for implementing an authorization and authentication module for validating a software agent received at the network interface; instructions for implementing one or more agent execution environments for executing agent code that is included with themore » software agent and that causes data from the sensor to be collected; and instructions for implementing an agent packaging and instantiation module for storing the collected data in a data container of the software agent and for transmitting the software agent, along with the stored data, to a next destination.« less

  20. vis-react-components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richardson, Gregory D; Goodall, John R; Steed, Chad A

    In developing visualizations for different data sets, the end solution often become dependent on the data being visualized. This causes engineers to have to re-develop many common components multiple times. The vis-react components library was designed to help enable creating visualizations that are independent of the underlying data. This library utilizes the React.js pattern of instantiating components that may be re-used. The library exposes an example application that allows other developers to understand how to use the components in the library.

  1. Enabling Communication and Navigation Technologies for Future Near Earth Science Missions

    NASA Technical Reports Server (NTRS)

    Israel, David J.; Heckler, Gregory; Menrad, Robert; Hudiburg, John; Boroson, Don; Robinson, Bryan; Cornwell, Donald

    2016-01-01

    In 2015, the Earth Regimes Network Evolution Study (ERNESt) proposed an architectural concept and technologies that evolve to enable space science and exploration missions out to the 2040 timeframe. The architectural concept evolves the current instantiations of the Near Earth Network and Space Network with new technologies to provide a global communication and navigation network that provides communication and navigation services to a wide range of space users in the near Earth domain. The technologies included High Rate Optical Communications, Optical Multiple Access (OMA), Delay Tolerant Networking (DTN), User Initiated Services (UIS), and advanced Position, Navigation, and Timing technology. This paper describes the key technologies and their current technology readiness levels. Examples of science missions that could be enabled by the technologies and the projected operational benefits of the architecture concept to missions are also described.

  2. Simple re-instantiation of small databases using cloud computing.

    PubMed

    Tan, Tin Wee; Xie, Chao; De Silva, Mark; Lim, Kuan Siong; Patro, C Pawan K; Lim, Shen Jean; Govindarajan, Kunde Ramamoorthy; Tong, Joo Chuan; Choo, Khar Heng; Ranganathan, Shoba; Khan, Asif M

    2013-01-01

    Small bioinformatics databases, unlike institutionally funded large databases, are vulnerable to discontinuation and many reported in publications are no longer accessible. This leads to irreproducible scientific work and redundant effort, impeding the pace of scientific progress. We describe a Web-accessible system, available online at http://biodb100.apbionet.org, for archival and future on demand re-instantiation of small databases within minutes. Depositors can rebuild their databases by downloading a Linux live operating system (http://www.bioslax.com), preinstalled with bioinformatics and UNIX tools. The database and its dependencies can be compressed into an ".lzm" file for deposition. End-users can search for archived databases and activate them on dynamically re-instantiated BioSlax instances, run as virtual machines over the two popular full virtualization standard cloud-computing platforms, Xen Hypervisor or vSphere. The system is adaptable to increasing demand for disk storage or computational load and allows database developers to use the re-instantiated databases for integration and development of new databases. Herein, we demonstrate that a relatively inexpensive solution can be implemented for archival of bioinformatics databases and their rapid re-instantiation should the live databases disappear.

  3. Simple re-instantiation of small databases using cloud computing

    PubMed Central

    2013-01-01

    Background Small bioinformatics databases, unlike institutionally funded large databases, are vulnerable to discontinuation and many reported in publications are no longer accessible. This leads to irreproducible scientific work and redundant effort, impeding the pace of scientific progress. Results We describe a Web-accessible system, available online at http://biodb100.apbionet.org, for archival and future on demand re-instantiation of small databases within minutes. Depositors can rebuild their databases by downloading a Linux live operating system (http://www.bioslax.com), preinstalled with bioinformatics and UNIX tools. The database and its dependencies can be compressed into an ".lzm" file for deposition. End-users can search for archived databases and activate them on dynamically re-instantiated BioSlax instances, run as virtual machines over the two popular full virtualization standard cloud-computing platforms, Xen Hypervisor or vSphere. The system is adaptable to increasing demand for disk storage or computational load and allows database developers to use the re-instantiated databases for integration and development of new databases. Conclusions Herein, we demonstrate that a relatively inexpensive solution can be implemented for archival of bioinformatics databases and their rapid re-instantiation should the live databases disappear. PMID:24564380

  4. Interactions between spontaneous instantiations to the basic level and post-event suggestions.

    PubMed

    Pansky, Ainat; Tenenboim, Einat

    2011-11-01

    Extensive research shows that post-event suggestions can distort the memory for a target event. In this study we examined the effect of such suggestions as they interact with the products of a spontaneous memory process: instantiation of abstract information to an intermediate level of abstractness, the basic level (Pansky & Koriat, 2004 ). Participants read a narrative containing items presented at the superordinate level (e.g., FRUIT), were exposed to suggestions that referred to these items at the basic level (e.g., APPLE), and were finally asked to recall the original items. We found that the tendency to instantiate spontaneously in the control (non-misleading) condition, particularly over time, increased following exposure to suggestions that were likely to coincide with those instantiations. Exposure to such suggestions, either immediately or following a 24-hour delay, reduced subsequent correct recall of the original items only if the suggested information coincided with the information one tends to instantiate spontaneously in a given context. Suggestibility, in this case, was particularly pronounced and phenomenologically compelling in terms of remember/know judgements. The findings are taken to imply that effects of post-event suggestions can be understood in terms of the constructive processes that set the stage for their occurrence.

  5. Metaphors we think with: the role of metaphor in reasoning.

    PubMed

    Thibodeau, Paul H; Boroditsky, Lera

    2011-02-23

    The way we talk about complex and abstract ideas is suffused with metaphor. In five experiments, we explore how these metaphors influence the way that we reason about complex issues and forage for further information about them. We find that even the subtlest instantiation of a metaphor (via a single word) can have a powerful influence over how people attempt to solve social problems like crime and how they gather information to make "well-informed" decisions. Interestingly, we find that the influence of the metaphorical framing effect is covert: people do not recognize metaphors as influential in their decisions; instead they point to more "substantive" (often numerical) information as the motivation for their problem-solving decision. Metaphors in language appear to instantiate frame-consistent knowledge structures and invite structurally consistent inferences. Far from being mere rhetorical flourishes, metaphors have profound influences on how we conceptualize and act with respect to important societal issues. We find that exposure to even a single metaphor can induce substantial differences in opinion about how to solve social problems: differences that are larger, for example, than pre-existing differences in opinion between Democrats and Republicans.

  6. Ontology-Driven Business Modelling: Improving the Conceptual Representation of the REA Ontology

    NASA Astrophysics Data System (ADS)

    Gailly, Frederik; Poels, Geert

    Business modelling research is increasingly interested in exploring how domain ontologies can be used as reference models for business models. The Resource Event Agent (REA) ontology is a primary candidate for ontology-driven modelling of business processes because the REA point of view on business reality is close to the conceptual modelling perspective on business models. In this paper Ontology Engineering principles are employed to reengineer REA in order to make it more suitable for ontology-driven business modelling. The new conceptual representation of REA that we propose uses a single representation formalism, includes a more complete domain axiomatizat-ion (containing definitions of concepts, concept relations and ontological axioms), and is proposed as a generic model that can be instantiated to create valid business models. The effects of these proposed improvements on REA-driven business modelling are demonstrated using a business modelling example.

  7. TMS for Instantiating a Knowledge Base With Incomplete Data

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    A computer program that belongs to the class known among software experts as output truth-maintenance-systems (output TMSs) has been devised as one of a number of software tools for reducing the size of the knowledge base that must be searched during execution of artificial- intelligence software of the rule-based inference-engine type in a case in which data are missing. This program determines whether the consequences of activation of two or more rules can be combined without causing a logical inconsistency. For example, in a case involving hypothetical scenarios that could lead to turning a given device on or off, the program determines whether a scenario involving a given combination of rules could lead to turning the device both on and off at the same time, in which case that combination of rules would not be included in the scenario.

  8. Object-oriented biomedical system modelling--the language.

    PubMed

    Hakman, M; Groth, T

    1999-11-01

    The paper describes a new object-oriented biomedical continuous system modelling language (OOBSML). It is fully object-oriented and supports model inheritance, encapsulation, and model component instantiation and behaviour polymorphism. Besides the traditional differential and algebraic equation expressions the language includes also formal expressions for documenting models and defining model quantity types and quantity units. It supports explicit definition of model input-, output- and state quantities, model components and component connections. The OOBSML model compiler produces self-contained, independent, executable model components that can be instantiated and used within other OOBSML models and/or stored within model and model component libraries. In this way complex models can be structured as multilevel, multi-component model hierarchies. Technically the model components produced by the OOBSML compiler are executable computer code objects based on distributed object and object request broker technology. This paper includes both the language tutorial and the formal language syntax and semantic description.

  9. Event-related potentials reveal the relations between feature representations at different levels of abstraction.

    PubMed

    Hannah, Samuel D; Shedden, Judith M; Brooks, Lee R; Grundy, John G

    2016-11-01

    In this paper, we use behavioural methods and event-related potentials (ERPs) to explore the relations between informational and instantiated features, as well as the relation between feature abstraction and rule type. Participants are trained to categorize two species of fictitious animals and then identify perceptually novel exemplars. Critically, two groups are given a perfectly predictive counting rule that, according to Hannah and Brooks (2009. Featuring familiarity: How a familiar feature instantiation influences categorization. Canadian Journal of Experimental Psychology/Revue Canadienne de Psychologie Expérimentale, 63, 263-275. Retrieved from http://doi.org/10.1037/a0017919), should orient them to using abstract informational features when categorizing the novel transfer items. A third group is taught a feature list rule, which should orient them to using detailed instantiated features. One counting-rule group were taught their rule before any exposure to the actual stimuli, and the other immediately after training, having learned the instantiations first. The feature-list group were also taught their rule after training. The ERP results suggest that at test, the two counting-rule groups processed items differently, despite their identical rule. This not only supports the distinction that informational and instantiated features are qualitatively different feature representations, but also implies that rules can readily operate over concrete inputs, in contradiction to traditional approaches that assume that rules necessarily act on abstract inputs.

  10. (E)pistemological Awareness, Instantiation of Methods, and Uninformed Methodological Ambiguity in Qualitative Research Projects

    ERIC Educational Resources Information Center

    Koro-Ljungberg, Mirka; Yendol-Hoppey, Diane; Smith, Jason Jude; Hayes, Sharon B.

    2009-01-01

    This article explores epistemological awareness and instantiation of methods, as well as uninformed ambiguity, in qualitative methodological decision making and research reporting. The authors argue that efforts should be made to make the research process, epistemologies, values, methodological decision points, and argumentative logic open,…

  11. Inquiry and Ideology: Teaching Everyday Forms of Historical Thinking

    ERIC Educational Resources Information Center

    Freedman, Eric B.

    2009-01-01

    In this design-based study, an eleven-week curricular module in recent American history was developed that departed from both the epistemology and ideology of traditional textbooks. The curriculum instantiated a constructivist epistemology by having students assess multiple historical narratives and sources of evidence. It instantiated a…

  12. Concrete Instantiations of Mathematics: A Double-Edged Sword

    ERIC Educational Resources Information Center

    Kaminski, Jennifer A.; Sloutsky, Vladimir M.; Heckler, Andrew F.

    2009-01-01

    What factors affect transfer of knowledge is a complex question. In recent research, the authors demonstrated that concreteness of the learning domain is one such factor (Kaminski, Sloutsky, & Heckler, 2008). Even when prompted and given no time delay, participants who learned a concrete instantiation of a mathematical concept failed to…

  13. Induction as Knowledge Integration

    NASA Technical Reports Server (NTRS)

    Smith, Benjamin D.; Rosenbloom, Paul S.

    1996-01-01

    Two key issues for induction algorithms are the accuracy of the learned hypothesis and the computational resources consumed in inducing that hypothesis. One of the most promising ways to improve performance along both dimensions is to make use of additional knowledge. Multi-strategy learning algorithms tackle this problem by employing several strategies for handling different kinds of knowledge in different ways. However, integrating knowledge into an induction algorithm can be difficult when the new knowledge differs significantly from the knowledge the algorithm already uses. In many cases the algorithm must be rewritten. This paper presents Knowledge Integration framework for Induction (KII), a KII, that provides a uniform mechanism for integrating knowledge into induction. In theory, arbitrary knowledge can be integrated with this mechanism, but in practice the knowledge representation language determines both the knowledge that can be integrated, and the costs of integration and induction. By instantiating KII with various set representations, algorithms can be generated at different trade-off points along these dimensions. One instantiation of KII, called RS-KII, is presented that can implement hybrid induction algorithms, depending on which knowledge it utilizes. RS-KII is demonstrated to implement AQ-11, as well as a hybrid algorithm that utilizes a domain theory and noisy examples. Other algorithms are also possible.

  14. Linear Time Algorithms to Restrict Insider Access using Multi-Policy Access Control Systems

    PubMed Central

    Mell, Peter; Shook, James; Harang, Richard; Gavrila, Serban

    2017-01-01

    An important way to limit malicious insiders from distributing sensitive information is to as tightly as possible limit their access to information. This has always been the goal of access control mechanisms, but individual approaches have been shown to be inadequate. Ensemble approaches of multiple methods instantiated simultaneously have been shown to more tightly restrict access, but approaches to do so have had limited scalability (resulting in exponential calculations in some cases). In this work, we take the Next Generation Access Control (NGAC) approach standardized by the American National Standards Institute (ANSI) and demonstrate its scalability. The existing publicly available reference implementations all use cubic algorithms and thus NGAC was widely viewed as not scalable. The primary NGAC reference implementation took, for example, several minutes to simply display the set of files accessible to a user on a moderately sized system. In our approach, we take these cubic algorithms and make them linear. We do this by reformulating the set theoretic approach of the NGAC standard into a graph theoretic approach and then apply standard graph algorithms. We thus can answer important access control decision questions (e.g., which files are available to a user and which users can access a file) using linear time graph algorithms. We also provide a default linear time mechanism to visualize and review user access rights for an ensemble of access control mechanisms. Our visualization appears to be a simple file directory hierarchy but in reality is an automatically generated structure abstracted from the underlying access control graph that works with any set of simultaneously instantiated access control policies. It also provide an implicit mechanism for symbolic linking that provides a powerful access capability. Our work thus provides the first efficient implementation of NGAC while enabling user privilege review through a novel visualization approach. This may help transition from concept to reality the idea of using ensembles of simultaneously instantiated access control methodologies, thereby limiting insider threat. PMID:28758045

  15. Service-Oriented Architecture (SOA) Instantiation within a Hard Real-Time, Deterministic Combat System Environment

    ERIC Educational Resources Information Center

    Moreland, James D., Jr

    2013-01-01

    This research investigates the instantiation of a Service-Oriented Architecture (SOA) within a hard real-time (stringent time constraints), deterministic (maximum predictability) combat system (CS) environment. There are numerous stakeholders across the U.S. Department of the Navy who are affected by this development, and therefore the system…

  16. A brain network instantiating approach and avoidance motivation.

    PubMed

    Spielberg, Jeffrey M; Miller, Gregory A; Warren, Stacie L; Engels, Anna S; Crocker, Laura D; Banich, Marie T; Sutton, Bradley P; Heller, Wendy

    2012-09-01

    Research indicates that dorsolateral prefrontal cortex (DLPFC) is important for pursuing goals, and areas of DLPFC are differentially involved in approach and avoidance motivation. Given the complexity of the processes involved in goal pursuit, DLPFC is likely part of a network that includes orbitofrontal cortex (OFC), cingulate, amygdala, and basal ganglia. This hypothesis was tested with regard to one component of goal pursuit, the maintenance of goals in the face of distraction. Examination of connectivity with motivation-related areas of DLPFC supported the network hypothesis. Differential patterns of connectivity suggest a distinct role for DLPFC areas, with one involved in selecting approach goals, one in selecting avoidance goals, and one in selecting goal pursuit strategies. Finally, differences in trait motivation moderated connectivity between DLPFC and OFC, suggesting that this connectivity is important for instantiating motivation. Copyright © 2012 Society for Psychophysiological Research.

  17. A Brain Network Instantiating Approach and Avoidance Motivation

    PubMed Central

    Spielberg, Jeffrey M.; Miller, Gregory A.; Warren, Stacie L.; Engels, Anna S.; Crocker, Laura D.; Banich, Marie T.; Sutton, Bradley P.; Heller, Wendy

    2015-01-01

    Research indicates that dorsolateral prefrontal cortex (DLPFC) is important for pursuing goals, and areas of DLPFC are differentially involved in approach and avoidance motivation. Given the complexity of the processes involved in goal pursuit, DLPFC is likely part of a network that includes orbitofrontal cortex (OFC), cingulate, amygdala, and basal ganglia. This hypothesis was tested with regard to one component of goal pursuit, the maintenance of goals in the face of distraction. Examination of connectivity with motivation-related areas of DLPFC supported the network hypothesis. Differential patterns of connectivity suggest a distinct role for DLPFC areas, with one involved in selecting approach goals, one in selecting avoidance goals, and one in selecting goal pursuit strategies. Finally, differences in trait motivation moderated connectivity between DLPFC and OFC, suggesting that this connectivity is important for instantiating motivation. PMID:22845892

  18. Reasoning about Function Objects

    NASA Astrophysics Data System (ADS)

    Nordio, Martin; Calcagno, Cristiano; Meyer, Bertrand; Müller, Peter; Tschannen, Julian

    Modern object-oriented languages support higher-order implementations through function objects such as delegates in C#, agents in Eiffel, or closures in Scala. Function objects bring a new level of abstraction to the object-oriented programming model, and require a comparable extension to specification and verification techniques. We introduce a verification methodology that extends function objects with auxiliary side-effect free (pure) methods to model logical artifacts: preconditions, postconditions and modifies clauses. These pure methods can be used to specify client code abstractly, that is, independently from specific instantiations of the function objects. To demonstrate the feasibility of our approach, we have implemented an automatic prover, which verifies several non-trivial examples.

  19. A new statistic to express the uncertainty of kriging predictions for purposes of survey planning.

    NASA Astrophysics Data System (ADS)

    Lark, R. M.; Lapworth, D. J.

    2014-05-01

    It is well-known that one advantage of kriging for spatial prediction is that, given the random effects model, the prediction error variance can be computed a priori for alternative sampling designs. This allows one to compare sampling schemes, in particular sampling at different densities, and so to decide on one which meets requirements in terms of the uncertainty of the resulting predictions. However, the planning of sampling schemes must account not only for statistical considerations, but also logistics and cost. This requires effective communication between statisticians, soil scientists and data users/sponsors such as managers, regulators or civil servants. In our experience the latter parties are not necessarily able to interpret the prediction error variance as a measure of uncertainty for decision making. In some contexts (particularly the solution of very specific problems at large cartographic scales, e.g. site remediation and precision farming) it is possible to translate uncertainty of predictions into a loss function directly comparable with the cost incurred in increasing precision. Often, however, sampling must be planned for more generic purposes (e.g. baseline or exploratory geochemical surveys). In this latter context the prediction error variance may be of limited value to a non-statistician who has to make a decision on sample intensity and associated cost. We propose an alternative criterion for these circumstances to aid communication between statisticians and data users about the uncertainty of geostatistical surveys based on different sampling intensities. The criterion is the consistency of estimates made from two non-coincident instantiations of a proposed sample design. We consider square sample grids, one instantiation is offset from the second by half the grid spacing along the rows and along the columns. If a sample grid is coarse relative to the important scales of variation in the target property then the consistency of predictions from two instantiations is expected to be small, and can be increased by reducing the grid spacing. The measure of consistency is the correlation between estimates from the two instantiations of the sample grid, averaged over a grid cell. We call this the offset correlation, it can be calculated from the variogram. We propose that this measure is easier to grasp intuitively than the prediction error variance, and has the advantage of having an upper bound (1.0) which will aid its interpretation. This quality measure is illustrated for some hypothetical examples, considering both ordinary kriging and factorial kriging of the variable of interest. It is also illustrated using data on metal concentrations in the soil of north-east England.

  20. Planning and Realization of Complex Intentions in Traumatic Brain Injury and Normal Aging

    ERIC Educational Resources Information Center

    Kliegel, Matthias; Eschen, Anne; Thone-Otto, Angelika I. T.

    2004-01-01

    The realization of delayed intentions (i.e., prospective memory) is a highly complex process composed of four phases: intention formation, retention, re-instantiation, and execution. The aim of this study was to investigate if executive functioning impairments are related to problems in the formation, re-instantiation, and execution of a delayed…

  1. Pragmatic User Model Implementation in an Intelligent Help System.

    ERIC Educational Resources Information Center

    Fernandez-Manjon, Baltasar; Fernandez-Valmayor, Alfredo; Fernandez-Chamizo, Carmen

    1998-01-01

    Describes Aran, a knowledge-based system designed to help users deal with problems related to Unix operation. Highlights include adaptation to the individual user; user modeling knowledge; stereotypes; content of the individual user model; instantiation, acquisition, and maintenance of the individual model; dynamic acquisition of objective and…

  2. Generalizing a model beyond the inherence heuristic and applying it to beliefs about objective value.

    PubMed

    Wood, Graham

    2014-10-01

    The inherence heuristic is characterized as part of an instantiation of a more general model that describes the interaction between undeveloped intuitions, produced by System 1 heuristics, and developed beliefs, constructed by System 2 reasoning. The general model is described and illustrated by examining another instantiation of the process that constructs belief in objective moral value.

  3. Ontological Problem-Solving Framework for Dynamically Configuring Sensor Systems and Algorithms

    PubMed Central

    Qualls, Joseph; Russomanno, David J.

    2011-01-01

    The deployment of ubiquitous sensor systems and algorithms has led to many challenges, such as matching sensor systems to compatible algorithms which are capable of satisfying a task. Compounding the challenges is the lack of the requisite knowledge models needed to discover sensors and algorithms and to subsequently integrate their capabilities to satisfy a specific task. A novel ontological problem-solving framework has been designed to match sensors to compatible algorithms to form synthesized systems, which are capable of satisfying a task and then assigning the synthesized systems to high-level missions. The approach designed for the ontological problem-solving framework has been instantiated in the context of a persistence surveillance prototype environment, which includes profiling sensor systems and algorithms to demonstrate proof-of-concept principles. Even though the problem-solving approach was instantiated with profiling sensor systems and algorithms, the ontological framework may be useful with other heterogeneous sensing-system environments. PMID:22163793

  4. Motion-adapted catheter navigation with real-time instantiation and improved visualisation

    PubMed Central

    Kwok, Ka-Wai; Wang, Lichao; Riga, Celia; Bicknell, Colin; Cheshire, Nicholas; Yang, Guang-Zhong

    2014-01-01

    The improvements to catheter manipulation by the use of robot-assisted catheter navigation for endovascular procedures include increased precision, stability of motion and operator comfort. However, navigation through the vasculature under fluoroscopic guidance is still challenging, mostly due to physiological motion and when tortuous vessels are involved. In this paper, we propose a motion-adaptive catheter navigation scheme based on shape modelling to compensate for these dynamic effects, permitting predictive and dynamic navigations. This allows for timed manipulations synchronised with the vascular motion. The technical contribution of the paper includes the following two aspects. Firstly, a dynamic shape modelling and real-time instantiation scheme based on sparse data obtained intra-operatively is proposed for improved visualisation of the 3D vasculature during endovascular intervention. Secondly, a reconstructed frontal view from the catheter tip using the derived dynamic model is used as an interventional aid to user guidance. To demonstrate the practical value of the proposed framework, a simulated aortic branch cannulation procedure is used with detailed user validation to demonstrate the improvement in navigation quality and efficiency. PMID:24744817

  5. Constructor theory of information

    PubMed Central

    Deutsch, David; Marletto, Chiara

    2015-01-01

    We propose a theory of information expressed solely in terms of which transformations of physical systems are possible and which are impossible—i.e. in constructor-theoretic terms. It includes conjectured, exact laws of physics expressing the regularities that allow information to be physically instantiated. Although these laws are directly about information, independently of the details of particular physical instantiations, information is not regarded as an a priori mathematical or logical concept, but as something whose nature and properties are determined by the laws of physics alone. This theory solves a problem at the foundations of existing information theory, namely that information and distinguishability are each defined in terms of the other. It also explains the relationship between classical and quantum information, and reveals the single, constructor-theoretic property underlying the most distinctive phenomena associated with the latter, including the lack of in-principle distinguishability of some states, the impossibility of cloning, the existence of pairs of variables that cannot simultaneously have sharp values, the fact that measurement processes can be both deterministic and unpredictable, the irreducible perturbation caused by measurement, and locally inaccessible information (as in entangled systems). PMID:25663803

  6. Towards a Context-Aware Proactive Decision Support Framework

    DTIC Science & Technology

    2013-11-15

    initiative that has developed text analytic technology that crosses the semantic gap into the area of event recognition and representation. The...recognizing operational context, and techniques for recognizing context shift. Additional research areas include: • Adequately capturing users...Universal Interaction Context Ontology [12] might serve as a foundation • Instantiating formal models of decision making based on information seeking

  7. Delayed Instantiation Bulk Operations for Management of Distributed, Object-Based Storage Systems

    DTIC Science & Technology

    2009-08-01

    source and destination object sets, while they have attribute pages to indicate that history . Fourth, we allow for operations to occur on any objects...client dialogue to the PostgreSQL database where server-side functions implement the service logic for the requests. The translation is done...to satisfy client requests, and performs delayed instantiation bulk operations. It is built around a PostgreSQL database with tables for storing

  8. Neurobiology of Schemas and Schema-Mediated Memory.

    PubMed

    Gilboa, Asaf; Marlatte, Hannah

    2017-08-01

    Schemas are superordinate knowledge structures that reflect abstracted commonalities across multiple experiences, exerting powerful influences over how events are perceived, interpreted, and remembered. Activated schema templates modulate early perceptual processing, as they get populated with specific informational instances (schema instantiation). Instantiated schemas, in turn, can enhance or distort mnemonic processing from the outset (at encoding), impact offline memory transformation and accelerate neocortical integration. Recent studies demonstrate distinctive neurobiological processes underlying schema-related learning. Interactions between the ventromedial prefrontal cortex (vmPFC), hippocampus, angular gyrus (AG), and unimodal associative cortices support context-relevant schema instantiation and schema mnemonic effects. The vmPFC and hippocampus may compete (as suggested by some models) or synchronize (as suggested by others) to optimize schema-related learning depending on the specific operationalization of schema memory. This highlights the need for more precise definitions of memory schemas. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. CABARET: Rule Interpretation in a Hybrid Architecture

    DTIC Science & Technology

    1991-01-01

    reasoning. In this paper, we discuss CABARET in the application area of income tax law concerning the deduction for expenses relating to an office...then re-instantiated with knowledge relevant to an area of income tax law . The goal of the TAX-HYPO project was to determine how well the mechanisms...is currently instantiated is an area of U.S. Federal income tax law regarding the "home office deduction". The home office deduction domain is

  10. Enabling Communication and Navigation Technologies for Future Near Earth Science Missions

    NASA Technical Reports Server (NTRS)

    Israel, David J.; Heckler, Greg; Menrad, Robert J.; Hudiburg, John J.; Boroson, Don M.; Robinson, Bryan S.; Cornwell, Donald M.

    2016-01-01

    In 2015, the Earth Regimes Network Evolution Study (ERNESt) Team proposed a fundamentally new architectural concept, with enabling technologies, that defines an evolutionary pathway out to the 2040 timeframe in which an increasing user community comprised of more diverse space science and exploration missions can be supported. The architectural concept evolves the current instantiations of the Near Earth Network and Space Network through implementation of select technologies resulting in a global communication and navigation network that provides communication and navigation services to a wide range of space users in the Near Earth regime, defined as an Earth-centered sphere with radius of 2M Km. The enabling technologies include: High Rate Optical Communications, Optical Multiple Access (OMA), Delay Tolerant Networking (DTN), User Initiated Services (UIS), and advanced Position, Navigation, and Timing technology (PNT). This paper describes this new architecture, the key technologies that enable it and their current technology readiness levels. Examples of science missions that could be enabled by the technologies and the projected operational benefits of the architecture concept to missions are also described.

  11. Abstraction of an Affective-Cognitive Decision Making Model Based on Simulated Behaviour and Perception Chains

    NASA Astrophysics Data System (ADS)

    Sharpanskykh, Alexei; Treur, Jan

    Employing rich internal agent models of actors in large-scale socio-technical systems often results in scalability issues. The problem addressed in this paper is how to improve computational properties of a complex internal agent model, while preserving its behavioral properties. The problem is addressed for the case of an existing affective-cognitive decision making model instantiated for an emergency scenario. For this internal decision model an abstracted behavioral agent model is obtained, which ensures a substantial increase of the computational efficiency at the cost of approximately 1% behavioural error. The abstraction technique used can be applied to a wide range of internal agent models with loops, for example, involving mutual affective-cognitive interactions.

  12. Specification and implementation of IFC based performance metrics to support building life cycle assessment of hybrid energy systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morrissey, Elmer; O'Donnell, James; Keane, Marcus

    2004-03-29

    Minimizing building life cycle energy consumption is becoming of paramount importance. Performance metrics tracking offers a clear and concise manner of relating design intent in a quantitative form. A methodology is discussed for storage and utilization of these performance metrics through an Industry Foundation Classes (IFC) instantiated Building Information Model (BIM). The paper focuses on storage of three sets of performance data from three distinct sources. An example of a performance metrics programming hierarchy is displayed for a heat pump and a solar array. Utilizing the sets of performance data, two discrete performance effectiveness ratios may be computed, thus offeringmore » an accurate method of quantitatively assessing building performance.« less

  13. Semi Automatic Ontology Instantiation in the domain of Risk Management

    NASA Astrophysics Data System (ADS)

    Makki, Jawad; Alquier, Anne-Marie; Prince, Violaine

    One of the challenging tasks in the context of Ontological Engineering is to automatically or semi-automatically support the process of Ontology Learning and Ontology Population from semi-structured documents (texts). In this paper we describe a Semi-Automatic Ontology Instantiation method from natural language text, in the domain of Risk Management. This method is composed from three steps 1 ) Annotation with part-of-speech tags, 2) Semantic Relation Instances Extraction, 3) Ontology instantiation process. It's based on combined NLP techniques using human intervention between steps 2 and 3 for control and validation. Since it heavily relies on linguistic knowledge it is not domain dependent which is a good feature for portability between the different fields of risk management application. The proposed methodology uses the ontology of the PRIMA1 project (supported by the European community) as a Generic Domain Ontology and populates it via an available corpus. A first validation of the approach is done through an experiment with Chemical Fact Sheets from Environmental Protection Agency2.

  14. Does content affect whether users remember that Web pages were hyperlinked?

    PubMed

    Jones, Keith S; Ballew, Timothy V; Probst, C Adam

    2008-10-01

    We determined whether memory for hyperlinks improved when they represented relations between the contents of the Web pages. J. S. Farris (2003) found that memory for hyperlinks improved when they represented relations between the contents of the Web pages. However, Farris's (2003) participants could have used their knowledge of site content to answer questions about relations that were instantiated via the site's content and its hyperlinks. In Experiment 1, users navigated a Web site and then answered questions about relations that were instantiated only via content, only via hyperlinks, and via content and hyperlinks. Unlike Farris (2003), we split the latter into two sets. One asked whether certain content elements were related, and the other asked whether certain Web pages were hyperlinked. Experiment 2 replicated Experiment 1 with one modification: The questions that were asked about relations instantiated via content and hyperlinks were changed so that each question's wrong answer was also related to the question's target. Memory for hyperlinks improved when they represented relations instantiated within the content of the Web pages. This was true when (a) questions about content and hyperlinks were separated (Experiment 1) and (b) each question's wrong answer was also related to the question's target (Experiment 2). The accuracy of users' mental representations of local architecture depended on whether hyperlinks were related to the site's content. Designers who want users to remember hyperlinks should associate those hyperlinks with content that reflects the relation between the contents on the Web pages.

  15. Systematization of a set of closure techniques.

    PubMed

    Hausken, Kjell; Moxnes, John F

    2011-11-01

    Approximations in population dynamics are gaining popularity since stochastic models in large populations are time consuming even on a computer. Stochastic modeling causes an infinite set of ordinary differential equations for the moments. Closure models are useful since they recast this infinite set into a finite set of ordinary differential equations. This paper systematizes a set of closure approximations. We develop a system, which we call a power p closure of n moments, where 0≤p≤n. Keeling's (2000a,b) approximation with third order moments is shown to be an instantiation of this system which we call a power 3 closure of 3 moments. We present an epidemiological example and evaluate the system for third and fourth moments compared with Monte Carlo simulations. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. Exploring the Impact of Early Decisions in Variable Ordering for Constraint Satisfaction Problems.

    PubMed

    Ortiz-Bayliss, José Carlos; Amaya, Ivan; Conant-Pablos, Santiago Enrique; Terashima-Marín, Hugo

    2018-01-01

    When solving constraint satisfaction problems (CSPs), it is a common practice to rely on heuristics to decide which variable should be instantiated at each stage of the search. But, this ordering influences the search cost. Even so, and to the best of our knowledge, no earlier work has dealt with how first variable orderings affect the overall cost. In this paper, we explore the cost of finding high-quality orderings of variables within constraint satisfaction problems. We also study differences among the orderings produced by some commonly used heuristics and the way bad first decisions affect the search cost. One of the most important findings of this work confirms the paramount importance of first decisions. Another one is the evidence that many of the existing variable ordering heuristics fail to appropriately select the first variable to instantiate. Another one is the evidence that many of the existing variable ordering heuristics fail to appropriately select the first variable to instantiate. We propose a simple method to improve early decisions of heuristics. By using it, performance of heuristics increases.

  17. Exploring the Impact of Early Decisions in Variable Ordering for Constraint Satisfaction Problems

    PubMed Central

    Amaya, Ivan

    2018-01-01

    When solving constraint satisfaction problems (CSPs), it is a common practice to rely on heuristics to decide which variable should be instantiated at each stage of the search. But, this ordering influences the search cost. Even so, and to the best of our knowledge, no earlier work has dealt with how first variable orderings affect the overall cost. In this paper, we explore the cost of finding high-quality orderings of variables within constraint satisfaction problems. We also study differences among the orderings produced by some commonly used heuristics and the way bad first decisions affect the search cost. One of the most important findings of this work confirms the paramount importance of first decisions. Another one is the evidence that many of the existing variable ordering heuristics fail to appropriately select the first variable to instantiate. Another one is the evidence that many of the existing variable ordering heuristics fail to appropriately select the first variable to instantiate. We propose a simple method to improve early decisions of heuristics. By using it, performance of heuristics increases. PMID:29681923

  18. The semantic architecture of the World-Wide Molecular Matrix (WWMM)

    PubMed Central

    2011-01-01

    The World-Wide Molecular Matrix (WWMM) is a ten year project to create a peer-to-peer (P2P) system for the publication and collection of chemical objects, including over 250, 000 molecules. It has now been instantiated in a number of repositories which include data encoded in Chemical Markup Language (CML) and linked by URIs and RDF. The technical specification and implementation is now complete. We discuss the types of architecture required to implement nodes in the WWMM and consider the social issues involved in adoption. PMID:21999475

  19. The semantic architecture of the World-Wide Molecular Matrix (WWMM).

    PubMed

    Murray-Rust, Peter; Adams, Sam E; Downing, Jim; Townsend, Joe A; Zhang, Yong

    2011-10-14

    The World-Wide Molecular Matrix (WWMM) is a ten year project to create a peer-to-peer (P2P) system for the publication and collection of chemical objects, including over 250, 000 molecules. It has now been instantiated in a number of repositories which include data encoded in Chemical Markup Language (CML) and linked by URIs and RDF. The technical specification and implementation is now complete. We discuss the types of architecture required to implement nodes in the WWMM and consider the social issues involved in adoption.

  20. Evidence Arguments for Using Formal Methods in Software Certification

    NASA Technical Reports Server (NTRS)

    Denney, Ewen W.; Pai, Ganesh

    2013-01-01

    We describe a generic approach for automatically integrating the output generated from a formal method/tool into a software safety assurance case, as an evidence argument, by (a) encoding the underlying reasoning as a safety case pattern, and (b) instantiating it using the data produced from the method/tool. We believe this approach not only improves the trustworthiness of the evidence generated from a formal method/tool, by explicitly presenting the reasoning and mechanisms underlying its genesis, but also provides a way to gauge the suitability of the evidence in the context of the wider assurance case. We illustrate our work by application to a real example-an unmanned aircraft system- where we invoke a formal code analysis tool from its autopilot software safety case, automatically transform the verification output into an evidence argument, and then integrate it into the former.

  1. Context-dependent control over attentional capture

    PubMed Central

    Cosman, Joshua D.; Vecera, Shaun P.

    2014-01-01

    A number of studies have demonstrated that the likelihood of a salient item capturing attention is dependent on the “attentional set” an individual employs in a given situation. The instantiation of an attentional set is often viewed as a strategic, voluntary process, relying on working memory systems that represent immediate task priorities. However, influential theories of attention and automaticity propose that goal-directed control can operate more or less automatically on the basis of longer-term task representations, a notion supported by a number of recent studies. Here, we provide evidence that longer-term contextual learning can rapidly and automatically influence the instantiation of a given attentional set. Observers learned associations between specific attentional sets and specific task-irrelevant background scenes during a training session, and in the ensuing test session simply reinstating particular scenes on a trial by trial basis biased observers to employ the associated attentional set. This directly influenced the magnitude of attentional capture, suggesting that memory for the context in which a task is performed can play an important role in the ability to instantiate a particular attentional set and overcome distraction by salient, task-irrelevant information. PMID:23025581

  2. Models of inhibitory control

    PubMed Central

    Logan, Gordon D.

    2017-01-01

    We survey models of response inhibition having different degrees of mathematical, computational and neurobiological specificity and generality. The independent race model accounts for performance of the stop-signal or countermanding task in terms of a race between GO and STOP processes with stochastic finishing times. This model affords insights into neurophysiological mechanisms that are reviewed by other authors in this volume. The formal link between the abstract GO and STOP processes and instantiating neural processes is articulated through interactive race models consisting of stochastic accumulator GO and STOP units. This class of model provides quantitative accounts of countermanding performance and replicates the dynamics of neural activity producing that performance. The interactive race can be instantiated in a network of biophysically plausible spiking excitatory and inhibitory units. Other models seek to account for interactions between units in frontal cortex, basal ganglia and superior colliculus. The strengths, weaknesses and relationships of the different models will be considered. We will conclude with a brief survey of alternative modelling approaches and a summary of problems to be addressed including accounting for differences across effectors, species, individuals, task conditions and clinical deficits. This article is part of the themed issue ‘Movement suppression: brain mechanisms for stopping and stillness’. PMID:28242727

  3. From scenarios to domain models: processes and representations

    NASA Astrophysics Data System (ADS)

    Haddock, Gail; Harbison, Karan

    1994-03-01

    The domain specific software architectures (DSSA) community has defined a philosophy for the development of complex systems. This philosophy improves productivity and efficiency by increasing the user's role in the definition of requirements, increasing the systems engineer's role in the reuse of components, and decreasing the software engineer's role to the development of new components and component modifications only. The scenario-based engineering process (SEP), the first instantiation of the DSSA philosophy, has been adopted by the next generation controller project. It is also the chosen methodology of the trauma care information management system project, and the surrogate semi-autonomous vehicle project. SEP uses scenarios from the user to create domain models and define the system's requirements. Domain knowledge is obtained from a variety of sources including experts, documents, and videos. This knowledge is analyzed using three techniques: scenario analysis, task analysis, and object-oriented analysis. Scenario analysis results in formal representations of selected scenarios. Task analysis of the scenario representations results in descriptions of tasks necessary for object-oriented analysis and also subtasks necessary for functional system analysis. Object-oriented analysis of task descriptions produces domain models and system requirements. This paper examines the representations that support the DSSA philosophy, including reference requirements, reference architectures, and domain models. The processes used to create and use the representations are explained through use of the scenario-based engineering process. Selected examples are taken from the next generation controller project.

  4. Instrumentation and control of harmonic oscillators via a single-board microprocessor-FPGA device.

    PubMed

    Picone, Rico A R; Davis, Solomon; Devine, Cameron; Garbini, Joseph L; Sidles, John A

    2017-04-01

    We report the development of an instrumentation and control system instantiated on a microprocessor-field programmable gate array (FPGA) device for a harmonic oscillator comprising a portion of a magnetic resonance force microscope. The specific advantages of the system are that it minimizes computation, increases maintainability, and reduces the technical barrier required to enter the experimental field of magnetic resonance force microscopy. Heterodyne digital control and measurement yields computational advantages. A single microprocessor-FPGA device improves system maintainability by using a single programming language. The system presented requires significantly less technical expertise to instantiate than the instrumentation of previous systems, yet integrity of performance is retained and demonstrated with experimental data.

  5. Instrumentation and control of harmonic oscillators via a single-board microprocessor-FPGA device

    NASA Astrophysics Data System (ADS)

    Picone, Rico A. R.; Davis, Solomon; Devine, Cameron; Garbini, Joseph L.; Sidles, John A.

    2017-04-01

    We report the development of an instrumentation and control system instantiated on a microprocessor-field programmable gate array (FPGA) device for a harmonic oscillator comprising a portion of a magnetic resonance force microscope. The specific advantages of the system are that it minimizes computation, increases maintainability, and reduces the technical barrier required to enter the experimental field of magnetic resonance force microscopy. Heterodyne digital control and measurement yields computational advantages. A single microprocessor-FPGA device improves system maintainability by using a single programming language. The system presented requires significantly less technical expertise to instantiate than the instrumentation of previous systems, yet integrity of performance is retained and demonstrated with experimental data.

  6. Balancing generality and specificity in component-based reuse

    NASA Technical Reports Server (NTRS)

    Eichmann, David A.; Beck, Jon

    1992-01-01

    For a component industry to be successful, we must move beyond the current techniques of black box reuse and genericity to a more flexible framework supporting customization of components as well as instantiation and composition of components. Customization of components strikes a balanced between creating dozens of variations of a base component and requiring the overhead of unnecessary features of an 'everything but the kitchen sink' component. We argue that design and instantiation of reusable components have competing criteria - design-for-use strives for generality, design-with-reuse strives for specificity - and that providing mechanisms for each can be complementary rather than antagonistic. In particular, we demonstrate how program slicing techniques can be applied to customization of reusable components.

  7. A Process for the Representation of openEHR ADL Archetypes in OWL Ontologies.

    PubMed

    Porn, Alex Mateus; Peres, Leticia Mara; Didonet Del Fabro, Marcos

    2015-01-01

    ADL is a formal language to express archetypes, independent of standards or domain. However, its specification is not precise enough in relation to the specialization and semantic of archetypes, presenting difficulties in implementation and a few available tools. Archetypes may be implemented using other languages such as XML or OWL, increasing integration with Semantic Web tools. Exchanging and transforming data can be better implemented with semantics oriented models, for example using OWL which is a language to define and instantiate Web ontologies defined by W3C. OWL permits defining significant, detailed, precise and consistent distinctions among classes, properties and relations by the user, ensuring the consistency of knowledge than using ADL techniques. This paper presents a process of an openEHR ADL archetypes representation in OWL ontologies. This process consists of ADL archetypes conversion in OWL ontologies and validation of OWL resultant ontologies using the mutation test.

  8. A 360° Vision for Virtual Organizations Characterization and Modelling: Two Intentional Level Aspects

    NASA Astrophysics Data System (ADS)

    Priego-Roche, Luz-María; Rieu, Dominique; Front, Agnès

    Nowadays, organizations aiming to be successful in an increasingly competitive market tend to group together into virtual organizations. Designing the information system (IS) of such virtual organizations on the basis of the IS of those participating is a real challenge. The IS of a virtual organization plays an important role in the collaboration and cooperation of the participants organizations and in reaching the common goal. This article proposes criteria allowing virtual organizations to be identified and classified at an intentional level, as well as the information necessary for designing the organizations’ IS. Instantiation of criteria for a specific virtual organization and its participants, will allow simple graphical models to be generated in a modelling tool. The models will be used as bases for the IS design at organizational and operational levels. The approach is illustrated by the example of the virtual organization UGRT (a regional stockbreeders union in Tabasco, Mexico).

  9. Rosen's (M,R) system in Unified Modelling Language.

    PubMed

    Zhang, Ling; Williams, Richard A; Gatherer, Derek

    2016-01-01

    Robert Rosen's (M,R) system is an abstract biological network architecture that is allegedly non-computable on a Turing machine. If (M,R) is truly non-computable, there are serious implications for the modelling of large biological networks in computer software. A body of work has now accumulated addressing Rosen's claim concerning (M,R) by attempting to instantiate it in various software systems. However, a conclusive refutation has remained elusive, principally since none of the attempts to date have unambiguously avoided the critique that they have altered the properties of (M,R) in the coding process, producing merely approximate simulations of (M,R) rather than true computational models. In this paper, we use the Unified Modelling Language (UML), a diagrammatic notation standard, to express (M,R) as a system of objects having attributes, functions and relations. We believe that this instantiates (M,R) in such a way than none of the original properties of the system are corrupted in the process. Crucially, we demonstrate that (M,R) as classically represented in the relational biology literature is implicitly a UML communication diagram. Furthermore, since UML is formally compatible with object-oriented computing languages, instantiation of (M,R) in UML strongly implies its computability in object-oriented coding languages. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  10. A Biophysical Model for Hawaiian Coral Reefs: Coupling Local Ecology, Larval Transport and Climate Change

    NASA Astrophysics Data System (ADS)

    Kapur, M. R.

    2016-02-01

    Simulative models of reef ecosystems have been used to evaluate ecological responses to a myriad of disturbance events, including fishing pressure, coral bleaching, invasion by alien species, and nutrient loading. The Coral Reef Scenario Evaluation Tool (CORSET), has been developed and instantiated for both the Meso-American Reef (MAR) and South China Sea (SCS) regions. This model is novel in that it accounts for the many scales at which reef ecosystem processes take place; is comprised of a "bottom-up" structure wherein complex behaviors are not pre-programmed, but emergent and highly portable to new systems. Local-scale dynamics are coupled across regions through larval connectivity matrices, derived sophisticated particle transport simulations that include key elements of larval behavior. By this approach, we are able to directly evaluate some of the potential consequences of larval connectivity patterns across a range of spatial scales and under multiple climate scenarios. This work develops and applies the CORSET (Coral Reef Scenario Evaluation Tool) to the Main Hawaiian Islands under a suite of climate and ecological scenarios. We introduce an adaptation constant into reef-building coral dynamics to simulate observed resiliencies to bleaching events. This presentation will share results from the model's instantiation under two Resource Concentration Pathway climate scenarios, with emphasis upon larval connectivity dynamics, emergent coral tolerance to increasing thermal anomalies, and patterns of spatial fishing closures. Results suggest that under a business-as-usual scenario, thermal tolerance and herbivore removal will have synergistic effects on reef resilience.

  11. A Formal Basis for Safety Case Patterns

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Pai, Ganesh

    2013-01-01

    By capturing common structures of successful arguments, safety case patterns provide an approach for reusing strategies for reasoning about safety. In the current state of the practice, patterns exist as descriptive specifications with informal semantics, which not only offer little opportunity for more sophisticated usage such as automated instantiation, composition and manipulation, but also impede standardization efforts and tool interoperability. To address these concerns, this paper gives (i) a formal definition for safety case patterns, clarifying both restrictions on the usage of multiplicity and well-founded recursion in structural abstraction, (ii) formal semantics to patterns, and (iii) a generic data model and algorithm for pattern instantiation. We illustrate our contributions by application to a new pattern, the requirements breakdown pattern, which builds upon our previous work

  12. Uniformity on the grid via a configuration framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Igor V Terekhov et al.

    2003-03-11

    As Grid permeates modern computing, Grid solutions continue to emerge and take shape. The actual Grid development projects continue to provide higher-level services that evolve in functionality and operate with application-level concepts which are often specific to the virtual organizations that use them. Physically, however, grids are comprised of sites whose resources are diverse and seldom project readily onto a grid's set of concepts. In practice, this also creates problems for site administrators who actually instantiate grid services. In this paper, we present a flexible, uniform framework to configure a grid site and its facilities, and otherwise describe the resourcesmore » and services it offers. We start from a site configuration and instantiate services for resource advertisement, monitoring and data handling; we also apply our framework to hosting environment creation. We use our ideas in the Information Management part of the SAM-Grid project, a grid system which will deliver petabyte-scale data to the hundreds of users. Our users are High Energy Physics experimenters who are scattered worldwide across dozens of institutions and always use facilities that are shared with other experiments as well as other grids. Our implementation represents information in the XML format and includes tools written in XQuery and XSLT.« less

  13. Designing persuasive health materials using processing fluency: a literature review.

    PubMed

    Okuhara, Tsuyoshi; Ishikawa, Hirono; Okada, Masahumi; Kato, Mio; Kiuchi, Takahiro

    2017-06-08

    Health materials to promote health behaviors should be readable and generate favorable evaluations of the message. Processing fluency (the subjective experience of ease with which people process information) has been increasingly studied over the past decade. In this review, we explore effects and instantiations of processing fluency and discuss the implications for designing effective health materials. We searched seven online databases using "processing fluency" as the key word. In addition, we gathered relevant publications using reference snowballing. We included published records that were written in English and applicable to the design of health materials. We found 40 articles that were appropriate for inclusion. Various instantiations of fluency have a uniform effect on human judgment: fluently processed stimuli generate positive judgments (e.g., liking, confidence). Processing fluency is used to predict the effort needed for a given task; accordingly, it has an impact on willingness to undertake the task. Physical perceptual, lexical, syntactic, phonological, retrieval, and imagery fluency were found to be particularly relevant to the design of health materials. Health-care professionals should consider the use of a perceptually fluent design, plain language, numeracy with an appropriate degree of precision, a limited number of key points, and concrete descriptions that make recipients imagine healthy behavior. Such fluently processed materials that are easy to read and understand have enhanced perspicuity and persuasiveness.

  14. Reel-to-reel substrate tape polishing system

    DOEpatents

    Selvamanickam, Venkat; Gardner, Michael T.; Judd, Raymond D.; Weloth, Martin; Qiao, Yunfei

    2005-06-21

    Disclosed is a reel-to-reel single-pass mechanical polishing system (100) suitable for polishing long lengths of metal substrate tape (124) used in the manufacture of high-temperature superconductor (HTS) coated tape, including multiple instantiations of a polishing station (114) in combination with a subsequent rinsing station (116) arranged along the axis of the metal substrate tape (124) that is translating between a payout spool (110a) and a take-up spool (110b). The metal substrate tape obtains a surface smoothness that is suitable for the subsequent deposition of a buffer layer.

  15. A UML profile for framework modeling.

    PubMed

    Xu, Xiao-liang; Wang, Le-yu; Zhou, Hong

    2004-01-01

    The current standard Unified Modeling Language(UML) could not model framework flexibility and extendability adequately due to lack of appropriate constructs to distinguish framework hot-spots from kernel elements. A new UML profile that may customize UML for framework modeling was presented using the extension mechanisms of UML, providing a group of UML extensions to meet the needs of framework modeling. In this profile, the extended class diagrams and sequence diagrams were defined to straightforwardly identify the hot-spots and describe their instantiation restrictions. A transformation model based on design patterns was also put forward, such that the profile based framework design diagrams could be automatically mapped to the corresponding implementation diagrams. It was proved that the presented profile makes framework modeling more straightforwardly and therefore easier to understand and instantiate.

  16. Modular VO oriented Java EE service deployer

    NASA Astrophysics Data System (ADS)

    Molinaro, Marco; Cepparo, Francesco; De Marco, Marco; Knapic, Cristina; Apollo, Pietro; Smareglia, Riccardo

    2014-07-01

    The International Virtual Observatory Alliance (IVOA) has produced many standards and recommendations whose aim is to generate an architecture that starts from astrophysical resources, in a general sense, and ends up in deployed consumable services (that are themselves astrophysical resources). Focusing on the Data Access Layer (DAL) system architecture, that these standards define, in the last years a web based application has been developed and maintained at INAF-OATs IA2 (Italian National institute for Astrophysics - Astronomical Observatory of Trieste, Italian center of Astronomical Archives) to try to deploy and manage multiple VO (Virtual Observatory) services in a uniform way: VO-Dance. However a set of criticalities have arisen since when the VO-Dance idea has been produced, plus some major changes underwent and are undergoing at the IVOA DAL layer (and related standards): this urged IA2 to identify a new solution for its own service layer. Keeping on the basic ideas from VO-Dance (simple service configuration, service instantiation at call time and modularity) while switching to different software technologies (e.g. dismissing Java Reflection in favour of Enterprise Java Bean, EJB, based solution), the new solution has been sketched out and tested for feasibility. Here we present the results originating from this test study. The main constraints for this new project come from various fields. A better homogenized solution rising from IVOA DAL standards: for example the new DALI (Data Access Layer Interface) specification that acts as a common interface system for previous and oncoming access protocols. The need for a modular system where each component is based upon a single VO specification allowing services to rely on common capabilities instead of homogenizing them inside service components directly. The search for a scalable system that takes advantage from distributed systems. The constraints find answer in the adopted solutions hereafter sketched. The development of the new system using Java Enterprise technologies can better benefit from existing libraries to build up the single tokens implementing the IVOA standards. Each component can be built from single standards and each deployed service (i.e. service components instantiations) can consume the other components' exposed methods and services without the need of homogenizing them in dedicated libraries. Scalability can be achieved in an easier way by deploying components or sets of services on a distributed environment and using JNDI (Java Naming and Directory Interface) and RMI (Remote Method Invocation) technologies. Single service configuration will not be significantly different from the VO-Dance solution given that Java class instantiation that benefited from Java Reflection will only be moved to Java EJB pooling (and not, e.g. embedded in bundles for subsequent deployment).

  17. Knowledge engineering tools for reasoning with scientific observations and interpretations: a neural connectivity use case.

    PubMed

    Russ, Thomas A; Ramakrishnan, Cartic; Hovy, Eduard H; Bota, Mihail; Burns, Gully A P C

    2011-08-22

    We address the goal of curating observations from published experiments in a generalizable form; reasoning over these observations to generate interpretations and then querying this interpreted knowledge to supply the supporting evidence. We present web-application software as part of the 'BioScholar' project (R01-GM083871) that fully instantiates this process for a well-defined domain: using tract-tracing experiments to study the neural connectivity of the rat brain. The main contribution of this work is to provide the first instantiation of a knowledge representation for experimental observations called 'Knowledge Engineering from Experimental Design' (KEfED) based on experimental variables and their interdependencies. The software has three parts: (a) the KEfED model editor - a design editor for creating KEfED models by drawing a flow diagram of an experimental protocol; (b) the KEfED data interface - a spreadsheet-like tool that permits users to enter experimental data pertaining to a specific model; (c) a 'neural connection matrix' interface that presents neural connectivity as a table of ordinal connection strengths representing the interpretations of tract-tracing data. This tool also allows the user to view experimental evidence pertaining to a specific connection. BioScholar is built in Flex 3.5. It uses Persevere (a noSQL database) as a flexible data store and PowerLoom® (a mature First Order Logic reasoning system) to execute queries using spatial reasoning over the BAMS neuroanatomical ontology. We first introduce the KEfED approach as a general approach and describe its possible role as a way of introducing structured reasoning into models of argumentation within new models of scientific publication. We then describe the design and implementation of our example application: the BioScholar software. This is presented as a possible biocuration interface and supplementary reasoning toolkit for a larger, more specialized bioinformatics system: the Brain Architecture Management System (BAMS).

  18. Knowledge engineering tools for reasoning with scientific observations and interpretations: a neural connectivity use case

    PubMed Central

    2011-01-01

    Background We address the goal of curating observations from published experiments in a generalizable form; reasoning over these observations to generate interpretations and then querying this interpreted knowledge to supply the supporting evidence. We present web-application software as part of the 'BioScholar' project (R01-GM083871) that fully instantiates this process for a well-defined domain: using tract-tracing experiments to study the neural connectivity of the rat brain. Results The main contribution of this work is to provide the first instantiation of a knowledge representation for experimental observations called 'Knowledge Engineering from Experimental Design' (KEfED) based on experimental variables and their interdependencies. The software has three parts: (a) the KEfED model editor - a design editor for creating KEfED models by drawing a flow diagram of an experimental protocol; (b) the KEfED data interface - a spreadsheet-like tool that permits users to enter experimental data pertaining to a specific model; (c) a 'neural connection matrix' interface that presents neural connectivity as a table of ordinal connection strengths representing the interpretations of tract-tracing data. This tool also allows the user to view experimental evidence pertaining to a specific connection. BioScholar is built in Flex 3.5. It uses Persevere (a noSQL database) as a flexible data store and PowerLoom® (a mature First Order Logic reasoning system) to execute queries using spatial reasoning over the BAMS neuroanatomical ontology. Conclusions We first introduce the KEfED approach as a general approach and describe its possible role as a way of introducing structured reasoning into models of argumentation within new models of scientific publication. We then describe the design and implementation of our example application: the BioScholar software. This is presented as a possible biocuration interface and supplementary reasoning toolkit for a larger, more specialized bioinformatics system: the Brain Architecture Management System (BAMS). PMID:21859449

  19. Universal biology and the statistical mechanics of early life.

    PubMed

    Goldenfeld, Nigel; Biancalani, Tommaso; Jafarpour, Farshid

    2017-12-28

    All known life on the Earth exhibits at least two non-trivial common features: the canonical genetic code and biological homochirality, both of which emerged prior to the Last Universal Common Ancestor state. This article describes recent efforts to provide a narrative of this epoch using tools from statistical mechanics. During the emergence of self-replicating life far from equilibrium in a period of chemical evolution, minimal models of autocatalysis show that homochirality would have necessarily co-evolved along with the efficiency of early-life self-replicators. Dynamical system models of the evolution of the genetic code must explain its universality and its highly refined error-minimization properties. These have both been accounted for in a scenario where life arose from a collective, networked phase where there was no notion of species and perhaps even individuality itself. We show how this phase ultimately terminated during an event sometimes known as the Darwinian transition, leading to the present epoch of tree-like vertical descent of organismal lineages. These examples illustrate concrete examples of universal biology: the quest for a fundamental understanding of the basic properties of living systems, independent of precise instantiation in chemistry or other media.This article is part of the themed issue 'Reconceptualizing the origins of life'. © 2017 The Author(s).

  20. Universal biology and the statistical mechanics of early life

    NASA Astrophysics Data System (ADS)

    Goldenfeld, Nigel; Biancalani, Tommaso; Jafarpour, Farshid

    2017-11-01

    All known life on the Earth exhibits at least two non-trivial common features: the canonical genetic code and biological homochirality, both of which emerged prior to the Last Universal Common Ancestor state. This article describes recent efforts to provide a narrative of this epoch using tools from statistical mechanics. During the emergence of self-replicating life far from equilibrium in a period of chemical evolution, minimal models of autocatalysis show that homochirality would have necessarily co-evolved along with the efficiency of early-life self-replicators. Dynamical system models of the evolution of the genetic code must explain its universality and its highly refined error-minimization properties. These have both been accounted for in a scenario where life arose from a collective, networked phase where there was no notion of species and perhaps even individuality itself. We show how this phase ultimately terminated during an event sometimes known as the Darwinian transition, leading to the present epoch of tree-like vertical descent of organismal lineages. These examples illustrate concrete examples of universal biology: the quest for a fundamental understanding of the basic properties of living systems, independent of precise instantiation in chemistry or other media. This article is part of the themed issue 'Reconceptualizing the origins of life'.

  1. Modelica buildings library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wetter, Michael; Zuo, Wangda; Nouidui, Thierry S.

    This paper describes the Buildings library, a free open-source library that is implemented in Modelica, an equation-based object-oriented modeling language. The library supports rapid prototyping, as well as design and operation of building energy and control systems. First, we describe the scope of the library, which covers HVAC systems, multi-zone heat transfer and multi-zone airflow and contaminant transport. Next, we describe differentiability requirements and address how we implemented them. We describe the class hierarchy that allows implementing component models by extending partial implementations of base models of heat and mass exchangers, and by instantiating basic models for conservation equations andmore » flow resistances. We also describe associated tools for pre- and post-processing, regression tests, co-simulation and real-time data exchange with building automation systems. Furthermore, the paper closes with an example of a chilled water plant, with and without water-side economizer, in which we analyzed the system-level efficiency for different control setpoints.« less

  2. A knowledge base architecture for distributed knowledge agents

    NASA Technical Reports Server (NTRS)

    Riedesel, Joel; Walls, Bryan

    1990-01-01

    A tuple space based object oriented model for knowledge base representation and interpretation is presented. An architecture for managing distributed knowledge agents is then implemented within the model. The general model is based upon a database implementation of a tuple space. Objects are then defined as an additional layer upon the database. The tuple space may or may not be distributed depending upon the database implementation. A language for representing knowledge and inference strategy is defined whose implementation takes advantage of the tuple space. The general model may then be instantiated in many different forms, each of which may be a distinct knowledge agent. Knowledge agents may communicate using tuple space mechanisms as in the LINDA model as well as using more well known message passing mechanisms. An implementation of the model is presented describing strategies used to keep inference tractable without giving up expressivity. An example applied to a power management and distribution network for Space Station Freedom is given.

  3. Modelica buildings library

    DOE PAGES

    Wetter, Michael; Zuo, Wangda; Nouidui, Thierry S.; ...

    2013-03-13

    This paper describes the Buildings library, a free open-source library that is implemented in Modelica, an equation-based object-oriented modeling language. The library supports rapid prototyping, as well as design and operation of building energy and control systems. First, we describe the scope of the library, which covers HVAC systems, multi-zone heat transfer and multi-zone airflow and contaminant transport. Next, we describe differentiability requirements and address how we implemented them. We describe the class hierarchy that allows implementing component models by extending partial implementations of base models of heat and mass exchangers, and by instantiating basic models for conservation equations andmore » flow resistances. We also describe associated tools for pre- and post-processing, regression tests, co-simulation and real-time data exchange with building automation systems. Furthermore, the paper closes with an example of a chilled water plant, with and without water-side economizer, in which we analyzed the system-level efficiency for different control setpoints.« less

  4. Using Docker Compose for the Simple Deployment of an Integrated Drug Target Screening Platform.

    PubMed

    List, Markus

    2017-06-10

    Docker virtualization allows for software tools to be executed in an isolated and controlled environment referred to as a container. In Docker containers, dependencies are provided exactly as intended by the developer and, consequently, they simplify the distribution of scientific software and foster reproducible research. The Docker paradigm is that each container encapsulates one particular software tool. However, to analyze complex biomedical data sets, it is often necessary to combine several software tools into elaborate workflows. To address this challenge, several Docker containers need to be instantiated and properly integrated, which complicates the software deployment process unnecessarily. Here, we demonstrate how an extension to Docker, Docker compose, can be used to mitigate these problems by providing a unified setup routine that deploys several tools in an integrated fashion. We demonstrate the power of this approach by example of a Docker compose setup for a drug target screening platform consisting of five integrated web applications and shared infrastructure, deployable in just two lines of codes.

  5. Integrated image data and medical record management for rare disease registries. A general framework and its instantiation to theGerman Calciphylaxis Registry.

    PubMed

    Deserno, Thomas M; Haak, Daniel; Brandenburg, Vincent; Deserno, Verena; Classen, Christoph; Specht, Paula

    2014-12-01

    Especially for investigator-initiated research at universities and academic institutions, Internet-based rare disease registries (RDR) are required that integrate electronic data capture (EDC) with automatic image analysis or manual image annotation. We propose a modular framework merging alpha-numerical and binary data capture. In concordance with the Office of Rare Diseases Research recommendations, a requirement analysis was performed based on several RDR databases currently hosted at Uniklinik RWTH Aachen, Germany. With respect to the study management tool that is already successfully operating at the Clinical Trial Center Aachen, the Google Web Toolkit was chosen with Hibernate and Gilead connecting a MySQL database management system. Image and signal data integration and processing is supported by Apache Commons FileUpload-Library and ImageJ-based Java code, respectively. As a proof of concept, the framework is instantiated to the German Calciphylaxis Registry. The framework is composed of five mandatory core modules: (1) Data Core, (2) EDC, (3) Access Control, (4) Audit Trail, and (5) Terminology as well as six optional modules: (6) Binary Large Object (BLOB), (7) BLOB Analysis, (8) Standard Operation Procedure, (9) Communication, (10) Pseudonymization, and (11) Biorepository. Modules 1-7 are implemented in the German Calciphylaxis Registry. The proposed RDR framework is easily instantiated and directly integrates image management and analysis. As open source software, it may assist improved data collection and analysis of rare diseases in near future.

  6. Human Error as an Emergent Property of Action Selection and Task Place-Holding.

    PubMed

    Tamborello, Franklin P; Trafton, J Gregory

    2017-05-01

    A computational process model could explain how the dynamic interaction of human cognitive mechanisms produces each of multiple error types. With increasing capability and complexity of technological systems, the potential severity of consequences of human error is magnified. Interruption greatly increases people's error rates, as does the presence of other information to maintain in an active state. The model executed as a software-instantiated Monte Carlo simulation. It drew on theoretical constructs such as associative spreading activation for prospective memory, explicit rehearsal strategies as a deliberate cognitive operation to aid retrospective memory, and decay. The model replicated the 30% effect of interruptions on postcompletion error in Ratwani and Trafton's Stock Trader task, the 45% interaction effect on postcompletion error of working memory capacity and working memory load from Byrne and Bovair's Phaser Task, as well as the 5% perseveration and 3% omission effects of interruption from the UNRAVEL Task. Error classes including perseveration, omission, and postcompletion error fall naturally out of the theory. The model explains post-interruption error in terms of task state representation and priming for recall of subsequent steps. Its performance suggests that task environments providing more cues to current task state will mitigate error caused by interruption. For example, interfaces could provide labeled progress indicators or facilities for operators to quickly write notes about their task states when interrupted.

  7. The Adam language: Ada extended with support for multiway activities

    NASA Technical Reports Server (NTRS)

    Charlesworth, Arthur

    1993-01-01

    The Adam language is an extension of Ada that supports multiway activities, which are cooperative activities involving two or more processes. This support is provided by three new constructs: diva procedures, meet statements, and multiway accept statements. Diva procedures are recursive generic procedures having a particular restrictive syntax that facilitates translation for parallel computers. Meet statements and multiway accept statements provide two ways to express a multiway rendezvous, which is an n-way rendezvous generalizing Ada's 2-way rendezvous. While meet statements tend to have simpler rules than multiway accept statements, the latter approach is a more straightforward extension of Ada. The only nonnull statements permitted within meet statements and multiway accept statements are calls on instantiated diva procedures. A call on an instantiated diva procedure is also permitted outside a multiway rendezvous; thus sequential Adam programs using diva procedures can be written. Adam programs are translated into Ada programs appropriate for use on parallel computers.

  8. A Data Analytics Approach to Discovering Unique Microstructural Configurations Susceptible to Fatigue

    NASA Astrophysics Data System (ADS)

    Jha, S. K.; Brockman, R. A.; Hoffman, R. M.; Sinha, V.; Pilchak, A. L.; Porter, W. J.; Buchanan, D. J.; Larsen, J. M.; John, R.

    2018-05-01

    Principal component analysis and fuzzy c-means clustering algorithms were applied to slip-induced strain and geometric metric data in an attempt to discover unique microstructural configurations and their frequencies of occurrence in statistically representative instantiations of a titanium alloy microstructure. Grain-averaged fatigue indicator parameters were calculated for the same instantiation. The fatigue indicator parameters strongly correlated with the spatial location of the microstructural configurations in the principal components space. The fuzzy c-means clustering method identified clusters of data that varied in terms of their average fatigue indicator parameters. Furthermore, the number of points in each cluster was inversely correlated to the average fatigue indicator parameter. This analysis demonstrates that data-driven methods have significant potential for providing unbiased determination of unique microstructural configurations and their frequencies of occurrence in a given volume from the point of view of strain localization and fatigue crack initiation.

  9. Instantiating informatics in nursing practice for integrated patient centred holistic models of care: a discussion paper.

    PubMed

    Hussey, Pamela A; Kennedy, Margaret Ann

    2016-05-01

    A discussion on how informatics knowledge and competencies can enable nursing to instantiate transition to integrated models of care. Costs of traditional models of care are no longer sustainable consequent to the spiralling incidence and costs of chronic illness. The international community looks towards technology-enabled solutions to support a shift towards integrated patient-centred models of care. Discussion paper. A search of the literature was performed dating from 2000-2015 and a purposeful data sample based on relevance to building the discussion was included. The holistic perspective of nursing knowledge can support and advance integrated healthcare models. Informatics skills are key for the profession to play a leadership role in design, implementation and operation of next generation health care. However, evidence suggests that nursing engagement with informatics strategic development for healthcare provision is currently variable. A statistically significant need exists to progress health care towards integrated models of care. Strategic and tactical plans that are robustly pragmatic with nursing insights and expertise are an essential component to achieve effective healthcare provision. To avoid exclusion in the discourse dominated by management and technology experts, nursing leaders must develop and actively promote the advancement of nursing informatics skills. For knowledge in nursing practice to flourish in contemporary health care, nurse leaders will need to incorporate informatics for optimal translation and interpretation. Defined nursing leadership roles informed by informatics are essential to generate concrete solutions sustaining nursing practice in integrated care models. © 2016 John Wiley & Sons Ltd.

  10. Towards Formal Implementation of PUS Standard

    NASA Astrophysics Data System (ADS)

    Ilić, D.

    2009-05-01

    As an effort to promote the reuse of on-board and ground systems ESA developed a standard for packet telemetry and telecommand - PUS. It defines a set of standard service models with the corresponding structures of the associated telemetry and telecommand packets. Various missions then can choose to implement those standard PUS services that best conform to their specific requirements. In this paper we propose a formal development (based on the Event-B method) of reusable service patterns, which can be instantiated for concrete application. Our formal models allow us to formally express and verify specific service properties including various telecommand and telemetry packet structure validation.

  11. Neurocognitive architecture of working memory

    PubMed Central

    Eriksson, Johan; Vogel, Edward K.; Lansner, Anders; Bergström, Fredrik; Nyberg, Lars

    2015-01-01

    The crucial role of working memory for temporary information processing and guidance of complex behavior has been recognized for many decades. There is emerging consensus that working memory maintenance results from the interactions among long-term memory representations and basic processes, including attention, that are instantiated as reentrant loops between frontal and posterior cortical areas, as well as subcortical structures. The nature of such interactions can account for capacity limitations, lifespan changes, and restricted transfer after working-memory training. Recent data and models indicate that working memory may also be based on synaptic plasticity, and that working memory can operate on non-consciously perceived information. PMID:26447571

  12. Multi-registration of software library resources

    DOEpatents

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2011-04-05

    Data communications, including issuing, by an application program to a high level data communications library, a request for initialization of a data communications service; issuing to a low level data communications library a request for registration of data communications functions; registering the data communications functions, including instantiating a factory object for each of the one or more data communications functions; issuing by the application program an instruction to execute a designated data communications function; issuing, to the low level data communications library, an instruction to execute the designated data communications function, including passing to the low level data communications library a call parameter that identifies a factory object; creating with the identified factory object the data communications object that implements the data communications function according to the protocol; and executing by the low level data communications library the designated data communications function.

  13. Floating-Point Units and Algorithms for field-programmable gate arrays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Underwood, Keith D.; Hemmert, K. Scott

    2005-11-01

    The software that we are attempting to copyright is a package of floating-point unit descriptions and example algorithm implementations using those units for use in FPGAs. The floating point units are best-in-class implementations of add, multiply, divide, and square root floating-point operations. The algorithm implementations are sample (not highly flexible) implementations of FFT, matrix multiply, matrix vector multiply, and dot product. Together, one could think of the collection as an implementation of parts of the BLAS library or something similar to the FFTW packages (without the flexibility) for FPGAs. Results from this work has been published multiple times and wemore » are working on a publication to discuss the techniques we use to implement the floating-point units, For some more background, FPGAS are programmable hardware. "Programs" for this hardware are typically created using a hardware description language (examples include Verilog, VHDL, and JHDL). Our floating-point unit descriptions are written in JHDL, which allows them to include placement constraints that make them highly optimized relative to some other implementations of floating-point units. Many vendors (Nallatech from the UK, SRC Computers in the US) have similar implementations, but our implementations seem to be somewhat higher performance. Our algorithm implementations are written in VHDL and models of the floating-point units are provided in VHDL as well. FPGA "programs" make multiple "calls" (hardware instantiations) to libraries of intellectual property (IP), such as the floating-point unit library described here. These programs are then compiled using a tool called a synthesizer (such as a tool from Synplicity, Inc.). The compiled file is a netlist of gates and flip-flops. This netlist is then mapped to a particular type of FPGA by a mapper and then a place- and-route tool. These tools assign the gates in the netlist to specific locations on the specific type of FPGA chip used and constructs the required routes between them. The result is a "bitstream" that is analogous to a compiled binary. The bitstream is loaded into the FPGA to create a specific hardware configuration.« less

  14. The Dynamics of Perceptual Learning: An Incremental Reweighting Model

    ERIC Educational Resources Information Center

    Petrov, Alexander A.; Dosher, Barbara Anne; Lu, Zhong-Lin

    2005-01-01

    The mechanisms of perceptual learning are analyzed theoretically, probed in an orientation-discrimination experiment involving a novel nonstationary context manipulation, and instantiated in a detailed computational model. Two hypotheses are examined: modification of early cortical representations versus task-specific selective reweighting.…

  15. Clinician and Writer: Their Crucible of Involvement.

    ERIC Educational Resources Information Center

    Ewald, Helen Rothschild

    Clinical report writing involves two interlinking processes--creation and communication. There are six stages of clinical inference that find parallels in generative writing stages: possessing a postulate system, constructing the major premise, observing for occurrences, instantiating (classifying) the occurrences, reaching a referential product,…

  16. Constructional and Conceptual Composition

    ERIC Educational Resources Information Center

    Dodge, Ellen Kirsten

    2010-01-01

    Goldberg's (1995) recognition that, in addition to various word-level constructions, sentences also instantiate meaningful argument structure constructions enables a non-polysemy-based analysis of various verb 'alternations' (Levin 1993). In such an analysis, meaning variations associated with the use of the same verb in different argument…

  17. First Things First: Internet Relay Chat Openings.

    ERIC Educational Resources Information Center

    Rintel, E. Sean; Mulholland, Joan; Pittam, Jeffery

    2001-01-01

    Argues that Internet Relay Chat (IRC) research needs to systematically address links between interaction structures, technological mediation and the instantiation and development of interpersonal relationships. Finds that openings that occur directly following user's entries into public IRC channels are often ambiguous, can disrupt relationship…

  18. An extensible six-step methodology to automatically generate fuzzy DSSs for diagnostic applications.

    PubMed

    d'Acierno, Antonio; Esposito, Massimo; De Pietro, Giuseppe

    2013-01-01

    The diagnosis of many diseases can be often formulated as a decision problem; uncertainty affects these problems so that many computerized Diagnostic Decision Support Systems (in the following, DDSSs) have been developed to aid the physician in interpreting clinical data and thus to improve the quality of the whole process. Fuzzy logic, a well established attempt at the formalization and mechanization of human capabilities in reasoning and deciding with noisy information, can be profitably used. Recently, we informally proposed a general methodology to automatically build DDSSs on the top of fuzzy knowledge extracted from data. We carefully refine and formalize our methodology that includes six stages, where the first three stages work with crisp rules, whereas the last three ones are employed on fuzzy models. Its strength relies on its generality and modularity since it supports the integration of alternative techniques in each of its stages. The methodology is designed and implemented in the form of a modular and portable software architecture according to a component-based approach. The architecture is deeply described and a summary inspection of the main components in terms of UML diagrams is outlined as well. A first implementation of the architecture has been then realized in Java following the object-oriented paradigm and used to instantiate a DDSS example aimed at accurately diagnosing breast masses as a proof of concept. The results prove the feasibility of the whole methodology implemented in terms of the architecture proposed.

  19. An extensible six-step methodology to automatically generate fuzzy DSSs for diagnostic applications

    PubMed Central

    2013-01-01

    Background The diagnosis of many diseases can be often formulated as a decision problem; uncertainty affects these problems so that many computerized Diagnostic Decision Support Systems (in the following, DDSSs) have been developed to aid the physician in interpreting clinical data and thus to improve the quality of the whole process. Fuzzy logic, a well established attempt at the formalization and mechanization of human capabilities in reasoning and deciding with noisy information, can be profitably used. Recently, we informally proposed a general methodology to automatically build DDSSs on the top of fuzzy knowledge extracted from data. Methods We carefully refine and formalize our methodology that includes six stages, where the first three stages work with crisp rules, whereas the last three ones are employed on fuzzy models. Its strength relies on its generality and modularity since it supports the integration of alternative techniques in each of its stages. Results The methodology is designed and implemented in the form of a modular and portable software architecture according to a component-based approach. The architecture is deeply described and a summary inspection of the main components in terms of UML diagrams is outlined as well. A first implementation of the architecture has been then realized in Java following the object-oriented paradigm and used to instantiate a DDSS example aimed at accurately diagnosing breast masses as a proof of concept. Conclusions The results prove the feasibility of the whole methodology implemented in terms of the architecture proposed. PMID:23368970

  20. A high-throughput screening approach to discovering good forms of biologically inspired visual representation.

    PubMed

    Pinto, Nicolas; Doukhan, David; DiCarlo, James J; Cox, David D

    2009-11-01

    While many models of biological object recognition share a common set of "broad-stroke" properties, the performance of any one model depends strongly on the choice of parameters in a particular instantiation of that model--e.g., the number of units per layer, the size of pooling kernels, exponents in normalization operations, etc. Since the number of such parameters (explicit or implicit) is typically large and the computational cost of evaluating one particular parameter set is high, the space of possible model instantiations goes largely unexplored. Thus, when a model fails to approach the abilities of biological visual systems, we are left uncertain whether this failure is because we are missing a fundamental idea or because the correct "parts" have not been tuned correctly, assembled at sufficient scale, or provided with enough training. Here, we present a high-throughput approach to the exploration of such parameter sets, leveraging recent advances in stream processing hardware (high-end NVIDIA graphic cards and the PlayStation 3's IBM Cell Processor). In analogy to high-throughput screening approaches in molecular biology and genetics, we explored thousands of potential network architectures and parameter instantiations, screening those that show promising object recognition performance for further analysis. We show that this approach can yield significant, reproducible gains in performance across an array of basic object recognition tasks, consistently outperforming a variety of state-of-the-art purpose-built vision systems from the literature. As the scale of available computational power continues to expand, we argue that this approach has the potential to greatly accelerate progress in both artificial vision and our understanding of the computational underpinning of biological vision.

  1. A High-Throughput Screening Approach to Discovering Good Forms of Biologically Inspired Visual Representation

    PubMed Central

    Pinto, Nicolas; Doukhan, David; DiCarlo, James J.; Cox, David D.

    2009-01-01

    While many models of biological object recognition share a common set of “broad-stroke” properties, the performance of any one model depends strongly on the choice of parameters in a particular instantiation of that model—e.g., the number of units per layer, the size of pooling kernels, exponents in normalization operations, etc. Since the number of such parameters (explicit or implicit) is typically large and the computational cost of evaluating one particular parameter set is high, the space of possible model instantiations goes largely unexplored. Thus, when a model fails to approach the abilities of biological visual systems, we are left uncertain whether this failure is because we are missing a fundamental idea or because the correct “parts” have not been tuned correctly, assembled at sufficient scale, or provided with enough training. Here, we present a high-throughput approach to the exploration of such parameter sets, leveraging recent advances in stream processing hardware (high-end NVIDIA graphic cards and the PlayStation 3's IBM Cell Processor). In analogy to high-throughput screening approaches in molecular biology and genetics, we explored thousands of potential network architectures and parameter instantiations, screening those that show promising object recognition performance for further analysis. We show that this approach can yield significant, reproducible gains in performance across an array of basic object recognition tasks, consistently outperforming a variety of state-of-the-art purpose-built vision systems from the literature. As the scale of available computational power continues to expand, we argue that this approach has the potential to greatly accelerate progress in both artificial vision and our understanding of the computational underpinning of biological vision. PMID:19956750

  2. Optimization and quantization in gradient symbol systems: a framework for integrating the continuous and the discrete in cognition.

    PubMed

    Smolensky, Paul; Goldrick, Matthew; Mathis, Donald

    2014-08-01

    Mental representations have continuous as well as discrete, combinatorial properties. For example, while predominantly discrete, phonological representations also vary continuously; this is reflected by gradient effects in instrumental studies of speech production. Can an integrated theoretical framework address both aspects of structure? The framework we introduce here, Gradient Symbol Processing, characterizes the emergence of grammatical macrostructure from the Parallel Distributed Processing microstructure (McClelland, Rumelhart, & The PDP Research Group, 1986) of language processing. The mental representations that emerge, Distributed Symbol Systems, have both combinatorial and gradient structure. They are processed through Subsymbolic Optimization-Quantization, in which an optimization process favoring representations that satisfy well-formedness constraints operates in parallel with a distributed quantization process favoring discrete symbolic structures. We apply a particular instantiation of this framework, λ-Diffusion Theory, to phonological production. Simulations of the resulting model suggest that Gradient Symbol Processing offers a way to unify accounts of grammatical competence with both discrete and continuous patterns in language performance. Copyright © 2013 Cognitive Science Society, Inc.

  3. Iterating between Tools to Create and Edit Visualizations.

    PubMed

    Bigelow, Alex; Drucker, Steven; Fisher, Danyel; Meyer, Miriah

    2017-01-01

    A common workflow for visualization designers begins with a generative tool, like D3 or Processing, to create the initial visualization; and proceeds to a drawing tool, like Adobe Illustrator or Inkscape, for editing and cleaning. Unfortunately, this is typically a one-way process: once a visualization is exported from the generative tool into a drawing tool, it is difficult to make further, data-driven changes. In this paper, we propose a bridge model to allow designers to bring their work back from the drawing tool to re-edit in the generative tool. Our key insight is to recast this iteration challenge as a merge problem - similar to when two people are editing a document and changes between them need to reconciled. We also present a specific instantiation of this model, a tool called Hanpuku, which bridges between D3 scripts and Illustrator. We show several examples of visualizations that are iteratively created using Hanpuku in order to illustrate the flexibility of the approach. We further describe several hypothetical tools that bridge between other visualization tools to emphasize the generality of the model.

  4. A shared-world conceptual model for integrating space station life sciences telescience operations

    NASA Technical Reports Server (NTRS)

    Johnson, Vicki; Bosley, John

    1988-01-01

    Mental models of the Space Station and its ancillary facilities will be employed by users of the Space Station as they draw upon past experiences, perform tasks, and collectively plan for future activities. The operational environment of the Space Station will incorporate telescience, a new set of operational modes. To investigate properties of the operational environment, distributed users, and the mental models they employ to manipulate resources while conducting telescience, an integrating shared-world conceptual model of Space Station telescience is proposed. The model comprises distributed users and resources (active elements); agents who mediate interactions among these elements on the basis of intelligent processing of shared information; and telescience protocols which structure the interactions of agents as they engage in cooperative, responsive interactions on behalf of users and resources distributed in space and time. Examples from the life sciences are used to instantiate and refine the model's principles. Implications for transaction management and autonomy are discussed. Experiments employing the model are described which the authors intend to conduct using the Space Station Life Sciences Telescience Testbed currently under development at Ames Research Center.

  5. Mechanisms of Developmental Change in Infant Categorization

    ERIC Educational Resources Information Center

    Westermann, Gert; Mareschal, Denis

    2012-01-01

    Computational models are tools for testing mechanistic theories of learning and development. Formal models allow us to instantiate theories of cognitive development in computer simulations. Model behavior can then be compared to real performance. Connectionist models, loosely based on neural information processing, have been successful in…

  6. The Dynamics of Conditioning and Extinction

    PubMed Central

    Killeen, Peter R.; Sanabria, Federico; Dolgov, Igor

    2009-01-01

    Pigeons responded to intermittently reinforced classical conditioning trials with erratic bouts of responding to the CS. Responding depended on whether the prior trial contained a peck, food, or both. A linear-persistence/learning model moved animals into and out of a response state, and a Weibull distribution for number of within-trial responses governed in-state pecking. Variations of trial and inter-trial durations caused correlated changes in rate and probability of responding, and model parameters. A novel prediction—in the protracted absence of food, response rates can plateau above zero—was validated. The model predicted smooth acquisition functions when instantiated with the probability of food, but a more accurate jagged learning curve when instantiated with trial-to-trial records of reinforcement. The Skinnerian parameter was dominant only when food could be accelerated or delayed by pecking. These experiments provide a framework for trial-by-trial accounts of conditioning and extinction that increases the information available from the data, permitting them to comment more definitively on complex contemporary models of momentum and conditioning. PMID:19839699

  7. Buckets: Aggregative, Intelligent Agents for Publishing

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Maly, Kurt; Shen, Stewart N. T.; Zubair, Mohammad

    1998-01-01

    Buckets are an aggregative, intelligent construct for publishing in digital libraries. The goal of research projects is to produce information. This information is often instantiated in several forms, differentiated by semantic types (report, software, video, datasets, etc.). A given semantic type can be further differentiated by syntactic representations as well (PostScript version, PDF version, Word version, etc.). Although the information was created together and subtle relationships can exist between them, different semantic instantiations are generally segregated along currently obsolete media boundaries. Reports are placed in report archives, software might go into a software archive, but most of the data and supporting materials are likely to be kept in informal personal archives or discarded altogether. Buckets provide an archive-independent container construct in which all related semantic and syntactic data types and objects can be logically grouped together, archived, and manipulated as a single object. Furthermore, buckets are active archival objects and can communicate with each other, people, or arbitrary network services.

  8. Self-organization of meaning and the reflexive communication of information

    PubMed Central

    Leydesdorff, Loet; Petersen, Alexander M.; Ivanova, Inga

    2017-01-01

    Following a suggestion from Warren Weaver, we extend the Shannon model of communication piecemeal into a complex systems model in which communication is differentiated both vertically and horizontally. This model enables us to bridge the divide between Niklas Luhmann’s theory of the self-organization of meaning in communications and empirical research using information theory. First, we distinguish between communication relations and correlations among patterns of relations. The correlations span a vector space in which relations are positioned and can be provided with meaning. Second, positions provide reflexive perspectives. Whereas the different meanings are integrated locally, each instantiation opens global perspectives – ‘horizons of meaning’ – along eigenvectors of the communication matrix. These next-order codifications of meaning can be expected to generate redundancies when interacting in instantiations. Increases in redundancy indicate new options and can be measured as local reduction of prevailing uncertainty (in bits). The systemic generation of new options can be considered as a hallmark of the knowledge-based economy. PMID:28232771

  9. Multiresolution multiscale active mask segmentation of fluorescence microscope images

    NASA Astrophysics Data System (ADS)

    Srinivasa, Gowri; Fickus, Matthew; Kovačević, Jelena

    2009-08-01

    We propose an active mask segmentation framework that combines the advantages of statistical modeling, smoothing, speed and flexibility offered by the traditional methods of region-growing, multiscale, multiresolution and active contours respectively. At the crux of this framework is a paradigm shift from evolving contours in the continuous domain to evolving multiple masks in the discrete domain. Thus, the active mask framework is particularly suited to segment digital images. We demonstrate the use of the framework in practice through the segmentation of punctate patterns in fluorescence microscope images. Experiments reveal that statistical modeling helps the multiple masks converge from a random initial configuration to a meaningful one. This obviates the need for an involved initialization procedure germane to most of the traditional methods used to segment fluorescence microscope images. While we provide the mathematical details of the functions used to segment fluorescence microscope images, this is only an instantiation of the active mask framework. We suggest some other instantiations of the framework to segment different types of images.

  10. Formal Semantic Definition of ELLA Timing

    DTIC Science & Technology

    1990-11-01

    above given as 2 ELLA Timing h hhhhhhhh INPUT : 111 11111 time : 01234 ........... then their corresponding outputs are hhhhhh DELAYI : ill iiiii {n=2...hhhh DELAY_2: 11 hhhhhh DELAY_3: iiiiiiiii where DELAYI has been instantiated for the value n=2. The above description provides a very informal

  11. Authentic Game-Based Learning and Teachers' Dilemmas in Reconstructing Professional Practice

    ERIC Educational Resources Information Center

    Chee, Yam San; Mehrotra, Swati; Ong, Jing Chuan

    2015-01-01

    Teachers who attempt pedagogical innovation with authentic digital games face significant challenges because such games instantiate open systems of learner activity, inviting enquiry learning rather than knowledge acquisition. However, school environments are normatively sanctioned cultural spaces where direct instruction and high-stakes tests are…

  12. Learning Progressions as Tools for Assessment and Learning

    ERIC Educational Resources Information Center

    Shepard, Lorrie A.

    2018-01-01

    This article addresses the teaching and learning side of the learning progressions literature, calling out for measurement specialists the knowledge most needed when collaborating with subject-matter experts in the development of learning progressions. Learning progressions are one of the strongest instantiations of principles from "Knowing…

  13. Kinesiology's "Inconvenient Truth" and the Physical Cultural Studies Imperative

    ERIC Educational Resources Information Center

    Andrews, David L.

    2008-01-01

    This article explicates the "inconvenient truth" that is at the core of the crisis currently facing the field of kinesiology. Namely, the instantiation of an epistemological hierarchy that privileges positivist over postpositivist, quantitative over qualitative, and predictive over interpretive ways of knowing. The discussion outlines…

  14. The Production of "Proper Cheating" in Online Examinations within Technological Universities

    ERIC Educational Resources Information Center

    Kitto, Simon; Saltmarsh, Sue

    2007-01-01

    This paper uses poststructuralist theories of governmentality, agency, consumption and Barry's (2001) concept of Technological Societies, as a heuristic framework to trace the role of online education technologies in the instantiation of subjectification processes within contemporary Australian universities. This case study of the unintended…

  15. Teacher Education and the Best-Loved Self

    ERIC Educational Resources Information Center

    Craig, Cheryl J.

    2013-01-01

    Four narrative fragments involving research disseminated globally -- namely, United States, Israel, The Netherlands, The People's Republic of China -- are used to instantiate the phenomenon of teachers teaching their best-loved selves, without becoming the curriculum themselves. Next, the development of the best-loved self-conceptualization as it…

  16. Use of containerisation as an alternative to full virtualisation in grid environments.

    NASA Astrophysics Data System (ADS)

    Long, Robin

    2015-12-01

    Virtualisation is a key tool on the grid. It can be used to provide varying work environments or as part of a cloud infrastructure. Virtualisation itself carries certain overheads that decrease the performance of the system through requiring extra resources to virtualise the software and hardware stack, and CPU-cycles wasted instantiating or destroying virtual machines for each job. With the rise and improvements in containerisation, where only the software stack is kept separate and no hardware or kernel virtualisation is used, there is scope for speed improvements and efficiency increases over standard virtualisation. We compare containerisation and virtualisation, including a comparison against bare-metal machines as a benchmark.

  17. Reuse: A knowledge-based approach

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui

    1992-01-01

    This paper describes our research in automating the reuse process through the use of application domain models. Application domain models are explicit formal representations of the application knowledge necessary to understand, specify, and generate application programs. Furthermore, they provide a unified repository for the operational structure, rules, policies, and constraints of a specific application area. In our approach, domain models are expressed in terms of a transaction-based meta-modeling language. This paper has described in detail the creation and maintenance of hierarchical structures. These structures are created through a process that includes reverse engineering of data models with supplementary enhancement from application experts. Source code is also reverse engineered but is not a major source of domain model instantiation at this time. In the second phase of the software synthesis process, program specifications are interactively synthesized from an instantiated domain model. These specifications are currently integrated into a manual programming process but will eventually be used to derive executable code with mechanically assisted transformations. This research is performed within the context of programming-in-the-large types of systems. Although our goals are ambitious, we are implementing the synthesis system in an incremental manner through which we can realize tangible results. The client/server architecture is capable of supporting 16 simultaneous X/Motif users and tens of thousands of attributes and classes. Domain models have been partially synthesized from five different application areas. As additional domain models are synthesized and additional knowledge is gathered, we will inevitably add to and modify our representation. However, our current experience indicates that it will scale and expand to meet our modeling needs.

  18. Validation of a common data model for active safety surveillance research

    PubMed Central

    Ryan, Patrick B; Reich, Christian G; Hartzema, Abraham G; Stang, Paul E

    2011-01-01

    Objective Systematic analysis of observational medical databases for active safety surveillance is hindered by the variation in data models and coding systems. Data analysts often find robust clinical data models difficult to understand and ill suited to support their analytic approaches. Further, some models do not facilitate the computations required for systematic analysis across many interventions and outcomes for large datasets. Translating the data from these idiosyncratic data models to a common data model (CDM) could facilitate both the analysts' understanding and the suitability for large-scale systematic analysis. In addition to facilitating analysis, a suitable CDM has to faithfully represent the source observational database. Before beginning to use the Observational Medical Outcomes Partnership (OMOP) CDM and a related dictionary of standardized terminologies for a study of large-scale systematic active safety surveillance, the authors validated the model's suitability for this use by example. Validation by example To validate the OMOP CDM, the model was instantiated into a relational database, data from 10 different observational healthcare databases were loaded into separate instances, a comprehensive array of analytic methods that operate on the data model was created, and these methods were executed against the databases to measure performance. Conclusion There was acceptable representation of the data from 10 observational databases in the OMOP CDM using the standardized terminologies selected, and a range of analytic methods was developed and executed with sufficient performance to be useful for active safety surveillance. PMID:22037893

  19. A Computational Model of Selection by Consequences

    ERIC Educational Resources Information Center

    McDowell, J. J.

    2004-01-01

    Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of…

  20. AVCS Simulator Test Plan and Design Guide

    NASA Technical Reports Server (NTRS)

    Shelden, Stephen

    2001-01-01

    Internal document for communication of AVCS direction and documentation of simulator functionality. Discusses methods for AVCS simulation evaluation of pilot functions, implementation strategy of varying functional representation of pilot tasks (by instantiations of a base AVCS to reasonably approximate the interface of various vehicles -- e.g. Altair, GlobalHawk, etc.).

  1. The Unsustainability Imperative? Problems with "Sustainability" and "Sustainable Development" as Regulative Ideals

    ERIC Educational Resources Information Center

    Stables, Andrew

    2013-01-01

    Normality is imminently catastrophic. Climate change is a contemporary instantiation of the perpetual sense of crisis that characterises the human condition, and concepts such as sustainability and resilience serve as regulative ideals (cf. beauty, perfection, and truth) in the fight against ubiquitous unsustainability. Unsustainability is an…

  2. System Instantiation Comparison Method: A Technique for Comparing Military Headquarters

    DTIC Science & Technology

    2007-02-01

    together. 21 DSTO-RR-0322 be necessary to define another context relating to hibernation . The context determines the rate at which the...July 1999. Larman C. 1998, The Use Case Model: What are the processes? Java Report SIGS Publication August 1998 Vol 3, Number 8 pp. 62-72. Levis

  3. A Comparison of Books and Hypermedia for Knowledge-based Sports Coaching.

    ERIC Educational Resources Information Center

    Vickers, Joan N.; Gaines, Brian R.

    1988-01-01

    Summarizes and illustrates the knowledge-based approach to instructional material design. A series of sports coaching handbooks and hypermedia presentations of the same material are described and the different instantiations of the knowledge and training structures are compared. Figures show knowledge structures for badminton and the architecture…

  4. Context-Dependent Control over Attentional Capture

    ERIC Educational Resources Information Center

    Cosman, Joshua D.; Vecera, Shaun P.

    2013-01-01

    A number of studies have demonstrated that the likelihood of a salient item capturing attention is dependent on the "attentional set" an individual employs in a given situation. The instantiation of an attentional set is often viewed as a strategic, voluntary process, relying on working memory systems that represent immediate task…

  5. Authentication in Reprogramming of Sensor Networks for Mote Class Adversaries

    DTIC Science & Technology

    2006-01-01

    based approach. In this paper, we propose a symmetric key-based protocol for authenticating the reprogramming process. Our protocol is based on the ... secret instantiation algorithm, which requires only O(log n) keys to be maintained at each sensor. We integrate this algorithm with the existing

  6. The Value of Fieldwork and Service Learning

    ERIC Educational Resources Information Center

    Ruppert, Nancy

    2013-01-01

    Colleges of education must instantiate their candidates' knowledge, skills, and dispositions for accreditation. Professors often have candidates reflect on field experience as a way to enhance their learning. This study examines reflections of 43 candidates over a 2-year period. Candidates engaged in an after-school enrichment program as…

  7. Influence of Familiar Features on Diagnosis: Instantiated Features in an Applied Setting

    ERIC Educational Resources Information Center

    Dore, Kelly L.; Brooks, Lee R.; Weaver, Bruce; Norman, Geoffrey R.

    2012-01-01

    Medical diagnosis can be viewed as a categorization task. There are two mechanisms whereby humans make categorical judgments: "analytical reasoning," based on explicit consideration of features and "nonanalytical reasoning," an unconscious holistic process of matching against prior exemplars. However, there is evidence that prior experience can…

  8. Transforming Systems Engineering through Model Centric Engineering

    DTIC Science & Technology

    2017-08-08

    12 Figure 5. Semantic Web Technologies related to Layers of Abstraction ................................. 23 Figure 6. NASA /JPL Instantiation...of OpenMBEE (circa 2014) ................................................. 24 Figure 7. NASA /JPL Foundational Ontology for Systems Engineering...Engineering (DE) Transformation initiative, and our relationship that we have fostered with National Aeronautics and Space Administration ( NASA ) Jet

  9. Quantitative Relationships Involving Additive Differences: Numerical Resilience

    ERIC Educational Resources Information Center

    Ramful, Ajay; Ho, Siew Yin

    2014-01-01

    This case study describes the ways in which problems involving additive differences with unknown starting quantities, constrain the problem solver in articulating the inherent quantitative relationship. It gives empirical evidence to show how numerical reasoning takes over as a Grade 6 student instantiates the quantitative relation by resorting to…

  10. Enhancing Knowledge Integration: An Information System Capstone Project

    ERIC Educational Resources Information Center

    Steiger, David M.

    2009-01-01

    This database project focuses on learning through knowledge integration; i.e., sharing and applying specialized (database) knowledge within a group, and combining it with other business knowledge to create new knowledge. Specifically, the Tiny Tots, Inc. project described below requires students to design, build, and instantiate a database system…

  11. Language, Perception, and the Schematic Representation of Spatial Relations

    ERIC Educational Resources Information Center

    Amorapanth, Prin; Kranjec, Alexander; Bromberger, Bianca; Lehet, Matthew; Widick, Page; Woods, Adam J.; Kimberg, Daniel Y.; Chatterjee, Anjan

    2012-01-01

    Schemas are abstract nonverbal representations that parsimoniously depict spatial relations. Despite their ubiquitous use in maps and diagrams, little is known about their neural instantiation. We sought to determine the extent to which schematic representations are neurally distinguished from language on the one hand, and from rich perceptual…

  12. Ontogeny and Phylogeny from an Epigenetic Point of View.

    ERIC Educational Resources Information Center

    Lovtrup, Soren

    1984-01-01

    The correlation between ontogeny and phylogeny is analyzed through the discussion of four theories on the reality, history, epigenetic, and ecological aspects of the mechanism of evolution. Also discussed are historical and creative aspects of evolution and three epigenetic mechanisms instantiated in the case of the amphibian embryo. (Author/RH)

  13. Ape Metaphysics: Object Individuation without Language

    ERIC Educational Resources Information Center

    Mendes, Natacha; Rakoczy, Hannes; Call, Josep

    2008-01-01

    Developmental research suggests that whereas very young infants individuate objects purely on spatiotemporal grounds, from (at latest) around 1 year of age children are capable of individuating objects according to the kind they belong to and the properties they instantiate. As the latter ability has been found to correlate with language, some…

  14. Identifying the Enemy: Social Categorization and National Security Policy

    ERIC Educational Resources Information Center

    Unsworth, Kristene

    2010-01-01

    This dissertation seeks to understand the interplay between informal articulations of social categories and formal instantiations of those categories in official language. Specifically, it explores the process of social categorization as it is used to identify threats to national security. The research employed a qualitative, document-based,…

  15. Deservingness: Challenging Coloniality in Education and Migration Scholarship

    ERIC Educational Resources Information Center

    Patel, Leigh

    2015-01-01

    Rhetoric, policy, and debate about immigration and immigrants are saturated with the trope of deservingness. In nation/states built on stratification, deservingness acts as a discourse of racialization, narrating across racially minoritized groups to re-instantiate the benefits for the racially majoritized. In this theoretical essay, I draw from…

  16. The Cost of Concreteness: The Effect of Nonessential Information on Analogical Transfer

    ERIC Educational Resources Information Center

    Kaminski, Jennifer A.; Sloutsky, Vladimir M.; Heckler, Andrew F.

    2013-01-01

    Most theories of analogical transfer focus on similarities between the learning and transfer domains, where transfer is more likely between domains that share common surface features, similar elements, or common interpretations of structure. We suggest that characteristics of the learning instantiation alone can give rise to different levels of…

  17. Introducing a Collaborative E2 (Evaluation & Enhancement) Social Accountability Framework for Medical Schools

    ERIC Educational Resources Information Center

    Kirby, Jeffrey; O'Hearn, Shawna; Latham, Lesley; Harris, Bessie; Davis-Murdoch, Sharon; Paul, Kara

    2016-01-01

    Medical schools recognize that they have an important social mandate beyond their primary role to educate future physicians. The instantiation of social accountability (SA) within faculties of medicine requires intentional, effective partnering with diverse internal and external stakeholders. Despite early, promising academic work in the field of…

  18. Language Maintenance in a Multilingual Family: Informal Heritage Language Lessons in Parent-Child Interactions

    ERIC Educational Resources Information Center

    Kheirkhah, Mina; Cekaite, Asta

    2015-01-01

    The present study explores language socialization patterns in a Persian-Kurdish family in Sweden and examines how "one-parent, one-language" family language policies are instantiated and negotiated in parent-child interactions. The data consist of video-recordings and ethnographic observations of family interactions, as well as…

  19. Acquisition of Nonadjacent Phonological Dependencies in the Native Language during the First Year of Life

    ERIC Educational Resources Information Center

    Gonzalez-Gomez, Nayeli; Nazzi, Thierry

    2012-01-01

    Languages instantiate many different kinds of dependencies, some holding between adjacent elements and others holding between nonadjacent elements. In the domain of phonology-phonotactics, sensitivity to adjacent dependencies has been found to appear between 6 and 10 months. However, no study has directly established the emergence of sensitivity…

  20. Generalization through the Recurrent Interaction of Episodic Memories: A Model of the Hippocampal System

    ERIC Educational Resources Information Center

    Kumaran, Dharshan; McClelland, James L.

    2012-01-01

    In this article, we present a perspective on the role of the hippocampal system in generalization, instantiated in a computational model called REMERGE (recurrency and episodic memory results in generalization). We expose a fundamental, but neglected, tension between prevailing computational theories that emphasize the function of the hippocampus…

  1. Language Policy in Puerto Rico's Higher Education: Opening the Door for Translanguaging Practices

    ERIC Educational Resources Information Center

    Carroll, Kevin S.; Mazak, Catherine M.

    2017-01-01

    This paper investigates the relationship between meso university language policies in Puerto Rico and their micro instantiations in an undergraduate psychology classroom. We describe a typology of language policies used by 38 universities and campuses in Puerto Rico where their openness allows for flexible implementation of everyday micro policy.…

  2. Home-School Literacy Connections: The Perceptions of African American and Immigrant ESL Parents in Two Urban Communities

    ERIC Educational Resources Information Center

    Dudley-Marling, Curt

    2009-01-01

    Background/Context: Educational reform has emphasized the importance of parent involvement. Perhaps the most common instantiations of parent involvement are various efforts to encourage particular reading practices in the home. Although there is some research supporting the efficacy of "family literacy" initiatives, these efforts have been…

  3. Video Games: Play That Can Do Serious Good

    ERIC Educational Resources Information Center

    Eichenbaum, Adam; Bavelier, Daphne; Green, C. Shawn

    2014-01-01

    The authors review recent research that reveals how today's video games instantiate naturally and effectively many principles psychologists, neuroscientists, and educators believe critical for learning. A large body of research exists showing that the effects of these games are much broader. In fact, some types of commercial games have been…

  4. Age-Related Declines in the Fidelity of Newly Acquired Category Representations

    ERIC Educational Resources Information Center

    Davis, Tyler; Love, Bradley C.; Maddox, W. Todd

    2012-01-01

    We present a theory suggesting that the ability to build category representations that reflect the nuances of category structures in the environment depends upon clustering mechanisms instantiated in an MTL-PFC-based circuit. Because function in this circuit declines with age, we predict that the ability to build category representations will be…

  5. Discrete-State and Continuous Models of Recognition Memory: Testing Core Properties under Minimal Assumptions

    ERIC Educational Resources Information Center

    Kellen, David; Klauer, Karl Christoph

    2014-01-01

    A classic discussion in the recognition-memory literature concerns the question of whether recognition judgments are better described by continuous or discrete processes. These two hypotheses are instantiated by the signal detection theory model (SDT) and the 2-high-threshold model, respectively. Their comparison has almost invariably relied on…

  6. Exploring the Synergies between the Object Oriented Paradigm and Mathematics: A Java Led Approach

    ERIC Educational Resources Information Center

    Conrad, Marc; French, Tim

    2004-01-01

    While the object oriented paradigm and its instantiation within programming languages such as Java has become a ubiquitous part of both the commercial and educational landscapes, its usage as a visualization technique within mathematics undergraduate programmes of study has perhaps been somewhat underestimated. By regarding the object oriented…

  7. Let's Know! Proximal Impacts on Prekindergarten through Grade 3 Students' Comprehension-Related Skills

    ERIC Educational Resources Information Center

    Jiang, Hui; Davis, Dawn

    2017-01-01

    Let's Know! is a language-focused curriculum supplement developed through the Institute of Education Sciences' Reading for Understanding initiative aimed at supporting prekindergarten through grade 3 students' listening and reading comprehension. The current study reports results concerning the impacts of 2 instantiations of Let's Know! on…

  8. Taking back the Standards: Equity-Minded Teachers' Responses to Accountability-Related Instructional Constraints

    ERIC Educational Resources Information Center

    Stillman, Jamy

    2009-01-01

    This article offers three case studies of teachers who have been specially prepared to serve diverse students and examines their interpretations and instantiations of No Child Left Behind (NCLB)-driven language arts reform in "underperforming" schools, largely composed of Spanish-speaking English Learners (ELs). Drawing on literature…

  9. Retrieval Constraints on the Front End Create Differences in Recollection on a Subsequent Test

    ERIC Educational Resources Information Center

    Marsh, Richard L.; Meeks, J. Thadeus; Cook, Gabriel I.; Clark-Foos, Arlo; Hicks, Jason L.; Brewer, Gene A.

    2009-01-01

    Four experiments were conducted to investigate how the cognitive control of memory retrieval selects particular qualitative characteristics as a consequence of instantiating a retrieval mode for recognition memory. Adapting the memory for foils paradigm from Jacoby, Shimizu, Daniels, and Rhodes (Jacoby, L. L., Shimizu, Y., Daniels, K. A., &…

  10. Toward the Development of Socio-Metacognitive Expertise: An Approach to Developing Collaborative Competence

    ERIC Educational Resources Information Center

    Borge, Marcela; White, Barbara

    2016-01-01

    We proposed and evaluated an instructional framework for increasing students' ability to understand and regulate collaborative interactions called Co-Regulated Collaborative Learning (CRCL). In this instantiation of CRCL, models of collaborative competence were articulated through a set of socio-metacognitive roles. Our population consisted of 28…

  11. Open Education and the Sustainable Development Goals: Making Change Happen

    ERIC Educational Resources Information Center

    Lane, Andy

    2017-01-01

    Education for All has been a concept at the heart of international development since 1990 and has found its latest instantiation within the Sustainable Development Goals (SDGs) as SDG 4, "Ensure inclusive and equitable quality education and promote lifelong learning opportunities for all". Open education, in the form of resources and…

  12. On the Bus and Online: Instantiating an Interactive Learning Environment through Design-Based Research

    ERIC Educational Resources Information Center

    Kartoglu, Ümit; Vesper, James L.; Reeves, Thomas C.

    2017-01-01

    The World Health Organization converted an award-winning experiential learning course that takes place on a bus traveling down the "cold chain" for time- and temperature-sensitive pharmaceutical products in Turkey to an online interactive learning environment through design-based research. Similarities and differences in the objectives…

  13. The Internet, Political Communications Research and the Search for a New Information Paradigm

    ERIC Educational Resources Information Center

    Chiu, William Franklin

    2013-01-01

    The Internet, as a digital record of human discourse, provides an opportunity to directly analyze political communicative behavior. The rapid emergence of social online networks augurs a transformation in the quality and quantity of information people have to evaluate their political system. Digital formats instantiate new categories of actors and…

  14. The Enstranged Self: Recovering Some Grounds for Pluralism in Education

    ERIC Educational Resources Information Center

    Conroy, James C.

    2009-01-01

    For something approaching 50 years, multicultural education has been accepted as an educational, social and moral good by liberal educators. Its instantiation in the practices of education has, in various ways, largely depended on a series of strategies for making the other (the stranger) familiar within the majority culture. This essay suggests…

  15. Double Mapping in Metaphorical Expressions of Thought and Communication in Catalan Sign Language (LSC)

    ERIC Educational Resources Information Center

    Jarque, Maria-Josep

    2005-01-01

    This document illustrates that mental functioning and communication in Catalan Sign Language (LSC) are conceptual through metaphorical projection of bodily experiences. The data in this document show how concepts are grasped, put on student's heads, exchanged, manipulated, and so on, constituting instantiations of the basic metaphors: ideas are…

  16. Physician Assimilation in Medical Schools: Dualisms of Biomedical and Biopsychosocial Ideologies in the Discourse of Physician Educators.

    PubMed

    Olufowote, James O; Wang, Guoyu E

    2017-06-01

    Although health communication research and popular literature on physicians have heightened awareness of the dualisms physicians face, research is yet to focus on the discourse of physician educators who assimilate students into medicine for dualisms of the biomedical (BMD) and biopsychosocial (BPS) ideologies. The study drew on a dualism-centered model to analyze the discourse of 19 behavioral science course directors at 10 medical schools for the emergence of dualisms in instantiations of BPS ideologies and for the management of dualism in discourse that instantiated both BMD and BPS ideologies as part of the curriculum. Dualism emerged in the BPS ideologies of "patient-centeredness" and "cultural competence." While a dualism between "patients' data" and "patients' stories" emerged in the patient-centeredness ideology, a dualism between enhancing "interaction skill" and "understanding" emerged in the cultural competence ideology. Moreover, the study found educator discourse managing dualism between BMD and BPS ideologies through the strategies of "connection" and "separation." The study concludes with a discussion and the implications for theory and research.

  17. Towards a framework for geospatial tangible user interfaces in collaborative urban planning

    NASA Astrophysics Data System (ADS)

    Maquil, Valérie; Leopold, Ulrich; De Sousa, Luís Moreira; Schwartz, Lou; Tobias, Eric

    2018-04-01

    The increasing complexity of urban planning projects today requires new approaches to better integrate stakeholders with different professional backgrounds throughout a city. Traditional tools used in urban planning are designed for experts and offer little opportunity for participation and collaborative design. This paper introduces the concept of geospatial tangible user interfaces (GTUI) and reports on the design and implementation as well as the usability of such a GTUI to support stakeholder participation in collaborative urban planning. The proposed system uses physical objects to interact with large digital maps and geospatial data projected onto a tabletop. It is implemented using a PostGIS database, a web map server providing OGC web services, the computer vision framework reacTIVision, a Java-based TUIO client, and GeoTools. We describe how a GTUI has be instantiated and evaluated within the scope of two case studies related to real world collaborative urban planning scenarios. Our results confirm the feasibility of our proposed GTUI solutions to (a) instantiate different urban planning scenarios, (b) support collaboration, and (c) ensure an acceptable usability.

  18. Computerization of guidelines: towards a "guideline markup language".

    PubMed

    Dart, T; Xu, Y; Chatellier, G; Degoulet, P

    2001-01-01

    Medical decision making is one of the most difficult daily tasks for physicians. Guidelines have been designed to reduce variance between physicians in daily practice, to improve patient outcomes and to control costs. In fact, few physicians use guidelines in daily practice. A way to ease the use of guidelines is to implement computerised guidelines (computer reminders). We present in this paper a method of computerising guidelines. Our objectives were: 1) to propose a generic model that can be instantiated for any specific guidelines; 2) to use eXtensible Markup Language (XML) as a guideline representation language to instantiate the generic model for a specific guideline. Our model is an object representation of a clinical algorithm, it has been validated by running two different guidelines issued by a French official Agency. In spite of some limitations, we found that this model is expressive enough to represent complex guidelines devoted to diabetes and hypertension management. We conclude that XML can be used as a description format to structure guidelines and as an interface between paper-based guidelines and computer applications.

  19. Learning Midlevel Auditory Codes from Natural Sound Statistics.

    PubMed

    Młynarski, Wiktor; McDermott, Josh H

    2018-03-01

    Interaction with the world requires an organism to transform sensory signals into representations in which behaviorally meaningful properties of the environment are made explicit. These representations are derived through cascades of neuronal processing stages in which neurons at each stage recode the output of preceding stages. Explanations of sensory coding may thus involve understanding how low-level patterns are combined into more complex structures. To gain insight into such midlevel representations for sound, we designed a hierarchical generative model of natural sounds that learns combinations of spectrotemporal features from natural stimulus statistics. In the first layer, the model forms a sparse convolutional code of spectrograms using a dictionary of learned spectrotemporal kernels. To generalize from specific kernel activation patterns, the second layer encodes patterns of time-varying magnitude of multiple first-layer coefficients. When trained on corpora of speech and environmental sounds, some second-layer units learned to group similar spectrotemporal features. Others instantiate opponency between distinct sets of features. Such groupings might be instantiated by neurons in the auditory cortex, providing a hypothesis for midlevel neuronal computation.

  20. Blushing and the philosophy of mind.

    PubMed

    Bunge, Mario

    2007-01-01

    The introduction, an imaginary dialogue between a philosopher and a scientist, is followed by a brief discussion of the interactions between science, philosophy, and religion. Next comes an analysis of the three most popular philosophies of mind: classical mind-body dualism, computerism, and psychoneural monism. It is argued that the latter, held by medical psychologists since Hippocrates, and formulated explicitly by Cajal and Hebb, is the philosophy of mind that underlies contemporary cognitive and affective neuroscience. The standard objections to psychoneural monism (or materialism) are examined. Evolutionary psychology, though promissory, is judged to be more fancy than fact at its present stage. The conclusion is that the philosophy of mind is still in a poor shape, but that it can advance if it learns more from the science of mind. It would also help if scientific psychologists were to replace such tacitly dualistic expressions as "organ N instantiates (or subserves) mental function M" with "organ N performs mental function M", just as we say "the legs walk" instead of "walking is subserved by legs," and "the lungs breathe" instead of "the lungs instantiate breathing."

  1. Towards a framework for geospatial tangible user interfaces in collaborative urban planning

    NASA Astrophysics Data System (ADS)

    Maquil, Valérie; Leopold, Ulrich; De Sousa, Luís Moreira; Schwartz, Lou; Tobias, Eric

    2018-03-01

    The increasing complexity of urban planning projects today requires new approaches to better integrate stakeholders with different professional backgrounds throughout a city. Traditional tools used in urban planning are designed for experts and offer little opportunity for participation and collaborative design. This paper introduces the concept of geospatial tangible user interfaces (GTUI) and reports on the design and implementation as well as the usability of such a GTUI to support stakeholder participation in collaborative urban planning. The proposed system uses physical objects to interact with large digital maps and geospatial data projected onto a tabletop. It is implemented using a PostGIS database, a web map server providing OGC web services, the computer vision framework reacTIVision, a Java-based TUIO client, and GeoTools. We describe how a GTUI has be instantiated and evaluated within the scope of two case studies related to real world collaborative urban planning scenarios. Our results confirm the feasibility of our proposed GTUI solutions to (a) instantiate different urban planning scenarios, (b) support collaboration, and (c) ensure an acceptable usability.

  2. Unaware Processing of Tools in the Neural System for Object-Directed Action Representation.

    PubMed

    Tettamanti, Marco; Conca, Francesca; Falini, Andrea; Perani, Daniela

    2017-11-01

    The hypothesis that the brain constitutively encodes observed manipulable objects for the actions they afford is still debated. Yet, crucial evidence demonstrating that, even in the absence of perceptual awareness, the mere visual appearance of a manipulable object triggers a visuomotor coding in the action representation system including the premotor cortex, has hitherto not been provided. In this fMRI study, we instantiated reliable unaware visual perception conditions by means of continuous flash suppression, and we tested in 24 healthy human participants (13 females) whether the visuomotor object-directed action representation system that includes left-hemispheric premotor, parietal, and posterior temporal cortices is activated even under subliminal perceptual conditions. We found consistent activation in the target visuomotor cortices, both with and without perceptual awareness, specifically for pictures of manipulable versus non-manipulable objects. By means of a multivariate searchlight analysis, we also found that the brain activation patterns in this visuomotor network enabled the decoding of manipulable versus non-manipulable object picture processing, both with and without awareness. These findings demonstrate the intimate neural coupling between visual perception and motor representation that underlies manipulable object processing: manipulable object stimuli specifically engage the visuomotor object-directed action representation system, in a constitutive manner that is independent from perceptual awareness. This perceptuo-motor coupling endows the brain with an efficient mechanism for monitoring and planning reactions to external stimuli in the absence of awareness. SIGNIFICANCE STATEMENT Our brain constantly encodes the visual information that hits the retina, leading to a stimulus-specific activation of sensory and semantic representations, even for objects that we do not consciously perceive. Do these unconscious representations encompass the motor programming of actions that could be accomplished congruently with the objects' functions? In this fMRI study, we instantiated unaware visual perception conditions, by dynamically suppressing the visibility of manipulable object pictures with mondrian masks. Despite escaping conscious perception, manipulable objects activated an object-directed action representation system that includes left-hemispheric premotor, parietal, and posterior temporal cortices. This demonstrates that visuomotor encoding occurs independently of conscious object perception. Copyright © 2017 the authors 0270-6474/17/3710712-13$15.00/0.

  3. Factors affecting the effectiveness of biomedical document indexing and retrieval based on terminologies.

    PubMed

    Dinh, Duy; Tamine, Lynda; Boubekeur, Fatiha

    2013-02-01

    The aim of this work is to evaluate a set of indexing and retrieval strategies based on the integration of several biomedical terminologies on the available TREC Genomics collections for an ad hoc information retrieval (IR) task. We propose a multi-terminology based concept extraction approach to selecting best concepts from free text by means of voting techniques. We instantiate this general approach on four terminologies (MeSH, SNOMED, ICD-10 and GO). We particularly focus on the effect of integrating terminologies into a biomedical IR process, and the utility of using voting techniques for combining the extracted concepts from each document in order to provide a list of unique concepts. Experimental studies conducted on the TREC Genomics collections show that our multi-terminology IR approach based on voting techniques are statistically significant compared to the baseline. For example, tested on the 2005 TREC Genomics collection, our multi-terminology based IR approach provides an improvement rate of +6.98% in terms of MAP (mean average precision) (p<0.05) compared to the baseline. In addition, our experimental results show that document expansion using preferred terms in combination with query expansion using terms from top ranked expanded documents improve the biomedical IR effectiveness. We have evaluated several voting models for combining concepts issued from multiple terminologies. Through this study, we presented many factors affecting the effectiveness of biomedical IR system including term weighting, query expansion, and document expansion models. The appropriate combination of those factors could be useful to improve the IR performance. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. Monitoring and Discovery for Self-Organized Network Management in Virtualized and Software Defined Networks

    PubMed Central

    Valdivieso Caraguay, Ángel Leonardo; García Villalba, Luis Javier

    2017-01-01

    This paper presents the Monitoring and Discovery Framework of the Self-Organized Network Management in Virtualized and Software Defined Networks SELFNET project. This design takes into account the scalability and flexibility requirements needed by 5G infrastructures. In this context, the present framework focuses on gathering and storing the information (low-level metrics) related to physical and virtual devices, cloud environments, flow metrics, SDN traffic and sensors. Similarly, it provides the monitoring data as a generic information source in order to allow the correlation and aggregation tasks. Our design enables the collection and storing of information provided by all the underlying SELFNET sublayers, including the dynamically onboarded and instantiated SDN/NFV Apps, also known as SELFNET sensors. PMID:28362346

  5. Monitoring and Discovery for Self-Organized Network Management in Virtualized and Software Defined Networks.

    PubMed

    Caraguay, Ángel Leonardo Valdivieso; Villalba, Luis Javier García

    2017-03-31

    This paper presents the Monitoring and Discovery Framework of the Self-Organized Network Management in Virtualized and Software Defined Networks SELFNET project. This design takes into account the scalability and flexibility requirements needed by 5G infrastructures. In this context, the present framework focuses on gathering and storing the information (low-level metrics) related to physical and virtual devices, cloud environments, flow metrics, SDN traffic and sensors. Similarly, it provides the monitoring data as a generic information source in order to allow the correlation and aggregation tasks. Our design enables the collection and storing of information provided by all the underlying SELFNET sublayers, including the dynamically onboarded and instantiated SDN/NFV Apps, also known as SELFNET sensors.

  6. Towards ontology-driven navigation of the lipid bibliosphere

    PubMed Central

    Baker, Christopher JO; Kanagasabai, Rajaraman; Ang, Wee Tiong; Veeramani, Anitha; Low, Hong-Sang; Wenk, Markus R

    2008-01-01

    Background The indexing of scientific literature and content is a relevant and contemporary requirement within life science information systems. Navigating information available in legacy formats continues to be a challenge both in enterprise and academic domains. The emergence of semantic web technologies and their fusion with artificial intelligence techniques has provided a new toolkit with which to address these data integration challenges. In the emerging field of lipidomics such navigation challenges are barriers to the translation of scientific results into actionable knowledge, critical to the treatment of diseases such as Alzheimer's syndrome, Mycobacterium infections and cancer. Results We present a literature-driven workflow involving document delivery and natural language processing steps generating tagged sentences containing lipid, protein and disease names, which are instantiated to custom designed lipid ontology. We describe the design challenges in capturing lipid nomenclature, the mandate of the ontology and its role as query model in the navigation of the lipid bibliosphere. We illustrate the extent of the description logic-based A-box query capability provided by the instantiated ontology using a graphical query composer to query sentences describing lipid-protein and lipid-disease correlations. Conclusion As scientists accept the need to readjust the manner in which we search for information and derive knowledge we illustrate a system that can constrain the literature explosion and knowledge navigation problems. Specifically we have focussed on solving this challenge for lipidomics researchers who have to deal with the lack of standardized vocabulary, differing classification schemes, and a wide array of synonyms before being able to derive scientific insights. The use of the OWL-DL variant of the Web Ontology Language (OWL) and description logic reasoning is pivotal in this regard, providing the lipid scientist with advanced query access to the results of text mining algorithms instantiated into the ontology. The visual query paradigm assists in the adoption of this technology. PMID:18315858

  7. Ventromedial prefrontal cortex generates pre-stimulus theta coherence desynchronization: A schema instantiation hypothesis.

    PubMed

    Gilboa, Asaf; Moscovitch, Morris

    2017-02-01

    The ventral medial prefrontal cortex (vmPFC) has long been implicated in monitoring of memory veracity, and more recently also in memory schema functions. In our model of strategic retrieval the two are related. We have proposed that the vmPFC has two schema-dependent functions: (i) to establish context-relevant templates against which the output of memory systems can be compared; (ii) to mediate automatic decision monitoring processes to ensure that only those responses that meet the criterion are enacted. Electroencephalogram (EEG) data were used to provide evidence that vmPFC supports both functions, and that schema instantiation informs monitoring. Participants viewed pictures of acquaintances, along with those of famous and nonfamous people, and were asked to respond positively only to pictures of individuals they had met (personal familiarity). The Self serves as a super-ordinate cognitive schema, facilitating accurate endorsement of acquaintances and exclusion of non-personal but familiar faces. For the present report we focused on pre-cue tonic oscillatory activity. Controls demonstrated theta coherence desynchronization between medial prefrontal areas, inferotemporal and lateral temporal cortices. These oscillatory coherence patterns were significantly reduced in patients with vmPFC damage, especially in those with clinical histories of spontaneous confabulation. Importantly, these pre-stimulus cortico-cortical desynchronizations predicted post-cue automatic memory activation, as indexed by a familiarity modulation of the face-sensitive posterior cortical N170. Pre-cue desynchronization also predicted early post-cue frontal positive modulation (P230) and response accuracy. The data are consistent with a schema instantiation model that suggests the vmPFC biases posterior neocortical long-term memory representations that enhance automatic memory cue processing and informs frontally-mediated rapid memory monitoring (P230). Damage to these structures can lead to inaccurate, context-irrelevant activation of schemas. These, in turn, impair monitoring signals and can lead to confabulation when memory control processes are also deficient. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Towards ontology-driven navigation of the lipid bibliosphere.

    PubMed

    Baker, Christopher Jo; Kanagasabai, Rajaraman; Ang, Wee Tiong; Veeramani, Anitha; Low, Hong-Sang; Wenk, Markus R

    2008-01-01

    The indexing of scientific literature and content is a relevant and contemporary requirement within life science information systems. Navigating information available in legacy formats continues to be a challenge both in enterprise and academic domains. The emergence of semantic web technologies and their fusion with artificial intelligence techniques has provided a new toolkit with which to address these data integration challenges. In the emerging field of lipidomics such navigation challenges are barriers to the translation of scientific results into actionable knowledge, critical to the treatment of diseases such as Alzheimer's syndrome, Mycobacterium infections and cancer. We present a literature-driven workflow involving document delivery and natural language processing steps generating tagged sentences containing lipid, protein and disease names, which are instantiated to custom designed lipid ontology. We describe the design challenges in capturing lipid nomenclature, the mandate of the ontology and its role as query model in the navigation of the lipid bibliosphere. We illustrate the extent of the description logic-based A-box query capability provided by the instantiated ontology using a graphical query composer to query sentences describing lipid-protein and lipid-disease correlations. As scientists accept the need to readjust the manner in which we search for information and derive knowledge we illustrate a system that can constrain the literature explosion and knowledge navigation problems. Specifically we have focussed on solving this challenge for lipidomics researchers who have to deal with the lack of standardized vocabulary, differing classification schemes, and a wide array of synonyms before being able to derive scientific insights. The use of the OWL-DL variant of the Web Ontology Language (OWL) and description logic reasoning is pivotal in this regard, providing the lipid scientist with advanced query access to the results of text mining algorithms instantiated into the ontology. The visual query paradigm assists in the adoption of this technology.

  9. Providing Personalized Energy Management and Awareness Services for Energy Efficiency in Smart Buildings.

    PubMed

    Fotopoulou, Eleni; Zafeiropoulos, Anastasios; Terroso-Sáenz, Fernando; Şimşek, Umutcan; González-Vidal, Aurora; Tsiolis, George; Gouvas, Panagiotis; Liapis, Paris; Fensel, Anna; Skarmeta, Antonio

    2017-09-07

    Considering that the largest part of end-use energy consumption worldwide is associated with the buildings sector, there is an inherent need for the conceptualization, specification, implementation, and instantiation of novel solutions in smart buildings, able to achieve significant reductions in energy consumption through the adoption of energy efficient techniques and the active engagement of the occupants. Towards the design of such solutions, the identification of the main energy consuming factors, trends, and patterns, along with the appropriate modeling and understanding of the occupants' behavior and the potential for the adoption of environmentally-friendly lifestyle changes have to be realized. In the current article, an innovative energy-aware information technology (IT) ecosystem is presented, aiming to support the design and development of novel personalized energy management and awareness services that can lead to occupants' behavioral change towards actions that can have a positive impact on energy efficiency. Novel information and communication technologies (ICT) are exploited towards this direction, related mainly to the evolution of the Internet of Things (IoT), data modeling, management and fusion, big data analytics, and personalized recommendation mechanisms. The combination of such technologies has resulted in an open and extensible architectural approach able to exploit in a homogeneous, efficient and scalable way the vast amount of energy, environmental, and behavioral data collected in energy efficiency campaigns and lead to the design of energy management and awareness services targeted to the occupants' lifestyles. The overall layered architectural approach is detailed, including design and instantiation aspects based on the selection of set of available technologies and tools. Initial results from the usage of the proposed energy aware IT ecosystem in a pilot site at the University of Murcia are presented along with a set of identified open issues for future research.

  10. The importance of shared mental models and shared situation awareness for transforming robots from tools to teammates

    NASA Astrophysics Data System (ADS)

    Ososky, Scott; Schuster, David; Jentsch, Florian; Fiore, Stephen; Shumaker, Randall; Lebiere, Christian; Kurup, Unmesh; Oh, Jean; Stentz, Anthony

    2012-06-01

    Current ground robots are largely employed via tele-operation and provide their operators with useful tools to extend reach, improve sensing, and avoid dangers. To move from robots that are useful as tools to truly synergistic human-robot teaming, however, will require not only greater technical capabilities among robots, but also a better understanding of the ways in which the principles of teamwork can be applied from exclusively human teams to mixed teams of humans and robots. In this respect, a core characteristic that enables successful human teams to coordinate shared tasks is their ability to create, maintain, and act on a shared understanding of the world and the roles of the team and its members in it. The team performance literature clearly points towards two important cornerstones for shared understanding of team members: mental models and situation awareness. These constructs have been investigated as products of teams as well; amongst teams, they are shared mental models and shared situation awareness. Consequently, we are studying how these two constructs can be measured and instantiated in human-robot teams. In this paper, we report results from three related efforts that are investigating process and performance outcomes for human robot teams. Our investigations include: (a) how human mental models of tasks and teams change whether a teammate is human, a service animal, or an advanced automated system; (b) how computer modeling can lead to mental models being instantiated and used in robots; (c) how we can simulate the interactions between human and future robotic teammates on the basis of changes in shared mental models and situation assessment.

  11. Providing Personalized Energy Management and Awareness Services for Energy Efficiency in Smart Buildings

    PubMed Central

    Fotopoulou, Eleni; Tsiolis, George; Gouvas, Panagiotis; Liapis, Paris; Fensel, Anna; Skarmeta, Antonio

    2017-01-01

    Considering that the largest part of end-use energy consumption worldwide is associated with the buildings sector, there is an inherent need for the conceptualization, specification, implementation, and instantiation of novel solutions in smart buildings, able to achieve significant reductions in energy consumption through the adoption of energy efficient techniques and the active engagement of the occupants. Towards the design of such solutions, the identification of the main energy consuming factors, trends, and patterns, along with the appropriate modeling and understanding of the occupants’ behavior and the potential for the adoption of environmentally-friendly lifestyle changes have to be realized. In the current article, an innovative energy-aware information technology (IT) ecosystem is presented, aiming to support the design and development of novel personalized energy management and awareness services that can lead to occupants’ behavioral change towards actions that can have a positive impact on energy efficiency. Novel information and communication technologies (ICT) are exploited towards this direction, related mainly to the evolution of the Internet of Things (IoT), data modeling, management and fusion, big data analytics, and personalized recommendation mechanisms. The combination of such technologies has resulted in an open and extensible architectural approach able to exploit in a homogeneous, efficient and scalable way the vast amount of energy, environmental, and behavioral data collected in energy efficiency campaigns and lead to the design of energy management and awareness services targeted to the occupants’ lifestyles. The overall layered architectural approach is detailed, including design and instantiation aspects based on the selection of set of available technologies and tools. Initial results from the usage of the proposed energy aware IT ecosystem in a pilot site at the University of Murcia are presented along with a set of identified open issues for future research. PMID:28880227

  12. Uses and Misuses of Ted Kaczynski's MMPI.

    PubMed

    Ben-Porath, Yossef S

    2018-05-22

    Although case studies can be a helpful didactic aid when teaching personality assessment and illustrating use of a test, they can, of course, not be used as "evidence" that a test "works" or does not work. This article, however, reviews and discusses the far more problematic uses instantiated in a case study of Ted Kaczynski's Minnesota Multiphasic Personality Inventory (MMPI). A series of errors of omission and commission are identified in Butcher, Hass, Greene, and Nelson's ( 2015 ) effort to criticize the MMPI-2-RF. These include not disclosing that Butcher's interpretive Minnesota Report for Forensic Settings indicates that the protocol is invalid, not including most of the MMPI-2 and MMPI-2-RF scores that contradict the authors' assertions, and mischaracterizing the MMPI-2-RF findings. Proper use of a case study is then illustrated by a discussion of diagnostic considerations indicated by the MMPI-2-RF findings.

  13. State institutions and social identity: National representation in soldiers' and civilians' interview talk concerning military service.

    PubMed

    Gibson, Stephen; Condor, Susan

    2009-06-01

    Theory and research deriving from social identity or self-categorization perspectives often starts out with the presumption that social actors necessarily view societal objects such as nations or states as human categories. However, recent work suggests that this may be only one of a number of forms that societal representation may take. For example, nations may be understood variously as peoples, places, or institutions. This paper presents findings from a qualitative interview study conducted in England, in which soldiers and civilians talked about nationhood in relation to military service. Analysis indicated that, in this context, speakers were often inclined to use the terms 'Britain', 'nation', and 'country' as references to a political institution as opposed to a category of people. In addition, there were systematic differences between the ways in which the two samples construed their nation in institutional terms. The civilians were inclined to treat military service as a matter of obedience to the dictates of the Government of the day. In contrast, the soldiers were more inclined to frame military service as a matter of loyalty to state as symbolically instantiated in the body of the sovereign. Implications for work adopting a social identity perspective are discussed.

  14. Exploiting semantics for sensor re-calibration in event detection systems

    NASA Astrophysics Data System (ADS)

    Vaisenberg, Ronen; Ji, Shengyue; Hore, Bijit; Mehrotra, Sharad; Venkatasubramanian, Nalini

    2008-01-01

    Event detection from a video stream is becoming an important and challenging task in surveillance and sentient systems. While computer vision has been extensively studied to solve different kinds of detection problems over time, it is still a hard problem and even in a controlled environment only simple events can be detected with a high degree of accuracy. Instead of struggling to improve event detection using image processing only, we bring in semantics to direct traditional image processing. Semantics are the underlying facts that hide beneath video frames, which can not be "seen" directly by image processing. In this work we demonstrate that time sequence semantics can be exploited to guide unsupervised re-calibration of the event detection system. We present an instantiation of our ideas by using an appliance as an example--Coffee Pot level detection based on video data--to show that semantics can guide the re-calibration of the detection model. This work exploits time sequence semantics to detect when re-calibration is required to automatically relearn a new detection model for the newly evolved system state and to resume monitoring with a higher rate of accuracy.

  15. The influence of the visual modality on language structure and conventionalization: insights from sign language and gesture.

    PubMed

    Perniss, Pamela; Özyürek, Asli; Morgan, Gary

    2015-01-01

    For humans, the ability to communicate and use language is instantiated not only in the vocal modality but also in the visual modality. The main examples of this are sign languages and (co-speech) gestures. Sign languages, the natural languages of Deaf communities, use systematic and conventionalized movements of the hands, face, and body for linguistic expression. Co-speech gestures, though non-linguistic, are produced in tight semantic and temporal integration with speech and constitute an integral part of language together with speech. The articles in this issue explore and document how gestures and sign languages are similar or different and how communicative expression in the visual modality can change from being gestural to grammatical in nature through processes of conventionalization. As such, this issue contributes to our understanding of how the visual modality shapes language and the emergence of linguistic structure in newly developing systems. Studying the relationship between signs and gestures provides a new window onto the human ability to recruit multiple levels of representation (e.g., categorical, gradient, iconic, abstract) in the service of using or creating conventionalized communicative systems. Copyright © 2015 Cognitive Science Society, Inc.

  16. Inhibitory Competition between Shape Properties in Figure-Ground Perception

    ERIC Educational Resources Information Center

    Peterson, Mary A.; Skow, Emily

    2008-01-01

    Theories of figure-ground perception entail inhibitory competition between either low-level units (edge or feature units) or high-level shape properties. Extant computational models instantiate the 1st type of theory. The authors investigated a prediction of the 2nd type of theory: that shape properties suggested on the ground side of an edge are…

  17. The Design and Implementation of an Online Professional Development Program for Future Online Educators: A Case Study

    ERIC Educational Resources Information Center

    Quah, Joy

    2013-01-01

    This study examines the practices of an instructor who designed and instantiated an online professional development program to foster expertise in online instructional design. The main purpose of this study is to investigate how her application of technological affordances may inform a re-examination of Cognitive Apprenticeship (Collins, Brown,…

  18. For Want of a Nail: How Absences Cause Events

    ERIC Educational Resources Information Center

    Wolff, Phillip; Barbey, Aron K.; Hausknecht, Matthew

    2010-01-01

    Causation by omission is instantiated when an effect occurs from an absence, as in "The absence of nicotine causes withdrawal" or "Not watering the plant caused it to wilt." The phenomenon has been viewed as an insurmountable problem for process theories of causation, which specify causation in terms of conserved quantities, like force, but not…

  19. Grammatical Gender Agreement in L2 Spanish: The Role of Syntactic Context

    ERIC Educational Resources Information Center

    Spino-Seijas, Le Anne L.

    2017-01-01

    A pervasive question in second language (L2) research is whether L2 learners can acquire parameterized functional features that are not instantiated in their first language (L1). While some researchers have argued for a representational deficit (e.g., Clahsen & Muysken, 1989; Hawkins & Chan, 1997), claiming that L2 learners' competence is…

  20. Understanding the Co-construction of Inquiry Practices: A Case Study of a Responsive Teaching Environment

    ERIC Educational Resources Information Center

    Maskiewicz, April C.; Winters, Victoria A.

    2012-01-01

    We set out to understand how different instantiations of inquiry emerged in two different years of one elementary teacher's classroom. Longitudinal observations from Mrs. Charles' 5th grade science classroom forced us to carefully and deliberately consider who exactly was responsible for the change in the class activities and norms. We provide…

  1. General-Purpose Ada Software Packages

    NASA Technical Reports Server (NTRS)

    Klumpp, Allan R.

    1991-01-01

    Collection of subprograms brings to Ada many features from other programming languages. All generic packages designed to be easily instantiated for types declared in user's facility. Most packages have widespread applicability, although some oriented for avionics applications. All designed to facilitate writing new software in Ada. Written on IBM/AT personal computer running under PC DOS, v.3.1.

  2. Software for Collaborative Use of Large Interactive Displays

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Shab, Thodore; Wales, Roxana; Vera, Alonso; Tollinger, Irene; McCurdy, Michael; Lyubimov, Dmitriy

    2006-01-01

    The MERBoard Collaborative Workspace, which is currently being deployed to support the Mars Exploration Rover (MER) Missions, is the first instantiation of a new computing architecture designed to support collaborative and group computing using computing devices situated in NASA mission operations room. It is a software system for generation of large-screen interactive displays by multiple users

  3. A Funds of Knowledge Approach to the Appropriation of New Media in a High School Writing Classroom

    ERIC Educational Resources Information Center

    Schwartz, Lisa H.

    2015-01-01

    Youths' learner-generated designs, instantiated in digital practices, spaces and artifacts, are underutilized in schools. Additionally, digital media tools are often taken up in reductive ways that serve to perpetuate deficit discourses for youth from nondominant communities, rather than reflect the creativity and innovation that youth practice…

  4. Making a Minimalist Approach to Codeswitching Work: Adding the Matrix Language.

    ERIC Educational Resources Information Center

    Jake, Janice L.; Myers-Scotton, Carol; Gross, Steven

    2002-01-01

    Discusses the Matrix Language Frame model. Analysis of noun phrases in a Spanish-English corpus illustrates this compatibility and shows how recent minimalist proposals can explain the distribution of nouns and determiners in the data if they adopt the notion of matrix language as bilingual instantiation of structural uniformity in a CP.…

  5. Processing Focus Structure in L1 and L2 French: L2 Proficiency Effects on ERPs

    ERIC Educational Resources Information Center

    Reichle, Robert V.; Birdsong, David

    2014-01-01

    This study examined the event-related potentials (ERPs) elicited by focus processing among first language (L1) speakers and second language (L2) learners of French. Participants read wh-questions containing explicit focus marking, followed by responses instantiating contrastive and informational focus. We hypothesized that L2 proficiency would…

  6. Convergence

    NASA Astrophysics Data System (ADS)

    Darcie, Thomas E.; Doverspike, Robert; Zirngibl, Martin; Korotky, Steven K.

    2005-02-01

    Call for Papers: Convergence Convergence has become a popular theme in telecommunications, one that has broad implications across all segments of the industry. Continual evolution of technology and applications continues to erase lines between traditionally separate lines of business, with dramatic consequences for vendors, service providers, and consumers. Spectacular advances in all layers of optical networking-leading to abundant, dynamic, cost-effective, and reliable wide-area and local-area connections-have been essential drivers of this evolution. As services and networks continue to evolve towards some notion of convergence, the continued role of optical networks must be explored. One vision of convergence renders all information in a common packet (especially IP) format. This vision is driven by the proliferation of data services. For example, time-division multiplexed (TDM) voice becomes VoIP. Analog cable-television signals become MPEG bits streamed to digital set-top boxes. T1 or OC-N private lines migrate to Ethernet virtual private networks (VPNs). All these packets coexist peacefully within a single packet-routing methodology built on an optical transport layer that combines the flexibility and cost of data networks with telecom-grade reliability. While this vision is appealing in its simplicity and shared widely, specifics of implementation raise many challenges and differences of opinion. For example, many seek to expand the role of Ethernet in these transport networks, while massive efforts are underway to make traditional TDM networks more data friendly within an evolved but backward-compatible SDH/SONET (synchronous digital hierarchy and synchronous optical network) multiplexing hierarchy. From this common underlying theme follow many specific instantiations. Examples include the convergence at the physical, logical, and operational levels of voice and data, video and data, private-line and virtual private-line, fixed and mobile, and local and long-haul services. These trends have many consequences for consumers, vendors, and carriers. Faced with large volumes of low-margin data traffic mixed with traditional voice services, the need for capital conservation and operational efficiency drives carriers away from today's separate overlay networks for each service and towards "converged" platforms. For example, cable operators require transport of multiple services over both hybrid fiber coax (HFC) and DWDM transport technologies. Local carriers seek an economical architecture to deliver integrated services on optically enabled broadband-access networks. Services over wireless-access networks must coexist with those from wired networks. In each case, convergence of networks and services inspires an important set of questions and challenges, driven by the need for low cost, operational efficiency, service performance requirements, and optical transport technology options. This Feature Issue explores the various interpretations and implications of network convergence pertinent to optical networking. How does convergence affect the evolution of optical transport-layer and control approaches? Are the implied directions consistent with research vision for optical networks? Substantial challenges remain. Papers are solicited across the broad spectrum of interests. These include, but are not limited to:

  7. Convergence

    NASA Astrophysics Data System (ADS)

    Darcie, Thomas E.; Doverspike, Robert; Zirngibl, Martin; Korotky, Steven K.

    2005-03-01

    Call for Papers: Convergence Convergence has become a popular theme in telecommunications, one that has broad implications across all segments of the industry. Continual evolution of technology and applications continues to erase lines between traditionally separate lines of business, with dramatic consequences for vendors, service providers, and consumers. Spectacular advances in all layers of optical networking-leading to abundant, dynamic, cost-effective, and reliable wide-area and local-area connections-have been essential drivers of this evolution. As services and networks continue to evolve towards some notion of convergence, the continued role of optical networks must be explored. One vision of convergence renders all information in a common packet (especially IP) format. This vision is driven by the proliferation of data services. For example, time-division multiplexed (TDM) voice becomes VoIP. Analog cable-television signals become MPEG bits streamed to digital set-top boxes. T1 or OC-N private lines migrate to Ethernet virtual private networks (VPNs). All these packets coexist peacefully within a single packet-routing methodology built on an optical transport layer that combines the flexibility and cost of data networks with telecom-grade reliability. While this vision is appealing in its simplicity and shared widely, specifics of implementation raise many challenges and differences of opinion. For example, many seek to expand the role of Ethernet in these transport networks, while massive efforts are underway to make traditional TDM networks more data friendly within an evolved but backward-compatible SDH/SONET (synchronous digital hierarchy and synchronous optical network) multiplexing hierarchy. From this common underlying theme follow many specific instantiations. Examples include the convergence at the physical, logical, and operational levels of voice and data, video and data, private-line and virtual private-line, fixed and mobile, and local and long-haul services. These trends have many consequences for consumers, vendors, and carriers. Faced with large volumes of low-margin data traffic mixed with traditional voice services, the need for capital conservation and operational efficiency drives carriers away from today's separate overlay networks for each service and towards "converged" platforms. For example, cable operators require transport of multiple services over both hybrid fiber coax (HFC) and DWDM transport technologies. Local carriers seek an economical architecture to deliver integrated services on optically enabled broadband-access networks. Services over wireless-access networks must coexist with those from wired networks. In each case, convergence of networks and services inspires an important set of questions and challenges, driven by the need for low cost, operational efficiency, service performance requirements, and optical transport technology options. This Feature Issue explores the various interpretations and implications of network convergence pertinent to optical networking. How does convergence affect the evolution of optical transport-layer and control approaches? Are the implied directions consistent with research vision for optical networks? Substantial challenges remain. Papers are solicited across the broad spectrum of interests. These include, but are not limited to:

  8. Whatever the Law Says: Language Policy Implementation and Early-Grade Literacy Achievement in Kenya

    ERIC Educational Resources Information Center

    Trudell, Barbara; Piper, Benjamin

    2014-01-01

    Language policy is generally seen as a national-level decision regarding which languages the state will support, and in which public domains. However, the reality is that language policy plays out at regional and local levels as well. In fact, it could be argued that the most important instantiations of language policy are those which directly…

  9. Facilitating 3D Virtual World Learning Environments Creation by Non-Technical End Users through Template-Based Virtual World Instantiation

    ERIC Educational Resources Information Center

    Liu, Chang; Zhong, Ying; Ozercan, Sertac; Zhu, Qing

    2013-01-01

    This paper presents a template-based solution to overcome technical barriers non-technical computer end users face when developing functional learning environments in three-dimensional virtual worlds (3DVW). "iVirtualWorld," a prototype of a platform-independent 3DVW creation tool that implements the proposed solution, facilitates 3DVW…

  10. An Analytical Framework for Categorizing the Use of CAS Symbolic Manipulation in Textbooks

    ERIC Educational Resources Information Center

    Davis, Jon D.; Fonger, Nicole L.

    2015-01-01

    The symbolic manipulation capabilities of computer algebra systems, which we refer to as CAS-S, are now becoming instantiated within secondary mathematics textbooks in the United States for the first time. While a number of research studies have examined how teachers use this technology in their classrooms, one of the most important factors in how…

  11. The Effect on Pupils' Science Performance and Problem-Solving Ability through Lego: An Engineering Design-Based Modeling Approach

    ERIC Educational Resources Information Center

    Li, Yanyan; Huang, Zhinan; Jiang, Menglu; Chang, Ting-Wen

    2016-01-01

    Incorporating scientific fundamentals via engineering through a design-based methodology has proven to be highly effective for STEM education. Engineering design can be instantiated for learning as they involve mental and physical stimulation and develop practical skills especially in solving problems. Lego bricks, as a set of toys based on design…

  12. Social Capital and Stability Operations

    DTIC Science & Technology

    2008-03-26

    defined as an instantiated set of informal values or norms that permit cooperation between two or more individuals, is the sine qua non of stable... multi -dimensional research, and editorial opinions, relate to the means (resources) by which to accomplish stability operations: unified action...development phase requires weaning indigenous institutions from reliance on external assistance. Fukuyama asserts that this is hard for three reasons

  13. The Development of Conceptions of the Right to Literacy in Traditional Rural Africa

    ERIC Educational Resources Information Center

    Day, Kathryn Louise

    2010-01-01

    This study examined conceptions of the right to literacy in children, adolescents, and young adults living in rural Zulu villages in the mountains of KwaZulu Natal, South Africa, as one instantiation of the development of conceptions of human rights in a developing world setting. Of human rights, literacy was chosen because of its familiarity to…

  14. Non-Bayesian Noun Generalization in 3-to 5-Year-Old Children: Probing the Role of Prior Knowledge in the Suspicious Coincidence Effect

    ERIC Educational Resources Information Center

    Jenkins, Gavin W.; Samuelson, Larissa K.; Smith, Jodi R.; Spencer, John P.

    2015-01-01

    It is unclear how children learn labels for multiple overlapping categories such as "Labrador," "dog," and "animal." Xu and Tenenbaum (2007a) suggested that learners infer correct meanings with the help of Bayesian inference. They instantiated these claims in a Bayesian model, which they tested with preschoolers and…

  15. An Analysis of Peer-Submitted and Peer-Reviewed Answer Rationales, in an Asynchronous Peer Instruction Based Learning Environment

    ERIC Educational Resources Information Center

    Bhatnagar, Sameer; Lasry, Nathaniel; Desmarais, Michel; Dugdale, Michael; Whittaker, Chris; Charles, Elizabeth S.

    2015-01-01

    This paper reports on an analyis of data from a novel "Peer Instruction" application, named DALITE. The Peer Instruction paradigm is well suited to take advantage of peer-input in web-based learning environments. DALITE implements an asynchronous instantiation of peer instruction: after submitting their answer to a multiple-choice…

  16. Virtual Environments Overview

    DTIC Science & Technology

    2009-04-01

    Gabbard (2008), p. 36 The extensions of our framework to opportunities and instantiations respect this admonition in focusing on uses of this...points above, which we found in reviewing other public reports on the current state and issues with analytic practice (notably Treverton & Gabbard , 2008...agencies, with differing backgrounds and expertise will need to work together to unravel tomorrow’s threats, which Treverton & Gabbard (2008) likened

  17. Pens on the Prize: Linking School and Community through Contest-Inspired Literacy

    ERIC Educational Resources Information Center

    Jocson, Korina; Burnside, Sherdren; Collins, Mualimu

    2006-01-01

    This article looks closely at one instantiation of a poetry-centered practice as supported by June Jordan's Poetry for the People, specifically in the context of an inaugural poetry contest. It builds on an earlier empirical study, which investigated the partnership between Poetry for the People and one East Bay Area high school and the rich…

  18. Origins of Eukaryotic Sexual Reproduction

    PubMed Central

    2014-01-01

    Sexual reproduction is a nearly universal feature of eukaryotic organisms. Given its ubiquity and shared core features, sex is thought to have arisen once in the last common ancestor to all eukaryotes. Using the perspectives of molecular genetics and cell biology, we consider documented and hypothetical scenarios for the instantiation and evolution of meiosis, fertilization, sex determination, uniparental inheritance of organelle genomes, and speciation. PMID:24591519

  19. Hidden Markov models for character recognition.

    PubMed

    Vlontzos, J A; Kung, S Y

    1992-01-01

    A hierarchical system for character recognition with hidden Markov model knowledge sources which solve both the context sensitivity problem and the character instantiation problem is presented. The system achieves 97-99% accuracy using a two-level architecture and has been implemented using a systolic array, thus permitting real-time (1 ms per character) multifont and multisize printed character recognition as well as handwriting recognition.

  20. Scalable System Design for Covert MIMO Communications

    DTIC Science & Technology

    2014-06-01

    Sample based resolution of the QRD and equalization processes in the MIMO receiver, for NQR = 11...55 5.1 NQR calculation parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73 5.2 Resources available on Xilinx Virtex-7 FPGAs...carried out for Na ∈ [2 3 4]. Extrapolation is used to determine trends as a function of the number of QRD blocks instantiated NQR and Na. This section

  1. A Model of Network Porosity

    DTIC Science & Technology

    2016-11-09

    the model does not become a full probabilistic attack graph analysis of the network , whose data requirements are currently unrealistic. The second...flow. – Untrustworthy persons may intentionally try to exfiltrate known sensitive data to ex- ternal networks . People may also unintentionally leak...section will provide details on the components, procedures, data requirements, and parameters required to instantiate the network porosity model. These

  2. Fusion of Hard and Soft Information in Nonparametric Density Estimation

    DTIC Science & Technology

    2015-06-10

    and stochastic optimization models, in analysis of simulation output, and when instantiating probability models. We adopt a constrained maximum...particular, density estimation is needed for generation of input densities to simulation and stochastic optimization models, in analysis of simulation output...an essential step in simulation analysis and stochastic optimization is the generation of probability densities for input random variables; see for

  3. Rosen's (M,R) system as an X-machine.

    PubMed

    Palmer, Michael L; Williams, Richard A; Gatherer, Derek

    2016-11-07

    Robert Rosen's (M,R) system is an abstract biological network architecture that is allegedly both irreducible to sub-models of its component states and non-computable on a Turing machine. (M,R) stands as an obstacle to both reductionist and mechanistic presentations of systems biology, principally due to its self-referential structure. If (M,R) has the properties claimed for it, computational systems biology will not be possible, or at best will be a science of approximate simulations rather than accurate models. Several attempts have been made, at both empirical and theoretical levels, to disprove this assertion by instantiating (M,R) in software architectures. So far, these efforts have been inconclusive. In this paper, we attempt to demonstrate why - by showing how both finite state machine and stream X-machine formal architectures fail to capture the self-referential requirements of (M,R). We then show that a solution may be found in communicating X-machines, which remove self-reference using parallel computation, and then synthesise such machine architectures with object-orientation to create a formal basis for future software instantiations of (M,R) systems. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. A novel approach for connecting temporal-ontologies with blood flow simulations.

    PubMed

    Weichert, F; Mertens, C; Walczak, L; Kern-Isberner, G; Wagner, M

    2013-06-01

    In this paper an approach for developing a temporal domain ontology for biomedical simulations is introduced. The ideas are presented in the context of simulations of blood flow in aneurysms using the Lattice Boltzmann Method. The advantages in using ontologies are manyfold: On the one hand, ontologies having been proven to be able to provide medical special knowledge e.g., key parameters for simulations. On the other hand, based on a set of rules and the usage of a reasoner, a system for checking the plausibility as well as tracking the outcome of medical simulations can be constructed. Likewise, results of simulations including data derived from them can be stored and communicated in a way that can be understood by computers. Later on, this set of results can be analyzed. At the same time, the ontologies provide a way to exchange knowledge between researchers. Lastly, this approach can be seen as a black-box abstraction of the internals of the simulation for the biomedical researcher as well. This approach is able to provide the complete parameter sets for simulations, part of the corresponding results and part of their analysis as well as e.g., geometry and boundary conditions. These inputs can be transferred to different simulation methods for comparison. Variations on the provided parameters can be automatically used to drive these simulations. Using a rule base, unphysical inputs or outputs of the simulation can be detected and communicated to the physician in a suitable and familiar way. An example for an instantiation of the blood flow simulation ontology and exemplary rules for plausibility checking are given. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. Addressing software security risk mitigations in the life cycle

    NASA Technical Reports Server (NTRS)

    Gilliam, David; Powell, John; Haugh, Eric; Bishop, Matt

    2003-01-01

    The NASA Office of Safety and Mission Assurance (OSMA) has funded the Jet Propulsion Laboratory (JPL) with a Center Initiative, 'Reducing Software Security Risk through an Integrated Approach' (RSSR), to address this need. The Initiative is a formal approach to addressing software security in the life cycle through the instantiation of a Software Security Assessment Instrument (SSAI) for the development and maintenance life cycles.

  6. An object oriented extension to CLIPS

    NASA Technical Reports Server (NTRS)

    Sobkowicz, Clifford

    1990-01-01

    A presentation of software sub-system developed to augment C Language Production Systems (CLIPS) with facilities for object oriented Knowledge representation. Functions are provided to define classes, instantiate objects, access attributes, and assert object related facts. This extension is implemented via the CLIPS user function interface and does not require modification of any CLIPS code. It does rely on internal CLIPS functions for memory management and symbol representation.

  7. Bootstrapping and Maintaining Trust in the Cloud

    DTIC Science & Technology

    2016-12-01

    simultaneous cloud nodes. 1. INTRODUCTION The proliferation and popularity of infrastructure-as-a- service (IaaS) cloud computing services such as...Amazon Web Services and Google Compute Engine means more cloud tenants are hosting sensitive, private, and business critical data and applications in the...thousands of IaaS resources as they are elastically instantiated and terminated. Prior cloud trusted computing solutions address a subset of these features

  8. An Interlingual-based Approach to Reference Resolution

    DTIC Science & Technology

    2000-01-01

    unclassified c . THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 They do not generally consider implicit references or...AMOUNT: C [currency] etc. where TIME, LOCATION, AGENT, THEME, human, organization, object, etc. are all ontological concepts. On some particular...AMOUNT: C [currency] etc. These instantiated representational objects are, in turn, referents in the discourse context when the next sentence is

  9. The Development of a Research Environment for Neural Networks: Instantiating Neocognitions

    DTIC Science & Technology

    1990-12-21

    interactive activation to adaptive reso- nance. Cognitive Science, 11:23-63. Reprinted in (Grossberg, 1988). Grossberg, S., editor (1988). Neural...higher order correlation network. Physica 22D, pages 276-306. Rosenblatt, F. (1962). Principles of Neurodynamics : Perceptrons and the Theory of Brain...and the PDP Research Group (1986b). Parallel Dis- tributed Processing: Ezplorations in the Microstructures of Cognition , volume 1: Foun- dations

  10. Semantic Document Model to Enhance Data and Knowledge Interoperability

    NASA Astrophysics Data System (ADS)

    Nešić, Saša

    To enable document data and knowledge to be efficiently shared and reused across application, enterprise, and community boundaries, desktop documents should be completely open and queryable resources, whose data and knowledge are represented in a form understandable to both humans and machines. At the same time, these are the requirements that desktop documents need to satisfy in order to contribute to the visions of the Semantic Web. With the aim of achieving this goal, we have developed the Semantic Document Model (SDM), which turns desktop documents into Semantic Documents as uniquely identified and semantically annotated composite resources, that can be instantiated into human-readable (HR) and machine-processable (MP) forms. In this paper, we present the SDM along with an RDF and ontology-based solution for the MP document instance. Moreover, on top of the proposed model, we have built the Semantic Document Management System (SDMS), which provides a set of services that exploit the model. As an application example that takes advantage of SDMS services, we have extended MS Office with a set of tools that enables users to transform MS Office documents (e.g., MS Word and MS PowerPoint) into Semantic Documents, and to search local and distant semantic document repositories for document content units (CUs) over Semantic Web protocols.

  11. Time-Warp–Invariant Neuronal Processing

    PubMed Central

    Gütig, Robert; Sompolinsky, Haim

    2009-01-01

    Fluctuations in the temporal durations of sensory signals constitute a major source of variability within natural stimulus ensembles. The neuronal mechanisms through which sensory systems can stabilize perception against such fluctuations are largely unknown. An intriguing instantiation of such robustness occurs in human speech perception, which relies critically on temporal acoustic cues that are embedded in signals with highly variable duration. Across different instances of natural speech, auditory cues can undergo temporal warping that ranges from 2-fold compression to 2-fold dilation without significant perceptual impairment. Here, we report that time-warp–invariant neuronal processing can be subserved by the shunting action of synaptic conductances that automatically rescales the effective integration time of postsynaptic neurons. We propose a novel spike-based learning rule for synaptic conductances that adjusts the degree of synaptic shunting to the temporal processing requirements of a given task. Applying this general biophysical mechanism to the example of speech processing, we propose a neuronal network model for time-warp–invariant word discrimination and demonstrate its excellent performance on a standard benchmark speech-recognition task. Our results demonstrate the important functional role of synaptic conductances in spike-based neuronal information processing and learning. The biophysics of temporal integration at neuronal membranes can endow sensory pathways with powerful time-warp–invariant computational capabilities. PMID:19582146

  12. A Design Rationale Capture Tool to Support Design Verification and Re-use

    NASA Technical Reports Server (NTRS)

    Hooey, Becky Lee; Da Silva, Jonny C.; Foyle, David C.

    2012-01-01

    A design rationale tool (DR tool) was developed to capture design knowledge to support design verification and design knowledge re-use. The design rationale tool captures design drivers and requirements, and documents the design solution including: intent (why it is included in the overall design); features (why it is designed the way it is); information about how the design components support design drivers and requirements; and, design alternatives considered but rejected. For design verification purposes, the tool identifies how specific design requirements were met and instantiated within the final design, and which requirements have not been met. To support design re-use, the tool identifies which design decisions are affected when design drivers and requirements are modified. To validate the design tool, the design knowledge from the Taxiway Navigation and Situation Awareness (T-NASA; Foyle et al., 1996) system was captured and the DR tool was exercised to demonstrate its utility for validation and re-use.

  13. Hydrological Modeling in Alaska with WRF-Hydro

    NASA Astrophysics Data System (ADS)

    Elmer, N. J.; Zavodsky, B.; Molthan, A.

    2017-12-01

    The operational National Water Model (NWM), implemented in August 2016, is an instantiation of the Weather Research and Forecasting hydrological extension package (WRF-Hydro). Currently, the NWM only covers the contiguous United States, but will be expanded to include an Alaska domain in the future. It is well known that Alaska presents several hydrological modeling challenges, including unique arctic/sub-arctic hydrological processes not observed elsewhere in the United States and a severe lack of in-situ observations for model initialization. This project sets up an experimental version of WRF-Hydro in Alaska mimicking the NWM to gauge the ability of WRF-Hydro to represent hydrological processes in Alaska and identify model calibration challenges. Recent and upcoming launches of hydrology-focused NASA satellite missions such as the Soil Moisture Active Passive (SMAP) and Surface Water Ocean Topography (SWOT) expand the spatial and temporal coverage of observations in Alaska, so this study also lays the groundwork for assimilating these NASA datasets into WRF-Hydro in the future.

  14. S-Band POSIX Device Drivers for RTEMS

    NASA Technical Reports Server (NTRS)

    Lux, James P.; Lang, Minh; Peters, Kenneth J.; Taylor, Gregory H.

    2011-01-01

    This is a set of POSIX device driver level abstractions in the RTEMS RTOS (Real-Time Executive for Multiprocessor Systems real-time operating system) to SBand radio hardware devices that have been instantiated in an FPGA (field-programmable gate array). These include A/D (analog-to-digital) sample capture, D/A (digital-to-analog) sample playback, PLL (phase-locked-loop) tuning, and PWM (pulse-width-modulation)-controlled gain. This software interfaces to Sband radio hardware in an attached Xilinx Virtex-2 FPGA. It uses plug-and-play device discovery to map memory to device IDs. Instead of interacting with hardware devices directly, using direct-memory mapped access at the application level, this driver provides an application programming interface (API) offering that easily uses standard POSIX function calls. This simplifies application programming, enables portability, and offers an additional level of protection to the hardware. There are three separate device drivers included in this package: sband_device (ADC capture and DAC playback), pll_device (RF front end PLL tuning), and pwm_device (RF front end AGC control).

  15. Advanced data management system architectures testbed

    NASA Technical Reports Server (NTRS)

    Grant, Terry

    1990-01-01

    The objective of the Architecture and Tools Testbed is to provide a working, experimental focus to the evolving automation applications for the Space Station Freedom data management system. Emphasis is on defining and refining real-world applications including the following: the validation of user needs; understanding system requirements and capabilities; and extending capabilities. The approach is to provide an open, distributed system of high performance workstations representing both the standard data processors and networks and advanced RISC-based processors and multiprocessor systems. The system provides a base from which to develop and evaluate new performance and risk management concepts and for sharing the results. Participants are given a common view of requirements and capability via: remote login to the testbed; standard, natural user interfaces to simulations and emulations; special attention to user manuals for all software tools; and E-mail communication. The testbed elements which instantiate the approach are briefly described including the workstations, the software simulation and monitoring tools, and performance and fault tolerance experiments.

  16. A cognitive robotics system: the symbolic and sub-symbolic robotic intelligence control system (SS-RICS)

    NASA Astrophysics Data System (ADS)

    Kelley, Troy D.; Avery, Eric

    2010-04-01

    This paper will detail the progress on the development of the Symbolic and Subsymbolic Robotics Intelligence Control System (SS-RICS). The system is a goal oriented production system, based loosely on the cognitive architecture, the Adaptive Control of Thought-Rational (ACT-R) some additions and changes. We have found that in order to simulate complex cognition on a robot, many aspects of cognition (long term memory (LTM), perception) needed to be in place before any generalized intelligent behavior can be produced. In working with ACT-R, we found that it was a good instantiation of working memory, but that we needed to add other aspects of cognition including LTM and perception to have a complete cognitive system. Our progress to date will be noted and the challenges that remain will be addressed.

  17. Bootstrapping and Maintaining Trust in the Cloud

    DTIC Science & Technology

    2016-12-01

    proliferation and popularity of infrastructure-as-a- service (IaaS) cloud computing services such as Amazon Web Services and Google Compute Engine means...IaaS trusted computing system: • Secure Bootstrapping – the system should enable the tenant to securely install an initial root secret into each cloud ...elastically instantiated and terminated. Prior cloud trusted computing solutions address a subset of these features, but none achieve all. Excalibur [31] sup

  18. A High-Level Symbolic Representation for Intelligent Agents Across Multiple Architectures

    DTIC Science & Technology

    2004-07-01

    components of Soar that map to these concepts (instantiation support, selected operator). Fik Ed" Vie Go Boolbmo .’ lookb Wind , Help 1B w ,’ F:ld 1.ý fie...AnswerSpeedRequest ((msg> isa RequestSpeedChange consider (sel’>. pmsg (msg> end 0 St=ndadd irttezf•cc fo1.1 goals . ~interface lGoal s l’n sa,,invq this goail Ys "rt

  19. Developing a Corrective Action Simulator to Support Decision Making Research and Training

    DTIC Science & Technology

    2008-05-01

    positions, and any time-based simulation injects (e.g., JSTARS reporting tracks, the Engineer reporting a new aircraft bingo time, a threat being active...future instantiations would benefit from migrating to the IMPRINT Pro version. During the course of this development effort the Army Research...initiating corrective action when a subordinate is observed to make an error (of omission or commission) 58 • Benefits of a Corrective

  20. The Role of Critical Inquiry in (Re)constructing the Public Agenda for Higher Education: Confronting the Conservative Modernization of the Academy

    ERIC Educational Resources Information Center

    Gildersleeve, Ryan Evely; Kuntz, Aaron M.; Pasque, Penny A.; Carducci, Rozana

    2010-01-01

    As higher education seeks to become more socially responsive, the public agenda is one form that has taken root in explicating the relation of higher education to society. In this paper, we critically analyze two different instantiations of the public agenda for higher education, placing them against the backdrop of what Michael Apple (2006a)…

  1. Augmenting distractor filtering via transcranial magnetic stimulation of the lateral occipital cortex.

    PubMed

    Eštočinová, Jana; Lo Gerfo, Emanuele; Della Libera, Chiara; Chelazzi, Leonardo; Santandrea, Elisa

    2016-11-01

    Visual selective attention (VSA) optimizes perception and behavioral control by enabling efficient selection of relevant information and filtering of distractors. While focusing resources on task-relevant information helps counteract distraction, dedicated filtering mechanisms have recently been demonstrated, allowing neural systems to implement suitable policies for the suppression of potential interference. Limited evidence is presently available concerning the neural underpinnings of these mechanisms, and whether neural circuitry within the visual cortex might play a causal role in their instantiation, a possibility that we directly tested here. In two related experiments, transcranial magnetic stimulation (TMS) was applied over the lateral occipital cortex of healthy humans at different times during the execution of a behavioral task which entailed varying levels of distractor interference and need for attentional engagement. While earlier TMS boosted target selection, stimulation within a restricted time epoch close to (and in the course of) stimulus presentation engendered selective enhancement of distractor suppression, by affecting the ongoing, reactive instantiation of attentional filtering mechanisms required by specific task conditions. The results attest to a causal role of mid-tier ventral visual areas in distractor filtering and offer insights into the mechanisms through which TMS may have affected ongoing neural activity in the stimulated tissue. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. QCA Gray Code Converter Circuits Using LTEx Methodology

    NASA Astrophysics Data System (ADS)

    Mukherjee, Chiradeep; Panda, Saradindu; Mukhopadhyay, Asish Kumar; Maji, Bansibadan

    2018-07-01

    The Quantum-dot Cellular Automata (QCA) is the prominent paradigm of nanotechnology considered to continue the computation at deep sub-micron regime. The QCA realizations of several multilevel circuit of arithmetic logic unit have been introduced in the recent years. However, as high fan-in Binary to Gray (B2G) and Gray to Binary (G2B) Converters exist in the processor based architecture, no attention has been paid towards the QCA instantiation of the Gray Code Converters which are anticipated to be used in 8-bit, 16-bit, 32-bit or even more bit addressable machines of Gray Code Addressing schemes. In this work the two-input Layered T module is presented to exploit the operation of an Exclusive-OR Gate (namely LTEx module) as an elemental block. The "defect-tolerant analysis" of the two-input LTEx module has been analyzed to establish the scalability and reproducibility of the LTEx module in the complex circuits. The novel formulations exploiting the operability of the LTEx module have been proposed to instantiate area-delay efficient B2G and G2B Converters which can be exclusively used in Gray Code Addressing schemes. Moreover this work formulates the QCA design metrics such as O-Cost, Effective area, Delay and Cost α for the n-bit converter layouts.

  3. QCA Gray Code Converter Circuits Using LTEx Methodology

    NASA Astrophysics Data System (ADS)

    Mukherjee, Chiradeep; Panda, Saradindu; Mukhopadhyay, Asish Kumar; Maji, Bansibadan

    2018-04-01

    The Quantum-dot Cellular Automata (QCA) is the prominent paradigm of nanotechnology considered to continue the computation at deep sub-micron regime. The QCA realizations of several multilevel circuit of arithmetic logic unit have been introduced in the recent years. However, as high fan-in Binary to Gray (B2G) and Gray to Binary (G2B) Converters exist in the processor based architecture, no attention has been paid towards the QCA instantiation of the Gray Code Converters which are anticipated to be used in 8-bit, 16-bit, 32-bit or even more bit addressable machines of Gray Code Addressing schemes. In this work the two-input Layered T module is presented to exploit the operation of an Exclusive-OR Gate (namely LTEx module) as an elemental block. The "defect-tolerant analysis" of the two-input LTEx module has been analyzed to establish the scalability and reproducibility of the LTEx module in the complex circuits. The novel formulations exploiting the operability of the LTEx module have been proposed to instantiate area-delay efficient B2G and G2B Converters which can be exclusively used in Gray Code Addressing schemes. Moreover this work formulates the QCA design metrics such as O-Cost, Effective area, Delay and Cost α for the n-bit converter layouts.

  4. CYBERWAR-2012/13: Siegel 2011 Predicted Cyberwar Via ACHILLES-HEEL DIGITS BEQS BEC ZERO-DIGIT BEC of/in ACHILLES-HEEL DIGITS Log-Law Algebraic-Inversion to ONLY BEQS BEC Digit-Physics U Barabasi Network/Graph-Physics BEQS BEC JAMMING Denial-of-Access(DOA) Attacks 2012-Instantiations

    NASA Astrophysics Data System (ADS)

    Huffmann, Master; Siegel, Edward Carl-Ludwig

    2013-03-01

    Newcomb-Benford(NeWBe)-Siegel log-law BEC Digit-Physics Network/Graph-Physics Barabasi et.al. evolving-``complex''-networks/graphs BEC JAMMING DOA attacks: Amazon(weekends: Microsoft I.E.-7/8(vs. Firefox): Memorial-day, Labor-day,...), MANY U.S.-Banks:WF,BoA,UB,UBS,...instantiations AGAIN militate for MANDATORY CONVERSION to PARALLEL ANALOG FAULT-TOLERANT but slow(er) SECURITY-ASSURANCE networks/graphs in parallel with faster ``sexy'' DIGITAL-Networks/graphs:``Cloud'', telecomm: n-G,..., because of common ACHILLES-HEEL VULNERABILITY: DIGITS!!! ``In fast-hare versus slow-tortoise race, Slow-But-Steady ALWAYS WINS!!!'' (Zeno). {Euler [#s(1732)] ∑- ∏()-Riemann[Monats. Akad. Berlin (1859)] ∑- ∏()- Kummer-Bernoulli (#s)}-Newcomb [Am.J.Math.4(1),39 (81) discovery of the QUANTUM!!!]-{Planck (01)]}-{Einstein (05)]-Poincar e [Calcul Probabilités,313(12)]-Weyl[Goett. Nach.(14); Math.Ann.77,313(16)]-(Bose (24)-Einstein(25)]-VS. -Fermi (27)-Dirac(27))-Menger [Dimensiontheorie(29)]-Benford [J.Am. Phil.Soc.78,115(38)]-Kac[Maths Stats.-Reason. (55)]- Raimi [Sci.Am.221,109(69)]-Jech-Hill [Proc.AMS,123,3,887(95)] log-function

  5. Trajectory Recognition as the Basis for Object Individuation: A Functional Model of Object File Instantiation and Object-Token Encoding

    PubMed Central

    Fields, Chris

    2011-01-01

    The perception of persisting visual objects is mediated by transient intermediate representations, object files, that are instantiated in response to some, but not all, visual trajectories. The standard object file concept does not, however, provide a mechanism sufficient to account for all experimental data on visual object persistence, object tracking, and the ability to perceive spatially disconnected stimuli as continuously existing objects. Based on relevant anatomical, functional, and developmental data, a functional model is constructed that bases visual object individuation on the recognition of temporal sequences of apparent center-of-mass positions that are specifically identified as trajectories by dedicated “trajectory recognition networks” downstream of the medial–temporal motion-detection area. This model is shown to account for a wide range of data, and to generate a variety of testable predictions. Individual differences in the recognition, abstraction, and encoding of trajectory information are expected to generate distinct object persistence judgments and object recognition abilities. Dominance of trajectory information over feature information in stored object tokens during early infancy, in particular, is expected to disrupt the ability to re-identify human and other individuals across perceptual episodes, and lead to developmental outcomes with characteristics of autism spectrum disorders. PMID:21716599

  6. Fourier power, subjective distance, and object categories all provide plausible models of BOLD responses in scene-selective visual areas

    PubMed Central

    Lescroart, Mark D.; Stansbury, Dustin E.; Gallant, Jack L.

    2015-01-01

    Perception of natural visual scenes activates several functional areas in the human brain, including the Parahippocampal Place Area (PPA), Retrosplenial Complex (RSC), and the Occipital Place Area (OPA). It is currently unclear what specific scene-related features are represented in these areas. Previous studies have suggested that PPA, RSC, and/or OPA might represent at least three qualitatively different classes of features: (1) 2D features related to Fourier power; (2) 3D spatial features such as the distance to objects in a scene; or (3) abstract features such as the categories of objects in a scene. To determine which of these hypotheses best describes the visual representation in scene-selective areas, we applied voxel-wise modeling (VM) to BOLD fMRI responses elicited by a set of 1386 images of natural scenes. VM provides an efficient method for testing competing hypotheses by comparing predictions of brain activity based on encoding models that instantiate each hypothesis. Here we evaluated three different encoding models that instantiate each of the three hypotheses listed above. We used linear regression to fit each encoding model to the fMRI data recorded from each voxel, and we evaluated each fit model by estimating the amount of variance it predicted in a withheld portion of the data set. We found that voxel-wise models based on Fourier power or the subjective distance to objects in each scene predicted much of the variance predicted by a model based on object categories. Furthermore, the response variance explained by these three models is largely shared, and the individual models explain little unique variance in responses. Based on an evaluation of previous studies and the data we present here, we conclude that there is currently no good basis to favor any one of the three alternative hypotheses about visual representation in scene-selective areas. We offer suggestions for further studies that may help resolve this issue. PMID:26594164

  7. C-semiring Frameworks for Minimum Spanning Tree Problems

    NASA Astrophysics Data System (ADS)

    Bistarelli, Stefano; Santini, Francesco

    In this paper we define general algebraic frameworks for the Minimum Spanning Tree problem based on the structure of c-semirings. We propose general algorithms that can compute such trees by following different cost criteria, which must be all specific instantiation of c-semirings. Our algorithms are extensions of well-known procedures, as Prim or Kruskal, and show the expressivity of these algebraic structures. They can deal also with partially-ordered costs on the edges.

  8. Cost Computations for Cyber Fighter Associate

    DTIC Science & Technology

    2015-05-01

    associate. Aberdeen Proving Ground (MD): Army Research Laboratory (US); in press. 2 Harman D, Brown S, Henz B, Marvel LM. A communication protocol... Harman , et al.2 A specific class called ListenThread was created for multithreaded listeners. When ListenThread is instantiated, it is passed a given...2. Harman D, Brown S, Henz B, Marvel LM. A communication protocol for CyAMS and the cyber associate interface. Aberdeen Proving Ground (MD): US Army

  9. Morphosyntactic Processing in Advanced Second Language (L2) Learners: An Event-Related Potential Investigation of the Effects of L1-l2 Similarity and Structural Distance

    ERIC Educational Resources Information Center

    Alemán Bañón, José; Fiorentino, Robert; Gabriele, Alison

    2014-01-01

    Different theoretical accounts of second language (L2) acquisition differ with respect to whether or not advanced learners are predicted to show native-like processing for features not instantiated in the native language (L1). We examined how native speakers of English, a language with number but not gender agreement, process number and gender…

  10. Model-based software design

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  11. A knowledge representation of local pandemic influenza planning models.

    PubMed

    Islam, Runa; Brandeau, Margaret L; Das, Amar K

    2007-10-11

    Planning for pandemic flu outbreak at the small-government level can be aided through the use of mathematical policy models. Formulating and analyzing policy models, however, can be a time- and expertise-expensive process. We believe that a knowledge-based system for facilitating the instantiation of locale- and problem-specific policy models can reduce some of these costs. In this work, we present the ontology we have developed for pandemic influenza policy models.

  12. Locating relevant patient information in electronic health record data using representations of clinical concepts and database structures.

    PubMed

    Pan, Xuequn; Cimino, James J

    2014-01-01

    Clinicians and clinical researchers often seek information in electronic health records (EHRs) that are relevant to some concept of interest, such as a disease or finding. The heterogeneous nature of EHRs can complicate retrieval, risking incomplete results. We frame this problem as the presence of two gaps: 1) a gap between clinical concepts and their representations in EHR data and 2) a gap between data representations and their locations within EHR data structures. We bridge these gaps with a knowledge structure that comprises relationships among clinical concepts (including concepts of interest and concepts that may be instantiated in EHR data) and relationships between clinical concepts and the database structures. We make use of available knowledge resources to develop a reproducible, scalable process for creating a knowledge base that can support automated query expansion from a clinical concept to all relevant EHR data.

  13. Remote creation of hybrid entanglement between particle-like and wave-like optical qubits

    NASA Astrophysics Data System (ADS)

    Morin, Olivier; Huang, Kun; Liu, Jianli; Le Jeannic, Hanna; Fabre, Claude; Laurat, Julien

    2014-07-01

    The wave-particle duality of light has led to two different encodings for optical quantum information processing. Several approaches have emerged based either on particle-like discrete-variable states (that is, finite-dimensional quantum systems) or on wave-like continuous-variable states (that is, infinite-dimensional systems). Here, we demonstrate the generation of entanglement between optical qubits of these different types, located at distant places and connected by a lossy channel. Such hybrid entanglement, which is a key resource for a variety of recently proposed schemes, including quantum cryptography and computing, enables information to be converted from one Hilbert space to the other via teleportation and therefore the connection of remote quantum processors based upon different encodings. Beyond its fundamental significance for the exploration of entanglement and its possible instantiations, our optical circuit holds promise for implementations of heterogeneous network, where discrete- and continuous-variable operations and techniques can be efficiently combined.

  14. Development and application of CATIA-GDML geometry builder

    NASA Astrophysics Data System (ADS)

    Belogurov, S.; Berchun, Yu; Chernogorov, A.; Malzacher, P.; Ovcharenko, E.; Schetinin, V.

    2014-06-01

    Due to conceptual difference between geometry descriptions in Computer-Aided Design (CAD) systems and particle transport Monte Carlo (MC) codes direct conversion of detector geometry in either direction is not feasible. The paper presents an update on functionality and application practice of the CATIA-GDML geometry builder first introduced at CHEP2010. This set of CATIAv5 tools has been developed for building a MC optimized GEANT4/ROOT compatible geometry based on the existing CAD model. The model can be exported via Geometry Description Markup Language (GDML). The builder allows also import and visualization of GEANT4/ROOT geometries in CATIA. The structure of a GDML file, including replicated volumes, volume assemblies and variables, is mapped into a part specification tree. A dedicated file template, a wide range of primitives, tools for measurement and implicit calculation of parameters, different types of multiple volume instantiation, mirroring, positioning and quality check have been implemented. Several use cases are discussed.

  15. The SERENITY Runtime Framework

    NASA Astrophysics Data System (ADS)

    Crespo, Beatriz Gallego-Nicasio; Piñuela, Ana; Soria-Rodriguez, Pedro; Serrano, Daniel; Maña, Antonio

    The SERENITY Runtime Framework (SRF) provides support for applications at runtime, by managing S&D Solutions and monitoring the systems’ context. The main functionality of the SRF, amongst others, is to provide S&D Solutions, by means of Executable Components, in response to applications security requirements. Runtime environment is defined in SRF through the S&D Library and Context Manager components. S&D Library is a local S&D Artefact repository, and stores S&D Classes, S&D Patterns and S&D Implementations. The Context Manager component is in charge of storing and management of the information used by the SRF to select the most appropriate S&D Pattern for a given scenario. The management of the execution of the Executable Component, as running realizations of the S&D Patterns, including instantiation, de-activation and control, as well as providing communication and monitoring mechanisms, besides the recovery and reconfiguration aspects, complete the list of tasks performed by the SRF.

  16. Instantiation and registration of statistical shape models of the femur and pelvis using 3D ultrasound imaging.

    PubMed

    Barratt, Dean C; Chan, Carolyn S K; Edwards, Philip J; Penney, Graeme P; Slomczykowski, Mike; Carter, Timothy J; Hawkes, David J

    2008-06-01

    Statistical shape modelling potentially provides a powerful tool for generating patient-specific, 3D representations of bony anatomy for computer-aided orthopaedic surgery (CAOS) without the need for a preoperative CT scan. Furthermore, freehand 3D ultrasound (US) provides a non-invasive method for digitising bone surfaces in the operating theatre that enables a much greater region to be sampled compared with conventional direct-contact (i.e., pointer-based) digitisation techniques. In this paper, we describe how these approaches can be combined to simultaneously generate and register a patient-specific model of the femur and pelvis to the patient during surgery. In our implementation, a statistical deformation model (SDM) was constructed for the femur and pelvis by performing a principal component analysis on the B-spline control points that parameterise the freeform deformations required to non-rigidly register a training set of CT scans to a carefully segmented template CT scan. The segmented template bone surface, represented by a triangulated surface mesh, is instantiated and registered to a cloud of US-derived surface points using an iterative scheme in which the weights corresponding to the first five principal modes of variation of the SDM are optimised in addition to the rigid-body parameters. The accuracy of the method was evaluated using clinically realistic data obtained on three intact human cadavers (three whole pelves and six femurs). For each bone, a high-resolution CT scan and rigid-body registration transformation, calculated using bone-implanted fiducial markers, served as the gold standard bone geometry and registration transformation, respectively. After aligning the final instantiated model and CT-derived surfaces using the iterative closest point (ICP) algorithm, the average root-mean-square distance between the surfaces was 3.5mm over the whole bone and 3.7mm in the region of surgical interest. The corresponding distances after aligning the surfaces using the marker-based registration transformation were 4.6 and 4.5mm, respectively. We conclude that despite limitations on the regions of bone accessible using US imaging, this technique has potential as a cost-effective and non-invasive method to enable surgical navigation during CAOS procedures, without the additional radiation dose associated with performing a preoperative CT scan or intraoperative fluoroscopic imaging. However, further development is required to investigate errors using error measures relevant to specific surgical procedures.

  17. Supporting the Virtual Soldier With a Physics-Based Software Architecture

    DTIC Science & Technology

    2005-06-01

    simple approach taken here). Rather, this paper demonstrates how existing solution schemes can rapidly expand; it embraces all theoretical solution... bodyj . In (5) the superscript ’T’ accompanying a vector denotes the transposition of the vector. The constraint force and moment are defined as F C=Z1 a a...FE codes as there are meshes, and the requested MD code. This is described next. Exactly how the PM instantiated each physics process became an issue

  18. Intelligent Mobile Autonomous System (IMAS).

    DTIC Science & Technology

    1987-01-01

    the "tile" of tesselation at a level (grain, discrete, pixel, or voxel of the space). These terms can be used intermittently , and each of them...search on the original level of traversability space is not fast enough to be considered for actual control application. Alternatives to limit the...0 (b) It must be concise and easy to "compute". In other words there must exist simple, fast procedures for instantiating the "words" or "sentences

  19. Managing Complex Interoperability Solutions using Model-Driven Architecture

    DTIC Science & Technology

    2011-06-01

    such as Oracle or MySQL . Each data model for a specific RDBMS is a distinct PSM. Or the system may want to exchange information with other C2...reduced number of transformations, e.g., from an RDBMS physical schema to the corresponding SQL script needed to instantiate the tables in a relational...tance of models. In engineering, a model serves several purposes: 1. It presents an abstract view of a complex system or of a complex information

  20. Investigations of quantum heuristics for optimization

    NASA Astrophysics Data System (ADS)

    Rieffel, Eleanor; Hadfield, Stuart; Jiang, Zhang; Mandra, Salvatore; Venturelli, Davide; Wang, Zhihui

    We explore the design of quantum heuristics for optimization, focusing on the quantum approximate optimization algorithm, a metaheuristic developed by Farhi, Goldstone, and Gutmann. We develop specific instantiations of the of quantum approximate optimization algorithm for a variety of challenging combinatorial optimization problems. Through theoretical analyses and numeric investigations of select problems, we provide insight into parameter setting and Hamiltonian design for quantum approximate optimization algorithms and related quantum heuristics, and into their implementation on hardware realizable in the near term.

  1. Conversational Interfaces: A Domain-Independent Architecture for Task-Oriented Dialogues

    DTIC Science & Technology

    2002-12-12

    system ought to be able tofa ilitate the understanding of the intentions of the human operatorand be ause it should be able to ommuni ate the plans...instantiated from the re ipes withnatural language a straightforward task for the dialogue front-end tofa ilitate. Moreover, it is designed so that onstraints...htake advantage of the framework dis ussed in this paper in order tofa iliate more natural dialogues between the human operator and thedevi e. The

  2. Beslisbevoegdheden van de Uitgestegen Soldaat. Deel B: Verbetering van Situational Awareness Met Behulp van de Soldier Digital Assistant in een Gesimuleerde Omgeving (Authority and Responsbility of the Dismounted Soldier. Part B. Improving the Situational Awareness using the Soldier Digital Assistant in a Simulated Environment)

    DTIC Science & Technology

    2007-04-01

    Dergelijke omngevingen zijn tot op heden vrijwel uitsluitend gebruikt voor training en onderwijs , maar slechts zeer sporadisch voor wetenschappelijk...Onderstaande instanties/personen ontvangen een volledig exemplaar van het rapport. 1 DMO/SC-DR&D standaard inclusief digitale versie bijgeleverd op cd

  3. Objects as closures: Abstract semantics of object oriented languages

    NASA Technical Reports Server (NTRS)

    Reddy, Uday S.

    1989-01-01

    We discuss denotational semantics of object-oriented languages, using the concept of closure widely used in (semi) functional programming to encapsulate side effects. It is shown that this denotational framework is adequate to explain classes, instantiation, and inheritance in the style of Simula as well as SMALLTALK-80. This framework is then compared with that of Kamin, in his recent denotational definition of SMALLTALK-80, and the implications of the differences between the two approaches are discussed.

  4. Dynamic knowledge representation using agent-based modeling: ontology instantiation and verification of conceptual models.

    PubMed

    An, Gary

    2009-01-01

    The sheer volume of biomedical research threatens to overwhelm the capacity of individuals to effectively process this information. Adding to this challenge is the multiscale nature of both biological systems and the research community as a whole. Given this volume and rate of generation of biomedical information, the research community must develop methods for robust representation of knowledge in order for individuals, and the community as a whole, to "know what they know." Despite increasing emphasis on "data-driven" research, the fact remains that researchers guide their research using intuitively constructed conceptual models derived from knowledge extracted from publications, knowledge that is generally qualitatively expressed using natural language. Agent-based modeling (ABM) is a computational modeling method that is suited to translating the knowledge expressed in biomedical texts into dynamic representations of the conceptual models generated by researchers. The hierarchical object-class orientation of ABM maps well to biomedical ontological structures, facilitating the translation of ontologies into instantiated models. Furthermore, ABM is suited to producing the nonintuitive behaviors that often "break" conceptual models. Verification in this context is focused at determining the plausibility of a particular conceptual model, and qualitative knowledge representation is often sufficient for this goal. Thus, utilized in this fashion, ABM can provide a powerful adjunct to other computational methods within the research process, as well as providing a metamodeling framework to enhance the evolution of biomedical ontologies.

  5. Visual Turing test for computer vision systems

    PubMed Central

    Geman, Donald; Geman, Stuart; Hallonquist, Neil; Younes, Laurent

    2015-01-01

    Today, computer vision systems are tested by their accuracy in detecting and localizing instances of objects. As an alternative, and motivated by the ability of humans to provide far richer descriptions and even tell a story about an image, we construct a “visual Turing test”: an operator-assisted device that produces a stochastic sequence of binary questions from a given test image. The query engine proposes a question; the operator either provides the correct answer or rejects the question as ambiguous; the engine proposes the next question (“just-in-time truthing”). The test is then administered to the computer-vision system, one question at a time. After the system’s answer is recorded, the system is provided the correct answer and the next question. Parsing is trivial and deterministic; the system being tested requires no natural language processing. The query engine employs statistical constraints, learned from a training set, to produce questions with essentially unpredictable answers—the answer to a question, given the history of questions and their correct answers, is nearly equally likely to be positive or negative. In this sense, the test is only about vision. The system is designed to produce streams of questions that follow natural story lines, from the instantiation of a unique object, through an exploration of its properties, and on to its relationships with other uniquely instantiated objects. PMID:25755262

  6. A METHODOLOGY FOR INTEGRATING IMAGES AND TEXT FOR OBJECT IDENTIFICATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paulson, Patrick R.; Hohimer, Ryan E.; Doucette, Peter J.

    2006-02-13

    Often text and imagery contain information that must be combined to solve a problem. One approach begins with transforming the raw text and imagery into a common structure that contains the critical information in a usable form. This paper presents an application in which the imagery of vehicles and the text from police reports were combined to demonstrate the power of data fusion to correctly identify the target vehicle--e.g., a red 2002 Ford truck identified in a police report--from a collection of diverse vehicle images. The imagery was abstracted into a common signature by first capturing the conceptual models ofmore » the imagery experts in software. Our system then (1) extracted fundamental features (e.g., wheel base, color), (2) made inferences about the information (e.g., it’s a red Ford) and then (3) translated the raw information into an abstract knowledge signature that was designed to both capture the important features and account for uncertainty. Likewise, the conceptual models of text analysis experts were instantiated into software that was used to generate an abstract knowledge signature that could be readily compared to the imagery knowledge signature. While this experiment primary focus was to demonstrate the power of text and imagery fusion for a specific example it also suggested several ways that text and geo-registered imagery could be combined to help solve other types of problems.« less

  7. How Do Children Restrict Their Linguistic Generalizations? An (Un-)Grammaticality Judgment Study

    PubMed Central

    Ambridge, Ben

    2013-01-01

    A paradox at the heart of language acquisition research is that, to achieve adult-like competence, children must acquire the ability to generalize verbs into non-attested structures, while avoiding utterances that are deemed ungrammatical by native speakers. For example, children must learn that, to denote the reversal of an action, un- can be added to many verbs, but not all (e.g., roll/unroll; close/*unclose). This study compared theoretical accounts of how this is done. Children aged 5–6 (N = 18), 9–10 (N = 18), and adults (N = 18) rated the acceptability of un- prefixed forms of 48 verbs (and, as a control, bare forms). Across verbs, a negative correlation was observed between the acceptability of ungrammatical un- prefixed forms (e.g., *unclose) and the frequency of (a) the bare form and (b) alternative forms (e.g., open), supporting the entrenchment and pre-emption hypotheses, respectively. Independent ratings of the extent to which verbs instantiate the semantic properties characteristic of a hypothesized semantic cryptotype for un- prefixation were a significant positive predictor of acceptability, for all age groups. The relative importance of each factor differed for attested and unattested un- forms and also varied with age. The findings are interpreted in the context of a new hybrid account designed to incorporate the three factors of entrenchment, pre-emption, and verb semantics. PMID:23252958

  8. A new formulation for air-blast fluid-structure interaction using an immersed approach: part II—coupling of IGA and meshfree discretizations

    NASA Astrophysics Data System (ADS)

    Bazilevs, Y.; Moutsanidis, G.; Bueno, J.; Kamran, K.; Kamensky, D.; Hillman, M. C.; Gomez, H.; Chen, J. S.

    2017-07-01

    In this two-part paper we begin the development of a new class of methods for modeling fluid-structure interaction (FSI) phenomena for air blast. We aim to develop accurate, robust, and practical computational methodology, which is capable of modeling the dynamics of air blast coupled with the structure response, where the latter involves large, inelastic deformations and disintegration into fragments. An immersed approach is adopted, which leads to an a-priori monolithic FSI formulation with intrinsic contact detection between solid objects, and without formal restrictions on the solid motions. In Part I of this paper, the core air-blast FSI methodology suitable for a variety of discretizations is presented and tested using standard finite elements. Part II of this paper focuses on a particular instantiation of the proposed framework, which couples isogeometric analysis (IGA) based on non-uniform rational B-splines and a reproducing-kernel particle method (RKPM), which is a meshfree technique. The combination of IGA and RKPM is felt to be particularly attractive for the problem class of interest due to the higher-order accuracy and smoothness of both discretizations, and relative simplicity of RKPM in handling fragmentation scenarios. A collection of mostly 2D numerical examples is presented in each of the parts to illustrate the good performance of the proposed air-blast FSI framework.

  9. A new formulation for air-blast fluid-structure interaction using an immersed approach. Part I: basic methodology and FEM-based simulations

    NASA Astrophysics Data System (ADS)

    Bazilevs, Y.; Kamran, K.; Moutsanidis, G.; Benson, D. J.; Oñate, E.

    2017-07-01

    In this two-part paper we begin the development of a new class of methods for modeling fluid-structure interaction (FSI) phenomena for air blast. We aim to develop accurate, robust, and practical computational methodology, which is capable of modeling the dynamics of air blast coupled with the structure response, where the latter involves large, inelastic deformations and disintegration into fragments. An immersed approach is adopted, which leads to an a-priori monolithic FSI formulation with intrinsic contact detection between solid objects, and without formal restrictions on the solid motions. In Part I of this paper, the core air-blast FSI methodology suitable for a variety of discretizations is presented and tested using standard finite elements. Part II of this paper focuses on a particular instantiation of the proposed framework, which couples isogeometric analysis (IGA) based on non-uniform rational B-splines and a reproducing-kernel particle method (RKPM), which is a Meshfree technique. The combination of IGA and RKPM is felt to be particularly attractive for the problem class of interest due to the higher-order accuracy and smoothness of both discretizations, and relative simplicity of RKPM in handling fragmentation scenarios. A collection of mostly 2D numerical examples is presented in each of the parts to illustrate the good performance of the proposed air-blast FSI framework.

  10. Cultural context and a critical approach to eliminating health disparities.

    PubMed

    Griffith, Derek M; Johnson, Jonetta; Ellis, Katrina R; Schulz, Amy Jo

    2010-01-01

    The science of eliminating racial health disparities requires a clear understanding of the underlying social processes that drive persistent differences in health outcomes by self-identified race. Understanding these social processes requires analysis of cultural notions of race as these are instantiated in institutional policies and practices that ultimately contribute to health disparities. Racism provides a useful framework for understanding how social, political and economic factors directly and indirectly influence health outcomes. While it is important to capture how individuals are influenced by their psychological experience of prejudice and discrimination, racism is more than an intrapersonal or interpersonal variable. Considerable attention has focused on race-based residential segregation and other forms of institutional racism but less focus has been placed on how cultural values, frameworks and meanings shape institutional policies and practices. In this article, we highlight the intersection of cultural and institutional racism as a critical mechanism through which racial inequities in social determinants of health not only develop but persist. This distinction highlights and helps to explain processes and structures that contribute to racial disparities persisting across time and outcomes. Using two historical examples, the National Negro Health Movement and hospital desegregation during the Civil Rights Era, we identify key questions that an analysis of cultural racism might add to the more common focus on overt policy decisions and practices.

  11. Symbiosis-Based Alternative Learning Multi-Swarm Particle Swarm Optimization.

    PubMed

    Niu, Ben; Huang, Huali; Tan, Lijing; Duan, Qiqi

    2017-01-01

    Inspired by the ideas from the mutual cooperation of symbiosis in natural ecosystem, this paper proposes a new variant of PSO, named Symbiosis-based Alternative Learning Multi-swarm Particle Swarm Optimization (SALMPSO). A learning probability to select one exemplar out of the center positions, the local best position, and the historical best position including the experience of internal and external multiple swarms, is used to keep the diversity of the population. Two different levels of social interaction within and between multiple swarms are proposed. In the search process, particles not only exchange social experience with others that are from their own sub-swarms, but also are influenced by the experience of particles from other fellow sub-swarms. According to the different exemplars and learning strategy, this model is instantiated as four variants of SALMPSO and a set of 15 test functions are conducted to compare with some variants of PSO including 10, 30 and 50 dimensions, respectively. Experimental results demonstrate that the alternative learning strategy in each SALMPSO version can exhibit better performance in terms of the convergence speed and optimal values on most multimodal functions in our simulation.

  12. ROSE Version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinlan, D.; Yi, Q.; Buduc, R.

    2005-02-17

    ROSE is an object-oriented software infrastructure for source-to-source translation that provides an interface for programmers to write their own specialized translators for optimizing scientific applications. ROSE is a part of current research on telescoping languages, which provides optimizations of the use of libraries in scientific applications. ROSE defines approaches to extend the optimization techniques, common in well defined languages, to the optimization of scientific applications using well defined libraries. ROSE includes a rich set of tools for generating customized transformations to support optimization of applications codes. We currently support full C and C++ (including template instantiation etc.), with Fortran 90more » support under development as part of a collaboration and contract with Rice to use their version of the open source Open64 F90 front-end. ROSE represents an attempt to define an open compiler infrastructure to handle the full complexity of full scale DOE applications codes using the languages common to scientific computing within DOE. We expect that such an infrastructure will also be useful for the development of numerous tools that may then realistically expect to work on DOE full scale applications.« less

  13. The evaluative imaging of mental models - Visual representations of complexity

    NASA Technical Reports Server (NTRS)

    Dede, Christopher

    1989-01-01

    The paper deals with some design issues involved in building a system that could visually represent the semantic structures of training materials and their underlying mental models. In particular, hypermedia-based semantic networks that instantiate classification problem solving strategies are thought to be a useful formalism for such representations; the complexity of these web structures can be best managed through visual depictions. It is also noted that a useful approach to implement in these hypermedia models would be some metrics of conceptual distance.

  14. The Challenge of New and Emerging Information Operations

    DTIC Science & Technology

    1999-06-01

    Information Dominance Center (IDC) are addressing the operational and technological needs. The IDC serves as a model for the DoD and a proposed virtual hearing room for Congress. As the IDC and its supporting technologies mature, individuals will be able to freely enter, navigate, plan, and execute operations within Perceptual and Knowledge Landscapes. This capability begins the transition from Information Dominance to Knowledge Dominance. The IDC is instantiating such entities as smart rooms, avatars, square pixel displays, polymorphic views, and

  15. Graph Unification and Tangram Hypothesis Explanation Representation (GATHER) and System and Component Modeling Framework (SCMF)

    DTIC Science & Technology

    2008-08-01

    services, DIDS and DMS, are deployable on the TanGrid system and are accessible via two APIs, a Java client and a servlet based interface. Additionally...but required the user to instantiate an IGraph object with several Java Maps containing the nodes, node attributes, edge types, and the connections...restrictions imposed by the bulk ingest process. Finally, once the bulk ingest process was available in the GraphUnification Java Archives (JAR), DC was

  16. Uncertainty and Anticipation in Anxiety

    PubMed Central

    Grupe, Dan W.; Nitschke, Jack B.

    2014-01-01

    Uncertainty about a possible future threat disrupts our ability to avoid it or to mitigate its negative impact, and thus results in anxiety. Here, we focus the broad literature on the neurobiology of anxiety through the lens of uncertainty. We identify five processes essential for adaptive anticipatory responses to future threat uncertainty, and propose that alterations to the neural instantiation of these processes results in maladaptive responses to uncertainty in pathological anxiety. This framework has the potential to advance the classification, diagnosis, and treatment of clinical anxiety. PMID:23783199

  17. Objects as closures - Abstract semantics of object oriented languages

    NASA Technical Reports Server (NTRS)

    Reddy, Uday S.

    1988-01-01

    The denotational semantics of object-oriented languages is discussed using the concept of closure widely used in (semi) functional programming to encapsulate side effects. It is shown that this denotational framework is adequate to explain classes, instantiation, and inheritance in the style of Simula as well as SMALLTALK-80. This framework is then compared with that of Kamin (1988), in his recent denotational definition of SMALLTALK-80, and the implications of the differences between the two approaches are discussed.

  18. ModelPlex: Verified Runtime Validation of Verified Cyber-Physical System Models

    DTIC Science & Technology

    2014-07-01

    nondeterministic choice (〈∪〉), deterministic assignment (〈:=〉) and logical con- nectives (∧ r etc.) replace current facts with simpler ones or branch...By sequent proof rule ∃ r , this existentially quantified variable is instantiated with an arbitrary term θ, which is often a new logical variable...that is implicitly existentially quantified [27]. Weakening (Wr) removes facts that are no longer necessary. (〈∗〉) ∃X〈x :=X〉φ 〈x := ∗〉φ 1 (∃ r ) Γ ` φ(θ

  19. Affordance Templates for Shared Robot Control

    NASA Technical Reports Server (NTRS)

    Hart, Stephen; Dinh, Paul; Hambuchen, Kim

    2014-01-01

    This paper introduces the Affordance Template framework used to supervise task behaviors on the NASA-JSC Valkyrie robot at the 2013 DARPA Robotics Challenge (DRC) Trials. This framework provides graphical interfaces to human supervisors that are adjustable based on the run-time environmental context (e.g., size, location, and shape of objects that the robot must interact with, etc.). Additional improvements, described below, inject degrees of autonomy into instantiations of affordance templates at run-time in order to enable efficient human supervision of the robot for accomplishing tasks.

  20. A Zeus++ Code Tool, a Method for Implementing Same, and Storage Medium Storing Computer Readable Instructions for Instantiating the Zeus++ Code Tool

    DTIC Science & Technology

    1999-12-01

    applications, it should be understood that the invention is not limited thereto. Those having - 9 - Navy Case No. 79694 ordinary skill in the art and access...processing. It should also be mentioned that Tecplot is a commercial plotting software package produced by Amtec Engineering, Inc. The following...conditions) 7. Ch (base on edge conditions) -43- 10 Navy Case No. 79694 8. Ch (base on reference conditions) 9 . Momentum thickness 10. Displacement

  1. Convergence

    NASA Astrophysics Data System (ADS)

    Darcie, Thomas E.; Doverspike, Robert; Zirngibl, Martin; Korotky, Steven K.

    2005-08-01

    Call for Papers: Convergence The Journal of Optical Networking (JON) invites submissions to a special issue on Convergence. Convergence has become a popular theme in telecommunications, one that has broad implications across all segments of the industry. Continual evolution of technology and applications continues to erase lines between traditionally separate lines of business, with dramatic consequences for vendors, service providers, and consumers. Spectacular advances in all layers of optical networking-leading to abundant, dynamic, cost-effective, and reliable wide-area and local-area connections-have been essential drivers of this evolution. As services and networks continue to evolve towards some notion of convergence, the continued role of optical networks must be explored. One vision of convergence renders all information in a common packet (especially IP) format. This vision is driven by the proliferation of data services. For example, time-division multiplexed (TDM) voice becomes VoIP. Analog cable-television signals become MPEG bits streamed to digital set-top boxes. T1 or OC-N private lines migrate to Ethernet virtual private networks (VPNs). All these packets coexist peacefully within a single packet-routing methodology built on an optical transport layer that combines the flexibility and cost of data networks with telecom-grade reliability. While this vision is appealing in its simplicity and shared widely, specifics of implementation raise many challenges and differences of opinion. For example, many seek to expand the role of Ethernet in these transport networks, while massive efforts are underway to make traditional TDM networks more data friendly within an evolved but backward-compatible SDH/SONET (synchronous digital hierarchy and synchronous optical network) multiplexing hierarchy. From this common underlying theme follow many specific instantiations. Examples include the convergence at the physical, logical, and operational levels of voice and data, video and data, private-line and virtual private-line, fixed and mobile, and local and long-haul services. These trends have many consequences for consumers, vendors, and carriers. Faced with large volumes of low-margin data traffic mixed with traditional voice services, the need for capital conservation and operational efficiency drives carriers away from today's separate overlay networks for each service and towards "converged" platforms. For example, cable operators require transport of multiple services over both hybrid fiber coax (HFC) and DWDM transport technologies. Local carriers seek an economical architecture to deliver integrated services on optically enabled broadband-access networks. Services over wireless-access networks must coexist with those from wired networks. In each case, convergence of networks and services inspires an important set of questions and challenges, driven by the need for low cost, operational efficiency, service performance requirements, and optical transport technology options. This Feature Issue explores the various interpretations and implications of network convergence pertinent to optical networking. How does convergence affect the evolution of optical transport-layer and control approaches? Are the implied directions consistent with research vision for optical networks? Substantial challenges remain. Papers are solicited across the broad spectrum of interests. These include, but are not limited to: Architecture, design and performance of optical wide-area-network (WAN), metro, and access networks Integration strategies for multiservice transport platforms Access methods that bridge traditional and emerging services Network signaling and control methodologies All-optical packet routing and switching techniques

  2. Convergence

    NASA Astrophysics Data System (ADS)

    Darcie, Thomas E.; Doverspike, Robert; Zirngibl, Martin; Korotky, Steven K.

    2005-06-01

    Call for Papers: Convergence The Journal of Optical Networking (JON) invites submissions to a special issue on Convergence. Convergence has become a popular theme in telecommunications, one that has broad implications across all segments of the industry. Continual evolution of technology and applications continues to erase lines between traditionally separate lines of business, with dramatic consequences for vendors, service providers, and consumers. Spectacular advances in all layers of optical networking-leading to abundant, dynamic, cost-effective, and reliable wide-area and local-area connections-have been essential drivers of this evolution. As services and networks continue to evolve towards some notion of convergence, the continued role of optical networks must be explored. One vision of convergence renders all information in a common packet (especially IP) format. This vision is driven by the proliferation of data services. For example, time-division multiplexed (TDM) voice becomes VoIP. Analog cable-television signals become MPEG bits streamed to digital set-top boxes. T1 or OC-N private lines migrate to Ethernet virtual private networks (VPNs). All these packets coexist peacefully within a single packet-routing methodology built on an optical transport layer that combines the flexibility and cost of data networks with telecom-grade reliability. While this vision is appealing in its simplicity and shared widely, specifics of implementation raise many challenges and differences of opinion. For example, many seek to expand the role of Ethernet in these transport networks, while massive efforts are underway to make traditional TDM networks more data friendly within an evolved but backward-compatible SDH/SONET (synchronous digital hierarchy and synchronous optical network) multiplexing hierarchy. From this common underlying theme follow many specific instantiations. Examples include the convergence at the physical, logical, and operational levels of voice and data, video and data, private-line and virtual private-line, fixed and mobile, and local and long-haul services. These trends have many consequences for consumers, vendors, and carriers. Faced with large volumes of low-margin data traffic mixed with traditional voice services, the need for capital conservation and operational efficiency drives carriers away from today's separate overlay networks for each service and towards "converged" platforms. For example, cable operators require transport of multiple services over both hybrid fiber coax (HFC) and DWDM transport technologies. Local carriers seek an economical architecture to deliver integrated services on optically enabled broadband-access networks. Services over wireless-access networks must coexist with those from wired networks. In each case, convergence of networks and services inspires an important set of questions and challenges, driven by the need for low cost, operational efficiency, service performance requirements, and optical transport technology options. This Feature Issue explores the various interpretations and implications of network convergence pertinent to optical networking. How does convergence affect the evolution of optical transport-layer and control approaches? Are the implied directions consistent with research vision for optical networks? Substantial challenges remain. Papers are solicited across the broad spectrum of interests. These include, but are not limited to: Architecture, design and performance of optical wide-area-network (WAN), metro, and access networks Integration strategies for multiservice transport platforms Access methods that bridge traditional and emerging services Network signaling and control methodologies All-optical packet routing and switching techniques

  3. Convergence

    NASA Astrophysics Data System (ADS)

    Darcie, Thomas E.; Doverspike, Robert; Zirngibl, Martin; Korotky, Steven K.

    2005-05-01

    Call for Papers: Convergence The Journal of Optical Networking (JON) invites submissions to a special issue on Convergence. Convergence has become a popular theme in telecommunications, one that has broad implications across all segments of the industry. Continual evolution of technology and applications continues to erase lines between traditionally separate lines of business, with dramatic consequences for vendors, service providers, and consumers. Spectacular advances in all layers of optical networking-leading to abundant, dynamic, cost-effective, and reliable wide-area and local-area connections-have been essential drivers of this evolution. As services and networks continue to evolve towards some notion of convergence, the continued role of optical networks must be explored. One vision of convergence renders all information in a common packet (especially IP) format. This vision is driven by the proliferation of data services. For example, time-division multiplexed (TDM) voice becomes VoIP. Analog cable-television signals become MPEG bits streamed to digital set-top boxes. T1 or OC-N private lines migrate to Ethernet virtual private networks (VPNs). All these packets coexist peacefully within a single packet-routing methodology built on an optical transport layer that combines the flexibility and cost of data networks with telecom-grade reliability. While this vision is appealing in its simplicity and shared widely, specifics of implementation raise many challenges and differences of opinion. For example, many seek to expand the role of Ethernet in these transport networks, while massive efforts are underway to make traditional TDM networks more data friendly within an evolved but backward-compatible SDH/SONET (synchronous digital hierarchy and synchronous optical network) multiplexing hierarchy. From this common underlying theme follow many specific instantiations. Examples include the convergence at the physical, logical, and operational levels of voice and data, video and data, private-line and virtual private-line, fixed and mobile, and local and long-haul services. These trends have many consequences for consumers, vendors, and carriers. Faced with large volumes of low-margin data traffic mixed with traditional voice services, the need for capital conservation and operational efficiency drives carriers away from today's separate overlay networks for each service and towards "converged" platforms. For example, cable operators require transport of multiple services over both hybrid fiber coax (HFC) and DWDM transport technologies. Local carriers seek an economical architecture to deliver integrated services on optically enabled broadband-access networks. Services over wireless-access networks must coexist with those from wired networks. In each case, convergence of networks and services inspires an important set of questions and challenges, driven by the need for low cost, operational efficiency, service performance requirements, and optical transport technology options. This Feature Issue explores the various interpretations and implications of network convergence pertinent to optical networking. How does convergence affect the evolution of optical transport-layer and control approaches? Are the implied directions consistent with research vision for optical networks? Substantial challenges remain. Papers are solicited across the broad spectrum of interests. These include, but are not limited to: Architecture, design and performance of optical wide-area-network (WAN), metro, and access networks Integration strategies for multiservice transport platforms Access methods that bridge traditional and emerging services Network signaling and control methodologies All-optical packet routing and switching techniques

  4. Convergence

    NASA Astrophysics Data System (ADS)

    Darcie, Thomas E.; Doverspike, Robert; Zirngibl, Martin; Korotky, Steven K.

    2005-04-01

    Call for Papers: Convergence The Journal of Optical Networking (JON) invites submissions to a special issue on Convergence. Convergence has become a popular theme in telecommunications, one that has broad implications across all segments of the industry. Continual evolution of technology and applications continues to erase lines between traditionally separate lines of business, with dramatic consequences for vendors, service providers, and consumers. Spectacular advances in all layers of optical networking-leading to abundant, dynamic, cost-effective, and reliable wide-area and local-area connections-have been essential drivers of this evolution. As services and networks continue to evolve towards some notion of convergence, the continued role of optical networks must be explored. One vision of convergence renders all information in a common packet (especially IP) format. This vision is driven by the proliferation of data services. For example, time-division multiplexed (TDM) voice becomes VoIP. Analog cable-television signals become MPEG bits streamed to digital set-top boxes. T1 or OC-N private lines migrate to Ethernet virtual private networks (VPNs). All these packets coexist peacefully within a single packet-routing methodology built on an optical transport layer that combines the flexibility and cost of data networks with telecom-grade reliability. While this vision is appealing in its simplicity and shared widely, specifics of implementation raise many challenges and differences of opinion. For example, many seek to expand the role of Ethernet in these transport networks, while massive efforts are underway to make traditional TDM networks more data friendly within an evolved but backward-compatible SDH/SONET (synchronous digital hierarchy and synchronous optical network) multiplexing hierarchy. From this common underlying theme follow many specific instantiations. Examples include the convergence at the physical, logical, and operational levels of voice and data, video and data, private-line and virtual private-line, fixed and mobile, and local and long-haul services. These trends have many consequences for consumers, vendors, and carriers. Faced with large volumes of low-margin data traffic mixed with traditional voice services, the need for capital conservation and operational efficiency drives carriers away from today's separate overlay networks for each service and towards "converged" platforms. For example, cable operators require transport of multiple services over both hybrid fiber coax (HFC) and DWDM transport technologies. Local carriers seek an economical architecture to deliver integrated services on optically enabled broadband-access networks. Services over wireless-access networks must coexist with those from wired networks. In each case, convergence of networks and services inspires an important set of questions and challenges, driven by the need for low cost, operational efficiency, service performance requirements, and optical transport technology options. This Feature Issue explores the various interpretations and implications of network convergence pertinent to optical networking. How does convergence affect the evolution of optical transport-layer and control approaches? Are the implied directions consistent with research vision for optical networks? Substantial challenges remain. Papers are solicited across the broad spectrum of interests. These include, but are not limited to: Architecture, design and performance of optical wide-area-network (WAN), metro, and access networks Integration strategies for multiservice transport platforms Access methods that bridge traditional and emerging services Network signaling and control methodologies All-optical packet routing and switching techniques

  5. An Adynamical, Graphical Approach to Quantum Gravity and Unification

    NASA Astrophysics Data System (ADS)

    Stuckey, W. M.; Silberstein, Michael; McDevitt, Timothy

    We use graphical field gradients in an adynamical, background independent fashion to propose a new approach to quantum gravity (QG) and unification. Our proposed reconciliation of general relativity (GR) and quantum field theory (QFT) is based on a modification of their graphical instantiations, i.e. Regge calculus and lattice gauge theory (LGT), respectively, which we assume are fundamental to their continuum counterparts. Accordingly, the fundamental structure is a graphical amalgam of space, time, and sources (in parlance of QFT) called a "space-time source element". These are fundamental elements of space, time, and sources, not source elements in space and time. The transition amplitude for a space-time source element is computed using a path integral with discrete graphical action. The action for a space-time source element is constructed from a difference matrix K and source vector J on the graph, as in lattice gauge theory. K is constructed from graphical field gradients so that it contains a non-trivial null space and J is then restricted to the row space of K, so that it is divergence-free and represents a conserved exchange of energy-momentum. This construct of K and J represents an adynamical global constraint (AGC) between sources, the space-time metric, and the energy-momentum content of the element, rather than a dynamical law for time-evolved entities. In this view, one manifestation of quantum gravity becomes evident when, for example, a single space-time source element spans adjoining simplices of the Regge calculus graph. Thus, energy conservation for the space-time source element includes contributions to the deficit angles between simplices. This idea is used to correct proper distance in the Einstein-de Sitter (EdS) cosmology model yielding a fit of the Union2 Compilation supernova data that matches ΛCDM without having to invoke accelerating expansion or dark energy. A similar modification to LGT results in an adynamical account of quantum interference.

  6. Early perception and structural identity: neural implementation

    NASA Astrophysics Data System (ADS)

    Ligomenides, Panos A.

    1992-03-01

    It is suggested that there exists a minimal set of rules for the perceptual composition of the unending variety of spatio-temporal patterns in our perceptual world. Driven by perceptual discernment of "sudden change" and "unexpectedness", these rules specify conditions (such as co-linearity and virtual continuation) for perceptual grouping and for recursive compositions of perceptual "modalities" and "signatures". Beginning with a smallset of primitive perceptual elements, selected contextually at some relevant level of abstraction, perceptual compositions can graduate to an unlimited variety of spatiotemporal signatures, scenes and activities. Local discernible elements, often perceptually ambiguous by themselves, may be integrated into spatiotemporal compositions, which generate unambiguous perceptual separations between "figure" and "ground". The definition of computational algorithms for the effective instantiation of the rules of perceptual grouping remains a principal problem. In this paper we present our approach for solving the problem of perceptual recognition within the confines of one-D variational profiles. More specifically, concerning "early" (pre-attentive) recognition, we define the "structural identity of a k-norm, k ∈ K,"--SkID--as a tool for discerning and locating the instantiation of spatiotemporal objects or events. The SkID profile also serves a s a reference coordinate framework for the "perceptual focusing of attention" and the eventual assessment of resemblance. Neural network implementations of pre-attentive and attentive recognition are also discussed briefly. Our principles are exemplified by application to one-D perceptual profiles, which allows simplicity of definitions and of the rules of perceptual composition.

  7. Open NASA Earth Exchange (OpenNEX): Strategies for enabling cross organization collaboration in the earth sciences

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Ganguly, S.; Nemani, R. R.; Votava, P.; Wang, W.; Lee, T. J.; Dungan, J. L.

    2014-12-01

    Sharing community-valued codes, intermediary datasets and results from individual efforts with others that are not in a direct funded collaboration can be a challenge. Cross organization collaboration is often impeded due to infrastructure security constraints, rigid financial controls, bureaucracy, and workforce nationalities, etc., which can force groups to work in a segmented fashion and/or through awkward and suboptimal web services. We show how a focused community may come together, share modeling and analysis codes, computing configurations, scientific results, knowledge and expertise on a public cloud platform; diverse groups of researchers working together at "arms length". Through the OpenNEX experimental workshop, users can view short technical "how-to" videos and explore encapsulated working environment. Workshop participants can easily instantiate Amazon Machine Images (AMI) or launch full cluster and data processing configurations within minutes. Enabling users to instantiate computing environments from configuration templates on large public cloud infrastructures, such as Amazon Web Services, may provide a mechanism for groups to easily use each others work and collaborate indirectly. Moreover, using the public cloud for this workshop allowed a single group to host a large read only data archive, making datasets of interest to the community widely available on the public cloud, enabling other groups to directly connect to the data and reduce the costs of the collaborative work by freeing other individual groups from redundantly retrieving, integrating or financing the storage of the datasets of interest.

  8. Microservices in Web Objects Enabled IoT Environment for Enhancing Reusability

    PubMed Central

    Chong, Ilyoung

    2018-01-01

    In the ubiquitous Internet of Things (IoT) environment, reusing objects instead of creating new one has become important in academics and industries. The situation becomes complex due to the availability of a huge number of connected IoT objects, and each individual service creates a new object instead of reusing the existing one to fulfill a requirement. A well-standard mechanism not only improves the reusability of objects but also improves service modularity and extensibility, and reduces cost. Web Objects enabled IoT environment applies the principle of reusability of objects in multiple IoT application domains through central objects repository and microservices. To reuse objects with microservices and to maintain a relationship with them, this study presents an architecture of Web of Objects platform. In the case of a similar request for an object, the already instantiated object that exists in the same or from other domain can be reused. Reuse of objects through microservices avoids duplications, and reduces time to search and instantiate them from their registries. Further, this article presents an algorithm for microservices and related objects discovery that considers the reusability of objects through the central objects repository. To support the reusability of objects, the necessary algorithm for objects matching is also presented. To realize the reusability of objects in Web Objects enabled IoT environment, a prototype has been designed and implemented based on a use case scenario. Finally, the results of the prototype have been analyzed and discussed to validate the proposed approach. PMID:29373491

  9. Microservices in Web Objects Enabled IoT Environment for Enhancing Reusability.

    PubMed

    Jarwar, Muhammad Aslam; Kibria, Muhammad Golam; Ali, Sajjad; Chong, Ilyoung

    2018-01-26

    In the ubiquitous Internet of Things (IoT) environment, reusing objects instead of creating new one has become important in academics and industries. The situation becomes complex due to the availability of a huge number of connected IoT objects, and each individual service creates a new object instead of reusing the existing one to fulfill a requirement. A well-standard mechanism not only improves the reusability of objects but also improves service modularity and extensibility, and reduces cost. Web Objects enabled IoT environment applies the principle of reusability of objects in multiple IoT application domains through central objects repository and microservices. To reuse objects with microservices and to maintain a relationship with them, this study presents an architecture of Web of Objects platform. In the case of a similar request for an object, the already instantiated object that exists in the same or from other domain can be reused. Reuse of objects through microservices avoids duplications, and reduces time to search and instantiate them from their registries. Further, this article presents an algorithm for microservices and related objects discovery that considers the reusability of objects through the central objects repository. To support the reusability of objects, the necessary algorithm for objects matching is also presented. To realize the reusability of objects in Web Objects enabled IoT environment, a prototype has been designed and implemented based on a use case scenario. Finally, the results of the prototype have been analyzed and discussed to validate the proposed approach.

  10. Theorising big IT programmes in healthcare: strong structuration theory meets actor-network theory.

    PubMed

    Greenhalgh, Trisha; Stones, Rob

    2010-05-01

    The UK National Health Service is grappling with various large and controversial IT programmes. We sought to develop a sharper theoretical perspective on the question "What happens - at macro-, meso- and micro-level - when government tries to modernise a health service with the help of big IT?" Using examples from data fragments at the micro-level of clinical work, we considered how structuration theory and actor-network theory (ANT) might be combined to inform empirical investigation. Giddens (1984) argued that social structures and human agency are recursively linked and co-evolve. ANT studies the relationships that link people and technologies in dynamic networks. It considers how discourses become inscribed in data structures and decision models of software, making certain network relations irreversible. Stones' (2005) strong structuration theory (SST) is a refinement of Giddens' work, systematically concerned with empirical research. It views human agents as linked in dynamic networks of position-practices. A quadripartite approcach considers [a] external social structures (conditions for action); [b] internal social structures (agents' capabilities and what they 'know' about the social world); [c] active agency and actions and [d] outcomes as they feed back on the position-practice network. In contrast to early structuration theory and ANT, SST insists on disciplined conceptual methodology and linking this with empirical evidence. In this paper, we adapt SST for the study of technology programmes, integrating elements from material interactionism and ANT. We argue, for example, that the position-practice network can be a socio-technical one in which technologies in conjunction with humans can be studied as 'actants'. Human agents, with their complex socio-cultural frames, are required to instantiate technology in social practices. Structurally relevant properties inscribed and embedded in technological artefacts constrain and enable human agency. The fortunes of healthcare IT programmes might be studied in terms of the interplay between these factors. Copyright 2010 Elsevier Ltd. All rights reserved.

  11. Genomic instantiation of consciousness in neurons through a biophoton field theory.

    PubMed

    Cacha, Lleuvelyn A; Poznanski, Roman R

    2014-06-01

    A theoretical framework is developed based on the premise that brains evolved into sufficiently complex adaptive systems capable of instantiating genomic consciousness through self-awareness and complex interactions that recognize qualitatively the controlling factors of biological processes. Furthermore, our hypothesis assumes that the collective interactions in neurons yield macroergic effects, which can produce sufficiently strong electric energy fields for electronic excitations to take place on the surface of endogenous structures via alpha-helical integral proteins as electro-solitons. Specifically the process of radiative relaxation of the electro-solitons allows for the transfer of energy via interactions with deoxyribonucleic acid (DNA) molecules to induce conformational changes in DNA molecules producing an ultra weak non-thermal spontaneous emission of coherent biophotons through a quantum effect. The instantiation of coherent biophotons confined in spaces of DNA molecules guides the biophoton field to be instantaneously conducted along the axonal and neuronal arbors and in-between neurons and throughout the cerebral cortex (cortico-thalamic system) and subcortical areas (e.g., midbrain and hindbrain). Thus providing an informational character of the electric coherence of the brain - referred to as quantum coherence. The biophoton field is realized as a conscious field upon the re-absorption of biophotons by exciplex states of DNA molecules. Such quantum phenomenon brings about self-awareness and enables objectivity to have access to subjectivity in the unconscious. As such, subjective experiences can be recalled to consciousness as subjective conscious experiences or qualia through co-operative interactions between exciplex states of DNA molecules and biophotons leading to metabolic activity and energy transfer across proteins as a result of protein-ligand binding during protein-protein communication. The biophoton field as a conscious field is attributable to the resultant effect of specifying qualia from the metabolic energy field that is transported in macromolecular proteins throughout specific networks of neurons that are constantly transforming into more stable associable representations as molecular solitons. The metastability of subjective experiences based on resonant dynamics occurs when bottom-up patterns of neocortical excitatory activity are matched with top-down expectations as adaptive dynamic pressures. These dynamics of on-going activity patterns influenced by the environment and selected as the preferred subjective experience in terms of a functional field through functional interactions and biological laws are realized as subjectivity and actualized through functional integration as qualia. It is concluded that interactionism and not information processing is the key in understanding how consciousness bridges the explanatory gap between subjective experiences and their neural correlates in the transcendental brain.

  12. Computational models of music perception and cognition I: The perceptual and cognitive processing chain

    NASA Astrophysics Data System (ADS)

    Purwins, Hendrik; Herrera, Perfecto; Grachten, Maarten; Hazan, Amaury; Marxer, Ricard; Serra, Xavier

    2008-09-01

    We present a review on perception and cognition models designed for or applicable to music. An emphasis is put on computational implementations. We include findings from different disciplines: neuroscience, psychology, cognitive science, artificial intelligence, and musicology. The article summarizes the methodology that these disciplines use to approach the phenomena of music understanding, the localization of musical processes in the brain, and the flow of cognitive operations involved in turning physical signals into musical symbols, going from the transducers to the memory systems of the brain. We discuss formal models developed to emulate, explain and predict phenomena involved in early auditory processing, pitch processing, grouping, source separation, and music structure computation. We cover generic computational architectures of attention, memory, and expectation that can be instantiated and tuned to deal with specific musical phenomena. Criteria for the evaluation of such models are presented and discussed. Thereby, we lay out the general framework that provides the basis for the discussion of domain-specific music models in Part II.

  13. Software-defined Quantum Networking Ecosystem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humble, Travis S.; Sadlier, Ronald

    The software enables a user to perform modeling and simulation of software-defined quantum networks. The software addresses the problem of how to synchronize transmission of quantum and classical signals through multi-node networks and to demonstrate quantum information protocols such as quantum teleportation. The software approaches this problem by generating a graphical model of the underlying network and attributing properties to each node and link in the graph. The graphical model is then simulated using a combination of discrete-event simulators to calculate the expected state of each node and link in the graph at a future time. A user interacts withmore » the software by providing an initial network model and instantiating methods for the nodes to transmit information with each other. This includes writing application scripts in python that make use of the software library interfaces. A user then initiates the application scripts, which invokes the software simulation. The user then uses the built-in diagnostic tools to query the state of the simulation and to collect statistics on synchronization.« less

  14. N400 ERPs for actions: building meaning in context

    PubMed Central

    Amoruso, Lucía; Gelormini, Carlos; Aboitiz, Francisco; Alvarez González, Miguel; Manes, Facundo; Cardona, Juan F.; Ibanez, Agustín

    2013-01-01

    Converging neuroscientific evidence suggests the existence of close links between language and sensorimotor cognition. Accordingly, during the comprehension of meaningful actions, our brain would recruit semantic-related operations similar to those associated with the processing of language information. Consistent with this view, electrophysiological findings show that the N400 component, traditionally linked to the semantic processing of linguistic material, can also be elicited by action-related material. This review outlines recent data from N400 studies that examine the understanding of action events. We focus on three specific domains, including everyday action comprehension, co-speech gesture integration, and the semantics involved in motor planning and execution. Based on the reviewed findings, we suggest that both negativities (the N400 and the action-N400) reflect a common neurocognitive mechanism involved in the construction of meaning through the expectancies created by previous experiences and current contextual information. To shed light on how this process is instantiated in the brain, a testable contextual fronto-temporo-parietal model is proposed. PMID:23459873

  15. JPL Space Telecommunications Radio System Operating Environment

    NASA Technical Reports Server (NTRS)

    Lux, James P.; Lang, Minh; Peters, Kenneth J.; Taylor, Gregory H.; Duncan, Courtney B.; Orozco, David S.; Stern, Ryan A.; Ahten, Earl R.; Girard, Mike

    2013-01-01

    A flight-qualified implementation of a Software Defined Radio (SDR) Operating Environment for the JPL-SDR built for the CoNNeCT Project has been developed. It is compliant with the NASA Space Telecommunications Radio System (STRS) Architecture Standard, and provides the software infrastructure for STRS compliant waveform applications. This software provides a standards-compliant abstracted view of the JPL-SDR hardware platform. It uses industry standard POSIX interfaces for most functions, as well as exposing the STRS API (Application Programming In terface) required by the standard. This software includes a standardized interface for IP components instantiated within a Xilinx FPGA (Field Programmable Gate Array). The software provides a standardized abstracted interface to platform resources such as data converters, file system, etc., which can be used by STRS standards conformant waveform applications. It provides a generic SDR operating environment with a much smaller resource footprint than similar products such as SCA (Software Communications Architecture) compliant implementations, or the DoD Joint Tactical Radio Systems (JTRS).

  16. Ventromedial hypothalamic neurons control a defensive emotion state

    PubMed Central

    Kunwar, Prabhat S; Zelikowsky, Moriel; Remedios, Ryan; Cai, Haijiang; Yilmaz, Melis; Meister, Markus; Anderson, David J

    2015-01-01

    Defensive behaviors reflect underlying emotion states, such as fear. The hypothalamus plays a role in such behaviors, but prevailing textbook views depict it as an effector of upstream emotion centers, such as the amygdala, rather than as an emotion center itself. We used optogenetic manipulations to probe the function of a specific hypothalamic cell type that mediates innate defensive responses. These neurons are sufficient to drive multiple defensive actions, and required for defensive behaviors in diverse contexts. The behavioral consequences of activating these neurons, moreover, exhibit properties characteristic of emotion states in general, including scalability, (negative) valence, generalization and persistence. Importantly, these neurons can also condition learned defensive behavior, further refuting long-standing claims that the hypothalamus is unable to support emotional learning and therefore is not an emotion center. These data indicate that the hypothalamus plays an integral role to instantiate emotion states, and is not simply a passive effector of upstream emotion centers. DOI: http://dx.doi.org/10.7554/eLife.06633.001 PMID:25748136

  17. Demonstration of Supervisory Control and Data Acquisition (SCADA) Virtualization Capability in the US Army Research Laboratory (ARL)/Sustaining Base Network Assurance Branch (SBNAB) US Army Cyber Analytics Laboratory (ACAL) SCADA Hardware Testbed

    DTIC Science & Technology

    2015-05-01

    application ,1 while the simulated PLC software is the open source ModbusPal Java application . When queried using the Modbus TCP protocol, ModbusPal reports...and programmable logic controller ( PLC ) components. The HMI and PLC components were instantiated with software and installed in multiple virtual...creating and capturing HMI– PLC network traffic over a 24-h period in the virtualized network and inspect the packets for errors.  Test the

  18. The deeper sources of political conflict: evidence from the psychological, cognitive, and neuro-sciences.

    PubMed

    Hibbing, John R; Smith, Kevin B; Peterson, Johnathan C; Feher, Balazs

    2014-03-01

    Political disputes ruin family reunions, scuttle policy initiatives, and spur violence and even terrorism. We summarize recent research indicating that the source of political differences can be found in biologically instantiated and often subthreshold predispositions as reflected in physiological, cognitive, and neural patterns that incline some people toward innovation and others toward conservatism. These findings suggest the need to revise traditional views that maintain that political opinions are the product of rational, conscious, socialized thought. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Generic framework for the secure Yuen 2000 quantum-encryption protocol employing the wire-tap channel approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mihaljevic, Miodrag J.

    2007-05-15

    It is shown that the security, against known-plaintext attacks, of the Yuen 2000 (Y00) quantum-encryption protocol can be considered via the wire-tap channel model assuming that the heterodyne measurement yields the sample for security evaluation. Employing the results reported on the wire-tap channel, a generic framework is proposed for developing secure Y00 instantiations. The proposed framework employs a dedicated encoding which together with inherent quantum noise at the attacker's side provides Y00 security.

  20. A software development and evolution model based on decision-making

    NASA Technical Reports Server (NTRS)

    Wild, J. Christian; Dong, Jinghuan; Maly, Kurt

    1991-01-01

    Design is a complex activity whose purpose is to construct an artifact which satisfies a set of constraints and requirements. However the design process is not well understood. The software design and evolution process is the focus of interest, and a three dimensional software development space organized around a decision-making paradigm is presented. An initial instantiation of this model called 3DPM(sub p) which was partly implemented, is presented. Discussion of the use of this model in software reuse and process management is given.

  1. The connection between logical and thermodynamic irreversibility

    NASA Astrophysics Data System (ADS)

    Ladyman, James; Presnell, Stuart; Short, Anthony J.; Groisman, Berry

    There has recently been a good deal of controversy about Landauer's Principle, which is often stated as follows: the erasure of one bit of information in a computational device is necessarily accompanied by a generation of kT ln 2 heat. This is often generalised to the claim that any logically irreversible operation cannot be implemented in a thermodynamically reversible way. Norton [2005. Eaters of the lotus: Landauer's principle and the return of Maxwell's demon. Studies in History and Philosophy of Modern Physics, 36, 375-411] and Maroney [2005. The (absence of a) relationship between thermodynamic and logical reversibility. Studies in History and Philosophy of Modern Physics, 36, 355-374] both argue that Landauer's Principle has not been shown to hold in general, and Maroney offers a method that he claims instantiates the operation Reset in a thermodynamically reversible way. In this paper we defend the qualitative form of Landauer's Principle, and clarify its quantitative consequences (assuming the second law of thermodynamics). We analyse in detail what it means for a physical system to implement a logical transformation L, and we make this precise by defining the notion of an L-machine. Then we show that logical irreversibility of L implies thermodynamic irreversibility of every corresponding L-machine. We do this in two ways. First, by assuming the phenomenological validity of the Kelvin statement of the second law, and second, by using information-theoretic reasoning. We illustrate our results with the example of the logical transformation 'Reset', and thereby recover the quantitative form of Landauer's Principle.

  2. A PACS archive architecture supported on cloud services.

    PubMed

    Silva, Luís A Bastião; Costa, Carlos; Oliveira, José Luis

    2012-05-01

    Diagnostic imaging procedures have continuously increased over the last decade and this trend may continue in coming years, creating a great impact on storage and retrieval capabilities of current PACS. Moreover, many smaller centers do not have financial resources or requirements that justify the acquisition of a traditional infrastructure. Alternative solutions, such as cloud computing, may help address this emerging need. A tremendous amount of ubiquitous computational power, such as that provided by Google and Amazon, are used every day as a normal commodity. Taking advantage of this new paradigm, an architecture for a Cloud-based PACS archive that provides data privacy, integrity, and availability is proposed. The solution is independent from the cloud provider and the core modules were successfully instantiated in examples of two cloud computing providers. Operational metrics for several medical imaging modalities were tabulated and compared for Google Storage, Amazon S3, and LAN PACS. A PACS-as-a-Service archive that provides storage of medical studies using the Cloud was developed. The results show that the solution is robust and that it is possible to store, query, and retrieve all desired studies in a similar way as in a local PACS approach. Cloud computing is an emerging solution that promises high scalability of infrastructures, software, and applications, according to a "pay-as-you-go" business model. The presented architecture uses the cloud to setup medical data repositories and can have a significant impact on healthcare institutions by reducing IT infrastructures.

  3. Instantiating the multiple levels of analysis perspective in a program of study on externalizing behavior

    PubMed Central

    Beauchaine, Theodore P.; Gatzke-Kopp, Lisa M.

    2014-01-01

    During the last quarter century, developmental psychopathology has become increasingly inclusive and now spans disciplines ranging from psychiatric genetics to primary prevention. As a result, developmental psychopathologists have extended traditional diathesis–stress and transactional models to include causal processes at and across all relevant levels of analysis. Such research is embodied in what is known as the multiple levels of analysis perspective. We describe how multiple levels of analysis research has informed our current thinking about antisocial and borderline personality development among trait impulsive and therefore vulnerable individuals. Our approach extends the multiple levels of analysis perspective beyond simple Biology × Environment interactions by evaluating impulsivity across physiological systems (genetic, autonomic, hormonal, neural), psychological constructs (social, affective, motivational), developmental epochs (preschool, middle childhood, adolescence, adulthood), sexes (male, female), and methods of inquiry (self-report, informant report, treatment outcome, cardiovascular, electrophysiological, neuroimaging). By conducting our research using any and all available methods across these levels of analysis, we have arrived at a developmental model of trait impulsivity that we believe confers a greater understanding of this highly heritable trait and captures at least some heterogeneity in key behavioral outcomes, including delinquency and suicide. PMID:22781868

  4. Optimizing Within-Subject Experimental Designs for jICA of Multi-Channel ERP and fMRI

    PubMed Central

    Mangalathu-Arumana, Jain; Liebenthal, Einat; Beardsley, Scott A.

    2018-01-01

    Joint independent component analysis (jICA) can be applied within subject for fusion of multi-channel event-related potentials (ERP) and functional magnetic resonance imaging (fMRI), to measure brain function at high spatiotemporal resolution (Mangalathu-Arumana et al., 2012). However, the impact of experimental design choices on jICA performance has not been systematically studied. Here, the sensitivity of jICA for recovering neural sources in individual data was evaluated as a function of imaging SNR, number of independent representations of the ERP/fMRI data, relationship between instantiations of the joint ERP/fMRI activity (linear, non-linear, uncoupled), and type of sources (varying parametrically and non-parametrically across representations of the data), using computer simulations. Neural sources were simulated with spatiotemporal and noise attributes derived from experimental data. The best performance, maximizing both cross-modal data fusion and the separation of brain sources, occurred with a moderate number of representations of the ERP/fMRI data (10–30), as in a mixed block/event related experimental design. Importantly, the type of relationship between instantiations of the ERP/fMRI activity, whether linear, non-linear or uncoupled, did not in itself impact jICA performance, and was accurately recovered in the common profiles (i.e., mixing coefficients). Thus, jICA provides an unbiased way to characterize the relationship between ERP and fMRI activity across brain regions, in individual data, rendering it potentially useful for characterizing pathological conditions in which neurovascular coupling is adversely affected. PMID:29410611

  5. Decoding "us" and "them": Neural representations of generalized group concepts.

    PubMed

    Cikara, Mina; Van Bavel, Jay J; Ingbretsen, Zachary A; Lau, Tatiana

    2017-05-01

    Humans form social coalitions in every society on earth, yet we know very little about how the general concepts us and them are represented in the brain. Evolutionary psychologists have argued that the human capacity for group affiliation is a byproduct of adaptations that evolved for tracking coalitions in general. These theories suggest that humans possess a common neural code for the concepts in-group and out-group, regardless of the category by which group boundaries are instantiated. The authors used multivoxel pattern analysis to identify the neural substrates of generalized group concept representations. They trained a classifier to encode how people represented the most basic instantiation of a specific social group (i.e., arbitrary teams created in the lab with no history of interaction or associated stereotypes) and tested how well the neural data decoded membership along an objectively orthogonal, real-world category (i.e., political parties). The dorsal anterior cingulate cortex/middle cingulate cortex and anterior insula were associated with representing groups across multiple social categories. Restricting the analyses to these regions in a separate sample of participants performing an explicit categorization task, the authors replicated cross-categorization classification in anterior insula. Classification accuracy across categories was driven predominantly by the correct categorization of in-group targets, consistent with theories indicating in-group preference is more central than out-group derogation to group perception and cognition. These findings highlight the extent to which social group concepts rely on domain-general circuitry associated with encoding stimuli's functional significance. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. Mechanisms and significance of brain glucose signaling in energy balance, glucose homeostasis, and food-induced reward.

    PubMed

    Devarakonda, Kavya; Mobbs, Charles V

    2016-12-15

    The concept that hypothalamic glucose signaling plays an important role in regulating energy balance, e.g., as instantiated in the so-called "glucostat" hypothesis, is one of the oldest in the field of metabolism. However the mechanisms by which neurons in the hypothalamus sense glucose, and the function of glucose signaling in the brain, has been difficult to establish. Nevertheless recent studies probing mechanisms of glucose signaling have also strongly supported a role for glucose signaling in regulating energy balance, glucose homeostasis, and food-induced reward. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  7. Reuse Tools to Support ADA Instantiation Construction

    DTIC Science & Technology

    1990-06-01

    assests Figure 3- 1 Research Summary-Gaining a New Perspective Working definitions of several relevant and driving terms are now in order: A software part...Report DTIC E -ECTE 0CT 3 01990 CIN: C02087KV000100 E LI I JUNE 1990 DISTP ’[ -lease; Ar:T7 ,’,7",;, 7 ’ rlue REPORT DOCUMENTATION PAGE 1 0 Puic .paftV bu...AftqW1o. Vh == to toe Otim ItNamn a. Maneralut w4d BudgP No0ton4 DC 208 1 . AGENCY USE ONLY (L"" BWk) 2. REPORT DATE 3. REPORT TYPE AND DATES COVERED

  8. Empathy and Its Discontents.

    PubMed

    Bloom, Paul

    2017-01-01

    What role does the experience of feeling what you think others are feeling - often known as 'empathy' - have in moral deliberation and moral action? Empathy has many fans and there is abundant evidence that it can motivate prosocial behavior. However, empathy is narrow in its focus, rendering it innumerate and subject to bias. It can motivate cruelty and aggression and lead to burnout and exhaustion. Compassion is distinct from empathy in its neural instantiation and its behavioral consequences and is a better prod to moral action, particularly in the modern world we live in. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Ada Namelist Package

    NASA Technical Reports Server (NTRS)

    Klumpp, Allan R.

    1991-01-01

    Ada Namelist Package, developed for Ada programming language, enables calling program to read and write FORTRAN-style namelist files. Features are: handling of any combination of types defined by user; ability to read vectors, matrices, and slices of vectors and matrices; handling of mismatches between variables in namelist file and those in programmed list of namelist variables; and ability to avoid searching entire input file for each variable. Principle benefits derived by user: ability to read and write namelist-readable files, ability to detect most file errors in initialization phase, and organization keeping number of instantiated units to few packages rather than to many subprograms.

  10. A Self-Provisioning Mechanism in OpenStack for IoT Devices.

    PubMed

    Solano, Antonio; Dormido, Raquel; Duro, Natividad; Sánchez, Juan Miguel

    2016-08-17

    The aim of this paper is to introduce a plug-and-play mechanism for an Internet of Things (IoT) device to instantiate a Software as a Service (SaaS) application in a private cloud, built up with OpenStack. The SaaS application is the digital avatar of a physical object connected to Internet. As a proof of concept, a Vending Machine is retrofitted and connected to Internet with and Arduino Open Hardware device. Once the self-configuration mechanism is completed, it is possible to order a product from a mobile communication device.

  11. A Self-Provisioning Mechanism in OpenStack for IoT Devices

    PubMed Central

    Solano, Antonio; Dormido, Raquel; Duro, Natividad; Sánchez, Juan Miguel

    2016-01-01

    The aim of this paper is to introduce a plug-and-play mechanism for an Internet of Things (IoT) device to instantiate a Software as a Service (SaaS) application in a private cloud, built up with OpenStack. The SaaS application is the digital avatar of a physical object connected to Internet. As a proof of concept, a Vending Machine is retrofitted and connected to Internet with and Arduino Open Hardware device. Once the self-configuration mechanism is completed, it is possible to order a product from a mobile communication device. PMID:27548166

  12. SERENITY Aware Development of Security and Dependability Solutions

    NASA Astrophysics Data System (ADS)

    Serrano, Daniel; Maña, Antonio; Llarena, Rafael; Crespo, Beatriz Gallego-Nicasio; Li, Keqin

    This chapter presents an infrastructure supporting the implementation of Executable Components (ECs). ECs represent S&D solutions at the implementation level, that is, by means of pieces of executable code. ECs are instantiated by the Serenity runtime Framework (SRF) as a result of requests coming from applications. The development of ECs requires programmers having specific technical knowledge about SERENITY, since they need to implement certain interfaces of the ECs according to SERENITY standards. Every EC has to implement, the interface between the SRF and the EC itself, and the interface that the EC offers to applications.

  13. Creation of the relevant next: How living systems capture the power of the adjacent possible through sign use.

    PubMed

    Favareau, Donald F

    2015-12-01

    Stuart Kauffman's revolutionary notion of the Adjacent Possible as an organizing principle in nature shares much in common with logician Charles S. Peirce's understanding of the universe as an ever-unfolding 'process ontology' of possibility space that is brought about through the recursive interaction of genuine possibility, transiently actualized order, and emergent (but never fully deterministic) lawfulness. Proceeding from these three fundamental categories of becoming-as-being, Peirce developed a complimentary logic of sign relations that, along with Estonian biologist Jakob von Uexküll's action-as-meaning-imprinting Umwelt theory, informs the work that is currently being undertaken under the aegis of Biosemiotics. In this paper, I will highlight the deep affinities between Kauffman's notion of the Adjacent Possible and Biosemiotics' hybrid Peircean/Uexküllian "sign" concept, by which living systems - both as individuals and in the aggregate (i.e., as co-actors, communities and lineages) - "capture" relevant aspects of their relations with the immediately given Adjacent Possible and preserve those recipes for future interaction possibilities as biologically instantiated signs. By so doing, living systems move into the Adjacent Possible by "collapsing the wave function" of possibility not just probabilistically, but guided by system-internal values arising from previously captured sign relations that are biologically instantiated as replicable system biases and generative constraints. The influence of such valenced and end-directed action in the world introduces into the universe the phenomenon of the Relevant (and not just deterministic, or even stochastic) Next. My argument in this paper is that organisms live out their lives perpetually confronted with negotiating the omnipresent Relevant Next, and are informed by the biological capture of their (and their lineage's) previous engagements in doing so. And because that "capture" of previous agent-object-action relationships are instantiated as biological signs for the guidance of the organism, not only are "successful survival strategies" within a given possibility space captured (as in traditional accounts of Natural Selection), but captured as well within those signs are the entire complement of previously untaken but still veridical real-world possibility spaces that are inseparably 'entangled' with that sign, and just awaiting exploration by the organism. Thus, while all action in the universe is both current-context dependant and next-context creating, the emergence of ever-more complex semiotic capabilities in organisms has expanded the possibility space of immediate-next-action in the world exponentially, and has brought into being not a pre-given, singly end-directed ordered world, but an emergent, many ends-directed world of promiscuous, unforeseeable and interacting telos. The goal of Biosemiotics is to understand and to explore this world. Copyright © 2015. Published by Elsevier Ltd.

  14. Offshore Wind Energy Climate Projection Using UPSCALE Climate Data under the RCP8.5 Emission Scenario

    PubMed Central

    Gross, Markus; Magar, Vanesa

    2016-01-01

    In previous work, the authors demonstrated how data from climate simulations can be utilized to estimate regional wind power densities. In particular, it was shown that the quality of wind power densities, estimated from the UPSCALE global dataset in offshore regions of Mexico, compared well with regional high resolution studies. Additionally, a link between surface temperature and moist air density in the estimates was presented. UPSCALE is an acronym for UK on PRACE (the Partnership for Advanced Computing in Europe)—weather-resolving Simulations of Climate for globAL Environmental risk. The UPSCALE experiment was performed in 2012 by NCAS (National Centre for Atmospheric Science)-Climate, at the University of Reading and the UK Met Office Hadley Centre. The study included a 25.6-year, five-member ensemble simulation of the HadGEM3 global atmosphere, at 25km resolution for present climate conditions. The initial conditions for the ensemble runs were taken from consecutive days of a test configuration. In the present paper, the emphasis is placed on the single climate run for a potential future climate scenario in the UPSCALE experiment dataset, using the Representation Concentrations Pathways (RCP) 8.5 climate change scenario. Firstly, some tests were performed to ensure that the results using only one instantiation of the current climate dataset are as robust as possible within the constraints of the available data. In order to achieve this, an artificial time series over a longer sampling period was created. Then, it was shown that these longer time series provided almost the same results than the short ones, thus leading to the argument that the short time series is sufficient to capture the climate. Finally, with the confidence that one instantiation is sufficient, the future climate dataset was analysed to provide, for the first time, a projection of future changes in wind power resources using the UPSCALE dataset. It is hoped that this, in turn, will provide some guidance for wind power developers and policy makers to prepare and adapt for climate change impacts on wind energy production. Although offshore locations around Mexico were used as a case study, the dataset is global and hence the methodology presented can be readily applied at any desired location. PMID:27788208

  15. Offshore Wind Energy Climate Projection Using UPSCALE Climate Data under the RCP8.5 Emission Scenario.

    PubMed

    Gross, Markus; Magar, Vanesa

    2016-01-01

    In previous work, the authors demonstrated how data from climate simulations can be utilized to estimate regional wind power densities. In particular, it was shown that the quality of wind power densities, estimated from the UPSCALE global dataset in offshore regions of Mexico, compared well with regional high resolution studies. Additionally, a link between surface temperature and moist air density in the estimates was presented. UPSCALE is an acronym for UK on PRACE (the Partnership for Advanced Computing in Europe)-weather-resolving Simulations of Climate for globAL Environmental risk. The UPSCALE experiment was performed in 2012 by NCAS (National Centre for Atmospheric Science)-Climate, at the University of Reading and the UK Met Office Hadley Centre. The study included a 25.6-year, five-member ensemble simulation of the HadGEM3 global atmosphere, at 25km resolution for present climate conditions. The initial conditions for the ensemble runs were taken from consecutive days of a test configuration. In the present paper, the emphasis is placed on the single climate run for a potential future climate scenario in the UPSCALE experiment dataset, using the Representation Concentrations Pathways (RCP) 8.5 climate change scenario. Firstly, some tests were performed to ensure that the results using only one instantiation of the current climate dataset are as robust as possible within the constraints of the available data. In order to achieve this, an artificial time series over a longer sampling period was created. Then, it was shown that these longer time series provided almost the same results than the short ones, thus leading to the argument that the short time series is sufficient to capture the climate. Finally, with the confidence that one instantiation is sufficient, the future climate dataset was analysed to provide, for the first time, a projection of future changes in wind power resources using the UPSCALE dataset. It is hoped that this, in turn, will provide some guidance for wind power developers and policy makers to prepare and adapt for climate change impacts on wind energy production. Although offshore locations around Mexico were used as a case study, the dataset is global and hence the methodology presented can be readily applied at any desired location.

  16. Lightness computation by the human visual system

    NASA Astrophysics Data System (ADS)

    Rudd, Michael E.

    2017-05-01

    A model of achromatic color computation by the human visual system is presented, which is shown to account in an exact quantitative way for a large body of appearance matching data collected with simple visual displays. The model equations are closely related to those of the original Retinex model of Land and McCann. However, the present model differs in important ways from Land and McCann's theory in that it invokes additional biological and perceptual mechanisms, including contrast gain control, different inherent neural gains for incremental, and decremental luminance steps, and two types of top-down influence on the perceptual weights applied to local luminance steps in the display: edge classification and spatial integration attentional windowing. Arguments are presented to support the claim that these various visual processes must be instantiated by a particular underlying neural architecture. By pointing to correspondences between the architecture of the model and findings from visual neurophysiology, this paper suggests that edge classification involves a top-down gating of neural edge responses in early visual cortex (cortical areas V1 and/or V2) while spatial integration windowing occurs in cortical area V4 or beyond.

  17. A systems engineering perspective on the human-centered design of health information systems.

    PubMed

    Samaras, George M; Horst, Richard L

    2005-02-01

    The discipline of systems engineering, over the past five decades, has used a structured systematic approach to managing the "cradle to grave" development of products and processes. While elements of this approach are typically used to guide the development of information systems that instantiate a significant user interface, it appears to be rare for the entire process to be implemented. In fact, a number of authors have put forth development lifecycle models that are subsets of the classical systems engineering method, but fail to include steps such as incremental hazard analysis and post-deployment corrective and preventative actions. In that most health information systems have safety implications, we argue that the design and development of such systems would benefit by implementing this systems engineering approach in full. Particularly with regard to bringing a human-centered perspective to the formulation of system requirements and the configuration of effective user interfaces, this classical systems engineering method provides an excellent framework for incorporating human factors (ergonomics) knowledge and integrating ergonomists in the interdisciplinary development of health information systems.

  18. Advanced Software Development Workstation Project

    NASA Technical Reports Server (NTRS)

    Lee, Daniel

    1989-01-01

    The Advanced Software Development Workstation Project, funded by Johnson Space Center, is investigating knowledge-based techniques for software reuse in NASA software development projects. Two prototypes have been demonstrated and a third is now in development. The approach is to build a foundation that provides passive reuse support, add a layer that uses domain-independent programming knowledge, add a layer that supports the acquisition of domain-specific programming knowledge to provide active support, and enhance maintainability and modifiability through an object-oriented approach. The development of new application software would use specification-by-reformulation, based on a cognitive theory of retrieval from very long-term memory in humans, and using an Ada code library and an object base. Current tasks include enhancements to the knowledge representation of Ada packages and abstract data types, extensions to support Ada package instantiation knowledge acquisition, integration with Ada compilers and relational databases, enhancements to the graphical user interface, and demonstration of the system with a NASA contractor-developed trajectory simulation package. Future work will focus on investigating issues involving scale-up and integration.

  19. SWARMs Ontology: A Common Information Model for the Cooperation of Underwater Robots.

    PubMed

    Li, Xin; Bilbao, Sonia; Martín-Wanton, Tamara; Bastos, Joaquim; Rodriguez, Jonathan

    2017-03-11

    In order to facilitate cooperation between underwater robots, it is a must for robots to exchange information with unambiguous meaning. However, heterogeneity, existing in information pertaining to different robots, is a major obstruction. Therefore, this paper presents a networked ontology, named the Smart and Networking Underwater Robots in Cooperation Meshes (SWARMs) ontology, to address information heterogeneity and enable robots to have the same understanding of exchanged information. The SWARMs ontology uses a core ontology to interrelate a set of domain-specific ontologies, including the mission and planning, the robotic vehicle, the communication and networking, and the environment recognition and sensing ontology. In addition, the SWARMs ontology utilizes ontology constructs defined in the PR-OWL ontology to annotate context uncertainty based on the Multi-Entity Bayesian Network (MEBN) theory. Thus, the SWARMs ontology can provide both a formal specification for information that is necessarily exchanged between robots and a command and control entity, and also support for uncertainty reasoning. A scenario on chemical pollution monitoring is described and used to showcase how the SWARMs ontology can be instantiated, be extended, represent context uncertainty, and support uncertainty reasoning.

  20. The TAME Project: Towards improvement-oriented software environments

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Rombach, H. Dieter

    1988-01-01

    Experience from a dozen years of analyzing software engineering processes and products is summarized as a set of software engineering and measurement principles that argue for software engineering process models that integrate sound planning and analysis into the construction process. In the TAME (Tailoring A Measurement Environment) project at the University of Maryland, such an improvement-oriented software engineering process model was developed that uses the goal/question/metric paradigm to integrate the constructive and analytic aspects of software development. The model provides a mechanism for formalizing the characterization and planning tasks, controlling and improving projects based on quantitative analysis, learning in a deeper and more systematic way about the software process and product, and feeding the appropriate experience back into the current and future projects. The TAME system is an instantiation of the TAME software engineering process model as an ISEE (integrated software engineering environment). The first in a series of TAME system prototypes has been developed. An assessment of experience with this first limited prototype is presented including a reassessment of its initial architecture.

  1. Development and Application of Compatible Discretizations of Maxwell's Equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, D; Koning, J; Rieben, R

    We present the development and application of compatible finite element discretizations of electromagnetics problems derived from the time dependent, full wave Maxwell equations. We review the H(curl)-conforming finite element method, using the concepts and notations of differential forms as a theoretical framework. We chose this approach because it can handle complex geometries, it is free of spurious modes, it is numerically stable without the need for filtering or artificial diffusion, it correctly models the discontinuity of fields across material boundaries, and it can be very high order. Higher-order H(curl) and H(div) conforming basis functions are not unique and we havemore » designed an extensible C++ framework that supports a variety of specific instantiations of these such as standard interpolatory bases, spectral bases, hierarchical bases, and semi-orthogonal bases. Virtually any electromagnetics problem that can be cast in the language of differential forms can be solved using our framework. For time dependent problems a method-of-lines scheme is used where the Galerkin method reduces the PDE to a semi-discrete system of ODE's, which are then integrated in time using finite difference methods. For time integration of wave equations we employ the unconditionally stable implicit Newmark-Beta method, as well as the high order energy conserving explicit Maxwell Symplectic method; for diffusion equations, we employ a generalized Crank-Nicholson method. We conclude with computational examples from resonant cavity problems, time-dependent wave propagation problems, and transient eddy current problems, all obtained using the authors massively parallel computational electromagnetics code EMSolve.« less

  2. Layout finishing of a 28nm, 3 billions transistors, multi-core processor

    NASA Astrophysics Data System (ADS)

    Morey-Chaisemartin, Philippe; Beisser, Eric

    2013-06-01

    Designing a fully new 256 cores processor is a great challenge for a fabless startup. In addition to all architecture, functionalities and timing issues, the layout by itself is a bottleneck due to all the process constraints of a 28nm technology. As developers of advanced layout finishing solutions, we were involved in the design flow of this huge chip with its 3 billions transistors. We had to face the issue of dummy patterns instantiation with respect to design constraints. All the design rules to generate the "dummies" are clearly defined in the Design Rule Manual, and some automatic procedures are provided by the foundry itself, but these routines don't take care of the designer requests. Such a chip, embeds both digital parts and analog modules for clock and power management. These two different type of designs have each their own set of constraints. In both cases, the insertion of dummies should not introduce unexpected variations leading to malfunctions. For example, on digital parts were signal race conditions are critical on long wires or bus, introduction of uncontrolled parasitic along these nets are highly critical. For analog devices such as high frequency and high sensitivity comparators, the exact symmetry of the two parts of a current mirror generator should be guaranteed. Thanks to the easily customizable features of our dummies insertion tool, we were able to configure it in order to meet all the designer requirements as well as the process constraints. This paper will present all these advanced key features as well as the layout tricks used to fulfill all requirements.

  3. Silencing the Critics: Understanding the Effects of Cocaine Sensitization on Dorsolateral and Ventral Striatum in the Context of an Actor/Critic Model

    PubMed Central

    Takahashi, Yuji; Schoenbaum, Geoffrey; Niv, Yael

    2008-01-01

    A critical problem in daily decision making is how to choose actions now in order to bring about rewards later. Indeed, many of our actions have long-term consequences, and it is important to not be myopic in balancing the pros and cons of different options, but rather to take into account both immediate and delayed consequences of actions. Failures to do so may be manifest as persistent, maladaptive decision-making, one example of which is addiction where behavior seems to be driven by the immediate positive experiences with drugs, despite the delayed adverse consequences. A recent study by Takahashi et al. (2007) investigated the effects of cocaine sensitization on decision making in rats and showed that drug use resulted in altered representations in the ventral striatum and the dorsolateral striatum, areas that have been implicated in the neural instantiation of a computational solution to optimal long-term actions selection called the Actor/Critic framework. In this Focus article we discuss their results and offer a computational interpretation in terms of drug-induced impairments in the Critic. We first survey the different lines of evidence linking the subparts of the striatum to the Actor/Critic framework, and then suggest two possible scenarios of breakdown that are suggested by Takahashi et al.'s (2007) data. As both are compatible with the current data, we discuss their different predictions and how these could be empirically tested in order to further elucidate (and hopefully inch towards curing) the neural basis of drug addiction. PMID:18982111

  4. Bayesian analogy with relational transformations.

    PubMed

    Lu, Hongjing; Chen, Dawn; Holyoak, Keith J

    2012-07-01

    How can humans acquire relational representations that enable analogical inference and other forms of high-level reasoning? Using comparative relations as a model domain, we explore the possibility that bottom-up learning mechanisms applied to objects coded as feature vectors can yield representations of relations sufficient to solve analogy problems. We introduce Bayesian analogy with relational transformations (BART) and apply the model to the task of learning first-order comparative relations (e.g., larger, smaller, fiercer, meeker) from a set of animal pairs. Inputs are coded by vectors of continuous-valued features, based either on human magnitude ratings, normed feature ratings (De Deyne et al., 2008), or outputs of the topics model (Griffiths, Steyvers, & Tenenbaum, 2007). Bootstrapping from empirical priors, the model is able to induce first-order relations represented as probabilistic weight distributions, even when given positive examples only. These learned representations allow classification of novel instantiations of the relations and yield a symbolic distance effect of the sort obtained with both humans and other primates. BART then transforms its learned weight distributions by importance-guided mapping, thereby placing distinct dimensions into correspondence. These transformed representations allow BART to reliably solve 4-term analogies (e.g., larger:smaller::fiercer:meeker), a type of reasoning that is arguably specific to humans. Our results provide a proof-of-concept that structured analogies can be solved with representations induced from unstructured feature vectors by mechanisms that operate in a largely bottom-up fashion. We discuss potential implications for algorithmic and neural models of relational thinking, as well as for the evolution of abstract thought. Copyright 2012 APA, all rights reserved.

  5. Not only … but also: REM sleep creates and NREM Stage 2 instantiates landmark junctions in cortical memory networks.

    PubMed

    Llewellyn, Sue; Hobson, J Allan

    2015-07-01

    This article argues both rapid eye movement (REM) and non-rapid eye movement (NREM) sleep contribute to overnight episodic memory processes but their roles differ. Episodic memory may have evolved from memory for spatial navigation in animals and humans. Equally, mnemonic navigation in world and mental space may rely on fundamentally equivalent processes. Consequently, the basic spatial network characteristics of pathways which meet at omnidirectional nodes or junctions may be conserved in episodic brain networks. A pathway is formally identified with the unidirectional, sequential phases of an episodic memory. In contrast, the function of omnidirectional junctions is not well understood. In evolutionary terms, both animals and early humans undertook tours to a series of landmark junctions, to take advantage of resources (food, water and shelter), whilst trying to avoid predators. Such tours required memory for emotionally significant landmark resource-place-danger associations and the spatial relationships amongst these landmarks. In consequence, these tours may have driven the evolution of both spatial and episodic memory. The environment is dynamic. Resource-place associations are liable to shift and new resource-rich landmarks may be discovered, these changes may require re-wiring in neural networks. To realise these changes, REM may perform an associative, emotional encoding function between memory networks, engendering an omnidirectional landmark junction which is instantiated in the cortex during NREM Stage 2. In sum, REM may preplay associated elements of past episodes (rather than replay individual episodes), to engender an unconscious representation which can be used by the animal on approach to a landmark junction in wake. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Such stuff as dreams are made on? Elaborative encoding, the ancient art of memory, and the hippocampus.

    PubMed

    Llewellyn, Sue

    2013-12-01

    This article argues that rapid eye movement (REM) dreaming is elaborative encoding for episodic memories. Elaborative encoding in REM can, at least partially, be understood through ancient art of memory (AAOM) principles: visualization, bizarre association, organization, narration, embodiment, and location. These principles render recent memories more distinctive through novel and meaningful association with emotionally salient, remote memories. The AAOM optimizes memory performance, suggesting that its principles may predict aspects of how episodic memory is configured in the brain. Integration and segregation are fundamental organizing principles in the cerebral cortex. Episodic memory networks interconnect profusely within the cortex, creating omnidirectional "landmark" junctions. Memories may be integrated at junctions but segregated along connecting network paths that meet at junctions. Episodic junctions may be instantiated during non-rapid eye movement (NREM) sleep after hippocampal associational function during REM dreams. Hippocampal association involves relating, binding, and integrating episodic memories into a mnemonic compositional whole. This often bizarre, composite image has not been present to the senses; it is not "real" because it hyperassociates several memories. During REM sleep, on the phenomenological level, this composite image is experienced as a dream scene. A dream scene may be instantiated as omnidirectional neocortical junction and retained by the hippocampus as an index. On episodic memory retrieval, an external stimulus (or an internal representation) is matched by the hippocampus against its indices. One or more indices then reference the relevant neocortical junctions from which episodic memories can be retrieved. Episodic junctions reach a processing (rather than conscious) level during normal wake to enable retrieval. If this hypothesis is correct, the stuff of dreams is the stuff of memory.

  7. SUMMA and Model Mimicry: Understanding Differences Among Land Models

    NASA Astrophysics Data System (ADS)

    Nijssen, B.; Nearing, G. S.; Ou, G.; Clark, M. P.

    2016-12-01

    Model inter-comparison and model ensemble experiments suffer from an inability to explain the mechanisms behind differences in model outcomes. We can clearly demonstrate that the models are different, but we cannot necessarily identify the reasons why, because most models exhibit myriad differences in process representations, model parameterizations, model parameters and numerical solution methods. This inability to identify the reasons for differences in model performance hampers our understanding and limits model improvement, because we cannot easily identify the most promising paths forward. We have developed the Structure for Unifying Multiple Modeling Alternatives (SUMMA) to allow for controlled experimentation with model construction, numerical techniques, and parameter values and therefore isolate differences in model outcomes to specific choices during the model development process. In developing SUMMA, we recognized that hydrologic models can be thought of as individual instantiations of a master modeling template that is based on a common set of conservation equations for energy and water. Given this perspective, SUMMA provides a unified approach to hydrologic modeling that integrates different modeling methods into a consistent structure with the ability to instantiate alternative hydrologic models at runtime. Here we employ SUMMA to revisit a previous multi-model experiment and demonstrate its use for understanding differences in model performance. Specifically, we implement SUMMA to mimic the spread of behaviors exhibited by the land models that participated in the Protocol for the Analysis of Land Surface Models (PALS) Land Surface Model Benchmarking Evaluation Project (PLUMBER) and draw conclusions about the relative performance of specific model parameterizations for water and energy fluxes through the soil-vegetation continuum. SUMMA's ability to mimic the spread of model ensembles and the behavior of individual models can be an important tool in focusing model development and improvement efforts.

  8. Parallel grid library for rapid and flexible simulation development

    NASA Astrophysics Data System (ADS)

    Honkonen, I.; von Alfthan, S.; Sandroos, A.; Janhunen, P.; Palmroth, M.

    2013-04-01

    We present an easy to use and flexible grid library for developing highly scalable parallel simulations. The distributed cartesian cell-refinable grid (dccrg) supports adaptive mesh refinement and allows an arbitrary C++ class to be used as cell data. The amount of data in grid cells can vary both in space and time allowing dccrg to be used in very different types of simulations, for example in fluid and particle codes. Dccrg transfers the data between neighboring cells on different processes transparently and asynchronously allowing one to overlap computation and communication. This enables excellent scalability at least up to 32 k cores in magnetohydrodynamic tests depending on the problem and hardware. In the version of dccrg presented here part of the mesh metadata is replicated between MPI processes reducing the scalability of adaptive mesh refinement (AMR) to between 200 and 600 processes. Dccrg is free software that anyone can use, study and modify and is available at https://gitorious.org/dccrg. Users are also kindly requested to cite this work when publishing results obtained with dccrg. Catalogue identifier: AEOM_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOM_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU Lesser General Public License version 3 No. of lines in distributed program, including test data, etc.: 54975 No. of bytes in distributed program, including test data, etc.: 974015 Distribution format: tar.gz Programming language: C++. Computer: PC, cluster, supercomputer. Operating system: POSIX. The code has been parallelized using MPI and tested with 1-32768 processes RAM: 10 MB-10 GB per process Classification: 4.12, 4.14, 6.5, 19.3, 19.10, 20. External routines: MPI-2 [1], boost [2], Zoltan [3], sfc++ [4] Nature of problem: Grid library supporting arbitrary data in grid cells, parallel adaptive mesh refinement, transparent remote neighbor data updates and load balancing. Solution method: The simulation grid is represented by an adjacency list (graph) with vertices stored into a hash table and edges into contiguous arrays. Message Passing Interface standard is used for parallelization. Cell data is given as a template parameter when instantiating the grid. Restrictions: Logically cartesian grid. Running time: Running time depends on the hardware, problem and the solution method. Small problems can be solved in under a minute and very large problems can take weeks. The examples and tests provided with the package take less than about one minute using default options. In the version of dccrg presented here the speed of adaptive mesh refinement is at most of the order of 106 total created cells per second. http://www.mpi-forum.org/. http://www.boost.org/. K. Devine, E. Boman, R. Heaphy, B. Hendrickson, C. Vaughan, Zoltan data management services for parallel dynamic applications, Comput. Sci. Eng. 4 (2002) 90-97. http://dx.doi.org/10.1109/5992.988653. https://gitorious.org/sfc++.

  9. Convergence

    NASA Astrophysics Data System (ADS)

    Darcie, Thomas E.; Doverspike, Robert; Zirngibl, Martin; Korotky, Steven K.

    2005-09-01

    Call for Papers: Convergence The Journal of Optical Networking (JON) invites submissions to a special issue on Convergence. Convergence has become a popular theme in telecommunications, one that has broad implications across all segments of the industry. Continual evolution of technology and applications continues to erase lines between traditionally separate lines of business, with dramatic consequences for vendors, service providers, and consumers. Spectacular advances in all layers of optical networking-leading to abundant, dynamic, cost-effective, and reliable wide-area and local-area connections-have been essential drivers of this evolution. As services and networks continue to evolve towards some notion of convergence, the continued role of optical networks must be explored. One vision of convergence renders all information in a common packet (especially IP) format. This vision is driven by the proliferation of data services. For example, time-division multiplexed (TDM) voice becomes VoIP. Analog cable-television signals become MPEG bits streamed to digital set-top boxes. T1 or OC-N private lines migrate to Ethernet virtual private networks (VPNs). All these packets coexist peacefully within a single packet-routing methodology built on an optical transport layer that combines the flexibility and cost of data networks with telecom-grade reliability. While this vision is appealing in its simplicity and shared widely, specifics of implementation raise many challenges and differences of opinion. For example, many seek to expand the role of Ethernet in these transport networks, while massive efforts are underway to make traditional TDM networks more data friendly within an evolved but backward-compatible SDH/SONET (synchronous digital hierarchy and synchronous optical network) multiplexing hierarchy. From this common underlying theme follow many specific instantiations. Examples include the convergence at the physical, logical, and operational levels of voice and data, video and data, private-line and virtual private-line, fixed and mobile, and local and long-haul services. These trends have many consequences for consumers, vendors, and carriers. Faced with large volumes of low-margin data traffic mixed with traditional voice services, the need for capital conservation and operational efficiency drives carriers away from today's separate overlay networks for each service and towards "converged" platforms. For example, cable operators require transport of multiple services over both hybrid fiber coax (HFC) and DWDM transport technologies. Local carriers seek an economical architecture to deliver integrated services on optically enabled broadband-access networks. Services over wireless-access networks must coexist with those from wired networks. In each case, convergence of networks and services inspires an important set of questions and challenges, driven by the need for low cost, operational efficiency, service performance requirements, and optical transport technology options. This Feature Issue explores the various interpretations and implications of network convergence pertinent to optical networking. How does convergence affect the evolution of optical transport-layer and control approaches? Are the implied directions consistent with research vision for optical networks? Substantial challenges remain. Papers are solicited across the broad spectrum of interests. These include, but are not limited to: Architecture, design and performance of optical wide-area-network (WAN), metro, and access networks Integration strategies for multiservice transport platforms Access methods that bridge traditional and emerging services Network signaling and control methodologies All-optical packet routing and switching techniques To submit to this special issue, follow the normal procedure for submission to JON, indicating "Convergence feature" in the "Comments" field of the online submission form. For all other questions relating to this feature issue, please send an e-mail to jon@osa.org, subject line "Convergence." Additional information can be found on the JON website: http://www.osa-jon.org/submission/ Submission Deadline: 1 October 2005

  10. Convergence

    NASA Astrophysics Data System (ADS)

    Darcie, Thomas E.; Doverspike, Robert; Zirngibl, Martin; Korotky, Steven K.

    2004-12-01

    Convergence has become a popular theme in telecommunications, one that has broad implications across all segments of the industry. Continual evolution of technology and applications continues to erase lines between traditionally separate lines of business, with dramatic consequences for vendors, service providers, and consumers. Spectacular advances in all layers of optical networking-leading to abundant, dynamic, cost-effective, and reliable wide-area and local-area connections-have been essential drivers of this evolution. As services and networks continue to evolve towards some notion of convergence, the continued role of optical networks must be explored. One vision of convergence renders all information in a common packet (especially IP) format. This vision is driven by the proliferation of data services. For example, time-division multiplexed (TDM) voice becomes VoIP. Analog cable-television signals become MPEG bits streamed to digital set-top boxes. T1 or OC-N private lines migrate to Ethernet virtual private networks (VPNs). All these packets coexist peacefully within a single packet-routing methodology built on an optical transport layer that combines the flexibility and cost of data networks with telecom-grade reliability. While this vision is appealing in its simplicity and shared widely, specifics of implementation raise many challenges and differences of opinion. For example, many seek to expand the role of Ethernet in these transport networks, while massive efforts are underway to make traditional TDM networks more data friendly within an evolved but backward-compatible SDH/SONET (synchronous digital hierarchy and synchronous optical network) multiplexing hierarchy. From this common underlying theme follow many specific instantiations. Examples include the convergence at the physical, logical, and operational levels of voice and data, video and data, private-line and virtual private-line, fixed and mobile, and local and long-haul services. These trends have many consequences for consumers, vendors, and carriers. Faced with large volumes of low-margin data traffic mixed with traditional voice services, the need for capital conservation and operational efficiency drives carriers away from today's separate overlay networks for each service and towards "converged" platforms. For example, cable operators require transport of multiple services over both hybrid fiber coax (HFC) and DWDM transport technologies. Local carriers seek an economical architecture to deliver integrated services on optically enabled broadband-access networks. Services over wireless-access networks must coexist with those from wired networks. In each case, convergence of networks and services inspires an important set of questions and challenges, driven by the need for low cost, operational efficiency, service performance requirements, and optical transport technology options. This Feature Issue explores the various interpretations and implications of network convergence pertinent to optical networking. How does convergence affect the evolution of optical transport-layer and control approaches? Are the implied directions consistent with research vision for optical networks? Substantial challenges remain. Papers are solicited across the broad spectrum of interests. These include, but are not limited to:

    • Architecture, design and performance of optical wide-area-network (WAN), metro, and access networks
    • Integration strategies for multiservice transport platforms
    • Access methods that bridge traditional and emerging services
    • Network signaling and control methodologies
    • All-optical packet routing and switching techniques

    Manuscript Submission

    To submit to this special issue, follow the normal procedure for submission to JON, indicating "Convergence feature" in the "Comments" field of the online submission form. For all other questions relating to this feature issue, please send an e-mail to jon@osa.org, subject line "Convergence." Additional information can be found on the JON website: http://www.osa-jon.org/submission/. Submission Deadline: 1 July 2005

  11. Ground-Ground Data Communication-Assisted Planning and Coordination: Shorter Verbal Communications

    NASA Technical Reports Server (NTRS)

    Kessell, Angela Mary; Lee, Paul U.; Smith, Nancy M.; Lee, Hwasoo Eric

    2010-01-01

    A human-in-the-loop simulation was conducted to investigate the operational feasibility, technical requirements, and potential improvement in airspace efficiency of adding a Multi-Sector Planner position. A subset of the data from that simulation is analyzed here to determine the impact, if any, of ground-ground data communication (Data Comm) on verbal communication and coordination for multi-sector air traffic management. The results suggest that the use of Data Comm significantly decreases the duration of individual verbal communications. The results also suggest that the use of Data Comm, as instantiated in the current simulation, does not obviate the need for accompanying voice calls.

  12. Privacy-Preserving Relationship Path Discovery in Social Networks

    NASA Astrophysics Data System (ADS)

    Mezzour, Ghita; Perrig, Adrian; Gligor, Virgil; Papadimitratos, Panos

    As social networks sites continue to proliferate and are being used for an increasing variety of purposes, the privacy risks raised by the full access of social networking sites over user data become uncomfortable. A decentralized social network would help alleviate this problem, but offering the functionalities of social networking sites is a distributed manner is a challenging problem. In this paper, we provide techniques to instantiate one of the core functionalities of social networks: discovery of paths between individuals. Our algorithm preserves the privacy of relationship information, and can operate offline during the path discovery phase. We simulate our algorithm on real social network topologies.

  13. Ontological Problem-Solving Framework for Assigning Sensor Systems and Algorithms to High-Level Missions

    PubMed Central

    Qualls, Joseph; Russomanno, David J.

    2011-01-01

    The lack of knowledge models to represent sensor systems, algorithms, and missions makes opportunistically discovering a synthesis of systems and algorithms that can satisfy high-level mission specifications impractical. A novel ontological problem-solving framework has been designed that leverages knowledge models describing sensors, algorithms, and high-level missions to facilitate automated inference of assigning systems to subtasks that may satisfy a given mission specification. To demonstrate the efficacy of the ontological problem-solving architecture, a family of persistence surveillance sensor systems and algorithms has been instantiated in a prototype environment to demonstrate the assignment of systems to subtasks of high-level missions. PMID:22164081

  14. Wavelet-based scalable L-infinity-oriented compression.

    PubMed

    Alecu, Alin; Munteanu, Adrian; Cornelis, Jan P H; Schelkens, Peter

    2006-09-01

    Among the different classes of coding techniques proposed in literature, predictive schemes have proven their outstanding performance in near-lossless compression. However, these schemes are incapable of providing embedded L(infinity)-oriented compression, or, at most, provide a very limited number of potential L(infinity) bit-stream truncation points. We propose a new multidimensional wavelet-based L(infinity)-constrained scalable coding framework that generates a fully embedded L(infinity)-oriented bit stream and that retains the coding performance and all the scalability options of state-of-the-art L2-oriented wavelet codecs. Moreover, our codec instantiation of the proposed framework clearly outperforms JPEG2000 in L(infinity) coding sense.

  15. A methodology and supply chain management inspired reference ontology for modeling healthcare teams.

    PubMed

    Kuziemsky, Craig E; Yazdi, Sara

    2011-01-01

    Numerous studies and strategic plans are advocating more team based healthcare delivery that is facilitated by information and communication technologies (ICTs). However before we can design ICTs to support teams we need a solid conceptual model of team processes and a methodology for using such a model in healthcare settings. This paper draws upon success in the supply chain management domain to develop a reference ontology of healthcare teams and a methodology for modeling teams to instantiate the ontology in specific settings. This research can help us understand how teams function and how we can design ICTs to support teams.

  16. The Experience of Emotion

    PubMed Central

    Barrett, Lisa Feldman; Mesquita, Batja; Ochsner, Kevin N.; Gross, James J.

    2007-01-01

    Experiences of emotion are content-rich events that emerge at the level of psychological description, but must be causally constituted by neurobiological processes. This chapter outlines an emerging scientific agenda for understanding what these experiences feel like and how they arise. We review the available answers to what is felt (i.e., the content that makes up an experience of emotion) and how neurobiological processes instantiate these properties of experience. These answers are then integrated into a broad framework that describes, in psychological terms, how the experience of emotion emerges from more basic processes. We then discuss the role of such experiences in the economy of the mind and behavior. PMID:17002554

  17. An Evolutionary Comparison of the Handicap Principle and Hybrid Equilibrium Theories of Signaling.

    PubMed

    Kane, Patrick; Zollman, Kevin J S

    2015-01-01

    The handicap principle has come under significant challenge both from empirical studies and from theoretical work. As a result, a number of alternative explanations for honest signaling have been proposed. This paper compares the evolutionary plausibility of one such alternative, the "hybrid equilibrium," to the handicap principle. We utilize computer simulations to compare these two theories as they are instantiated in Maynard Smith's Sir Philip Sidney game. We conclude that, when both types of communication are possible, evolution is unlikely to lead to handicap signaling and is far more likely to result in the partially honest signaling predicted by hybrid equilibrium theory.

  18. On-Orbit Range Set Applications

    NASA Astrophysics Data System (ADS)

    Holzinger, M.; Scheeres, D.

    2011-09-01

    History and methodology of Δv range set computation is briefly reviewed, followed by a short summary of the Δv optimal spacecraft servicing problem literature. Service vehicle placement is approached from a Δv range set viewpoint, providing a framework under which the analysis becomes quite geometric and intuitive. The optimal servicing tour design problem is shown to be a specific instantiation of the metric- Traveling Salesman Problem (TSP), which in general is an NP-hard problem. The Δv-TSP is argued to be quite similar to the Euclidean-TSP, for which approximate optimal solutions may be found in polynomial time. Applications of range sets are demonstrated using analytical and simulation results.

  19. Humans, elephants, diamonds and gold: patterns of intentional design in Girolamo Cardano's natural philosophy.

    PubMed

    Giglioni, Guido

    2014-01-01

    Distancing himself from both Aristotelian and Epicurean models of natural change, and resisting delusions of anthropocentric grandeur, Cardano advanced a theory of teleology centred on the notion of non-human selfhood. In keeping with Plato, he argued that nature was ruled by the mind, meaning by "mind" a universal paragon of intelligibility instantiated through patterns of purposive action ("noetic" teleology). This allowed Cardano to defend a theory of natural finalism in which life was regarded as a primordial attribute of being, already in evidence in the most elementary forms of nature, whose main categories were ability to feign, self-interest, self-preservation and indefinite persistence.

  20. Space Telecommunications Radio System (STRS) Architecture Standard. Release 1.02.1

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Kacpura, Thomas J.; Handler, Louis M.; Hall, C. Steve; Mortensen, Dale J.; Johnson, Sandra K.; Briones, Janette C.; Nappier, Jennifer M.; Downey, Joseph A.; Lux, James P.

    2012-01-01

    This document contains the NASA architecture standard for software defined radios used in space- and ground-based platforms to enable commonality among radio developments to enhance capability and services while reducing mission and programmatic risk. Transceivers (or transponders) with functionality primarily defined in software (e.g., firmware) have the ability to change their functional behavior through software alone. This radio architecture standard offers value by employing common waveform software interfaces, method of instantiation, operation, and testing among different compliant hardware and software products. These common interfaces within the architecture abstract application software from the underlying hardware to enable technology insertion independently at either the software or hardware layer.

  1. Assurance Cases for Proofs as Evidence

    NASA Technical Reports Server (NTRS)

    Chaki, Sagar; Gurfinkel, Arie; Wallnau, Kurt; Weinstock, Charles

    2009-01-01

    Proof-carrying code (PCC) provides a 'gold standard' for establishing formal and objective confidence in program behavior. However, in order to extend the benefits of PCC - and other formal certification techniques - to realistic systems, we must establish the correspondence of a mathematical proof of a program's semantics and its actual behavior. In this paper, we argue that assurance cases are an effective means of establishing such a correspondence. To this end, we present an assurance case pattern for arguing that a proof is free from various proof hazards. We also instantiate this pattern for a proof-based mechanism to provide evidence about a generic medical device software.

  2. Bilinearity, Rules, and Prefrontal Cortex

    PubMed Central

    Dayan, Peter

    2007-01-01

    Humans can be instructed verbally to perform computationally complex cognitive tasks; their performance then improves relatively slowly over the course of practice. Many skills underlie these abilities; in this paper, we focus on the particular question of a uniform architecture for the instantiation of habitual performance and the storage, recall, and execution of simple rules. Our account builds on models of gated working memory, and involves a bilinear architecture for representing conditional input-output maps and for matching rules to the state of the input and working memory. We demonstrate the performance of our model on two paradigmatic tasks used to investigate prefrontal and basal ganglia function. PMID:18946523

  3. A comparison of multiprocessor scheduling methods for iterative data flow architectures

    NASA Technical Reports Server (NTRS)

    Storch, Matthew

    1993-01-01

    A comparative study is made between the Algorithm to Architecture Mapping Model (ATAMM) and three other related multiprocessing models from the published literature. The primary focus of all four models is the non-preemptive scheduling of large-grain iterative data flow graphs as required in real-time systems, control applications, signal processing, and pipelined computations. Important characteristics of the models such as injection control, dynamic assignment, multiple node instantiations, static optimum unfolding, range-chart guided scheduling, and mathematical optimization are identified. The models from the literature are compared with the ATAMM for performance, scheduling methods, memory requirements, and complexity of scheduling and design procedures.

  4. Dynamic resource allocation in a hierarchical multiprocessor system: A preliminary study

    NASA Technical Reports Server (NTRS)

    Ngai, Tin-Fook

    1986-01-01

    An integrated system approach to dynamic resource allocation is proposed. Some of the problems in dynamic resource allocation and the relationship of these problems to system structures are examined. A general dynamic resource allocation scheme is presented. A hierarchial system architecture which dynamically maps between processor structure and programs at multiple levels of instantiations is described. Simulation experiments were conducted to study dynamic resource allocation on the proposed system. Preliminary evaluation based on simple dynamic resource allocation algorithms indicates that with the proposed system approach, the complexity of dynamic resource management could be significantly reduced while achieving reasonable effective dynamic resource allocation.

  5. Limits of quantitation - Yet another suggestion

    NASA Astrophysics Data System (ADS)

    Carlson, Jill; Wysoczanski, Artur; Voigtman, Edward

    2014-06-01

    The work presented herein suggests that the limit of quantitation concept may be rendered substantially less ambiguous and ultimately more useful as a figure of merit by basing it upon the significant figure and relative measurement error ideas due to Coleman, Auses and Gram, coupled with the correct instantiation of Currie's detection limit methodology. Simple theoretical results are presented for a linear, univariate chemical measurement system with homoscedastic Gaussian noise, and these are tested against both Monte Carlo computer simulations and laser-excited molecular fluorescence experimental results. Good agreement among experiment, theory and simulation is obtained and an easy extension to linearly heteroscedastic Gaussian noise is also outlined.

  6. Bridging Theory with Practice: An Exploratory Study of Visualization Use and Design for Climate Model Comparison

    DOE PAGES

    Dasgupta, Aritra; Poco, Jorge; Wei, Yaxing; ...

    2015-03-16

    Evaluation methodologies in visualization have mostly focused on how well the tools and techniques cater to the analytical needs of the user. While this is important in determining the effectiveness of the tools and advancing the state-of-the-art in visualization research, a key area that has mostly been overlooked is how well established visualization theories and principles are instantiated in practice. This is especially relevant when domain experts, and not visualization researchers, design visualizations for analysis of their data or for broader dissemination of scientific knowledge. There is very little research on exploring the synergistic capabilities of cross-domain collaboration between domainmore » experts and visualization researchers. To fill this gap, in this paper we describe the results of an exploratory study of climate data visualizations conducted in tight collaboration with a pool of climate scientists. The study analyzes a large set of static climate data visualizations for identifying their shortcomings in terms of visualization design. The outcome of the study is a classification scheme that categorizes the design problems in the form of a descriptive taxonomy. The taxonomy is a first attempt for systematically categorizing the types, causes, and consequences of design problems in visualizations created by domain experts. We demonstrate the use of the taxonomy for a number of purposes, such as, improving the existing climate data visualizations, reflecting on the impact of the problems for enabling domain experts in designing better visualizations, and also learning about the gaps and opportunities for future visualization research. We demonstrate the applicability of our taxonomy through a number of examples and discuss the lessons learnt and implications of our findings.« less

  7. Bridging Theory with Practice: An Exploratory Study of Visualization Use and Design for Climate Model Comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasgupta, Aritra; Poco, Jorge; Wei, Yaxing

    Evaluation methodologies in visualization have mostly focused on how well the tools and techniques cater to the analytical needs of the user. While this is important in determining the effectiveness of the tools and advancing the state-of-the-art in visualization research, a key area that has mostly been overlooked is how well established visualization theories and principles are instantiated in practice. This is especially relevant when domain experts, and not visualization researchers, design visualizations for analysis of their data or for broader dissemination of scientific knowledge. There is very little research on exploring the synergistic capabilities of cross-domain collaboration between domainmore » experts and visualization researchers. To fill this gap, in this paper we describe the results of an exploratory study of climate data visualizations conducted in tight collaboration with a pool of climate scientists. The study analyzes a large set of static climate data visualizations for identifying their shortcomings in terms of visualization design. The outcome of the study is a classification scheme that categorizes the design problems in the form of a descriptive taxonomy. The taxonomy is a first attempt for systematically categorizing the types, causes, and consequences of design problems in visualizations created by domain experts. We demonstrate the use of the taxonomy for a number of purposes, such as, improving the existing climate data visualizations, reflecting on the impact of the problems for enabling domain experts in designing better visualizations, and also learning about the gaps and opportunities for future visualization research. We demonstrate the applicability of our taxonomy through a number of examples and discuss the lessons learnt and implications of our findings.« less

  8. Addressing the translational dilemma: dynamic knowledge representation of inflammation using agent-based modeling.

    PubMed

    An, Gary; Christley, Scott

    2012-01-01

    Given the panoply of system-level diseases that result from disordered inflammation, such as sepsis, atherosclerosis, cancer, and autoimmune disorders, understanding and characterizing the inflammatory response is a key target of biomedical research. Untangling the complex behavioral configurations associated with a process as ubiquitous as inflammation represents a prototype of the translational dilemma: the ability to translate mechanistic knowledge into effective therapeutics. A critical failure point in the current research environment is a throughput bottleneck at the level of evaluating hypotheses of mechanistic causality; these hypotheses represent the key step toward the application of knowledge for therapy development and design. Addressing the translational dilemma will require utilizing the ever-increasing power of computers and computational modeling to increase the efficiency of the scientific method in the identification and evaluation of hypotheses of mechanistic causality. More specifically, development needs to focus on facilitating the ability of non-computer trained biomedical researchers to utilize and instantiate their knowledge in dynamic computational models. This is termed "dynamic knowledge representation." Agent-based modeling is an object-oriented, discrete-event, rule-based simulation method that is well suited for biomedical dynamic knowledge representation. Agent-based modeling has been used in the study of inflammation at multiple scales. The ability of agent-based modeling to encompass multiple scales of biological process as well as spatial considerations, coupled with an intuitive modeling paradigm, suggest that this modeling framework is well suited for addressing the translational dilemma. This review describes agent-based modeling, gives examples of its applications in the study of inflammation, and introduces a proposed general expansion of the use of modeling and simulation to augment the generation and evaluation of knowledge by the biomedical research community at large.

  9. The role of the P3 and CNV components in voluntary and automatic temporal orienting: A high spatial-resolution ERP study.

    PubMed

    Mento, Giovanni

    2017-12-01

    A main distinction has been proposed between voluntary and automatic mechanisms underlying temporal orienting (TO) of selective attention. Voluntary TO implies the endogenous directing of attention induced by symbolic cues. Conversely, automatic TO is exogenously instantiated by the physical properties of stimuli. A well-known example of automatic TO is sequential effects (SEs), which refer to the adjustments in participants' behavioral performance as a function of the trial-by-trial sequential distribution of the foreperiod between two stimuli. In this study a group of healthy adults underwent a cued reaction time task purposely designed to assess both voluntary and automatic TO. During the task, both post-cue and post-target event-related potentials (ERPs) were recorded by means of a high spatial resolution EEG system. In the results of the post-cue analysis, the P3a and P3b were identified as two distinct ERP markers showing distinguishable spatiotemporal features and reflecting automatic and voluntary a priori expectancy generation, respectively. The brain source reconstruction further revealed that distinct cortical circuits supported these two temporally dissociable components. Namely, the voluntary P3b was supported by a left sensorimotor network, while the automatic P3a was generated by a more distributed frontoparietal circuit. Additionally, post-cue contingent negative variation (CNV) and post-target P3 modulations were observed as common markers of voluntary and automatic expectancy implementation and response selection, although partially dissociable neural networks subserved these two mechanisms. Overall, these results provide new electrophysiological evidence suggesting that distinct neural substrates can be recruited depending on the voluntary or automatic cognitive nature of the cognitive mechanisms subserving TO. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. A Novel Weighted Kernel PCA-Based Method for Optimization and Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Thimmisetty, C.; Talbot, C.; Chen, X.; Tong, C. H.

    2016-12-01

    It has been demonstrated that machine learning methods can be successfully applied to uncertainty quantification for geophysical systems through the use of the adjoint method coupled with kernel PCA-based optimization. In addition, it has been shown through weighted linear PCA how optimization with respect to both observation weights and feature space control variables can accelerate convergence of such methods. Linear machine learning methods, however, are inherently limited in their ability to represent features of non-Gaussian stochastic random fields, as they are based on only the first two statistical moments of the original data. Nonlinear spatial relationships and multipoint statistics leading to the tortuosity characteristic of channelized media, for example, are captured only to a limited extent by linear PCA. With the aim of coupling the kernel-based and weighted methods discussed, we present a novel mathematical formulation of kernel PCA, Weighted Kernel Principal Component Analysis (WKPCA), that both captures nonlinear relationships and incorporates the attribution of significance levels to different realizations of the stochastic random field of interest. We also demonstrate how new instantiations retaining defining characteristics of the random field can be generated using Bayesian methods. In particular, we present a novel WKPCA-based optimization method that minimizes a given objective function with respect to both feature space random variables and observation weights through which optimal snapshot significance levels and optimal features are learned. We showcase how WKPCA can be applied to nonlinear optimal control problems involving channelized media, and in particular demonstrate an application of the method to learning the spatial distribution of material parameter values in the context of linear elasticity, and discuss further extensions of the method to stochastic inversion.

  11. Digital health and the biopolitics of the Quantified Self

    PubMed Central

    Ajana, Btihaj

    2017-01-01

    Recent years have witnessed an intensive growth of systems of measurement and an increasing integration of data processes into various spheres of everyday life. From smartphone apps that measure our activity and sleep, to digital devices that monitor our health and performance at the workplace, the culture of measurement is currently on the rise. Encouraged by movements such as the Quantified Self, whose motto is ‘self knowledge through numbers’, a growing number of people across the globe are embracing practices of self-quantification and tracking in the spirit of improving their wellbeing and productivity or charting their fitness progress. In this article, I examine the biopolitical aspects of the Quantified Self practices, exploring some of the ideologies and rationalities underlying self-tracking culture. I argue that such practices represent an instantiation of a ‘biopolitics of the self’ whereby the body is made amenable to management and monitoring techniques that often echo the ethos of neoliberalism. Rather than being restricted to an individualized form, self-tracking practices are also becoming part of a biosocial and communal phenomenon in which individuals are incited to share with others information about their physical activities and biodata. In exploring some examples of this data sharing culture, I critically address the extent to which the sharing of personal physical data can be seen as a ‘solidaristic’ act that can contribute to a larger Big Data ecosystem and inform the wider medical community and healthcare research and policy. I link this discussion to debates on ‘data philanthropy’, highlighting the emerging tension between philanthropic discourses of data sharing and issues of privacy. From here, I go on to discuss further ethical and political concerns, particularly in relation to data security and the marked shifts in healthcare responsibilities.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melton, Ron

    The Pacific Northwest Smart Grid Demonstration (PNWSGD), a $179 million project that was co-funded by the U.S. Department of Energy (DOE) in late 2009, was one of the largest and most comprehensive demonstrations of electricity grid modernization ever completed. The project was one of 16 regional smart grid demonstrations funded by the American Recovery and Reinvestment Act. It was the only demonstration that included multiple states and cooperation from multiple electric utilities, including rural electric co-ops, investor-owned, municipal, and other public utilities. No fewer than 55 unique instantiations of distinct smart grid systems were demonstrated at the projects’ sites. Themore » local objectives for these systems included improved reliability, energy conservation, improved efficiency, and demand responsiveness. The demonstration developed and deployed an innovative transactive system, unique in the world, that coordinated many of the project’s distributed energy resources and demand-responsive components. With the transactive system, additional regional objectives were also addressed, including the mitigation of renewable energy intermittency and the flattening of system load. Using the transactive system, the project coordinated a regional response across the 11 utilities. This region-wide connection from the transmission system down to individual premises equipment was one of the major successes of the project. The project showed that this can be done and assets at the end points can respond dynamically on a wide scale. In principle, a transactive system of this type might eventually help coordinate electricity supply, transmission, distribution, and end uses by distributing mostly automated control responsibilities among the many distributed smart grid domain members and their smart devices.« less

  13. Cue competition effects in human causal learning.

    PubMed

    Vogel, Edgar H; Glynn, Jacqueline Y; Wagner, Allan R

    2015-01-01

    Five experiments involving human causal learning were conducted to compare the cue competition effects known as blocking and unovershadowing, in proactive and retroactive instantiations. Experiment 1 demonstrated reliable proactive blocking and unovershadowing but only retroactive unovershadowing. Experiment 2 replicated the same pattern and showed that the retroactive unovershadowing that was observed was interfered with by a secondary memory task that had no demonstrable effect on either proactive unovershadowing or blocking. Experiments 3a, 3b, and 3c demonstrated that retroactive unovershadowing was accompanied by an inflated memory effect not accompanying proactive unovershadowing. The differential pattern of proactive versus retroactive cue competition effects is discussed in relationship to amenable associative and inferential processing possibilities.

  14. Accelerating Innovation in the Creation of Biovalue: The Cell and Gene Therapy Catapult.

    PubMed

    Gardner, John; Webster, Andrew

    2017-09-01

    The field of regenerative medicine (RM) has considerable therapeutic promise that is proving difficult to realize. As a result, governments have supported the establishment of intermediary agencies to "accelerate" innovation. This article examines in detail one such agency, the United Kingdom's Cell and Gene Therapy Catapult (CGTC). We describe CGTC's role as an accelerator agency and its value narrative, which combines both "health and wealth." Drawing on the notion of sociotechnical imaginaries, we unpack the tensions within this narrative and its instantiation as the CGTC cell therapy infrastructure is built and engages with other agencies, some of which have different priorities and roles to play within the RM field.

  15. Mediator assessment, documentation, and disposition of child custody cases involving intimate partner abuse: a naturalistic evaluation of one county's practices.

    PubMed

    Beck, Connie J A; Walsh, Michele E; Mechanic, Mindy B; Taylor, Caitilin S

    2010-06-01

    The contentious and costly nature of the adversarial process for resolving child custody disputes has prompted scholars, practitioners, and policy makers to advocate for the development and implementation of less divisive forms of dispute resolution, notably, mediation. Mediation has been championed for its potential to resolve disputes with less acrimony among disputants, reduced economic costs, increased satisfaction with outcomes, and fewer adverse consequences for family members. Despite the increasing popularity, arguments have cautioned against the use of mandated mediation when intimate partner abuse (IPA) is alleged. This research documents a mediation screening process and models mediators' decision-making process as instantiated, naturally, in one jurisdiction.

  16. A systems engineering approach to automated failure cause diagnosis in space power systems

    NASA Technical Reports Server (NTRS)

    Dolce, James L.; Faymon, Karl A.

    1987-01-01

    Automatic failure-cause diagnosis is a key element in autonomous operation of space power systems such as Space Station's. A rule-based diagnostic system has been developed for determining the cause of degraded performance. The knowledge required for such diagnosis is elicited from the system engineering process by using traditional failure analysis techniques. Symptoms, failures, causes, and detector information are represented with structured data; and diagnostic procedural knowledge is represented with rules. Detected symptoms instantiate failure modes and possible causes consistent with currently held beliefs about the likelihood of the cause. A diagnosis concludes with an explanation of the observed symptoms in terms of a chain of possible causes and subcauses.

  17. Accelerating Innovation in the Creation of Biovalue

    PubMed Central

    Webster, Andrew

    2017-01-01

    The field of regenerative medicine (RM) has considerable therapeutic promise that is proving difficult to realize. As a result, governments have supported the establishment of intermediary agencies to “accelerate” innovation. This article examines in detail one such agency, the United Kingdom’s Cell and Gene Therapy Catapult (CGTC). We describe CGTC’s role as an accelerator agency and its value narrative, which combines both “health and wealth.” Drawing on the notion of sociotechnical imaginaries, we unpack the tensions within this narrative and its instantiation as the CGTC cell therapy infrastructure is built and engages with other agencies, some of which have different priorities and roles to play within the RM field. PMID:28845068

  18. An Evolutionary Comparison of the Handicap Principle and Hybrid Equilibrium Theories of Signaling

    PubMed Central

    Kane, Patrick; Zollman, Kevin J. S.

    2015-01-01

    The handicap principle has come under significant challenge both from empirical studies and from theoretical work. As a result, a number of alternative explanations for honest signaling have been proposed. This paper compares the evolutionary plausibility of one such alternative, the “hybrid equilibrium,” to the handicap principle. We utilize computer simulations to compare these two theories as they are instantiated in Maynard Smith’s Sir Philip Sidney game. We conclude that, when both types of communication are possible, evolution is unlikely to lead to handicap signaling and is far more likely to result in the partially honest signaling predicted by hybrid equilibrium theory. PMID:26348617

  19. Design Mining Interacting Wind Turbines.

    PubMed

    Preen, Richard J; Bull, Larry

    2016-01-01

    An initial study has recently been presented of surrogate-assisted evolutionary algorithms used to design vertical-axis wind turbines wherein candidate prototypes are evaluated under fan-generated wind conditions after being physically instantiated by a 3D printer. Unlike other approaches, such as computational fluid dynamics simulations, no mathematical formulations were used and no model assumptions were made. This paper extends that work by exploring alternative surrogate modelling and evolutionary techniques. The accuracy of various modelling algorithms used to estimate the fitness of evaluated individuals from the initial experiments is compared. The effect of temporally windowing surrogate model training samples is explored. A surrogate-assisted approach based on an enhanced local search is introduced; and alternative coevolution collaboration schemes are examined.

  20. Process Materialization Using Templates and Rules to Design Flexible Process Models

    NASA Astrophysics Data System (ADS)

    Kumar, Akhil; Yao, Wen

    The main idea in this paper is to show how flexible processes can be designed by combining generic process templates and business rules. We instantiate a process by applying rules to specific case data, and running a materialization algorithm. The customized process instance is then executed in an existing workflow engine. We present an architecture and also give an algorithm for process materialization. The rules are written in a logic-based language like Prolog. Our focus is on capturing deeper process knowledge and achieving a holistic approach to robust process design that encompasses control flow, resources and data, as well as makes it easier to accommodate changes to business policy.

  1. Promoting motivation with virtual agents and avatars: role of visual presence and appearance.

    PubMed

    Baylor, Amy L

    2009-12-12

    Anthropomorphic virtual agents can serve as powerful technological mediators to impact motivational outcomes such as self-efficacy and attitude change. Such anthropomorphic agents can be designed as simulated social models in the Bandurian sense, providing social influence as virtual 'role models'. Of particular value is the capacity for designing such agents as optimized social models for a target audience and context. Importantly, the visual presence and appearance of such agents can have a major impact on motivation and affect regardless of the underlying technical sophistication. Empirical results of different instantiations of agent presence and appearance are reviewed for both autonomous virtual agents and avatars that represent a user.

  2. Improving Grasp Skills Using Schema Structured Learning

    NASA Technical Reports Server (NTRS)

    Platt, Robert; Grupen, ROderic A.; Fagg, Andrew H.

    2006-01-01

    Abstract In the control-based approach to robotics, complex behavior is created by sequencing and combining control primitives. While it is desirable for the robot to autonomously learn the correct control sequence, searching through the large number of potential solutions can be time consuming. This paper constrains this search to variations of a generalized solution encoded in a framework known as an action schema. A new algorithm, SCHEMA STRUCTURED LEARNING, is proposed that repeatedly executes variations of the generalized solution in search of instantiations that satisfy action schema objectives. This approach is tested in a grasping task where Dexter, the UMass humanoid robot, learns which reaching and grasping controllers maximize the probability of grasp success.

  3. Movement as utopia.

    PubMed

    Couton, Philippe; López, José Julián

    2009-10-01

    Opposition to utopianism on ontological and political grounds has seemingly relegated it to a potentially dangerous form of antiquated idealism. This conclusion is based on a restrictive view of utopia as excessively ordered panoptic discursive constructions. This overlooks the fact that, from its inception, movement has been central to the utopian tradition. The power of utopianism indeed resides in its ability to instantiate the tension between movement and place that has marked social transformations in the modern era. This tension continues in contemporary discussions of movement-based social processes, particularly international migration and related identity formations, such as open borders transnationalism and cosmopolitanism. Understood as such, utopia remains an ongoing and powerful, albeit problematic instrument of social and political imagination.

  4. The coronal fricative problem

    PubMed Central

    Dinnsen, Daniel A.; Dow, Michael C.; Gierut, Judith A.; Morrisette, Michele L.; Green, Christopher R.

    2013-01-01

    This paper examines a range of predicted versus attested error patterns involving coronal fricatives (e.g. [s, z, θ, ð]) as targets and repairs in the early sound systems of monolingual English-acquiring children. Typological results are reported from a cross-sectional study of 234 children with phonological delays (ages 3 years; 0 months to 7;9). Our analyses revealed different instantiations of a putative developmental conspiracy within and across children. Supplemental longitudinal evidence is also presented that replicates the cross-sectional results, offering further insight into the life-cycle of the conspiracy. Several of the observed typological anomalies are argued to follow from a modified version of Optimality Theory with Candidate Chains (McCarthy, 2007). PMID:24790247

  5. Operator Informational Needs for Multiple Autonomous Small Vehicles

    NASA Technical Reports Server (NTRS)

    Trujillo, Anna C.; Fan, Henry; Cross, Charles D.; Hempley, Lucas E.; Cichella, Venanzio; Puig-Navarro, Javier; Mehdi, Syed Bilal

    2015-01-01

    With the anticipated explosion of small unmanned aerial vehicles, it is highly likely that operators will be controlling fleets of autonomous vehicles. To fulfill the promise of autonomy, vehicle operators will not be concerned with manual control of the vehicle; instead, they will deal with the overall mission. Furthermore, the one operator to many vehicles is becoming a constant meme with various industries including package delivery, search and rescue, and utility companies. In order for an operator to concurrently control several vehicles, his station must look and behave very differently than the current ground control station instantiations. Furthermore, the vehicle will have to be much more autonomous, especially during non-normal operations, in order to accommodate the knowledge deficit or the information overload of the operator in charge of several vehicles. The expected usage increase of small drones requires presenting the operational information generated by a fleet of heterogeneous autonomous agents to an operator. NASA Langley Research Center's Autonomy Incubator has brought together researchers in various disciplines including controls, trajectory planning, systems engineering, and human factors to develop an integrated system to study autonomy issues. The initial human factors effort is focusing on mission displays that would give an operator the overall status of all autonomous agents involved in the current mission. This paper will discuss the specifics of the mission displays for operators controlling several vehicles.

  6. Childhood Maltreatment and Its Effect on Neurocognitive Functioning: Timing and Chronicity Matter

    PubMed Central

    Cowell, Raquel A.; Cicchetti, Dante; Rogosch, Fred A.; Toth, Sheree L.

    2015-01-01

    Childhood maltreatment represents a complex stressor, with the developmental timing, duration, frequency, and type of maltreatment varying with each child (Barnett, Manly, & Cicchetti, 1993; Cicchetti & Manly, 2001). Multiple brain regions and neural circuits are disrupted by the experience of child maltreatment (Cicchetti & Toth, in press; DeBellis et al., 2002; McCrory & Viding, 2010; Teicher, Anderson, & Polcari, 2012). These neurobiological compromises indicate the impairment of a number of important cognitive functions, including working memory and inhibitory control. The present study extends prior research by examining the effect of childhood maltreatment on neurocognitive functioning based on developmental timing of maltreatment, including onset, chronicity, and recency, in a sample of 3- to 9-year-old nonmaltreated (n = 136) and maltreated children (n = 223). Maltreated children performed more poorly on inhibitory control and working memory tasks than nonmaltreated children. Group differences between maltreated children based on the timing of maltreatment and the chronicity of maltreatment also were evident. Specifically, children who were maltreated during infancy, and children with a chronic history of maltreatment, exhibited significantly poorer inhibitory control and working memory performance than children without a history of maltreatment. The results suggest that maltreatment occurring during infancy, a period of major brain organization, disrupts normative structure and function, and these deficits are further instantiated by the prolonged stress of chronic maltreatment during the early years of life. PMID:25997769

  7. Electrophysiological evidence for the morpheme-based combinatoric processing of English compounds

    PubMed Central

    Fiorentino, Robert; Naito-Billen, Yuka; Bost, Jamie; Fund-Reznicek, Ella

    2014-01-01

    The extent to which the processing of compounds (e.g., “catfish”) makes recourse to morphological-level representations remains a matter of debate. Moreover, positing a morpheme-level route to complex word recognition entails not only access to morphological constituents, but also combinatoric processes operating on the constituent representations; however, the neurophysiological mechanisms subserving decomposition, and in particular morpheme combination, have yet to be fully elucidated. The current study presents electrophysiological evidence for the morpheme-based processing of both lexicalized (e.g., “teacup”) and novel (e.g., “tombnote”) visually-presented English compounds; these brain responses appear prior to and are dissociable from the eventual overt lexical decision response. The electrophysiological results reveal increased negativities for conditions with compound structure, including effects shared by lexicalized and novel compounds, as well as effects unique to each compound type, which may be related to aspects of morpheme combination. These findings support models positing across-the-board morphological decomposition, counter to models proposing that putatively complex words are primarily or solely processed as undecomposed representations, and motivate further electrophysiological research toward a more precise characterization of the nature and neurophysiological instantiation of complex word recognition. PMID:24279696

  8. SWARMs Ontology: A Common Information Model for the Cooperation of Underwater Robots

    PubMed Central

    Li, Xin; Bilbao, Sonia; Martín-Wanton, Tamara; Bastos, Joaquim; Rodriguez, Jonathan

    2017-01-01

    In order to facilitate cooperation between underwater robots, it is a must for robots to exchange information with unambiguous meaning. However, heterogeneity, existing in information pertaining to different robots, is a major obstruction. Therefore, this paper presents a networked ontology, named the Smart and Networking Underwater Robots in Cooperation Meshes (SWARMs) ontology, to address information heterogeneity and enable robots to have the same understanding of exchanged information. The SWARMs ontology uses a core ontology to interrelate a set of domain-specific ontologies, including the mission and planning, the robotic vehicle, the communication and networking, and the environment recognition and sensing ontology. In addition, the SWARMs ontology utilizes ontology constructs defined in the PR-OWL ontology to annotate context uncertainty based on the Multi-Entity Bayesian Network (MEBN) theory. Thus, the SWARMs ontology can provide both a formal specification for information that is necessarily exchanged between robots and a command and control entity, and also support for uncertainty reasoning. A scenario on chemical pollution monitoring is described and used to showcase how the SWARMs ontology can be instantiated, be extended, represent context uncertainty, and support uncertainty reasoning. PMID:28287468

  9. The impact of threat of shock on the framing effect and temporal discounting: executive functions unperturbed by acute stress?

    PubMed

    Robinson, Oliver J; Bond, Rebecca L; Roiser, Jonathan P

    2015-01-01

    Anxiety and stress-related disorders constitute a large global health burden, but are still poorly understood. Prior work has demonstrated clear impacts of stress upon basic cognitive function: biasing attention toward unexpected and potentially threatening information and instantiating a negative affective bias. However, the impact that these changes have on higher-order, executive, decision-making processes is unclear. In this study, we examined the impact of a translational within-subjects stress induction (threat of unpredictable shock) on two well-established executive decision-making biases: the framing effect (N = 83), and temporal discounting (N = 36). In both studies, we demonstrate (a) clear subjective effects of stress, and (b) clear executive decision-making biases but (c) no impact of stress on these decision-making biases. Indeed, Bayes factor analyses confirmed substantial preference for decision-making models that did not include stress. We posit that while stress may induce subjective mood change and alter low-level perceptual and action processes (Robinson et al., 2013c), some higher-level executive processes remain unperturbed by these impacts. As such, although stress can induce a transient affective biases and altered mood, these need not result in poor financial decision-making.

  10. Constructing Agent Model for Virtual Training Systems

    NASA Astrophysics Data System (ADS)

    Murakami, Yohei; Sugimoto, Yuki; Ishida, Toru

    Constructing highly realistic agents is essential if agents are to be employed in virtual training systems. In training for collaboration based on face-to-face interaction, the generation of emotional expressions is one key. In training for guidance based on one-to-many interaction such as direction giving for evacuations, emotional expressions must be supplemented by diverse agent behaviors to make the training realistic. To reproduce diverse behavior, we characterize agents by using a various combinations of operation rules instantiated by the user operating the agent. To accomplish this goal, we introduce a user modeling method based on participatory simulations. These simulations enable us to acquire information observed by each user in the simulation and the operating history. Using these data and the domain knowledge including known operation rules, we can generate an explanation for each behavior. Moreover, the application of hypothetical reasoning, which offers consistent selection of hypotheses, to the generation of explanations allows us to use otherwise incompatible operation rules as domain knowledge. In order to validate the proposed modeling method, we apply it to the acquisition of an evacuee's model in a fire-drill experiment. We successfully acquire a subject's model corresponding to the results of an interview with the subject.

  11. Tissue-aware RNA-Seq processing and normalization for heterogeneous and sparse data.

    PubMed

    Paulson, Joseph N; Chen, Cho-Yi; Lopes-Ramos, Camila M; Kuijjer, Marieke L; Platig, John; Sonawane, Abhijeet R; Fagny, Maud; Glass, Kimberly; Quackenbush, John

    2017-10-03

    Although ultrahigh-throughput RNA-Sequencing has become the dominant technology for genome-wide transcriptional profiling, the vast majority of RNA-Seq studies typically profile only tens of samples, and most analytical pipelines are optimized for these smaller studies. However, projects are generating ever-larger data sets comprising RNA-Seq data from hundreds or thousands of samples, often collected at multiple centers and from diverse tissues. These complex data sets present significant analytical challenges due to batch and tissue effects, but provide the opportunity to revisit the assumptions and methods that we use to preprocess, normalize, and filter RNA-Seq data - critical first steps for any subsequent analysis. We find that analysis of large RNA-Seq data sets requires both careful quality control and the need to account for sparsity due to the heterogeneity intrinsic in multi-group studies. We developed Yet Another RNA Normalization software pipeline (YARN), that includes quality control and preprocessing, gene filtering, and normalization steps designed to facilitate downstream analysis of large, heterogeneous RNA-Seq data sets and we demonstrate its use with data from the Genotype-Tissue Expression (GTEx) project. An R package instantiating YARN is available at http://bioconductor.org/packages/yarn .

  12. Context-Based Urban Terrain Reconstruction from Uav-Videos for Geoinformation Applications

    NASA Astrophysics Data System (ADS)

    Bulatov, D.; Solbrig, P.; Gross, H.; Wernerus, P.; Repasi, E.; Heipke, C.

    2011-09-01

    Urban terrain reconstruction has many applications in areas of civil engineering, urban planning, surveillance and defense research. Therefore the needs of covering ad-hoc demand and performing a close-range urban terrain reconstruction with miniaturized and relatively inexpensive sensor platforms are constantly growing. Using (miniaturized) unmanned aerial vehicles, (M)UAVs, represents one of the most attractive alternatives to conventional large-scale aerial imagery. We cover in this paper a four-step procedure of obtaining georeferenced 3D urban models from video sequences. The four steps of the procedure - orientation, dense reconstruction, urban terrain modeling and geo-referencing - are robust, straight-forward, and nearly fully-automatic. The two last steps - namely, urban terrain modeling from almost-nadir videos and co-registration of models 6ndash; represent the main contribution of this work and will therefore be covered with more detail. The essential substeps of the third step include digital terrain model (DTM) extraction, segregation of buildings from vegetation, as well as instantiation of building and tree models. The last step is subdivided into quasi- intrasensorial registration of Euclidean reconstructions and intersensorial registration with a geo-referenced orthophoto. Finally, we present reconstruction results from a real data-set and outline ideas for future work.

  13. The design of scenario-based training from the resilience engineering perspective: a study with grid electricians.

    PubMed

    Saurin, Tarcisio Abreu; Wachs, Priscila; Righi, Angela Weber; Henriqson, Eder

    2014-07-01

    Although scenario-based training (SBT) can be an effective means to help workers develop resilience skills, it has not yet been analyzed from the resilience engineering (RE) perspective. This study introduces a five-stage method for designing SBT from the RE view: (a) identification of resilience skills, work constraints and actions for re-designing the socio-technical system; (b) design of template scenarios, allowing the simulation of the work constraints and the use of resilience skills; (c) design of the simulation protocol, which includes briefing, simulation and debriefing; (d) implementation of both scenarios and simulation protocol; and (e) evaluation of the scenarios and simulation protocol. It is reported how the method was applied in an electricity distribution company, in order to train grid electricians. The study was framed as an application of design science research, and five research outputs are discussed: method, constructs, model of the relationships among constructs, instantiations of the method, and theory building. Concerning the last output, the operationalization of the RE perspective on three elements of SBT is presented: identification of training objectives; scenario design; and debriefing. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. "My goose child Martina": the multiple uses of geese in the writings of Konrad Lorenz.

    PubMed

    Munz, Tania

    2011-01-01

    In 1935, the graylag goose Martina (1935-?) hatched from an egg in the home of the zoologist Konrad Lorenz (1903-1989). Martina imprinted on Lorenz, slept in his bedroom, mated with the gander Martin, and flew off in 1937. Over the following decades, Konrad Lorenz helped to establish the discipline of ethology, received a share of the 1973 Nobel Prize in Physiology or Medicine, and continued to write about his famous goose Martina. This essay examines the different instantiations of the geese in general, and Martina in particular, in Lorenz's writings aimed at readerships that included prewar zoologists, National Socialist psychologists, and popular audiences from the 1930s to 1980s. By developing an animal with her own biography, Lorenz created an individual whose lived and rhetorical agency made her especially well suited to perform widely divergent aspects of his evolving science. While a significant literature in the history of science has explored the standardization and stabilization of animals in science, I show how Lorenz's creation of a highly protean and increasingly public Martina was co-constitutive of the establishment of the science and public persona.

  15. Values: the dynamic nexus between biology, ecology and culture.

    PubMed

    Fischer, Ronald; Boer, Diana

    2016-04-01

    Values are motivational goals that influence attitudes, behaviors and evaluations. Cross-cultural evidence suggests that values show a systematic structure. Personal and cultural variations in the value structure, value priorities and value links to attitudes, behavior and well-being reflect contextual constraints and affordances in the environment, suggesting that values function as broadly adaptive psychological structures. The internal structure of values (the descriptive value system) becomes more clearly differentiated in more economically developed contexts. Value priorities shift toward more autonomous, self-expressive and individualistic orientations with greater economic resources and less ecological stress. In addition to systematic changes in internal structure, value links to attitudes, behaviors and well-being are influenced by economic, ecological and institutional contexts. Values are more likely to be expressed in attitudes and behavior if individuals have greater access to economic resources, experience less institutional and ecological stress or when the values reinforce culturally normative behavior. Frontiers for further value research include a greater examination of the neural underpinnings of values in specific ecological contexts and across the lifespan; and an examination of how values are behaviorally instantiated in different environments. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Learning in stochastic neural networks for constraint satisfaction problems

    NASA Technical Reports Server (NTRS)

    Johnston, Mark D.; Adorf, Hans-Martin

    1989-01-01

    Researchers describe a newly-developed artificial neural network algorithm for solving constraint satisfaction problems (CSPs) which includes a learning component that can significantly improve the performance of the network from run to run. The network, referred to as the Guarded Discrete Stochastic (GDS) network, is based on the discrete Hopfield network but differs from it primarily in that auxiliary networks (guards) are asymmetrically coupled to the main network to enforce certain types of constraints. Although the presence of asymmetric connections implies that the network may not converge, it was found that, for certain classes of problems, the network often quickly converges to find satisfactory solutions when they exist. The network can run efficiently on serial machines and can find solutions to very large problems (e.g., N-queens for N as large as 1024). One advantage of the network architecture is that network connection strengths need not be instantiated when the network is established: they are needed only when a participating neural element transitions from off to on. They have exploited this feature to devise a learning algorithm, based on consistency techniques for discrete CSPs, that updates the network biases and connection strengths and thus improves the network performance.

  17. Logistic Model to Support Service Modularity for the Promotion of Reusability in a Web Objects-Enabled IoT Environment.

    PubMed

    Kibria, Muhammad Golam; Ali, Sajjad; Jarwar, Muhammad Aslam; Kumar, Sunil; Chong, Ilyoung

    2017-09-22

    Due to a very large number of connected virtual objects in the surrounding environment, intelligent service features in the Internet of Things requires the reuse of existing virtual objects and composite virtual objects. If a new virtual object is created for each new service request, then the number of virtual object would increase exponentially. The Web of Objects applies the principle of service modularity in terms of virtual objects and composite virtual objects. Service modularity is a key concept in the Web Objects-Enabled Internet of Things (IoT) environment which allows for the reuse of existing virtual objects and composite virtual objects in heterogeneous ontologies. In the case of similar service requests occurring at the same, or different locations, the already-instantiated virtual objects and their composites that exist in the same, or different ontologies can be reused. In this case, similar types of virtual objects and composite virtual objects are searched and matched. Their reuse avoids duplication under similar circumstances, and reduces the time it takes to search and instantiate them from their repositories, where similar functionalities are provided by similar types of virtual objects and their composites. Controlling and maintaining a virtual object means controlling and maintaining a real-world object in the real world. Even though the functional costs of virtual objects are just a fraction of those for deploying and maintaining real-world objects, this article focuses on reusing virtual objects and composite virtual objects, as well as discusses similarity matching of virtual objects and composite virtual objects. This article proposes a logistic model that supports service modularity for the promotion of reusability in the Web Objects-enabled IoT environment. Necessary functional components and a flowchart of an algorithm for reusing composite virtual objects are discussed. Also, to realize the service modularity, a use case scenario is studied and implemented.

  18. What Makes Hydrologic Models Differ? Using SUMMA to Systematically Explore Model Uncertainty and Error

    NASA Astrophysics Data System (ADS)

    Bennett, A.; Nijssen, B.; Chegwidden, O.; Wood, A.; Clark, M. P.

    2017-12-01

    Model intercomparison experiments have been conducted to quantify the variability introduced during the model development process, but have had limited success in identifying the sources of this model variability. The Structure for Unifying Multiple Modeling Alternatives (SUMMA) has been developed as a framework which defines a general set of conservation equations for mass and energy as well as a common core of numerical solvers along with the ability to set options for choosing between different spatial discretizations and flux parameterizations. SUMMA can be thought of as a framework for implementing meta-models which allows for the investigation of the impacts of decisions made during the model development process. Through this flexibility we develop a hierarchy of definitions which allows for models to be compared to one another. This vocabulary allows us to define the notion of weak equivalence between model instantiations. Through this weak equivalence we develop the concept of model mimicry, which can be used to investigate the introduction of uncertainty and error during the modeling process as well as provide a framework for identifying modeling decisions which may complement or negate one another. We instantiate SUMMA instances that mimic the behaviors of the Variable Infiltration Capacity (VIC) model and the Precipitation Runoff Modeling System (PRMS) by choosing modeling decisions which are implemented in each model. We compare runs from these models and their corresponding mimics across the Columbia River Basin located in the Pacific Northwest of the United States and Canada. From these comparisons, we are able to determine the extent to which model implementation has an effect on the results, as well as determine the changes in sensitivity of parameters due to these implementation differences. By examining these changes in results and sensitivities we can attempt to postulate changes in the modeling decisions which may provide better estimation of state variables.

  19. Logistic Model to Support Service Modularity for the Promotion of Reusability in a Web Objects-Enabled IoT Environment

    PubMed Central

    Chong, Ilyoung

    2017-01-01

    Due to a very large number of connected virtual objects in the surrounding environment, intelligent service features in the Internet of Things requires the reuse of existing virtual objects and composite virtual objects. If a new virtual object is created for each new service request, then the number of virtual object would increase exponentially. The Web of Objects applies the principle of service modularity in terms of virtual objects and composite virtual objects. Service modularity is a key concept in the Web Objects-Enabled Internet of Things (IoT) environment which allows for the reuse of existing virtual objects and composite virtual objects in heterogeneous ontologies. In the case of similar service requests occurring at the same, or different locations, the already-instantiated virtual objects and their composites that exist in the same, or different ontologies can be reused. In this case, similar types of virtual objects and composite virtual objects are searched and matched. Their reuse avoids duplication under similar circumstances, and reduces the time it takes to search and instantiate them from their repositories, where similar functionalities are provided by similar types of virtual objects and their composites. Controlling and maintaining a virtual object means controlling and maintaining a real-world object in the real world. Even though the functional costs of virtual objects are just a fraction of those for deploying and maintaining real-world objects, this article focuses on reusing virtual objects and composite virtual objects, as well as discusses similarity matching of virtual objects and composite virtual objects. This article proposes a logistic model that supports service modularity for the promotion of reusability in the Web Objects-enabled IoT environment. Necessary functional components and a flowchart of an algorithm for reusing composite virtual objects are discussed. Also, to realize the service modularity, a use case scenario is studied and implemented. PMID:28937590

  20. A graph-based computational framework for simulation and optimisation of coupled infrastructure networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jalving, Jordan; Abhyankar, Shrirang; Kim, Kibaek

    Here, we present a computational framework that facilitates the construction, instantiation, and analysis of large-scale optimization and simulation applications of coupled energy networks. The framework integrates the optimization modeling package PLASMO and the simulation package DMNetwork (built around PETSc). These tools use a common graphbased abstraction that enables us to achieve compatibility between data structures and to build applications that use network models of different physical fidelity. We also describe how to embed these tools within complex computational workflows using SWIFT, which is a tool that facilitates parallel execution of multiple simulation runs and management of input and output data.more » We discuss how to use these capabilities to target coupled natural gas and electricity systems.« less

  1. Ada Compiler Validation Summary Report: Certificate Number: 940305W1. 11335 TLD Systems, Ltd. TLD Comanche VAX/i960 Ada Compiler System, Version 4.1.1 VAX Cluster under VMS 5.5 = Tronix JIAWG Execution Vehicle (i960MX) under TLD Real Time Executive, Version 4.1.1

    DTIC Science & Technology

    1994-03-14

    Comanche VAX/i960 Ada Compiler System, Version 4.1.1 Host Computer System: Digital Local Area Network VAX Cluster executing on (2) MicroVAX 3100 Model 90...31 $MAX DIGITS 15 SmNx INT 2147483647 $MAX INT PLUS_1 2147483648 $MIN IN -2_147483648 A-3 MACR PARAMEERIS $NAME NO SUCH INTEGER TYPE $NAME LIST...nested generlcs are Supported and generics defined in libary units are pexitted. zt is not possible to pen ore a macro instantiation for a generic I

  2. Resting-State Functional Connectivity Differentiates Anxious Apprehension and Anxious Arousal

    PubMed Central

    Burdwood, Erin N.; Infantolino, Zachary P.; Crocker, Laura D.; Spielberg, Jeffrey M.; Banich, Marie T.; Miller, Gregory A.; Heller, Wendy

    2016-01-01

    Brain regions in the default mode network (DMN) display greater functional connectivity at rest or during self-referential processing than during goal-directed tasks. The present study assessed resting-state connectivity as a function of anxious apprehension and anxious arousal, independent of depressive symptoms, in order to understand how these dimensions disrupt cognition. Whole-brain, seed-based analyses indicated differences between anxious apprehension and anxious arousal in DMN functional connectivity. Lower connectivity associated with higher anxious apprehension suggests decreased adaptive, inner-focused thought processes, whereas higher connectivity at higher levels of anxious arousal may reflect elevated monitoring of physiological responses to threat. These findings further the conceptualization of anxious apprehension and anxious arousal as distinct psychological dimensions with distinct neural instantiations. PMID:27406406

  3. Fast Bound Methods for Large Scale Simulation with Application for Engineering Optimization

    NASA Technical Reports Server (NTRS)

    Patera, Anthony T.; Peraire, Jaime; Zang, Thomas A. (Technical Monitor)

    2002-01-01

    In this work, we have focused on fast bound methods for large scale simulation with application for engineering optimization. The emphasis is on the development of techniques that provide both very fast turnaround and a certificate of Fidelity; these attributes ensure that the results are indeed relevant to - and trustworthy within - the engineering context. The bound methodology which underlies this work has many different instantiations: finite element approximation; iterative solution techniques; and reduced-basis (parameter) approximation. In this grant we have, in fact, treated all three, but most of our effort has been concentrated on the first and third. We describe these below briefly - but with a pointer to an Appendix which describes, in some detail, the current "state of the art."

  4. Simulation and Analysis of Converging Shock Wave Test Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramsey, Scott D.; Shashkov, Mikhail J.

    2012-06-21

    Results and analysis pertaining to the simulation of the Guderley converging shock wave test problem (and associated code verification hydrodynamics test problems involving converging shock waves) in the LANL ASC radiation-hydrodynamics code xRAGE are presented. One-dimensional (1D) spherical and two-dimensional (2D) axi-symmetric geometric setups are utilized and evaluated in this study, as is an instantiation of the xRAGE adaptive mesh refinement capability. For the 2D simulations, a 'Surrogate Guderley' test problem is developed and used to obviate subtleties inherent to the true Guderley solution's initialization on a square grid, while still maintaining a high degree of fidelity to the originalmore » problem, and minimally straining the general credibility of associated analysis and conclusions.« less

  5. A graph-based computational framework for simulation and optimisation of coupled infrastructure networks

    DOE PAGES

    Jalving, Jordan; Abhyankar, Shrirang; Kim, Kibaek; ...

    2017-04-24

    Here, we present a computational framework that facilitates the construction, instantiation, and analysis of large-scale optimization and simulation applications of coupled energy networks. The framework integrates the optimization modeling package PLASMO and the simulation package DMNetwork (built around PETSc). These tools use a common graphbased abstraction that enables us to achieve compatibility between data structures and to build applications that use network models of different physical fidelity. We also describe how to embed these tools within complex computational workflows using SWIFT, which is a tool that facilitates parallel execution of multiple simulation runs and management of input and output data.more » We discuss how to use these capabilities to target coupled natural gas and electricity systems.« less

  6. Adaptable state based control system

    NASA Technical Reports Server (NTRS)

    Rasmussen, Robert D. (Inventor); Dvorak, Daniel L. (Inventor); Gostelow, Kim P. (Inventor); Starbird, Thomas W. (Inventor); Gat, Erann (Inventor); Chien, Steve Ankuo (Inventor); Keller, Robert M. (Inventor)

    2004-01-01

    An autonomous controller, comprised of a state knowledge manager, a control executor, hardware proxies and a statistical estimator collaborates with a goal elaborator, with which it shares common models of the behavior of the system and the controller. The elaborator uses the common models to generate from temporally indeterminate sets of goals, executable goals to be executed by the controller. The controller may be updated to operate in a different system or environment than that for which it was originally designed by the replacement of shared statistical models and by the instantiation of a new set of state variable objects derived from a state variable class. The adaptation of the controller does not require substantial modification of the goal elaborator for its application to the new system or environment.

  7. Architecture-driven reuse of code in KASE

    NASA Technical Reports Server (NTRS)

    Bhansali, Sanjay

    1993-01-01

    In order to support the synthesis of large, complex software systems, we need to focus on issues pertaining to the architectural design of a system in addition to algorithm and data structure design. An approach that is based on abstracting the architectural design of a set of problems in the form of a generic architecture, and providing tools that can be used to instantiate the generic architecture for specific problem instances is presented. Such an approach also facilitates reuse of code between different systems belonging to the same problem class. An application of our approach on a realistic problem is described; the results of the exercise are presented; and how our approach compares to other work in this area is discussed.

  8. Resisting the siren call of individualism in pediatric decision-making and the role of relational interests.

    PubMed

    Salter, Erica K

    2014-02-01

    The siren call of individualism is compelling. And although we have recognized its dangerous allure in the realm of adult decision-making, it has had profound and yet unnoticed dangerous effects in pediatric decision-making as well. Liberal individualism as instantiated in the best interest standard conceptualizes the child as independent and unencumbered and the goal of child rearing as rational autonomous adulthood, a characterization that is both ontologically false and normatively dangerous. Although a notion of the individuated child might have a place in establishing a threshold of care obligated and enforced by the state, beyond this context we should turn our attention more explicitly to the relational interests of children.

  9. Contingent Attentional Capture

    NASA Technical Reports Server (NTRS)

    Remington, Roger; Folk, Charles L.

    1994-01-01

    Four experiments address the degree of top-down selectivity in attention capture by feature singletons through manipulations of the spatial relationship and featural similarity of target and distractor singletons in a modified spatial cuing paradigm. Contrary to previous studies, all four experiments show that when searching for a singleton target, an irrelevant featural singleton captures attention only when defined by the same feature value as the target. Experiments 2, 3, and 4 provide a potential explanation for this empirical discrepancy by showing that irrelevant singletons can produce distraction effects that are independent of shifts of spatial attention. The results further support the notion that attentional capture is contingent on top-down attention control settings but indicates that such settings can be instantiated at the level of feature values.

  10. Complex systems and health behavior change: insights from cognitive science.

    PubMed

    Orr, Mark G; Plaut, David C

    2014-05-01

    To provide proof-of-concept that quantum health behavior can be instantiated as a computational model that is informed by cognitive science, the Theory of Reasoned Action, and quantum health behavior theory. We conducted a synthetic review of the intersection of quantum health behavior change and cognitive science. We conducted simulations, using a computational model of quantum health behavior (a constraint satisfaction artificial neural network) and tested whether the model exhibited quantum-like behavior. The model exhibited clear signs of quantum-like behavior. Quantum health behavior can be conceptualized as constraint satisfaction: a mitigation between current behavioral state and the social contexts in which it operates. We outlined implications for moving forward with computational models of both quantum health behavior and health behavior in general.

  11. Designing for expansive science learning and identification across settings

    NASA Astrophysics Data System (ADS)

    Stromholt, Shelley; Bell, Philip

    2017-10-01

    In this study, we present a case for designing expansive science learning environments in relation to neoliberal instantiations of standards-based implementation projects in education. Using ethnographic and design-based research methods, we examine how the design of coordinated learning across settings can engage youth from non-dominant communities in scientific and engineering practices, resulting in learning experiences that are more relevant to youth and their communities. Analyses highlight: (a) transformative moments of identification for one fifth-grade student across school and non-school settings; (b) the disruption of societal, racial stereotypes on the capabilities of and expectations for marginalized youth; and (c) how youth recognized themselves as members of their community and agents of social change by engaging in personally consequential science investigations and learning.

  12. Learning for autonomous navigation : extrapolating from underfoot to the far field

    NASA Technical Reports Server (NTRS)

    Matthies, Larry; Turmon, Michael; Howard, Andrew; Angelova, Anelia; Tang, Benyang; Mjolsness, Eric

    2005-01-01

    Autonomous off-road navigation of robotic ground vehicles has important applications on Earth and in space exploration. Progress in this domain has been retarded by the limited lookahead range of 3-D sensors and by the difficulty of preprogramming systems to understand the traversability of the wide variety of terrain they can encounter. Enabling robots to learn from experience may alleviate both of these problems. We define two paradigms for this, learning from 3-D geometry and learning from proprioception, and describe initial instantiations of them we have developed under DARPA and NASA programs. Field test results show promise for learning traversability of vegetated terrain, learning to extend the lookahead range of the vision system, and learning how slip varies with slope.

  13. The road plan model: Information model for planning road building activities

    NASA Technical Reports Server (NTRS)

    Azinhal, Rafaela K.; Moura-Pires, Fernando

    1994-01-01

    The general building contractor is presented with an information model as an approach for deriving a high-level work plan of construction activities applied to road building. Road construction activities are represented in a Road Plan Model (RPM), which is modeled in the ISO standard STEP/EXPRESS and adopts various concepts from the GARM notation. The integration with the preceding road design stage and the succeeding phase of resource scheduling is discussed within the framework of a Road Construction Model. Construction knowledge is applied to the road design and the terrain model of the surrounding road infrastructure for the instantiation of the RPM. Issues regarding the implementation of a road planner application supporting the RPM are discussed.

  14. Causality, Measurement, and Elementary Interactions

    NASA Astrophysics Data System (ADS)

    Gillis, Edward J.

    2011-12-01

    Signal causality, the prohibition of superluminal information transmission, is the fundamental property shared by quantum measurement theory and relativity, and it is the key to understanding the connection between nonlocal measurement effects and elementary interactions. To prevent those effects from transmitting information between the generating and observing process, they must be induced by the kinds of entangling interactions that constitute measurements, as implied in the Projection Postulate. They must also be nondeterministic as reflected in the Born Probability Rule. The nondeterminism of entanglement-generating processes explains why the relevant types of information cannot be instantiated in elementary systems, and why the sequencing of nonlocal effects is, in principle, unobservable. This perspective suggests a simple hypothesis about nonlocal transfers of amplitude during entangling interactions, which yields straightforward experimental consequences.

  15. A user interface framework for the Square Kilometre Array: concepts and responsibilities

    NASA Astrophysics Data System (ADS)

    Marassi, Alessandro; Brajnik, Giorgio; Nicol, Mark; Alberti, Valentina; Le Roux, Gerhard

    2016-07-01

    The Square Kilometre Array (SKA) project is responsible for developing the SKA Observatory, the world's largest radio telescope, with eventually over a square kilometre of collecting area and including a general headquarters as well as two radio telescopes: SKA1-Mid in South Africa and SKA1-Low in Australia. The SKA project consists of a number of subsystems (elements) among which the Telescope Manager (TM) is the one involved in controlling and monitoring the SKA telescopes. The TM element has three primary responsibilities: management of astronomical observations, management of telescope hardware and software subsystems, management of data to support system operations and all stakeholders (operators, maintainers, engineers and science users) in achieving operational, maintenance and engineering goals. Operators, maintainers, engineers and science users will interact with TM via appropriate user interfaces (UI). The TM UI framework envisaged is a complete set of general technical solutions (components, technologies and design information) for implementing a generic computing system (UI platform). Such a system will enable UI components to be instantiated to allow for human interaction via screens, keyboards, mouse and to implement the necessary logic for acquiring or deriving the information needed for interaction. It will provide libraries and specific Application Programming Interfaces (APIs) to implement operator and engineer interactive interfaces. This paper will provide a status update of the TM UI framework, UI platform and UI components design effort, including the technology choices, and discuss key challenges in the TM UI architecture, as well as our approaches to addressing them.

  16. Frequency-specific electrophysiologic correlates of resting state fMRI networks.

    PubMed

    Hacker, Carl D; Snyder, Abraham Z; Pahwa, Mrinal; Corbetta, Maurizio; Leuthardt, Eric C

    2017-04-01

    Resting state functional MRI (R-fMRI) studies have shown that slow (<0.1Hz), intrinsic fluctuations of the blood oxygen level dependent (BOLD) signal are temporally correlated within hierarchically organized functional systems known as resting state networks (RSNs) (Doucet et al., 2011). Most broadly, this hierarchy exhibits a dichotomy between two opposed systems (Fox et al., 2005). One system engages with the environment and includes the visual, auditory, and sensorimotor (SMN) networks as well as the dorsal attention network (DAN), which controls spatial attention. The other system includes the default mode network (DMN) and the fronto-parietal control system (FPC), RSNs that instantiate episodic memory and executive control, respectively. Here, we test the hypothesis, based on the spectral specificity of electrophysiologic responses to perceptual vs. memory tasks (Klimesch, 1999; Pfurtscheller and Lopes da Silva, 1999), that these two large-scale neural systems also manifest frequency specificity in the resting state. We measured the spatial correspondence between electrocorticographic (ECoG) band-limited power (BLP) and R-fMRI correlation patterns in awake, resting, human subjects. Our results show that, while gamma BLP correspondence was common throughout the brain, theta (4-8Hz) BLP correspondence was stronger in the DMN and FPC, whereas alpha (8-12Hz) correspondence was stronger in the SMN and DAN. Thus, the human brain, at rest, exhibits frequency specific electrophysiology, respecting both the spectral structure of task responses and the hierarchical organization of RSNs. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Frequency-specific electrophysiologic correlates of resting state fMRI networks

    PubMed Central

    Hacker, Carl D.; Snyder, Abraham Z.; Pahwa, Mrinal; Corbetta, Maurizio; Leuthardt, Eric C.

    2017-01-01

    Resting state functional MRI (R-fMRI) studies have shown that slow (< 0.1 Hz), intrinsic fluctuations of the blood oxygen level dependent (BOLD) signal are temporally correlated within hierarchically organized functional systems known as resting state networks (RSNs) (Doucet et al., 2011). Most broadly, this hierarchy exhibits a dichotomy between two opposed systems (Fox et al., 2005). One system engages with the environment and includes the visual, auditory, and sensorimotor (SMN) networks as well as the dorsal attention network (DAN), which controls spatial attention. The other system includes the default mode network (DMN) and the fronto-parietal control system (FPC), RSNs that instantiate episodic memory and executive control, respectively. Here, we test the hypothesis, based on the spectral specificity of electrophysiologic responses to perceptual vs. memory tasks (Klimesch, 1999; Pfurtscheller and Lopes da Silva, 1999), that these two large-scale neural systems also manifest frequency specificity in the resting state. We measured the spatial correspondence between electrocorticographic (ECoG) band-limited power (BLP) and R-fMRI correlation patterns in awake, resting, human subjects. Our results show that, while gamma BLP correspondence was common throughout the brain, theta (4–8 Hz) BLP correspondence was stronger in the DMN and FPC, whereas alpha (8–12 Hz) correspondence was stronger in the SMN and DAN. Thus, the human brain, at rest, exhibits frequency specific electrophysiology, respecting both the spectral structure of task responses and the hierarchical organization of RSNs. PMID:28159686

  18. Convergence

    NASA Astrophysics Data System (ADS)

    Darcie, Thomas E.; Doverspike, Robert; Zirngibl, Martin; Korotky, Steven K.

    2005-01-01

    Call for Papers: Convergence

    Guest Editors: Thomas E. Darcie, University of Victoria Robert Doverspike, AT&T Martin Zirngibl, Lucent Technologies

    Coordinating Associate Editor: Steven K. Korotky, Lucent Technologies

    The Journal of Optical Networking (JON) invites submissions to a special issue on Convergence. Convergence has become a popular theme in telecommunications, one that has broad implications across all segments of the industry. Continual evolution of technology and applications continues to erase lines between traditionally separate lines of business, with dramatic consequences for vendors, service providers, and consumers. Spectacular advances in all layers of optical networking-leading to abundant, dynamic, cost-effective, and reliable wide-area and local-area connections-have been essential drivers of this evolution. As services and networks continue to evolve towards some notion of convergence, the continued role of optical networks must be explored. One vision of convergence renders all information in a common packet (especially IP) format. This vision is driven by the proliferation of data services. For example, time-division multiplexed (TDM) voice becomes VoIP. Analog cable-television signals become MPEG bits streamed to digital set-top boxes. T1 or OC-N private lines migrate to Ethernet virtual private networks (VPNs). All these packets coexist peacefully within a single packet-routing methodology built on an optical transport layer that combines the flexibility and cost of data networks with telecom-grade reliability. While this vision is appealing in its simplicity and shared widely, specifics of implementation raise many challenges and differences of opinion. For example, many seek to expand the role of Ethernet in these transport networks, while massive efforts are underway to make traditional TDM networks more data friendly within an evolved but backward-compatible SDH/SONET (synchronous digital hierarchy and synchronous optical network) multiplexing hierarchy. From this common underlying theme follow many specific instantiations. Examples include the convergence at the physical, logical, and operational levels of voice and data, video and data, private-line and virtual private-line, fixed and mobile, and local and long-haul services. These trends have many consequences for consumers, vendors, and carriers. Faced with large volumes of low-margin data traffic mixed with traditional voice services, the need for capital conservation and operational efficiency drives carriers away from today's separate overlay networks for each service and towards "converged" platforms. For example, cable operators require transport of multiple services over both hybrid fiber coax (HFC) and DWDM transport technologies. Local carriers seek an economical architecture to deliver integrated services on optically enabled broadband-access networks. Services over wireless-access networks must coexist with those from wired networks. In each case, convergence of networks and services inspires an important set of questions and challenges, driven by the need for low cost, operational efficiency, service performance requirements, and optical transport technology options. This Feature Issue explores the various interpretations and implications of network convergence pertinent to optical networking. How does convergence affect the evolution of optical transport-layer and control approaches? Are the implied directions consistent with research vision for optical networks? Substantial challenges remain. Papers are solicited across the broad spectrum of interests. These include, but are not limited to:
    • Architecture, design and performance of optical wide-area-network (WAN), metro, and access networks
    • Integration strategies for multiservice transport platforms
    • Access methods that bridge traditional and emerging services
    • Network signaling and control methodologies
    • All-optical packet routing and switching techniques

    Manuscript Submission

    To submit to this special issue, follow the normal procedure for submission to JON, indicating "Convergence feature" in the "Comments" field of the online submission form. For all other questions relating to this feature issue, please send an e-mail to jon@osa.org, subject line "Convergence." Additional information can be found on the JON website: http://www.osa-jon.org/submission/. Submission Deadline: 1 July 2005

  19. Postural Communication of Emotion: Perception of Distinct Poses of Five Discrete Emotions.

    PubMed

    Lopez, Lukas D; Reschke, Peter J; Knothe, Jennifer M; Walle, Eric A

    2017-01-01

    Emotion can be communicated through multiple distinct modalities. However, an often-ignored channel of communication is posture. Recent research indicates that bodily posture plays an important role in the perception of emotion. However, research examining postural communication of emotion is limited by the variety of validated emotion poses and unknown cohesion of categorical and dimensional ratings. The present study addressed these limitations. Specifically, we examined individuals' (1) categorization of emotion postures depicting 5 discrete emotions (joy, sadness, fear, anger, and disgust), (2) categorization of different poses depicting the same discrete emotion, and (3) ratings of valence and arousal for each emotion pose. Findings revealed that participants successfully categorized each posture as the target emotion, including disgust. Moreover, participants accurately identified multiple distinct poses within each emotion category. In addition to the categorical responses, dimensional ratings of valence and arousal revealed interesting overlap and distinctions between and within emotion categories. These findings provide the first evidence of an identifiable posture for disgust and instantiate the principle of equifinality of emotional communication through the inclusion of distinct poses within emotion categories. Additionally, the dimensional ratings corroborated the categorical data and provide further granularity for future researchers to consider in examining how distinct emotion poses are perceived.

  20. Electron-beam lithography with character projection technique for high-throughput exposure with line-edge quality control

    NASA Astrophysics Data System (ADS)

    Ikeno, Rimon; Maruyama, Satoshi; Mita, Yoshio; Ikeda, Makoto; Asada, Kunihiro

    2016-07-01

    The high throughput of character projection (CP) electron-beam (EB) lithography makes it a promising technique for low-to-medium volume device fabrication with regularly arranged layouts, such as for standard-cell logics and memory arrays. However, non-VLSI applications such as MEMS and MOEMS may not be able to fully utilize the benefits of the CP method due to the wide variety of layout figures including curved and oblique edges. In addition, the stepwise shapes that appear because of the EB exposure process often result in intolerable edge roughness, which degrades device performances. In this study, we propose a general EB lithography methodology for such applications utilizing a combination of the CP and variable-shaped beam methods. In the process of layout data conversion with CP character instantiation, several control parameters were optimized to minimize the shot count, improve the edge quality, and enhance the overall device performance. We have demonstrated EB shot reduction and edge-quality improvement with our methodology by using a leading-edge EB exposure tool, ADVANTEST F7000S-VD02, and a high-resolution hydrogen silsesquioxane resist. Atomic force microscope observations were used to analyze the resist edge profiles' quality to determine the influence of the control parameters used in the data conversion process.

  1. Longitudinal Validation of General and Specific Structural Features of Personality Pathology

    PubMed Central

    Wright, Aidan G.C.; Hopwood, Christopher J.; Skodol, Andrew E.; Morey, Leslie C.

    2016-01-01

    Theorists have long argued that personality disorder (PD) is best understood in terms of general impairments shared across the disorders as well as more specific instantiations of pathology. A model based on this theoretical structure was proposed as part of the DSM-5 revision process. However, only recently has this structure been subjected to formal quantitative evaluation, with little in the way of validation efforts via external correlates or prospective longitudinal prediction. We used the Collaborative Longitudinal Study of Personality Disorders dataset to: (1) estimate structural models that parse general from specific variance in personality disorder features, (2) examine patterns of growth in general and specific features over the course of 10 years, and (3) establish concurrent and dynamic longitudinal associations in PD features and a host of external validators including basic personality traits and psychosocial functioning scales. We found that general PD exhibited much lower absolute stability and was most strongly related to broad markers of psychosocial functioning, concurrently and longitudinally, whereas specific features had much higher mean stability and exhibited more circumscribed associations with functioning. However, both general and specific factors showed recognizable associations with normative and pathological traits. These results can inform efforts to refine the conceptualization and diagnosis of personality pathology. PMID:27819472

  2. Behavioral conflict, anterior cingulate cortex, and experiment duration: implications of diverging data.

    PubMed

    Erickson, Kirk I; Milham, Michael P; Colcombe, Stanley J; Kramer, Arthur F; Banich, Marie T; Webb, Andrew; Cohen, Neal J

    2004-02-01

    We investigated the relationship between behavioral measures of conflict and the degree of activity in the anterior cingulate cortex (ACC). We reanalyzed an existing data set that employed the Stroop task using functional magnetic resonance imaging [Milham et al., Brain Cogn 2002;49:277-296]. Although we found no changes in the behavioral measures of conflict from the first to the second half of task performance, we found a reliable reduction in the activity of the anterior cingulate cortex. This result suggests the lack of a strong relationship between behavioral measurements of conflict and anterior cingulate activity. A concomitant increase in dorsolateral prefrontal cortex activity was also found, which may reflect a tradeoff in the neural substrates involved in supporting conflict resolution, detection, or monitoring processes. A second analysis of the data revealed that the duration of an experiment can dramatically affect interpretations of the results, including the roles in which particular regions are thought to play in cognition. These results are discussed in relation to current conceptions of ACC's role in attentional control. In addition, we discuss the implication of our results with current conceptions of conflict and of its instantiation in the brain. Hum. Brain Mapping 21:96-105, 2004. Copyright 2003 Wiley-Liss, Inc.

  3. From relational ontology to transformative activist stance on development and learning: expanding Vygotsky's (CHAT) project

    NASA Astrophysics Data System (ADS)

    Stetsenko, Anna

    2008-07-01

    This paper offers steps towards overcoming current fragmentation within sociocultural approaches by expansively reconstructing a broad dialectical view on human development and learning (drawing on Vygotsky's project) underwritten by ideology of social justice. The common foundation for sociocultural approaches is developed by dialectically supplanting relational ontology with the notion that collaborative purposeful transformation of the world is the core of human nature and the principled grounding for learning and development. An activist transformative stance suggests that people come to know themselves and their world as well as ultimately come to be human in and through (not in addition to) the processes of collaboratively transforming the world in view of their goals. This means that all human activities (including psychological processes and the self) are instantiations of contributions to collaborative transformative practices that are contingent on both the past and the vision for the future and therefore are profoundly imbued with ideology, ethics, and values. And because acting, being, and knowing are seen from a transformative activist stance as all rooted in, derivative of, and instrumental within a collaborative historical becoming, this stance cuts across and bridges the gaps (a) between individual and social and (b) among ontological, epistemological, and moral-ethical (ideological) dimensions of activity.

  4. Steam distribution and energy delivery optimization using wireless sensors

    NASA Astrophysics Data System (ADS)

    Olama, Mohammed M.; Allgood, Glenn O.; Kuruganti, Teja P.; Sukumar, Sreenivas R.; Djouadi, Seddik M.; Lake, Joe E.

    2011-05-01

    The Extreme Measurement Communications Center at Oak Ridge National Laboratory (ORNL) explores the deployment of a wireless sensor system with a real-time measurement-based energy efficiency optimization framework in the ORNL campus. With particular focus on the 12-mile long steam distribution network in our campus, we propose an integrated system-level approach to optimize the energy delivery within the steam distribution system. We address the goal of achieving significant energy-saving in steam lines by monitoring and acting on leaking steam valves/traps. Our approach leverages an integrated wireless sensor and real-time monitoring capabilities. We make assessments on the real-time status of the distribution system by mounting acoustic sensors on the steam pipes/traps/valves and observe the state measurements of these sensors. Our assessments are based on analysis of the wireless sensor measurements. We describe Fourier-spectrum based algorithms that interpret acoustic vibration sensor data to characterize flows and classify the steam system status. We are able to present the sensor readings, steam flow, steam trap status and the assessed alerts as an interactive overlay within a web-based Google Earth geographic platform that enables decision makers to take remedial action. We believe our demonstration serves as an instantiation of a platform that extends implementation to include newer modalities to manage water flow, sewage and energy consumption.

  5. Early stress, parental motivation, and reproductive decision-making: applications of life history theory to parental behavior.

    PubMed

    Cabeza de Baca, Tomás; Ellis, Bruce J

    2017-06-01

    This review focuses on the impact of parental behavior on child development, as interpreted from an evolutionary-developmental perspective. We employ psychosocial acceleration theory to reinterpret the effects of variation in parental investment and involvement on child development, arguing that these effects have been structured by natural selection to match the developing child to current and expected future environments. Over time, an individual's development, physiology, and behavior are organized in a coordinated manner (as instantiated in 'life history strategies') that facilitates survival and reproductive success under different conditions. We review evidence to suggest that parental behavior (1) is strategic and contingent on environmental opportunities and constraints and (2) influences child life history strategies across behavioral, cognitive, and physiological domains. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Limits to ground control in autonomous spacecraft

    NASA Technical Reports Server (NTRS)

    Wan, Alfred D. M.; Braspenning, Peter J.; Vreeswijk, Gerrard A. W.

    1995-01-01

    In this paper the autonomy concept used by ESA and NASA is critically evaluated. Moreover, a more proper ground control/spacecraft organizational structure is proposed on the basis of a new, more elaborated concept of autonomy. In an extended theoretical discussion its definitional properties and functionalities are established. The rather basic property of adaptivity leads to the categorization of behaviour into the modes of satisfaction and avoidance behaviour. However, the autonomy property with the most profound consequences is goal-robustness. The mechanism that implements goal-robustness tests newly generated goals and externally received goals on consistency with high-level goals. If goals appear not to be good instantiations or more acceptable replacements of existing goals, they are rejected. This means that ground control has to cooperate with the spacecraft instead of (intermittently) commanding it.

  7. Characterization of metal additive manufacturing surfaces using synchrotron X-ray CT and micromechanical modeling

    NASA Astrophysics Data System (ADS)

    Kantzos, C. A.; Cunningham, R. W.; Tari, V.; Rollett, A. D.

    2018-05-01

    Characterizing complex surface topologies is necessary to understand stress concentrations created by rough surfaces, particularly those made via laser power-bed additive manufacturing (AM). Synchrotron-based X-ray microtomography (μ XCT) of AM surfaces was shown to provide high resolution detail of surface features and near-surface porosity. Using the CT reconstructions to instantiate a micromechanical model indicated that surface notches and near-surface porosity both act as stress concentrators, while adhered powder carried little to no load. Differences in powder size distribution had no direct effect on the relevant surface features, nor on stress concentrations. Conventional measurements of surface roughness, which are highly influenced by adhered powder, are therefore unlikely to contain the information relevant to damage accumulation and crack initiation.

  8. Single-exposure visual memory judgments are reflected in inferotemporal cortex

    PubMed Central

    Meyer, Travis

    2018-01-01

    Our visual memory percepts of whether we have encountered specific objects or scenes before are hypothesized to manifest as decrements in neural responses in inferotemporal cortex (IT) with stimulus repetition. To evaluate this proposal, we recorded IT neural responses as two monkeys performed a single-exposure visual memory task designed to measure the rates of forgetting with time. We found that a weighted linear read-out of IT was a better predictor of the monkeys’ forgetting rates and reaction time patterns than a strict instantiation of the repetition suppression hypothesis, expressed as a total spike count scheme. Behavioral predictions could be attributed to visual memory signals that were reflected as repetition suppression and were intermingled with visual selectivity, but only when combined across the most sensitive neurons. PMID:29517485

  9. Resting-state functional connectivity differentiates anxious apprehension and anxious arousal.

    PubMed

    Burdwood, Erin N; Infantolino, Zachary P; Crocker, Laura D; Spielberg, Jeffrey M; Banich, Marie T; Miller, Gregory A; Heller, Wendy

    2016-10-01

    Brain regions in the default mode network (DMN) display greater functional connectivity at rest or during self-referential processing than during goal-directed tasks. The present study assessed resting-state connectivity as a function of anxious apprehension and anxious arousal, independent of depressive symptoms, in order to understand how these dimensions disrupt cognition. Whole-brain, seed-based analyses indicated differences between anxious apprehension and anxious arousal in DMN functional connectivity. Lower connectivity associated with higher anxious apprehension suggests decreased adaptive, inner-focused thought processes, whereas higher connectivity at higher levels of anxious arousal may reflect elevated monitoring of physiological responses to threat. These findings further the conceptualization of anxious apprehension and anxious arousal as distinct psychological dimensions with distinct neural instantiations. © 2016 Society for Psychophysiological Research.

  10. A national EHR strategy preparedness characterisation model and its application in the South-East European region.

    PubMed

    Orfanidis, Leonidas; Bamidis, Panagiotis; Eaglestone, Barry

    2006-01-01

    This paper is concerned with modelling national approaches towards electronic health record systems (NEHRS) development. A model framework is stepwise produced, that allows for the characterisation of the preparedness and the readiness of a country to develop an NEHRS. Secondary data of published reports are considered for the creation of the model. Such sources are identified to mostly originate from within a sample of five developed countries. Factors arising from these sources are identified, coded and scaled, so as to allow for a quantitative application of the model. Instantiation of the latter for the case of the five developed countries is contrasted with the set of countries from South East Europe (SEE). The likely importance and validity of this modelling approach is discussed, using the Delphi method.

  11. Characterization of metal additive manufacturing surfaces using synchrotron X-ray CT and micromechanical modeling

    NASA Astrophysics Data System (ADS)

    Kantzos, C. A.; Cunningham, R. W.; Tari, V.; Rollett, A. D.

    2017-12-01

    Characterizing complex surface topologies is necessary to understand stress concentrations created by rough surfaces, particularly those made via laser power-bed additive manufacturing (AM). Synchrotron-based X-ray microtomography (μ XCT ) of AM surfaces was shown to provide high resolution detail of surface features and near-surface porosity. Using the CT reconstructions to instantiate a micromechanical model indicated that surface notches and near-surface porosity both act as stress concentrators, while adhered powder carried little to no load. Differences in powder size distribution had no direct effect on the relevant surface features, nor on stress concentrations. Conventional measurements of surface roughness, which are highly influenced by adhered powder, are therefore unlikely to contain the information relevant to damage accumulation and crack initiation.

  12. Modeling the Car Crash Crisis Management System Using HiLA

    NASA Astrophysics Data System (ADS)

    Hölzl, Matthias; Knapp, Alexander; Zhang, Gefei

    An aspect-oriented modeling approach to the Car Crash Crisis Management System (CCCMS) using the High-Level Aspect (HiLA) language is described. HiLA is a language for expressing aspects for UML static structures and UML state machines. In particular, HiLA supports both a static graph transformational and a dynamic approach of applying aspects. Furthermore, it facilitates methodologically turning use case descriptions into state machines: for each main success scenario, a base state machine is developed; all extensions to this main success scenario are covered by aspects. Overall, the static structure of the CCCMS is modeled in 43 classes, the main success scenarios in 13 base machines, the use case extensions in 47 static and 31 dynamic aspects, most of which are instantiations of simple aspect templates.

  13. Knowledge representation requirements for model sharing between model-based reasoning and simulation in process flow domains

    NASA Technical Reports Server (NTRS)

    Throop, David R.

    1992-01-01

    The paper examines the requirements for the reuse of computational models employed in model-based reasoning (MBR) to support automated inference about mechanisms. Areas in which the theory of MBR is not yet completely adequate for using the information that simulations can yield are identified, and recent work in these areas is reviewed. It is argued that using MBR along with simulations forces the use of specific fault models. Fault models are used so that a particular fault can be instantiated into the model and run. This in turn implies that the component specification language needs to be capable of encoding any fault that might need to be sensed or diagnosed. It also means that the simulation code must anticipate all these faults at the component level.

  14. Tracing the Rationale Behind UML Model Change Through Argumentation

    NASA Astrophysics Data System (ADS)

    Jureta, Ivan J.; Faulkner, Stéphane

    Neglecting traceability—i.e., the ability to describe and follow the life of a requirement—is known to entail misunderstanding and miscommunication, leading to the engineering of poor quality systems. Following the simple principles that (a) changes to UML model instances ought be justified to the stakeholders, (b) justification should proceed in a structured manner to ensure rigor in discussions, critique, and revisions of model instances, and (c) the concept of argument instantiated in a justification process ought to be well defined and understood, the present paper introduces the UML Traceability through Argumentation Method (UML-TAM) to enable the traceability of design rationale in UML while allowing the appropriateness of model changes to be checked by analysis of the structure of the arguments provided to justify such changes.

  15. The Research Data Alliance

    NASA Astrophysics Data System (ADS)

    Fontaine, K. S.

    2015-12-01

    The Research Data Alliance (RDA) is an international organization created in 2012 to provide researchers with a forum for identifying and removing barriers to data sharing. Since then, RDA has gained over 3000 individual members, over three dozen organizational members, 47 Interest Groups, and 17 Working Groups, all focused on research data sharing. Interoperability is one instantiation of data sharing, but is not the only barrier to overcome. Technology limitations, discipline-specific cultures that do not support sharing, lack of best-practices, or lack of good definitions, are only three of a long list of situations preventing researchers from sharing their data. This presentation will cover how RDA has grown, some details on how the first eight solutions contribute to interoperability and sharing, and a sneak peek at what's in the pipeline.

  16. Obtaining P3P privacy policies for composite services.

    PubMed

    Sun, Yi; Huang, Zhiqiu; Ke, Changbo

    2014-01-01

    With the development of web services technology, web services have changed from single to composite services. Privacy protection in composite services is becoming an important issue. P3P (platform for privacy preferences) is a privacy policy language which was designed for single web services. It enables service providers to express how they will deal with the privacy information of service consumers. In order to solve the problem that P3P cannot be applied to composite services directly, we propose a method to obtain P3P privacy policies for composite services. In this method, we present the definitions of Purpose, Recipient, and Retention elements as well as Optional and Required attributes for P3P policies of composite services. We also provide an instantiation to illustrate the feasibility of the method.

  17. Rapidly Mixing Gibbs Sampling for a Class of Factor Graphs Using Hierarchy Width.

    PubMed

    De Sa, Christopher; Zhang, Ce; Olukotun, Kunle; Ré, Christopher

    2015-12-01

    Gibbs sampling on factor graphs is a widely used inference technique, which often produces good empirical results. Theoretical guarantees for its performance are weak: even for tree structured graphs, the mixing time of Gibbs may be exponential in the number of variables. To help understand the behavior of Gibbs sampling, we introduce a new (hyper)graph property, called hierarchy width . We show that under suitable conditions on the weights, bounded hierarchy width ensures polynomial mixing time. Our study of hierarchy width is in part motivated by a class of factor graph templates, hierarchical templates , which have bounded hierarchy width-regardless of the data used to instantiate them. We demonstrate a rich application from natural language processing in which Gibbs sampling provably mixes rapidly and achieves accuracy that exceeds human volunteers.

  18. Integration services to enable regional shared electronic health records.

    PubMed

    Oliveira, Ilídio C; Cunha, João P S

    2011-01-01

    eHealth is expected to integrate a comprehensive set of patient data sources into a coherent continuum, but implementations vary and Portugal is still lacking on electronic patient data sharing. In this work, we present a clinical information hub to aggregate multi-institution patient data and bridge the information silos. This integration platform enables a coherent object model, services-oriented applications development and a trust framework. It has been instantiated in the Rede Telemática de Saúde (www.RTSaude.org) to support a regional Electronic Health Record approach, fed dynamically from production systems at eight partner institutions, providing access to more than 11,000,000 care episodes, relating to over 350,000 citizens. The network has obtained the necessary clearance from the Portuguese data protection agency.

  19. Causation, constitution and context. Comment on "Seeing mental states: An experimental strategy for measuring the observability of other minds" by Cristina Becchio et al.

    NASA Astrophysics Data System (ADS)

    Zahavi, Dan

    2018-03-01

    In their new article [1], Becchio and her colleagues argue that recent claims concerning the possibility of directly perceiving other people's mental states will remain speculative as long as one has failed to demonstrate the availability of mentalistic information in observable behavior [p. 4]. The ambitious goal of the authors is then to outline an experimental setup that will permit one to determine whether and to what extent a mental state is observable. Drawing on Becchio's previous work on how regularities in the kinematic patterns specify the mental states of the agent, the authors suggest that a similar approach can be adopted to probe the observability of any mental state instantiated in behavioral patterns [p. 19].

  20. A constrained rasch model of trace redintegration in serial recall.

    PubMed

    Roodenrys, Steven; Miller, Leonie M

    2008-04-01

    The notion that verbal short-term memory tasks, such as serial recall, make use of information in long-term as well as in short-term memory is instantiated in many models of these tasks. Such models incorporate a process in which degraded traces retrieved from a short-term store are reconstructed, or redintegrated (Schweickert, 1993), through the use of information in long-term memory. This article presents a conceptual and mathematical model of this process based on a class of item-response theory models. It is demonstrated that this model provides a better fit to three sets of data than does the multinomial processing tree model of redintegration (Schweickert, 1993) and that a number of conceptual accounts of serial recall can be related to the parameters of the model.

  1. Agent-Based vs. Equation-based Epidemiological Models:A Model Selection Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sukumar, Sreenivas R; Nutaro, James J

    This paper is motivated by the need to design model validation strategies for epidemiological disease-spread models. We consider both agent-based and equation-based models of pandemic disease spread and study the nuances and complexities one has to consider from the perspective of model validation. For this purpose, we instantiate an equation based model and an agent based model of the 1918 Spanish flu and we leverage data published in the literature for our case- study. We present our observations from the perspective of each implementation and discuss the application of model-selection criteria to compare the risk in choosing one modeling paradigmmore » to another. We conclude with a discussion of our experience and document future ideas for a model validation framework.« less

  2. Flight Testing of Terrain-Relative Navigation and Large-Divert Guidance on a VTVL Rocket

    NASA Technical Reports Server (NTRS)

    Trawny, Nikolas; Benito, Joel; Tweddle, Brent; Bergh, Charles F.; Khanoyan, Garen; Vaughan, Geoffrey M.; Zheng, Jason X.; Villalpando, Carlos Y.; Cheng, Yang; Scharf, Daniel P.; hide

    2015-01-01

    Since 2011, the Autonomous Descent and Ascent Powered-Flight Testbed (ADAPT) has been used to demonstrate advanced descent and landing technologies onboard the Masten Space Systems (MSS) Xombie vertical-takeoff, vertical-landing suborbital rocket. The current instantiation of ADAPT is a stand-alone payload comprising sensing and avionics for terrain-relative navigation and fuel-optimal onboard planning of large divert trajectories, thus providing complete pin-point landing capabilities needed for planetary landers. To this end, ADAPT combines two technologies developed at JPL, the Lander Vision System (LVS), and the Guidance for Fuel Optimal Large Diverts (G-FOLD) software. This paper describes the integration and testing of LVS and G-FOLD in the ADAPT payload, culminating in two successful free flight demonstrations on the Xombie vehicle conducted in December 2014.

  3. DABI: A data base for image analysis with nondeterministic inference capability

    NASA Technical Reports Server (NTRS)

    Yakimovsky, Y.; Cunningham, R.

    1976-01-01

    A description is given of the data base used in the perception subsystem of the Mars robot vehicle prototype being implemented at the Jet Propulsion Laboratory. This data base contains two types of information. The first is generic (uninstantiated, abstract) information that specifies the general rules of perception of objects in the expected environments. The second kind of information is a specific (instantiated) description of a structure, i.e., the properties and relations of objects in the specific case being analyzed. The generic knowledge can be used by the approximate reasoning subsystem to obtain information on the specific structures which is not directly measurable by the sensory instruments. Raw measurements are input either from the sensory instruments or a human operator using a CRT or a TTY.

  4. Professional Development in a Reform Context: Understanding the Design and Enactment of Learning Experiences Created by Teacher Leaders for Science Educators

    NASA Astrophysics Data System (ADS)

    Shafer, Laura

    Teacher in-service learning about education reforms like NGSS often begin with professional development (PD) as a foundational component (Supovitz & Turner, 2000). Teacher Leaders, who are early implementers of education reform, are positioned to play a contributing role to the design of PD. As early implementers of reforms, Teacher Leaders are responsible for interpreting the purposes of reform, enacting reforms with fidelity to meet those intended goals, and are positioned to share their expertise with others. However, Teacher Leader knowledge is rarely accessed as a resource for the design of professional development programs. This study is unique in that I analyze the knowledge Teacher Leaders, who are positioned as developers of PD, bring to the design of PD around science education reform. I use the extended interconnected model of professional growth (Clarke & Hollingsworth, 2002; Coenders & Terlouw, 2015) to analyze the knowledge pathways Teacher Leaders' access as PD developers. I found that Teacher Leaders accessed knowledge pathways that cycled through their personal domain, domain of practice and domain of consequence. Additionally the findings indicated when Teacher Leaders did not have access to these knowledge domains they were unwilling to continue with PD design. These findings point to how Teacher Leaders prioritize their classroom experience to ground PD design and use their perceptions of student learning outcomes as an indicator of the success of the reform. Because professional development (PD) is viewed as an important resource for influencing teachers' knowledge and beliefs around the implementation of education reform efforts (Garet, et al., 2001; Suppovitz & Turner, 2000), I offer that Teacher Leaders, who are early implementers of reform measures, can contribute to the professional development system. The second part of this dissertation documents the instantiation of the knowledge of Teacher Leaders, who are positioned as designers and facilitators of PD. I examine the extent to which Teacher Leader knowledge is instantiated into specific resources and tasks during PD specifically for the Next Generation Science Standards (NGSS). The findings indicate that Teacher Leaders' knowledge is instantiated in tasks that promote and facilitate alignment of Teacher Leader goals for NGSS science practices-based instruction, which are framed around student learning outcomes. I offer a number of ways in which these findings can help educators and PD developers to better structure activities that present an alternative vision for science education that also provides the needed resources to shape how classroom tasks are designed and managed in ways that attend to and build on the practical knowledge of Teacher Leaders. The third part of this dissertation addresses the role Teacher Leaders play in this reform context with respect to their contributions to the professional development system. Based on the analyses of the Teacher Leaders in this study, I claim Teacher Leaders are essential contributors to the professional development system that extends beyond their typical role of participant. I argue that Teacher Leaders bring special expertise to the role of designers and facilitators of PD programs, and to the role of ambassadors for professional learning communities in a reform context. Because Teacher Leaders have a broader influence on the professional development system as pictured here, the Teacher Leaders in this study represent an essential piece of the reform puzzle.

  5. Building a model for disease classification integration in oncology, an approach based on the national cancer institute thesaurus.

    PubMed

    Jouhet, Vianney; Mougin, Fleur; Bréchat, Bérénice; Thiessard, Frantz

    2017-02-07

    Identifying incident cancer cases within a population remains essential for scientific research in oncology. Data produced within electronic health records can be useful for this purpose. Due to the multiplicity of providers, heterogeneous terminologies such as ICD-10 and ICD-O-3 are used for oncology diagnosis recording purpose. To enable disease identification based on these diagnoses, there is a need for integrating disease classifications in oncology. Our aim was to build a model integrating concepts involved in two disease classifications, namely ICD-10 (diagnosis) and ICD-O-3 (topography and morphology), despite their structural heterogeneity. Based on the NCIt, a "derivative" model for linking diagnosis and topography-morphology combinations was defined and built. ICD-O-3 and ICD-10 codes were then used to instantiate classes of the "derivative" model. Links between terminologies obtained through the model were then compared to mappings provided by the Surveillance, Epidemiology, and End Results (SEER) program. The model integrated 42% of neoplasm ICD-10 codes (excluding metastasis), 98% of ICD-O-3 morphology codes (excluding metastasis) and 68% of ICD-O-3 topography codes. For every codes instantiating at least a class in the "derivative" model, comparison with SEER mappings reveals that all mappings were actually available in the model as a link between the corresponding codes. We have proposed a method to automatically build a model for integrating ICD-10 and ICD-O-3 based on the NCIt. The resulting "derivative" model is a machine understandable resource that enables an integrated view of these heterogeneous terminologies. The NCIt structure and the available relationships can help to bridge disease classifications taking into account their structural and granular heterogeneities. However, (i) inconsistencies exist within the NCIt leading to misclassifications in the "derivative" model, (ii) the "derivative" model only integrates a part of ICD-10 and ICD-O-3. The NCIt is not sufficient for integration purpose and further work based on other termino-ontological resources is needed in order to enrich the model and avoid identified inconsistencies.

  6. Using articulation and inscription as catalysts for reflection: Design principles for reflective inquiry

    NASA Astrophysics Data System (ADS)

    Loh, Ben Tun-Bin

    2003-07-01

    The demand for students to engage in complex student-driven and information-rich inquiry investigations poses challenges to existing learning environments. Students are not familiar with this style of work, and lack the skills, tools, and expectations it demands, often forging blindly forward in the investigation. If students are to be successful, they need to learn to be reflective inquirers, periodically stepping back from an investigation to evaluate their work. The fundamental goal of my dissertation is to understand how to design learning environments to promote and support reflective inquiry. I have three basic research questions: how to define this mode of work, how to help students learn it, and understanding how it facilitates reflection when enacted in a classroom. I take an exploratory approach in which, through iterative cycles of design, development, and reflection, I develop principles of design for reflective inquiry, instantiate those principles in the design of a software environment, and test that software in the context of classroom work. My work contributes to the understanding of reflective inquiry in three ways: First, I define a task model that describes the kinds of operations (cognitive tasks) that students should engage in as reflective inquirers. These operations are defined in terms of two basic tasks: articulation and inscription, which serve as catalysts for externalizing student thinking as objects of and triggers for reflection. Second, I instantiate the task model in the design of software tools (the Progress Portfolio). And, through proof of concept pilot studies, I examine how the task model and tools helped students with their investigative classroom work. Finally, I take a step back from these implementations and articulate general design principles for reflective inquiry with the goal of informing the design of other reflective inquiry learning environments. There are three design principles: (1) Provide a designated work space for reflection activities to focus student attention on reflection. (2) Help students create and use artifacts that represent their work and their thinking as a means to create referents for reflection. (3) Support and take advantage of social processes that help students reflect on their own work.

  7. Top-Level Categories of Constitutively Organized Material Entities - Suggestions for a Formal Top-Level Ontology

    PubMed Central

    Vogt, Lars; Grobe, Peter; Quast, Björn; Bartolomaeus, Thomas

    2011-01-01

    Background Application oriented ontologies are important for reliably communicating and managing data in databases. Unfortunately, they often differ in the definitions they use and thus do not live up to their potential. This problem can be reduced when using a standardized and ontologically consistent template for the top-level categories from a top-level formal foundational ontology. This would support ontological consistency within application oriented ontologies and compatibility between them. The Basic Formal Ontology (BFO) is such a foundational ontology for the biomedical domain that has been developed following the single inheritance policy. It provides the top-level template within the Open Biological and Biomedical Ontologies Foundry. If it wants to live up to its expected role, its three top-level categories of material entity (i.e., ‘object’, ‘fiat object part’, ‘object aggregate’) must be exhaustive, i.e. every concrete material entity must instantiate exactly one of them. Methodology/Principal Findings By systematically evaluating all possible basic configurations of material building blocks we show that BFO's top-level categories of material entity are not exhaustive. We provide examples from biology and everyday life that demonstrate the necessity for two additional categories: ‘fiat object part aggregate’ and ‘object with fiat object part aggregate’. By distinguishing topological coherence, topological adherence, and metric proximity we furthermore provide a differentiation of clusters and groups as two distinct subcategories for each of the three categories of material entity aggregates, resulting in six additional subcategories of material entity. Conclusions/Significance We suggest extending BFO to incorporate two additional categories of material entity as well as two subcategories for each of the three categories of material entity aggregates. With these additions, BFO would exhaustively cover all top-level types of material entity that application oriented ontologies may use as templates. Our result, however, depends on the premise that all material entities are organized according to a constitutive granularity. PMID:21533043

  8. Confronting Space Debris: Strategies and Warnings from Comparable Examples Including Deepwater Horizon

    DTIC Science & Technology

    2010-01-01

    Horizon (DH) was an ultra deepwater , semisubmers- ible offshore drilling rig contracted to BP by its owner, Transocean. The rig was capable of...Warnings from Comparable Examples Including Deepwater Horizon 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT...research quality and objectivity. StrategieS and WarningS from Comparable exampleS inCluding deepWater Horizon Confronting SpaCe DebriS dave baiocchi

  9. What Learning Systems do Intelligent Agents Need? Complementary Learning Systems Theory Updated.

    PubMed

    Kumaran, Dharshan; Hassabis, Demis; McClelland, James L

    2016-07-01

    We update complementary learning systems (CLS) theory, which holds that intelligent agents must possess two learning systems, instantiated in mammalians in neocortex and hippocampus. The first gradually acquires structured knowledge representations while the second quickly learns the specifics of individual experiences. We broaden the role of replay of hippocampal memories in the theory, noting that replay allows goal-dependent weighting of experience statistics. We also address recent challenges to the theory and extend it by showing that recurrent activation of hippocampal traces can support some forms of generalization and that neocortical learning can be rapid for information that is consistent with known structure. Finally, we note the relevance of the theory to the design of artificial intelligent agents, highlighting connections between neuroscience and machine learning. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Adding Spatially Correlated Noise to a Median Ionosphere

    NASA Astrophysics Data System (ADS)

    Holmes, J. M.; Egert, A. R.; Dao, E. V.; Colman, J. J.; Parris, R. T.

    2017-12-01

    We describe a process for adding spatially correlated noise to a background ionospheric model, in this case the International Reference Ionosphere (IRI). Monthly median models do a good job describing bulk features of the ionosphere in a median sense. It is well known that the ionosphere almost never actually looks like its median. For the purposes of constructing an Operational System Simulation Experiment, it may be desirable to construct an ionosphere more similar to a particular instant, hour, or day than to the monthly median. We will examine selected data from the Global Ionosphere Radio Observatory (GIRO) database and estimate the amount of variance captured by the IRI model. We will then examine spatial and temporal correlations within the residuals. This analysis will be used to construct a temporal-spatial gridded ionosphere that represents a particular instantiation of those statistics.

  11. Pubertal testosterone influences threat-related amygdala–orbitofrontal cortex coupling

    PubMed Central

    Forbes, Erika E.; Ladouceur, Cecile D.; Worthman, Carol M.; Olino, Thomas M.; Ryan, Neal D.; Dahl, Ronald E.

    2015-01-01

    Growing evidence indicates that normative pubertal maturation is associated with increased threat reactivity, and this developmental shift has been implicated in the increased rates of adolescent affective disorders. However, the neural mechanisms involved in this pubertal increase in threat reactivity remain unknown. Research in adults indicates that testosterone transiently decreases amygdala–orbitofrontal cortex (OFC) coupling. Consequently, we hypothesized that increased pubertal testosterone disrupts amygdala–OFC coupling, which may contribute to developmental increases in threat reactivity in some adolescents. Hypotheses were tested in a longitudinal study by examining the impact of testosterone on functional connectivity. Findings were consistent with hypotheses and advance our understanding of normative pubertal changes in neural systems instantiating affect/motivation. Finally, potential novel insights into the neurodevelopmental pathways that may contribute to adolescent vulnerability to behavioral and emotional problems are discussed. PMID:24795438

  12. Programming with models: modularity and abstraction provide powerful capabilities for systems biology

    PubMed Central

    Mallavarapu, Aneil; Thomson, Matthew; Ullian, Benjamin; Gunawardena, Jeremy

    2008-01-01

    Mathematical models are increasingly used to understand how phenotypes emerge from systems of molecular interactions. However, their current construction as monolithic sets of equations presents a fundamental barrier to progress. Overcoming this requires modularity, enabling sub-systems to be specified independently and combined incrementally, and abstraction, enabling generic properties of biological processes to be specified independently of specific instances. These, in turn, require models to be represented as programs rather than as datatypes. Programmable modularity and abstraction enables libraries of modules to be created, which can be instantiated and reused repeatedly in different contexts with different components. We have developed a computational infrastructure that accomplishes this. We show here why such capabilities are needed, what is required to implement them and what can be accomplished with them that could not be done previously. PMID:18647734

  13. Electro-optical co-simulation for integrated CMOS photonic circuits with VerilogA.

    PubMed

    Sorace-Agaskar, Cheryl; Leu, Jonathan; Watts, Michael R; Stojanovic, Vladimir

    2015-10-19

    We present a Cadence toolkit library written in VerilogA for simulation of electro-optical systems. We have identified and described a set of fundamental photonic components at the physical level such that characteristics of composite devices (e.g. ring modulators) are created organically - by simple instantiation of fundamental primitives. Both the amplitude and phase of optical signals as well as optical-electrical interactions are simulated. We show that the results match other simulations and analytic solutions that have previously been compared to theory for both simple devices, such as ring resonators, and more complicated devices and systems such as single-sideband modulators, WDM links and Pound Drever Hall Locking loops. We also illustrate the capability of such toolkit for co-simulation with electronic circuits, which is a key enabler of the electro-optic system development and verification.

  14. Abstraction and Assume-Guarantee Reasoning for Automated Software Verification

    NASA Technical Reports Server (NTRS)

    Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.

    2004-01-01

    Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.

  15. Application of Ontology Technology in Health Statistic Data Analysis.

    PubMed

    Guo, Minjiang; Hu, Hongpu; Lei, Xingyun

    2017-01-01

    Research Purpose: establish health management ontology for analysis of health statistic data. Proposed Methods: this paper established health management ontology based on the analysis of the concepts in China Health Statistics Yearbook, and used protégé to define the syntactic and semantic structure of health statistical data. six classes of top-level ontology concepts and their subclasses had been extracted and the object properties and data properties were defined to establish the construction of these classes. By ontology instantiation, we can integrate multi-source heterogeneous data and enable administrators to have an overall understanding and analysis of the health statistic data. ontology technology provides a comprehensive and unified information integration structure of the health management domain and lays a foundation for the efficient analysis of multi-source and heterogeneous health system management data and enhancement of the management efficiency.

  16. Power and the objectification of social targets.

    PubMed

    Gruenfeld, Deborah H; Inesi, M Ena; Magee, Joe C; Galinsky, Adam D

    2008-07-01

    Objectification has been defined historically as a process of subjugation whereby people, like objects, are treated as means to an end. The authors hypothesized that objectification is a response to social power that involves approaching useful social targets regardless of the value of their other human qualities. Six studies found that under conditions of power, approach toward a social target was driven more by the target's usefulness, defined in terms of the perceiver's goals, than in low-power and baseline conditions. This instrumental response to power, which was linked to the presence of an active goal, was observed using multiple instantiations of power, different measures of approach, a variety of goals, and several types of instrumental and noninstrumental target attributes. Implications for research on the psychology of power, automatic goal pursuit, and self-objectification theory are discussed.

  17. Affective Neuronal Selection: The Nature of the Primordial Emotion Systems

    PubMed Central

    Toronchuk, Judith A.; Ellis, George F. R.

    2013-01-01

    Based on studies in affective neuroscience and evolutionary psychiatry, a tentative new proposal is made here as to the nature and identification of primordial emotional systems. Our model stresses phylogenetic origins of emotional systems, which we believe is necessary for a full understanding of the functions of emotions and additionally suggests that emotional organizing systems play a role in sculpting the brain during ontogeny. Nascent emotional systems thus affect cognitive development. A second proposal concerns two additions to the affective systems identified by Panksepp. We suggest there is substantial evidence for a primary emotional organizing program dealing with power, rank, dominance, and subordination which instantiates competitive and territorial behavior and is an evolutionary contributor to self-esteem in humans. A program underlying disgust reactions which originally functioned in ancient vertebrates to protect against infection and toxins is also suggested. PMID:23316177

  18. Obtaining P3P Privacy Policies for Composite Services

    PubMed Central

    Sun, Yi; Huang, Zhiqiu; Ke, Changbo

    2014-01-01

    With the development of web services technology, web services have changed from single to composite services. Privacy protection in composite services is becoming an important issue. P3P (platform for privacy preferences) is a privacy policy language which was designed for single web services. It enables service providers to express how they will deal with the privacy information of service consumers. In order to solve the problem that P3P cannot be applied to composite services directly, we propose a method to obtain P3P privacy policies for composite services. In this method, we present the definitions of Purpose, Recipient, and Retention elements as well as Optional and Required attributes for P3P policies of composite services. We also provide an instantiation to illustrate the feasibility of the method. PMID:25126609

  19. Biology, ideology, and epistemology: how do we know political attitudes are inherited and why should we care?

    PubMed

    Smith, Kevin; Alford, John R; Hatemi, Peter K; Eaves, Lindon J; Funk, Carolyn; Hibbing, John R

    2012-01-01

    Evidence that political attitudes and behavior are in part biologically and even genetically instantiated is much discussed in political science of late. Yet the classic twin design, a primary source of evidence on this matter, has been criticized for being biased toward finding genetic influence. In this article, we employ a new data source to test empirically the alternative, exclusively environmental, explanations for ideological similarities between twins. We find little support for these explanations and argue that even if we treat them as wholly correct, they provide reasons for political science to pay more rather than less attention to the biological basis of attitudes and behaviors. Our analysis suggests that the mainstream socialization paradigm for explaining attitudes and behaviors is not necessarily incorrect but is substantively incomplete.

  20. Programming with models: modularity and abstraction provide powerful capabilities for systems biology.

    PubMed

    Mallavarapu, Aneil; Thomson, Matthew; Ullian, Benjamin; Gunawardena, Jeremy

    2009-03-06

    Mathematical models are increasingly used to understand how phenotypes emerge from systems of molecular interactions. However, their current construction as monolithic sets of equations presents a fundamental barrier to progress. Overcoming this requires modularity, enabling sub-systems to be specified independently and combined incrementally, and abstraction, enabling generic properties of biological processes to be specified independently of specific instances. These, in turn, require models to be represented as programs rather than as datatypes. Programmable modularity and abstraction enables libraries of modules to be created, which can be instantiated and reused repeatedly in different contexts with different components. We have developed a computational infrastructure that accomplishes this. We show here why such capabilities are needed, what is required to implement them and what can be accomplished with them that could not be done previously.

  1. An Overview of SAL

    NASA Technical Reports Server (NTRS)

    Bensalem, Saddek; Ganesh, Vijay; Lakhnech, Yassine; Munoz, Cesar; Owre, Sam; Ruess, Harald; Rushby, John; Rusu, Vlad; Saiedi, Hassen; Shankar, N.

    2000-01-01

    To become practical for assurance, automated formal methods must be made more scalable, automatic, and cost-effective. Such an increase in scope, scale, automation, and utility can be derived from an emphasis on a systematic separation of concerns during verification. SAL (Symbolic Analysis Laboratory) attempts to address these issues. It is a framework for combining different tools to calculate properties of concurrent systems. The heart of SAL is a language, developed in collaboration with Stanford, Berkeley, and Verimag for specifying concurrent systems in a compositional way. Our instantiation of the SAL framework augments PVS with tools for abstraction, invariant generation, program analysis (such as slicing), theorem proving, and model checking to separate concerns as well as calculate properties (i.e., perform, symbolic analysis) of concurrent systems. We. describe the motivation, the language, the tools, their integration in SAL/PAS, and some preliminary experience of their use.

  2. Object-Oriented Query Language For Events Detection From Images Sequences

    NASA Astrophysics Data System (ADS)

    Ganea, Ion Eugen

    2015-09-01

    In this paper is presented a method to represent the events extracted from images sequences and the query language used for events detection. Using an object oriented model the spatial and temporal relationships between salient objects and also between events are stored and queried. This works aims to unify the storing and querying phases for video events processing. The object oriented language syntax used for events processing allow the instantiation of the indexes classes in order to improve the accuracy of the query results. The experiments were performed on images sequences provided from sport domain and it shows the reliability and the robustness of the proposed language. To extend the language will be added a specific syntax for constructing the templates for abnormal events and for detection of the incidents as the final goal of the research.

  3. Open Architecture Standard for NASA's Software-Defined Space Telecommunications Radio Systems

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Johnson, Sandra K.; Kacpura, Thomas J.; Hall, Charles S.; Smith, Carl R.; Liebetreu, John

    2008-01-01

    NASA is developing an architecture standard for software-defined radios used in space- and ground-based platforms to enable commonality among radio developments to enhance capability and services while reducing mission and programmatic risk. Transceivers (or transponders) with functionality primarily defined in software (e.g., firmware) have the ability to change their functional behavior through software alone. This radio architecture standard offers value by employing common waveform software interfaces, method of instantiation, operation, and testing among different compliant hardware and software products. These common interfaces within the architecture abstract application software from the underlying hardware to enable technology insertion independently at either the software or hardware layer. This paper presents the initial Space Telecommunications Radio System (STRS) Architecture for NASA missions to provide the desired software abstraction and flexibility while minimizing the resources necessary to support the architecture.

  4. Rapidly Mixing Gibbs Sampling for a Class of Factor Graphs Using Hierarchy Width

    PubMed Central

    De Sa, Christopher; Zhang, Ce; Olukotun, Kunle; Ré, Christopher

    2016-01-01

    Gibbs sampling on factor graphs is a widely used inference technique, which often produces good empirical results. Theoretical guarantees for its performance are weak: even for tree structured graphs, the mixing time of Gibbs may be exponential in the number of variables. To help understand the behavior of Gibbs sampling, we introduce a new (hyper)graph property, called hierarchy width. We show that under suitable conditions on the weights, bounded hierarchy width ensures polynomial mixing time. Our study of hierarchy width is in part motivated by a class of factor graph templates, hierarchical templates, which have bounded hierarchy width—regardless of the data used to instantiate them. We demonstrate a rich application from natural language processing in which Gibbs sampling provably mixes rapidly and achieves accuracy that exceeds human volunteers. PMID:27279724

  5. Mars 520-d mission simulation reveals protracted crew hypokinesis and alterations of sleep duration and timing.

    PubMed

    Basner, Mathias; Dinges, David F; Mollicone, Daniel; Ecker, Adrian; Jones, Christopher W; Hyder, Eric C; Di Antonio, Adrian; Savelev, Igor; Kan, Kevin; Goel, Namni; Morukov, Boris V; Sutton, Jeffrey P

    2013-02-12

    The success of interplanetary human spaceflight will depend on many factors, including the behavioral activity levels, sleep, and circadian timing of crews exposed to prolonged microgravity and confinement. To address the effects of the latter, we used a high-fidelity ground simulation of a Mars mission to objectively track sleep-wake dynamics in a multinational crew of six during 520 d of confined isolation. Measurements included continuous recordings of wrist actigraphy and light exposure (4.396 million min) and weekly computer-based neurobehavioral assessments (n = 888) to identify changes in the crew's activity levels, sleep quantity and quality, sleep-wake periodicity, vigilance performance, and workload throughout the record-long 17 mo of mission confinement. Actigraphy revealed that crew sedentariness increased across the mission as evident in decreased waking movement (i.e., hypokinesis) and increased sleep and rest times. Light exposure decreased during the mission. The majority of crewmembers also experienced one or more disturbances of sleep quality, vigilance deficits, or altered sleep-wake periodicity and timing, suggesting inadequate circadian entrainment. The results point to the need to identify markers of differential vulnerability to hypokinesis and sleep-wake changes during the prolonged isolation of exploration spaceflight and the need to ensure maintenance of circadian entrainment, sleep quantity and quality, and optimal activity levels during exploration missions. Therefore, successful adaptation to such missions will require crew to transit in spacecraft and live in surface habitats that instantiate aspects of Earth's geophysical signals (appropriately timed light exposure, food intake, exercise) required for temporal organization and maintenance of human behavior.

  6. Optical contact micrometer

    DOEpatents

    Jacobson, Steven D.

    2014-08-19

    Certain examples provide optical contact micrometers and methods of use. An example optical contact micrometer includes a pair of opposable lenses to receive an object and immobilize the object in a position. The example optical contact micrometer includes a pair of opposable mirrors positioned with respect to the pair of lenses to facilitate viewing of the object through the lenses. The example optical contact micrometer includes a microscope to facilitate viewing of the object through the lenses via the mirrors; and an interferometer to obtain one or more measurements of the object.

  7. Anterior cingulate cortex activity can be independent of response conflict in Stroop-like tasks.

    PubMed

    Roelofs, Ardi; van Turennout, Miranda; Coles, Michael G H

    2006-09-12

    Cognitive control includes the ability to formulate goals and plans of action and to follow these while facing distraction. Previous neuroimaging studies have shown that the presence of conflicting response alternatives in Stroop-like tasks increases activity in dorsal anterior cingulate cortex (ACC), suggesting that the ACC is involved in cognitive control. However, the exact nature of ACC function is still under debate. The prevailing conflict detection hypothesis maintains that the ACC is involved in performance monitoring. According to this view, ACC activity reflects the detection of response conflict and acts as a signal that engages regulative processes subserved by lateral prefrontal brain regions. Here, we provide evidence from functional MRI that challenges this view and favors an alternative view, according to which the ACC has a role in regulation itself. Using an arrow-word Stroop task, subjects responded to incongruent, congruent, and neutral stimuli. A critical prediction made by the conflict detection hypothesis is that ACC activity should be increased only when conflicting response alternatives are present. Our data show that ACC responses are larger for neutral than for congruent stimuli, in the absence of response conflict. This result demonstrates the engagement of the ACC in regulation itself. A computational model of Stroop-like performance instantiating a version of the regulative hypothesis is shown to account for our findings.

  8. Security and Policy for Group Collaboration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ian Foster; Carl Kesselman

    2006-07-31

    “Security and Policy for Group Collaboration” was a Collaboratory Middleware research project aimed at providing the fundamental security and policy infrastructure required to support the creation and operation of distributed, computationally enabled collaborations. The project developed infrastructure that exploits innovative new techniques to address challenging issues of scale, dynamics, distribution, and role. To reduce greatly the cost of adding new members to a collaboration, we developed and evaluated new techniques for creating and managing credentials based on public key certificates, including support for online certificate generation, online certificate repositories, and support for multiple certificate authorities. To facilitate the integration ofmore » new resources into a collaboration, we improved significantly the integration of local security environments. To make it easy to create and change the role and associated privileges of both resources and participants of collaboration, we developed community wide authorization services that provide distributed, scalable means for specifying policy. These services make it possible for the delegation of capability from the community to a specific user, class of user or resource. Finally, we instantiated our research results into a framework that makes it useable to a wide range of collaborative tools. The resulting mechanisms and software have been widely adopted within DOE projects and in many other scientific projects. The widespread adoption of our Globus Toolkit technology has provided, and continues to provide, a natural dissemination and technology transfer vehicle for our results.« less

  9. Design and characterization of the Large-aperture Experiment to Detect the Dark Age (LEDA) radiometer systems

    NASA Astrophysics Data System (ADS)

    Price, D. C.; Greenhill, L. J.; Fialkov, A.; Bernardi, G.; Garsden, H.; Barsdell, B. R.; Kocz, J.; Anderson, M. M.; Bourke, S. A.; Craig, J.; Dexter, M. R.; Dowell, J.; Eastwood, M. W.; Eftekhari, T.; Ellingson, S. W.; Hallinan, G.; Hartman, J. M.; Kimberk, R.; Lazio, T. Joseph W.; Leiker, S.; MacMahon, D.; Monroe, R.; Schinzel, F.; Taylor, G. B.; Tong, E.; Werthimer, D.; Woody, D. P.

    2018-05-01

    The Large-Aperture Experiment to Detect the Dark Age (LEDA) was designed to detect the predicted O(100) mK sky-averaged absorption of the Cosmic Microwave Background by Hydrogen in the neutral pre- and intergalactic medium just after the cosmological Dark Age. The spectral signature would be associated with emergence of a diffuse Lyα background from starlight during `Cosmic Dawn'. Recently, Bowman et al. (2018) have reported detection of this predicted absorption feature, with an unexpectedly large amplitude of 530 mK, centered at 78 MHz. Verification of this result by an independent experiment, such as LEDA, is pressing. In this paper, we detail design and characterization of the LEDA radiometer systems, and a first-generation pipeline that instantiates a signal path model. Sited at the Owens Valley Radio Observatory Long Wavelength Array, LEDA systems include the station correlator, five well-separated redundant dual polarization radiometers and backend electronics. The radiometers deliver a 30-85 MHz band (16 < z < 34) and operate as part of the larger interferometric array, for purposes ultimately of in situ calibration. Here, we report on the LEDA system design, calibration approach, and progress in characterization as of January 2016. The LEDA systems are currently being modified to improve performance near 78 MHz in order to verify the purported absorption feature.

  10. 48 CFR 1845.7101-1 - Property classification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... aeronautical and space programs, which are capable of stand-alone operation. Examples include research aircraft... characteristics. (ii) Examples of NASA heritage assets include buildings and structures designated as National...., it no longer provides service to NASA operations). Examples of obsolete property are items in...

  11. 48 CFR 1845.7101-1 - Property classification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... aeronautical and space programs, which are capable of stand-alone operation. Examples include research aircraft... characteristics. (ii) Examples of NASA heritage assets include buildings and structures designated as National...., it no longer provides service to NASA operations). Examples of obsolete property are items in...

  12. 48 CFR 1845.7101-1 - Property classification.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... aeronautical and space programs, which are capable of stand-alone operation. Examples include research aircraft... characteristics. (ii) Examples of NASA heritage assets include buildings and structures designated as National...., it no longer provides service to NASA operations). Examples of obsolete property are items in...

  13. 48 CFR 1845.7101-1 - Property classification.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... aeronautical and space programs, which are capable of stand-alone operation. Examples include research aircraft... characteristics. (ii) Examples of NASA heritage assets include buildings and structures designated as National...., it no longer provides service to NASA operations). Examples of obsolete property are items in...

  14. 48 CFR 1845.7101-1 - Property classification.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... aeronautical and space programs, which are capable of stand-alone operation. Examples include research aircraft... characteristics. (ii) Examples of NASA heritage assets include buildings and structures designated as National...., it no longer provides service to NASA operations). Examples of obsolete property are items in...

  15. Persistent Infrared Spectral Hole-Burning for Impurity Vibrational Modes in Solids.

    DTIC Science & Technology

    1986-09-30

    infrared vibrational transitions of impurity molecules in solids. Examples include 1,2- difluoroethane in rare gas matrices, perrhenate ions in alkali...observed consists of infrared vibrational transitions of impurity molecules in solids. Examples include 1,2- difluoroethane in rare gas matrices...solids. Examples include 1,2- difluoroethane in rare gas matrices, perrhenate ions in alkali halide crystals, and most recently, cyanide and nitrite

  16. Examples as an Instructional Tool in Mathematics and Science Classrooms: Teachers' Perceptions and Attitudes

    ERIC Educational Resources Information Center

    Huang, Xiaoxia; Cribbs, Jennifer

    2017-01-01

    This study examined mathematics and science teachers' perceptions and use of four types of examples, including typical textbook examples (standard worked examples) and erroneous worked examples in the written form as well as mastery modelling examples and peer modelling examples involving the verbalization of the problem-solving process. Data…

  17. An alternative database approach for management of SNOMED CT and improved patient data queries.

    PubMed

    Campbell, W Scott; Pedersen, Jay; McClay, James C; Rao, Praveen; Bastola, Dhundy; Campbell, James R

    2015-10-01

    SNOMED CT is the international lingua franca of terminologies for human health. Based in Description Logics (DL), the terminology enables data queries that incorporate inferences between data elements, as well as, those relationships that are explicitly stated. However, the ontologic and polyhierarchical nature of the SNOMED CT concept model make it difficult to implement in its entirety within electronic health record systems that largely employ object oriented or relational database architectures. The result is a reduction of data richness, limitations of query capability and increased systems overhead. The hypothesis of this research was that a graph database (graph DB) architecture using SNOMED CT as the basis for the data model and subsequently modeling patient data upon the semantic core of SNOMED CT could exploit the full value of the terminology to enrich and support advanced data querying capability of patient data sets. The hypothesis was tested by instantiating a graph DB with the fully classified SNOMED CT concept model. The graph DB instance was tested for integrity by calculating the transitive closure table for the SNOMED CT hierarchy and comparing the results with transitive closure tables created using current, validated methods. The graph DB was then populated with 461,171 anonymized patient record fragments and over 2.1 million associated SNOMED CT clinical findings. Queries, including concept negation and disjunction, were then run against the graph database and an enterprise Oracle relational database (RDBMS) of the same patient data sets. The graph DB was then populated with laboratory data encoded using LOINC, as well as, medication data encoded with RxNorm and complex queries performed using LOINC, RxNorm and SNOMED CT to identify uniquely described patient populations. A graph database instance was successfully created for two international releases of SNOMED CT and two US SNOMED CT editions. Transitive closure tables and descriptive statistics generated using the graph database were identical to those using validated methods. Patient queries produced identical patient count results to the Oracle RDBMS with comparable times. Database queries involving defining attributes of SNOMED CT concepts were possible with the graph DB. The same queries could not be directly performed with the Oracle RDBMS representation of the patient data and required the creation and use of external terminology services. Further, queries of undefined depth were successful in identifying unknown relationships between patient cohorts. The results of this study supported the hypothesis that a patient database built upon and around the semantic model of SNOMED CT was possible. The model supported queries that leveraged all aspects of the SNOMED CT logical model to produce clinically relevant query results. Logical disjunction and negation queries were possible using the data model, as well as, queries that extended beyond the structural IS_A hierarchy of SNOMED CT to include queries that employed defining attribute-values of SNOMED CT concepts as search parameters. As medical terminologies, such as SNOMED CT, continue to expand, they will become more complex and model consistency will be more difficult to assure. Simultaneously, consumers of data will increasingly demand improvements to query functionality to accommodate additional granularity of clinical concepts without sacrificing speed. This new line of research provides an alternative approach to instantiating and querying patient data represented using advanced computable clinical terminologies. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. The soft embodiment of culture: camera angles and motion through time and space.

    PubMed

    Leung, Angela K-y; Cohen, Dov

    2007-09-01

    Cultural assumptions about one's relation to others and one's place in the world can be literally embodied in the way one cognitively maps out one's position and motion in time and space. In three experiments, we examined the psychological perspective that Asian American and Euro-American participants embodied as they both comprehended and produced narratives and mapped out metaphors of time and space. In social situations, Euro-American participants were more likely to embody their own perspective and a sense of their own motion (rather than those of a friend), whereas Asian American participants were more likely to embody a friend's perspective and sense of motion (rather than their own). We discuss how these psychological perspectives represent the soft embodiment of culture by implicitly instantiating cultural injunctions (a) to think about how you look to others and to harmonize with them or (b) to know yourself, trust yourself, and act with confidence.

  19. Anticipatory control through associative learning of subliminal relations: invisible may be better than visible.

    PubMed

    Farooqui, Ausaf A; Manly, Tom

    2015-03-01

    We showed that anticipatory cognitive control could be unconsciously instantiated through subliminal cues that predicted enhanced future control needs. In task-switching experiments, one of three subliminal cues preceded each trial. Participants had no conscious experience or knowledge of these cues, but their performance was significantly improved on switch trials after cues that predicted task switches (but not particular tasks). This utilization of subliminal information was flexible and adapted to a change in cues predicting task switches and occurred only when switch trials were difficult and effortful. When cues were consciously visible, participants were unable to discern their relevance and could not use them to enhance switch performance. Our results show that unconscious cognition can implicitly use subliminal information in a goal-directed manner for anticipatory control, and they also suggest that subliminal representations may be more conducive to certain forms of associative learning. © The Author(s) 2015.

  20. Temporal coding in a silicon network of integrate-and-fire neurons.

    PubMed

    Liu, Shih-Chii; Douglas, Rodney

    2004-09-01

    Spatio-temporal processing of spike trains by neuronal networks depends on a variety of mechanisms distributed across synapses, dendrites, and somata. In natural systems, the spike trains and the processing mechanisms cohere though their common physical instantiation. This coherence is lost when the natural system is encoded for simulation on a general purpose computer. By contrast, analog VLSI circuits are, like neurons, inherently related by their real-time physics, and so, could provide a useful substrate for exploring neuronlike event-based processing. Here, we describe a hybrid analog-digital VLSI chip comprising a set of integrate-and-fire neurons and short-term dynamical synapses that can be configured into simple network architectures with some properties of neocortical neuronal circuits. We show that, despite considerable fabrication variance in the properties of individual neurons, the chip offers a viable substrate for exploring real-time spike-based processing in networks of neurons.

  1. Life politics, nature and the state: Giddens' sociological theory and The Politics of Climate Change.

    PubMed

    Thorpe, Charles; Jacobson, Brynna

    2013-03-01

    Anthony Giddens' The Politics of Climate Change represents a significant shift in the way in which he addresses ecological politics. In this book, he rejects the relevance of environmentalism and demarcates climate-change policy from life politics. Giddens addresses climate change in the technocratic mode of simple rather than reflexive modernization. However, Giddens' earlier sociological theory provides the basis for a more reflexive understanding of climate change. Climate change instantiates how, in high modernity, the existential contradiction of the human relationship with nature returns in new form, expressed in life politics and entangled with the structural contradictions of the capitalist state. The interlinking of existential and structural contradiction is manifested in the tension between life politics and the capitalist nation-state. This tension is key for understanding the failures so far of policy responses to climate change. © London School of Economics and Political Science 2013.

  2. Children's understanding of the costs and rewards underlying rational action.

    PubMed

    Jara-Ettinger, Julian; Gweon, Hyowon; Tenenbaum, Joshua B; Schulz, Laura E

    2015-07-01

    Humans explain and predict other agents' behavior using mental state concepts, such as beliefs and desires. Computational and developmental evidence suggest that such inferences are enabled by a principle of rational action: the expectation that agents act efficiently, within situational constraints, to achieve their goals. Here we propose that the expectation of rational action is instantiated by a naïve utility calculus sensitive to both agent-constant and agent-specific aspects of costs and rewards associated with actions. In four experiments, we show that, given an agent's choices, children (range: 5-6 year olds; N=96) can infer unobservable aspects of costs (differences in agents' competence) from information about subjective differences in rewards (differences in agents' preferences) and vice versa. Moreover, children can design informative experiments on both objects and agents to infer unobservable constraints on agents' actions. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Squish: Near-Optimal Compression for Archival of Relational Datasets

    PubMed Central

    Gao, Yihan; Parameswaran, Aditya

    2017-01-01

    Relational datasets are being generated at an alarmingly rapid rate across organizations and industries. Compressing these datasets could significantly reduce storage and archival costs. Traditional compression algorithms, e.g., gzip, are suboptimal for compressing relational datasets since they ignore the table structure and relationships between attributes. We study compression algorithms that leverage the relational structure to compress datasets to a much greater extent. We develop Squish, a system that uses a combination of Bayesian Networks and Arithmetic Coding to capture multiple kinds of dependencies among attributes and achieve near-entropy compression rate. Squish also supports user-defined attributes: users can instantiate new data types by simply implementing five functions for a new class interface. We prove the asymptotic optimality of our compression algorithm and conduct experiments to show the effectiveness of our system: Squish achieves a reduction of over 50% in storage size relative to systems developed in prior work on a variety of real datasets. PMID:28180028

  4. An Ontology-based Context-aware System for Smart Homes: E-care@home.

    PubMed

    Alirezaie, Marjan; Renoux, Jennifer; Köckemann, Uwe; Kristoffersson, Annica; Karlsson, Lars; Blomqvist, Eva; Tsiftes, Nicolas; Voigt, Thiemo; Loutfi, Amy

    2017-07-06

    Smart home environments have a significant potential to provide for long-term monitoring of users with special needs in order to promote the possibility to age at home. Such environments are typically equipped with a number of heterogeneous sensors that monitor both health and environmental parameters. This paper presents a framework called E-care@home, consisting of an IoT infrastructure, which provides information with an unambiguous, shared meaning across IoT devices, end-users, relatives, health and care professionals and organizations. We focus on integrating measurements gathered from heterogeneous sources by using ontologies in order to enable semantic interpretation of events and context awareness. Activities are deduced using an incremental answer set solver for stream reasoning. The paper demonstrates the proposed framework using an instantiation of a smart environment that is able to perform context recognition based on the activities and the events occurring in the home.

  5. Network Security via Biometric Recognition of Patterns of Gene Expression

    NASA Technical Reports Server (NTRS)

    Shaw, Harry C.

    2016-01-01

    Molecular biology provides the ability to implement forms of information and network security completely outside the bounds of legacy security protocols and algorithms. This paper addresses an approach which instantiates the power of gene expression for security. Molecular biology provides a rich source of gene expression and regulation mechanisms, which can be adopted to use in the information and electronic communication domains. Conventional security protocols are becoming increasingly vulnerable due to more intensive, highly capable attacks on the underlying mathematics of cryptography. Security protocols are being undermined by social engineering and substandard implementations by IT (Information Technology) organizations. Molecular biology can provide countermeasures to these weak points with the current security approaches. Future advances in instruments for analyzing assays will also enable this protocol to advance from one of cryptographic algorithms to an integrated system of cryptographic algorithms and real-time assays of gene expression products.

  6. Genomics-Based Security Protocols: From Plaintext to Cipherprotein

    NASA Technical Reports Server (NTRS)

    Shaw, Harry; Hussein, Sayed; Helgert, Hermann

    2011-01-01

    The evolving nature of the internet will require continual advances in authentication and confidentiality protocols. Nature provides some clues as to how this can be accomplished in a distributed manner through molecular biology. Cryptography and molecular biology share certain aspects and operations that allow for a set of unified principles to be applied to problems in either venue. A concept for developing security protocols that can be instantiated at the genomics level is presented. A DNA (Deoxyribonucleic acid) inspired hash code system is presented that utilizes concepts from molecular biology. It is a keyed-Hash Message Authentication Code (HMAC) capable of being used in secure mobile Ad hoc networks. It is targeted for applications without an available public key infrastructure. Mechanics of creating the HMAC are presented as well as a prototype HMAC protocol architecture. Security concepts related to the implementation differences between electronic domain security and genomics domain security are discussed.

  7. Network Security via Biometric Recognition of Patterns of Gene Expression

    NASA Technical Reports Server (NTRS)

    Shaw, Harry C.

    2016-01-01

    Molecular biology provides the ability to implement forms of information and network security completely outside the bounds of legacy security protocols and algorithms. This paper addresses an approach which instantiates the power of gene expression for security. Molecular biology provides a rich source of gene expression and regulation mechanisms, which can be adopted to use in the information and electronic communication domains. Conventional security protocols are becoming increasingly vulnerable due to more intensive, highly capable attacks on the underlying mathematics of cryptography. Security protocols are being undermined by social engineering and substandard implementations by IT organizations. Molecular biology can provide countermeasures to these weak points with the current security approaches. Future advances in instruments for analyzing assays will also enable this protocol to advance from one of cryptographic algorithms to an integrated system of cryptographic algorithms and real-time expression and assay of gene expression products.

  8. On the interaction of deaffrication and consonant harmony*

    PubMed Central

    Dinnsen, Daniel A.; Gierut, Judith A.; Morrisette, Michele L.; Green, Christopher R.; Farris-Trimble, Ashley W.

    2010-01-01

    Error patterns in children’s phonological development are often described as simplifying processes that can interact with one another with different consequences. Some interactions limit the applicability of an error pattern, and others extend it to more words. Theories predict that error patterns interact to their full potential. While specific interactions have been documented for certain pairs of processes, no developmental study has shown that the range of typologically predicted interactions occurs for those processes. To determine whether this anomaly is an accidental gap or a systematic peculiarity of particular error patterns, two commonly occurring processes were considered, namely Deaffrication and Consonant Harmony. Results are reported from a cross-sectional and longitudinal study of 12 children (age 3;0 – 5;0) with functional phonological delays. Three interaction types were attested to varying degrees. The longitudinal results further instantiated the typology and revealed a characteristic trajectory of change. Implications of these findings are explored. PMID:20513256

  9. Multiple Embedded Processors for Fault-Tolerant Computing

    NASA Technical Reports Server (NTRS)

    Bolotin, Gary; Watson, Robert; Katanyoutanant, Sunant; Burke, Gary; Wang, Mandy

    2005-01-01

    A fault-tolerant computer architecture has been conceived in an effort to reduce vulnerability to single-event upsets (spurious bit flips caused by impingement of energetic ionizing particles or photons). As in some prior fault-tolerant architectures, the redundancy needed for fault tolerance is obtained by use of multiple processors in one computer. Unlike prior architectures, the multiple processors are embedded in a single field-programmable gate array (FPGA). What makes this new approach practical is the recent commercial availability of FPGAs that are capable of having multiple embedded processors. A working prototype (see figure) consists of two embedded IBM PowerPC 405 processor cores and a comparator built on a Xilinx Virtex-II Pro FPGA. This relatively simple instantiation of the architecture implements an error-detection scheme. A planned future version, incorporating four processors and two comparators, would correct some errors in addition to detecting them.

  10. Pubertal testosterone influences threat-related amygdala-orbitofrontal cortex coupling.

    PubMed

    Spielberg, Jeffrey M; Forbes, Erika E; Ladouceur, Cecile D; Worthman, Carol M; Olino, Thomas M; Ryan, Neal D; Dahl, Ronald E

    2015-03-01

    Growing evidence indicates that normative pubertal maturation is associated with increased threat reactivity, and this developmental shift has been implicated in the increased rates of adolescent affective disorders. However, the neural mechanisms involved in this pubertal increase in threat reactivity remain unknown. Research in adults indicates that testosterone transiently decreases amygdala-orbitofrontal cortex (OFC) coupling. Consequently, we hypothesized that increased pubertal testosterone disrupts amygdala-OFC coupling, which may contribute to developmental increases in threat reactivity in some adolescents. Hypotheses were tested in a longitudinal study by examining the impact of testosterone on functional connectivity. Findings were consistent with hypotheses and advance our understanding of normative pubertal changes in neural systems instantiating affect/motivation. Finally, potential novel insights into the neurodevelopmental pathways that may contribute to adolescent vulnerability to behavioral and emotional problems are discussed. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  11. Parallel digital forensics infrastructure.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liebrock, Lorie M.; Duggan, David Patrick

    2009-10-01

    This report documents the architecture and implementation of a Parallel Digital Forensics infrastructure. This infrastructure is necessary for supporting the design, implementation, and testing of new classes of parallel digital forensics tools. Digital Forensics has become extremely difficult with data sets of one terabyte and larger. The only way to overcome the processing time of these large sets is to identify and develop new parallel algorithms for performing the analysis. To support algorithm research, a flexible base infrastructure is required. A candidate architecture for this base infrastructure was designed, instantiated, and tested by this project, in collaboration with New Mexicomore » Tech. Previous infrastructures were not designed and built specifically for the development and testing of parallel algorithms. With the size of forensics data sets only expected to increase significantly, this type of infrastructure support is necessary for continued research in parallel digital forensics. This report documents the implementation of the parallel digital forensics (PDF) infrastructure architecture and implementation.« less

  12. Forecasting of construction and demolition waste in Brazil.

    PubMed

    Paz, Diogo Hf; Lafayette, Kalinny Pv

    2016-08-01

    The objective of this article is to develop a computerised tool (software) that facilitates the analysis of strategies for waste management on construction sites through the use of indicators of construction and demolition waste generation. The development involved the following steps: knowledge acquisition, structuring the system, coding and system evaluation. The step of knowledge acquisition aims to provide subsidies for the representation of them through models. In the step of structuring the system, it was presented the structuring and formalisation of knowledge for the development of the system, and has two stages: the construction of the conceptual model and the subsequent instantiation of the model. The coding system aims to implement (code) the conceptual model developed in a model played by computer (digital). The results showed that the system is very useful and applicable in construction sites, helping to improve the quality of waste management, and creating a database that will support new research. © The Author(s) 2016.

  13. Social Identification in Sports Teams: The Role of Personal, Social, and Collective Identity Motives.

    PubMed

    Thomas, William E; Brown, Rupert; Easterbrook, Matthew J; Vignoles, Vivian L; Manzi, Claudia; D'Angelo, Chiara; Holt, Jeremy J

    2017-04-01

    Based on motivated identity construction theory (MICT; Vignoles, 2011), we offer an integrative approach examining the combined roles of six identity motives (self-esteem, distinctiveness, belonging, meaning, continuity, and efficacy) instantiated at three different motivational levels (personal, social, and collective identity) as predictors of group identification. These identity processes were investigated among 369 members of 45 sports teams from England and Italy in a longitudinal study over 6 months with four time points. Multilevel change modeling and cross-lagged analyses showed that satisfaction of four personal identity motives (individuals' personal feelings of self-esteem, distinctiveness, meaning, and efficacy derived from team membership), three social identity motives (individuals' feelings that the team identity carries a sense of belonging, meaning, and continuity), and one collective identity motive (a shared belief in group distinctiveness) significantly predicted group identification. Motivational processes underlying group identification are complex, multilayered, and not reducible to personal needs.

  14. From action to abstraction: Using the hands to learn math

    PubMed Central

    Novack, Miriam A.; Congdon, Eliza L.; Hemani-Lopez, Naureen; Goldin-Meadow, Susan

    2014-01-01

    Previous research has shown that children benefit from gesturing during math instruction. Here we ask whether gesturing promotes learning because it is itself a physical action, or because it uses physical action to represent abstract ideas. To address this question, we taught third-grade children a strategy for solving mathematical equivalence problems that was instantiated in one of three ways: (1) in the physical action children performed on objects, (2) in a concrete gesture miming that action, or (3) in an abstract gesture. All three types of hand movements helped children learn how to solve the problems on which they were trained. However, only gesture led to success on problems that required generalizing the knowledge gained. The results provide the first evidence that gesture promotes transfer of knowledge better than action, and suggest that the beneficial effects gesture has on learning may reside in the features that differentiate it from action. PMID:24503873

  15. Phenomenal and access consciousness in olfaction.

    PubMed

    Stevenson, Richard J

    2009-12-01

    Contemporary literature on consciousness, with some exceptions, rarely considers the olfactory system. In this article the characteristics of olfactory consciousness, viewed from the standpoint of the phenomenal (P)/access (A) distinction, are examined relative to the major senses. The review details several qualitative differences in both olfactory P consciousness (shifts in the felt location, universal synesthesia-like and affect-rich experiences, and misperceptions) and A consciousness (recovery from habituation, capacity for conscious processing, access to semantic and episodic memory, learning, attention, and in the serial-unitary nature of olfactory percepts). The basis for these differences is argued to arise from the functions that the olfactory system performs and from the unique neural architecture needed to instantiate them. These data suggest, at a minimum, that P and A consciousness are uniquely configured in olfaction and an argument can be made that the P and A distinction may not hold for this sensory system.

  16. Boudot's Range-Bounded Commitment Scheme Revisited

    NASA Astrophysics Data System (ADS)

    Cao, Zhengjun; Liu, Lihua

    Checking whether a committed integer lies in a specific interval has many cryptographic applications. In Eurocrypt'98, Chan et al. proposed an instantiation (CFT Proof). Based on CFT, Boudot presented a popular range-bounded commitment scheme in Eurocrypt'2000. Both CFT Proof and Boudot Proof are based on the encryption E(x, r)=g^xh^r mod n, where n is an RSA modulus whose factorization is unknown by the prover. They did not use a single base as usual. Thus an increase in cost occurs. In this paper, we show that it suffices to adopt a single base. The cost of the modified Boudot Proof is about half of that of the original scheme. Moreover, the key restriction in the original scheme, i.e., both the discrete logarithm of g in base h and the discrete logarithm of h in base g are unknown by the prover, which is a potential menace to the Boudot Proof, is definitely removed.

  17. Threaded cognition: an integrated theory of concurrent multitasking.

    PubMed

    Salvucci, Dario D; Taatgen, Niels A

    2008-01-01

    The authors propose the idea of threaded cognition, an integrated theory of concurrent multitasking--that is, performing 2 or more tasks at once. Threaded cognition posits that streams of thought can be represented as threads of processing coordinated by a serial procedural resource and executed across other available resources (e.g., perceptual and motor resources). The theory specifies a parsimonious mechanism that allows for concurrent execution, resource acquisition, and resolution of resource conflicts, without the need for specialized executive processes. By instantiating this mechanism as a computational model, threaded cognition provides explicit predictions of how multitasking behavior can result in interference, or lack thereof, for a given set of tasks. The authors illustrate the theory in model simulations of several representative domains ranging from simple laboratory tasks such as dual-choice tasks to complex real-world domains such as driving and driver distraction. (c) 2008 APA, all rights reserved

  18. Towards a multi-level approach to the emergence of meaning processes in living systems.

    PubMed

    Queiroz, João; El-Hani, Charbel Niño

    2006-09-01

    Any description of the emergence and evolution of different types of meaning processes (semiosis, sensu C.S.Peirce) in living systems must be supported by a theoretical framework which makes it possible to understand the nature and dynamics of such processes. Here we propose that the emergence of semiosis of different kinds can be understood as resulting from fundamental interactions in a triadically-organized hierarchical process. To grasp these interactions, we develop a model grounded on Stanley Salthe's hierarchical structuralism. This model can be applied to establish, in a general sense, a set of theoretical constraints for explaining the instantiation of different kinds of meaning processes (iconic, indexical, symbolic) in semiotic systems. We use it to model a semiotic process in the immune system, namely, B-cell activation, in order to offer insights into the heuristic role it can play in the development of explanations for specific semiotic processes.

  19. The Challenge of Translation in Social Neuroscience: A Review of Oxytocin, Vasopressin, and Affiliative Behavior

    PubMed Central

    Insel, Thomas R.

    2010-01-01

    Social neuroscience is rapidly exploring the complex territory between perception and action where recognition, value, and meaning are instantiated. This review follows the trail of research on oxytocin and vasopressin as an exemplar of one path for exploring the “dark matter” of social neuroscience. Studies across vertebrate species suggest that these neuropeptides are important for social cognition, with gender and steroid-dependent effects. Comparative research in voles yields a model based on inter-species and intra-species variation of the geography of oxytocin receptors and vasopressin V1a receptors in the forebrain. Highly affiliative species have receptors in brain circuits related to reward or reinforcement. The neuroanatomical distribution of these receptors may be guided by variations in the regulatory regions of their respective genes. This review describes the promises and problems of extrapolating these findings to human social cognition, with specific reference to the social deficits of autism. PMID:20346754

  20. SDN-controlled topology-reconfigurable optical mobile fronthaul architecture for bidirectional CoMP and low latency inter-cell D2D in the 5G mobile era.

    PubMed

    Cvijetic, Neda; Tanaka, Akihiro; Kanonakis, Konstantinos; Wang, Ting

    2014-08-25

    We demonstrate the first SDN-controlled optical topology-reconfigurable mobile fronthaul (MFH) architecture for bidirectional coordinated multipoint (CoMP) and low latency inter-cell device-to-device (D2D) connectivity in the 5G mobile networking era. SDN-based OpenFlow control is used to dynamically instantiate the CoMP and inter-cell D2D features as match/action combinations in control plane flow tables of software-defined optical and electrical switching elements. Dynamic re-configurability is thereby introduced into the optical MFH topology, while maintaining back-compatibility with legacy fiber deployments. 10 Gb/s peak rates with <7 μs back-to-back transmission latency and 29.6 dB total power budget are experimentally demonstrated, confirming the attractiveness of the new approach for optical MFH of future 5G mobile systems.

  1. A neural circuit mechanism for regulating vocal variability during song learning in zebra finches.

    PubMed

    Garst-Orozco, Jonathan; Babadi, Baktash; Ölveczky, Bence P

    2014-12-15

    Motor skill learning is characterized by improved performance and reduced motor variability. The neural mechanisms that couple skill level and variability, however, are not known. The zebra finch, a songbird, presents a unique opportunity to address this question because production of learned song and induction of vocal variability are instantiated in distinct circuits that converge on a motor cortex analogue controlling vocal output. To probe the interplay between learning and variability, we made intracellular recordings from neurons in this area, characterizing how their inputs from the functionally distinct pathways change throughout song development. We found that inputs that drive stereotyped song-patterns are strengthened and pruned, while inputs that induce variability remain unchanged. A simple network model showed that strengthening and pruning of action-specific connections reduces the sensitivity of motor control circuits to variable input and neural 'noise'. This identifies a simple and general mechanism for learning-related regulation of motor variability.

  2. Interactive autonomy and robotic skills

    NASA Technical Reports Server (NTRS)

    Kellner, A.; Maediger, B.

    1994-01-01

    Current concepts of robot-supported operations for space laboratories (payload servicing, inspection, repair, and ORU exchange) are mainly based on the concept of 'interactive autonomy' which implies autonomous behavior of the robot according to predefined timelines, predefined sequences of elementary robot operations and within predefined world models supplying geometrical and other information for parameter instantiation on the one hand, and the ability to override and change the predefined course of activities by human intervention on the other hand. Although in principle a very powerful and useful concept, in practice the confinement of the robot to the abstract world models and predefined activities appears to reduce the robot's stability within real world uncertainties and its applicability to non-predefined parts of the world, calling for frequent corrective interaction by the operator, which in itself may be tedious and time-consuming. Methods are presented to improve this situation by incorporating 'robotic skills' into the concept of interactive autonomy.

  3. WellnessRules: A Web 3.0 Case Study in RuleML-Based Prolog-N3 Profile Interoperation

    NASA Astrophysics Data System (ADS)

    Boley, Harold; Osmun, Taylor Michael; Craig, Benjamin Larry

    An interoperation study, WellnessRules, is described, where rules about wellness opportunities are created by participants in rule languages such as Prolog and N3, and translated within a wellness community using RuleML/XML. The wellness rules are centered around participants, as profiles, encoding knowledge about their activities conditional on the season, the time-of-day, the weather, etc. This distributed knowledge base extends FOAF profiles with a vocabulary and rules about wellness group networking. The communication between participants is organized through Rule Responder, permitting wellness-profile translation and distributed querying across engines. WellnessRules interoperates between rules and queries in the relational (Datalog) paradigm of the pure-Prolog subset of POSL and in the frame (F-logic) paradigm of N3. An evaluation of Rule Responder instantiated for WellnessRules revealed acceptable Web response times.

  4. Decision making in recurrent neuronal circuits.

    PubMed

    Wang, Xiao-Jing

    2008-10-23

    Decision making has recently emerged as a central theme in neurophysiological studies of cognition, and experimental and computational work has led to the proposal of a cortical circuit mechanism of elemental decision computations. This mechanism depends on slow recurrent synaptic excitation balanced by fast feedback inhibition, which not only instantiates attractor states for forming categorical choices but also long transients for gradually accumulating evidence in favor of or against alternative options. Such a circuit endowed with reward-dependent synaptic plasticity is able to produce adaptive choice behavior. While decision threshold is a core concept for reaction time tasks, it can be dissociated from a general decision rule. Moreover, perceptual decisions and value-based economic choices are described within a unified framework in which probabilistic choices result from irregular neuronal activity as well as iterative interactions of a decision maker with an uncertain environment or other unpredictable decision makers in a social group.

  5. The Neuroscience Information Framework: A Data and Knowledge Environment for Neuroscience

    PubMed Central

    Akil, Huda; Ascoli, Giorgio A.; Bowden, Douglas M.; Bug, William; Donohue, Duncan E.; Goldberg, David H.; Grafstein, Bernice; Grethe, Jeffrey S.; Gupta, Amarnath; Halavi, Maryam; Kennedy, David N.; Marenco, Luis; Martone, Maryann E.; Miller, Perry L.; Müller, Hans-Michael; Robert, Adrian; Shepherd, Gordon M.; Sternberg, Paul W.; Van Essen, David C.; Williams, Robert W.

    2009-01-01

    With support from the Institutes and Centers forming the NIH Blueprint for Neuroscience Research, we have designed and implemented a new initiative for integrating access to and use of Web-based neuroscience resources: the Neuroscience Information Framework. The Framework arises from the expressed need of the neuroscience community for neuroinformatic tools and resources to aid scientific inquiry, builds upon prior development of neuroinformatics by the Human Brain Project and others, and directly derives from the Society for Neuroscience’s Neuroscience Database Gateway. Partnered with the Society, its Neuroinformatics Committee, and volunteer consultant-collaborators, our multi-site consortium has developed: (1) a comprehensive, dynamic, inventory of Web-accessible neuroscience resources, (2) an extended and integrated terminology describing resources and contents, and (3) a framework accepting and aiding concept-based queries. Evolving instantiations of the Framework may be viewed at http://nif.nih.gov, http://neurogateway.org, and other sites as they come on line. PMID:18946742

  6. The effects of attention on perceptual implicit memory.

    PubMed

    Rajaram, S; Srinivas, K; Travers, S

    2001-10-01

    Reports on the effects of dividing attention at study on subsequent perceptual priming suggest that perceptual priming is generally unaffected by attentional manipulations as long as word identity is processed. We tested this hypothesis in three experiments by using the implicit word fragment completion and word stem completion tasks. Division of attention was instantiated with the Stroop task in order to ensure the processing of word identity even when the participant's attention was directed to a stimulus attribute other than the word itself. Under these conditions, we found that even though perceptual priming was significant, it was significantly reduced in magnitude. A stem cued recall test in Experiment 2 confirmed a more deleterious effect of divided attention on explicit memory. Taken together, our findings delineate the relative contributions of perceptual analysis and attentional processes in mediating perceptual priming on two ubiquitously used tasks of word fragment completion and word stem completion.

  7. Automatic Earth observation data service based on reusable geo-processing workflow

    NASA Astrophysics Data System (ADS)

    Chen, Nengcheng; Di, Liping; Gong, Jianya; Yu, Genong; Min, Min

    2008-12-01

    A common Sensor Web data service framework for Geo-Processing Workflow (GPW) is presented as part of the NASA Sensor Web project. This framework consists of a data service node, a data processing node, a data presentation node, a Catalogue Service node and BPEL engine. An abstract model designer is used to design the top level GPW model, model instantiation service is used to generate the concrete BPEL, and the BPEL execution engine is adopted. The framework is used to generate several kinds of data: raw data from live sensors, coverage or feature data, geospatial products, or sensor maps. A scenario for an EO-1 Sensor Web data service for fire classification is used to test the feasibility of the proposed framework. The execution time and influences of the service framework are evaluated. The experiments show that this framework can improve the quality of services for sensor data retrieval and processing.

  8. An Ontology-based Context-aware System for Smart Homes: E-care@home

    PubMed Central

    Alirezaie, Marjan; Köckemann, Uwe; Kristoffersson, Annica; Karlsson, Lars; Blomqvist, Eva; Voigt, Thiemo; Loutfi, Amy

    2017-01-01

    Smart home environments have a significant potential to provide for long-term monitoring of users with special needs in order to promote the possibility to age at home. Such environments are typically equipped with a number of heterogeneous sensors that monitor both health and environmental parameters. This paper presents a framework called E-care@home, consisting of an IoT infrastructure, which provides information with an unambiguous, shared meaning across IoT devices, end-users, relatives, health and care professionals and organizations. We focus on integrating measurements gathered from heterogeneous sources by using ontologies in order to enable semantic interpretation of events and context awareness. Activities are deduced using an incremental answer set solver for stream reasoning. The paper demonstrates the proposed framework using an instantiation of a smart environment that is able to perform context recognition based on the activities and the events occurring in the home. PMID:28684686

  9. Open architectures for formal reasoning and deductive technologies for software development

    NASA Technical Reports Server (NTRS)

    Mccarthy, John; Manna, Zohar; Mason, Ian; Pnueli, Amir; Talcott, Carolyn; Waldinger, Richard

    1994-01-01

    The objective of this project is to develop an open architecture for formal reasoning systems. One goal is to provide a framework with a clear semantic basis for specification and instantiation of generic components; construction of complex systems by interconnecting components; and for making incremental improvements and tailoring to specific applications. Another goal is to develop methods for specifying component interfaces and interactions to facilitate use of existing and newly built systems as 'off the shelf' components, thus helping bridge the gap between producers and consumers of reasoning systems. In this report we summarize results in several areas: our data base of reasoning systems; a theory of binding structures; a theory of components of open systems; a framework for specifying components of open reasoning system; and an analysis of the integration of rewriting and linear arithmetic modules in Boyer-Moore using the above framework.

  10. From genius inverts to gendered intelligence: Lewis Terman and the power of the norm.

    PubMed

    Hegarty, Peter

    2007-05-01

    The histories of "intelligence" and "sexuality" have largely been narrated separately. In Lewis Terman's work on individual differences, they intersect. Influenced by G. Stanley Hall, Terman initially described atypically accelerated development as problematic. Borrowing from Galton, Terman later positioned gifted children as nonaverage but ideal. Attention to the gifted effeminate subjects used to exemplify giftedness and gender nonconformity in Terman's work shows the selective instantiation of nonaverageness as pathological a propos of effeminacy, and as ideal a propos of high intelligence. Throughout, high intelligence is conflated with health, masculinity, and heterosexuality. Terman's research located marital sexual problems in women's bodies, further undoing possibilities for evaluating heterosexual men's practices as different from a normative position. Terman's research modernized Galton's imperialist vision of a society lead by a male cognitive elite. Psychologists continue to traffic in his logic that values and inculcates intelligence only in the service of sexual and gender conformity.

  11. Potential implementation of reservoir computing models based on magnetic skyrmions

    NASA Astrophysics Data System (ADS)

    Bourianoff, George; Pinna, Daniele; Sitte, Matthias; Everschor-Sitte, Karin

    2018-05-01

    Reservoir Computing is a type of recursive neural network commonly used for recognizing and predicting spatio-temporal events relying on a complex hierarchy of nested feedback loops to generate a memory functionality. The Reservoir Computing paradigm does not require any knowledge of the reservoir topology or node weights for training purposes and can therefore utilize naturally existing networks formed by a wide variety of physical processes. Most efforts to implement reservoir computing prior to this have focused on utilizing memristor techniques to implement recursive neural networks. This paper examines the potential of magnetic skyrmion fabrics and the complex current patterns which form in them as an attractive physical instantiation for Reservoir Computing. We argue that their nonlinear dynamical interplay resulting from anisotropic magnetoresistance and spin-torque effects allows for an effective and energy efficient nonlinear processing of spatial temporal events with the aim of event recognition and prediction.

  12. MobiDiC: Context Adaptive Digital Signage with Coupons

    NASA Astrophysics Data System (ADS)

    Müller, Jörg; Krüger, Antonio

    In this paper we present a field study of a digital signage system that measures audience response with coupons in order to enable context adaptivity. In the concept for context adaptivity, the signs sense their environment; decide which content to show, and then sense the audience reaction to the content shown. From this audience measurement, the strategies which content to show in which situation are refined. As one instantiation of audience measurement, we propose a novel simple couponing system, where customers can photograph the coupons at the signs. Thus, it can be measured whether customers really went to the shop. To investigate the feasibility of this approach, we implemented a prototype of 20 signs in the city center of Münster, Germany. During one year of deployment, we investigated usage of the system through interviews with shop owners and customers. Our experiences show that customer attention towards the signs is a major hurdle to overcome.

  13. Probabilistic registration of an unbiased statistical shape model to ultrasound images of the spine

    NASA Astrophysics Data System (ADS)

    Rasoulian, Abtin; Rohling, Robert N.; Abolmaesumi, Purang

    2012-02-01

    The placement of an epidural needle is among the most difficult regional anesthetic techniques. Ultrasound has been proposed to improve success of placement. However, it has not become the standard-of-care because of limitations in the depictions and interpretation of the key anatomical features. We propose to augment the ultrasound images with a registered statistical shape model of the spine to aid interpretation. The model is created with a novel deformable group-wise registration method which utilizes a probabilistic approach to register groups of point sets. The method is compared to a volume-based model building technique and it demonstrates better generalization and compactness. We instantiate and register the shape model to a spine surface probability map extracted from the ultrasound images. Validation is performed on human subjects. The achieved registration accuracy (2-4 mm) is sufficient to guide the choice of puncture site and trajectory of an epidural needle.

  14. Why leaders punish: A power perspective.

    PubMed

    Mooijman, Marlon; van Dijk, Wilco W; Ellemers, Naomi; van Dijk, Eric

    2015-07-01

    We propose that power fundamentally changes why leaders punish and we develop a theoretical model that specifies how and why this occurs. Specifically, we argue that power increases the reliance on deterrence, but not just deserts, as a punishment motive and relate this to power fostering a distrustful mindset. We tested our model in 9 studies using different instantiations of power, different measurements and manipulations of distrust while measuring punishment motives and recommended punishments across a number of different situations. These 9 studies demonstrate that power fosters distrust and hereby increases both the reliance on deterrence as a punishment motive and the implementation of punishments aimed at deterrence (i.e., public punishments, public naming of rule breakers and punishments with a mandatory minimum). We discuss the practical implications for leaders, managers and policymakers and the theoretical implications for scholars interested in power, trust, and punishments. (c) 2015 APA, all rights reserved).

  15. Information from multiple modalities helps 5-month-olds learn abstract rules.

    PubMed

    Frank, Michael C; Slemmer, Jonathan A; Marcus, Gary F; Johnson, Scott P

    2009-07-01

    By 7 months of age, infants are able to learn rules based on the abstract relationships between stimuli (Marcus et al., 1999), but they are better able to do so when exposed to speech than to some other classes of stimuli. In the current experiments we ask whether multimodal stimulus information will aid younger infants in identifying abstract rules. We habituated 5-month-olds to simple abstract patterns (ABA or ABB) instantiated in coordinated looming visual shapes and speech sounds (Experiment 1), shapes alone (Experiment 2), and speech sounds accompanied by uninformative but coordinated shapes (Experiment 3). Infants showed evidence of rule learning only in the presence of the informative multimodal cues. We hypothesize that the additional evidence present in these multimodal displays was responsible for the success of younger infants in learning rules, congruent with both a Bayesian account and with the Intersensory Redundancy Hypothesis.

  16. Evidence accumulation as a model for lexical selection.

    PubMed

    Anders, R; Riès, S; van Maanen, L; Alario, F X

    2015-11-01

    We propose and demonstrate evidence accumulation as a plausible theoretical and/or empirical model for the lexical selection process of lexical retrieval. A number of current psycholinguistic theories consider lexical selection as a process related to selecting a lexical target from a number of alternatives, which each have varying activations (or signal supports), that are largely resultant of an initial stimulus recognition. We thoroughly present a case for how such a process may be theoretically explained by the evidence accumulation paradigm, and we demonstrate how this paradigm can be directly related or combined with conventional psycholinguistic theory and their simulatory instantiations (generally, neural network models). Then with a demonstrative application on a large new real data set, we establish how the empirical evidence accumulation approach is able to provide parameter results that are informative to leading psycholinguistic theory, and that motivate future theoretical development. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Finessing filter scarcity problem in face recognition via multi-fold filter convolution

    NASA Astrophysics Data System (ADS)

    Low, Cheng-Yaw; Teoh, Andrew Beng-Jin

    2017-06-01

    The deep convolutional neural networks for face recognition, from DeepFace to the recent FaceNet, demand a sufficiently large volume of filters for feature extraction, in addition to being deep. The shallow filter-bank approaches, e.g., principal component analysis network (PCANet), binarized statistical image features (BSIF), and other analogous variants, endure the filter scarcity problem that not all PCA and ICA filters available are discriminative to abstract noise-free features. This paper extends our previous work on multi-fold filter convolution (ℳ-FFC), where the pre-learned PCA and ICA filter sets are exponentially diversified by ℳ folds to instantiate PCA, ICA, and PCA-ICA offspring. The experimental results unveil that the 2-FFC operation solves the filter scarcity state. The 2-FFC descriptors are also evidenced to be superior to that of PCANet, BSIF, and other face descriptors, in terms of rank-1 identification rate (%).

  18. a Cognitive Approach to Teaching a Graduate-Level Geobia Course

    NASA Astrophysics Data System (ADS)

    Bianchetti, Raechel A.

    2016-06-01

    Remote sensing image analysis training occurs both in the classroom and the research lab. Education in the classroom for traditional pixel-based image analysis has been standardized across college curriculums. However, with the increasing interest in Geographic Object-Based Image Analysis (GEOBIA), there is a need to develop classroom instruction for this method of image analysis. While traditional remote sensing courses emphasize the expansion of skills and knowledge related to the use of computer-based analysis, GEOBIA courses should examine the cognitive factors underlying visual interpretation. This current paper provides an initial analysis of the development, implementation, and outcomes of a GEOBIA course that considers not only the computational methods of GEOBIA, but also the cognitive factors of expertise, that such software attempts to replicate. Finally, a reflection on the first instantiation of this course is presented, in addition to plans for development of an open-source repository for course materials.

  19. The neuroscience information framework: a data and knowledge environment for neuroscience.

    PubMed

    Gardner, Daniel; Akil, Huda; Ascoli, Giorgio A; Bowden, Douglas M; Bug, William; Donohue, Duncan E; Goldberg, David H; Grafstein, Bernice; Grethe, Jeffrey S; Gupta, Amarnath; Halavi, Maryam; Kennedy, David N; Marenco, Luis; Martone, Maryann E; Miller, Perry L; Müller, Hans-Michael; Robert, Adrian; Shepherd, Gordon M; Sternberg, Paul W; Van Essen, David C; Williams, Robert W

    2008-09-01

    With support from the Institutes and Centers forming the NIH Blueprint for Neuroscience Research, we have designed and implemented a new initiative for integrating access to and use of Web-based neuroscience resources: the Neuroscience Information Framework. The Framework arises from the expressed need of the neuroscience community for neuroinformatic tools and resources to aid scientific inquiry, builds upon prior development of neuroinformatics by the Human Brain Project and others, and directly derives from the Society for Neuroscience's Neuroscience Database Gateway. Partnered with the Society, its Neuroinformatics Committee, and volunteer consultant-collaborators, our multi-site consortium has developed: (1) a comprehensive, dynamic, inventory of Web-accessible neuroscience resources, (2) an extended and integrated terminology describing resources and contents, and (3) a framework accepting and aiding concept-based queries. Evolving instantiations of the Framework may be viewed at http://nif.nih.gov , http://neurogateway.org , and other sites as they come on line.

  20. Graph theoretical model of a sensorimotor connectome in zebrafish.

    PubMed

    Stobb, Michael; Peterson, Joshua M; Mazzag, Borbala; Gahtan, Ethan

    2012-01-01

    Mapping the detailed connectivity patterns (connectomes) of neural circuits is a central goal of neuroscience. The best quantitative approach to analyzing connectome data is still unclear but graph theory has been used with success. We present a graph theoretical model of the posterior lateral line sensorimotor pathway in zebrafish. The model includes 2,616 neurons and 167,114 synaptic connections. Model neurons represent known cell types in zebrafish larvae, and connections were set stochastically following rules based on biological literature. Thus, our model is a uniquely detailed computational representation of a vertebrate connectome. The connectome has low overall connection density, with 2.45% of all possible connections, a value within the physiological range. We used graph theoretical tools to compare the zebrafish connectome graph to small-world, random and structured random graphs of the same size. For each type of graph, 100 randomly generated instantiations were considered. Degree distribution (the number of connections per neuron) varied more in the zebrafish graph than in same size graphs with less biological detail. There was high local clustering and a short average path length between nodes, implying a small-world structure similar to other neural connectomes and complex networks. The graph was found not to be scale-free, in agreement with some other neural connectomes. An experimental lesion was performed that targeted three model brain neurons, including the Mauthner neuron, known to control fast escape turns. The lesion decreased the number of short paths between sensory and motor neurons analogous to the behavioral effects of the same lesion in zebrafish. This model is expandable and can be used to organize and interpret a growing database of information on the zebrafish connectome.

  1. A Digitally Programmable Cytomorphic Chip for Simulation of Arbitrary Biochemical Reaction Networks.

    PubMed

    Woo, Sung Sik; Kim, Jaewook; Sarpeshkar, Rahul

    2018-04-01

    Prior work has shown that compact analog circuits can faithfully represent and model fundamental biomolecular circuits via efficient log-domain cytomorphic transistor equivalents. Such circuits have emphasized basis functions that are dominant in genetic transcription and translation networks and deoxyribonucleic acid (DNA)-protein binding. Here, we report a system featuring digitally programmable 0.35 μm BiCMOS analog cytomorphic chips that enable arbitrary biochemical reaction networks to be exactly represented thus enabling compact and easy composition of protein networks as well. Since all biomolecular networks can be represented as chemical reaction networks, our protein networks also include the former genetic network circuits as a special case. The cytomorphic analog protein circuits use one fundamental association-dissociation-degradation building-block circuit that can be configured digitally to exactly represent any zeroth-, first-, and second-order reaction including loading, dynamics, nonlinearity, and interactions with other building-block circuits. To address a divergence issue caused by random variations in chip fabrication processes, we propose a unique way of performing computation based on total variables and conservation laws, which we instantiate at both the circuit and network levels. Thus, scalable systems that operate with finite error over infinite time can be built. We show how the building-block circuits can be composed to form various network topologies, such as cascade, fan-out, fan-in, loop, dimerization, or arbitrary networks using total variables. We demonstrate results from a system that combines interacting cytomorphic chips to simulate a cancer pathway and a glycolysis pathway. Both simulations are consistent with conventional software simulations. Our highly parallel digitally programmable analog cytomorphic systems can lead to a useful design, analysis, and simulation tool for studying arbitrary large-scale biological networks in systems and synthetic biology.

  2. A systematic review of team formulation in clinical psychology practice: Definition, implementation, and outcomes.

    PubMed

    Geach, Nicole; Moghaddam, Nima G; De Boos, Danielle

    2018-06-01

    Team formulation is promoted by professional practice guidelines for clinical psychologists. However, it is unclear whether team formulation is understood/implemented in consistent ways - or whether there is outcome evidence to support the promotion of this practice. This systematic review aimed to (1) synthesize how team formulation practice is defined and implemented by practitioner psychologists and (2) analyse the range of team formulation outcomes in the peer-reviewed literature. Seven electronic bibliographic databases were searched in June 2016. Eleven articles met inclusion criteria and were quality assessed. Extracted data were synthesized using content analysis. Descriptions of team formulation revealed three main forms of instantiation: (1) a structured, consultation approach; (2) semi-structured, reflective practice meetings; and (3) unstructured/informal sharing of ideas through routine interactions. Outcome evidence linked team formulation to a range of outcomes for staff teams and service users, including some negative outcomes. Quality appraisal identified significant issues with evaluation methods; such that, overall, outcomes were not well-supported. There is weak evidence to support the claimed beneficial outcomes of team formulation in practice. There is a need for greater specification and standardization of 'team formulation' practices, to enable a clearer understanding of any relationships with outcomes and implications for best-practice implementations. Under the umbrella term of 'team formulation', three types of practice are reported: (1) highly structured consultation; (2) reflective practice meetings; and (3) informal sharing of ideas. Outcomes linked to team formulation, including some negative outcomes, were not well evidenced. Research using robust study designs is required to investigate the process and outcomes of team formulation practice. © 2017 The British Psychological Society.

  3. A Software Engine to Justify the Conclusions of an Expert System for Detecting Renal Obstruction on 99mTc-MAG3 Scans

    PubMed Central

    Garcia, Ernest V.; Taylor, Andrew; Manatunga, Daya; Folks, Russell

    2013-01-01

    The purposes of this study were to describe and evaluate a software engine to justify the conclusions reached by a renal expert system (RENEX) for assessing patients with suspected renal obstruction and to obtain from this evaluation new knowledge that can be incorporated into RENEX to attempt to improve diagnostic performance. Methods RENEX consists of 60 heuristic rules extracted from the rules used by a domain expert to generate the knowledge base and a forward-chaining inference engine to determine obstruction. The justification engine keeps track of the sequence of the rules that are instantiated to reach a conclusion. The interpreter can then request justification by clicking on the specific conclusion. The justification process then reports the English translation of all concatenated rules instantiated to reach that conclusion. The justification engine was evaluated with a prospective group of 60 patients (117 kidneys). After reviewing the standard renal mercaptoacetyltriglycine (MAG3) scans obtained before and after the administration of furosemide, a masked expert determined whether each kidney was obstructed, whether the results were equivocal, or whether the kidney was not obstructed and identified and ranked the main variables associated with each interpretation. Two parameters were then tabulated: the frequency with which the main variables associated with obstruction by the expert were also justified by RENEX and the frequency with which the justification rules provided by RENEX were deemed to be correct by the expert. Only when RENEX and the domain expert agreed on the diagnosis (87 kidneys) were the results used to test the justification. Results RENEX agreed with 91% (184/203) of the rules supplied by the expert for justifying the diagnosis. RENEX provided 103 additional rules justifying the diagnosis; the expert agreed that 102 (99%) were correct, although the rules were considered to be of secondary importance. Conclusion We have described and evaluated a software engine to justify the conclusions of RENEX for detecting renal obstruction with MAG3 renal scans obtained before and after the administration of furosemide. This tool is expected to increase physician confidence in the interpretations provided by RENEX and to assist physicians and trainees in gaining a higher level of expertise. PMID:17332625

  4. A software engine to justify the conclusions of an expert system for detecting renal obstruction on 99mTc-MAG3 scans.

    PubMed

    Garcia, Ernest V; Taylor, Andrew; Manatunga, Daya; Folks, Russell

    2007-03-01

    The purposes of this study were to describe and evaluate a software engine to justify the conclusions reached by a renal expert system (RENEX) for assessing patients with suspected renal obstruction and to obtain from this evaluation new knowledge that can be incorporated into RENEX to attempt to improve diagnostic performance. RENEX consists of 60 heuristic rules extracted from the rules used by a domain expert to generate the knowledge base and a forward-chaining inference engine to determine obstruction. The justification engine keeps track of the sequence of the rules that are instantiated to reach a conclusion. The interpreter can then request justification by clicking on the specific conclusion. The justification process then reports the English translation of all concatenated rules instantiated to reach that conclusion. The justification engine was evaluated with a prospective group of 60 patients (117 kidneys). After reviewing the standard renal mercaptoacetyltriglycine (MAG3) scans obtained before and after the administration of furosemide, a masked expert determined whether each kidney was obstructed, whether the results were equivocal, or whether the kidney was not obstructed and identified and ranked the main variables associated with each interpretation. Two parameters were then tabulated: the frequency with which the main variables associated with obstruction by the expert were also justified by RENEX and the frequency with which the justification rules provided by RENEX were deemed to be correct by the expert. Only when RENEX and the domain expert agreed on the diagnosis (87 kidneys) were the results used to test the justification. RENEX agreed with 91% (184/203) of the rules supplied by the expert for justifying the diagnosis. RENEX provided 103 additional rules justifying the diagnosis; the expert agreed that 102 (99%) were correct, although the rules were considered to be of secondary importance. We have described and evaluated a software engine to justify the conclusions of RENEX for detecting renal obstruction with MAG3 renal scans obtained before and after the administration of furosemide. This tool is expected to increase physician confidence in the interpretations provided by RENEX and to assist physicians and trainees in gaining a higher level of expertise.

  5. Exercise Habit

    MedlinePlus

    ... and lungs. Examples of aerobic exercise include walking, hiking, running, aerobic dance, biking, rowing, swimming, and cross- ... Examples of weight-bearing exercise include walking, yoga, hiking, climbing stairs, playing tennis, dancing, and strength training. ...

  6. Images that Twist and Turn.

    ERIC Educational Resources Information Center

    Hubbard, Guy

    2002-01-01

    Discusses various art movements that include examples of artworks with movement, such as Romantic, Classic, Rococo, and Art Nouveau. Addresses ways in which students can learn to incorporate movement into their own works of art. Includes examples of artists who included movement into their artworks. (CMK)

  7. The Behavioral Intervention Technology Model: An Integrated Conceptual and Technological Framework for eHealth and mHealth Interventions

    PubMed Central

    Schueller, Stephen M; Montague, Enid; Burns, Michelle Nicole; Rashidi, Parisa

    2014-01-01

    A growing number of investigators have commented on the lack of models to inform the design of behavioral intervention technologies (BITs). BITs, which include a subset of mHealth and eHealth interventions, employ a broad range of technologies, such as mobile phones, the Web, and sensors, to support users in changing behaviors and cognitions related to health, mental health, and wellness. We propose a model that conceptually defines BITs, from the clinical aim to the technological delivery framework. The BIT model defines both the conceptual and technological architecture of a BIT. Conceptually, a BIT model should answer the questions why, what, how (conceptual and technical), and when. While BITs generally have a larger treatment goal, such goals generally consist of smaller intervention aims (the "why") such as promotion or reduction of specific behaviors, and behavior change strategies (the conceptual "how"), such as education, goal setting, and monitoring. Behavior change strategies are instantiated with specific intervention components or “elements” (the "what"). The characteristics of intervention elements may be further defined or modified (the technical "how") to meet the needs, capabilities, and preferences of a user. Finally, many BITs require specification of a workflow that defines when an intervention component will be delivered. The BIT model includes a technological framework (BIT-Tech) that can integrate and implement the intervention elements, characteristics, and workflow to deliver the entire BIT to users over time. This implementation may be either predefined or include adaptive systems that can tailor the intervention based on data from the user and the user’s environment. The BIT model provides a step towards formalizing the translation of developer aims into intervention components, larger treatments, and methods of delivery in a manner that supports research and communication between investigators on how to design, develop, and deploy BITs. PMID:24905070

  8. The behavioral intervention technology model: an integrated conceptual and technological framework for eHealth and mHealth interventions.

    PubMed

    Mohr, David C; Schueller, Stephen M; Montague, Enid; Burns, Michelle Nicole; Rashidi, Parisa

    2014-06-05

    A growing number of investigators have commented on the lack of models to inform the design of behavioral intervention technologies (BITs). BITs, which include a subset of mHealth and eHealth interventions, employ a broad range of technologies, such as mobile phones, the Web, and sensors, to support users in changing behaviors and cognitions related to health, mental health, and wellness. We propose a model that conceptually defines BITs, from the clinical aim to the technological delivery framework. The BIT model defines both the conceptual and technological architecture of a BIT. Conceptually, a BIT model should answer the questions why, what, how (conceptual and technical), and when. While BITs generally have a larger treatment goal, such goals generally consist of smaller intervention aims (the "why") such as promotion or reduction of specific behaviors, and behavior change strategies (the conceptual "how"), such as education, goal setting, and monitoring. Behavior change strategies are instantiated with specific intervention components or "elements" (the "what"). The characteristics of intervention elements may be further defined or modified (the technical "how") to meet the needs, capabilities, and preferences of a user. Finally, many BITs require specification of a workflow that defines when an intervention component will be delivered. The BIT model includes a technological framework (BIT-Tech) that can integrate and implement the intervention elements, characteristics, and workflow to deliver the entire BIT to users over time. This implementation may be either predefined or include adaptive systems that can tailor the intervention based on data from the user and the user's environment. The BIT model provides a step towards formalizing the translation of developer aims into intervention components, larger treatments, and methods of delivery in a manner that supports research and communication between investigators on how to design, develop, and deploy BITs.

  9. Grounded understanding of abstract concepts: The case of STEM learning.

    PubMed

    Hayes, Justin C; Kraemer, David J M

    2017-01-01

    Characterizing the neural implementation of abstract conceptual representations has long been a contentious topic in cognitive science. At the heart of the debate is whether the "sensorimotor" machinery of the brain plays a central role in representing concepts, or whether the involvement of these perceptual and motor regions is merely peripheral or epiphenomenal. The domain of science, technology, engineering, and mathematics (STEM) learning provides an important proving ground for sensorimotor (or grounded) theories of cognition, as concepts in science and engineering courses are often taught through laboratory-based and other hands-on methodologies. In this review of the literature, we examine evidence suggesting that sensorimotor processes strengthen learning associated with the abstract concepts central to STEM pedagogy. After considering how contemporary theories have defined abstraction in the context of semantic knowledge, we propose our own explanation for how body-centered information, as computed in sensorimotor brain regions and visuomotor association cortex, can form a useful foundation upon which to build an understanding of abstract scientific concepts, such as mechanical force. Drawing from theories in cognitive neuroscience, we then explore models elucidating the neural mechanisms involved in grounding intangible concepts, including Hebbian learning, predictive coding, and neuronal recycling. Empirical data on STEM learning through hands-on instruction are considered in light of these neural models. We conclude the review by proposing three distinct ways in which the field of cognitive neuroscience can contribute to STEM learning by bolstering our understanding of how the brain instantiates abstract concepts in an embodied fashion.

  10. Integrating HL7 RIM and ontology for unified knowledge and data representation in clinical decision support systems.

    PubMed

    Zhang, Yi-Fan; Tian, Yu; Zhou, Tian-Shu; Araki, Kenji; Li, Jing-Song

    2016-01-01

    The broad adoption of clinical decision support systems within clinical practice has been hampered mainly by the difficulty in expressing domain knowledge and patient data in a unified formalism. This paper presents a semantic-based approach to the unified representation of healthcare domain knowledge and patient data for practical clinical decision making applications. A four-phase knowledge engineering cycle is implemented to develop a semantic healthcare knowledge base based on an HL7 reference information model, including an ontology to model domain knowledge and patient data and an expression repository to encode clinical decision making rules and queries. A semantic clinical decision support system is designed to provide patient-specific healthcare recommendations based on the knowledge base and patient data. The proposed solution is evaluated in the case study of type 2 diabetes mellitus inpatient management. The knowledge base is successfully instantiated with relevant domain knowledge and testing patient data. Ontology-level evaluation confirms model validity. Application-level evaluation of diagnostic accuracy reaches a sensitivity of 97.5%, a specificity of 100%, and a precision of 98%; an acceptance rate of 97.3% is given by domain experts for the recommended care plan orders. The proposed solution has been successfully validated in the case study as providing clinical decision support at a high accuracy and acceptance rate. The evaluation results demonstrate the technical feasibility and application prospect of our approach. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  11. Implementing recovery: an analysis of the key technologies in Scotland

    PubMed Central

    2011-01-01

    Background Over the past ten years the promotion of recovery has become a stated aim of mental health policies within a number of English speaking countries, including Scotland. Implementation of a recovery approach involves a significant reorientation of mental health services and practices, which often poses significant challenges for reformers. This article examines how four key technologies of recovery have assisted in the move towards the creation of a recovery-oriented mental health system in Scotland. Methods Drawing on documentary analysis and a series of interviews we examine the construction and implementation of four key recovery 'technologies' as they have been put to use in Scotland: recovery narratives, the Scottish Recovery Indicator (SRI), Wellness Recovery Action Planning (WRAP) and peer support. Results Our findings illuminate how each of these technologies works to instantiate, exemplify and disseminate a 'recovery orientation' at different sites within the mental health system in order to bring about a 'recovery oriented' mental health system. They also enable us to identify some of the factors that facilitate or hinder the effectiveness of those technologies in bringing about a change in how mental health services are delivered in Scotland. These finding provide a basis for some general reflections on the utility of 'recovery technologies' to implement a shift towards recovery in mental health services in Scotland and elsewhere. Conclusions Our analysis of this process within the Scottish context will be valuable for policy makers and service coordinators wishing to implement recovery values within their own national mental health systems. PMID:21569633

  12. The history of research on the filled pause as evidence of the written language bias in linguistics (Linell, 1982).

    PubMed

    O'Connell, Daniel C; Kowal, Sabine

    2004-11-01

    Erard's (2004) publication in the New York Times of a journalistic history of the filled pause serves as the occasion for this critical review of the past half-century of research on the filled pause. Historically, the various phonetic realizations or instantiations of the filled pause have been presented with an odd recurrent admixture of the interjection ah. In addition, the filled pause has been consistently associated with both hesitation and disfluency. The present authors hold that such a mandatory association of the filled pause with disfluency is the product of The written language bias in linguistics [Linell, 1982] and disregards much cogent evidence to the contrary. The implicit prescriptivism of well formedness--a demand derived from literacy--must be rejected; literate well formedness is not a necessary or even typical property of spontaneous spoken discourse; its structures and functions--including those of the filled pause--are very different from those of written language The recent work of Clark and Fox Tree (2002) holds promise for moving the status of the filled pause not only toward that of a conventional word, but also toward its status as an interjection. This latter development is also being fostered by lexicographers. Nonetheless, in view of ongoing research regarding the disparate privileges of occurrence and functions of filled pauses in comparison with interjections, the present authors are reluctant to categorize the filled pause as an interjection.

  13. NASA Tech Briefs, February 2007

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Topics covered include: Calibration Test Set for a Phase-Comparison Digital Tracker; Wireless Acoustic Measurement System; Spiral Orbit Tribometer; Arrays of Miniature Microphones for Aeroacoustic Testing; Predicting Rocket or Jet Noise in Real Time; Computational Workbench for Multibody Dynamics; High-Power, High-Efficiency Ka-Band Space Traveling-Wave Tube; Gratings and Random Reflectors for Near-Infrared PIN Diodes; Optically Transparent Split-Ring Antennas for 1 to 10 GHz; Ice-Penetrating Robot for Scientific Exploration; Power-Amplifier Module for 145 to 165 GHz; Aerial Videography From Locally Launched Rockets; SiC Multi-Chip Power Modules as Power-System Building Blocks; Automated Design of Restraint Layer of an Inflatable Vessel; TMS for Instantiating a Knowledge Base With Incomplete Data; Simulating Flights of Future Launch Vehicles and Spacecraft; Control Code for Bearingless Switched- Reluctance Motor; Machine Aided Indexing and the NASA Thesaurus; Arbitrating Control of Control and Display Units; Web-Based Software for Managing Research; Driver Code for Adaptive Optics; Ceramic Paste for Patching High-Temperature Insulation; Fabrication of Polyimide-Matrix/Carbon and Boron-Fiber Tape; Protective Skins for Aerogel Monoliths; Code Assesses Risks Posed by Meteoroids and Orbital Debris; Asymmetric Bulkheads for Cylindrical Pressure Vessels; Self-Regulating Water-Separator System for Fuel Cells; Self-Advancing Step-Tap Drills; Array of Bolometers for Submillimeter- Wavelength Operation; Delta-Doped CCDs as Detector Arrays in Mass Spectrometers; Arrays of Bundles of Carbon Nanotubes as Field Emitters; Staggering Inflation To Stabilize Attitude of a Solar Sail; and Bare Conductive Tether for Decelerating a Spacecraft.

  14. A Novel Active Imaging Model to Design Visual Systems: A Case of Inspection System for Specular Surfaces

    PubMed Central

    Azorin-Lopez, Jorge; Fuster-Guillo, Andres; Saval-Calvo, Marcelo; Mora-Mora, Higinio; Garcia-Chamizo, Juan Manuel

    2017-01-01

    The use of visual information is a very well known input from different kinds of sensors. However, most of the perception problems are individually modeled and tackled. It is necessary to provide a general imaging model that allows us to parametrize different input systems as well as their problems and possible solutions. In this paper, we present an active vision model considering the imaging system as a whole (including camera, lighting system, object to be perceived) in order to propose solutions to automated visual systems that present problems that we perceive. As a concrete case study, we instantiate the model in a real application and still challenging problem: automated visual inspection. It is one of the most used quality control systems to detect defects on manufactured objects. However, it presents problems for specular products. We model these perception problems taking into account environmental conditions and camera parameters that allow a system to properly perceive the specific object characteristics to determine defects on surfaces. The validation of the model has been carried out using simulations providing an efficient way to perform a large set of tests (different environment conditions and camera parameters) as a previous step of experimentation in real manufacturing environments, which more complex in terms of instrumentation and more expensive. Results prove the success of the model application adjusting scale, viewpoint and lighting conditions to detect structural and color defects on specular surfaces. PMID:28640211

  15. An emergentist perspective on the origin of number sense

    PubMed Central

    2018-01-01

    The finding that human infants and many other animal species are sensitive to numerical quantity has been widely interpreted as evidence for evolved, biologically determined numerical capacities across unrelated species, thereby supporting a ‘nativist’ stance on the origin of number sense. Here, we tackle this issue within the ‘emergentist’ perspective provided by artificial neural network models, and we build on computer simulations to discuss two different approaches to think about the innateness of number sense. The first, illustrated by artificial life simulations, shows that numerical abilities can be supported by domain-specific representations emerging from evolutionary pressure. The second assumes that numerical representations need not be genetically pre-determined but can emerge from the interplay between innate architectural constraints and domain-general learning mechanisms, instantiated in deep learning simulations. We show that deep neural networks endowed with basic visuospatial processing exhibit a remarkable performance in numerosity discrimination before any experience-dependent learning, whereas unsupervised sensory experience with visual sets leads to subsequent improvement of number acuity and reduces the influence of continuous visual cues. The emergent neuronal code for numbers in the model includes both numerosity-sensitive (summation coding) and numerosity-selective response profiles, closely mirroring those found in monkey intraparietal neurons. We conclude that a form of innatism based on architectural and learning biases is a fruitful approach to understanding the origin and development of number sense. This article is part of a discussion meeting issue ‘The origins of numerical abilities'. PMID:29292348

  16. Example-Based Learning: Effects of Different Types of Examples on Student Performance, Cognitive Load and Self-Efficacy in a Statistical Learning Task

    ERIC Educational Resources Information Center

    Huang, Xiaoxia

    2017-01-01

    Previous research has indicated the disconnect between example-based research focusing on worked examples (WEs) and that focusing on modeling examples. The purpose of this study was to examine and compare the effect of four different types of examples from the two separate lines of research, including standard WEs, erroneous WEs, expert (masterly)…

  17. 45 CFR 612.7 - Exemptions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... to which the information pertains. Examples of records exempt from disclosure include, but are not... information to be withheld. Examples of records exempt from disclosure include, but are not limited to: (i... Federal Government owns or may own a right, title, or interest (including a nonexclusive license), 35 U.S...

  18. 77 FR 33133 - Patient Protection and Affordable Care Act; Data Collection To Support Standards Related to...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-05

    ... includes both quantitative and non-quantitative limits on benefits. Examples of quantitative limits include... duration of treatment. Examples of non-quantitative limits include prior authorization and step therapy... relevant issuers would submit data and descriptive information on the [[Page 33136

  19. 15 CFR 700.52 - Examples of assistance.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 2 2010-01-01 2010-01-01 false Examples of assistance. 700.52 Section... DEFENSE PRIORITIES AND ALLOCATIONS SYSTEM Special Priorities Assistance § 700.52 Examples of assistance... an item needed to fill a rated order. (b) Other examples of special priorities assistance include: (1...

  20. 26 CFR 1.482-3 - Methods to determine taxable income in connection with a transfer of tangible property.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... range. (4) Examples. The principles of this paragraph (b) are illustrated by the following examples..., marketing, advertising programs and services, (including promotional programs, rebates, and co-op... sold and operating expenses. (4) Examples. The following examples illustrate the principles of this...

Top