Sample records for model abstraction techniques

  1. Abstraction Techniques for Parameterized Verification

    DTIC Science & Technology

    2006-11-01

    approach for applying model checking to unbounded systems is to extract finite state models from them using conservative abstraction techniques. Prop...36 2.5.1 Multiple Reference Processes . . . . . . . . . . . . . . . . . . . 36 2.5.2 Adding Monitor Processes...model checking to complex pieces of code like device drivers depends on the use of abstraction methods. An abstraction method extracts a small finite

  2. Localization Versus Abstraction: A Comparison of Two Search Reduction Techniques

    NASA Technical Reports Server (NTRS)

    Lansky, Amy L.

    1992-01-01

    There has been much recent work on the use of abstraction to improve planning behavior and cost. Another technique for dealing with the inherently explosive cost of planning is localization. This paper compares the relative strengths of localization and abstraction in reducing planning search cost. In particular, localization is shown to subsume abstraction. Localization techniques can model the various methods of abstraction that have been used, but also provide a much more flexible framework, with a broader range of benefits.

  3. Test Input Generation for Red-Black Trees using Abstraction

    NASA Technical Reports Server (NTRS)

    Visser, Willem; Pasareanu, Corina S.; Pelanek, Radek

    2005-01-01

    We consider the problem of test input generation for code that manipulates complex data structures. Test inputs are sequences of method calls from the data structure interface. We describe test input generation techniques that rely on state matching to avoid generation of redundant tests. Exhaustive techniques use explicit state model checking to explore all the possible test sequences up to predefined input sizes. Lossy techniques rely on abstraction mappings to compute and store abstract versions of the concrete states; they explore under-approximations of all the possible test sequences. We have implemented the techniques on top of the Java PathFinder model checker and we evaluate them using a Java implementation of red-black trees.

  4. Applying model abstraction techniques to optimize monitoring networks for detecting subsurface contaminant transport

    USDA-ARS?s Scientific Manuscript database

    Improving strategies for monitoring subsurface contaminant transport includes performance comparison of competing models, developed independently or obtained via model abstraction. Model comparison and parameter discrimination involve specific performance indicators selected to better understand s...

  5. Application of model abstraction techniques to simulate transport in soils

    USDA-ARS?s Scientific Manuscript database

    Successful understanding and modeling of contaminant transport in soils is the precondition of risk-informed predictions of the subsurface contaminant transport. Exceedingly complex models of subsurface contaminant transport are often inefficient. Model abstraction is the methodology for reducing th...

  6. Integrating model abstraction into monitoring strategies

    USDA-ARS?s Scientific Manuscript database

    This study was designed and performed to investigate the opportunities and benefits of integrating model abstraction techniques into monitoring strategies. The study focused on future applications of modeling to contingency planning and management of potential and actual contaminant release sites wi...

  7. Generating Models of Infinite-State Communication Protocols Using Regular Inference with Abstraction

    NASA Astrophysics Data System (ADS)

    Aarts, Fides; Jonsson, Bengt; Uijen, Johan

    In order to facilitate model-based verification and validation, effort is underway to develop techniques for generating models of communication system components from observations of their external behavior. Most previous such work has employed regular inference techniques which generate modest-size finite-state models. They typically suppress parameters of messages, although these have a significant impact on control flow in many communication protocols. We present a framework, which adapts regular inference to include data parameters in messages and states for generating components with large or infinite message alphabets. A main idea is to adapt the framework of predicate abstraction, successfully used in formal verification. Since we are in a black-box setting, the abstraction must be supplied externally, using information about how the component manages data parameters. We have implemented our techniques by connecting the LearnLib tool for regular inference with the protocol simulator ns-2, and generated a model of the SIP component as implemented in ns-2.

  8. Abstract Model of the SATS Concept of Operations: Initial Results and Recommendations

    NASA Technical Reports Server (NTRS)

    Dowek, Gilles; Munoz, Cesar; Carreno, Victor A.

    2004-01-01

    An abstract mathematical model of the concept of operations for the Small Aircraft Transportation System (SATS) is presented. The Concept of Operations consist of several procedures that describe nominal operations for SATS, Several safety properties of the system are proven using formal techniques. The final goal of the verification effort is to show that under nominal operations, aircraft are safely separated. The abstract model was written and formally verified in the Prototype Verification System (PVS).

  9. Assume-Guarantee Abstraction Refinement Meets Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Bogomolov, Sergiy; Frehse, Goran; Greitschus, Marius; Grosu, Radu; Pasareanu, Corina S.; Podelski, Andreas; Strump, Thomas

    2014-01-01

    Compositional verification techniques in the assume- guarantee style have been successfully applied to transition systems to efficiently reduce the search space by leveraging the compositional nature of the systems under consideration. We adapt these techniques to the domain of hybrid systems with affine dynamics. To build assumptions we introduce an abstraction based on location merging. We integrate the assume-guarantee style analysis with automatic abstraction refinement. We have implemented our approach in the symbolic hybrid model checker SpaceEx. The evaluation shows its practical potential. To the best of our knowledge, this is the first work combining assume-guarantee reasoning with automatic abstraction-refinement in the context of hybrid automata.

  10. Finding Feasible Abstract Counter-Examples

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Dwyer, Matthew B.; Visser, Willem; Clancy, Daniel (Technical Monitor)

    2002-01-01

    A strength of model checking is its ability to automate the detection of subtle system errors and produce traces that exhibit those errors. Given the high computational cost of model checking most researchers advocate the use of aggressive property-preserving abstractions. Unfortunately, the more aggressively a system is abstracted the more infeasible behavior it will have. Thus, while abstraction enables efficient model checking it also threatens the usefulness of model checking as a defect detection tool, since it may be difficult to determine whether a counter-example is feasible and hence worth developer time to analyze. We have explored several strategies for addressing this problem by extending an explicit-state model checker, Java PathFinder (JPF), to search for and analyze counter-examples in the presence of abstractions. We demonstrate that these techniques effectively preserve the defect detection ability of model checking in the presence of aggressive abstraction by applying them to check properties of several abstracted multi-threaded Java programs. These new capabilities are not specific to JPF and can be easily adapted to other model checking frameworks; we describe how this was done for the Bandera toolset.

  11. Abstraction and Assume-Guarantee Reasoning for Automated Software Verification

    NASA Technical Reports Server (NTRS)

    Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.

    2004-01-01

    Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.

  12. Opponent Modeling in Interesting Adversarial Environments

    DTIC Science & Technology

    2008-07-01

    University of Alberta and the research team at Carnegie Melon University have both explored various abstraction and solution techniques. In [10] Billings et...solution for an abstraction of the entire four-betting-round version of Limit Texas Hold’em. At Carnegie Melon University, Gilpin and Sandholm...measuring how well opponent models met these goals. We’ve found that modeling opponents that adapt while playing is an active and fertile field of

  13. Abstraction of an Affective-Cognitive Decision Making Model Based on Simulated Behaviour and Perception Chains

    NASA Astrophysics Data System (ADS)

    Sharpanskykh, Alexei; Treur, Jan

    Employing rich internal agent models of actors in large-scale socio-technical systems often results in scalability issues. The problem addressed in this paper is how to improve computational properties of a complex internal agent model, while preserving its behavioral properties. The problem is addressed for the case of an existing affective-cognitive decision making model instantiated for an emergency scenario. For this internal decision model an abstracted behavioral agent model is obtained, which ensures a substantial increase of the computational efficiency at the cost of approximately 1% behavioural error. The abstraction technique used can be applied to a wide range of internal agent models with loops, for example, involving mutual affective-cognitive interactions.

  14. Teaching, Learning and Evaluation Techniques in the Engineering Courses.

    ERIC Educational Resources Information Center

    Vermaas, Luiz Lenarth G.; Crepaldi, Paulo Cesar; Fowler, Fabio Roberto

    This article presents some techniques of professional formation from the Petra Model that can be applied in Engineering Programs. It shows its philosophy, teaching methods for listening, making abstracts, studying, researching, team working and problem solving. Some questions regarding planning and evaluation, based in the model are, as well,…

  15. Computational Difficulties in the Identification and Optimization of Control Systems.

    DTIC Science & Technology

    1980-01-01

    necessary and Identify by block number) - -. 3. iABSTRACT (Continue on revers, side It necessary and Identify by block number) As more realistic models ...Island 02912 ABSTRACT As more realistic models for resource management are developed, the need for efficient computational techniques for parameter...optimization (optimal control) in "state" models which This research was supported in part by ttfe National Science Foundation under grant NSF-MCS 79-05774

  16. Formal methods for modeling and analysis of hybrid systems

    NASA Technical Reports Server (NTRS)

    Tiwari, Ashish (Inventor); Lincoln, Patrick D. (Inventor)

    2009-01-01

    A technique based on the use of a quantifier elimination decision procedure for real closed fields and simple theorem proving to construct a series of successively finer qualitative abstractions of hybrid automata is taught. The resulting abstractions are always discrete transition systems which can then be used by any traditional analysis tool. The constructed abstractions are conservative and can be used to establish safety properties of the original system. The technique works on linear and non-linear polynomial hybrid systems: the guards on discrete transitions and the continuous flows in all modes can be specified using arbitrary polynomial expressions over the continuous variables. An exemplar tool in the SAL environment built over the theorem prover PVS is detailed. The technique scales well to large and complex hybrid systems.

  17. Automatic Review of Abstract State Machines by Meta Property Verification

    NASA Technical Reports Server (NTRS)

    Arcaini, Paolo; Gargantini, Angelo; Riccobene, Elvinia

    2010-01-01

    A model review is a validation technique aimed at determining if a model is of sufficient quality and allows defects to be identified early in the system development, reducing the cost of fixing them. In this paper we propose a technique to perform automatic review of Abstract State Machine (ASM) formal specifications. We first detect a family of typical vulnerabilities and defects a developer can introduce during the modeling activity using the ASMs and we express such faults as the violation of meta-properties that guarantee certain quality attributes of the specification. These meta-properties are then mapped to temporal logic formulas and model checked for their violation. As a proof of concept, we also report the result of applying this ASM review process to several specifications.

  18. Assessing and reducing hydrogeologic model uncertainty

    USDA-ARS?s Scientific Manuscript database

    NRC is sponsoring research that couples model abstraction techniques with model uncertainty assessment methods. Insights and information from this program will be useful in decision making by NRC staff, licensees and stakeholders in their assessment of subsurface radionuclide transport. All analytic...

  19. Advanced techniques in reliability model representation and solution

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.; Nicol, David M.

    1992-01-01

    The current tendency of flight control system designs is towards increased integration of applications and increased distribution of computational elements. The reliability analysis of such systems is difficult because subsystem interactions are increasingly interdependent. Researchers at NASA Langley Research Center have been working for several years to extend the capability of Markov modeling techniques to address these problems. This effort has been focused in the areas of increased model abstraction and increased computational capability. The reliability model generator (RMG) is a software tool that uses as input a graphical object-oriented block diagram of the system. RMG uses a failure-effects algorithm to produce the reliability model from the graphical description. The ASSURE software tool is a parallel processing program that uses the semi-Markov unreliability range evaluator (SURE) solution technique and the abstract semi-Markov specification interface to the SURE tool (ASSIST) modeling language. A failure modes-effects simulation is used by ASSURE. These tools were used to analyze a significant portion of a complex flight control system. The successful combination of the power of graphical representation, automated model generation, and parallel computation leads to the conclusion that distributed fault-tolerant system architectures can now be analyzed.

  20. Experimental Validation Techniques for the Heleeos Off-Axis Laser Propagation Model

    DTIC Science & Technology

    2010-03-01

    EXPERIMENTAL VALIDATION TECHNIQUES FOR THE HELEEOS OFF-AXIS LASER PROPAGATION MODEL THESIS John Haiducek, 1st Lt, USAF AFIT/GAP/ENP/10-M07 DEPARTMENT...Department of Defense, or the United States Government. AFIT/GAP/ENP/10-M07 EXPERIMENTAL VALIDATION TECHNIQUES FOR THE HELEEOS OFF-AXIS LASER ...BS, Physics 1st Lt, USAF March 2010 APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. AFIT/GAP/ENP/10-M07 Abstract The High Energy Laser End-to-End

  1. Automated Verification of Specifications with Typestates and Access Permissions

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.; Catano, Nestor

    2011-01-01

    We propose an approach to formally verify Plural specifications based on access permissions and typestates, by model-checking automatically generated abstract state-machines. Our exhaustive approach captures all the possible behaviors of abstract concurrent programs implementing the specification. We describe the formal methodology employed by our technique and provide an example as proof of concept for the state-machine construction rules. The implementation of a fully automated algorithm to generate and verify models, currently underway, provides model checking support for the Plural tool, which currently supports only program verification via data flow analysis (DFA).

  2. Computer Code for Interpreting 13C NMR Relaxation Measurements with Specific Models of Molecular Motion: The Rigid Isotropic and Symmetric Top Rotor Models and the Flexible Symmetric Top Rotor Model

    DTIC Science & Technology

    2017-01-01

    unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT: Carbon-13 nuclear magnetic resonance (13C NMR) spectroscopy is a powerful technique for...FLEXIBLE SYMMETRIC TOP ROTOR MODEL 1. INTRODUCTION Nuclear magnetic resonance (NMR) spectroscopy is a tremendously powerful technique for...application of NMR spectroscopy concerns the property of molecular motion, which is related to many physical, and even biological, functions of molecules in

  3. Physical principles for DNA tile self-assembly.

    PubMed

    Evans, Constantine G; Winfree, Erik

    2017-06-19

    DNA tiles provide a promising technique for assembling structures with nanoscale resolution through self-assembly by basic interactions rather than top-down assembly of individual structures. Tile systems can be programmed to grow based on logical rules, allowing for a small number of tile types to assemble large, complex assemblies that can retain nanoscale resolution. Such algorithmic systems can even assemble different structures using the same tiles, based on inputs that seed the growth. While programming and theoretical analysis of tile self-assembly often makes use of abstract logical models of growth, experimentally implemented systems are governed by nanoscale physical processes that can lead to very different behavior, more accurately modeled by taking into account the thermodynamics and kinetics of tile attachment and detachment in solution. This review discusses the relationships between more abstract and more physically realistic tile assembly models. A central concern is how consideration of model differences enables the design of tile systems that robustly exhibit the desired abstract behavior in realistic physical models and in experimental implementations. Conversely, we identify situations where self-assembly in abstract models can not be well-approximated by physically realistic models, putting constraints on physical relevance of the abstract models. To facilitate the discussion, we introduce a unified model of tile self-assembly that clarifies the relationships between several well-studied models in the literature. Throughout, we highlight open questions regarding the physical principles for DNA tile self-assembly.

  4. Moving Target Techniques: Leveraging Uncertainty for CyberDefense

    DTIC Science & Technology

    2015-12-15

    cyberattacks is a continual struggle for system managers. Attackers often need only find one vulnerability (a flaw or bug that an attacker can exploit...additional parsing code itself could have security-relevant software bugs . Dynamic  Network   Techniques in the dynamic network domain change the...evaluation of MT techniques can benefit from a variety of evaluation approaches, including abstract analysis, modeling and simulation, test bed

  5. Coupling Radar Rainfall to Hydrological Models for Water Abstraction Management

    NASA Astrophysics Data System (ADS)

    Asfaw, Alemayehu; Shucksmith, James; Smith, Andrea; MacDonald, Ken

    2015-04-01

    The impacts of climate change and growing water use are likely to put considerable pressure on water resources and the environment. In the UK, a reform to surface water abstraction policy has recently been proposed which aims to increase the efficiency of using available water resources whilst minimising impacts on the aquatic environment. Key aspects to this reform include the consideration of dynamic rather than static abstraction licensing as well as introducing water trading concepts. Dynamic licensing will permit varying levels of abstraction dependent on environmental conditions (i.e. river flow and quality). The practical implementation of an effective dynamic abstraction strategy requires suitable flow forecasting techniques to inform abstraction asset management. Potentially the predicted availability of water resources within a catchment can be coupled to predicted demand and current storage to inform a cost effective water resource management strategy which minimises environmental impacts. The aim of this work is to use a historical analysis of UK case study catchment to compare potential water resource availability using modelled dynamic abstraction scenario informed by a flow forecasting model, against observed abstraction under a conventional abstraction regime. The work also demonstrates the impacts of modelling uncertainties on the accuracy of predicted water availability over range of forecast lead times. The study utilised a conceptual rainfall-runoff model PDM - Probability-Distributed Model developed by Centre for Ecology & Hydrology - set up in the Dove River catchment (UK) using 1km2 resolution radar rainfall as inputs and 15 min resolution gauged flow data for calibration and validation. Data assimilation procedures are implemented to improve flow predictions using observed flow data. Uncertainties in the radar rainfall data used in the model are quantified using artificial statistical error model described by Gaussian distribution and propagated through the model to assess its influence on the forecasted flow uncertainty. Furthermore, the effects of uncertainties at different forecast lead times on potential abstraction strategies are assessed. The results show that over a 10 year period, an average of approximately 70 ML/d of potential water is missed in the study catchment under a convention abstraction regime. This indicates a considerable potential for the use of flow forecasting models to effectively implement advanced abstraction management and more efficiently utilize available water resources in the study catchment.

  6. Lateral-Directional Parameter Estimation on the X-48B Aircraft Using an Abstracted, Multi-Objective Effector Model

    NASA Technical Reports Server (NTRS)

    Ratnayake, Nalin A.; Waggoner, Erin R.; Taylor, Brian R.

    2011-01-01

    The problem of parameter estimation on hybrid-wing-body aircraft is complicated by the fact that many design candidates for such aircraft involve a large number of aerodynamic control effectors that act in coplanar motion. This adds to the complexity already present in the parameter estimation problem for any aircraft with a closed-loop control system. Decorrelation of flight and simulation data must be performed in order to ascertain individual surface derivatives with any sort of mathematical confidence. Non-standard control surface configurations, such as clamshell surfaces and drag-rudder modes, further complicate the modeling task. In this paper, time-decorrelation techniques are applied to a model structure selected through stepwise regression for simulated and flight-generated lateral-directional parameter estimation data. A virtual effector model that uses mathematical abstractions to describe the multi-axis effects of clamshell surfaces is developed and applied. Comparisons are made between time history reconstructions and observed data in order to assess the accuracy of the regression model. The Cram r-Rao lower bounds of the estimated parameters are used to assess the uncertainty of the regression model relative to alternative models. Stepwise regression was found to be a useful technique for lateral-directional model design for hybrid-wing-body aircraft, as suggested by available flight data. Based on the results of this study, linear regression parameter estimation methods using abstracted effectors are expected to perform well for hybrid-wing-body aircraft properly equipped for the task.

  7. Abstract probabilistic CNOT gate model based on double encoding: study of the errors and physical realizability

    NASA Astrophysics Data System (ADS)

    Gueddana, Amor; Attia, Moez; Chatta, Rihab

    2015-03-01

    In this work, we study the error sources standing behind the non-perfect linear optical quantum components composing a non-deterministic quantum CNOT gate model, which performs the CNOT function with a success probability of 4/27 and uses a double encoding technique to represent photonic qubits at the control and the target. We generalize this model to an abstract probabilistic CNOT version and determine the realizability limits depending on a realistic range of the errors. Finally, we discuss physical constraints allowing the implementation of the Asymmetric Partially Polarizing Beam Splitter (APPBS), which is at the heart of correctly realizing the CNOT function.

  8. COMPUTER SIMULATIONS OF LUNG AIRWAY STRUCTURES USING DATA-DRIVEN SURFACE MODELING TECHNIQUES

    EPA Science Inventory

    ABSTRACT

    Knowledge of human lung morphology is a subject critical to many areas of medicine. The visualization of lung structures naturally lends itself to computer graphics modeling due to the large number of airways involved and the complexities of the branching systems...

  9. The Living Cell as a Multi-agent Organisation: A Compositional Organisation Model of Intracellular Dynamics

    NASA Astrophysics Data System (ADS)

    Jonker, C. M.; Snoep, J. L.; Treur, J.; Westerhoff, H. V.; Wijngaards, W. C. A.

    Within the areas of Computational Organisation Theory and Artificial Intelligence, techniques have been developed to simulate and analyse dynamics within organisations in society. Usually these modelling techniques are applied to factories and to the internal organisation of their process flows, thus obtaining models of complex organisations at various levels of aggregation. The dynamics in living cells are often interpreted in terms of well-organised processes, a bacterium being considered a (micro)factory. This suggests that organisation modelling techniques may also benefit their analysis. Using the example of Escherichia coli it is shown how indeed agent-based organisational modelling techniques can be used to simulate and analyse E.coli's intracellular dynamics. Exploiting the abstraction levels entailed by this perspective, a concise model is obtained that is readily simulated and analysed at the various levels of aggregation, yet shows the cell's essential dynamic patterns.

  10. SHERLOCK: A Coached Practice Environment for an Electronics Troubleshooting Job

    DTIC Science & Technology

    1988-03-01

    context. At the most abstract level, it Is the cognitive version of earlier approaches to errorless learning (Terrace, 1964). With support from a...not yet learned and Is the basis for Interactions with the student . Sherlock does not use simulation techniques to model student pedormnce. Its...annotation of how well the student is expected to do at each point of the abstracted problem space. The object (microprogram) corresponding to each node

  11. Copy Hiding Application Interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Holger; Poliakoff, David; Robinson, Peter

    2016-10-06

    CHAI is a light-weight framework which abstracts the automated movement of data (e.g. to/from Host/Device) via RAJA like performance portability programming model constructs. It can be viewed as a utility framework and an adjunct to FAJA (A Performance Portability Framework). Performance Portability is a technique that abstracts the complexities of modern Heterogeneous Architectures while allowing the original program to undergo incremental minimally invasive code changes in order to adapt to the newer architectures.

  12. Spatial-temporal variability in groundwater abstraction across Uganda: Implications to sustainable water resources management

    NASA Astrophysics Data System (ADS)

    Nanteza, J.; Thomas, B. F.; Mukwaya, P. I.

    2017-12-01

    The general lack of knowledge about the current rates of water abstraction/use is a challenge to sustainable water resources management in many countries, including Uganda. Estimates of water abstraction/use rates over Uganda, currently available from the FAO are not disaggregated according to source, making it difficult to understand how much is taken out of individual water stores, limiting effective management. Modelling efforts have disaggregated water use rates according to source (i.e. groundwater and surface water). However, over Sub-Saharan Africa countries, these model use estimates are highly uncertain given the scale limitations in applying water use (i.e. point versus regional), thus influencing model calibration/validation. In this study, we utilize data from the water supply atlas project over Uganda to estimate current rates of groundwater abstraction across the country based on location, well type and other relevant information. GIS techniques are employed to demarcate areas served by each water source. These areas are combined with past population distributions and average daily water needed per person to estimate water abstraction/use through time. The results indicate an increase in groundwater use, and isolate regions prone to groundwater depletion where improved management is required to sustainably management groundwater use.

  13. Temporal abstraction and temporal Bayesian networks in clinical domains: a survey.

    PubMed

    Orphanou, Kalia; Stassopoulou, Athena; Keravnou, Elpida

    2014-03-01

    Temporal abstraction (TA) of clinical data aims to abstract and interpret clinical data into meaningful higher-level interval concepts. Abstracted concepts are used for diagnostic, prediction and therapy planning purposes. On the other hand, temporal Bayesian networks (TBNs) are temporal extensions of the known probabilistic graphical models, Bayesian networks. TBNs can represent temporal relationships between events and their state changes, or the evolution of a process, through time. This paper offers a survey on techniques/methods from these two areas that were used independently in many clinical domains (e.g. diabetes, hepatitis, cancer) for various clinical tasks (e.g. diagnosis, prognosis). A main objective of this survey, in addition to presenting the key aspects of TA and TBNs, is to point out important benefits from a potential integration of TA and TBNs in medical domains and tasks. The motivation for integrating these two areas is their complementary function: TA provides clinicians with high level views of data while TBNs serve as a knowledge representation and reasoning tool under uncertainty, which is inherent in all clinical tasks. Key publications from these two areas of relevance to clinical systems, mainly circumscribed to the latest two decades, are reviewed and classified. TA techniques are compared on the basis of: (a) knowledge acquisition and representation for deriving TA concepts and (b) methodology for deriving basic and complex temporal abstractions. TBNs are compared on the basis of: (a) representation of time, (b) knowledge representation and acquisition, (c) inference methods and the computational demands of the network, and (d) their applications in medicine. The survey performs an extensive comparative analysis to illustrate the separate merits and limitations of various TA and TBN techniques used in clinical systems with the purpose of anticipating potential gains through an integration of the two techniques, thus leading to a unified methodology for clinical systems. The surveyed contributions are evaluated using frameworks of respective key features. In addition, for the evaluation of TBN methods, a unifying clinical domain (diabetes) is used. The main conclusion transpiring from this review is that techniques/methods from these two areas, that so far are being largely used independently of each other in clinical domains, could be effectively integrated in the context of medical decision-support systems. The anticipated key benefits of the perceived integration are: (a) during problem solving, the reasoning can be directed at different levels of temporal and/or conceptual abstractions since the nodes of the TBNs can be complex entities, temporally and structurally and (b) during model building, knowledge generated in the form of basic and/or complex abstractions, can be deployed in a TBN. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. JIGSAW: Preference-directed, co-operative scheduling

    NASA Technical Reports Server (NTRS)

    Linden, Theodore A.; Gaw, David

    1992-01-01

    Techniques that enable humans and machines to cooperate in the solution of complex scheduling problems have evolved out of work on the daily allocation and scheduling of Tactical Air Force resources. A generalized, formal model of these applied techniques is being developed. It is called JIGSAW by analogy with the multi-agent, constructive process used when solving jigsaw puzzles. JIGSAW begins from this analogy and extends it by propagating local preferences into global statistics that dynamically influence the value and variable ordering decisions. The statistical projections also apply to abstract resources and time periods--allowing more opportunities to find a successful variable ordering by reserving abstract resources and deferring the choice of a specific resource or time period.

  15. Investigation of Antiangiogenic Mechanisms Using Novel Imaging Techniques

    DTIC Science & Technology

    2010-02-01

    of the tumor environment can sensitize the tumor to conventional cytotoxic therapies. To this end, we employ the window chamber model to optically ...facilitate longitudinal, in vivo investigation into the parameters of interest. These include Doppler Optical Coherence Tomography for the measurement of... Optical Techniques, Tumor Pathophysiology, Treatment Response, Vascular Normalization 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18

  16. Instruction-level performance modeling and characterization of multimedia applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Y.; Cameron, K.W.

    1999-06-01

    One of the challenges for characterizing and modeling realistic multimedia applications is the lack of access to source codes. On-chip performance counters effectively resolve this problem by monitoring run-time behaviors at the instruction-level. This paper presents a novel technique of characterizing and modeling workloads at the instruction level for realistic multimedia applications using hardware performance counters. A variety of instruction counts are collected from some multimedia applications, such as RealPlayer, GSM Vocoder, MPEG encoder/decoder, and speech synthesizer. These instruction counts can be used to form a set of abstract characteristic parameters directly related to a processor`s architectural features. Based onmore » microprocessor architectural constraints and these calculated abstract parameters, the architectural performance bottleneck for a specific application can be estimated. Meanwhile, the bottleneck estimation can provide suggestions about viable architectural/functional improvement for certain workloads. The biggest advantage of this new characterization technique is a better understanding of processor utilization efficiency and architectural bottleneck for each application. This technique also provides predictive insight of future architectural enhancements and their affect on current codes. In this paper the authors also attempt to model architectural effect on processor utilization without memory influence. They derive formulas for calculating CPI{sub 0}, CPI without memory effect, and they quantify utilization of architectural parameters. These equations are architecturally diagnostic and predictive in nature. Results provide promise in code characterization, and empirical/analytical modeling.« less

  17. Reasoning about Function Objects

    NASA Astrophysics Data System (ADS)

    Nordio, Martin; Calcagno, Cristiano; Meyer, Bertrand; Müller, Peter; Tschannen, Julian

    Modern object-oriented languages support higher-order implementations through function objects such as delegates in C#, agents in Eiffel, or closures in Scala. Function objects bring a new level of abstraction to the object-oriented programming model, and require a comparable extension to specification and verification techniques. We introduce a verification methodology that extends function objects with auxiliary side-effect free (pure) methods to model logical artifacts: preconditions, postconditions and modifies clauses. These pure methods can be used to specify client code abstractly, that is, independently from specific instantiations of the function objects. To demonstrate the feasibility of our approach, we have implemented an automatic prover, which verifies several non-trivial examples.

  18. Modeling biological gradient formation: combining partial differential equations and Petri nets.

    PubMed

    Bertens, Laura M F; Kleijn, Jetty; Hille, Sander C; Heiner, Monika; Koutny, Maciej; Verbeek, Fons J

    2016-01-01

    Both Petri nets and differential equations are important modeling tools for biological processes. In this paper we demonstrate how these two modeling techniques can be combined to describe biological gradient formation. Parameters derived from partial differential equation describing the process of gradient formation are incorporated in an abstract Petri net model. The quantitative aspects of the resulting model are validated through a case study of gradient formation in the fruit fly.

  19. Computer Security Models

    DTIC Science & Technology

    1984-09-01

    Verification Technique for a Class of Security Kernels," International Symposium on Programming , Lecture Notes in Computer Science 137, Springer-Verlag, New York...September 1984 MTR9S31 " J. K. Millen Computer Security C. M. Cerniglia Models * 0 Ne c - ¢- C. S• ~CONTRACT SPONSOR OUSDRE/C31 & ESO/ALEE...ABSTRACT The purpose of this report is to provide a basis for evaluating security models in the context of secure computer system development

  20. Control Theory Perspective of Effects-Based Thinking and Operations: Modelling Operations as a Feedback Control System

    DTIC Science & Technology

    2007-11-01

    Control Theory Perspective of Effects-Based Thinking and Operations Modelling “Operations” as a Feedback Control System Philip S. E... Theory Perspective of Effects-Based Thinking and Operations Modelling “Operations” as a Feedback Control System Philip S. E. Farrell...Abstract This paper explores operations that involve effects-based thinking (EBT) using Control Theory techniques in order to highlight the concept’s

  1. Model authoring system for fail safe analysis

    NASA Technical Reports Server (NTRS)

    Sikora, Scott E.

    1990-01-01

    The Model Authoring System is a prototype software application for generating fault tree analyses and failure mode and effects analyses for circuit designs. Utilizing established artificial intelligence and expert system techniques, the circuits are modeled as a frame-based knowledge base in an expert system shell, which allows the use of object oriented programming and an inference engine. The behavior of the circuit is then captured through IF-THEN rules, which then are searched to generate either a graphical fault tree analysis or failure modes and effects analysis. Sophisticated authoring techniques allow the circuit to be easily modeled, permit its behavior to be quickly defined, and provide abstraction features to deal with complexity.

  2. Topological characterization versus synchronization for assessing (or not) dynamical equivalence

    NASA Astrophysics Data System (ADS)

    Letellier, Christophe; Mangiarotti, Sylvain; Sendiña-Nadal, Irene; Rössler, Otto E.

    2018-04-01

    Model validation from experimental data is an important and not trivial topic which is too often reduced to a simple visual inspection of the state portrait spanned by the variables of the system. Synchronization was suggested as a possible technique for model validation. By means of a topological analysis, we revisited this concept with the help of an abstract chemical reaction system and data from two electrodissolution experiments conducted by Jack Hudson's group. The fact that it was possible to synchronize topologically different global models led us to conclude that synchronization is not a recommendable technique for model validation. A short historical preamble evokes Jack Hudson's early career in interaction with Otto E. Rössler.

  3. GIS Learning Objects: An Approach to Content Aggregation

    ERIC Educational Resources Information Center

    Govorov, Michael; Gienko, Gennady

    2013-01-01

    Content development and maintenance of geographic information systems (GIS) related courses, especially designed for distance and online delivery, could be a tedious task even for an experienced instructor. The paper outlines application of abstract instructional design techniques for modeling course structure and developing corresponding course…

  4. An Abstraction-Based Data Model for Information Retrieval

    NASA Astrophysics Data System (ADS)

    McAllister, Richard A.; Angryk, Rafal A.

    Language ontologies provide an avenue for automated lexical analysis that may be used to supplement existing information retrieval methods. This paper presents a method of information retrieval that takes advantage of WordNet, a lexical database, to generate paths of abstraction, and uses them as the basis for an inverted index structure to be used in the retrieval of documents from an indexed corpus. We present this method as a entree to a line of research on using ontologies to perform word-sense disambiguation and improve the precision of existing information retrieval techniques.

  5. Development and validation of techniques for improving software dependability

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    1992-01-01

    A collection of document abstracts are presented on the topic of improving software dependability through NASA grant NAG-1-1123. Specific topics include: modeling of error detection; software inspection; test cases; Magnetic Stereotaxis System safety specifications and fault trees; and injection of synthetic faults into software.

  6. Computer Description of Black Hawk Helicopter

    DTIC Science & Technology

    1979-06-01

    Model Combinatorial Geometry Models Black Hawk Helicopter Helicopter GIFT Computer Code Geometric Description of Targets 20. ABSTRACT...description was made using the technique of combinatorial geometry (COM-GEOM) and will be used as input to the GIFT computer code which generates Tliic...rnHp The data used bv the COVART comtmter code was eenerated bv the Geometric Information for Targets ( GIFT )Z computer code. This report documents

  7. Multi-disciplinary optimization of aeroservoelastic systems

    NASA Technical Reports Server (NTRS)

    Karpel, Mardechay

    1992-01-01

    The purpose of the research project was to continue the development of new methods for efficient aeroservoelastic analysis and optimization. The main targets were as follows: to complete the development of analytical tools for the investigation of flutter with large stiffness changes; to continue the work on efficient continuous gust response and sensitivity derivatives; and to advance the techniques of calculating dynamic loads with control and unsteady aerodynamic effects. An efficient and highly accurate mathematical model for time-domain analysis of flutter during which large structural changes occur was developed in cooperation with Carol D. Wieseman of NASA LaRC. The model was based on the second-year work 'Modal Coordinates for Aeroelastic Analysis with Large Local Structural Variations'. The work on continuous gust response was completed. An abstract of the paper 'Continuous Gust Response and Sensitivity Derivatives Using State-Space Models' was submitted for presentation in the 33rd Israel Annual Conference on Aviation and Astronautics, Feb. 1993. The abstract is given in Appendix A. The work extends the optimization model to deal with continuous gust objectives in a way that facilitates their inclusion in the efficient multi-disciplinary optimization scheme. Currently under development is a work designed to extend the analysis and optimization capabilities to loads and stress considerations. The work is on aircraft dynamic loads in response to impulsive and non-impulsive excitation. The work extends the formulations of the mode-displacement and summation-of-forces methods to include modes with significant local distortions, and load modes. An abstract of the paper,'Structural Dynamic Loads in Response to Impulsive Excitation' is given in appendix B. Another work performed this year under the Grant was 'Size-Reduction Techniques for the Determination of Efficient Aeroservoelastic Models' given in Appendix C.

  8. Instruction-Level Characterization of Scientific Computing Applications Using Hardware Performance Counters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Y.; Cameron, K.W.

    1998-11-24

    Workload characterization has been proven an essential tool to architecture design and performance evaluation in both scientific and commercial computing areas. Traditional workload characterization techniques include FLOPS rate, cache miss ratios, CPI (cycles per instruction or IPC, instructions per cycle) etc. With the complexity of sophisticated modern superscalar microprocessors, these traditional characterization techniques are not powerful enough to pinpoint the performance bottleneck of an application on a specific microprocessor. They are also incapable of immediately demonstrating the potential performance benefit of any architectural or functional improvement in a new processor design. To solve these problems, many people rely on simulators,more » which have substantial constraints especially on large-scale scientific computing applications. This paper presents a new technique of characterizing applications at the instruction level using hardware performance counters. It has the advantage of collecting instruction-level characteristics in a few runs virtually without overhead or slowdown. A variety of instruction counts can be utilized to calculate some average abstract workload parameters corresponding to microprocessor pipelines or functional units. Based on the microprocessor architectural constraints and these calculated abstract parameters, the architectural performance bottleneck for a specific application can be estimated. In particular, the analysis results can provide some insight to the problem that only a small percentage of processor peak performance can be achieved even for many very cache-friendly codes. Meanwhile, the bottleneck estimation can provide suggestions about viable architectural/functional improvement for certain workloads. Eventually, these abstract parameters can lead to the creation of an analytical microprocessor pipeline model and memory hierarchy model.« less

  9. Bring It to the Pitch: Combining Video and Movement Data to Enhance Team Sport Analysis.

    PubMed

    Stein, Manuel; Janetzko, Halldor; Lamprecht, Andreas; Breitkreutz, Thorsten; Zimmermann, Philipp; Goldlucke, Bastian; Schreck, Tobias; Andrienko, Gennady; Grossniklaus, Michael; Keim, Daniel A

    2018-01-01

    Analysts in professional team sport regularly perform analysis to gain strategic and tactical insights into player and team behavior. Goals of team sport analysis regularly include identification of weaknesses of opposing teams, or assessing performance and improvement potential of a coached team. Current analysis workflows are typically based on the analysis of team videos. Also, analysts can rely on techniques from Information Visualization, to depict e.g., player or ball trajectories. However, video analysis is typically a time-consuming process, where the analyst needs to memorize and annotate scenes. In contrast, visualization typically relies on an abstract data model, often using abstract visual mappings, and is not directly linked to the observed movement context anymore. We propose a visual analytics system that tightly integrates team sport video recordings with abstract visualization of underlying trajectory data. We apply appropriate computer vision techniques to extract trajectory data from video input. Furthermore, we apply advanced trajectory and movement analysis techniques to derive relevant team sport analytic measures for region, event and player analysis in the case of soccer analysis. Our system seamlessly integrates video and visualization modalities, enabling analysts to draw on the advantages of both analysis forms. Several expert studies conducted with team sport analysts indicate the effectiveness of our integrated approach.

  10. Programmers manual for a one-dimensional Lagrangian transport model

    USGS Publications Warehouse

    Schoellhamer, D.H.; Jobson, H.E.

    1986-01-01

    A one-dimensional Lagrangian transport model for simulating water-quality constituents such as temperature, dissolved oxygen , and suspended sediment in rivers is presented in this Programmers Manual. Lagrangian transport modeling techniques, the model 's subroutines, and the user-written decay-coefficient subroutine are discussed in detail. Appendices list the program codes. The Programmers Manual is intended for the model user who needs to modify code either to adapt the model to a particular need or to use reaction kinetics not provided with the model. (Author 's abstract)

  11. EMISSIONS INVENTORY OF PM 2.5 TRACE ELEMENTS ACROSS THE U.S.

    EPA Science Inventory

    This abstract describes work done to speciate PM2.5 emissions into emissions of trace metals to enable concentrations of metal species to be predicted by air quality models. Methods are described and initial results are presented. A technique for validating the resul...

  12. Reshaping the Enterprise through an Information Architecture and Process Reengineering.

    ERIC Educational Resources Information Center

    Laudato, Nicholas C.; DeSantis, Dennis J.

    1995-01-01

    The approach used by the University of Pittsburgh (Pennsylvania) in designing a campus-wide information architecture and a framework for reengineering the business process included building consensus on a general philosophy for information systems, using pattern-based abstraction techniques, applying data modeling and application prototyping, and…

  13. Understanding and predicting changing use of groundwater with climate and other uncertainties: a Bayesian approach

    NASA Astrophysics Data System (ADS)

    Costa, F. A. F.; Keir, G.; McIntyre, N.; Bulovic, N.

    2015-12-01

    Most groundwater supply bores in Australia do not have flow metering equipment and so regional groundwater abstraction rates are not well known. Past estimates of unmetered abstraction for regional numerical groundwater modelling typically have not attempted to quantify the uncertainty inherent in the estimation process in detail. In particular, the spatial properties of errors in the estimates are almost always neglected. Here, we apply Bayesian spatial models to estimate these abstractions at a regional scale, using the state-of-the-art computationally inexpensive approaches of integrated nested Laplace approximation (INLA) and stochastic partial differential equations (SPDE). We examine a case study in the Condamine Alluvium aquifer in southern Queensland, Australia; even in this comparatively data-rich area with extensive groundwater abstraction for agricultural irrigation, approximately 80% of bores do not have reliable metered flow records. Additionally, the metering data in this area are characterised by complicated statistical features, such as zero-valued observations, non-normality, and non-stationarity. While this precludes the use of many classical spatial estimation techniques, such as kriging, our model (using the R-INLA package) is able to accommodate these features. We use a joint model to predict both probability and magnitude of abstraction from bores in space and time, and examine the effect of a range of high-resolution gridded meteorological covariates upon the predictive ability of the model. Deviance Information Criterion (DIC) scores are used to assess a range of potential models, which reward good model fit while penalising excessive model complexity. We conclude that maximum air temperature (as a reasonably effective surrogate for evapotranspiration) is the most significant single predictor of abstraction rate; and that a significant spatial effect exists (represented by the SPDE approximation of a Gaussian random field with a Matérn covariance function). Our final model adopts air temperature, solar exposure, and normalized difference vegetation index (NDVI) as covariates, shows good agreement with previous estimates at a regional scale, and additionally offers rigorous quantification of uncertainty in the estimate.

  14. (abstract) Generic Modeling of a Life Support System for Process Technology Comparisons

    NASA Technical Reports Server (NTRS)

    Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.

    1993-01-01

    This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support systems and process technology options for a Lunar Base and a Mars Exploration Mission.

  15. Investigation of the SCS-CN initial abstraction ratio using a Monte Carlo simulation for the derived flood frequency curves

    NASA Astrophysics Data System (ADS)

    Caporali, E.; Chiarello, V.; Galeati, G.

    2014-12-01

    Peak discharges estimates for a given return period are of primary importance in engineering practice for risk assessment and hydraulic structure design. Different statistical methods are chosen here for the assessment of flood frequency curve: one indirect technique based on the extreme rainfall event analysis, the Peak Over Threshold (POT) model and the Annual Maxima approach as direct techniques using river discharge data. In the framework of the indirect method, a Monte Carlo simulation approach is adopted to determine a derived frequency distribution of peak runoff using a probabilistic formulation of the SCS-CN method as stochastic rainfall-runoff model. A Monte Carlo simulation is used to generate a sample of different runoff events from different stochastic combination of rainfall depth, storm duration, and initial loss inputs. The distribution of the rainfall storm events is assumed to follow the GP law whose parameters are estimated through GEV's parameters of annual maximum data. The evaluation of the initial abstraction ratio is investigated since it is one of the most questionable assumption in the SCS-CN model and plays a key role in river basin characterized by high-permeability soils, mainly governed by infiltration excess mechanism. In order to take into account the uncertainty of the model parameters, this modified approach, that is able to revise and re-evaluate the original value of the initial abstraction ratio, is implemented. In the POT model the choice of the threshold has been an essential issue, mainly based on a compromise between bias and variance. The Generalized Extreme Value (GEV) distribution fitted to the annual maxima discharges is therefore compared with the Pareto distributed peaks to check the suitability of the frequency of occurrence representation. The methodology is applied to a large dam in the Serchio river basin, located in the Tuscany Region. The application has shown as Monte Carlo simulation technique can be a useful tool to provide more robust estimation of the results obtained by direct statistical methods.

  16. ASTRONAUTICS INFORMATION. Abstracts Vol. III, No. 1. Abstracts 3,082- 3,184

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1961-01-01

    Abstracts are presented on astronautics. The abstracts are generally restricted to spaceflight and to applicable techniques and data. The publication covers the period of January 1961. 102 references. (J.R.D.)

  17. Using Nonlinear Programming in International Trade Theory: The Factor-Proportions Model

    ERIC Educational Resources Information Center

    Gilbert, John

    2004-01-01

    Students at all levels benefit from a multi-faceted approach to learning abstract material. The most commonly used technique in teaching the pure theory of international trade is a combination of geometry and algebraic derivations. Numerical simulation can provide a valuable third support to these approaches. The author describes a simple…

  18. Utilizing a structural meta-ontology for family-based quality assurance of the BioPortal ontologies.

    PubMed

    Ochs, Christopher; He, Zhe; Zheng, Ling; Geller, James; Perl, Yehoshua; Hripcsak, George; Musen, Mark A

    2016-06-01

    An Abstraction Network is a compact summary of an ontology's structure and content. In previous research, we showed that Abstraction Networks support quality assurance (QA) of biomedical ontologies. The development of an Abstraction Network and its associated QA methodologies, however, is a labor-intensive process that previously was applicable only to one ontology at a time. To improve the efficiency of the Abstraction-Network-based QA methodology, we introduced a QA framework that uses uniform Abstraction Network derivation techniques and QA methodologies that are applicable to whole families of structurally similar ontologies. For the family-based framework to be successful, it is necessary to develop a method for classifying ontologies into structurally similar families. We now describe a structural meta-ontology that classifies ontologies according to certain structural features that are commonly used in the modeling of ontologies (e.g., object properties) and that are important for Abstraction Network derivation. Each class of the structural meta-ontology represents a family of ontologies with identical structural features, indicating which types of Abstraction Networks and QA methodologies are potentially applicable to all of the ontologies in the family. We derive a collection of 81 families, corresponding to classes of the structural meta-ontology, that enable a flexible, streamlined family-based QA methodology, offering multiple choices for classifying an ontology. The structure of 373 ontologies from the NCBO BioPortal is analyzed and each ontology is classified into multiple families modeled by the structural meta-ontology. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. [Analysis of syndrome discipline of generalized anxiety disorder using data mining techniques].

    PubMed

    Tang, Qi-sheng; Sun, Wen-jun; Qu, Miao; Guo, Dong-fang

    2012-09-01

    To study the use of data mining techniques in analyzing the syndrome discipline of generalized anxiety disorder (GAD). From August 1, 2009 to July 31, 2010, 705 patients with GAD in 10 hospitals of Beijing were investigated over one year. Data mining techniques, such as Bayes net and cluster analysis, were used to analyze the syndrome discipline of GAD. A total of 61 symptoms of GAD were screened out. By using Bayes net, nine syndromes of GAD were abstracted based on the symptoms. Eight syndromes were abstracted by cluster analysis. After screening for duplicate syndromes and combining the experts' experience and traditional Chinese medicine theory, six syndromes of GAD were defined. These included depressed liver qi transforming into fire, phlegm-heat harassing the heart, liver depression and spleen deficiency, heart-kidney non-interaction, dual deficiency of the heart and spleen, and kidney deficiency and liver yang hyperactivity. Based on the results, the draft of Syndrome Diagnostic Criteria for Generalized Anxiety Disorder was developed. Data mining techniques such as Bayes net and cluster analysis have certain future potential for establishing syndrome models and analyzing syndrome discipline, thus they are suitable for the research of syndrome differentiation.

  20. ASTRONAUTICS INFORMATION. ABSTRACTS, VOL. V, NO. 3. Abstracts 5,201- 5,330

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardgrove, B.J.; Warren, F.L. comps.

    1962-03-01

    Abstracts of astronautics information covering the period March 1962 are presented. The 129 abstracts cover the subject of spaceflight and applicable data and techniques. Author, subject, and source indexes are included. (M.C.G.)

  1. Validating Human Behavioral Models for Combat Simulations Using Techniques for the Evaluation of Human Performance

    DTIC Science & Technology

    2004-01-01

    Cognitive Task Analysis Abstract As Department of Defense (DoD) leaders rely more on modeling and simulation to provide information on which to base...capabilities and intent. Cognitive Task Analysis (CTA) Cognitive Task Analysis (CTA) is an extensive/detailed look at tasks and subtasks performed by a...Domain Analysis and Task Analysis: A Difference That Matters. In Cognitive Task Analysis , edited by J. M. Schraagen, S.

  2. Studies on Instabilities in Long-Baseline Two-Way Satellite Time and Frequency Transfer (TWSTFT) Including a Troposphere Delay Model

    DTIC Science & Technology

    2007-11-01

    TRANSFER ( TWSTFT ) INCLUDING A TROPOSPHERE DELAY MODEL D. Piester, A. Bauch Physikalisch-Technische Bundesanstalt (PTB) Bundesallee 100...Abstract Two-way satellite time and frequency transfer ( TWSTFT ) is one of the leading techniques for remote comparisons of atomic frequency standards...nanosecond level. These achievements are due to the fact that many delay variations of the transmitted signals cancel out in TWSTFT because of the

  3. A spatial operator algebra for manipulator modeling and control

    NASA Technical Reports Server (NTRS)

    Rodriguez, G.; Kreutz, K.; Jain, A.

    1989-01-01

    A spatial operator algebra for modeling the control and trajectory design of manipulation is discussed, with emphasis on its analytical formulation and implementation in the Ada programming language. The elements of this algebra are linear operators whose domain and range spaces consist of forces, moments, velocities, and accelerations. The effect of these operators is equivalent to a spatial recursion along the span of the manipulator. Inversion is obtained using techniques of recursive filtering and smoothing. The operator alegbra provides a high-level framework for describing the dynamic and kinematic behavior of a manipulator and control and trajectory design algorithms. Implementable recursive algorithms can be immediately derived from the abstract operator expressions by inspection, thus greatly simplifying the transition from an abstract problem formulation and solution to the detailed mechanization of a specific algorithm.

  4. Measurement of latent cognitive abilities involved in concept identification learning.

    PubMed

    Thomas, Michael L; Brown, Gregory G; Gur, Ruben C; Moore, Tyler M; Patt, Virginie M; Nock, Matthew K; Naifeh, James A; Heeringa, Steven; Ursano, Robert J; Stein, Murray B

    2015-01-01

    We used cognitive and psychometric modeling techniques to evaluate the construct validity and measurement precision of latent cognitive abilities measured by a test of concept identification learning: the Penn Conditional Exclusion Test (PCET). Item response theory parameters were embedded within classic associative- and hypothesis-based Markov learning models and were fitted to 35,553 Army soldiers' PCET data from the Army Study to Assess Risk and Resilience in Servicemembers (Army STARRS). Data were consistent with a hypothesis-testing model with multiple latent abilities-abstraction and set shifting. Latent abstraction ability was positively correlated with number of concepts learned, and latent set-shifting ability was negatively correlated with number of perseverative errors, supporting the construct validity of the two parameters. Abstraction was most precisely assessed for participants with abilities ranging from 1.5 standard deviations below the mean to the mean itself. Measurement of set shifting was acceptably precise only for participants making a high number of perseverative errors. The PCET precisely measures latent abstraction ability in the Army STARRS sample, especially within the range of mildly impaired to average ability. This precision pattern is ideal for a test developed to measure cognitive impairment as opposed to cognitive strength. The PCET also measures latent set-shifting ability, but reliable assessment is limited to the impaired range of ability, reflecting that perseverative errors are rare among cognitively healthy adults. Integrating cognitive and psychometric models can provide information about construct validity and measurement precision within a single analytical framework.

  5. An Abstract Data Model for the IDEF0 Graphical Analysis Language

    DTIC Science & Technology

    1990-01-11

    whatever level was necessary to ensure an unambiguous interpretation of the system require- ments. Marca and McGowan have written an excellent book which...December 1987. AFIT/GE/ENG/87D-28. [7] MARCA , D. A., AND McGOWAN, C. L. SADT Structured Analysis and Design Technique. McGraw- Hill Book Company, 1988. [8

  6. How Learning Techniques Initiate Simulation of Human Mind

    ERIC Educational Resources Information Center

    Girija, C.

    2014-01-01

    The simulation of human mind often helps in the understanding of abstract concept by representing it in a realistic model and simplistic way so that a learner develops an understanding of the key concepts. Bian (1873) and James (1890) in their work suggested that thoughts and body activity result from interactions among neurons within the brain.…

  7. Enhancing the Scientific Process with Artificial Intelligence: Forest Science Applications

    Treesearch

    Ronald E. McRoberts; Daniel L. Schmoldt; H. Michael Rauscher

    1991-01-01

    Forestry, as a science, is a process for investigating nature. It consists of repeatedly cycling through a number of steps, including identifying knowledge gaps, creating knowledge to fill them, and organizing, evaluating, and delivering this knowledge. Much of this effort is directed toward creating abstract models of natural phenomena. The cognitive techniques of AI...

  8. Analytical Modelling of the Effects of Different Gas Turbine Cooling Techniques on Engine Performance =

    NASA Astrophysics Data System (ADS)

    Uysal, Selcuk Can

    In this research, MATLAB SimulinkRTM was used to develop a cooled engine model for industrial gas turbines and aero-engines. The model consists of uncooled on-design, mean-line turbomachinery design and a cooled off-design analysis in order to evaluate the engine performance parameters by using operating conditions, polytropic efficiencies, material information and cooling system details. The cooling analysis algorithm involves a 2nd law analysis to calculate losses from the cooling technique applied. The model is used in a sensitivity analysis that evaluates the impacts of variations in metal Biot number, thermal barrier coating Biot number, film cooling effectiveness, internal cooling effectiveness and maximum allowable blade temperature on main engine performance parameters of aero and industrial gas turbine engines. The model is subsequently used to analyze the relative performance impact of employing Anti-Vortex Film Cooling holes (AVH) by means of data obtained for these holes by Detached Eddy Simulation-CFD Techniques that are valid for engine-like turbulence intensity conditions. Cooled blade configurations with AVH and other different external cooling techniques were used in a performance comparison study. (Abstract shortened by ProQuest.).

  9. Production of Diabetic Offspring Using Cryopreserved Epididymal Sperm by In Vitro Fertilization and Intrafallopian Insemination Techniques in Transgenic Pigs

    PubMed Central

    UMEYAMA, Kazuhiro; HONDA, Kasumi; MATSUNARI, Hitomi; NAKANO, Kazuaki; HIDAKA, Tatsuro; SEKIGUCHI, Keito; MOCHIZUKI, Hironori; TAKEUCHI, Yasuhiro; FUJIWARA, Tsukasa; WATANABE, Masahito; NAGAYA, Masaki; NAGASHIMA, Hiroshi

    2013-01-01

    Abstract Somatic cell nuclear transfer (SCNT) is a useful technique for creating pig strains that model human diseases. However, production of numerous cloned disease model pigs by SCNT for large-scale experiments is impractical due to its complexity and inefficiency. In the present study, we aimed to establish an efficient procedure for proliferating the diabetes model pig carrying the mutant human hepatocyte nuclear factor-1α gene. A founder diabetes transgenic cloned pig was generated by SCNT and treated with insulin to allow for normal growth to maturity, at which point epididymal sperm could be collected for cryopreservation. In vitro fertilization and intrafallopian insemination using the cryopreserved epididymal sperm resulted in diabetes model transgenic offspring. These results suggest that artificial reproductive technology using cryopreserved epididymal sperm could be a practical option for proliferation of genetically modified disease model pigs. PMID:23979397

  10. Machine Learning-Based Classification of 38 Years of Spine-Related Literature Into 100 Research Topics.

    PubMed

    Sing, David C; Metz, Lionel N; Dudli, Stefan

    2017-06-01

    Retrospective review. To identify the top 100 spine research topics. Recent advances in "machine learning," or computers learning without explicit instructions, have yielded broad technological advances. Topic modeling algorithms can be applied to large volumes of text to discover quantifiable themes and trends. Abstracts were extracted from the National Library of Medicine PubMed database from five prominent peer-reviewed spine journals (European Spine Journal [ESJ], The Spine Journal [SpineJ], Spine, Journal of Spinal Disorders and Techniques [JSDT], Journal of Neurosurgery: Spine [JNS]). Each abstract was entered into a latent Dirichlet allocation model specified to discover 100 topics, resulting in each abstract being assigned a probability of belonging in a topic. Topics were named using the five most frequently appearing terms within that topic. Significance of increasing ("hot") or decreasing ("cold") topic popularity over time was evaluated with simple linear regression. From 1978 to 2015, 25,805 spine-related research articles were extracted and classified into 100 topics. Top two most published topics included "clinical, surgeons, guidelines, information, care" (n = 496 articles) and "pain, back, low, treatment, chronic" (424). Top two hot trends included "disc, cervical, replacement, level, arthroplasty" (+0.05%/yr, P < 0.001), and "minimally, invasive, approach, technique" (+0.05%/yr, P < 0.001). By journal, the most published topics were ESJ-"operative, surgery, postoperative, underwent, preoperative"; SpineJ-"clinical, surgeons, guidelines, information, care"; Spine-"pain, back, low, treatment, chronic"; JNS- "tumor, lesions, rare, present, diagnosis"; JSDT-"cervical, anterior, plate, fusion, ACDF." Topics discovered through latent Dirichlet allocation modeling represent unbiased meaningful themes relevant to spine care. Topic dynamics can provide historical context and direction for future research for aspiring investigators and trainees interested in spine careers. Please explore https://singdc.shinyapps.io/spinetopics. N A.

  11. Multi-Database Searching in the Behavioral Sciences--Part I: Basic Techniques and Core Databases.

    ERIC Educational Resources Information Center

    Angier, Jennifer J.; Epstein, Barbara A.

    1980-01-01

    Outlines practical searching techniques in seven core behavioral science databases accessing psychological literature: Psychological Abstracts, Social Science Citation Index, Biosis, Medline, Excerpta Medica, Sociological Abstracts, ERIC. Use of individual files is discussed and their relative strengths/weaknesses are compared. Appended is a list…

  12. Abstracts for the International Conference on Asteroids, Comets, Meteors 1991

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Topics addressed include: chemical abundances; asteroidal belt evolution; sources of meteors and meteorites; cometary spectroscopy; gas diffusion; mathematical models; cometary nuclei; cratering records; imaging techniques; cometary composition; asteroid classification; radio telescopes and spectroscopy; magnetic fields; cosmogony; IUE observations; orbital distribution of asteroids, comets, and meteors; solar wind effects; computerized simulation; infrared remote sensing; optical properties; and orbital evolution.

  13. Technical Reliability Studies. EOS/ESD Technology Abstracts

    DTIC Science & Technology

    1982-01-01

    RESISTANT BIPOLAR TRANSISTOR DESIGN AND ITS APPLICATIONS TO LINEAR INTEGRATED CIRCUITS 16145 MODULE ELECTROSTATIC DISCHARGE SIMULATOR 15786 SOME...T.M. 16476 STATIC DISCHARGE MODELING TECHNIQUES FOR EVALUATION OF INTEGRATED (FET) CIRCUIT DESTRUCTION 16145 MODULE ELECTAOSTATIC DISCHARGE SIMULATOR...PLASTIC LSI CIRCUITS PRklE, L.A., II 16145 MODULE ELECTROSTATIC DISCHARGE SIMULATOR PRICE, R.D. 13455 EVALUATION OF PLASTIC LSI CIRCUITS PSHAENICH, A

  14. Modeling Abstraction and Simulation Techniques

    DTIC Science & Technology

    2002-12-01

    for data reduction on the patterns stored in normal database . In [58], J. Marin et al. proposed a hybrid model to profile user behavior by the...conv(Ad) as the feasible set 124 in the “surrogate” continuous state space. When the feasible set Ad is not a polyhedron , the set Ac = conv(Ad) may...and is not necessarily convex. Note also that the definition reduces to Ac = conv(Ad) when Ad is formed by all the discrete points in a polyhedron . Now

  15. Generalized Abstract Symbolic Summaries

    NASA Technical Reports Server (NTRS)

    Person, Suzette; Dwyer, Matthew B.

    2009-01-01

    Current techniques for validating and verifying program changes often consider the entire program, even for small changes, leading to enormous V&V costs over a program s lifetime. This is due, in large part, to the use of syntactic program techniques which are necessarily imprecise. Building on recent advances in symbolic execution of heap manipulating programs, in this paper, we develop techniques for performing abstract semantic differencing of program behaviors that offer the potential for improved precision.

  16. Organic Techniques for Protecting Virtual Private Network (VPN) Services from Access Link Flooding Attacks

    DTIC Science & Technology

    2002-01-01

    Submitted to ICN 2002 Organic Techniques for Protecting Virtual Private Network (VPN) Services from Access Link Flooding Attacks1 Ranga S. Ramanujan ...using these techniques is also described. Contact author: Dr. Ranga S. Ramanujan Architecture Technology Corporation 9971 Valley View Road Eden Prairie...OF ABSTRACT 18. NUMBER OF PAGES 15 19a. NAME OF RESPONSIBLE PERSON a. REPORT unclassified b. ABSTRACT unclassified c . THIS PAGE unclassified

  17. Progressive Learning of Topic Modeling Parameters: A Visual Analytics Framework.

    PubMed

    El-Assady, Mennatallah; Sevastjanova, Rita; Sperrle, Fabian; Keim, Daniel; Collins, Christopher

    2018-01-01

    Topic modeling algorithms are widely used to analyze the thematic composition of text corpora but remain difficult to interpret and adjust. Addressing these limitations, we present a modular visual analytics framework, tackling the understandability and adaptability of topic models through a user-driven reinforcement learning process which does not require a deep understanding of the underlying topic modeling algorithms. Given a document corpus, our approach initializes two algorithm configurations based on a parameter space analysis that enhances document separability. We abstract the model complexity in an interactive visual workspace for exploring the automatic matching results of two models, investigating topic summaries, analyzing parameter distributions, and reviewing documents. The main contribution of our work is an iterative decision-making technique in which users provide a document-based relevance feedback that allows the framework to converge to a user-endorsed topic distribution. We also report feedback from a two-stage study which shows that our technique results in topic model quality improvements on two independent measures.

  18. On the Power of Abstract Interpretation

    NASA Technical Reports Server (NTRS)

    Reddy, Uday S.; Kamin, Samuel N.

    1991-01-01

    Increasingly sophisticated applications of static analysis place increased burden on the reliability of the analysis techniques. Often, the failure of the analysis technique to detect some information my mean that the time or space complexity of the generated code would be altered. Thus, it is important to precisely characterize the power of static analysis techniques. We follow the approach of Selur et. al. who studied the power of strictness analysis techniques. Their result can be summarized by saying 'strictness analysis is perfect up to variations in constants.' In other words, strictness analysis is as good as it could be, short of actually distinguishing between concrete values. We use this approach to characterize a broad class of analysis techniques based on abstract interpretation including, but not limited to, strictness analysis. For the first-order case, we consider abstract interpretations where the abstract domain for data values is totally ordered. This condition is satisfied by Mycroft's strictness analysis that of Sekar et. al. and Wadler's analysis of list-strictness. For such abstract interpretations, we show that the analysis is complete in the sense that, short of actually distinguishing between concrete values with the same abstraction, it gives the best possible information. We further generalize these results to typed lambda calculus with pairs and higher-order functions. Note that products and function spaces over totally ordered domains are not totally ordered. In fact, the notion of completeness used in the first-order case fails if product domains or function spaces are added. We formulate a weaker notion of completeness based on observability of values. Two values (including pairs and functions) are considered indistinguishable if their observable components are indistinguishable. We show that abstract interpretation of typed lambda calculus programs is complete up to this notion of indistinguishability. We use denotationally-oriented arguments instead of the detailed operational arguments used by Selur et. al.. Hence, our proofs are much simpler. They should be useful for further future improvements.

  19. High-level user interfaces for transfer function design with semantics.

    PubMed

    Salama, Christof Rezk; Keller, Maik; Kohlmann, Peter

    2006-01-01

    Many sophisticated techniques for the visualization of volumetric data such as medical data have been published. While existing techniques are mature from a technical point of view, managing the complexity of visual parameters is still difficult for non-expert users. To this end, this paper presents new ideas to facilitate the specification of optical properties for direct volume rendering. We introduce an additional level of abstraction for parametric models of transfer functions. The proposed framework allows visualization experts to design high-level transfer function models which can intuitively be used by non-expert users. The results are user interfaces which provide semantic information for specialized visualization problems. The proposed method is based on principal component analysis as well as on concepts borrowed from computer animation.

  20. PyMCT: A Very High Level Language Coupling Tool For Climate System Models

    NASA Astrophysics Data System (ADS)

    Tobis, M.; Pierrehumbert, R. T.; Steder, M.; Jacob, R. L.

    2006-12-01

    At the Climate Systems Center of the University of Chicago, we have been examining strategies for applying agile programming techniques to complex high-performance modeling experiments. While the "agile" development methodology differs from a conventional requirements process and its associated milestones, the process remain a formal one. It is distinguished by continuous improvement in functionality, large numbers of small releases, extensive and ongoing testing strategies, and a strong reliance on very high level languages (VHLL). Here we report on PyMCT, which we intend as a core element in a model ensemble control superstructure. PyMCT is a set of Python bindings for MCT, the Fortran-90 based Model Coupling Toolkit, which forms the infrastructure for the inter-component communication in the Community Climate System Model (CCSM). MCT provides a scalable model communication infrastructure. In order to take maximum advantage of agile software development methodologies, we exposed MCT functionality to Python, a prominent VHLL. We describe how the scalable architecture of MCT allows us to overcome the relatively weak runtime performance of Python, so that the performance of the combined system is not severely impacted. To demonstrate these advantages, we reimplemented the CCSM coupler in Python. While this alone offers no new functionality, it does provide a rigorous test of PyMCT functionality and performance. We reimplemented the CPL6 library, presenting an interesting case study of the comparison between conventional Fortran-90 programming and the higher abstraction level provided by a VHLL. The powerful abstractions provided by Python will allow much more complex experimental paradigms. In particular, we hope to build on the scriptability of our coupling strategy to enable systematic sensitivity tests. Our most ambitious objective is to combine our efforts with Bayesian inverse modeling techniques toward objective tuning at the highest level, across model architectures.

  1. Assume-Guarantee Verification of Source Code with Design-Level Assumptions

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Pasareanu, Corina S.; Cobleigh, Jamieson M.

    2004-01-01

    Model checking is an automated technique that can be used to determine whether a system satisfies certain required properties. To address the 'state explosion' problem associated with this technique, we propose to integrate assume-guarantee verification at different phases of system development. During design, developers build abstract behavioral models of the system components and use them to establish key properties of the system. To increase the scalability of model checking at this level, we have developed techniques that automatically decompose the verification task by generating component assumptions for the properties to hold. The design-level artifacts are subsequently used to guide the implementation of the system, but also to enable more efficient reasoning at the source code-level. In particular we propose to use design-level assumptions to similarly decompose the verification of the actual system implementation. We demonstrate our approach on a significant NASA application, where design-level models were used to identify; and correct a safety property violation, and design-level assumptions allowed us to check successfully that the property was presented by the implementation.

  2. Computational Methods for Control and Estimation of Distributed System

    DTIC Science & Technology

    1988-08-01

    prey example. [1987, August] Estimation of Nonlinearities in Parabolic Models for Growth, Predation and Dispersal of Populations. S a ON A VARIATIONAL ...NOTATION 17. COSATI CODES 18. SUBJECT TERMS (Continue on reverse if necessary and identify by block number) FIELD GROUP SUB-GROUP 19. ABSTRACT (Continue...techniques for infinite dimensional systems. (v) Control and stabilization of visco-elastic structures. (vi) Approximation in delay and Volterra type

  3. 5. international workshop on the identification of transcribed sequences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-12-31

    This workshop was held November 5--8, 1995 in Les Embiez, France. The purpose of this conference was to provide a multidisciplinary forum for exchange of state-of-the-art information on mapping the human genome. Attention is focused on the following topics: transcriptional maps; functional analysis; techniques; model organisms; and tissue specific libraries and genes. Abstracts are included of the papers that were presented.

  4. Integrated Efforts for Analysis of Geophysical Measurements and Models.

    DTIC Science & Technology

    1997-09-26

    12b. DISTRIBUTION CODE 13. ABSTRACT ( Maximum 200 words) This contract supported investigations of integrated applications of physics, ephemerides...REGIONS AND GPS DATA VALIDATIONS 20 2.5 PL-SCINDA: VISUALIZATION AND ANALYSIS TECHNIQUES 22 2.5.1 View Controls 23 2.5.2 Map Selection...and IR data, about cloudy pixels. Clustering and maximum likelihood classification algorithms categorize up to four cloud layers into stratiform or

  5. Clustering More than Two Million Biomedical Publications: Comparing the Accuracies of Nine Text-Based Similarity Approaches

    PubMed Central

    Boyack, Kevin W.; Newman, David; Duhon, Russell J.; Klavans, Richard; Patek, Michael; Biberstine, Joseph R.; Schijvenaars, Bob; Skupin, André; Ma, Nianli; Börner, Katy

    2011-01-01

    Background We investigate the accuracy of different similarity approaches for clustering over two million biomedical documents. Clustering large sets of text documents is important for a variety of information needs and applications such as collection management and navigation, summary and analysis. The few comparisons of clustering results from different similarity approaches have focused on small literature sets and have given conflicting results. Our study was designed to seek a robust answer to the question of which similarity approach would generate the most coherent clusters of a biomedical literature set of over two million documents. Methodology We used a corpus of 2.15 million recent (2004-2008) records from MEDLINE, and generated nine different document-document similarity matrices from information extracted from their bibliographic records, including titles, abstracts and subject headings. The nine approaches were comprised of five different analytical techniques with two data sources. The five analytical techniques are cosine similarity using term frequency-inverse document frequency vectors (tf-idf cosine), latent semantic analysis (LSA), topic modeling, and two Poisson-based language models – BM25 and PMRA (PubMed Related Articles). The two data sources were a) MeSH subject headings, and b) words from titles and abstracts. Each similarity matrix was filtered to keep the top-n highest similarities per document and then clustered using a combination of graph layout and average-link clustering. Cluster results from the nine similarity approaches were compared using (1) within-cluster textual coherence based on the Jensen-Shannon divergence, and (2) two concentration measures based on grant-to-article linkages indexed in MEDLINE. Conclusions PubMed's own related article approach (PMRA) generated the most coherent and most concentrated cluster solution of the nine text-based similarity approaches tested, followed closely by the BM25 approach using titles and abstracts. Approaches using only MeSH subject headings were not competitive with those based on titles and abstracts. PMID:21437291

  6. Development of a high resolution interstellar dust engineering model - overview of the project

    NASA Astrophysics Data System (ADS)

    Sterken, V. J.; Strub, P.; Soja, R. H.; Srama, R.; Krüger, H.; Grün, E.

    2013-09-01

    Beyond 3 AU heliocentric distance, the flow of interstellar dust through the solar system is a dominant component of the total dust population. The modulation of this flux with the solar cycle and the position in the solar system has been predicted by theoretical studies since the seventies. The modulation was proven to exist by matching dust trajectory simulations with real spacecraft data from Ulysses in 1998. The modulations were further analyzed and studies in detail in 2012. The current ESA interplanetary meteoroid model IMEM includes an interstellar dust component, but this component was modelled only with straight line trajectories through the solar system. For the new ESA IMEX model, a high-resolution interstellar dust component is implemented separately from a dust streams module. The dust streams module focuses on dust in streams that was released from comets (cf. Abstract R. Soja). Parallel processing techniques are used to improve computation time (cf. Abstract P. Strub). The goal is to make predictions for the interstellar dust flux as close to the Sun as 1 AU or closer, for future space mission design.

  7. Planner-Based Control of Advanced Life Support Systems

    NASA Technical Reports Server (NTRS)

    Muscettola, Nicola; Kortenkamp, David; Fry, Chuck; Bell, Scott

    2005-01-01

    The paper describes an approach to the integration of qualitative and quantitative modeling techniques for advanced life support (ALS) systems. Developing reliable control strategies that scale up to fully integrated life support systems requires augmenting quantitative models and control algorithms with the abstractions provided by qualitative, symbolic models and their associated high-level control strategies. This will allow for effective management of the combinatorics due to the integration of a large number of ALS subsystems. By focusing control actions at different levels of detail and reactivity we can use faster: simpler responses at the lowest level and predictive but complex responses at the higher levels of abstraction. In particular, methods from model-based planning and scheduling can provide effective resource management over long time periods. We describe reference implementation of an advanced control system using the IDEA control architecture developed at NASA Ames Research Center. IDEA uses planning/scheduling as the sole reasoning method for predictive and reactive closed loop control. We describe preliminary experiments in planner-based control of ALS carried out on an integrated ALS simulation developed at NASA Johnson Space Center.

  8. Modeling systems-level dynamics: Understanding without mechanistic explanation in integrative systems biology.

    PubMed

    MacLeod, Miles; Nersessian, Nancy J

    2015-02-01

    In this paper we draw upon rich ethnographic data of two systems biology labs to explore the roles of explanation and understanding in large-scale systems modeling. We illustrate practices that depart from the goal of dynamic mechanistic explanation for the sake of more limited modeling goals. These processes use abstract mathematical formulations of bio-molecular interactions and data fitting techniques which we call top-down abstraction to trade away accurate mechanistic accounts of large-scale systems for specific information about aspects of those systems. We characterize these practices as pragmatic responses to the constraints many modelers of large-scale systems face, which in turn generate more limited pragmatic non-mechanistic forms of understanding of systems. These forms aim at knowledge of how to predict system responses in order to manipulate and control some aspects of them. We propose that this analysis of understanding provides a way to interpret what many systems biologists are aiming for in practice when they talk about the objective of a "systems-level understanding." Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems.

    PubMed

    Williams, Richard A; Timmis, Jon; Qwarnstrom, Eva E

    2016-01-01

    Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model.

  10. Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems

    PubMed Central

    Timmis, Jon; Qwarnstrom, Eva E.

    2016-01-01

    Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model. PMID:27571414

  11. Abstracts of papers presented at the Eleventh International Laser Radar Conference

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Abstracts of 39 papers discuss measurements of properties from the Earth's ocean surface to the mesosphere, made with techniques ranging from elastic and inelastic scattering to Doppler shifts and differential absorption. Topics covered include: (1) middle atmospheric measurements; (2) meteorological parameters: temperature, density, humidity; (3) trace gases by Raman and DIAL techniques; (4) techniques and technology; (5) plume dispersion; (6) boundary layer dynamics; (7) wind measurements; visibility and aerosol properties; and (9) multiple scattering, clouds, and hydrometers.

  12. Simulating complex intracellular processes using object-oriented computational modelling.

    PubMed

    Johnson, Colin G; Goldman, Jacki P; Gullick, William J

    2004-11-01

    The aim of this paper is to give an overview of computer modelling and simulation in cellular biology, in particular as applied to complex biochemical processes within the cell. This is illustrated by the use of the techniques of object-oriented modelling, where the computer is used to construct abstractions of objects in the domain being modelled, and these objects then interact within the computer to simulate the system and allow emergent properties to be observed. The paper also discusses the role of computer simulation in understanding complexity in biological systems, and the kinds of information which can be obtained about biology via simulation.

  13. Abstracts of Research, July 1973 through June 1974.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Computer and Information Science Research Center.

    Abstracts of research papers in the fields of computer and information science are given; 72 papers are abstracted in the areas of information storage and retrieval, information processing, linguistic analysis, artificial intelligence, mathematical techniques, systems programing, and computer networks. In addition, the Ohio State University…

  14. A heuristic for efficient data distribution management in distributed simulation

    NASA Astrophysics Data System (ADS)

    Gupta, Pankaj; Guha, Ratan K.

    2005-05-01

    In this paper, we propose an algorithm for reducing the complexity of region matching and efficient multicasting in data distribution management component of High Level Architecture (HLA) Run Time Infrastructure (RTI). The current data distribution management (DDM) techniques rely on computing the intersection between the subscription and update regions. When a subscription region and an update region of different federates overlap, RTI establishes communication between the publisher and the subscriber. It subsequently routes the updates from the publisher to the subscriber. The proposed algorithm computes the update/subscription regions matching for dynamic allocation of multicast group. It provides new multicast routines that exploit the connectivity of federation by communicating updates regarding interactions and routes information only to those federates that require them. The region-matching problem in DDM reduces to clique-covering problem using the connections graph abstraction where the federations represent the vertices and the update/subscribe relations represent the edges. We develop an abstract model based on connection graph for data distribution management. Using this abstract model, we propose a heuristic for solving the region-matching problem of DDM. We also provide complexity analysis of the proposed heuristics.

  15. Symbolic Heuristic Search for Factored Markov Decision Processes

    NASA Technical Reports Server (NTRS)

    Morris, Robert (Technical Monitor); Feng, Zheng-Zhu; Hansen, Eric A.

    2003-01-01

    We describe a planning algorithm that integrates two approaches to solving Markov decision processes with large state spaces. State abstraction is used to avoid evaluating states individually. Forward search from a start state, guided by an admissible heuristic, is used to avoid evaluating all states. We combine these two approaches in a novel way that exploits symbolic model-checking techniques and demonstrates their usefulness for decision-theoretic planning.

  16. Shallow Water Acoustics Workshop, 1983.

    DTIC Science & Technology

    1983-02-01

    WAGSTAFF , Ronald McKISIC, Mike 800 N. Quincy Street NVRDA Office of Naval Research Arlington, VA 22217 NSTL Station, MS 39529 Code 4250A Arlington, VA...Groups were: 1. Environmental Acoustics and Modeling (R. Wagstaff , NORDA, Chairman) 2. Measurements and Survey Techniques (G. Lewis, NAVOCEANO...NORDA’s present program (S. Stanic) and its additional proposed work (W. Kuperman and R. Wagstaff ) were delivered as invited papers. The abstracts of all

  17. RTO Technical Publications: A Quarterly Listing

    NASA Technical Reports Server (NTRS)

    2004-01-01

    The titles of five reports are listed here, together with an abstract for each. The titles include: 1) 'Spectral Models of Kuiper Belt Objects and Centaurs'; 2) 'Simulation of and for Military Decision Making'; 3) 'Abundance of the Radioactive Be-10 in the Cosmic Radiation up to 2 GeV/nucleon with the Balloon-borne Instrument ISOMAX1998'; 4) 'Optical Air Flow Measurements in Flight'; 5) 'Flight Test Measurement Techniques for Laminar Flow'.

  18. Low Frequency Acoustic Intensity Propagation Modeling in Shallow Water Waveguides

    DTIC Science & Technology

    2016-06-01

    REPORT NUMBER 11. SUPPLEMENTARY NOTES The views expressed in this thesis are those of the author and do not reflect the official policy or position of...release; distribution is unlimited 12b. DISTRIBUTION CODE A 13. ABSTRACT (maximum 200 words) Three popular numerical techniques are employed to...planar interfacial two-fluid transmission and reflection are used to benchmark the commercial software package COMSOL. Canonical Pekeris-type

  19. Image Processing Techniques for Assessment of Dental Trays

    DTIC Science & Technology

    2001-10-25

    170 patients having Angle Class I molar relationships with minor malocclusions and teeth including second molars fully erupted without loss of tooth...Abstract-A tray selected for the dental patient must adapt to the curvature of the teeth and allow the impression material to be in appropriate...brands of perforated metal trays with 170 lower arch cast models collected from patients having Angle Class 1 type occlusion with minor malocclusions

  20. Advanced Diagnostics and Instrumentation for Chemically Reactive Flow Systems.

    DTIC Science & Technology

    1981-09-01

    graphic images from our model programs on the color display unit. We have written software for axial tomography image reconstruction that will be...technique for such applications . It can be shown that by making measurements, as described above, simultaneously at two wavelengths, one can derive a...DISTRIBUTION STATEMENT (of the abstract entered In Block 20, it different from Report) IS. SUPPLEMENTARY NOTES 19. KEY WORDS (Continue on reverse side it

  1. Modelling Farm Animal Welfare

    PubMed Central

    Collins, Lisa M.; Part, Chérie E.

    2013-01-01

    Simple Summary In this review paper we discuss the different modeling techniques that have been used in animal welfare research to date. We look at what questions they have been used to answer, the advantages and pitfalls of the methods, and how future research can best use these approaches to answer some of the most important upcoming questions in farm animal welfare. Abstract The use of models in the life sciences has greatly expanded in scope and advanced in technique in recent decades. However, the range, type and complexity of models used in farm animal welfare is comparatively poor, despite the great scope for use of modeling in this field of research. In this paper, we review the different modeling approaches used in farm animal welfare science to date, discussing the types of questions they have been used to answer, the merits and problems associated with the method, and possible future applications of each technique. We find that the most frequently published types of model used in farm animal welfare are conceptual and assessment models; two types of model that are frequently (though not exclusively) based on expert opinion. Simulation, optimization, scenario, and systems modeling approaches are rarer in animal welfare, despite being commonly used in other related fields. Finally, common issues such as a lack of quantitative data to parameterize models, and model selection and validation are discussed throughout the review, with possible solutions and alternative approaches suggested. PMID:26487411

  2. Statistical assessment of the learning curves of health technologies.

    PubMed

    Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T

    2001-01-01

    (1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second was a case series of consecutive laparoscopic cholecystectomy procedures performed by ten surgeons; the third was randomised trial data derived from the laparoscopic procedure arm of a multicentre trial of groin hernia repair, supplemented by data from non-randomised operations performed during the trial. RESULTS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: Of 4571 abstracts identified, 272 (6%) were later included in the study after review of the full paper. Some 51% of studies assessed a surgical minimal access technique and 95% were case series. The statistical method used most often (60%) was splitting the data into consecutive parts (such as halves or thirds), with only 14% attempting a more formal statistical analysis. The reporting of the studies was poor, with 31% giving no details of data collection methods. RESULTS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: Of 9431 abstracts assessed, 115 (1%) were deemed appropriate for further investigation and, of these, 18 were included in the study. All of the methods for complex data sets were identified in the non-clinical literature. These were discriminant analysis, two-stage estimation of learning rates, generalised estimating equations, multilevel models, latent curve models, time series models and stochastic parameter models. In addition, eight new shapes of learning curves were identified. RESULTS - TESTING OF STATISTICAL METHODS: No one particular shape of learning curve performed significantly better than another. The performance of 'operation time' as a proxy for learning differed between the three procedures. Multilevel modelling using the laparoscopic cholecystectomy data demonstrated and measured surgeon-specific and confounding effects. The inclusion of non-randomised cases, despite the possible limitations of the method, enhanced the interpretation of learning effects. CONCLUSIONS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: The statistical methods used for assessing learning effects in health technology assessment have been crude and the reporting of studies poor. CONCLUSIONS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: A number of statistical methods for assessing learning effects were identified that had not hitherto been used in health technology assessment. There was a hierarchy of methods for the identification and measurement of learning, and the more sophisticated methods for both have had little if any use in health technology assessment. This demonstrated the value of considering fields outside clinical research when addressing methodological issues in health technology assessment. CONCLUSIONS - TESTING OF STATISTICAL METHODS: It has been demonstrated that the portfolio of techniques identified can enhance investigations of learning curve effects. (ABSTRACT TRUNCATED)

  3. Innovation Abstracts; Volume XIV, 1992.

    ERIC Educational Resources Information Center

    Roueche, Suanne D., Ed.

    1992-01-01

    This series of 30 one- to two-page abstracts covering 1992 highlights a variety of innovative approaches to teaching and learning in the community college. Topics covered in the abstracts include: (1) faculty recognition and orientation; (2) the Amado M. Pena, Jr., Scholarship Program; (3) innovative teaching techniques, with individual abstracts…

  4. Innovation Abstracts, Volume XV, 1993.

    ERIC Educational Resources Information Center

    Roueche, Suanne D., Ed.

    1993-01-01

    This volume of 30 one- to two-page abstracts from 1993 highlights a variety of innovative approaches to teaching and learning in the community college. Topics covered in the abstracts include: (1) role-playing to encourage critical thinking; (2) team learning techniques to cultivate business skills; (3) librarian-instructor partnerships to create…

  5. TES: A Text Extraction System.

    ERIC Educational Resources Information Center

    Goh, A.; Hui, S. C.

    1996-01-01

    Describes how TES, a text extraction system, is able to electronically retrieve a set of sentences from a document to form an indicative abstract. Discusses various text abstraction techniques and related work in the area, provides an overview of the TES system, and compares system results against manually produced abstracts. (LAM)

  6. Defeating Adversary Network Intelligence Efforts with Active Cyber Defense Techniques

    DTIC Science & Technology

    2008-06-01

    Hide Things from Hackers: Processes, Principles, and Techniques,” Journal of Information Warfare , 5 (3): 26-40 (2006). 20. Rosenau, William ...54 Additional Sources Apel , Thomas. Generating Fingerprints of Network Servers and their Use in Honeypots. Thesis. Aachen University, Aachen...Paul Williams , PhD (ENG) REPORT U ABSTRACT U c. THIS PAGE U 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF PAGES 55

  7. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program versions [19]. These techniques compare two programs with a large degree of syntactic similarity to prove that portions of one program version are equivalent to the other. Regression verification can be used for guaranteeing backward compatibility, and for showing behavioral equivalence in programs with syntactic differences, e.g., when a program is refactored to improve its performance, maintainability, or readability. Existing regression verification techniques leverage similarities between program versions by using abstraction and decomposition techniques to improve scalability of the analysis [10, 12, 19]. The abstractions and decomposition in the these techniques, e.g., summaries of unchanged code [12] or semantically equivalent methods [19], compute an over-approximation of the program behaviors. The equivalence checking results of these techniques are sound but not complete-they may characterize programs as not functionally equivalent when, in fact, they are equivalent. In this work we describe a novel approach that leverages the impact of the differences between two programs for scaling regression verification. We partition program behaviors of each version into (a) behaviors impacted by the changes and (b) behaviors not impacted (unimpacted) by the changes. Only the impacted program behaviors are used during equivalence checking. We then prove that checking equivalence of the impacted program behaviors is equivalent to checking equivalence of all program behaviors for a given depth bound. In this work we use symbolic execution to generate the program behaviors and leverage control- and data-dependence information to facilitate the partitioning of program behaviors. The impacted program behaviors are termed as impact summaries. The dependence analyses that facilitate the generation of the impact summaries, we believe, could be used in conjunction with other abstraction and decomposition based approaches, [10, 12], as a complementary reduction technique. An evaluation of our regression verification technique shows that our approach is capable of leveraging similarities between program versions to reduce the size of the queries and the time required to check for logical equivalence. The main contributions of this work are: - A regression verification technique to generate impact summaries that can be checked for functional equivalence using an off-the-shelf decision procedure. - A proof that our approach is sound and complete with respect to the depth bound of symbolic execution. - An implementation of our technique using the LLVMcompiler infrastructure, the klee Symbolic Virtual Machine [4], and a variety of Satisfiability Modulo Theory (SMT) solvers, e.g., STP [7] and Z3 [6]. - An empirical evaluation on a set of C artifacts which shows that the use of impact summaries can reduce the cost of regression verification.

  8. Environmental information acquisition and maintenance techniques: reference guide. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riggins, R.E.; Young, V.T.; Goran, W.D.

    1980-08-01

    This report provides a guide to techniques for collecting, using and maintaining data about each of the 13 environmental technical specialties in the Environmental Impact Computer System (EICS). The technical specialties are: (1) ecology, (2) environmental health, (3) air, (4) surface water, (5) ground water, (6) sociology, (7) economics, (8) earth science, (9) land use, (10) noise, (11) transportation, (12) aesthetics, and (13) energy and resource conservation. Acquisition techniques are classified by the following general categories: (1) secondary data, (2) remote sensing, (3) mathematical modeling, (4) field work, (5) mapping/maps and (6) expert opinion. A matrix identifies the most appropriatemore » techniques for collecting information on the EICS technical specialties. After selecting a method, the user may read an abstract of the report explaining that technique, and may also wish to obtain the original document for detailed information about applying the technique. Finally, this report offers guidelines on storing environmental information for future use, and on presenting that information effectively in environmental documents.« less

  9. D Representation of the 19TH Century Balkan Architecture Using Scaled Museum-Maquette and Photogrammetry Methods

    NASA Astrophysics Data System (ADS)

    Georgiou, E.; Karachaliou, E.; Stylianidis, E.

    2017-08-01

    Characteristic example of the Balkan architecture of the 19th century, consists the "Tower house" which is found in the region of Epirus and Western Macedonia, Greece. Nowadays, the only information about these heritage buildings could be abstracted by the architectural designs on hand and the model - Tower that is being displayed in the Folklore Museum of the Municipality of Kozani, Greece, as a maquette. The current work generates a scaled 3D digital model of the "Tower house", by using photogrammetry techniques applied on the model-maquette that is being displayed in the Museum exhibits.

  10. The OSSE Framework at the NASA Global Modeling and Assimilation Office (GMAO)

    NASA Astrophysics Data System (ADS)

    Moradi, I.; Prive, N.; McCarty, W.; Errico, R. M.; Gelaro, R.

    2017-12-01

    This abstract summarizes the OSSE framework developed at the Global Modeling and Assimilation Office at the National Aeronautics and Space Administration (NASA/GMAO). Some of the OSSE techniques developed at GMAO including simulation of realistic observations, e.g., adding errors to simulated observations, are now widely used by the community to evaluate the impact of new observations on the weather forecasts. This talk presents some of the recent progresses and challenges in simulating realistic observations, radiative transfer modeling support for the GMAO OSSE activities, assimilation of OSSE observations into data assimilation systems, and evaluating the impact of simulated observations on the forecast skills.

  11. The OSSE Framework at the NASA Global Modeling and Assimilation Office (GMAO)

    NASA Technical Reports Server (NTRS)

    Moradi, Isaac; Prive, Nikki; McCarty, Will; Errico, Ronald M.; Gelaro, Ron

    2017-01-01

    This abstract summarizes the OSSE framework developed at the Global Modeling and Assimilation Office at the National Aeronautics and Space Administration (NASA/GMAO). Some of the OSSE techniques developed at GMAO including simulation of realistic observations, e.g., adding errors to simulated observations, are now widely used by the community to evaluate the impact of new observations on the weather forecasts. This talk presents some of the recent progresses and challenges in simulating realistic observations, radiative transfer modeling support for the GMAO OSSE activities, assimilation of OSSE observations into data assimilation systems, and evaluating the impact of simulated observations on the forecast skills.

  12. Scalable and Accurate SMT-based Model Checking of Data Flow Systems

    DTIC Science & Technology

    2013-10-30

    guided by the semantics of the description language . In this project we developed instead a complementary and novel approach based on a somewhat brute...believe that our approach could help considerably in expanding the reach of abstract interpretation techniques to a variety of tar- get languages , as...project. We worked on developing a framework for compositional verification that capitalizes on the fact that data-flow languages , such as Lustre, have

  13. Using Supervised Learning Techniques for Diagnosis of Dynamic Systems

    DTIC Science & Technology

    2002-05-04

    M. Gasca 2 , Juan A. Ortega2 Abstract. This paper describes an approach based on supervised diagnose systems faults are needed to maintain the systems...labelled, data will be used for this purpose [5] [6]. treated to add additional information about the running of system. In [7] the fundaments of the based ...8] proposes classification tool to the set of labelled and treated data. This a consistency- based approach with qualitative models. way, any

  14. A Survey of Terrain Modeling Technologies and Techniques

    DTIC Science & Technology

    2007-09-01

    Washington , DC 20314-1000 ERDC/TEC TR-08-2 ii Abstract: Test planning, rehearsal, and distributed test events for Future Combat Systems (FCS) require...distance) for all five lines of control points. Blue circles are errors of DSM (original data), red squares are DTM (bare Earth, processed by Intermap...circles are DSM, red squares are DTM ........... 8 5 Distribution of errors for line No. 729. Blue circles are DSM, red squares are DTM

  15. Cognitive Bridging: Using Strategic Communication To Connect Abstract Goals With The Means To Achieve Them.

    PubMed

    Katz, Sherri Jean; Byrne, Sahara

    2018-01-29

    Three studies test several mechanisms of cognitive bridging, or how a strategic communication message functions to connect the abstract goal of an individual with the specific means to achieve the goal. Across all of the experiments (n = 276, n = 209, n = 145), it was demonstrated that participants who received an induced bridging mechanism were more likely to produce cognitive bridging outputs and report more abstract responses than participants who did not receive a bridging technique. We do not find the same pattern of results among participants who received an integrated bridging technique. Taken together, these studies provide evidence that how abstractly or concretely an individual is thinking can be influenced by abstraction cues planted within a strategic message, providing promise for messaging efforts at the moment of decision. In other words, the level of abstract thinking an individual is carrying into an exposure situation is possible to change using cues within the message itself. This is the first article to juxtapose the induced and integrated mechanisms of cognitive bridging.

  16. Remote sensing of natural resources

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Quarterly literature review compiles citations and abstracts from eight major abstracting and indexing services. Each issue contains author/keyword index. Includes data obtained or techniques used from space, aircraft, or ground-based stations.

  17. Proceedings of the Sixth NASA Langley Formal Methods (LFM) Workshop

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Yvonne (Editor)

    2008-01-01

    Today's verification techniques are hard-pressed to scale with the ever-increasing complexity of safety critical systems. Within the field of aeronautics alone, we find the need for verification of algorithms for separation assurance, air traffic control, auto-pilot, Unmanned Aerial Vehicles (UAVs), adaptive avionics, automated decision authority, and much more. Recent advances in formal methods have made verifying more of these problems realistic. Thus we need to continually re-assess what we can solve now and identify the next barriers to overcome. Only through an exchange of ideas between theoreticians and practitioners from academia to industry can we extend formal methods for the verification of ever more challenging problem domains. This volume contains the extended abstracts of the talks presented at LFM 2008: The Sixth NASA Langley Formal Methods Workshop held on April 30 - May 2, 2008 in Newport News, Virginia, USA. The topics of interest that were listed in the call for abstracts were: advances in formal verification techniques; formal models of distributed computing; planning and scheduling; automated air traffic management; fault tolerance; hybrid systems/hybrid automata; embedded systems; safety critical applications; safety cases; accident/safety analysis.

  18. Geophysical abstracts 167, October-December 1956

    USGS Publications Warehouse

    Rabbitt, Mary C.; Vitaliano, Dorothy B.; Vesselowsky, S.T.; ,

    1956-01-01

    Geophysical Abstracts includes abstracts of technical papers and books on the physics of the solid earth, the application of physical methods and techniques to geologic problems, and geophysical exploration. The table of contents, which is alphabetically arranged, shows the material covered.Abstracts are prepared only of material that is believed to be generally available. Ordinarily abstracts are not published of material with limited circulation (such as dissertations, open-file reports, or memoranda) or of other papers presented orally at meetings unless summaries of substantial length are published. Abstracts of papers in Japanese and Chinese are based on abstracts or summaries in a western language accompanying the paper.

  19. Geophysical abstracts 164, January-March 1956

    USGS Publications Warehouse

    Rabbitt, Mary C.; Vitaliano, Dorothy B.; Vesselowsky, S.T.; ,

    1956-01-01

    Geophysical Abstracts includes abstracts of technical papers and books on the physics of the solid earth, the application of physical methods and techniques to geologic problems, and geophysical exploration. A new table of contents, alphabetically arranged, has been adapted to show more clearly the material covered.Abstracts are prepared only of material that is believed to be generally available. Ordinarily abstracts are not published of material with limited circulation (such as dissertations, open-file reports, or memoranda) or of papers presented orally at meetings unless summaries of substantial length are published. Abstracts of papers in Japanese and Chinese are based on abstracts or summaries in a western language accompanying the paper.

  20. Geophysical abstracts 166, July-September 1956

    USGS Publications Warehouse

    Rabbitt, Mary C.; Vitaliano, Dorothy B.; Vesselowsky, S.T.; ,

    1956-01-01

    Geophysical Abstracts includes abstracts of technical papers and books on the physics of the solid earth, the application of physical methods and techniques to geologic problems, and geophysical exploration. The table of contents, which is alphabetically arranged, shows the material covered.Abstracts are prepared only of material that is believed to be generally available. Ordinarily abstracts are not published of material with limited circulation (such as dissertations, open-file reports, or memoranda) or of other papers presented orally at meetings unless summaries of substantial length are published. Abstracts of papers in Japanese and Chinese are based on abstracts or summaries in a western language accompanying the paper.

  1. Geophysical abstracts 165, April-June 1956

    USGS Publications Warehouse

    Rabbitt, Mary C.; Vitaliano, Dorothy B.; Vesselowsky, S.T.; ,

    1956-01-01

    Geophysical Abstracts includes abstracts of technical papers and books on the physics of the solid earth, the application of physical methods and techniques to geologic problems, and geophysical exploration. The table of contents, which is alphabetically arranged, shows the material covered.Abstracts are prepared only of material that is believed to be generally available. Ordinarily abstracts are not published of material with limited circulation (such as dissertations, open-file reports, or memoranda) or of other papers presented orally at meetings unless summaries of substantial length are published. Abstracts of papers in Japanese and Chinese are based on abstracts or summaries in a western language accompanying the paper.

  2. Evaluating the application of failure mode and effects analysis technique in hospital wards: a systematic review

    PubMed Central

    Asgari Dastjerdi, Hoori; Khorasani, Elahe; Yarmohammadian, Mohammad Hossein; Ahmadzade, Mahdiye Sadat

    2017-01-01

    Abstract: Background: Medical errors are one of the greatest problems in any healthcare systems. The best way to prevent such problems is errors identification and their roots. Failure Mode and Effects Analysis (FMEA) technique is a prospective risk analysis method. This study is a review of risk analysis using FMEA technique in different hospital wards and departments. Methods: This paper has systematically investigated the available databases. After selecting inclusion and exclusion criteria, the related studies were found. This selection was made in two steps. First, the abstracts and titles were investigated by the researchers and, after omitting papers which did not meet the inclusion criteria, 22 papers were finally selected and the text was thoroughly examined. At the end, the results were obtained. Results: The examined papers had focused mostly on the process and had been conducted in the pediatric wards and radiology departments, and most participants were nursing staffs. Many of these papers attempted to express almost all the steps of model implementation; and after implementing the strategies and interventions, the Risk Priority Number (RPN) was calculated to determine the degree of the technique’s effect. However, these papers have paid less attention to the identification of risk effects. Conclusions: The study revealed that a small number of studies had failed to show the FMEA technique effects. In general, however, most of the studies recommended this technique and had considered it a useful and efficient method in reducing the number of risks and improving service quality. PMID:28039688

  3. Linking Somatic and Symbolic Representation in Semantic Memory: The Dynamic Multilevel Reactivation Framework

    PubMed Central

    Reilly, Jamie; Peelle, Jonathan E; Garcia, Amanda; Crutch, Sebastian J

    2016-01-01

    Biological plausibility is an essential constraint for any viable model of semantic memory. Yet, we have only the most rudimentary understanding of how the human brain conducts abstract symbolic transformations that underlie word and object meaning. Neuroscience has evolved a sophisticated arsenal of techniques for elucidating the architecture of conceptual representation. Nevertheless, theoretical convergence remains elusive. Here we describe several contrastive approaches to the organization of semantic knowledge, and in turn we offer our own perspective on two recurring questions in semantic memory research: 1) to what extent are conceptual representations mediated by sensorimotor knowledge (i.e., to what degree is semantic memory embodied)? 2) How might an embodied semantic system represent abstract concepts such as modularity, symbol, or proposition? To address these questions, we review the merits of sensorimotor (i.e., embodied) and amodal (i.e., disembodied) semantic theories and address the neurobiological constraints underlying each. We conclude that the shortcomings of both perspectives in their extreme forms necessitate a hybrid middle ground. We accordingly propose the Dynamic Multilevel Reactivation Framework, an integrative model premised upon flexible interplay between sensorimotor and amodal symbolic representations mediated by multiple cortical hubs. We discuss applications of the Dynamic Multilevel Reactivation Framework to abstract and concrete concept representation and describe how a multidimensional conceptual topography based on emotion, sensation, and magnitude can successfully frame a semantic space containing meanings for both abstract and concrete words. The consideration of ‘abstract conceptual features’ does not diminish the role of logical and/or executive processing in activating, manipulating and using information stored in conceptual representations. Rather, it proposes that the material on which these processes operate necessarily combine pure sensorimotor information and higher-order cognitive dimensions involved in symbolic representation. PMID:27294419

  4. Linking somatic and symbolic representation in semantic memory: the dynamic multilevel reactivation framework.

    PubMed

    Reilly, Jamie; Peelle, Jonathan E; Garcia, Amanda; Crutch, Sebastian J

    2016-08-01

    Biological plausibility is an essential constraint for any viable model of semantic memory. Yet, we have only the most rudimentary understanding of how the human brain conducts abstract symbolic transformations that underlie word and object meaning. Neuroscience has evolved a sophisticated arsenal of techniques for elucidating the architecture of conceptual representation. Nevertheless, theoretical convergence remains elusive. Here we describe several contrastive approaches to the organization of semantic knowledge, and in turn we offer our own perspective on two recurring questions in semantic memory research: (1) to what extent are conceptual representations mediated by sensorimotor knowledge (i.e., to what degree is semantic memory embodied)? (2) How might an embodied semantic system represent abstract concepts such as modularity, symbol, or proposition? To address these questions, we review the merits of sensorimotor (i.e., embodied) and amodal (i.e., disembodied) semantic theories and address the neurobiological constraints underlying each. We conclude that the shortcomings of both perspectives in their extreme forms necessitate a hybrid middle ground. We accordingly propose the Dynamic Multilevel Reactivation Framework-an integrative model predicated upon flexible interplay between sensorimotor and amodal symbolic representations mediated by multiple cortical hubs. We discuss applications of the dynamic multilevel reactivation framework to abstract and concrete concept representation and describe how a multidimensional conceptual topography based on emotion, sensation, and magnitude can successfully frame a semantic space containing meanings for both abstract and concrete words. The consideration of 'abstract conceptual features' does not diminish the role of logical and/or executive processing in activating, manipulating and using information stored in conceptual representations. Rather, it proposes that the materials upon which these processes operate necessarily combine pure sensorimotor information and higher-order cognitive dimensions involved in symbolic representation.

  5. A spatial operator algebra for manipulator modeling and control

    NASA Technical Reports Server (NTRS)

    Rodriguez, G.; Kreutz, Kenneth; Jain, Abhinandan

    1989-01-01

    A recently developed spatial operator algebra, useful for modeling, control, and trajectory design of manipulators is discussed. The elements of this algebra are linear operators whose domain and range spaces consist of forces, moments, velocities, and accelerations. The effect of these operators is equivalent to a spatial recursion along the span of a manipulator. Inversion of operators can be efficiently obtained via techniques of recursive filtering and smoothing. The operator algebra provides a high level framework for describing the dynamic and kinematic behavior of a manipulator and control and trajectory design algorithms. The interpretation of expressions within the algebraic framework leads to enhanced conceptual and physical understanding of manipulator dynamics and kinematics. Furthermore, implementable recursive algorithms can be immediately derived from the abstract operator expressions by inspection. Thus, the transition from an abstract problem formulation and solution to the detailed mechanizaton of specific algorithms is greatly simplified. The analytical formulation of the operator algebra, as well as its implementation in the Ada programming language are discussed.

  6. On Design Mining: Coevolution and Surrogate Models.

    PubMed

    Preen, Richard J; Bull, Larry

    2017-01-01

    Design mining is the use of computational intelligence techniques to iteratively search and model the attribute space of physical objects evaluated directly through rapid prototyping to meet given objectives. It enables the exploitation of novel materials and processes without formal models or complex simulation. In this article, we focus upon the coevolutionary nature of the design process when it is decomposed into concurrent sub-design-threads due to the overall complexity of the task. Using an abstract, tunable model of coevolution, we consider strategies to sample subthread designs for whole-system testing and how best to construct and use surrogate models within the coevolutionary scenario. Drawing on our findings, we then describe the effective design of an array of six heterogeneous vertical-axis wind turbines.

  7. Framing of Uncertainty in Scientific Publications: Towards Recommendations for Decision Support

    NASA Astrophysics Data System (ADS)

    Guillaume, J. H. A.; Helgeson, C.; Elsawah, S.; Jakeman, A. J.; Kummu, M.

    2016-12-01

    Uncertainty is recognised as an essential issue in environmental decision making and decision support. As modellers, we notably use a variety of tools and techniques within an analysis, for example related to uncertainty quantification and model validation. We also address uncertainty by how we present results. For example, experienced modellers are careful to distinguish robust conclusions from those that need further work, and the precision of quantitative results is tailored to their accuracy. In doing so, the modeller frames how uncertainty should be interpreted by their audience. This is an area which extends beyond modelling to fields such as philosophy of science, semantics, discourse analysis, intercultural communication and rhetoric. We propose that framing of uncertainty deserves greater attention in the context of decision support, and that there are opportunities in this area for fundamental research, synthesis and knowledge transfer, development of teaching curricula, and significant advances in managing uncertainty in decision making. This presentation reports preliminary results of a study of framing practices. Specifically, we analyse the framing of uncertainty that is visible in the abstracts from a corpus of scientific articles. We do this through textual analysis of the content and structure of those abstracts. Each finding that appears in an abstract is classified according to the uncertainty framing approach used, using a classification scheme that was iteratively revised based on reflection and comparison amongst three coders. This analysis indicates how frequently the different framing approaches are used, and provides initial insights into relationships between frames, how the frames relate to interpretation of uncertainty, and how rhetorical devices are used by modellers to communicate uncertainty in their work. We propose initial hypotheses for how the resulting insights might influence decision support, and help advance decision making to better address uncertainty.

  8. Top-level modeling of an als system utilizing object-oriented techniques

    NASA Astrophysics Data System (ADS)

    Rodriguez, L. F.; Kang, S.; Ting, K. C.

    The possible configuration of an Advanced Life Support (ALS) System capable of supporting human life for long-term space missions continues to evolve as researchers investigate potential technologies and configurations. To facilitate the decision process the development of acceptable, flexible, and dynamic mathematical computer modeling tools capable of system level analysis is desirable. Object-oriented techniques have been adopted to develop a dynamic top-level model of an ALS system.This approach has several advantages; among these, object-oriented abstractions of systems are inherently modular in architecture. Thus, models can initially be somewhat simplistic, while allowing for adjustments and improvements. In addition, by coding the model in Java, the model can be implemented via the World Wide Web, greatly encouraging the utilization of the model. Systems analysis is further enabled with the utilization of a readily available backend database containing information supporting the model. The subsystem models of the ALS system model include Crew, Biomass Production, Waste Processing and Resource Recovery, Food Processing and Nutrition, and the Interconnecting Space. Each subsystem model and an overall model have been developed. Presented here is the procedure utilized to develop the modeling tool, the vision of the modeling tool, and the current focus for each of the subsystem models.

  9. Hardware description languages

    NASA Technical Reports Server (NTRS)

    Tucker, Jerry H.

    1994-01-01

    Hardware description languages are special purpose programming languages. They are primarily used to specify the behavior of digital systems and are rapidly replacing traditional digital system design techniques. This is because they allow the designer to concentrate on how the system should operate rather than on implementation details. Hardware description languages allow a digital system to be described with a wide range of abstraction, and they support top down design techniques. A key feature of any hardware description language environment is its ability to simulate the modeled system. The two most important hardware description languages are Verilog and VHDL. Verilog has been the dominant language for the design of application specific integrated circuits (ASIC's). However, VHDL is rapidly gaining in popularity.

  10. Annual Quality Assurance Conference Abstracts by Barbara Marshik

    EPA Pesticide Factsheets

    25th Annual Quality Assurance Conference. Abstracts: Material and Process Conditions for Successful Use of Extractive Sampling Techniques and Certification Methods Errors in the Analysis of NMHC and VOCs in CNG-Based Engine Emissions by Barbara Marshik

  11. Assessment of Metacognitive Knowledge among Science Students, a Case Study of Two Bilingual and Two NNS Students

    ERIC Educational Resources Information Center

    Ali, Gadacha

    2007-01-01

    This investigation aims to assess awareness of genre and writing skills among science students via an abstract writing task, with recall and follow-up protocols to monitor the students, and to characterize the relationship between the abstract and the base article. Abstract writing involves specific data selection techniques of activities involved…

  12. Safety Analysis of FMS/CTAS Interactions During Aircraft Arrivals

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy G.

    1998-01-01

    This grant funded research on human-computer interaction design and analysis techniques, using future ATC environments as a testbed. The basic approach was to model the nominal behavior of both the automated and human procedures and then to apply safety analysis techniques to these models. Our previous modeling language, RSML, had been used to specify the system requirements for TCAS II for the FAA. Using the lessons learned from this experience, we designed a new modeling language that (among other things) incorporates features to assist in designing less error-prone human-computer interactions and interfaces and in detecting potential HCI problems, such as mode confusion. The new language, SpecTRM-RL, uses "intent" abstractions, based on Rasmussen's abstraction hierarchy, and includes both informal (English and graphical) specifications and formal, executable models for specifying various aspects of the system. One of the goals for our language was to highlight the system modes and mode changes to assist in identifying the potential for mode confusion. Three published papers resulted from this research. The first builds on the work of Degani on mode confusion to identify aspects of the system design that could lead to potential hazards. We defined and modeled modes differently than Degani and also defined design criteria for SpecTRM-RL models. Our design criteria include the Degani criteria but extend them to include more potential problems. In a second paper, Leveson and Palmer showed how the criteria for indirect mode transitions could be applied to a mode confusion problem found in several ASRS reports for the MD-88. In addition, we defined a visual task modeling language that can be used by system designers to model human-computer interaction. The visual models can be translated into SpecTRM-RL models, and then the SpecTRM-RL suite of analysis tools can be used to perform formal and informal safety analyses on the task model in isolation or integrated with the rest of the modeled system. We had hoped to be able to apply these modeling languages and analysis tools to a TAP air/ground trajectory negotiation scenario, but the development of the tools took more time than we anticipated.

  13. Moving beyond regression techniques in cardiovascular risk prediction: applying machine learning to address analytic challenges

    PubMed Central

    Goldstein, Benjamin A.; Navar, Ann Marie; Carter, Rickey E.

    2017-01-01

    Abstract Risk prediction plays an important role in clinical cardiology research. Traditionally, most risk models have been based on regression models. While useful and robust, these statistical methods are limited to using a small number of predictors which operate in the same way on everyone, and uniformly throughout their range. The purpose of this review is to illustrate the use of machine-learning methods for development of risk prediction models. Typically presented as black box approaches, most machine-learning methods are aimed at solving particular challenges that arise in data analysis that are not well addressed by typical regression approaches. To illustrate these challenges, as well as how different methods can address them, we consider trying to predicting mortality after diagnosis of acute myocardial infarction. We use data derived from our institution's electronic health record and abstract data on 13 regularly measured laboratory markers. We walk through different challenges that arise in modelling these data and then introduce different machine-learning approaches. Finally, we discuss general issues in the application of machine-learning methods including tuning parameters, loss functions, variable importance, and missing data. Overall, this review serves as an introduction for those working on risk modelling to approach the diffuse field of machine learning. PMID:27436868

  14. Investigating Actuation Force Fight with Asynchronous and Synchronous Redundancy Management Techniques

    NASA Technical Reports Server (NTRS)

    Hall, Brendan; Driscoll, Kevin; Schweiker, Kevin; Dutertre, Bruno

    2013-01-01

    Within distributed fault-tolerant systems the term force-fight is colloquially used to describe the level of command disagreement present at redundant actuation interfaces. This report details an investigation of force-fight using three distributed system case-study architectures. Each case study architecture is abstracted and formally modeled using the Symbolic Analysis Laboratory (SAL) tool chain from the Stanford Research Institute (SRI). We use the formal SAL models to produce k-induction based proofs of a bounded actuation agreement property. We also present a mathematically derived bound of redundant actuation agreement for sine-wave stimulus. The report documents our experiences and lessons learned developing the formal models and the associated proofs.

  15. Illustrative visualization of 3D city models

    NASA Astrophysics Data System (ADS)

    Doellner, Juergen; Buchholz, Henrik; Nienhaus, Marc; Kirsch, Florian

    2005-03-01

    This paper presents an illustrative visualization technique that provides expressive representations of large-scale 3D city models, inspired by the tradition of artistic and cartographic visualizations typically found in bird"s-eye view and panoramic maps. We define a collection of city model components and a real-time multi-pass rendering algorithm that achieves comprehensible, abstract 3D city model depictions based on edge enhancement, color-based and shadow-based depth cues, and procedural facade texturing. Illustrative visualization provides an effective visual interface to urban spatial information and associated thematic information complementing visual interfaces based on the Virtual Reality paradigm, offering a huge potential for graphics design. Primary application areas include city and landscape planning, cartoon worlds in computer games, and tourist information systems.

  16. Current Progress of Genetically Engineered Pig Models for Biomedical Research

    PubMed Central

    Gün, Gökhan

    2014-01-01

    Abstract The first transgenic pigs were generated for agricultural purposes about three decades ago. Since then, the micromanipulation techniques of pig oocytes and embryos expanded from pronuclear injection of foreign DNA to somatic cell nuclear transfer, intracytoplasmic sperm injection-mediated gene transfer, lentiviral transduction, and cytoplasmic injection. Mechanistically, the passive transgenesis approach based on random integration of foreign DNA was developed to active genetic engineering techniques based on the transient activity of ectopic enzymes, such as transposases, recombinases, and programmable nucleases. Whole-genome sequencing and annotation of advanced genome maps of the pig complemented these developments. The full implementation of these tools promises to immensely increase the efficiency and, in parallel, to reduce the costs for the generation of genetically engineered pigs. Today, the major application of genetically engineered pigs is found in the field of biomedical disease modeling. It is anticipated that genetically engineered pigs will increasingly be used in biomedical research, since this model shows several similarities to humans with regard to physiology, metabolism, genome organization, pathology, and aging. PMID:25469311

  17. Approximation, abstraction and decomposition in search and optimization

    NASA Technical Reports Server (NTRS)

    Ellman, Thomas

    1992-01-01

    In this paper, I discuss four different areas of my research. One portion of my research has focused on automatic synthesis of search control heuristics for constraint satisfaction problems (CSPs). I have developed techniques for automatically synthesizing two types of heuristics for CSPs: Filtering functions are used to remove portions of a search space from consideration. Another portion of my research is focused on automatic synthesis of hierarchic algorithms for solving constraint satisfaction problems (CSPs). I have developed a technique for constructing hierarchic problem solvers based on numeric interval algebra. Another portion of my research is focused on automatic decomposition of design optimization problems. We are using the design of racing yacht hulls as a testbed domain for this research. Decomposition is especially important in the design of complex physical shapes such as yacht hulls. Another portion of my research is focused on intelligent model selection in design optimization. The model selection problem results from the difficulty of using exact models to analyze the performance of candidate designs.

  18. How learning to abstract shapes neural sound representations

    PubMed Central

    Ley, Anke; Vroomen, Jean; Formisano, Elia

    2014-01-01

    The transformation of acoustic signals into abstract perceptual representations is the essence of the efficient and goal-directed neural processing of sounds in complex natural environments. While the human and animal auditory system is perfectly equipped to process the spectrotemporal sound features, adequate sound identification and categorization require neural sound representations that are invariant to irrelevant stimulus parameters. Crucially, what is relevant and irrelevant is not necessarily intrinsic to the physical stimulus structure but needs to be learned over time, often through integration of information from other senses. This review discusses the main principles underlying categorical sound perception with a special focus on the role of learning and neural plasticity. We examine the role of different neural structures along the auditory processing pathway in the formation of abstract sound representations with respect to hierarchical as well as dynamic and distributed processing models. Whereas most fMRI studies on categorical sound processing employed speech sounds, the emphasis of the current review lies on the contribution of empirical studies using natural or artificial sounds that enable separating acoustic and perceptual processing levels and avoid interference with existing category representations. Finally, we discuss the opportunities of modern analyses techniques such as multivariate pattern analysis (MVPA) in studying categorical sound representations. With their increased sensitivity to distributed activation changes—even in absence of changes in overall signal level—these analyses techniques provide a promising tool to reveal the neural underpinnings of perceptually invariant sound representations. PMID:24917783

  19. System Modeling and Diagnostics for Liquefying-Fuel Hybrid Rockets

    NASA Technical Reports Server (NTRS)

    Poll, Scott; Iverson, David; Ou, Jeremy; Sanderfer, Dwight; Patterson-Hine, Ann

    2003-01-01

    A Hybrid Combustion Facility (HCF) was recently built at NASA Ames Research Center to study the combustion properties of a new fuel formulation that burns approximately three times faster than conventional hybrid fuels. Researchers at Ames working in the area of Integrated Vehicle Health Management recognized a good opportunity to apply IVHM techniques to a candidate technology for next generation launch systems. Five tools were selected to examine various IVHM techniques for the HCF. Three of the tools, TEAMS (Testability Engineering and Maintenance System), L2 (Livingstone2), and RODON, are model-based reasoning (or diagnostic) systems. Two other tools in this study, ICS (Interval Constraint Simulator) and IMS (Inductive Monitoring System) do not attempt to isolate the cause of the failure but may be used for fault detection. Models of varying scope and completeness were created, both qualitative and quantitative. In each of the models, the structure and behavior of the physical system are captured. In the qualitative models, the temporal aspects of the system behavior and the abstraction of sensor data are handled outside of the model and require the development of additional code. In the quantitative model, less extensive processing code is also necessary. Examples of fault diagnoses are given.

  20. Using detailed inter-network simulation and model abstraction to investigate and evaluate joint battlespace infosphere (JBI) support technologies

    NASA Astrophysics Data System (ADS)

    Green, David M.; Dallaire, Joel D.; Reaper, Jerome H.

    2004-08-01

    The Joint Battlespace Infosphere (JBI) program is performing a technology investigation into global communications, data mining and warehousing, and data fusion technologies by focusing on techniques and methodologies that support twenty first century military distributed collaboration. Advancement of these technologies is vitally important if military decision makers are to have the right data, in the right format, at the right time and place to support making the right decisions within available timelines. A quantitative understanding of individual and combinational effects arising from the application of technologies within a framework is presently far too complex to evaluate at more than a cursory depth. In order to facilitate quantitative analysis under these circumstances, the Distributed Information Enterprise Modeling and Simulation (DIEMS) team was formed to apply modeling and simulation (M&S) techniques to help in addressing JBI analysis challenges. The DIEMS team has been tasked utilizing collaborative distributed M&S architectures to quantitatively evaluate JBI technologies and tradeoffs. This paper first presents a high level view of the DIEMS project. Once this approach has been established, a more concentrated view of the detailed communications simulation techniques used in generating the underlying support data sets is presented.

  1. Foundations of the Bandera Abstraction Tools

    NASA Technical Reports Server (NTRS)

    Hatcliff, John; Dwyer, Matthew B.; Pasareanu, Corina S.; Robby

    2003-01-01

    Current research is demonstrating that model-checking and other forms of automated finite-state verification can be effective for checking properties of software systems. Due to the exponential costs associated with model-checking, multiple forms of abstraction are often necessary to obtain system models that are tractable for automated checking. The Bandera Tool Set provides multiple forms of automated support for compiling concurrent Java software systems to models that can be supplied to several different model-checking tools. In this paper, we describe the foundations of Bandera's data abstraction mechanism which is used to reduce the cardinality (and the program's state-space) of data domains in software to be model-checked. From a technical standpoint, the form of data abstraction used in Bandera is simple, and it is based on classical presentations of abstract interpretation. We describe the mechanisms that Bandera provides for declaring abstractions, for attaching abstractions to programs, and for generating abstracted programs and properties. The contributions of this work are the design and implementation of various forms of tool support required for effective application of data abstraction to software components written in a programming language like Java which has a rich set of linguistic features.

  2. Abstraction and model evaluation in category learning.

    PubMed

    Vanpaemel, Wolf; Storms, Gert

    2010-05-01

    Thirty previously published data sets, from seminal category learning tasks, are reanalyzed using the varying abstraction model (VAM). Unlike a prototype-versus-exemplar analysis, which focuses on extreme levels of abstraction only, a VAM analysis also considers the possibility of partial abstraction. Whereas most data sets support no abstraction when only the extreme possibilities are considered, we show that evidence for abstraction can be provided using the broader view on abstraction provided by the VAM. The present results generalize earlier demonstrations of partial abstraction (Vanpaemel & Storms, 2008), in which only a small number of data sets was analyzed. Following the dominant modus operandi in category learning research, Vanpaemel and Storms evaluated the models on their best fit, a practice known to ignore the complexity of the models under consideration. In the present study, in contrast, model evaluation not only relies on the maximal likelihood, but also on the marginal likelihood, which is sensitive to model complexity. Finally, using a large recovery study, it is demonstrated that, across the 30 data sets, complexity differences between the models in the VAM family are small. This indicates that a (computationally challenging) complexity-sensitive model evaluation method is uncalled for, and that the use of a (computationally straightforward) complexity-insensitive model evaluation method is justified.

  3. Modern Quantum Field Theory II - Proceeeings of the International Colloquium

    NASA Astrophysics Data System (ADS)

    Das, S. R.; Mandal, G.; Mukhi, S.; Wadia, S. R.

    1995-08-01

    The Table of Contents for the book is as follows: * Foreword * 1. Black Holes and Quantum Gravity * Quantum Black Holes and the Problem of Time * Black Hole Entropy and the Semiclassical Approximation * Entropy and Information Loss in Two Dimensions * Strings on a Cone and Black Hole Entropy (Abstract) * Boundary Dynamics, Black Holes and Spacetime Fluctuations in Dilation Gravity (Abstract) * Pair Creation of Black Holes (Abstract) * A Brief View of 2-Dim. String Theory and Black Holes (Abstract) * 2. String Theory * Non-Abelian Duality in WZW Models * Operators and Correlation Functions in c ≤ 1 String Theory * New Symmetries in String Theory * A Look at the Discretized Superstring Using Random Matrices * The Nested BRST Structure of Wn-Symmetries * Landau-Ginzburg Model for a Critical Topological String (Abstract) * On the Geometry of Wn Gravity (Abstract) * O(d, d) Tranformations, Marginal Deformations and the Coset Construction in WZNW Models (Abstract) * Nonperturbative Effects and Multicritical Behaviour of c = 1 Matrix Model (Abstract) * Singular Limits and String Solutions (Abstract) * BV Algebra on the Moduli Spaces of Riemann Surfaces and String Field Theory (Abstract) * 3. Condensed Matter and Statistical Mechanics * Stochastic Dynamics in a Deposition-Evaporation Model on a Line * Models with Inverse-Square Interactions: Conjectured Dynamical Correlation Functions of the Calogero-Sutherland Model at Rational Couplings * Turbulence and Generic Scale Invariance * Singular Perturbation Approach to Phase Ordering Dynamics * Kinetics of Diffusion-Controlled and Ballistically-Controlled Reactions * Field Theory of a Frustrated Heisenberg Spin Chain * FQHE Physics in Relativistic Field Theories * Importance of Initial Conditions in Determining the Dynamical Class of Cellular Automata (Abstract) * Do Hard-Core Bosons Exhibit Quantum Hall Effect? (Abstract) * Hysteresis in Ferromagnets * 4. Fundamental Aspects of Quantum Mechanics and Quantum Field Theory * Finite Quantum Physics and Noncommutative Geometry * Higgs as Gauge Field and the Standard Model * Canonical Quantisation of an Off-Conformal Theory * Deterministic Quantum Mechanics in One Dimension * Spin-Statistics Relations for Topological Geons in 2+1 Quantum Gravity * Generalized Fock Spaces * Geometrical Expression for Short Distance Singularities in Field Theory * 5. Mathematics and Quantum Field Theory * Knot Invariants from Quantum Field Theories * Infinite Grassmannians and Moduli Spaces of G-Bundles * A Review of an Algebraic Geometry Approach to a Model Quantum Field Theory on a Curve (Abstract) * 6. Integrable Models * Spectral Representation of Correlation Functions in Two-Dimensional Quantum Field Theories * On Various Avatars of the Pasquier Algebra * Supersymmetric Integrable Field Theories and Eight Vertex Free Fermion Models (Abstract) * 7. Lattice Field Theory * From Kondo Model and Strong Coupling Lattice QCD to the Isgur-Wise Function * Effective Confinement from a Logarithmically Running Coupling (Abstract)

  4. A Technique for Machine-Aided Indexing

    ERIC Educational Resources Information Center

    Klingbiel, Paul H.

    1973-01-01

    The technique for machine-aided indexing developed at the Defense Documentation Center (DDC) is illustrated on a randomly chosen abstract. Additional text is provided in coded form so that the reader can more fully explore this technique. (2 references) (Author)

  5. The Abstraction Process of Limit Knowledge

    ERIC Educational Resources Information Center

    Sezgin Memnun, Dilek; Aydin, Bünyamin; Özbilen, Ömer; Erdogan, Günes

    2017-01-01

    The RBC+C abstraction model is an effective model in mathematics education because it gives the opportunity to analyze research data through cognitive actions. For this reason, we aim to examine the abstraction process of the limit knowledge of two volunteer participant students using the RBC+C abstraction model. With this aim, the students'…

  6. Speech Communication and Communication Processes: Abstracts of Doctoral Dissertations Published in "Dissertation Abstracts International," July through September 1977 (Vol. 38 Nos. 1 through 3).

    ERIC Educational Resources Information Center

    ERIC Clearinghouse on Reading and Communication Skills, Urbana, IL.

    This collection of abstracts is part of a continuing series providing information on recent doctoral dissertations. The 22 titles deal with a variety of topics, including the following: the rhetorical effectiveness of Senator Edmund S. Muskie's 1972 Presidential primary election campaign; persuasive speaking techniques of black college and…

  7. Manual on the Flight of Flexible Aircraft in Turbulence (Manuel sur le Vol des Avions Non-rigides en Milieu Turbulent)

    DTIC Science & Technology

    1991-05-01

    Static Non-Linearity 106 0 y = f(dx/dt) = -f(-dx/dt) = = > Static Non-Linearity • y = f(x,sign(dx/dt)) = = > Hysteresis-Type Non-Linearity = -f(-x,sign... Havilland Division Garratt Blvd., Downsview Ontario M3K I Y5 Canada CONTENTS ABSTRACT NOTATION 1. INTRODUCTION 2. THE SDG GUST MODEL 3. ESTABLISHING CRITICAL...VENT ETRE ADRESSEES DIRECTEMENT N AU SERVICE NATIONAL TECHNIQUE, Dh INFORMATION (NTIS) DONT LADRESSE SUIT AGENCES DE VENTE National Technical

  8. 2001 Flight Mechanics Symposium

    NASA Technical Reports Server (NTRS)

    Lynch, John P. (Editor)

    2001-01-01

    This conference publication includes papers and abstracts presented at the Flight Mechanics Symposium held on June 19-21, 2001. Sponsored by the Guidance, Navigation and Control Center of Goddard Space Flight Center, this symposium featured technical papers on a wide range of issues related to attitude/orbit determination, prediction and control; attitude simulation; attitude sensor calibration; theoretical foundation of attitude computation; dynamics model improvements; autonomous navigation; constellation design and formation flying; estimation theory and computational techniques; Earth environment mission analysis and design; and, spacecraft re-entry mission design and operations.

  9. Using Genotype Abundance to Improve Phylogenetic Inference

    PubMed Central

    Mesin, Luka; Victora, Gabriel D; Minin, Vladimir N; Matsen, Frederick A

    2018-01-01

    Abstract Modern biological techniques enable very dense genetic sampling of unfolding evolutionary histories, and thus frequently sample some genotypes multiple times. This motivates strategies to incorporate genotype abundance information in phylogenetic inference. In this article, we synthesize a stochastic process model with standard sequence-based phylogenetic optimality, and show that tree estimation is substantially improved by doing so. Our method is validated with extensive simulations and an experimental single-cell lineage tracing study of germinal center B cell receptor affinity maturation. PMID:29474671

  10. An Abstract Systolic Model and Its Application to the Design of Finite Element Systems.

    DTIC Science & Technology

    1983-01-01

    networks as a collection of communicating. parallel :.,’-.processes, some of the techniques for the verification of distributed systems ,.woi (see for...item must be collected . even If there is no Interest In its value. In this case. the collection of the data is simply achieved by changing the state of...the appropriate data as well as for collecting the output data and performing some additional tasks that we will discuss later. A basic functional

  11. Patch models and their applications to multivehicle command and control.

    PubMed

    Rao, Venkatesh G; D'Andrea, Raffaello

    2007-06-01

    We introduce patch models, a computational modeling formalism for multivehicle combat domains, based on spatiotemporal abstraction methods developed in the computer science community. The framework yields models that are expressive enough to accommodate nontrivial controlled vehicle dynamics while being within the representational capabilities of common artificial intelligence techniques used in the construction of autonomous systems. The framework allows several key design requirements of next-generation network-centric command and control systems, such as maintenance of shared situation awareness, to be achieved. Major features include support for multiple situation models at each decision node and rapid mission plan adaptation. We describe the formal specification of patch models and our prototype implementation, i.e., Patchworks. The capabilities of patch models are validated through a combat mission simulation in Patchworks, which involves two defending teams protecting a camp from an enemy attacking team.

  12. Groundwater management under uncertainty using a stochastic multi-cell model

    NASA Astrophysics Data System (ADS)

    Joodavi, Ata; Zare, Mohammad; Ziaei, Ali Naghi; Ferré, Ty P. A.

    2017-08-01

    The optimization of spatially complex groundwater management models over long time horizons requires the use of computationally efficient groundwater flow models. This paper presents a new stochastic multi-cell lumped-parameter aquifer model that explicitly considers uncertainty in groundwater recharge. To achieve this, the multi-cell model is combined with the constrained-state formulation method. In this method, the lower and upper bounds of groundwater heads are incorporated into the mass balance equation using indicator functions. This provides expressions for the means, variances and covariances of the groundwater heads, which can be included in the constraint set in an optimization model. This method was used to formulate two separate stochastic models: (i) groundwater flow in a two-cell aquifer model with normal and non-normal distributions of groundwater recharge; and (ii) groundwater management in a multiple cell aquifer in which the differences between groundwater abstractions and water demands are minimized. The comparison between the results obtained from the proposed modeling technique with those from Monte Carlo simulation demonstrates the capability of the proposed models to approximate the means, variances and covariances. Significantly, considering covariances between the heads of adjacent cells allows a more accurate estimate of the variances of the groundwater heads. Moreover, this modeling technique requires no discretization of state variables, thus offering an efficient alternative to computationally demanding methods.

  13. Automatic Text Structuring and Summarization.

    ERIC Educational Resources Information Center

    Salton, Gerard; And Others

    1997-01-01

    Discussion of the use of information retrieval techniques for automatic generation of semantic hypertext links focuses on automatic text summarization. Topics include World Wide Web links, text segmentation, and evaluation of text summarization by comparing automatically generated abstracts with manually prepared abstracts. (Author/LRW)

  14. Abstracts of Research. July 1974-June 1975.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Computer and Information Science Research Center.

    Abstracts of research papers in computer and information science are given for 68 papers in the areas of information storage and retrieval; human information processing; information analysis; linguistic analysis; artificial intelligence; information processes in physical, biological, and social systems; mathematical techniques; systems…

  15. Proceedings of the 21st DOE/NRC Nuclear Air Cleaning Conference; Sessions 1--8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    First, M.W.

    1991-02-01

    Separate abstracts have been prepared for the papers presented at the meeting on nuclear facility air cleaning technology in the following specific areas of interest: air cleaning technologies for the management and disposal of radioactive wastes; Canadian waste management program; radiological health effects models for nuclear power plant accident consequence analysis; filter testing; US standard codes on nuclear air and gas treatment; European community nuclear codes and standards; chemical processing off-gas cleaning; incineration and vitrification; adsorbents; nuclear codes and standards; mathematical modeling techniques; filter technology; safety; containment system venting; and nuclear air cleaning programs around the world. (MB)

  16. Object-orientated DBMS techniques for time-oriented medical record.

    PubMed

    Pinciroli, F; Combi, C; Pozzi, G

    1992-01-01

    In implementing time-orientated medical record (TOMR) management systems, use of a relational model played a big role. Many applications have been developed to extend query and data manipulation languages to temporal aspects of information. Our experience in developing TOMR revealed some deficiencies inside the relational model, such as: (a) abstract data type definition; (b) unified view of data, at a programming level; (c) management of temporal data; (d) management of signals and images. We identified some first topics to face by an object-orientated approach to database design. This paper describes the first steps in designing and implementing a TOMR by an object-orientated DBMS.

  17. Proceedings of the First NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)

    2009-01-01

    Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.

  18. "Proprietary Processed" Allografts: Clinical Outcomes and Biomechanical Properties in Anterior Cruciate Ligament Reconstruction.

    PubMed

    Roberson, Troy A; Abildgaard, Jeffrey T; Wyland, Douglas J; Siffri, Paul C; Geary, Stephen P; Hawkins, Richard J; Tokish, John M

    2017-11-01

    The processing of allograft tissues in anterior cruciate ligament (ACL) reconstruction continues to be controversial. While high-dose irradiation of grafts has received scrutiny for high failure rates, lower dose irradiation and "proprietary-based" nonirradiated sterilization techniques have become increasingly popular, with little in the literature to evaluate their outcomes. Recent studies have suggested that the specifics of allograft processing techniques may be a risk factor for higher failure rates. To assess these proprietary processes and their clinical outcomes and biomechanical properties. Systematic review. A systematic review was performed using searches of PubMed, EMBASE, Google Scholar, and Cochrane databases. English-language studies were identified with the following search terms: "allograft ACL reconstruction" (title/abstract), "novel allograft processing" (title/abstract), "allograft anterior cruciate ligament" (title/abstract), "anterior cruciate ligament allograft processing" (title/abstract), or "biomechanical properties anterior cruciate ligament allograft" (title/abstract). Duplicate studies, studies not providing the allograft processing technique, and those not containing the outcomes of interest were excluded. Outcomes of interest included outcome scores, complication and failure rates, and biomechanical properties of the processed allografts. Twenty-four studies (13 clinical, 11 biomechanical) met inclusion criteria for review. No demonstrable difference in patient-reported outcomes was appreciated between the processing techniques, with the exception of the Tutoplast process. The clinical failure rate of the Tutoplast process was unacceptably high (45% at 6 years), but no other difference was found between other processing techniques (BioCleanse: 5.4%; AlloTrue: 5.7%; MTF: 6.7%). Several studies did show an increased failure rate, but these studies either combined processing techniques or failed to delineate enough detail to allow a specific comparison for this study. The biomechanical studies showed overall maintenance of satisfactory biomechanical properties throughout multiple testing modes with normalization to the percentage of control specimens. A comparison of proprietary allograft processing techniques is difficult because of the variability and lack of specificity of reporting in the current literature. Among the available literature, except for the Tutoplast process, no notable differences were found in the clinical outcomes or biomechanical properties. Future study with a longer follow-up is necessary to determine the role and limitations of these grafts in the clinical setting.

  19. LEAD-FREE INTERCONNECT TECHNIQUE BY USING VARIABLE FREQUENCY MICROWAVE (VFM). (R831489)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  20. Symbolic Analysis of Concurrent Programs with Polymorphism

    NASA Technical Reports Server (NTRS)

    Rungta, Neha Shyam

    2010-01-01

    The current trend of multi-core and multi-processor computing is causing a paradigm shift from inherently sequential to highly concurrent and parallel applications. Certain thread interleavings, data input values, or combinations of both often cause errors in the system. Systematic verification techniques such as explicit state model checking and symbolic execution are extensively used to detect errors in such systems [7, 9]. Explicit state model checking enumerates possible thread schedules and input data values of a program in order to check for errors [3, 9]. To partially mitigate the state space explosion from data input values, symbolic execution techniques substitute data input values with symbolic values [5, 7, 6]. Explicit state model checking and symbolic execution techniques used in conjunction with exhaustive search techniques such as depth-first search are unable to detect errors in medium to large-sized concurrent programs because the number of behaviors caused by data and thread non-determinism is extremely large. We present an overview of abstraction-guided symbolic execution for concurrent programs that detects errors manifested by a combination of thread schedules and data values [8]. The technique generates a set of key program locations relevant in testing the reachability of the target locations. The symbolic execution is then guided along these locations in an attempt to generate a feasible execution path to the error state. This allows the execution to focus in parts of the behavior space more likely to contain an error.

  1. Reading Achievement: Characteristics Associated with Success and Failure: Abstracts of Doctoral Dissertations Published in "Dissertation Abstracts International," July through December 1985 (Vol. 46 Nos. 1 through 6).

    ERIC Educational Resources Information Center

    ERIC Clearinghouse on Reading and Communication Skills, Urbana, IL.

    This collection of abstracts is part of a continuing series providing information on recent doctoral dissertations. The 26 titles deal with a variety of topics, including the following: (1) the effect of selected biofeedback techniques on reading comprehension in a high school chemistry class; (2) an investigation of volunteer and nonvolunteer…

  2. Working Notes of the 1990 Spring Symposium on Automated Abduction

    DTIC Science & Technology

    1990-09-27

    possibilities for abstracting the leaf nodes in using apprenticeship learning techniques. In LTCAI.E the proof tree. Morgan Kaufmann, 1987. A detailed...ibm.com Abstract planation process and compute particular operational A major limitation of explanation-based learn - descriptions of the target...for the learning that would be difficult or impos- 3n educated, somewhat abstract guess at why the pro- sible using abduction. I position is likely to

  3. Proceedings of the international conference on nuclear physics, August 24-30, 1980, Berkeley, California. Volume 1. Abstracts. [Berkeley, California, August 24-30, 1980 (abstracts only)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1980-01-01

    This volume contains all abstracts (931) received by the conference organizers before June 20, 1980. The abstracts are grouped according to the following topics: nucleon-nucleon interactions, free and in nuclei; distribution of matter, charge, and magnetism; exotic nuclei and exotic probes; giant resonances and other high-lying excitations; applications of nuclear science; nuclei with large angular momentum and deformation; heavy-ion reactions and relaxation phenomena; new techniques and instruments; pion absorption and scattering by nuclei; and miscellaneous. Some of these one-page abstracts contain data. A complete author index is provided. (RWR)

  4. Protein folding optimization based on 3D off-lattice model via an improved artificial bee colony algorithm.

    PubMed

    Li, Bai; Lin, Mu; Liu, Qiao; Li, Ya; Zhou, Changjun

    2015-10-01

    Protein folding is a fundamental topic in molecular biology. Conventional experimental techniques for protein structure identification or protein folding recognition require strict laboratory requirements and heavy operating burdens, which have largely limited their applications. Alternatively, computer-aided techniques have been developed to optimize protein structures or to predict the protein folding process. In this paper, we utilize a 3D off-lattice model to describe the original protein folding scheme as a simplified energy-optimal numerical problem, where all types of amino acid residues are binarized into hydrophobic and hydrophilic ones. We apply a balance-evolution artificial bee colony (BE-ABC) algorithm as the minimization solver, which is featured by the adaptive adjustment of search intensity to cater for the varying needs during the entire optimization process. In this work, we establish a benchmark case set with 13 real protein sequences from the Protein Data Bank database and evaluate the convergence performance of BE-ABC algorithm through strict comparisons with several state-of-the-art ABC variants in short-term numerical experiments. Besides that, our obtained best-so-far protein structures are compared to the ones in comprehensive previous literature. This study also provides preliminary insights into how artificial intelligence techniques can be applied to reveal the dynamics of protein folding. Graphical Abstract Protein folding optimization using 3D off-lattice model and advanced optimization techniques.

  5. Constraint-Based Abstract Semantics for Temporal Logic: A Direct Approach to Design and Implementation

    NASA Astrophysics Data System (ADS)

    Banda, Gourinath; Gallagher, John P.

    interpretation provides a practical approach to verifying properties of infinite-state systems. We apply the framework of abstract interpretation to derive an abstract semantic function for the modal μ-calculus, which is the basis for abstract model checking. The abstract semantic function is constructed directly from the standard concrete semantics together with a Galois connection between the concrete state-space and an abstract domain. There is no need for mixed or modal transition systems to abstract arbitrary temporal properties, as in previous work in the area of abstract model checking. Using the modal μ-calculus to implement CTL, the abstract semantics gives an over-approximation of the set of states in which an arbitrary CTL formula holds. Then we show that this leads directly to an effective implementation of an abstract model checking algorithm for CTL using abstract domains based on linear constraints. The implementation of the abstract semantic function makes use of an SMT solver. We describe an implemented system for proving properties of linear hybrid automata and give some experimental results.

  6. Abstracting event-based control models for high autonomy systems

    NASA Technical Reports Server (NTRS)

    Luh, Cheng-Jye; Zeigler, Bernard P.

    1993-01-01

    A high autonomy system needs many models on which to base control, management, design, and other interventions. These models differ in level of abstraction and in formalism. Concepts and tools are needed to organize the models into a coherent whole. The paper deals with the abstraction processes for systematic derivation of related models for use in event-based control. The multifaceted modeling methodology is briefly reviewed. The morphism concepts needed for application to model abstraction are described. A theory for supporting the construction of DEVS models needed for event-based control is then presented. An implemented morphism on the basis of this theory is also described.

  7. Methods for solving reasoning problems in abstract argumentation – A survey

    PubMed Central

    Charwat, Günther; Dvořák, Wolfgang; Gaggl, Sarah A.; Wallner, Johannes P.; Woltran, Stefan

    2015-01-01

    Within the last decade, abstract argumentation has emerged as a central field in Artificial Intelligence. Besides providing a core formalism for many advanced argumentation systems, abstract argumentation has also served to capture several non-monotonic logics and other AI related principles. Although the idea of abstract argumentation is appealingly simple, several reasoning problems in this formalism exhibit high computational complexity. This calls for advanced techniques when it comes to implementation issues, a challenge which has been recently faced from different angles. In this survey, we give an overview on different methods for solving reasoning problems in abstract argumentation and compare their particular features. Moreover, we highlight available state-of-the-art systems for abstract argumentation, which put these methods to practice. PMID:25737590

  8. Estuarine research; an annotated bibliography of selected literature, with emphasis on the Hudson River estuary, New York and New Jersey

    USGS Publications Warehouse

    Embree, William N.; Wiltshire, Denise A.

    1978-01-01

    Abstracts of 177 selected publications on water movement in estuaries, particularly the Hudson River estuary, are compiled for reference in Hudson River studies. Subjects represented are the hydraulic, chemical, and physical characteristics of estuarine waters, estuarine modeling techniques, and methods of water-data collection and analysis. Summaries are presented in five categories: Hudson River estuary studies; hydrodynamic-model studies; water-quality-model studies; reports on data-collection equipment and methods; and bibliographies, literature reviews, conference proceedings, and textbooks. An author index is included. Omitted are most works published before 1965, environmental-impact statements, theses and dissertations, policy or planning reports, regional or economic reports, ocean studies, studies based on physical models, and foreign studies. (Woodard-USGS)

  9. Slicing AADL Specifications for Model Checking

    NASA Technical Reports Server (NTRS)

    Odenbrett, Maximilian; Nguyen, Viet Yen; Noll, Thomas

    2010-01-01

    To combat the state-space explosion problem in model checking larger systems, abstraction techniques can be employed. Here, methods that operate on the system specification before constructing its state space are preferable to those that try to minimize the resulting transition system as they generally reduce peak memory requirements. We sketch a slicing algorithm for system specifications written in (a variant of) the Architecture Analysis and Design Language (AADL). Given a specification and a property to be verified, it automatically removes those parts of the specification that are irrelevant for model checking the property, thus reducing the size of the corresponding transition system. The applicability and effectiveness of our approach is demonstrated by analyzing the state-space reduction for an example, employing a translator from AADL to Promela, the input language of the SPIN model checker.

  10. Planarian brain regeneration as a model system for developmental neurotoxicology

    PubMed Central

    Hagstrom, Danielle; Cochet‐Escartin, Olivier

    2016-01-01

    Abstract Freshwater planarians, famous for their regenerative prowess, have long been recognized as a valuable in vivo animal model to study the effects of chemical exposure. In this review, we summarize the current techniques and tools used in the literature to assess toxicity in the planarian system. We focus on the planarian's particular amenability for neurotoxicology and neuroregeneration studies, owing to the planarian's unique ability to regenerate a centralized nervous system. Zooming in from the organismal to the molecular level, we show that planarians offer a repertoire of morphological and behavioral readouts while also being amenable to mechanistic studies of compound toxicity. Finally, we discuss the open challenges and opportunities for planarian brain regeneration to become an important model system for modern toxicology. PMID:27499880

  11. Rapid Prototyping 3D Model in Treatment of Pediatric Hip Dysplasia: A Case Report

    PubMed Central

    Holt, Andrew M.; Starosolski, Zbigniew; Kan, J. Herman

    2017-01-01

    Abstract Background: Rapid prototyping is an emerging technology that integrates common medical imaging with specialized production mechanisms to create detailed anatomic replicas. 3D-printed models of musculoskeletal anatomy have already proven useful in orthopedics and their applications continue to expand. Case Description: We present the case of a 10 year-old female with Down syndrome and left acetabular dysplasia and chronic hip instability who underwent periacetabular osteotomy. A rapid prototyping 3D model was created to better understand the anatomy, counsel the family about the problem and the surgical procedure, as well as guide surgical technique. The intricate detail and size match of the model with the patient’s anatomy offered unparalleled, hands-on experience with the patient’s anatomy pre-operatively and improved surgical precision. Conclusions: Our experience with rapid prototyping confirmed its ability to enhance orthopedic care by improving the surgeon’s ability to understand complex anatomy. Additionally, we report a new application utilizing intraoperative fluoroscopic comparison of the model and patient to ensure surgical precision and minimize the risk of complications. This technique could be used in other challenging cases. The increasing availability of rapid prototyping welcomes further use in all areas of orthopedics. PMID:28852351

  12. Research Support for the Laboratory for Lightwave Technology

    DTIC Science & Technology

    1992-12-31

    34 .. . ."/ 12a. DISTRIBUTION AVAILABILITY STATEMENT 12b. DISTRIBUTION CODE UNLIMITED 13. ABSTRACT (Mawimum 200words) 4 SEE ATTACHED ABSTRACT DT I 14. SUBJECT...8217TERMS 15. NUMBER OF PAGES 16. PRICE CODE 17. SECURITY CLASSIFICATION 18. SECURITY CLASSIFICATION 19. SECURITY CLASSIFICATION 20. LIMITATION OF ABSTRACT...temperature ceramic nano- phase single crystal oxides that may be produced at a high rate . The synthesis of both glasses and ceramics using novel techniques

  13. Rasmussen's legacy: A paradigm change in engineering for safety.

    PubMed

    Leveson, Nancy G

    2017-03-01

    This paper describes three applications of Rasmussen's idea to systems engineering practice. The first is the application of the abstraction hierarchy to engineering specifications, particularly requirements specification. The second is the use of Rasmussen's ideas in safety modeling and analysis to create a new, more powerful type of accident causation model that extends traditional models to better handle human-operated, software-intensive, sociotechnical systems. Because this new model has a formal, mathematical foundation built on systems theory (as was Rasmussen's original model), new modeling and analysis tools become possible. The third application is to engineering hazard analysis. Engineers have traditionally either omitted human from consideration in system hazard analysis or have treated them rather superficially, for example, that they behave randomly. Applying Rasmussen's model of human error to a powerful new hazard analysis technique allows human behavior to be included in engineering hazard analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Evaluation of Water Stress Coefficient Methods to Estimate Actual Corn Evapotranspiration in Colorado

    USDA-ARS?s Scientific Manuscript database

    Abstract for Kullberg Hydrology Days: Abstract. Increased competition for water resources is placing pressure on the agricultural sector to remain profitable while reducing water use. Remote sensing techniques have been developed to monitor crop water stress and produce information for evapotranspi...

  15. PURIFICATION OF ENTEROCYTOZOON BIENEUSI SPORES FROM STOOL SPECIMENS BY GRADIENT AND CELL SORTING TECHNIQUES. (R828042)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  16. Quantum Tunneling in Testosterone 6β-Hydroxylation by Cytochrome P450: Reaction Dynamics Calculations Employing Multiconfiguration Molecular-Mechanical Potential Energy Surfaces

    NASA Astrophysics Data System (ADS)

    Zhang, Yan; Lin, Hai

    2009-05-01

    Testosterone hydroxylation is a prototypical reaction of human cytochrome P450 3A4, which metabolizes about 50% of oral drugs on the market. Reaction dynamics calculations were carried out for the testosterone 6β-hydrogen abstraction and the 6β-d1-testosterone 6β-duterium abstraction employing a model that consists of the substrate and the active oxidant compound I. The calculations were performed at the level of canonical variational transition state theory with multidimensional tunneling and were based on a semiglobal full-dimensional potential energy surface generated by the multiconfiguration molecular mechanics technique. The tunneling coefficients were found to be around 3, indicating substantial contributions by quantum tunneling. However, the tunneling made only modest contributions to the kinetic isotope effects. The kinetic isotope effects were computed to be about 2 in the doublet spin state and about 5 in the quartet spin state.

  17. ASSIST - THE ABSTRACT SEMI-MARKOV SPECIFICATION INTERFACE TO THE SURE TOOL PROGRAM (SUN VERSION)

    NASA Technical Reports Server (NTRS)

    Johnson, S. C.

    1994-01-01

    ASSIST, the Abstract Semi-Markov Specification Interface to the SURE Tool program, is an interface that will enable reliability engineers to accurately design large semi-Markov models. The user describes the failure behavior of a fault-tolerant computer system in an abstract, high-level language. The ASSIST program then automatically generates a corresponding semi-Markov model. The abstract language allows efficient description of large, complex systems; a one-page ASSIST-language description may result in a semi-Markov model with thousands of states and transitions. The ASSIST program also includes model-reduction techniques to facilitate efficient modeling of large systems. Instead of listing the individual states of the Markov model, reliability engineers can specify the rules governing the behavior of a system, and these are used to automatically generate the model. ASSIST reads an input file describing the failure behavior of a system in an abstract language and generates a Markov model in the format needed for input to SURE, the semi-Markov Unreliability Range Evaluator program, and PAWS/STEM, the Pade Approximation with Scaling program and Scaled Taylor Exponential Matrix. A Markov model consists of a number of system states and transitions between them. Each state in the model represents a possible state of the system in terms of which components have failed, which ones have been removed, etc. Within ASSIST, each state is defined by a state vector, where each element of the vector takes on an integer value within a defined range. An element can represent any meaningful characteristic, such as the number of working components of one type in the system, or the number of faulty components of another type in use. Statements representing transitions between states in the model have three parts: a condition expression, a destination expression, and a rate expression. The first expression is a Boolean expression describing the state space variable values of states for which the transition is valid. The second expression defines the destination state for the transition in terms of state space variable values. The third expression defines the distribution of elapsed time for the transition. The mathematical approach chosen to solve a reliability problem may vary with the size and nature of the problem. Although different solution techniques are utilized on different programs, it is possible to have a common input language. The Systems Validation Methods group at NASA Langley Research Center has created a set of programs that form the basis for a reliability analysis workstation. The set of programs are: SURE reliability analysis program (COSMIC program LAR-13789, LAR-14921); the ASSIST specification interface program (LAR-14193, LAR-14923), PAWS/STEM reliability analysis programs (LAR-14165, LAR-14920); and the FTC fault tree tool (LAR-14586, LAR-14922). FTC is used to calculate the top-event probability for a fault tree. PAWS/STEM and SURE are programs which interpret the same SURE language, but utilize different solution methods. ASSIST is a preprocessor that generates SURE language from a more abstract definition. SURE, ASSIST, and PAWS/STEM are also offered as a bundle. Please see the abstract for COS-10039/COS-10041, SARA - SURE/ASSIST Reliability Analysis Workstation, for pricing details. ASSIST was originally developed for DEC VAX series computers running VMS and was later ported for use on Sun computers running SunOS. The VMS version (LAR14193) is written in C-language and can be compiled with the VAX C compiler. The standard distribution medium for the VMS version of ASSIST is a 9-track 1600 BPI magnetic tape in VMSINSTAL format. It is also available on a TK50 tape cartridge in VMSINSTAL format. Executables are included. The Sun version (LAR14923) is written in ANSI C-language. An ANSI compliant C compiler is required in order to compile this package. The standard distribution medium for the Sun version of ASSIST is a .25 inch streaming magnetic tape cartridge in UNIX tar format. Both Sun3 and Sun4 executables are included. Electronic copies of the documentation in PostScript, TeX, and DVI formats are provided on the distribution medium. (The VMS distribution lacks the .DVI format files, however.) ASSIST was developed in 1986 and last updated in 1992. DEC, VAX, VMS, and TK50 are trademarks of Digital Equipment Corporation. SunOS, Sun3, and Sun4 are trademarks of Sun Microsystems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories.

  18. ASSIST - THE ABSTRACT SEMI-MARKOV SPECIFICATION INTERFACE TO THE SURE TOOL PROGRAM (VAX VMS VERSION)

    NASA Technical Reports Server (NTRS)

    Johnson, S. C.

    1994-01-01

    ASSIST, the Abstract Semi-Markov Specification Interface to the SURE Tool program, is an interface that will enable reliability engineers to accurately design large semi-Markov models. The user describes the failure behavior of a fault-tolerant computer system in an abstract, high-level language. The ASSIST program then automatically generates a corresponding semi-Markov model. The abstract language allows efficient description of large, complex systems; a one-page ASSIST-language description may result in a semi-Markov model with thousands of states and transitions. The ASSIST program also includes model-reduction techniques to facilitate efficient modeling of large systems. Instead of listing the individual states of the Markov model, reliability engineers can specify the rules governing the behavior of a system, and these are used to automatically generate the model. ASSIST reads an input file describing the failure behavior of a system in an abstract language and generates a Markov model in the format needed for input to SURE, the semi-Markov Unreliability Range Evaluator program, and PAWS/STEM, the Pade Approximation with Scaling program and Scaled Taylor Exponential Matrix. A Markov model consists of a number of system states and transitions between them. Each state in the model represents a possible state of the system in terms of which components have failed, which ones have been removed, etc. Within ASSIST, each state is defined by a state vector, where each element of the vector takes on an integer value within a defined range. An element can represent any meaningful characteristic, such as the number of working components of one type in the system, or the number of faulty components of another type in use. Statements representing transitions between states in the model have three parts: a condition expression, a destination expression, and a rate expression. The first expression is a Boolean expression describing the state space variable values of states for which the transition is valid. The second expression defines the destination state for the transition in terms of state space variable values. The third expression defines the distribution of elapsed time for the transition. The mathematical approach chosen to solve a reliability problem may vary with the size and nature of the problem. Although different solution techniques are utilized on different programs, it is possible to have a common input language. The Systems Validation Methods group at NASA Langley Research Center has created a set of programs that form the basis for a reliability analysis workstation. The set of programs are: SURE reliability analysis program (COSMIC program LAR-13789, LAR-14921); the ASSIST specification interface program (LAR-14193, LAR-14923), PAWS/STEM reliability analysis programs (LAR-14165, LAR-14920); and the FTC fault tree tool (LAR-14586, LAR-14922). FTC is used to calculate the top-event probability for a fault tree. PAWS/STEM and SURE are programs which interpret the same SURE language, but utilize different solution methods. ASSIST is a preprocessor that generates SURE language from a more abstract definition. SURE, ASSIST, and PAWS/STEM are also offered as a bundle. Please see the abstract for COS-10039/COS-10041, SARA - SURE/ASSIST Reliability Analysis Workstation, for pricing details. ASSIST was originally developed for DEC VAX series computers running VMS and was later ported for use on Sun computers running SunOS. The VMS version (LAR14193) is written in C-language and can be compiled with the VAX C compiler. The standard distribution medium for the VMS version of ASSIST is a 9-track 1600 BPI magnetic tape in VMSINSTAL format. It is also available on a TK50 tape cartridge in VMSINSTAL format. Executables are included. The Sun version (LAR14923) is written in ANSI C-language. An ANSI compliant C compiler is required in order to compile this package. The standard distribution medium for the Sun version of ASSIST is a .25 inch streaming magnetic tape cartridge in UNIX tar format. Both Sun3 and Sun4 executables are included. Electronic copies of the documentation in PostScript, TeX, and DVI formats are provided on the distribution medium. (The VMS distribution lacks the .DVI format files, however.) ASSIST was developed in 1986 and last updated in 1992. DEC, VAX, VMS, and TK50 are trademarks of Digital Equipment Corporation. SunOS, Sun3, and Sun4 are trademarks of Sun Microsystems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories.

  19. Selected translated abstracts of Russian-language climate-change publications. 4: General circulation models (in English;Russian)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burtis, M.D.; Razuvaev, V.N.; Sivachok, S.G.

    1996-10-01

    This report presents English-translated abstracts of important Russian-language literature concerning general circulation models as they relate to climate change. Into addition to the bibliographic citations and abstracts translated into English, this report presents the original citations and abstracts in Russian. Author and title indexes are included to assist the reader in locating abstracts of particular interest.

  20. Concept Mapping: A Critical Thinking Technique

    ERIC Educational Resources Information Center

    Harris, Charles M.; Zha, Shenghua

    2013-01-01

    Concept mapping, graphically depicting the structure of abstract concepts, is based on the observation that pictures and line drawings are often more easily comprehended than the words that represent an abstract concept. The efficacy of concept mapping for facilitating critical thinking was assessed in four sections of an introductory psychology…

  1. Innovation Abstracts, Volume IV, Numbers 1-36.

    ERIC Educational Resources Information Center

    Watkins, Karen, Ed.

    1982-01-01

    Brief, two-page abstracts are provided on 36 educational topics of interest to community college faculty, administrators, and staff. The topics covered are: (1) a student retention technique; (2) educational productivity and quality; (3) competency-based adult education; (4) part-time faculty; (5) Beaver College's (Pennsylvania) writing across the…

  2. DEVELOPMENT OF HOURLY PROBABILISTIC UTILITY NOX EMISSION INVENTORIES USING TIME SERIES TECHNIQUES: PART 2-MULTIVARIATE APPROACH. (R826766)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  3. DEVELOPMENT OF HOURLY PROBABILISTIC UTILITY NOX EMISSION INVENTORIES USING TIME SERIES TECHNIQUES: PART 1-UNIVARIATE APPROACH. (R826766)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  4. Proceedings of the seventh international conference on high voltage electron microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fisher, R.M.; Gronsky, R.; Westmacott, K.H.

    1983-01-01

    Eight-four papers are arranged under the following headings: high resolution, techniques and instrumentation, radiation effects, in-situ and phase transformations, minerals and ceramics, and semiconductors and thin films. Twenty-three papers were abstracted separately for the data base; three of the remainder had previously been abstracted. (DLC)

  5. OZONE AIR QUALITY OVER NORTH AMERICA: PART II--AN ANALYSIS OF TREND DETECTION AND ATTRIBUTION TECHNIQUES. (R825260)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  6. A COMPARISON OF TWO SAMPLING TECHNIQUES IN THE STUDY OF SUBMERSED MACROPHYTE RICHNESS AND ABUNDANCE. (U915544)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  7. Information Processing Techniques Program. Volume II. Communications- Adaptive Internetting

    DTIC Science & Technology

    1977-09-30

    LABORATORY INFORMATION PROCESSING TECHNIQUES PROGRAM VOLUME II: COMMUNICATIONS-ADAPTIVE INTERNETTING I SEMIANNUAL TECHNICAL SUMMARY REPORT TO THE...MASSACHUSETTS ABSTRACT This repori describes work performed on the Communications-Adaptive Internetting program sponsored by the Information ... information processing techniques network speech terminal communicatlons-adaptive internetting 04 links digital voice communications time-varying

  8. Cleaning and Cleanliness Verification Techniques for Mars Returned Sample Handling

    NASA Technical Reports Server (NTRS)

    Mickelson, E. T.; Lindstrom, D. J.; Allton, J. H.; Hittle, J. D.

    2002-01-01

    Precision cleaning and cleanliness verification techniques are examined as a subset of a comprehensive contamination control strategy for a Mars sample return mission. Additional information is contained in the original extended abstract.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carey, Stephen A.; Minard, Kevin R.; Trease, Lynn L.

    ABSTRACT Age-related changes in gross and microscopic structure of the nasal cavity can alter local tissue susceptibility as well as the dose of inhaled toxicant delivered to susceptible sites. This article describes a novel method for the use of magnetic resonance imaging, 3-dimensional airway modeling, and morphometric techniques to characterize the distribution and magnitude of ozone-induced nasal injury in infant monkeys. Using this method, we are able to generate age-specific, 3-dimensional, epithelial maps of the nasal airways of infant Rhesus macaques. The principal nasal lesions observed in this primate model of ozone-induced nasal toxicology were neutrophilic rhinitis, along with necrosismore » and exfoliation of the epithelium lining the anterior maxilloturbinate. These lesions, induced by acute or cyclic (episodic) exposures, were examined by light microscopy, quantified by morphometric techniques, and mapped on 3-dimensional models of the nasal airways. Here, we describe the histopathologic, imaging, and computational biology methods developed to efficiently characterize, localize, quantify, and map these nasal lesions. By combining these techniques, the location and severity of the nasal epithelial injury were correlated with epithelial type, nasal airway geometry, and local biochemical and molecular changes on an individual animal basis. These correlations are critical for accurate predictive modeling of exposure-dose-response relationships in the nasal airways, and subsequent extrapolation of nasal findings in animals to humans for developing risk assessment.« less

  10. Physics and Process Modeling (PPM) and Other Propulsion R and T. Volume 1; Materials Processing, Characterization, and Modeling; Lifting Models

    NASA Technical Reports Server (NTRS)

    1997-01-01

    This CP contains the extended abstracts and presentation figures of 36 papers presented at the PPM and Other Propulsion R&T Conference. The focus of the research described in these presentations is on materials and structures technologies that are parts of the various projects within the NASA Aeronautics Propulsion Systems Research and Technology Base Program. These projects include Physics and Process Modeling; Smart, Green Engine; Fast, Quiet Engine; High Temperature Engine Materials Program; and Hybrid Hyperspeed Propulsion. Also presented were research results from the Rotorcraft Systems Program and work supported by the NASA Lewis Director's Discretionary Fund. Authors from NASA Lewis Research Center, industry, and universities conducted research in the following areas: material processing, material characterization, modeling, life, applied life models, design techniques, vibration control, mechanical components, and tribology. Key issues, research accomplishments, and future directions are summarized in this publication.

  11. Formal Methods for Automated Diagnosis of Autosub 6000

    NASA Technical Reports Server (NTRS)

    Ernits, Juhan; Dearden, Richard; Pebody, Miles

    2009-01-01

    This is a progress report on applying formal methods in the context of building an automated diagnosis and recovery system for Autosub 6000, an Autonomous Underwater Vehicle (AUV). The diagnosis task involves building abstract models of the control system of the AUV. The diagnosis engine is based on Livingstone 2, a model-based diagnoser originally built for aerospace applications. Large parts of the diagnosis model can be built without concrete knowledge about each mission, but actual mission scripts and configuration parameters that carry important information for diagnosis are changed for every mission. Thus we use formal methods for generating the mission control part of the diagnosis model automatically from the mission script and perform a number of invariant checks to validate the configuration. After the diagnosis model is augmented with the generated mission control component model, it needs to be validated using verification techniques.

  12. High-energy physics software parallelization using database techniques

    NASA Astrophysics Data System (ADS)

    Argante, E.; van der Stok, P. D. V.; Willers, I.

    1997-02-01

    A programming model for software parallelization, called CoCa, is introduced that copes with problems caused by typical features of high-energy physics software. By basing CoCa on the database transaction paradimg, the complexity induced by the parallelization is for a large part transparent to the programmer, resulting in a higher level of abstraction than the native message passing software. CoCa is implemented on a Meiko CS-2 and on a SUN SPARCcenter 2000 parallel computer. On the CS-2, the performance is comparable with the performance of native PVM and MPI.

  13. Chemically Aware Model Builder (camb): an R package for property and bioactivity modelling of small molecules.

    PubMed

    Murrell, Daniel S; Cortes-Ciriano, Isidro; van Westen, Gerard J P; Stott, Ian P; Bender, Andreas; Malliavin, Thérèse E; Glen, Robert C

    2015-01-01

    In silico predictive models have proved to be valuable for the optimisation of compound potency, selectivity and safety profiles in the drug discovery process. camb is an R package that provides an environment for the rapid generation of quantitative Structure-Property and Structure-Activity models for small molecules (including QSAR, QSPR, QSAM, PCM) and is aimed at both advanced and beginner R users. camb's capabilities include the standardisation of chemical structure representation, computation of 905 one-dimensional and 14 fingerprint type descriptors for small molecules, 8 types of amino acid descriptors, 13 whole protein sequence descriptors, filtering methods for feature selection, generation of predictive models (using an interface to the R package caret), as well as techniques to create model ensembles using techniques from the R package caretEnsemble). Results can be visualised through high-quality, customisable plots (R package ggplot2). Overall, camb constitutes an open-source framework to perform the following steps: (1) compound standardisation, (2) molecular and protein descriptor calculation, (3) descriptor pre-processing and model training, visualisation and validation, and (4) bioactivity/property prediction for new molecules. camb aims to speed model generation, in order to provide reproducibility and tests of robustness. QSPR and proteochemometric case studies are included which demonstrate camb's application.Graphical abstractFrom compounds and data to models: a complete model building workflow in one package.

  14. Model Checking - My 27-Year Quest to Overcome the State Explosion Problem

    NASA Technical Reports Server (NTRS)

    Clarke, Ed

    2009-01-01

    Model Checking is an automatic verification technique for state-transition systems that are finite=state or that have finite-state abstractions. In the early 1980 s in a series of joint papers with my graduate students E.A. Emerson and A.P. Sistla, we proposed that Model Checking could be used for verifying concurrent systems and gave algorithms for this purpose. At roughly the same time, Joseph Sifakis and his student J.P. Queille at the University of Grenoble independently developed a similar technique. Model Checking has been used successfully to reason about computer hardware and communication protocols and is beginning to be used for verifying computer software. Specifications are written in temporal logic, which is particularly valuable for expressing concurrency properties. An intelligent, exhaustive search is used to determine if the specification is true or not. If the specification is not true, the Model Checker will produce a counterexample execution trace that shows why the specification does not hold. This feature is extremely useful for finding obscure errors in complex systems. The main disadvantage of Model Checking is the state-explosion problem, which can occur if the system under verification has many processes or complex data structures. Although the state-explosion problem is inevitable in worst case, over the past 27 years considerable progress has been made on the problem for certain classes of state-transition systems that occur often in practice. In this talk, I will describe what Model Checking is, how it works, and the main techniques that have been developed for combating the state explosion problem.

  15. Distributed geospatial model sharing based on open interoperability standards

    USGS Publications Warehouse

    Feng, Min; Liu, Shuguang; Euliss, Ned H.; Fang, Yin

    2009-01-01

    Numerous geospatial computational models have been developed based on sound principles and published in journals or presented in conferences. However modelers have made few advances in the development of computable modules that facilitate sharing during model development or utilization. Constraints hampering development of model sharing technology includes limitations on computing, storage, and connectivity; traditional stand-alone and closed network systems cannot fully support sharing and integrating geospatial models. To address this need, we have identified methods for sharing geospatial computational models using Service Oriented Architecture (SOA) techniques and open geospatial standards. The service-oriented model sharing service is accessible using any tools or systems compliant with open geospatial standards, making it possible to utilize vast scientific resources available from around the world to solve highly sophisticated application problems. The methods also allow model services to be empowered by diverse computational devices and technologies, such as portable devices and GRID computing infrastructures. Based on the generic and abstract operations and data structures required for Web Processing Service (WPS) standards, we developed an interactive interface for model sharing to help reduce interoperability problems for model use. Geospatial computational models are shared on model services, where the computational processes provided by models can be accessed through tools and systems compliant with WPS. We developed a platform to help modelers publish individual models in a simplified and efficient way. Finally, we illustrate our technique using wetland hydrological models we developed for the prairie pothole region of North America.

  16. The ChemViz Project: Using a Supercomputer To Illustrate Abstract Concepts in Chemistry.

    ERIC Educational Resources Information Center

    Beckwith, E. Kenneth; Nelson, Christopher

    1998-01-01

    Describes the Chemistry Visualization (ChemViz) Project, a Web venture maintained by the University of Illinois National Center for Supercomputing Applications (NCSA) that enables high school students to use computational chemistry as a technique for understanding abstract concepts. Discusses the evolution of computational chemistry and provides a…

  17. A Selected Annotated Bibliography on the Analysis of Water Resources System, Volume 2.

    ERIC Educational Resources Information Center

    Kriss, Carol; And Others

    Presented is an annotated bibliography of some recent selected publications pertaining to the application of systems analysis techniques for defining and evaluating alternative solutions to water resource problems. Both subject and author indices are provided. Keywords are listed at the end of each abstract. The abstracted material emphasizes the…

  18. A Proposed Multimedia Cone of Abstraction: Updating a Classic Instructional Design Theory

    ERIC Educational Resources Information Center

    Baukal, Charles E.; Ausburn, Floyd B.; Ausburn, Lynna J.

    2013-01-01

    Advanced multimedia techniques offer significant learning potential for students. Dale (1946, 1954, 1969) developed a Cone of Experience (CoE) which is a hierarchy of learning experiences ranging from direct participation to abstract symbolic expression. This paper updates the CoE for today's technology and learning context, specifically focused…

  19. Urban children and nature: a summary of research on camping and outdoor education

    Treesearch

    William R., Jr. Burch

    1977-01-01

    This paper reports the preliminary findings of an extensive bibliographic search that identified studies or urban children in camp and outdoor education programs. These studies were systematically abstracted and classified qualitative or quantitative. Twenty-five percent of the abstracted studies were quantitative. The major findings, techniques of study, and policy...

  20. Proceedings of the first ERDA statistical symposium, Los Alamos, NM, November 3--5, 1975. [Sixteen papers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nicholson, W L; Harris, J L

    1976-03-01

    The First ERDA Statistical Symposium was organized to provide a means for communication among ERDA statisticians, and the sixteen papers presented at the meeting are given. Topics include techniques of numerical analysis used for accelerators, nuclear reactors, skewness and kurtosis statistics, radiochemical spectral analysis, quality control, and other statistics problems. Nine of the papers were previously announced in Nuclear Science Abstracts (NSA), while the remaining seven were abstracted for ERDA Energy Research Abstracts (ERA) and INIS Atomindex. (PMA)

  1. Extreme Rock Distributions on Mars and Implications for Landing Safety

    NASA Technical Reports Server (NTRS)

    Golombek, M. P.

    2001-01-01

    Prior to the landing of Mars Pathfinder, the size-frequency distribution of rocks from the two Viking landing sites and Earth analog surfaces was used to derive a size-frequency model, for nomimal rock distributions on Mars. This work, coupled with extensive testing of the Pathfinder airbag landing system, allowed an estimate of what total rock abundances derived from thermal differencing techniques could be considered safe for landing. Predictions based on this model proved largely correct at predicting the size-frequency distribution of rocks at the Mars Pathfinder site and the fraction of potentially hazardous rocks. In this abstract, extreme rock distributions observed in Mars Orbiter Camera (MOC) images are compared with those observed at the three landing sites and model distributions as an additional constraint on potentially hazardous surfaces on Mars.

  2. Comparing Noun Phrasing Techniques for Use with Medical Digital Library Tools.

    ERIC Educational Resources Information Center

    Tolle, Kristin M.; Chen, Hsinchun

    2000-01-01

    Describes a study that investigated the use of a natural language processing technique called noun phrasing to determine whether it is a viable technique for medical information retrieval. Evaluates four noun phrase generation tools for their ability to isolate noun phrases from medical journal abstracts, focusing on precision and recall.…

  3. Prototyping for surgical and prosthetic treatment.

    PubMed

    Goiato, Marcelo Coelho; Santos, Murillo Rezende; Pesqueira, Aldiéris Alves; Moreno, Amália; dos Santos, Daniela Micheline; Haddad, Marcela Filié

    2011-05-01

    Techniques of rapid prototyping were introduced in the 1980s in the field of engineering for the fabrication of a solid model based on a computed file. After its introduction in the biomedical field, several applications were raised for the fabrication of models to ease surgical planning and simulation in implantology, neurosurgery, and orthopedics, as well as for the fabrication of maxillofacial prostheses. Hence, the literature has described the evolution of rapid prototyping technique in health care, which allowed easier technique, improved surgical results, and fabrication of maxillofacial prostheses. Accordingly, a literature review on MEDLINE (PubMed) database was conducted using the keywords rapid prototyping, surgical planning, and maxillofacial prostheses and based on articles published from 1981 to 2010. After reading the titles and abstracts of the articles, 50 studies were selected owing to their correlations with the aim of the current study. Several studies show that the prototypes have been used in different dental-medical areas such as maxillofacial and craniofacial surgery; implantology; neurosurgery; orthopedics; scaffolds of ceramic, polymeric, and metallic materials; and fabrication of personalized maxillofacial prostheses. Therefore, prototyping has been an indispensable tool in several studies and helpful for surgical planning and fabrication of prostheses and implants.

  4. On 3D inelastic analysis methods for hot section components

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.; Chen, P. C.; Dame, L. T.; Holt, R. V.; Huang, H.; Hartle, M.; Gellin, S.; Allen, D. H.; Haisler, W. E.

    1986-01-01

    Accomplishments are described for the 2-year program, to develop advanced 3-D inelastic structural stress analysis methods and solution strategies for more accurate and cost effective analysis of combustors, turbine blades and vanes. The approach was to develop a matrix of formulation elements and constitutive models. Three constitutive models were developed in conjunction with optimized iterating techniques, accelerators, and convergence criteria within a framework of dynamic time incrementing. Three formulations models were developed; an eight-noded mid-surface shell element, a nine-noded mid-surface shell element and a twenty-noded isoparametric solid element. A separate computer program was developed for each combination of constitutive model-formulation model. Each program provides a functional stand alone capability for performing cyclic nonlinear structural analysis. In addition, the analysis capabilities incorporated into each program can be abstracted in subroutine form for incorporation into other codes or to form new combinations.

  5. The 3D inelastic analysis methods for hot section components

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.; Maffeo, R. J.; Tipton, M. T.; Weber, G.

    1992-01-01

    A two-year program to develop advanced 3D inelastic structural stress analysis methods and solution strategies for more accurate and cost effective analysis of combustors, turbine blades, and vanes is described. The approach was to develop a matrix of formulation elements and constitutive models. Three constitutive models were developed in conjunction with optimized iterating techniques, accelerators, and convergence criteria within a framework of dynamic time incrementing. Three formulation models were developed: an eight-noded midsurface shell element; a nine-noded midsurface shell element; and a twenty-noded isoparametric solid element. A separate computer program has been developed for each combination of constitutive model-formulation model. Each program provides a functional stand alone capability for performing cyclic nonlinear structural analysis. In addition, the analysis capabilities incorporated into each program can be abstracted in subroutine form for incorporation into other codes or to form new combinations.

  6. An initial-abstraction, constant-loss model for unit hydrograph modeling for applicable watersheds in Texas

    USGS Publications Warehouse

    Asquith, William H.; Roussel, Meghan C.

    2007-01-01

    Estimation of representative hydrographs from design storms, which are known as design hydrographs, provides for cost-effective, riskmitigated design of drainage structures such as bridges, culverts, roadways, and other infrastructure. During 2001?07, the U.S. Geological Survey (USGS), in cooperation with the Texas Department of Transportation, investigated runoff hydrographs, design storms, unit hydrographs,and watershed-loss models to enhance design hydrograph estimation in Texas. Design hydrographs ideally should mimic the general volume, peak, and shape of observed runoff hydrographs. Design hydrographs commonly are estimated in part by unit hydrographs. A unit hydrograph is defined as the runoff hydrograph that results from a unit pulse of excess rainfall uniformly distributed over the watershed at a constant rate for a specific duration. A time-distributed, watershed-loss model is required for modeling by unit hydrographs. This report develops a specific time-distributed, watershed-loss model known as an initial-abstraction, constant-loss model. For this watershed-loss model, a watershed is conceptualized to have the capacity to store or abstract an absolute depth of rainfall at and near the beginning of a storm. Depths of total rainfall less than this initial abstraction do not produce runoff. The watershed also is conceptualized to have the capacity to remove rainfall at a constant rate (loss) after the initial abstraction is satisfied. Additional rainfall inputs after the initial abstraction is satisfied contribute to runoff if the rainfall rate (intensity) is larger than the constant loss. The initial abstraction, constant-loss model thus is a two-parameter model. The initial-abstraction, constant-loss model is investigated through detailed computational and statistical analysis of observed rainfall and runoff data for 92 USGS streamflow-gaging stations (watersheds) in Texas with contributing drainage areas from 0.26 to 166 square miles. The analysis is limited to a previously described, watershed-specific, gamma distribution model of the unit hydrograph. In particular, the initial-abstraction, constant-loss model is tuned to the gamma distribution model of the unit hydrograph. A complex computational analysis of observed rainfall and runoff for the 92 watersheds was done to determine, by storm, optimal values of initial abstraction and constant loss. Optimal parameter values for a given storm were defined as those values that produced a modeled runoff hydrograph with volume equal to the observed runoff hydrograph and also minimized the residual sum of squares of the two hydrographs. Subsequently, the means of the optimal parameters were computed on a watershed-specific basis. These means for each watershed are considered the most representative, are tabulated, and are used in further statistical analyses. Statistical analyses of watershed-specific, initial abstraction and constant loss include documentation of the distribution of each parameter using the generalized lambda distribution. The analyses show that watershed development has substantial influence on initial abstraction and limited influence on constant loss. The means and medians of the 92 watershed-specific parameters are tabulated with respect to watershed development; although they have considerable uncertainty, these parameters can be used for parameter prediction for ungaged watersheds. The statistical analyses of watershed-specific, initial abstraction and constant loss also include development of predictive procedures for estimation of each parameter for ungaged watersheds. Both regression equations and regression trees for estimation of initial abstraction and constant loss are provided. The watershed characteristics included in the regression analyses are (1) main-channel length, (2) a binary factor representing watershed development, (3) a binary factor representing watersheds with an abundance of rocky and thin-soiled terrain, and (4) curve numb

  7. Novel concept of washing for microfluidic paper-based analytical devices based on capillary force of paper substrates.

    PubMed

    Mohammadi, Saeed; Busa, Lori Shayne Alamo; Maeki, Masatoshi; Mohamadi, Reza M; Ishida, Akihiko; Tani, Hirofumi; Tokeshi, Manabu

    2016-11-01

    A novel washing technique for microfluidic paper-based analytical devices (μPADs) that is based on the spontaneous capillary action of paper and eliminates unbound antigen and antibody in a sandwich immunoassay is reported. Liquids can flow through a porous medium (such as paper) in the absence of external pressure as a result of capillary action. Uniform results were achieved when washing a paper substrate in a PDMS holder which was integrated with a cartridge absorber acting as a porous medium. Our study demonstrated that applying this washing technique would allow μPADs to become the least expensive microfluidic device platform with high reproducibility and sensitivity. In a model μPAD assay that utilized this novel washing technique, C-reactive protein (CRP) was detected with a limit of detection (LOD) of 5 μg mL -1 . Graphical Abstract A novel washing technique for microfluidic paper-based analytical devices (μPADs) that is based on the spontaneous capillary action of paper and eliminates unbound antigen and antibody in a sandwich immunoassay is reported.

  8. Automatic Generation of Cycle-Approximate TLMs with Timed RTOS Model Support

    NASA Astrophysics Data System (ADS)

    Hwang, Yonghyun; Schirner, Gunar; Abdi, Samar

    This paper presents a technique for automatically generating cycle-approximate transaction level models (TLMs) for multi-process applications mapped to embedded platforms. It incorporates three key features: (a) basic block level timing annotation, (b) RTOS model integration, and (c) RTOS overhead delay modeling. The inputs to TLM generation are application C processes and their mapping to processors in the platform. A processor data model, including pipelined datapath, memory hierarchy and branch delay model is used to estimate basic block execution delays. The delays are annotated to the C code, which is then integrated with a generated SystemC RTOS model. Our abstract RTOS provides dynamic scheduling and inter-process communication (IPC) with processor- and RTOS-specific pre-characterized timing. Our experiments using a MP3 decoder and a JPEG encoder show that timed TLMs, with integrated RTOS models, can be automatically generated in less than a minute. Our generated TLMs simulated three times faster than real-time and showed less than 10% timing error compared to board measurements.

  9. On the Impact of Execution Models: A Case Study in Computational Chemistry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chavarría-Miranda, Daniel; Halappanavar, Mahantesh; Krishnamoorthy, Sriram

    2015-05-25

    Efficient utilization of high-performance computing (HPC) platforms is an important and complex problem. Execution models, abstract descriptions of the dynamic runtime behavior of the execution stack, have significant impact on the utilization of HPC systems. Using a computational chemistry kernel as a case study and a wide variety of execution models combined with load balancing techniques, we explore the impact of execution models on the utilization of an HPC system. We demonstrate a 50 percent improvement in performance by using work stealing relative to a more traditional static scheduling approach. We also use a novel semi-matching technique for load balancingmore » that has comparable performance to a traditional hypergraph-based partitioning implementation, which is computationally expensive. Using this study, we found that execution model design choices and assumptions can limit critical optimizations such as global, dynamic load balancing and finding the correct balance between available work units and different system and runtime overheads. With the emergence of multi- and many-core architectures and the consequent growth in the complexity of HPC platforms, we believe that these lessons will be beneficial to researchers tuning diverse applications on modern HPC platforms, especially on emerging dynamic platforms with energy-induced performance variability.« less

  10. Reports of planetary geology program, 1976 - 1977. [abstracts

    NASA Technical Reports Server (NTRS)

    Arvidson, R. (Compiler); Wahmann, R. (Compiler); Howard, J. H., III

    1977-01-01

    One hundred seventeen investigations undertaken in the NASA Planetary Geology Program in 1976-1977 are reported in abstract form. Topics discussed include solar system formation; planetary interiors; planetary evolution; asteroids, comets and moons; cratering; volcanic, eolian, fluvial and mass wasting processes; volatiles and the Martian regolith; mapping; and instrument development and techniques. An author index is provided.

  11. Query Processing for Probabilistic State Diagrams Describing Multiple Robot Navigation in an Indoor Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Czejdo, Bogdan; Bhattacharya, Sambit; Ferragut, Erik M

    2012-01-01

    This paper describes the syntax and semantics of multi-level state diagrams to support probabilistic behavior of cooperating robots. The techniques are presented to analyze these diagrams by querying combined robots behaviors. It is shown how to use state abstraction and transition abstraction to create, verify and process large probabilistic state diagrams.

  12. Composite use of numerical groundwater flow modeling and geoinformatics techniques for monitoring Indus Basin aquifer, Pakistan.

    PubMed

    Ahmad, Zulfiqar; Ashraf, Arshad; Fryar, Alan; Akhter, Gulraiz

    2011-02-01

    The integration of the Geographic Information System (GIS) with groundwater modeling and satellite remote sensing capabilities has provided an efficient way of analyzing and monitoring groundwater behavior and its associated land conditions. A 3-dimensional finite element model (Feflow) has been used for regional groundwater flow modeling of Upper Chaj Doab in Indus Basin, Pakistan. The approach of using GIS techniques that partially fulfill the data requirements and define the parameters of existing hydrologic models was adopted. The numerical groundwater flow model is developed to configure the groundwater equipotential surface, hydraulic head gradient, and estimation of the groundwater budget of the aquifer. GIS is used for spatial database development, integration with a remote sensing, and numerical groundwater flow modeling capabilities. The thematic layers of soils, land use, hydrology, infrastructure, and climate were developed using GIS. The Arcview GIS software is used as additive tool to develop supportive data for numerical groundwater flow modeling and integration and presentation of image processing and modeling results. The groundwater flow model was calibrated to simulate future changes in piezometric heads from the period 2006 to 2020. Different scenarios were developed to study the impact of extreme climatic conditions (drought/flood) and variable groundwater abstraction on the regional groundwater system. The model results indicated a significant response in watertable due to external influential factors. The developed model provides an effective tool for evaluating better management options for monitoring future groundwater development in the study area.

  13. Consistent model driven architecture

    NASA Astrophysics Data System (ADS)

    Niepostyn, Stanisław J.

    2015-09-01

    The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.

  14. A Strategy for Urban Astronomical Observatory Site Preservation: The Southern Arizona Example (Abstract)

    NASA Astrophysics Data System (ADS)

    Craine, E. R.; Craine, B. L.; Craine, P. R.; Craine, E. M.; Fouts, S.

    2014-12-01

    (Abstract only) Urbanized observatories are under financial pressures for numerous and complex reasons, including concerns that increasing sky brightness will continue to erode their scientific viability. The history of urbanized observatories is one of steady decline and divestiture. We argue that light at night (LAN) impacts of urban growth are inadequately understood, that current measurement techniques are incomplete in scope, and that both limit the effectiveness of mitigation programs. We give examples of these factors for Pima County, Arizona, and propose techniques and a program that could provide focus and power to mitigation efforts, and could extend the longevity of southern Arizona observatories.

  15. Review of progress in quantitative NDE. [Nondestructive Evaluation (NDE)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-01-01

    This booklet is composed of abstracts from papers submitted at a meeting on quantitative NDE. A multitude of topics are discussed including analysis of composite materials, NMR uses, x-ray instruments and techniques, manufacturing uses, neural networks, eddy currents, stress measurements, magnetic materials, adhesive bonds, signal processing, NDE of mechanical structures, tomography,defect sizing, NDE of plastics and ceramics, new techniques, optical and electromagnetic techniques, and nonlinear techniques. (GHH)

  16. Radar studies of arctic ice and development of a real-time Arctic ice type identification system

    NASA Technical Reports Server (NTRS)

    Rouse, J. W., Jr.; Schell, J. A.; Permenter, J. A.

    1973-01-01

    Studies were conducted to develop a real-time Arctic ice type identification system. Data obtained by NASA Mission 126, conducted at Pt. Barrow, Alaska (Site 93) in April 1970 was analyzed in detail to more clearly define the major mechanisms at work affecting the radar energy illuminating a terrain cell of sea ice. General techniques for reduction of the scatterometer data to a form suitable for application of ice type decision criteria were investigated, and the electronic circuit requirements for implementation of these techniques were determined. Also, consideration of circuit requirements are extended to include the electronics necessary for analog programming of ice type decision algorithms. After completing the basic circuit designs a laboratory model was constructed and a preliminary evaluation performed. Several system modifications for improved performance are suggested. (Modified author abstract)

  17. Factors controlling the evaporation of secondary organic aerosol from α‐pinene ozonolysis

    PubMed Central

    Pajunoja, Aki; Tikkanen, Olli‐Pekka; Buchholz, Angela; Faiola, Celia; Väisänen, Olli; Hao, Liqing; Kari, Eetu; Peräkylä, Otso; Garmash, Olga; Shiraiwa, Manabu; Ehn, Mikael; Lehtinen, Kari; Virtanen, Annele

    2017-01-01

    Abstract Secondary organic aerosols (SOA) forms a major fraction of organic aerosols in the atmosphere. Knowledge of SOA properties that affect their dynamics in the atmosphere is needed for improving climate models. By combining experimental and modeling techniques, we investigated the factors controlling SOA evaporation under different humidity conditions. Our experiments support the conclusion of particle phase diffusivity limiting the evaporation under dry conditions. Viscosity of particles at dry conditions was estimated to increase several orders of magnitude during evaporation, up to 109 Pa s. However, at atmospherically relevant relative humidity and time scales, our results show that diffusion limitations may have a minor effect on evaporation of the studied α‐pinene SOA particles. Based on previous studies and our model simulations, we suggest that, in warm environments dominated by biogenic emissions, the major uncertainty in models describing the SOA particle evaporation is related to the volatility of SOA constituents. PMID:28503004

  18. A model-guided symbolic execution approach for network protocol implementations and vulnerability detection.

    PubMed

    Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing

    2017-01-01

    Formal techniques have been devoted to analyzing whether network protocol specifications violate security policies; however, these methods cannot detect vulnerabilities in the implementations of the network protocols themselves. Symbolic execution can be used to analyze the paths of the network protocol implementations, but for stateful network protocols, it is difficult to reach the deep states of the protocol. This paper proposes a novel model-guided approach to detect vulnerabilities in network protocol implementations. Our method first abstracts a finite state machine (FSM) model, then utilizes the model to guide the symbolic execution. This approach achieves high coverage of both the code and the protocol states. The proposed method is implemented and applied to test numerous real-world network protocol implementations. The experimental results indicate that the proposed method is more effective than traditional fuzzing methods such as SPIKE at detecting vulnerabilities in the deep states of network protocol implementations.

  19. Automated measurement of zebrafish larval movement

    PubMed Central

    Cario, Clinton L; Farrell, Thomas C; Milanese, Chiara; Burton, Edward A

    2011-01-01

    Abstract The zebrafish is a powerful vertebrate model that is readily amenable to genetic, pharmacological and environmental manipulations to elucidate the molecular and cellular basis of movement and behaviour. We report software enabling automated analysis of zebrafish movement from video recordings captured with cameras ranging from a basic camcorder to more specialized equipment. The software, which is provided as open-source MATLAB functions, can be freely modified and distributed, and is compatible with multiwell plates under a wide range of experimental conditions. Automated measurement of zebrafish movement using this technique will be useful for multiple applications in neuroscience, pharmacology and neuropsychiatry. PMID:21646414

  20. (abstract) Using an Inversion Algorithm to Retrieve Parameters and Monitor Changes over Forested Areas from SAR Data

    NASA Technical Reports Server (NTRS)

    Moghaddam, Mahta

    1995-01-01

    In this work, the application of an inversion algorithm based on a nonlinear opimization technique to retrieve forest parameters from multifrequency polarimetric SAR data is discussed. The approach discussed here allows for retrieving and monitoring changes in forest parameters in a quantative and systematic fashion using SAR data. The parameters to be inverted directly from the data are the electromagnetic scattering properties of the forest components such as their dielectric constants and size characteristics. Once these are known, attributes such as canopy moisture content can be obtained, which are useful in the ecosystem models.

  1. A Regularized Neural Net Approach for Retrieval of Atmospheric and Surface Temperatures with the IASI Instrument

    NASA Technical Reports Server (NTRS)

    Aires, F.; Chedin, A.; Scott, N. A.; Rossow, W. B.; Hansen, James E. (Technical Monitor)

    2001-01-01

    Abstract In this paper, a fast atmospheric and surface temperature retrieval algorithm is developed for the high resolution Infrared Atmospheric Sounding Interferometer (IASI) space-borne instrument. This algorithm is constructed on the basis of a neural network technique that has been regularized by introduction of a priori information. The performance of the resulting fast and accurate inverse radiative transfer model is presented for a large divE:rsified dataset of radiosonde atmospheres including rare events. Two configurations are considered: a tropical-airmass specialized scheme and an all-air-masses scheme.

  2. A new modified speculum guided single nostril technique for endoscopic transnasal transsphenoidal surgery: an analysis of nasal complications.

    PubMed

    Waran, Vicknes; Tang, Ing Ping; Karuppiah, Ravindran; Abd Kadir, Khairul Azmi; Chandran, Hari; Muthusamy, Kalai Arasu; Prepageran, Narayanan

    2013-12-01

    Abstract The endoscopic transnasal, transsphenoidal surgical technique for pituitary tumour excision has generally been regarded as a less invasive technique, ranging from single nostril to dual nostril techniques. We propose a single nostril technique using a modified nasal speculum as a preferred technique. We initially reviewed 25 patients who underwent pituitary tumour excision, via endoscopic transnasal transsphenoidal surgery, using this new modified speculum-guided single nostril technique. The results show shorter operation time with reduced intra- and post-operative nasal soft tissue injuries and complications.

  3. TOXICO-CHEMINFORMATICS AND QSAR MODELING OF ...

    EPA Pesticide Factsheets

    This abstract concludes that QSAR approaches combined with toxico-chemoinformatics descriptors can enhance predictive toxicology models. This abstract concludes that QSAR approaches combined with toxico-chemoinformatics descriptors can enhance predictive toxicology models.

  4. Exploration of SWRL Rule Bases through Visualization, Paraphrasing, and Categorization of Rules

    NASA Astrophysics Data System (ADS)

    Hassanpour, Saeed; O'Connor, Martin J.; Das, Amar K.

    Rule bases are increasingly being used as repositories of knowledge content on the Semantic Web. As the size and complexity of these rule bases increases, developers and end users need methods of rule abstraction to facilitate rule management. In this paper, we describe a rule abstraction method for Semantic Web Rule Language (SWRL) rules that is based on lexical analysis and a set of heuristics. Our method results in a tree data structure that we exploit in creating techniques to visualize, paraphrase, and categorize SWRL rules. We evaluate our approach by applying it to several biomedical ontologies that contain SWRL rules, and show how the results reveal rule patterns within the rule base. We have implemented our method as a plug-in tool for Protégé-OWL, the most widely used ontology modeling software for the Semantic Web. Our tool can allow users to rapidly explore content and patterns in SWRL rule bases, enabling their acquisition and management.

  5. Agents Technology Research

    DTIC Science & Technology

    2010-02-01

    multi-agent reputation management. State abstraction is a technique used to allow machine learning technologies to cope with problems that have large...state abstrac- tion process to enable reinforcement learning in domains with large state spaces. State abstraction is vital to machine learning ...across a collective of independent platforms. These individual elements, often referred to as agents in the machine learning community, should exhibit both

  6. Practical Application of Model Checking in Software Verification

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Skakkebaek, Jens Ulrik

    1999-01-01

    This paper presents our experiences in applying the JAVA PATHFINDER (J(sub PF)), a recently developed JAVA to SPIN translator, in the finding of synchronization bugs in a Chinese Chess game server application written in JAVA. We give an overview of J(sub PF) and the subset of JAVA that it supports and describe the abstraction and verification of the game server. Finally, we analyze the results of the effort. We argue that abstraction by under-approximation is necessary for abstracting sufficiently smaller models for verification purposes; that user guidance is crucial for effective abstraction; and that current model checkers do not conveniently support the computational models of software in general and JAVA in particular.

  7. Towards Clean Diesel Engines. Second Symposium. Book of Abstracts.

    DTIC Science & Technology

    1998-04-06

    and Mie-scattering imaging and EXCIPLEX technique, based on a fluorescence system. This last technique, even if it is able to distinguish...set of experimental data, ob- tained by a collaborative effort with researchers at the RWTH Aachen, is presented. Laser-induced exciplex fluorescence

  8. Training Methodology. Part 3: Instructional Methods and Techniques. An Annotated Bibliography. (Revised).

    ERIC Educational Resources Information Center

    Health Services and Mental Health Administration (DHEW), Bethesda, MD.

    The revised annotated bibliography contains abstracts of 345 documents published between January 1960 and March 1968 on specific instructional methods and techniques for groups and individuals. Among methods included are: job instruction, apprenticeship, demonstration, coaching, internship, correspondence and independent study, programed…

  9. Splatterplots: overcoming overdraw in scatter plots.

    PubMed

    Mayorga, Adrian; Gleicher, Michael

    2013-09-01

    We introduce Splatterplots, a novel presentation of scattered data that enables visualizations that scale beyond standard scatter plots. Traditional scatter plots suffer from overdraw (overlapping glyphs) as the number of points per unit area increases. Overdraw obscures outliers, hides data distributions, and makes the relationship among subgroups of the data difficult to discern. To address these issues, Splatterplots abstract away information such that the density of data shown in any unit of screen space is bounded, while allowing continuous zoom to reveal abstracted details. Abstraction automatically groups dense data points into contours and samples remaining points. We combine techniques for abstraction with perceptually based color blending to reveal the relationship between data subgroups. The resulting visualizations represent the dense regions of each subgroup of the data set as smooth closed shapes and show representative outliers explicitly. We present techniques that leverage the GPU for Splatterplot computation and rendering, enabling interaction with massive data sets. We show how Splatterplots can be an effective alternative to traditional methods of displaying scatter data communicating data trends, outliers, and data set relationships much like traditional scatter plots, but scaling to data sets of higher density and up to millions of points on the screen.

  10. Splatterplots: Overcoming Overdraw in Scatter Plots

    PubMed Central

    Mayorga, Adrian; Gleicher, Michael

    2014-01-01

    We introduce Splatterplots, a novel presentation of scattered data that enables visualizations that scale beyond standard scatter plots. Traditional scatter plots suffer from overdraw (overlapping glyphs) as the number of points per unit area increases. Overdraw obscures outliers, hides data distributions, and makes the relationship among subgroups of the data difficult to discern. To address these issues, Splatterplots abstract away information such that the density of data shown in any unit of screen space is bounded, while allowing continuous zoom to reveal abstracted details. Abstraction automatically groups dense data points into contours and samples remaining points. We combine techniques for abstraction with with perceptually based color blending to reveal the relationship between data subgroups. The resulting visualizations represent the dense regions of each subgroup of the dataset as smooth closed shapes and show representative outliers explicitly. We present techniques that leverage the GPU for Splatterplot computation and rendering, enabling interaction with massive data sets. We show how splatterplots can be an effective alternative to traditional methods of displaying scatter data communicating data trends, outliers, and data set relationships much like traditional scatter plots, but scaling to data sets of higher density and up to millions of points on the screen. PMID:23846097

  11. Splatterplots: Overcoming Overdraw in Scatter Plots.

    PubMed

    Mayorga, Adrian; Gleicher, Michael

    2013-03-20

    We introduce Splatterplots, a novel presentation of scattered data that enables visualizations that scale beyond standard scatter plots. Traditional scatter plots suffer from overdraw (overlapping glyphs) as the number of points per unit area increases. Overdraw obscures outliers, hides data distributions, and makes the relationship among subgroups of the data difficult to discern. To address these issues, Splatterplots abstract away information such that the density of data shown in any unit of screen space is bounded, while allowing continuous zoom to reveal abstracted details. Abstraction automatically groups dense data points into contours and samples remaining points. We combine techniques for abstraction with with perceptually based color blending to reveal the relationship between data subgroups. The resulting visualizations represent the dense regions of each subgroup of the dataset as smooth closed shapes and show representative outliers explicitly. We present techniques that leverage the GPU for Splatterplot computation and rendering, enabling interaction with massive data sets. We show how splatterplots can be an effective alternative to traditional methods of displaying scatter data communicating data trends, outliers, and data set relationships much like traditional scatter plots, but scaling to data sets of higher density and up to millions of points on the screen.

  12. Crisis Management Systems: A Case Study for Aspect-Oriented Modeling

    NASA Astrophysics Data System (ADS)

    Kienzle, Jörg; Guelfi, Nicolas; Mustafiz, Sadaf

    The intent of this document is to define a common case study for the aspect-oriented modeling research community. The domain of the case study is crisis management systems, i.e., systems that help in identifying, assessing, and handling a crisis situation by orchestrating the communication between all parties involved in handling the crisis, by allocating and managing resources, and by providing access to relevant crisis-related information to authorized users. This document contains informal requirements of crisis management systems (CMSs) in general, a feature model for a CMS product line, use case models for a car crash CMS (CCCMS), a domain model for the CCCMS, an informal physical architecture description of the CCCMS, as well as some design models of a possible object-oriented implementation of parts of the CCCMS backend. AOM researchers who want to demonstrate the power of their AOM approach or technique can hence apply the approach at the most appropriate level of abstraction.

  13. Predicting Seawater Intrusion in Coastal Groundwater Boreholes Using Self-Potential Data

    NASA Astrophysics Data System (ADS)

    Graham, M.; MacAllister, D. J.; Jackson, M.; Vinogradov, J.; Butler, A. P.

    2017-12-01

    Many coastal groundwater abstraction wells are under threat from seawater intrusion: this is exacerbated in summer by low water tables and increased abstraction. Existing hydrochemistry or geophysical techniques often fail to predict the timing of intrusion events. We investigate whether the presence and transport of seawater can influence self-potentials (SPs) measured within groundwater boreholes, with the aim of using SP monitoring to provide early warning of saline intrusion. SP data collection: SP data were collected from a coastal groundwater borehole and an inland borehole (> 60 km from the coast) in the Seaford Chalk of southern England. The SP gradient in the inland borehole was approximately 0.05 mV/m, while that in the coastal borehole varied from 0.16-0.26 mV/m throughout the monitoring period. Spectral analysis showed that semi-diurnal fluctuations in the SP gradient were several orders of magnitude higher at the coast than inland, indicating a strong influence from oceanic tides. A characteristic decrease in the gradient, or precursor, was observed in the coastal borehole several days prior to seawater intrusion. Modelling results: Hydrodynamic transport and geoelectric modelling suggest that observed pressure changes (associated with the streaming potential) are insufficient to explain either the magnitude of the coastal SP gradient or the semi-diurnal SP fluctuations. By contrast, a model of the exclusion-diffusion potential closely matches these observations and produces a precursor similar to that observed in the field. Sensitivity analysis suggests that both a sharp saline front and spatial variations in the exclusion efficiency arising from aquifer heterogeneities are necessary to explain the SP gradient observed in the coastal borehole. The presence of the precursor in the model depends also on the presence and depth of fractures near the base of the borehole. Conclusions: Our results indicate that SP monitoring, combined with hydrodynamic transport and geoelectric modelling, holds considerable promise as an early warning device for seawater intrusion. We now aim to refine our understanding of the technique by applying it to a range of aquifer types.

  14. Teaching Information Retrieval Using Telediscussion Techniques.

    ERIC Educational Resources Information Center

    Heiliger, Edward M.

    This paper concerns an experiment in teaching a graduate seminar on Information Retrieval using telediscussion techniques. Outstanding persons from Project INTREX, MEDLARS, Chemical Abstracts, the University of Georgia, the SUNY biomedical Network, AEC, NASA, and DDC gave hour-long telelectures. A Conference Telephone Set was used with success.…

  15. User-Extensible Graphics Using Abstract Structure,

    DTIC Science & Technology

    1987-08-01

    Flex 6 The Algol68 model of the graphical abstract structure 5 The creation of a PictureDefinition 6 The making of a picture from a PictureDefinition 7...data together with the operations that can be performed on that data. i 7! ś I _ § 4, The Alqol68 model of the graphical abstract structure Every

  16. Probabilistic bias analysis in pharmacoepidemiology and comparative effectiveness research: a systematic review.

    PubMed

    Hunnicutt, Jacob N; Ulbricht, Christine M; Chrysanthopoulou, Stavroula A; Lapane, Kate L

    2016-12-01

    We systematically reviewed pharmacoepidemiologic and comparative effectiveness studies that use probabilistic bias analysis to quantify the effects of systematic error including confounding, misclassification, and selection bias on study results. We found articles published between 2010 and October 2015 through a citation search using Web of Science and Google Scholar and a keyword search using PubMed and Scopus. Eligibility of studies was assessed by one reviewer. Three reviewers independently abstracted data from eligible studies. Fifteen studies used probabilistic bias analysis and were eligible for data abstraction-nine simulated an unmeasured confounder and six simulated misclassification. The majority of studies simulating an unmeasured confounder did not specify the range of plausible estimates for the bias parameters. Studies simulating misclassification were in general clearer when reporting the plausible distribution of bias parameters. Regardless of the bias simulated, the probability distributions assigned to bias parameters, number of simulated iterations, sensitivity analyses, and diagnostics were not discussed in the majority of studies. Despite the prevalence and concern of bias in pharmacoepidemiologic and comparative effectiveness studies, probabilistic bias analysis to quantitatively model the effect of bias was not widely used. The quality of reporting and use of this technique varied and was often unclear. Further discussion and dissemination of the technique are warranted. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  17. Human Factors in Streaming Data Analysis: Challenges and Opportunities for Information Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasgupta, Aritra; Arendt, Dustin L.; Franklin, Lyndsey

    State-of-the-art visual analytics models and frameworks mostly assume a static snapshot of the data, while in many cases it is a stream with constant updates and changes. Exploration of streaming data poses unique challenges as machine-level computations and abstractions need to be synchronized with the visual representation of the data and the temporally evolving human insights. In the visual analytics literature, we lack a thorough characterization of streaming data and analysis of the challenges associated with task abstraction, visualization design, and adaptation of the role of human-in-the-loop for exploration of data streams. We aim to fill this gap by conductingmore » a survey of the state-of-the-art in visual analytics of streaming data for systematically describing the contributions and shortcomings of current techniques and analyzing the research gaps that need to be addressed in the future. Our contributions are: i) problem characterization for identifying challenges that are unique to streaming data analysis tasks, ii) a survey and analysis of the state-of-the-art in streaming data visualization research with a focus on the visualization design space for dynamic data and the role of the human-in-the-loop, and iii) reflections on the design-trade-offs for streaming visual analytics techniques and their practical applicability in real-world application scenarios.« less

  18. Facing the challenges of multiscale modelling of bacterial and fungal pathogen–host interactions

    PubMed Central

    Schleicher, Jana; Conrad, Theresia; Gustafsson, Mika; Cedersund, Gunnar; Guthke, Reinhard

    2017-01-01

    Abstract Recent and rapidly evolving progress on high-throughput measurement techniques and computational performance has led to the emergence of new disciplines, such as systems medicine and translational systems biology. At the core of these disciplines lies the desire to produce multiscale models: mathematical models that integrate multiple scales of biological organization, ranging from molecular, cellular and tissue models to organ, whole-organism and population scale models. Using such models, hypotheses can systematically be tested. In this review, we present state-of-the-art multiscale modelling of bacterial and fungal infections, considering both the pathogen and host as well as their interaction. Multiscale modelling of the interactions of bacteria, especially Mycobacterium tuberculosis, with the human host is quite advanced. In contrast, models for fungal infections are still in their infancy, in particular regarding infections with the most important human pathogenic fungi, Candida albicans and Aspergillus fumigatus. We reflect on the current availability of computational approaches for multiscale modelling of host–pathogen interactions and point out current challenges. Finally, we provide an outlook for future requirements of multiscale modelling. PMID:26857943

  19. Mining patterns in persistent surveillance systems with smart query and visual analytics

    NASA Astrophysics Data System (ADS)

    Habibi, Mohammad S.; Shirkhodaie, Amir

    2013-05-01

    In Persistent Surveillance Systems (PSS) the ability to detect and characterize events geospatially help take pre-emptive steps to counter adversary's actions. Interactive Visual Analytic (VA) model offers this platform for pattern investigation and reasoning to comprehend and/or predict such occurrences. The need for identifying and offsetting these threats requires collecting information from diverse sources, which brings with it increasingly abstract data. These abstract semantic data have a degree of inherent uncertainty and imprecision, and require a method for their filtration before being processed further. In this paper, we have introduced an approach based on Vector Space Modeling (VSM) technique for classification of spatiotemporal sequential patterns of group activities. The feature vectors consist of an array of attributes extracted from generated sensors semantic annotated messages. To facilitate proper similarity matching and detection of time-varying spatiotemporal patterns, a Temporal-Dynamic Time Warping (DTW) method with Gaussian Mixture Model (GMM) for Expectation Maximization (EM) is introduced. DTW is intended for detection of event patterns from neighborhood-proximity semantic frames derived from established ontology. GMM with EM, on the other hand, is employed as a Bayesian probabilistic model to estimated probability of events associated with a detected spatiotemporal pattern. In this paper, we present a new visual analytic tool for testing and evaluation group activities detected under this control scheme. Experimental results demonstrate the effectiveness of proposed approach for discovery and matching of subsequences within sequentially generated patterns space of our experiments.

  20. BluePyOpt: Leveraging Open Source Software and Cloud Infrastructure to Optimise Model Parameters in Neuroscience.

    PubMed

    Van Geit, Werner; Gevaert, Michael; Chindemi, Giuseppe; Rössert, Christian; Courcol, Jean-Denis; Muller, Eilif B; Schürmann, Felix; Segev, Idan; Markram, Henry

    2016-01-01

    At many scales in neuroscience, appropriate mathematical models take the form of complex dynamical systems. Parameterizing such models to conform to the multitude of available experimental constraints is a global non-linear optimisation problem with a complex fitness landscape, requiring numerical techniques to find suitable approximate solutions. Stochastic optimisation approaches, such as evolutionary algorithms, have been shown to be effective, but often the setting up of such optimisations and the choice of a specific search algorithm and its parameters is non-trivial, requiring domain-specific expertise. Here we describe BluePyOpt, a Python package targeted at the broad neuroscience community to simplify this task. BluePyOpt is an extensible framework for data-driven model parameter optimisation that wraps and standardizes several existing open-source tools. It simplifies the task of creating and sharing these optimisations, and the associated techniques and knowledge. This is achieved by abstracting the optimisation and evaluation tasks into various reusable and flexible discrete elements according to established best-practices. Further, BluePyOpt provides methods for setting up both small- and large-scale optimisations on a variety of platforms, ranging from laptops to Linux clusters and cloud-based compute infrastructures. The versatility of the BluePyOpt framework is demonstrated by working through three representative neuroscience specific use cases.

  1. Nanorods, nanospheres, nanocubes: Synthesis, characterization and catalytic activity of nanoferrites of Mn, Co, Ni, Part-89

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Supriya; Srivastava, Pratibha; Singh, Gurdip, E-mail: gsingh4us@yahoo.com

    2013-02-15

    Graphical abstract: Prepared nanoferrites were characterized by FE-SEM and bright field TEM micrographs. The catalytic effect of these nanoferrites was evaluated on the thermal decomposition of ammonium perchlorate using TG and TG–DSC techniques. The kinetics of thermal decomposition of AP was evaluated using isothermal TG data by model fitting as well as isoconversional method. Display Omitted Highlights: ► Synthesis of ferrite nanostructures (∼20.0 nm) by wet-chemical method under different synthetic conditions. ► Characterization using XRD, FE-SEM, EDS, TEM, HRTEM and SAED pattern. ► Catalytic activity of ferrite nanostructures on AP thermal decomposition by thermal techniques. ► Burning rate measurements ofmore » CSPs with ferrite nanostructures. ► Kinetics of thermal decomposition of AP + nanoferrites. -- Abstract: In this paper, the nanoferrites of Mn, Co and Ni were synthesized by wet chemical method and characterized by X-ray diffraction (XRD), field emission scanning electron microscopy (FE-SEM), energy dispersive, X-ray spectra (EDS), transmission electron microscopy (TEM) and high resolution transmission electron microscopy (HR-TEM). It is catalytic activity were investigated on the thermal decomposition of ammonium perchlorate (AP) and composite solid propellants (CSPs) using thermogravimetry (TG), TG coupled with differential scanning calorimetry (TG–DSC) and ignition delay measurements. Kinetics of thermal decomposition of AP + nanoferrites have also been investigated using isoconversional and model fitting approaches which have been applied to data for isothermal TG decomposition. The burning rate of CSPs was considerably enhanced by these nanoferrites. Addition of nanoferrites to AP led to shifting of the high temperature decomposition peak toward lower temperature. All these studies reveal that ferrite nanorods show the best catalytic activity superior to that of nanospheres and nanocubes.« less

  2. Preparation of water soluble L-arginine capped CdSe/ZnS QDs and their interaction with synthetic DNA: Picosecond-resolved FRET study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giri, Anupam; Goswami, Nirmal; Lemmens, Peter

    2012-08-15

    Graphical abstract: Förster resonance energy transfer (FRET) studies on the interaction of water soluble arginine-capped CdSe/ZnS QDs with ethidium bromide (EB) labeled synthetic dodecamer DNA. Highlights: ► We have solubilized CdSe/ZnS QD in water replacing their TOPO ligand by L-arginine. ► We have studied arginine@QD–DNA interaction using FRET technique. ► Arginine@QDs act as energy donor and ethidium bromide-DNA acts as energy acceptor. ► We have applied a kinetic model to understand the kinetics of energy transfer. ► Circular dichroism studies revealed negligible perturbation in the DNA B-form in the arg@QD-DNA complex. -- Abstract: We have exchanged TOPO (trioctylphosphine oxide) ligandmore » of CdSe/ZnS core/shell quantum dots (QDs) with an amino acid L-arginine (Arg) at the toluene/water interface and eventually rendered the QDs from toluene to aqueous phase. We have studied the interaction of the water soluble Arg-capped QDs (energy donor) with ethidium (EB) labeled synthetic dodecamer DNA (energy acceptor) using picoseconds resolved Förster resonance energy transfer (FRET) technique. Furthermore, we have applied a model developed by M. Tachiya to understand the kinetics of energy transfer and the distribution of acceptor (EB-DNA) molecules around the donor QDs. Circular dichroism (CD) studies revealed a negligible perturbation in the native B-form structure of the DNA upon interaction with Arg-capped QDs. The melting and the rehybridization pathways of the DNA attached to the QDs have been monitored by the CD which reveals hydrogen bonding is the associative mechanism for interaction between Arg-capped QDs and DNA.« less

  3. Multi-Level Reduced Order Modeling Equipped with Probabilistic Error Bounds

    NASA Astrophysics Data System (ADS)

    Abdo, Mohammad Gamal Mohammad Mostafa

    This thesis develops robust reduced order modeling (ROM) techniques to achieve the needed efficiency to render feasible the use of high fidelity tools for routine engineering analyses. Markedly different from the state-of-the-art ROM techniques, our work focuses only on techniques which can quantify the credibility of the reduction which can be measured with the reduction errors upper-bounded for the envisaged range of ROM model application. Our objective is two-fold. First, further developments of ROM techniques are proposed when conventional ROM techniques are too taxing to be computationally practical. This is achieved via a multi-level ROM methodology designed to take advantage of the multi-scale modeling strategy typically employed for computationally taxing models such as those associated with the modeling of nuclear reactor behavior. Second, the discrepancies between the original model and ROM model predictions over the full range of model application conditions are upper-bounded in a probabilistic sense with high probability. ROM techniques may be classified into two broad categories: surrogate construction techniques and dimensionality reduction techniques, with the latter being the primary focus of this work. We focus on dimensionality reduction, because it offers a rigorous approach by which reduction errors can be quantified via upper-bounds that are met in a probabilistic sense. Surrogate techniques typically rely on fitting a parametric model form to the original model at a number of training points, with the residual of the fit taken as a measure of the prediction accuracy of the surrogate. This approach, however, does not generally guarantee that the surrogate model predictions at points not included in the training process will be bound by the error estimated from the fitting residual. Dimensionality reduction techniques however employ a different philosophy to render the reduction, wherein randomized snapshots of the model variables, such as the model parameters, responses, or state variables, are projected onto lower dimensional subspaces, referred to as the "active subspaces", which are selected to capture a user-defined portion of the snapshots variations. Once determined, the ROM model application involves constraining the variables to the active subspaces. In doing so, the contribution from the variables discarded components can be estimated using a fundamental theorem from random matrix theory which has its roots in Dixon's theory, developed in 1983. This theory was initially presented for linear matrix operators. The thesis extends this theorem's results to allow reduction of general smooth nonlinear operators. The result is an approach by which the adequacy of a given active subspace determined using a given set of snapshots, generated either using the full high fidelity model, or other models with lower fidelity, can be assessed, which provides insight to the analyst on the type of snapshots required to reach a reduction that can satisfy user-defined preset tolerance limits on the reduction errors. Reactor physics calculations are employed as a test bed for the proposed developments. The focus will be on reducing the effective dimensionality of the various data streams such as the cross-section data and the neutron flux. The developed methods will be applied to representative assembly level calculations, where the size of the cross-section and flux spaces are typically large, as required by downstream core calculations, in order to capture the broad range of conditions expected during reactor operation. (Abstract shortened by ProQuest.).

  4. Abstracting of suspected illegal land use in urban areas using case-based classification of remote sensing images

    NASA Astrophysics Data System (ADS)

    Chen, Fulong; Wang, Chao; Yang, Chengyun; Zhang, Hong; Wu, Fan; Lin, Wenjuan; Zhang, Bo

    2008-11-01

    This paper proposed a method that uses a case-based classification of remote sensing images and applied this method to abstract the information of suspected illegal land use in urban areas. Because of the discrete cases for imagery classification, the proposed method dealt with the oscillation of spectrum or backscatter within the same land use category, and it not only overcame the deficiency of maximum likelihood classification (the prior probability of land use could not be obtained) but also inherited the advantages of the knowledge-based classification system, such as artificial intelligence and automatic characteristics. Consequently, the proposed method could do the classifying better. Then the researchers used the object-oriented technique for shadow removal in highly dense city zones. With multi-temporal SPOT 5 images whose resolution was 2.5×2.5 meters, the researchers found that the method can abstract suspected illegal land use information in urban areas using post-classification comparison technique.

  5. Proceedings of the second United Nations symposium on the development and use of geothermal resources. Volume 2 (in several languages)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Separate abstracts were prepared for 69 of the 77 papers presented. The remaining 7 papers were previously abstracted in ERA and can be found in the report number index under CONF-750525. The papers presented are under sections entitled geophysical techniques in exploration, environmental factors and waste disposal, and drilling technology. (WHK)

  6. A NOVEL SMALL-RATIO RELATIVE-RATE TECHNIQUE FOR MEASURING OH FORMATION YIELDS FROM THE REACTIONS OF O3 WITH ALKENES IN THE GAS PHASE, AND ITS APPLICATION TO THE REACTIONS OF ETHENE AND PROPENE. (R826236)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  7. Automatic specification of reliability models for fault-tolerant computers

    NASA Technical Reports Server (NTRS)

    Liceaga, Carlos A.; Siewiorek, Daniel P.

    1993-01-01

    The calculation of reliability measures using Markov models is required for life-critical processor-memory-switch structures that have standby redundancy or that are subject to transient or intermittent faults or repair. The task of specifying these models is tedious and prone to human error because of the large number of states and transitions required in any reasonable system. Therefore, model specification is a major analysis bottleneck, and model verification is a major validation problem. The general unfamiliarity of computer architects with Markov modeling techniques further increases the necessity of automating the model specification. Automation requires a general system description language (SDL). For practicality, this SDL should also provide a high level of abstraction and be easy to learn and use. The first attempt to define and implement an SDL with those characteristics is presented. A program named Automated Reliability Modeling (ARM) was constructed as a research vehicle. The ARM program uses a graphical interface as its SDL, and it outputs a Markov reliability model specification formulated for direct use by programs that generate and evaluate the model.

  8. Composing, Analyzing and Validating Software Models

    NASA Astrophysics Data System (ADS)

    Sheldon, Frederick T.

    1998-10-01

    This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.

  9. Composing, Analyzing and Validating Software Models

    NASA Technical Reports Server (NTRS)

    Sheldon, Frederick T.

    1998-01-01

    This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.

  10. Multilevel and Hybrid Architecture for Device Abstraction and Context Information Management in Smart Home Environments

    NASA Astrophysics Data System (ADS)

    Peláez, Víctor; González, Roberto; San Martín, Luis Ángel; Campos, Antonio; Lobato, Vanesa

    Hardware device management, and context information acquisition and abstraction are key factors to develop the ambient intelligent paradigm in smart homes. This work presents an architecture that addresses these two problems and provides a usable framework to develop applications easily. In contrast to other proposals, this work addresses performance issues specifically. Results show that the execution performance of the developed prototype is suitable for deployment in a real environment. In addition, the modular design of the system allows the user to develop applications using different techniques and different levels of abstraction.

  11. Remote sensing of natural resources: Quarterly literature review

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A quarterly review of technical literature concerning remote sensing techniques is presented. The format contains indexed and abstracted materials with emphasis on data gathering techniques performed or obtained remotely from space, aircraft, or ground-based stations. Remote sensor applications including the remote sensing of natural resources are presented.

  12. "What is relevant in a text document?": An interpretable machine learning approach

    PubMed Central

    Arras, Leila; Horn, Franziska; Montavon, Grégoire; Müller, Klaus-Robert

    2017-01-01

    Text documents can be described by a number of abstract concepts such as semantic category, writing style, or sentiment. Machine learning (ML) models have been trained to automatically map documents to these abstract concepts, allowing to annotate very large text collections, more than could be processed by a human in a lifetime. Besides predicting the text’s category very accurately, it is also highly desirable to understand how and why the categorization process takes place. In this paper, we demonstrate that such understanding can be achieved by tracing the classification decision back to individual words using layer-wise relevance propagation (LRP), a recently developed technique for explaining predictions of complex non-linear classifiers. We train two word-based ML models, a convolutional neural network (CNN) and a bag-of-words SVM classifier, on a topic categorization task and adapt the LRP method to decompose the predictions of these models onto words. Resulting scores indicate how much individual words contribute to the overall classification decision. This enables one to distill relevant information from text documents without an explicit semantic information extraction step. We further use the word-wise relevance scores for generating novel vector-based document representations which capture semantic information. Based on these document vectors, we introduce a measure of model explanatory power and show that, although the SVM and CNN models perform similarly in terms of classification accuracy, the latter exhibits a higher level of explainability which makes it more comprehensible for humans and potentially more useful for other applications. PMID:28800619

  13. Tracer transport in soils and shallow groundwater: model abstraction with modern tools

    USDA-ARS?s Scientific Manuscript database

    Vadose zone controls contaminant transport from the surface to groundwater, and modeling transport in vadose zone has become a burgeoning field. Exceedingly complex models of subsurface contaminant transport are often inefficient. Model abstraction is the methodology for reducing the complexity of a...

  14. Improving data retrieval quality: Evidence based medicine perspective.

    PubMed

    Kamalov, M; Dobrynin, V; Balykina, J; Kolbin, A; Verbitskaya, E; Kasimova, M

    2015-01-01

    The actively developing approach in modern medicine is the approach focused on principles of evidence-based medicine. The assessment of quality and reliability of studies is needed. However, in some cases studies corresponding to the first level of evidence may contain errors in randomized control trials (RCTs). Solution of the problem is the Grading of Recommendations Assessment, Development and Evaluation (GRADE) system. Studies both in the fields of medicine and information retrieval are conducted for developing search engines for the MEDLINE database [1]; combined techniques for summarization and information retrieval targeted to solving problems of finding the best medication based on the levels of evidence are being developed [2]. Based on the relevance and demand for studies both in the field of medicine and information retrieval, it was decided to start the development of a search engine for the MEDLINE database search on the basis of the Saint-Petersburg State University with the support of Pavlov First Saint-Petersburg State Medical University and Tashkent Institute of Postgraduate Medical Education. Novelty and value of the proposed system are characterized by the use of ranking method of relevant abstracts. It is suggested that the system will be able to perform ranking based on studies level of evidence and to apply GRADE criteria for system evaluation. The assigned task falls within the domain of information retrieval and machine learning. Based on the results of implementation from previous work [3], in which the main goal was to cluster abstracts from MEDLINE database by subtypes of medical interventions, a set of algorithms for clustering in this study was selected: K-means, K-means ++, EM from the sklearn (http://scikit-learn.org) and WEKA (http://www.cs.waikato.ac.nz/~ml/weka/) libraries, together with the methods of Latent Semantic Analysis (LSA) [4] choosing the first 210 facts and the model "bag of words" [5] to represent clustered documents. During the process of abstracts classification, few algorithms were tested including: Complement Naive Bayes [6], Sequential Minimal Optimization (SMO) [7] and non linear SVM from the WEKA library. The first step of this study was to markup abstracts of articles from the MEDLINE by containing and not containing a medical intervention. For this purpose, based on our previous work [8] a web-crawler was modified to perform the necessary markuping. The next step was to evaluate the clustering algorithms at the markup abstracts. As a result of clustering abstracts by two groups, when applying the LSA and choosing first 210 facts, the following results were obtained:1) K-means: Purity = 0,5598, Normalized Entropy = 0.5994;2)K-means ++: Purity = 0,6743, Normalized Entropy = 0.4996;3)EM: Purity = 0,5443, Normalized Entropy = 0.6344.When applying the model "bag of words":1)K-means: Purity = 0,5134, Normalized Entropy = 0.6254;2)K-means ++: Purity = 0,5645, Normalized Entropy = 0.5299;3)EM: Purity = 0,5247, Normalized Entropy = 0.6345.Then, studies which contain medical intervention have been considered and classified by the subtypes of medical interventions. At the process of classification abstracts by subtypes of medical interventions, abstracts were presented as a "bag of words" model with the removal of stop words. 1)Complement Naive Bayes: macro F-measure = 0.6934, micro F-measure = 0.7234;2)Sequantial Minimal Optimization: macro F-measure = 0.6543, micro F-measure = 0.7042;3)Non linear SVM: macro F-measure = 0.6835, micro F-measure = 0.7642. Based on the results of computational experiments, the best results of abstract clustering by containing and not containing medical intervention were obtained using the K-Means ++ algorithm together with LSA, choosing the first 210 facts. The quality of classification abstracts by subtypes of medical interventions value for existing ones [8] has been improved using non linear SVM algorithm, with "bag of words" model and the removal of stop words. The results of clustering obtained in this study will help in grouping abstracts by levels of evidence, using the classification by subtypes of medical interventions and it will be possible to extract information from the abstracts on specific types of interventions.

  15. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  16. An Integrated Planning Representation Using Macros, Abstractions, and Cases

    NASA Technical Reports Server (NTRS)

    Baltes, Jacky; MacDonald, Bruce

    1992-01-01

    Planning will be an essential part of future autonomous robots and integrated intelligent systems. This paper focuses on learning problem solving knowledge in planning systems. The system is based on a common representation for macros, abstractions, and cases. Therefore, it is able to exploit both classical and case based techniques. The general operators in a successful plan derivation would be assessed for their potential usefulness, and some stored. The feasibility of this approach was studied through the implementation of a learning system for abstraction. New macros are motivated by trying to improve the operatorset. One heuristic used to improve the operator set is generating operators with more general preconditions than existing ones. This heuristic leads naturally to abstraction hierarchies. This investigation showed promising results on the towers of Hanoi problem. The paper concludes by describing methods for learning other problem solving knowledge. This knowledge can be represented by allowing operators at different levels of abstraction in a refinement.

  17. Conceptual FOM design tool

    NASA Astrophysics Data System (ADS)

    Krause, Lee S.; Burns, Carla L.

    2000-06-01

    This paper discusses the research currently in progress to develop the Conceptual Federation Object Model Design Tool. The objective of the Conceptual FOM (C-FOM) Design Tool effort is to provide domain and subject matter experts, such as scenario developers, with automated support for understanding and utilizing available HLA simulation and other simulation assets during HLA Federation development. The C-FOM Design Tool will import Simulation Object Models from HLA reuse repositories, such as the MSSR, to populate the domain space that will contain all the objects and their supported interactions. In addition, the C-FOM tool will support the conversion of non-HLA legacy models into HLA- compliant models by applying proven abstraction techniques against the legacy models. Domain experts will be able to build scenarios based on the domain objects and interactions in both a text and graphical form and export a minimal FOM. The ability for domain and subject matter experts to effectively access HLA and non-HLA assets is critical to the long-term acceptance of the HLA initiative.

  18. Experimental Evaluation of a Planning Language Suitable for Formal Verification

    NASA Technical Reports Server (NTRS)

    Butler, Rick W.; Munoz, Cesar A.; Siminiceanu, Radu I.

    2008-01-01

    The marriage of model checking and planning faces two seemingly diverging alternatives: the need for a planning language expressive enough to capture the complexity of real-life applications, as opposed to a language simple, yet robust enough to be amenable to exhaustive verification and validation techniques. In an attempt to reconcile these differences, we have designed an abstract plan description language, ANMLite, inspired from the Action Notation Modeling Language (ANML) [17]. We present the basic concepts of the ANMLite language as well as an automatic translator from ANMLite to the model checker SAL (Symbolic Analysis Laboratory) [7]. We discuss various aspects of specifying a plan in terms of constraints and explore the implications of choosing a robust logic behind the specification of constraints, rather than simply propose a new planning language. Additionally, we provide an initial assessment of the efficiency of model checking to search for solutions of planning problems. To this end, we design a basic test benchmark and study the scalability of the generated SAL models in terms of plan complexity.

  19. Extending BPM Environments of Your Choice with Performance Related Decision Support

    NASA Astrophysics Data System (ADS)

    Fritzsche, Mathias; Picht, Michael; Gilani, Wasif; Spence, Ivor; Brown, John; Kilpatrick, Peter

    What-if Simulations have been identified as one solution for business performance related decision support. Such support is especially useful in cases where it can be automatically generated out of Business Process Management (BPM) Environments from the existing business process models and performance parameters monitored from the executed business process instances. Currently, some of the available BPM Environments offer basic-level performance prediction capabilities. However, these functionalities are normally too limited to be generally useful for performance related decision support at business process level. In this paper, an approach is presented which allows the non-intrusive integration of sophisticated tooling for what-if simulations, analytic performance prediction tools, process optimizations or a combination of such solutions into already existing BPM environments. The approach abstracts from process modelling techniques which enable automatic decision support spanning processes across numerous BPM Environments. For instance, this enables end-to-end decision support for composite processes modelled with the Business Process Modelling Notation (BPMN) on top of existing Enterprise Resource Planning (ERP) processes modelled with proprietary languages.

  20. Evaluation of the 3d Urban Modelling Capabilities in Geographical Information Systems

    NASA Astrophysics Data System (ADS)

    Dogru, A. O.; Seker, D. Z.

    2010-12-01

    Geographical Information System (GIS) Technology, which provides successful solutions to basic spatial problems, is currently widely used in 3 dimensional (3D) modeling of physical reality with its developing visualization tools. The modeling of large and complicated phenomenon is a challenging problem in terms of computer graphics currently in use. However, it is possible to visualize that phenomenon in 3D by using computer systems. 3D models are used in developing computer games, military training, urban planning, tourism and etc. The use of 3D models for planning and management of urban areas is very popular issue of city administrations. In this context, 3D City models are produced and used for various purposes. However the requirements of the models vary depending on the type and scope of the application. While a high level visualization, where photorealistic visualization techniques are widely used, is required for touristy and recreational purposes, an abstract visualization of the physical reality is generally sufficient for the communication of the thematic information. The visual variables, which are the principle components of cartographic visualization, such as: color, shape, pattern, orientation, size, position, and saturation are used for communicating the thematic information. These kinds of 3D city models are called as abstract models. Standardization of technologies used for 3D modeling is now available by the use of CityGML. CityGML implements several novel concepts to support interoperability, consistency and functionality. For example it supports different Levels-of-Detail (LoD), which may arise from independent data collection processes and are used for efficient visualization and efficient data analysis. In one CityGML data set, the same object may be represented in different LoD simultaneously, enabling the analysis and visualization of the same object with regard to different degrees of resolution. Furthermore, two CityGML data sets containing the same object in different LoD may be combined and integrated. In this study GIS tools used for 3D modeling issues were examined. In this context, the availability of the GIS tools for obtaining different LoDs of CityGML standard. Additionally a 3D GIS application that covers a small part of the city of Istanbul was implemented for communicating the thematic information rather than photorealistic visualization by using 3D model. An abstract model was created by using a commercial GIS software modeling tools and the results of the implementation were also presented in the study.

  1. Efficiency measurement of health care organizations: What models are used?

    PubMed Central

    Jaafaripooyan, Ebrahim; Emamgholipour, Sara; Raei, Behzad

    2017-01-01

    Background: Literature abounds with various techniques for efficiency measurement of health care organizations (HCOs), which should be used cautiously and appropriately. The present study aimed at discovering the rules regulating the interplay among the number of inputs, outputs, and decision- making units (DMUs) and identifying all methods used for the measurement of Iranian HCOs and critically appraising all DEA studies on Iranian HCOs in their application of such rules. Methods: The present study employed a systematic search of all studies related to efficiency measurement of Iranian HCOs. A search was conducted in different databases such as PubMed and Scopus between 2001 and 2015 to identify the studies related to the measurement in health care. The retrieved studies passed through a multi-stage (title, abstract, body) filtering process. Data extraction table for each study was completed and included method, number of inputs and outputs, DMUs, and their efficiency score. Results: Various methods were found for efficiency measurement. Overall, 122 studies were retrieved, of which 73 had exclusively employed DEA technique for measuring the efficiency of HCOs in Iran, and 23 with hybrid models (including DEA). Only 6 studies had explicitly used the rules of thumb. Conclusion: The number of inputs, outputs, and DMUs should be cautiously selected in DEA like techniques, as their proportionality can directly affect the discriminatory power of the technique. The given literature seemed to be, to a large extent, unsuccessful in attending to such proportionality. This study collected a list of key rules (of thumb) on the interplay of inputs, outputs, and DMUs, which could be considered by most researchers keen to apply DEA technique.

  2. The travesty of choosing after positive prenatal diagnosis.

    PubMed

    Sandelowski, Margarete; Barroso, Julie

    2005-01-01

    To integrate the findings of qualitative studies of expectant parents receiving positive prenatal diagnosis. Seventeen published and unpublished reports appearing between 1984 and 2001 and retrieved between December of 2002 and March of 2003. The electronic databases searched include Academic Search Elite, AIDS Information Online (AIDSLINE), Anthropological Index Online, Anthropological Literature, Black Studies, Cumulative Index to Nursing and Allied Health Literature (CINAHL), Digital Dissertations, Dissertation Abstracts Index (DAI), Educational Resource Information Center (ERIC), MEDLINE, PsycInfo, Public Affairs Information Service (PAIS), PubMed, Social Science Abstracts (SocSci Abstracts), Social Science Citation Index, Social Work Abstracts, Sociological Abstracts (Sociofile), Women's Resources International, and Women's Studies. Qualitative studies involving expectant parents living in the United States of any race, ethnicity, nationality, or class who learned during any time in pregnancy of any fetal impairment by any means of diagnosis were eligible for inclusion. Metasummary techniques, including the calculation of frequency effect sizes, were used to aggregate the findings. Metasynthesis techniques, including constant comparison analysis and the reciprocal translation of concepts, were used to interpret the findings. The topical emphasis in the findings is on the termination of pregnancy following positive diagnosis. The thematic emphasis is on the dilemmas of choice and decision making. Positive prenatal diagnosis was for couples an experience of chosen losses and lost choices. Couples managed information to minimize stigmatization and cognitive dissonance. Existing guidelines for caring for couples after perinatal losses must accommodate the chosen loss experientially defining positive prenatal diagnosis.

  3. Determination of 2p Excitation Transfer Rate Coefficient in Neon Gas Discharges

    NASA Astrophysics Data System (ADS)

    Smith, D. J.; Stewart, R. S.

    2001-10-01

    We will discuss our theoretical modelling and application of an array of four complementary optical diagnostic techniques for low-temperature plasmas. These are cw laser collisionally-induced fluorescence (LCIF), cw optogalvanic effect (OGE), optical emission spectroscopy (OES) and optical absorption spectroscopy (OAS). We will briefly present an overview of our investigation of neon positive column plasmas for reduced axial electric fields ranging from 3x10-17 Vcm2 to 2x10-16 Vcm2 (3-20 Td), detailing our determination of five sets of important collisional rate coefficients involving the fifteen lowest levels, the 1S0 ground state and the 1s and 2p excited states (in Paschen notation), hence information on several energy regions of the electron distribution function (EDF). The discussion will be extended to show the new results obtained from analysis of the argon positive column over similar reduced fields. Future work includes application of our multi-diagnostic technique to more complex systems, including the addition of molecules for EDF determination. array of four complementary optical diagnostic techniques OGE LCIF determination of five sets of important collisional rate coefficients

  4. A Methodology for Evaluating Artifacts Produced by a Formal Verification Process

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.; Miner, Paul S.; Person, Suzette

    2011-01-01

    The goal of this study is to produce a methodology for evaluating the claims and arguments employed in, and the evidence produced by formal verification activities. To illustrate the process, we conduct a full assessment of a representative case study for the Enabling Technology Development and Demonstration (ETDD) program. We assess the model checking and satisfiabilty solving techniques as applied to a suite of abstract models of fault tolerant algorithms which were selected to be deployed in Orion, namely the TTEthernet startup services specified and verified in the Symbolic Analysis Laboratory (SAL) by TTTech. To this end, we introduce the Modeling and Verification Evaluation Score (MVES), a metric that is intended to estimate the amount of trust that can be placed on the evidence that is obtained. The results of the evaluation process and the MVES can then be used by non-experts and evaluators in assessing the credibility of the verification results.

  5. A methodology for long-range prediction of air transportation

    NASA Technical Reports Server (NTRS)

    Ayati, M. B.; English, J. M.

    1980-01-01

    A framework and methodology for long term projection of demand for aviation fuels is presented. The approach taken includes two basic components. The first was a new technique for establishing the socio-economic environment within which the future aviation industry is embedded. The concept utilized was a definition of an overall societal objective for the very long run future. Within a framework so defined, a set of scenarios by which the future will unfold are then written. These scenarios provide the determinants of the air transport industry operations and accordingly provide an assessment of future fuel requirements. The second part was the modeling of the industry in terms of an abstracted set of variables to represent the overall industry performance on a macro scale. The model was validated by testing the desired output variables from the model with historical data over the past decades.

  6. Clinical innovation for promoting family care in paediatric intensive care: demonstration, role modelling and reflective practice.

    PubMed

    Tomlinson, Patricia S; Thomlinson, Elizabeth; Peden-McAlpine, Cynthia; Kirschbaum, Mark

    2002-04-01

    To explore family caregiving problems in paediatric crisis care and methods that could be applied to move the abstraction of family care to development of specific family interventions. Family centred care has been accepted as the ideal philosophy for holistic health care of children, but methods for its implementation are not well established. In paediatric health crises, family care requires special sensitivity to family needs and a type of complex nursing care for which many practitioners are not sufficiently prepared. Developing family sensitive models of intervention and finding a strategy for transfer of this knowledge to clinical practice is an important challenge facing family nursing today. Social learning theory provides a rich background to explore these issues. Specific techniques of role modelling and reflective practice are suggested as effective approaches to teach family sensitive care in clinical settings where families are part of the care environment.

  7. Towards new generation spectroscopic models of cool stars

    NASA Astrophysics Data System (ADS)

    Bergemann, Maria

    2018-06-01

    Abstract: Spectroscopy is a unique tool to determine the physical parameters of stars. Knowledge of stellar chemical abundances, masses, and ages is the key to understanding the evolution of their host populations. I will focus on the current outstanding problems in spectroscopy of cool stars, which are the most useful objects in studies of our local Galactic neighborhood but also very distant systems, like faint dwarf Spheroidal galaxies. Among the most debated issues is to what extent can we trust the techniques, which rely on the classical assumptions of local thermodynamic equilibrium and hydrostatic balance. I will summarise the ongoing efforts to improve the models of cool stars, with the emphasis on NLTE and 3D modelling. I will then discuss how these exciting observations impact our knowledge of abundances in the Milky Way and in dSph systems, and present outlook for the future studies.

  8. NeuroPhysics: Studying how neurons create the perception of space-time using Physics' tools and techniques

    NASA Astrophysics Data System (ADS)

    Dhingra, Shonali; Sandler, Roman; Rios, Rodrigo; Vuong, Cliff; Mehta, Mayank

    All animals naturally perceive the abstract concept of space-time. A brain region called the Hippocampus is known to be important in creating these perceptions, but the underlying mechanisms are unknown. In our lab we employ several experimental and computational techniques from Physics to tackle this fundamental puzzle. Experimentally, we use ideas from Nanoscience and Materials Science to develop techniques to measure the activity of hippocampal neurons, in freely-behaving animals. Computationally, we develop models to study neuronal activity patterns, which are point processes that are highly stochastic and multidimensional. We then apply these techniques to collect and analyze neuronal signals from rodents while they're exploring space in Real World or Virtual Reality with various stimuli. Our findings show that under these conditions neuronal activity depends on various parameters, such as sensory cues including visual and auditory, and behavioral cues including, linear and angular, position and velocity. Further, neuronal networks create internally-generated rhythms, which influence perception of space and time. In totality, these results further our understanding of how the brain develops a cognitive map of our surrounding space, and keep track of time.

  9. Concrete Model Checking with Abstract Matching and Refinement

    NASA Technical Reports Server (NTRS)

    Pasareanu Corina S.; Peianek Radek; Visser, Willem

    2005-01-01

    We propose an abstraction-based model checking method which relies on refinement of an under-approximation of the feasible behaviors of the system under analysis. The method preserves errors to safety properties, since all analyzed behaviors are feasible by definition. The method does not require an abstract transition relation to he generated, but instead executes the concrete transitions while storing abstract versions of the concrete states, as specified by a set of abstraction predicates. For each explored transition. the method checks, with the help of a theorem prover, whether there is any loss of precision introduced by abstraction. The results of these checks are used to decide termination or to refine the abstraction, by generating new abstraction predicates. If the (possibly infinite) concrete system under analysis has a finite bisimulation quotient, then the method is guaranteed to eventually explore an equivalent finite bisimilar structure. We illustrate the application of the approach for checking concurrent programs. We also show how a lightweight variant can be used for efficient software testing.

  10. Finite-element simulation of ground-water flow in the vicinity of Yucca Mountain, Nevada-California

    USGS Publications Warehouse

    Czarnecki, J.B.; Waddell, R.K.

    1984-01-01

    A finite-element model of the groundwater flow system in the vicinity of Yucca Mountain at the Nevada Test Site was developed using parameter estimation techniques. The model simulated steady-state ground-water flow occurring in tuffaceous, volcanic , and carbonate rocks, and alluvial aquifers. Hydraulic gradients in the modeled area range from 0.00001 for carbonate aquifers to 0.19 for barriers in tuffaceous rocks. Three model parameters were used in estimating transmissivity in six zones. Simulated hydraulic-head values range from about 1,200 m near Timber Mountain to about 300 m near Furnace Creek Ranch. Model residuals for simulated versus measured hydraulic heads range from -28.6 to 21.4 m; most are less than +/-7 m, indicating an acceptable representation of the hydrologic system by the model. Sensitivity analyses of the model 's flux boundary condition variables were performed to assess the effect of varying boundary fluxes on the calculation of estimated model transmissivities. Varying the flux variables representing discharge at Franklin Lake and Furnace Creek Ranch has greater effect than varying other flux variables. (Author 's abstract)

  11. Bioinformatics Symposium of the Analytical Division of the American Chemical Society Meeting. Final Technical Report from 03/15/2000 to 03/14/2001 [sample pages of agenda, abstracts, index

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kennedy, Robert T.

    Sparked by the Human Genome Project, biological and biomedical research has become an information science. Information tools are now being generated for proteins, cell modeling, and genomics. The opportunity for analytical chemistry in this new environment is profound. New analytical techniques that can provide the information on genes, SNPs, proteins, protein modifications, cells, and cell chemistry are required. In this symposium, we brought together both informatics experts and leading analytical chemists to discuss this interface. Over 200 people attended this highly successful symposium.

  12. ESSAA: Embedded system safety analysis assistant

    NASA Technical Reports Server (NTRS)

    Wallace, Peter; Holzer, Joseph; Guarro, Sergio; Hyatt, Larry

    1987-01-01

    The Embedded System Safety Analysis Assistant (ESSAA) is a knowledge-based tool that can assist in identifying disaster scenarios. Imbedded software issues hazardous control commands to the surrounding hardware. ESSAA is intended to work from outputs to inputs, as a complement to simulation and verification methods. Rather than treating the software in isolation, it examines the context in which the software is to be deployed. Given a specified disasterous outcome, ESSAA works from a qualitative, abstract model of the complete system to infer sets of environmental conditions and/or failures that could cause a disasterous outcome. The scenarios can then be examined in depth for plausibility using existing techniques.

  13. Nondestructive testing techniques

    NASA Astrophysics Data System (ADS)

    Bray, Don E.; McBride, Don

    A comprehensive reference covering a broad range of techniques in nondestructive testing is presented. Based on years of extensive research and application at NASA and other government research facilities, the book provides practical guidelines for selecting the appropriate testing methods and equipment. Topics discussed include visual inspection, penetrant and chemical testing, nuclear radiation, sonic and ultrasonic, thermal and microwave, magnetic and electromagnetic techniques, and training and human factors. (No individual items are abstracted in this volume)

  14. Synchronisation Technique of Data Recorded on a Multichannel Tape Recorder,

    DTIC Science & Technology

    1984-01-01

    retrieval Synchronizers I 16. Abstract A portable, self-contained, electronic digital unit, termed Data Synchroniser was designed and developed by EDE...AD A139 570 SYNCHRONISATION TECHNIQUE OF DATA RECORDED ON A / OULl ICHANNEL TAPE RECORDER (U) ENGINEERING DEVELOPMENT ESTA B LISHMENT MARIBYRNONO...BGINEERING DEVELOPMEWIT ESTABUSHIMENT S[ SYNCHRONISATION TECHNIQUE OF DATA - i RECORDED ON A MULTICHANNEL TAPE RECORDER BY J.D. DICKENS .t T)TCi j.D. ~c .s

  15. A Study on User Interface Design of Knowledge Management Groupware in Selected Leading Organizations of Pakistan

    DTIC Science & Technology

    2004-06-01

    Information Systems, Faculty of ICT, International Islamic University, Malaysia . Abstract. Several techniques for evaluating a groupware...inspection based techniques couldn’t be carried out in other parts of Pakistan where the IT industry has mushroomed in the past few years. Nevertheless...there are no set standards for using any particular technique. Evaluating a groupware interface is an evolving process and requires more investigation

  16. Physiologically based pharmacokinetic modeling using microsoft excel and visual basic for applications.

    PubMed

    Marino, Dale J

    2005-01-01

    Abstract Physiologically based pharmacokinetic (PBPK) models are mathematical descriptions depicting the relationship between external exposure and internal dose. These models have found great utility for interspecies extrapolation. However, specialized computer software packages, which are not widely distributed, have typically been used for model development and utilization. A few physiological models have been reported using more widely available software packages (e.g., Microsoft Excel), but these tend to include less complex processes and dose metrics. To ascertain the capability of Microsoft Excel and Visual Basis for Applications (VBA) for PBPK modeling, models for styrene, vinyl chloride, and methylene chloride were coded in Advanced Continuous Simulation Language (ACSL), Excel, and VBA, and simulation results were compared. For styrene, differences between ACSL and Excel or VBA compartment concentrations and rates of change were less than +/-7.5E-10 using the same numerical integration technique and time step. Differences using VBA fixed step or ACSL Gear's methods were generally <1.00E-03, although larger differences involving very small values were noted after exposure transitions. For vinyl chloride and methylene chloride, Excel and VBA PBPK model dose metrics differed by no more than -0.013% or -0.23%, respectively, from ACSL results. These differences are likely attributable to different step sizes rather than different numerical integration techniques. These results indicate that Microsoft Excel and VBA can be useful tools for utilizing PBPK models, and given the availability of these software programs, it is hoped that this effort will help facilitate the use and investigation of PBPK modeling.

  17. An object-oriented software approach for a distributed human tracking motion system

    NASA Astrophysics Data System (ADS)

    Micucci, Daniela L.

    2003-06-01

    Tracking is a composite job involving the co-operation of autonomous activities which exploit a complex information model and rely on a distributed architecture. Both information and activities must be classified and related in several dimensions: abstraction levels (what is modelled and how information is processed); topology (where the modelled entities are); time (when entities exist); strategy (why something happens); responsibilities (who is in charge of processing the information). A proper Object-Oriented analysis and design approach leads to a modular architecture where information about conceptual entities is modelled at each abstraction level via classes and intra-level associations, whereas inter-level associations between classes model the abstraction process. Both information and computation are partitioned according to level-specific topological models. They are also placed in a temporal framework modelled by suitable abstractions. Domain-specific strategies control the execution of the computations. Computational components perform both intra-level processing and intra-level information conversion. The paper overviews the phases of the analysis and design process, presents major concepts at each abstraction level, and shows how the resulting design turns into a modular, flexible and adaptive architecture. Finally, the paper sketches how the conceptual architecture can be deployed into a concrete distribute architecture by relying on an experimental framework.

  18. Processing Uav and LIDAR Point Clouds in Grass GIS

    NASA Astrophysics Data System (ADS)

    Petras, V.; Petrasova, A.; Jeziorska, J.; Mitasova, H.

    2016-06-01

    Today's methods of acquiring Earth surface data, namely lidar and unmanned aerial vehicle (UAV) imagery, non-selectively collect or generate large amounts of points. Point clouds from different sources vary in their properties such as number of returns, density, or quality. We present a set of tools with applications for different types of points clouds obtained by a lidar scanner, structure from motion technique (SfM), and a low-cost 3D scanner. To take advantage of the vertical structure of multiple return lidar point clouds, we demonstrate tools to process them using 3D raster techniques which allow, for example, the development of custom vegetation classification methods. Dense point clouds obtained from UAV imagery, often containing redundant points, can be decimated using various techniques before further processing. We implemented and compared several decimation techniques in regard to their performance and the final digital surface model (DSM). Finally, we will describe the processing of a point cloud from a low-cost 3D scanner, namely Microsoft Kinect, and its application for interaction with physical models. All the presented tools are open source and integrated in GRASS GIS, a multi-purpose open source GIS with remote sensing capabilities. The tools integrate with other open source projects, specifically Point Data Abstraction Library (PDAL), Point Cloud Library (PCL), and OpenKinect libfreenect2 library to benefit from the open source point cloud ecosystem. The implementation in GRASS GIS ensures long term maintenance and reproducibility by the scientific community but also by the original authors themselves.

  19. Defence Reporter. Spring 2012

    DTIC Science & Technology

    2012-01-01

    procedures and techniques for dealing with disruptive events. R0002869D Visualisation Techniques: Communicating Results to Senior Decision-Makers Dstl...understanding and accurate recall of key information. Open-source literature provides evidence that good visualisations aid effective communication...of abstract or complex information. General principles regarding the design of visualisations for use in presentations or reports are provided. Best

  20. Modelling the sensitivity of river reaches to water abstraction: RAPHSA- a hydroecology tool for environmental managers

    NASA Astrophysics Data System (ADS)

    Klaar, Megan; Laize, Cedric; Maddock, Ian; Acreman, Mike; Tanner, Kath; Peet, Sarah

    2014-05-01

    A key challenge for environmental managers is the determination of environmental flows which allow a maximum yield of water resources to be taken from surface and sub-surface sources, whilst ensuring sufficient water remains in the environment to support biota and habitats. It has long been known that sensitivity to changes in water levels resulting from river and groundwater abstractions varies between rivers. Whilst assessment at the catchment scale is ideal for determining broad pressures on water resources and ecosystems, assessment of the sensitivity of reaches to changes in flow has previously been done on a site-by-site basis, often with the application of detailed but time consuming techniques (e.g. PHABSIM). While this is appropriate for a limited number of sites, it is costly in terms of money and time resources and therefore not appropriate for application at a national level required by responsible licensing authorities. To address this need, the Environment Agency (England) is developing an operational tool to predict relationships between physical habitat and flow which may be applied by field staff to rapidly determine the sensitivity of physical habitat to flow alteration for use in water resource management planning. An initial model of river sensitivity to abstraction (defined as the change in physical habitat related to changes in river discharge) was developed using site characteristics and data from 66 individual PHABSIM surveys throughout the UK (Booker & Acreman, 2008). By applying a multivariate multiple linear regression analysis to the data to define habitat availability-flow curves using resource intensity as predictor variables, the model (known as RAPHSA- Rapid Assessment of Physical Habitat Sensitivity to Abstraction) is able to take a risk-based approach to modeled certainty. Site specific information gathered using desk-based, or a variable amount of field work can be used to predict the shape of the habitat- flow curves, with the uncertainty of estimates reducing as more information is collected. Creation of generalized physical habitat- discharge relationships by the model allows environmental managers to select the desired level of confidence in the modeled results, based on environmental risk and the level of resource investment available. Hence, resources can be better directed according to the level of certainty required at each site. This model is intended to provide managers with an alternative to the existing use of either expert opinion or resource intensive site- specific investigations in determining local environmental flows. Here, we outline the potential use of this tool by the Environment Agency in routine operational and investigation- specific scenarios using case studies to illustrate its use.

  1. Automated Assume-Guarantee Reasoning by Abstraction Refinement

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Giannakopoulous, Dimitra; Glannakopoulou, Dimitra

    2008-01-01

    Current automated approaches for compositional model checking in the assume-guarantee style are based on learning of assumptions as deterministic automata. We propose an alternative approach based on abstraction refinement. Our new method computes the assumptions for the assume-guarantee rules as conservative and not necessarily deterministic abstractions of some of the components, and refines those abstractions using counter-examples obtained from model checking them together with the other components. Our approach also exploits the alphabets of the interfaces between components and performs iterative refinement of those alphabets as well as of the abstractions. We show experimentally that our preliminary implementation of the proposed alternative achieves similar or better performance than a previous learning-based implementation.

  2. Aerobiology: Experimental Considerations, Observations, and Future Tools

    PubMed Central

    Haddrell, Allen E.

    2017-01-01

    ABSTRACT Understanding airborne survival and decay of microorganisms is important for a range of public health and biodefense applications, including epidemiological and risk analysis modeling. Techniques for experimental aerosol generation, retention in the aerosol phase, and sampling require careful consideration and understanding so that they are representative of the conditions the bioaerosol would experience in the environment. This review explores the current understanding of atmospheric transport in relation to advances and limitations of aerosol generation, maintenance in the aerosol phase, and sampling techniques. Potential tools for the future are examined at the interface between atmospheric chemistry, aerosol physics, and molecular microbiology where the heterogeneity and variability of aerosols can be explored at the single-droplet and single-microorganism levels within a bioaerosol. The review highlights the importance of method comparison and validation in bioaerosol research and the benefits that the application of novel techniques could bring to increasing the understanding of aerobiological phenomena in diverse research fields, particularly during the progression of atmospheric transport, where complex interdependent physicochemical and biological processes occur within bioaerosol particles. PMID:28667111

  3. Distraction techniques for face and smile aesthetic preventing ageing decay

    PubMed Central

    Barbaro, Roberto; Troisi, Donato; D’Alessio, Giuseppe; Amato, Maurizio; Lo Giudice, Roberto; Paolo Claudio, Pier

    2016-01-01

    Abstract Modern concepts in the world of beauty arise from popular models, beautiful faces of actors document a bi-protrusive asset with high tension for soft tissues. Facial symmetry has been proposed as a marker of development and stability that may be important in human mate choice. For various traits any deviation from perfect symmetry can be considered a reflection of imperfect development. Additionally, bi-protrusive profile is dependent on the hormonal level regardless of male or female sex. The goal of maxillofacial surgery is to provide best results both for aesthetic and functional aspects. Following these new concepts of aesthetic of the face, new surgical procedure by osteodistraction techniques will lead to a very natural result by harmonizing the face also preventing aesthetic decay in aging faces. Ten cases with a feedback on the aesthetic results using the fivepoint scale of Likert after orthognatic surgery performed following distraction new techniques in combination with ancillary surgical procedures. The aesthetic results in all patients were highly satisfactory. All the patients accepted the new aesthetic of the face avoiding elements of discrepancy and consequently medico-legal problems. PMID:28352833

  4. U.S. Geological Survey National Computer Technology Meeting; Program and abstracts, May 7-11, 1990

    USGS Publications Warehouse

    Balthrop, B. H.; Baker, E.G.

    1990-01-01

    Computer-related information from all Divisions of the U.S. Geological Survey are discussed in this compilation of abstracts. Some of the topics addressed are system administration; distributed information systems and data bases, both current (1990) and proposed; hydrologic applications; national water information systems; geographic information systems applications and techniques. The report contains some of the abstracts that were presented at the National Computer Technology Meeting that was held in May 1990. The meeting was sponsored by the Water Resources Division and was attended by more than 200 technical and managerial personnel representing all the Divisions of the U.S. Geological Survey. (USGS)

  5. Modelling Metamorphism by Abstract Interpretation

    NASA Astrophysics Data System (ADS)

    Dalla Preda, Mila; Giacobazzi, Roberto; Debray, Saumya; Coogan, Kevin; Townsend, Gregg M.

    Metamorphic malware apply semantics-preserving transformations to their own code in order to foil detection systems based on signature matching. In this paper we consider the problem of automatically extract metamorphic signatures from these malware. We introduce a semantics for self-modifying code, later called phase semantics, and prove its correctness by showing that it is an abstract interpretation of the standard trace semantics. Phase semantics precisely models the metamorphic code behavior by providing a set of traces of programs which correspond to the possible evolutions of the metamorphic code during execution. We show that metamorphic signatures can be automatically extracted by abstract interpretation of the phase semantics, and that regular metamorphism can be modelled as finite state automata abstraction of the phase semantics.

  6. Analysis and Prediction of Sea Ice Evolution using Koopman Mode Decomposition Techniques

    DTIC Science & Technology

    2018-04-30

    Title: Analysis and Prediction of Sea Ice Evolution using Koopman Mode Decomposition Techniques Subject: Monthly Progress Report Period of...Resources: N/A TOTAL: $18,687 2 TECHNICAL STATUS REPORT Abstract The program goal is analysis of sea ice dynamical behavior using Koopman Mode Decompo...sition (KMD) techniques. The work in the program’s first month consisted of improvements to data processing code, inclusion of additional arctic sea ice

  7. Demonstration of Novel Sampling Techniques for Measurement of Turbine Engine Volatile and Non-Volatile Particulate Matter (PM) Emissions

    DTIC Science & Technology

    2015-12-30

    FINAL REPORT Demonstration of Novel Sampling Techniques for Measurement of Turbine Engine Volatile and Non-Volatile Particulate Matter (PM...Novel Sampling Techniques for Measurement of Turbine Engine Volatile and Non-Volatile Particulate Matter (PM) Emissions 6. AUTHOR(S) E. Corporan, M...report contains color. 14. ABSTRACT This project consists of demonstrating the performance and viability of two devices to condition aircraft turbine

  8. A Formal Model of Partitioning for Integrated Modular Avionics

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.

    1998-01-01

    The aviation industry is gradually moving toward the use of integrated modular avionics (IMA) for civilian transport aircraft. An important concern for IMA is ensuring that applications are safely partitioned so they cannot interfere with one another. We have investigated the problem of ensuring safe partitioning and logical non-interference among separate applications running on a shared Avionics Computer Resource (ACR). This research was performed in the context of ongoing standardization efforts, in particular, the work of RTCA committee SC-182, and the recently completed ARINC 653 application executive (APEX) interface standard. We have developed a formal model of partitioning suitable for evaluating the design of an ACR. The model draws from the mathematical modeling techniques developed by the computer security community. This report presents a formulation of partitioning requirements expressed first using conventional mathematical notation, then formalized using the language of SRI'S Prototype Verification System (PVS). The approach is demonstrated on three candidate designs, each an abstraction of features found in real systems.

  9. Modeling human comprehension of data visualizations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matzen, Laura E.; Haass, Michael Joseph; Divis, Kristin Marie

    This project was inspired by two needs. The first is a need for tools to help scientists and engineers to design effective data visualizations for communicating information, whether to the user of a system, an analyst who must make decisions based on complex data, or in the context of a technical report or publication. Most scientists and engineers are not trained in visualization design, and they could benefit from simple metrics to assess how well their visualization's design conveys the intended message. In other words, will the most important information draw the viewer's attention? The second is the need formore » cognition-based metrics for evaluating new types of visualizations created by researchers in the information visualization and visual analytics communities. Evaluating visualizations is difficult even for experts. However, all visualization methods and techniques are intended to exploit the properties of the human visual system to convey information efficiently to a viewer. Thus, developing evaluation methods that are rooted in the scientific knowledge of the human visual system could be a useful approach. In this project, we conducted fundamental research on how humans make sense of abstract data visualizations, and how this process is influenced by their goals and prior experience. We then used that research to develop a new model, the Data Visualization Saliency Model, that can make accurate predictions about which features in an abstract visualization will draw a viewer's attention. The model is an evaluation tool that can address both of the needs described above, supporting both visualization research and Sandia mission needs.« less

  10. Mass storage system reference model, Version 4

    NASA Technical Reports Server (NTRS)

    Coleman, Sam (Editor); Miller, Steve (Editor)

    1993-01-01

    The high-level abstractions that underlie modern storage systems are identified. The information to generate the model was collected from major practitioners who have built and operated large storage facilities, and represents a distillation of the wisdom they have acquired over the years. The model provides a common terminology and set of concepts to allow existing systems to be examined and new systems to be discussed and built. It is intended that the model and the interfaces identified from it will allow and encourage vendors to develop mutually-compatible storage components that can be combined to form integrated storage systems and services. The reference model presents an abstract view of the concepts and organization of storage systems. From this abstraction will come the identification of the interfaces and modules that will be used in IEEE storage system standards. The model is not yet suitable as a standard; it does not contain implementation decisions, such as how abstract objects should be broken up into software modules or how software modules should be mapped to hosts; it does not give policy specifications, such as when files should be migrated; does not describe how the abstract objects should be used or connected; and does not refer to specific hardware components. In particular, it does not fully specify the interfaces.

  11. Visualizing Mars Using Virtual Reality: A State of the Art Mapping Technique Used on Mars Pathfinder

    NASA Technical Reports Server (NTRS)

    Stoker, C.; Zbinden, E.; Blackmon, T.; Nguyen, L.

    1999-01-01

    We describe an interactive terrain visualization system which rapidly generates and interactively displays photorealistic three-dimensional (3-D) models produced from stereo images. This product, first demonstrated in Mars Pathfinder, is interactive, 3-D, and can be viewed in an immersive display which qualifies it for the name Virtual Reality (VR). The use of this technology on Mars Pathfinder was the first use of VR for geologic analysis. A primary benefit of using VR to display geologic information is that it provides an improved perception of depth and spatial layout of the remote site. The VR aspect of the display allows an operator to move freely in the environment, unconstrained by the physical limitations of the perspective from which the data were acquired. Virtual Reality offers a way to archive and retrieve information in a way that is intuitively obvious. Combining VR models with stereo display systems can give the user a sense of presence at the remote location. The capability, to interactively perform measurements from within the VR model offers unprecedented ease in performing operations that are normally time consuming and difficult using other techniques. Thus, Virtual Reality can be a powerful a cartographic tool. Additional information is contained in the original extended abstract.

  12. Abdomen and spinal cord segmentation with augmented active shape models

    PubMed Central

    Xu, Zhoubing; Conrad, Benjamin N.; Baucom, Rebeccah B.; Smith, Seth A.; Poulose, Benjamin K.; Landman, Bennett A.

    2016-01-01

    Abstract. Active shape models (ASMs) have been widely used for extracting human anatomies in medical images given their capability for shape regularization of topology preservation. However, sensitivity to model initialization and local correspondence search often undermines their performances, especially around highly variable contexts in computed-tomography (CT) and magnetic resonance (MR) images. In this study, we propose an augmented ASM (AASM) by integrating the multiatlas label fusion (MALF) and level set (LS) techniques into the traditional ASM framework. Using AASM, landmark updates are optimized globally via a region-based LS evolution applied on the probability map generated from MALF. This augmentation effectively extends the searching range of correspondent landmarks while reducing sensitivity to the image contexts and improves the segmentation robustness. We propose the AASM framework as a two-dimensional segmentation technique targeting structures with one axis of regularity. We apply AASM approach to abdomen CT and spinal cord (SC) MR segmentation challenges. On 20 CT scans, the AASM segmentation of the whole abdominal wall enables the subcutaneous/visceral fat measurement, with high correlation to the measurement derived from manual segmentation. On 28 3T MR scans, AASM yields better performances than other state-of-the-art approaches in segmenting white/gray matter in SC. PMID:27610400

  13. How Pupils Use a Model for Abstract Concepts in Genetics

    ERIC Educational Resources Information Center

    Venville, Grady; Donovan, Jenny

    2008-01-01

    The purpose of this research was to explore the way pupils of different age groups use a model to understand abstract concepts in genetics. Pupils from early childhood to late adolescence were taught about genes and DNA using an analogical model (the wool model) during their regular biology classes. Changing conceptual understandings of the…

  14. Some design constraints required for the assembly of software components: The incorporation of atomic abstract types into generically structured abstract types

    NASA Technical Reports Server (NTRS)

    Johnson, Charles S.

    1986-01-01

    It is nearly axiomatic, that to take the greatest advantage of the useful features available in a development system, and to avoid the negative interactions of those features, requires the exercise of a design methodology which constrains their use. A major design support feature of the Ada language is abstraction: for data, functions processes, resources, and system elements in general. Atomic abstract types can be created in packages defining those private types and all of the overloaded operators, functions, and hidden data required for their use in an application. Generically structured abstract types can be created in generic packages defining those structured private types, as buildups from the user-defined data types which are input as parameters. A study is made of the design constraints required for software incorporating either atomic or generically structured abstract types, if the integration of software components based on them is to be subsequently performed. The impact of these techniques on the reusability of software and the creation of project-specific software support environments is also discussed.

  15. Reliability of reporting nosocomial infections in the discharge abstract and implications for receipt of revenues under prospective reimbursement.

    PubMed Central

    Massanari, R M; Wilkerson, K; Streed, S A; Hierholzer, W J

    1987-01-01

    Proper reporting of discharge diagnoses, including complications of medical care, is essential for maximum recovery of revenues under the prospective reimbursement system. To evaluate the effectiveness of abstracting techniques in identifying nosocomial infections at discharge, discharge abstracts of patients with nosocomial infections were reviewed during September through November of 1984. Patients with nosocomial infections were identified using modified Centers for Disease Control (CDC) definitions and trained surveillance technicians. Records which did not include the diagnosis of nosocomial infections in the discharge abstract were identified, and potential lost revenues were estimated. We identified 631 infections in 498 patients. On average, only 57 per cent of the infections were properly recorded and coded in the discharge abstract. Of the additional monies which might be anticipated by the health care institution to assist in the cost of care of adverse events, approximately one-third would have been lost due to errors in coding in the discharge abstract. Although these lost revenues are substantial, they constitute but a small proportion of the potential costs to the institution when patients acquire nosocomial infections. PMID:3105338

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strout, Michelle

    Programming parallel machines is fraught with difficulties: the obfuscation of algorithms due to implementation details such as communication and synchronization, the need for transparency between language constructs and performance, the difficulty of performing program analysis to enable automatic parallelization techniques, and the existence of important "dusty deck" codes. The SAIMI project developed abstractions that enable the orthogonal specification of algorithms and implementation details within the context of existing DOE applications. The main idea is to enable the injection of small programming models such as expressions involving transcendental functions, polyhedral iteration spaces with sparse constraints, and task graphs into full programsmore » through the use of pragmas. These smaller, more restricted programming models enable orthogonal specification of many implementation details such as how to map the computation on to parallel processors, how to schedule the computation, and how to allocation storage for the computation. At the same time, these small programming models enable the expression of the most computationally intense and communication heavy portions in many scientific simulations. The ability to orthogonally manipulate the implementation for such computations will significantly ease performance programming efforts and expose transformation possibilities and parameter to automated approaches such as autotuning. At Colorado State University, the SAIMI project was supported through DOE grant DE-SC3956 from April 2010 through August 2015. The SAIMI project has contributed a number of important results to programming abstractions that enable the orthogonal specification of implementation details in scientific codes. This final report summarizes the research that was funded by the SAIMI project.« less

  17. Modeling integrated cellular machinery using hybrid Petri-Boolean networks.

    PubMed

    Berestovsky, Natalie; Zhou, Wanding; Nagrath, Deepak; Nakhleh, Luay

    2013-01-01

    The behavior and phenotypic changes of cells are governed by a cellular circuitry that represents a set of biochemical reactions. Based on biological functions, this circuitry is divided into three types of networks, each encoding for a major biological process: signal transduction, transcription regulation, and metabolism. This division has generally enabled taming computational complexity dealing with the entire system, allowed for using modeling techniques that are specific to each of the components, and achieved separation of the different time scales at which reactions in each of the three networks occur. Nonetheless, with this division comes loss of information and power needed to elucidate certain cellular phenomena. Within the cell, these three types of networks work in tandem, and each produces signals and/or substances that are used by the others to process information and operate normally. Therefore, computational techniques for modeling integrated cellular machinery are needed. In this work, we propose an integrated hybrid model (IHM) that combines Petri nets and Boolean networks to model integrated cellular networks. Coupled with a stochastic simulation mechanism, the model simulates the dynamics of the integrated network, and can be perturbed to generate testable hypotheses. Our model is qualitative and is mostly built upon knowledge from the literature and requires fine-tuning of very few parameters. We validated our model on two systems: the transcriptional regulation of glucose metabolism in human cells, and cellular osmoregulation in S. cerevisiae. The model produced results that are in very good agreement with experimental data, and produces valid hypotheses. The abstract nature of our model and the ease of its construction makes it a very good candidate for modeling integrated networks from qualitative data. The results it produces can guide the practitioner to zoom into components and interconnections and investigate them using such more detailed mathematical models.

  18. Modeling Integrated Cellular Machinery Using Hybrid Petri-Boolean Networks

    PubMed Central

    Berestovsky, Natalie; Zhou, Wanding; Nagrath, Deepak; Nakhleh, Luay

    2013-01-01

    The behavior and phenotypic changes of cells are governed by a cellular circuitry that represents a set of biochemical reactions. Based on biological functions, this circuitry is divided into three types of networks, each encoding for a major biological process: signal transduction, transcription regulation, and metabolism. This division has generally enabled taming computational complexity dealing with the entire system, allowed for using modeling techniques that are specific to each of the components, and achieved separation of the different time scales at which reactions in each of the three networks occur. Nonetheless, with this division comes loss of information and power needed to elucidate certain cellular phenomena. Within the cell, these three types of networks work in tandem, and each produces signals and/or substances that are used by the others to process information and operate normally. Therefore, computational techniques for modeling integrated cellular machinery are needed. In this work, we propose an integrated hybrid model (IHM) that combines Petri nets and Boolean networks to model integrated cellular networks. Coupled with a stochastic simulation mechanism, the model simulates the dynamics of the integrated network, and can be perturbed to generate testable hypotheses. Our model is qualitative and is mostly built upon knowledge from the literature and requires fine-tuning of very few parameters. We validated our model on two systems: the transcriptional regulation of glucose metabolism in human cells, and cellular osmoregulation in S. cerevisiae. The model produced results that are in very good agreement with experimental data, and produces valid hypotheses. The abstract nature of our model and the ease of its construction makes it a very good candidate for modeling integrated networks from qualitative data. The results it produces can guide the practitioner to zoom into components and interconnections and investigate them using such more detailed mathematical models. PMID:24244124

  19. BluePyOpt: Leveraging Open Source Software and Cloud Infrastructure to Optimise Model Parameters in Neuroscience

    PubMed Central

    Van Geit, Werner; Gevaert, Michael; Chindemi, Giuseppe; Rössert, Christian; Courcol, Jean-Denis; Muller, Eilif B.; Schürmann, Felix; Segev, Idan; Markram, Henry

    2016-01-01

    At many scales in neuroscience, appropriate mathematical models take the form of complex dynamical systems. Parameterizing such models to conform to the multitude of available experimental constraints is a global non-linear optimisation problem with a complex fitness landscape, requiring numerical techniques to find suitable approximate solutions. Stochastic optimisation approaches, such as evolutionary algorithms, have been shown to be effective, but often the setting up of such optimisations and the choice of a specific search algorithm and its parameters is non-trivial, requiring domain-specific expertise. Here we describe BluePyOpt, a Python package targeted at the broad neuroscience community to simplify this task. BluePyOpt is an extensible framework for data-driven model parameter optimisation that wraps and standardizes several existing open-source tools. It simplifies the task of creating and sharing these optimisations, and the associated techniques and knowledge. This is achieved by abstracting the optimisation and evaluation tasks into various reusable and flexible discrete elements according to established best-practices. Further, BluePyOpt provides methods for setting up both small- and large-scale optimisations on a variety of platforms, ranging from laptops to Linux clusters and cloud-based compute infrastructures. The versatility of the BluePyOpt framework is demonstrated by working through three representative neuroscience specific use cases. PMID:27375471

  20. A TECHNIQUE FOR DETERMINING THE OPERATING CAPACITIES OF JUNIOR COLLEGE INSTRUCTIONAL FACILITIES.

    ERIC Educational Resources Information Center

    CLAWSON, KENNETH TED

    A TECHNIQUE FOR DETERMINING THE CAPACITY OF A COLLEGE PLANT SHOULD (1) CONSIDER THE FUNCTIONAL USE OF THE PLANT, (2) ATTACK THE CAPACITY PROBLEM DIRECTLY RATHER THAN THROUGH STATUS STUDIES, (3) INVOLVE THE SIGNIFICANT FACTORS RELATED TO CAPACITY, (4) USE OBJECTIVE FACTORS, (5) BE UNIVERSAL IN ITS APPLICATION, (6) NOT INVOLVE ABSTRACT STANDARDS,…

  1. Multiscale visual quality assessment for cluster analysis with self-organizing maps

    NASA Astrophysics Data System (ADS)

    Bernard, Jürgen; von Landesberger, Tatiana; Bremm, Sebastian; Schreck, Tobias

    2011-01-01

    Cluster analysis is an important data mining technique for analyzing large amounts of data, reducing many objects to a limited number of clusters. Cluster visualization techniques aim at supporting the user in better understanding the characteristics and relationships among the found clusters. While promising approaches to visual cluster analysis already exist, these usually fall short of incorporating the quality of the obtained clustering results. However, due to the nature of the clustering process, quality plays an important aspect, as for most practical data sets, typically many different clusterings are possible. Being aware of clustering quality is important to judge the expressiveness of a given cluster visualization, or to adjust the clustering process with refined parameters, among others. In this work, we present an encompassing suite of visual tools for quality assessment of an important visual cluster algorithm, namely, the Self-Organizing Map (SOM) technique. We define, measure, and visualize the notion of SOM cluster quality along a hierarchy of cluster abstractions. The quality abstractions range from simple scalar-valued quality scores up to the structural comparison of a given SOM clustering with output of additional supportive clustering methods. The suite of methods allows the user to assess the SOM quality on the appropriate abstraction level, and arrive at improved clustering results. We implement our tools in an integrated system, apply it on experimental data sets, and show its applicability.

  2. Molecular modeling: An open invitation for applied mathematics

    NASA Astrophysics Data System (ADS)

    Mezey, Paul G.

    2013-10-01

    Molecular modeling methods provide a very wide range of challenges for innovative mathematical and computational techniques, where often high dimensionality, large sets of data, and complicated interrelations imply a multitude of iterative approximations. The physical and chemical basis of these methodologies involves quantum mechanics with several non-intuitive aspects, where classical interpretation and classical analogies are often misleading or outright wrong. Hence, instead of the everyday, common sense approaches which work so well in engineering, in molecular modeling one often needs to rely on rather abstract mathematical constraints and conditions, again emphasizing the high level of reliance on applied mathematics. Yet, the interdisciplinary aspects of the field of molecular modeling also generates some inertia and perhaps too conservative reliance on tried and tested methodologies, that is at least partially caused by the less than up-to-date involvement in the newest developments in applied mathematics. It is expected that as more applied mathematicians take up the challenge of employing the latest advances of their field in molecular modeling, important breakthroughs may follow. In this presentation some of the current challenges of molecular modeling are discussed.

  3. Local rules simulation of the kinetics of virus capsid self-assembly.

    PubMed

    Schwartz, R; Shor, P W; Prevelige, P E; Berger, B

    1998-12-01

    A computer model is described for studying the kinetics of the self-assembly of icosahedral viral capsids. Solution of this problem is crucial to an understanding of the viral life cycle, which currently cannot be adequately addressed through laboratory techniques. The abstract simulation model employed to address this is based on the local rules theory of. Proc. Natl. Acad. Sci. USA. 91:7732-7736). It is shown that the principle of local rules, generalized with a model of kinetics and other extensions, can be used to simulate complicated problems in self-assembly. This approach allows for a computationally tractable molecular dynamics-like simulation of coat protein interactions while retaining many relevant features of capsid self-assembly. Three simple simulation experiments are presented to illustrate the use of this model. These show the dependence of growth and malformation rates on the energetics of binding interactions, the tolerance of errors in binding positions, and the concentration of subunits in the examples. These experiments demonstrate a tradeoff within the model between growth rate and fidelity of assembly for the three parameters. A detailed discussion of the computational model is also provided.

  4. Techniques of Photometry and Astrometry with APASS, Gaia, and Pan-STARRs Results (Abstract)

    NASA Astrophysics Data System (ADS)

    Green, W.

    2017-12-01

    (Abstract only) The databases with the APASS DR9, Gaia DR1, and the Pan-STARRs 3pi DR1 data releases are publicly available for use. There is a bit of data-mining involved to download and manage these reference stars. This paper discusses the use of these databases to acquire accurate photometric references as well as techniques for improving results. Images are prepared in the usual way: zero, dark, flat-fields, and WCS solutions with Astrometry.net. Images are then processed with Sextractor to produce an ASCII table of identifying photometric features. The database manages photometics catalogs and images converted to ASCII tables. Scripts convert the files into SQL and assimilate them into database tables. Using SQL techniques, each image star is merged with reference data to produce publishable results. The VYSOS has over 13,000 images of the ONC5 field to process with roughly 100 total fields in the campaign. This paper provides the overview for this daunting task.

  5. Correction of Phase Distortion by Nonlinear Optical Techniques

    DTIC Science & Technology

    1981-05-01

    I I I I ifi 00 o o \\] CORRECTION OF PHASE DISTORTION BY NONLINEAR OPTICAL TECHNIQUES op Hughes Research Laboratories 3011 Malibu Canyon...CORRECTION OF PHASE DISTORTION BY NONLINEAR OPTICAL TECHNIQUES • , — •■ FBiMowmln»"Own. we^owr^wwcw n R.C./Lind| W.B./Browne C.R. Giuliano, R.K... phase conjugation. Adaptive optics , Laser compensation, SBS, Four-wave mixing. 20. ABSTRACT (ConllmM on i tmrr and Identity bv block number

  6. Introduction of a Current Waveform, Waveshaping Technique to Limit Conduction Loss in High-Frequency DC-DC Converters Suitable for Space Power

    DTIC Science & Technology

    1990-06-01

    resonant Buck converter 19 ABSTRACT (Continue on reverse if necessary and identify by block number) Space power supply manufacturers have tried to...increase power density and construct smaller, highly efficient power supplies by increasing switching frequency. Incorporation of a power MOSFET as a...Michael, Second Reader \\’-. ohn P. Powers , Chairman Department of Electrical Engineering iii ABSTRACT Space power supply manufacturers have tried to

  7. Technology for large space systems: A bibliography with indexes (supplement 11)

    NASA Technical Reports Server (NTRS)

    1985-01-01

    This bibliography contains 539 abstracts of reports, articles and other documents introduced into the NASA scientific and technical information system between January 1, 1984 and December 31, 1984. Abstracts are arranged in the following categories: systems; analysis and design techniques; structural concepts; structural and thermal analysis; structural dynamics and control; electronics; advanced materials; assembly concepts; propulsion; and miscellaneous. Subject, personal author, corporate source, contract number, report number, and accession number indexes are listed.

  8. Seeking health information on the web: positive hypothesis testing.

    PubMed

    Kayhan, Varol Onur

    2013-04-01

    The goal of this study is to investigate positive hypothesis testing among consumers of health information when they search the Web. After demonstrating the extent of positive hypothesis testing using Experiment 1, we conduct Experiment 2 to test the effectiveness of two debiasing techniques. A total of 60 undergraduate students searched a tightly controlled online database developed by the authors to test the validity of a hypothesis. The database had four abstracts that confirmed the hypothesis and three abstracts that disconfirmed it. Findings of Experiment 1 showed that majority of participants (85%) exhibited positive hypothesis testing. In Experiment 2, we found that the recommendation technique was not effective in reducing positive hypothesis testing since none of the participants assigned to this server could retrieve disconfirming evidence. Experiment 2 also showed that the incorporation technique successfully reduced positive hypothesis testing since 75% of the participants could retrieve disconfirming evidence. Positive hypothesis testing on the Web is an understudied topic. More studies are needed to validate the effectiveness of the debiasing techniques discussed in this study and develop new techniques. Search engine developers should consider developing new options for users so that both confirming and disconfirming evidence can be presented in search results as users test hypotheses using search engines. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  9. Do Concretely and Abstractly Worded Arguments Require Different Models?

    ERIC Educational Resources Information Center

    Hample, Dale

    Dale Hample's cognitive model of argument is designed to reflect the operation of syllogistic thought processes. It has been suggested however, that the model applies more closely to abstractly worded arguments than to concrete thinking and that it also may work better with more interested respondents because it seems to describe the central…

  10. U.S. Geological Survey national computer technology meeting; program and abstracts, New Orleans, Louisiana, April 10-15, 1994

    USGS Publications Warehouse

    Balthrop, B. H.; Baker, E.G.

    1994-01-01

    This report contains some of the abstracts of papers that were presented at the National Computer Technology Meeting that was held in April 1994. This meeting was sponsored by the Water Resources Division of the U.S. Geological Survey, and was attended by more than 200 technical and managerial personnel representing all the Divisions of the U.S. Geological Survey. Computer-related information from all Divisions of the U.S. Geological Survey are discussed in this compilation of abstracts. Some of the topics addressed are data transfer, data-base management, hydrologic applications, national water information systems, and geographic information systems applications and techniques.

  11. Comprehensive investigations of kinetics of alkaline hydrolysis of TNT (2,4,6-trinitrotoluene), DNT (2,4-dinitrotoluene), and DNAN (2,4-dinitroanisole).

    PubMed

    Sviatenko, Liudmyla; Kinney, Chad; Gorb, Leonid; Hill, Frances C; Bednar, Anthony J; Okovytyy, Sergiy; Leszczynski, Jerzy

    2014-09-02

    Combined experimental and computational techniques were used to analyze multistep chemical reactions in the alkaline hydrolysis of three nitroaromatic compounds: 2,4,6-trinitrotoluene (TNT), 2,4-dinitrotoluene (DNT), and 2,4-dinitroanisole (DNAN). The study reveals common features and differences in the kinetic behavior of these compounds. The analysis of the predicted pathways includes modeling of the reactions, along with simulation of UV-vis spectra, experimental monitoring of reactions using LC/MS techniques, development of the kinetic model by designing and solving the system of differential equations, and obtaining computationally predicted kinetics for decay and accumulation of reactants and products. Obtained results suggest that DNT and DNAN are more resistant to alkaline hydrolysis than TNT. The direct substitution of a nitro group by a hydroxide represents the most favorable pathway for all considered compounds. The formation of Meisenheimer complexes leads to the kinetic first-step intermediates in the hydrolysis of TNT. Janovsky complexes can also be formed during hydrolysis of TNT and DNT but in small quantities. Methyl group abstraction is one of the suggested pathways of DNAN transformation during alkaline hydrolysis.

  12. Managing a Common Pool Resource: Real Time Decision-Making in a Groundwater Aquifer

    NASA Astrophysics Data System (ADS)

    Sahu, R.; McLaughlin, D.

    2017-12-01

    In a Common Pool Resource (CPR) such as a groundwater aquifer, multiple landowners (agents) are competing for a limited resource of water. Landowners pump out the water to grow their own crops. Such problems can be posed as differential games, with agents all trying to control the behavior of the shared dynamic system. Each agent aims to maximize his/her own personal objective like agriculture yield, being aware that the action of every other agent collectively influences the behavior of the shared aquifer. The agents therefore choose a subgame perfect Nash equilibrium strategy that derives an optimal action for each agent based on the current state of the aquifer and assumes perfect information of every other agents' objective function. Furthermore, using an Iterated Best Response approach and interpolating techniques, an optimal pumping strategy can be computed for a more-realistic description of the groundwater model under certain assumptions. The numerical implementation of dynamic optimization techniques for a relevant description of the physical system yields results qualitatively different from the previous solutions obtained from simple abstractions.This work aims to bridge the gap between extensive modeling approaches in hydrology and competitive solution strategies in differential game theory.

  13. How to Express C++ Concepts in Fortran90

    NASA Technical Reports Server (NTRS)

    Decyk, V.; Norton, C.; Szymanski, B.

    1997-01-01

    r summarizes techniques for emulating in Fortran90 the most important object-oriented concepts of C++ classes (including abstract data types, encapsulation and function overloading), inheritance and dynamic dispatching.

  14. How to Express C++ Concepts in Fortran90

    DOE PAGES

    Decyk, Viktor K.; Norton, Charles D.; Szymanski, Boleslaw K.

    1997-01-01

    This paper summarizes techniques for emulating in Fortran90 the most important objectoriented concepts of C++: classes (including abstract data types, encapsulation and function overloading), inheritance and dynamic dispatching.

  15. Advanced Pediatric Brain Imaging Research and Training Program

    DTIC Science & Technology

    2014-10-01

    death and disability in children. Recent advances in pediatric magnetic resonance imaging ( MRI ) techniques are revolutionizing our understanding of... MRI , brain injury. 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME OF RESPONSIBLE PERSON USAMRMC a...principles of pediatric brain injury and recovery following injury, as well as the clinical application of sophisticated MRI techniques that are

  16. Effective Techniques for Augmenting Heat Transfer: An Application of Entropy Generation Minimization Principles.

    DTIC Science & Technology

    1980-12-01

    augmentation techniques, entropy generation, irreversibility, exergy . 20. ABSTRACT (Continue on rovers. side If necessary and Identify by block number...35 3.5 Internally finned tubes ...... ................. .. 37 3.6 Internally roughened tubes ..... ............... . 41 3.7 Other heat transfer...irreversibility and entropy generation as fundamental criterion for evaluating and, eventually, minimizing the waste of usable energy ( exergy ) in energy

  17. Toward More Effective Teaching in WCHEN Schools; The Report of a Course in New Training Techniques for Nurse Faculty.

    ERIC Educational Resources Information Center

    Elliott, Jo Eleanor

    Forty-five abstracts represent projects prepared by faculty personnel from Western Council on Higher Education for Nursing (WCHEN) member schools who were participants in a short-term course, "Improving Instruction Through the Use of Selected Tools and Techniques." Programed instruction projects involve various clinical areas and deal with such…

  18. Undersea Laser Communication with Narrow Beams

    DTIC Science & Technology

    2015-09-29

    Abstract Laser sources enable highly efficient optical communications links due to their ability to be focused into very directive beam profiles...Recent atmospheric and space optical links have demonstrated robust laser communications links at high rate with techniques that are applicable to the...undersea environment. These techniques contrast to the broad-angle beams utilized in most reported demonstrations of undersea optical communications

  19. Automatic blocking for complex three-dimensional configurations

    NASA Technical Reports Server (NTRS)

    Dannenhoffer, John F., III

    1995-01-01

    A new blocking technique for complex three-dimensional configurations is described. This new technique is based upon the concept of an abstraction, or squared-up representation, of the configuration and the associated grid. By allowing the user to describe blocking requirements in natural terms (such as 'wrap a grid around this leading edge' or 'make all grid lines emanating from this wall orthogonal to it'), users can quickly generate complex grids around complex configurations, while still maintaining a high level of control where desired. An added advantage of the abstraction concept is that once a blocking is defined for a class of configurations, it can be automatically applied to other configurations of the same class, making the new technique particularly well suited for the parametric variations which typically occur during design processes. Grids have been generated for a variety of real-world, two- and three-dimensional configurations. In all cases, the time required to generate the grid, given just an electronic form of the configuration, was at most a few days. Hence with this new technique, the generation of a block-structured grid is only slightly more expensive than the generation of an unstructured grid for the same configuration.

  20. A Review of Equation of State Models, Chemical Equilibrium Calculations and CERV Code Requirements for SHS Detonation Modelling

    DTIC Science & Technology

    2009-10-01

    parameters for a large number of species. These authors provide many sample calculations with the JCZS database incorporated in CHEETAH 2.0, including...FORM (highest classification of Title, Abstract, Keywords) DOCUMENT CONTROL DATA (Security classification of title, body of abstract and...CLASSIFICATION OF FORM 13. ABSTRACT (a brief and factual summary of the document. It may also appear elsewhere in the body of the document itself

  1. An abstract specification language for Markov reliability models

    NASA Technical Reports Server (NTRS)

    Butler, R. W.

    1985-01-01

    Markov models can be used to compute the reliability of virtually any fault tolerant system. However, the process of delineating all of the states and transitions in a model of complex system can be devastatingly tedious and error-prone. An approach to this problem is presented utilizing an abstract model definition language. This high level language is described in a nonformal manner and illustrated by example.

  2. An abstract language for specifying Markov reliability models

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.

    1986-01-01

    Markov models can be used to compute the reliability of virtually any fault tolerant system. However, the process of delineating all of the states and transitions in a model of complex system can be devastatingly tedious and error-prone. An approach to this problem is presented utilizing an abstract model definition language. This high level language is described in a nonformal manner and illustrated by example.

  3. Researching Mental Health Disorders in the Era of Social Media: Systematic Review

    PubMed Central

    Vadillo, Miguel A; Curcin, Vasa

    2017-01-01

    Background Mental illness is quickly becoming one of the most prevalent public health problems worldwide. Social network platforms, where users can express their emotions, feelings, and thoughts, are a valuable source of data for researching mental health, and techniques based on machine learning are increasingly used for this purpose. Objective The objective of this review was to explore the scope and limits of cutting-edge techniques that researchers are using for predictive analytics in mental health and to review associated issues, such as ethical concerns, in this area of research. Methods We performed a systematic literature review in March 2017, using keywords to search articles on data mining of social network data in the context of common mental health disorders, published between 2010 and March 8, 2017 in medical and computer science journals. Results The initial search returned a total of 5386 articles. Following a careful analysis of the titles, abstracts, and main texts, we selected 48 articles for review. We coded the articles according to key characteristics, techniques used for data collection, data preprocessing, feature extraction, feature selection, model construction, and model verification. The most common analytical method was text analysis, with several studies using different flavors of image analysis and social interaction graph analysis. Conclusions Despite an increasing number of studies investigating mental health issues using social network data, some common problems persist. Assembling large, high-quality datasets of social media users with mental disorder is problematic, not only due to biases associated with the collection methods, but also with regard to managing consent and selecting appropriate analytics techniques. PMID:28663166

  4. Systematic comparison of the behaviors produced by computational models of epileptic neocortex.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warlaumont, A. S.; Lee, H. C.; Benayoun, M.

    2010-12-01

    Two existing models of brain dynamics in epilepsy, one detailed (i.e., realistic) and one abstract (i.e., simplified) are compared in terms of behavioral range and match to in vitro mouse recordings. A new method is introduced for comparing across computational models that may have very different forms. First, high-level metrics were extracted from model and in vitro output time series. A principal components analysis was then performed over these metrics to obtain a reduced set of derived features. These features define a low-dimensional behavior space in which quantitative measures of behavioral range and degree of match to real data canmore » be obtained. The detailed and abstract models and the mouse recordings overlapped considerably in behavior space. Both the range of behaviors and similarity to mouse data were similar between the detailed and abstract models. When no high-level metrics were used and principal components analysis was computed over raw time series, the models overlapped minimally with the mouse recordings. The method introduced here is suitable for comparing across different kinds of model data and across real brain recordings. It appears that, despite differences in form and computational expense, detailed and abstract models do not necessarily differ in their behaviors.« less

  5. MODELING PHYTOREMEDIATION FOR PETROLEUM CONTAMINATED SOIL: MODEL DEVELOPMENT. (R827015C018)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  6. Mutant mice: experimental organisms as materialised models in biomedicine.

    PubMed

    Huber, Lara; Keuck, Lara K

    2013-09-01

    Animal models have received particular attention as key examples of material models. In this paper, we argue that the specificities of establishing animal models-acknowledging their status as living beings and as epistemological tools-necessitate a more complex account of animal models as materialised models. This becomes particularly evident in animal-based models of diseases that only occur in humans: in these cases, the representational relation between animal model and human patient needs to be generated and validated. The first part of this paper presents an account of how disease-specific animal models are established by drawing on the example of transgenic mice models for Alzheimer's disease. We will introduce an account of validation that involves a three-fold process including (1) from human being to experimental organism; (2) from experimental organism to animal model; and (3) from animal model to human patient. This process draws upon clinical relevance as much as scientific practices and results in disease-specific, yet incomplete, animal models. The second part of this paper argues that the incompleteness of models can be described in terms of multi-level abstractions. We qualify this notion by pointing to different experimental techniques and targets of modelling, which give rise to a plurality of models for a specific disease. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. New ARCH: Future Generation Internet Architecture

    DTIC Science & Technology

    2004-08-01

    a vocabulary to talk about a system . This provides a framework ( a “reference model ...layered model Modularity and abstraction are central tenets of Computer Science thinking. Modularity breaks a system into parts, normally to permit...this complexity is hidden. Abstraction suggests a structure for the system . A popular and simple structure is a layered model : lower layer

  8. Corona Preionization Technique for Carbon Dioxide TEA Lasers.

    DTIC Science & Technology

    1982-11-30

    34’" " " " "- -. .. " "I~ 82R8O701-02 CORONA PREIONIZATION TECHNIQUE FOR CARBON DIOXIDE TEA LASERS W after R. Kamnki SUnited Technologiles Research Center C...TITLE (and Subtitle) S. TYPE OF REPORT a PERIOD COVERED CORONA PREIONIZATION TECHNIQUE FOR CARBON Final Report DIOXIDE TEA LASERS May 5, 1981...Preionization Laser UV Preionization Pulsed CO2 Laser Corona Preionization CO2 TEA Laser 10. ABSTRACT (Continue on reverse side If neceeeiny md Identify

  9. A Comparison of Direction Finding Results From an FFT Peak Identification Technique With Those From the Music Algorithm

    DTIC Science & Technology

    1991-07-01

    MUSIC ALGORITHM (U) by L.E. Montbrland go I July 1991 CRC REPORT NO. 1438 Ottawa I* Government of Canada Gouvsrnweient du Canada I o DParunnt of...FINDING RESULTS FROM AN FFT PEAK IDENTIFICATION TECHNIQUE WITH THOSE FROM THE MUSIC ALGORITHM (U) by L.E. Montbhrand CRC REPORT NO. 1438 July 1991...Ottawa A Comparison of Direction Finding Results From an FFT Peak Identification Technique With Those From the Music Algorithm L.E. Montbriand Abstract A

  10. Symbolic modeling of human anatomy for visualization and simulation

    NASA Astrophysics Data System (ADS)

    Pommert, Andreas; Schubert, Rainer; Riemer, Martin; Schiemann, Thomas; Tiede, Ulf; Hoehne, Karl H.

    1994-09-01

    Visualization of human anatomy in a 3D atlas requires both spatial and more abstract symbolic knowledge. Within our 'intelligent volume' model which integrates these two levels, we developed and implemented a semantic network model for describing human anatomy. Concepts for structuring (abstraction levels, domains, views, generic and case-specific modeling, inheritance) are introduced. Model, tools for generation and exploration and applications in our 3D anatomical atlas are presented and discussed.

  11. Recommendations for Model Driven Paradigms for Integrated Approaches to Cyber Defense

    DTIC Science & Technology

    2017-03-06

    analogy (e.g., Susceptible, Infected, Recovered [SIR]) • Abstract wargaming: game -theoretic model of cyber conflict without modeling the underlying...malware. 3.7 Abstract Wargaming Here, a game -theoretic process is modeled with moves and effects inspired by cyber conflict but without modeling the...underlying processes of cyber attack and defense. Examples in literature include the following: • Cho J-H, Gao J. Cyber war game in temporal networks

  12. Mise en oeuvre et caracterisation d'une methode d'injection de pannes a haut niveau d'abstraction

    NASA Astrophysics Data System (ADS)

    Robache, Remi

    Nowadays, the effects of cosmic rays on electronics are well known. Different studies have demonstrated that neutrons are the main cause of non-destructive errors in embedded circuits on airplanes. Moreover, the reduction of transistor sizes is making all circuits more sensitive to those effects. Radiation tolerant circuits are sometimes used in order to improve the robustness of circuits. However, those circuits are expensive and their technologies often lag a few generations behind compared to non-tolerant circuits. Designers prefer to use conventional circuits with mitigation techniques to improve the tolerance to soft errors. It is necessary to analyse and verify the dependability of a circuit throughout its design process. Conventional design methodologies need to be adapted in order to evaluate the tolerance to non-destructive errors caused by radiations. Nowadays, designers need new tools and new methodologies to validate their mitigation strategies if they are to meet system requirements. In this thesis, we are proposing a new methodology allowing to capture the faulty behavior of a circuit at a low level of abstraction and to apply it at a higher level. In order to do that, we are introducing the new concept of faulty behavior Signatures that allows creating, at a high level of abstraction (system level) models that reflect with high fidelity the faulty behavior of a circuit learned at a low level of abstraction, at gate level. We successfully replicated the faulty behavior of an 8 bit adder and multiplier with Simulink, with respectively a correlation coefficient of 98.53% and 99.86%. We are proposing a methodology that permits to generate a library of faulty components, with Simulink, allowing designers to verify the dependability of their models early in the design flow. We are presenting and analyzing our results obtained for three different circuits throughout this thesis. Within the framework of this project a paper was published at the NEWCAS 2013 conference (Robache et al., 2013). This works presents the new concept of faulty behavior Signature, the methodology for generating Signatures we developed and also our experiments with an 8bit multiplier.

  13. Engaging Teenagers in Astronomy Using the Lens of Next Generation Science Standards and Common Core State Standards (Abstract)

    NASA Astrophysics Data System (ADS)

    Gillette, S.; Wolf, D.; Harrison, J.

    2017-12-01

    (Abstract only) The Vanguard Double Star Workshop has been developed to teach eighth graders the technique of measuring position angle and separation of double stars. Through this program, the students follow in the footsteps of a professional scientist by researching the topic, performing the experiment, writing a scientific article, publishing a scientific article, and finally presenting the material to peers. An examination of current educational standards grounds this program in educational practice and philosophy.

  14. Remote sensing in hydrology: A survey of applications with selected bibliography and abstracts

    NASA Technical Reports Server (NTRS)

    Sers, S. W. (Compiler)

    1971-01-01

    Remote infrared sensing as a water exploration technique is demonstrated. Various applications are described, demonstrating that infrared sensors can locate aquifers, geothermal water, water trapped by faults, springs and water in desert regions. The potentiality of airborne IR sensors as a water prospecting tool is considered. Also included is a selected bibliography with abstracts concentrating on those publications which will better acquaint the hydrologist with investigations using thermal remote sensors as applied to water exploration.

  15. Research in advanced formal theorem-proving techniques

    NASA Technical Reports Server (NTRS)

    Rulifson, J. F.

    1971-01-01

    The present status is summarized of a continuing research program aimed at the design and implementation of a language for expressing problem-solving procedures in several areas of artificial intelligence, including program synthesis, robot planning, and theorem proving. Notations, concepts, and procedures common to the representation and solution of many of these problems were abstracted and incorporated as features into the language. The areas of research covered are described, and abstracts of six papers that contain extensive description and technical detail of the work are presented.

  16. Soviet-French working group interpretation of the scientific information during the search for celestial sources of gamma pulses, abstract of reports, 24-30 March 1977

    NASA Technical Reports Server (NTRS)

    Estulin, I. V.

    1977-01-01

    The progress made and techniques used by the Soviet-French group in the study of gamma and X ray pulses are described in abstracts of 16 reports. Experiments included calibration and operation of various recording instruments designed for measurements involving these pulses, specifically the location of sources of such pulses in outer space. Space vehicles are utilized in conjunction with ground equipment to accomplish these tests.

  17. Global planning of several plants

    NASA Technical Reports Server (NTRS)

    Bescos, Sylvie

    1992-01-01

    This paper discusses an attempt to solve the problem of planning several pharmaceutical plants at a global level. The interest in planning at this level is to increase the global control over the production process, to improve its overall efficiency, and to reduce the need for interaction between production plants. In order to reduce the complexity of this problem and to make it tractable, some abstractions were made. Based on these abstractions, a prototype is being developed within the framework of the EUREKA project PROTOS, using Constraint Logic Programming techniques.

  18. Exploring the Unknown: Detection of Fast Variability of Starlight (Abstract)

    NASA Astrophysics Data System (ADS)

    Stanton, R. H.

    2017-12-01

    (Abstract only) In previous papers the author described a photometer designed for observing high-speed events such as lunar and asteroid occultations, and for searching for new varieties of fast stellar variability. A significant challenge presented by such a system is how one deals with the large quantity of data generated in order to process it efficiently and reveal any hidden information that might be present. This paper surveys some of the techniques used to achieve this goal.

  19. Supervised Gamma Process Poisson Factorization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Dylan Zachary

    This thesis develops the supervised gamma process Poisson factorization (S- GPPF) framework, a novel supervised topic model for joint modeling of count matrices and document labels. S-GPPF is fully generative and nonparametric: document labels and count matrices are modeled under a uni ed probabilistic framework and the number of latent topics is controlled automatically via a gamma process prior. The framework provides for multi-class classification of documents using a generative max-margin classifier. Several recent data augmentation techniques are leveraged to provide for exact inference using a Gibbs sampling scheme. The first portion of this thesis reviews supervised topic modeling andmore » several key mathematical devices used in the formulation of S-GPPF. The thesis then introduces the S-GPPF generative model and derives the conditional posterior distributions of the latent variables for posterior inference via Gibbs sampling. The S-GPPF is shown to exhibit state-of-the-art performance for joint topic modeling and document classification on a dataset of conference abstracts, beating out competing supervised topic models. The unique properties of S-GPPF along with its competitive performance make it a novel contribution to supervised topic modeling.« less

  20. Assessing the impacts of water abstractions on river ecosystem services: an eco-hydraulic modelling approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carolli, Mauro, E-mail: mauro.carolli@unitn.it; Geneletti, Davide, E-mail: davide.geneletti@unitn.it; Zolezzi, Guido, E-mail: guido.zolezzi@unitn.it

    The provision of important river ecosystem services (ES) is dependent on the flow regime. This requires methods to assess the impacts on ES caused by interventions on rivers that affect flow regime, such as water abstractions. This study proposes a method to i) quantify the provision of a set of river ES, ii) simulate the effects of water abstraction alternatives that differ in location and abstracted flow, and iii) assess the impact of water abstraction alternatives on the selected ES. The method is based on river modelling science, and integrates spatially distributed hydrological, hydraulic and habitat models at different spatialmore » and temporal scales. The method is applied to the hydropeaked upper Noce River (Northern Italy), which is regulated by hydropower operations. We selected locally relevant river ES: habitat suitability for the adult marble trout, white-water rafting suitability, hydroelectricity production from run-of-river (RoR) plants. Our results quantify the seasonality of river ES response variables and their intrinsic non-linearity, which explains why the same abstracted flow can produce different effects on trout habitat and rafting suitability depending on the morphology of the abstracted reach. An economic valuation of the examined river ES suggests that incomes from RoR hydropower plants are of comparable magnitude to touristic revenue losses related to the decrease in rafting suitability.« less

  1. Theorists and Techniques: Connecting Education Theories to Lamaze Teaching Techniques

    PubMed Central

    Podgurski, Mary Jo

    2016-01-01

    ABSTRACT Should childbirth educators connect education theory to technique? Is there more to learning about theorists than memorizing facts for an assessment? Are childbirth educators uniquely poised to glean wisdom from theorists and enhance their classes with interactive techniques inspiring participant knowledge and empowerment? Yes, yes, and yes. This article will explore how an awareness of education theory can enhance retention of material through interactive learning techniques. Lamaze International childbirth classes already prepare participants for the childbearing year by using positive group dynamics; theory will empower childbirth educators to address education through well-studied avenues. Childbirth educators can provide evidence-based learning techniques in their classes and create true behavioral change. PMID:26848246

  2. Classification of malignant and benign lung nodules using taxonomic diversity index and phylogenetic distance.

    PubMed

    de Sousa Costa, Robherson Wector; da Silva, Giovanni Lucca França; de Carvalho Filho, Antonio Oseas; Silva, Aristófanes Corrêa; de Paiva, Anselmo Cardoso; Gattass, Marcelo

    2018-05-23

    Lung cancer presents the highest cause of death among patients around the world, in addition of being one of the smallest survival rates after diagnosis. Therefore, this study proposes a methodology for diagnosis of lung nodules in benign and malignant tumors based on image processing and pattern recognition techniques. Mean phylogenetic distance (MPD) and taxonomic diversity index (Δ) were used as texture descriptors. Finally, the genetic algorithm in conjunction with the support vector machine were applied to select the best training model. The proposed methodology was tested on computed tomography (CT) images from the Lung Image Database Consortium and Image Database Resource Initiative (LIDC-IDRI), with the best sensitivity of 93.42%, specificity of 91.21%, accuracy of 91.81%, and area under the ROC curve of 0.94. The results demonstrate the promising performance of texture extraction techniques using mean phylogenetic distance and taxonomic diversity index combined with phylogenetic trees. Graphical Abstract Stages of the proposed methodology.

  3. Fetoscopy for meningomyelocele repair: past, present and future

    PubMed Central

    Bevilacqua, Nicole Silva; Pedreira, Denise Araujo Lapa

    2015-01-01

    ABSTRACT Meningomyelocele is a malformation with high prevalence, and one of its main comorbidities is Arnold-Chiari malformation type II. The intrauterine repair of this defect has been studied to reduce the progressive spinal cord damage during gestation. The purpose of the present review was to describe the evolution of fetal surgery for meningomyelocele repair. Searches on PubMed database were conducted including articles published in the last 10 years. Twenty-seven articles were selected, 16 experimental studies and 11 studies in humans. A recent study demonstrated that the fetal correction results in better prognosis of neurological and psychomotor development, but open surgery, which has being used widely, has considerable maternal risks. Studies in animal and human models show that the endoscopic approach is feasible and leads to lower maternal morbidity rates. Two endoscopic techniques are currently under assessment - one in Germany and another in Brazil, and we believe that the endoscopic approach will be the future technique for prenatal repair of this defect. PMID:26154549

  4. Real-Time Visualization of Network Behaviors for Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Best, Daniel M.; Bohn, Shawn J.; Love, Douglas V.

    Plentiful, complex, and dynamic data make understanding the state of an enterprise network difficult. Although visualization can help analysts understand baseline behaviors in network traffic and identify off-normal events, visual analysis systems often do not scale well to operational data volumes (in the hundreds of millions to billions of transactions per day) nor to analysis of emergent trends in real-time data. We present a system that combines multiple, complementary visualization techniques coupled with in-stream analytics, behavioral modeling of network actors, and a high-throughput processing platform called MeDICi. This system provides situational understanding of real-time network activity to help analysts takemore » proactive response steps. We have developed these techniques using requirements gathered from the government users for which the tools are being developed. By linking multiple visualization tools to a streaming analytic pipeline, and designing each tool to support a particular kind of analysis (from high-level awareness to detailed investigation), analysts can understand the behavior of a network across multiple levels of abstraction.« less

  5. Correction to hill (2005).

    PubMed

    Hill, Clara E

    2006-01-01

    Reports an error in "Therapist Techniques, Client Involvement, and the Therapeutic Relationship: Inextricably Intertwined in the Therapy Process" by Clara E. Hill (Psychotherapy: Theory, Research, Practice, Training, 2005 Win, Vol 42(4), 431-442). An author's name was incorrectly spelled in a reference. The correct reference is presented. (The following abstract of the original article appeared in record 2006-03309-003.) I propose that therapist techniques, client involvement, and the therapeutic relationship are inextricably intertwined and need to be considered together in any discussion of the therapy process. Furthermore, I present a pantheoretical model of how these three variables evolve over four stages of successful therapy: initial impression formation, beginning the therapy (involves the components of facilitating client exploration and developing case conceptualization and treatment strategies), the core work of therapy (involves the components of theory-relevant tasks and overcoming obstacles), and termination. Theoretical propositions as well as implications for training and research are presented. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  6. Bring It On, Complexity! Present and Future of Self-Organising Middle-Out Abstraction

    NASA Astrophysics Data System (ADS)

    Mammen, Sebastian Von; Steghöfer, Jan-Philipp

    The following sections are included: * The Great Complexity Challenge * Self-Organising Middle-Out Abstraction * Optimising Graphics, Physics and Artificial Intelligence * Emergence and Hierarchies in a Natural System * The Technical Concept of SOMO * Observation of interactions * Interaction pattern recognition and behavioural abstraction * Creating and adjusting hierarchies * Confidence measures * Execution model * Learning SOMO: parameters, knowledge propagation, and procreation * Current Implementations * Awareness Beyond Virtuality * Integration and emergence * Model inference * SOMO net * SOMO after me * The Future of SOMO

  7. Hologram representation of design data in an expert system knowledge base

    NASA Technical Reports Server (NTRS)

    Shiva, S. G.; Klon, Peter F.

    1988-01-01

    A novel representational scheme for design object descriptions is presented. An abstract notion of modules and signals is developed as a conceptual foundation for the scheme. This abstraction relates the objects to the meaning of system descriptions. Anchored on this abstraction, a representational model which incorporates dynamic semantics for these objects is presented. This representational model is called a hologram scheme since it represents dual level information, namely, structural and semantic. The benefits of this scheme are presented.

  8. Selected Translated Abstracts of Chinese-Language Climate Change Publications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cushman, R.M.; Burtis, M.D.

    1999-05-01

    This report contains English-translated abstracts of important Chinese-language literature concerning global climate change for the years 1995-1998. This body of literature includes the topics of adaptation, ancient climate change, climate variation, the East Asia monsoon, historical climate change, impacts, modeling, and radiation and trace-gas emissions. In addition to the biological citations and abstracts translated into English, this report presents the original citations and abstracts in Chinese. Author and title indexes are included to assist the reader in locating abstracts of particular interest.

  9. Natural Language Processing.

    ERIC Educational Resources Information Center

    Chowdhury, Gobinda G.

    2003-01-01

    Discusses issues related to natural language processing, including theoretical developments; natural language understanding; tools and techniques; natural language text processing systems; abstracting; information extraction; information retrieval; interfaces; software; Internet, Web, and digital library applications; machine translation for…

  10. Advanced statistical methods for improved data analysis of NASA astrophysics missions

    NASA Technical Reports Server (NTRS)

    Feigelson, Eric D.

    1992-01-01

    The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.

  11. Efficient embedding of complex networks to hyperbolic space via their Laplacian

    PubMed Central

    Alanis-Lobato, Gregorio; Mier, Pablo; Andrade-Navarro, Miguel A.

    2016-01-01

    The different factors involved in the growth process of complex networks imprint valuable information in their observable topologies. How to exploit this information to accurately predict structural network changes is the subject of active research. A recent model of network growth sustains that the emergence of properties common to most complex systems is the result of certain trade-offs between node birth-time and similarity. This model has a geometric interpretation in hyperbolic space, where distances between nodes abstract this optimisation process. Current methods for network hyperbolic embedding search for node coordinates that maximise the likelihood that the network was produced by the afore-mentioned model. Here, a different strategy is followed in the form of the Laplacian-based Network Embedding, a simple yet accurate, efficient and data driven manifold learning approach, which allows for the quick geometric analysis of big networks. Comparisons against existing embedding and prediction techniques highlight its applicability to network evolution and link prediction. PMID:27445157

  12. Rigorously modeling self-stabilizing fault-tolerant circuits: An ultra-robust clocking scheme for systems-on-chip.

    PubMed

    Dolev, Danny; Függer, Matthias; Posch, Markus; Schmid, Ulrich; Steininger, Andreas; Lenzen, Christoph

    2014-06-01

    We present the first implementation of a distributed clock generation scheme for Systems-on-Chip that recovers from an unbounded number of arbitrary transient faults despite a large number of arbitrary permanent faults. We devise self-stabilizing hardware building blocks and a hybrid synchronous/asynchronous state machine enabling metastability-free transitions of the algorithm's states. We provide a comprehensive modeling approach that permits to prove, given correctness of the constructed low-level building blocks, the high-level properties of the synchronization algorithm (which have been established in a more abstract model). We believe this approach to be of interest in its own right, since this is the first technique permitting to mathematically verify, at manageable complexity, high-level properties of a fault-prone system in terms of its very basic components. We evaluate a prototype implementation, which has been designed in VHDL, using the Petrify tool in conjunction with some extensions, and synthesized for an Altera Cyclone FPGA.

  13. Rigorously modeling self-stabilizing fault-tolerant circuits: An ultra-robust clocking scheme for systems-on-chip☆

    PubMed Central

    Dolev, Danny; Függer, Matthias; Posch, Markus; Schmid, Ulrich; Steininger, Andreas; Lenzen, Christoph

    2014-01-01

    We present the first implementation of a distributed clock generation scheme for Systems-on-Chip that recovers from an unbounded number of arbitrary transient faults despite a large number of arbitrary permanent faults. We devise self-stabilizing hardware building blocks and a hybrid synchronous/asynchronous state machine enabling metastability-free transitions of the algorithm's states. We provide a comprehensive modeling approach that permits to prove, given correctness of the constructed low-level building blocks, the high-level properties of the synchronization algorithm (which have been established in a more abstract model). We believe this approach to be of interest in its own right, since this is the first technique permitting to mathematically verify, at manageable complexity, high-level properties of a fault-prone system in terms of its very basic components. We evaluate a prototype implementation, which has been designed in VHDL, using the Petrify tool in conjunction with some extensions, and synthesized for an Altera Cyclone FPGA. PMID:26516290

  14. Efficient embedding of complex networks to hyperbolic space via their Laplacian

    NASA Astrophysics Data System (ADS)

    Alanis-Lobato, Gregorio; Mier, Pablo; Andrade-Navarro, Miguel A.

    2016-07-01

    The different factors involved in the growth process of complex networks imprint valuable information in their observable topologies. How to exploit this information to accurately predict structural network changes is the subject of active research. A recent model of network growth sustains that the emergence of properties common to most complex systems is the result of certain trade-offs between node birth-time and similarity. This model has a geometric interpretation in hyperbolic space, where distances between nodes abstract this optimisation process. Current methods for network hyperbolic embedding search for node coordinates that maximise the likelihood that the network was produced by the afore-mentioned model. Here, a different strategy is followed in the form of the Laplacian-based Network Embedding, a simple yet accurate, efficient and data driven manifold learning approach, which allows for the quick geometric analysis of big networks. Comparisons against existing embedding and prediction techniques highlight its applicability to network evolution and link prediction.

  15. MOGO: Model-Oriented Global Optimization of Petascale Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malony, Allen D.; Shende, Sameer S.

    The MOGO project was initiated under in 2008 under the DOE Program Announcement for Software Development Tools for Improved Ease-of-Use on Petascale systems (LAB 08-19). The MOGO team consisted of Oak Ridge National Lab, Argonne National Lab, and the University of Oregon. The overall goal of MOGO was to attack petascale performance analysis by developing a general framework where empirical performance data could be efficiently and accurately compared with performance expectations at various levels of abstraction. This information could then be used to automatically identify and remediate performance problems. MOGO was be based on performance models derived from application knowledge,more » performance experiments, and symbolic analysis. MOGO was able to make reasonable impact on existing DOE applications and systems. New tools and techniques were developed, which, in turn, were used on important DOE applications on DOE LCF systems to show significant performance improvements.« less

  16. A high-level language for rule-based modelling.

    PubMed

    Pedersen, Michael; Phillips, Andrew; Plotkin, Gordon D

    2015-01-01

    Rule-based languages such as Kappa excel in their support for handling the combinatorial complexities prevalent in many biological systems, including signalling pathways. But Kappa provides little structure for organising rules, and large models can therefore be hard to read and maintain. This paper introduces a high-level, modular extension of Kappa called LBS-κ. We demonstrate the constructs of the language through examples and three case studies: a chemotaxis switch ring, a MAPK cascade, and an insulin signalling pathway. We then provide a formal definition of LBS-κ through an abstract syntax and a translation to plain Kappa. The translation is implemented in a compiler tool which is available as a web application. We finally demonstrate how to increase the expressivity of LBS-κ through embedded scripts in a general-purpose programming language, a technique which we view as generally applicable to other domain specific languages.

  17. A High-Level Language for Rule-Based Modelling

    PubMed Central

    Pedersen, Michael; Phillips, Andrew; Plotkin, Gordon D.

    2015-01-01

    Rule-based languages such as Kappa excel in their support for handling the combinatorial complexities prevalent in many biological systems, including signalling pathways. But Kappa provides little structure for organising rules, and large models can therefore be hard to read and maintain. This paper introduces a high-level, modular extension of Kappa called LBS-κ. We demonstrate the constructs of the language through examples and three case studies: a chemotaxis switch ring, a MAPK cascade, and an insulin signalling pathway. We then provide a formal definition of LBS-κ through an abstract syntax and a translation to plain Kappa. The translation is implemented in a compiler tool which is available as a web application. We finally demonstrate how to increase the expressivity of LBS-κ through embedded scripts in a general-purpose programming language, a technique which we view as generally applicable to other domain specific languages. PMID:26043208

  18. Diffusion of a new intermediate product in a simple ‘classical‐Schumpeterian’ model

    PubMed Central

    2017-01-01

    Abstract This paper deals with the problem of new intermediate products within a simple model, where production is circular and goods enter into the production of other goods. It studies the process by which the new good is absorbed into the economy and the structural transformation that goes with it. By means of a long‐period method the forces of structural transformation are examined, in particular the shift of existing means of production towards the innovation and the mechanism of differential growth in terms of alternative techniques and their associated systems of production. We treat two important Schumpeterian topics: the question of technological unemployment and the problem of ‘forced saving’ and the related problem of an involuntary reduction of real consumption per capita. It is shown that both phenomena are potential by‐products of the transformation process. PMID:29695874

  19. Mutual information, neural networks and the renormalization group

    NASA Astrophysics Data System (ADS)

    Koch-Janusz, Maciej; Ringel, Zohar

    2018-06-01

    Physical systems differing in their microscopic details often display strikingly similar behaviour when probed at macroscopic scales. Those universal properties, largely determining their physical characteristics, are revealed by the powerful renormalization group (RG) procedure, which systematically retains `slow' degrees of freedom and integrates out the rest. However, the important degrees of freedom may be difficult to identify. Here we demonstrate a machine-learning algorithm capable of identifying the relevant degrees of freedom and executing RG steps iteratively without any prior knowledge about the system. We introduce an artificial neural network based on a model-independent, information-theoretic characterization of a real-space RG procedure, which performs this task. We apply the algorithm to classical statistical physics problems in one and two dimensions. We demonstrate RG flow and extract the Ising critical exponent. Our results demonstrate that machine-learning techniques can extract abstract physical concepts and consequently become an integral part of theory- and model-building.

  20. Exploring the potential of high resolution mass spectrometry for the investigation of lignin-derived phenol substitutes in phenolic resin syntheses.

    PubMed

    Dier, Tobias K F; Fleckenstein, Marco; Militz, Holger; Volmer, Dietrich A

    2017-05-01

    Chemical degradation is an efficient method to obtain bio-oils and other compounds from lignin. Lignin bio-oils are potential substitutes for the phenol component of phenol formaldehyde (PF) resins. Here, we developed an analytical method based on high resolution mass spectrometry that provided structural information for the synthesized lignin-derived resins and supported the prediction of their properties. Different model resins based on typical lignin degradation products were analyzed by electrospray ionization in negative ionization mode. Utilizing enhanced mass defect filter techniques provided detailed structural information of the lignin-based model resins and readily complemented the analytical data from differential scanning calorimetry and thermogravimetric analysis. Relative reactivity and chemical diversity of the phenol substitutes were significant determinants of the outcome of the PF resin synthesis and thus controlled the areas of application of the resulting polymers. Graphical abstract ᅟ.

  1. Temporal abstraction and inductive logic programming for arrhythmia recognition from electrocardiograms.

    PubMed

    Carrault, G; Cordier, M-O; Quiniou, R; Wang, F

    2003-07-01

    This paper proposes a novel approach to cardiac arrhythmia recognition from electrocardiograms (ECGs). ECGs record the electrical activity of the heart and are used to diagnose many heart disorders. The numerical ECG is first temporally abstracted into series of time-stamped events. Temporal abstraction makes use of artificial neural networks to extract interesting waves and their features from the input signals. A temporal reasoner called a chronicle recogniser processes such series in order to discover temporal patterns called chronicles which can be related to cardiac arrhythmias. Generally, it is difficult to elicit an accurate set of chronicles from a doctor. Thus, we propose to learn automatically from symbolic ECG examples the chronicles discriminating the arrhythmias belonging to some specific subset. Since temporal relationships are of major importance, inductive logic programming (ILP) is the tool of choice as it enables first-order relational learning. The approach has been evaluated on real ECGs taken from the MIT-BIH database. The performance of the different modules as well as the efficiency of the whole system is presented. The results are rather good and demonstrate that integrating numerical techniques for low level perception and symbolic techniques for high level classification is very valuable.

  2. Practical Techniques for Language Design and Prototyping

    DTIC Science & Technology

    2005-01-01

    Practical Techniques for Language Design and Prototyping Mark-Oliver Stehr1 and Carolyn L. Talcott2 1 University of Illinois at Urbana-Champaign...cs.stanford.edu Abstract. Global computing involves the interplay of a vast variety of languages , but practially useful foundations for language ...framework, namely rewriting logic, that allows us to express (1) and (2) and, in addition, language aspects such as concurrency and non-determinism. We

  3. Analysis Using Bi-Spectral Related Technique

    DTIC Science & Technology

    1993-11-17

    filtering is employed as the data is processed (equation 1). Earlier results have shown that in contrast to the Wigner - Ville Distribution ( WVD ) no spectral...Technique by-o -~ Ralph Hippenstiel November 17, 1993 94 2 22 1 0 Approved for public reslease; distribution unlimited. Prepared for: Naval Command Control...Government. 12a. DISTRIBUTION /AVAILABILITY STATEMENT 12b. DISTRIBUTION ’.ODE Approved for public relkase; distribution unlimited. 13. ABSTRACT (Maximum

  4. DEVELOPMENT OF AN AGAR LIFT-DNA/DNA HYBRIDIZATION TECHNIQUE FOR USE IN VISUALIZATION OF THE SPATIAL DISTRIBUTION OF EUBACTERIA ON SOIL SURFACES. (R825415)

    EPA Science Inventory

    Abstract

    While microbial growth is well-understood in pure culture systems, less is known about growth in intact soil systems. The objective of this work was to develop a technique to allow visualization of the two-dimensional spatial distribution of bacterial growth o...

  5. Quality Control of True Height Profiles Obtained Automatically from Digital Ionograms.

    DTIC Science & Technology

    1982-05-01

    nece.,ssary and Identify by block number) Ionosphere Digisonde Electron Density Profile Ionogram Autoscaling ARTIST 2 , ABSTRACT (Continue on reverae...analysis technique currently used with the ionogram traces scaled automatically by the ARTIST software [Reinisch and Huang, 1983; Reinisch et al...19841, and the generalized polynomial analysis technique POLAN [Titheridge, 1985], using the same ARTIST -identified ionogram traces. 2. To determine how

  6. Free-Energy Profiles of Membrane Insertion of the M2 Transmembrane Peptide from Influenza A Virus

    DTIC Science & Technology

    2008-12-01

    ABSTRACT The insertion of the M2 transmembrane peptide from influenza A virus into a membrane has been studied with molecular - dynamics simulations ...performed replica-exchange molecular - dynamics simulations with umbrella-sampling techniques to characterize the probability distribution and conformation...atomic- detailed molecular dynamics (MD) simulation techniques represent a valuable complementary methodology to inves- tigate membrane-insertion of

  7. Improving the geological interpretation of magnetic and gravity satellite anomalies

    NASA Technical Reports Server (NTRS)

    Hinze, William J.; Braile, Lawrence W.; Vonfrese, Ralph R. B.

    1987-01-01

    Quantitative analysis of the geologic component of observed satellite magnetic and gravity fields requires accurate isolation of the geologic component of the observations, theoretically sound and viable inversion techniques, and integration of collateral, constraining geologic and geophysical data. A number of significant contributions were made which make quantitative analysis more accurate. These include procedures for: screening and processing orbital data for lithospheric signals based on signal repeatability and wavelength analysis; producing accurate gridded anomaly values at constant elevations from the orbital data by three-dimensional least squares collocation; increasing the stability of equivalent point source inversion and criteria for the selection of the optimum damping parameter; enhancing inversion techniques through an iterative procedure based on the superposition theorem of potential fields; and modeling efficiently regional-scale lithospheric sources of satellite magnetic anomalies. In addition, these techniques were utilized to investigate regional anomaly sources of North and South America and India and to provide constraints to continental reconstruction. Since the inception of this research study, eleven papers were presented with associated published abstracts, three theses were completed, four papers were published or accepted for publication, and an additional manuscript was submitted for publication.

  8. Investigation of microstructure in additive manufactured Inconel 625 by spatially resolved neutron transmission spectroscopy

    PubMed Central

    Tremsin, Anton S.; Gao, Yan; Dial, Laura C.; Grazzi, Francesco; Shinohara, Takenao

    2016-01-01

    Abstract Non-destructive testing techniques based on neutron imaging and diffraction can provide information on the internal structure of relatively thick metal samples (up to several cm), which are opaque to other conventional non-destructive methods. Spatially resolved neutron transmission spectroscopy is an extension of traditional neutron radiography, where multiple images are acquired simultaneously, each corresponding to a narrow range of energy. The analysis of transmission spectra enables studies of bulk microstructures at the spatial resolution comparable to the detector pixel. In this study we demonstrate the possibility of imaging (with ~100 μm resolution) distribution of some microstructure properties, such as residual strain, texture, voids and impurities in Inconel 625 samples manufactured with an additive manufacturing method called direct metal laser melting (DMLM). Although this imaging technique can be implemented only in a few large-scale facilities, it can be a valuable tool for optimization of additive manufacturing techniques and materials and for correlating bulk microstructure properties to manufacturing process parameters. In addition, the experimental strain distribution can help validate finite element models which many industries use to predict the residual stress distributions in additive manufactured components. PMID:27877885

  9. Smart licensing and environmental flows: Modeling framework and sensitivity testing

    NASA Astrophysics Data System (ADS)

    Wilby, R. L.; Fenn, C. R.; Wood, P. J.; Timlett, R.; Lequesne, T.

    2011-12-01

    Adapting to climate change is just one among many challenges facing river managers. The response will involve balancing the long-term water demands of society with the changing needs of the environment in sustainable and cost effective ways. This paper describes a modeling framework for evaluating the sensitivity of low river flows to different configurations of abstraction licensing under both historical climate variability and expected climate change. A rainfall-runoff model is used to quantify trade-offs among environmental flow (e-flow) requirements, potential surface and groundwater abstraction volumes, and the frequency of harmful low-flow conditions. Using the River Itchen in southern England as a case study it is shown that the abstraction volume is more sensitive to uncertainty in the regional climate change projection than to the e-flow target. It is also found that "smarter" licensing arrangements (involving a mix of hands off flows and "rising block" abstraction rules) could achieve e-flow targets more frequently than conventional seasonal abstraction limits, with only modest reductions in average annual yield, even under a hotter, drier climate change scenario.

  10. Becoming Syntactic

    ERIC Educational Resources Information Center

    Chang, Franklin; Dell, Gary S.; Bock, Kathryn

    2006-01-01

    Psycholinguistic research has shown that the influence of abstract syntactic knowledge on performance is shaped by particular sentences that have been experienced. To explore this idea, the authors applied a connectionist model of sentence production to the development and use of abstract syntax. The model makes use of (a) error-based learning to…

  11. Learning Probabilistic Logic Models from Probabilistic Examples

    PubMed Central

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2009-01-01

    Abstract We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples. PMID:19888348

  12. Modeling Adaptable Business Service for Enterprise Collaboration

    NASA Astrophysics Data System (ADS)

    Boukadi, Khouloud; Vincent, Lucien; Burlat, Patrick

    Nowadays, a Service Oriented Architecture (SOA) seems to be one of the most promising paradigms for leveraging enterprise information systems. SOA creates opportunities for enterprises to provide value added service tailored for on demand enterprise collaboration. With the emergence and rapid development of Web services technologies, SOA is being paid increasing attention and has become widespread. In spite of the popularity of SOA, a standardized framework for modeling and implementing business services are still in progress. For the purpose of supporting these service-oriented solutions, we adopt a model driven development approach. This paper outlines the Contextual Service Oriented Modeling and Analysis (CSOMA) methodology and presents UML profiles for the PIM level service-oriented architectural modeling, as well as its corresponding meta-models. The proposed PIM (Platform Independent Model) describes the business SOA at a high level of abstraction regardless of techniques involved in the application employment. In addition, all essential service-specific concerns required for delivering quality and context-aware service are covered. Some of the advantages of this approach are that it is generic and thus not closely allied with Web service technology as well as specifically treating the service adaptability during the design stage.

  13. Modelling the protocol stack in NCS with deterministic and stochastic petri net

    NASA Astrophysics Data System (ADS)

    Hui, Chen; Chunjie, Zhou; Weifeng, Zhu

    2011-06-01

    Protocol stack is the basis of the networked control systems (NCS). Full or partial reconfiguration of protocol stack offers both optimised communication service and system performance. Nowadays, field testing is unrealistic to determine the performance of reconfigurable protocol stack; and the Petri net formal description technique offers the best combination of intuitive representation, tool support and analytical capabilities. Traditionally, separation between the different layers of the OSI model has been a common practice. Nevertheless, such a layered modelling analysis framework of protocol stack leads to the lack of global optimisation for protocol reconfiguration. In this article, we proposed a general modelling analysis framework for NCS based on the cross-layer concept, which is to establish an efficiency system scheduling model through abstracting the time constraint, the task interrelation, the processor and the bus sub-models from upper and lower layers (application, data link and physical layer). Cross-layer design can help to overcome the inadequacy of global optimisation based on information sharing between protocol layers. To illustrate the framework, we take controller area network (CAN) as a case study. The simulation results of deterministic and stochastic Petri-net (DSPN) model can help us adjust the message scheduling scheme and obtain better system performance.

  14. Proceedings of the second United Nations symposium on the development and use of geothermal resources held at San Francisco, California, May 20--29, 1975. Volume 1 (in several languages)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The 299 papers in the Proceedings are presented in three volumes and are divided into twelve sections, each section dealing with a different aspect of geothermal energy. Rapporturs' summaries of the contents of each section are grouped together in Vol. 1 of the Proceedings; a separate abstract was prepared for each summary. Volume 1 also contains ninety-eight papers under the following section headings: present status of resources development; geology, hydrology, and geothermal systems; and geochemical techniques in exploration. Separate abstracts were prepared for ninety-seven papers. One paper was previously abstracted for ERA and appeared as CONF-750525--17. (LBS)

  15. Quasistatic Evolution in Perfect Plasticity for General Heterogeneous Materials

    NASA Astrophysics Data System (ADS)

    Solombrino, Francesco

    2014-04-01

    Inspired by some recent developments in the theory of small-strain heterogeneous elastoplasticity, we both revisit and generalize the formulation of the quasistatic evolutionary problem in perfect plasticity given by Francfort and Giacomini (Commun Pure Appl Math, 65:1185-1241, 2012). We show that their definition of the plastic dissipation measure is equivalent to an abstract one, where it is defined as the supremum of the dualities between the deviatoric parts of admissible stress fields and the plastic strains. By means of this abstract definition, a viscoplastic approximation and variational techniques from the theory of rate-independent processes give the existence of an evolution satisfying an energy-dissipation balance and consequently Hill's maximum plastic work principle for an abstract and very large class of yield conditions.

  16. Interactional Metadiscourse in Research Article Abstracts

    ERIC Educational Resources Information Center

    Gillaerts, Paul; Van de Velde, Freek

    2010-01-01

    This paper deals with interpersonality in research article abstracts analysed in terms of interactional metadiscourse. The evolution in the distribution of three prominent interactional markers comprised in Hyland's (2005a) model, viz. hedges, boosters and attitude markers, is investigated in three decades of abstract writing in the field of…

  17. Temporal abstraction-based clinical phenotyping with Eureka!

    PubMed

    Post, Andrew R; Kurc, Tahsin; Willard, Richie; Rathod, Himanshu; Mansour, Michel; Pai, Akshatha Kalsanka; Torian, William M; Agravat, Sanjay; Sturm, Suzanne; Saltz, Joel H

    2013-01-01

    Temporal abstraction, a method for specifying and detecting temporal patterns in clinical databases, is very expressive and performs well, but it is difficult for clinical investigators and data analysts to understand. Such patterns are critical in phenotyping patients using their medical records in research and quality improvement. We have previously developed the Analytic Information Warehouse (AIW), which computes such phenotypes using temporal abstraction but requires software engineers to use. We have extended the AIW's web user interface, Eureka! Clinical Analytics, to support specifying phenotypes using an alternative model that we developed with clinical stakeholders. The software converts phenotypes from this model to that of temporal abstraction prior to data processing. The model can represent all phenotypes in a quality improvement project and a growing set of phenotypes in a multi-site research study. Phenotyping that is accessible to investigators and IT personnel may enable its broader adoption.

  18. The Feedback-related Negativity Codes Components of Abstract Inference during Reward-based Decision-making.

    PubMed

    Reiter, Andrea M F; Koch, Stefan P; Schröger, Erich; Hinrichs, Hermann; Heinze, Hans-Jochen; Deserno, Lorenz; Schlagenhauf, Florian

    2016-08-01

    Behavioral control is influenced not only by learning from the choices made and the rewards obtained but also by "what might have happened," that is, inference about unchosen options and their fictive outcomes. Substantial progress has been made in understanding the neural signatures of direct learning from choices that are actually made and their associated rewards via reward prediction errors (RPEs). However, electrophysiological correlates of abstract inference in decision-making are less clear. One seminal theory suggests that the so-called feedback-related negativity (FRN), an ERP peaking 200-300 msec after a feedback stimulus at frontocentral sites of the scalp, codes RPEs. Hitherto, the FRN has been predominantly related to a so-called "model-free" RPE: The difference between the observed outcome and what had been expected. Here, by means of computational modeling of choice behavior, we show that individuals employ abstract, "double-update" inference on the task structure by concurrently tracking values of chosen stimuli (associated with observed outcomes) and unchosen stimuli (linked to fictive outcomes). In a parametric analysis, model-free RPEs as well as their modification because of abstract inference were regressed against single-trial FRN amplitudes. We demonstrate that components related to abstract inference uniquely explain variance in the FRN beyond model-free RPEs. These findings advance our understanding of the FRN and its role in behavioral adaptation. This might further the investigation of disturbed abstract inference, as proposed, for example, for psychiatric disorders, and its underlying neural correlates.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    An, Ho-Myoung; Kim, Hee-Dong; Kim, Tae Geun, E-mail: tgkim1@korea.ac.kr

    Graphical abstract: The degradation tendency extracted by CP technique was almost the same in both the bulk-type and TFT-type cells. - Highlights: • D{sub it} is directly investigated from bulk-type and TFT-type CTF memory. • Charge pumping technique was employed to analyze the D{sub it} information. • To apply the CP technique to monitor the reliability of the 3D NAND flash. - Abstract: The energy distribution and density of interface traps (D{sub it}) are directly investigated from bulk-type and thin-film transistor (TFT)-type charge trap flash memory cells with tunnel oxide degradation, under program/erase (P/E) cycling using a charge pumping (CP)more » technique, in view of application in a 3-demension stackable NAND flash memory cell. After P/E cycling in bulk-type devices, the interface trap density gradually increased from 1.55 × 10{sup 12} cm{sup −2} eV{sup −1} to 3.66 × 10{sup 13} cm{sup −2} eV{sup −1} due to tunnel oxide damage, which was consistent with the subthreshold swing and transconductance degradation after P/E cycling. Its distribution moved toward shallow energy levels with increasing cycling numbers, which coincided with the decay rate degradation with short-term retention time. The tendency extracted with the CP technique for D{sub it} of the TFT-type cells was similar to those of bulk-type cells.« less

  20. Modelling fate and transport of pesticides in river catchments with drinking water abstractions

    NASA Astrophysics Data System (ADS)

    Desmet, Nele; Seuntjens, Piet; Touchant, Kaatje

    2010-05-01

    When drinking water is abstracted from surface water, the presence of pesticides may have a large impact on the purification costs. In order to respect imposed thresholds at points of drinking water abstraction in a river catchment, sustainable pesticide management strategies might be required in certain areas. To improve management strategies, a sound understanding of the emission routes, the transport, the environmental fate and the sources of pesticides is needed. However, pesticide monitoring data on which measures are founded, are generally scarce. Data scarcity hampers the interpretation and the decision making. In such a case, a modelling approach can be very useful as a tool to obtain complementary information. Modelling allows to take into account temporal and spatial variability in both discharges and concentrations. In the Netherlands, the Meuse river is used for drinking water abstraction and the government imposes the European drinking water standard for individual pesticides (0.1 ?g.L-1) for surface waters at points of drinking water abstraction. The reported glyphosate concentrations in the Meuse river frequently exceed the standard and this enhances the request for targeted measures. In this study, a model for the Meuse river was developed to estimate the contribution of influxes at the Dutch-Belgian border on the concentration levels detected at the drinking water intake 250 km downstream and to assess the contribution of the tributaries to the glyphosate loads. The effects of glyphosate decay on environmental fate were considered as well. Our results show that the application of a river model allows to asses fate and transport of pesticides in a catchment in spite of monitoring data scarcity. Furthermore, the model provides insight in the contribution of different sub basins to the pollution level. The modelling results indicate that the effect of local measures to reduce pesticides concentrations in the river at points of drinking water abstraction, might be limited due to dominant transboundary loads. This emphasizes the need for transboundary management strategies on a river catchment scale.

  1. Effects of air annealing on CdS quantum dots thin film grown at room temperature by CBD technique intended for photosensor applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaikh, Shaheed U.; Desale, Dipalee J.; Siddiqui, Farha Y.

    2012-11-15

    Graphical abstract: The effect of different intensities (40, 60 100 and 200 W) of light on CdS quantum dots thin film annealed at 350 °C indicating enhancement in (a) photo-current and (b) photosensitivity. Highlights: ► The preparation of CdS nanodot thin film at room temperature by M-CBD technique. ► Study of air annealing on prepared CdS nanodots thin film. ► The optimized annealing temperature for CdS nanodot thin film is 350 °C. ► Modified CdS thin films can be used in photosensor application. -- Abstract: CdS quantum dots thin-films have been deposited onto the glass substrate at room temperature usingmore » modified chemical bath deposition technique. The prepared thin films were further annealed in air atmosphere at 150, 250 and 350 °C for 1 h and subsequently characterized by scanning electron microscopy, ultraviolet–visible spectroscopy, electrical resistivity and I–V system. The modifications observed in morphology and opto-electrical properties of the thin films are presented.« less

  2. MODELING DIFFUSION AND REACTION IN SOILS: X. A UNIFYING MODEL FOR SOLUTE AND GAS DIFFUSIVITY IN UNSATURATED SOIL. (R825433)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  3. MODELING THE FATE OF TOLUENE IN A CHAMBER WITH ALFALFA PLANTS 1. THEORY AND MODELING CONCEPTS. (R825549C062)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  4. Labyrinth, An Abstract Model for Hypermedia Applications. Description of its Static Components.

    ERIC Educational Resources Information Center

    Diaz, Paloma; Aedo, Ignacio; Panetsos, Fivos

    1997-01-01

    The model for hypermedia applications called Labyrinth allows: (1) the design of platform-independent hypermedia applications; (2) the categorization, generalization and abstraction of sparse unstructured heterogeneous information in multiple and interconnected levels; (3) the creation of personal views in multiuser hyperdocuments for both groups…

  5. Modelling a Network of Decision Makers

    DTIC Science & Technology

    2004-06-01

    DATES COVERED 00-00-2004 to 00-00-2004 4. TITLE AND SUBTITLE Modelling a Netowrk of Decision Makers (Briefing Charts) 5a. CONTRACT NUMBER 5b...contains color images. 14. ABSTRACT 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 31 19a

  6. Evaluation of the TEAM Train-the-Trainer program

    DOT National Transportation Integrated Search

    1992-05-22

    Author's abstract: The objective of this study was to evaluate the effectiveness of Techniques for Effective Alcohol Management (TEAM) Train-the-Trainer workshops. Effectiveness was measured in terms of the success facility representatives had, after...

  7. Students' Abstraction in Recognizing, Building with and Constructing a Quadrilateral

    ERIC Educational Resources Information Center

    Budiarto, Mega Teguh; Rahaju, Endah Budi; Hartono, Sugi

    2017-01-01

    This study aims to implement empirically students' abstraction with socio-cultural background of Indonesia. Abstraction is an activity that involves a vertical reorganization of previously constructed mathematics into a new mathematical structure. The principal components of the model are three dynamic nested epistemic actions: recognizing,…

  8. Beyond Captions: Linking Figures with Abstract Sentences in Biomedical Articles

    PubMed Central

    Bockhorst, Joseph P.; Conroy, John M.; Agarwal, Shashank; O’Leary, Dianne P.; Yu, Hong

    2012-01-01

    Although figures in scientific articles have high information content and concisely communicate many key research findings, they are currently under utilized by literature search and retrieval systems. Many systems ignore figures, and those that do not typically only consider caption text. This study describes and evaluates a fully automated approach for associating figures in the body of a biomedical article with sentences in its abstract. We use supervised methods to learn probabilistic language models, hidden Markov models, and conditional random fields for predicting associations between abstract sentences and figures. Three kinds of evidence are used: text in abstract sentences and figures, relative positions of sentences and figures, and the patterns of sentence/figure associations across an article. Each information source is shown to have predictive value, and models that use all kinds of evidence are more accurate than models that do not. Our most accurate method has an -score of 69% on a cross-validation experiment, is competitive with the accuracy of human experts, has significantly better predictive accuracy than state-of-the-art methods and enables users to access figures associated with an abstract sentence with an average of 1.82 fewer mouse clicks. A user evaluation shows that human users find our system beneficial. The system is available at http://FigureItOut.askHERMES.org. PMID:22815711

  9. Modelling difficulties in abstract thinking in psychosis: the importance of socio-developmental background.

    PubMed

    Berg, A O; Melle, I; Zuber, V; Simonsen, C; Nerhus, M; Ueland, T; Andreassen, O A; Sundet, K; Vaskinn, A

    2017-01-01

    Abstract thinking is important in modern understanding of neurocognitive abilities, and a symptom of thought disorder in psychosis. In patients with psychosis, we assessed if socio-developmental background influences abstract thinking, and the association with executive functioning and clinical psychosis symptoms. Participants (n = 174) had a diagnosis of psychotic or bipolar disorder, were 17-65 years, intelligence quotient (IQ) > 70, fluent in a Scandinavian language, and their full primary education in Norway. Immigrants (N = 58) were matched (1:2) with participants without a history of migration (N = 116). All participants completed a neurocognitive and clinical assessment. Socio-developmental background was operationalised as human developmental index (HDI) of country of birth, at year of birth. Structural equation modelling was used to assess the model with best fit. The model with best fit, χ 2  = 96.591, df = 33, p < .001, confirmed a significant indirect effect of HDI scores on abstract thinking through executive functioning, but not through clinical psychosis symptoms. This study found that socio-developmental background influences abstract thinking in psychosis by indirect effect through executive functioning. We should take into account socio-developmental background in the interpretation of neurocognitive performance in patients with psychosis, and prioritise cognitive remediation in treatment of immigrant patients.

  10. Abstract memory representations in the ventromedial prefrontal cortex and hippocampus support concept generalization.

    PubMed

    Bowman, Caitlin R; Zeithamova, Dagmar

    2018-02-07

    Memory function involves both the ability to remember details of individual experiences and the ability to link information across events to create new knowledge. Prior research has identified the ventromedial prefrontal cortex (VMPFC) and the hippocampus as important for integrating across events in service of generalization in episodic memory. The degree to which these memory integration mechanisms contribute to other forms of generalization, such as concept learning, is unclear. The present study used a concept-learning task in humans (both sexes) coupled with model-based fMRI to test whether VMPFC and hippocampus contribute to concept generalization, and whether they do so by maintaining specific category exemplars or abstract category representations. Two formal categorization models were fit to individual subject data: a prototype model that posits abstract category representations and an exemplar model that posits category representations based on individual category members. Latent variables from each of these models were entered into neuroimaging analyses to determine whether VMPFC and the hippocampus track prototype or exemplar information during concept generalization. Behavioral model fits indicated that almost three quarters of the subjects relied on prototype information when making judgments about new category members. Paralleling prototype dominance in behavior, correlates of the prototype model were identified in VMPFC and the anterior hippocampus with no significant exemplar correlates. These results indicate that the VMPFC and portions of the hippocampus play a broad role in memory generalization and that they do so by representing abstract information integrated from multiple events. SIGNIFICANCE STATEMENT Whether people represent concepts as a set of individual category members or by deriving generalized concept representations abstracted across exemplars has been debated. In episodic memory, generalized memory representations have been shown to arise through integration across events supported by the ventromedial prefrontal cortex (VMPFC) and hippocampus. The current study combined formal categorization models with fMRI data analysis to show that the VMPFC and anterior hippocampus represent abstract prototype information during concept generalization, contributing novel evidence of generalized concept representations in the brain. Results indicate that VMPFC-hippocampal memory integration mechanisms contribute to knowledge generalization across multiple cognitive domains, with the degree of abstraction of memory representations varying along the long axis of the hippocampus. Copyright © 2018 the authors.

  11. Surgery for cervical intraepithelial neoplasia

    PubMed Central

    Martin-Hirsch, Pierre PL; Paraskevaidis, Evangelos; Bryant, Andrew; Dickinson, Heather O; Keep, Sarah L

    2014-01-01

    Background Cervical intraepithelial neoplasia (CIN) is the most common pre-malignant lesion. Atypical squamous changes occur in the transformation zone of the cervix with mild, moderate or severe changes described by their depth (CIN 1, 2 or 3). Cervical intraepithelial neoplasia is treated by local ablation or lower morbidity excision techniques. Choice of treatment depends on the grade and extent of the disease. Objectives To assess the effectiveness and safety of alternative surgical treatments for CIN. Search methods We searched the Cochrane Gynaecological Cancer Group Trials Register, Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library), MEDLINE and EMBASE (up to April 2009). We also searched registers of clinical trials, abstracts of scientific meetings and reference lists of included studies. Selection criteria Randomised controlled trials (RCTs) of alternative surgical treatments in women with cervical intraepithelial neoplasia. Data collection and analysis Two review authors independently abstracted data and assessed risks of bias. Risk ratios that compared residual disease after the follow-up examination and adverse events in women who received one of either laser ablation, laser conisation, large loop excision of the transformation zone (LLETZ), knife conisation or cryotherapy were pooled in random-effects model meta-analyses. Main results Twenty-nine trials were included. Seven surgical techniques were tested in various comparisons. No significant differences in treatment failures were demonstrated in terms of persistent disease after treatment. Large loop excision of the transformation zone appeared to provide the most reliable specimens for histology with the least morbidity. Morbidity was lower than with laser conisation, although the trials did not provide data for every outcome measure. There were not enough data to assess the effect on morbidity when compared with laser ablation. Authors’ conclusions The evidence suggests that there is no obvious superior surgical technique for treating cervical intraepithelial neoplasia in terms of treatment failures or operative morbidity. PMID:20556751

  12. Prediction of homoprotein and heteroprotein complexes by protein docking and template‐based modeling: A CASP‐CAPRI experiment

    PubMed Central

    Velankar, Sameer; Kryshtafovych, Andriy; Huang, Shen‐You; Schneidman‐Duhovny, Dina; Sali, Andrej; Segura, Joan; Fernandez‐Fuentes, Narcis; Viswanath, Shruthi; Elber, Ron; Grudinin, Sergei; Popov, Petr; Neveu, Emilie; Lee, Hasup; Baek, Minkyung; Park, Sangwoo; Heo, Lim; Rie Lee, Gyu; Seok, Chaok; Qin, Sanbo; Zhou, Huan‐Xiang; Ritchie, David W.; Maigret, Bernard; Devignes, Marie‐Dominique; Ghoorah, Anisah; Torchala, Mieczyslaw; Chaleil, Raphaël A.G.; Bates, Paul A.; Ben‐Zeev, Efrat; Eisenstein, Miriam; Negi, Surendra S.; Weng, Zhiping; Vreven, Thom; Pierce, Brian G.; Borrman, Tyler M.; Yu, Jinchao; Ochsenbein, Françoise; Guerois, Raphaël; Vangone, Anna; Rodrigues, João P.G.L.M.; van Zundert, Gydo; Nellen, Mehdi; Xue, Li; Karaca, Ezgi; Melquiond, Adrien S.J.; Visscher, Koen; Kastritis, Panagiotis L.; Bonvin, Alexandre M.J.J.; Xu, Xianjin; Qiu, Liming; Yan, Chengfei; Li, Jilong; Ma, Zhiwei; Cheng, Jianlin; Zou, Xiaoqin; Shen, Yang; Peterson, Lenna X.; Kim, Hyung‐Rae; Roy, Amit; Han, Xusi; Esquivel‐Rodriguez, Juan; Kihara, Daisuke; Yu, Xiaofeng; Bruce, Neil J.; Fuller, Jonathan C.; Wade, Rebecca C.; Anishchenko, Ivan; Kundrotas, Petras J.; Vakser, Ilya A.; Imai, Kenichiro; Yamada, Kazunori; Oda, Toshiyuki; Nakamura, Tsukasa; Tomii, Kentaro; Pallara, Chiara; Romero‐Durana, Miguel; Jiménez‐García, Brian; Moal, Iain H.; Férnandez‐Recio, Juan; Joung, Jong Young; Kim, Jong Yun; Joo, Keehyoung; Lee, Jooyoung; Kozakov, Dima; Vajda, Sandor; Mottarella, Scott; Hall, David R.; Beglov, Dmitri; Mamonov, Artem; Xia, Bing; Bohnuud, Tanggis; Del Carpio, Carlos A.; Ichiishi, Eichiro; Marze, Nicholas; Kuroda, Daisuke; Roy Burman, Shourya S.; Gray, Jeffrey J.; Chermak, Edrisse; Cavallo, Luigi; Oliva, Romina; Tovchigrechko, Andrey

    2016-01-01

    ABSTRACT We present the results for CAPRI Round 30, the first joint CASP‐CAPRI experiment, which brought together experts from the protein structure prediction and protein–protein docking communities. The Round comprised 25 targets from amongst those submitted for the CASP11 prediction experiment of 2014. The targets included mostly homodimers, a few homotetramers, and two heterodimers, and comprised protein chains that could readily be modeled using templates from the Protein Data Bank. On average 24 CAPRI groups and 7 CASP groups submitted docking predictions for each target, and 12 CAPRI groups per target participated in the CAPRI scoring experiment. In total more than 9500 models were assessed against the 3D structures of the corresponding target complexes. Results show that the prediction of homodimer assemblies by homology modeling techniques and docking calculations is quite successful for targets featuring large enough subunit interfaces to represent stable associations. Targets with ambiguous or inaccurate oligomeric state assignments, often featuring crystal contact‐sized interfaces, represented a confounding factor. For those, a much poorer prediction performance was achieved, while nonetheless often providing helpful clues on the correct oligomeric state of the protein. The prediction performance was very poor for genuine tetrameric targets, where the inaccuracy of the homology‐built subunit models and the smaller pair‐wise interfaces severely limited the ability to derive the correct assembly mode. Our analysis also shows that docking procedures tend to perform better than standard homology modeling techniques and that highly accurate models of the protein components are not always required to identify their association modes with acceptable accuracy. Proteins 2016; 84(Suppl 1):323–348. © 2016 The Authors Proteins: Structure, Function, and Bioinformatics Published by Wiley Periodicals, Inc. PMID:27122118

  13. Qumquad: a UML-based approach for remodeling of legacy systems in health care.

    PubMed

    Garde, Sebastian; Knaup, Petra; Herold, Ralf

    2003-07-01

    Health care information systems still comprise legacy systems to a certain extent. For reengineering legacy systems a thorough remodeling is inalienable. Current modeling techniques like the Unified Modeling Language (UML) do not offer a systematic and comprehensive process-oriented method for remodeling activities. We developed a systematic method for remodeling legacy systems in health care called Qumquad. Qumquad consists of three major steps: (i) modeling the actual state of the application system, (ii) systematic identification of weak points in this model and (iii) development of a target concept for the reimplementation considering the identified weak points. We applied Qumquad for remodeling a documentation and therapy planning system for pediatric oncology (DOSPO). As a result of our remodeling activities we regained an abstract model of the system, an analysis of the current weak points of DOSPO and possible (partly alternative) solutions to overcome the weak points. Qumquad proved to be very helpful in the reengineering process of DOSPO since we now have at our disposal a comprehensive model for the reimplementation of DOSPO that current users of the system agree on. Qumquad can easily be applied to other reengineering projects in health care.

  14. Towards Using Reo for Compliance-Aware Business Process Modeling

    NASA Astrophysics Data System (ADS)

    Arbab, Farhad; Kokash, Natallia; Meng, Sun

    Business process modeling and implementation of process supporting infrastructures are two challenging tasks that are not fully aligned. On the one hand, languages such as Business Process Modeling Notation (BPMN) exist to capture business processes at the level of domain analysis. On the other hand, programming paradigms and technologies such as Service-Oriented Computing (SOC) and web services have emerged to simplify the development of distributed web systems that underly business processes. BPMN is the most recognized language for specifying process workflows at the early design steps. However, it is rather declarative and may lead to the executable models which are incomplete or semantically erroneous. Therefore, an approach for expressing and analyzing BPMN models in a formal setting is required. In this paper we describe how BPMN diagrams can be represented by means of a semantically precise channel-based coordination language called Reo which admits formal analysis using model checking and bisimulation techniques. Moreover, since additional requirements may come from various regulatory/legislative documents, we discuss the opportunities offered by Reo and its mathematical abstractions for expressing process-related constraints such as Quality of Service (QoS) or time-aware conditions on process states.

  15. Quantitative spectral and orientational analysis in surface sum frequency generation vibrational spectroscopy (SFG-VS)

    NASA Astrophysics Data System (ADS)

    Wang, Hong-Fei; Gan, Wei; Lu, Rong; Rao, Yi; Wu, Bao-Hua

    Sum frequency generation vibrational spectroscopy (SFG-VS) has been proven to be a uniquely effective spectroscopic technique in the investigation of molecular structure and conformations, as well as the dynamics of molecular interfaces. However, the ability to apply SFG-VS to complex molecular interfaces has been limited by the ability to abstract quantitative information from SFG-VS experiments. In this review, we try to make assessments of the limitations, issues and techniques as well as methodologies in quantitative orientational and spectral analysis with SFG-VS. Based on these assessments, we also try to summarize recent developments in methodologies on quantitative orientational and spectral analysis in SFG-VS, and their applications to detailed analysis of SFG-VS data of various vapour/neat liquid interfaces. A rigorous formulation of the polarization null angle (PNA) method is given for accurate determination of the orientational parameter D = /, and comparison between the PNA method with the commonly used polarization intensity ratio (PIR) method is discussed. The polarization and incident angle dependencies of the SFG-VS intensity are also reviewed, in the light of how experimental arrangements can be optimized to effectively abstract crucial information from the SFG-VS experiments. The values and models of the local field factors in the molecular layers are discussed. In order to examine the validity and limitations of the bond polarizability derivative model, the general expressions for molecular hyperpolarizability tensors and their expression with the bond polarizability derivative model for C3v, C2v and C∞v molecular groups are given in the two appendixes. We show that the bond polarizability derivative model can quantitatively describe many aspects of the intensities observed in the SFG-VS spectrum of the vapour/neat liquid interfaces in different polarizations. Using the polarization analysis in SFG-VS, polarization selection rules or guidelines are developed for assignment of the SFG-VS spectrum. Using the selection rules, SFG-VS spectra of vapour/diol, and vapour/n-normal alcohol (n˜ 1-8) interfaces are assigned, and some of the ambiguity and confusion, as well as their implications in previous IR and Raman assignment, are duly discussed. The ability to assign a SFG-VS spectrum using the polarization selection rules makes SFG-VS not only an effective and useful vibrational spectroscopy technique for interface studies, but also a complementary vibrational spectroscopy method in general condensed phase studies. These developments will put quantitative orientational and spectral analysis in SFG-VS on a more solid foundation. The formulations, concepts and issues discussed in this review are expected to find broad applications for investigations on molecular interfaces in the future.

  16. Evaluating the morphology of the left atrial appendage by a transesophageal echocardiographic 3-dimensional printed model

    PubMed Central

    Song, Hongning; Zhou, Qing; Zhang, Lan; Deng, Qing; Wang, Yijia; Hu, Bo; Tan, Tuantuan; Chen, Jinling; Pan, Yiteng; He, Fazhi

    2017-01-01

    Abstract The novel 3-dimensional printing (3DP) technique has shown its ability to assist personalized cardiac intervention therapy. This study aimed to determine the feasibility of 3D-printed left atrial appendage (LAA) models based on 3D transesophageal echocardiography (3D TEE) data and their application value in treating LAA occlusions. Eighteen patients with transcatheter LAA occlusion, and preprocedure 3D TEE and cardiac computed tomography were enrolled. 3D TEE volumetric data of the LAA were acquired and postprocessed for 3DP. Two types of 3D models of the LAA (ie, hard chamber model and flexible wall model) were printed by a 3D printer. The morphological classification and lobe identification of the LAA were assessed by the 3D chamber model, and LAA dimensions were measured via the 3D wall model. Additionally, a simulation operative rehearsal was performed on the 3D models in cases of challenging LAA morphology for the purpose of understanding the interactions between the device and the model. Three-dimensional TEE volumetric data of the LAA were successfully reprocessed and printed as 3D LAA chamber models and 3D LAA wall models in all patients. The consistency of the morphological classifications of the LAA based on 3D models and cardiac computed tomography was 0.92 (P < .01). The differences between the LAA ostium dimensions and depth measured using the 3D models were not significant from those measured on 3D TEE (P > .05). A simulation occlusion was successfully performed on the 3D model of the 2 challenging cases and compared with the real procedure. The echocardiographic 3DP technique is feasible and accurate in reflecting the spatial morphology of the LAA, which may be promising for the personalized planning of transcatheter LAA occlusion. PMID:28930824

  17. Simulation and Verification of Synchronous Set Relations in Rewriting Logic

    NASA Technical Reports Server (NTRS)

    Rocha, Camilo; Munoz, Cesar A.

    2011-01-01

    This paper presents a mathematical foundation and a rewriting logic infrastructure for the execution and property veri cation of synchronous set relations. The mathematical foundation is given in the language of abstract set relations. The infrastructure consists of an ordersorted rewrite theory in Maude, a rewriting logic system, that enables the synchronous execution of a set relation provided by the user. By using the infrastructure, existing algorithm veri cation techniques already available in Maude for traditional asynchronous rewriting, such as reachability analysis and model checking, are automatically available to synchronous set rewriting. The use of the infrastructure is illustrated with an executable operational semantics of a simple synchronous language and the veri cation of temporal properties of a synchronous system.

  18. Manufacturing Techniques for Titanium Aluminide Based Alloys and Metal Matrix Composites

    DTIC Science & Technology

    2010-01-01

    aluminides are being used in the low pressure turbine (LPT) blades . In addition, titanium aluminides were also investigated for use in High Speed Civil... Titanium aluminides are also being used in General Electric’s GEnex gas turbine engine for the 6th and the 7th stage of the low pressure turbine blades ...ABSTRACT Title of Dissertation: MANUFACTURING TECHNIQUES FOR TITANIUM ALUMINIDE BASED ALLOYS AND METAL MATRIX COMPOSITES

  19. 3D Feature Extraction for Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Silver, Deborah

    1996-01-01

    Visualization techniques provide tools that help scientists identify observed phenomena in scientific simulation. To be useful, these tools must allow the user to extract regions, classify and visualize them, abstract them for simplified representations, and track their evolution. Object Segmentation provides a technique to extract and quantify regions of interest within these massive datasets. This article explores basic algorithms to extract coherent amorphous regions from two-dimensional and three-dimensional scalar unstructured grids. The techniques are applied to datasets from Computational Fluid Dynamics and those from Finite Element Analysis.

  20. BIOMARKER ASSAYS IN NIPPLE APIRATE FLUID

    EPA Science Inventory

    ABSTRACT

    The noninvasive technique of nipple aspiration as a potential source of biomarkers of breast cancer risk was evaluated. The feasibility of performing mutagenesis assays, amplifying DNA and performing protein electrophoresis on nipple aspirate fluid was explored. ...

  1. Thermo-Reflectance Spectra of Eros: Unambiguous Detection of Olivine

    NASA Technical Reports Server (NTRS)

    Lucey, P. G.; Hinrichs, J. L.; Urquhart-Kelly, M.; Wellnitz, D.; Bell, J. F., III; Clark, B. E.

    2001-01-01

    Olivine is readily detected on 433 Eros using the new thermo-reflectance spectral technique applied to near-IR spectra obtained at Eros by the NEAR spacecraft. Additional information is contained in the original extended abstract.

  2. Certification trails for data structures

    NASA Technical Reports Server (NTRS)

    Sullivan, Gregory F.; Masson, Gerald M.

    1993-01-01

    Certification trails are a recently introduced and promising approach to fault detection and fault tolerance. The applicability of the certification trail technique is significantly generalized. Previously, certification trails had to be customized to each algorithm application; trails appropriate to wide classes of algorithms were developed. These certification trails are based on common data-structure operations such as those carried out using these sets of operations such as those carried out using balanced binary trees and heaps. Any algorithms using these sets of operations can therefore employ the certification trail method to achieve software fault tolerance. To exemplify the scope of the generalization of the certification trail technique provided, constructions of trails for abstract data types such as priority queues and union-find structures are given. These trails are applicable to any data-structure implementation of the abstract data type. It is also shown that these ideals lead naturally to monitors for data-structure operations.

  3. T-L Plane Abstraction-Based Energy-Efficient Real-Time Scheduling for Multi-Core Wireless Sensors.

    PubMed

    Kim, Youngmin; Lee, Ki-Seong; Pham, Ngoc-Son; Lee, Sun-Ro; Lee, Chan-Gun

    2016-07-08

    Energy efficiency is considered as a critical requirement for wireless sensor networks. As more wireless sensor nodes are equipped with multi-cores, there are emerging needs for energy-efficient real-time scheduling algorithms. The T-L plane-based scheme is known to be an optimal global scheduling technique for periodic real-time tasks on multi-cores. Unfortunately, there has been a scarcity of studies on extending T-L plane-based scheduling algorithms to exploit energy-saving techniques. In this paper, we propose a new T-L plane-based algorithm enabling energy-efficient real-time scheduling on multi-core sensor nodes with dynamic power management (DPM). Our approach addresses the overhead of processor mode transitions and reduces fragmentations of the idle time, which are inherent in T-L plane-based algorithms. Our experimental results show the effectiveness of the proposed algorithm compared to other energy-aware scheduling methods on T-L plane abstraction.

  4. An intermediate level of abstraction for computational systems chemistry.

    PubMed

    Andersen, Jakob L; Flamm, Christoph; Merkle, Daniel; Stadler, Peter F

    2017-12-28

    Computational techniques are required for narrowing down the vast space of possibilities to plausible prebiotic scenarios, because precise information on the molecular composition, the dominant reaction chemistry and the conditions for that era are scarce. The exploration of large chemical reaction networks is a central aspect in this endeavour. While quantum chemical methods can accurately predict the structures and reactivities of small molecules, they are not efficient enough to cope with large-scale reaction systems. The formalization of chemical reactions as graph grammars provides a generative system, well grounded in category theory, at the right level of abstraction for the analysis of large and complex reaction networks. An extension of the basic formalism into the realm of integer hyperflows allows for the identification of complex reaction patterns, such as autocatalysis, in large reaction networks using optimization techniques.This article is part of the themed issue 'Reconceptualizing the origins of life'. © 2017 The Author(s).

  5. Trajectory-Based Performance Assessment for Aviation Weather Information

    NASA Technical Reports Server (NTRS)

    Vigeant-Langlois, Laurence; Hansman, R. John, Jr.

    2003-01-01

    Based on an analysis of aviation decision-makers' time-related weather information needs, an abstraction of the aviation weather decision task was developed, that involves 4-D intersection testing between aircraft trajectory hypertubes and hazardous weather hypervolumes. The framework builds on the hypothesis that hazardous meteorological fields can be simplified using discrete boundaries of surrogate threat attributes. The abstractions developed in the framework may be useful in studying how to improve the performance of weather forecasts from the trajectory-centric perspective, as well as for developing useful visualization techniques of weather information.

  6. 1976 annual summary report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1978-03-01

    Abstracts of papers published during the previous calendar year, arranged in accordance with the project titles used in the USDOE Schedule 189 Budget Proposals, are presented. The collection of abstracts supplements the listing of papers published in the Schedule 189. The following subject areas are represented: high-energy physics; nuclear physics; basic energy sciences (nuclear science, materials sciences, solid state physics, materials chemistry); molecular, mathematical, and earth sciences (fundamental interactions, processes and techniques, mathematical and computer sciences); environmental research and development; physical and technological studies (characterization, measurement and monitoring); and nuclear research and applications.

  7. Time-Reversal Based Range Extension Technique for Ultra-wideband (UWB) Sensors and Applications in Tactical Communications and Networking

    DTIC Science & Technology

    2007-10-16

    ABSTRACT c. THIS PAGE ABSTRACT OF Francis Otuonye P U UU24 19b. TELEPHONE NUMBER (Include area code ) 24 931-372-3374 Standard Form 298 (Rev. 8/98...modulation pulse wavefom--sotware defined or cognitive. From a information-theoretical viewpoint, the two parts as a whole form so-called "pre- coding ". I...The time domain system Fig. 2.3(b) is based on digital sampling oscilloscope (DSO), Textronix TDS 7000E3. The time domain sounder has the capability

  8. Artificial neural networks for defining the water quality determinants of groundwater abstraction in coastal aquifer

    NASA Astrophysics Data System (ADS)

    Lallahem, S.; Hani, A.

    2017-02-01

    Water sustainability in the lower Seybouse River basin, eastern Algeria, must take into account the importance of water quantity and quality integration. So, there is a need for a better knowledge and understanding of the water quality determinants of groundwater abstraction to meet the municipal and agricultural uses. In this paper, the artificial neural network (ANN) models were used to model and predict the relationship between groundwater abstraction and water quality determinants in the lower Seybouse River basin. The study area chosen is the lower Seybouse River basin and real data were collected from forty five wells for reference year 2006. Results indicate that the feed-forward multilayer perceptron models with back-propagation are useful tools to define and prioritize the important water quality parameters of groundwater abstraction and use. The model evaluation shows that the correlation coefficients are more than 95% for training, verification and testing data. The model aims to link the water quantity and quality with the objective to strengthen the Integrated Water Resources Management approach. It assists water planners and managers to better assess the water quality parameters and progress towards the provision of appropriate quantities of water of suitable quality.

  9. The Abstract Selection Task: New Data and an Almost Comprehensive Model

    ERIC Educational Resources Information Center

    Klauer, Karl Christoph; Stahl, Christoph; Erdfelder, Edgar

    2007-01-01

    A complete quantitative account of P. Wason's (1966) abstract selection task is proposed. The account takes the form of a mathematical model. It is assumed that some response patterns are caused by inferential reasoning, whereas other responses reflect cognitive processes that affect each card selection separately and independently of other card…

  10. Beyond Exemplars and Prototypes as Memory Representations of Natural Concepts: A Clustering Approach

    ERIC Educational Resources Information Center

    Verbeemen, Timothy; Vanpaemel, Wolf; Pattyn, Sven; Storms, Gert; Verguts, Tom

    2007-01-01

    Categorization in well-known natural concepts is studied using a special version of the Varying Abstraction Framework (Vanpaemel, W., & Storms, G. (2006). A varying abstraction framework for categorization. Manuscript submitted for publication; Vanpaemel, W., Storms, G., & Ons, B. (2005). A varying abstraction model for categorization. In B. Bara,…

  11. MRMAide: a mixed resolution modeling aide

    NASA Astrophysics Data System (ADS)

    Treshansky, Allyn; McGraw, Robert M.

    2002-07-01

    The Mixed Resolution Modeling Aide (MRMAide) technology is an effort to semi-automate the implementation of Mixed Resolution Modeling (MRM). MRMAide suggests ways of resolving differences in fidelity and resolution across diverse modeling paradigms. The goal of MRMAide is to provide a technology that will allow developers to incorporate model components into scenarios other than those for which they were designed. Currently, MRM is implemented by hand. This is a tedious, error-prone, and non-portable process. MRMAide, in contrast, will automatically suggest to a developer where and how to connect different components and/or simulations. MRMAide has three phases of operation: pre-processing, data abstraction, and validation. During pre-processing the components to be linked together are evaluated in order to identify appropriate mapping points. During data abstraction those mapping points are linked via data abstraction algorithms. During validation developers receive feedback regarding their newly created models relative to existing baselined models. The current work presents an overview of the various problems encountered during MRM and the various technologies utilized by MRMAide to overcome those problems.

  12. Model, analysis, and evaluation of the effects of analog VLSI arithmetic on linear subspace-based image recognition.

    PubMed

    Carvajal, Gonzalo; Figueroa, Miguel

    2014-07-01

    Typical image recognition systems operate in two stages: feature extraction to reduce the dimensionality of the input space, and classification based on the extracted features. Analog Very Large Scale Integration (VLSI) is an attractive technology to achieve compact and low-power implementations of these computationally intensive tasks for portable embedded devices. However, device mismatch limits the resolution of the circuits fabricated with this technology. Traditional layout techniques to reduce the mismatch aim to increase the resolution at the transistor level, without considering the intended application. Relating mismatch parameters to specific effects in the application level would allow designers to apply focalized mismatch compensation techniques according to predefined performance/cost tradeoffs. This paper models, analyzes, and evaluates the effects of mismatched analog arithmetic in both feature extraction and classification circuits. For the feature extraction, we propose analog adaptive linear combiners with on-chip learning for both Least Mean Square (LMS) and Generalized Hebbian Algorithm (GHA). Using mathematical abstractions of analog circuits, we identify mismatch parameters that are naturally compensated during the learning process, and propose cost-effective guidelines to reduce the effect of the rest. For the classification, we derive analog models for the circuits necessary to implement Nearest Neighbor (NN) approach and Radial Basis Function (RBF) networks, and use them to emulate analog classifiers with standard databases of face and hand-writing digits. Formal analysis and experiments show how we can exploit adaptive structures and properties of the input space to compensate the effects of device mismatch at the application level, thus reducing the design overhead of traditional layout techniques. Results are also directly extensible to multiple application domains using linear subspace methods. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Researching Mental Health Disorders in the Era of Social Media: Systematic Review.

    PubMed

    Wongkoblap, Akkapon; Vadillo, Miguel A; Curcin, Vasa

    2017-06-29

    Mental illness is quickly becoming one of the most prevalent public health problems worldwide. Social network platforms, where users can express their emotions, feelings, and thoughts, are a valuable source of data for researching mental health, and techniques based on machine learning are increasingly used for this purpose. The objective of this review was to explore the scope and limits of cutting-edge techniques that researchers are using for predictive analytics in mental health and to review associated issues, such as ethical concerns, in this area of research. We performed a systematic literature review in March 2017, using keywords to search articles on data mining of social network data in the context of common mental health disorders, published between 2010 and March 8, 2017 in medical and computer science journals. The initial search returned a total of 5386 articles. Following a careful analysis of the titles, abstracts, and main texts, we selected 48 articles for review. We coded the articles according to key characteristics, techniques used for data collection, data preprocessing, feature extraction, feature selection, model construction, and model verification. The most common analytical method was text analysis, with several studies using different flavors of image analysis and social interaction graph analysis. Despite an increasing number of studies investigating mental health issues using social network data, some common problems persist. Assembling large, high-quality datasets of social media users with mental disorder is problematic, not only due to biases associated with the collection methods, but also with regard to managing consent and selecting appropriate analytics techniques. ©Akkapon Wongkoblap, Miguel A Vadillo, Vasa Curcin. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 29.06.2017.

  14. Neural network submodel as an abstraction tool: relating network performance to combat outcome

    NASA Astrophysics Data System (ADS)

    Jablunovsky, Greg; Dorman, Clark; Yaworsky, Paul S.

    2000-06-01

    Simulation of Command and Control (C2) networks has historically emphasized individual system performance with little architectural context or credible linkage to `bottom- line' measures of combat outcomes. Renewed interest in modeling C2 effects and relationships stems from emerging network intensive operational concepts. This demands improved methods to span the analytical hierarchy between C2 system performance models and theater-level models. Neural network technology offers a modeling approach that can abstract the essential behavior of higher resolution C2 models within a campaign simulation. The proposed methodology uses off-line learning of the relationships between network state and campaign-impacting performance of a complex C2 architecture and then approximation of that performance as a time-varying parameter in an aggregated simulation. Ultimately, this abstraction tool offers an increased fidelity of C2 system simulation that captures dynamic network dependencies within a campaign context.

  15. Model-based object classification using unification grammars and abstract representations

    NASA Astrophysics Data System (ADS)

    Liburdy, Kathleen A.; Schalkoff, Robert J.

    1993-04-01

    The design and implementation of a high level computer vision system which performs object classification is described. General object labelling and functional analysis require models of classes which display a wide range of geometric variations. A large representational gap exists between abstract criteria such as `graspable' and current geometric image descriptions. The vision system developed and described in this work addresses this problem and implements solutions based on a fusion of semantics, unification, and formal language theory. Object models are represented using unification grammars, which provide a framework for the integration of structure and semantics. A methodology for the derivation of symbolic image descriptions capable of interacting with the grammar-based models is described and implemented. A unification-based parser developed for this system achieves object classification by determining if the symbolic image description can be unified with the abstract criteria of an object model. Future research directions are indicated.

  16. Abstraction of the Relational Model from a Department of Veterans Affairs DHCP Database: Bridging Theory and Working Application

    PubMed Central

    Levy, C.; Beauchamp, C.

    1996-01-01

    This poster describes the methods used and working prototype that was developed from an abstraction of the relational model from the VA's hierarchical DHCP database. Overlaying the relational model on DHCP permits multiple user views of the physical data structure, enhances access to the database by providing a link to commercial (SQL based) software, and supports a conceptual managed care data model based on primary and longitudinal patient care. The goal of this work was to create a relational abstraction of the existing hierarchical database; to construct, using SQL data definition language, user views of the database which reflect the clinical conceptual view of DHCP, and to allow the user to work directly with the logical view of the data using GUI based commercial software of their choosing. The workstation is intended to serve as a platform from which a managed care information model could be implemented and evaluated.

  17. Achilles and the tortoise: Some caveats to mathematical modeling in biology.

    PubMed

    Gilbert, Scott F

    2018-01-31

    Mathematical modeling has recently become a much-lauded enterprise, and many funding agencies seek to prioritize this endeavor. However, there are certain dangers associated with mathematical modeling, and knowledge of these pitfalls should also be part of a biologist's training in this set of techniques. (1) Mathematical models are limited by known science; (2) Mathematical models can tell what can happen, but not what did happen; (3) A model does not have to conform to reality, even if it is logically consistent; (4) Models abstract from reality, and sometimes what they eliminate is critically important; (5) Mathematics can present a Platonic ideal to which biologically organized matter strives, rather than a trial-and-error bumbling through evolutionary processes. This "Unity of Science" approach, which sees biology as the lowest physical science and mathematics as the highest science, is part of a Western belief system, often called the Great Chain of Being (or Scala Natura), that sees knowledge emerge as one passes from biology to chemistry to physics to mathematics, in an ascending progression of reason being purification from matter. This is also an informal model for the emergence of new life. There are now other informal models for integrating development and evolution, but each has its limitations. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Experimental evaluation of certification trails using abstract data type validation

    NASA Technical Reports Server (NTRS)

    Wilson, Dwight S.; Sullivan, Gregory F.; Masson, Gerald M.

    1993-01-01

    Certification trails are a recently introduced and promising approach to fault-detection and fault-tolerance. Recent experimental work reveals many cases in which a certification-trail approach allows for significantly faster program execution time than a basic time-redundancy approach. Algorithms for answer-validation of abstract data types allow a certification trail approach to be used for a wide variety of problems. An attempt to assess the performance of algorithms utilizing certification trails on abstract data types is reported. Specifically, this method was applied to the following problems: heapsort, Hullman tree, shortest path, and skyline. Previous results used certification trails specific to a particular problem and implementation. The approach allows certification trails to be localized to 'data structure modules,' making the use of this technique transparent to the user of such modules.

  19. Structural methodologies for auditing SNOMED.

    PubMed

    Wang, Yue; Halper, Michael; Min, Hua; Perl, Yehoshua; Chen, Yan; Spackman, Kent A

    2007-10-01

    SNOMED is one of the leading health care terminologies being used worldwide. As such, quality assurance is an important part of its maintenance cycle. Methodologies for auditing SNOMED based on structural aspects of its organization are presented. In particular, automated techniques for partitioning SNOMED into smaller groups of concepts based primarily on relationships patterns are defined. Two abstraction networks, the area taxonomy and p-area taxonomy, are derived from the partitions. The high-level views afforded by these abstraction networks form the basis for systematic auditing. The networks tend to highlight errors that manifest themselves as irregularities at the abstract level. They also support group-based auditing, where sets of purportedly similar concepts are focused on for review. The auditing methodologies are demonstrated on one of SNOMED's top-level hierarchies. Errors discovered during the auditing process are reported.

  20. A review of volume‐area scaling of glaciers

    PubMed Central

    Bahr, David B.; Kaser, Georg

    2015-01-01

    Abstract Volume‐area power law scaling, one of a set of analytical scaling techniques based on principals of dimensional analysis, has become an increasingly important and widely used method for estimating the future response of the world's glaciers and ice caps to environmental change. Over 60 papers since 1988 have been published in the glaciological and environmental change literature containing applications of volume‐area scaling, mostly for the purpose of estimating total global glacier and ice cap volume and modeling future contributions to sea level rise from glaciers and ice caps. The application of the theory is not entirely straightforward, however, and many of the recently published results contain analyses that are in conflict with the theory as originally described by Bahr et al. (1997). In this review we describe the general theory of scaling for glaciers in full three‐dimensional detail without simplifications, including an improved derivation of both the volume‐area scaling exponent γ and a new derivation of the multiplicative scaling parameter c. We discuss some common misconceptions of the theory, presenting examples of both appropriate and inappropriate applications. We also discuss potential future developments in power law scaling beyond its present uses, the relationship between power law scaling and other modeling approaches, and some of the advantages and limitations of scaling techniques. PMID:27478877

  1. A Mathematical Model for Railway Control Systems

    NASA Technical Reports Server (NTRS)

    Hoover, D. N.

    1996-01-01

    We present a general method for modeling safety aspects of railway control systems. Using our modeling method, one can progressively refine an abstract railway safety model, sucessively adding layers of detail about how a real system actually operates, while maintaining a safety property that refines the original abstract safety property. This method supports a top-down approach to specification of railway control systems and to proof of a variety of safety-related properties. We demonstrate our method by proving safety of the classical block control system.

  2. Evaporation and abstraction determined from stable isotopes during normal flow on the Gariep River, South Africa

    NASA Astrophysics Data System (ADS)

    Diamond, Roger E.; Jack, Sam

    2018-04-01

    Changes in the stable isotope composition of water can, with the aid of climatic parameters, be used to calculate the quantity of evaporation from a water body. Previous workers have mostly focused on small, research catchments, with abundant data, but of limited scope. This study aimed to expand such work to a regional or sub-continental scale. The first full length isotope survey of the Gariep River quantifies evaporation on the river and the man-made reservoirs for the first time, and proposes a technique to calculate abstraction from the river. The theoretically determined final isotope composition for an evaporating water body in the given climate lies on the empirically determined local evaporation line, validating the assumptions and inputs to the Craig-Gordon evaporation model that was used. Evaporation from the Gariep River amounts to around 20% of flow, or 40 m3/s, of which about half is due to evaporation from the surface of the Gariep and Vanderkloof Reservoirs, showing the wastefulness of large surface water impoundments. This compares well with previous estimates based on evapotranspiration calculations, and equates to around 1300 GL/a of water, or about the annual water consumption of Johannesburg and Pretoria, where over 10 million people reside. Using similar evaporation calculations and applying existing transpiration estimates to a gauged length of river, the remaining quantity can be attributed to abstraction, amounting to 175 L/s/km in the lower middle reaches of the river. Given that high water demand and climate change are global problems, and with the challenges of maintaining water monitoring networks, stable isotopes are shown to be applicable over regional to national scales for modelling hydrological flows. Stable isotopes provide a complementary method to conventional flow gauging for understanding hydrology and management of large water resources, particularly in arid areas subject to significant evaporation.

  3. An abstract approach to evaporation models in rarefied gas dynamics

    NASA Astrophysics Data System (ADS)

    Greenberg, W.; van der Mee, C. V. M.

    1984-03-01

    Strong evaporation models involving 1D stationary problems with linear self-adjoint collision operators and solutions in abstract Hilbert spaces are investigated analytically. An efficient algorithm for locating the transition from existence to nonexistence of solutions is developed and applied to the 1D and 3D BGK model equations and the 3D BGK model in moment form, demonstrating the nonexistence of stationary evaporation states with supersonic drift velocities. Applications to similar models in electron and phonon transport, radiative transfer, and neutron transport are suggested.

  4. Design and construction of stone columns, vol. I.

    DOT National Transportation Integrated Search

    1983-12-01

    k Abstract tone columns have been used since the 1950s as a technique for improving both cohesive soils and silty sands. Potential applicationsj include (1) stabilizing foundation soils to support embankments and approach fills, 12) ,supporting re...

  5. Refining the maintenance techniques for Interlocking Concrete Paver GIs - abstract

    EPA Science Inventory

    Surface clogging adversely affects the performance of Interlocking Concrete Pavements (ICP) by reducing their ability to infiltrate stormwater runoff. Determining the correct methods for remedial maintenances is crucial to recovering and maintaining efficient ICP performance. T...

  6. The High Angular Resolution Multiplicity of Massive Stars

    DTIC Science & Technology

    2009-02-01

    binaries: visual – stars: early-type – stars: individual ( iota Ori, delta Ori, delta Sco) – techniques: interferometric Online-only material...STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT 15. SUBJECT TERMS 16. SECURITY

  7. Stereoscopy in cinematographic synthetic imagery

    NASA Astrophysics Data System (ADS)

    Eisenmann, Jonathan; Parent, Rick

    2009-02-01

    In this paper we present experiments and results pertaining to the perception of depth in stereoscopic viewing of synthetic imagery. In computer animation, typical synthetic imagery is highly textured and uses stylized illumination of abstracted material models by abstracted light source models. While there have been numerous studies concerning stereoscopic capabilities, conventions for staging and cinematography in stereoscopic movies have not yet been well-established. Our long-term goal is to measure the effectiveness of various cinematography techniques on the human visual system in a theatrical viewing environment. We would like to identify the elements of stereoscopic cinema that are important in terms of enhancing the viewer's understanding of a scene as well as providing guidelines for the cinematographer relating to storytelling. In these experiments we isolated stereoscopic effects by eliminating as many other visual cues as is reasonable. In particular, we aim to empirically determine what types of movement in synthetic imagery affect the perceptual depth sensing capabilities of our viewers. Using synthetic imagery, we created several viewing scenarios in which the viewer is asked to locate a target object's depth in a simple environment. The scenarios were specifically designed to compare the effectiveness of stereo viewing, camera movement, and object motion in aiding depth perception. Data were collected showing the error between the choice of the user and the actual depth value, and patterns were identified that relate the test variables to the viewer's perceptual depth accuracy in our theatrical viewing environment.

  8. Graph Theory Roots of Spatial Operators for Kinematics and Dynamics

    NASA Technical Reports Server (NTRS)

    Jain, Abhinandan

    2011-01-01

    Spatial operators have been used to analyze the dynamics of robotic multibody systems and to develop novel computational dynamics algorithms. Mass matrix factorization, inversion, diagonalization, and linearization are among several new insights obtained using such operators. While initially developed for serial rigid body manipulators, the spatial operators and the related mathematical analysis have been shown to extend very broadly including to tree and closed topology systems, to systems with flexible joints, links, etc. This work uses concepts from graph theory to explore the mathematical foundations of spatial operators. The goal is to study and characterize the properties of the spatial operators at an abstract level so that they can be applied to a broader range of dynamics problems. The rich mathematical properties of the kinematics and dynamics of robotic multibody systems has been an area of strong research interest for several decades. These properties are important to understand the inherent physical behavior of systems, for stability and control analysis, for the development of computational algorithms, and for model development of faithful models. Recurring patterns in spatial operators leads one to ask the more abstract question about the properties and characteristics of spatial operators that make them so broadly applicable. The idea is to step back from the specific application systems, and understand more deeply the generic requirements and properties of spatial operators, so that the insights and techniques are readily available across different kinematics and dynamics problems. In this work, techniques from graph theory were used to explore the abstract basis for the spatial operators. The close relationship between the mathematical properties of adjacency matrices for graphs and those of spatial operators and their kernels were established. The connections hold across very basic requirements on the system topology, the nature of the component bodies, the indexing schemes, etc. The relationship of the underlying structure is intimately connected with efficient, recursive computational algorithms. The results provide the foundational groundwork for a much broader look at the key problems in kinematics and dynamics. The properties of general graphs and trees of nodes and edge were examined, as well as the properties of adjacency matrices that are used to describe graph connectivity. The nilpotency property of such matrices for directed trees was reviewed, and the adjacency matrices were generalized to the notion of block weighted adjacency matrices that support block matrix elements. This leads us to the development of the notion of Spatial Kernel Operator SKO kernels. These kernels provide the basis for the development of SKO resolvent operators.

  9. Research Article Abstracts in Two Subdisciplines of Business--Move Structure and Hedging between Management and Marketing

    ERIC Educational Resources Information Center

    Li, Qian; Pramoolsook, Issra

    2015-01-01

    The importance of RA abstracts lies in their influence on the readers' decision about whether the accompanying article is worth reading. A number of studies have investigated the move structure of abstracts and have generated several influential models. However, little research has been conducted on subdisciplinary variations in move structure of…

  10. PROPAGATION OF UNCERTAINTY IN HOURLY UTILITY NOX EMISSIONS THROUGH A PHOTOCHEMICAL GRID AIR QUALITY MODEL: A CASE STUDY FOR THE CHARLOTTE, NC, MODELING DOMAIN (R826766)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  11. The Impact of Quantum Theoretical Models of Consciousness on the Study of Education.

    ERIC Educational Resources Information Center

    Andris, James F.

    This paper abstracts and discusses the approaches of five educational theorists who have used quantum theory as a model for educational phenomena, sets forth and uses metatheoretical criteria to evaluate the work of these theorists, and states guidelines for further work in this domain. The paper abstracts and discusses the works of the following…

  12. Teaching Subtraction and Multiplication with Regrouping Using the Concrete-Representational-Abstract Sequence and Strategic Instruction Model

    ERIC Educational Resources Information Center

    Flores, Margaret M.; Hinton, Vanessa; Strozier, Shaunita D.

    2014-01-01

    Based on Common Core Standards (2010), mathematics interventions should emphasize conceptual understanding of numbers and operations as well as fluency. For students at risk for failure, the concrete-representational-abstract (CRA) sequence and the Strategic Instruction Model (SIM) have been shown effective in teaching computation with an emphasis…

  13. Momentum Concept in the Process of Knowledge Construction

    ERIC Educational Resources Information Center

    Ergul, N. Remziye

    2013-01-01

    Abstraction is one of the methods for learning knowledge with using mental processes that cannot be obtained through experiment and observation. RBC model that is based on abstraction in the process of creating knowledge is directly related to mental processes. In this study, the RBC model is used for the high school students' processes of…

  14. Learning with Technology: Video Modeling with Concrete-Representational-Abstract Sequencing for Students with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Yakubova, Gulnoza; Hughes, Elizabeth M.; Shinaberry, Megan

    2016-01-01

    The purpose of this study was to determine the effectiveness of a video modeling intervention with concrete-representational-abstract instructional sequence in teaching mathematics concepts to students with autism spectrum disorder (ASD). A multiple baseline across skills design of single-case experimental methodology was used to determine the…

  15. Population at risk: using areal interpolation and Twitter messages to create population models for burglaries and robberies

    PubMed Central

    2018-01-01

    ABSTRACT Population at risk of crime varies due to the characteristics of a population as well as the crime generator and attractor places where crime is located. This establishes different crime opportunities for different crimes. However, there are very few efforts of modeling structures that derive spatiotemporal population models to allow accurate assessment of population exposure to crime. This study develops population models to depict the spatial distribution of people who have a heightened crime risk for burglaries and robberies. The data used in the study include: Census data as source data for the existing population, Twitter geo-located data, and locations of schools as ancillary data to redistribute the source data more accurately in the space, and finally gridded population and crime data to evaluate the derived population models. To create the models, a density-weighted areal interpolation technique was used that disaggregates the source data in smaller spatial units considering the spatial distribution of the ancillary data. The models were evaluated with validation data that assess the interpolation error and spatial statistics that examine their relationship with the crime types. Our approach derived population models of a finer resolution that can assist in more precise spatial crime analyses and also provide accurate information about crime rates to the public. PMID:29887766

  16. Will big data yield new mathematics? An evolving synergy with neuroscience

    PubMed Central

    Feng, S.; Holmes, P.

    2016-01-01

    New mathematics has often been inspired by new insights into the natural world. Here we describe some ongoing and possible future interactions among the massive data sets being collected in neuroscience, methods for their analysis and mathematical models of the underlying, still largely uncharted neural substrates that generate these data. We start by recalling events that occurred in turbulence modelling when substantial space-time velocity field measurements and numerical simulations allowed a new perspective on the governing equations of fluid mechanics. While no analogous global mathematical model of neural processes exists, we argue that big data may enable validation or at least rejection of models at cellular to brain area scales and may illuminate connections among models. We give examples of such models and survey some relatively new experimental technologies, including optogenetics and functional imaging, that can report neural activity in live animals performing complex tasks. The search for analytical techniques for these data is already yielding new mathematics, and we believe their multi-scale nature may help relate well-established models, such as the Hodgkin–Huxley equations for single neurons, to more abstract models of neural circuits, brain areas and larger networks within the brain. In brief, we envisage a closer liaison, if not a marriage, between neuroscience and mathematics. PMID:27516705

  17. Will big data yield new mathematics? An evolving synergy with neuroscience.

    PubMed

    Feng, S; Holmes, P

    2016-06-01

    New mathematics has often been inspired by new insights into the natural world. Here we describe some ongoing and possible future interactions among the massive data sets being collected in neuroscience, methods for their analysis and mathematical models of the underlying, still largely uncharted neural substrates that generate these data. We start by recalling events that occurred in turbulence modelling when substantial space-time velocity field measurements and numerical simulations allowed a new perspective on the governing equations of fluid mechanics. While no analogous global mathematical model of neural processes exists, we argue that big data may enable validation or at least rejection of models at cellular to brain area scales and may illuminate connections among models. We give examples of such models and survey some relatively new experimental technologies, including optogenetics and functional imaging, that can report neural activity in live animals performing complex tasks. The search for analytical techniques for these data is already yielding new mathematics, and we believe their multi-scale nature may help relate well-established models, such as the Hodgkin-Huxley equations for single neurons, to more abstract models of neural circuits, brain areas and larger networks within the brain. In brief, we envisage a closer liaison, if not a marriage, between neuroscience and mathematics.

  18. Regional-scale, fully coupled modelling of stream aquifer interaction in a tropical catchment

    NASA Astrophysics Data System (ADS)

    Werner, Adrian D.; Gallagher, Mark R.; Weeks, Scott W.

    2006-09-01

    SummaryThe planning and management of water resources in the Pioneer Valley, north-eastern Australia requires a tool for assessing the impact of groundwater and stream abstractions on water supply reliabilities and environmental flows in Sandy Creek (the main surface water system studied). Consequently, a fully coupled stream-aquifer model has been constructed using the code MODHMS, calibrated to near-stream observations of watertable behaviour and multiple components of gauged stream flow. This model has been tested using other methods of estimation, including stream depletion analysis and radon isotope tracer sampling. The coarseness of spatial discretisation, which is required for practical reasons of computational efficiency, limits the model's capacity to simulate small-scale processes (e.g., near-stream groundwater pumping, bank storage effects), and alternative approaches are required to complement the model's range of applicability. Model predictions of groundwater influx to Sandy Creek are compared with baseflow estimates from three different hydrograph separation techniques, which were found to be unable to reflect the dynamics of Sandy Creek stream-aquifer interactions. The model was also used to infer changes in the water balance of the system caused by historical land use change. This led to constraints on the recharge distribution which can be implemented to improve model calibration performance.

  19. The Use of Satellite Observed Cloud Patterns in Northern Hemisphere 300 mb and 1000/300 mb Numerical Analysis.

    DTIC Science & Technology

    1984-02-01

    prediction Extratropical cyclones Objective analysis Bogus techniques 20. ABSTRACT (Continue on reverse aide If necooearn mid Identify by block number) Jh A...quasi-objective statistical method for deriving 300 mb geopotential heights and 1000/300 mb thicknesses in the vicinity of extratropical cyclones 0I...with the aid of satellite imagery is presented. The technique utilizes satellite observed extratropical spiral cloud pattern parameters in conjunction

  20. Stereotactic body radiotherapy in lung cancer: an update *

    PubMed Central

    Abreu, Carlos Eduardo Cintra Vita; Ferreira, Paula Pratti Rodrigues; de Moraes, Fabio Ynoe; Neves, Wellington Furtado Pimenta; Gadia, Rafael; Carvalho, Heloisa de Andrade

    2015-01-01

    Abstract For early-stage lung cancer, the treatment of choice is surgery. In patients who are not surgical candidates or are unwilling to undergo surgery, radiotherapy is the principal treatment option. Here, we review stereotactic body radiotherapy, a technique that has produced quite promising results in such patients and should be the treatment of choice, if available. We also present the major indications, technical aspects, results, and special situations related to the technique. PMID:26398758

  1. Aplicacion de nuevas tecnicas y procedimientos para la ensenanza de la lectura-escritura (Application of the New Techniques and Procedures for Teaching Reading-Writing).

    ERIC Educational Resources Information Center

    Instituto Nacional de Pedagogia (Mexico).

    This document is an English-language abstract (approximately 1,500 words) of experiments performed in Mexico, D. F. by way of introducing new techniques for teaching reading and writing, particularly in the remedial classes. The first part of the document deals with a series of experiments carried out with first grade remedial groups as follows:…

  2. Well-Posedness Results for a Class of Toxicokinetic Models

    DTIC Science & Technology

    2001-07-24

    estimation. The main result that we establish here regarding well-posedness of solutions is based on ideas presented in [5] and [1]. Banks and Musante [5...necessary regularity required for the model to t into the second class of abstract problems discussed by Banks and Musante . Transport models for other...upon the results of Banks and Musante by achieving well-posedness for a more general class of abstract nonlinear parabolic equations. Ackleh, Banks and

  3. Exploitation of Self Organization in UAV Swarms for Optimization in Combat Environments

    DTIC Science & Technology

    2008-03-01

    behaviors and entangled hierarchy into Swarmfare [59] UAV simulation environment to include these models. • Validate this new model’s success through...Figure 4.3. The hierarchy of control emerges from the entangled hierarchy of the state relations at the simulation , swarm and rule/behaviors level...majors, major) Abstract Model Types (AMT) Figure A.1: SO Abstract Model Type Table 142 Appendix B. Simulators Comparision Name MATLAB Multi UAV MultiUAV

  4. Fully automatic adjoints: a robust and efficient mechanism for generating adjoint ocean models

    NASA Astrophysics Data System (ADS)

    Ham, D. A.; Farrell, P. E.; Funke, S. W.; Rognes, M. E.

    2012-04-01

    The problem of generating and maintaining adjoint models is sufficiently difficult that typically only the most advanced and well-resourced community ocean models achieve it. There are two current technologies which each suffer from their own limitations. Algorithmic differentiation, also called automatic differentiation, is employed by models such as the MITGCM [2] and the Alfred Wegener Institute model FESOM [3]. This technique is very difficult to apply to existing code, and requires a major initial investment to prepare the code for automatic adjoint generation. AD tools may also have difficulty with code employing modern software constructs such as derived data types. An alternative is to formulate the adjoint differential equation and to discretise this separately. This approach, known as the continuous adjoint and employed in ROMS [4], has the disadvantage that two different model code bases must be maintained and manually kept synchronised as the model develops. The discretisation of the continuous adjoint is not automatically consistent with that of the forward model, producing an additional source of error. The alternative presented here is to formulate the flow model in the high level language UFL (Unified Form Language) and to automatically generate the model using the software of the FEniCS project. In this approach it is the high level code specification which is differentiated, a task very similar to the formulation of the continuous adjoint [5]. However since the forward and adjoint models are generated automatically, the difficulty of maintaining them vanishes and the software engineering process is therefore robust. The scheduling and execution of the adjoint model, including the application of an appropriate checkpointing strategy is managed by libadjoint [1]. In contrast to the conventional algorithmic differentiation description of a model as a series of primitive mathematical operations, libadjoint employs a new abstraction of the simulation process as a sequence of discrete equations which are assembled and solved. It is the coupling of the respective abstractions employed by libadjoint and the FEniCS project which produces the adjoint model automatically, without further intervention from the model developer. This presentation will demonstrate this new technology through linear and non-linear shallow water test cases. The exceptionally simple model syntax will be highlighted and the correctness of the resulting adjoint simulations will be demonstrated using rigorous convergence tests.

  5. Improvements and Limitations of Humanized Mouse Models for HIV Research: NIH/NIAID “Meet the Experts” 2015 Workshop Summary

    PubMed Central

    Akkina, Ramesh; Allam, Atef; Balazs, Alejandro B.; Blankson, Joel N.; Burnett, John C.; Casares, Sofia; Garcia, J. Victor; Hasenkrug, Kim J.; Kitchen, Scott G.; Klein, Florian; Kumar, Priti; Luster, Andrew D.; Poluektova, Larisa Y.; Rao, Mangala; Shultz, Leonard D.; Zack, Jerome A.

    2016-01-01

    Abstract The number of humanized mouse models for the human immunodeficiency virus (HIV)/acquired immunodeficiency syndrome (AIDS) and other infectious diseases has expanded rapidly over the past 8 years. Highly immunodeficient mouse strains, such as NOD/SCID/gamma chainnull (NSG, NOG), support better human hematopoietic cell engraftment. Another improvement is the derivation of highly immunodeficient mice, transgenic with human leukocyte antigens (HLAs) and cytokines that supported development of HLA-restricted human T cells and heightened human myeloid cell engraftment. Humanized mice are also used to study the HIV reservoir using new imaging techniques. Despite these advances, there are still limitations in HIV immune responses and deficits in lymphoid structures in these models in addition to xenogeneic graft-versus-host responses. To understand and disseminate the improvements and limitations of humanized mouse models to the scientific community, the NIH sponsored and convened a meeting on April 15, 2015 to discuss the state of knowledge concerning these questions and best practices for selecting a humanized mouse model for a particular scientific investigation. This report summarizes the findings of the NIH meeting. PMID:26670361

  6. High performance cellular level agent-based simulation with FLAME for the GPU.

    PubMed

    Richmond, Paul; Walker, Dawn; Coakley, Simon; Romano, Daniela

    2010-05-01

    Driven by the availability of experimental data and ability to simulate a biological scale which is of immediate interest, the cellular scale is fast emerging as an ideal candidate for middle-out modelling. As with 'bottom-up' simulation approaches, cellular level simulations demand a high degree of computational power, which in large-scale simulations can only be achieved through parallel computing. The flexible large-scale agent modelling environment (FLAME) is a template driven framework for agent-based modelling (ABM) on parallel architectures ideally suited to the simulation of cellular systems. It is available for both high performance computing clusters (www.flame.ac.uk) and GPU hardware (www.flamegpu.com) and uses a formal specification technique that acts as a universal modelling format. This not only creates an abstraction from the underlying hardware architectures, but avoids the steep learning curve associated with programming them. In benchmarking tests and simulations of advanced cellular systems, FLAME GPU has reported massive improvement in performance over more traditional ABM frameworks. This allows the time spent in the development and testing stages of modelling to be drastically reduced and creates the possibility of real-time visualisation for simple visual face-validation.

  7. Comprehensive Peptide Ion Structure Studies Using Ion Mobility Techniques: Part 3. Relating Solution-Phase to Gas-Phase Structures.

    PubMed

    Kondalaji, Samaneh Ghassabi; Khakinejad, Mahdiar; Valentine, Stephen J

    2018-06-01

    Molecular dynamics (MD) simulations have been utilized to study peptide ion conformer establishment during the electrospray process. An explicit water model is used for nanodroplets containing a model peptide and hydronium ions. Simulations are conducted at 300 K for two different peptide ion charge configurations and for droplets containing varying numbers of hydronium ions. For all conditions, modeling has been performed until production of the gas-phase ions and the resultant conformers have been compared to proposed gas-phase structures. The latter species were obtained from previous studies in which in silico candidate structures were filtered according to ion mobility and hydrogen-deuterium exchange (HDX) reactivity matches. Results from the present study present three key findings namely (1) the evidence from ion production modeling supports previous structure refinement studies based on mobility and HDX reactivity matching, (2) the modeling of the electrospray process is significantly improved by utilizing initial droplets existing below but close to the calculated Rayleigh limit, and (3) peptide ions in the nanodroplets sample significantly different conformers than those in the bulk solution due to altered physicochemical properties of the solvent. Graphical Abstract ᅟ.

  8. Content Abstract Classification Using Naive Bayes

    NASA Astrophysics Data System (ADS)

    Latif, Syukriyanto; Suwardoyo, Untung; Aldrin Wihelmus Sanadi, Edwin

    2018-03-01

    This study aims to classify abstract content based on the use of the highest number of words in an abstract content of the English language journals. This research uses a system of text mining technology that extracts text data to search information from a set of documents. Abstract content of 120 data downloaded at www.computer.org. Data grouping consists of three categories: DM (Data Mining), ITS (Intelligent Transport System) and MM (Multimedia). Systems built using naive bayes algorithms to classify abstract journals and feature selection processes using term weighting to give weight to each word. Dimensional reduction techniques to reduce the dimensions of word counts rarely appear in each document based on dimensional reduction test parameters of 10% -90% of 5.344 words. The performance of the classification system is tested by using the Confusion Matrix based on comparative test data and test data. The results showed that the best classification results were obtained during the 75% training data test and 25% test data from the total data. Accuracy rates for categories of DM, ITS and MM were 100%, 100%, 86%. respectively with dimension reduction parameters of 30% and the value of learning rate between 0.1-0.5.

  9. Directory of Energy Information Administration Model Abstracts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1986-07-16

    This directory partially fulfills the requirements of Section 8c, of the documentation order, which states in part that: The Office of Statistical Standards will annually publish an EIA document based on the collected abstracts and the appendices. This report contains brief statements about each model's title, acronym, purpose, and status, followed by more detailed information on characteristics, uses, and requirements. Sources for additional information are identified. All models active through March 1985 are included. The main body of this directory is an alphabetical list of all active EIA models. Appendix A identifies major EIA modeling systems and the models withinmore » these systems, and Appendix B identifies active EIA models by type (basic, auxiliary, and developing). EIA also leases models developed by proprietary software vendors. Documentation for these proprietary models is the responsibility of the companies from which they are leased. EIA has recently leased models from Chase Econometrics, Inc., Data Resources, Inc. (DRI), the Oak Ridge National Laboratory (ORNL), and Wharton Econometric Forecasting Associates (WEFA). Leased models are not abstracted here. The directory is intended for the use of energy and energy-policy analysts in the public and private sectors.« less

  10. Toward best practice framing of uncertainty in scientific publications: A review of Water Resources Research abstracts

    NASA Astrophysics Data System (ADS)

    Guillaume, Joseph H. A.; Helgeson, Casey; Elsawah, Sondoss; Jakeman, Anthony J.; Kummu, Matti

    2017-08-01

    Uncertainty is recognized as a key issue in water resources research, among other sciences. Discussions of uncertainty typically focus on tools and techniques applied within an analysis, e.g., uncertainty quantification and model validation. But uncertainty is also addressed outside the analysis, in writing scientific publications. The language that authors use conveys their perspective of the role of uncertainty when interpreting a claim—what we call here "framing" the uncertainty. This article promotes awareness of uncertainty framing in four ways. (1) It proposes a typology of eighteen uncertainty frames, addressing five questions about uncertainty. (2) It describes the context in which uncertainty framing occurs. This is an interdisciplinary topic, involving philosophy of science, science studies, linguistics, rhetoric, and argumentation. (3) We analyze the use of uncertainty frames in a sample of 177 abstracts from the Water Resources Research journal in 2015. This helped develop and tentatively verify the typology, and provides a snapshot of current practice. (4) We make provocative recommendations to achieve a more influential, dynamic science. Current practice in uncertainty framing might be described as carefully considered incremental science. In addition to uncertainty quantification and degree of belief (present in ˜5% of abstracts), uncertainty is addressed by a combination of limiting scope, deferring to further work (˜25%) and indicating evidence is sufficient (˜40%)—or uncertainty is completely ignored (˜8%). There is a need for public debate within our discipline to decide in what context different uncertainty frames are appropriate. Uncertainty framing cannot remain a hidden practice evaluated only by lone reviewers.

  11. Adverse Outcome Pathway Network Analyses: Techniques and benchmarking the AOPwiki

    EPA Science Inventory

    Abstract: As the community of toxicological researchers, risk assessors, and risk managers adopt the adverse outcome pathway (AOP) paradigm for organizing toxicological knowledge, the number and diversity of adverse outcome pathways and AOP networks are continuing to grow. This ...

  12. The use of far infra-red radiation for the detection of concealed metal objects.

    DOT National Transportation Integrated Search

    1971-11-01

    Abstract The use of infrared radiation for the detection : of concealed metal objects has been investigated both : theoretically and experimentally. The investigation was : divided into two phases, one which considered passive : techniques, and anoth...

  13. Go Figure.

    ERIC Educational Resources Information Center

    Greenman, Geri

    2000-01-01

    Describes the first assignment for an intermediate oil painting class in which the students painted the human figure. Explains that the assignment involved three techniques: (1) abstract application of acrylic paint; (2) oil "Paintstiks" from Shiva; and (3) a final layer of actual oil paint. (CMK)

  14. 76 FR 14442 - 60-Day Notice of Proposed Information Collection: DS 6561 Pre-Assignment for Overseas Duty for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-16

    ... automated collection techniques or other forms of technology. Abstract of proposed collection: The DS 6561 form provides a concise summary of basic medical history, lab tests and physical examination. Since...

  15. 77 FR 25225 - 60-Day Notice of Proposed Information Collection: DS 7655, Iraqi Citizens and Nationals Employed...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-27

    ... the use of automated collection techniques or other forms of technology. Abstract of Proposed..., date(s) of employment, biometric, and other data must be collected and used to verify employment for...

  16. From data to information: Tools and techniques educators can use to enhance Google Earth imagery with geographic information systems data and three dimensional models

    NASA Astrophysics Data System (ADS)

    Simms, M.

    2007-12-01

    As with any educational technology, moving beyond basic information delivery to dynamic use can be a challenge and Google Earth (GE) is no exception. Moving beyond annotated placemarks and pictures, educators can utilize free, free-to-educators, and low cost tools to develop learning experiences within GE to facilitate dynamic interaction with real world data in the form of three dimensional models (3D) and geographic information systems data (GIS). Students take an active role in knowledge construction through self-directed navigation in 3D, seeing features of the landscape not in snapshot views found in textbooks, but in situ and in context. By incorporating categorized data, such as what is commonly found in GIS, an added dimension of human interaction can be incorporated. For example, GIS layers such as landuse, soil type, etc. provide students with data tools for investigating the role their community plays in supporting migrating Monarch butterfly habitat. Functionality for changing the appearance of layers in GE facilitates interaction with geospatial data in a manner that creates a type of "visual" GIS and can serve as an advanced organizer for later use of more powerful GIS software. GE can also be used as a metaphor to create a new context for an otherwise abstract concept, for example, scaling 3D models of the sun and planets to the size of a well known football stadium and placing each planet at the corresponding scaled distances from that location. Photorealistic 3D models created using SketchUp and Anim8or may help students relate to an otherwise abstract concept of planetary size and distances. Finally, digital elevation models (DEM) draped with imagery not available in GE or with GIS data can be used to make topic-specific 3D models either used within GE or in a 3D model viewer embedded in a website or email. With a little instruction, students can quickly learn how to make their own models as well. Procedures and software to accomplish each of these examples will be demonstrated.

  17. Modeling Multiple Stresses Placed Upon A Groundwater System In A Semi-Arid Brackish Environment

    NASA Astrophysics Data System (ADS)

    Toll, M.; Salameh, E.; Sauter, M.

    2008-12-01

    In semi-arid areas groundwater systems are frequently not sufficiently characterized hydrogeologically and long term data records are generally not available. Long-term time series are necessary, however to design future groundwater abstraction scenarios or to predict the influence of future climate change effects on groundwater resources. To overcome these problems an integrated approach for the provision of a reliable database based on sparse and fuzzy data is proposed. This integrated approach is demonstrated in the lowermost area of the Jordan Valley. The Jordan Valley is part of the Jordan Dead Sea Wadi Araba Rift Valley, which extends from the Red Sea to lake Tiberias and beyond with a major 107 km sinistral strike-slip fault between the Arabian plate to the east and the northeastern part of the African plate to the west. Due to extensional forces a topographic depression was formed. As a result of an arid environment it is filled with evaporites, lacustrine sediments, and clastic fluvial components. A subtropical climate with hot, dry summers and mild humid winters with low amounts of rainfall provide excellent farming conditions. Therefore the Jordan Valley is considered as the food basket of Jordan and is used intensively for agriculture. As a result hundreds of shallow wells were drilled and large amounts of groundwater were abstracted since groundwater is the major source for irrigation. Consequently groundwater quality decreased rapidly since the sixties and signs of overpumping and an increase in soil salinity could clearly be seen. In order to achieve a sustainable state of water resources and to quantify the impact of climate change on water resources a proper assessment of the groundwater resources as well as their quality is a prerequisite. In order to sufficiently describe the complex hydrogeologic flow system an integrated approach, combining geological, geophysical, hydrogeological, historical, and chemical methods was chosen. The aquifer geometry and composition is described with the help of geological, hydochemical, and geophysical methods. As far as the water budget is concerned, the recharge to the considered aquifer is estimated with geological methods and available data sets, while the abstraction from the aquifer is estimated with the help of remote sensing techniques. A historical approach is used to detect the general conditions under which the groundwater system has been in the past. Afterwards this information is implemented into a flow model. On the basis of the findings a numerical 3-D transient model integrating all important features of the hydrogeological system was developed.3 In order to be able to give reliable predictions about the impacts of climate change scenarios on the groundwater system the flow model was tested against stress periods depicted during the historical review of the test area (model period: 1955 - 2008). These stress periods include periods of intense rainfall, of drought, and of anthropogenic impacts, like building of storage dams and of violent conflicts. Recommendations for future sustainable groundwater abstractions are given.

  18. Global-scale assessment of groundwater depletion and related groundwater abstractions: Combining hydrological modeling with information from well observations and GRACE satellites

    NASA Astrophysics Data System (ADS)

    Döll, Petra; Müller Schmied, Hannes; Schuh, Carina; Portmann, Felix T.; Eicker, Annette

    2014-07-01

    Groundwater depletion (GWD) compromises crop production in major global agricultural areas and has negative ecological consequences. To derive GWD at the grid cell, country, and global levels, we applied a new version of the global hydrological model WaterGAP that simulates not only net groundwater abstractions and groundwater recharge from soils but also groundwater recharge from surface water bodies in dry regions. A large number of independent estimates of GWD as well as total water storage (TWS) trends determined from GRACE satellite data by three analysis centers were compared to model results. GWD and TWS trends are simulated best assuming that farmers in GWD areas irrigate at 70% of optimal water requirement. India, United States, Iran, Saudi Arabia, and China had the highest GWD rates in the first decade of the 21st century. On the Arabian Peninsula, in Libya, Egypt, Mali, Mozambique, and Mongolia, at least 30% of the abstracted groundwater was taken from nonrenewable groundwater during this time period. The rate of global GWD has likely more than doubled since the period 1960-2000. Estimated GWD of 113 km3/yr during 2000-2009, corresponding to a sea level rise of 0.31 mm/yr, is much smaller than most previous estimates. About 15% of the globally abstracted groundwater was taken from nonrenewable groundwater during this period. To monitor recent temporal dynamics of GWD and related water abstractions, GRACE data are best evaluated with a hydrological model that, like WaterGAP, simulates the impact of abstractions on water storage, but the low spatial resolution of GRACE remains a challenge.

  19. Publication Rate of Avian Medicine Conference Abstracts and Influencing Factors: 2011-2015.

    PubMed

    Doukaki, Christina; MedVet, Dr; Beaufrère, Hugues; Vet, Dr Med; Huynh, Minh

    2018-06-01

    International conferences on avian medicine and surgery aim to disseminate scientific and evidence-based information in the form of oral presentations and posters. Most manuscripts presented are printed in the conference proceedings as abstracts. Subsequent publication in a scientific peer-reviewed journal is the natural outcome of the research cycle, although studies have shown that the vast majority of conference abstracts are not published. The purpose of this study was to explore 1) the fate of abstracts presented in avian conferences (Association of Avian Veterinarians, European Association of Avian Veterinarians, International Conference on Avian Herpetological and Exotic Mammal Medicine) in the years 2011-2015, 2) assess the publication rate in peer-reviewed journals, 3) describe the time course of subsequent publication, and 4) identify factors associated with increased likelihood of publication. The results showed that 24% of conference abstracts were published within the next 2 years. Depending on the statistical model used, several factors were identified as associated with increased publication rate. North American papers seem to publish with more frequency (univariate model), while European papers had the opposite trend (multivariable model). Likewise, experimental studies were more prone to being published overall (univariate model), whereas retrospective observational studies had a lower rate of publication (multivariable model). Increasing the number of authors was also associated with increased publication rate. Most publications were published in the Journal of Avian Medicine and Surgery, which tends to suggest that this journal is the main journal of the specialty. Some parameters highlighted in this study may assist conference attendees to assess the likelihood of later publication.

  20. Integrating multiparametric prostate MRI into clinical practice

    PubMed Central

    2011-01-01

    Abstract Multifunctional magnetic resonance imaging (MRI) techniques are increasingly being used to address bottlenecks in prostate cancer patient management. These techniques yield qualitative, semi-quantitative and fully quantitative biomarkers that reflect on the underlying biological status of a tumour. If these techniques are to have a role in patient management, then standard methods of data acquisition, analysis and reporting have to be developed. Effective communication by the use of scoring systems, structured reporting and a graphical interface that matches prostate anatomy are key elements. Practical guidelines for integrating multiparametric MRI into clinical practice are presented. PMID:22187067

  1. New Therapies for Fibrofatty Infiltration

    DTIC Science & Technology

    2017-08-01

    14. ABSTRACT The goal of this project is to test three classes of compounds in animal models of muscular dystrophy, and evaluate their therapeutic...inhibitor compound to be tested in animal models of disease, as a more efficacious drug was identified with similar substrate specificity. 15...SUBJECT TERMS Fibrofatty infiltration, drug testing , muscular dystrophy, fibrosis. 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER

  2. Fifth SIAM conference on geometric design 97: Final program and abstracts. Final technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1997-12-31

    The meeting was divided into the following sessions: (1) CAD/CAM; (2) Curve/Surface Design; (3) Geometric Algorithms; (4) Multiresolution Methods; (5) Robotics; (6) Solid Modeling; and (7) Visualization. This report contains the abstracts of papers presented at the meeting. Proceding the conference there was a short course entitled ``Wavelets for Geometric Modeling and Computer Graphics``.

  3. Long-term drug administration in the adult zebrafish using oral gavage for cancer preclinical studies

    PubMed Central

    Dang, Michelle; Henderson, Rachel E.; Garraway, Levi A.

    2016-01-01

    ABSTRACT Zebrafish are a major model for chemical genetics, and most studies use embryos when investigating small molecules that cause interesting phenotypes or that can rescue disease models. Limited studies have dosed adults with small molecules by means of water-borne exposure or injection techniques. Challenges in the form of drug delivery-related trauma and anesthesia-related toxicity have excluded the adult zebrafish from long-term drug efficacy studies. Here, we introduce a novel anesthetic combination of MS-222 and isoflurane to an oral gavage technique for a non-toxic, non-invasive and long-term drug administration platform. As a proof of principle, we established drug efficacy of the FDA-approved BRAFV600E inhibitor, Vemurafenib, in adult zebrafish harboring BRAFV600E melanoma tumors. In the model, adult casper zebrafish intraperitoneally transplanted with a zebrafish melanoma cell line (ZMEL1) and exposed to daily sub-lethal dosing at 100 mg/kg of Vemurafenib for 2 weeks via oral gavage resulted in an average 65% decrease in tumor burden and a 15% mortality rate. In contrast, Vemurafenib-resistant ZMEL1 cell lines, generated in culture from low-dose drug exposure for 4 months, did not respond to the oral gavage treatment regimen. Similarly, this drug treatment regimen can be applied for treatment of primary melanoma tumors in the zebrafish. Taken together, we developed an effective long-term drug treatment system that will allow the adult zebrafish to be used to identify more effective anti-melanoma combination therapies and opens up possibilities for treating adult models of other diseases. PMID:27482819

  4. Workshop on the Tectonic Evolution of Greenstone Belts (supplement containing abstracts of invited talks and late abstracts)

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Topics addressed include: greenstone belt tectonics, thermal constaints, geological structure, rock components, crustal accretion model, geological evolution, synsedimentary deformation, Archean structures and geological faults.

  5. Visualization and dissemination of multidimensional proteomics data comparing protein abundance during Caenorhabditis elegans development.

    PubMed

    Riffle, Michael; Merrihew, Gennifer E; Jaschob, Daniel; Sharma, Vagisha; Davis, Trisha N; Noble, William S; MacCoss, Michael J

    2015-11-01

    Regulation of protein abundance is a critical aspect of cellular function, organism development, and aging. Alternative splicing may give rise to multiple possible proteoforms of gene products where the abundance of each proteoform is independently regulated. Understanding how the abundances of these distinct gene products change is essential to understanding the underlying mechanisms of many biological processes. Bottom-up proteomics mass spectrometry techniques may be used to estimate protein abundance indirectly by sequencing and quantifying peptides that are later mapped to proteins based on sequence. However, quantifying the abundance of distinct gene products is routinely confounded by peptides that map to multiple possible proteoforms. In this work, we describe a technique that may be used to help mitigate the effects of confounding ambiguous peptides and multiple proteoforms when quantifying proteins. We have applied this technique to visualize the distribution of distinct gene products for the whole proteome across 11 developmental stages of the model organism Caenorhabditis elegans. The result is a large multidimensional dataset for which web-based tools were developed for visualizing how translated gene products change during development and identifying possible proteoforms. The underlying instrument raw files and tandem mass spectra may also be downloaded. The data resource is freely available on the web at http://www.yeastrc.org/wormpes/ . Graphical Abstract ᅟ.

  6. Foundations of reusable and interoperable facet models using category theory

    PubMed Central

    2016-01-01

    Faceted browsing has become ubiquitous with modern digital libraries and online search engines, yet the process is still difficult to abstractly model in a manner that supports the development of interoperable and reusable interfaces. We propose category theory as a theoretical foundation for faceted browsing and demonstrate how the interactive process can be mathematically abstracted. Existing efforts in facet modeling are based upon set theory, formal concept analysis, and light-weight ontologies, but in many regards, they are implementations of faceted browsing rather than a specification of the basic, underlying structures and interactions. We will demonstrate that category theory allows us to specify faceted objects and study the relationships and interactions within a faceted browsing system. Resulting implementations can then be constructed through a category-theoretic lens using these models, allowing abstract comparison and communication that naturally support interoperability and reuse. PMID:27942248

  7. The Motor System Contributes to Comprehension of Abstract Language

    PubMed Central

    Guan, Connie Qun; Meng, Wanjin; Yao, Ru; Glenberg, Arthur M.

    2013-01-01

    If language comprehension requires a sensorimotor simulation, how can abstract language be comprehended? We show that preparation to respond in an upward or downward direction affects comprehension of the abstract quantifiers “more and more” and “less and less” as indexed by an N400-like component. Conversely, the semantic content of the sentence affects the motor potential measured immediately before the upward or downward action is initiated. We propose that this bidirectional link between motor system and language arises because the motor system implements forward models that predict the sensory consequences of actions. Because the same movement (e.g., raising the arm) can have multiple forward models for different contexts, the models can make different predictions depending on whether the arm is raised, for example, to place an object or raised as a threat. Thus, different linguistic contexts invoke different forward models, and the predictions constitute different understandings of the language. PMID:24086463

  8. Supporting ontology adaptation and versioning based on a graph of relevance

    NASA Astrophysics Data System (ADS)

    Sassi, Najla; Jaziri, Wassim; Alharbi, Saad

    2016-11-01

    Ontologies recently have become a topic of interest in computer science since they are seen as a semantic support to explicit and enrich data-models as well as to ensure interoperability of data. Moreover, supporting ontology adaptation becomes essential and extremely important, mainly when using ontologies in changing environments. An important issue when dealing with ontology adaptation is the management of several versions. Ontology versioning is a complex and multifaceted problem as it should take into account change management, versions storage and access, consistency issues, etc. The purpose of this paper is to propose an approach and tool for ontology adaptation and versioning. A series of techniques are proposed to 'safely' evolve a given ontology and produce a new consistent version. The ontology versions are ordered in a graph according to their relevance. The relevance is computed based on four criteria: conceptualisation, usage frequency, abstraction and completeness. The techniques to carry out the versioning process are implemented in the Consistology tool, which has been developed to assist users in expressing adaptation requirements and managing ontology versions.

  9. Workshop on Analysis of Returned Comet Nucleus Samples

    NASA Technical Reports Server (NTRS)

    1989-01-01

    This volume contains abstracts that were accepted by the Program Committee for presentation at the workshop on the analysis of returned comet nucleus samples held in Milpitas, California, January 16 to 18, 1989. The abstracts deal with the nature of cometary ices, cryogenic handling and sampling equipment, origin and composition of samples, and spectroscopic, thermal and chemical processing methods of cometary nuclei. Laboratory simulation experimental results on dust samples are reported. Some results obtained from Halley's comet are also included. Microanalytic techniques for examining trace elements of cometary particles, synchrotron x ray fluorescence and instrument neutron activation analysis (INAA), are presented.

  10. Observing RR Lyrae Variables in the M3 Globular Cluster with the BYU West Mountain Observatory (Abstract)

    NASA Astrophysics Data System (ADS)

    Joner, M. D.

    2016-06-01

    (Abstract only) We have utilized the 0.9-meter telescope of the Brigham Young University West Mountain Observatory to secure data on the northern hemisphere globular cluster NGC 5272 (M3). We made 216 observations in the V filter spaced between March and August 2012. We present light curves of the M3 RR Lyrae stars using different techniques. We compare light curves produced using DAOPHOT and ISIS software packages for stars in both the halo and core regions of this globular cluster. The light curve fitting is done using FITLC.

  11. A model-Driven Approach to Customize the Vocabulary of Communication Boards: Towards More Humanization of Health Care.

    PubMed

    Franco, Natália M; Medeiros, Gabriel F; Silva, Edson A; Murta, Angela S; Machado, Aydano P; Fidalgo, Robson N

    2015-01-01

    This work presents a Modeling Language and its technological infrastructure to customize the vocabulary of Communication Boards (CB), which are important tools to provide more humanization of health care. Using a technological infrastructure based on Model-Driven Development (MDD) approach, our Modelin Language (ML) creates an abstraction layer between users (e.g., health professionals such as an audiologist or speech therapist) and application code. Moreover, the use of a metamodel enables a syntactic corrector for preventing creation of wrong models. Our ML and metamodel enable more autonomy for health professionals in creating customized CB because it abstracts complexities and permits them to deal only with the domain concepts (e.g., vocabulary and patient needs). Additionally, our infrastructure provides a configuration file that can be used to share and reuse models. This way, the vocabulary modelling effort will decrease our time since people share vocabulary models. Our study provides an infrastructure that aims to abstract the complexity of CB vocabulary customization, giving more autonomy to health professionals when they need customizing, sharing and reusing vocabularies for CB.

  12. Decadal climate predictions improved by ocean ensemble dispersion filtering

    NASA Astrophysics Data System (ADS)

    Kadow, C.; Illing, S.; Kröner, I.; Ulbrich, U.; Cubasch, U.

    2017-06-01

    Decadal predictions by Earth system models aim to capture the state and phase of the climate several years in advance. Atmosphere-ocean interaction plays an important role for such climate forecasts. While short-term weather forecasts represent an initial value problem and long-term climate projections represent a boundary condition problem, the decadal climate prediction falls in-between these two time scales. In recent years, more precise initialization techniques of coupled Earth system models and increased ensemble sizes have improved decadal predictions. However, climate models in general start losing the initialized signal and its predictive skill from one forecast year to the next. Here we show that the climate prediction skill of an Earth system model can be improved by a shift of the ocean state toward the ensemble mean of its individual members at seasonal intervals. We found that this procedure, called ensemble dispersion filter, results in more accurate results than the standard decadal prediction. Global mean and regional temperature, precipitation, and winter cyclone predictions show an increased skill up to 5 years ahead. Furthermore, the novel technique outperforms predictions with larger ensembles and higher resolution. Our results demonstrate how decadal climate predictions benefit from ocean ensemble dispersion filtering toward the ensemble mean.Plain Language SummaryDecadal predictions aim to predict the climate several years in advance. Atmosphere-ocean interaction plays an important role for such climate forecasts. The ocean memory due to its heat capacity holds big potential skill. In recent years, more precise initialization techniques of coupled Earth system models (incl. atmosphere and ocean) have improved decadal predictions. Ensembles are another important aspect. Applying slightly perturbed predictions to trigger the famous butterfly effect results in an ensemble. Instead of evaluating one prediction, but the whole ensemble with its ensemble average, improves a prediction system. However, climate models in general start losing the initialized signal and its predictive skill from one forecast year to the next. Our study shows that the climate prediction skill of an Earth system model can be improved by a shift of the ocean state toward the ensemble mean of its individual members at seasonal intervals. We found that this procedure applying the average during the model run, called ensemble dispersion filter, results in more accurate results than the standard prediction. Global mean and regional temperature, precipitation, and winter cyclone predictions show an increased skill up to 5 years ahead. Furthermore, the novel technique outperforms predictions with larger ensembles and higher resolution.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4963469','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4963469"><span>Design of a practical model-observer-based image quality assessment method for x-ray computed tomography imaging systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Tseng, Hsin-Wu; Fan, Jiahua; Kupinski, Matthew A.</p> <p>2016-01-01</p> <p>Abstract. The use of a channelization mechanism on model observers not only makes mimicking human visual behavior possible, but also reduces the amount of image data needed to estimate the model observer parameters. The channelized Hotelling observer (CHO) and channelized scanning linear observer (CSLO) have recently been used to assess CT image quality for detection tasks and combined detection/estimation tasks, respectively. Although the use of channels substantially reduces the amount of data required to compute image quality, the number of scans required for CT imaging is still not practical for routine use. It is our desire to further reduce the number of scans required to make CHO or CSLO an image quality tool for routine and frequent system validations and evaluations. This work explores different data-reduction schemes and designs an approach that requires only a few CT scans. Three different kinds of approaches are included in this study: a conventional CHO/CSLO technique with a large sample size, a conventional CHO/CSLO technique with fewer samples, and an approach that we will show requires fewer samples to mimic conventional performance with a large sample size. The mean value and standard deviation of areas under ROC/EROC curve were estimated using the well-validated shuffle approach. The results indicate that an 80% data reduction can be achieved without loss of accuracy. This substantial data reduction is a step toward a practical tool for routine-task-based QA/QC CT system assessment. PMID:27493982</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/270337','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/270337"><span>Eleventh international symposium on radiopharmaceutical chemistry</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>NONE</p> <p></p> <p>This document contains abstracts of papers which were presented at the Eleventh International Symposium on Radiopharmaceutical Chemistry. Sessions included: radiopharmaceuticals for the dopaminergic system, strategies for the production and use of labelled reactive small molecules, radiopharmaceuticals for measuring metabolism, radiopharmaceuticals for the serotonin and sigma receptor systems, labelled probes for molecular biology applications, radiopharmaceuticals for receptor systems, radiopharmaceuticals utilizing coordination chemistry, radiolabelled antibodies, radiolabelling methods for small molecules, analytical techniques in radiopharmaceutical chemistry, and analytical techniques in radiopharmaceutical chemistry.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA515073','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA515073"><span>Time Dependent Channel Packet Calculation of Two Nucleon Scattering Matrix Elements</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2010-03-01</p> <p>solutions, 46 ( ) ( )1 1 11 ( ) cos sinL L L L Lr Akr j kr krψ δ η δ= −   (3.70) Here, A is an arbitrary constant, Lδ is the phase shift...iv AFIT/DS/ENP/10-M03 Abstract A new approach to calculating nucleon-nucleon scattering matrix...elements using a proven atomic time-dependent wave packet technique is investigated. Using this technique, reactant and product wave packets containing</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/4940115','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/4940115"><span>International conference on bone mineral measurement, October 12--13, 1973, Chicago, Illinois</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>None</p> <p>1973-12-31</p> <p>From international conference on bone mineral measurement; Chicago, Illinois, USA (12 Oct 1973). Abstracts of papers presented at the international conference on bone mineral measurement are presented. The papers were grouped into two sessions: a physical session including papers on measuring techniques, errors, interpretation and correlations, dual photon techniques, and data handling and exchange; a biomedical session including papers on bone disease, osteoporosis, normative data, non-disease influences, renal, and activity and inactivity. (ERB)</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19830005315','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19830005315"><span>Summary of flat-plate solar array project documentation. Abstracts of published documents, 1975 to June 1982</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p></p> <p>1982-01-01</p> <p>Technologies that will enable the private sector to manufacture and widely use photovoltaic systems for the generation of electricity in residential, commercial, industrial, and government applications at a cost per watt that is competitive with other means is investigated. Silicon refinement processes, advanced silicon sheet growth techniques, solar cell development, encapsulation, automated fabrication process technology, advanced module/array design, and module/array test and evaluation techniques are developed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5997014','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5997014"><span>Modeling-based design and assessment of an acousto-optic guided high-intensity focused ultrasound system</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Adams, Matthew T.; Cleveland, Robin O.; Roy, Ronald A.</p> <p>2017-01-01</p> <p>Abstract. Real-time acousto-optic (AO) sensing has been shown to noninvasively detect changes in ex vivo tissue optical properties during high-intensity focused ultrasound (HIFU) exposures. The technique is particularly appropriate for monitoring noncavitating lesions that offer minimal acoustic contrast. A numerical model is presented for an AO-guided HIFU system with an illumination wavelength of 1064 nm and an acoustic frequency of 1.1 MHz. To confirm the model’s accuracy, it is compared to previously published experimental data gathered during AO-guided HIFU in chicken breast. The model is used to determine an optimal design for an AO-guided HIFU system, to assess its robustness, and to predict its efficacy for the ablation of large volumes. It was found that a through transmission geometry results in the best performance, and an optical wavelength around 800 nm was optimal as it provided sufficient contrast with low absorption. Finally, it was shown that the strategy employed while treating large volumes with AO guidance has a major impact on the resulting necrotic volume and symmetry. PMID:28114454</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/22204846-structural-investigation-porcine-stomach-mucin-ray-fiber-diffraction-homology-modeling','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/22204846-structural-investigation-porcine-stomach-mucin-ray-fiber-diffraction-homology-modeling"><span>Structural investigation of porcine stomach mucin by X-ray fiber diffraction and homology modeling</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Veluraja, K., E-mail: veluraja@msuniv.ac.in; Vennila, K.N.; Umamakeshvari, K.</p> <p></p> <p>Research highlights: {yields} Techniques to get oriented mucin fibre. {yields} X-ray fibre diffraction pattern for mucin. {yields} Molecular modeling of mucin based on X-ray fibre diffraction pattern. -- Abstract: The basic understanding of the three dimensional structure of mucin is essential to understand its physiological function. Technology has been developed to achieve orientated porcine stomach mucin molecules. X-ray fiber diffraction of partially orientated porcine stomach mucin molecules show d-spacing signals at 2.99, 4.06, 4.22, 4.7, 5.37 and 6.5 A. The high intense d-spacing signal at 4.22 A is attributed to the antiparallel {beta}-sheet structure identified in the fraction of themore » homology modeled mucin molecule (amino acid residues 800-980) using Nidogen-Laminin complex structure as a template. The X-ray fiber diffraction signal at 6.5 A reveals partial organization of oligosaccharides in porcine stomach mucin. This partial structure of mucin will be helpful in establishing a three dimensional structure for the whole mucin molecule.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017JPhCS.895a2046M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017JPhCS.895a2046M"><span>Problem Posing with Realistic Mathematics Education Approach in Geometry Learning</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mahendra, R.; Slamet, I.; Budiyono</p> <p>2017-09-01</p> <p>One of the difficulties of students in the learning of geometry is on the subject of plane that requires students to understand the abstract matter. The aim of this research is to determine the effect of Problem Posing learning model with Realistic Mathematics Education Approach in geometry learning. This quasi experimental research was conducted in one of the junior high schools in Karanganyar, Indonesia. The sample was taken using stratified cluster random sampling technique. The results of this research indicate that the model of Problem Posing learning with Realistic Mathematics Education Approach can improve students’ conceptual understanding significantly in geometry learning especially on plane topics. It is because students on the application of Problem Posing with Realistic Mathematics Education Approach are become to be active in constructing their knowledge, proposing, and problem solving in realistic, so it easier for students to understand concepts and solve the problems. Therefore, the model of Problem Posing learning with Realistic Mathematics Education Approach is appropriately applied in mathematics learning especially on geometry material. Furthermore, the impact can improve student achievement.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li class="active"><span>24</span></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_24 --> <div id="page_25" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="481"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/5174457','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/5174457"><span>Occupational dose reduction at Department of Energy contractor facilities: Bibliography of selected readings in radiation protection and ALARA</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Dionne, B.J.; Sullivan, S.G.; Baum, J.W.</p> <p>1993-12-01</p> <p>This bibliography contains abstracts relating to various aspects of ALARA program implementation and dose reduction activities, with a focus on DOE facilities. Abstracts included in this bibliography were selected from proceedings of technical meetings, journals, research reports, searches of the DOE Energy, Science and Technology Database (in general, the citation and abstract information is presented as obtained from this database), and reprints of published articles provided by the authors. Facility types and activities covered in the scope of this report include: radioactive waste, uranium enrichment, fuel fabrication, spent fuel storage and reprocessing, facility decommissioning, hot laboratories, tritium production, research, testmore » and production reactors, weapons fabrication and testing, fusion, uranium and plutonium processing, radiography, and aocelerators. Information on improved shielding design, decontamination, containments, robotics, source prevention and control, job planning, improved operational and design techniques, as well as on other topics, has been included. In addition, DOE/EH reports not included in previous volumes of the bibliography are in this volume (abstracts 611 to 684). This volume (Volume 5 of the series) contains 217 abstracts. An author index and a subject index are provided to facilitate use. Both indices contain the abstract numbers from previous volumes, as well as the current volume. Information that the reader feels might be included in the next volume of this bibliography should be submitted to the BNL ALARA Center.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/7368691','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/7368691"><span>Indexes of the proceedings for the nine symposia (international) on detonation, 1951--89</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Crane, S.L.; Deal, W.E.; Ramsay, J.B.</p> <p>1993-01-01</p> <p>The Proceedings of the nine Detonation Symposia have become the major archival source of information of international research in explosive phenomenology, theory, experimental techniques, numerical modeling, and high-rate reaction chemistry. In many cases, they contain the original reference or the only reference to major progress in the field. For some papers, the information is more complete than the complementary article appearing in a formal journal, yet for others, authors elected to publish only an abstract in the Proceedings. For the large majority of papers, the Symposia Proceedings provide the only published reference to a body of work. This report indexesmore » the nine existing Proceedings of the Detonation Symposia by paper titles, topic phrases, authors, and first appearance of acronyms and code names.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1531866','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1531866"><span>Intravital Fluorescence Videomicroscopy to Study Tumor Angiogenesis and Microcirculation1</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Vajkoczy, Peter; Ullrich, Axel; Meager, Michael D</p> <p>2000-01-01</p> <p>Abstract Angiogenesis and microcirculation play a central role in growth and metastasis of human neoplasms, and, thus, represent a major target for novel treatment strategies. Mechanistic analysis of processes involved in tumor vascularization, however, requires sophisticated in vivo experimental models and techniques. Intravital microscopy allows direct assessment of tumor angiogenesis, microcirculation and overall perfusion. Its application to the study of tumor-induced neovascularization further provides information on molecular transport and delivery, intra- and extravascular cell-to-cell and cell-to-matrix interaction, as well as tumor oxygenation and metabolism. With the recent advances in the field of bioluminescence and fluorescent reporter genes, appropriate for in vivo imaging, the intravital fluorescent microscopic approach has to be considered a powerful tool to study microvascular, cellular and molecular mechanisms of tumor growth. PMID:10933068</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/774552','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/774552"><span>Indexes of the Proceedings for the Ten International Symposia on Detonation 1951-93</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Deal, William E.; Ramsay, John B.; Roach, Alita M.</p> <p>1998-09-01</p> <p>The Proceedings of the ten Detonation Symposia have become the major archival source of information of international research in explosive phenomenology, theory, experimental techniques, numerical modeling, and high-rate reaction chemistry. In many cases, they contain the original reference or the only reference to major progress in the field. For some papers, the information is more complete than the complementary article appearing in a formal journal; yet for others, authors elected to publish only an abstract in the Proceedings. For the large majority of papers, the Symposia Proceedings provide the only published reference to a body of work. This report indexesmore » the ten existing Proceedings of the Detonation Symposia by paper titles, topic phrases, authors, and first appearance of acronyms and code names.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/12603045','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/12603045"><span>Playing biology's name game: identifying protein names in scientific text.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Hanisch, Daniel; Fluck, Juliane; Mevissen, Heinz-Theodor; Zimmer, Ralf</p> <p>2003-01-01</p> <p>A growing body of work is devoted to the extraction of protein or gene interaction information from the scientific literature. Yet, the basis for most extraction algorithms, i.e. the specific and sensitive recognition of protein and gene names and their numerous synonyms, has not been adequately addressed. Here we describe the construction of a comprehensive general purpose name dictionary and an accompanying automatic curation procedure based on a simple token model of protein names. We designed an efficient search algorithm to analyze all abstracts in MEDLINE in a reasonable amount of time on standard computers. The parameters of our method are optimized using machine learning techniques. Used in conjunction, these ingredients lead to good search performance. A supplementary web page is available at http://cartan.gmd.de/ProMiner/.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=20010045081&hterms=active+volcanoes&qs=N%3D0%26Ntk%3DAll%26Ntx%3Dmode%2Bmatchall%26Ntt%3Dactive%2Bvolcanoes','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=20010045081&hterms=active+volcanoes&qs=N%3D0%26Ntk%3DAll%26Ntx%3Dmode%2Bmatchall%26Ntt%3Dactive%2Bvolcanoes"><span>Interpreting Low Spatial Resolution Thermal Data from Active Volcanoes on Io and the Earth</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Keszthelyi, L.; Harris, A. J. L.; Flynn, L.; Davies, A. G.; McEwen, A.</p> <p>2001-01-01</p> <p>The style of volcanism was successfully determined at a number of active volcanoes on Io and the Earth using the same techniques to interpret thermal remote sensing data. Additional information is contained in the original extended abstract.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=59444&keyword=special+AND+library&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50','EPA-EIMS'); return false;" href="https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=59444&keyword=special+AND+library&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50"><span>ADVANCES IN GREEN CHEMISTRY: CHEMICAL SYNTHESES USING MICROWAVE IRRADIATION, ISBN 81-901238-5-8</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://oaspub.epa.gov/eims/query.page">EPA Science Inventory</a></p> <p></p> <p></p> <p>16. Abstract Advances in Green Chemistry: Chemical Syntheses Using Microwave Irradiation<br> Microwave-accelerated chemical syntheses in solvents as well as under solvent-free conditions have witnessed an explosive growth. The technique has found widespread application predomi...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/ED127958.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/ED127958.pdf"><span>Team Training and Evaluation Strategies: A State-of-Art Review.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Wagner, H.; And Others</p> <p></p> <p>Educational Resources Information Center (ERIC), the Defense Documentation Center (DDC), National Technical Information Service (NTIS), Psychological Abstracts, HumRRO Library, and industrial training publications were surveyed to analyze instructional and evaluative techniques relevant to team training. Research studies and team training…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/AD1002536','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/AD1002536"><span>Large Eddy Simulations using oodlesDST</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2016-01-01</p> <p>Research Agency DST-Group-TR-3205 ABSTRACT The oodlesDST code is based on OpenFOAM software and performs Large Eddy Simulations of......maritime platforms using a variety of simulation techniques. He is currently using OpenFOAM software to perform both Reynolds Averaged Navier-Stokes</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=63808&keyword=Plot+AND+analysis&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50','EPA-EIMS'); return false;" href="https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=63808&keyword=Plot+AND+analysis&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50"><span>A GIS TECHNIQUE FOR ESTIMATING NATURAL ATTENUATION RATES AND MASS BALANCES</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://oaspub.epa.gov/eims/query.page">EPA Science Inventory</a></p> <p></p> <p></p> <p>ABSTRACT: Regulatory approval of monitored natural attenuation (MNA) as a component for site remediation often requires a demonstration that contaminant mass has decreased significantly over time. Successful approval of MNA also typically requires an estimate of past and future n...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=patent+AND+search&pg=4&id=EJ273361','ERIC'); return false;" href="https://eric.ed.gov/?q=patent+AND+search&pg=4&id=EJ273361"><span>Library Searching: An Industrial User's Viewpoint.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Hendrickson, W. A.</p> <p>1982-01-01</p> <p>Discusses library searching of chemical literature from an industrial user's viewpoint, focusing on differences between academic and industrial researcher's searching techniques of the same problem area. Indicates that industry users need more exposure to patents, work with abstracting services and continued improvement in computer searching…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA140102','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA140102"><span>In-Vivo Techniques for Measuring Electrical Properties of Tissues.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1980-09-01</p> <p>probe Electromagnetic energy Dielectric properties Monopole antenna In-situ tissues , Antemortem/Pos tmortem studies Renal blood flow 10 ABSTRACT... mice or rats, which were positioned beneath a fixed measurement probe. Several alternative methods involving the use of semi-rigid or flexible coaxial</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=20010044898&hterms=Xxxii&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D50%26Ntt%3DXxxii','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=20010044898&hterms=Xxxii&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D50%26Ntt%3DXxxii"><span>Investigations into the Contamination of Lunar Return Material. Part 1; Surface Analysis and Imaging Investigations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Steele, A.; Toporski, J. K. W.; Avci, R.; Agee, C. B.; McKay, D. S.</p> <p>2001-01-01</p> <p>A suite of lunar soils has been investigated by imaging and in-situ spectroscopy techniques. A suite of contaminant plastics and potential microbes has been found. Additional information is contained in the original extended abstract.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19740018552','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19740018552"><span>Study of optimum methods of optical communication. [accounting for the effects of the turbulent atmosphere and quantum mechanics</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Harger, R. O.</p> <p>1974-01-01</p> <p>Abstracts are reported relating to the techniques used in the research concerning optical transmission of information. Communication through the turbulent atmosphere, quantum mechanics, and quantum communication theory are discussed along with the results.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=318193&Lab=NHEERL&keyword=Springer%2C+E&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50','EPA-EIMS'); return false;" href="https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=318193&Lab=NHEERL&keyword=Springer%2C+E&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50"><span>Risk Assessment Strategies and Techniques for Combined Exposures</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://oaspub.epa.gov/eims/query.page">EPA Science Inventory</a></p> <p></p> <p></p> <p>Author: Cynthia V. Rider, Ph.D., and Jane Ellen Simmons, Ph.D.Abstract: Consideration of cumulative risk is necessary to evaluate properly the safety of, and the risks associated with, combined exposures. These combined exposures ("mixtures") commonly occur from exposure to: envi...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014PhDT........68R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014PhDT........68R"><span>Abstracted Workow Framework with a Structure from Motion Application</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Rossi, Adam J.</p> <p></p> <p>In scientific and engineering disciplines, from academia to industry, there is an increasing need for the development of custom software to perform experiments, construct systems, and develop products. The natural mindset initially is to shortcut and bypass all overhead and process rigor in order to obtain an immediate result for the problem at hand, with the misconception that the software will simply be thrown away at the end. In a majority of the cases, it turns out the software persists for many years, and likely ends up in production systems for which it was not initially intended. In the current study, a framework that can be used in both industry and academic applications mitigates underlying problems associated with developing scientific and engineering software. This results in software that is much more maintainable, documented, and usable by others, specifically allowing new users to extend capabilities of components already implemented in the framework. There is a multi-disciplinary need in the fields of imaging science, computer science, and software engineering for a unified implementation model, which motivates the development of an abstracted software framework. Structure from motion (SfM) has been identified as one use case where the abstracted workflow framework can improve research efficiencies and eliminate implementation redundancies in scientific fields. The SfM process begins by obtaining 2D images of a scene from different perspectives. Features from the images are extracted and correspondences are established. This provides a sufficient amount of information to initialize the problem for fully automated processing. Transformations are established between views, and 3D points are established via triangulation algorithms. The parameters for the camera models for all views / images are solved through bundle adjustment, establishing a highly consistent point cloud. The initial sparse point cloud and camera matrices are used to generate a dense point cloud through patch based techniques or densification algorithms such as Semi-Global Matching (SGM). The point cloud can be visualized or exploited by both humans and automated techniques. In some cases the point cloud is "draped" with original imagery in order to enhance the 3D model for a human viewer. The SfM workflow can be implemented in the abstracted framework, making it easily leverageable and extensible by multiple users. Like many processes in scientific and engineering domains, the workflow described for SfM is complex and requires many disparate components to form a functional system, often utilizing algorithms implemented by many users in different languages / environments and without knowledge of how the component fits into the larger system. In practice, this generally leads to issues interfacing the components, building the software for desired platforms, understanding its concept of operations, and how it can be manipulated in order to fit the desired function for a particular application. In addition, other scientists and engineers instinctively wish to analyze the performance of the system, establish new algorithms, optimize existing processes, and establish new functionality based on current research. This requires a framework whereby new components can be easily plugged in without affecting the current implemented functionality. The need for a universal programming environment establishes the motivation for the development of the abstracted workflow framework. This software implementation, named Catena, provides base classes from which new components must derive in order to operate within the framework. The derivation mandates requirements be satisfied in order to provide a complete implementation. Additionally, the developer must provide documentation of the component in terms of its overall function and inputs. The interface input and output values corresponding to the component must be defined in terms of their respective data types, and the implementation uses mechanisms within the framework to retrieve and send the values. This process requires the developer to componentize their algorithm rather than implement it monolithically. Although the requirements of the developer are slightly greater, the benefits realized from using Catena far outweigh the overhead, and results in extensible software. This thesis provides a basis for the abstracted workflow framework concept and the Catena software implementation. The benefits are also illustrated using a detailed examination of the SfM process as an example application.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA588345','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA588345"><span>Material Modeling and Ballistic-Resistance Analysis of Armor-Grade Composites Reinforced with High-Performance Fibers</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2009-12-01</p> <p>CLASSIFICATION OF: A new ballistic material model for 0/90 cross-plied oriented ultra-high molecular weight (UHMW) polyethylene fiber-based armor...recently developed unit cell-based ballistic material model for the same class of composites (M. Grujicic, G. Arakere, T. 1. REPORT DATE (DD-MM-YYYY) 4...ABSTRACT UU c. THIS PAGE UU 2. REPORT TYPE New Reprint 17. LIMITATION OF ABSTRACT UU 15. NUMBER OF PAGES 5d. PROJECT NUMBER 5e. TASK NUMBER 5f</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA276250','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA276250"><span>Development of Intelligent Computer-Assisted Instruction Systems to Facilitate Reading Skills of Learning-Disabled Children</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1993-12-01</p> <p>Unclassified/Unlimited 13. ABSTRACT ~Maximum 2W0 worr*J The purpose of this thesis is to develop a high-level model to create seli"adapting software which...Department of Computer Science ABSTRACT The purpose of this thesis is to develop a high-level model to create self-adapting software which teaches learning...stimulating and demanding. The power of the system model described herein is that it can vary as needed by the individual student. The system will</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/AD1048704','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/AD1048704"><span>Accelerating Coagulation in Traumatic Injuries Using Inorganic Polyphosphate-Coated Silica Nanoparticles in a Swine (Sus scrofa) Model</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2018-03-13</p> <p>all information . Use additional pages if necessary.) PROTOCOL #: FDG20160012A DATE: 13 March 2018 PROTOCOL TITLE: Accelerating Coagulation...Investigator Attachments: Attachment 1: Defense Technical Information Center (DTIC) Abstract Submission (Mandatory) 4 FDG20160012A...Attachment 1 Defense Technical Information Center (DTIC) Abstract Submission This abstract requires a brief (no more than 200 words) factual summary of the</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/AD1048689','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/AD1048689"><span>Determining the Cardiovascular Effect of Partial versus Complete REBOA in a Porcine (Sus scrofa) Model of Hemorrhagic Shock.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2018-03-09</p> <p>all information . Use additional pages if necessary.) PROTOCOL #: FDG20170005A DATE: 9 March 2018 PROTOCOL TITLE: Determining...Investigator Attachments: Attachment 1: Defense Technical Information Center (DTIC) Abstract Submission (Mandatory) 4 FDG20170005A...Attachment 1 Defense Technical Information Center (DTIC) Abstract Submission This abstract requires a brief (no more than 200 words) factual summary of the</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_25 --> <div class="footer-extlink text-muted" style="margin-bottom:1rem; text-align:center;">Some links on this page may take you to non-federal websites. Their policies may differ from this site.</div> </div><!-- container --> <a id="backToTop" href="#top"> Top </a> <footer> <nav> <ul class="links"> <li><a href="/sitemap.html">Site Map</a></li> <li><a href="/website-policies.html">Website Policies</a></li> <li><a href="https://www.energy.gov/vulnerability-disclosure-policy" target="_blank">Vulnerability Disclosure Program</a></li> <li><a href="/contact.html">Contact Us</a></li> </ul> </nav> </footer> <script type="text/javascript"><!-- // var lastDiv = ""; function showDiv(divName) { // hide last div if (lastDiv) { document.getElementById(lastDiv).className = "hiddenDiv"; } //if value of the box is not nothing and an object with that name exists, then change the class if (divName && document.getElementById(divName)) { document.getElementById(divName).className = "visibleDiv"; lastDiv = divName; } } //--> </script> <script> /** * Function that tracks a click on an outbound link in Google Analytics. * This function takes a valid URL string as an argument, and uses that URL string * as the event label. */ var trackOutboundLink = function(url,collectionCode) { try { h = window.open(url); setTimeout(function() { ga('send', 'event', 'topic-page-click-through', collectionCode, url); }, 1000); } catch(err){} }; </script> <!-- Google Analytics --> <script> (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','//www.google-analytics.com/analytics.js','ga'); ga('create', 'UA-1122789-34', 'auto'); ga('send', 'pageview'); </script> <!-- End Google Analytics --> <script> showDiv('page_1') </script> </body> </html>