Sample records for typing rule-based transformations

  1. CONSTRAINTS ON VARIABLES IN SYNTAX.

    ERIC Educational Resources Information Center

    ROSS, JOHN ROBERT

    IN ATTEMPTING TO DEFINE "SYNTACTIC VARIABLE," THE AUTHOR BASES HIS DISCUSSION ON THE ASSUMPTION THAT SYNTACTIC FACTS ARE A COLLECTION OF TWO TYPES OF RULES--CONTEXT-FREE PHRASE STRUCTURE RULES (GENERATING UNDERLYING OR DEEP PHRASE MARKERS) AND GRAMMATICAL TRANSFORMATIONS, WHICH MAP UNDERLYING PHRASE MARKERS ONTO SUPERFICIAL (OR SURFACE) PHRASE…

  2. Enumeration of Ring–Chain Tautomers Based on SMIRKS Rules

    PubMed Central

    2015-01-01

    A compound exhibits (prototropic) tautomerism if it can be represented by two or more structures that are related by a formal intramolecular movement of a hydrogen atom from one heavy atom position to another. When the movement of the proton is accompanied by the opening or closing of a ring it is called ring–chain tautomerism. This type of tautomerism is well observed in carbohydrates, but it also occurs in other molecules such as warfarin. In this work, we present an approach that allows for the generation of all ring–chain tautomers of a given chemical structure. Based on Baldwin’s Rules estimating the likelihood of ring closure reactions to occur, we have defined a set of transform rules covering the majority of ring–chain tautomerism cases. The rules automatically detect substructures in a given compound that can undergo a ring–chain tautomeric transformation. Each transformation is encoded in SMIRKS line notation. All work was implemented in the chemoinformatics toolkit CACTVS. We report on the application of our ring–chain tautomerism rules to a large database of commercially available screening samples in order to identify ring–chain tautomers. PMID:25158156

  3. Transformation of Arden Syntax's medical logic modules into ArdenML for a business rules management system.

    PubMed

    Jung, Chai Young; Choi, Jong-Ye; Jeong, Seong Jik; Cho, Kyunghee; Koo, Yong Duk; Bae, Jin Hee; Kim, Sukil

    2016-05-16

    Arden Syntax is a Health Level Seven International (HL7) standard language that is used for representing medical knowledge as logic statements. Arden Syntax Markup Language (ArdenML) is a new representation of Arden Syntax based on XML. Compilers are required to execute medical logic modules (MLMs) in the hospital environment. However, ArdenML may also replace the compiler. The purpose of this study is to demonstrate that MLMs, encoded in ArdenML, can be transformed into a commercial rule engine format through an XSLT stylesheet and made executable in a target system. The target rule engine selected was Blaze Advisor. We developed an XSLT stylesheet to transform MLMs in ArdenML into Structured Rules Language (SRL) in Blaze Advisor, through a comparison of syntax between the two languages. The stylesheet was then refined recursively, by building and applying rules collected from the billing and coding guidelines of the Korean health insurance service. Two nurse coders collected and verified the rules and two information technology (IT) specialists encoded the MLMs and built the XSLT stylesheet. Finally, the stylesheet was validated by importing the MLMs into Blaze Advisor and applying them to claims data. The language comparison revealed that Blaze Advisor requires the declaration of variables with explicit types. We used both integer and real numbers for numeric types in ArdenML. "IF∼THEN" statements and assignment statements in ArdenML become rules in Blaze Advisor. We designed an XSLT stylesheet to solve this issue. In addition, we maintained the order of rule execution in the transformed rules, and added two small programs to support variable declarations and action statements. A total of 1489 rules were reviewed during this study, of which 324 rules were collected. We removed duplicate rules and encoded 241 unique MLMs in ArdenML, which were successfully transformed into SRL and imported to Blaze Advisor via the XSLT stylesheet. When applied to 73,841 outpatients' insurance claims data, the review result was the same as that of the legacy system. We have demonstrated that ArdenML can replace a compiler for transforming MLMs into commercial rule engine format. While the proposed XSLT stylesheet requires refinement for general use, we anticipate that the development of further XSLT stylesheets will support various rule engines. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Transformation of Graphical ECA Policies into Executable PonderTalk Code

    NASA Astrophysics Data System (ADS)

    Romeikat, Raphael; Sinsel, Markus; Bauer, Bernhard

    Rules are becoming more and more important in business modeling and systems engineering and are recognized as a high-level programming paradigma. For the effective development of rules it is desired to start at a high level, e.g. with graphical rules, and to refine them into code of a particular rule language for implementation purposes later. An model-driven approach is presented in this paper to transform graphical rules into executable code in a fully automated way. The focus is on event-condition-action policies as a special rule type. These are modeled graphically and translated into the PonderTalk language. The approach may be extended to integrate other rule types and languages as well.

  5. 76 FR 63566 - Efficiency and Renewables Advisory Committee, Appliance Standards Subcommittee, Negotiated...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-13

    ... Medium- and Low-Voltage Dry-Type Distribution Transformers AGENCY: Department of Energy, Office of Energy... Dry-Type and the second addressing Low-Voltage Dry-Type Distribution Transformers. The Liquid Immersed... proposed rule for regulating the energy efficiency of distribution transformers, as authorized by the...

  6. Quadrature rules with multiple nodes for evaluating integrals with strong singularities

    NASA Astrophysics Data System (ADS)

    Milovanovic, Gradimir V.; Spalevic, Miodrag M.

    2006-05-01

    We present a method based on the Chakalov-Popoviciu quadrature formula of Lobatto type, a rather general case of quadrature with multiple nodes, for approximating integrals defined by Cauchy principal values or by Hadamard finite parts. As a starting point we use the results obtained by L. Gori and E. Santi (cf. On the evaluation of Hilbert transforms by means of a particular class of Turan quadrature rules, Numer. Algorithms 10 (1995), 27-39; Quadrature rules based on s-orthogonal polynomials for evaluating integrals with strong singularities, Oberwolfach Proceedings: Applications and Computation of Orthogonal Polynomials, ISNM 131, Birkhauser, Basel, 1999, pp. 109-119). We generalize their results by using some of our numerical procedures for stable calculation of the quadrature formula with multiple nodes of Gaussian type and proposed methods for estimating the remainder term in such type of quadrature formulae. Numerical examples, illustrations and comparisons are also shown.

  7. Developing a modular architecture for creation of rule-based clinical diagnostic criteria.

    PubMed

    Hong, Na; Pathak, Jyotishman; Chute, Christopher G; Jiang, Guoqian

    2016-01-01

    With recent advances in computerized patient records system, there is an urgent need for producing computable and standards-based clinical diagnostic criteria. Notably, constructing rule-based clinical diagnosis criteria has become one of the goals in the International Classification of Diseases (ICD)-11 revision. However, few studies have been done in building a unified architecture to support the need for diagnostic criteria computerization. In this study, we present a modular architecture for enabling the creation of rule-based clinical diagnostic criteria leveraging Semantic Web technologies. The architecture consists of two modules: an authoring module that utilizes a standards-based information model and a translation module that leverages Semantic Web Rule Language (SWRL). In a prototype implementation, we created a diagnostic criteria upper ontology (DCUO) that integrates ICD-11 content model with the Quality Data Model (QDM). Using the DCUO, we developed a transformation tool that converts QDM-based diagnostic criteria into Semantic Web Rule Language (SWRL) representation. We evaluated the domain coverage of the upper ontology model using randomly selected diagnostic criteria from broad domains (n = 20). We also tested the transformation algorithms using 6 QDM templates for ontology population and 15 QDM-based criteria data for rule generation. As the results, the first draft of DCUO contains 14 root classes, 21 subclasses, 6 object properties and 1 data property. Investigation Findings, and Signs and Symptoms are the two most commonly used element types. All 6 HQMF templates are successfully parsed and populated into their corresponding domain specific ontologies and 14 rules (93.3 %) passed the rule validation. Our efforts in developing and prototyping a modular architecture provide useful insight into how to build a scalable solution to support diagnostic criteria representation and computerization.

  8. Enhanced image fusion using directional contrast rules in fuzzy transform domain.

    PubMed

    Nandal, Amita; Rosales, Hamurabi Gamboa

    2016-01-01

    In this paper a novel image fusion algorithm based on directional contrast in fuzzy transform (FTR) domain is proposed. Input images to be fused are first divided into several non-overlapping blocks. The components of these sub-blocks are fused using directional contrast based fuzzy fusion rule in FTR domain. The fused sub-blocks are then transformed into original size blocks using inverse-FTR. Further, these inverse transformed blocks are fused according to select maximum based fusion rule for reconstructing the final fused image. The proposed fusion algorithm is both visually and quantitatively compared with other standard and recent fusion algorithms. Experimental results demonstrate that the proposed method generates better results than the other methods.

  9. Theoretical and subjective bit assignments in transform picture

    NASA Technical Reports Server (NTRS)

    Jones, H. W., Jr.

    1977-01-01

    It is shown that all combinations of symmetrical input distributions with difference distortion measures give a bit assignment rule identical to the well-known rule for a Gaussian input distribution with mean-square error. Published work is examined to show that the bit assignment rule is useful for transforms of full pictures, but subjective bit assignments for transform picture coding using small block sizes are significantly different from the theoretical bit assignment rule. An intuitive explanation is based on subjective design experience, and a subjectively obtained bit assignment rule is given.

  10. Classification of the Gabon SAR Mosaic Using a Wavelet Based Rule Classifier

    NASA Technical Reports Server (NTRS)

    Simard, Marc; Saatchi, Sasan; DeGrandi, Gianfranco

    2000-01-01

    A method is developed for semi-automated classification of SAR images of the tropical forest. Information is extracted using the wavelet transform (WT). The transform allows for extraction of structural information in the image as a function of scale. In order to classify the SAR image, a Desicion Tree Classifier is used. The method of pruning is used to optimize classification rate versus tree size. The results give explicit insight on the type of information useful for a given class.

  11. Modeling for (physical) biologists: an introduction to the rule-based approach

    PubMed Central

    Chylek, Lily A; Harris, Leonard A; Faeder, James R; Hlavacek, William S

    2015-01-01

    Models that capture the chemical kinetics of cellular regulatory networks can be specified in terms of rules for biomolecular interactions. A rule defines a generalized reaction, meaning a reaction that permits multiple reactants, each capable of participating in a characteristic transformation and each possessing certain, specified properties, which may be local, such as the state of a particular site or domain of a protein. In other words, a rule defines a transformation and the properties that reactants must possess to participate in the transformation. A rule also provides a rate law. A rule-based approach to modeling enables consideration of mechanistic details at the level of functional sites of biomolecules and provides a facile and visual means for constructing computational models, which can be analyzed to study how system-level behaviors emerge from component interactions. PMID:26178138

  12. 76 FR 45471 - Energy Efficiency Standards for Distribution Transformers; Notice of Intent To Negotiate Proposed...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-29

    ... EERE-2010-BT-STD-0048] RIN 1904-AC04 Energy Efficiency Standards for Distribution Transformers; Notice...-type distribution transformers. The purpose of the subcommittee will be to discuss and, if possible, reach consensus on a proposed rule for the energy efficiency of distribution transformers, as authorized...

  13. 76 FR 57007 - Efficiency and Renewables Advisory Committee, Appliance Standards Subcommittee, Negotiated...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-15

    ... Distribution Transformers AGENCY: Department of Energy, Office of Energy Efficiency and Renewable Energy... Rulemaking Working Group for Low-Voltage Dry-Type Distribution Transformers (hereafter ``LV Group''). The LV... proposed rule for regulating the energy efficiency of distribution transformers, as authorized by the...

  14. Multispectral multisensor image fusion using wavelet transforms

    USGS Publications Warehouse

    Lemeshewsky, George P.

    1999-01-01

    Fusion techniques can be applied to multispectral and higher spatial resolution panchromatic images to create a composite image that is easier to interpret than the individual images. Wavelet transform-based multisensor, multiresolution fusion (a type of band sharpening) was applied to Landsat thematic mapper (TM) multispectral and coregistered higher resolution SPOT panchromatic images. The objective was to obtain increased spatial resolution, false color composite products to support the interpretation of land cover types wherein the spectral characteristics of the imagery are preserved to provide the spectral clues needed for interpretation. Since the fusion process should not introduce artifacts, a shift invariant implementation of the discrete wavelet transform (SIDWT) was used. These results were compared with those using the shift variant, discrete wavelet transform (DWT). Overall, the process includes a hue, saturation, and value color space transform to minimize color changes, and a reported point-wise maximum selection rule to combine transform coefficients. The performance of fusion based on the SIDWT and DWT was evaluated with a simulated TM 30-m spatial resolution test image and a higher resolution reference. Simulated imagery was made by blurring higher resolution color-infrared photography with the TM sensors' point spread function. The SIDWT based technique produced imagery with fewer artifacts and lower error between fused images and the full resolution reference. Image examples with TM and SPOT 10-m panchromatic illustrate the reduction in artifacts due to the SIDWT based fusion.

  15. Target-Based Maintenance of Privacy Preserving Association Rules

    ERIC Educational Resources Information Center

    Ahluwalia, Madhu V.

    2011-01-01

    In the context of association rule mining, the state-of-the-art in privacy preserving data mining provides solutions for categorical and Boolean association rules but not for quantitative association rules. This research fills this gap by describing a method based on discrete wavelet transform (DWT) to protect input data privacy while preserving…

  16. Developing a reversible rapid coordinate transformation model for the cylindrical projection

    NASA Astrophysics Data System (ADS)

    Ye, Si-jing; Yan, Tai-lai; Yue, Yan-li; Lin, Wei-yan; Li, Lin; Yao, Xiao-chuang; Mu, Qin-yun; Li, Yong-qin; Zhu, De-hai

    2016-04-01

    Numerical models are widely used for coordinate transformations. However, in most numerical models, polynomials are generated to approximate "true" geographic coordinates or plane coordinates, and one polynomial is hard to make simultaneously appropriate for both forward and inverse transformations. As there is a transformation rule between geographic coordinates and plane coordinates, how accurate and efficient is the calculation of the coordinate transformation if we construct polynomials to approximate the transformation rule instead of "true" coordinates? In addition, is it preferable to compare models using such polynomials with traditional numerical models with even higher exponents? Focusing on cylindrical projection, this paper reports on a grid-based rapid numerical transformation model - a linear rule approximation model (LRA-model) that constructs linear polynomials to approximate the transformation rule and uses a graticule to alleviate error propagation. Our experiments on cylindrical projection transformation between the WGS 84 Geographic Coordinate System (EPSG 4326) and the WGS 84 UTM ZONE 50N Plane Coordinate System (EPSG 32650) with simulated data demonstrate that the LRA-model exhibits high efficiency, high accuracy, and high stability; is simple and easy to use for both forward and inverse transformations; and can be applied to the transformation of a large amount of data with a requirement of high calculation efficiency. Furthermore, the LRA-model exhibits advantages in terms of calculation efficiency, accuracy and stability for coordinate transformations, compared to the widely used hyperbolic transformation model.

  17. Implementation of artificial intelligence rules in a data base management system

    NASA Technical Reports Server (NTRS)

    Feyock, S.

    1986-01-01

    The intelligent front end prototype was transformed into a RIM-integrated system. A RIM-based expert system was written which demonstrated the developed capability. The use of rules to produce extensibility of the intelligent front end, including the concept of demons and rule manipulation rules were investigated. Innovative approaches such as syntax programming were to be considered.

  18. Local Subspace Classifier with Transform-Invariance for Image Classification

    NASA Astrophysics Data System (ADS)

    Hotta, Seiji

    A family of linear subspace classifiers called local subspace classifier (LSC) outperforms the k-nearest neighbor rule (kNN) and conventional subspace classifiers in handwritten digit classification. However, LSC suffers very high sensitivity to image transformations because it uses projection and the Euclidean distances for classification. In this paper, I present a combination of a local subspace classifier (LSC) and a tangent distance (TD) for improving accuracy of handwritten digit recognition. In this classification rule, we can deal with transform-invariance easily because we are able to use tangent vectors for approximation of transformations. However, we cannot use tangent vectors in other type of images such as color images. Hence, kernel LSC (KLSC) is proposed for incorporating transform-invariance into LSC via kernel mapping. The performance of the proposed methods is verified with the experiments on handwritten digit and color image classification.

  19. Theory and operational rules for the discrete Hankel transform.

    PubMed

    Baddour, Natalie; Chouinard, Ugo

    2015-04-01

    Previous definitions of a discrete Hankel transform (DHT) have focused on methods to approximate the continuous Hankel integral transform. In this paper, we propose and evaluate the theory of a DHT that is shown to arise from a discretization scheme based on the theory of Fourier-Bessel expansions. The proposed transform also possesses requisite orthogonality properties which lead to invertibility of the transform. The standard set of shift, modulation, multiplication, and convolution rules are derived. In addition to the theory of the actual manipulated quantities which stand in their own right, this DHT can be used to approximate the continuous forward and inverse Hankel transform in the same manner that the discrete Fourier transform is known to be able to approximate the continuous Fourier transform.

  20. New derivation of the wavefront curvature transformation at an interface between two inhomogeneous media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uzsin, B.

    The principles for ray-tracing and wavefront curvature calculations in a three-dimensional medium are reviewed. A new derivation of the transformation of the wavefront curvature matrix at an interface between two inhomogeneous media is given. The derivation is based on a Taylor series expansion of the ray refraction equation at the interface between two inhomogeneous media, and only elementary geometric arguments are used. The wavefront curvature transformation at the interface is obtained by neglecting all terms in the direction of the surface normal. With proper definition of the variables, the derivation is also valid for a reflected wavefront. A simplified transformationmore » rule is derived for a reflected wave of the same type as the incident wave.« less

  1. Chinese Passives: Transformational or Lexical?

    ERIC Educational Resources Information Center

    Zhang, Jiuwu; Wen, Xiaohong

    Analysis of Chinese passive constructions indicates two types. The first is a verbal or syntactic passive because it is derived through a transformational rule. The second is a lexical passive that has certain properties in common with the predicate adjectives in both Chinese and English and is derived through the semantic function and in lexical…

  2. Some Lexical Redundancy Rules for English Nouns.

    ERIC Educational Resources Information Center

    Starosta, Stanley

    In line with current thinking in transformational grammar, syntax as a system can and should be studied before a study is made of the use of that system. Chomsky's lexical redundancy rule is an area for further study, possibly to come closer to defining and achieving explanatory adequacy. If it is observed that English nouns come in two types,…

  3. A new network representation of the metabolism to detect chemical transformation modules.

    PubMed

    Sorokina, Maria; Medigue, Claudine; Vallenet, David

    2015-11-14

    Metabolism is generally modeled by directed networks where nodes represent reactions and/or metabolites. In order to explore metabolic pathway conservation and divergence among organisms, previous studies were based on graph alignment to find similar pathways. Few years ago, the concept of chemical transformation modules, also called reaction modules, was introduced and correspond to sequences of chemical transformations which are conserved in metabolism. We propose here a novel graph representation of the metabolic network where reactions sharing a same chemical transformation type are grouped in Reaction Molecular Signatures (RMS). RMS were automatically computed for all reactions and encode changes in atoms and bonds. A reaction network containing all available metabolic knowledge was then reduced by an aggregation of reaction nodes and edges to obtain a RMS network. Paths in this network were explored and a substantial number of conserved chemical transformation modules was detected. Furthermore, this graph-based formalism allows us to define several path scores reflecting different biological conservation meanings. These scores are significantly higher for paths corresponding to known metabolic pathways and were used conjointly to build association rules that should predict metabolic pathway types like biosynthesis or degradation. This representation of metabolism in a RMS network offers new insights to capture relevant metabolic contexts. Furthermore, along with genomic context methods, it should improve the detection of gene clusters corresponding to new metabolic pathways.

  4. Numerical calculation of the Fresnel transform.

    PubMed

    Kelly, Damien P

    2014-04-01

    In this paper, we address the problem of calculating Fresnel diffraction integrals using a finite number of uniformly spaced samples. General and simple sampling rules of thumb are derived that allow the user to calculate the distribution for any propagation distance. It is shown how these rules can be extended to fast-Fourier-transform-based algorithms to increase calculation efficiency. A comparison with other theoretical approaches is made.

  5. Kinetic Monte Carlo Method for Rule-based Modeling of Biochemical Networks

    PubMed Central

    Yang, Jin; Monine, Michael I.; Faeder, James R.; Hlavacek, William S.

    2009-01-01

    We present a kinetic Monte Carlo method for simulating chemical transformations specified by reaction rules, which can be viewed as generators of chemical reactions, or equivalently, definitions of reaction classes. A rule identifies the molecular components involved in a transformation, how these components change, conditions that affect whether a transformation occurs, and a rate law. The computational cost of the method, unlike conventional simulation approaches, is independent of the number of possible reactions, which need not be specified in advance or explicitly generated in a simulation. To demonstrate the method, we apply it to study the kinetics of multivalent ligand-receptor interactions. We expect the method will be useful for studying cellular signaling systems and other physical systems involving aggregation phenomena. PMID:18851068

  6. 76 FR 53763 - Immigration Benefits Business Transformation, Increment I

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-29

    ...The Department of Homeland Security (DHS) is amending its regulations to enable U.S. Citizenship and Immigration Services (USCIS) to migrate from a paper file-based, non-integrated systems environment to an electronic customer-focused, centralized case management environment for benefit processing. This transformation process will allow USCIS to streamline benefit processing, eliminate the capture and processing of redundant data, and reduce the number of and automate its forms. This transformation process will be a phased multi-year initiative to restructure USCIS business processes and related information technology systems. DHS is removing references to form numbers, form titles, expired regulatory provisions, and descriptions of internal procedures, many of which will change during transformation. DHS is also finalizing interim rules that permitted submission of benefit requests with an electronic signature when such requests are submitted in an electronic format rather than on a paper form and that removed references to filing locations for immigration benefits. In addition, in this rule DHS is publishing the final rule for six other interim rules published during the past several years, most of which received no public comments.

  7. Multi-focus image fusion based on area-based standard deviation in dual tree contourlet transform domain

    NASA Astrophysics Data System (ADS)

    Dong, Min; Dong, Chenghui; Guo, Miao; Wang, Zhe; Mu, Xiaomin

    2018-04-01

    Multiresolution-based methods, such as wavelet and Contourlet are usually used to image fusion. This work presents a new image fusion frame-work by utilizing area-based standard deviation in dual tree Contourlet trans-form domain. Firstly, the pre-registered source images are decomposed with dual tree Contourlet transform; low-pass and high-pass coefficients are obtained. Then, the low-pass bands are fused with weighted average based on area standard deviation rather than the simple "averaging" rule. While the high-pass bands are merged with the "max-absolute' fusion rule. Finally, the modified low-pass and high-pass coefficients are used to reconstruct the final fused image. The major advantage of the proposed fusion method over conventional fusion is the approximately shift invariance and multidirectional selectivity of dual tree Contourlet transform. The proposed method is compared with wavelet- , Contourletbased methods and other the state-of-the art methods on common used multi focus images. Experiments demonstrate that the proposed fusion framework is feasible and effective, and it performs better in both subjective and objective evaluation.

  8. A real-time expert system for self-repairing flight control

    NASA Technical Reports Server (NTRS)

    Gaither, S. A.; Agarwal, A. K.; Shah, S. C.; Duke, E. L.

    1989-01-01

    An integrated environment for specifying, prototyping, and implementing a self-repairing flight-control (SRFC) strategy is described. At an interactive workstation, the user can select paradigms such as rule-based expert systems, state-transition diagrams, and signal-flow graphs and hierarchically nest them, assign timing and priority attributes, establish blackboard-type communication, and specify concurrent execution on single or multiple processors. High-fidelity nonlinear simulations of aircraft and SRFC systems can be performed off-line, with the possibility of changing SRFC rules, inference strategies, and other heuristics to correct for control deficiencies. Finally, the off-line-generated SRFC can be transformed into highly optimized application-specific real-time C-language code. An application of this environment to the design of aircraft fault detection, isolation, and accommodation algorithms is presented in detail.

  9. Transformation of PRT6 RNAi construct into tomato (Solanum lycopersicum) cv. Micro-Tom

    NASA Astrophysics Data System (ADS)

    Suka, Intan Elya; Chew, Bee Lynn; Goh, Hoe-Han; Isa, Nurulhikma Md

    2018-04-01

    PROTEOLYSIS 6 plays major role in the N-end rule pathway as N-recognin which functions as E3 ligase enzyme. It mediates ubiquitin processes that lead to degradation of unstable substrate protein. The aim of the current study is to transform the PRT6 gene into tomato (Solanum lycopersicum) from the cultivar Micro-Tom and to investigate its function in regulating ripening in tomato fruits. The PRT6_RNAi construct was successfully transformed into Agrobacterium C58 via heat shock method and transformed into seven days old cotyledon explants. Factors affecting transformation efficiency such as co-cultivation time and type of plant growth regulator combination were evaluated. Results from this study found that pre-cultured cotyledons from seven days old seedlings incubated for 2 days in co-cultivation medium increased shoot regeneration. Plant growth hormones zeatin combine with auxin produced a higher number of callus formation but lower shoot proliferation and transformation frequency compared to treatments of single plant hormone in the selection medium. Polymerase chain reaction (PCR) was performed on the regenerated shoots to confirm the integration of PRT6 fragment into the genome of transgenic plants. Based on PCR analysis, all putative shoots were positive transformants.

  10. Transformation rules and degradation of CAHs by Fentonlike oxidation in growth ring of water distribution network-A review

    NASA Astrophysics Data System (ADS)

    Zhong, D.; Ma, W. C.; Jiang, X. Q.; Yuan, Y. X.; Yuan, Y.; Wang, Z. Q.; Fang, T. T.; Huang, W. Y.

    2017-08-01

    Chlorinated hydrocarbons are widely used as organic solvent and chemical raw materials. After treatment, water polluted with trichloroethylene (TCE)/tetrachloroethylene (PCE) can reach the water quality requirements, while water with trace amounts of TCE/PCE is still harmful to humans, which will cause cancers. Water distribution network is an extremely complicated system, in which adsorption, desorption, flocculation, movement, transformation and reduction will occur, leading to changes of TCE/PCE concentrations and products. Therefore, it is important to investigate the transformation rules of TCE/PCE in water distribution network. What’s more, growth-ring, including drinking water pipes deposits, can act as catalysts in Fenton-like reagent (H2O2). This review summarizes the status of transformation rules of CAHs in water distribution network. It also evaluates the effectiveness and fruit of CAHs degradation by Fenton-like reagent based on growth-ring. This review is important in solving the potential safety problems caused by TCE/PCE in water distribution network.

  11. Unconstrained handwritten numeral recognition based on radial basis competitive and cooperative networks with spatio-temporal feature representation.

    PubMed

    Lee, S; Pan, J J

    1996-01-01

    This paper presents a new approach to representation and recognition of handwritten numerals. The approach first transforms a two-dimensional (2-D) spatial representation of a numeral into a three-dimensional (3-D) spatio-temporal representation by identifying the tracing sequence based on a set of heuristic rules acting as transformation operators. A multiresolution critical-point segmentation method is then proposed to extract local feature points, at varying degrees of scale and coarseness. A new neural network architecture, referred to as radial-basis competitive and cooperative network (RCCN), is presented especially for handwritten numeral recognition. RCCN is a globally competitive and locally cooperative network with the capability of self-organizing hidden units to progressively achieve desired network performance, and functions as a universal approximator of arbitrary input-output mappings. Three types of RCCNs are explored: input-space RCCN (IRCCN), output-space RCCN (ORCCN), and bidirectional RCCN (BRCCN). Experiments against handwritten zip code numerals acquired by the U.S. Postal Service indicated that the proposed method is robust in terms of variations, deformations, transformations, and corruption, achieving about 97% recognition rate.

  12. Exact Hybrid Particle/Population Simulation of Rule-Based Models of Biochemical Systems

    PubMed Central

    Stover, Lori J.; Nair, Niketh S.; Faeder, James R.

    2014-01-01

    Detailed modeling and simulation of biochemical systems is complicated by the problem of combinatorial complexity, an explosion in the number of species and reactions due to myriad protein-protein interactions and post-translational modifications. Rule-based modeling overcomes this problem by representing molecules as structured objects and encoding their interactions as pattern-based rules. This greatly simplifies the process of model specification, avoiding the tedious and error prone task of manually enumerating all species and reactions that can potentially exist in a system. From a simulation perspective, rule-based models can be expanded algorithmically into fully-enumerated reaction networks and simulated using a variety of network-based simulation methods, such as ordinary differential equations or Gillespie's algorithm, provided that the network is not exceedingly large. Alternatively, rule-based models can be simulated directly using particle-based kinetic Monte Carlo methods. This “network-free” approach produces exact stochastic trajectories with a computational cost that is independent of network size. However, memory and run time costs increase with the number of particles, limiting the size of system that can be feasibly simulated. Here, we present a hybrid particle/population simulation method that combines the best attributes of both the network-based and network-free approaches. The method takes as input a rule-based model and a user-specified subset of species to treat as population variables rather than as particles. The model is then transformed by a process of “partial network expansion” into a dynamically equivalent form that can be simulated using a population-adapted network-free simulator. The transformation method has been implemented within the open-source rule-based modeling platform BioNetGen, and resulting hybrid models can be simulated using the particle-based simulator NFsim. Performance tests show that significant memory savings can be achieved using the new approach and a monetary cost analysis provides a practical measure of its utility. PMID:24699269

  13. Exact hybrid particle/population simulation of rule-based models of biochemical systems.

    PubMed

    Hogg, Justin S; Harris, Leonard A; Stover, Lori J; Nair, Niketh S; Faeder, James R

    2014-04-01

    Detailed modeling and simulation of biochemical systems is complicated by the problem of combinatorial complexity, an explosion in the number of species and reactions due to myriad protein-protein interactions and post-translational modifications. Rule-based modeling overcomes this problem by representing molecules as structured objects and encoding their interactions as pattern-based rules. This greatly simplifies the process of model specification, avoiding the tedious and error prone task of manually enumerating all species and reactions that can potentially exist in a system. From a simulation perspective, rule-based models can be expanded algorithmically into fully-enumerated reaction networks and simulated using a variety of network-based simulation methods, such as ordinary differential equations or Gillespie's algorithm, provided that the network is not exceedingly large. Alternatively, rule-based models can be simulated directly using particle-based kinetic Monte Carlo methods. This "network-free" approach produces exact stochastic trajectories with a computational cost that is independent of network size. However, memory and run time costs increase with the number of particles, limiting the size of system that can be feasibly simulated. Here, we present a hybrid particle/population simulation method that combines the best attributes of both the network-based and network-free approaches. The method takes as input a rule-based model and a user-specified subset of species to treat as population variables rather than as particles. The model is then transformed by a process of "partial network expansion" into a dynamically equivalent form that can be simulated using a population-adapted network-free simulator. The transformation method has been implemented within the open-source rule-based modeling platform BioNetGen, and resulting hybrid models can be simulated using the particle-based simulator NFsim. Performance tests show that significant memory savings can be achieved using the new approach and a monetary cost analysis provides a practical measure of its utility.

  14. Transformation based endorsement systems

    NASA Technical Reports Server (NTRS)

    Sudkamp, Thomas

    1988-01-01

    Evidential reasoning techniques classically represent support for a hypothesis by a numeric value or an evidential interval. The combination of support is performed by an arithmetic rule which often requires restrictions to be placed on the set of possibilities. These assumptions usually require the hypotheses to be exhausitive and mutually exclusive. Endorsement based classification systems represent support for the alternatives symbolically rather than numerically. A framework for constructing endorsement systems is presented in which transformations are defined to generate and update the knowledge base. The interaction of the knowledge base and transformations produces a non-monotonic reasoning system. Two endorsement based reasoning systems are presented to demonstrate the flexibility of the transformational approach for reasoning with ambiguous and inconsistent information.

  15. Computer-Based Linguistic Analysis.

    ERIC Educational Resources Information Center

    Wright, James R.

    Noam Chomsky's transformational-generative grammar model may effectively be translated into an equivalent computer model. Phrase-structure rules and transformations are tested as to their validity and ordering by the computer via the process of random lexical substitution. Errors appearing in the grammar are detected and rectified, and formal…

  16. Willpower and Personal Rules.

    ERIC Educational Resources Information Center

    Benabou, Roland; Tirole, Jean

    2004-01-01

    We develop a theory of internal commitments or "personal rules" based on self-reputation over one's willpower, which transforms lapses into precedents that undermine future self-restraint. The foundation for this mechanism is the imperfect recall of past motives and feelings, leading people to draw inferences from their past actions. The degree of…

  17. Rule-based support system for multiple UMLS semantic type assignments

    PubMed Central

    Geller, James; He, Zhe; Perl, Yehoshua; Morrey, C. Paul; Xu, Julia

    2012-01-01

    Background When new concepts are inserted into the UMLS, they are assigned one or several semantic types from the UMLS Semantic Network by the UMLS editors. However, not every combination of semantic types is permissible. It was observed that many concepts with rare combinations of semantic types have erroneous semantic type assignments or prohibited combinations of semantic types. The correction of such errors is resource-intensive. Objective We design a computational system to inform UMLS editors as to whether a specific combination of two, three, four, or five semantic types is permissible or prohibited or questionable. Methods We identify a set of inclusion and exclusion instructions in the UMLS Semantic Network documentation and derive corresponding rule-categories as well as rule-categories from the UMLS concept content. We then design an algorithm adviseEditor based on these rule-categories. The algorithm specifies rules for an editor how to proceed when considering a tuple (pair, triple, quadruple, quintuple) of semantic types to be assigned to a concept. Results Eight rule-categories were identified. A Web-based system was developed to implement the adviseEditor algorithm, which returns for an input combination of semantic types whether it is permitted, prohibited or (in a few cases) requires more research. The numbers of semantic type pairs assigned to each rule-category are reported. Interesting examples for each rule-category are illustrated. Cases of semantic type assignments that contradict rules are listed, including recently introduced ones. Conclusion The adviseEditor system implements explicit and implicit knowledge available in the UMLS in a system that informs UMLS editors about the permissibility of a desired combination of semantic types. Using adviseEditor might help accelerate the work of the UMLS editors and prevent erroneous semantic type assignments. PMID:23041716

  18. Electrical and mechanical fully coupled theory and experimental verification of Rosen-type piezoelectric transformers.

    PubMed

    Hsu, Yu-Hsiang; Lee, Chih-Kung; Hsiao, Wen-Hsin

    2005-10-01

    A piezoelectric transformer is a power transfer device that converts its input and output voltage as well as current by effectively using electrical and mechanical coupling effects of piezoelectric materials. Equivalent-circuit models, which are traditionally used to analyze piezoelectric transformers, merge each mechanical resonance effect into a series of ordinary differential equations. Because of using ordinary differential equations, equivalent circuit models are insufficient to reflect the mechanical behavior of piezoelectric plates. Electromechanically, fully coupled governing equations of Rosen-type piezoelectric transformers, which are partial differential equations in nature, can be derived to address the deficiencies of the equivalent circuit models. It can be shown that the modal actuator concept can be adopted to optimize the electromechanical coupling effect of the driving section once the added spatial domain design parameters are taken into account, which are three-dimensional spatial dependencies of electromechanical properties. The maximum power transfer condition for a Rosen-type piezoelectric transformer is detailed. Experimental results, which lead us to a series of new design rules, also are presented to prove the validity and effectiveness of the theoretical predictions.

  19. Semantic Segmentation of Building Elements Using Point Cloud Hashing

    NASA Astrophysics Data System (ADS)

    Chizhova, M.; Gurianov, A.; Hess, M.; Luhmann, T.; Brunn, A.; Stilla, U.

    2018-05-01

    For the interpretation of point clouds, the semantic definition of extracted segments from point clouds or images is a common problem. Usually, the semantic of geometrical pre-segmented point cloud elements are determined using probabilistic networks and scene databases. The proposed semantic segmentation method is based on the psychological human interpretation of geometric objects, especially on fundamental rules of primary comprehension. Starting from these rules the buildings could be quite well and simply classified by a human operator (e.g. architect) into different building types and structural elements (dome, nave, transept etc.), including particular building parts which are visually detected. The key part of the procedure is a novel method based on hashing where point cloud projections are transformed into binary pixel representations. A segmentation approach released on the example of classical Orthodox churches is suitable for other buildings and objects characterized through a particular typology in its construction (e.g. industrial objects in standardized enviroments with strict component design allowing clear semantic modelling).

  20. 76 FR 50148 - Notice of Intent to Negotiate Proposed Rule on Energy Efficiency Standards for Distribution...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-12

    ... Intent to Negotiate Proposed Rule on Energy Efficiency Standards for Distribution Transformers AGENCY... transformers. The purpose of the subcommittee will be to discuss and, if possible, reach consensus on a proposed rule for the energy efficiency of distribution transformers, as authorized by the Energy Policy...

  1. Kinetics of austenite-pearlite transformation in eutectoid carbon steel

    NASA Astrophysics Data System (ADS)

    Hawbolt, E. B.; Chau, B.; Brimacombe, J. K.

    1983-09-01

    The kinetics of the austenite-to-pearlite transformation have been measured under isothermal and continuous-cooling conditions on a eutectoid carbon (1080) steel using a diametral dilatometric technique. The isothermal transformation kinetics have been analyzed in terms of the Avrami Equation containing the two parameters n and b; the initiation of transformation was characterized by an empirically determined transformation-start time (tAv). The parameter n was found to be nearly constant; and neither n nor b was dependent on the cooling rate between T A1 and the test temperature. Continuous-cooling tests were performed with cooling rates ranging from 7.5 to 108 °C per second, and the initiation of transformation was determined. Comparison of this transformation-start time for different cooling rates with the measured slow cooling of a test coupon immersed in a salt bath indicates that, particularly at lower temperatures, the transformation in the traditional T-T-T test specimen may not be isothermal. The additivity rule was found to predict accurately the time taken, relative to tAv, to reach a given fraction of austenite transformed, even though there is some question that the isokinetic condition was met above 660 °C. However, the additivity rule does not hold for the pretransformation or incubation period, as originally proposed by Scheil, and seriously overestimates the incubation time. Application of the additivity rule to the prediction of transformation-finish time, based on transformation start at TA1, also leads to overestimates, but these are less serious. The isothermal parameters— n ( T), b ( T), and tAv ( T)—have been used to predict continuous-cooling transformation kinetics which are in close agreement with measurements at four cooling rates ranging from 7.5 to 64 °C per second.

  2. Bayesian probability estimates are not necessary to make choices satisfying Bayes' rule in elementary situations.

    PubMed

    Domurat, Artur; Kowalczuk, Olga; Idzikowska, Katarzyna; Borzymowska, Zuzanna; Nowak-Przygodzka, Marta

    2015-01-01

    This paper has two aims. First, we investigate how often people make choices conforming to Bayes' rule when natural sampling is applied. Second, we show that using Bayes' rule is not necessary to make choices satisfying Bayes' rule. Simpler methods, even fallacious heuristics, might prescribe correct choices reasonably often under specific circumstances. We considered elementary situations with binary sets of hypotheses and data. We adopted an ecological approach and prepared two-stage computer tasks resembling natural sampling. Probabilistic relations were inferred from a set of pictures, followed by a choice which was made to maximize the chance of a preferred outcome. Use of Bayes' rule was deduced indirectly from choices. Study 1 used a stratified sample of N = 60 participants equally distributed with regard to gender and type of education (humanities vs. pure sciences). Choices satisfying Bayes' rule were dominant. To investigate ways of making choices more directly, we replicated Study 1, adding a task with a verbal report. In Study 2 (N = 76) choices conforming to Bayes' rule dominated again. However, the verbal reports revealed use of a new, non-inverse rule, which always renders correct choices, but is easier than Bayes' rule to apply. It does not require inversion of conditions [transforming P(H) and P(D|H) into P(H|D)] when computing chances. Study 3 examined the efficiency of three fallacious heuristics (pre-Bayesian, representativeness, and evidence-only) in producing choices concordant with Bayes' rule. Computer-simulated scenarios revealed that the heuristics produced correct choices reasonably often under specific base rates and likelihood ratios. Summing up we conclude that natural sampling results in most choices conforming to Bayes' rule. However, people tend to replace Bayes' rule with simpler methods, and even use of fallacious heuristics may be satisfactorily efficient.

  3. Bayesian probability estimates are not necessary to make choices satisfying Bayes’ rule in elementary situations

    PubMed Central

    Domurat, Artur; Kowalczuk, Olga; Idzikowska, Katarzyna; Borzymowska, Zuzanna; Nowak-Przygodzka, Marta

    2015-01-01

    This paper has two aims. First, we investigate how often people make choices conforming to Bayes’ rule when natural sampling is applied. Second, we show that using Bayes’ rule is not necessary to make choices satisfying Bayes’ rule. Simpler methods, even fallacious heuristics, might prescribe correct choices reasonably often under specific circumstances. We considered elementary situations with binary sets of hypotheses and data. We adopted an ecological approach and prepared two-stage computer tasks resembling natural sampling. Probabilistic relations were inferred from a set of pictures, followed by a choice which was made to maximize the chance of a preferred outcome. Use of Bayes’ rule was deduced indirectly from choices. Study 1 used a stratified sample of N = 60 participants equally distributed with regard to gender and type of education (humanities vs. pure sciences). Choices satisfying Bayes’ rule were dominant. To investigate ways of making choices more directly, we replicated Study 1, adding a task with a verbal report. In Study 2 (N = 76) choices conforming to Bayes’ rule dominated again. However, the verbal reports revealed use of a new, non-inverse rule, which always renders correct choices, but is easier than Bayes’ rule to apply. It does not require inversion of conditions [transforming P(H) and P(D|H) into P(H|D)] when computing chances. Study 3 examined the efficiency of three fallacious heuristics (pre-Bayesian, representativeness, and evidence-only) in producing choices concordant with Bayes’ rule. Computer-simulated scenarios revealed that the heuristics produced correct choices reasonably often under specific base rates and likelihood ratios. Summing up we conclude that natural sampling results in most choices conforming to Bayes’ rule. However, people tend to replace Bayes’ rule with simpler methods, and even use of fallacious heuristics may be satisfactorily efficient. PMID:26347676

  4. Data characteristic analysis of air conditioning load based on fast Fourier transform

    NASA Astrophysics Data System (ADS)

    Li, Min; Zhang, Yanchi; Xie, Da

    2018-04-01

    With the development of economy and the improvement of people's living standards, air conditioning equipment is more and more popular. The influence of air conditioning load for power grid is becoming more and more serious. In this context it is necessary to study the characteristics of air conditioning load. This paper analyzes the data of air conditioning power consumption in an office building. The data is used for Fast Fourier Transform by data analysis software. Then a series of maps are drawn for the transformed data. The characteristics of each map were analyzed separately. The hidden rules of these data are mined from the angle of frequency domain. And these rules are hard to find in the time domain.

  5. C code generation from Petri-net-based logic controller specification

    NASA Astrophysics Data System (ADS)

    Grobelny, Michał; Grobelna, Iwona; Karatkevich, Andrei

    2017-08-01

    The article focuses on programming of logic controllers. It is important that a programming code of a logic controller is executed flawlessly according to the primary specification. In the presented approach we generate C code for an AVR microcontroller from a rule-based logical model of a control process derived from a control interpreted Petri net. The same logical model is also used for formal verification of the specification by means of the model checking technique. The proposed rule-based logical model and formal rules of transformation ensure that the obtained implementation is consistent with the already verified specification. The approach is validated by practical experiments.

  6. H2RM: A Hybrid Rough Set Reasoning Model for Prediction and Management of Diabetes Mellitus.

    PubMed

    Ali, Rahman; Hussain, Jamil; Siddiqi, Muhammad Hameed; Hussain, Maqbool; Lee, Sungyoung

    2015-07-03

    Diabetes is a chronic disease characterized by high blood glucose level that results either from a deficiency of insulin produced by the body, or the body's resistance to the effects of insulin. Accurate and precise reasoning and prediction models greatly help physicians to improve diagnosis, prognosis and treatment procedures of different diseases. Though numerous models have been proposed to solve issues of diagnosis and management of diabetes, they have the following drawbacks: (1) restricted one type of diabetes; (2) lack understandability and explanatory power of the techniques and decision; (3) limited either to prediction purpose or management over the structured contents; and (4) lack competence for dimensionality and vagueness of patient's data. To overcome these issues, this paper proposes a novel hybrid rough set reasoning model (H2RM) that resolves problems of inaccurate prediction and management of type-1 diabetes mellitus (T1DM) and type-2 diabetes mellitus (T2DM). For verification of the proposed model, experimental data from fifty patients, acquired from a local hospital in semi-structured format, is used. First, the data is transformed into structured format and then used for mining prediction rules. Rough set theory (RST) based techniques and algorithms are used to mine the prediction rules. During the online execution phase of the model, these rules are used to predict T1DM and T2DM for new patients. Furthermore, the proposed model assists physicians to manage diabetes using knowledge extracted from online diabetes guidelines. Correlation-based trend analysis techniques are used to manage diabetic observations. Experimental results demonstrate that the proposed model outperforms the existing methods with 95.9% average and balanced accuracies.

  7. H2RM: A Hybrid Rough Set Reasoning Model for Prediction and Management of Diabetes Mellitus

    PubMed Central

    Ali, Rahman; Hussain, Jamil; Siddiqi, Muhammad Hameed; Hussain, Maqbool; Lee, Sungyoung

    2015-01-01

    Diabetes is a chronic disease characterized by high blood glucose level that results either from a deficiency of insulin produced by the body, or the body’s resistance to the effects of insulin. Accurate and precise reasoning and prediction models greatly help physicians to improve diagnosis, prognosis and treatment procedures of different diseases. Though numerous models have been proposed to solve issues of diagnosis and management of diabetes, they have the following drawbacks: (1) restricted one type of diabetes; (2) lack understandability and explanatory power of the techniques and decision; (3) limited either to prediction purpose or management over the structured contents; and (4) lack competence for dimensionality and vagueness of patient’s data. To overcome these issues, this paper proposes a novel hybrid rough set reasoning model (H2RM) that resolves problems of inaccurate prediction and management of type-1 diabetes mellitus (T1DM) and type-2 diabetes mellitus (T2DM). For verification of the proposed model, experimental data from fifty patients, acquired from a local hospital in semi-structured format, is used. First, the data is transformed into structured format and then used for mining prediction rules. Rough set theory (RST) based techniques and algorithms are used to mine the prediction rules. During the online execution phase of the model, these rules are used to predict T1DM and T2DM for new patients. Furthermore, the proposed model assists physicians to manage diabetes using knowledge extracted from online diabetes guidelines. Correlation-based trend analysis techniques are used to manage diabetic observations. Experimental results demonstrate that the proposed model outperforms the existing methods with 95.9% average and balanced accuracies. PMID:26151207

  8. Improved FFT-based numerical inversion of Laplace transforms via fast Hartley transform algorithm

    NASA Technical Reports Server (NTRS)

    Hwang, Chyi; Lu, Ming-Jeng; Shieh, Leang S.

    1991-01-01

    The disadvantages of numerical inversion of the Laplace transform via the conventional fast Fourier transform (FFT) are identified and an improved method is presented to remedy them. The improved method is based on introducing a new integration step length Delta(omega) = pi/mT for trapezoidal-rule approximation of the Bromwich integral, in which a new parameter, m, is introduced for controlling the accuracy of the numerical integration. Naturally, this method leads to multiple sets of complex FFT computations. A new inversion formula is derived such that N equally spaced samples of the inverse Laplace transform function can be obtained by (m/2) + 1 sets of N-point complex FFT computations or by m sets of real fast Hartley transform (FHT) computations.

  9. Phases, phase equilibria, and phase rules in low-dimensional systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frolov, T., E-mail: timfrol@berkeley.edu; Mishin, Y., E-mail: ymishin@gmu.edu

    2015-07-28

    We present a unified approach to thermodynamic description of one, two, and three dimensional phases and phase transformations among them. The approach is based on a rigorous definition of a phase applicable to thermodynamic systems of any dimensionality. Within this approach, the same thermodynamic formalism can be applied for the description of phase transformations in bulk systems, interfaces, and line defects separating interface phases. For both lines and interfaces, we rigorously derive an adsorption equation, the phase coexistence equations, and other thermodynamic relations expressed in terms of generalized line and interface excess quantities. As a generalization of the Gibbs phasemore » rule for bulk phases, we derive phase rules for lines and interfaces and predict the maximum number of phases than may coexist in systems of the respective dimensionality.« less

  10. A Computer Program for Testing Grammars On-Line.

    ERIC Educational Resources Information Center

    Gross, Louis N.

    This paper describes a computer system which is intended to aid the linguist in building a transformational grammar. The program operates as a rule tester, performing three services for the user through sets of functions which allow the user to--specify, change, and print base trees (to which transformations would apply); define transformations…

  11. Inference for Transition Network Grammars,

    DTIC Science & Technology

    1976-01-01

    If the arc Is followed. language L(G) is said to be structurally complete if The power of an augmented transition network (Am) is each rewriting rule ...Clearly, a context-sensitive grammar can be represented as a context—free grarmar plus a set of transformationDbbbbb Eabbbbbb Dbb~~bb Ebbbbbb rules ...are the foun— as a CFG (base) and a set of transformationa l rules . datIons of grammars of different complexities. The The CSL Is obtained by appl

  12. Extended canonical field theory of matter and space-time

    NASA Astrophysics Data System (ADS)

    Struckmeier, J.; Vasak, D.; matter, H. Stoecker Field theory of; space-time

    2015-11-01

    Any physical theory that follows from an action principle should be invariant in its form under mappings of the reference frame in order to comply with the general principle of relativity. The required form-invariance of the action principle implies that the mapping must constitute a particular extended canonical transformation. In the realm of the covariant Hamiltonian formulation of field theory, the term ``extended'' implies that not only the fields but also the space-time geometry is subject to transformation. A canonical transformation maintains the general form of the action principle by simultaneously defining the appropriate transformation rules for the fields, the conjugate momentum fields, and the transformation rule for the Hamiltonian. Provided that the given system of fields exhibits a particular global symmetry, the associated extended canonical transformation determines an amended Hamiltonian that is form-invariant under the corresponding local symmetry. This will be worked out for a Hamiltonian system of scalar and vector fields that is presupposed to be form-invariant under space-time transformations xμ\\mapsto Xμ with partial Xμ/partial xν=const., hence under global space-time transformations such as the Poincaré transformation. The corresponding amended system that is form-invariant under local space-time transformations partial Xμ/partial xν≠qconst. then describes the coupling of the fields to the space-time geometry and thus yields the dynamics of space-time that is associated with the given physical system. Non-zero spin matter determines thereby the space-time curvature via a well-defined source term in a covariant Poisson-type equation for the Riemann tensor.

  13. Characterization and classification of South American land cover types using satellite data

    NASA Technical Reports Server (NTRS)

    Townshend, J. R. G.; Justice, C. O.; Kalb, V.

    1987-01-01

    Various methods are compared for carrying out land cover classifications of South America using multitemporal Advanced Very High Resolution Radiometer data. Fifty-two images of the normalized difference vegetation index (NDVI) from a 1-year period are used to generate multitemporal data sets. Three main approaches to land cover classification are considered, namely the use of the principal components transformed images, the use of a characteristic curves procedure based on NDVI values plotted against time, and finally application of the maximum likelihood rule to multitemporal data sets. Comparison of results from training sites indicates that the last approach yields the most accurate results. Despite the reliance on training site figures for performance assessment, the results are nevertheless extremely encouraging, with accuracies for several cover types exceeding 90 per cent.

  14. Presenting Germany's drug pricing rule as a cost-per-QALY rule.

    PubMed

    Gandjour, Afschin

    2012-06-01

    In Germany, the Institute for Quality and Efficiency in Health Care (IQWiG) makes recommendations for ceiling prices of drugs based on an evaluation of the relationship between costs and effectiveness. To set ceiling prices, IQWiG uses the following decision rule: the incremental cost-effectiveness ratio of a new drug compared with the next effective intervention should not be higher than that of the next effective intervention compared to its comparator. The purpose of this paper is to show that IQWiG's decision rule can be presented as a cost-per-QALY rule by using equity-weighted QALYs. This transformation shows where both rules share commonalities. Furthermore, it makes the underlying ethical implications of IQWiG's decision rule transparent and open to debate.

  15. TOMML: A Rule Language for Structured Data

    NASA Astrophysics Data System (ADS)

    Cirstea, Horatiu; Moreau, Pierre-Etienne; Reilles, Antoine

    We present the TOM language that extends JAVA with the purpose of providing high level constructs inspired by the rewriting community. TOM bridges thus the gap between a general purpose language and high level specifications based on rewriting. This approach was motivated by the promotion of rule based techniques and their integration in large scale applications. Powerful matching capabilities along with a rich strategy language are among TOM's strong features that make it easy to use and competitive with respect to other rule based languages. TOM is thus a natural choice for querying and transforming structured data and in particular XML documents [1]. We present here its main XML oriented features and illustrate its use on several examples.

  16. QCD Sum Rules and Models for Generalized Parton Distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anatoly Radyushkin

    2004-10-01

    I use QCD sum rule ideas to construct models for generalized parton distributions. To this end, the perturbative parts of QCD sum rules for the pion and nucleon electromagnetic form factors are interpreted in terms of GPDs and two models are discussed. One of them takes the double Borel transform at adjusted value of the Borel parameter as a model for nonforward parton densities, and another is based on the local duality relation. Possible ways of improving these Ansaetze are briefly discussed.

  17. SPARQL Query Re-writing Using Partonomy Based Transformation Rules

    NASA Astrophysics Data System (ADS)

    Jain, Prateek; Yeh, Peter Z.; Verma, Kunal; Henson, Cory A.; Sheth, Amit P.

    Often the information present in a spatial knowledge base is represented at a different level of granularity and abstraction than the query constraints. For querying ontology's containing spatial information, the precise relationships between spatial entities has to be specified in the basic graph pattern of SPARQL query which can result in long and complex queries. We present a novel approach to help users intuitively write SPARQL queries to query spatial data, rather than relying on knowledge of the ontology structure. Our framework re-writes queries, using transformation rules to exploit part-whole relations between geographical entities to address the mismatches between query constraints and knowledge base. Our experiments were performed on completely third party datasets and queries. Evaluations were performed on Geonames dataset using questions from National Geographic Bee serialized into SPARQL and British Administrative Geography Ontology using questions from a popular trivia website. These experiments demonstrate high precision in retrieval of results and ease in writing queries.

  18. Rule acquisition in formal decision contexts based on formal, object-oriented and property-oriented concept lattices.

    PubMed

    Ren, Yue; Li, Jinhai; Aswani Kumar, Cherukuri; Liu, Wenqi

    2014-01-01

    Rule acquisition is one of the main purposes in the analysis of formal decision contexts. Up to now, there have been several types of rules in formal decision contexts such as decision rules, decision implications, and granular rules, which can be viewed as ∧-rules since all of them have the following form: "if conditions 1,2,…, and m hold, then decisions hold." In order to enrich the existing rule acquisition theory in formal decision contexts, this study puts forward two new types of rules which are called ∨-rules and ∨-∧ mixed rules based on formal, object-oriented, and property-oriented concept lattices. Moreover, a comparison of ∨-rules, ∨-∧ mixed rules, and ∧-rules is made from the perspectives of inclusion and inference relationships. Finally, some real examples and numerical experiments are conducted to compare the proposed rule acquisition algorithms with the existing one in terms of the running efficiency.

  19. Rule Acquisition in Formal Decision Contexts Based on Formal, Object-Oriented and Property-Oriented Concept Lattices

    PubMed Central

    Ren, Yue; Aswani Kumar, Cherukuri; Liu, Wenqi

    2014-01-01

    Rule acquisition is one of the main purposes in the analysis of formal decision contexts. Up to now, there have been several types of rules in formal decision contexts such as decision rules, decision implications, and granular rules, which can be viewed as ∧-rules since all of them have the following form: “if conditions 1,2,…, and m hold, then decisions hold.” In order to enrich the existing rule acquisition theory in formal decision contexts, this study puts forward two new types of rules which are called ∨-rules and ∨-∧ mixed rules based on formal, object-oriented, and property-oriented concept lattices. Moreover, a comparison of ∨-rules, ∨-∧ mixed rules, and ∧-rules is made from the perspectives of inclusion and inference relationships. Finally, some real examples and numerical experiments are conducted to compare the proposed rule acquisition algorithms with the existing one in terms of the running efficiency. PMID:25165744

  20. Using Rule-Based Computer Programming to Unify Communication Rules Research.

    ERIC Educational Resources Information Center

    Sanford, David L.; Roach, J. W.

    This paper proposes the use of a rule-based computer programming language as a standard for the expression of rules, arguing that the adoption of a standard would enable researchers to communicate about rules in a consistent and significant way. Focusing on the formal equivalence of artificial intelligence (AI) programming to different types of…

  1. Constructing a Geology Ontology Using a Relational Database

    NASA Astrophysics Data System (ADS)

    Hou, W.; Yang, L.; Yin, S.; Ye, J.; Clarke, K.

    2013-12-01

    In geology community, the creation of a common geology ontology has become a useful means to solve problems of data integration, knowledge transformation and the interoperation of multi-source, heterogeneous and multiple scale geological data. Currently, human-computer interaction methods and relational database-based methods are the primary ontology construction methods. Some human-computer interaction methods such as the Geo-rule based method, the ontology life cycle method and the module design method have been proposed for applied geological ontologies. Essentially, the relational database-based method is a reverse engineering of abstracted semantic information from an existing database. The key is to construct rules for the transformation of database entities into the ontology. Relative to the human-computer interaction method, relational database-based methods can use existing resources and the stated semantic relationships among geological entities. However, two problems challenge the development and application. One is the transformation of multiple inheritances and nested relationships and their representation in an ontology. The other is that most of these methods do not measure the semantic retention of the transformation process. In this study, we focused on constructing a rule set to convert the semantics in a geological database into a geological ontology. According to the relational schema of a geological database, a conversion approach is presented to convert a geological spatial database to an OWL-based geological ontology, which is based on identifying semantics such as entities, relationships, inheritance relationships, nested relationships and cluster relationships. The semantic integrity of the transformation was verified using an inverse mapping process. In a geological ontology, an inheritance and union operations between superclass and subclass were used to present the nested relationship in a geochronology and the multiple inheritances relationship. Based on a Quaternary database of downtown of Foshan city, Guangdong Province, in Southern China, a geological ontology was constructed using the proposed method. To measure the maintenance of semantics in the conversation process and the results, an inverse mapping from the ontology to a relational database was tested based on a proposed conversation rule. The comparison of schema and entities and the reduction of tables between the inverse database and the original database illustrated that the proposed method retains the semantic information well during the conversation process. An application for abstracting sandstone information showed that semantic relationships among concepts in the geological database were successfully reorganized in the constructed ontology. Key words: geological ontology; geological spatial database; multiple inheritance; OWL Acknowledgement: This research is jointly funded by the Specialized Research Fund for the Doctoral Program of Higher Education of China (RFDP) (20100171120001), NSFC (41102207) and the Fundamental Research Funds for the Central Universities (12lgpy19).

  2. Harmonic wavelet packet transform for on-line system health diagnosis

    NASA Astrophysics Data System (ADS)

    Yan, Ruqiang; Gao, Robert X.

    2004-07-01

    This paper presents a new approach to on-line health diagnosis of mechanical systems, based on the wavelet packet transform. Specifically, signals acquired from vibration sensors are decomposed into sub-bands by means of the discrete harmonic wavelet packet transform (DHWPT). Based on the Fisher linear discriminant criterion, features in the selected sub-bands are then used as inputs to three classifiers (Nearest Neighbor rule-based and two Neural Network-based), for system health condition assessment. Experimental results have confirmed that, comparing to the conventional approach where statistical parameters from raw signals are used, the presented approach enabled higher signal-to-noise ratio for more effective and intelligent use of the sensory information, thus leading to more accurate system health diagnosis.

  3. ReactPRED: a tool to predict and analyze biochemical reactions.

    PubMed

    Sivakumar, Tadi Venkata; Giri, Varun; Park, Jin Hwan; Kim, Tae Yong; Bhaduri, Anirban

    2016-11-15

    Biochemical pathways engineering is often used to synthesize or degrade target chemicals. In silico screening of the biochemical transformation space allows predicting feasible reactions, constituting these pathways. Current enabling tools are customized to predict reactions based on pre-defined biochemical transformations or reaction rule sets. Reaction rule sets are usually curated manually and tailored to specific applications. They are not exhaustive. In addition, current systems are incapable of regulating and refining data with an aim to tune specificity and sensitivity. A robust and flexible tool that allows automated reaction rule set creation along with regulated pathway prediction and analyses is a need. ReactPRED aims to address the same. ReactPRED is an open source flexible and customizable tool enabling users to predict biochemical reactions and pathways. The tool allows automated reaction rule creation from a user defined reaction set. Additionally, reaction rule degree and rule tolerance features allow refinement of predicted data. It is available as a flexible graphical user interface and a console application. ReactPRED is available at: https://sourceforge.net/projects/reactpred/ CONTACT: anirban.b@samsung.com or ty76.kim@samsung.comSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. THE APPLICATION AND IMPLEMENTATION OF DEACON TYPE SYSTEMS.

    DTIC Science & Technology

    management information system deriving from a project concerning development of techniques for computing with a computer in essentially unconstrained English. Deacon-type systems respond to instructions and queries concerning the subject matter of their data by appropriately manipulating and organizing the data internally. The clues that guide the organizing activity are the syntactic rules of the language and their semantic transformations. Three examples of Deacon systems are given. The ’Deacon Breadboard Summary’ of F. B. Thompson (RM 64TMP-9)

  5. Verification and Validation of KBS with Neural Network Components

    NASA Technical Reports Server (NTRS)

    Wen, Wu; Callahan, John

    1996-01-01

    Artificial Neural Network (ANN) play an important role in developing robust Knowledge Based Systems (KBS). The ANN based components used in these systems learn to give appropriate predictions through training with correct input-output data patterns. Unlike traditional KBS that depends on a rule database and a production engine, the ANN based system mimics the decisions of an expert without specifically formulating the if-than type of rules. In fact, the ANNs demonstrate their superiority when such if-then type of rules are hard to generate by human expert. Verification of traditional knowledge based system is based on the proof of consistency and completeness of the rule knowledge base and correctness of the production engine.These techniques, however, can not be directly applied to ANN based components.In this position paper, we propose a verification and validation procedure for KBS with ANN based components. The essence of the procedure is to obtain an accurate system specification through incremental modification of the specifications using an ANN rule extraction algorithm.

  6. Genetic reinforcement learning through symbiotic evolution for fuzzy controller design.

    PubMed

    Juang, C F; Lin, J Y; Lin, C T

    2000-01-01

    An efficient genetic reinforcement learning algorithm for designing fuzzy controllers is proposed in this paper. The genetic algorithm (GA) adopted in this paper is based upon symbiotic evolution which, when applied to fuzzy controller design, complements the local mapping property of a fuzzy rule. Using this Symbiotic-Evolution-based Fuzzy Controller (SEFC) design method, the number of control trials, as well as consumed CPU time, are considerably reduced when compared to traditional GA-based fuzzy controller design methods and other types of genetic reinforcement learning schemes. Moreover, unlike traditional fuzzy controllers, which partition the input space into a grid, SEFC partitions the input space in a flexible way, thus creating fewer fuzzy rules. In SEFC, different types of fuzzy rules whose consequent parts are singletons, fuzzy sets, or linear equations (TSK-type fuzzy rules) are allowed. Further, the free parameters (e.g., centers and widths of membership functions) and fuzzy rules are all tuned automatically. For the TSK-type fuzzy rule especially, which put the proposed learning algorithm in use, only the significant input variables are selected to participate in the consequent of a rule. The proposed SEFC design method has been applied to different simulated control problems, including the cart-pole balancing system, a magnetic levitation system, and a water bath temperature control system. The proposed SEFC has been verified to be efficient and superior from these control problems, and from comparisons with some traditional GA-based fuzzy systems.

  7. eFSM--a novel online neural-fuzzy semantic memory model.

    PubMed

    Tung, Whye Loon; Quek, Chai

    2010-01-01

    Fuzzy rule-based systems (FRBSs) have been successfully applied to many areas. However, traditional fuzzy systems are often manually crafted, and their rule bases that represent the acquired knowledge are static and cannot be trained to improve the modeling performance. This subsequently leads to intensive research on the autonomous construction and tuning of a fuzzy system directly from the observed training data to address the knowledge acquisition bottleneck, resulting in well-established hybrids such as neural-fuzzy systems (NFSs) and genetic fuzzy systems (GFSs). However, the complex and dynamic nature of real-world problems demands that fuzzy rule-based systems and models be able to adapt their parameters and ultimately evolve their rule bases to address the nonstationary (time-varying) characteristics of their operating environments. Recently, considerable research efforts have been directed to the study of evolving Tagaki-Sugeno (T-S)-type NFSs based on the concept of incremental learning. In contrast, there are very few incremental learning Mamdani-type NFSs reported in the literature. Hence, this paper presents the evolving neural-fuzzy semantic memory (eFSM) model, a neural-fuzzy Mamdani architecture with a data-driven progressively adaptive structure (i.e., rule base) based on incremental learning. Issues related to the incremental learning of the eFSM rule base are carefully investigated, and a novel parameter learning approach is proposed for the tuning of the fuzzy set parameters in eFSM. The proposed eFSM model elicits highly interpretable semantic knowledge in the form of Mamdani-type if-then fuzzy rules from low-level numeric training data. These Mamdani fuzzy rules define the computing structure of eFSM and are incrementally learned with the arrival of each training data sample. New rules are constructed from the emergence of novel training data and obsolete fuzzy rules that no longer describe the recently observed data trends are pruned. This enables eFSM to maintain a current and compact set of Mamdani-type if-then fuzzy rules that collectively generalizes and describes the salient associative mappings between the inputs and outputs of the underlying process being modeled. The learning and modeling performances of the proposed eFSM are evaluated using several benchmark applications and the results are encouraging.

  8. Algorithm Diversity for Resilent Systems

    DTIC Science & Technology

    2016-06-27

    data structures. 15. SUBJECT TERMS computer security, software diversity, program transformation 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF 18...systematic method for transforming Datalog rules with general universal and existential quantification into efficient algorithms with precise complexity...worst case in the size of the ground rules. There are numerous choices during the transformation that lead to diverse algorithms and different

  9. Study on stress-strain response of multi-phase TRIP steel under cyclic loading

    NASA Astrophysics Data System (ADS)

    Dan, W. J.; Hu, Z. G.; Zhang, W. G.; Li, S. H.; Lin, Z. Q.

    2013-12-01

    The stress-strain response of multi-phase TRIP590 sheet steel is studied in cyclic loading condition at room temperature based on a cyclic phase transformation model and a multi-phase mixed kinematic hardening model. The cyclic martensite transformation model is proposed based on the shear-band intersection, where the repeat number, strain amplitude and cyclic frequency are used to control the phase transformation process. The multi-phase mixed kinematic hardening model is developed based on the non-linear kinematic hardening rule of per-phase. The parameters of transformation model are identified with the relationship between the austenite volume fraction and the repeat number. The parameters in Kinematic hardening model are confirmed by the experimental hysteresis loops in different strain amplitude conditions. The responses of hysteresis loop and stress amplitude are evaluated by tension-compression data.

  10. How to select combination operators for fuzzy expert systems using CRI

    NASA Technical Reports Server (NTRS)

    Turksen, I. B.; Tian, Y.

    1992-01-01

    A method to select combination operators for fuzzy expert systems using the Compositional Rule of Inference (CRI) is proposed. First, fuzzy inference processes based on CRI are classified into three categories in terms of their inference results: the Expansion Type Inference, the Reduction Type Inference, and Other Type Inferences. Further, implication operators under Sup-T composition are classified as the Expansion Type Operator, the Reduction Type Operator, and the Other Type Operators. Finally, the combination of rules or their consequences is investigated for inference processes based on CRI.

  11. A Semantic Transformation Methodology for the Secondary Use of Observational Healthcare Data in Postmarketing Safety Studies.

    PubMed

    Pacaci, Anil; Gonul, Suat; Sinaci, A Anil; Yuksel, Mustafa; Laleci Erturkmen, Gokce B

    2018-01-01

    Background: Utilization of the available observational healthcare datasets is key to complement and strengthen the postmarketing safety studies. Use of common data models (CDM) is the predominant approach in order to enable large scale systematic analyses on disparate data models and vocabularies. Current CDM transformation practices depend on proprietarily developed Extract-Transform-Load (ETL) procedures, which require knowledge both on the semantics and technical characteristics of the source datasets and target CDM. Purpose: In this study, our aim is to develop a modular but coordinated transformation approach in order to separate semantic and technical steps of transformation processes, which do not have a strict separation in traditional ETL approaches. Such an approach would discretize the operations to extract data from source electronic health record systems, alignment of the source, and target models on the semantic level and the operations to populate target common data repositories. Approach: In order to separate the activities that are required to transform heterogeneous data sources to a target CDM, we introduce a semantic transformation approach composed of three steps: (1) transformation of source datasets to Resource Description Framework (RDF) format, (2) application of semantic conversion rules to get the data as instances of ontological model of the target CDM, and (3) population of repositories, which comply with the specifications of the CDM, by processing the RDF instances from step 2. The proposed approach has been implemented on real healthcare settings where Observational Medical Outcomes Partnership (OMOP) CDM has been chosen as the common data model and a comprehensive comparative analysis between the native and transformed data has been conducted. Results: Health records of ~1 million patients have been successfully transformed to an OMOP CDM based database from the source database. Descriptive statistics obtained from the source and target databases present analogous and consistent results. Discussion and Conclusion: Our method goes beyond the traditional ETL approaches by being more declarative and rigorous. Declarative because the use of RDF based mapping rules makes each mapping more transparent and understandable to humans while retaining logic-based computability. Rigorous because the mappings would be based on computer readable semantics which are amenable to validation through logic-based inference methods.

  12. Effects of Stoichiometry on Transformation Temperatures and Actuator-Type Performance of NiTiPd and NiTiPdX High-Temperature Shape Memory Alloys

    NASA Technical Reports Server (NTRS)

    Bigelow, Glen S.; Gaydosh, Darrell; Garg, Anita; Padula, Santo A., II; Noebe, Ronald D.

    2007-01-01

    High-temperature shape memory NiTiPd and NiTiPdX (X=Au, Pt, Hf) alloys were produced with titanium equivalent (Ti+Hf) compositions of 50.5, 50.0, 49.5, and 49.0 at.%. Thermo-mechanical testing in compression was used to evaluate the transformation temperatures, transformation strain, work output, and permanent deformation behavior of each alloy to study the effects of quaternary alloying and stoichiometry on high-temperature shape memory alloy behavior. Microstructural evaluation showed the presence of second phases for all alloy compositions. No load transformation temperatures in the stoichiometric alloys were relatively unchanged by Au and Pt substitutions, while the substitution of Hf for Ti causes a drop in transformation temperatures. The NiTiPd, NiTiPdAu and NiTiPdHf alloys exhibited transformation temperatures that were highest in the Ti-rich compositions, slightly lower at stoichiometry, and significantly reduced when the Ti equivalent composition was less than 50 at.%. For the NiTiPdPt alloy, transformation temperatures were highest for the Ti-rich compositions, lowest at stoichiometry, and slightly higher in the Ni-rich composition. When thermally cycled under constant stresses of up to 300 MPa, all of the alloys had transformation strains, and therefore work outputs, which increased with increasing stress. In each series of alloys, the transformation strain and thus work output was highest for stoichiometric or Ti-rich compositions while permanent strain associated with the constant-load thermal cycling was lowest for alloys with Ni-equivalent-rich compositions. Based on these results, basic rules for optimizing the composition of NiTiPd alloys for actuator performance will be discussed.

  13. Neural activity in superior parietal cortex during rule-based visual-motor transformations.

    PubMed

    Hawkins, Kara M; Sayegh, Patricia; Yan, Xiaogang; Crawford, J Douglas; Sergio, Lauren E

    2013-03-01

    Cognition allows for the use of different rule-based sensorimotor strategies, but the neural underpinnings of such strategies are poorly understood. The purpose of this study was to compare neural activity in the superior parietal lobule during a standard (direct interaction) reaching task, with two nonstandard (gaze and reach spatially incongruent) reaching tasks requiring the integration of rule-based information. Specifically, these nonstandard tasks involved dissociating the planes of reach and vision or rotating visual feedback by 180°. Single unit activity, gaze, and reach trajectories were recorded from two female Macaca mulattas. In all three conditions, we observed a temporal discharge pattern at the population level reflecting early reach planning and on-line reach monitoring. In the plane-dissociated task, we found a significant overall attenuation in the discharge rate of cells from deep recording sites, relative to standard reaching. We also found that cells modulated by reach direction tended to be significantly tuned either during the standard or the plane-dissociated task but rarely during both. In the standard versus feedback reversal comparison, we observed some cells that shifted their preferred direction by 180° between conditions, reflecting maintenance of directional tuning with respect to the reach goal. Our findings suggest that the superior parietal lobule plays an important role in processing information about the nonstandard nature of a task, which, through reciprocal connections with precentral motor areas, contributes to the accurate transformation of incongruent sensory inputs into an appropriate motor output. Such processing is crucial for the integration of rule-based information into a motor act.

  14. Evaluation of a rule base for decision making in general practice.

    PubMed Central

    Essex, B; Healy, M

    1994-01-01

    BACKGROUND. Decision making in general practice relies heavily on judgmental expertise. It should be possible to codify this expertise into rules and principles. AIM. A study was undertaken to evaluate the effectiveness, of rules from a rule base designed to improve students' and trainees' management decisions relating to patients seen in general practice. METHOD. The rule base was developed after studying decisions about and management of thousands of patients seen in one general practice over an eight year period. Vignettes were presented to 93 fourth year medical students and 179 general practitioner trainees. They recorded their perception and management of each case before and after being presented with a selection of relevant rules. Participants also commented on their level of agreement with each of the rules provided with the vignettes. A panel of five independent assessors then rated as good, acceptable or poor, the participants' perception and management of each case before and after seeing the rules. RESULTS. Exposure to a few selected rules of thumb improved the problem perception and management decisions of both undergraduates and trainees. The degree of improvement was not related to previous experience or to the stated level of agreement with the proposed rules. The assessors identified difficulties students and trainees experienced in changing their perceptions and management decisions when the rules suggested options they had not considered. CONCLUSION. The rules developed to improve decision making skills in general practice are effective when used with vignettes. The next phase is to transform the rule base into an expert system to train students and doctors to acquire decision making skills. It could also be used to provide decision support when confronted with difficult management decisions in general practice. PMID:8204334

  15. Image Fusion Algorithms Using Human Visual System in Transform Domain

    NASA Astrophysics Data System (ADS)

    Vadhi, Radhika; Swamy Kilari, Veera; Samayamantula, Srinivas Kumar

    2017-08-01

    The endeavor of digital image fusion is to combine the important visual parts from various sources to advance the visibility eminence of the image. The fused image has a more visual quality than any source images. In this paper, the Human Visual System (HVS) weights are used in the transform domain to select appropriate information from various source images and then to attain a fused image. In this process, mainly two steps are involved. First, apply the DWT to the registered source images. Later, identify qualitative sub-bands using HVS weights. Hence, qualitative sub-bands are selected from different sources to form high quality HVS based fused image. The quality of the HVS based fused image is evaluated with general fusion metrics. The results show the superiority among the state-of-the art resolution Transforms (MRT) such as Discrete Wavelet Transform (DWT), Stationary Wavelet Transform (SWT), Contourlet Transform (CT), and Non Sub Sampled Contourlet Transform (NSCT) using maximum selection fusion rule.

  16. Fusion of GFP and phase contrast images with complex shearlet transform and Haar wavelet-based energy rule.

    PubMed

    Qiu, Chenhui; Wang, Yuanyuan; Guo, Yanen; Xia, Shunren

    2018-03-14

    Image fusion techniques can integrate the information from different imaging modalities to get a composite image which is more suitable for human visual perception and further image processing tasks. Fusing green fluorescent protein (GFP) and phase contrast images is very important for subcellular localization, functional analysis of protein and genome expression. The fusion method of GFP and phase contrast images based on complex shearlet transform (CST) is proposed in this paper. Firstly the GFP image is converted to IHS model and its intensity component is obtained. Secondly the CST is performed on the intensity component and the phase contrast image to acquire the low-frequency subbands and the high-frequency subbands. Then the high-frequency subbands are merged by the absolute-maximum rule while the low-frequency subbands are merged by the proposed Haar wavelet-based energy (HWE) rule. Finally the fused image is obtained by performing the inverse CST on the merged subbands and conducting IHS-to-RGB conversion. The proposed fusion method is tested on a number of GFP and phase contrast images and compared with several popular image fusion methods. The experimental results demonstrate that the proposed fusion method can provide better fusion results in terms of subjective quality and objective evaluation. © 2018 Wiley Periodicals, Inc.

  17. An Interval Type-2 Neural Fuzzy System for Online System Identification and Feature Elimination.

    PubMed

    Lin, Chin-Teng; Pal, Nikhil R; Wu, Shang-Lin; Liu, Yu-Ting; Lin, Yang-Yin

    2015-07-01

    We propose an integrated mechanism for discarding derogatory features and extraction of fuzzy rules based on an interval type-2 neural fuzzy system (NFS)-in fact, it is a more general scheme that can discard bad features, irrelevant antecedent clauses, and even irrelevant rules. High-dimensional input variable and a large number of rules not only enhance the computational complexity of NFSs but also reduce their interpretability. Therefore, a mechanism for simultaneous extraction of fuzzy rules and reducing the impact of (or eliminating) the inferior features is necessary. The proposed approach, namely an interval type-2 Neural Fuzzy System for online System Identification and Feature Elimination (IT2NFS-SIFE), uses type-2 fuzzy sets to model uncertainties associated with information and data in designing the knowledge base. The consequent part of the IT2NFS-SIFE is of Takagi-Sugeno-Kang type with interval weights. The IT2NFS-SIFE possesses a self-evolving property that can automatically generate fuzzy rules. The poor features can be discarded through the concept of a membership modulator. The antecedent and modulator weights are learned using a gradient descent algorithm. The consequent part weights are tuned via the rule-ordered Kalman filter algorithm to enhance learning effectiveness. Simulation results show that IT2NFS-SIFE not only simplifies the system architecture by eliminating derogatory/irrelevant antecedent clauses, rules, and features but also maintains excellent performance.

  18. An automatic fuzzy-based multi-temporal brain digital subtraction angiography image fusion algorithm using curvelet transform and content selection strategy.

    PubMed

    Momeni, Saba; Pourghassem, Hossein

    2014-08-01

    Recently image fusion has prominent role in medical image processing and is useful to diagnose and treat many diseases. Digital subtraction angiography is one of the most applicable imaging to diagnose brain vascular diseases and radiosurgery of brain. This paper proposes an automatic fuzzy-based multi-temporal fusion algorithm for 2-D digital subtraction angiography images. In this algorithm, for blood vessel map extraction, the valuable frames of brain angiography video are automatically determined to form the digital subtraction angiography images based on a novel definition of vessel dispersion generated by injected contrast material. Our proposed fusion scheme contains different fusion methods for high and low frequency contents based on the coefficient characteristic of wrapping second generation of curvelet transform and a novel content selection strategy. Our proposed content selection strategy is defined based on sample correlation of the curvelet transform coefficients. In our proposed fuzzy-based fusion scheme, the selection of curvelet coefficients are optimized by applying weighted averaging and maximum selection rules for the high frequency coefficients. For low frequency coefficients, the maximum selection rule based on local energy criterion is applied to better visual perception. Our proposed fusion algorithm is evaluated on a perfect brain angiography image dataset consisting of one hundred 2-D internal carotid rotational angiography videos. The obtained results demonstrate the effectiveness and efficiency of our proposed fusion algorithm in comparison with common and basic fusion algorithms.

  19. Rule-based reasoning is fast and belief-based reasoning can be slow: Challenging current explanations of belief-bias and base-rate neglect.

    PubMed

    Newman, Ian R; Gibb, Maia; Thompson, Valerie A

    2017-07-01

    It is commonly assumed that belief-based reasoning is fast and automatic, whereas rule-based reasoning is slower and more effortful. Dual-Process theories of reasoning rely on this speed-asymmetry explanation to account for a number of reasoning phenomena, such as base-rate neglect and belief-bias. The goal of the current study was to test this hypothesis about the relative speed of belief-based and rule-based processes. Participants solved base-rate problems (Experiment 1) and conditional inferences (Experiment 2) under a challenging deadline; they then gave a second response in free time. We found that fast responses were informed by rules of probability and logical validity, and that slow responses incorporated belief-based information. Implications for Dual-Process theories and future research options for dissociating Type I and Type II processes are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. Multispectral image sharpening using a shift-invariant wavelet transform and adaptive processing of multiresolution edges

    USGS Publications Warehouse

    Lemeshewsky, G.P.; Rahman, Z.-U.; Schowengerdt, R.A.; Reichenbach, S.E.

    2002-01-01

    Enhanced false color images from mid-IR, near-IR (NIR), and visible bands of the Landsat thematic mapper (TM) are commonly used for visually interpreting land cover type. Described here is a technique for sharpening or fusion of NIR with higher resolution panchromatic (Pan) that uses a shift-invariant implementation of the discrete wavelet transform (SIDWT) and a reported pixel-based selection rule to combine coefficients. There can be contrast reversals (e.g., at soil-vegetation boundaries between NIR and visible band images) and consequently degraded sharpening and edge artifacts. To improve performance for these conditions, I used a local area-based correlation technique originally reported for comparing image-pyramid-derived edges for the adaptive processing of wavelet-derived edge data. Also, using the redundant data of the SIDWT improves edge data generation. There is additional improvement because sharpened subband imagery is used with the edge-correlation process. A reported technique for sharpening three-band spectral imagery used forward and inverse intensity, hue, and saturation transforms and wavelet-based sharpening of intensity. This technique had limitations with opposite contrast data, and in this study sharpening was applied to single-band multispectral-Pan image pairs. Sharpening used simulated 30-m NIR imagery produced by degrading the spatial resolution of a higher resolution reference. Performance, evaluated by comparison between sharpened and reference image, was improved when sharpened subband data were used with the edge correlation.

  1. A Machine Learning Approach to Discover Rules for Expressive Performance Actions in Jazz Guitar Music.

    PubMed

    Giraldo, Sergio I; Ramirez, Rafael

    2016-01-01

    Expert musicians introduce expression in their performances by manipulating sound properties such as timing, energy, pitch, and timbre. Here, we present a data driven computational approach to induce expressive performance rule models for note duration, onset, energy, and ornamentation transformations in jazz guitar music. We extract high-level features from a set of 16 commercial audio recordings (and corresponding music scores) of jazz guitarist Grant Green in order to characterize the expression in the pieces. We apply machine learning techniques to the resulting features to learn expressive performance rule models. We (1) quantitatively evaluate the accuracy of the induced models, (2) analyse the relative importance of the considered musical features, (3) discuss some of the learnt expressive performance rules in the context of previous work, and (4) assess their generailty. The accuracies of the induced predictive models is significantly above base-line levels indicating that the audio performances and the musical features extracted contain sufficient information to automatically learn informative expressive performance patterns. Feature analysis shows that the most important musical features for predicting expressive transformations are note duration, pitch, metrical strength, phrase position, Narmour structure, and tempo and key of the piece. Similarities and differences between the induced expressive rules and the rules reported in the literature were found. Differences may be due to the fact that most previously studied performance data has consisted of classical music recordings. Finally, the rules' performer specificity/generality is assessed by applying the induced rules to performances of the same pieces performed by two other professional jazz guitar players. Results show a consistency in the ornamentation patterns between Grant Green and the other two musicians, which may be interpreted as a good indicator for generality of the ornamentation rules.

  2. A Machine Learning Approach to Discover Rules for Expressive Performance Actions in Jazz Guitar Music

    PubMed Central

    Giraldo, Sergio I.; Ramirez, Rafael

    2016-01-01

    Expert musicians introduce expression in their performances by manipulating sound properties such as timing, energy, pitch, and timbre. Here, we present a data driven computational approach to induce expressive performance rule models for note duration, onset, energy, and ornamentation transformations in jazz guitar music. We extract high-level features from a set of 16 commercial audio recordings (and corresponding music scores) of jazz guitarist Grant Green in order to characterize the expression in the pieces. We apply machine learning techniques to the resulting features to learn expressive performance rule models. We (1) quantitatively evaluate the accuracy of the induced models, (2) analyse the relative importance of the considered musical features, (3) discuss some of the learnt expressive performance rules in the context of previous work, and (4) assess their generailty. The accuracies of the induced predictive models is significantly above base-line levels indicating that the audio performances and the musical features extracted contain sufficient information to automatically learn informative expressive performance patterns. Feature analysis shows that the most important musical features for predicting expressive transformations are note duration, pitch, metrical strength, phrase position, Narmour structure, and tempo and key of the piece. Similarities and differences between the induced expressive rules and the rules reported in the literature were found. Differences may be due to the fact that most previously studied performance data has consisted of classical music recordings. Finally, the rules' performer specificity/generality is assessed by applying the induced rules to performances of the same pieces performed by two other professional jazz guitar players. Results show a consistency in the ornamentation patterns between Grant Green and the other two musicians, which may be interpreted as a good indicator for generality of the ornamentation rules. PMID:28066290

  3. Field Dislocation Mechanics for heterogeneous elastic materials: A numerical spectral approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Djaka, Komlan Senam; Villani, Aurelien; Taupin, Vincent

    Spectral methods using Fast Fourier Transform (FFT) algorithms have recently seen a surge in interest in the mechanics of materials community. The present work addresses the critical question of determining accurate local mechanical fields using FFT methods without artificial fluctuations arising from materials and defects induced discontinuities. Precisely, this work introduces a numerical approach based on intrinsic discrete Fourier transforms for the simultaneous treatment of material discontinuities arising from the presence of dislocations and from elastic stiffness heterogeneities. To this end, the elasto-static equations of the field dislocation mechanics theory for periodic heterogeneous materials are numerically solved with FFT inmore » the case of dislocations in proximity of inclusions of varying stiffness. An optimal intrinsic discrete Fourier transform method is sought based on two distinct schemes. A centered finite difference scheme for differential rules are used for numerically solving the Poisson-type equation in the Fourier space, while centered finite differences on a rotated grid is chosen for the computation of the modified Fourier–Green’s operator associated with the Lippmann–Schwinger-type equation. By comparing different methods with analytical solutions for an edge dislocation in a composite material, it is found that the present spectral method is accurate, devoid of any numerical oscillation, and efficient even for an infinite phase elastic contrast like a hole embedded in a matrix containing a dislocation. The present FFT method is then used to simulate physical cases such as the elastic fields of dislocation dipoles located near the matrix/inclusion interface in a 2D composite material and the ones due to dislocation loop distributions surrounding cubic inclusions in 3D composite material. In these configurations, the spectral method allows investigating accurately the elastic interactions and image stresses due to dislocation fields in the presence of elastic inhomogeneities.« less

  4. Field Dislocation Mechanics for heterogeneous elastic materials: A numerical spectral approach

    DOE PAGES

    Djaka, Komlan Senam; Villani, Aurelien; Taupin, Vincent; ...

    2017-03-01

    Spectral methods using Fast Fourier Transform (FFT) algorithms have recently seen a surge in interest in the mechanics of materials community. The present work addresses the critical question of determining accurate local mechanical fields using FFT methods without artificial fluctuations arising from materials and defects induced discontinuities. Precisely, this work introduces a numerical approach based on intrinsic discrete Fourier transforms for the simultaneous treatment of material discontinuities arising from the presence of dislocations and from elastic stiffness heterogeneities. To this end, the elasto-static equations of the field dislocation mechanics theory for periodic heterogeneous materials are numerically solved with FFT inmore » the case of dislocations in proximity of inclusions of varying stiffness. An optimal intrinsic discrete Fourier transform method is sought based on two distinct schemes. A centered finite difference scheme for differential rules are used for numerically solving the Poisson-type equation in the Fourier space, while centered finite differences on a rotated grid is chosen for the computation of the modified Fourier–Green’s operator associated with the Lippmann–Schwinger-type equation. By comparing different methods with analytical solutions for an edge dislocation in a composite material, it is found that the present spectral method is accurate, devoid of any numerical oscillation, and efficient even for an infinite phase elastic contrast like a hole embedded in a matrix containing a dislocation. The present FFT method is then used to simulate physical cases such as the elastic fields of dislocation dipoles located near the matrix/inclusion interface in a 2D composite material and the ones due to dislocation loop distributions surrounding cubic inclusions in 3D composite material. In these configurations, the spectral method allows investigating accurately the elastic interactions and image stresses due to dislocation fields in the presence of elastic inhomogeneities.« less

  5. Fuzzy self-learning control for magnetic servo system

    NASA Technical Reports Server (NTRS)

    Tarn, J. H.; Kuo, L. T.; Juang, K. Y.; Lin, C. E.

    1994-01-01

    It is known that an effective control system is the key condition for successful implementation of high-performance magnetic servo systems. Major issues to design such control systems are nonlinearity; unmodeled dynamics, such as secondary effects for copper resistance, stray fields, and saturation; and that disturbance rejection for the load effect reacts directly on the servo system without transmission elements. One typical approach to design control systems under these conditions is a special type of nonlinear feedback called gain scheduling. It accommodates linear regulators whose parameters are changed as a function of operating conditions in a preprogrammed way. In this paper, an on-line learning fuzzy control strategy is proposed. To inherit the wealth of linear control design, the relations between linear feedback and fuzzy logic controllers have been established. The exercise of engineering axioms of linear control design is thus transformed into tuning of appropriate fuzzy parameters. Furthermore, fuzzy logic control brings the domain of candidate control laws from linear into nonlinear, and brings new prospects into design of the local controllers. On the other hand, a self-learning scheme is utilized to automatically tune the fuzzy rule base. It is based on network learning infrastructure; statistical approximation to assign credit; animal learning method to update the reinforcement map with a fast learning rate; and temporal difference predictive scheme to optimize the control laws. Different from supervised and statistical unsupervised learning schemes, the proposed method learns on-line from past experience and information from the process and forms a rule base of an FLC system from randomly assigned initial control rules.

  6. Neoliberal Justice and the Transformation of the Moral: The Privatization of the Right to Health Care in Colombia.

    PubMed

    Abadía-Barrero, César Ernesto

    2016-03-01

    Neoliberal reforms have transformed the legislative scope and everyday dynamics around the right to health care from welfare state social contracts to insurance markets administered by transnational financial capital. This article presents experiences of health care-seeking treatment, judicial rulings about the right to health care, and market-based health care legislation in Colombia. When insurance companies deny services, citizens petition the judiciary to issue a writ affirming their right to health care. The judiciary evaluates the finances of all relevant parties to rule whether a service should be provided and who should be responsible for the costs. A 2011 law claimed that citizens who demand, physicians who prescribe, and judges who grant uncovered services use the system's limited economic resources and undermine the state's capacity to expand coverage to the poor. This article shows how the consolidation of neoliberal ideology in health care requires the transformation of moral values around life. © 2015 by the American Anthropological Association.

  7. System Complexity Reduction via Feature Selection

    ERIC Educational Resources Information Center

    Deng, Houtao

    2011-01-01

    This dissertation transforms a set of system complexity reduction problems to feature selection problems. Three systems are considered: classification based on association rules, network structure learning, and time series classification. Furthermore, two variable importance measures are proposed to reduce the feature selection bias in tree…

  8. Rule groupings: A software engineering approach towards verification of expert systems

    NASA Technical Reports Server (NTRS)

    Mehrotra, Mala

    1991-01-01

    Currently, most expert system shells do not address software engineering issues for developing or maintaining expert systems. As a result, large expert systems tend to be incomprehensible, difficult to debug or modify and almost impossible to verify or validate. Partitioning rule based systems into rule groups which reflect the underlying subdomains of the problem should enhance the comprehensibility, maintainability, and reliability of expert system software. Attempts were made to semiautomatically structure a CLIPS rule base into groups of related rules that carry the same type of information. Different distance metrics that capture relevant information from the rules for grouping are discussed. Two clustering algorithms that partition the rule base into groups of related rules are given. Two independent evaluation criteria are developed to measure the effectiveness of the grouping strategies. Results of the experiment with three sample rule bases are presented.

  9. STRATEGIES OF MARINE DINOFLAGELLATE SURVIVAL AND SOME RULES OF ASSEMBLY. (R829368)

    EPA Science Inventory

    Dinoflagellate ecology is based on multiple adaptive strategies and species having diverse habitat preferences. Nine types of mixing-irradiance-nutrient habitats selecting for specific marine dinoflagellate life-form types are recognised, with five rules of assembly proposed t...

  10. Conceptual Model Formalization in a Semantic Interoperability Service Framework: Transforming Relational Database Schemas to OWL.

    PubMed

    Bravo, Carlos; Suarez, Carlos; González, Carolina; López, Diego; Blobel, Bernd

    2014-01-01

    Healthcare information is distributed through multiple heterogeneous and autonomous systems. Access to, and sharing of, distributed information sources are a challenging task. To contribute to meeting this challenge, this paper presents a formal, complete and semi-automatic transformation service from Relational Databases to Web Ontology Language. The proposed service makes use of an algorithm that allows to transform several data models of different domains by deploying mainly inheritance rules. The paper emphasizes the relevance of integrating the proposed approach into an ontology-based interoperability service to achieve semantic interoperability.

  11. Optimal operating rules definition in complex water resource systems combining fuzzy logic, expert criteria and stochastic programming

    NASA Astrophysics Data System (ADS)

    Macian-Sorribes, Hector; Pulido-Velazquez, Manuel

    2016-04-01

    This contribution presents a methodology for defining optimal seasonal operating rules in multireservoir systems coupling expert criteria and stochastic optimization. Both sources of information are combined using fuzzy logic. The structure of the operating rules is defined based on expert criteria, via a joint expert-technician framework consisting in a series of meetings, workshops and surveys carried out between reservoir managers and modelers. As a result, the decision-making process used by managers can be assessed and expressed using fuzzy logic: fuzzy rule-based systems are employed to represent the operating rules and fuzzy regression procedures are used for forecasting future inflows. Once done that, a stochastic optimization algorithm can be used to define optimal decisions and transform them into fuzzy rules. Finally, the optimal fuzzy rules and the inflow prediction scheme are combined into a Decision Support System for making seasonal forecasts and simulate the effect of different alternatives in response to the initial system state and the foreseen inflows. The approach presented has been applied to the Jucar River Basin (Spain). Reservoir managers explained how the system is operated, taking into account the reservoirs' states at the beginning of the irrigation season and the inflows previewed during that season. According to the information given by them, the Jucar River Basin operating policies were expressed via two fuzzy rule-based (FRB) systems that estimate the amount of water to be allocated to the users and how the reservoir storages should be balanced to guarantee those deliveries. A stochastic optimization model using Stochastic Dual Dynamic Programming (SDDP) was developed to define optimal decisions, which are transformed into optimal operating rules embedding them into the two FRBs previously created. As a benchmark, historical records are used to develop alternative operating rules. A fuzzy linear regression procedure was employed to foresee future inflows depending on present and past hydrological and meteorological variables actually used by the reservoir managers to define likely inflow scenarios. A Decision Support System (DSS) was created coupling the FRB systems and the inflow prediction scheme in order to give the user a set of possible optimal releases in response to the reservoir states at the beginning of the irrigation season and the fuzzy inflow projections made using hydrological and meteorological information. The results show that the optimal DSS created using the FRB operating policies are able to increase the amount of water allocated to the users in 20 to 50 Mm3 per irrigation season with respect to the current policies. Consequently, the mechanism used to define optimal operating rules and transform them into a DSS is able to increase the water deliveries in the Jucar River Basin, combining expert criteria and optimization algorithms in an efficient way. This study has been partially supported by the IMPADAPT project (CGL2013-48424-C2-1-R) with Spanish MINECO (Ministerio de Economía y Competitividad) and FEDER funds. It also has received funding from the European Union's Horizon 2020 research and innovation programme under the IMPREX project (grant agreement no: 641.811).

  12. View-tolerant face recognition and Hebbian learning imply mirror-symmetric neural tuning to head orientation

    PubMed Central

    Leibo, Joel Z.; Liao, Qianli; Freiwald, Winrich A.; Anselmi, Fabio; Poggio, Tomaso

    2017-01-01

    SUMMARY The primate brain contains a hierarchy of visual areas, dubbed the ventral stream, which rapidly computes object representations that are both specific for object identity and robust against identity-preserving transformations like depth-rotations [1, 2]. Current computational models of object recognition, including recent deep learning networks, generate these properties through a hierarchy of alternating selectivity-increasing filtering and tolerance-increasing pooling operations, similar to simple-complex cells operations [3, 4, 5, 6]. Here we prove that a class of hierarchical architectures and a broad set of biologically plausible learning rules generate approximate invariance to identity-preserving transformations at the top level of the processing hierarchy. However, all past models tested failed to reproduce the most salient property of an intermediate representation of a three-level face-processing hierarchy in the brain: mirror-symmetric tuning to head orientation [7]. Here we demonstrate that one specific biologically-plausible Hebb-type learning rule generates mirror-symmetric tuning to bilaterally symmetric stimuli like faces at intermediate levels of the architecture and show why it does so. Thus the tuning properties of individual cells inside the visual stream appear to result from group properties of the stimuli they encode and to reflect the learning rules that sculpted the information-processing system within which they reside. PMID:27916522

  13. Ontological Modeling of Transformation in Heart Defect Diagrams

    PubMed Central

    Viswanath, Venkatesh; Tong, Tuanjie; Dinakarpandian, Deendayal; Lee, Yugyung

    2006-01-01

    The accurate portrayal of a large volume data of variable heart defects is crucial to providing good patient care in pediatric cardiology. Our research aims to span the universe of congenital heart defects by generating illustrative diagrams that enhance data interpretation. To accommodate the range and severity of defects to be represented, we base our diagrams on transformation models applied to a normal heart rather than a static set of defects. These models are based on a domain-specific ontology, clustering, association rule mining and the use of parametric equations specified in a mathematical programming language. PMID:17238451

  14. Large-scale optimization-based classification models in medicine and biology.

    PubMed

    Lee, Eva K

    2007-06-01

    We present novel optimization-based classification models that are general purpose and suitable for developing predictive rules for large heterogeneous biological and medical data sets. Our predictive model simultaneously incorporates (1) the ability to classify any number of distinct groups; (2) the ability to incorporate heterogeneous types of attributes as input; (3) a high-dimensional data transformation that eliminates noise and errors in biological data; (4) the ability to incorporate constraints to limit the rate of misclassification, and a reserved-judgment region that provides a safeguard against over-training (which tends to lead to high misclassification rates from the resulting predictive rule); and (5) successive multi-stage classification capability to handle data points placed in the reserved-judgment region. To illustrate the power and flexibility of the classification model and solution engine, and its multi-group prediction capability, application of the predictive model to a broad class of biological and medical problems is described. Applications include: the differential diagnosis of the type of erythemato-squamous diseases; predicting presence/absence of heart disease; genomic analysis and prediction of aberrant CpG island meythlation in human cancer; discriminant analysis of motility and morphology data in human lung carcinoma; prediction of ultrasonic cell disruption for drug delivery; identification of tumor shape and volume in treatment of sarcoma; discriminant analysis of biomarkers for prediction of early atherosclerois; fingerprinting of native and angiogenic microvascular networks for early diagnosis of diabetes, aging, macular degeneracy and tumor metastasis; prediction of protein localization sites; and pattern recognition of satellite images in classification of soil types. In all these applications, the predictive model yields correct classification rates ranging from 80 to 100%. This provides motivation for pursuing its use as a medical diagnostic, monitoring and decision-making tool.

  15. A new modulated Hebbian learning rule--biologically plausible method for local computation of a principal subspace.

    PubMed

    Jankovic, Marko; Ogawa, Hidemitsu

    2003-08-01

    This paper presents one possible implementation of a transformation that performs linear mapping to a lower-dimensional subspace. Principal component subspace will be the one that will be analyzed. Idea implemented in this paper represents generalization of the recently proposed infinity OH neural method for principal component extraction. The calculations in the newly proposed method are performed locally--a feature which is usually considered as desirable from the biological point of view. Comparing to some other wellknown methods, proposed synaptic efficacy learning rule requires less information about the value of the other efficacies to make single efficacy modification. Synaptic efficacies are modified by implementation of Modulated Hebb-type (MH) learning rule. Slightly modified MH algorithm named Modulated Hebb Oja (MHO) algorithm, will be also introduced. Structural similarity of the proposed network with part of the retinal circuit will be presented, too.

  16. Simulating urban land cover changes at sub-pixel level in a coastal city

    NASA Astrophysics Data System (ADS)

    Zhao, Xiaofeng; Deng, Lei; Feng, Huihui; Zhao, Yanchuang

    2014-10-01

    The simulation of urban expansion or land cover changes is a major theme in both geographic information science and landscape ecology. Yet till now, almost all of previous studies were based on grid computations at pixel level. With the prevalence of spectral mixture analysis in urban land cover research, the simulation of urban land cover at sub-pixel level is being put into agenda. This study provided a new approach of land cover simulation at sub-pixel level. Landsat TM/ETM+ images of Xiamen city, China on both the January of 2002 and 2007 were used to acquire land cover data through supervised classification. Then the two classified land cover data were utilized to extract the transformation rule between 2002 and 2007 using logistic regression. The transformation possibility of each land cover type in a certain pixel was taken as its percent in the same pixel after normalization. And cellular automata (CA) based grid computation was carried out to acquire simulated land cover on 2007. The simulated 2007 sub-pixel land cover was testified with a validated sub-pixel land cover achieved by spectral mixture analysis in our previous studies on the same date. And finally the sub-pixel land cover of 2017 was simulated for urban planning and management. The results showed that our method is useful in land cover simulation at sub-pixel level. Although the simulation accuracy is not quite satisfactory for all the land cover types, it provides an important idea and a good start in the CA-based urban land cover simulation.

  17. Multispectral image sharpening using wavelet transform techniques and spatial correlation of edges

    USGS Publications Warehouse

    Lemeshewsky, George P.; Schowengerdt, Robert A.

    2000-01-01

    Several reported image fusion or sharpening techniques are based on the discrete wavelet transform (DWT). The technique described here uses a pixel-based maximum selection rule to combine respective transform coefficients of lower spatial resolution near-infrared (NIR) and higher spatial resolution panchromatic (pan) imagery to produce a sharpened NIR image. Sharpening assumes a radiometric correlation between the spectral band images. However, there can be poor correlation, including edge contrast reversals (e.g., at soil-vegetation boundaries), between the fused images and, consequently, degraded performance. To improve sharpening, a local area-based correlation technique originally reported for edge comparison with image pyramid fusion is modified for application with the DWT process. Further improvements are obtained by using redundant, shift-invariant implementation of the DWT. Example images demonstrate the improvements in NIR image sharpening with higher resolution pan imagery.

  18. SIRE: A Simple Interactive Rule Editor for NICBES

    NASA Technical Reports Server (NTRS)

    Bykat, Alex

    1988-01-01

    To support evolution of domain expertise, and its representation in an expert system knowledge base, a user-friendly rule base editor is mandatory. The Nickel Cadmium Battery Expert System (NICBES), a prototype of an expert system for the Hubble Space Telescope power storage management system, does not provide such an editor. In the following, a description of a Simple Interactive Rule Base Editor (SIRE) for NICBES is described. The SIRE provides a consistent internal representation of the NICBES knowledge base. It supports knowledge presentation and provides a user-friendly and code language independent medium for rule addition and modification. The SIRE is integrated with NICBES via an interface module. This module provides translation of the internal representation to Prolog-type rules (Horn clauses), latter rule assertion, and a simple mechanism for rule selection for its Prolog inference engine.

  19. Leadership: validation of a self-report scale: comment on Dussault, Frenette, and Fernet (2013).

    PubMed

    Chakrabarty, Subhra

    2014-10-01

    In a recent study, Dussault, Frenette, and Fernet (2013) developed a 21-item self-report instrument to measure leadership based on Bass's (1985) transformational/transactional leadership paradigm. The final specification included a third-order dimension (leadership), two second-order dimensions (transactional leadership and transformational leadership), and a first-order dimension (laissez-faire leadership). This note focuses on the need for assessing convergent and discriminant validity of the scale, and on ruling out the potential for common method bias.

  20. Modelling Chemical Reasoning to Predict and Invent Reactions.

    PubMed

    Segler, Marwin H S; Waller, Mark P

    2017-05-02

    The ability to reason beyond established knowledge allows organic chemists to solve synthetic problems and invent novel transformations. Herein, we propose a model that mimics chemical reasoning, and formalises reaction prediction as finding missing links in a knowledge graph. We have constructed a knowledge graph containing 14.4 million molecules and 8.2 million binary reactions, which represents the bulk of all chemical reactions ever published in the scientific literature. Our model outperforms a rule-based expert system in the reaction prediction task for 180 000 randomly selected binary reactions. The data-driven model generalises even beyond known reaction types, and is thus capable of effectively (re-)discovering novel transformations (even including transition metal-catalysed reactions). Our model enables computers to infer hypotheses about reactivity and reactions by only considering the intrinsic local structure of the graph and because each single reaction prediction is typically achieved in a sub-second time frame, the model can be used as a high-throughput generator of reaction hypotheses for reaction discovery. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Deciding Full Branching Time Logic by Program Transformation

    NASA Astrophysics Data System (ADS)

    Pettorossi, Alberto; Proietti, Maurizio; Senni, Valerio

    We present a method based on logic program transformation, for verifying Computation Tree Logic (CTL*) properties of finite state reactive systems. The finite state systems and the CTL* properties we want to verify, are encoded as logic programs on infinite lists. Our verification method consists of two steps. In the first step we transform the logic program that encodes the given system and the given property, into a monadic ω -program, that is, a stratified program defining nullary or unary predicates on infinite lists. This transformation is performed by applying unfold/fold rules that preserve the perfect model of the initial program. In the second step we verify the property of interest by using a proof method for monadic ω-programs.

  2. Training the rivers and exploring the coasts. Knowledge evolution in the Netherlands in two engineering fields between 1800 and 1940

    NASA Astrophysics Data System (ADS)

    Toussaint, Bert

    In this paper, the author wants to explore the knowledge development in two crucial fields, river management and coast management in the 19th century and first decades of the 20th century. Were there similar characteristics in this development? Which types of knowledge can be distinguished? Who were the principal actors in these processes? Did the knowledge evolution have a Dutch stamp or a rather international flavour? To structure the analysis, the author uses the concept of technology regime, a set of technical rules which shapes the know-how of engineers, their design rules and research processes. The analysis shows that the knowledge development of river management and coastal management followed different evolution paths between 1800 and 1940. In the field of river management, a substantial amount of mathematical and physical theories had been gradually developed since the end of the 17th century. After 1850, the regularization approach met gradually a widespread support. Empirical data, design rules, theoretical knowledge and engineering pivoted around the regularization approach, and a technology regime around this approach emerged. The regularization regime further developed in the 20th century, and handbooks were increasingly shaped by mathematical and physical reasoning and formulas. On the other hand, coastal management was until the 1880s a rather marginal activity. Coastal engineering was an extremely complex and multidimensional field of knowledge which no engineer was able to grasp. The foundation of a Dutch weather institute was a first important step towards a more theoretical approach. The Zuiderzee works (starting in 1925) gave probably the most important stimuli to scientific coastal research. It was also a main factor in setting up scientific institutes by Rijkswaterstaat. So from the 1920s, Rijkswaterstaat became a major producer of scientific knowledge, not only in tidal modelling but also in coastal research. Due to a multidisciplinary knowledge network, coastal research transformed from a marginal to a first-rank scientific field, and this transformation enabled Rijkswaterstaat to set a much higher level of ambition in coastal management. The 1953 flood and the Deltaworks marked a new era. New design rules for sea dykes and river levees, based on a revolutionary statistical risk approach were determined, and design rules for the Deltaworks estuary closures were developed, being enabled by the development of hydraulic research.

  3. [A wavelet-transform-based method for the automatic detection of late-type stars].

    PubMed

    Liu, Zhong-tian; Zhao, Rrui-zhen; Zhao, Yong-heng; Wu, Fu-chao

    2005-07-01

    The LAMOST project, the world largest sky survey project, urgently needs an automatic late-type stars detection system. However, to our knowledge, no effective methods for automatic late-type stars detection have been reported in the literature up to now. The present study work is intended to explore possible ways to deal with this issue. Here, by "late-type stars" we mean those stars with strong molecule absorption bands, including oxygen-rich M, L and T type stars and carbon-rich C stars. Based on experimental results, the authors find that after a wavelet transform with 5 scales on the late-type stars spectra, their frequency spectrum of the transformed coefficient on the 5th scale consistently manifests a unimodal distribution, and the energy of frequency spectrum is largely concentrated on a small neighborhood centered around the unique peak. However, for the spectra of other celestial bodies, the corresponding frequency spectrum is of multimodal and the energy of frequency spectrum is dispersible. Based on such a finding, the authors presented a wavelet-transform-based automatic late-type stars detection method. The proposed method is shown by extensive experiments to be practical and of good robustness.

  4. Rules or consequences? The role of ethical mind-sets in moral dynamics.

    PubMed

    Cornelissen, Gert; Bashshur, Michael R; Rode, Julian; Le Menestrel, Marc

    2013-04-01

    Recent research on the dynamics of moral behavior has documented two contrasting phenomena-moral consistency and moral balancing. Moral balancing refers to the phenomenon whereby behaving ethically or unethically decreases the likelihood of engaging in the same type of behavior again later. Moral consistency describes the opposite pattern-engaging in ethical or unethical behavior increases the likelihood of engaging in the same type of behavior later on. The three studies reported here supported the hypothesis that individuals' ethical mind-set (i.e., outcome-based vs. rule-based) moderates the impact of an initial ethical or unethical act on the likelihood of behaving ethically on a subsequent occasion. More specifically, an outcome-based mind-set facilitated moral balancing, and a rule-based mind-set facilitated moral consistency.

  5. PET-CT image fusion using random forest and à-trous wavelet transform.

    PubMed

    Seal, Ayan; Bhattacharjee, Debotosh; Nasipuri, Mita; Rodríguez-Esparragón, Dionisio; Menasalvas, Ernestina; Gonzalo-Martin, Consuelo

    2018-03-01

    New image fusion rules for multimodal medical images are proposed in this work. Image fusion rules are defined by random forest learning algorithm and a translation-invariant à-trous wavelet transform (AWT). The proposed method is threefold. First, source images are decomposed into approximation and detail coefficients using AWT. Second, random forest is used to choose pixels from the approximation and detail coefficients for forming the approximation and detail coefficients of the fused image. Lastly, inverse AWT is applied to reconstruct fused image. All experiments have been performed on 198 slices of both computed tomography and positron emission tomography images of a patient. A traditional fusion method based on Mallat wavelet transform has also been implemented on these slices. A new image fusion performance measure along with 4 existing measures has been presented, which helps to compare the performance of 2 pixel level fusion methods. The experimental results clearly indicate that the proposed method outperforms the traditional method in terms of visual and quantitative qualities and the new measure is meaningful. Copyright © 2017 John Wiley & Sons, Ltd.

  6. Dependent Measure and Time Constraints Modulate the Competition between Conflicting Feature-Based and Rule-Based Generalization Processes

    ERIC Educational Resources Information Center

    Cobos, Pedro L.; Gutiérrez-Cobo, María J.; Morís, Joaquín; Luque, David

    2017-01-01

    In our study, we tested the hypothesis that feature-based and rule-based generalization involve different types of processes that may affect each other producing different results depending on time constraints and on how generalization is measured. For this purpose, participants in our experiments learned cue-outcome relationships that followed…

  7. Graduated driver licensing and differential deterrence: The effect of license type on intentions to violate road rules.

    PubMed

    Poirier, Brigitte; Blais, Etienne; Faubert, Camille

    2018-01-01

    In keeping with the differential deterrence theory, this article assesses the moderating effect of license type on the relationship between social control and intention to violate road rules. More precisely, the article has two objectives: (1) to assess the effect of license type on intentions to infringe road rules; and (2) to pinpoint mechanisms of social control affecting intentions to violate road rules based on one's type of driver license (a restricted license or a full license). This effect is examined among a sample of 392 young drivers in the province of Quebec, Canada. Drivers taking part in the Graduated Driver Licensing (GDL) program have limited demerit points and there is zero tolerance for drinking-and-driving. Propensity score matching techniques were used to assess the effect of the license type on intentions to violate road rules and on various mechanisms of social control. Regression analyses were then conducted to estimate the moderating effect of license type. Average treatment effects from propensity score matching analyses indicate that respondents with a restricted license have lower levels of intention to infringe road rules. While moral commitment and, to a lesser extent, the perceived risk of arrest are both negatively associated with intentions to violate road rules, the license type moderates the relationship between delinquent peers and intentions to violate road rules. The effect of delinquent peers is reduced among respondents with a restricted driver license. Finally, a diminished capability to resist peer pressure could explain the increased crash risk in months following full licensing. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Association Rule-based Predictive Model for Machine Failure in Industrial Internet of Things

    NASA Astrophysics Data System (ADS)

    Kwon, Jung-Hyok; Lee, Sol-Bee; Park, Jaehoon; Kim, Eui-Jik

    2017-09-01

    This paper proposes an association rule-based predictive model for machine failure in industrial Internet of things (IIoT), which can accurately predict the machine failure in real manufacturing environment by investigating the relationship between the cause and type of machine failure. To develop the predictive model, we consider three major steps: 1) binarization, 2) rule creation, 3) visualization. The binarization step translates item values in a dataset into one or zero, then the rule creation step creates association rules as IF-THEN structures using the Lattice model and Apriori algorithm. Finally, the created rules are visualized in various ways for users’ understanding. An experimental implementation was conducted using R Studio version 3.3.2. The results show that the proposed predictive model realistically predicts machine failure based on association rules.

  9. View-Tolerant Face Recognition and Hebbian Learning Imply Mirror-Symmetric Neural Tuning to Head Orientation.

    PubMed

    Leibo, Joel Z; Liao, Qianli; Anselmi, Fabio; Freiwald, Winrich A; Poggio, Tomaso

    2017-01-09

    The primate brain contains a hierarchy of visual areas, dubbed the ventral stream, which rapidly computes object representations that are both specific for object identity and robust against identity-preserving transformations, like depth rotations [1, 2]. Current computational models of object recognition, including recent deep-learning networks, generate these properties through a hierarchy of alternating selectivity-increasing filtering and tolerance-increasing pooling operations, similar to simple-complex cells operations [3-6]. Here, we prove that a class of hierarchical architectures and a broad set of biologically plausible learning rules generate approximate invariance to identity-preserving transformations at the top level of the processing hierarchy. However, all past models tested failed to reproduce the most salient property of an intermediate representation of a three-level face-processing hierarchy in the brain: mirror-symmetric tuning to head orientation [7]. Here, we demonstrate that one specific biologically plausible Hebb-type learning rule generates mirror-symmetric tuning to bilaterally symmetric stimuli, like faces, at intermediate levels of the architecture and show why it does so. Thus, the tuning properties of individual cells inside the visual stream appear to result from group properties of the stimuli they encode and to reflect the learning rules that sculpted the information-processing system within which they reside. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Fusion of multi-spectral and panchromatic images based on 2D-PWVD and SSIM

    NASA Astrophysics Data System (ADS)

    Tan, Dongjie; Liu, Yi; Hou, Ruonan; Xue, Bindang

    2016-03-01

    A combined method using 2D pseudo Wigner-Ville distribution (2D-PWVD) and structural similarity(SSIM) index is proposed for fusion of low resolution multi-spectral (MS) image and high resolution panchromatic (PAN) image. First, the intensity component of multi-spectral image is extracted with generalized IHS transform. Then, the spectrum diagrams of the intensity components of multi-spectral image and panchromatic image are obtained with 2D-PWVD. Different fusion rules are designed for different frequency information of the spectrum diagrams. SSIM index is used to evaluate the high frequency information of the spectrum diagrams for assigning the weights in the fusion processing adaptively. After the new spectrum diagram is achieved according to the fusion rule, the final fusion image can be obtained by inverse 2D-PWVD and inverse GIHS transform. Experimental results show that, the proposed method can obtain high quality fusion images.

  11. The New Rule Paradigm Shift: Transforming At-Risk Programs by Matching Business Archetypes Strategies in the Global Market

    ERIC Educational Resources Information Center

    Stark, Paul S.

    2007-01-01

    The challenge was given to transform aviation-related programs to keep them from being eliminated. These programs were to be discontinued due to enrollment declines, costs, legislative mandates, lack of administrative support, and drastic state budget reductions. The New Rule was a paradigm shift of focus to the global market for program…

  12. Run-time parallelization and scheduling of loops

    NASA Technical Reports Server (NTRS)

    Saltz, Joel H.; Mirchandaney, Ravi; Crowley, Kay

    1990-01-01

    Run time methods are studied to automatically parallelize and schedule iterations of a do loop in certain cases, where compile-time information is inadequate. The methods presented involve execution time preprocessing of the loop. At compile-time, these methods set up the framework for performing a loop dependency analysis. At run time, wave fronts of concurrently executable loop iterations are identified. Using this wavefront information, loop iterations are reordered for increased parallelism. Symbolic transformation rules are used to produce: inspector procedures that perform execution time preprocessing and executors or transformed versions of source code loop structures. These transformed loop structures carry out the calculations planned in the inspector procedures. Performance results are presented from experiments conducted on the Encore Multimax. These results illustrate that run time reordering of loop indices can have a significant impact on performance. Furthermore, the overheads associated with this type of reordering are amortized when the loop is executed several times with the same dependency structure.

  13. Transfer of Rule-Based Expertise through a Tutorial Dialogue

    DTIC Science & Technology

    1979-09-01

    be causing the infection (.2) [RULE633]. {The student asks, "Does the patient have a fever ?") " FEBRILE MYCIN never needed to inquire about whether...remaining clauses, some we classified most as restrictions, and the one or two that remained constituted the key factor(s) of the rule. The " petechial ...Infection is bacterial, KEY-FACTORt 4) Petechial is one of the types of rash which the patient has, RESTRICTIONS 5) Purpuric is not one of the types

  14. Analyzing Large Gene Expression and Methylation Data Profiles Using StatBicRM: Statistical Biclustering-Based Rule Mining

    PubMed Central

    Maulik, Ujjwal; Mallik, Saurav; Mukhopadhyay, Anirban; Bandyopadhyay, Sanghamitra

    2015-01-01

    Microarray and beadchip are two most efficient techniques for measuring gene expression and methylation data in bioinformatics. Biclustering deals with the simultaneous clustering of genes and samples. In this article, we propose a computational rule mining framework, StatBicRM (i.e., statistical biclustering-based rule mining) to identify special type of rules and potential biomarkers using integrated approaches of statistical and binary inclusion-maximal biclustering techniques from the biological datasets. At first, a novel statistical strategy has been utilized to eliminate the insignificant/low-significant/redundant genes in such way that significance level must satisfy the data distribution property (viz., either normal distribution or non-normal distribution). The data is then discretized and post-discretized, consecutively. Thereafter, the biclustering technique is applied to identify maximal frequent closed homogeneous itemsets. Corresponding special type of rules are then extracted from the selected itemsets. Our proposed rule mining method performs better than the other rule mining algorithms as it generates maximal frequent closed homogeneous itemsets instead of frequent itemsets. Thus, it saves elapsed time, and can work on big dataset. Pathway and Gene Ontology analyses are conducted on the genes of the evolved rules using David database. Frequency analysis of the genes appearing in the evolved rules is performed to determine potential biomarkers. Furthermore, we also classify the data to know how much the evolved rules are able to describe accurately the remaining test (unknown) data. Subsequently, we also compare the average classification accuracy, and other related factors with other rule-based classifiers. Statistical significance tests are also performed for verifying the statistical relevance of the comparative results. Here, each of the other rule mining methods or rule-based classifiers is also starting with the same post-discretized data-matrix. Finally, we have also included the integrated analysis of gene expression and methylation for determining epigenetic effect (viz., effect of methylation) on gene expression level. PMID:25830807

  15. Analyzing large gene expression and methylation data profiles using StatBicRM: statistical biclustering-based rule mining.

    PubMed

    Maulik, Ujjwal; Mallik, Saurav; Mukhopadhyay, Anirban; Bandyopadhyay, Sanghamitra

    2015-01-01

    Microarray and beadchip are two most efficient techniques for measuring gene expression and methylation data in bioinformatics. Biclustering deals with the simultaneous clustering of genes and samples. In this article, we propose a computational rule mining framework, StatBicRM (i.e., statistical biclustering-based rule mining) to identify special type of rules and potential biomarkers using integrated approaches of statistical and binary inclusion-maximal biclustering techniques from the biological datasets. At first, a novel statistical strategy has been utilized to eliminate the insignificant/low-significant/redundant genes in such way that significance level must satisfy the data distribution property (viz., either normal distribution or non-normal distribution). The data is then discretized and post-discretized, consecutively. Thereafter, the biclustering technique is applied to identify maximal frequent closed homogeneous itemsets. Corresponding special type of rules are then extracted from the selected itemsets. Our proposed rule mining method performs better than the other rule mining algorithms as it generates maximal frequent closed homogeneous itemsets instead of frequent itemsets. Thus, it saves elapsed time, and can work on big dataset. Pathway and Gene Ontology analyses are conducted on the genes of the evolved rules using David database. Frequency analysis of the genes appearing in the evolved rules is performed to determine potential biomarkers. Furthermore, we also classify the data to know how much the evolved rules are able to describe accurately the remaining test (unknown) data. Subsequently, we also compare the average classification accuracy, and other related factors with other rule-based classifiers. Statistical significance tests are also performed for verifying the statistical relevance of the comparative results. Here, each of the other rule mining methods or rule-based classifiers is also starting with the same post-discretized data-matrix. Finally, we have also included the integrated analysis of gene expression and methylation for determining epigenetic effect (viz., effect of methylation) on gene expression level.

  16. Rule-based land use/land cover classification in coastal areas using seasonal remote sensing imagery: a case study from Lianyungang City, China.

    PubMed

    Yang, Xiaoyan; Chen, Longgao; Li, Yingkui; Xi, Wenjia; Chen, Longqian

    2015-07-01

    Land use/land cover (LULC) inventory provides an important dataset in regional planning and environmental assessment. To efficiently obtain the LULC inventory, we compared the LULC classifications based on single satellite imagery with a rule-based classification based on multi-seasonal imagery in Lianyungang City, a coastal city in China, using CBERS-02 (the 2nd China-Brazil Environmental Resource Satellites) images. The overall accuracies of the classification based on single imagery are 78.9, 82.8, and 82.0% in winter, early summer, and autumn, respectively. The rule-based classification improves the accuracy to 87.9% (kappa 0.85), suggesting that combining multi-seasonal images can considerably improve the classification accuracy over any single image-based classification. This method could also be used to analyze seasonal changes of LULC types, especially for those associated with tidal changes in coastal areas. The distribution and inventory of LULC types with an overall accuracy of 87.9% and a spatial resolution of 19.5 m can assist regional planning and environmental assessment efficiently in Lianyungang City. This rule-based classification provides a guidance to improve accuracy for coastal areas with distinct LULC temporal spectral features.

  17. LIDAR Point Cloud Data Extraction and Establishment of 3D Modeling of Buildings

    NASA Astrophysics Data System (ADS)

    Zhang, Yujuan; Li, Xiuhai; Wang, Qiang; Liu, Jiang; Liang, Xin; Li, Dan; Ni, Chundi; Liu, Yan

    2018-01-01

    This paper takes the method of Shepard’s to deal with the original LIDAR point clouds data, and generate regular grid data DSM, filters the ground point cloud and non ground point cloud through double least square method, and obtains the rules of DSM. By using region growing method for the segmentation of DSM rules, the removal of non building point cloud, obtaining the building point cloud information. Uses the Canny operator to extract the image segmentation is needed after the edges of the building, uses Hough transform line detection to extract the edges of buildings rules of operation based on the smooth and uniform. At last, uses E3De3 software to establish the 3D model of buildings.

  18. IFC BIM-Based Methodology for Semi-Automated Building Energy Performance Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bazjanac, Vladimir

    2008-07-01

    Building energy performance (BEP) simulation is still rarely used in building design, commissioning and operations. The process is too costly and too labor intensive, and it takes too long to deliver results. Its quantitative results are not reproducible due to arbitrary decisions and assumptions made in simulation model definition, and can be trusted only under special circumstances. A methodology to semi-automate BEP simulation preparation and execution makes this process much more effective. It incorporates principles of information science and aims to eliminate inappropriate human intervention that results in subjective and arbitrary decisions. This is achieved by automating every part ofmore » the BEP modeling and simulation process that can be automated, by relying on data from original sources, and by making any necessary data transformation rule-based and automated. This paper describes the new methodology and its relationship to IFC-based BIM and software interoperability. It identifies five steps that are critical to its implementation, and shows what part of the methodology can be applied today. The paper concludes with a discussion of application to simulation with EnergyPlus, and describes data transformation rules embedded in the new Geometry Simplification Tool (GST).« less

  19. Ductilizing bulk metallic glass composite by tailoring stacking fault energy.

    PubMed

    Wu, Y; Zhou, D Q; Song, W L; Wang, H; Zhang, Z Y; Ma, D; Wang, X L; Lu, Z P

    2012-12-14

    Martensitic transformation was successfully introduced to bulk metallic glasses as the reinforcement micromechanism. In this Letter, it was found that the twinning property of the reinforcing crystals can be dramatically improved by reducing the stacking fault energy through microalloying, which effectively alters the electron charge density redistribution on the slipping plane. The enhanced twinning propensity promotes the martensitic transformation of the reinforcing austenite and, consequently, improves plastic stability and the macroscopic tensile ductility. In addition, a general rule to identify effective microalloying elements based on their electronegativity and atomic size was proposed.

  20. Style-independent document labeling: design and performance evaluation

    NASA Astrophysics Data System (ADS)

    Mao, Song; Kim, Jong Woo; Thoma, George R.

    2003-12-01

    The Medical Article Records System or MARS has been developed at the U.S. National Library of Medicine (NLM) for automated data entry of bibliographical information from medical journals into MEDLINE, the premier bibliographic citation database at NLM. Currently, a rule-based algorithm (called ZoneCzar) is used for labeling important bibliographical fields (title, author, affiliation, and abstract) on medical journal article page images. While rules have been created for medical journals with regular layout types, new rules have to be manually created for any input journals with arbitrary or new layout types. Therefore, it is of interest to label any journal articles independent of their layout styles. In this paper, we first describe a system (called ZoneMatch) for automated generation of crucial geometric and non-geometric features of important bibliographical fields based on string-matching and clustering techniques. The rule based algorithm is then modified to use these features to perform style-independent labeling. We then describe a performance evaluation method for quantitatively evaluating our algorithm and characterizing its error distributions. Experimental results show that the labeling performance of the rule-based algorithm is significantly improved when the generated features are used.

  1. A Comparison of Methods for Transforming Sentences into Test Questions for Instructional Materials. Technical Report #1.

    ERIC Educational Resources Information Center

    Roid, Gale; And Others

    Several measurement theorists have convincingly argued that methods of writing test questions, particularly for criterion-referenced tests, should be based on operationally defined rules. This study was designed to examine and further refine a method for objectively generating multiple-choice questions for prose instructional materials. Important…

  2. SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks.

    PubMed

    Zenke, Friedemann; Ganguli, Surya

    2018-06-01

    A vast majority of computation in the brain is performed by spiking neural networks. Despite the ubiquity of such spiking, we currently lack an understanding of how biological spiking neural circuits learn and compute in vivo, as well as how we can instantiate such capabilities in artificial spiking circuits in silico. Here we revisit the problem of supervised learning in temporally coding multilayer spiking neural networks. First, by using a surrogate gradient approach, we derive SuperSpike, a nonlinear voltage-based three-factor learning rule capable of training multilayer networks of deterministic integrate-and-fire neurons to perform nonlinear computations on spatiotemporal spike patterns. Second, inspired by recent results on feedback alignment, we compare the performance of our learning rule under different credit assignment strategies for propagating output errors to hidden units. Specifically, we test uniform, symmetric, and random feedback, finding that simpler tasks can be solved with any type of feedback, while more complex tasks require symmetric feedback. In summary, our results open the door to obtaining a better scientific understanding of learning and computation in spiking neural networks by advancing our ability to train them to solve nonlinear problems involving transformations between different spatiotemporal spike time patterns.

  3. A neural network architecture for implementation of expert systems for real time monitoring

    NASA Technical Reports Server (NTRS)

    Ramamoorthy, P. A.

    1991-01-01

    Since neural networks have the advantages of massive parallelism and simple architecture, they are good tools for implementing real time expert systems. In a rule based expert system, the antecedents of rules are in the conjunctive or disjunctive form. We constructed a multilayer feedforward type network in which neurons represent AND or OR operations of rules. Further, we developed a translator which can automatically map a given rule base into the network. Also, we proposed a new and powerful yet flexible architecture that combines the advantages of both fuzzy expert systems and neural networks. This architecture uses the fuzzy logic concepts to separate input data domains into several smaller and overlapped regions. Rule-based expert systems for time critical applications using neural networks, the automated implementation of rule-based expert systems with neural nets, and fuzzy expert systems vs. neural nets are covered.

  4. ER2OWL: Generating OWL Ontology from ER Diagram

    NASA Astrophysics Data System (ADS)

    Fahad, Muhammad

    Ontology is the fundamental part of Semantic Web. The goal of W3C is to bring the web into (its full potential) a semantic web with reusing previous systems and artifacts. Most legacy systems have been documented in structural analysis and structured design (SASD), especially in simple or Extended ER Diagram (ERD). Such systems need up-gradation to become the part of semantic web. In this paper, we present ERD to OWL-DL ontology transformation rules at concrete level. These rules facilitate an easy and understandable transformation from ERD to OWL. The set of rules for transformation is tested on a structured analysis and design example. The framework provides OWL ontology for semantic web fundamental. This framework helps software engineers in upgrading the structured analysis and design artifact ERD, to components of semantic web. Moreover our transformation tool, ER2OWL, reduces the cost and time for building OWL ontologies with the reuse of existing entity relationship models.

  5. Toward semantic-based retrieval of visual information: a model-based approach

    NASA Astrophysics Data System (ADS)

    Park, Youngchoon; Golshani, Forouzan; Panchanathan, Sethuraman

    2002-07-01

    This paper center around the problem of automated visual content classification. To enable classification based image or visual object retrieval, we propose a new image representation scheme called visual context descriptor (VCD) that is a multidimensional vector in which each element represents the frequency of a unique visual property of an image or a region. VCD utilizes the predetermined quality dimensions (i.e., types of features and quantization level) and semantic model templates mined in priori. Not only observed visual cues, but also contextually relevant visual features are proportionally incorporated in VCD. Contextual relevance of a visual cue to a semantic class is determined by using correlation analysis of ground truth samples. Such co-occurrence analysis of visual cues requires transformation of a real-valued visual feature vector (e.g., color histogram, Gabor texture, etc.,) into a discrete event (e.g., terms in text). Good-feature to track, rule of thirds, iterative k-means clustering and TSVQ are involved in transformation of feature vectors into unified symbolic representations called visual terms. Similarity-based visual cue frequency estimation is also proposed and used for ensuring the correctness of model learning and matching since sparseness of sample data causes the unstable results of frequency estimation of visual cues. The proposed method naturally allows integration of heterogeneous visual or temporal or spatial cues in a single classification or matching framework, and can be easily integrated into a semantic knowledge base such as thesaurus, and ontology. Robust semantic visual model template creation and object based image retrieval are demonstrated based on the proposed content description scheme.

  6. Proposal to designate the order Actinomycetales Buchanan 1917, 162 (Approved Lists 1980) as the nomenclatural type of the class Actinobacteria. Request for an Opinion

    PubMed Central

    2017-01-01

    The name of the class Actinobacteria is illegitimate according to Rules 15, 22 and 27(3) because it was proposed without the designation of a nomenclatural type. I therefore propose to designate the order Actinomycetales Buchanan 1917, 162 (Approved Lists 1980) as its nomenclatural type, based on Rule 22 of the International Code of Nomenclature of Prokaryotes. PMID:28840812

  7. Proposal to designate the order Actinomycetales Buchanan 1917, 162 (Approved Lists 1980) as the nomenclatural type of the class Actinobacteria. Request for an Opinion.

    PubMed

    Oren, Aharon

    2017-09-01

    The name of the class Actinobacteria is illegitimate according to Rules 15, 22 and 27(3) because it was proposed without the designation of a nomenclatural type. I therefore propose to designate the order Actinomycetales Buchanan 1917, 162 (Approved Lists 1980) as its nomenclatural type, based on Rule 22 of the International Code of Nomenclature of Prokaryotes.

  8. Principals' Leadership in Mexican Upper High Schools: The Paradoxes between Rules and Practices

    ERIC Educational Resources Information Center

    Santizo Rodall, Claudia A.; Ortega Salazar, Sylvia B.

    2018-01-01

    This article discusses the type of organization and leadership that underlies a competency-based management rule established in Mexico (2008) applicable to principals in public upper high schools. This rule, identified as the 449 Agreement, describes competencies and communicates expected behavior. Implementation, however, is mediated by the…

  9. Foundations for Streaming Model Transformations by Complex Event Processing.

    PubMed

    Dávid, István; Ráth, István; Varró, Dániel

    2018-01-01

    Streaming model transformations represent a novel class of transformations to manipulate models whose elements are continuously produced or modified in high volume and with rapid rate of change. Executing streaming transformations requires efficient techniques to recognize activated transformation rules over a live model and a potentially infinite stream of events. In this paper, we propose foundations of streaming model transformations by innovatively integrating incremental model query, complex event processing (CEP) and reactive (event-driven) transformation techniques. Complex event processing allows to identify relevant patterns and sequences of events over an event stream. Our approach enables event streams to include model change events which are automatically and continuously populated by incremental model queries. Furthermore, a reactive rule engine carries out transformations on identified complex event patterns. We provide an integrated domain-specific language with precise semantics for capturing complex event patterns and streaming transformations together with an execution engine, all of which is now part of the Viatra reactive transformation framework. We demonstrate the feasibility of our approach with two case studies: one in an advanced model engineering workflow; and one in the context of on-the-fly gesture recognition.

  10. Automated diagnosis of coronary artery disease based on data mining and fuzzy modeling.

    PubMed

    Tsipouras, Markos G; Exarchos, Themis P; Fotiadis, Dimitrios I; Kotsia, Anna P; Vakalis, Konstantinos V; Naka, Katerina K; Michalis, Lampros K

    2008-07-01

    A fuzzy rule-based decision support system (DSS) is presented for the diagnosis of coronary artery disease (CAD). The system is automatically generated from an initial annotated dataset, using a four stage methodology: 1) induction of a decision tree from the data; 2) extraction of a set of rules from the decision tree, in disjunctive normal form and formulation of a crisp model; 3) transformation of the crisp set of rules into a fuzzy model; and 4) optimization of the parameters of the fuzzy model. The dataset used for the DSS generation and evaluation consists of 199 subjects, each one characterized by 19 features, including demographic and history data, as well as laboratory examinations. Tenfold cross validation is employed, and the average sensitivity and specificity obtained is 62% and 54%, respectively, using the set of rules extracted from the decision tree (first and second stages), while the average sensitivity and specificity increase to 80% and 65%, respectively, when the fuzzification and optimization stages are used. The system offers several advantages since it is automatically generated, it provides CAD diagnosis based on easily and noninvasively acquired features, and is able to provide interpretation for the decisions made.

  11. Optical Generation of Fuzzy-Based Rules

    NASA Astrophysics Data System (ADS)

    Gur, Eran; Mendlovic, David; Zalevsky, Zeev

    2002-08-01

    In the last third of the 20th century, fuzzy logic has risen from a mathematical concept to an applicable approach in soft computing. Today, fuzzy logic is used in control systems for various applications, such as washing machines, train-brake systems, automobile automatic gear, and so forth. The approach of optical implementation of fuzzy inferencing was given by the authors in previous papers, giving an extra emphasis to applications with two dominant inputs. In this paper the authors introduce a real-time optical rule generator for the dual-input fuzzy-inference engine. The paper briefly goes over the dual-input optical implementation of fuzzy-logic inferencing. Then, the concept of constructing a set of rules from given data is discussed. Next, the authors show ways to implement this procedure optically. The discussion is accompanied by an example that illustrates the transformation from raw data into fuzzy set rules.

  12. Image segmentation using association rule features.

    PubMed

    Rushing, John A; Ranganath, Heggere; Hinke, Thomas H; Graves, Sara J

    2002-01-01

    A new type of texture feature based on association rules is described. Association rules have been used in applications such as market basket analysis to capture relationships present among items in large data sets. It is shown that association rules can be adapted to capture frequently occurring local structures in images. The frequency of occurrence of these structures can be used to characterize texture. Methods for segmentation of textured images based on association rule features are described. Simulation results using images consisting of man made and natural textures show that association rule features perform well compared to other widely used texture features. Association rule features are used to detect cumulus cloud fields in GOES satellite images and are found to achieve higher accuracy than other statistical texture features for this problem.

  13. Twisted supersymmetry: Twisted symmetry versus renormalizability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dimitrijevic, Marija; Nikolic, Biljana; Radovanovic, Voja

    We discuss a deformation of superspace based on a Hermitian twist. The twist implies a *-product that is noncommutative, Hermitian and finite when expanded in a power series of the deformation parameter. The Leibniz rule for the twisted supersymmetry transformations is deformed. A minimal deformation of the Wess-Zumino action is proposed and its renormalizability properties are discussed. There is no tadpole contribution, but the two-point function diverges. We speculate that the deformed Leibniz rule, or more generally the twisted symmetry, interferes with renormalizability properties of the model. We discuss different possibilities to render a renormalizable model.

  14. An Improved Quantum Information Hiding Protocol Based on Entanglement Swapping of χ-type Quantum States

    NASA Astrophysics Data System (ADS)

    Xu, Shu-Jiang; Chen, Xiu-Bo; Wang, Lian-Hai; Ding, Qing-Yan; Zhang, Shu-Hui

    2016-06-01

    In 2011, Qu et al. proposed a quantum information hiding protocol based on the entanglement swapping of χ-type quantum states. Because a χ-type state can be described by the 4-particle cat states which have good symmetry, the possible output results of the entanglement swapping between a given χ-type state and all of the 16 χ-type states are divided into 8 groups instead of 16 groups of different results when the global phase is not considered. So it is difficult to read out the secret messages since each result occurs twice in each line (column) of the secret messages encoding rule for the original protocol. In fact, a 3-bit instead of a 4-bit secret message can be encoded by performing two unitary transformations on 2 particles of a χ-type quantum state in the original protocol. To overcome this defect, we propose an improved quantum information hiding protocol based on the general term formulas of the entanglement swapping among χ-type states. Supported by the National Natural Science Foundation of China under Grant Nos. 61572297, 61303199, 61272514, and 61373131, the Shandong Provincial Natural Science Foundation of China under Grant Nos. ZR2013FM025, ZR2013FQ001, ZR2014FM003, and ZY2015YL018, the Shandong Provincial Outstanding Research Award Fund for Young Scientists of China under Grant Nos. BS2015DX006 and BS2014DX007, the National Development Foundation for Cryptological Research, China under Grant No. MMJJ201401012, the Priority Academic Program Development of Jiangsu Higher Education Institutions and Jiangsu Collaborative Innovation Center on Atmospheric Environment and Equipment Technology Funds, and the Shandong Academy of Sciences Youth Fund Project, China under Grant Nos. 2015QN003 and 2013QN007

  15. Remote Sensing Image Fusion Method Based on Nonsubsampled Shearlet Transform and Sparse Representation

    NASA Astrophysics Data System (ADS)

    Moonon, Altan-Ulzii; Hu, Jianwen; Li, Shutao

    2015-12-01

    The remote sensing image fusion is an important preprocessing technique in remote sensing image processing. In this paper, a remote sensing image fusion method based on the nonsubsampled shearlet transform (NSST) with sparse representation (SR) is proposed. Firstly, the low resolution multispectral (MS) image is upsampled and color space is transformed from Red-Green-Blue (RGB) to Intensity-Hue-Saturation (IHS). Then, the high resolution panchromatic (PAN) image and intensity component of MS image are decomposed by NSST to high and low frequency coefficients. The low frequency coefficients of PAN and the intensity component are fused by the SR with the learned dictionary. The high frequency coefficients of intensity component and PAN image are fused by local energy based fusion rule. Finally, the fused result is obtained by performing inverse NSST and inverse IHS transform. The experimental results on IKONOS and QuickBird satellites demonstrate that the proposed method provides better spectral quality and superior spatial information in the fused image than other remote sensing image fusion methods both in visual effect and object evaluation.

  16. Incorporating Technology and Cooperative Learning to Teach Function Transformations

    ERIC Educational Resources Information Center

    Boz, Burçak; Erbilgin, Evrim

    2015-01-01

    When teaching transformations of functions, teachers typically have students vary the coefficients of equations and examine the resulting changes in the graph. This approach, however, may lead students to memorise rules related to transformations. Students need opportunities to think deeply about transformations beyond superficial observations…

  17. Multiadaptive Bionic Wavelet Transform: Application to ECG Denoising and Baseline Wandering Reduction

    NASA Astrophysics Data System (ADS)

    Sayadi, Omid; Shamsollahi, Mohammad B.

    2007-12-01

    We present a new modified wavelet transform, called the multiadaptive bionic wavelet transform (MABWT), that can be applied to ECG signals in order to remove noise from them under a wide range of variations for noise. By using the definition of bionic wavelet transform and adaptively determining both the center frequency of each scale together with the[InlineEquation not available: see fulltext.]-function, the problem of desired signal decomposition is solved. Applying a new proposed thresholding rule works successfully in denoising the ECG. Moreover by using the multiadaptation scheme, lowpass noisy interference effects on the baseline of ECG will be removed as a direct task. The method was extensively clinically tested with real and simulated ECG signals which showed high performance of noise reduction, comparable to those of wavelet transform (WT). Quantitative evaluation of the proposed algorithm shows that the average SNR improvement of MABWT is 1.82 dB more than the WT-based results, for the best case. Also the procedure has largely proved advantageous over wavelet-based methods for baseline wandering cancellation, including both DC components and baseline drifts.

  18. Automated visualization of rule-based models

    PubMed Central

    Tapia, Jose-Juan; Faeder, James R.

    2017-01-01

    Frameworks such as BioNetGen, Kappa and Simmune use “reaction rules” to specify biochemical interactions compactly, where each rule specifies a mechanism such as binding or phosphorylation and its structural requirements. Current rule-based models of signaling pathways have tens to hundreds of rules, and these numbers are expected to increase as more molecule types and pathways are added. Visual representations are critical for conveying rule-based models, but current approaches to show rules and interactions between rules scale poorly with model size. Also, inferring design motifs that emerge from biochemical interactions is an open problem, so current approaches to visualize model architecture rely on manual interpretation of the model. Here, we present three new visualization tools that constitute an automated visualization framework for rule-based models: (i) a compact rule visualization that efficiently displays each rule, (ii) the atom-rule graph that conveys regulatory interactions in the model as a bipartite network, and (iii) a tunable compression pipeline that incorporates expert knowledge and produces compact diagrams of model architecture when applied to the atom-rule graph. The compressed graphs convey network motifs and architectural features useful for understanding both small and large rule-based models, as we show by application to specific examples. Our tools also produce more readable diagrams than current approaches, as we show by comparing visualizations of 27 published models using standard graph metrics. We provide an implementation in the open source and freely available BioNetGen framework, but the underlying methods are general and can be applied to rule-based models from the Kappa and Simmune frameworks also. We expect that these tools will promote communication and analysis of rule-based models and their eventual integration into comprehensive whole-cell models. PMID:29131816

  19. Challenges for Rule Systems on the Web

    NASA Astrophysics Data System (ADS)

    Hu, Yuh-Jong; Yeh, Ching-Long; Laun, Wolfgang

    The RuleML Challenge started in 2007 with the objective of inspiring the issues of implementation for management, integration, interoperation and interchange of rules in an open distributed environment, such as the Web. Rules are usually classified as three types: deductive rules, normative rules, and reactive rules. The reactive rules are further classified as ECA rules and production rules. The study of combination rule and ontology is traced back to an earlier active rule system for relational and object-oriented (OO) databases. Recently, this issue has become one of the most important research problems in the Semantic Web. Once we consider a computer executable policy as a declarative set of rules and ontologies that guides the behavior of entities within a system, we have a flexible way to implement real world policies without rewriting the computer code, as we did before. Fortunately, we have de facto rule markup languages, such as RuleML or RIF to achieve the portability and interchange of rules for different rule systems. Otherwise, executing real-life rule-based applications on the Web is almost impossible. Several commercial or open source rule engines are available for the rule-based applications. However, we still need a standard rule language and benchmark for not only to compare the rule systems but also to measure the progress in the field. Finally, a number of real-life rule-based use cases will be investigated to demonstrate the applicability of current rule systems on the Web.

  20. Abstract rule learning in 11- and 14-month-old infants.

    PubMed

    Koulaguina, Elena; Shi, Rushen

    2013-02-01

    This study tests the hypothesis that distributional information can guide infants in the generalization of word order movement rules at the initial stage of language acquisition. Participants were 11- and 14-month-old infants. Stimuli were sentences in Russian, a language that was unknown to our infants. During training the word order of each sentence was transformed following a consistent pattern (e.g., ABC-BAC). During the test phase infants heard novel sentences that respected the trained rule and ones that violated the trained rule (i.e., a different transformation such as ABC-ACB). Stimuli words had highly variable phonological and morphological shapes. The cue available was the positional information of words and their non-adjacent relations across sentences. We found that 14-month-olds, but not 11-month-olds, showed evidence of abstract rule generalization to novel instances. The implications of this finding to early syntactic acquisition are discussed.

  1. 40 CFR 35.3565 - Specific cash draw rules for authorized types of assistance from the Fund.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... to 5 percent of each fiscal year's capitalization grant or 2 million dollars, whichever is greater... 40 Protection of Environment 1 2011-07-01 2011-07-01 false Specific cash draw rules for authorized... the following rules: (a) Loans—(1) Eligible project costs. A State may draw cash based on the...

  2. 40 CFR 35.3565 - Specific cash draw rules for authorized types of assistance from the Fund.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... to 5 percent of each fiscal year's capitalization grant or 2 million dollars, whichever is greater... 40 Protection of Environment 1 2014-07-01 2014-07-01 false Specific cash draw rules for authorized... the following rules: (a) Loans—(1) Eligible project costs. A State may draw cash based on the...

  3. 40 CFR 35.3565 - Specific cash draw rules for authorized types of assistance from the Fund.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... to 5 percent of each fiscal year's capitalization grant or 2 million dollars, whichever is greater... 40 Protection of Environment 1 2013-07-01 2013-07-01 false Specific cash draw rules for authorized... the following rules: (a) Loans—(1) Eligible project costs. A State may draw cash based on the...

  4. 40 CFR 35.3565 - Specific cash draw rules for authorized types of assistance from the Fund.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... to 5 percent of each fiscal year's capitalization grant or 2 million dollars, whichever is greater... 40 Protection of Environment 1 2012-07-01 2012-07-01 false Specific cash draw rules for authorized... the following rules: (a) Loans—(1) Eligible project costs. A State may draw cash based on the...

  5. Curriculum Transformation in a Post-Apartheid South African University: The Arts Faculty, Tshwane University of Technology

    ERIC Educational Resources Information Center

    Ebewo, Patrick J.; Sirayi, Mzo

    2018-01-01

    During the apartheid rule in South Africa, established universities and other tertiary institutions were forcibly segregated to serve particular racial groups. Some critics have stated that the apartheid regime in South Africa supported an exclusively Western model of education, and that university education was based on a mono-cultural approach…

  6. Explicit bounds for the positive root of classes of polynomials with applications

    NASA Astrophysics Data System (ADS)

    Herzberger, Jürgen

    2003-03-01

    We consider a certain type of polynomial equations for which there exists--according to Descartes' rule of signs--only one simple positive root. These equations are occurring in Numerical Analysis when calculating or estimating the R-order or Q-order of convergence of certain iterative processes with an error-recursion of special form. On the other hand, these polynomial equations are very common as defining equations for the effective rate of return for certain cashflows like bonds or annuities in finance. The effective rate of interest i* for those cashflows is i*=q*-1, where q* is the unique positive root of such polynomial. We construct bounds for i* for a special problem concerning an ordinary simple annuity which is obtained by changing the conditions of such an annuity with given data applying the German rule (Preisangabeverordnung or short PAngV). Moreover, we consider a number of results for such polynomial roots in Numerical Analysis showing that by a simple variable transformation we can derive several formulas out of earlier results by applying this transformation. The same is possible in finance in order to generalize results to more complicated cashflows.

  7. Performance Evaluation of CMUT-Based Ultrasonic Transformers for Galvanic Isolation.

    PubMed

    Heller, Jacques; Boulme, Audren; Alquier, Daniel; Ngo, Sophie; Certon, Dominique

    2018-04-01

    This paper presents the development of a novel acoustic transformer with high galvanic isolation dedicated to power switch triggering. The transformer is based on two capacitive micromachined ultrasonic transducers layered on each side of a silicon substrate; one is the primary circuit, and the other is the secondary circuit. The thickness mode resonance of the substrate is leveraged to transmit the triggering signal. The fabrication and characterization of an initial prototype is presented in this paper. All experimental results are discussed, from the electrical impedance measurements to the power efficiency measurements, for different electrical load conditions. A comparison with a specifically developed finite-element method model is done. Simulations are finally used to identify the optimization rules of this initial prototype. It is shown that the power efficiency can be increased from 35% to 60%, and the transmitted power can be increased from 1.6 to 45 mW/Volt.

  8. An infrared-visible image fusion scheme based on NSCT and compressed sensing

    NASA Astrophysics Data System (ADS)

    Zhang, Qiong; Maldague, Xavier

    2015-05-01

    Image fusion, as a research hot point nowadays in the field of infrared computer vision, has been developed utilizing different varieties of methods. Traditional image fusion algorithms are inclined to bring problems, such as data storage shortage and computational complexity increase, etc. Compressed sensing (CS) uses sparse sampling without knowing the priori knowledge and greatly reconstructs the image, which reduces the cost and complexity of image processing. In this paper, an advanced compressed sensing image fusion algorithm based on non-subsampled contourlet transform (NSCT) is proposed. NSCT provides better sparsity than the wavelet transform in image representation. Throughout the NSCT decomposition, the low-frequency and high-frequency coefficients can be obtained respectively. For the fusion processing of low-frequency coefficients of infrared and visible images , the adaptive regional energy weighting rule is utilized. Thus only the high-frequency coefficients are specially measured. Here we use sparse representation and random projection to obtain the required values of high-frequency coefficients, afterwards, the coefficients of each image block can be fused via the absolute maximum selection rule and/or the regional standard deviation rule. In the reconstruction of the compressive sampling results, a gradient-based iterative algorithm and the total variation (TV) method are employed to recover the high-frequency coefficients. Eventually, the fused image is recovered by inverse NSCT. Both the visual effects and the numerical computation results after experiments indicate that the presented approach achieves much higher quality of image fusion, accelerates the calculations, enhances various targets and extracts more useful information.

  9. Slant rectification in Russian passport OCR system using fast Hough transform

    NASA Astrophysics Data System (ADS)

    Limonova, Elena; Bezmaternykh, Pavel; Nikolaev, Dmitry; Arlazarov, Vladimir

    2017-03-01

    In this paper, we introduce slant detection method based on Fast Hough Transform calculation and demonstrate its application in industrial system for Russian passports recognition. About 1.5% of this kind of documents appear to be slant or italic. This fact reduces recognition rate, because Optical Recognition Systems are normally designed to process normal fonts. Our method uses Fast Hough Transform to analyse vertical strokes of characters extracted with the help of x-derivative of a text line image. To improve the quality of detector we also introduce field grouping rules. The resulting algorithm allowed to reach high detection quality. Almost all errors of considered approach happen on passports of nonstandard fonts, while slant detector works in appropriate way.

  10. On the Nature of Syntactic Irregularity.

    ERIC Educational Resources Information Center

    Lakoff, George

    This dissertation is an attempt to characterize the notion "exception to a rule of grammar" within the context of Chomsky's conception of grammar as given in "Aspects of the Theory of Syntax." This notion depends on a prior notion of "rule government"--in each phrase marker on which a transformational rule may…

  11. Optimal joint detection and estimation that maximizes ROC-type curves

    PubMed Central

    Wunderlich, Adam; Goossens, Bart; Abbey, Craig K.

    2017-01-01

    Combined detection-estimation tasks are frequently encountered in medical imaging. Optimal methods for joint detection and estimation are of interest because they provide upper bounds on observer performance, and can potentially be utilized for imaging system optimization, evaluation of observer efficiency, and development of image formation algorithms. We present a unified Bayesian framework for decision rules that maximize receiver operating characteristic (ROC)-type summary curves, including ROC, localization ROC (LROC), estimation ROC (EROC), free-response ROC (FROC), alternative free-response ROC (AFROC), and exponentially-transformed FROC (EFROC) curves, succinctly summarizing previous results. The approach relies on an interpretation of ROC-type summary curves as plots of an expected utility versus an expected disutility (or penalty) for signal-present decisions. We propose a general utility structure that is flexible enough to encompass many ROC variants and yet sufficiently constrained to allow derivation of a linear expected utility equation that is similar to that for simple binary detection. We illustrate our theory with an example comparing decision strategies for joint detection-estimation of a known signal with unknown amplitude. In addition, building on insights from our utility framework, we propose new ROC-type summary curves and associated optimal decision rules for joint detection-estimation tasks with an unknown, potentially-multiple, number of signals in each observation. PMID:27093544

  12. Optimal Joint Detection and Estimation That Maximizes ROC-Type Curves.

    PubMed

    Wunderlich, Adam; Goossens, Bart; Abbey, Craig K

    2016-09-01

    Combined detection-estimation tasks are frequently encountered in medical imaging. Optimal methods for joint detection and estimation are of interest because they provide upper bounds on observer performance, and can potentially be utilized for imaging system optimization, evaluation of observer efficiency, and development of image formation algorithms. We present a unified Bayesian framework for decision rules that maximize receiver operating characteristic (ROC)-type summary curves, including ROC, localization ROC (LROC), estimation ROC (EROC), free-response ROC (FROC), alternative free-response ROC (AFROC), and exponentially-transformed FROC (EFROC) curves, succinctly summarizing previous results. The approach relies on an interpretation of ROC-type summary curves as plots of an expected utility versus an expected disutility (or penalty) for signal-present decisions. We propose a general utility structure that is flexible enough to encompass many ROC variants and yet sufficiently constrained to allow derivation of a linear expected utility equation that is similar to that for simple binary detection. We illustrate our theory with an example comparing decision strategies for joint detection-estimation of a known signal with unknown amplitude. In addition, building on insights from our utility framework, we propose new ROC-type summary curves and associated optimal decision rules for joint detection-estimation tasks with an unknown, potentially-multiple, number of signals in each observation.

  13. Transforming the Way We Teach Function Transformations

    ERIC Educational Resources Information Center

    Faulkenberry, Eileen Durand; Faulkenberry, Thomas J.

    2010-01-01

    In this article, the authors discuss "function," a well-defined rule that relates inputs to outputs. They have found that by using the input-output definition of "function," they can examine transformations of functions simply by looking at changes to input or output and the respective changes to the graph. Applying transformations to the input…

  14. Transformers and the Electric Utility System

    ERIC Educational Resources Information Center

    Roman, Harry T.

    2005-01-01

    For electric energy to get from the generating station to a home, it must pass through a transformer, a device that can change voltage levels easily. This article describes how transformers work, covering the following topics: (1) the magnetism-electricity link; (2) transformer basics; (3) the energy seesaw; (4) the turns ratio rule; and (5)…

  15. Predicting Mycobacterium tuberculosis Complex Clades Using Knowledge-Based Bayesian Networks

    PubMed Central

    Bennett, Kristin P.

    2014-01-01

    We develop a novel approach for incorporating expert rules into Bayesian networks for classification of Mycobacterium tuberculosis complex (MTBC) clades. The proposed knowledge-based Bayesian network (KBBN) treats sets of expert rules as prior distributions on the classes. Unlike prior knowledge-based support vector machine approaches which require rules expressed as polyhedral sets, KBBN directly incorporates the rules without any modification. KBBN uses data to refine rule-based classifiers when the rule set is incomplete or ambiguous. We develop a predictive KBBN model for 69 MTBC clades found in the SITVIT international collection. We validate the approach using two testbeds that model knowledge of the MTBC obtained from two different experts and large DNA fingerprint databases to predict MTBC genetic clades and sublineages. These models represent strains of MTBC using high-throughput biomarkers called spacer oligonucleotide types (spoligotypes), since these are routinely gathered from MTBC isolates of tuberculosis (TB) patients. Results show that incorporating rules into problems can drastically increase classification accuracy if data alone are insufficient. The SITVIT KBBN is publicly available for use on the World Wide Web. PMID:24864238

  16. Classification and disease prediction via mathematical programming

    NASA Astrophysics Data System (ADS)

    Lee, Eva K.; Wu, Tsung-Lin

    2007-11-01

    In this chapter, we present classification models based on mathematical programming approaches. We first provide an overview on various mathematical programming approaches, including linear programming, mixed integer programming, nonlinear programming and support vector machines. Next, we present our effort of novel optimization-based classification models that are general purpose and suitable for developing predictive rules for large heterogeneous biological and medical data sets. Our predictive model simultaneously incorporates (1) the ability to classify any number of distinct groups; (2) the ability to incorporate heterogeneous types of attributes as input; (3) a high-dimensional data transformation that eliminates noise and errors in biological data; (4) the ability to incorporate constraints to limit the rate of misclassification, and a reserved-judgment region that provides a safeguard against over-training (which tends to lead to high misclassification rates from the resulting predictive rule) and (5) successive multi-stage classification capability to handle data points placed in the reserved judgment region. To illustrate the power and flexibility of the classification model and solution engine, and its multigroup prediction capability, application of the predictive model to a broad class of biological and medical problems is described. Applications include: the differential diagnosis of the type of erythemato-squamous diseases; predicting presence/absence of heart disease; genomic analysis and prediction of aberrant CpG island meythlation in human cancer; discriminant analysis of motility and morphology data in human lung carcinoma; prediction of ultrasonic cell disruption for drug delivery; identification of tumor shape and volume in treatment of sarcoma; multistage discriminant analysis of biomarkers for prediction of early atherosclerois; fingerprinting of native and angiogenic microvascular networks for early diagnosis of diabetes, aging, macular degeneracy and tumor metastasis; prediction of protein localization sites; and pattern recognition of satellite images in classification of soil types. In all these applications, the predictive model yields correct classification rates ranging from 80% to 100%. This provides motivation for pursuing its use as a medical diagnostic, monitoring and decision-making tool.

  17. Integration of Hospital Information and Clinical Decision Support Systems to Enable the Reuse of Electronic Health Record Data.

    PubMed

    Kopanitsa, Georgy

    2017-05-18

    The efficiency and acceptance of clinical decision support systems (CDSS) can increase if they reuse medical data captured during health care delivery. High heterogeneity of the existing legacy data formats has become the main barrier for the reuse of data. Thus, we need to apply data modeling mechanisms that provide standardization, transformation, accumulation and querying medical data to allow its reuse. In this paper, we focus on the interoperability issues of the hospital information systems (HIS) and CDSS data integration. Our study is based on the approach proposed by Marcos et al. where archetypes are used as a standardized mechanism for the interaction of a CDSS with an electronic health record (EHR). We build an integration tool to enable CDSSs collect data from various institutions without a need for modifications in the implementation. The approach implies development of a conceptual level as a set of archetypes representing concepts required by a CDSS. Treatment case data from Regional Clinical Hospital in Tomsk, Russia was extracted, transformed and loaded to the archetype database of a clinical decision support system. Test records' normalization has been performed by defining transformation and aggregation rules between the EHR data and the archetypes. These mapping rules were used to automatically generate openEHR compliant data. After the transformation, archetype data instances were loaded into the CDSS archetype based data storage. The performance times showed acceptable performance for the extraction stage with a mean of 17.428 s per year (3436 case records). The transformation times were also acceptable with 136.954 s per year (0.039 s per one instance). The accuracy evaluation showed the correctness and applicability of the method for the wide range of HISes. These operations were performed without interrupting the HIS workflow to prevent the HISes from disturbing the service provision to the users. The project results have proven that archetype based technologies are mature enough to be applied in routine operations that require extraction, transformation, loading and querying medical data from heterogeneous EHR systems. Inference models in clinical research and CDSS can benefit from this by defining queries to a valid data set with known structure and constraints. The standard based nature of the archetype approach allows an easy integration of CDSSs with existing EHR systems.

  18. Nonlinear dynamic systems identification using recurrent interval type-2 TSK fuzzy neural network - A novel structure.

    PubMed

    El-Nagar, Ahmad M

    2018-01-01

    In this study, a novel structure of a recurrent interval type-2 Takagi-Sugeno-Kang (TSK) fuzzy neural network (FNN) is introduced for nonlinear dynamic and time-varying systems identification. It combines the type-2 fuzzy sets (T2FSs) and a recurrent FNN to avoid the data uncertainties. The fuzzy firing strengths in the proposed structure are returned to the network input as internal variables. The interval type-2 fuzzy sets (IT2FSs) is used to describe the antecedent part for each rule while the consequent part is a TSK-type, which is a linear function of the internal variables and the external inputs with interval weights. All the type-2 fuzzy rules for the proposed RIT2TSKFNN are learned on-line based on structure and parameter learning, which are performed using the type-2 fuzzy clustering. The antecedent and consequent parameters of the proposed RIT2TSKFNN are updated based on the Lyapunov function to achieve network stability. The obtained results indicate that our proposed network has a small root mean square error (RMSE) and a small integral of square error (ISE) with a small number of rules and a small computation time compared with other type-2 FNNs. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  19. Microcomputer-based classification of environmental data in municipal areas

    NASA Astrophysics Data System (ADS)

    Thiergärtner, H.

    1995-10-01

    Multivariate data-processing methods used in mineral resource identification can be used to classify urban regions. Using elements of expert systems, geographical information systems, as well as known classification and prognosis systems, it is possible to outline a single model that consists of resistant and of temporary parts of a knowledge base including graphical input and output treatment and of resistant and temporary elements of a bank of methods and algorithms. Whereas decision rules created by experts will be stored in expert systems directly, powerful classification rules in form of resistant but latent (implicit) decision algorithms may be implemented in the suggested model. The latent functions will be transformed into temporary explicit decision rules by learning processes depending on the actual task(s), parameter set(s), pixels selection(s), and expert control(s). This takes place both at supervised and nonsupervised classification of multivariately described pixel sets representing municipal subareas. The model is outlined briefly and illustrated by results obtained in a target area covering a part of the city of Berlin (Germany).

  20. Medicare Program; Prospective Payment System and Consolidated Billing for Skilled Nursing Facilities (SNFs) for FY 2016, SNF Value-Based Purchasing Program, SNF Quality Reporting Program, and Staffing Data Collection. Final Rule.

    PubMed

    2015-08-04

    This final rule updates the payment rates used under the prospective payment system (PPS) for skilled nursing facilities (SNFs) for fiscal year (FY) 2016. In addition, it specifies a SNF all-cause all-condition hospital readmission measure, as well as adopts that measure for a new SNF Value-Based Purchasing (VBP) Program, and includes a discussion of SNF VBP Program policies we are considering for future rulemaking to promote higher quality and more efficient health care for Medicare beneficiaries. Additionally, this final rule will implement a new quality reporting program for SNFs as specified in the Improving Medicare Post-Acute Care Transformation Act of 2014 (IMPACT Act). It also amends the requirements that a long-term care (LTC) facility must meet to qualify to participate as a skilled nursing facility (SNF) in the Medicare program, or a nursing facility (NF) in the Medicaid program, by establishing requirements that implement the provision in the Affordable Care Act regarding the submission of staffing information based on payroll data.

  1. The Hard Problem of Cooperation

    PubMed Central

    Eriksson, Kimmo; Strimling, Pontus

    2012-01-01

    Based on individual variation in cooperative inclinations, we define the “hard problem of cooperation” as that of achieving high levels of cooperation in a group of non-cooperative types. Can the hard problem be solved by institutions with monitoring and sanctions? In a laboratory experiment we find that the answer is affirmative if the institution is imposed on the group but negative if development of the institution is left to the group to vote on. In the experiment, participants were divided into groups of either cooperative types or non-cooperative types depending on their behavior in a public goods game. In these homogeneous groups they repeatedly played a public goods game regulated by an institution that incorporated several of the key properties identified by Ostrom: operational rules, monitoring, rewards, punishments, and (in one condition) change of rules. When change of rules was not possible and punishments were set to be high, groups of both types generally abided by operational rules demanding high contributions to the common good, and thereby achieved high levels of payoffs. Under less severe rules, both types of groups did worse but non-cooperative types did worst. Thus, non-cooperative groups profited the most from being governed by an institution demanding high contributions and employing high punishments. Nevertheless, in a condition where change of rules through voting was made possible, development of the institution in this direction was more often voted down in groups of non-cooperative types. We discuss the relevance of the hard problem and fit our results into a bigger picture of institutional and individual determinants of cooperative behavior. PMID:22792282

  2. Negotiating the Rules of Engagement: Exploring Perceptions of Dance Technique Learning through Bourdieu's Concept of "Doxa"

    ERIC Educational Resources Information Center

    Rimmer, Rachel

    2017-01-01

    This article presents the findings from a focus group discussion conducted with first year undergraduate dance students in March 2015. The focus group concluded a cycle of action research during which the researcher explored the use of enquiry-based learning approaches to teaching dance technique in higher education. Grounded in transformative and…

  3. How children perceive fractals: Hierarchical self-similarity and cognitive development

    PubMed Central

    Martins, Maurício Dias; Laaha, Sabine; Freiberger, Eva Maria; Choi, Soonja; Fitch, W. Tecumseh

    2014-01-01

    The ability to understand and generate hierarchical structures is a crucial component of human cognition, available in language, music, mathematics and problem solving. Recursion is a particularly useful mechanism for generating complex hierarchies by means of self-embedding rules. In the visual domain, fractals are recursive structures in which simple transformation rules generate hierarchies of infinite depth. Research on how children acquire these rules can provide valuable insight into the cognitive requirements and learning constraints of recursion. Here, we used fractals to investigate the acquisition of recursion in the visual domain, and probed for correlations with grammar comprehension and general intelligence. We compared second (n = 26) and fourth graders (n = 26) in their ability to represent two types of rules for generating hierarchical structures: Recursive rules, on the one hand, which generate new hierarchical levels; and iterative rules, on the other hand, which merely insert items within hierarchies without generating new levels. We found that the majority of fourth graders, but not second graders, were able to represent both recursive and iterative rules. This difference was partially accounted by second graders’ impairment in detecting hierarchical mistakes, and correlated with between-grade differences in grammar comprehension tasks. Empirically, recursion and iteration also differed in at least one crucial aspect: While the ability to learn recursive rules seemed to depend on the previous acquisition of simple iterative representations, the opposite was not true, i.e., children were able to acquire iterative rules before they acquired recursive representations. These results suggest that the acquisition of recursion in vision follows learning constraints similar to the acquisition of recursion in language, and that both domains share cognitive resources involved in hierarchical processing. PMID:24955884

  4. Toward sensor-based context aware systems.

    PubMed

    Sakurai, Yoshitaka; Takada, Kouhei; Anisetti, Marco; Bellandi, Valerio; Ceravolo, Paolo; Damiani, Ernesto; Tsuruta, Setsuo

    2012-01-01

    This paper proposes a methodology for sensor data interpretation that can combine sensor outputs with contexts represented as sets of annotated business rules. Sensor readings are interpreted to generate events labeled with the appropriate type and level of uncertainty. Then, the appropriate context is selected. Reconciliation of different uncertainty types is achieved by a simple technique that moves uncertainty from events to business rules by generating combs of standard Boolean predicates. Finally, context rules are evaluated together with the events to take a decision. The feasibility of our idea is demonstrated via a case study where a context-reasoning engine has been connected to simulated heartbeat sensors using prerecorded experimental data. We use sensor outputs to identify the proper context of operation of a system and trigger decision-making based on context information.

  5. Wallace State's New Rules of Business: Affirming the Truths of Intentional Transformation

    ERIC Educational Resources Information Center

    Johnson, Mell

    2007-01-01

    Wallace State Community College in Hanceville, Alabama, took the Community College Futures Assembly challenge for the 2006 Bellwether Award from FAST COMPANY's release of "The Rules of Business: Timeless Truths from the Best Minds in Business" to identify its own substantive question for this year's competition: "The New Rules of…

  6. Expert systems for diagnostic purposes, prospected applications to the radar field

    NASA Astrophysics Data System (ADS)

    Filippi, Riccardo

    Expert systems applied to fault diagnosis, particularly electrical circuit troubleshooting, are introduced. Diagnostic systems consisting of sequences of rules of the symptom-disease type (rule based system) and systems based upon a physical and functional description of the unit subjected to fault diagnosis are treated. Application of such systems to radar equipment troubleshooting, in particular to the transmitter, is discussed.

  7. 75 FR 652 - Energy Conservation Program: Certification, Compliance, and Enforcement Requirements for Certain...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-05

    ... manufacturer certification for distribution transformers. DATES: This rule is effective February 4, 2010 except... 2005--Commercial Equipment D. Distribution Transformers E. General Requirements IV. Procedural... distribution transformers that DOE proposed in the July 2006 NOPR. II. Summary of Today's Action DOE adopts...

  8. Relativistic corrections to a generalized sum rule

    NASA Astrophysics Data System (ADS)

    Sinky, H.; Leung, P. T.

    2006-09-01

    Relativistic corrections to a previously established generalized sum rule are obtained using the Foldy-Wouthysen transformation. This sum rule derived previously by Wang [Phys. Rev. A 60, 262 (1999)] for a nonrelativistic system contains both the well-known Thomas-Reiche-Kuhn and Bethe sum rules, for which relativistic corrections have been obtained in the literature. Our results for the generalized formula will be applied to recover several results obtained previously in the literature, as well as to another sum rule whose relativistic corrections will be obtained.

  9. A new method of Quickbird own image fusion

    NASA Astrophysics Data System (ADS)

    Han, Ying; Jiang, Hong; Zhang, Xiuying

    2009-10-01

    With the rapid development of remote sensing technology, the means of accessing to remote sensing data become increasingly abundant, thus the same area can form a large number of multi-temporal, different resolution image sequence. At present, the fusion methods are mainly: HPF, IHS transform method, PCA method, Brovey, Mallat algorithm and wavelet transform and so on. There exists a serious distortion of the spectrums in the IHS transform, Mallat algorithm omits low-frequency information of the high spatial resolution images, the integration results of which has obvious blocking effects. Wavelet multi-scale decomposition for different sizes, the directions, details and the edges can have achieved very good results, but different fusion rules and algorithms can achieve different effects. This article takes the Quickbird own image fusion as an example, basing on wavelet transform and HVS, wavelet transform and IHS integration. The result shows that the former better. This paper introduces the correlation coefficient, the relative average spectral error index and usual index to evaluate the quality of image.

  10. Identification of transformer fault based on dissolved gas analysis using hybrid support vector machine-modified evolutionary particle swarm optimisation

    PubMed Central

    2018-01-01

    Early detection of power transformer fault is important because it can reduce the maintenance cost of the transformer and it can ensure continuous electricity supply in power systems. Dissolved Gas Analysis (DGA) technique is commonly used to identify oil-filled power transformer fault type but utilisation of artificial intelligence method with optimisation methods has shown convincing results. In this work, a hybrid support vector machine (SVM) with modified evolutionary particle swarm optimisation (EPSO) algorithm was proposed to determine the transformer fault type. The superiority of the modified PSO technique with SVM was evaluated by comparing the results with the actual fault diagnosis, unoptimised SVM and previous reported works. Data reduction was also applied using stepwise regression prior to the training process of SVM to reduce the training time. It was found that the proposed hybrid SVM-Modified EPSO (MEPSO)-Time Varying Acceleration Coefficient (TVAC) technique results in the highest correct identification percentage of faults in a power transformer compared to other PSO algorithms. Thus, the proposed technique can be one of the potential solutions to identify the transformer fault type based on DGA data on site. PMID:29370230

  11. Identification of transformer fault based on dissolved gas analysis using hybrid support vector machine-modified evolutionary particle swarm optimisation.

    PubMed

    Illias, Hazlee Azil; Zhao Liang, Wee

    2018-01-01

    Early detection of power transformer fault is important because it can reduce the maintenance cost of the transformer and it can ensure continuous electricity supply in power systems. Dissolved Gas Analysis (DGA) technique is commonly used to identify oil-filled power transformer fault type but utilisation of artificial intelligence method with optimisation methods has shown convincing results. In this work, a hybrid support vector machine (SVM) with modified evolutionary particle swarm optimisation (EPSO) algorithm was proposed to determine the transformer fault type. The superiority of the modified PSO technique with SVM was evaluated by comparing the results with the actual fault diagnosis, unoptimised SVM and previous reported works. Data reduction was also applied using stepwise regression prior to the training process of SVM to reduce the training time. It was found that the proposed hybrid SVM-Modified EPSO (MEPSO)-Time Varying Acceleration Coefficient (TVAC) technique results in the highest correct identification percentage of faults in a power transformer compared to other PSO algorithms. Thus, the proposed technique can be one of the potential solutions to identify the transformer fault type based on DGA data on site.

  12. Promoter Sequences Prediction Using Relational Association Rule Mining

    PubMed Central

    Czibula, Gabriela; Bocicor, Maria-Iuliana; Czibula, Istvan Gergely

    2012-01-01

    In this paper we are approaching, from a computational perspective, the problem of promoter sequences prediction, an important problem within the field of bioinformatics. As the conditions for a DNA sequence to function as a promoter are not known, machine learning based classification models are still developed to approach the problem of promoter identification in the DNA. We are proposing a classification model based on relational association rules mining. Relational association rules are a particular type of association rules and describe numerical orderings between attributes that commonly occur over a data set. Our classifier is based on the discovery of relational association rules for predicting if a DNA sequence contains or not a promoter region. An experimental evaluation of the proposed model and comparison with similar existing approaches is provided. The obtained results show that our classifier overperforms the existing techniques for identifying promoter sequences, confirming the potential of our proposal. PMID:22563233

  13. The Nonsubsampled Contourlet Transform Based Statistical Medical Image Fusion Using Generalized Gaussian Density

    PubMed Central

    Yang, Guocheng; Li, Meiling; Chen, Leiting; Yu, Jie

    2015-01-01

    We propose a novel medical image fusion scheme based on the statistical dependencies between coefficients in the nonsubsampled contourlet transform (NSCT) domain, in which the probability density function of the NSCT coefficients is concisely fitted using generalized Gaussian density (GGD), as well as the similarity measurement of two subbands is accurately computed by Jensen-Shannon divergence of two GGDs. To preserve more useful information from source images, the new fusion rules are developed to combine the subbands with the varied frequencies. That is, the low frequency subbands are fused by utilizing two activity measures based on the regional standard deviation and Shannon entropy and the high frequency subbands are merged together via weight maps which are determined by the saliency values of pixels. The experimental results demonstrate that the proposed method significantly outperforms the conventional NSCT based medical image fusion approaches in both visual perception and evaluation indices. PMID:26557871

  14. A circadian rhythm in skill-based errors in aviation maintenance.

    PubMed

    Hobbs, Alan; Williamson, Ann; Van Dongen, Hans P A

    2010-07-01

    In workplaces where activity continues around the clock, human error has been observed to exhibit a circadian rhythm, with a characteristic peak in the early hours of the morning. Errors are commonly distinguished by the nature of the underlying cognitive failure, particularly the level of intentionality involved in the erroneous action. The Skill-Rule-Knowledge (SRK) framework of Rasmussen is used widely in the study of industrial errors and accidents. The SRK framework describes three fundamental types of error, according to whether behavior is under the control of practiced sensori-motor skill routines with minimal conscious awareness; is guided by implicit or explicit rules or expertise; or where the planning of actions requires the conscious application of domain knowledge. Up to now, examinations of circadian patterns of industrial errors have not distinguished between different types of error. Consequently, it is not clear whether all types of error exhibit the same circadian rhythm. A survey was distributed to aircraft maintenance personnel in Australia. Personnel were invited to anonymously report a safety incident and were prompted to describe, in detail, the human involvement (if any) that contributed to it. A total of 402 airline maintenance personnel reported an incident, providing 369 descriptions of human error in which the time of the incident was reported and sufficient detail was available to analyze the error. Errors were categorized using a modified version of the SRK framework, in which errors are categorized as skill-based, rule-based, or knowledge-based, or as procedure violations. An independent check confirmed that the SRK framework had been applied with sufficient consistency and reliability. Skill-based errors were the most common form of error, followed by procedure violations, rule-based errors, and knowledge-based errors. The frequency of errors was adjusted for the estimated proportion of workers present at work/each hour of the day, and the 24 h pattern of each error type was examined. Skill-based errors exhibited a significant circadian rhythm, being most prevalent in the early hours of the morning. Variation in the frequency of rule-based errors, knowledge-based errors, and procedure violations over the 24 h did not reach statistical significance. The results suggest that during the early hours of the morning, maintenance technicians are at heightened risk of "absent minded" errors involving failures to execute action plans as intended.

  15. Automating the Transformational Development of Software. Volume 1.

    DTIC Science & Technology

    1983-03-01

    DRACO system [Neighbors 80] uses meta-rules to derive information about which new transformations will be applicable after a particular transformation has...transformation over another. The new model, as Incorporated in a system called Glitter, explicitly represents transformation goals, methods, and selection...done anew for each new problem (compare this with Neighbor’s Draco system [Neighbors 80] which attempts to reuse domain analysis). o Is the user

  16. Beam steering performance of compressed Luneburg lens based on transformation optics

    NASA Astrophysics Data System (ADS)

    Gao, Ju; Wang, Cong; Zhang, Kuang; Hao, Yang; Wu, Qun

    2018-06-01

    In this paper, two types of compressed Luneburg lenses based on transformation optics are investigated and simulated using two different sources, namely, waveguides and dipoles, which represent plane and spherical wave sources, respectively. We determined that the largest beam steering angle and the related feed point are intrinsic characteristics of a certain type of compressed Luneburg lens, and that the optimized distance between the feed and lens, gain enhancement, and side-lobe suppression are related to the type of source. Based on our results, we anticipate that these lenses will prove useful in various future antenna applications.

  17. Traditional versus rule-based programming techniques: Application to the control of optional flight information

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell R.; Abbott, Kathy H.

    1987-01-01

    To the software design community, the concern over the costs associated with a program's execution time and implementation is great. It is always desirable, and sometimes imperative, that the proper programming technique is chosen which minimizes all costs for a given application or type of application. A study is described that compared cost-related factors associated with traditional programming techniques to rule-based programming techniques for a specific application. The results of this study favored the traditional approach regarding execution efficiency, but favored the rule-based approach regarding programmer productivity (implementation ease). Although this study examined a specific application, the results should be widely applicable.

  18. Compact high voltage, high peak power, high frequency transformer for converter type modulator applications.

    PubMed

    Reghu, T; Mandloi, V; Shrivastava, Purushottam

    2016-04-01

    The design and development of a compact high voltage, high peak power, high frequency transformer for a converter type modulator of klystron amplifiers is presented. The transformer has been designed to operate at a frequency of 20 kHz and at a flux swing of ±0.6 T. Iron (Fe) based nanocrystalline material has been selected as a core for the construction of the transformer. The transformer employs a specially designed solid Teflon bobbin having 120 kV insulation for winding the high voltage secondary windings. The flux swing of the core has been experimentally found by plotting the hysteresis loop at actual operating conditions. Based on the design, a prototype transformer has been built which is per se a unique combination of high voltage, high frequency, and peak power specifications. The transformer was able to provide 58 kV (pk-pk) at the secondary with a peak power handling capability of 700 kVA. The transformation ratio was 1:17. The performance of the transformer is also presented and discussed.

  19. Using the Interactive Learning Environment Aplusix for Teaching and Learning School Algebra: A Research Experiment in a Middle School

    ERIC Educational Resources Information Center

    Hadjerrouit, Said

    2011-01-01

    Most software tools that have been developed with the aim of helping students to learn school algebra have not yet achieved successful results in classroom. Almost all of them are menu-based systems that provide transformation rules in menus and buttons. Aplusix is a new interactive software tool for learning school algebra. In contrast to…

  20. Intelligent Gearbox Diagnosis Methods Based on SVM, Wavelet Lifting and RBR

    PubMed Central

    Gao, Lixin; Ren, Zhiqiang; Tang, Wenliang; Wang, Huaqing; Chen, Peng

    2010-01-01

    Given the problems in intelligent gearbox diagnosis methods, it is difficult to obtain the desired information and a large enough sample size to study; therefore, we propose the application of various methods for gearbox fault diagnosis, including wavelet lifting, a support vector machine (SVM) and rule-based reasoning (RBR). In a complex field environment, it is less likely for machines to have the same fault; moreover, the fault features can also vary. Therefore, a SVM could be used for the initial diagnosis. First, gearbox vibration signals were processed with wavelet packet decomposition, and the signal energy coefficients of each frequency band were extracted and used as input feature vectors in SVM for normal and faulty pattern recognition. Second, precision analysis using wavelet lifting could successfully filter out the noisy signals while maintaining the impulse characteristics of the fault; thus effectively extracting the fault frequency of the machine. Lastly, the knowledge base was built based on the field rules summarized by experts to identify the detailed fault type. Results have shown that SVM is a powerful tool to accomplish gearbox fault pattern recognition when the sample size is small, whereas the wavelet lifting scheme can effectively extract fault features, and rule-based reasoning can be used to identify the detailed fault type. Therefore, a method that combines SVM, wavelet lifting and rule-based reasoning ensures effective gearbox fault diagnosis. PMID:22399894

  1. Intelligent gearbox diagnosis methods based on SVM, wavelet lifting and RBR.

    PubMed

    Gao, Lixin; Ren, Zhiqiang; Tang, Wenliang; Wang, Huaqing; Chen, Peng

    2010-01-01

    Given the problems in intelligent gearbox diagnosis methods, it is difficult to obtain the desired information and a large enough sample size to study; therefore, we propose the application of various methods for gearbox fault diagnosis, including wavelet lifting, a support vector machine (SVM) and rule-based reasoning (RBR). In a complex field environment, it is less likely for machines to have the same fault; moreover, the fault features can also vary. Therefore, a SVM could be used for the initial diagnosis. First, gearbox vibration signals were processed with wavelet packet decomposition, and the signal energy coefficients of each frequency band were extracted and used as input feature vectors in SVM for normal and faulty pattern recognition. Second, precision analysis using wavelet lifting could successfully filter out the noisy signals while maintaining the impulse characteristics of the fault; thus effectively extracting the fault frequency of the machine. Lastly, the knowledge base was built based on the field rules summarized by experts to identify the detailed fault type. Results have shown that SVM is a powerful tool to accomplish gearbox fault pattern recognition when the sample size is small, whereas the wavelet lifting scheme can effectively extract fault features, and rule-based reasoning can be used to identify the detailed fault type. Therefore, a method that combines SVM, wavelet lifting and rule-based reasoning ensures effective gearbox fault diagnosis.

  2. Design and performance of a pulse transformer based on Fe-based nanocrystalline core.

    PubMed

    Yi, Liu; Xibo, Feng; Lin, Fuchang

    2011-08-01

    A dry-type pulse transformer based on Fe-based nanocrystalline core with a load of 0.88 nF, output voltage of more than 65 kV, and winding ratio of 46 is designed and constructed. The dynamic characteristics of Fe-based nanocrystalline core under the impulse with the pulse width of several microseconds were studied. The pulse width and incremental flux density have an important effect on the pulse permeability, so the pulse permeability is measured under a certain pulse width and incremental flux density. The minimal volume of the toroidal pulse transformer core is determined by the coupling coefficient, the capacitors of the resonant charging circuit, incremental flux density, and pulse permeability. The factors of the charging time, ratio, and energy transmission efficiency in the resonant charging circuit based on magnetic core-type pulse transformer are analyzed. Experimental results of the pulse transformer are in good agreement with the theoretical calculation. When the primary capacitor is 3.17 μF and charge voltage is 1.8 kV, a voltage across the secondary capacitor of 0.88 nF with peak value of 68.5 kV, rise time (10%-90%) of 1.80 μs is obtained.

  3. Safety leadership at construction sites: the importance of rule-oriented and participative leadership.

    PubMed

    Grill, Martin; Pousette, Anders; Nielsen, Kent; Grytnes, Regine; Törner, Marianne

    2017-07-01

    Objectives The construction industry accounted for >20% of all fatal occupational accidents in Europe in 2014. Leadership is an essential antecedent to occupational safety. The aim of the present study was to assess the influence of transformational, active transactional, rule-oriented, participative, and laissez-faire leadership on safety climate, safety behavior, and accidents in the Swedish and Danish construction industry. Sweden and Denmark are similar countries but have a large difference in occupational accidents rates. Methods A questionnaire study was conducted among a random sample of construction workers in both countries: 811 construction workers from 85 sites responded, resulting in site and individual response rates of 73% and 64%, respectively. Results The results indicated that transformational, active transactional, rule-oriented and participative leadership predict positive safety outcomes, and laissez-faire leadership predict negative safety outcomes. For example, rule-oriented leadership predicts a superior safety climate (β=0.40, P<0.001), enhanced safety behavior (β=0.15, P<0.001), and fewer accidents [odds ratio (OR) 0.78, 95% confidence interval (95% CI) 0.62-0.98]. The effect of rule-oriented leadership on workers' safety behavior was moderated by the level of participative leadership (β=0.10, P<0.001), suggesting that when rules and plans are established in a collaborative manner, workers' motivation to comply with safety regulations and participate in proactive safety activities is elevated. The influence of leadership behaviors on safety outcomes were largely similar in Sweden and Denmark. Rule-oriented and participative leadership were more common in the Swedish than Danish construction industry, which may partly explain the difference in occupational accident rates. Conclusions Applying less laissez-faire leadership and more transformational, active transactional, participative and rule-oriented leadership appears to be an effective way for construction site managers to improve occupational safety in the industry.

  4. Automatic learning of rules. A practical example of using artificial intelligence to improve computer-based detection of myocardial infarction and left ventricular hypertrophy in the 12-lead ECG.

    PubMed

    Kaiser, W; Faber, T S; Findeis, M

    1996-01-01

    The authors developed a computer program that detects myocardial infarction (MI) and left ventricular hypertrophy (LVH) in two steps: (1) by extracting parameter values from a 10-second, 12-lead electrocardiogram, and (2) by classifying the extracted parameter values with rule sets. Every disease has its dedicated set of rules. Hence, there are separate rule sets for anterior MI, inferior MI, and LVH. If at least one rule is satisfied, the disease is said to be detected. The computer program automatically develops these rule sets. A database (learning set) of healthy subjects and patients with MI, LVH, and mixed MI+LVH was used. After defining the rule type, initial limits, and expected quality of the rules (positive predictive value, minimum number of patients), the program creates a set of rules by varying the limits. The general rule type is defined as: disease = lim1l < p1 < or = lim1u and lim2l < p2 < or = lim2u and ... limnl < pn < or = limnu. When defining the rule types, only the parameters (p1 ... pn) that are known as clinical electrocardiographic criteria (amplitudes [mV] of Q, R, and T waves and ST-segment; duration [ms] of Q wave; frontal angle [degrees]) were used. This allowed for submitting the learned rule sets to an independent investigator for medical verification. It also allowed the creation of explanatory texts with the rules. These advantages are not offered by the neurons of a neural network. The learned rules were checked against a test set and the following results were obtained: MI: sensitivity 76.2%, positive predictive value 98.6%; LVH: sensitivity 72.3%, positive predictive value 90.9%. The specificity ratings for MI are better than 98%; for LVH, better than 90%.

  5. A computational exploration of complementary learning mechanisms in the primate ventral visual pathway.

    PubMed

    Spoerer, Courtney J; Eguchi, Akihiro; Stringer, Simon M

    2016-02-01

    In order to develop transformation invariant representations of objects, the visual system must make use of constraints placed upon object transformation by the environment. For example, objects transform continuously from one point to another in both space and time. These two constraints have been exploited separately in order to develop translation and view invariance in a hierarchical multilayer model of the primate ventral visual pathway in the form of continuous transformation learning and temporal trace learning. We show for the first time that these two learning rules can work cooperatively in the model. Using these two learning rules together can support the development of invariance in cells and help maintain object selectivity when stimuli are presented over a large number of locations or when trained separately over a large number of viewing angles. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. Testing the significance of a correlation with nonnormal data: comparison of Pearson, Spearman, transformation, and resampling approaches.

    PubMed

    Bishara, Anthony J; Hittner, James B

    2012-09-01

    It is well known that when data are nonnormally distributed, a test of the significance of Pearson's r may inflate Type I error rates and reduce power. Statistics textbooks and the simulation literature provide several alternatives to Pearson's correlation. However, the relative performance of these alternatives has been unclear. Two simulation studies were conducted to compare 12 methods, including Pearson, Spearman's rank-order, transformation, and resampling approaches. With most sample sizes (n ≥ 20), Type I and Type II error rates were minimized by transforming the data to a normal shape prior to assessing the Pearson correlation. Among transformation approaches, a general purpose rank-based inverse normal transformation (i.e., transformation to rankit scores) was most beneficial. However, when samples were both small (n ≤ 10) and extremely nonnormal, the permutation test often outperformed other alternatives, including various bootstrap tests.

  7. Redundancy checking algorithms based on parallel novel extension rule

    NASA Astrophysics Data System (ADS)

    Liu, Lei; Yang, Yang; Li, Guangli; Wang, Qi; Lü, Shuai

    2017-05-01

    Redundancy checking (RC) is a key knowledge reduction technology. Extension rule (ER) is a new reasoning method, first presented in 2003 and well received by experts at home and abroad. Novel extension rule (NER) is an improved ER-based reasoning method, presented in 2009. In this paper, we first analyse the characteristics of the extension rule, and then present a simple algorithm for redundancy checking based on extension rule (RCER). In addition, we introduce MIMF, a type of heuristic strategy. Using the aforementioned rule and strategy, we design and implement RCHER algorithm, which relies on MIMF. Next we design and implement an RCNER (redundancy checking based on NER) algorithm based on NER. Parallel computing greatly accelerates the NER algorithm, which has weak dependence among tasks when executed. Considering this, we present PNER (parallel NER) and apply it to redundancy checking and necessity checking. Furthermore, we design and implement the RCPNER (redundancy checking based on PNER) and NCPPNER (necessary clause partition based on PNER) algorithms as well. The experimental results show that MIMF significantly influences the acceleration of algorithm RCER in formulae on a large scale and high redundancy. Comparing PNER with NER and RCPNER with RCNER, the average speedup can reach up to the number of task decompositions when executed. Comparing NCPNER with the RCNER-based algorithm on separating redundant formulae, speedup increases steadily as the scale of the formulae is incrementing. Finally, we describe the challenges that the extension rule will be faced with and suggest possible solutions.

  8. Hierarchical Leak Detection and Localization Method in Natural Gas Pipeline Monitoring Sensor Networks

    PubMed Central

    Wan, Jiangwen; Yu, Yang; Wu, Yinfeng; Feng, Renjian; Yu, Ning

    2012-01-01

    In light of the problems of low recognition efficiency, high false rates and poor localization accuracy in traditional pipeline security detection technology, this paper proposes a type of hierarchical leak detection and localization method for use in natural gas pipeline monitoring sensor networks. In the signal preprocessing phase, original monitoring signals are dealt with by wavelet transform technology to extract the single mode signals as well as characteristic parameters. In the initial recognition phase, a multi-classifier model based on SVM is constructed and characteristic parameters are sent as input vectors to the multi-classifier for initial recognition. In the final decision phase, an improved evidence combination rule is designed to integrate initial recognition results for final decisions. Furthermore, a weighted average localization algorithm based on time difference of arrival is introduced for determining the leak point’s position. Experimental results illustrate that this hierarchical pipeline leak detection and localization method could effectively improve the accuracy of the leak point localization and reduce the undetected rate as well as false alarm rate. PMID:22368464

  9. Hierarchical leak detection and localization method in natural gas pipeline monitoring sensor networks.

    PubMed

    Wan, Jiangwen; Yu, Yang; Wu, Yinfeng; Feng, Renjian; Yu, Ning

    2012-01-01

    In light of the problems of low recognition efficiency, high false rates and poor localization accuracy in traditional pipeline security detection technology, this paper proposes a type of hierarchical leak detection and localization method for use in natural gas pipeline monitoring sensor networks. In the signal preprocessing phase, original monitoring signals are dealt with by wavelet transform technology to extract the single mode signals as well as characteristic parameters. In the initial recognition phase, a multi-classifier model based on SVM is constructed and characteristic parameters are sent as input vectors to the multi-classifier for initial recognition. In the final decision phase, an improved evidence combination rule is designed to integrate initial recognition results for final decisions. Furthermore, a weighted average localization algorithm based on time difference of arrival is introduced for determining the leak point's position. Experimental results illustrate that this hierarchical pipeline leak detection and localization method could effectively improve the accuracy of the leak point localization and reduce the undetected rate as well as false alarm rate.

  10. Improved medical image fusion based on cascaded PCA and shift invariant wavelet transforms.

    PubMed

    Reena Benjamin, J; Jayasree, T

    2018-02-01

    In the medical field, radiologists need more informative and high-quality medical images to diagnose diseases. Image fusion plays a vital role in the field of biomedical image analysis. It aims to integrate the complementary information from multimodal images, producing a new composite image which is expected to be more informative for visual perception than any of the individual input images. The main objective of this paper is to improve the information, to preserve the edges and to enhance the quality of the fused image using cascaded principal component analysis (PCA) and shift invariant wavelet transforms. A novel image fusion technique based on cascaded PCA and shift invariant wavelet transforms is proposed in this paper. PCA in spatial domain extracts relevant information from the large dataset based on eigenvalue decomposition, and the wavelet transform operating in the complex domain with shift invariant properties brings out more directional and phase details of the image. The significance of maximum fusion rule applied in dual-tree complex wavelet transform domain enhances the average information and morphological details. The input images of the human brain of two different modalities (MRI and CT) are collected from whole brain atlas data distributed by Harvard University. Both MRI and CT images are fused using cascaded PCA and shift invariant wavelet transform method. The proposed method is evaluated based on three main key factors, namely structure preservation, edge preservation, contrast preservation. The experimental results and comparison with other existing fusion methods show the superior performance of the proposed image fusion framework in terms of visual and quantitative evaluations. In this paper, a complex wavelet-based image fusion has been discussed. The experimental results demonstrate that the proposed method enhances the directional features as well as fine edge details. Also, it reduces the redundant details, artifacts, distortions.

  11. Fabrication of High-Performance Polymer Bulk-Heterojunction Solar Cells by the Interfacial Modifications III

    DTIC Science & Technology

    2011-04-30

    University of Tennessee) 3. "An ambipolar to n-type transformation in pentacene -based organic field-effect transistors" Org. Electron. 12, 509 (2011...OFETs). An ambipolar to n-type transformation in pentacene -based organic field-effect transistors (OFETs) of Al source-drain electrodes had been...correlated with the interfacial interactions between Al electrodes and pentacene , as characterized by analyzing Near-edge X-ray absorption fine structure

  12. Pushing the rules: effects and aftereffects of deliberate rule violations.

    PubMed

    Wirth, Robert; Pfister, Roland; Foerster, Anna; Huestegge, Lynn; Kunde, Wilfried

    2016-09-01

    Most of our daily life is organized around rules and social norms. But what makes rules so special? And what if one were to break a rule intentionally? Can we simply free us from the present set of rules or do we automatically adhere to them? How do rule violations influence subsequent behavior? To investigate the effects and aftereffects of violating simple S-R rule, we conducted three experiments that investigated continuous finger-tracking responses on an iPad. Our experiments show that rule violations are distinct from rule-based actions in both response times and movement trajectories, they take longer to initiate and execute, and their movement trajectory is heavily contorted. Data not only show differences between the two types of response (rule-based vs. violation), but also yielded a characteristic pattern of aftereffects in case of rule violations: rule violations do not trigger adaptation effects that render further rule violations less difficult, but every rule violation poses repeated effort on the agent. The study represents a first step towards understanding the signature and underlying mechanisms of deliberate rule violations, they cannot be acted out by themselves, but require the activation of the original rule first. Consequently, they are best understood as reformulations of existing rules that are not accessible on their own, but need to be constantly derived from the original rule, with an add-on that might entail an active tendency to steer away from mental representations that reflect (socially) unwanted behavior.

  13. Rule-Based Category Learning in Children: The Role of Age and Executive Functioning

    PubMed Central

    Rabi, Rahel; Minda, John Paul

    2014-01-01

    Rule-based category learning was examined in 4–11 year-olds and adults. Participants were asked to learn a set of novel perceptual categories in a classification learning task. Categorization performance improved with age, with younger children showing the strongest rule-based deficit relative to older children and adults. Model-based analyses provided insight regarding the type of strategy being used to solve the categorization task, demonstrating that the use of the task appropriate strategy increased with age. When children and adults who identified the correct categorization rule were compared, the performance deficit was no longer evident. Executive functions were also measured. While both working memory and inhibitory control were related to rule-based categorization and improved with age, working memory specifically was found to marginally mediate the age-related improvements in categorization. When analyses focused only on the sample of children, results showed that working memory ability and inhibitory control were associated with categorization performance and strategy use. The current findings track changes in categorization performance across childhood, demonstrating at which points performance begins to mature and resemble that of adults. Additionally, findings highlight the potential role that working memory and inhibitory control may play in rule-based category learning. PMID:24489658

  14. A hybrid learning method for constructing compact rule-based fuzzy models.

    PubMed

    Zhao, Wanqing; Niu, Qun; Li, Kang; Irwin, George W

    2013-12-01

    The Takagi–Sugeno–Kang-type rule-based fuzzy model has found many applications in different fields; a major challenge is, however, to build a compact model with optimized model parameters which leads to satisfactory model performance. To produce a compact model, most existing approaches mainly focus on selecting an appropriate number of fuzzy rules. In contrast, this paper considers not only the selection of fuzzy rules but also the structure of each rule premise and consequent, leading to the development of a novel compact rule-based fuzzy model. Here, each fuzzy rule is associated with two sets of input attributes, in which the first is used for constructing the rule premise and the other is employed in the rule consequent. A new hybrid learning method combining the modified harmony search method with a fast recursive algorithm is hereby proposed to determine the structure and the parameters for the rule premises and consequents. This is a hard mixed-integer nonlinear optimization problem, and the proposed hybrid method solves the problem by employing an embedded framework, leading to a significantly reduced number of model parameters and a small number of fuzzy rules with each being as simple as possible. Results from three examples are presented to demonstrate the compactness (in terms of the number of model parameters and the number of rules) and the performance of the fuzzy models obtained by the proposed hybrid learning method, in comparison with other techniques from the literature.

  15. The relationship between second-order false belief and display rules reasoning: the integration of cognitive and affective social understanding.

    PubMed

    Naito, Mika; Seki, Yoshimi

    2009-01-01

    To investigate the relation between cognitive and affective social understanding, Japanese 4- to 8-year-olds received tasks of first- and second-order false beliefs and prosocial and self-presentational display rules. From 6 to 8 years, children comprehended display rules, as well as second-order false belief, using social pressures justifications decreasingly and motivational justifications with embedded perspectives increasingly with age. Although not related to either type of display across ages, second-order tasks were associated with both types of display tasks only at 8 years when examined in each age group. Results suggest that children base their second-order theory of mind and display rules understanding on distinct reasoning until middle childhood, during which time the originally distinct aspects of social understanding are integrated.

  16. Nurses and electronic health records in a Canadian hospital: examining the social organisation and programmed use of digitised nursing knowledge.

    PubMed

    Campbell, Marie L; Rankin, Janet M

    2017-03-01

    Institutional ethnography (IE) is used to examine transformations in a professional nurse's work associated with her engagement with a hospital's electronic health record (EHR) which is being updated to integrate professional caregiving and produce more efficient and effective health care. We review in the technical and scholarly literature the practices and promises of information technology and, especially of its applications in health care, finding useful the more critical and analytic perspectives. Among the latter, scholarship on the activities of economising is important to our inquiry into the actual activities that transform 'things' (in our case, nursing knowledge and action) into calculable information for objective and financially relevant decision-making. Beginning with an excerpt of observational data, we explicate observed nurse-patient interactions, discovering in them traces of institutional ruling relations that the nurse's activation of the EHR carries into the nursing setting. The EHR, we argue, materialises and generalises the ruling relations across institutionally located caregivers; its authorised information stabilises their knowing and acting, shaping health care towards a calculated effective and efficient form. Participating in the EHR's ruling practices, nurses adopt its ruling standpoint; a transformation that we conclude needs more careful analysis and debate. © 2016 Foundation for the Sociology of Health & Illness.

  17. Multi-focus image fusion algorithm using NSCT and MPCNN

    NASA Astrophysics Data System (ADS)

    Liu, Kang; Wang, Lianli

    2018-04-01

    Based on nonsubsampled contourlet transform (NSCT) and modified pulse coupled neural network (MPCNN), the paper proposes an effective method of image fusion. Firstly, the paper decomposes the source image into the low-frequency components and high-frequency components using NSCT, and then processes the low-frequency components by regional statistical fusion rules. For high-frequency components, the paper calculates the spatial frequency (SF), which is input into MPCNN model to get relevant coefficients according to the fire-mapping image of MPCNN. At last, the paper restructures the final image by inverse transformation of low-frequency and high-frequency components. Compared with the wavelet transformation (WT) and the traditional NSCT algorithm, experimental results indicate that the method proposed in this paper achieves an improvement both in human visual perception and objective evaluation. It indicates that the method is effective, practical and good performance.

  18. Invariant Visual Object and Face Recognition: Neural and Computational Bases, and a Model, VisNet

    PubMed Central

    Rolls, Edmund T.

    2012-01-01

    Neurophysiological evidence for invariant representations of objects and faces in the primate inferior temporal visual cortex is described. Then a computational approach to how invariant representations are formed in the brain is described that builds on the neurophysiology. A feature hierarchy model in which invariant representations can be built by self-organizing learning based on the temporal and spatial statistics of the visual input produced by objects as they transform in the world is described. VisNet can use temporal continuity in an associative synaptic learning rule with a short-term memory trace, and/or it can use spatial continuity in continuous spatial transformation learning which does not require a temporal trace. The model of visual processing in the ventral cortical stream can build representations of objects that are invariant with respect to translation, view, size, and also lighting. The model has been extended to provide an account of invariant representations in the dorsal visual system of the global motion produced by objects such as looming, rotation, and object-based movement. The model has been extended to incorporate top-down feedback connections to model the control of attention by biased competition in, for example, spatial and object search tasks. The approach has also been extended to account for how the visual system can select single objects in complex visual scenes, and how multiple objects can be represented in a scene. The approach has also been extended to provide, with an additional layer, for the development of representations of spatial scenes of the type found in the hippocampus. PMID:22723777

  19. Invariant Visual Object and Face Recognition: Neural and Computational Bases, and a Model, VisNet.

    PubMed

    Rolls, Edmund T

    2012-01-01

    Neurophysiological evidence for invariant representations of objects and faces in the primate inferior temporal visual cortex is described. Then a computational approach to how invariant representations are formed in the brain is described that builds on the neurophysiology. A feature hierarchy model in which invariant representations can be built by self-organizing learning based on the temporal and spatial statistics of the visual input produced by objects as they transform in the world is described. VisNet can use temporal continuity in an associative synaptic learning rule with a short-term memory trace, and/or it can use spatial continuity in continuous spatial transformation learning which does not require a temporal trace. The model of visual processing in the ventral cortical stream can build representations of objects that are invariant with respect to translation, view, size, and also lighting. The model has been extended to provide an account of invariant representations in the dorsal visual system of the global motion produced by objects such as looming, rotation, and object-based movement. The model has been extended to incorporate top-down feedback connections to model the control of attention by biased competition in, for example, spatial and object search tasks. The approach has also been extended to account for how the visual system can select single objects in complex visual scenes, and how multiple objects can be represented in a scene. The approach has also been extended to provide, with an additional layer, for the development of representations of spatial scenes of the type found in the hippocampus.

  20. Predictions of Crystal Structure Based on Radius Ratio: How Reliable Are They?

    ERIC Educational Resources Information Center

    Nathan, Lawrence C.

    1985-01-01

    Discussion of crystalline solids in undergraduate curricula often includes the use of radius ratio rules as a method for predicting which type of crystal structure is likely to be adopted by a given ionic compound. Examines this topic, establishing more definitive guidelines for the use and reliability of the rules. (JN)

  1. 24 CFR 5.315 - Content of pet rules: General requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...' content among projects and within individual projects, based on factors such as the size, type, location... not conflict with any applicable State or local law or regulation governing the owning or keeping of pets in dwelling accommodations. (d) Conflict with State or local law. The pet rules adopted by the...

  2. Working Memory Development in Monolingual and Bilingual Children

    ERIC Educational Resources Information Center

    Morales, Julia; Calvo, Alejandra; Bialystok, Ellen

    2013-01-01

    Two studies are reported comparing the performance of monolingual and bilingual children on tasks requiring different levels of working memory. In the first study, 56 5-year-olds performed a Simon-type task that manipulated working memory demands by comparing conditions based on two rules and four rules and manipulated conflict resolution demands…

  3. Web-based Weather Expert System (WES) for Space Shuttle Launch

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge E.; Rajkumar, T.

    2003-01-01

    The Web-based Weather Expert System (WES) is a critical module of the Virtual Test Bed development to support 'go/no go' decisions for Space Shuttle operations in the Intelligent Launch and Range Operations program of NASA. The weather rules characterize certain aspects of the environment related to the launching or landing site, the time of the day or night, the pad or runway conditions, the mission durations, the runway equipment and landing type. Expert system rules are derived from weather contingency rules, which were developed over years by NASA. Backward chaining, a goal-directed inference method is adopted, because a particular consequence or goal clause is evaluated first, and then chained backward through the rules. Once a rule is satisfied or true, then that particular rule is fired and the decision is expressed. The expert system is continuously verifying the rules against the past one-hour weather conditions and the decisions are made. The normal procedure of operations requires a formal pre-launch weather briefing held on Launch minus 1 day, which is a specific weather briefing for all areas of Space Shuttle launch operations. In this paper, the Web-based Weather Expert System of the Intelligent Launch and range Operations program is presented.

  4. Deductibles in health insurance

    NASA Astrophysics Data System (ADS)

    Dimitriyadis, I.; Öney, Ü. N.

    2009-11-01

    This study is an extension to a simulation study that has been developed to determine ruin probabilities in health insurance. The study concentrates on inpatient and outpatient benefits for customers of varying age bands. Loss distributions are modelled through the Allianz tool pack for different classes of insureds. Premiums at different levels of deductibles are derived in the simulation and ruin probabilities are computed assuming a linear loading on the premium. The increase in the probability of ruin at high levels of the deductible clearly shows the insufficiency of proportional loading in deductible premiums. The PH-transform pricing rule developed by Wang is analyzed as an alternative pricing rule. A simple case, where an insured is assumed to be an exponential utility decision maker while the insurer's pricing rule is a PH-transform is also treated.

  5. Belief Function Based Decision Fusion for Decentralized Target Classification in Wireless Sensor Networks

    PubMed Central

    Zhang, Wenyu; Zhang, Zhenjiang

    2015-01-01

    Decision fusion in sensor networks enables sensors to improve classification accuracy while reducing the energy consumption and bandwidth demand for data transmission. In this paper, we focus on the decentralized multi-class classification fusion problem in wireless sensor networks (WSNs) and a new simple but effective decision fusion rule based on belief function theory is proposed. Unlike existing belief function based decision fusion schemes, the proposed approach is compatible with any type of classifier because the basic belief assignments (BBAs) of each sensor are constructed on the basis of the classifier’s training output confusion matrix and real-time observations. We also derive explicit global BBA in the fusion center under Dempster’s combinational rule, making the decision making operation in the fusion center greatly simplified. Also, sending the whole BBA structure to the fusion center is avoided. Experimental results demonstrate that the proposed fusion rule has better performance in fusion accuracy compared with the naïve Bayes rule and weighted majority voting rule. PMID:26295399

  6. Mental Spatial Transformations of Objects and Bodies: Different Developmental Trajectories in Children from 7 to 11 Years of Age

    ERIC Educational Resources Information Center

    Crescentini, Cristiano; Fabbro, Franco; Urgesi, Cosimo

    2014-01-01

    Despite the large body of knowledge on adults suggesting that 2 basic types of mental spatial transformation--namely, object-based and egocentric perspective transformations--are dissociable and specialized for different situations, there is much less research investigating the developmental aspects of such spatial transformation systems. Here, an…

  7. Proposal to designate Methylothermus subterraneus Hirayama et al. 2011 as the type species of the genus Methylothermus. Request for an Opinion.

    PubMed

    Boden, Rich; Oren, Aharon

    2017-09-01

    Methylothermus thermalis, the designated type species of the genus Methylothermus, is not available from culture collections and its nomenclatural type is a patent strain. According to Rule 20a of the International Code of Nomenclature of Prokaryotes, only species whose names are legitimate may serve as types of genera. Therefore, the name Methylothermus and the names of the species Methylothermus thermalis and Methylothermus subterraneus are not validly published and are illegitimate. We therefore submit a Request for an Opinion to the Judicial Commission of the ICSP to consider the later-named Methylothermus subterraneus as the new type species of the genus Methylothermus based on Rule 20e(2).

  8. Skin image retrieval using Gabor wavelet texture feature.

    PubMed

    Ou, X; Pan, W; Zhang, X; Xiao, P

    2016-12-01

    Skin imaging plays a key role in many clinical studies. We have used many skin imaging techniques, including the recently developed capacitive contact skin imaging based on fingerprint sensors. The aim of this study was to develop an effective skin image retrieval technique using Gabor wavelet transform, which can be used on different types of skin images, but with a special focus on skin capacitive contact images. Content-based image retrieval (CBIR) is a useful technology to retrieve stored images from database by supplying query images. In a typical CBIR, images are retrieved based on colour, shape, texture, etc. In this study, texture feature is used for retrieving skin images, and Gabor wavelet transform is used for texture feature description and extraction. The results show that the Gabor wavelet texture features can work efficiently on different types of skin images. Although Gabor wavelet transform is slower compared with other image retrieval techniques, such as principal component analysis (PCA) and grey-level co-occurrence matrix (GLCM), Gabor wavelet transform is the best for retrieving skin capacitive contact images and facial images with different orientations. Gabor wavelet transform can also work well on facial images with different expressions and skin cancer/disease images. We have developed an effective skin image retrieval method based on Gabor wavelet transform, that it is useful for retrieving different types of images, namely digital colour face images, digital colour skin cancer and skin disease images, and particularly greyscale skin capacitive contact images. Gabor wavelet transform can also be potentially useful for face recognition (with different orientation and expressions) and skin cancer/disease diagnosis. © 2016 Society of Cosmetic Scientists and the Société Française de Cosmétologie.

  9. 76 FR 67400 - Capital Project Management

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-01

    ...-0030] RIN 2132-AA92 Capital Project Management AGENCY: Federal Transit Administration (FTA), DOT... extending the comment period on its proposed rule for Capital Project Management to December 2, 2011, to...) proposing to transform the current FTA rule for project management oversight into a discrete set of...

  10. Hierarchical charge distribution controls self-assembly process of silk in vitro

    NASA Astrophysics Data System (ADS)

    Zhang, Yi; Zhang, Cencen; Liu, Lijie; Kaplan, David L.; Zhu, Hesun; Lu, Qiang

    2015-12-01

    Silk materials with different nanostructures have been developed without the understanding of the inherent transformation mechanism. Here we attempt to reveal the conversion road of the various nanostructures and determine the critical regulating factors. The regulating conversion processes influenced by a hierarchical charge distribution were investigated, showing different transformations between molecules, nanoparticles and nanofibers. Various repulsion and compressive forces existed among silk fibroin molecules and aggregates due to the exterior and interior distribution of charge, which further controlled their aggregating and deaggregating behaviors and finally formed nanofibers with different sizes. Synergistic action derived from molecular mobility and concentrations could also tune the assembly process and final nanostructures. It is suggested that the complicated silk fibroin assembly processes comply a same rule based on charge distribution, offering a promising way to develop silk-based materials with designed nanostructures.

  11. The Affordable Care Act: the ethical call for value-based leadership to transform quality.

    PubMed

    Piper, Llewellyn E

    2013-01-01

    Hospitals in America face a daunting and historical challenge starting in 2013 as leadership navigates their organizations toward a new port of call-the Patient Protection and Affordable Care Act. Known as the Affordable Care Act (ACA) was signed into law in March 2010 and held in abeyance waiting on 2 pivotal points-the Supreme Court's June 2012 ruling upholding the constitutionality of the ACA and the 2012 presidential election of Barack Obama bringing to reality to health care organizations that leadership now must implement the mandates of health care delivery under the ACA. This article addresses the need for value-based leadership to transform the culture of health care organizations in order to be successful in navigating uncharted waters under the unprecedented challenges for change in the delivery of quality health care.

  12. Medicare Program; FY 2016 Hospice Wage Index and Payment Rate Update and Hospice Quality Reporting Requirements. Final rule.

    PubMed

    2015-08-06

    This final rule will update the hospice payment rates and the wage index for fiscal year (FY) 2016 (October 1, 2015 through September 30, 2016), including implementing the last year of the phase-out of the wage index budget neutrality adjustment factor (BNAF). Effective on January 1, 2016, this rule also finalizes our proposals to differentiate payments for routine home care (RHC) based on the beneficiary's length of stay and implement a service intensity add-on (SIA) payment for services provided in the last 7 days of a beneficiary's life, if certain criteria are met. In addition, this rule will implement changes to the aggregate cap calculation mandated by the Improving Medicare Post-Acute Care Transformation Act of 2014 (IMPACT Act), align the cap accounting year for both the inpatient cap and the hospice aggregate cap with the federal fiscal year starting in FY 2017, make changes to the hospice quality reporting program, clarify a requirement for diagnosis reporting on the hospice claim, and discuss recent hospice payment reform research and analyses.

  13. Compiling standardized information from clinical practice: using content analysis and ICF Linking Rules in a goal-oriented youth rehabilitation program.

    PubMed

    Lustenberger, Nadia A; Prodinger, Birgit; Dorjbal, Delgerjargal; Rubinelli, Sara; Schmitt, Klaus; Scheel-Sailer, Anke

    2017-09-23

    To illustrate how routinely written narrative admission and discharge reports of a rehabilitation program for eight youths with chronic neurological health conditions can be transformed to the International Classification of Functioning, Disability and Health. First, a qualitative content analysis was conducted by building meaningful units with text segments assigned of the reports to the five elements of the Rehab-Cycle ® : goal; assessment; assignment; intervention; evaluation. Second, the meaningful units were then linked to the ICF using the refined ICF Linking Rules. With the first step of transformation, the emphasis of the narrative reports changed to a process oriented interdisciplinary layout, revealing three thematic blocks of goals: mobility, self-care, mental, and social functions. The linked 95 unique ICF codes could be grouped in clinically meaningful goal-centered ICF codes. Between the two independent linkers, the agreement rate was improved after complementing the rules with additional agreements. The ICF Linking Rules can be used to compile standardized health information from narrative reports if prior structured. The process requires time and expertise. To implement the ICF into common practice, the findings provide the starting point for reporting rehabilitation that builds upon existing practice and adheres to international standards. Implications for Rehabilitation This study provides evidence that routinely collected health information from rehabilitation practice can be transformed to the International Classification of Functioning, Disability and Health by using the "ICF Linking Rules", however, this requires time and expertise. The Rehab-Cycle ® , including assessments, assignments, goal setting, interventions and goal evaluation, serves as feasible framework for structuring this rehabilitation program and ensures that the complexity of local practice is appropriately reflected. The refined "ICF Linking Rules" lead to a standardized transformation process of narrative text and thus a higher quality with increased transparency. As a next step, the resulting format of goal codes supplemented by goal-clarifying codes could be validated to strengthen the implementation of the International Classification of Functioning, Disability and Health into rehabilitation routine by respecting the variety of clinical practice.

  14. TMS for Instantiating a Knowledge Base With Incomplete Data

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    A computer program that belongs to the class known among software experts as output truth-maintenance-systems (output TMSs) has been devised as one of a number of software tools for reducing the size of the knowledge base that must be searched during execution of artificial- intelligence software of the rule-based inference-engine type in a case in which data are missing. This program determines whether the consequences of activation of two or more rules can be combined without causing a logical inconsistency. For example, in a case involving hypothetical scenarios that could lead to turning a given device on or off, the program determines whether a scenario involving a given combination of rules could lead to turning the device both on and off at the same time, in which case that combination of rules would not be included in the scenario.

  15. Alteration of a motor learning rule under mirror-reversal transformation does not depend on the amplitude of visual error.

    PubMed

    Kasuga, Shoko; Kurata, Makiko; Liu, Meigen; Ushiba, Junichi

    2015-05-01

    Human's sophisticated motor learning system paradoxically interferes with motor performance when visual information is mirror-reversed (MR), because normal movement error correction further aggravates the error. This error-increasing mechanism makes performing even a simple reaching task difficult, but is overcome by alterations in the error correction rule during the trials. To isolate factors that trigger learners to change the error correction rule, we manipulated the gain of visual angular errors when participants made arm-reaching movements with mirror-reversed visual feedback, and compared the rule alteration timing between groups with normal or reduced gain. Trial-by-trial changes in the visual angular error was tracked to explain the timing of the change in the error correction rule. Under both gain conditions, visual angular errors increased under the MR transformation, and suddenly decreased after 3-5 trials with increase. The increase became degressive at different amplitude between the two groups, nearly proportional to the visual gain. The findings suggest that the alteration of the error-correction rule is not dependent on the amplitude of visual angular errors, and possibly determined by the number of trials over which the errors increased or statistical property of the environment. The current results encourage future intensive studies focusing on the exact rule-change mechanism. Copyright © 2014 Elsevier Ireland Ltd and the Japan Neuroscience Society. All rights reserved.

  16. Transactors, Transformers and Beyond. A Multi-Method Development of a Theoretical Typology of Leadership.

    ERIC Educational Resources Information Center

    Pearce, Craig L.; Sims, Henry P., Jr.; Cox, Jonathan F.; Ball, Gail; Schnell, Eugene; Smith, Ken A.; Trevino, Linda

    2003-01-01

    To extend the transactional-transformational model of leadership, four theoretical behavioral types of leadership were developed based on literature review and data from studies of executive behavior (n=253) and subordinate attitudes (n=208). Confirmatory factor analysis of a third data set (n=702) support the existence of four leadership types:…

  17. A blind dual color images watermarking based on IWT and state coding

    NASA Astrophysics Data System (ADS)

    Su, Qingtang; Niu, Yugang; Liu, Xianxi; Zhu, Yu

    2012-04-01

    In this paper, a state-coding based blind watermarking algorithm is proposed to embed color image watermark to color host image. The technique of state coding, which makes the state code of data set be equal to the hiding watermark information, is introduced in this paper. When embedding watermark, using Integer Wavelet Transform (IWT) and the rules of state coding, these components, R, G and B, of color image watermark are embedded to these components, Y, Cr and Cb, of color host image. Moreover, the rules of state coding are also used to extract watermark from the watermarked image without resorting to the original watermark or original host image. Experimental results show that the proposed watermarking algorithm cannot only meet the demand on invisibility and robustness of the watermark, but also have well performance compared with other proposed methods considered in this work.

  18. An analytical fuzzy-based approach to ?-gain optimal control of input-affine nonlinear systems using Newton-type algorithm

    NASA Astrophysics Data System (ADS)

    Milic, Vladimir; Kasac, Josip; Novakovic, Branko

    2015-10-01

    This paper is concerned with ?-gain optimisation of input-affine nonlinear systems controlled by analytic fuzzy logic system. Unlike the conventional fuzzy-based strategies, the non-conventional analytic fuzzy control method does not require an explicit fuzzy rule base. As the first contribution of this paper, we prove, by using the Stone-Weierstrass theorem, that the proposed fuzzy system without rule base is universal approximator. The second contribution of this paper is an algorithm for solving a finite-horizon minimax problem for ?-gain optimisation. The proposed algorithm consists of recursive chain rule for first- and second-order derivatives, Newton's method, multi-step Adams method and automatic differentiation. Finally, the results of this paper are evaluated on a second-order nonlinear system.

  19. TOXPERT: An Expert System for Risk Assessment

    PubMed Central

    Soto, R. J.; Osimitz, T. G.; Oleson, A.

    1988-01-01

    TOXPERT is an artificial intelligence based system used to model product safety, toxicology (TOX) and regulatory (REG) decision processes. An expert system shell uses backward chaining rule control to link “marketing approval” goals to the type of product, REG agency, exposure conditions and TOX. Marketing risks are primarily a function of the TOX, hazards and exposure potential. The method employed differentiates between REG requirements in goal seeking control for various types of products. This is accomplished by controlling rule execution by defining frames for each REG agency. In addition, TOXPERT produces classifications of TOX ratings and suggested product labeling. This production rule system uses principles of TOX, REGs, corporate guidelines and internal “rules of thumb.” TOXPERT acts as an advisor for this narrow domain. Advantages are that it can make routine decisions freeing professional's time for more complex problem solving, provide backup and training.

  20. Mexican Civil-Military Relations: Stability and Strength in an Uncertain Environment

    DTIC Science & Technology

    2011-10-28

    colonial rule ended in 1821, Mexico was an unstable yet independent state until General Porfirio Diaz established a dictatorship in 1876, running the...ruthlessness that he ruled for 34 years. The Mexican people needed stability, and Diaz’ approach to governance offered economic reforms...Factbook (Washington: CSIS Americas Program, 1999), 3. transformed the ruling party’s structure6 by establishing four “corporate sectors” within

  1. Simulation of land use change in the three gorges reservoir area based on CART-CA

    NASA Astrophysics Data System (ADS)

    Yuan, Min

    2018-05-01

    This study proposes a new method to simulate spatiotemporal complex multiple land uses by using classification and regression tree algorithm (CART) based CA model. In this model, we use classification and regression tree algorithm to calculate land class conversion probability, and combine neighborhood factor, random factor to extract cellular transformation rules. The overall Kappa coefficient is 0.8014 and the overall accuracy is 0.8821 in the land dynamic simulation results of the three gorges reservoir area from 2000 to 2010, and the simulation results are satisfactory.

  2. Sieve-based relation extraction of gene regulatory networks from biological literature

    PubMed Central

    2015-01-01

    Background Relation extraction is an essential procedure in literature mining. It focuses on extracting semantic relations between parts of text, called mentions. Biomedical literature includes an enormous amount of textual descriptions of biological entities, their interactions and results of related experiments. To extract them in an explicit, computer readable format, these relations were at first extracted manually from databases. Manual curation was later replaced with automatic or semi-automatic tools with natural language processing capabilities. The current challenge is the development of information extraction procedures that can directly infer more complex relational structures, such as gene regulatory networks. Results We develop a computational approach for extraction of gene regulatory networks from textual data. Our method is designed as a sieve-based system and uses linear-chain conditional random fields and rules for relation extraction. With this method we successfully extracted the sporulation gene regulation network in the bacterium Bacillus subtilis for the information extraction challenge at the BioNLP 2013 conference. To enable extraction of distant relations using first-order models, we transform the data into skip-mention sequences. We infer multiple models, each of which is able to extract different relationship types. Following the shared task, we conducted additional analysis using different system settings that resulted in reducing the reconstruction error of bacterial sporulation network from 0.73 to 0.68, measured as the slot error rate between the predicted and the reference network. We observe that all relation extraction sieves contribute to the predictive performance of the proposed approach. Also, features constructed by considering mention words and their prefixes and suffixes are the most important features for higher accuracy of extraction. Analysis of distances between different mention types in the text shows that our choice of transforming data into skip-mention sequences is appropriate for detecting relations between distant mentions. Conclusions Linear-chain conditional random fields, along with appropriate data transformations, can be efficiently used to extract relations. The sieve-based architecture simplifies the system as new sieves can be easily added or removed and each sieve can utilize the results of previous ones. Furthermore, sieves with conditional random fields can be trained on arbitrary text data and hence are applicable to broad range of relation extraction tasks and data domains. PMID:26551454

  3. Sieve-based relation extraction of gene regulatory networks from biological literature.

    PubMed

    Žitnik, Slavko; Žitnik, Marinka; Zupan, Blaž; Bajec, Marko

    2015-01-01

    Relation extraction is an essential procedure in literature mining. It focuses on extracting semantic relations between parts of text, called mentions. Biomedical literature includes an enormous amount of textual descriptions of biological entities, their interactions and results of related experiments. To extract them in an explicit, computer readable format, these relations were at first extracted manually from databases. Manual curation was later replaced with automatic or semi-automatic tools with natural language processing capabilities. The current challenge is the development of information extraction procedures that can directly infer more complex relational structures, such as gene regulatory networks. We develop a computational approach for extraction of gene regulatory networks from textual data. Our method is designed as a sieve-based system and uses linear-chain conditional random fields and rules for relation extraction. With this method we successfully extracted the sporulation gene regulation network in the bacterium Bacillus subtilis for the information extraction challenge at the BioNLP 2013 conference. To enable extraction of distant relations using first-order models, we transform the data into skip-mention sequences. We infer multiple models, each of which is able to extract different relationship types. Following the shared task, we conducted additional analysis using different system settings that resulted in reducing the reconstruction error of bacterial sporulation network from 0.73 to 0.68, measured as the slot error rate between the predicted and the reference network. We observe that all relation extraction sieves contribute to the predictive performance of the proposed approach. Also, features constructed by considering mention words and their prefixes and suffixes are the most important features for higher accuracy of extraction. Analysis of distances between different mention types in the text shows that our choice of transforming data into skip-mention sequences is appropriate for detecting relations between distant mentions. Linear-chain conditional random fields, along with appropriate data transformations, can be efficiently used to extract relations. The sieve-based architecture simplifies the system as new sieves can be easily added or removed and each sieve can utilize the results of previous ones. Furthermore, sieves with conditional random fields can be trained on arbitrary text data and hence are applicable to broad range of relation extraction tasks and data domains.

  4. The role of long-term familiarity and attentional maintenance in short-term memory for timbre.

    PubMed

    Siedenburg, Kai; McAdams, Stephen

    2017-04-01

    We study short-term recognition of timbre using familiar recorded tones from acoustic instruments and unfamiliar transformed tones that do not readily evoke sound-source categories. Participants indicated whether the timbre of a probe sound matched with one of three previously presented sounds (item recognition). In Exp. 1, musicians better recognised familiar acoustic compared to unfamiliar synthetic sounds, and this advantage was particularly large in the medial serial position. There was a strong correlation between correct rejection rate and the mean perceptual dissimilarity of the probe to the tones from the sequence. Exp. 2 compared musicians' and non-musicians' performance with concurrent articulatory suppression, visual interference, and with a silent control condition. Both suppression tasks disrupted performance by a similar margin, regardless of musical training of participants or type of sounds. Our results suggest that familiarity with sound source categories and attention play important roles in short-term memory for timbre, which rules out accounts solely based on sensory persistence.

  5. Implicit transfer of spatial structure in visuomotor sequence learning.

    PubMed

    Tanaka, Kanji; Watanabe, Katsumi

    2014-11-01

    Implicit learning and transfer in sequence learning are essential in daily life. Here, we investigated the implicit transfer of visuomotor sequences following a spatial transformation. In the two experiments, participants used trial and error to learn a sequence consisting of several button presses, known as the m×n task (Hikosaka et al., 1995). After this learning session, participants learned another sequence in which the button configuration was spatially transformed in one of the following ways: mirrored, rotated, and random arrangement. Our results showed that even when participants were unaware of the transformation rules, accuracy of transfer session in the mirrored and rotated groups was higher than that in the random group (i.e., implicit transfer occurred). Both those who noticed the transformation rules and those who did not (i.e., explicit and implicit transfer instances, respectively) showed faster performance in the mirrored sequences than in the rotated sequences. Taken together, the present results suggest that people can use their implicit visuomotor knowledge to spatially transform sequences and that implicit transfers are modulated by a transformation cost, similar to that in explicit transfer. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. 40 CFR 35.3565 - Specific cash draw rules for authorized types of assistance from the Fund.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the following rules: (a) Loans—(1) Eligible project costs. A State may draw cash based on the... associated pre-project costs, cash may be drawn immediately upon execution of the loan agreement. (2) Eligible project reimbursement costs. A State may draw cash to reimburse assistance recipients for eligible...

  7. Automatic Assembly of Combined Checkingfixture for Auto-Body Components Based Onfixture Elements Libraries

    NASA Astrophysics Data System (ADS)

    Jiang, Jingtao; Sui, Rendong; Shi, Yan; Li, Furong; Hu, Caiqi

    In this paper 3-D models of combined fixture elements are designed, classified by their functions, and saved in computer as supporting elements library, jointing elements library, basic elements library, localization elements library, clamping elements library, and adjusting elements library etc. Then automatic assembly of 3-D combined checking fixture for auto-body part is presented based on modularization theory. And in virtual auto-body assembly space, Locating constraint mapping technique and assembly rule-based reasoning technique are used to calculate the position of modular elements according to localization points and clamp points of auto-body part. Auto-body part model is transformed from itself coordinate system space to virtual assembly space by homogeneous transformation matrix. Automatic assembly of different functional fixture elements and auto-body part is implemented with API function based on the second development of UG. It is proven in practice that the method in this paper is feasible and high efficiency.

  8. Preparation of name and address data for record linkage using hidden Markov models

    PubMed Central

    Churches, Tim; Christen, Peter; Lim, Kim; Zhu, Justin Xi

    2002-01-01

    Background Record linkage refers to the process of joining records that relate to the same entity or event in one or more data collections. In the absence of a shared, unique key, record linkage involves the comparison of ensembles of partially-identifying, non-unique data items between pairs of records. Data items with variable formats, such as names and addresses, need to be transformed and normalised in order to validly carry out these comparisons. Traditionally, deterministic rule-based data processing systems have been used to carry out this pre-processing, which is commonly referred to as "standardisation". This paper describes an alternative approach to standardisation, using a combination of lexicon-based tokenisation and probabilistic hidden Markov models (HMMs). Methods HMMs were trained to standardise typical Australian name and address data drawn from a range of health data collections. The accuracy of the results was compared to that produced by rule-based systems. Results Training of HMMs was found to be quick and did not require any specialised skills. For addresses, HMMs produced equal or better standardisation accuracy than a widely-used rule-based system. However, acccuracy was worse when used with simpler name data. Possible reasons for this poorer performance are discussed. Conclusion Lexicon-based tokenisation and HMMs provide a viable and effort-effective alternative to rule-based systems for pre-processing more complex variably formatted data such as addresses. Further work is required to improve the performance of this approach with simpler data such as names. Software which implements the methods described in this paper is freely available under an open source license for other researchers to use and improve. PMID:12482326

  9. Combining High Spatial Resolution Optical and LIDAR Data for Object-Based Image Classification

    NASA Astrophysics Data System (ADS)

    Li, R.; Zhang, T.; Geng, R.; Wang, L.

    2018-04-01

    In order to classify high spatial resolution images more accurately, in this research, a hierarchical rule-based object-based classification framework was developed based on a high-resolution image with airborne Light Detection and Ranging (LiDAR) data. The eCognition software is employed to conduct the whole process. In detail, firstly, the FBSP optimizer (Fuzzy-based Segmentation Parameter) is used to obtain the optimal scale parameters for different land cover types. Then, using the segmented regions as basic units, the classification rules for various land cover types are established according to the spectral, morphological and texture features extracted from the optical images, and the height feature from LiDAR respectively. Thirdly, the object classification results are evaluated by using the confusion matrix, overall accuracy and Kappa coefficients. As a result, a method using the combination of an aerial image and the airborne Lidar data shows higher accuracy.

  10. Region Based CNN for Foreign Object Debris Detection on Airfield Pavement

    PubMed Central

    Cao, Xiaoguang; Wang, Peng; Meng, Cai; Gong, Guoping; Liu, Miaoming; Qi, Jun

    2018-01-01

    In this paper, a novel algorithm based on convolutional neural network (CNN) is proposed to detect foreign object debris (FOD) based on optical imaging sensors. It contains two modules, the improved region proposal network (RPN) and spatial transformer network (STN) based CNN classifier. In the improved RPN, some extra select rules are designed and deployed to generate high quality candidates with fewer numbers. Moreover, the efficiency of CNN detector is significantly improved by introducing STN layer. Compared to faster R-CNN and single shot multiBox detector (SSD), the proposed algorithm achieves better result for FOD detection on airfield pavement in the experiment. PMID:29494524

  11. Region Based CNN for Foreign Object Debris Detection on Airfield Pavement.

    PubMed

    Cao, Xiaoguang; Wang, Peng; Meng, Cai; Bai, Xiangzhi; Gong, Guoping; Liu, Miaoming; Qi, Jun

    2018-03-01

    In this paper, a novel algorithm based on convolutional neural network (CNN) is proposed to detect foreign object debris (FOD) based on optical imaging sensors. It contains two modules, the improved region proposal network (RPN) and spatial transformer network (STN) based CNN classifier. In the improved RPN, some extra select rules are designed and deployed to generate high quality candidates with fewer numbers. Moreover, the efficiency of CNN detector is significantly improved by introducing STN layer. Compared to faster R-CNN and single shot multiBox detector (SSD), the proposed algorithm achieves better result for FOD detection on airfield pavement in the experiment.

  12. Equations for Scoring Rules When Data Are Missing

    NASA Technical Reports Server (NTRS)

    James, Mark

    2006-01-01

    A document presents equations for scoring rules in a diagnostic and/or prognostic artificial-intelligence software system of the rule-based inference-engine type. The equations define a set of metrics that characterize the evaluation of a rule when data required for the antecedence clause(s) of the rule are missing. The metrics include a primary measure denoted the rule completeness metric (RCM) plus a number of subsidiary measures that contribute to the RCM. The RCM is derived from an analysis of a rule with respect to its truth and a measure of the completeness of its input data. The derivation is such that the truth value of an antecedent is independent of the measure of its completeness. The RCM can be used to compare the degree of completeness of two or more rules with respect to a given set of data. Hence, the RCM can be used as a guide to choosing among rules during the rule-selection phase of operation of the artificial-intelligence system..

  13. 76 FR 33794 - Self-Regulatory Organizations; Chicago Stock Exchange, Incorporated; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-09

    ... types and indications that are eligible for entry to and accepted by the Matching System. The Exchange... Exchange with the ability to determine on an order type by order type basis which orders and indications... Rule 43.2 relating to the types of orders handled on the CBOE's Screen Based Trading System (``SBT...

  14. Notions of "Generation" in Rhetorical Studies.

    ERIC Educational Resources Information Center

    Young, Richard

    A study of the meanings of "generation," a popular term in current rhetorical jargon, reveals important developments in the art and theory of rhetoric. As now used, it refers without clear distinction to rule-governed, heuristic, and trial-and-error procedures. The rule-governed procedures of transformation grammar are being employed to…

  15. A novel methodology for building robust design rules by using design based metrology (DBM)

    NASA Astrophysics Data System (ADS)

    Lee, Myeongdong; Choi, Seiryung; Choi, Jinwoo; Kim, Jeahyun; Sung, Hyunju; Yeo, Hyunyoung; Shim, Myoungseob; Jin, Gyoyoung; Chung, Eunseung; Roh, Yonghan

    2013-03-01

    This paper addresses a methodology for building robust design rules by using design based metrology (DBM). Conventional method for building design rules has been using a simulation tool and a simple pattern spider mask. At the early stage of the device, the estimation of simulation tool is poor. And the evaluation of the simple pattern spider mask is rather subjective because it depends on the experiential judgment of an engineer. In this work, we designed a huge number of pattern situations including various 1D and 2D design structures. In order to overcome the difficulties of inspecting many types of patterns, we introduced Design Based Metrology (DBM) of Nano Geometry Research, Inc. And those mass patterns could be inspected at a fast speed with DBM. We also carried out quantitative analysis on PWQ silicon data to estimate process variability. Our methodology demonstrates high speed and accuracy for building design rules. All of test patterns were inspected within a few hours. Mass silicon data were handled with not personal decision but statistical processing. From the results, robust design rules are successfully verified and extracted. Finally we found out that our methodology is appropriate for building robust design rules.

  16. Transformation of Topic-Specific Professional Knowledge into Personal Pedagogical Content Knowledge through Lesson Planning

    ERIC Educational Resources Information Center

    Stender, Anita; Brückmann, Maja; Neumann, Knut

    2017-01-01

    This study investigates the relationship between two different types of pedagogical content knowledge (PCK): the topic-specific professional knowledge (TSPK) and practical routines, so-called teaching scripts. Based on the Transformation Model of Lesson Planning, we assume that teaching scripts originate from a transformation of TSPK during lesson…

  17. Five types of OECD healthcare systems: empirical results of a deductive classification.

    PubMed

    Böhm, Katharina; Schmid, Achim; Götze, Ralf; Landwehr, Claudia; Rothgang, Heinz

    2013-12-01

    This article classifies 30 OECD healthcare systems according to a deductively generated typology by Rothgang and Wendt [1]. This typology distinguishes three core dimensions of the healthcare system: regulation, financing, and service provision, and three types of actors: state, societal, and private actors. We argue that there is a hierarchical relationship between the three dimensions, led by regulation, followed by financing and finally service provision, where the superior dimension restricts the nature of the subordinate dimensions. This hierarchy rule limits the number of theoretically plausible types to ten. To test our argument, we classify 30 OECD healthcare systems, mainly using OECD Health Data and WHO country reports. The classification results in five system types: the National Health Service, the National Health Insurance, the Social Health Insurance, the Etatist Social Health Insurance, and the Private Health System. All five types belong to the group of healthcare system types considered theoretically plausible. Merely Slovenia does not comply with our assumption of a hierarchy among dimensions and typical actors due to its singular transformation history. Crown Copyright © 2013. Published by Elsevier Ireland Ltd. All rights reserved.

  18. An adaptive singular spectrum analysis method for extracting brain rhythms of electroencephalography

    PubMed Central

    Hu, Hai; Guo, Shengxin; Liu, Ran

    2017-01-01

    Artifacts removal and rhythms extraction from electroencephalography (EEG) signals are important for portable and wearable EEG recording devices. Incorporating a novel grouping rule, we proposed an adaptive singular spectrum analysis (SSA) method for artifacts removal and rhythms extraction. Based on the EEG signal amplitude, the grouping rule determines adaptively the first one or two SSA reconstructed components as artifacts and removes them. The remaining reconstructed components are then grouped based on their peak frequencies in the Fourier transform to extract the desired rhythms. The grouping rule thus enables SSA to be adaptive to EEG signals containing different levels of artifacts and rhythms. The simulated EEG data based on the Markov Process Amplitude (MPA) EEG model and the experimental EEG data in the eyes-open and eyes-closed states were used to verify the adaptive SSA method. Results showed a better performance in artifacts removal and rhythms extraction, compared with the wavelet decomposition (WDec) and another two recently reported SSA methods. Features of the extracted alpha rhythms using adaptive SSA were calculated to distinguish between the eyes-open and eyes-closed states. Results showed a higher accuracy (95.8%) than those of the WDec method (79.2%) and the infinite impulse response (IIR) filtering method (83.3%). PMID:28674650

  19. 49 CFR 1100.2 - Applicability.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 1100-1129, Rules of General Applicability, establish general rules applicable to all types of proceedings. Other rules in this subchapter establish special rules applicable to particular types of proceedings. When there is a conflict or inconsistency between a rule of general applicability and a special...

  20. Comparison of methods for acid quantification: impact of resist components on acid-generating efficiency

    NASA Astrophysics Data System (ADS)

    Cameron, James F.; Fradkin, Leslie; Moore, Kathryn; Pohlers, Gerd

    2000-06-01

    Chemically amplified deep UV (CA-DUV) positive resists are the enabling materials for manufacture of devices at and below 0.18 micrometer design rules in the semiconductor industry. CA-DUV resists are typically based on a combination of an acid labile polymer and a photoacid generator (PAG). Upon UV exposure, a catalytic amount of a strong Bronsted acid is released and is subsequently used in a post-exposure bake step to deprotect the acid labile polymer. Deprotection transforms the acid labile polymer into a base soluble polymer and ultimately enables positive tone image development in dilute aqueous base. As CA-DUV resist systems continue to mature and are used in increasingly demanding situations, it is critical to develop a fundamental understanding of how robust these materials are. One of the most important factors to quantify is how much acid is photogenerated in these systems at key exposure doses. For the purpose of quantifying photoacid generation several methods have been devised. These include spectrophotometric methods, ion conductivity methods and most recently an acid-base type titration similar to the standard addition method. This paper compares many of these techniques. First, comparisons between the most commonly used acid sensitive dye, tetrabromophenol blue sodium salt (TBPB) and a less common acid sensitive dye, Rhodamine B base (RB) are made in several resist systems. Second, the novel acid-base type titration based on the standard addition method is compared to the spectrophotometric titration method. During these studies, the make up of the resist system is probed as follows: the photoacid generator and resist additives are varied to understand the impact of each of these resist components on the acid generation process.

  1. A CLIPS expert system for clinical flow cytometry data analysis

    NASA Technical Reports Server (NTRS)

    Salzman, G. C.; Duque, R. E.; Braylan, R. C.; Stewart, C. C.

    1990-01-01

    An expert system is being developed using CLIPS to assist clinicians in the analysis of multivariate flow cytometry data from cancer patients. Cluster analysis is used to find subpopulations representing various cell types in multiple datasets each consisting of four to five measurements on each of 5000 cells. CLIPS facts are derived from results of the clustering. CLIPS rules are based on the expertise of Drs. Stewart, Duque, and Braylan. The rules incorporate certainty factors based on case histories.

  2. Learning Optimal Individualized Treatment Rules from Electronic Health Record Data

    PubMed Central

    Wang, Yuanjia; Wu, Peng; Liu, Ying; Weng, Chunhua; Zeng, Donglin

    2016-01-01

    Medical research is experiencing a paradigm shift from “one-size-fits-all” strategy to a precision medicine approach where the right therapy, for the right patient, and at the right time, will be prescribed. We propose a statistical method to estimate the optimal individualized treatment rules (ITRs) that are tailored according to subject-specific features using electronic health records (EHR) data. Our approach merges statistical modeling and medical domain knowledge with machine learning algorithms to assist personalized medical decision making using EHR. We transform the estimation of optimal ITR into a classification problem and account for the non-experimental features of the EHR data and confounding by clinical indication. We create a broad range of feature variables that reflect both patient health status and healthcare data collection process. Using EHR data collected at Columbia University clinical data warehouse, we construct a decision tree for choosing the best second line therapy for treating type 2 diabetes patients. PMID:28503676

  3. Skills, rules and knowledge in aircraft maintenance: errors in context

    NASA Technical Reports Server (NTRS)

    Hobbs, Alan; Williamson, Ann

    2002-01-01

    Automatic or skill-based behaviour is generally considered to be less prone to error than behaviour directed by conscious control. However, researchers who have applied Rasmussen's skill-rule-knowledge human error framework to accidents and incidents have sometimes found that skill-based errors appear in significant numbers. It is proposed that this is largely a reflection of the opportunities for error which workplaces present and does not indicate that skill-based behaviour is intrinsically unreliable. In the current study, 99 errors reported by 72 aircraft mechanics were examined in the light of a task analysis based on observations of the work of 25 aircraft mechanics. The task analysis identified the opportunities for error presented at various stages of maintenance work packages and by the job as a whole. Once the frequency of each error type was normalized in terms of the opportunities for error, it became apparent that skill-based performance is more reliable than rule-based performance, which is in turn more reliable than knowledge-based performance. The results reinforce the belief that industrial safety interventions designed to reduce errors would best be directed at those aspects of jobs that involve rule- and knowledge-based performance.

  4. YouTube, Critical Pedagogy, and Media Activism

    ERIC Educational Resources Information Center

    Kellner, Douglas; Kim, Gooyong

    2010-01-01

    Critical pedagogy believes education to be a form of cultural politics that is fundamental to social transformation aiming to cultivate human agency and transformative activity. The explosion of information and communication technologies (ICTs) has provided ordinary people with unprecedented opportunities to take on the ruling educational power…

  5. An improved association-mining research for exploring Chinese herbal property theory: based on data of the Shennong's Classic of Materia Medica.

    PubMed

    Jin, Rui; Lin, Zhi-jian; Xue, Chun-miao; Zhang, Bing

    2013-09-01

    Knowledge Discovery in Databases is gaining attention and raising new hopes for traditional Chinese medicine (TCM) researchers. It is a useful tool in understanding and deciphering TCM theories. Aiming for a better understanding of Chinese herbal property theory (CHPT), this paper performed an improved association rule learning to analyze semistructured text in the book entitled Shennong's Classic of Materia Medica. The text was firstly annotated and transformed to well-structured multidimensional data. Subsequently, an Apriori algorithm was employed for producing association rules after the sensitivity analysis of parameters. From the confirmed 120 resulting rules that described the intrinsic relationships between herbal property (qi, flavor and their combinations) and herbal efficacy, two novel fundamental principles underlying CHPT were acquired and further elucidated: (1) the many-to-one mapping of herbal efficacy to herbal property; (2) the nonrandom overlap between the related efficacy of qi and flavor. This work provided an innovative knowledge about CHPT, which would be helpful for its modern research.

  6. An agent-based model of dialect evolution in killer whales.

    PubMed

    Filatova, Olga A; Miller, Patrick J O

    2015-05-21

    The killer whale is one of the few animal species with vocal dialects that arise from socially learned group-specific call repertoires. We describe a new agent-based model of killer whale populations and test a set of vocal-learning rules to assess which mechanisms may lead to the formation of dialect groupings observed in the wild. We tested a null model with genetic transmission and no learning, and ten models with learning rules that differ by template source (mother or matriline), variation type (random errors or innovations) and type of call change (no divergence from kin vs. divergence from kin). The null model without vocal learning did not produce the pattern of group-specific call repertoires we observe in nature. Learning from either mother alone or the entire matriline with calls changing by random errors produced a graded distribution of the call phenotype, without the discrete call types observed in nature. Introducing occasional innovation or random error proportional to matriline variance yielded more or less discrete and stable call types. A tendency to diverge from the calls of related matrilines provided fast divergence of loose call clusters. A pattern resembling the dialect diversity observed in the wild arose only when rules were applied in combinations and similar outputs could arise from different learning rules and their combinations. Our results emphasize the lack of information on quantitative features of wild killer whale dialects and reveal a set of testable questions that can draw insights into the cultural evolution of killer whale dialects. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Mechanical Transformation of Task Heuristics into Operational Procedures

    DTIC Science & Technology

    1981-04-14

    Introduction A central theme of recent research in artificial intelligence is that *Intelligent task performance requires large amounts of knowledge...PLAY P1 C4] (. (LEADING (QSO)) (OR (CAN-LEAO- HEARrS (gSO)J (mEg (SUIT-OF C3) H])] C-) (FOLLOWING (QSO)) (OR [VOID (OSO) (SUIT-LED)3 [IN-SUIT C3 (SUIT...Production rules as a representation for a knowledge based consultation system. Artificial Intelligence 8:15-45, Spring, 1977. [Davis 77b] R. Davis

  8. Gene Ontology synonym generation rules lead to increased performance in biomedical concept recognition.

    PubMed

    Funk, Christopher S; Cohen, K Bretonnel; Hunter, Lawrence E; Verspoor, Karin M

    2016-09-09

    Gene Ontology (GO) terms represent the standard for annotation and representation of molecular functions, biological processes and cellular compartments, but a large gap exists between the way concepts are represented in the ontology and how they are expressed in natural language text. The construction of highly specific GO terms is formulaic, consisting of parts and pieces from more simple terms. We present two different types of manually generated rules to help capture the variation of how GO terms can appear in natural language text. The first set of rules takes into account the compositional nature of GO and recursively decomposes the terms into their smallest constituent parts. The second set of rules generates derivational variations of these smaller terms and compositionally combines all generated variants to form the original term. By applying both types of rules, new synonyms are generated for two-thirds of all GO terms and an increase in F-measure performance for recognition of GO on the CRAFT corpus from 0.498 to 0.636 is observed. Additionally, we evaluated the combination of both types of rules over one million full text documents from Elsevier; manual validation and error analysis show we are able to recognize GO concepts with reasonable accuracy (88 %) based on random sampling of annotations. In this work we present a set of simple synonym generation rules that utilize the highly compositional and formulaic nature of the Gene Ontology concepts. We illustrate how the generated synonyms aid in improving recognition of GO concepts on two different biomedical corpora. We discuss other applications of our rules for GO ontology quality assurance, explore the issue of overgeneration, and provide examples of how similar methodologies could be applied to other biomedical terminologies. Additionally, we provide all generated synonyms for use by the text-mining community.

  9. An architecture for rule based system explanation

    NASA Technical Reports Server (NTRS)

    Fennel, T. R.; Johannes, James D.

    1990-01-01

    A system architecture is presented which incorporate both graphics and text into explanations provided by rule based expert systems. This architecture facilitates explanation of the knowledge base content, the control strategies employed by the system, and the conclusions made by the system. The suggested approach combines hypermedia and inference engine capabilities. Advantages include: closer integration of user interface, explanation system, and knowledge base; the ability to embed links to deeper knowledge underlying the compiled knowledge used in the knowledge base; and allowing for more direct control of explanation depth and duration by the user. User models are suggested to control the type, amount, and order of information presented.

  10. Reinforcement interval type-2 fuzzy controller design by online rule generation and q-value-aided ant colony optimization.

    PubMed

    Juang, Chia-Feng; Hsu, Chia-Hung

    2009-12-01

    This paper proposes a new reinforcement-learning method using online rule generation and Q-value-aided ant colony optimization (ORGQACO) for fuzzy controller design. The fuzzy controller is based on an interval type-2 fuzzy system (IT2FS). The antecedent part in the designed IT2FS uses interval type-2 fuzzy sets to improve controller robustness to noise. There are initially no fuzzy rules in the IT2FS. The ORGQACO concurrently designs both the structure and parameters of an IT2FS. We propose an online interval type-2 rule generation method for the evolution of system structure and flexible partitioning of the input space. Consequent part parameters in an IT2FS are designed using Q -values and the reinforcement local-global ant colony optimization algorithm. This algorithm selects the consequent part from a set of candidate actions according to ant pheromone trails and Q-values, both of which are updated using reinforcement signals. The ORGQACO design method is applied to the following three control problems: 1) truck-backing control; 2) magnetic-levitation control; and 3) chaotic-system control. The ORGQACO is compared with other reinforcement-learning methods to verify its efficiency and effectiveness. Comparisons with type-1 fuzzy systems verify the noise robustness property of using an IT2FS.

  11. HIT: a new approach for hiding multimedia information in text

    NASA Astrophysics Data System (ADS)

    El-Kwae, Essam A.; Cheng, Li

    2002-04-01

    A new technique for hiding multimedia data in text, called the Hiding in Text (HIT) technique, is introduced. The HIT technique can transform any type of media represented by a long binary string into innocuous text that follows correct grammatical rules. This technique divides English words into types where each word can appear in any number of types. For each type, there is a dictionary, which maps words to binary codes. Marker types are special types whose words do not repeat in any other type. Each generated sentence must include at least one word from the marker type. In the hiding phase, a binary string is input to the HIT encoding algorithm, which then selects sentence templates at random. The output is a set of English sentences according to the selected templates and the dictionaries of types. In the retrieving phase, the HIT technique uses the position of the marker word to identify the template used to build each sentence. The proposed technique greatly improves the efficiency and the security features of previous solutions. Examples for hiding text and image information in a cover text are given to illustrate the HIT technique.

  12. Angular momentum conservation law in light-front quantum field theory

    DOE PAGES

    Chiu, Kelly Yu-Ju; Brodsky, Stanley J.

    2017-03-31

    We prove the Lorentz invariance of the angular momentum conservation law and the helicity sum rule for relativistic composite systems in the light-front formulation. We explicitly show that j 3, the z -component of the angular momentum remains unchanged under Lorentz transformations generated by the light-front kinematical boost operators. The invariance of j 3 under Lorentz transformations is a feature unique to the front form. Applying the Lorentz invariance of the angular quantum number in the front form, we obtain a selection rule for the orbital angular momentum which can be used to eliminate certain interaction vertices in QED andmore » QCD. We also generalize the selection rule to any renormalizable theory and show that there exists an upper bound on the change of orbital angular momentum in scattering processes at any fixed order in perturbation theory.« less

  13. Features of development process displacement of earth’s surface when dredging coal in Eastern Donbas

    NASA Astrophysics Data System (ADS)

    Posylniy, Yu V.; Versilov, S. O.; Shurygin, D. N.; Kalinchenko, V. M.

    2017-10-01

    The results of studies of the process of the earth’s surface displacement due to the influence of the adjacent longwalls are presented. It is established that the actual distributions of soil subsidence in the fall and revolt of the reservoir with the same boundary settlement processes differ both from each other and by the distribution of subsidence, recommended by the rules of structures protection. The application of the new boundary criteria - the relative subsidence of 0.03 - allows one to go from two distributions to one distribution, which is also different from the sedimentation distribution of protection rules. The use of a new geometrical element - a virtual point of the mould - allows one to transform the actual distribution of subsidence in the model distribution of rules of constructions protection. When transforming the curves of subsidence, the boundary points vary and, consequently, the boundary corners do.

  14. Angular momentum conservation law in light-front quantum field theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chiu, Kelly Yu-Ju; Brodsky, Stanley J.

    We prove the Lorentz invariance of the angular momentum conservation law and the helicity sum rule for relativistic composite systems in the light-front formulation. We explicitly show that j 3, the z -component of the angular momentum remains unchanged under Lorentz transformations generated by the light-front kinematical boost operators. The invariance of j 3 under Lorentz transformations is a feature unique to the front form. Applying the Lorentz invariance of the angular quantum number in the front form, we obtain a selection rule for the orbital angular momentum which can be used to eliminate certain interaction vertices in QED andmore » QCD. We also generalize the selection rule to any renormalizable theory and show that there exists an upper bound on the change of orbital angular momentum in scattering processes at any fixed order in perturbation theory.« less

  15. Angular momentum conservation law in light-front quantum field theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chiu, Kelly Yu-Ju; Brodsky, Stanley J.

    We prove the Lorentz invariance of the angular momentum conservation law and the helicity sum rule for relativistic composite systems in the light-front formulation. We explicitly show that j 3 , the z -component of the angular momentum remains unchanged under Lorentz transformations generated by the light-front kinematical boost operators. The invariance of j 3 under Lorentz transformations is a feature unique to the front form. Applying the Lorentz invariance of the angular quantum number in the front form, we obtain a selection rule for the orbital angular momentum which can be used to eliminate certain interaction vertices in QEDmore » and QCD. We also generalize the selection rule to any renormalizable theory and show that there exists an upper bound on the change of orbital angular momentum in scattering processes at any fixed order in perturbation theory.« less

  16. A Caenorhabditis elegans Wild Type Defies the Temperature–Size Rule Owing to a Single Nucleotide Polymorphism in tra-3

    PubMed Central

    Kammenga, Jan E; Doroszuk, Agnieszka; Riksen, Joost A. G; Hazendonk, Esther; Spiridon, Laurentiu; Petrescu, Andrei-Jose; Tijsterman, Marcel; Plasterk, Ronald H. A; Bakker, Jaap

    2007-01-01

    Ectotherms rely for their body heat on surrounding temperatures. A key question in biology is why most ectotherms mature at a larger size at lower temperatures, a phenomenon known as the temperature–size rule. Since temperature affects virtually all processes in a living organism, current theories to explain this phenomenon are diverse and complex and assert often from opposing assumptions. Although widely studied, the molecular genetic control of the temperature–size rule is unknown. We found that the Caenorhabditis elegans wild-type N2 complied with the temperature–size rule, whereas wild-type CB4856 defied it. Using a candidate gene approach based on an N2 × CB4856 recombinant inbred panel in combination with mutant analysis, complementation, and transgenic studies, we show that a single nucleotide polymorphism in tra-3 leads to mutation F96L in the encoded calpain-like protease. This mutation attenuates the ability of CB4856 to grow larger at low temperature. Homology modelling predicts that F96L reduces TRA-3 activity by destabilizing the DII-A domain. The data show that size adaptation of ectotherms to temperature changes may be less complex than previously thought because a subtle wild-type polymorphism modulates the temperature responsiveness of body size. These findings provide a novel step toward the molecular understanding of the temperature–size rule, which has puzzled biologists for decades. PMID:17335351

  17. Effect of Box-Cox transformation on power of Haseman-Elston and maximum-likelihood variance components tests to detect quantitative trait Loci.

    PubMed

    Etzel, C J; Shete, S; Beasley, T M; Fernandez, J R; Allison, D B; Amos, C I

    2003-01-01

    Non-normality of the phenotypic distribution can affect power to detect quantitative trait loci in sib pair studies. Previously, we observed that Winsorizing the sib pair phenotypes increased the power of quantitative trait locus (QTL) detection for both Haseman-Elston (HE) least-squares tests [Hum Hered 2002;53:59-67] and maximum likelihood-based variance components (MLVC) analysis [Behav Genet (in press)]. Winsorizing the phenotypes led to a slight increase in type 1 error in H-E tests and a slight decrease in type I error for MLVC analysis. Herein, we considered transforming the sib pair phenotypes using the Box-Cox family of transformations. Data were simulated for normal and non-normal (skewed and kurtic) distributions. Phenotypic values were replaced by Box-Cox transformed values. Twenty thousand replications were performed for three H-E tests of linkage and the likelihood ratio test (LRT), the Wald test and other robust versions based on the MLVC method. We calculated the relative nominal inflation rate as the ratio of observed empirical type 1 error divided by the set alpha level (5, 1 and 0.1% alpha levels). MLVC tests applied to non-normal data had inflated type I errors (rate ratio greater than 1.0), which were controlled best by Box-Cox transformation and to a lesser degree by Winsorizing. For example, for non-transformed, skewed phenotypes (derived from a chi2 distribution with 2 degrees of freedom), the rates of empirical type 1 error with respect to set alpha level=0.01 were 0.80, 4.35 and 7.33 for the original H-E test, LRT and Wald test, respectively. For the same alpha level=0.01, these rates were 1.12, 3.095 and 4.088 after Winsorizing and 0.723, 1.195 and 1.905 after Box-Cox transformation. Winsorizing reduced inflated error rates for the leptokurtic distribution (derived from a Laplace distribution with mean 0 and variance 8). Further, power (adjusted for empirical type 1 error) at the 0.01 alpha level ranged from 4.7 to 17.3% across all tests using the non-transformed, skewed phenotypes, from 7.5 to 20.1% after Winsorizing and from 12.6 to 33.2% after Box-Cox transformation. Likewise, power (adjusted for empirical type 1 error) using leptokurtic phenotypes at the 0.01 alpha level ranged from 4.4 to 12.5% across all tests with no transformation, from 7 to 19.2% after Winsorizing and from 4.5 to 13.8% after Box-Cox transformation. Thus the Box-Cox transformation apparently provided the best type 1 error control and maximal power among the procedures we considered for analyzing a non-normal, skewed distribution (chi2) while Winzorizing worked best for the non-normal, kurtic distribution (Laplace). We repeated the same simulations using a larger sample size (200 sib pairs) and found similar results. Copyright 2003 S. Karger AG, Basel

  18. The study on dynamic cadastral coding rules based on kinship relationship

    NASA Astrophysics Data System (ADS)

    Xu, Huan; Liu, Nan; Liu, Renyi; Lu, Jingfeng

    2007-06-01

    Cadastral coding rules are an important supplement to the existing national and local standard specifications for building cadastral database. After analyzing the course of cadastral change, especially the parcel change with the method of object-oriented analysis, a set of dynamic cadastral coding rules based on kinship relationship corresponding to the cadastral change is put forward and a coding format composed of street code, block code, father parcel code, child parcel code and grandchild parcel code is worked out within the county administrative area. The coding rule has been applied to the development of an urban cadastral information system called "ReGIS", which is not only able to figure out the cadastral code automatically according to both the type of parcel change and the coding rules, but also capable of checking out whether the code is spatiotemporally unique before the parcel is stored in the database. The system has been used in several cities of Zhejiang Province and got a favorable response. This verifies the feasibility and effectiveness of the coding rules to some extent.

  19. Designing boosting ensemble of relational fuzzy systems.

    PubMed

    Scherer, Rafał

    2010-10-01

    A method frequently used in classification systems for improving classification accuracy is to combine outputs of several classifiers. Among various types of classifiers, fuzzy ones are tempting because of using intelligible fuzzy if-then rules. In the paper we build an AdaBoost ensemble of relational neuro-fuzzy classifiers. Relational fuzzy systems bond input and output fuzzy linguistic values by a binary relation; thus, fuzzy rules have additional, comparing to traditional fuzzy systems, weights - elements of a fuzzy relation matrix. Thanks to this the system is better adjustable to data during learning. In the paper an ensemble of relational fuzzy systems is proposed. The problem is that such an ensemble contains separate rule bases which cannot be directly merged. As systems are separate, we cannot treat fuzzy rules coming from different systems as rules from the same (single) system. In the paper, the problem is addressed by a novel design of fuzzy systems constituting the ensemble, resulting in normalization of individual rule bases during learning. The method described in the paper is tested on several known benchmarks and compared with other machine learning solutions from the literature.

  20. Temporal and contextual knowledge in model-based expert systems

    NASA Technical Reports Server (NTRS)

    Toth-Fejel, Tihamer; Heher, Dennis

    1987-01-01

    A basic paradigm that allows representation of physical systems with a focus on context and time is presented. Paragon provides the capability to quickly capture an expert's knowledge in a cognitively resonant manner. From that description, Paragon creates a simulation model in LISP, which when executed, verifies that the domain expert did not make any mistakes. The Achille's heel of rule-based systems has been the lack of a systematic methodology for testing, and Paragon's developers are certain that the model-based approach overcomes that problem. The reason this testing is now possible is that software, which is very difficult to test, has in essence been transformed into hardware.

  1. Greek classicism in living structure? Some deductive pathways in animal morphology.

    PubMed

    Zweers, G A

    1985-01-01

    Classical temples in ancient Greece show two deterministic illusionistic principles of architecture, which govern their functional design: geometric proportionalism and a set of illusion-strengthening rules in the proportionalism's "stochastic margin". Animal morphology, in its mechanistic-deductive revival, applies just one architectural principle, which is not always satisfactory. Whether a "Greek Classical" situation occurs in the architecture of living structure is to be investigated by extreme testing with deductive methods. Three deductive methods for explanation of living structure in animal morphology are proposed: the parts, the compromise, and the transformation deduction. The methods are based upon the systems concept for an organism, the flow chart for a functionalistic picture, and the network chart for a structuralistic picture, whereas the "optimal design" serves as the architectural principle for living structure. These methods show clearly the high explanatory power of deductive methods in morphology, but they also make one open end most explicit: neutral issues do exist. Full explanation of living structure asks for three entries: functional design within architectural and transformational constraints. The transformational constraint brings necessarily in a stochastic component: an at random variation being a sort of "free management space". This variation must be a variation from the deterministic principle of the optimal design, since any transformation requires space for plasticity in structure and action, and flexibility in role fulfilling. Nevertheless, finally the question comes up whether for animal structure a similar situation exists as in Greek Classical temples. This means that the at random variation, that is found when the optimal design is used to explain structure, comprises apart from a stochastic part also real deviations being yet another deterministic part. This deterministic part could be a set of rules that governs actualization in the "free management space".

  2. Identifying new persistent and bioaccumulative organics among chemicals in commerce. III: byproducts, impurities, and transformation products.

    PubMed

    Howard, Philip H; Muir, Derek C G

    2013-05-21

    The goal of this series of studies was to identify commercial chemicals that might be persistent and bioaccumulative (PB) and that were not being considered in current wastewater and aquatic environmental measurement programs. In this study, we focus on chemicals that are not on commercial chemical lists such as U.S. EPA's Inventory Update Rule but may be found as byproducts or impurities in commercial chemicals or are likely transformation products from commercial chemical use. We evaluated the 610 chemicals from our earlier publication as well as high production volume chemicals and identified 320 chemicals (39 byproducts and impurities, and 281 transformation products) that could be potential PB chemicals. Four examples are discussed in detail; these chemicals had a fair amount of information on the commercial synthesis and byproducts and impurities that might be found in the commercial product. Unfortunately for many of the 610 chemicals, as well as the transformation products, little or no information was available. Use of computer-aided software to predict the transformation pathways in combination with the biodegradation rules of thumb and some basic organic chemistry has allowed 281 potential PB transformation products to be suggested for some of the 610 commercial chemicals; more PB transformation products were not selected since microbial degradation often results in less persistent and less bioaccumulative metabolites.

  3. The norms, rules and motivational values driving sustainable remediation of contaminated environments: A study of implementation.

    PubMed

    Prior, Jason

    2016-02-15

    Efforts to achieve sustainability are transforming the norms, rules and values that affect the remediation of contaminated environments. This is altering the ways in which remediation impacts on the total environment. Despite this transformation, few studies have provided systematic insights into the diverse norms and rules that drive the implementation of sustainable remediation at contaminated sites, and no studies have investigated how values motivate compliance with these norms and rules. This study is a systematic analysis of the rules, norms and motivational values embedded in sustainable remediation processes at three sites across Australia, using in-depth interviews conducted with 18 participants between 2011 and 2014, through the application of Crawford and Ostrom's Institutional Grammar and Schwartz's value framework. These approaches offered methods for identifying the rules, norms, and motivational values that guided participants' actions within remediation processes at these sites. The findings identify a core set of 16 norms and 18 rules (sanctions) used by participants to implement sustainable remediation at the sites. These norms and rules: define the position of participants within the process, provide means for incorporating sustainability into established remediation practices, and define the scope of outcomes that constitute sustainable remediation. The findings revealed that motivational values focused on public interest and self-interest influenced participants' compliance with norms and rules. The findings also found strong interdependence between the norms and rules (sanctions) within the remediation processes and the normative principles operating within the broader domain of environmental management and planning. The paper concludes with a discussion of: the system of norms operating within sustainable remediation (which far exceed those associated with ESD); their link, through rules (sanctions) to contemporary styles of regulatory enforcement; and the underlying balance of public-interest values and self-interest values that drives participants' involvement in sustainable remediation. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lue Xing; Sun Kun; Wang Pan

    In the framework of Bell-polynomial manipulations, under investigation hereby are three single-field bilinearizable equations: the (1+1)-dimensional shallow water wave model, Boiti-Leon-Manna-Pempinelli model, and (2+1)-dimensional Sawada-Kotera model. Based on the concept of scale invariance, a direct and unifying Bell-polynomial scheme is employed to achieve the Baecklund transformations and Lax pairs associated with those three soliton equations. Note that the Bell-polynomial expressions and Bell-polynomial-typed Baecklund transformations for those three soliton equations can be, respectively, cast into the bilinear equations and bilinear Baecklund transformations with symbolic computation. Consequently, it is also shown that the Bell-polynomial-typed Baecklund transformations can be linearized into the correspondingmore » Lax pairs.« less

  5. Bäcklund Transformations in 10D SUSY Yang-Mills Theories

    NASA Astrophysics Data System (ADS)

    Gervais, Jean-Loup

    A Bäcklund transformation is derived for the Yang's type (super) equations previously derived (hep-th/9811108) by M. Saveliev and the author, from the ten-dimensional super-Yang-Mills field equations in an on-shell light cone gauge. It is shown to be based upon a particular gauge transformation satisfying nonlinear conditions which ensure that the equations retain the same form. These Yang's type field equations are shown to be precisely such that they automatically provide a solution of these conditions. This Bäcklund transformation is similar to the one proposed by A. Leznov for self-dual Yang-Mills in four dimensions. In the introduction a personal recollection on the birth of supersymmetry is given.

  6. Toy-Based Programming and Childrens Knowledge of Products.

    ERIC Educational Resources Information Center

    Burroughs, W. Jeffrey; Ryan, John

    This study explored children's play behavior as it may be influenced by a particular type of television programming, i.e., shows based on commercially available toys. Subjects were two groups of 5- to 6-year-old and 7- to 8-year-old boys who were exposed to a representative program, The Transformers, which features the Transformer toys. Exposure…

  7. Microstructure of warm rolling and pearlitic transformation of ultrafine-grained GCr15 steel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Jun-Jie; Lian, Fu-Liang; Liu, Hong-Ji

    2014-09-15

    Pearlitic transformation mechanisms have been investigated in ultra-fine grained GCr15 steel. The ultrafine-grained steel, whose grain size was less than 1 μm, was prepared by thermo-mechanical treatment at 873 K and then annealing at 923 K for 2 h. Pearlitic transformation was conducted by reheating the ultra-fine grained samples at 1073 K and 1123 K for different periods of time and then cooling in air. Scanning electron microscope observation shows that normal lamellar pearlite, instead of granular cementite and ferrite, cannot be formed when the grain size is approximately less than 4(± 0.6) μm, which yields a critical grain sizemore » for normal lamellar pearlitic transformations in this chromium alloyed steel. The result confirms that grain size has a great influence on pearlitic transformation by increasing the diffusion rate of carbon atoms in the ultra-fine grained steel, and the addition of chromium element doesn't change this pearlitic phase transformation rule. Meanwhile, the grain growth rate is reduced by chromium alloying, which is beneficial to form fine grains during austenitizing, thus it facilitating pearlitic transformation by divorced eutectoid transformation. Moreover, chromium element can form a relatively high gradient in the frontier of the undissolved carbide, which promotes carbide formation in the frontier of the undissolved carbide, i.e., chromium promotes divorced eutectoid transformation. - Highlights: • Ultrafine-grained GCr15 steel was obtained by warm rolling and annealing technology. • Reduction of grain size makes pearlite morphology from lamellar to granular. • Adding Cr does not change normal pearlitic phase transformation rule in UFG steel. • Cr carbide resists grain growth and facilitates pearlitic transformation by DET.« less

  8. Hospital-based epidemiological and clinical characterisation of the malignant transformation of oral leukoplakia in a Chinese population.

    PubMed

    Lyu, Ming-Yue; Guo, Yu-Si; Li, Shuo; Yang, Di; Hua, Hong

    2017-08-01

    The aim of this review was to analyse, systematically, hospital-based epidemiological information concerning the malignant transformation rate (MTR) of oral leukoplakia (OL) in a Chinese population, as well as the associated risk factors. Four electronic databases were searched for studies dealing with OL and related risk factors, including age, gender, type of lesion, site, and smoking and drinking habits. The MTR of OL in the hospital-based Chinese population ranged from 4% to 13%, based on the studies analysed. Regarding risk factors, we found that female patients had a higher MTR than male patients, and that patients older than 50 years of age also had a higher MTR. Patients who smoked had a lower MTR, while alcohol consumption seemed to have no association with MTR. Malignant transformation occurred most commonly on the tongue. Regarding lesion type, non-homogeneous OL had a higher MTR, with the granular type having the highest MTR. Our results regarding the epidemiology of OL showed a similar trend to those reported in western populations and provided preliminary epidemiological information on the Chinese population. Our findings show that female gender, age >50 years and non-homogeneous OL are risk factors for malignant transformation. It is important to develop clinical strategies to educate, diagnose and treat patients with OL and to minimise the MTR of OL. © 2017 FDI World Dental Federation.

  9. Recognition of multiple imbalanced cancer types based on DNA microarray data using ensemble classifiers.

    PubMed

    Yu, Hualong; Hong, Shufang; Yang, Xibei; Ni, Jun; Dan, Yuanyuan; Qin, Bin

    2013-01-01

    DNA microarray technology can measure the activities of tens of thousands of genes simultaneously, which provides an efficient way to diagnose cancer at the molecular level. Although this strategy has attracted significant research attention, most studies neglect an important problem, namely, that most DNA microarray datasets are skewed, which causes traditional learning algorithms to produce inaccurate results. Some studies have considered this problem, yet they merely focus on binary-class problem. In this paper, we dealt with multiclass imbalanced classification problem, as encountered in cancer DNA microarray, by using ensemble learning. We utilized one-against-all coding strategy to transform multiclass to multiple binary classes, each of them carrying out feature subspace, which is an evolving version of random subspace that generates multiple diverse training subsets. Next, we introduced one of two different correction technologies, namely, decision threshold adjustment or random undersampling, into each training subset to alleviate the damage of class imbalance. Specifically, support vector machine was used as base classifier, and a novel voting rule called counter voting was presented for making a final decision. Experimental results on eight skewed multiclass cancer microarray datasets indicate that unlike many traditional classification approaches, our methods are insensitive to class imbalance.

  10. Fisheries regulatory regimes and resilience to climate change.

    PubMed

    Ojea, Elena; Pearlman, Isaac; Gaines, Steven D; Lester, Sarah E

    2017-05-01

    Climate change is already producing ecological, social, and economic impacts on fisheries, and these effects are expected to increase in frequency and magnitude in the future. Fisheries governance and regulations can alter socio-ecological resilience to climate change impacts via harvest control rules and incentives driving fisher behavior, yet there are no syntheses or conceptual frameworks for examining how institutions and their regulatory approaches can alter fisheries resilience to climate change. We identify nine key climate resilience criteria for fisheries socio-ecological systems (SES), defining resilience as the ability of the coupled system of interacting social and ecological components (i.e., the SES) to absorb change while avoiding transformation into a different undesirable state. We then evaluate the capacity of four fisheries regulatory systems that vary in their degree of property rights, including open access, limited entry, and two types of rights-based management, to increase or inhibit resilience. Our exploratory assessment of evidence in the literature suggests that these regulatory regimes vary widely in their ability to promote resilient fisheries, with rights-based approaches appearing to offer more resilience benefits in many cases, but detailed characteristics of the regulatory instruments are fundamental.

  11. Sponsors' and investigative staffs' perceptions of the current investigational new drug safety reporting process in oncology trials.

    PubMed

    Perez, Raymond; Archdeacon, Patrick; Roach, Nancy; Goodwin, Robert; Jarow, Jonathan; Stuccio, Nina; Forrest, Annemarie

    2017-06-01

    The Food and Drug Administration's final rule on investigational new drug application safety reporting, effective from 28 March 2011, clarified the reporting requirements for serious and unexpected suspected adverse reactions occurring in clinical trials. The Clinical Trials Transformation Initiative released recommendations in 2013 to assist implementation of the final rule; however, anecdotal reports and data from a Food and Drug Administration audit indicated that a majority of reports being submitted were still uninformative and did not result in actionable changes. Clinical Trials Transformation Initiative investigated remaining barriers and potential solutions to full implementation of the final rule by polling and interviewing investigators, clinical research staff, and sponsors. In an opinion-gathering effort, two discrete online surveys designed to assess challenges and motivations related to management of expedited (7- to 15-day) investigational new drug safety reporting processes in oncology trials were developed and distributed to two populations: investigators/clinical research staff and sponsors. Data were collected for approximately 1 year. Twenty-hour-long interviews were also conducted with Clinical Trials Transformation Initiative-nominated interview participants who were considered as having extensive knowledge of and experience with the topic. Interviewees included 13 principal investigators/study managers/research team members and 7 directors/vice presidents of pharmacovigilance operations from 5 large global pharmaceutical companies. The investigative site's responses indicate that too many individual reports are still being submitted, which are time-consuming to process and provide little value for patient safety assessments or for informing actionable changes. Fewer but higher quality reports would be more useful, and the investigator and staff would benefit from sponsors'"filtering" of reports and increased sponsor communication. Sponsors replied that their greatest challenges include (1) lack of global harmonization in reporting rules, (2) determining causality, and (3) fear of regulatory repercussions. Interaction with the Food and Drug Administration has helped improve sponsors' adherence to the final rule, and sponsors would benefit from increased communication with the Food and Drug Administration and educational materials. The goal of the final rule is to minimize uninformative safety reports so that important safety signals can be captured and communicated early enough in a clinical program to make changes that help ensure patient safety. Investigative staff and sponsors acknowledge that the rule has not been fully implemented although they agree with the intention. Clinical Trials Transformation Initiative will use the results from the surveys and interviews to develop new recommendations and educational materials that will be available to sponsors to increase compliance with the final rule and facilitate discussion between sponsors, investigators, and Food and Drug Administration representatives.

  12. Sponsors’ and investigative staffs' perceptions of the current investigational new drug safety reporting process in oncology trials

    PubMed Central

    Perez, Raymond; Archdeacon, Patrick; Roach, Nancy; Goodwin, Robert; Jarow, Jonathan; Stuccio, Nina; Forrest, Annemarie

    2017-01-01

    Background/aims: The Food and Drug Administration’s final rule on investigational new drug application safety reporting, effective from 28 March 2011, clarified the reporting requirements for serious and unexpected suspected adverse reactions occurring in clinical trials. The Clinical Trials Transformation Initiative released recommendations in 2013 to assist implementation of the final rule; however, anecdotal reports and data from a Food and Drug Administration audit indicated that a majority of reports being submitted were still uninformative and did not result in actionable changes. Clinical Trials Transformation Initiative investigated remaining barriers and potential solutions to full implementation of the final rule by polling and interviewing investigators, clinical research staff, and sponsors. Methods: In an opinion-gathering effort, two discrete online surveys designed to assess challenges and motivations related to management of expedited (7- to 15-day) investigational new drug safety reporting processes in oncology trials were developed and distributed to two populations: investigators/clinical research staff and sponsors. Data were collected for approximately 1 year. Twenty-hour-long interviews were also conducted with Clinical Trials Transformation Initiative–nominated interview participants who were considered as having extensive knowledge of and experience with the topic. Interviewees included 13 principal investigators/study managers/research team members and 7 directors/vice presidents of pharmacovigilance operations from 5 large global pharmaceutical companies. Results: The investigative site’s responses indicate that too many individual reports are still being submitted, which are time-consuming to process and provide little value for patient safety assessments or for informing actionable changes. Fewer but higher quality reports would be more useful, and the investigator and staff would benefit from sponsors’“filtering” of reports and increased sponsor communication. Sponsors replied that their greatest challenges include (1) lack of global harmonization in reporting rules, (2) determining causality, and (3) fear of regulatory repercussions. Interaction with the Food and Drug Administration has helped improve sponsors’ adherence to the final rule, and sponsors would benefit from increased communication with the Food and Drug Administration and educational materials. Conclusion: The goal of the final rule is to minimize uninformative safety reports so that important safety signals can be captured and communicated early enough in a clinical program to make changes that help ensure patient safety. Investigative staff and sponsors acknowledge that the rule has not been fully implemented although they agree with the intention. Clinical Trials Transformation Initiative will use the results from the surveys and interviews to develop new recommendations and educational materials that will be available to sponsors to increase compliance with the final rule and facilitate discussion between sponsors, investigators, and Food and Drug Administration representatives. PMID:28345368

  13. Cellular automata rule characterization and classification using texture descriptors

    NASA Astrophysics Data System (ADS)

    Machicao, Jeaneth; Ribas, Lucas C.; Scabini, Leonardo F. S.; Bruno, Odermir M.

    2018-05-01

    The cellular automata (CA) spatio-temporal patterns have attracted the attention from many researchers since it can provide emergent behavior resulting from the dynamics of each individual cell. In this manuscript, we propose an approach of texture image analysis to characterize and classify CA rules. The proposed method converts the CA spatio-temporal patterns into a gray-scale image. The gray-scale is obtained by creating a binary number based on the 8-connected neighborhood of each dot of the CA spatio-temporal pattern. We demonstrate that this technique enhances the CA rule characterization and allow to use different texture image analysis algorithms. Thus, various texture descriptors were evaluated in a supervised training approach aiming to characterize the CA's global evolution. Our results show the efficiency of the proposed method for the classification of the elementary CA (ECAs), reaching a maximum of 99.57% of accuracy rate according to the Li-Packard scheme (6 classes) and 94.36% for the classification of the 88 rules scheme. Moreover, within the image analysis context, we found a better performance of the method by means of a transformation of the binary states to a gray-scale.

  14. Is the Factor-of-2 Rule Broadly Applicable for Evaluating the Prediction Accuracy of Metal-Toxicity Models?

    PubMed

    Meyer, Joseph S; Traudt, Elizabeth M; Ranville, James F

    2018-01-01

    In aquatic toxicology, a toxicity-prediction model is generally deemed acceptable if its predicted median lethal concentrations (LC50 values) or median effect concentrations (EC50 values) are within a factor of 2 of their paired, observed LC50 or EC50 values. However, that rule of thumb is based on results from only two studies: multiple LC50 values for the fathead minnow (Pimephales promelas) exposed to Cu in one type of exposure water, and multiple EC50 values for Daphnia magna exposed to Zn in another type of exposure water. We tested whether the factor-of-2 rule of thumb also is supported in a different dataset in which D. magna were exposed separately to Cd, Cu, Ni, or Zn. Overall, the factor-of-2 rule of thumb appeared to be a good guide to evaluating the acceptability of a toxicity model's underprediction or overprediction of observed LC50 or EC50 values in these acute toxicity tests.

  15. 77 FR 17102 - Self-Regulatory Organizations; International Securities Exchange, LLC; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-23

    ... Rule Change To Adopt a New Order Type March 19, 2012. Pursuant to Section 19(b)(1) of the Securities... of the Proposed Rule Change The Exchange proposes to amend Rule 715 (Types of Orders) to adopt a new order type. The text of the proposed rule change is available on the Exchange's Internet Web site at...

  16. Analytical formulation of cellular automata rules using data models

    NASA Astrophysics Data System (ADS)

    Jaenisch, Holger M.; Handley, James W.

    2009-05-01

    We present a unique method for converting traditional cellular automata (CA) rules into analytical function form. CA rules have been successfully used for morphological image processing and volumetric shape recognition and classification. Further, the use of CA rules as analog models to the physical and biological sciences can be significantly extended if analytical (as opposed to discrete) models could be formulated. We show that such transformations are possible. We use as our example John Horton Conway's famous "Game of Life" rule set. We show that using Data Modeling, we are able to derive both polynomial and bi-spectrum models of the IF-THEN rules that yield equivalent results. Further, we demonstrate that the "Game of Life" rule set can be modeled using the multi-fluxion, yielding a closed form nth order derivative and integral. All of the demonstrated analytical forms of the CA rule are general and applicable to real-time use.

  17. Conceptual model of knowledge base system

    NASA Astrophysics Data System (ADS)

    Naykhanova, L. V.; Naykhanova, I. V.

    2018-05-01

    In the article, the conceptual model of the knowledge based system by the type of the production system is provided. The production system is intended for automation of problems, which solution is rigidly conditioned by the legislation. A core component of the system is a knowledge base. The knowledge base consists of a facts set, a rules set, the cognitive map and ontology. The cognitive map is developed for implementation of a control strategy, ontology - the explanation mechanism. Knowledge representation about recognition of a situation in the form of rules allows describing knowledge of the pension legislation. This approach provides the flexibility, originality and scalability of the system. In the case of changing legislation, it is necessary to change the rules set. This means that the change of the legislation would not be a big problem. The main advantage of the system is that there is an opportunity to be adapted easily to changes of the legislation.

  18. Semantic Mediation via Access Broker: the OWS-9 experiment

    NASA Astrophysics Data System (ADS)

    Santoro, Mattia; Papeschi, Fabrizio; Craglia, Massimo; Nativi, Stefano

    2013-04-01

    Even with the use of common data models standards to publish and share geospatial data, users may still face semantic inconsistencies when they use Spatial Data Infrastructures - especially in multidisciplinary contexts. Several semantic mediation solutions exist to address this issue; they span from simple XSLT documents to transform from one data model schema to another, to more complex services based on the use of ontologies. This work presents the activity done in the context of the OGC Web Services Phase 9 (OWS-9) Cross Community Interoperability to develop a semantic mediation solution by enhancing the GEOSS Discovery and Access Broker (DAB). This is a middleware component that provides harmonized access to geospatial datasets according to client applications preferred service interface (Nativi et al. 2012, Vaccari et al. 2012). Given a set of remote feature data encoded in different feature schemas, the objective of the activity was to use the DAB to enable client applications to transparently access the feature data according to one single schema. Due to the flexible architecture of the Access Broker, it was possible to introduce a new transformation type in the configured chain of transformations. In fact, the Access Broker already provided the following transformations: Coordinate Reference System (CRS), spatial resolution, spatial extent (e.g., a subset of a data set), and data encoding format. A new software module was developed to invoke the needed external semantic mediation service and harmonize the accessed features. In OWS-9 the Access Broker invokes a SPARQL WPS to retrieve mapping rules for the OWS-9 schemas: USGS, and NGA schema. The solution implemented to address this problem shows the flexibility and extensibility of the brokering framework underpinning the GEO DAB: new services can be added to augment the number of supported schemas without the need to modify other components and/or software modules. Moreover, all other transformations (CRS, format, etc.) are available for client applications in a transparent way. Notwithstanding the encouraging results of this experiment, some issues (e.g. the automatic discovery of semantic mediation services to be invoked) still need to be solved. Future work will consider new semantic mediation services to broker, and compliance tests with the INSPIRE transformation service. References: Nativi S., Craglia M. and Pearlman J. 2012. The Brokering Approach for Multidisciplinary Interoperability: A Position Paper. International Journal of Spatial Data Infrastructures Research, Vol. 7, 1-15. http://ijsdir.jrc.ec.europa.eu/index.php/ijsdir/article/view/281/319 Vaccari L., Craglia M., Fugazza C. Nativi S. and Santoro M. 2012. Integrative Research: The EuroGEOSS Experience. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, Vol. 5 (6) 1603-1611. http://ieeexplore.ieee.org/xpl/articleDetails.jsp?tp=&arnumber=6187671&contentType=Journals+%26+Magazines&sortType%3Dasc_p_Sequence%26filter%3DAND%28p_IS_Number%3A6383184%29

  19. A Hybrid Approach Using Case-Based Reasoning and Rule-Based Reasoning to Support Cancer Diagnosis: A Pilot Study.

    PubMed

    Saraiva, Renata M; Bezerra, João; Perkusich, Mirko; Almeida, Hyggo; Siebra, Clauirton

    2015-01-01

    Recently there has been an increasing interest in applying information technology to support the diagnosis of diseases such as cancer. In this paper, we present a hybrid approach using case-based reasoning (CBR) and rule-based reasoning (RBR) to support cancer diagnosis. We used symptoms, signs, and personal information from patients as inputs to our model. To form specialized diagnoses, we used rules to define the input factors' importance according to the patient's characteristics. The model's output presents the probability of the patient having a type of cancer. To carry out this research, we had the approval of the ethics committee at Napoleão Laureano Hospital, in João Pessoa, Brazil. To define our model's cases, we collected real patient data at Napoleão Laureano Hospital. To define our model's rules and weights, we researched specialized literature and interviewed health professional. To validate our model, we used K-fold cross validation with the data collected at Napoleão Laureano Hospital. The results showed that our approach is an effective CBR system to diagnose cancer.

  20. Determining a human cardiac pacemaker using fuzzy logic

    NASA Astrophysics Data System (ADS)

    Varnavsky, A. N.; Antonenco, A. V.

    2017-01-01

    The paper presents a possibility of estimating a human cardiac pacemaker using combined application of nonlinear integral transformation and fuzzy logic, which allows carrying out the analysis in the real-time mode. The system of fuzzy logical conclusion is proposed, membership functions and rules of fuzzy products are defined. It was shown that the ratio of the value of a truth degree of the winning rule condition to the value of a truth degree of any other rule condition is at least 3.

  1. A clocking discipline for two-phase digital integrated circuits

    NASA Astrophysics Data System (ADS)

    Noice, D. C.

    1983-09-01

    Sooner or later a designer of digital circuits must face the problem of timing verification so he can avoid errors caused by clock skew, critical races, and hazards. Unlike previous verification methods, such as timing simulation and timing analysis, the approach presented here guarantees correct operation despite uncertainty about delays in the circuit. The result is a clocking discipline that deals with timing abstractions only. It is not based on delay calculations; it is only concerned with the correct, synchronous operation at some clock rate. Accordingly, it may be used earlier in the design cycle, which is particularly important to integrated circuit designs. The clocking discipline consists of a notation of clocking types, and composition rules for using the types. Together, the notation and rules define a formal theory of two phase clocking. The notation defines the names and exact characteristics for different signals that are used in a two phase digital system. The notation makes it possible to develop rules for propagating the clocking types through particular circuits.

  2. The Analysis and Discussion in the Effective Application of the Dispatcher Training Based on Case Teaching Method with the Cause from the Action of the Gap Protection of Main Transformer

    NASA Astrophysics Data System (ADS)

    Yuanyuan, Xu; Zhengmao, Zhang; Xiang, Fang; Yuanshuai, Xu; Xinxin, Song

    2018-03-01

    The combination of theory and practice is a difficult problem on dispatcher training. Through a typical example of case, this paper provides an effective case teaching method for dispatcher training, and combines the theoretical discussion of the rule of experience with cases and achieves vividness. It helps students to understand and catch the key points of the theory, and improve their practical skills.

  3. Langevin dynamics for vector variables driven by multiplicative white noise: A functional formalism

    NASA Astrophysics Data System (ADS)

    Moreno, Miguel Vera; Arenas, Zochil González; Barci, Daniel G.

    2015-04-01

    We discuss general multidimensional stochastic processes driven by a system of Langevin equations with multiplicative white noise. In particular, we address the problem of how time reversal diffusion processes are affected by the variety of conventions available to deal with stochastic integrals. We present a functional formalism to build up the generating functional of correlation functions without any type of discretization of the Langevin equations at any intermediate step. The generating functional is characterized by a functional integration over two sets of commuting variables, as well as Grassmann variables. In this representation, time reversal transformation became a linear transformation in the extended variables, simplifying in this way the complexity introduced by the mixture of prescriptions and the associated calculus rules. The stochastic calculus is codified in our formalism in the structure of the Grassmann algebra. We study some examples such as higher order derivative Langevin equations and the functional representation of the micromagnetic stochastic Landau-Lifshitz-Gilbert equation.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Binder, Tobias; Covi, Laura; Kamada, Ayuki

    Dark Matter (DM) models providing possible alternative solutions to the small-scale crisis of the standard cosmology are nowadays of growing interest. We consider DM interacting with light hidden fermions via well-motivated fundamental operators showing the resultant matter power spectrum is suppressed on subgalactic scales within a plausible parameter region. Our basic description of the evolution of cosmological perturbations relies on a fully consistent first principles derivation of a perturbed Fokker-Planck type equation, generalizing existing literature. The cosmological perturbation of the Fokker-Planck equation is presented for the first time in two different gauges, where the results transform into each other accordingmore » to the rules of gauge transformation. Furthermore, our focus lies on a derivation of a broadly applicable and easily computable collision term showing important phenomenological differences to other existing approximations. As one of the main results and concerning the small-scale crisis, we show the equal importance of vector and scalar boson mediated interactions between the DM and the light fermions.« less

  5. Judge Rules Plagiarism-Detection Tool Falls under "Fair Use"

    ERIC Educational Resources Information Center

    Young, Jeffrey R.

    2008-01-01

    Judge Claude M. Hilton, of the U.S. District Court in Alexandria, Virginia, in March found that scanning the student papers for the purpose of detecting plagiarism is a "highly transformative" use that falls under the fair-use provision of copyright law. He ruled that the company "makes no use of any work's particular expressive or creative…

  6. A new type of simplified fuzzy rule-based system

    NASA Astrophysics Data System (ADS)

    Angelov, Plamen; Yager, Ronald

    2012-02-01

    Over the last quarter of a century, two types of fuzzy rule-based (FRB) systems dominated, namely Mamdani and Takagi-Sugeno type. They use the same type of scalar fuzzy sets defined per input variable in their antecedent part which are aggregated at the inference stage by t-norms or co-norms representing logical AND/OR operations. In this paper, we propose a significantly simplified alternative to define the antecedent part of FRB systems by data Clouds and density distribution. This new type of FRB systems goes further in the conceptual and computational simplification while preserving the best features (flexibility, modularity, and human intelligibility) of its predecessors. The proposed concept offers alternative non-parametric form of the rules antecedents, which fully reflects the real data distribution and does not require any explicit aggregation operations and scalar membership functions to be imposed. Instead, it derives the fuzzy membership of a particular data sample to a Cloud by the data density distribution of the data associated with that Cloud. Contrast this to the clustering which is parametric data space decomposition/partitioning where the fuzzy membership to a cluster is measured by the distance to the cluster centre/prototype ignoring all the data that form that cluster or approximating their distribution. The proposed new approach takes into account fully and exactly the spatial distribution and similarity of all the real data by proposing an innovative and much simplified form of the antecedent part. In this paper, we provide several numerical examples aiming to illustrate the concept.

  7. Frequency analysis of DC tolerant current transformers

    NASA Astrophysics Data System (ADS)

    Mlejnek, P.; Kaspar, P.

    2013-09-01

    This article deals with wide frequency range behaviour of DC tolerant current transformers that are usually used in modern static energy meters. In this application current transformers must comply with European and International Standards in their accuracy and DC tolerance. Therefore, the linear DC tolerant current transformers and double core current transformers are used in this field. More details about the problems of these particular types of transformers can be found in our previous works. Although these transformers are designed mainly for power distribution network frequency (50/60 Hz), it can be interesting to understand their behaviour in wider frequency range. Based on this knowledge the new generations of energy meters with measuring quality of electric energy will be produced. This solution brings better measurement of consumption of nonlinear loads or measurement of non-sinusoidal voltage and current sources such as solar cells or fuel cells. The determination of actual power consumption in such energy meters is done using particular harmonics component of current and voltage. We measured the phase and ratio errors that are the most important parameters of current transformers, to characterize several samples of current transformers of both types.

  8. Modeling and analysis on ring-type piezoelectric transformers.

    PubMed

    Ho, Shine-Tzong

    2007-11-01

    This paper presents an electromechanical model for a ring-type piezoelectric transformer (PT). To establish this model, vibration characteristics of the piezoelectric ring with free boundary conditions are analyzed in advance. Based on the vibration analysis of the piezoelectric ring, the operating frequency and vibration mode of the PT are chosen. Then, electromechanical equations of motion for the PT are derived based on Hamilton's principle, which can be used to simulate the coupled electromechanical system for the transformer. Such as voltage stepup ratio, input impedance, output impedance, input power, output power, and efficiency are calculated by the equations. The optimal load resistance and the maximum efficiency for the PT will be presented in this paper. Experiments also were conducted to verify the theoretical analysis, and a good agreement was obtained.

  9. Using new aggregation operators in rule-based intelligent control

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.; Chen, Yung-Yaw; Yager, Ronald R.

    1990-01-01

    A new aggregation operator is applied in the design of an approximate reasoning-based controller. The ordered weighted averaging (OWA) operator has the property of lying between the And function and the Or function used in previous fuzzy set reasoning systems. It is shown here that, by applying OWA operators, more generalized types of control rules, which may include linguistic quantifiers such as Many and Most, can be developed. The new aggregation operators, as tested in a cart-pole balancing control problem, illustrate improved performance when compared with existing fuzzy control aggregation schemes.

  10. CDMBE: A Case Description Model Based on Evidence

    PubMed Central

    Zhu, Jianlin; Yang, Xiaoping; Zhou, Jing

    2015-01-01

    By combining the advantages of argument map and Bayesian network, a case description model based on evidence (CDMBE), which is suitable to continental law system, is proposed to describe the criminal cases. The logic of the model adopts the credibility logical reason and gets evidence-based reasoning quantitatively based on evidences. In order to consist with practical inference rules, five types of relationship and a set of rules are defined to calculate the credibility of assumptions based on the credibility and supportability of the related evidences. Experiments show that the model can get users' ideas into a figure and the results calculated from CDMBE are in line with those from Bayesian model. PMID:26421006

  11. A variant selection model for predicting the transformation texture of deformed austenite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butron-Guillen, M.P.; Jonas, J.J.; Da Costa Viana, C.S.

    1997-09-01

    The occurrence of variant selection during the transformation of deformed austenite is examined, together with its effect on the product texture. A new prediction method is proposed based on the morphology of the austenite grains, on slip activity, and on the residual stresses remaining in the material after rolling. The aspect ratio of pancaked grains is demonstrated to play an important role in favoring selection of the transformed copper ({l_brace}311{r_brace}<011> and {l_brace}211{r_brace}<011>) components. The extent of shear on active slip planes during prior rolling is shown to promote the formation of the transformed brass ({l_brace}332{r_brace}<113> and {l_brace}211{r_brace}<113>) components. Finally, themore » residual stresses remaining in the material after rolling play an essential part by preventing growth of the {l_brace}110{r_brace}<110> and {l_brace}100{r_brace} orientations selected by the grain shape and slip activity rules. With the aid of these three variant selection criteria combined, it is possible to reproduce all the features of the transformation textures observed experimentally. The criteria also explain why the intensities of the transformed copper components are sensitive to the pancaking strain, while those of the transformed brass are a function of the cooling rate employed after hot rolling.« less

  12. Object-based land-cover classification for metropolitan Phoenix, Arizona, using aerial photography

    NASA Astrophysics Data System (ADS)

    Li, Xiaoxiao; Myint, Soe W.; Zhang, Yujia; Galletti, Chritopher; Zhang, Xiaoxiang; Turner, Billie L.

    2014-12-01

    Detailed land-cover mapping is essential for a range of research issues addressed by the sustainability and land system sciences and planning. This study uses an object-based approach to create a 1 m land-cover classification map of the expansive Phoenix metropolitan area through the use of high spatial resolution aerial photography from National Agricultural Imagery Program. It employs an expert knowledge decision rule set and incorporates the cadastral GIS vector layer as auxiliary data. The classification rule was established on a hierarchical image object network, and the properties of parcels in the vector layer were used to establish land cover types. Image segmentations were initially utilized to separate the aerial photos into parcel sized objects, and were further used for detailed land type identification within the parcels. Characteristics of image objects from contextual and geometrical aspects were used in the decision rule set to reduce the spectral limitation of the four-band aerial photography. Classification results include 12 land-cover classes and subclasses that may be assessed from the sub-parcel to the landscape scales, facilitating examination of scale dynamics. The proposed object-based classification method provides robust results, uses minimal and readily available ancillary data, and reduces computational time.

  13. Fourier transform mass spectrometry.

    PubMed

    Scigelova, Michaela; Hornshaw, Martin; Giannakopulos, Anastassios; Makarov, Alexander

    2011-07-01

    This article provides an introduction to Fourier transform-based mass spectrometry. The key performance characteristics of Fourier transform-based mass spectrometry, mass accuracy and resolution, are presented in the view of how they impact the interpretation of measurements in proteomic applications. The theory and principles of operation of two types of mass analyzer, Fourier transform ion cyclotron resonance and Orbitrap, are described. Major benefits as well as limitations of Fourier transform-based mass spectrometry technology are discussed in the context of practical sample analysis, and illustrated with examples included as figures in this text and in the accompanying slide set. Comparisons highlighting the performance differences between the two mass analyzers are made where deemed useful in assisting the user with choosing the most appropriate technology for an application. Recent developments of these high-performing mass spectrometers are mentioned to provide a future outlook.

  14. Fourier Transform Mass Spectrometry

    PubMed Central

    Scigelova, Michaela; Hornshaw, Martin; Giannakopulos, Anastassios; Makarov, Alexander

    2011-01-01

    This article provides an introduction to Fourier transform-based mass spectrometry. The key performance characteristics of Fourier transform-based mass spectrometry, mass accuracy and resolution, are presented in the view of how they impact the interpretation of measurements in proteomic applications. The theory and principles of operation of two types of mass analyzer, Fourier transform ion cyclotron resonance and Orbitrap, are described. Major benefits as well as limitations of Fourier transform-based mass spectrometry technology are discussed in the context of practical sample analysis, and illustrated with examples included as figures in this text and in the accompanying slide set. Comparisons highlighting the performance differences between the two mass analyzers are made where deemed useful in assisting the user with choosing the most appropriate technology for an application. Recent developments of these high-performing mass spectrometers are mentioned to provide a future outlook. PMID:21742802

  15. Collaboration pathway(s) using new tools for optimizing operational climate monitoring from space

    NASA Astrophysics Data System (ADS)

    Helmuth, Douglas B.; Selva, Daniel; Dwyer, Morgan M.

    2014-10-01

    Consistently collecting the earth's climate signatures remains a priority for world governments and international scientific organizations. Architecting a solution requires transforming scientific missions into an optimized robust `operational' constellation that addresses the needs of decision makers, scientific investigators and global users for trusted data. The application of new tools offers pathways for global architecture collaboration. Recent (2014) rulebased decision engine modeling runs that targeted optimizing the intended NPOESS architecture, becomes a surrogate for global operational climate monitoring architecture(s). This rule-based systems tools provide valuable insight for Global climate architectures, through the comparison and evaluation of alternatives considered and the exhaustive range of trade space explored. A representative optimization of Global ECV's (essential climate variables) climate monitoring architecture(s) is explored and described in some detail with thoughts on appropriate rule-based valuations. The optimization tools(s) suggest and support global collaboration pathways and hopefully elicit responses from the audience and climate science shareholders.

  16. Adjusting Wavelet-based Multiresolution Analysis Boundary Conditions for Robust Long-term Streamflow Forecasting Model

    NASA Astrophysics Data System (ADS)

    Maslova, I.; Ticlavilca, A. M.; McKee, M.

    2012-12-01

    There has been an increased interest in wavelet-based streamflow forecasting models in recent years. Often overlooked in this approach are the circularity assumptions of the wavelet transform. We propose a novel technique for minimizing the wavelet decomposition boundary condition effect to produce long-term, up to 12 months ahead, forecasts of streamflow. A simulation study is performed to evaluate the effects of different wavelet boundary rules using synthetic and real streamflow data. A hybrid wavelet-multivariate relevance vector machine model is developed for forecasting the streamflow in real-time for Yellowstone River, Uinta Basin, Utah, USA. The inputs of the model utilize only the past monthly streamflow records. They are decomposed into components formulated in terms of wavelet multiresolution analysis. It is shown that the model model accuracy can be increased by using the wavelet boundary rule introduced in this study. This long-term streamflow modeling and forecasting methodology would enable better decision-making and managing water availability risk.

  17. Object-based and egocentric mental rotation performance in older adults: the importance of gender differences and motor ability.

    PubMed

    Jansen, Petra; Kaltner, Sandra

    2014-01-01

    In this study, mental rotation performance was assessed in both an object-based task, human figures and letters as stimuli, and in an egocentric-based task, a human figure as a stimulus, in 60 older persons between 60 and 71 years old (30 women, 30 men). Additionally all participants completed three motor tests measuring balance and mobility. The results show that the reaction time was slower for letters than for both human figure tasks and the mental rotation speed was faster over all for egocentric mental rotation tasks. Gender differences were found in the accuracy measurement, favoring males, and were independent of stimulus type, kind of transformation, and angular disparity. Furthermore, a regression analysis showed that the accuracy rate for object-based transformations with body stimuli could be predicted by gender and balance ability. This study showed that the mental rotation performance in older adults depends on stimulus type, kind of transformation, and gender and that performance partially relates to motor ability.

  18. Multi Groups Cooperation based Symbiotic Evolution for TSK-type Neuro-Fuzzy Systems Design

    PubMed Central

    Cheng, Yi-Chang; Hsu, Yung-Chi

    2010-01-01

    In this paper, a TSK-type neuro-fuzzy system with multi groups cooperation based symbiotic evolution method (TNFS-MGCSE) is proposed. The TNFS-MGCSE is developed from symbiotic evolution. The symbiotic evolution is different from traditional GAs (genetic algorithms) that each chromosome in symbiotic evolution represents a rule of fuzzy model. The MGCSE is different from the traditional symbiotic evolution; with a population in MGCSE is divided to several groups. Each group formed by a set of chromosomes represents a fuzzy rule and cooperate with other groups to generate the better chromosomes by using the proposed cooperation based crossover strategy (CCS). In this paper, the proposed TNFS-MGCSE is used to evaluate by numerical examples (Mackey-Glass chaotic time series and sunspot number forecasting). The performance of the TNFS-MGCSE achieves excellently with other existing models in the simulations. PMID:21709856

  19. Periodicity and Multi-scale Analysis of Runoff and Sediment Load in the Wulanghe River, Jinsha River

    NASA Astrophysics Data System (ADS)

    Chen, Yiming

    2018-01-01

    Based on the annual runoff and sediment data (1959-2014 ) of Zongguantian hydrological station, time-frequency wavelet transform characteristics and their periodic rules of high and low flow alternating change were analyzed in multi-time scales by the Morlet continue wavelet transformation (CWT). It is concluded that the primary periods of runoff and sediment load time series of the high and low annual flow in the different time scales were 12-year, 3-year and 26-year, 18-year, 13-year, 5-year, respectively, and predicted that the major variant trend of the two time series would been gradually decreasing and been in the high flow period around 8-year (from 2014 to 2022) and 10-year (from 2014 to 2020).

  20. Optical chirp z-transform processor with a simplified architecture.

    PubMed

    Ngo, Nam Quoc

    2014-12-29

    Using a simplified chirp z-transform (CZT) algorithm based on the discrete-time convolution method, this paper presents the synthesis of a simplified architecture of a reconfigurable optical chirp z-transform (OCZT) processor based on the silica-based planar lightwave circuit (PLC) technology. In the simplified architecture of the reconfigurable OCZT, the required number of optical components is small and there are no waveguide crossings which make fabrication easy. The design of a novel type of optical discrete Fourier transform (ODFT) processor as a special case of the synthesized OCZT is then presented to demonstrate its effectiveness. The designed ODFT can be potentially used as an optical demultiplexer at the receiver of an optical fiber orthogonal frequency division multiplexing (OFDM) transmission system.

  1. Competition.

    PubMed

    Chambers, D W

    1997-01-01

    Our ambivalence toward competition can be traced to an unspoken preference for certain types of competition which give us an advantage over the types we value less. Four types are defined (a) pure (same rules, same objectives), (b) collaborative (same rules, shared objective), (c) market share (different rules, same objectives), and (d) market growth (different rules, value added orientation). The defining characteristics of the four types of competition are respectively: needing a referee, arguing over the spoils, differentiation and substitutability, and customer focus. Dentistry has features of all four types of competition, thus making it difficult to have a meaningful discussion or frame a coherent policy on this topic.

  2. Selective Effects of Sport Expertise on the Stages of Mental Rotation Tasks With Object-Based and Egocentric Transformations

    PubMed Central

    Feng, Tian; Zhang, Zhongqiu; Ji, Zhiguang; Jia, Binbin; Li, Yawei

    2017-01-01

    It is well established that motor expertise is linked to superior mental rotation ability, but few studies have attempted to explain the factors that influence the stages of mental rotation in sport experts. Some authors have argued that athletes are faster in the perceptual and decision stages but not in the rotation stages of object-based transformations; however, stimuli related to sport have not been used to test mental rotation with egocentric transformations. Therefore, 24 adolescent elite divers and 23 adolescent nonathletes completed mental rotation tasks with object-based and egocentric transformations. The results showed faster reaction times (RTs) for the motor experts in tasks with both types of transformations (object-based cube, object-based body, and egocentric body). Additionally, the differences in favour of motor experts in the perceptual and decision stages were confirmed. Interestingly, motor experts also outperformed nonathletes in the rotation stages in the egocentric transformations. These findings are discussed against the background of the effects of sport expertise on mental rotation. PMID:29071008

  3. Sharing the knowledge gained from occupational cohort studies: a call for action.

    PubMed

    Behrens, Thomas; Mester, Birte; Fritschi, Lin

    2012-06-01

    An immense body of knowledge has been created by establishing various job-exposure matrices (JEMs) to assess occupational exposures in community- and industry-based cohort studies. These JEMs could be made available to occupational epidemiologists using knowledge-sharing technologies, thereby saving considerable amounts of time and money for researchers investigating occupation-related research questions. In this paper, the authors give an example of how a detailed JEM can be easily transformed into a job-specific module (JSM) for use in community-based studies. OccIDEAS is operationalised as a web-based software, combining the use of JSMs with an individual expert exposure assessment to assess occupational exposures in various industries according to a set of predefined rules. The authors used a JEM focusing on endocrine-disrupting chemicals from a German study on testicular cancer in the automobile industry to create a JSM in OccIDEAS. The JEM was easily translated into OccIDEAS requiring about 50 h of work by an epidemiologist familiar with the German JEM to learn about the OccIDEAS structure, establish the required set of exposure rules and to translate the JEM into OccIDEAS. Language did not represent an obstacle for translation either. To make the data available in an international context, an interpreter had to translate the German tasks and exposures after they were coded into OccIDEAS. JEMs which are constructed based on identifying tasks that determine exposure can be easily transformed into a JSM. Occupational epidemiologists are invited to contribute to the international scope of OccIDEAS by providing their previously established JEMs to make existing data on occupational exposures widely available to the epidemiological community.

  4. Fault diagnosis of power transformer based on fault-tree analysis (FTA)

    NASA Astrophysics Data System (ADS)

    Wang, Yongliang; Li, Xiaoqiang; Ma, Jianwei; Li, SuoYu

    2017-05-01

    Power transformers is an important equipment in power plants and substations, power distribution transmission link is made an important hub of power systems. Its performance directly affects the quality and health of the power system reliability and stability. This paper summarizes the five parts according to the fault type power transformers, then from the time dimension divided into three stages of power transformer fault, use DGA routine analysis and infrared diagnostics criterion set power transformer running state, finally, according to the needs of power transformer fault diagnosis, by the general to the section by stepwise refinement of dendritic tree constructed power transformer fault

  5. Cytoskeletal motor-driven active self-assembly in in vitro systems

    DOE PAGES

    Lam, A. T.; VanDelinder, V.; Kabir, A. M. R.; ...

    2015-11-11

    Molecular motor-driven self-assembly has been an active area of soft matter research for the past decade. Because molecular motors transform chemical energy into mechanical work, systems which employ molecular motors to drive self-assembly processes are able to overcome kinetic and thermodynamic limits on assembly time, size, complexity, and structure. Here, we review the progress in elucidating and demonstrating the rules and capabilities of motor-driven active self-assembly. Lastly, we focus on the types of structures created and the degree of control realized over these structures, and discuss the next steps necessary to achieve the full potential of this assembly mode whichmore » complements robotic manipulation and passive self-assembly.« less

  6. Modeling of a ring rosen-type piezoelectric transformer by Hamilton's principle.

    PubMed

    Nadal, Clément; Pigache, Francois; Erhart, Jiří

    2015-04-01

    This paper deals with the analytical modeling of a ring Rosen-type piezoelectric transformer. The developed model is based on a Hamiltonian approach, enabling to obtain main parameters and performance evaluation for the first radial vibratory modes. Methodology is detailed, and final results, both the input admittance and the electric potential distribution on the surface of the secondary part, are compared with numerical and experimental ones for discussion and validation.

  7. Analysis of Occupational Accidents in Underground and Surface Mining in Spain Using Data-Mining Techniques

    PubMed Central

    Sanmiquel, Lluís; Bascompta, Marc; Rossell, Josep M.; Anticoi, Hernán Francisco; Guash, Eduard

    2018-01-01

    An analysis of occupational accidents in the mining sector was conducted using the data from the Spanish Ministry of Employment and Social Safety between 2005 and 2015, and data-mining techniques were applied. Data was processed with the software Weka. Two scenarios were chosen from the accidents database: surface and underground mining. The most important variables involved in occupational accidents and their association rules were determined. These rules are composed of several predictor variables that cause accidents, defining its characteristics and context. This study exposes the 20 most important association rules in the sector—either surface or underground mining—based on the statistical confidence levels of each rule as obtained by Weka. The outcomes display the most typical immediate causes, along with the percentage of accidents with a basis in each association rule. The most important immediate cause is body movement with physical effort or overexertion, and the type of accident is physical effort or overexertion. On the other hand, the second most important immediate cause and type of accident are different between the two scenarios. Data-mining techniques were chosen as a useful tool to find out the root cause of the accidents. PMID:29518921

  8. Analysis of Occupational Accidents in Underground and Surface Mining in Spain Using Data-Mining Techniques.

    PubMed

    Sanmiquel, Lluís; Bascompta, Marc; Rossell, Josep M; Anticoi, Hernán Francisco; Guash, Eduard

    2018-03-07

    An analysis of occupational accidents in the mining sector was conducted using the data from the Spanish Ministry of Employment and Social Safety between 2005 and 2015, and data-mining techniques were applied. Data was processed with the software Weka. Two scenarios were chosen from the accidents database: surface and underground mining. The most important variables involved in occupational accidents and their association rules were determined. These rules are composed of several predictor variables that cause accidents, defining its characteristics and context. This study exposes the 20 most important association rules in the sector-either surface or underground mining-based on the statistical confidence levels of each rule as obtained by Weka. The outcomes display the most typical immediate causes, along with the percentage of accidents with a basis in each association rule. The most important immediate cause is body movement with physical effort or overexertion, and the type of accident is physical effort or overexertion. On the other hand, the second most important immediate cause and type of accident are different between the two scenarios. Data-mining techniques were chosen as a useful tool to find out the root cause of the accidents.

  9. A departure from cognitivism: Implications of Chomsky's second revolution in linguistics.

    PubMed

    Schoneberger, T

    2000-01-01

    In 1957 Noam Chomsky published Syntactic Structures, expressing views characterized as constituting a "revolution" in linguistics. Chomsky proposed that the proper subject matter of linguistics is not the utterances of speakers, but what speakers and listeners know. To that end, he theorized that what they know is a system of rules that underlie actual performance. This theory became known as transformational grammar. In subsequent versions of this theory, rules continued to play a dominant role. However, in 1980 Chomsky began a second revolution by proposing the elimination of rules in a new theory: the principles-and-parameters approach. Subsequent writings finalized the abandonment of rules. Given the centrality of rules to cognitivism, this paper argues that Chomsky's second revolution constitutes a departure from cognitivism.

  10. Martensitic Transformation in a β-Type Mg-Sc Alloy

    NASA Astrophysics Data System (ADS)

    Ogawa, Yukiko; Ando, Daisuke; Sutou, Yuji; Somekawa, Hidetoshi; Koike, Junichi

    2018-03-01

    Recently, we found that a Mg-Sc alloy with a bcc (β) phase exhibits superelasticity and a shape memory effect at low temperature. In this work, we examined the stress-induced and thermally induced martensitic transformation of the β-type Mg-Sc alloy and investigated the crystal structure of the thermally induced martensite phase based on in situ X-ray diffraction (XRD) measurements. The lattice constants of the martensite phase were calculated to be a = 0.3285 nm, b = 0.5544 nm, and c = 0.5223 nm when we assumed that the martensite phase has an orthorhombic structure (Cmcm). Based on the lattice correspondence between a bcc and an orthorhombic structures such as that in the case of β-Ti shape memory alloys, we estimated the transformation strain of the β Mg-Sc alloy. As a result, the transformation strains along the 001, 011, and 111 directions in the β phase were calculated to be + 5.7, + 8.8, and + 3.3%, respectively.

  11. Superfast algorithms of multidimensional discrete k-wave transforms and Volterra filtering based on superfast radon transform

    NASA Astrophysics Data System (ADS)

    Labunets, Valeri G.; Labunets-Rundblad, Ekaterina V.; Astola, Jaakko T.

    2001-12-01

    Fast algorithms for a wide class of non-separable n-dimensional (nD) discrete unitary K-transforms (DKT) are introduced. They need less 1D DKTs than in the case of the classical radix-2 FFT-type approach. The method utilizes a decomposition of the nD K-transform into the product of a new nD discrete Radon transform and of a set of parallel/independ 1D K-transforms. If the nD K-transform has a separable kernel (e.g., the case of the discrete Fourier transform) our approach leads to decrease of multiplicative complexity by the factor of n comparing to the classical row/column separable approach. It is well known that an n-th order Volterra filter of one dimensional signal can be evaluated by an appropriate nD linear convolution. This work describes new superfast algorithm for Volterra filtering. New approach is based on the superfast discrete Radon and Nussbaumer polynomial transforms.

  12. Beauty vector meson decay constants from QCD sum rules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lucha, Wolfgang; Melikhov, Dmitri; D. V. Skobeltsyn Institute of Nuclear Physics, M. V. Lomonosov Moscow State University, 119991, Moscow

    We present the outcomes of a very recent investigation of the decay constants of nonstrange and strange heavy-light beauty vector mesons, with special emphasis on the ratio of any such decay constant to the decay constant of the corresponding pseudoscalar meson, by means of Borel-transformed QCD sum rules. Our results suggest that both these ratios are below unity.

  13. 47 CFR 95.193 - (FRS Rule 3) Types of communications.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 5 2010-10-01 2010-10-01 false (FRS Rule 3) Types of communications. 95.193 Section 95.193 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES PERSONAL RADIO SERVICES Family Radio Service (FRS) General Provisions § 95.193 (FRS Rule 3) Types...

  14. Crystal Graph Convolutional Neural Networks for an Accurate and Interpretable Prediction of Material Properties

    NASA Astrophysics Data System (ADS)

    Xie, Tian; Grossman, Jeffrey C.

    2018-04-01

    The use of machine learning methods for accelerating the design of crystalline materials usually requires manually constructed feature vectors or complex transformation of atom coordinates to input the crystal structure, which either constrains the model to certain crystal types or makes it difficult to provide chemical insights. Here, we develop a crystal graph convolutional neural networks framework to directly learn material properties from the connection of atoms in the crystal, providing a universal and interpretable representation of crystalline materials. Our method provides a highly accurate prediction of density functional theory calculated properties for eight different properties of crystals with various structure types and compositions after being trained with 1 04 data points. Further, our framework is interpretable because one can extract the contributions from local chemical environments to global properties. Using an example of perovskites, we show how this information can be utilized to discover empirical rules for materials design.

  15. 77 FR 34115 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing of Proposed Rule Change Amending...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-08

    ... Equities Rule 7.31(h) To Add a PL Select Order Type June 4, 2012. Pursuant to Section 19(b)(1) of the... Change The Exchange proposes to amend NYSE Arca Equities Rule 7.31(h) to add a PL Select Order type. The... add a PL Select Order type. Pursuant to NYSE Arca Equities Rule 7.31(h)(4), a Passive Liquidity (``PL...

  16. Transfer between local and global processing levels by pigeons (Columba livia) and humans (Homo sapiens) in exemplar- and rule-based categorization tasks.

    PubMed

    Aust, Ulrike; Braunöder, Elisabeth

    2015-02-01

    The present experiment investigated pigeons' and humans' processing styles-local or global-in an exemplar-based visual categorization task in which category membership of every stimulus had to be learned individually, and in a rule-based task in which category membership was defined by a perceptual rule. Group Intact was trained with the original pictures (providing both intact local and global information), Group Scrambled was trained with scrambled versions of the same pictures (impairing global information), and Group Blurred was trained with blurred versions (impairing local information). Subsequently, all subjects were tested for transfer to the 2 untrained presentation modes. Humans outperformed pigeons regarding learning speed and accuracy as well as transfer performance and showed good learning irrespective of group assignment, whereas the pigeons of Group Blurred needed longer to learn the training tasks than the pigeons of Groups Intact and Scrambled. Also, whereas humans generalized equally well to any novel presentation mode, pigeons' transfer from and to blurred stimuli was impaired. Both species showed faster learning and, for the most part, better transfer in the rule-based than in the exemplar-based task, but there was no evidence of the used processing mode depending on the type of task (exemplar- or rule-based). Whereas pigeons relied on local information throughout, humans did not show a preference for either processing level. Additional tests with grayscale versions of the training stimuli, with versions that were both blurred and scrambled, and with novel instances of the rule-based task confirmed and further extended these findings. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  17. A method of extracting impervious surface based on rule algorithm

    NASA Astrophysics Data System (ADS)

    Peng, Shuangyun; Hong, Liang; Xu, Quanli

    2018-02-01

    The impervious surface has become an important index to evaluate the urban environmental quality and measure the development level of urbanization. At present, the use of remote sensing technology to extract impervious surface has become the main way. In this paper, a method to extract impervious surface based on rule algorithm is proposed. The main ideas of the method is to use the rule-based algorithm to extract impermeable surface based on the characteristics and the difference which is between the impervious surface and the other three types of objects (water, soil and vegetation) in the seven original bands, NDWI and NDVI. The steps can be divided into three steps: 1) Firstly, the vegetation is extracted according to the principle that the vegetation is higher in the near-infrared band than the other bands; 2) Then, the water is extracted according to the characteristic of the water with the highest NDWI and the lowest NDVI; 3) Finally, the impermeable surface is extracted based on the fact that the impervious surface has a higher NDWI value and the lowest NDVI value than the soil.In order to test the accuracy of the rule algorithm, this paper uses the linear spectral mixed decomposition algorithm, the CART algorithm, the NDII index algorithm for extracting the impervious surface based on six remote sensing image of the Dianchi Lake Basin from 1999 to 2014. Then, the accuracy of the above three methods is compared with the accuracy of the rule algorithm by using the overall classification accuracy method. It is found that the extraction method based on the rule algorithm is obviously higher than the above three methods.

  18. Lessons for health care rationing from the case of child B.

    PubMed

    Price, D

    1996-01-20

    More details have emerged about the child B leukaemia case with the publication of the All England Law Report on the Appeal Court decision. At the time the view was widely held that the controversy might have been avoided if the responsible health authority had consulted the public. The law report reveals, however, that the courts adopted a moral language widely at variance with that of the patient's doctor. The courts were concerned to support a utilitarian decision procedure based on calculations of the greatest overall good; the doctor was concerned with the best interests of a sick child. The doctor-patient relationship may be damaged when public consideration transforms the issue in this way. Also, the Appeal Court supported a decision which claimed to have "weighed" opposing evaluations, but it excused the health authority from describing how that weighing took place. One of the main criticisms of the utilitarian approach, however, is that weighing of this type is extremely difficult to justify. By its ruling the court has made legal challenge on the grounds of inadequate consultation virtually impossible to substantiate.

  19. Lessons for health care rationing from the case of child B.

    PubMed Central

    Price, D.

    1996-01-01

    More details have emerged about the child B leukaemia case with the publication of the All England Law Report on the Appeal Court decision. At the time the view was widely held that the controversy might have been avoided if the responsible health authority had consulted the public. The law report reveals, however, that the courts adopted a moral language widely at variance with that of the patient's doctor. The courts were concerned to support a utilitarian decision procedure based on calculations of the greatest overall good; the doctor was concerned with the best interests of a sick child. The doctor-patient relationship may be damaged when public consideration transforms the issue in this way. Also, the Appeal Court supported a decision which claimed to have "weighed" opposing evaluations, but it excused the health authority from describing how that weighing took place. One of the main criticisms of the utilitarian approach, however, is that weighing of this type is extremely difficult to justify. By its ruling the court has made legal challenge on the grounds of inadequate consultation virtually impossible to substantiate. Images p168-a PMID:8563539

  20. Cavernous Transformation of Portal Vein Secondary to Portal Vein Thrombosis: A Case Report

    PubMed Central

    Ramos, Radhames; Park, Yoojin; Shazad, Ghulamullah; A.Garcia, Christine; Cohen, Ronny

    2012-01-01

    There are few reported cases of cavernous transformation of the portal vein (CTPV) in adults. We present a case of a 58 year-old male who was found to have this complication due to portal vein thrombosis (PVT). A 58-year old African American male with chronic alcohol and tobacco use presented with a 25-day history of weakness, generalized malaise, nausea and vomiting associated with progressively worsening anorexia and weight loss. The patient was admitted for severe anemia in conjunction with abnormal liver function tests and electrolyte abnormalities, and to rule out end stage liver disease or hepatic malignancy. The work-up for anemia showed no significant colon abnormalities, cholecystitis, liver cirrhosis, or liver abnormalities but could not rule out malignancy. An esophageogastroduodenoscopy (EGD) was suspicious for a mass compressing the stomach and small bowel. After further work-up, the hepatic mass has been diagnosed as a cavernous transformation of the portal vein (CTPV), a very rare complication of portal vein thrombosis (PVT). Cavernous Transformation of the Portal Vein (CTPV) is a rare and incurable complication of portal vein thrombosis (PVT) that should be considered as one of the differential diagnoses of a hepatic mass. Keywords Cavernous transformation of the portal vein; Portal vein thrombosis; Portal hypertension; Hyperbilirubinemia; Hepatic mass PMID:22383935

  1. Structure identification in fuzzy inference using reinforcement learning

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.; Khedkar, Pratap

    1993-01-01

    In our previous work on the GARIC architecture, we have shown that the system can start with surface structure of the knowledge base (i.e., the linguistic expression of the rules) and learn the deep structure (i.e., the fuzzy membership functions of the labels used in the rules) by using reinforcement learning. Assuming the surface structure, GARIC refines the fuzzy membership functions used in the consequents of the rules using a gradient descent procedure. This hybrid fuzzy logic and reinforcement learning approach can learn to balance a cart-pole system and to backup a truck to its docking location after a few trials. In this paper, we discuss how to do structure identification using reinforcement learning in fuzzy inference systems. This involves identifying both surface as well as deep structure of the knowledge base. The term set of fuzzy linguistic labels used in describing the values of each control variable must be derived. In this process, splitting a label refers to creating new labels which are more granular than the original label and merging two labels creates a more general label. Splitting and merging of labels directly transform the structure of the action selection network used in GARIC by increasing or decreasing the number of hidden layer nodes.

  2. A comprehensive revisit of the ρ meson with improved Monte-Carlo based QCD sum rules

    NASA Astrophysics Data System (ADS)

    Wang, Qi-Nan; Zhang, Zhu-Feng; Steele, T. G.; Jin, Hong-Ying; Huang, Zhuo-Ran

    2017-07-01

    We improve the Monte-Carlo based QCD sum rules by introducing the rigorous Hölder-inequality-determined sum rule window and a Breit-Wigner type parametrization for the phenomenological spectral function. In this improved sum rule analysis methodology, the sum rule analysis window can be determined without any assumptions on OPE convergence or the QCD continuum. Therefore, an unbiased prediction can be obtained for the phenomenological parameters (the hadronic mass and width etc.). We test the new approach in the ρ meson channel with re-examination and inclusion of α s corrections to dimension-4 condensates in the OPE. We obtain results highly consistent with experimental values. We also discuss the possible extension of this method to some other channels. Supported by NSFC (11175153, 11205093, 11347020), Open Foundation of the Most Important Subjects of Zhejiang Province, and K. C. Wong Magna Fund in Ningbo University, TGS is Supported by the Natural Sciences and Engineering Research Council of Canada (NSERC), Z. F. Zhang and Z. R. Huang are Grateful to the University of Saskatchewan for its Warm Hospitality

  3. Retrosynthetic Reaction Prediction Using Neural Sequence-to-Sequence Models

    PubMed Central

    2017-01-01

    We describe a fully data driven model that learns to perform a retrosynthetic reaction prediction task, which is treated as a sequence-to-sequence mapping problem. The end-to-end trained model has an encoder–decoder architecture that consists of two recurrent neural networks, which has previously shown great success in solving other sequence-to-sequence prediction tasks such as machine translation. The model is trained on 50,000 experimental reaction examples from the United States patent literature, which span 10 broad reaction types that are commonly used by medicinal chemists. We find that our model performs comparably with a rule-based expert system baseline model, and also overcomes certain limitations associated with rule-based expert systems and with any machine learning approach that contains a rule-based expert system component. Our model provides an important first step toward solving the challenging problem of computational retrosynthetic analysis. PMID:29104927

  4. Symmetry rules for the indirect nuclear spin-spin coupling tensor revisited

    NASA Astrophysics Data System (ADS)

    Buckingham, A. D.; Pyykkö, P.; Robert, J. B.; Wiesenfeld, L.

    The symmetry rules of Buckingham and Love (1970), relating the number of independent components of the indirect spin-spin coupling tensor J to the symmetry of the nuclear sites, are shown to require modification if the two nuclei are exchanged by a symmetry operation. In that case, the anti-symmetric part of J does not transform as a second-rank polar tensor under symmetry operations that interchange the coupled nuclei and may be called an anti-tensor. New rules are derived and illustrated by simple molecular models.

  5. An Algebraic Approach to the Study and Optimization of the Set of Rules of a Conditional Rewrite System

    NASA Astrophysics Data System (ADS)

    Makhortov, S. D.

    2018-03-01

    An algebraic system containing the semantics of a set of rules of the conditional equational theory (or the conditional term rewriting system) is introduced. The following basic questions are considered for the given model: existence of logical closure, structure of logical closure, possibility of equivalent transformations, and construction of logical reduction. The obtained results can be applied to the analysis and automatic optimization of the corresponding set of rules. The basis for the given research is the theory of lattices and binary relations.

  6. 78 FR 16051 - Vehicle/Track Interaction Safety Standards; High-Speed and High Cant Deficiency Operations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-13

    ...FRA is amending the Track Safety Standards and Passenger Equipment Safety Standards to promote the safe interaction of rail vehicles with the track over which they operate under a variety of conditions at speeds up to 220 m.p.h. The final rule revises standards for track geometry and safety limits for vehicle response to track conditions, enhances vehicle/track qualification procedures, and adds flexibility for permitting high cant deficiency train operations through curves at conventional speeds. The rule accounts for a range of vehicle types that are currently in operation, as well as vehicle types that may likely be used in future high-speed or high cant deficiency rail operations, or both. The rule is based on the results of simulation studies designed to identify track geometry irregularities associated with unsafe wheel/rail forces and accelerations, thorough reviews of vehicle qualification and revenue service test data, and consideration of international practices.

  7. Persistent free radicals in carbon-based materials on transformation of refractory organic contaminants (ROCs) in water: A critical review.

    PubMed

    Qin, Yaxin; Li, Guiying; Gao, Yanpeng; Zhang, Lizhi; Ok, Yong Sik; An, Taicheng

    2018-06-15

    With the increased concentrations and kinds of refractory organic contaminants (ROCs) in aquatic environments, many previous reviews systematically summarized the applications of carbon-based materials in the adsorption and catalytic degradation of ROCs for their economically viable and environmentally friendly behavior. Interestingly, recent studies indicated that carbon-based materials in natural environment can also mediate the transformation of ROCs directly or indirectly due to their abundant persistent free radicals (PFRs). Understanding the formation mechanisms of PFRs in carbo-based materials and their interactions with ROCs is essential to develop their further applications in environment remediation. However, there is no comprehensive review so far about the direct and indirect removal of ROCs mediated by PFRs in amorphous, porous and crystalline carbon-based materials. The review aims to evaluate the formation mechanisms of PFRs in carbon-based materials synthesized through pyrolysis and hydrothermal carbonization processes. The influence of synthesis conditions (temperature and time) and carbon sources on the types as well as the concentrations of PFRs in carbon-based materials are also discussed. In particular, the effects of metals on the concentrations and types of PFRs in carbon-based materials are highlighted because they are considered as the catalysts for the formation of PFRs. The formation mechanisms of reactive species and the further transformation mechanisms of ROCs are briefly summarized, and the surface properties of carbon-based materials including surface area, types and number of functional groups, etc. are found to be the key parameters controlling their activities. However, due to diversity and complexity of carbon-based materials, the exact relationships between the activities of carbon-based materials and PFRs are still uncertain. Finally, the existing problems and current challenges for the ROCs transformation with carbon-based materials are also pointed out. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. The Korean Question: Is There a Role for Forward-Based American Forces in a Unified Korea?

    DTIC Science & Technology

    2003-01-01

    century king , Sejong , is credited with remarkable advances in printing and type (indeed, the first metal types, and the oldest known examples of...Press, 1962; reprint, New York: Da Capo Press, 1996), 25. 8 colonial rule over Korea with the Korean King , the mentally handicapped Sunjong, abdicating

  9. Infrared space observatory photometry of circumstellar dust in Vega-type systems

    NASA Technical Reports Server (NTRS)

    Fajardo-Acosta, S. B.; Stencel, R. E.; Backman, D. E.; Thakur, N.

    1998-01-01

    The ISOPHOT (Infrared Space Observatory Photometry) instrument onboard the Infrared Space Observatory (ISO) was used to obtain 3.6-90 micron photometry of Vega-type systems. Photometric data were calibrated with the ISOPHOT fine calibration source 1 (FCS1). Linear regression was used to derive transformations to make comparisons to ground-based and IRAS photometry systems possible. These transformations were applied to the photometry of 14 main-sequence stars. Details of these results are reported on.

  10. PID tuning rules for SOPDT systems: review and some new results.

    PubMed

    Panda, Rames C; Yu, Cheng-Ching; Huang, Hsiao-Ping

    2004-04-01

    PID controllers are widely used in industries and so many tuning rules have been proposed over the past 50 years that users are often lost in the jungle of tuning formulas. Moreover, unlike PI control, different control laws and structures of implementation further complicate the use of the PID controller. In this work, five different tuning rules are taken for study to control second-order plus dead time systems with wide ranges of damping coefficients and dead time to time constant ratios (D/tau). Four of them are based on IMC design with different types of approximations on dead time and the other on desired closed-loop specifications (i.e., specified forward transfer function). The method of handling dead time in the IMC type of design is important especially for systems with large D/tau ratios. A systematic approach was followed to evaluate the performance of controllers. The regions of applicability of suitable tuning rules are highlighted and recommendations are also given. It turns out that IMC designed with the Maclaurin series expansion type PID is a better choice for both set point and load changes for systems with D/tau greater than 1. For systems with D/tau less than 1, the desired closed-loop specification approach is favored.

  11. 14 CFR 302.1 - Applicability and description of part.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... this part sets forth general rules applicable to all types of proceedings. Each of the other subparts of this part sets forth special rules applicable to the type of proceedings described in the title of... and to the rules in the subpart relating to the particular type of proceeding, if any. In addition...

  12. A hybrid intelligence approach to artifact recognition in digital publishing

    NASA Astrophysics Data System (ADS)

    Vega-Riveros, J. Fernando; Santos Villalobos, Hector J.

    2006-02-01

    The system presented integrates rule-based and case-based reasoning for artifact recognition in Digital Publishing. In Variable Data Printing (VDP) human proofing could result prohibitive since a job could contain millions of different instances that may contain two types of artifacts: 1) evident defects, like a text overflow or overlapping 2) style-dependent artifacts, subtle defects that show as inconsistencies with regard to the original job design. We designed a Knowledge-Based Artifact Recognition tool for document segmentation, layout understanding, artifact detection, and document design quality assessment. Document evaluation is constrained by reference to one instance of the VDP job proofed by a human expert against the remaining instances. Fundamental rules of document design are used in the rule-based component for document segmentation and layout understanding. Ambiguities in the design principles not covered by the rule-based system are analyzed by case-based reasoning, using the Nearest Neighbor Algorithm, where features from previous jobs are used to detect artifacts and inconsistencies within the document layout. We used a subset of XSL-FO and assembled a set of 44 document samples. The system detected all the job layout changes, while obtaining an overall average accuracy of 84.56%, with the highest accuracy of 92.82%, for overlapping and the lowest, 66.7%, for the lack-of-white-space.

  13. Eawag-Soil in enviPath: a new resource for exploring regulatory pesticide soil biodegradation pathways and half-life data.

    PubMed

    Latino, Diogo A R S; Wicker, Jörg; Gütlein, Martin; Schmid, Emanuel; Kramer, Stefan; Fenner, Kathrin

    2017-03-22

    Developing models for the prediction of microbial biotransformation pathways and half-lives of trace organic contaminants in different environments requires as training data easily accessible and sufficiently large collections of respective biotransformation data that are annotated with metadata on study conditions. Here, we present the Eawag-Soil package, a public database that has been developed to contain all freely accessible regulatory data on pesticide degradation in laboratory soil simulation studies for pesticides registered in the EU (282 degradation pathways, 1535 reactions, 1619 compounds and 4716 biotransformation half-life values with corresponding metadata on study conditions). We provide a thorough description of this novel data resource, and discuss important features of the pesticide soil degradation data that are relevant for model development. Most notably, the variability of half-life values for individual compounds is large and only about one order of magnitude lower than the entire range of median half-life values spanned by all compounds, demonstrating the need to consider study conditions in the development of more accurate models for biotransformation prediction. We further show how the data can be used to find missing rules relevant for predicting soil biotransformation pathways. From this analysis, eight examples of reaction types were presented that should trigger the formulation of new biotransformation rules, e.g., Ar-OH methylation, or the extension of existing rules, e.g., hydroxylation in aliphatic rings. The data were also used to exemplarily explore the dependence of half-lives of different amide pesticides on chemical class and experimental parameters. This analysis highlighted the value of considering initial transformation reactions for the development of meaningful quantitative-structure biotransformation relationships (QSBR), which is a novel opportunity offered by the simultaneous encoding of transformation reactions and corresponding half-lives in Eawag-Soil. Overall, Eawag-Soil provides an unprecedentedly rich collection of manually extracted and curated biotransformation data, which should be useful in a great variety of applications.

  14. The Relative Success of Recognition-Based Inference in Multichoice Decisions

    ERIC Educational Resources Information Center

    McCloy, Rachel; Beaman, C. Philip; Smith, Philip T.

    2008-01-01

    The utility of an "ecologically rational" recognition-based decision rule in multichoice decision problems is analyzed, varying the type of judgment required (greater or lesser). The maximum size and range of a counterintuitive advantage associated with recognition-based judgment (the "less-is-more effect") is identified for a range of cue…

  15. Mapping of invasive Acacia species in Brazilian Mussununga ecosystems using high- resolution IR remote sensing data acquired with an autonomous Unmanned Aerial System (UAS)

    NASA Astrophysics Data System (ADS)

    Lehmann, Jan Rudolf Karl; Zvara, Ondrej; Prinz, Torsten

    2015-04-01

    The biological invasion of Australian Acacia species in natural ecosystems outside Australia has often a negative impact on native and endemic plant species and the related biodiversity. In Brazil, the Atlantic rainforest of Bahia and Espirito Santo forms an associated type of ecosystem, the Mussununga. In our days this biologically diverse ecosystem is negatively affected by the invasion of Acacia mangium and Acacia auriculiformis, both introduced to Brazil by the agroforestry to increase the production of pulp and high grade woods. In order to detect the distribution of Acacia species and to monitor the expansion of this invasion the use of high-resolution imagery data acquired with an autonomous Unmanned Aerial System (UAS) proved to be a very promising approach. In this study, two types of datasets - CIR and RGB - were collected since both types provide different information. In case of CIR imagery attention was paid on spectral signatures related to plants, whereas in case of RGB imagery the focus was on surface characteristics. Orthophoto-mosaics and DSM/DTM for both dataset were extracted. RGB/IHS transformations of the imagery's colour space were utilized, as well as NDVIblue index in case of CIR imagery to discriminate plant associations. Next, two test areas were defined in order validate OBIA rule sets using eCognition software. In case of RGB dataset, a rule set based on elevation distinction between high vegetation (including Acacia) and low vegetation (including soils) was developed. High vegetation was classified using Nearest Neighbour algorithm while working with the CIR dataset. The IHS information was used to mask shadows, soils and low vegetation. Further Nearest Neighbour classification was used for distinction between Acacia and other high vegetation types. Finally an accuracy assessment was performed using a confusion matrix. One can state that the IHS information appeared to be helpful in Acacia detection while the surface elevation information in case of RGB dataset was helpful to distinguish between low and high vegetation types. The successful use of a fixed-wing UAS proved to be a reliable and flexible technique to acquire ecologically sensitive data over wide areas and by extended UAS flight missions.

  16. Performing transformation: reflections of a lesbian academic couple.

    PubMed

    Gibson, Michelle; Meem, Deborah T

    2005-01-01

    We experience queer literacy as a kind of collision between the traditional and the transformative. Queer literacy is an acquired literacy of transformation, where the established rules of behavior and discourse are both challenged and transcended. As a lesbian academic couple in a privileged intellectual, political, and social location, we can move out of the traditional realm (through the closet) into an otherworldly queer space where knowledge and identity are destabilized. Moving in and out of queer transformative space requires a kind of blind faith-faith that believes in what the mind can neither see nor prove.

  17. Maximum type I error rate inflation from sample size reassessment when investigators are blind to treatment labels.

    PubMed

    Żebrowska, Magdalena; Posch, Martin; Magirr, Dominic

    2016-05-30

    Consider a parallel group trial for the comparison of an experimental treatment to a control, where the second-stage sample size may depend on the blinded primary endpoint data as well as on additional blinded data from a secondary endpoint. For the setting of normally distributed endpoints, we demonstrate that this may lead to an inflation of the type I error rate if the null hypothesis holds for the primary but not the secondary endpoint. We derive upper bounds for the inflation of the type I error rate, both for trials that employ random allocation and for those that use block randomization. We illustrate the worst-case sample size reassessment rule in a case study. For both randomization strategies, the maximum type I error rate increases with the effect size in the secondary endpoint and the correlation between endpoints. The maximum inflation increases with smaller block sizes if information on the block size is used in the reassessment rule. Based on our findings, we do not question the well-established use of blinded sample size reassessment methods with nuisance parameter estimates computed from the blinded interim data of the primary endpoint. However, we demonstrate that the type I error rate control of these methods relies on the application of specific, binding, pre-planned and fully algorithmic sample size reassessment rules and does not extend to general or unplanned sample size adjustments based on blinded data. © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  18. Renormalisation group corrections to neutrino mixing sum rules

    NASA Astrophysics Data System (ADS)

    Gehrlein, J.; Petcov, S. T.; Spinrath, M.; Titov, A. V.

    2016-11-01

    Neutrino mixing sum rules are common to a large class of models based on the (discrete) symmetry approach to lepton flavour. In this approach the neutrino mixing matrix U is assumed to have an underlying approximate symmetry form Ũν, which is dictated by, or associated with, the employed (discrete) symmetry. In such a setup the cosine of the Dirac CP-violating phase δ can be related to the three neutrino mixing angles in terms of a sum rule which depends on the symmetry form of Ũν. We consider five extensively discussed possible symmetry forms of Ũν: i) bimaximal (BM) and ii) tri-bimaximal (TBM) forms, the forms corresponding to iii) golden ratio type A (GRA) mixing, iv) golden ratio type B (GRB) mixing, and v) hexagonal (HG) mixing. For each of these forms we investigate the renormalisation group corrections to the sum rule predictions for δ in the cases of neutrino Majorana mass term generated by the Weinberg (dimension 5) operator added to i) the Standard Model, and ii) the minimal SUSY extension of the Standard Model.

  19. Lost in transformation? Reviving ethics of care in hospital cultures of evidence-based healthcare.

    PubMed

    Norlyk, Annelise; Haahr, Anita; Dreyer, Pia; Martinsen, Bente

    2017-07-01

    Drawing on previous empirical research, we provide an exemplary narrative to illustrate how patients have experienced hospital care organized according to evidence-based fast-track programmes. The aim of this paper was to analyse and discuss if and how it is possible to include patients' individual perspectives in an evidence-based practice as seen from the point of view of nursing theory. The paper highlights two conflicting courses of development. One is a course of standardization founded on evidence-based recommendations, which specify a set of rules that the patient must follow rigorously. The other is a course of democratization based on patients' involvement in care. Referring to the analysis of the narrative, we argue that, in the current implementation of evidence-based practice, the proposed involvement of patients resembles empty rhetoric. We argue that the principles and values from evidence-based medicine are being lost in the transformation into the current evidence-based hospital culture which potentially leads to a McDonaldization of nursing practice reflected as 'one best way'. We argue for reviving ethics of care perspectives in today's evidence practice as the fundamental values of nursing may potentially bridge conflicts between evidence-based practice and the ideals of patient participation thus preventing a practice of 'McNursing'. © 2017 John Wiley & Sons Ltd.

  20. Conormal distributions in the Shubin calculus of pseudodifferential operators

    NASA Astrophysics Data System (ADS)

    Cappiello, Marco; Schulz, René; Wahlberg, Patrik

    2018-02-01

    We characterize the Schwartz kernels of pseudodifferential operators of Shubin type by means of a Fourier-Bros-Iagolnitzer transform. Based on this, we introduce as a generalization a new class of tempered distributions called Shubin conormal distributions. We study their transformation behavior, normal forms, and microlocal properties.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sobottka, Marcelo, E-mail: sobottka@mtm.ufsc.br; Hart, Andrew G., E-mail: ahart@dim.uchile.cl

    Highlights: {yields} We propose a simple stochastic model to construct primitive DNA sequences. {yields} The model provide an explanation for Chargaff's second parity rule in primitive DNA sequences. {yields} The model is also used to predict a novel type of strand symmetry in primitive DNA sequences. {yields} We extend the results for bacterial DNA sequences and compare distributional properties intrinsic to the model to statistical estimates from 1049 bacterial genomes. {yields} We find out statistical evidences that the novel type of strand symmetry holds for bacterial DNA sequences. -- Abstract: Chargaff's second parity rule for short oligonucleotides states that themore » frequency of any short nucleotide sequence on a strand is approximately equal to the frequency of its reverse complement on the same strand. Recent studies have shown that, with the exception of organellar DNA, this parity rule generally holds for double-stranded DNA genomes and fails to hold for single-stranded genomes. While Chargaff's first parity rule is fully explained by the Watson-Crick pairing in the DNA double helix, a definitive explanation for the second parity rule has not yet been determined. In this work, we propose a model based on a hidden Markov process for approximating the distributional structure of primitive DNA sequences. Then, we use the model to provide another possible theoretical explanation for Chargaff's second parity rule, and to predict novel distributional aspects of bacterial DNA sequences.« less

  2. Automated microaneurysm detection in diabetic retinopathy using curvelet transform

    NASA Astrophysics Data System (ADS)

    Ali Shah, Syed Ayaz; Laude, Augustinus; Faye, Ibrahima; Tang, Tong Boon

    2016-10-01

    Microaneurysms (MAs) are known to be the early signs of diabetic retinopathy (DR). An automated MA detection system based on curvelet transform is proposed for color fundus image analysis. Candidates of MA were extracted in two parallel steps. In step one, blood vessels were removed from preprocessed green band image and preliminary MA candidates were selected by local thresholding technique. In step two, based on statistical features, the image background was estimated. The results from the two steps allowed us to identify preliminary MA candidates which were also present in the image foreground. A collection set of features was fed to a rule-based classifier to divide the candidates into MAs and non-MAs. The proposed system was tested with Retinopathy Online Challenge database. The automated system detected 162 MAs out of 336, thus achieved a sensitivity of 48.21% with 65 false positives per image. Counting MA is a means to measure the progression of DR. Hence, the proposed system may be deployed to monitor the progression of DR at early stage in population studies.

  3. Automated microaneurysm detection in diabetic retinopathy using curvelet transform.

    PubMed

    Ali Shah, Syed Ayaz; Laude, Augustinus; Faye, Ibrahima; Tang, Tong Boon

    2016-10-01

    Microaneurysms (MAs) are known to be the early signs of diabetic retinopathy (DR). An automated MA detection system based on curvelet transform is proposed for color fundus image analysis. Candidates of MA were extracted in two parallel steps. In step one, blood vessels were removed from preprocessed green band image and preliminary MA candidates were selected by local thresholding technique. In step two, based on statistical features, the image background was estimated. The results from the two steps allowed us to identify preliminary MA candidates which were also present in the image foreground. A collection set of features was fed to a rule-based classifier to divide the candidates into MAs and non-MAs. The proposed system was tested with Retinopathy Online Challenge database. The automated system detected 162 MAs out of 336, thus achieved a sensitivity of 48.21% with 65 false positives per image. Counting MA is a means to measure the progression of DR. Hence, the proposed system may be deployed to monitor the progression of DR at early stage in population studies.

  4. A departure from cognitivism: Implications of Chomsky's second revolution in linguistics

    PubMed Central

    Schoneberger, Ted

    2000-01-01

    In 1957 Noam Chomsky published Syntactic Structures, expressing views characterized as constituting a “revolution” in linguistics. Chomsky proposed that the proper subject matter of linguistics is not the utterances of speakers, but what speakers and listeners know. To that end, he theorized that what they know is a system of rules that underlie actual performance. This theory became known as transformational grammar. In subsequent versions of this theory, rules continued to play a dominant role. However, in 1980 Chomsky began a second revolution by proposing the elimination of rules in a new theory: the principles-and-parameters approach. Subsequent writings finalized the abandonment of rules. Given the centrality of rules to cognitivism, this paper argues that Chomsky's second revolution constitutes a departure from cognitivism. PMID:22477214

  5. Homogeneous illusion device exhibiting transformed and shifted scattering effect

    NASA Astrophysics Data System (ADS)

    Mei, Jin-Shuo; Wu, Qun; Zhang, Kuang; He, Xun-Jun; Wang, Yue

    2016-06-01

    Based on the theory of transformation optics, a type of homogeneous illusion device exhibiting transformed and shifted scattering effect is proposed in this paper. The constitutive parameters of the proposed device are derived, and full-wave simulations are performed to validate the electromagnetic properties of transformed and shifted scattering effect. The simulation results show that the proposed device not only can visually shift the image of target in two dimensions, but also can visually transform the shape of target. It is expected that such homogeneous illusion device could possess potential applications in military camouflage and other field of electromagnetic engineering.

  6. 13 CFR 134.201 - Scope of the rules in this subpart B.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...: (1) Where another subpart of this part, pertaining to a specific type of OHA proceeding, provides a different rule; or (2) Where another part of this chapter, pertaining to a specific type of OHA proceeding... types of OHA proceedings, the rules of practice are located as follows: (1) For appeals from size...

  7. Comment legitimer une innovation theorique en grammaire transformationelle: la theorie des traces (How to Legitimize a Theoretical Innovation in Transformational Grammar: The Trace Theory)

    ERIC Educational Resources Information Center

    Pollock, J. Y.

    1976-01-01

    Taking as an example the "trace theory" of movement rules developed at MIT, the article shows the conditions to which a theoretical innovation must conform in order to be considered legitimate in the context of transformational grammar's "Extended Standard Theory." (Text is in French.) (CDSH/AM)

  8. Continuous Cooling Transformation in Cast Duplex Stainless Steels CD3MN and CD3MWCuN

    NASA Astrophysics Data System (ADS)

    Kim, Yoon-Jun; Chumbley, L. Scott; Gleeson, Brian

    2008-04-01

    The kinetics of brittle phase transformation in cast duplex stainless steels CD3MN and CD3MWCuN was investigated under continuous cooling conditions. Cooling rates slower than 5 °C/min. were obtained using a conventional tube furnace with a programable controller. In order to obtain controlled high cooling rates, a furnace equipped to grow crystals by means of the Bridgman method was used. Samples were soaked at 1100 °C for 30 min and cooled at different rates by changing the furnace position at various velocities. The velocity of the furnace movement was correlated to a continuous-cooling-temperature profile for the samples. Continuous-cooling-transformation (CCT) diagrams were constructed based on experimental observations through metallographic sample preparations and optical microscopy. These are compared to calculated diagrams derived from previously determined isothermal transformation diagrams. The theoretical calculations employed a modified Johnson-Mehl-Avrami (JMA) equation (or Avrami equation) under assumption of the additivity rule. Rockwell hardness tests were made to present the correlation between hardness change and the amount of brittle phases (determined by tint-etching to most likely be a combination of sigma + chi) after cooling.

  9. Counting supersymmetric branes

    NASA Astrophysics Data System (ADS)

    Kleinschmidt, Axel

    2011-10-01

    Maximal supergravity solutions are revisited and classified, with particular emphasis on objects of co-dimension at most two. This class of solutions includes branes whose tension scales with xxxx. We present a group theory derivation of the counting of these objects based on the corresponding tensor hierarchies derived from E 11 and discrete T- and U-duality transformations. This provides a rationale for the wrapping rules that were recently discussed for σ ≤ 3 in the literature and extends them. Explicit supergravity solutions that give rise to co-dimension two branes are constructed and analysed.

  10. Controlling false-negative errors in microarray differential expression analysis: a PRIM approach.

    PubMed

    Cole, Steve W; Galic, Zoran; Zack, Jerome A

    2003-09-22

    Theoretical considerations suggest that current microarray screening algorithms may fail to detect many true differences in gene expression (Type II analytic errors). We assessed 'false negative' error rates in differential expression analyses by conventional linear statistical models (e.g. t-test), microarray-adapted variants (e.g. SAM, Cyber-T), and a novel strategy based on hold-out cross-validation. The latter approach employs the machine-learning algorithm Patient Rule Induction Method (PRIM) to infer minimum thresholds for reliable change in gene expression from Boolean conjunctions of fold-induction and raw fluorescence measurements. Monte Carlo analyses based on four empirical data sets show that conventional statistical models and their microarray-adapted variants overlook more than 50% of genes showing significant up-regulation. Conjoint PRIM prediction rules recover approximately twice as many differentially expressed transcripts while maintaining strong control over false-positive (Type I) errors. As a result, experimental replication rates increase and total analytic error rates decline. RT-PCR studies confirm that gene inductions detected by PRIM but overlooked by other methods represent true changes in mRNA levels. PRIM-based conjoint inference rules thus represent an improved strategy for high-sensitivity screening of DNA microarrays. Freestanding JAVA application at http://microarray.crump.ucla.edu/focus

  11. Representing and computing regular languages on massively parallel networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, M.I.; O'Sullivan, J.A.; Boysam, B.

    1991-01-01

    This paper proposes a general method for incorporating rule-based constraints corresponding to regular languages into stochastic inference problems, thereby allowing for a unified representation of stochastic and syntactic pattern constraints. The authors' approach first established the formal connection of rules to Chomsky grammars, and generalizes the original work of Shannon on the encoding of rule-based channel sequences to Markov chains of maximum entropy. This maximum entropy probabilistic view leads to Gibb's representations with potentials which have their number of minima growing at precisely the exponential rate that the language of deterministically constrained sequences grow. These representations are coupled to stochasticmore » diffusion algorithms, which sample the language-constrained sequences by visiting the energy minima according to the underlying Gibbs' probability law. The coupling to stochastic search methods yields the all-important practical result that fully parallel stochastic cellular automata may be derived to generate samples from the rule-based constraint sets. The production rules and neighborhood state structure of the language of sequences directly determines the necessary connection structures of the required parallel computing surface. Representations of this type have been mapped to the DAP-510 massively-parallel processor consisting of 1024 mesh-connected bit-serial processing elements for performing automated segmentation of electron-micrograph images.« less

  12. Intelligent control for modeling of real-time reservoir operation, part II: artificial neural network with operating rule curves

    NASA Astrophysics Data System (ADS)

    Chang, Ya-Ting; Chang, Li-Chiu; Chang, Fi-John

    2005-04-01

    To bridge the gap between academic research and actual operation, we propose an intelligent control system for reservoir operation. The methodology includes two major processes, the knowledge acquired and implemented, and the inference system. In this study, a genetic algorithm (GA) and a fuzzy rule base (FRB) are used to extract knowledge based on the historical inflow data with a design objective function and on the operating rule curves respectively. The adaptive network-based fuzzy inference system (ANFIS) is then used to implement the knowledge, to create the fuzzy inference system, and then to estimate the optimal reservoir operation. To investigate its applicability and practicability, the Shihmen reservoir, Taiwan, is used as a case study. For the purpose of comparison, a simulation of the currently used M-5 operating rule curve is also performed. The results demonstrate that (1) the GA is an efficient way to search the optimal input-output patterns, (2) the FRB can extract the knowledge from the operating rule curves, and (3) the ANFIS models built on different types of knowledge can produce much better performance than the traditional M-5 curves in real-time reservoir operation. Moreover, we show that the model can be more intelligent for reservoir operation if more information (or knowledge) is involved.

  13. Presynaptic ionotropic receptors controlling and modulating the rules for spike timing-dependent plasticity.

    PubMed

    Verhoog, Matthijs B; Mansvelder, Huibert D

    2011-01-01

    Throughout life, activity-dependent changes in neuronal connection strength enable the brain to refine neural circuits and learn based on experience. In line with predictions made by Hebb, synapse strength can be modified depending on the millisecond timing of action potential firing (STDP). The sign of synaptic plasticity depends on the spike order of presynaptic and postsynaptic neurons. Ionotropic neurotransmitter receptors, such as NMDA receptors and nicotinic acetylcholine receptors, are intimately involved in setting the rules for synaptic strengthening and weakening. In addition, timing rules for STDP within synapses are not fixed. They can be altered by activation of ionotropic receptors located at, or close to, synapses. Here, we will highlight studies that uncovered how network actions control and modulate timing rules for STDP by activating presynaptic ionotropic receptors. Furthermore, we will discuss how interaction between different types of ionotropic receptors may create "timing" windows during which particular timing rules lead to synaptic changes.

  14. Exact sum rules for inhomogeneous strings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amore, Paolo, E-mail: paolo.amore@gmail.com

    2013-11-15

    We derive explicit expressions for the sum rules of the eigenvalues of inhomogeneous strings with arbitrary density and with different boundary conditions. We show that the sum rule of order N may be obtained in terms of a diagrammatic expansion, with (N−1)!/2 independent diagrams. These sum rules are used to derive upper and lower bounds to the energy of the fundamental mode of an inhomogeneous string; we also show that it is possible to improve these approximations taking into account the asymptotic behavior of the spectrum and applying the Shanks transformation to the sequence of approximations obtained to the differentmore » orders. We discuss three applications of these results. -- Highlights: •We derive an explicit expression for the sum rules of an inhomogeneous string. •We obtain a diagrammatic representation for the sum rules of a given order. •We obtain precise bounds on the lowest eigenvalue of the string.« less

  15. The Manchester Acute Coronary Syndromes (MACS) decision rule: validation with a new automated assay for heart-type fatty acid binding protein.

    PubMed

    Body, Richard; Burrows, Gillian; Carley, Simon; Lewis, Philip S

    2015-10-01

    The Manchester Acute Coronary Syndromes (MACS) decision rule may enable acute coronary syndromes to be immediately 'ruled in' or 'ruled out' in the emergency department. The rule incorporates heart-type fatty acid binding protein (h-FABP) and high sensitivity troponin T levels. The rule was previously validated using a semiautomated h-FABP assay that was not practical for clinical implementation. We aimed to validate the rule with an automated h-FABP assay that could be used clinically. In this prospective diagnostic cohort study we included patients presenting to the emergency department with suspected cardiac chest pain. Serum drawn on arrival was tested for h-FABP using an automated immunoturbidimetric assay (Randox) and high sensitivity troponin T (Roche). The primary outcome, a diagnosis of acute myocardial infarction (AMI), was adjudicated based on 12 h troponin testing. A secondary outcome, major adverse cardiac events (MACE; death, AMI, revascularisation or new coronary stenosis), was determined at 30 days. Of the 456 patients included, 78 (17.1%) had AMI and 97 (21.3%) developed MACE. Using the automated h-FABP assay, the MACS rule had the same C-statistic for MACE as the original rule (0.91; 95% CI 0.88 to 0.92). 18.9% of patients were identified as 'very low risk' and thus eligible for immediate discharge with no missed AMIs and a 2.3% incidence of MACE (n=2, both coronary stenoses). 11.1% of patients were classed as 'high-risk' and had a 92.0% incidence of MACE. Our findings validate the performance of a refined MACS rule incorporating an automated h-FABP assay, facilitating use in clinical settings. The effectiveness of this refined rule should be verified in an interventional trial prior to implementation. UK CRN 8376. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  16. Spectral analysis of the Earth's topographic potential via 2D-DFT: a new data-based degree variance model to degree 90,000

    NASA Astrophysics Data System (ADS)

    Rexer, Moritz; Hirt, Christian

    2015-09-01

    Classical degree variance models (such as Kaula's rule or the Tscherning-Rapp model) often rely on low-resolution gravity data and so are subject to extrapolation when used to describe the decay of the gravity field at short spatial scales. This paper presents a new degree variance model based on the recently published GGMplus near-global land areas 220 m resolution gravity maps (Geophys Res Lett 40(16):4279-4283, 2013). We investigate and use a 2D-DFT (discrete Fourier transform) approach to transform GGMplus gravity grids into degree variances. The method is described in detail and its approximation errors are studied using closed-loop experiments. Focus is placed on tiling, azimuth averaging, and windowing effects in the 2D-DFT method and on analytical fitting of degree variances. Approximation errors of the 2D-DFT procedure on the (spherical harmonic) degree variance are found to be at the 10-20 % level. The importance of the reference surface (sphere, ellipsoid or topography) of the gravity data for correct interpretation of degree variance spectra is highlighted. The effect of the underlying mass arrangement (spherical or ellipsoidal approximation) on the degree variances is found to be crucial at short spatial scales. A rule-of-thumb for transformation of spectra between spherical and ellipsoidal approximation is derived. Application of the 2D-DFT on GGMplus gravity maps yields a new degree variance model to degree 90,000. The model is supported by GRACE, GOCE, EGM2008 and forward-modelled gravity at 3 billion land points over all land areas within the SRTM data coverage and provides gravity signal variances at the surface of the topography. The model yields omission errors of 9 mGal for gravity (1.5 cm for geoid effects) at scales of 10 km, 4 mGal (1 mm) at 2-km scales, and 2 mGal (0.2 mm) at 1-km scales.

  17. Gene Model Annotations for Drosophila melanogaster: The Rule-Benders

    PubMed Central

    Crosby, Madeline A.; Gramates, L. Sian; dos Santos, Gilberto; Matthews, Beverley B.; St. Pierre, Susan E.; Zhou, Pinglei; Schroeder, Andrew J.; Falls, Kathleen; Emmert, David B.; Russo, Susan M.; Gelbart, William M.

    2015-01-01

    In the context of the FlyBase annotated gene models in Drosophila melanogaster, we describe the many exceptional cases we have curated from the literature or identified in the course of FlyBase analysis. These range from atypical but common examples such as dicistronic and polycistronic transcripts, noncanonical splices, trans-spliced transcripts, noncanonical translation starts, and stop-codon readthroughs, to single exceptional cases such as ribosomal frameshifting and HAC1-type intron processing. In FlyBase, exceptional genes and transcripts are flagged with Sequence Ontology terms and/or standardized comments. Because some of the rule-benders create problems for handlers of high-throughput data, we discuss plans for flagging these cases in bulk data downloads. PMID:26109356

  18. A graphical, rule based robotic interface system

    NASA Technical Reports Server (NTRS)

    Mckee, James W.; Wolfsberger, John

    1988-01-01

    The ability of a human to take control of a robotic system is essential in any use of robots in space in order to handle unforeseen changes in the robot's work environment or scheduled tasks. But in cases in which the work environment is known, a human controlling a robot's every move by remote control is both time consuming and frustrating. A system is needed in which the user can give the robotic system commands to perform tasks but need not tell the system how. To be useful, this system should be able to plan and perform the tasks faster than a telerobotic system. The interface between the user and the robot system must be natural and meaningful to the user. A high level user interface program under development at the University of Alabama, Huntsville, is described. A graphical interface is proposed in which the user selects objects to be manipulated by selecting representations of the object on projections of a 3-D model of the work environment. The user may move in the work environment by changing the viewpoint of the projections. The interface uses a rule based program to transform user selection of items on a graphics display of the robot's work environment into commands for the robot. The program first determines if the desired task is possible given the abilities of the robot and any constraints on the object. If the task is possible, the program determines what movements the robot needs to make to perform the task. The movements are transformed into commands for the robot. The information defining the robot, the work environment, and how objects may be moved is stored in a set of data bases accessible to the program and displayable to the user.

  19. Identifying Novice Student Programming Misconceptions and Errors from Summative Assessments

    ERIC Educational Resources Information Center

    Veerasamy, Ashok Kumar; D'Souza, Daryl; Laakso, Mikko-Jussi

    2016-01-01

    This article presents a study aimed at examining the novice student answers in an introductory programming final e-exam to identify misconceptions and types of errors. Our study used the Delphi concept inventory to identify student misconceptions and skill, rule, and knowledge-based errors approach to identify the types of errors made by novices…

  20. Decision Tree Repository and Rule Set Based Mingjiang River Estuarine Wetlands Classifaction

    NASA Astrophysics Data System (ADS)

    Zhang, W.; Li, X.; Xiao, W.

    2018-05-01

    The increasing urbanization and industrialization have led to wetland losses in estuarine area of Mingjiang River over past three decades. There has been increasing attention given to produce wetland inventories using remote sensing and GIS technology. Due to inconsistency training site and training sample, traditionally pixel-based image classification methods can't achieve a comparable result within different organizations. Meanwhile, object-oriented image classification technique shows grate potential to solve this problem and Landsat moderate resolution remote sensing images are widely used to fulfill this requirement. Firstly, the standardized atmospheric correct, spectrally high fidelity texture feature enhancement was conducted before implementing the object-oriented wetland classification method in eCognition. Secondly, we performed the multi-scale segmentation procedure, taking the scale, hue, shape, compactness and smoothness of the image into account to get the appropriate parameters, using the top and down region merge algorithm from single pixel level, the optimal texture segmentation scale for different types of features is confirmed. Then, the segmented object is used as the classification unit to calculate the spectral information such as Mean value, Maximum value, Minimum value, Brightness value and the Normalized value. The Area, length, Tightness and the Shape rule of the image object Spatial features and texture features such as Mean, Variance and Entropy of image objects are used as classification features of training samples. Based on the reference images and the sampling points of on-the-spot investigation, typical training samples are selected uniformly and randomly for each type of ground objects. The spectral, texture and spatial characteristics of each type of feature in each feature layer corresponding to the range of values are used to create the decision tree repository. Finally, with the help of high resolution reference images, the random sampling method is used to conduct the field investigation, achieve an overall accuracy of 90.31 %, and the Kappa coefficient is 0.88. The classification method based on decision tree threshold values and rule set developed by the repository, outperforms the results obtained from the traditional methodology. Our decision tree repository and rule set based object-oriented classification technique was an effective method for producing comparable and consistency wetlands data set.

  1. Estimating Snow Water Storage in North America Using CLM4, DART, and Snow Radiance Data Assimilation

    NASA Technical Reports Server (NTRS)

    Kwon, Yonghwan; Yang, Zong-Liang; Zhao, Long; Hoar, Timothy J.; Toure, Ally M.; Rodell, Matthew

    2016-01-01

    This paper addresses continental-scale snow estimates in North America using a recently developed snow radiance assimilation (RA) system. A series of RA experiments with the ensemble adjustment Kalman filter are conducted by assimilating the Advanced Microwave Scanning Radiometer for Earth Observing System (AMSR-E) brightness temperature T(sub B) at 18.7- and 36.5-GHz vertical polarization channels. The overall RA performance in estimating snow depth for North America is improved by simultaneously updating the Community Land Model, version 4 (CLM4), snow/soil states and radiative transfer model (RTM) parameters involved in predicting T(sub B) based on their correlations with the prior T(sub B) (i.e., rule-based RA), although degradations are also observed. The RA system exhibits a more mixed performance for snow cover fraction estimates. Compared to the open-loop run (0.171m RMSE), the overall snow depth estimates are improved by 1.6% (0.168m RMSE) in the rule-based RA whereas the default RA (without a rule) results in a degradation of 3.6% (0.177mRMSE). Significant improvement of the snow depth estimates in the rule-based RA as observed for tundra snow class (11.5%, p < 0.05) and bare soil land-cover type (13.5%, p < 0.05). However, the overall improvement is not significant (p = 0.135) because snow estimates are degraded or marginally improved for other snow classes and land covers, especially the taiga snow class and forest land cover (7.1% and 7.3% degradations, respectively). The current RA system needs to be further refined to enhance snow estimates for various snow types and forested regions.

  2. 75 FR 32523 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-08

    ... Effectiveness of Proposed Rule Change Related to Eligible Order Types June 2, 2010. Pursuant to Section 19(b)(1... proposing to amend its rules to clarify the applicability of various order types on the Exchange. The text... 1. Purpose The Exchange proposes to modify Rule 6.53, Certain Types of Orders Defined, to clarify...

  3. Transformation Model Choice in Nonlinear Regression Analysis of Fluorescence-based Serial Dilution Assays

    PubMed Central

    Fong, Youyi; Yu, Xuesong

    2016-01-01

    Many modern serial dilution assays are based on fluorescence intensity (FI) readouts. We study optimal transformation model choice for fitting five parameter logistic curves (5PL) to FI-based serial dilution assay data. We first develop a generalized least squares-pseudolikelihood type algorithm for fitting heteroscedastic logistic models. Next we show that the 5PL and log 5PL functions can approximate each other well. We then compare four 5PL models with different choices of log transformation and variance modeling through a Monte Carlo study and real data. Our findings are that the optimal choice depends on the intended use of the fitted curves. PMID:27642502

  4. iRODS: A Distributed Data Management Cyberinfrastructure for Observatories

    NASA Astrophysics Data System (ADS)

    Rajasekar, A.; Moore, R.; Vernon, F.

    2007-12-01

    Large-scale and long-term preservation of both observational and synthesized data requires a system that virtualizes data management concepts. A methodology is needed that can work across long distances in space (distribution) and long-periods in time (preservation). The system needs to manage data stored on multiple types of storage systems including new systems that become available in the future. This concept is called infrastructure independence, and is typically implemented through virtualization mechanisms. Data grids are built upon concepts of data and trust virtualization. These concepts enable the management of collections of data that are distributed across multiple institutions, stored on multiple types of storage systems, and accessed by multiple types of clients. Data virtualization ensures that the name spaces used to identify files, users, and storage systems are persistent, even when files are migrated onto future technology. This is required to preserve authenticity, the link between the record and descriptive and provenance metadata. Trust virtualization ensures that access controls remain invariant as files are moved within the data grid. This is required to track the chain of custody of records over time. The Storage Resource Broker (http://www.sdsc.edu/srb) is one such data grid used in a wide variety of applications in earth and space sciences such as ROADNet (roadnet.ucsd.edu), SEEK (seek.ecoinformatics.org), GEON (www.geongrid.org) and NOAO (www.noao.edu). Recent extensions to data grids provide one more level of virtualization - policy or management virtualization. Management virtualization ensures that execution of management policies can be automated, and that rules can be created that verify assertions about the shared collections of data. When dealing with distributed large-scale data over long periods of time, the policies used to manage the data and provide assurances about the authenticity of the data become paramount. The integrated Rule-Oriented Data System (iRODS) (http://irods.sdsc.edu) provides the mechanisms needed to describe not only management policies, but also to track how the policies are applied and their execution results. The iRODS data grid maps management policies to rules that control the execution of the remote micro-services. As an example, a rule can be created that automatically creates a replica whenever a file is added to a specific collection, or extracts its metadata automatically and registers it in a searchable catalog. For the replication operation, the persistent state information consists of the replica location, the creation date, the owner, the replica size, etc. The mechanism used by iRODS for providing policy virtualization is based on well-defined functions, called micro-services, which are chained into alternative workflows using rules. A rule engine, based on the event-condition-action paradigm executes the rule-based workflows after an event. Rules can be deferred to a pre-determined time or executed on a periodic basis. As the data management policies evolve, the iRODS system can implement new rules, new micro-services, and new state information (metadata content) needed to manage the new policies. Each sub- collection can be managed using a different set of policies. The discussion of the concepts in rule-based policy virtualization and its application to long-term and large-scale data management for observatories such as ORION and NEON will be the basis of the paper.

  5. 46 CFR 174.080 - Flooding on self-elevating and surface type units.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... STABILITY SPECIAL RULES PERTAINING TO SPECIFIC VESSEL TYPES Special Rules Pertaining to Mobile Offshore Drilling Units § 174.080 Flooding on self-elevating and surface type units. (a) On a surface type unit or...

  6. 46 CFR 174.080 - Flooding on self-elevating and surface type units.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... STABILITY SPECIAL RULES PERTAINING TO SPECIFIC VESSEL TYPES Special Rules Pertaining to Mobile Offshore Drilling Units § 174.080 Flooding on self-elevating and surface type units. (a) On a surface type unit or...

  7. New publicly available chemical query language, CSRML, to support chemotype representations for application to data mining and modeling.

    PubMed

    Yang, Chihae; Tarkhov, Aleksey; Marusczyk, Jörg; Bienfait, Bruno; Gasteiger, Johann; Kleinoeder, Thomas; Magdziarz, Tomasz; Sacher, Oliver; Schwab, Christof H; Schwoebel, Johannes; Terfloth, Lothar; Arvidson, Kirk; Richard, Ann; Worth, Andrew; Rathman, James

    2015-03-23

    Chemotypes are a new approach for representing molecules, chemical substructures and patterns, reaction rules, and reactions. Chemotypes are capable of integrating types of information beyond what is possible using current representation methods (e.g., SMARTS patterns) or reaction transformations (e.g., SMIRKS, reaction SMILES). Chemotypes are expressed in the XML-based Chemical Subgraphs and Reactions Markup Language (CSRML), and can be encoded not only with connectivity and topology but also with properties of atoms, bonds, electronic systems, or molecules. CSRML has been developed in parallel with a public set of chemotypes, i.e., the ToxPrint chemotypes, which are designed to provide excellent coverage of environmental, regulatory, and commercial-use chemical space, as well as to represent chemical patterns and properties especially relevant to various toxicity concerns. A software application, ChemoTyper has also been developed and made publicly available in order to enable chemotype searching and fingerprinting against a target structure set. The public ChemoTyper houses the ToxPrint chemotype CSRML dictionary, as well as reference implementation so that the query specifications may be adopted by other chemical structure knowledge systems. The full specifications of the XML-based CSRML standard used to express chemotypes are publicly available to facilitate and encourage the exchange of structural knowledge.

  8. Collaborative Wideband Compressed Signal Detection in Interplanetary Internet

    NASA Astrophysics Data System (ADS)

    Wang, Yulin; Zhang, Gengxin; Bian, Dongming; Gou, Liang; Zhang, Wei

    2014-07-01

    As the development of autonomous radio in deep space network, it is possible to actualize communication between explorers, aircrafts, rovers and satellites, e.g. from different countries, adopting different signal modes. The first mission to enforce the autonomous radio is to detect signals of the explorer autonomously without disturbing the original communication. This paper develops a collaborative wideband compressed signal detection approach for InterPlaNetary (IPN) Internet where there exist sparse active signals in the deep space environment. Compressed sensing (CS) can be utilized by exploiting the sparsity of IPN Internet communication signal, whose useful frequency support occupies only a small portion of an entirely wide spectrum. An estimate of the signal spectrum can be obtained by using reconstruction algorithms. Against deep space shadowing and channel fading, multiple satellites collaboratively sense and make a final decision according to certain fusion rule to gain spatial diversity. A couple of novel discrete cosine transform (DCT) and walsh-hadamard transform (WHT) based compressed spectrum detection methods are proposed which significantly improve the performance of spectrum recovery and signal detection. Finally, extensive simulation results are presented to show the effectiveness of our proposed collaborative scheme for signal detection in IPN Internet. Compared with the conventional discrete fourier transform (DFT) based method, our DCT and WHT based methods reduce computational complexity, decrease processing time, save energy and enhance probability of detection.

  9. Research on a new fiber-optic axial pressure sensor of transformer winding based on fiber Bragg grating

    NASA Astrophysics Data System (ADS)

    Liu, Yuan; Li, Lianqing; Zhao, Lin; Wang, Jiqiang; Liu, Tongyu

    2017-12-01

    Based on the principle of the fiber Bragg grating, a new type of fiber-optic pressure sensor for axial force measurement of transformer winding is designed, which is designed with the structure of bending plate beam, the optimization of the packaging process, and material of the sensor. Through the calibration experiment to calibrate the sensor, the field test results of the Taikai transformer factory show that the sensitivity of the sensor is 0.133 pm/kPa and the repeatability error is 2.7% FS. The data of the fiber-optic pressure sensor in different positions maintain consistent and repeatable, which can meet the requirement of the real-time monitoring of the axial force of transformer winding.

  10. A rule-based shell to hierarchically organize HST observations

    NASA Technical Reports Server (NTRS)

    Bose, Ashim; Gerb, Andrew

    1995-01-01

    An observing program on the Hubble Space Telescope (HST) is described in terms of exposures that are obtained by one or more of the instruments onboard the HST. These exposures are organized into a hierarchy of structures for purposes of efficient scheduling of observations. The process by which exposures get organized into the higher-level structures is called merging. This process relies on rules to determine which observations can be 'merged' into the same higher level structure, and which cannot. The TRANSformation expert system converts proposals for astronomical observations with HST into detailed observing plans. The conversion process includes the task of merging. Within TRANS, we have implemented a declarative shell to facilitate merging. This shell offers the following features: (1) an easy way of specifying rules on when to merge and when not to merge, (2) a straightforward priority mechanism for resolving conflicts among rules, (3) an explanation facility for recording the merging history, (4) a report generating mechanism to help users understand the reasons for merging, and (5) a self-documenting mechanism that documents all the merging rules that have been defined in the shell, ordered by priority. The merging shell is implemented using an object-oriented paradigm in CLOS. It has been a part of operational TRANS (after extensive testing) since July 1993. It has fulfilled all performance expectations, and has considerably simplified the process of implementing new or changed requirements for merging. The users are pleased with its report-generating and self-documenting features.

  11. Image Encryption Algorithm Based on Hyperchaotic Maps and Nucleotide Sequences Database

    PubMed Central

    2017-01-01

    Image encryption technology is one of the main means to ensure the safety of image information. Using the characteristics of chaos, such as randomness, regularity, ergodicity, and initial value sensitiveness, combined with the unique space conformation of DNA molecules and their unique information storage and processing ability, an efficient method for image encryption based on the chaos theory and a DNA sequence database is proposed. In this paper, digital image encryption employs a process of transforming the image pixel gray value by using chaotic sequence scrambling image pixel location and establishing superchaotic mapping, which maps quaternary sequences and DNA sequences, and by combining with the logic of the transformation between DNA sequences. The bases are replaced under the displaced rules by using DNA coding in a certain number of iterations that are based on the enhanced quaternary hyperchaotic sequence; the sequence is generated by Chen chaos. The cipher feedback mode and chaos iteration are employed in the encryption process to enhance the confusion and diffusion properties of the algorithm. Theoretical analysis and experimental results show that the proposed scheme not only demonstrates excellent encryption but also effectively resists chosen-plaintext attack, statistical attack, and differential attack. PMID:28392799

  12. From Cues to Nudge: A Knowledge-Based Framework for Surveillance of Healthcare-Associated Infections.

    PubMed

    Shaban-Nejad, Arash; Mamiya, Hiroshi; Riazanov, Alexandre; Forster, Alan J; Baker, Christopher J O; Tamblyn, Robyn; Buckeridge, David L

    2016-01-01

    We propose an integrated semantic web framework consisting of formal ontologies, web services, a reasoner and a rule engine that together recommend appropriate level of patient-care based on the defined semantic rules and guidelines. The classification of healthcare-associated infections within the HAIKU (Hospital Acquired Infections - Knowledge in Use) framework enables hospitals to consistently follow the standards along with their routine clinical practice and diagnosis coding to improve quality of care and patient safety. The HAI ontology (HAIO) groups over thousands of codes into a consistent hierarchy of concepts, along with relationships and axioms to capture knowledge on hospital-associated infections and complications with focus on the big four types, surgical site infections (SSIs), catheter-associated urinary tract infection (CAUTI); hospital-acquired pneumonia, and blood stream infection. By employing statistical inferencing in our study we use a set of heuristics to define the rule axioms to improve the SSI case detection. We also demonstrate how the occurrence of an SSI is identified using semantic e-triggers. The e-triggers will be used to improve our risk assessment of post-operative surgical site infections (SSIs) for patients undergoing certain type of surgeries (e.g., coronary artery bypass graft surgery (CABG)).

  13. Unsupervised Biomedical Named Entity Recognition: Experiments with Clinical and Biological Texts

    PubMed Central

    Zhang, Shaodian; Elhadad, Nóemie

    2013-01-01

    Named entity recognition is a crucial component of biomedical natural language processing, enabling information extraction and ultimately reasoning over and knowledge discovery from text. Much progress has been made in the design of rule-based and supervised tools, but they are often genre and task dependent. As such, adapting them to different genres of text or identifying new types of entities requires major effort in re-annotation or rule development. In this paper, we propose an unsupervised approach to extracting named entities from biomedical text. We describe a stepwise solution to tackle the challenges of entity boundary detection and entity type classification without relying on any handcrafted rules, heuristics, or annotated data. A noun phrase chunker followed by a filter based on inverse document frequency extracts candidate entities from free text. Classification of candidate entities into categories of interest is carried out by leveraging principles from distributional semantics. Experiments show that our system, especially the entity classification step, yields competitive results on two popular biomedical datasets of clinical notes and biological literature, and outperforms a baseline dictionary match approach. Detailed error analysis provides a road map for future work. PMID:23954592

  14. Fuzzy Control of Robotic Arm

    NASA Astrophysics Data System (ADS)

    Lin, Kyaw Kyaw; Soe, Aung Kyaw; Thu, Theint Theint

    2008-10-01

    This research work investigates a Self-Tuning Proportional Derivative (PD) type Fuzzy Logic Controller (STPDFLC) for a two link robot system. The proposed scheme adjusts on-line the output Scaling Factor (SF) by fuzzy rules according to the current trend of the robot. The rule base for tuning the output scaling factor is defined on the error (e) and change in error (de). The scheme is also based on the fact that the controller always tries to manipulate the process input. The rules are in the familiar if-then format. All membership functions for controller inputs (e and de) and controller output (UN) are defined on the common interval [-1,1]; whereas the membership functions for the gain updating factor (α) is defined on [0,1]. There are various methods to calculate the crisp output of the system. Center of Gravity (COG) method is used in this application due to better results it gives. Performances of the proposed STPDFLC are compared with those of their corresponding PD-type conventional Fuzzy Logic Controller (PDFLC). The proposed scheme shows a remarkably improved performance over its conventional counterpart especially under parameters variation (payload). The two-link results of analysis are simulated. These simulation results are illustrated by using MATLAB® programming.

  15. Criterion learning in rule-based categorization: Simulation of neural mechanism and new data

    PubMed Central

    Helie, Sebastien; Ell, Shawn W.; Filoteo, J. Vincent; Maddox, W. Todd

    2015-01-01

    In perceptual categorization, rule selection consists of selecting one or several stimulus-dimensions to be used to categorize the stimuli (e.g, categorize lines according to their length). Once a rule has been selected, criterion learning consists of defining how stimuli will be grouped using the selected dimension(s) (e.g., if the selected rule is line length, define ‘long’ and ‘short’). Very little is known about the neuroscience of criterion learning, and most existing computational models do not provide a biological mechanism for this process. In this article, we introduce a new model of rule learning called Heterosynaptic Inhibitory Criterion Learning (HICL). HICL includes a biologically-based explanation of criterion learning, and we use new category-learning data to test key aspects of the model. In HICL, rule selective cells in prefrontal cortex modulate stimulus-response associations using pre-synaptic inhibition. Criterion learning is implemented by a new type of heterosynaptic error-driven Hebbian learning at inhibitory synapses that uses feedback to drive cell activation above/below thresholds representing ionic gating mechanisms. The model is used to account for new human categorization data from two experiments showing that: (1) changing rule criterion on a given dimension is easier if irrelevant dimensions are also changing (Experiment 1), and (2) showing that changing the relevant rule dimension and learning a new criterion is more difficult, but also facilitated by a change in the irrelevant dimension (Experiment 2). We conclude with a discussion of some of HICL’s implications for future research on rule learning. PMID:25682349

  16. Criterion learning in rule-based categorization: simulation of neural mechanism and new data.

    PubMed

    Helie, Sebastien; Ell, Shawn W; Filoteo, J Vincent; Maddox, W Todd

    2015-04-01

    In perceptual categorization, rule selection consists of selecting one or several stimulus-dimensions to be used to categorize the stimuli (e.g., categorize lines according to their length). Once a rule has been selected, criterion learning consists of defining how stimuli will be grouped using the selected dimension(s) (e.g., if the selected rule is line length, define 'long' and 'short'). Very little is known about the neuroscience of criterion learning, and most existing computational models do not provide a biological mechanism for this process. In this article, we introduce a new model of rule learning called Heterosynaptic Inhibitory Criterion Learning (HICL). HICL includes a biologically-based explanation of criterion learning, and we use new category-learning data to test key aspects of the model. In HICL, rule selective cells in prefrontal cortex modulate stimulus-response associations using pre-synaptic inhibition. Criterion learning is implemented by a new type of heterosynaptic error-driven Hebbian learning at inhibitory synapses that uses feedback to drive cell activation above/below thresholds representing ionic gating mechanisms. The model is used to account for new human categorization data from two experiments showing that: (1) changing rule criterion on a given dimension is easier if irrelevant dimensions are also changing (Experiment 1), and (2) showing that changing the relevant rule dimension and learning a new criterion is more difficult, but also facilitated by a change in the irrelevant dimension (Experiment 2). We conclude with a discussion of some of HICL's implications for future research on rule learning. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Managing curriculum transformation within strict university governance structures: an example from Damascus University Medical School.

    PubMed

    Kayyal, Mohammad; Gibbs, Trevor

    2012-01-01

    As the world of medical education moves forward, it becomes increasingly clear that the transformative process is not as easy a process for all. Across the globe, there appears to be many barriers that obstruct or threaten innovation and change, most of which cause almost insurmountable problems to many schools. If transformative education is to result in an equitable raising of standards across such an unlevel playing field, schools have to find ways in overcoming these barriers. One seemingly common barrier to development occurs when medical schools are trapped within strict University governance structures; rules and regulations which are frequently inappropriate and obstructive to the transformation that must occur in today's medical educational paradigm. The Faculty of Medicine at Damascus University, one of the oldest and foremost medical schools in the Middle East, is one such school where rigid rules and regulations and traditional values are obstructing transformative change. This paper describes the problems, which the authors believe to be common to many, and explores how attempts have been made to overcome them and move the school into the twenty-first century. It is the ultimate purpose of this paper to raise awareness of the issue, share the lessons learned in order to assist others who are experiencing similar problems and possibly create opportunities for dialogue between schools.

  18. Object-based locust habitat mapping using high-resolution multispectral satellite data in the southern Aral Sea basin

    NASA Astrophysics Data System (ADS)

    Navratil, Peter; Wilps, Hans

    2013-01-01

    Three different object-based image classification techniques are applied to high-resolution satellite data for the mapping of the habitats of Asian migratory locust (Locusta migratoria migratoria) in the southern Aral Sea basin, Uzbekistan. A set of panchromatic and multispectral Système Pour l'Observation de la Terre-5 satellite images was spectrally enhanced by normalized difference vegetation index and tasseled cap transformation and segmented into image objects, which were then classified by three different classification approaches: a rule-based hierarchical fuzzy threshold (HFT) classification method was compared to a supervised nearest neighbor classifier and classification tree analysis by the quick, unbiased, efficient statistical trees algorithm. Special emphasis was laid on the discrimination of locust feeding and breeding habitats due to the significance of this discrimination for practical locust control. Field data on vegetation and land cover, collected at the time of satellite image acquisition, was used to evaluate classification accuracy. The results show that a robust HFT classifier outperformed the two automated procedures by 13% overall accuracy. The classification method allowed a reliable discrimination of locust feeding and breeding habitats, which is of significant importance for the application of the resulting data for an economically and environmentally sound control of locust pests because exact spatial knowledge on the habitat types allows a more effective surveying and use of pesticides.

  19. An expert system design to diagnose cancer by using a new method reduced rule base.

    PubMed

    Başçiftçi, Fatih; Avuçlu, Emre

    2018-04-01

    A Medical Expert System (MES) was developed which uses Reduced Rule Base to diagnose cancer risk according to the symptoms in an individual. A total of 13 symptoms were used. With the new MES, the reduced rules are controlled instead of all possibilities (2 13 = 8192 different possibilities occur). By controlling reduced rules, results are found more quickly. The method of two-level simplification of Boolean functions was used to obtain Reduced Rule Base. Thanks to the developed application with the number of dynamic inputs and outputs on different platforms, anyone can easily test their own cancer easily. More accurate results were obtained considering all the possibilities related to cancer. Thirteen different risk factors were determined to determine the type of cancer. The truth table produced in our study has 13 inputs and 4 outputs. The Boolean Function Minimization method is used to obtain less situations by simplifying logical functions. Diagnosis of cancer quickly thanks to control of the simplified 4 output functions. Diagnosis made with the 4 output values obtained using Reduced Rule Base was found to be quicker than diagnosis made by screening all 2 13 = 8192 possibilities. With the improved MES, more probabilities were added to the process and more accurate diagnostic results were obtained. As a result of the simplification process in breast and renal cancer diagnosis 100% diagnosis speed gain, in cervical cancer and lung cancer diagnosis rate gain of 99% was obtained. With Boolean function minimization, less number of rules is evaluated instead of evaluating a large number of rules. Reducing the number of rules allows the designed system to work more efficiently and to save time, and facilitates to transfer the rules to the designed Expert systems. Interfaces were developed in different software platforms to enable users to test the accuracy of the application. Any one is able to diagnose the cancer itself using determinative risk factors. Thereby likely to beat the cancer with early diagnosis. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Limit of validity of Ostwald's rule of stages in a statistical mechanical model of crystallization.

    PubMed

    Hedges, Lester O; Whitelam, Stephen

    2011-10-28

    We have only rules of thumb with which to predict how a material will crystallize, chief among which is Ostwald's rule of stages. It states that the first phase to appear upon transformation of a parent phase is the one closest to it in free energy. Although sometimes upheld, the rule is without theoretical foundation and is not universally obeyed, highlighting the need for microscopic understanding of crystallization controls. Here we study in detail the crystallization pathways of a prototypical model of patchy particles. The range of crystallization pathways it exhibits is richer than can be predicted by Ostwald's rule, but a combination of simulation and analytic theory reveals clearly how these pathways are selected by microscopic parameters. Our results suggest strategies for controlling self-assembly pathways in simulation and experiment.

  1. 77 FR 61449 - Self-Regulatory Organizations; EDGX Exchange, Inc.; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-09

    ...) to add a new order type, the NBBO Offset Peg Order, to the rule. The text of the proposed rule change... Change Relating to EDGX Rule 11.5 To Add a New Order Type October 2, 2012. Pursuant to Section 19(b)(1... and discussed any comments it received on the proposed rule change. The text of these statements may...

  2. 77 FR 61463 - Self-Regulatory Organizations; EDGA Exchange, Inc.; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-09

    ...) to add a new order type, the NBBO Offset Peg Order, to the rule. The text of the proposed rule change... Change Relating to EDGA Rule 11.5 To Add a New Order Type October 2, 2012. Pursuant to Section 19(b)(1... and discussed any comments it received on the proposed rule change. The text of these statements may...

  3. Proceedings of Joint RL/AFOSR Workshop on Intelligent Information Systems Held at Griffiss AFB, New York on October 22-23, 1991

    DTIC Science & Technology

    1992-04-01

    AND SCHEDULING" TIM FINN, UNIVERSITY OF MARYLAND, BALTIMORE COUNTY E. " EXTRACTING RULES FROM SOFTWARE FOR KNOWLEDGE-BASES" NOAH S. PRYWES, UNIVERSITY...Databases for Planning and Scheduling" Tim Finin, Unisys Corporation 8:30 - 9:00 " Extracting Rules from Software for Knowledge Baseso Noah Prywes, U. of...Space Requirements are Tractable E.G.: FEM, Multiplication Routines, Sorting Programs Lebmwmy fo Al Roseew d. The Ohio Male Unlversity A-2 Type 2

  4. Complexity of line-seru conversion for different scheduling rules and two improved exact algorithms for the multi-objective optimization.

    PubMed

    Yu, Yang; Wang, Sihan; Tang, Jiafu; Kaku, Ikou; Sun, Wei

    2016-01-01

    Productivity can be greatly improved by converting the traditional assembly line to a seru system, especially in the business environment with short product life cycles, uncertain product types and fluctuating production volumes. Line-seru conversion includes two decision processes, i.e., seru formation and seru load. For simplicity, however, previous studies focus on the seru formation with a given scheduling rule in seru load. We select ten scheduling rules usually used in seru load to investigate the influence of different scheduling rules on the performance of line-seru conversion. Moreover, we clarify the complexities of line-seru conversion for ten different scheduling rules from the theoretical perspective. In addition, multi-objective decisions are often used in line-seru conversion. To obtain Pareto-optimal solutions of multi-objective line-seru conversion, we develop two improved exact algorithms based on reducing time complexity and space complexity respectively. Compared with the enumeration based on non-dominated sorting to solve multi-objective problem, the two improved exact algorithms saves computation time greatly. Several numerical simulation experiments are performed to show the performance improvement brought by the two proposed exact algorithms.

  5. Evaluation of parameters affecting switchgrass tissue culture: toward a consolidated procedure for Agrobacterium-mediated transformation of switchgrass (Panicum virgatum)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Chien-Yuan; Donohoe, Bryon S.; Ahuja, Neha

    Switchgrass (Panicum virgatum), a robust perennial C4-type grass, has been evaluated and designated as a model bioenergy crop by the U.S. DOE and USDA. Conventional breeding of switchgrass biomass is difficult because it displays self-incompatible hindrance. Therefore, direct genetic modifications of switchgrass have been considered the more effective approach to tailor switchgrass with traits of interest. Successful transformations have demonstrated increased biomass yields, reduction in the recalcitrance of cell walls and enhanced saccharification efficiency. Several tissue culture protocols have been previously described to produce transgenic switchgrass lines using different nutrient-based media, co-cultivation approaches, and antibiotic strengths for selection. After evaluatingmore » the published protocols, we consolidated these approaches and optimized the process to develop a more efficient protocol for producing transgenic switchgrass. First, seed sterilization was optimized, which led to a 20% increase in yield of induced calluses. Second, we have selected a N 6 macronutrient/B 5 micronutrient (NB)-based medium for callus induction from mature seeds of the Alamo cultivar, and chose a Murashige and Skoog-based medium to regenerate both Type I and Type II calluses. Third, Agrobacterium-mediated transformation was adopted that resulted in 50-100% positive regenerated transformants after three rounds (2 weeks/round) of selection with antibiotic. Genomic DNA PCR, RT-PCR, Southern blot, visualization of the red fluorescent protein and histochemical β-glucuronidase (GUS) staining were conducted to confirm the positive switchgrass transformants. The optimized methods developed here provide an improved strategy to promote the production and selection of callus and generation of transgenic switchgrass lines. The process for switchgrass transformation has been evaluated and consolidated to devise an improved approach for transgenic switchgrass production. With the optimization of seed sterilization, callus induction, and regeneration steps, a reliable and effective protocol is established to facilitate switchgrass engineering.« less

  6. Evaluation of parameters affecting switchgrass tissue culture: toward a consolidated procedure for Agrobacterium-mediated transformation of switchgrass (Panicum virgatum)

    DOE PAGES

    Lin, Chien-Yuan; Donohoe, Bryon S.; Ahuja, Neha; ...

    2017-12-19

    Switchgrass (Panicum virgatum), a robust perennial C4-type grass, has been evaluated and designated as a model bioenergy crop by the U.S. DOE and USDA. Conventional breeding of switchgrass biomass is difficult because it displays self-incompatible hindrance. Therefore, direct genetic modifications of switchgrass have been considered the more effective approach to tailor switchgrass with traits of interest. Successful transformations have demonstrated increased biomass yields, reduction in the recalcitrance of cell walls and enhanced saccharification efficiency. Several tissue culture protocols have been previously described to produce transgenic switchgrass lines using different nutrient-based media, co-cultivation approaches, and antibiotic strengths for selection. After evaluatingmore » the published protocols, we consolidated these approaches and optimized the process to develop a more efficient protocol for producing transgenic switchgrass. First, seed sterilization was optimized, which led to a 20% increase in yield of induced calluses. Second, we have selected a N 6 macronutrient/B 5 micronutrient (NB)-based medium for callus induction from mature seeds of the Alamo cultivar, and chose a Murashige and Skoog-based medium to regenerate both Type I and Type II calluses. Third, Agrobacterium-mediated transformation was adopted that resulted in 50-100% positive regenerated transformants after three rounds (2 weeks/round) of selection with antibiotic. Genomic DNA PCR, RT-PCR, Southern blot, visualization of the red fluorescent protein and histochemical β-glucuronidase (GUS) staining were conducted to confirm the positive switchgrass transformants. The optimized methods developed here provide an improved strategy to promote the production and selection of callus and generation of transgenic switchgrass lines. The process for switchgrass transformation has been evaluated and consolidated to devise an improved approach for transgenic switchgrass production. With the optimization of seed sterilization, callus induction, and regeneration steps, a reliable and effective protocol is established to facilitate switchgrass engineering.« less

  7. Mechanism and microstructures in Ga2O3 pseudomartensitic solid phase transition.

    PubMed

    Zhu, Sheng-Cai; Guan, Shu-Hui; Liu, Zhi-Pan

    2016-07-21

    Solid-to-solid phase transition, although widely exploited in making new materials, challenges persistently our current theory for predicting its complex kinetics and rich microstructures in transition. The Ga2O3α-β phase transformation represents such a common but complex reaction with marked change in cation coordination and crystal density, which was known to yield either amorphous or crystalline products under different synthetic conditions. Here we, via recently developed stochastic surface walking (SSW) method, resolve for the first time the atomistic mechanism of Ga2O3α-β phase transformation, the pathway of which turns out to be the first reaction pathway ever determined for a new type of diffusionless solid phase transition, namely, pseudomartensitic phase transition. We demonstrate that the sensitivity of product crystallinity is caused by its multi-step, multi-type reaction pathway, which bypasses seven intermediate phases and involves all types of elementary solid phase transition steps, i.e. the shearing of O layers (martensitic type), the local diffusion of Ga atoms (reconstructive type) and the significant lattice dilation (dilation type). While the migration of Ga atoms across the close-packed O layers is the rate-determining step and yields "amorphous-like" high energy intermediates, the shearing of O layers contributes to the formation of coherent biphase junctions and the presence of a crystallographic orientation relation, (001)α//(201[combining macron])β + [120]α//[13[combining macron]2]β. Our experiment using high-resolution transmission electron microscopy further confirms the theoretical predictions on the atomic structure of biphase junction and the formation of (201[combining macron])β twin, and also discovers the late occurrence of lattice expansion in the nascent β phase that grows out from the parent α phase. By distinguishing pseudomartensitic transition from other types of mechanisms, we propose general rules to predict the product crystallinity of solid phase transition. The new knowledge on the kinetics of pseudomartensitic transition complements the theory of diffusionless solid phase transition.

  8. Fusion of infrared and visible images based on saliency scale-space in frequency domain

    NASA Astrophysics Data System (ADS)

    Chen, Yanfei; Sang, Nong; Dan, Zhiping

    2015-12-01

    A fusion algorithm of infrared and visible images based on saliency scale-space in the frequency domain was proposed. Focus of human attention is directed towards the salient targets which interpret the most important information in the image. For the given registered infrared and visible images, firstly, visual features are extracted to obtain the input hypercomplex matrix. Secondly, the Hypercomplex Fourier Transform (HFT) is used to obtain the salient regions of the infrared and visible images respectively, the convolution of the input hypercomplex matrix amplitude spectrum with a low-pass Gaussian kernel of an appropriate scale which is equivalent to an image saliency detector are done. The saliency maps are obtained by reconstructing the 2D signal using the original phase and the amplitude spectrum, filtered at a scale selected by minimizing saliency map entropy. Thirdly, the salient regions are fused with the adoptive weighting fusion rules, and the nonsalient regions are fused with the rule based on region energy (RE) and region sharpness (RS), then the fused image is obtained. Experimental results show that the presented algorithm can hold high spectrum information of the visual image, and effectively get the thermal targets information at different scales of the infrared image.

  9. Hierarchical graphs for rule-based modeling of biochemical systems

    PubMed Central

    2011-01-01

    Background In rule-based modeling, graphs are used to represent molecules: a colored vertex represents a component of a molecule, a vertex attribute represents the internal state of a component, and an edge represents a bond between components. Components of a molecule share the same color. Furthermore, graph-rewriting rules are used to represent molecular interactions. A rule that specifies addition (removal) of an edge represents a class of association (dissociation) reactions, and a rule that specifies a change of a vertex attribute represents a class of reactions that affect the internal state of a molecular component. A set of rules comprises an executable model that can be used to determine, through various means, the system-level dynamics of molecular interactions in a biochemical system. Results For purposes of model annotation, we propose the use of hierarchical graphs to represent structural relationships among components and subcomponents of molecules. We illustrate how hierarchical graphs can be used to naturally document the structural organization of the functional components and subcomponents of two proteins: the protein tyrosine kinase Lck and the T cell receptor (TCR) complex. We also show that computational methods developed for regular graphs can be applied to hierarchical graphs. In particular, we describe a generalization of Nauty, a graph isomorphism and canonical labeling algorithm. The generalized version of the Nauty procedure, which we call HNauty, can be used to assign canonical labels to hierarchical graphs or more generally to graphs with multiple edge types. The difference between the Nauty and HNauty procedures is minor, but for completeness, we provide an explanation of the entire HNauty algorithm. Conclusions Hierarchical graphs provide more intuitive formal representations of proteins and other structured molecules with multiple functional components than do the regular graphs of current languages for specifying rule-based models, such as the BioNetGen language (BNGL). Thus, the proposed use of hierarchical graphs should promote clarity and better understanding of rule-based models. PMID:21288338

  10. Structure and Stoichiometry of MgxZny in Hot-Dipped Zn-Mg-Al Coating Layer on Interstitial-Free Steel

    NASA Astrophysics Data System (ADS)

    Kim, Jaenam; Lee, Chongsoo; Jin, Youngsool

    2018-03-01

    Correlations of stoichiometry and phase structure of MgxZny in hot-dipped Zn-Mg-Al coating layer which were modified by additive element have been established on the bases of diffraction and phase transformation principles. X-ray diffraction (XRD) results showed that MgxZny in the Zn-Mg-Al coating layers consist of Mg2Zn11 and MgZn2. The additive elements had a significant effect on the phase fraction of Mg2Zn11 while the Mg/Al ratio had a negligible effect. Transmission electron microscope (TEM) assisted selected area electron diffraction (SAED) results of small areas MgxZny were indexed dominantly as MgZn2 which have different Mg/Zn stoichiometry between 0.10 and 0.18. It is assumed that the MgxZny have deviated stoichiometry of the phase structure with additive element. The deviated Mg2Zn11 phase structure was interpreted as base-centered orthorhombic by applying two theoretical validity: a structure factor rule explained why the base-centered orthorhombic Mg2Zn11 has less reciprocal lattice reflections in the SAED compared to hexagonal MgZn2, and a phase transformation model elucidated its reasonable lattice point sharing of the corresponding unit cell during hexagonal MgZn2 (a, b = 0.5252 nm, c = 0.8577 nm) transform to intermediate tetragonal and final base-centered orthorhombic Mg2Zn11 (a = 0.8575 nm, b = 0.8874 nm, c = 0.8771 nm) in the equilibrium state.

  11. Grammar-based Automatic 3D Model Reconstruction from Terrestrial Laser Scanning Data

    NASA Astrophysics Data System (ADS)

    Yu, Q.; Helmholz, P.; Belton, D.; West, G.

    2014-04-01

    The automatic reconstruction of 3D buildings has been an important research topic during the last years. In this paper, a novel method is proposed to automatically reconstruct the 3D building models from segmented data based on pre-defined formal grammar and rules. Such segmented data can be extracted e.g. from terrestrial or mobile laser scanning devices. Two steps are considered in detail. The first step is to transform the segmented data into 3D shapes, for instance using the DXF (Drawing Exchange Format) format which is a CAD data file format used for data interchange between AutoCAD and other program. Second, we develop a formal grammar to describe the building model structure and integrate the pre-defined grammars into the reconstruction process. Depending on the different segmented data, the selected grammar and rules are applied to drive the reconstruction process in an automatic manner. Compared with other existing approaches, our proposed method allows the model reconstruction directly from 3D shapes and takes the whole building into account.

  12. Improving the quality of the ECG signal by filtering in wavelet transform domain

    NASA Astrophysics Data System (ADS)

    DzierŻak, RóŻa; Surtel, Wojciech; Dzida, Grzegorz; Maciejewski, Marcin

    2016-09-01

    The article concerns the research methods of noise reduction occurring in the ECG signals. The method is based on the use of filtration in wavelet transform domain. The study was conducted on two types of signal - received during the rest of the patient and obtained during physical activity. For each of the signals 3 types of filtration were used. The study was designed to determine the effectiveness of various wavelets for de-noising signals obtained in both cases. The results confirm the suitability of the method for improving the quality of the electrocardiogram in case of both types of signals.

  13. Quality evaluation of LC-MS/MS-based E. coli H antigen typing (MS-H) through label-free quantitative data analysis in a clinical sample setup.

    PubMed

    Cheng, Keding; Sloan, Angela; McCorrister, Stuart; Peterson, Lorea; Chui, Huixia; Drebot, Mike; Nadon, Celine; Knox, J David; Wang, Gehua

    2014-12-01

    The need for rapid and accurate H typing is evident during Escherichia coli outbreak situations. This study explores the transition of MS-H, a method originally developed for rapid H antigen typing of E. coli using LC-MS/MS of flagella digest of reference strains and some clinical strains, to E. coli isolates in clinical scenario through quantitative analysis and method validation. Motile and nonmotile strains were examined in batches to simulate clinical sample scenario. Various LC-MS/MS batch run procedures and MS-H typing rules were compared and summarized through quantitative analysis of MS-H data output for a standard method development. Label-free quantitative data analysis of MS-H typing was proven very useful for examining the quality of MS-H result and the effects of some sample carryovers from motile E. coli isolates. Based on this, a refined procedure and protein identification rule specific for clinical MS-H typing was established and validated. With LC-MS/MS batch run procedure and database search parameter unique for E. coli MS-H typing, the standard procedure maintained high accuracy and specificity in clinical situations, and its potential to be used in a clinical setting was clearly established. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Broadband active electrically small superconductor antennas

    NASA Astrophysics Data System (ADS)

    Kornev, V. K.; Kolotinskiy, N. V.; Sharafiev, A. V.; Soloviev, I. I.; Mukhanov, O. A.

    2017-10-01

    A new type of broadband active electrically small antenna (ESA) based on superconducting quantum arrays (SQAs) has been proposed and developed. These antennas are capable of providing both sensing and amplification of broadband electromagnetic signals with a very high spurious-free dynamic range (SFDR)—up to 100 dB (and even more)—with high sensitivity. The frequency band can range up to tens of gigahertz, depending on Josephson junction characteristic frequency, set by fabrication. In this paper we review theoretical and experimental studies of SQAs and SQA-based antenna prototypes of both transformer and transformer-less types. The ESA prototypes evaluated were fabricated using a standard Nb process with critical current density 4.5 kA cm-2. Measured device characteristics, design issues and comparative analysis of various ESA types, as well as requirements for interfaces, are reviewed and discussed.

  15. New approaches to the estimation of the geosystem properties transformation in technogenesis

    NASA Astrophysics Data System (ADS)

    Sarapulova, G.; Fedotov, K.

    2013-03-01

    A new approach to the estimation of environmental situation of the urbanized territories of a large city is offered. The approach is based on a complex of physical and chemical parameters adequately describing transformation of soil properties, with the use of landscape-geochemical method and GIS technologies. The pollution of soil horizons by heavy metals (HM) and mineral oil (MO) in a zone of influence of dangerous industrial objects exceeds by orders the maximum permissible concentration (MPC). Sharp deterioration of a nitric and carbon mode, increase of the alkalinity and the decrease in buffer activity of ground were revealed. The dynamics of technogenic streams and aureole of the contaminants migration in soils can result not only in the further transformation of their properties, but in closing the migration cycles of MO and HM. As a rule this is accompanied by the formation of secondary local sites of toxic substances accumulation and abnormal geochemical fields - laterally technogenic module. The revision of approaches to the analysis of soils under technogenesis and the development of new system of ecological monitoring represent a challenging goal.

  16. Ontology-based data integration between clinical and research systems.

    PubMed

    Mate, Sebastian; Köpcke, Felix; Toddenroth, Dennis; Martin, Marcus; Prokosch, Hans-Ulrich; Bürkle, Thomas; Ganslandt, Thomas

    2015-01-01

    Data from the electronic medical record comprise numerous structured but uncoded elements, which are not linked to standard terminologies. Reuse of such data for secondary research purposes has gained in importance recently. However, the identification of relevant data elements and the creation of database jobs for extraction, transformation and loading (ETL) are challenging: With current methods such as data warehousing, it is not feasible to efficiently maintain and reuse semantically complex data extraction and trans-formation routines. We present an ontology-supported approach to overcome this challenge by making use of abstraction: Instead of defining ETL procedures at the database level, we use ontologies to organize and describe the medical concepts of both the source system and the target system. Instead of using unique, specifically developed SQL statements or ETL jobs, we define declarative transformation rules within ontologies and illustrate how these constructs can then be used to automatically generate SQL code to perform the desired ETL procedures. This demonstrates how a suitable level of abstraction may not only aid the interpretation of clinical data, but can also foster the reutilization of methods for un-locking it.

  17. Levels of integration in cognitive control and sequence processing in the prefrontal cortex.

    PubMed

    Bahlmann, Jörg; Korb, Franziska M; Gratton, Caterina; Friederici, Angela D

    2012-01-01

    Cognitive control is necessary to flexibly act in changing environments. Sequence processing is needed in language comprehension to build the syntactic structure in sentences. Functional imaging studies suggest that sequence processing engages the left ventrolateral prefrontal cortex (PFC). In contrast, cognitive control processes additionally recruit bilateral rostral lateral PFC regions. The present study aimed to investigate these two types of processes in one experimental paradigm. Sequence processing was manipulated using two different sequencing rules varying in complexity. Cognitive control was varied with different cue-sets that determined the choice of a sequencing rule. Univariate analyses revealed distinct PFC regions for the two types of processing (i.e. sequence processing: left ventrolateral PFC and cognitive control processing: bilateral dorsolateral and rostral PFC). Moreover, in a common brain network (including left lateral PFC and intraparietal sulcus) no interaction between sequence and cognitive control processing was observed. In contrast, a multivariate pattern analysis revealed an interaction of sequence and cognitive control processing, such that voxels in left lateral PFC and parietal cortex showed different tuning functions for tasks involving different sequencing and cognitive control demands. These results suggest that the difference between the process of rule selection (i.e. cognitive control) and the process of rule-based sequencing (i.e. sequence processing) find their neuronal underpinnings in distinct activation patterns in lateral PFC. Moreover, the combination of rule selection and rule sequencing can shape the response of neurons in lateral PFC and parietal cortex.

  18. Levels of Integration in Cognitive Control and Sequence Processing in the Prefrontal Cortex

    PubMed Central

    Bahlmann, Jörg; Korb, Franziska M.; Gratton, Caterina; Friederici, Angela D.

    2012-01-01

    Cognitive control is necessary to flexibly act in changing environments. Sequence processing is needed in language comprehension to build the syntactic structure in sentences. Functional imaging studies suggest that sequence processing engages the left ventrolateral prefrontal cortex (PFC). In contrast, cognitive control processes additionally recruit bilateral rostral lateral PFC regions. The present study aimed to investigate these two types of processes in one experimental paradigm. Sequence processing was manipulated using two different sequencing rules varying in complexity. Cognitive control was varied with different cue-sets that determined the choice of a sequencing rule. Univariate analyses revealed distinct PFC regions for the two types of processing (i.e. sequence processing: left ventrolateral PFC and cognitive control processing: bilateral dorsolateral and rostral PFC). Moreover, in a common brain network (including left lateral PFC and intraparietal sulcus) no interaction between sequence and cognitive control processing was observed. In contrast, a multivariate pattern analysis revealed an interaction of sequence and cognitive control processing, such that voxels in left lateral PFC and parietal cortex showed different tuning functions for tasks involving different sequencing and cognitive control demands. These results suggest that the difference between the process of rule selection (i.e. cognitive control) and the process of rule-based sequencing (i.e. sequence processing) find their neuronal underpinnings in distinct activation patterns in lateral PFC. Moreover, the combination of rule selection and rule sequencing can shape the response of neurons in lateral PFC and parietal cortex. PMID:22952762

  19. Relaxing the rule of ten events per variable in logistic and Cox regression.

    PubMed

    Vittinghoff, Eric; McCulloch, Charles E

    2007-03-15

    The rule of thumb that logistic and Cox models should be used with a minimum of 10 outcome events per predictor variable (EPV), based on two simulation studies, may be too conservative. The authors conducted a large simulation study of other influences on confidence interval coverage, type I error, relative bias, and other model performance measures. They found a range of circumstances in which coverage and bias were within acceptable levels despite less than 10 EPV, as well as other factors that were as influential as or more influential than EPV. They conclude that this rule can be relaxed, in particular for sensitivity analyses undertaken to demonstrate adequate control of confounding.

  20. Characteristics of knowledge content in a curated online evidence library.

    PubMed

    Varada, Sowmya; Lacson, Ronilda; Raja, Ali S; Ip, Ivan K; Schneider, Louise; Osterbur, David; Bain, Paul; Vetrano, Nicole; Cellini, Jacqueline; Mita, Carol; Coletti, Margaret; Whelan, Julia; Khorasani, Ramin

    2018-05-01

    To describe types of recommendations represented in a curated online evidence library, report on the quality of evidence-based recommendations pertaining to diagnostic imaging exams, and assess underlying knowledge representation. The evidence library is populated with clinical decision rules, professional society guidelines, and locally developed best practice guidelines. Individual recommendations were graded based on a standard methodology and compared using chi-square test. Strength of evidence ranged from grade 1 (systematic review) through grade 5 (recommendations based on expert opinion). Finally, variations in the underlying representation of these recommendations were identified. The library contains 546 individual imaging-related recommendations. Only 15% (16/106) of recommendations from clinical decision rules were grade 5 vs 83% (526/636) from professional society practice guidelines and local best practice guidelines that cited grade 5 studies (P < .0001). Minor head trauma, pulmonary embolism, and appendicitis were topic areas supported by the highest quality of evidence. Three main variations in underlying representations of recommendations were "single-decision," "branching," and "score-based." Most recommendations were grade 5, largely because studies to test and validate many recommendations were absent. Recommendation types vary in amount and complexity and, accordingly, the structure and syntax of statements they generate. However, they can be represented in single-decision, branching, and score-based representations. In a curated evidence library with graded imaging-based recommendations, evidence quality varied widely, with decision rules providing the highest-quality recommendations. The library may be helpful in highlighting evidence gaps, comparing recommendations from varied sources on similar clinical topics, and prioritizing imaging recommendations to inform clinical decision support implementation.

  1. 78 FR 54504 - Self-Regulatory Organizations; NASDAQ OMX PHLX LLC; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-04

    ... Change To Amend the Commentary to Rule 1080 To Add a New PIXL ISO Order Type August 28, 2013. Pursuant to... 1080 to add a new PIXL ISO order type. The text of the proposed rule change is available on the... Exchange proposes to amend the Commentary to Rule 1080 to add a new PIXL ISO order type. PIXL The price...

  2. 77 FR 313 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-04

    ... Change Deleting NYSE Arca Equities Rule 7.31(w)(1) to Remove the PNP Plus Order Type December 28, 2011... Equities Rule 7.31(w)(1) to remove the PNP Plus Order type. The text of the proposed rule change is...(w)(1) to remove the PNP (Post No Preference) Plus order type. By its terms, a PNP Order is a limit...

  3. Age-Related Differences in Contribution of Rule-Based Thinking toward Moral Evaluations.

    PubMed

    Caravita, Simona C S; De Silva, Lindamulage N; Pagani, Vera; Colombo, Barbara; Antonietti, Alessandro

    2017-01-01

    This study aims to investigate the interplay of different criteria of moral evaluation, related to the type of the rule and context characteristics, in moral reasoning of children, early, and late adolescents. Students attending to fourth, seventh, and tenth grade were asked to evaluate the acceptability of rule breaking actions using ad hoc scenarios. Results suggest that the role of different moral evaluation criteria changes by age. During adolescence a greater integration of the moral criteria emerged. Moreover, adolescents also prioritized the evaluation of moral rule (forbidding to harm others) violations as non-acceptable when the perpetrator harms an innocent victim by applying a direct personal force. The relevance of these findings to increase the understanding of how moral reasoning changes by age for the assessment of impairments in moral reasoning of non-normative groups is also discussed.

  4. Rewrite Systems, Pattern Matching, and Code Generation

    DTIC Science & Technology

    1988-06-09

    Transformations Quicn a bucn arbol se anima, buena sombra le cobija1 [Old Spanish Saying] 1 Trees arc hierarchical mathematical objects. Their...subtrees of a tree may m atch one or more rewrite rules. Traditional research in term rewrite systems is concerned with de termining if a given system...be simulated by sets of rewrite rules. Non-local condjtions are des cribed in an awkward way since the only way to transmit information is indirectly

  5. Profile-likelihood Confidence Intervals in Item Response Theory Models.

    PubMed

    Chalmers, R Philip; Pek, Jolynn; Liu, Yang

    2017-01-01

    Confidence intervals (CIs) are fundamental inferential devices which quantify the sampling variability of parameter estimates. In item response theory, CIs have been primarily obtained from large-sample Wald-type approaches based on standard error estimates, derived from the observed or expected information matrix, after parameters have been estimated via maximum likelihood. An alternative approach to constructing CIs is to quantify sampling variability directly from the likelihood function with a technique known as profile-likelihood confidence intervals (PL CIs). In this article, we introduce PL CIs for item response theory models, compare PL CIs to classical large-sample Wald-type CIs, and demonstrate important distinctions among these CIs. CIs are then constructed for parameters directly estimated in the specified model and for transformed parameters which are often obtained post-estimation. Monte Carlo simulation results suggest that PL CIs perform consistently better than Wald-type CIs for both non-transformed and transformed parameters.

  6. The Plant Ionome Revisited by the Nutrient Balance Concept

    PubMed Central

    Parent, Serge-Étienne; Parent, Léon Etienne; Egozcue, Juan José; Rozane, Danilo-Eduardo; Hernandes, Amanda; Lapointe, Line; Hébert-Gentile, Valérie; Naess, Kristine; Marchand, Sébastien; Lafond, Jean; Mattos, Dirceu; Barlow, Philip; Natale, William

    2013-01-01

    Tissue analysis is commonly used in ecology and agronomy to portray plant nutrient signatures. Nutrient concentration data, or ionomes, belong to the compositional data class, i.e., multivariate data that are proportions of some whole, hence carrying important numerical properties. Statistics computed across raw or ordinary log-transformed nutrient data are intrinsically biased, hence possibly leading to wrong inferences. Our objective was to present a sound and robust approach based on a novel nutrient balance concept to classify plant ionomes. We analyzed leaf N, P, K, Ca, and Mg of two wild and six domesticated fruit species from Canada, Brazil, and New Zealand sampled during reproductive stages. Nutrient concentrations were (1) analyzed without transformation, (2) ordinary log-transformed as commonly but incorrectly applied in practice, (3) additive log-ratio (alr) transformed as surrogate to stoichiometric rules, and (4) converted to isometric log-ratios (ilr) arranged as sound nutrient balance variables. Raw concentration and ordinary log transformation both led to biased multivariate analysis due to redundancy between interacting nutrients. The alr- and ilr-transformed data provided unbiased discriminant analyses of plant ionomes, where wild and domesticated species formed distinct groups and the ionomes of species and cultivars were differentiated without numerical bias. The ilr nutrient balance concept is preferable to alr, because the ilr technique projects the most important interactions between nutrients into a convenient Euclidean space. This novel numerical approach allows rectifying historical biases and supervising phenotypic plasticity in plant nutrition studies. PMID:23526060

  7. A Case of Endometrioid Adenocarcinoma Originating from the Serous Surface of the Small Intestine.

    PubMed

    Makihara, Natsuko; Fujita, Ichiro; Soudaf, Hiroo; Yamamoto, Takahisa; Sashikata, Terumasa; Mukohara, Toru; Maeda, Tetsuo

    2015-09-07

    Malignant transformation of endometriosis has been extensively described in the literature. However, extragonadal endometrioid adenocarcinoma, either de novo or arising from malignant transformation of endometriosis, is rare. The present case report describes a patient with endometrioid adenocarcinoma on the serous surface of the small intestine. A 25-year-old female with no history of endometriosis was referred to our hospital with an intrapelvic tumor. An internal examination, ultrasound, and magnetic resonance imaging revealed a round mass approximately 80 mm in diameter; however, identification of the affected organ was difficult. Because we could not rule out malignancy based on the non-specific radiologic findings, laparotomy was performed. A mass with ileal adhesions was detected intraoperatively. In addition, the uterus and bilateral adnexa appeared normal. The tumor was resected with part of the ileum. Histopathology confirmed a diagnosis of endometrioid adenocarcinoma originating from the serous surface of the small intestine.

  8. Application of preprocessing filtering on Decision Tree C4.5 and rough set theory

    NASA Astrophysics Data System (ADS)

    Chan, Joseph C. C.; Lin, Tsau Y.

    2001-03-01

    This paper compares two artificial intelligence methods: the Decision Tree C4.5 and Rough Set Theory on the stock market data. The Decision Tree C4.5 is reviewed with the Rough Set Theory. An enhanced window application is developed to facilitate the pre-processing filtering by introducing the feature (attribute) transformations, which allows users to input formulas and create new attributes. Also, the application produces three varieties of data set with delaying, averaging, and summation. The results prove the improvement of pre-processing by applying feature (attribute) transformations on Decision Tree C4.5. Moreover, the comparison between Decision Tree C4.5 and Rough Set Theory is based on the clarity, automation, accuracy, dimensionality, raw data, and speed, which is supported by the rules sets generated by both algorithms on three different sets of data.

  9. Sparingly Solvating Electrolytes for High Energy Density Lithium-Sulfur Batteries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, Lei; Curtiss, Larry A.; Zavadil, Kevin R.

    2016-07-11

    Moving to lighter and less expensive battery chemistries compared to lithium-ion requires the control of energy storage mechanisms based on chemical transformations rather than intercalation. Lithium sulfur (Li/S) has tremendous theoretical specific energy, but contemporary approaches to control this solution-mediated, precipitation-dissolution chemistry requires using large excesses of electrolyte to fully solubilize the polysulfide intermediate. Achieving reversible electrochemistry under lean electrolyte operation is the only path for Li/S to move beyond niche applications to potentially transformational performance. An emerging topic for Li/S research is the use of sparingly solvating electrolytes and the creation of design rules for discovering new electrolyte systemsmore » that fundamentally decouple electrolyte volume from reaction mechanism. This perspective presents an outlook for sparingly solvating electrolytes as the key path forward for longer-lived, high-energy density Li/S batteries including an overview of this promising new concept and some strategies for accomplishing it.« less

  10. Sparingly solvating electrolytes for high energy density Lithium–sulfur batteries

    DOE PAGES

    Cheng, Lei; Curtiss, Larry A.; Zavadil, Kevin R.; ...

    2016-07-11

    Moving to lighter and less expensive battery chemistries compared to lithium-ion requires the control of energy storage mechanisms based on chemical transformations rather than intercalation. Lithium sulfur (Li/S) has tremendous theoretical specific energy, but contemporary approaches to control this solution-mediated, precipitation-dissolution chemistry requires using large excesses of electrolyte to fully solubilize the polysulfide intermediate. Achieving reversible electrochemistry under lean electrolyte operation is the only path for Li/S to move beyond niche applications to potentially transformational performance. An emerging topic for Li/S research is the use of sparingly solvating electrolytes and the creation of design rules for discovering new electrolyte systemsmore » that fundamentally decouple electrolyte volume from reaction mechanism. Furthermore, this perspective presents an outlook for sparingly solvating electrolytes as the key path forward for longer-lived, high-energy density Li/S batteries including an overview of this promising new concept and some strategies for accomplishing it.« less

  11. Lenses that provide the transformation between two given wavefronts

    NASA Astrophysics Data System (ADS)

    Criado, C.; Alamo, N.

    2016-12-01

    We give an original method to design four types of lenses solving the following problems: focusing a given wavefront in a given point, and performing the transformation between two arbitrary incoming and outgoing wavefronts. The method to design the lenses profiles is based on the optical properties of the envelopes of certain families of Cartesian ovals of revolution.

  12. Nondeterministic data base for computerized visual perception

    NASA Technical Reports Server (NTRS)

    Yakimovsky, Y.

    1976-01-01

    A description is given of the knowledge representation data base in the perception subsystem of the Mars robot vehicle prototype. Two types of information are stored. The first is generic information that represents general rules that are conformed to by structures in the expected environments. The second kind of information is a specific description of a structure, i.e., the properties and relations of objects in the specific case being analyzed. The generic knowledge is represented so that it can be applied to extract and infer the description of specific structures. The generic model of the rules is substantially a Bayesian representation of the statistics of the environment, which means it is geared to representation of nondeterministic rules relating properties of, and relations between, objects. The description of a specific structure is also nondeterministic in the sense that all properties and relations may take a range of values with an associated probability distribution.

  13. On spatial mutation-selection models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kondratiev, Yuri, E-mail: kondrat@math.uni-bielefeld.de; Kutoviy, Oleksandr, E-mail: kutoviy@math.uni-bielefeld.de, E-mail: kutovyi@mit.edu; Department of Mathematics, Massachusetts Institute of Technology, 77 Massachusetts Avenue, Cambridge, Massachusetts 02139

    2013-11-15

    We discuss the selection procedure in the framework of mutation models. We study the regulation for stochastically developing systems based on a transformation of the initial Markov process which includes a cost functional. The transformation of initial Markov process by cost functional has an analytic realization in terms of a Kimura-Maruyama type equation for the time evolution of states or in terms of the corresponding Feynman-Kac formula on the path space. The state evolution of the system including the limiting behavior is studied for two types of mutation-selection models.

  14. Based on a multi-agent system for multi-scale simulation and application of household's LUCC: a case study for Mengcha village, Mizhi county, Shaanxi province.

    PubMed

    Chen, Hai; Liang, Xiaoying; Li, Rui

    2013-01-01

    Multi-Agent Systems (MAS) offer a conceptual approach to include multi-actor decision making into models of land use change. Through the simulation based on the MAS, this paper tries to show the application of MAS in the micro scale LUCC, and reveal the transformation mechanism of difference scale. This paper starts with a description of the context of MAS research. Then, it adopts the Nested Spatial Choice (NSC) method to construct the multi-scale LUCC decision-making model. And a case study for Mengcha village, Mizhi County, Shaanxi Province is reported. Finally, the potentials and drawbacks of the following approach is discussed and concluded. From our design and implementation of the MAS in multi-scale model, a number of observations and conclusions can be drawn on the implementation and future research directions. (1) The use of the LUCC decision-making and multi-scale transformation framework provides, according to us, a more realistic modeling of multi-scale decision making process. (2) By using continuous function, rather than discrete function, to construct the decision-making of the households is more realistic to reflect the effect. (3) In this paper, attempts have been made to give a quantitative analysis to research the household interaction. And it provides the premise and foundation for researching the communication and learning among the households. (4) The scale transformation architecture constructed in this paper helps to accumulate theory and experience for the interaction research between the micro land use decision-making and the macro land use landscape pattern. Our future research work will focus on: (1) how to rational use risk aversion principle, and put the rule on rotation between household parcels into model. (2) Exploring the methods aiming at researching the household decision-making over a long period, it allows us to find the bridge between the long-term LUCC data and the short-term household decision-making. (3) Researching the quantitative method and model, especially the scenario analysis model which may reflect the interaction among different household types.

  15. Toward Design Principles for Diffusionless Transformations: The Frustrated Formation of Co–Co Bonds in a Low-Temperature Polymorph of GdCoSi 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vinokur, Anastasiya I.; Fredrickson, Daniel C.

    Diffusionless (or displacive) phase transitions allow inorganic materials to show exquisite responsiveness to external stimuli, as is illustrated vividly by the superelasticity, shape memory, and magnetocaloric effects exhibited by martensitic materials. In this Article, we present a new diffusionless transition in the compound GdCoSi 2, whose origin in frustrated bonding points toward generalizable design principles for these transformations. We first describe the synthesis of GdCoSi 2 and the determination of its structure using single crystal X-ray diffraction. While previous studies based on powder X-ray diffraction assigned this compound to the simple CeNi 1–xSi 2 structure type (space group Cmcm), ourmore » structure solution reveals a superstructure variant (space group Pbcm) in which the Co sublattice is distorted to create zigzag chains of Co atoms. DFT-calibrated Hückel calculations, coupled with a reversed approximation Molecular Orbital (raMO) analysis, trace this superstructure to the use of Co–Co isolobal bonds to complete filled 18 electron configurations on the Co atoms, in accordance with the 18–n rule. The formation of these Co–Co bonds is partially impeded, however, by a small degree of electron transfer from Si-based electronic states to those with Co–Co σ* character. The incomplete success of Co–Co bond creation suggests that these interactions are relatively weak, opening the possibility of them being overcome by thermal energy at elevated temperatures. In fact, high-temperature powder and single crystal X-ray diffraction data, as well as differential scanning calorimetry, indicate that a reversible Pbcm to Cmcm transition occurs at about 380 K. This transition is diffusionless, and the available data point toward it being first-order. We expect that similar cases of frustrated interactions could be staged in other rare earth–transition metal–main group phases, providing a potentially rich source of compounds exhibiting diffusionless transformations and the unique properties these transitions mediate.« less

  16. The generation of arbitrary order, non-classical, Gauss-type quadrature for transport applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spence, Peter J., E-mail: peter.spence@awe.co.uk

    A method is presented, based upon the Stieltjes method (1884), for the determination of non-classical Gauss-type quadrature rules, and the associated sets of abscissae and weights. The method is then used to generate a number of quadrature sets, to arbitrary order, which are primarily aimed at deterministic transport calculations. The quadrature rules and sets detailed include arbitrary order reproductions of those presented by Abu-Shumays in [4,8] (known as the QR sets, but labelled QRA here), in addition to a number of new rules and associated sets; these are generated in a similar way, and we label them the QRS quadraturemore » sets. The method presented here shifts the inherent difficulty (encountered by Abu-Shumays) associated with solving the non-linear moment equations, particular to the required quadrature rule, to one of the determination of non-classical weight functions and the subsequent calculation of various associated inner products. Once a quadrature rule has been written in a standard form, with an associated weight function having been identified, the calculation of the required inner products is achieved using specific variable transformations, in addition to the use of rapid, highly accurate quadrature suited to this purpose. The associated non-classical Gauss quadrature sets can then be determined, and this can be done to any order very rapidly. In this paper, instead of listing weights and abscissae for the different quadrature sets detailed (of which there are a number), the MATLAB code written to generate them is included as Appendix D. The accuracy and efficacy (in a transport setting) of the quadrature sets presented is not tested in this paper (although the accuracy of the QRA quadrature sets has been studied in [12,13]), but comparisons to tabulated results listed in [8] are made. When comparisons are made with one of the azimuthal QRA sets detailed in [8], the inherent difficulty in the method of generation, used there, becomes apparent, with the highest order tabulated sets showing unexpected anomalies. Although not in an actual transport setting, the accuracy of the sets presented here is assessed to some extent, by using them to approximate integrals (over an octant of the unit sphere) of various high order spherical harmonics. When this is done, errors in the tabulated QRA sets present themselves at the highest tabulated orders, whilst combinations of the new QRS quadrature sets offer some improvements in accuracy over the original QRA sets. Finally, in order to offer a quick, visual understanding of the various quadrature sets presented, when combined to give product sets for the purposes of integrating functions confined to the surface of a sphere, three-dimensional representations of points located on an octant of the unit sphere (as in [8,12]) are shown.« less

  17. Biomedical image classification based on a cascade of an SVM with a reject option and subspace analysis.

    PubMed

    Lin, Dongyun; Sun, Lei; Toh, Kar-Ann; Zhang, Jing Bo; Lin, Zhiping

    2018-05-01

    Automated biomedical image classification could confront the challenges of high level noise, image blur, illumination variation and complicated geometric correspondence among various categorical biomedical patterns in practice. To handle these challenges, we propose a cascade method consisting of two stages for biomedical image classification. At stage 1, we propose a confidence score based classification rule with a reject option for a preliminary decision using the support vector machine (SVM). The testing images going through stage 1 are separated into two groups based on their confidence scores. Those testing images with sufficiently high confidence scores are classified at stage 1 while the others with low confidence scores are rejected and fed to stage 2. At stage 2, the rejected images from stage 1 are first processed by a subspace analysis technique called eigenfeature regularization and extraction (ERE), and then classified by another SVM trained in the transformed subspace learned by ERE. At both stages, images are represented based on two types of local features, i.e., SIFT and SURF, respectively. They are encoded using various bag-of-words (BoW) models to handle biomedical patterns with and without geometric correspondence, respectively. Extensive experiments are implemented to evaluate the proposed method on three benchmark real-world biomedical image datasets. The proposed method significantly outperforms several competing state-of-the-art methods in terms of classification accuracy. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Nested sparse grid collocation method with delay and transformation for subsurface flow and transport problems

    NASA Astrophysics Data System (ADS)

    Liao, Qinzhuo; Zhang, Dongxiao; Tchelepi, Hamdi

    2017-06-01

    In numerical modeling of subsurface flow and transport problems, formation properties may not be deterministically characterized, which leads to uncertainty in simulation results. In this study, we propose a sparse grid collocation method, which adopts nested quadrature rules with delay and transformation to quantify the uncertainty of model solutions. We show that the nested Kronrod-Patterson-Hermite quadrature is more efficient than the unnested Gauss-Hermite quadrature. We compare the convergence rates of various quadrature rules including the domain truncation and domain mapping approaches. To further improve accuracy and efficiency, we present a delayed process in selecting quadrature nodes and a transformed process for approximating unsmooth or discontinuous solutions. The proposed method is tested by an analytical function and in one-dimensional single-phase and two-phase flow problems with different spatial variances and correlation lengths. An additional example is given to demonstrate its applicability to three-dimensional black-oil models. It is found from these examples that the proposed method provides a promising approach for obtaining satisfactory estimation of the solution statistics and is much more efficient than the Monte-Carlo simulations.

  19. More on approximations of Poisson probabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kao, C

    1980-05-01

    Calculation of Poisson probabilities frequently involves calculating high factorials, which becomes tedious and time-consuming with regular calculators. The usual way to overcome this difficulty has been to find approximations by making use of the table of the standard normal distribution. A new transformation proposed by Kao in 1978 appears to perform better for this purpose than traditional transformations. In the present paper several approximation methods are stated and compared numerically, including an approximation method that utilizes a modified version of Kao's transformation. An approximation based on a power transformation was found to outperform those based on the square-root type transformationsmore » as proposed in literature. The traditional Wilson-Hilferty approximation and Makabe-Morimura approximation are extremely poor compared with this approximation. 4 tables. (RWR)« less

  20. Recreational System Optimization to Reduce Conflict on Public Lands

    NASA Astrophysics Data System (ADS)

    Shilling, Fraser; Boggs, Jennifer; Reed, Sarah

    2012-09-01

    In response to federal administrative rule, the Tahoe National Forest (TNF), California, USA engaged in trail-route prioritization for motorized recreation (e.g., off-highway-vehicles) and other recreation types. The prioritization was intended to identify routes that were suitable and ill-suited for maintenance in a transportation system. A recreational user survey was conducted online ( n = 813) for user preferences for trail system characteristics, recreational use patterns, and demographics. Motorized trail users and non-motorized users displayed very clear and contrasting preferences for the same system. As has been found by previous investigators, non-motorized users expressed antagonism to motorized use on the same recreational travel system, whereas motorized users either supported multiple-use routes or dismissed non-motorized recreationists' concerns. To help the TNF plan for reduced conflict, a geographic information system (GIS) based modeling approach was used to identify recreational opportunities and potential environmental impacts of all travel routes. This GIS-based approach was based on an expert-derived rule set. The rules addressed particular environmental and recreation concerns in the TNF. Route segments were identified that could be incorporated into minimal-impact networks to support various types of recreation. The combination of potential impacts and user-benefits supported an optimization approach for an appropriate recreational travel network to minimize environmental impacts and user-conflicts in a multi-purpose system.

  1. Recreational system optimization to reduce conflict on public lands.

    PubMed

    Shilling, Fraser; Boggs, Jennifer; Reed, Sarah

    2012-09-01

    In response to federal administrative rule, the Tahoe National Forest (TNF), California, USA engaged in trail-route prioritization for motorized recreation (e.g., off-highway-vehicles) and other recreation types. The prioritization was intended to identify routes that were suitable and ill-suited for maintenance in a transportation system. A recreational user survey was conducted online (n = 813) for user preferences for trail system characteristics, recreational use patterns, and demographics. Motorized trail users and non-motorized users displayed very clear and contrasting preferences for the same system. As has been found by previous investigators, non-motorized users expressed antagonism to motorized use on the same recreational travel system, whereas motorized users either supported multiple-use routes or dismissed non-motorized recreationists' concerns. To help the TNF plan for reduced conflict, a geographic information system (GIS) based modeling approach was used to identify recreational opportunities and potential environmental impacts of all travel routes. This GIS-based approach was based on an expert-derived rule set. The rules addressed particular environmental and recreation concerns in the TNF. Route segments were identified that could be incorporated into minimal-impact networks to support various types of recreation. The combination of potential impacts and user-benefits supported an optimization approach for an appropriate recreational travel network to minimize environmental impacts and user-conflicts in a multi-purpose system.

  2. Equivalent circuit of radio frequency-plasma with the transformer model

    NASA Astrophysics Data System (ADS)

    Nishida, K.; Mochizuki, S.; Ohta, M.; Yasumoto, M.; Lettry, J.; Mattei, S.; Hatayama, A.

    2014-02-01

    LINAC4 H- source is radio frequency (RF) driven type source. In the RF system, it is required to match the load impedance, which includes H- source, to that of final amplifier. We model RF plasma inside the H- source as circuit elements using transformer model so that characteristics of the load impedance become calculable. It has been shown that the modeling based on the transformer model works well to predict the resistance and inductance of the plasma.

  3. R package PRIMsrc: Bump Hunting by Patient Rule Induction Method for Survival, Regression and Classification

    PubMed Central

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J. Sunil

    2015-01-01

    PRIMsrc is a novel implementation of a non-parametric bump hunting procedure, based on the Patient Rule Induction Method (PRIM), offering a unified treatment of outcome variables, including censored time-to-event (Survival), continuous (Regression) and discrete (Classification) responses. To fit the model, it uses a recursive peeling procedure with specific peeling criteria and stopping rules depending on the response. To validate the model, it provides an objective function based on prediction-error or other specific statistic, as well as two alternative cross-validation techniques, adapted to the task of decision-rule making and estimation in the three types of settings. PRIMsrc comes as an open source R package, including at this point: (i) a main function for fitting a Survival Bump Hunting model with various options allowing cross-validated model selection to control model size (#covariates) and model complexity (#peeling steps) and generation of cross-validated end-point estimates; (ii) parallel computing; (iii) various S3-generic and specific plotting functions for data visualization, diagnostic, prediction, summary and display of results. It is available on CRAN and GitHub. PMID:26798326

  4. Quality Leadership and Quality Control

    PubMed Central

    Badrick, Tony

    2003-01-01

    Different quality control rules detect different analytical errors with varying levels of efficiency depending on the type of error present, its prevalence and the number of observations. The efficiency of a rule can be gauged by inspection of a power function graph. Control rules are only part of a process and not an end in itself; just as important are the trouble-shooting systems employed when a failure occurs. 'Average of patient normals' may develop as a usual adjunct to conventional quality control serum based programmes. Acceptable error can be based on various criteria; biological variation is probably the most sensible. Once determined, acceptable error can be used as limits in quality control rule systems. A key aspect of an organisation is leadership, which links the various components of the quality system. Leadership is difficult to characterise but its key aspects include trust, setting an example, developing staff and critically setting the vision for the organisation. Organisations also have internal characteristics such as the degree of formalisation, centralisation, and complexity. Medical organisations can have internal tensions because of the dichotomy between the bureaucratic and the shadow medical structures. PMID:18568046

  5. Moral empiricism and the bias for act-based rules.

    PubMed

    Ayars, Alisabeth; Nichols, Shaun

    2017-10-01

    Previous studies on rule learning show a bias in favor of act-based rules, which prohibit intentionally producing an outcome but not merely allowing the outcome. Nichols, Kumar, Lopez, Ayars, and Chan (2016) found that exposure to a single sample violation in which an agent intentionally causes the outcome was sufficient for participants to infer that the rule was act-based. One explanation is that people have an innate bias to think rules are act-based. We suggest an alternative empiricist account: since most rules that people learn are act-based, people form an overhypothesis (Goodman, 1955) that rules are typically act-based. We report three studies that indicate that people can use information about violations to form overhypotheses about rules. In study 1, participants learned either three "consequence-based" rules that prohibited allowing an outcome or three "act-based" rules that prohibiting producing the outcome; in a subsequent learning task, we found that participants who had learned three consequence-based rules were more likely to think that the new rule prohibited allowing an outcome. In study 2, we presented participants with either 1 consequence-based rule or 3 consequence-based rules, and we found that those exposed to 3 such rules were more likely to think that a new rule was also consequence based. Thus, in both studies, it seems that learning 3 consequence-based rules generates an overhypothesis to expect new rules to be consequence-based. In a final study, we used a more subtle manipulation. We exposed participants to examples act-based or accident-based (strict liability) laws and then had them learn a novel rule. We found that participants who were exposed to the accident-based laws were more likely to think a new rule was accident-based. The fact that participants' bias for act-based rules can be shaped by evidence from other rules supports the idea that the bias for act-based rules might be acquired as an overhypothesis from the preponderance of act-based rules. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. 77 FR 29435 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing of Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-17

    ... types to make benchmarking easier and more efficient. For the members that already have such... benchmarking capability to firms that currently lack it or lack an exchange-based alternative. NASDAQ further...

  7. Wavelet analysis techniques applied to removing varying spectroscopic background in calibration model for pear sugar content

    NASA Astrophysics Data System (ADS)

    Liu, Yande; Ying, Yibin; Lu, Huishan; Fu, Xiaping

    2005-11-01

    A new method is proposed to eliminate the varying background and noise simultaneously for multivariate calibration of Fourier transform near infrared (FT-NIR) spectral signals. An ideal spectrum signal prototype was constructed based on the FT-NIR spectrum of fruit sugar content measurement. The performances of wavelet based threshold de-noising approaches via different combinations of wavelet base functions were compared. Three families of wavelet base function (Daubechies, Symlets and Coiflets) were applied to estimate the performance of those wavelet bases and threshold selection rules by a series of experiments. The experimental results show that the best de-noising performance is reached via the combinations of Daubechies 4 or Symlet 4 wavelet base function. Based on the optimization parameter, wavelet regression models for sugar content of pear were also developed and result in a smaller prediction error than a traditional Partial Least Squares Regression (PLSR) mode.

  8. Vector disformal transformation of cosmological perturbations

    NASA Astrophysics Data System (ADS)

    Papadopoulos, Vassilis; Zarei, Moslem; Firouzjahi, Hassan; Mukohyama, Shinji

    2018-03-01

    We study disformal transformations of cosmological perturbations by vector fields in theories invariant under U (1 ) gauge transformations. Three types of vector disformal transformations are considered: (i) disformal transformations by a single timelike vector; (ii) disformal transformations by a single spacelike vector; and (iii) disformal transformations by three spacelike vectors. We show that transformations of type (i) do not change either curvature perturbation or gravitational waves; that those of type (ii) do not change curvature perturbation but change gravitational waves; and that those of type (iii) change both curvature perturbation and gravitational waves. Therefore, coupling matter fields to the metric after disformal transformations of type (ii) or (iii) in principle have observable consequences. While the recent multi-messenger observation of binary neutron stars has singled out a proper disformal frame at the present epoch with a high precision, the result of the present paper may thus help distinguishing disformal frames in the early universe.

  9. Study on Ecological Risk Assessment of Guangxi Coastal Zone Based on 3s Technology

    NASA Astrophysics Data System (ADS)

    Zhong, Z.; Luo, H.; Ling, Z. Y.; Huang, Y.; Ning, W. Y.; Tang, Y. B.; Shao, G. Z.

    2018-05-01

    This paper takes Guangxi coastal zone as the study area, following the standards of land use type, divides the coastal zone of ecological landscape into seven kinds of natural wetland landscape types such as woodland, farmland, grassland, water, urban land and wetlands. Using TM data of 2000-2015 such 15 years, with the CART decision tree algorithm, for analysis the characteristic of types of landscape's remote sensing image and build decision tree rules of landscape classification to extract information classification. Analyzing of the evolution process of the landscape pattern in Guangxi coastal zone in nearly 15 years, we may understand the distribution characteristics and change rules. Combined with the natural disaster data, we use of landscape index and the related risk interference degree and construct ecological risk evaluation model in Guangxi coastal zone for ecological risk assessment results of Guangxi coastal zone.

  10. 14 CFR 91.189 - Category II and III operations: General operating rules.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... pilot who is controlling the aircraft has appropriate instrumentation for the type of flight control... TRANSPORTATION (CONTINUED) AIR TRAFFIC AND GENERAL OPERATING RULES GENERAL OPERATING AND FLIGHT RULES Flight Rules Instrument Flight Rules § 91.189 Category II and III operations: General operating rules. (a) No...

  11. 14 CFR 91.189 - Category II and III operations: General operating rules.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... pilot who is controlling the aircraft has appropriate instrumentation for the type of flight control... TRANSPORTATION (CONTINUED) AIR TRAFFIC AND GENERAL OPERATING RULES GENERAL OPERATING AND FLIGHT RULES Flight Rules Instrument Flight Rules § 91.189 Category II and III operations: General operating rules. (a) No...

  12. 14 CFR 91.189 - Category II and III operations: General operating rules.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... pilot who is controlling the aircraft has appropriate instrumentation for the type of flight control... TRANSPORTATION (CONTINUED) AIR TRAFFIC AND GENERAL OPERATING RULES GENERAL OPERATING AND FLIGHT RULES Flight Rules Instrument Flight Rules § 91.189 Category II and III operations: General operating rules. (a) No...

  13. Takagi-Sugeno-Kang fuzzy models of the rainfall-runoff transformation

    NASA Astrophysics Data System (ADS)

    Jacquin, A. P.; Shamseldin, A. Y.

    2009-04-01

    Fuzzy inference systems, or fuzzy models, are non-linear models that describe the relation between the inputs and the output of a real system using a set of fuzzy IF-THEN rules. This study deals with the application of Takagi-Sugeno-Kang type fuzzy models to the development of rainfall-runoff models operating on a daily basis, using a system based approach. The models proposed are classified in two types, each intended to account for different kinds of dominant non-linear effects in the rainfall-runoff relationship. Fuzzy models type 1 are intended to incorporate the effect of changes in the prevailing soil moisture content, while fuzzy models type 2 address the phenomenon of seasonality. Each model type consists of five fuzzy models of increasing complexity; the most complex fuzzy model of each model type includes all the model components found in the remaining fuzzy models of the respective type. The models developed are applied to data of six catchments from different geographical locations and sizes. Model performance is evaluated in terms of two measures of goodness of fit, namely the Nash-Sutcliffe criterion and the index of volumetric fit. The results of the fuzzy models are compared with those of the Simple Linear Model, the Linear Perturbation Model and the Nearest Neighbour Linear Perturbation Model, which use similar input information. Overall, the results of this study indicate that Takagi-Sugeno-Kang fuzzy models are a suitable alternative for modelling the rainfall-runoff relationship. However, it is also observed that increasing the complexity of the model structure does not necessarily produce an improvement in the performance of the fuzzy models. The relative importance of the different model components in determining the model performance is evaluated through sensitivity analysis of the model parameters in the accompanying study presented in this meeting. Acknowledgements: We would like to express our gratitude to Prof. Kieran M. O'Connor from the National University of Ireland, Galway, for providing the data used in this study.

  14. Operational formulation of time reversal in quantum theory

    NASA Astrophysics Data System (ADS)

    Oreshkov, Ognyan; Cerf, Nicolas J.

    2015-10-01

    The symmetry of quantum theory under time reversal has long been a subject of controversy because the transition probabilities given by Born’s rule do not apply backward in time. Here, we resolve this problem within a rigorous operational probabilistic framework. We argue that reconciling time reversal with the probabilistic rules of the theory requires a notion of operation that permits realizations through both pre- and post-selection. We develop the generalized formulation of quantum theory that stems from this approach and give a precise definition of time-reversal symmetry, emphasizing a previously overlooked distinction between states and effects. We prove an analogue of Wigner’s theorem, which characterizes all allowed symmetry transformations in this operationally time-symmetric quantum theory. Remarkably, we find larger classes of symmetry transformations than previously assumed, suggesting a possible direction in the search for extensions of known physics.

  15. Facilitating the Specification Capture and Transformation Process in the Development of Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Filho, Aluzio Haendehen; Caminada, Numo; Haeusler, Edward Hermann; vonStaa, Arndt

    2004-01-01

    To support the development of flexible and reusable MAS, we have built a framework designated MAS-CF. MAS-CF is a component framework that implements a layered architecture based on contextual composition. Interaction rules, controlled by architecture mechanisms, ensure very low coupling, making possible the sharing of distributed services in a transparent, dynamic and independent way. These properties propitiate large-scale reuse, since organizational abstractions can be reused and propagated to all instances created from a framework. The objective is to reduce complexity and development time of multi-agent systems through the reuse of generic organizational abstractions.

  16. 3D near-to-surface conductivity reconstruction by inversion of VETEM data using the distorted Born iterative method

    USGS Publications Warehouse

    Wang, G.L.; Chew, W.C.; Cui, T.J.; Aydiner, A.A.; Wright, D.L.; Smith, D.V.

    2004-01-01

    Three-dimensional (3D) subsurface imaging by using inversion of data obtained from the very early time electromagnetic system (VETEM) was discussed. The study was carried out by using the distorted Born iterative method to match the internal nonlinear property of the 3D inversion problem. The forward solver was based on the total-current formulation bi-conjugate gradient-fast Fourier transform (BCCG-FFT). It was found that the selection of regularization parameter follow a heuristic rule as used in the Levenberg-Marquardt algorithm so that the iteration is stable.

  17. Chaos-assisted broadband momentum transformation in optical microresonators

    NASA Astrophysics Data System (ADS)

    Jiang, Xuefeng; Shao, Linbo; Zhang, Shu-Xin; Yi, Xu; Wiersig, Jan; Wang, Li; Gong, Qihuang; Lončar, Marko; Yang, Lan; Xiao, Yun-Feng

    2017-10-01

    The law of momentum conservation rules out many desired processes in optical microresonators. We report broadband momentum transformations of light in asymmetric whispering gallery microresonators. Assisted by chaotic motions, broadband light can travel between optical modes with different angular momenta within a few picoseconds. Efficient coupling from visible to near-infrared bands is demonstrated between a nanowaveguide and whispering gallery modes with quality factors exceeding 10 million. The broadband momentum transformation enhances the device conversion efficiency of the third-harmonic generation by greater than three orders of magnitude over the conventional evanescent-wave coupling. The observed broadband and fast momentum transformation could promote applications such as multicolor lasers, broadband memories, and multiwavelength optical networks.

  18. Inertial Frames Without the Relativity Principle: Breaking Lorentz Symmetry

    NASA Astrophysics Data System (ADS)

    Baccetti, Valentina; Tate, Kyle; Visser, Matt

    2015-01-01

    We investigate inertial frames in the absence of Lorentz invariance, reconsidering the usual group structure implied by the relativity principle. We abandon the relativity principle, discarding the group structure for the transformations between inertial frames, while requiring these transformations to be at least linear (to preserve homogeneity). In theories with a preferred frame (aether), the set of transformations between inertial frames forms a groupoid/pseudogroup instead of a group, a characteristic essential to evading the von Ignatowsky theorems. In order to understand the dynamics, we also demonstrate that the transformation rules for energy and momentum are in general affine. We finally focus on one specific and compelling model implementing a minimalist violation of Lorentz invariance.

  19. Presynaptic Ionotropic Receptors Controlling and Modulating the Rules for Spike Timing-Dependent Plasticity

    PubMed Central

    Verhoog, Matthijs B.; Mansvelder, Huibert D.

    2011-01-01

    Throughout life, activity-dependent changes in neuronal connection strength enable the brain to refine neural circuits and learn based on experience. In line with predictions made by Hebb, synapse strength can be modified depending on the millisecond timing of action potential firing (STDP). The sign of synaptic plasticity depends on the spike order of presynaptic and postsynaptic neurons. Ionotropic neurotransmitter receptors, such as NMDA receptors and nicotinic acetylcholine receptors, are intimately involved in setting the rules for synaptic strengthening and weakening. In addition, timing rules for STDP within synapses are not fixed. They can be altered by activation of ionotropic receptors located at, or close to, synapses. Here, we will highlight studies that uncovered how network actions control and modulate timing rules for STDP by activating presynaptic ionotropic receptors. Furthermore, we will discuss how interaction between different types of ionotropic receptors may create “timing” windows during which particular timing rules lead to synaptic changes. PMID:21941664

  20. Dopaminergic neurons write and update memories with cell-type-specific rules

    PubMed Central

    Aso, Yoshinori; Rubin, Gerald M

    2016-01-01

    Associative learning is thought to involve parallel and distributed mechanisms of memory formation and storage. In Drosophila, the mushroom body (MB) is the major site of associative odor memory formation. Previously we described the anatomy of the adult MB and defined 20 types of dopaminergic neurons (DANs) that each innervate distinct MB compartments (Aso et al., 2014a, 2014b). Here we compare the properties of memories formed by optogenetic activation of individual DAN cell types. We found extensive differences in training requirements for memory formation, decay dynamics, storage capacity and flexibility to learn new associations. Even a single DAN cell type can either write or reduce an aversive memory, or write an appetitive memory, depending on when it is activated relative to odor delivery. Our results show that different learning rules are executed in seemingly parallel memory systems, providing multiple distinct circuit-based strategies to predict future events from past experiences. DOI: http://dx.doi.org/10.7554/eLife.16135.001 PMID:27441388

  1. Environmental transformations and ecological effects of iron-based nanoparticles.

    PubMed

    Lei, Cheng; Sun, Yuqing; Tsang, Daniel C W; Lin, Daohui

    2018-01-01

    The increasing application of iron-based nanoparticles (NPs), especially high concentrations of zero-valent iron nanoparticles (nZVI), has raised concerns regarding their environmental behavior and potential ecological effects. In the environment, iron-based NPs undergo physical, chemical, and/or biological transformations as influenced by environmental factors such as pH, ions, dissolved oxygen, natural organic matter (NOM), and biotas. This review presents recent research advances on environmental transformations of iron-based NPs, and articulates their relationships with the observed toxicities. The type and extent of physical, chemical, and biological transformations, including aggregation, oxidation, and bio-reduction, depend on the properties of NPs and the receiving environment. Toxicities of iron-based NPs to bacteria, algae, fish, and plants are increasingly observed, which are evaluated with a particular focus on the underlying mechanisms. The toxicity of iron-based NPs is a function of their properties, tolerance of test organisms, and environmental conditions. Oxidative stress induced by reactive oxygen species is considered as the primary toxic mechanism of iron-based NPs. Factors influencing the toxicity of iron-based NPs are addressed and environmental transformations play a significant role, for example, surface oxidation or coating by NOM generally lowers the toxicity of nZVI. Research gaps and future directions are suggested with an aim to boost concerted research efforts on environmental transformations and toxicity of iron-based NPs, e.g., toxicity studies of transformed NPs in field, expansion of toxicity endpoints, and roles of laden contaminants and surface coating. This review will enhance our understanding of potential risks of iron-based NPs and proper uses of environmentally benign NPs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Why Narrating Changes Memory: A Contribution to an Integrative Model of Memory and Narrative Processes.

    PubMed

    Smorti, Andrea; Fioretti, Chiara

    2016-06-01

    This paper aims to reflect on the relation between autobiographical memory (ME) and autobiographical narrative (NA), examining studies on the effects of narrating on the narrator and showing how studying these relations can make more comprehensible both memory's and narrating's way of working. Studies that address explicitly on ME and NA are scarce and touch this issue indirectly. Authors consider different trends of studies of ME and NA: congruency vs incongruency hypotheses on retrieving, the way of organizing memories according to gist or verbatim format and their role in organizing positive and negative emotional experiences, the social roots of ME and NA, the rules of conversation based on narrating. Analysis of investigations leads the Authors to point out three basic results of their research. Firstly, NA transforms ME because it narrativizes memories according to a narrative format. This means that memories, when are narrated, are transformed in stories (verbal language) and socialised. Secondly, the narrativization process is determined by the act of telling something within a communicative situation. Thus, relational situation of narrating act, by modifying the story, modifies also memories. The Authors propose the RE.NA.ME model (RElation, NArration, MEmory) to understand and study ME and NA. Finally, this study claims that ME and NA refer to two different types of processes having a wide area of overlapping. This is due to common social, developmental and cultural roots that make NA to include part of ME (narrative of memory) and ME to include part of NA (memory of personal events that have been narrated).

  3. The Emerging Role of the Republic of South Africa as a Regional Power

    DTIC Science & Technology

    2003-04-07

    effect national power. It is written from the perspective that South Africa, even with its past of racial separation and minority rule, is overcoming...facing these issues head-on, has overcome racial diverseness, and is developing into a leading regional role. South Africa is transforming. Since 1994...that effect national power. It is written from the perspective that South Africa, even with its past of racial separation and minority rule, is

  4. A syllogistic system generated by the Aristotelian approach and the modern approach as an hyperincursive system

    NASA Astrophysics Data System (ADS)

    Grappone, Arturo G.; Malatesta, Michele

    2001-06-01

    This paper proves that the syllogism set is transformed in an hyperincursive system which is composed of a sole hyperincursive set by using the Aristotelian rules to deduce syllogisms among them and a deduction new rule which is proposed in this paper. An important consequence of this result is that every syllogism deduces all the possible syllogisms. Part One was written by Michele Malatesta and Part Two by Arturo Graziano Grappone.

  5. JFMCC: Theater C2 in Need of SOLE

    DTIC Science & Technology

    2003-02-03

    four command ships in service presently in the Navy. “Of the four command ships in service today, two—the Mount Whitney (LCC-20) and the Blue Ridge ...Dr. Henry H. Gaffney , Jr., The Top 100 Rules of the New American Way of War, (Washington, D.C.: Office of Force Transformation, Office of the...2001. Barnett, Thomas and Gaffney , Henry H. The Top 100 Rules of the New American Way of War. Washington, D.C.: Office of

  6. The transformation of targeted killing and international order.

    PubMed

    Senn, Martin; Troy, Jodok

    2017-05-04

    This article introduces the special issue's question of whether and how the current transformation of targeted killing is transforming the global international order and provides the conceptual ground for the individual contributions to the special issue. It develops a two-dimensional concept of political order and introduces a theoretical framework that conceives the maintenance and transformation of international order as a dynamic interplay between its behavioral dimension in the form of violence and discursive processes and its institutional dimension in the form of ideas, norms, and rules. The article also conceptualizes targeted killing and introduces a typology of targeted-killing acts on the basis of their legal and moral legitimacy. Building on this conceptual groundwork, the article takes stock of the current transformation of targeted killing and summarizes the individual contributions to this special issue.

  7. Multispectral medical image fusion in Contourlet domain for computer based diagnosis of Alzheimer's disease.

    PubMed

    Bhateja, Vikrant; Moin, Aisha; Srivastava, Anuja; Bao, Le Nguyen; Lay-Ekuakille, Aimé; Le, Dac-Nhuong

    2016-07-01

    Computer based diagnosis of Alzheimer's disease can be performed by dint of the analysis of the functional and structural changes in the brain. Multispectral image fusion deliberates upon fusion of the complementary information while discarding the surplus information to achieve a solitary image which encloses both spatial and spectral details. This paper presents a Non-Sub-sampled Contourlet Transform (NSCT) based multispectral image fusion model for computer-aided diagnosis of Alzheimer's disease. The proposed fusion methodology involves color transformation of the input multispectral image. The multispectral image in YIQ color space is decomposed using NSCT followed by dimensionality reduction using modified Principal Component Analysis algorithm on the low frequency coefficients. Further, the high frequency coefficients are enhanced using non-linear enhancement function. Two different fusion rules are then applied to the low-pass and high-pass sub-bands: Phase congruency is applied to low frequency coefficients and a combination of directive contrast and normalized Shannon entropy is applied to high frequency coefficients. The superiority of the fusion response is depicted by the comparisons made with the other state-of-the-art fusion approaches (in terms of various fusion metrics).

  8. Multispectral medical image fusion in Contourlet domain for computer based diagnosis of Alzheimer’s disease

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhateja, Vikrant, E-mail: bhateja.vikrant@gmail.com, E-mail: nhuongld@hus.edu.vn; Moin, Aisha; Srivastava, Anuja

    Computer based diagnosis of Alzheimer’s disease can be performed by dint of the analysis of the functional and structural changes in the brain. Multispectral image fusion deliberates upon fusion of the complementary information while discarding the surplus information to achieve a solitary image which encloses both spatial and spectral details. This paper presents a Non-Sub-sampled Contourlet Transform (NSCT) based multispectral image fusion model for computer-aided diagnosis of Alzheimer’s disease. The proposed fusion methodology involves color transformation of the input multispectral image. The multispectral image in YIQ color space is decomposed using NSCT followed by dimensionality reduction using modified Principal Componentmore » Analysis algorithm on the low frequency coefficients. Further, the high frequency coefficients are enhanced using non-linear enhancement function. Two different fusion rules are then applied to the low-pass and high-pass sub-bands: Phase congruency is applied to low frequency coefficients and a combination of directive contrast and normalized Shannon entropy is applied to high frequency coefficients. The superiority of the fusion response is depicted by the comparisons made with the other state-of-the-art fusion approaches (in terms of various fusion metrics).« less

  9. Image Fusion of CT and MR with Sparse Representation in NSST Domain

    PubMed Central

    Qiu, Chenhui; Wang, Yuanyuan; Zhang, Huan

    2017-01-01

    Multimodal image fusion techniques can integrate the information from different medical images to get an informative image that is more suitable for joint diagnosis, preoperative planning, intraoperative guidance, and interventional treatment. Fusing images of CT and different MR modalities are studied in this paper. Firstly, the CT and MR images are both transformed to nonsubsampled shearlet transform (NSST) domain. So the low-frequency components and high-frequency components are obtained. Then the high-frequency components are merged using the absolute-maximum rule, while the low-frequency components are merged by a sparse representation- (SR-) based approach. And the dynamic group sparsity recovery (DGSR) algorithm is proposed to improve the performance of the SR-based approach. Finally, the fused image is obtained by performing the inverse NSST on the merged components. The proposed fusion method is tested on a number of clinical CT and MR images and compared with several popular image fusion methods. The experimental results demonstrate that the proposed fusion method can provide better fusion results in terms of subjective quality and objective evaluation. PMID:29250134

  10. Image Fusion of CT and MR with Sparse Representation in NSST Domain.

    PubMed

    Qiu, Chenhui; Wang, Yuanyuan; Zhang, Huan; Xia, Shunren

    2017-01-01

    Multimodal image fusion techniques can integrate the information from different medical images to get an informative image that is more suitable for joint diagnosis, preoperative planning, intraoperative guidance, and interventional treatment. Fusing images of CT and different MR modalities are studied in this paper. Firstly, the CT and MR images are both transformed to nonsubsampled shearlet transform (NSST) domain. So the low-frequency components and high-frequency components are obtained. Then the high-frequency components are merged using the absolute-maximum rule, while the low-frequency components are merged by a sparse representation- (SR-) based approach. And the dynamic group sparsity recovery (DGSR) algorithm is proposed to improve the performance of the SR-based approach. Finally, the fused image is obtained by performing the inverse NSST on the merged components. The proposed fusion method is tested on a number of clinical CT and MR images and compared with several popular image fusion methods. The experimental results demonstrate that the proposed fusion method can provide better fusion results in terms of subjective quality and objective evaluation.

  11. Nonequimolar Mixture of Organic Acids and Bases: An Exception to the Rule of Thumb for Salt or Cocrystal.

    PubMed

    Pratik, Saied Md; Datta, Ayan

    2016-08-04

    Formation of salt and/or cocrystal from organic acid-base mixtures has significant consequences in the pharmaceutical industry and its related intellectual property rights (IPR). On the basis of calculations using periodic dispersion corrected DFT (DFT-D2) on formic acid-pyridine adduct, we have demonstrated that an equimolar stoichiometric ratio (1:1) exists as a neutral cocrystal. On the other hand, the nonequimolar stoichiometry (4:1) readily forms an ionic salt. While the former result is in agreement with the ΔpKa rule between the base and the acid, the latter is not. Calculations reveal that, within the equimolar manifold (n:n; n = 1-4), the mixture exists as a hydrogen bonded complex in a cocrystal-like environment. However, the nonequimolar mixture in a ratio of 5:1 and above readily forms salt-like structures. Because of the cooperative nature of hydrogen bonding, the strength of the O-H···N hydrogen bond increases and eventually transforms into O(-)···H-N(+) (complete proton transfer) as the ratio of formic acid increases and forms salt as experimentally observed. Clearly, an enhanced polarization of formic acid on aggregation increases its acidity and, hence, facilitates its transfer to pyridine. Motion of the proton from formic acid to pyridine is shown to follow a relay mechanism wherein the proton that is far away from pyridine is ionized and is subsequently transferred to pyridine via hopping across the neutral formic acid molecules (Grotthuss type pathway). The dynamic nature of protons in the condensed phase is also evident for cocrystals as the barrier of intramolecular proton migration in formic acid (leading to tautomerism), ΔH(⧧)tautomer = 17.1 kcal/mol in the presence of pyridine is half of that in free formic acid (cf. ΔH(⧧)tautomer = 34.2 kcal/mol). We show that an acid-base reaction can be altered in the solid state to selectively form a cocrystal or salt depending on the strength and nature of aggregation.

  12. Suitability evaluation tool for lands (rice, corn and soybean) as mobile application

    NASA Astrophysics Data System (ADS)

    Rahim, S. E.; Supli, A. A.; Damiri, N.

    2017-09-01

    Evaluation of land suitability for special purposes e.g. for food crops is a must, a means to understand determining factors to be considered in the management of a land successfully. A framework for evaluating the land suitability for purposes in agriculture was first introduced by the Food and Agriculture Organization (FAO) in late 1970s. When using the framework manually, it is time consuming and not interesting for land users. Therefore, the authors have developed an effective tool by transforming the FAO framework into smart mobile application. This application is designed by using simple language for each factor and also by utilizing rule based system (RBS) algorithm. The factors involved are soil type, depth of soil solum, soil fertility, soil pH, drainage, risk of flood, etc. Suitability in this paper is limited to rice, corn and soybean. The application is found to be easier to understand and also could automatically determine the suitability of land. Usability testing was also conducted with 75 respondents. The results showed the usability was in "very good" classification. The program is urgently needed by the land managers, farmers, lecturers, students and government officials (planners) to help them more easily manage their land for a better future.

  13. Credibility battles in the autism litigation.

    PubMed

    Kirkland, Anna

    2012-04-01

    That vaccines do not cause autism is now a widely accepted proposition, though a few dissenters remain. An 8-year court process in the US federal vaccine injury compensation court ended in 2010 with rulings that autism was not an adverse reaction to vaccination. There were two sets of trials: one against the measles-mumps-rubella (MMR) vaccine and one against the mercury-based preservative thimerosal. The MMR story is more widely known because of publicity surrounding the main proponent of an MMR-autism link, British doctor Andrew Wakefield, but the story of thimerosal in court is largely untold. This study examines the credibility battles and boundary work in the two cases, illuminating the sustaining world of alternative science that supported the parents, lawyers, researchers, and expert witnesses against vaccines. After the loss in court, the families and their advocates transformed their scientific arguments into an indictment of procedural injustice in the vaccine court. I argue that the very efforts designed to produce legitimacy in this type of lopsided dispute will be counter-mobilized as evidence of injustice, helping us understand why settling a scientific controversy in court does not necessarily mean changing anyone's mind.

  14. Engineering monitoring expert system's developer

    NASA Technical Reports Server (NTRS)

    Lo, Ching F.

    1991-01-01

    This research project is designed to apply artificial intelligence technology including expert systems, dynamic interface of neural networks, and hypertext to construct an expert system developer. The developer environment is specifically suited to building expert systems which monitor the performance of ground support equipment for propulsion systems and testing facilities. The expert system developer, through the use of a graphics interface and a rule network, will be transparent to the user during rule constructing and data scanning of the knowledge base. The project will result in a software system that allows its user to build specific monitoring type expert systems which monitor various equipments used for propulsion systems or ground testing facilities and accrues system performance information in a dynamic knowledge base.

  15. Unit operations for gas-liquid mass transfer in reduced gravity environments

    NASA Technical Reports Server (NTRS)

    Pettit, Donald R.; Allen, David T.

    1992-01-01

    Basic scaling rules are derived for converting Earth-based designs of mass transfer equipment into designs for a reduced gravity environment. Three types of gas-liquid mass transfer operations are considered: bubble columns, spray towers, and packed columns. Application of the scaling rules reveals that the height of a bubble column in lunar- and Mars-based operations would be lower than terrestrial designs by factors of 0.64 and 0.79 respectively. The reduced gravity columns would have greater cross-sectional areas, however, by factors of 2.4 and 1.6 for lunar and Martian settings. Similar results were obtained for spray towers. In contract, packed column height was found to be nearly independent of gravity.

  16. Error identification and recovery by student nurses using human patient simulation: opportunity to improve patient safety.

    PubMed

    Henneman, Elizabeth A; Roche, Joan P; Fisher, Donald L; Cunningham, Helene; Reilly, Cheryl A; Nathanson, Brian H; Henneman, Philip L

    2010-02-01

    This study examined types of errors that occurred or were recovered in a simulated environment by student nurses. Errors occurred in all four rule-based error categories, and all students committed at least one error. The most frequent errors occurred in the verification category. Another common error was related to physician interactions. The least common errors were related to coordinating information with the patient and family. Our finding that 100% of student subjects committed rule-based errors is cause for concern. To decrease errors and improve safe clinical practice, nurse educators must identify effective strategies that students can use to improve patient surveillance. Copyright 2010 Elsevier Inc. All rights reserved.

  17. Symbolic rule-based classification of lung cancer stages from free-text pathology reports.

    PubMed

    Nguyen, Anthony N; Lawley, Michael J; Hansen, David P; Bowman, Rayleen V; Clarke, Belinda E; Duhig, Edwina E; Colquist, Shoni

    2010-01-01

    To classify automatically lung tumor-node-metastases (TNM) cancer stages from free-text pathology reports using symbolic rule-based classification. By exploiting report substructure and the symbolic manipulation of systematized nomenclature of medicine-clinical terms (SNOMED CT) concepts in reports, statements in free text can be evaluated for relevance against factors relating to the staging guidelines. Post-coordinated SNOMED CT expressions based on templates were defined and populated by concepts in reports, and tested for subsumption by staging factors. The subsumption results were used to build logic according to the staging guidelines to calculate the TNM stage. The accuracy measure and confusion matrices were used to evaluate the TNM stages classified by the symbolic rule-based system. The system was evaluated against a database of multidisciplinary team staging decisions and a machine learning-based text classification system using support vector machines. Overall accuracy on a corpus of pathology reports for 718 lung cancer patients against a database of pathological TNM staging decisions were 72%, 78%, and 94% for T, N, and M staging, respectively. The system's performance was also comparable to support vector machine classification approaches. A system to classify lung TNM stages from free-text pathology reports was developed, and it was verified that the symbolic rule-based approach using SNOMED CT can be used for the extraction of key lung cancer characteristics from free-text reports. Future work will investigate the applicability of using the proposed methodology for extracting other cancer characteristics and types.

  18. Age-Related Differences in Contribution of Rule-Based Thinking toward Moral Evaluations

    PubMed Central

    Caravita, Simona C. S.; De Silva, Lindamulage N.; Pagani, Vera; Colombo, Barbara; Antonietti, Alessandro

    2017-01-01

    This study aims to investigate the interplay of different criteria of moral evaluation, related to the type of the rule and context characteristics, in moral reasoning of children, early, and late adolescents. Students attending to fourth, seventh, and tenth grade were asked to evaluate the acceptability of rule breaking actions using ad hoc scenarios. Results suggest that the role of different moral evaluation criteria changes by age. During adolescence a greater integration of the moral criteria emerged. Moreover, adolescents also prioritized the evaluation of moral rule (forbidding to harm others) violations as non-acceptable when the perpetrator harms an innocent victim by applying a direct personal force. The relevance of these findings to increase the understanding of how moral reasoning changes by age for the assessment of impairments in moral reasoning of non-normative groups is also discussed. PMID:28473788

  19. First-principles prediction of a promising p-type transparent conductive material CsGeCl3

    NASA Astrophysics Data System (ADS)

    Huang, Dan; Zhao, Yu-Jun; Ju, Zhi-Ping; Gan, Li-Yong; Chen, Xin-Man; Li, Chang-Sheng; Yao, Chun-mei; Guo, Jin

    2014-04-01

    Most reported p-type transparent conductive materials are Cu-based compounds such as CuAlO2 and CuCrO2. Here, we report that compounds based on ns2 cations with low binding energy can also possess high valence band maximum, which is crucial for the p-type doping according to the doping limit rules. In particular, CsGeCl3, a compound with valence band maximum from ns2 cations, is predicted as a promising p-type transparent conductive material by first-principles calculations. Our results show that the p-type defect Ge vacancy dominates its intrinsic defects with a shallow transition level, and the calculated hole effective masses are low in CsGeCl3.

  20. Knowledge-driven institutional change: an empirical study on combating desertification in northern China from 1949 to 2004.

    PubMed

    Yang, Lihua; Wu, Jianguo

    2012-11-15

    Understanding institutional changes is crucial for environmental management. Here we investigated how institutional changes influenced the process and result of desertification control in northern China between 1949 and 2004. Our analysis was based on a case study of 21 field sites and a meta-analysis of additional 29 sites reported in the literature. Our results show that imposed knowledge-driven institutional change was often perceived as a more progressive, scientific, and rational type of institutional change by entrepreneurs, scholars, experts, and technicians, while voluntary, knowledge-driven institutional change based on indigenous knowledge and experiences of local populations was discouraged. Our findings also demonstrate that eight working rules of imposed knowledge-driven institutional change can be applied to control desertification effectively. These rules address the issues of perception of potential gains, entrepreneurs' appeals and support, coordination of multiple goals, collaboration among multiple organizations, interest distribution and conflict resolution, incremental institutional change, external intervention, and coordination among the myriad institutions involved. Imposed knowledge-driven institutional change tended to be more successful when these rules were thoroughly implemented. These findings provide an outline for implementing future institutional changes and policy making to combat desertification and other types of ecological and environmental management. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Experiments on individual strategy updating in iterated snowdrift game under random rematching.

    PubMed

    Qi, Hang; Ma, Shoufeng; Jia, Ning; Wang, Guangchao

    2015-03-07

    How do people actually play the iterated snowdrift games, particularly under random rematching protocol is far from well explored. Two sets of laboratory experiments on snowdrift game were conducted to investigate human strategy updating rules. Four groups of subjects were modeled by experience-weighted attraction learning theory at individual-level. Three out of the four groups (75%) passed model validation. Substantial heterogeneity is observed among the players who update their strategies in four typical types, whereas rare people behave like belief-based learners even under fixed pairing. Most subjects (63.9%) adopt the reinforcement learning (or alike) rules; but, interestingly, the performance of averaged reinforcement learners suffered. It is observed that two factors seem to benefit players in competition, i.e., the sensitivity to their recent experiences and the overall consideration of forgone payoffs. Moreover, subjects with changing opponents tend to learn faster based on their own recent experience, and display more diverse strategy updating rules than they do with fixed opponent. These findings suggest that most of subjects do apply reinforcement learning alike updating rules even under random rematching, although these rules may not improve their performance. The findings help evolutionary biology researchers to understand sophisticated human behavioral strategies in social dilemmas. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Sediment-Hosted Zinc-Lead Deposits of the World - Database and Grade and Tonnage Models

    USGS Publications Warehouse

    Singer, Donald A.; Berger, Vladimir I.; Moring, Barry C.

    2009-01-01

    This report provides information on sediment-hosted zinc-lead mineral deposits based on the geologic settings that are observed on regional geologic maps. The foundation of mineral-deposit models is information about known deposits. The purpose of this publication is to make this kind of information available in digital form for sediment-hosted zinc-lead deposits. Mineral-deposit models are important in exploration planning and quantitative resource assessments: Grades and tonnages among deposit types are significantly different, and many types occur in different geologic settings that can be identified from geologic maps. Mineral-deposit models are the keystone in combining the diverse geoscience information on geology, mineral occurrences, geophysics, and geochemistry used in resource assessments and mineral exploration. Too few thoroughly explored mineral deposits are available in most local areas for reliable identification of the important geoscience variables, or for robust estimation of undiscovered deposits - thus, we need mineral-deposit models. Globally based deposit models allow recognition of important features because the global models demonstrate how common different features are. Well-designed and -constructed deposit models allow geologists to know from observed geologic environments the possible mineral-deposit types that might exist, and allow economists to determine the possible economic viability of these resources in the region. Thus, mineral-deposit models play the central role in transforming geoscience information to a form useful to policy makers. This publication contains a computer file of information on sediment-hosted zinc-lead deposits from around the world. It also presents new grade and tonnage models for nine types of these deposits and a file allowing locations of all deposits to be plotted in Google Earth. The data are presented in FileMaker Pro, Excel and text files to make the information available to as many as possible. The value of this information and any derived analyses depends critically on the consistent manner of data gathering. For this reason, we first discuss the rules applied in this compilation. Next, the fields of the data file are considered. Finally, we provide new grade and tonnage models that are, for the most part, based on a classification of deposits using observable geologic units from regional-scaled maps.

  3. Intelligent wear mode identification system for marine diesel engines based on multi-level belief rule base methodology

    NASA Astrophysics Data System (ADS)

    Yan, Xinping; Xu, Xiaojian; Sheng, Chenxing; Yuan, Chengqing; Li, Zhixiong

    2018-01-01

    Wear faults are among the chief causes of main-engine damage, significantly influencing the secure and economical operation of ships. It is difficult for engineers to utilize multi-source information to identify wear modes, so an intelligent wear mode identification model needs to be developed to assist engineers in diagnosing wear faults in diesel engines. For this purpose, a multi-level belief rule base (BBRB) system is proposed in this paper. The BBRB system consists of two-level belief rule bases, and the 2D and 3D characteristics of wear particles are used as antecedent attributes on each level. Quantitative and qualitative wear information with uncertainties can be processed simultaneously by the BBRB system. In order to enhance the efficiency of the BBRB, the silhouette value is adopted to determine referential points and the fuzzy c-means clustering algorithm is used to transform input wear information into belief degrees. In addition, the initial parameters of the BBRB system are constructed on the basis of expert-domain knowledge and then optimized by the genetic algorithm to ensure the robustness of the system. To verify the validity of the BBRB system, experimental data acquired from real-world diesel engines are analyzed. Five-fold cross-validation is conducted on the experimental data and the BBRB is compared with the other four models in the cross-validation. In addition, a verification dataset containing different wear particles is used to highlight the effectiveness of the BBRB system in wear mode identification. The verification results demonstrate that the proposed BBRB is effective and efficient for wear mode identification with better performance and stability than competing systems.

  4. The Patient-Centered Medical Home (PCMH) Framing Typology for Understanding the Structure, Function, and Outcomes of PCMHs.

    PubMed

    Kieber-Emmons, Autumn M; Miller, William L

    2017-01-01

    Patient-centered medical homes (PCHMs) aspire to transform today's challenged primary care services. However, it is unclear which PCMH characteristics produce specific outcomes of interest for care delivery. This study tested a novel typology of PCMH practice transformation, the PCMH framing typology, and evaluated measurable outcomes by each type. Using the Patient-Centered Primary Care Collaborative 2012 to 2013 Annual Review, this secondary analysis of the published PCMH literature extracted data from publications of 59 PCMHs. Each of the 59 sites was categorized as 1 of 4 PCMH types: add-on, renovated, hybrid, or integrated. Six outcome measures (cost reductions, decreased emergency department/hospital utilization, improved quality, improved access, increased preventive services, and improved patient satisfaction) were independently coded for each site. Practices were combined based on type, and mean outcomes scores for each measure were displayed on radar graphs for comparison. While each type showed a characteristic pattern of success, only the integrated type improved in all 6 outcomes. No type achieved high success in all measures. There seem to be 4 types of PCMH, each of which shows a distinctive outcomes profile. Within the PCMH framing typology, direction is emerging for how best to transform primary care to achieve the greatest success. © Copyright 2017 by the American Board of Family Medicine.

  5. Culture and problem-solving: Congruency between the cultural mindset of individualism versus collectivism and problem type.

    PubMed

    Arieli, Sharon; Sagiv, Lilach

    2018-06-01

    This research investigates how the cultural mindset influences problem-solving. Drawing on the notion that cultural mindset influences the cognitive process individuals bring to bear at the moment of judgment, we propose that the congruency between the cultural mindset (individualistic vs. collectivistic) and problem type (rule-based vs. context-based) affects success in problem-solving. In 7 studies we incorporated the traditional approach to studying the impact of culture (i.e., comparing cultural groups) with contemporary approaches viewing cultural differences in a more dynamic and malleable manner. We first show that members of an individualistic group (Jewish Americans) perform better on rule-based problems, whereas members of collectivistic groups (ultra-Orthodox Jews and Arabs from Israel) perform better on context-based problems (Study 1). We then study Arabs in Israel using language (Arabic vs. Hebrew) to prime their collectivistic versus individualistic mindsets (Study 2). As hypothesized, among biculturals (those who internalize both cultures) Arabic facilitated solving context-based problems, whereas Hebrew facilitated solving rule-based problems. We follow up with 5 experiments priming the cultural mindset of individualism versus collectivism, employing various manifestations of the cultural dimension: focusing on the individual versus the collective (Studies 3, 6, and 7); experiencing independence versus interdependence (Study 4); and directing attention to objects versus the context (Studies 5a-b). Finally, we took a meta-analytic approach, showing that the effects found in Studies 3-6 are robust across priming tasks, problems, and samples. Taken together, the differences between cultural groups (Studies 1-2) were recreated when the individualistic/collectivistic cultural mindset was primed. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  6. NASA Langley's Approach to the Sandia's Structural Dynamics Challenge Problem

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Kenny, Sean P.; Crespo, Luis G.; Elliott, Kenny B.

    2007-01-01

    The objective of this challenge is to develop a data-based probabilistic model of uncertainty to predict the behavior of subsystems (payloads) by themselves and while coupled to a primary (target) system. Although this type of analysis is routinely performed and representative of issues faced in real-world system design and integration, there are still several key technical challenges that must be addressed when analyzing uncertain interconnected systems. For example, one key technical challenge is related to the fact that there is limited data on target configurations. Moreover, it is typical to have multiple data sets from experiments conducted at the subsystem level, but often samples sizes are not sufficient to compute high confidence statistics. In this challenge problem additional constraints are placed as ground rules for the participants. One such rule is that mathematical models of the subsystem are limited to linear approximations of the nonlinear physics of the problem at hand. Also, participants are constrained to use these models and the multiple data sets to make predictions about the target system response under completely different input conditions. Our approach involved initially the screening of several different methods. Three of the ones considered are presented herein. The first one is based on the transformation of the modal data to an orthogonal space where the mean and covariance of the data are matched by the model. The other two approaches worked solutions in physical space where the uncertain parameter set is made of masses, stiffnesses and damping coefficients; one matches confidence intervals of low order moments of the statistics via optimization while the second one uses a Kernel density estimation approach. The paper will touch on all the approaches, lessons learned, validation 1 metrics and their comparison, data quantity restriction, and assumptions/limitations of each approach. Keywords: Probabilistic modeling, model validation, uncertainty quantification, kernel density

  7. Detection algorithm for glass bottle mouth defect by continuous wavelet transform based on machine vision

    NASA Astrophysics Data System (ADS)

    Qian, Jinfang; Zhang, Changjiang

    2014-11-01

    An efficient algorithm based on continuous wavelet transform combining with pre-knowledge, which can be used to detect the defect of glass bottle mouth, is proposed. Firstly, under the condition of ball integral light source, a perfect glass bottle mouth image is obtained by Japanese Computar camera through the interface of IEEE-1394b. A single threshold method based on gray level histogram is used to obtain the binary image of the glass bottle mouth. In order to efficiently suppress noise, moving average filter is employed to smooth the histogram of original glass bottle mouth image. And then continuous wavelet transform is done to accurately determine the segmentation threshold. Mathematical morphology operations are used to get normal binary bottle mouth mask. A glass bottle to be detected is moving to the detection zone by conveyor belt. Both bottle mouth image and binary image are obtained by above method. The binary image is multiplied with normal bottle mask and a region of interest is got. Four parameters (number of connected regions, coordinate of centroid position, diameter of inner cycle, and area of annular region) can be computed based on the region of interest. Glass bottle mouth detection rules are designed by above four parameters so as to accurately detect and identify the defect conditions of glass bottle. Finally, the glass bottles of Coca-Cola Company are used to verify the proposed algorithm. The experimental results show that the proposed algorithm can accurately detect the defect conditions of the glass bottles and have 98% detecting accuracy.

  8. 76 FR 61288 - Efficiency and Renewables Advisory Committee, Appliance Standards Subcommittee Negotiated...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-04

    ... Medium- and Low-Voltage Dry-Type Distribution Transformers AGENCY: Department of Energy, Office of Energy... Dry-Type Distribution Transformers and the second addressing Low-Voltage Dry-Type Distribution Transformers. The Liquid Immersed and Medium-Voltage Dry-Type Group (MV Group) and the Low-Voltage Dry-Type...

  9. A hierarchical fuzzy rule-based approach to aphasia diagnosis.

    PubMed

    Akbarzadeh-T, Mohammad-R; Moshtagh-Khorasani, Majid

    2007-10-01

    Aphasia diagnosis is a particularly challenging medical diagnostic task due to the linguistic uncertainty and vagueness, inconsistencies in the definition of aphasic syndromes, large number of measurements with imprecision, natural diversity and subjectivity in test objects as well as in opinions of experts who diagnose the disease. To efficiently address this diagnostic process, a hierarchical fuzzy rule-based structure is proposed here that considers the effect of different features of aphasia by statistical analysis in its construction. This approach can be efficient for diagnosis of aphasia and possibly other medical diagnostic applications due to its fuzzy and hierarchical reasoning construction. Initially, the symptoms of the disease which each consists of different features are analyzed statistically. The measured statistical parameters from the training set are then used to define membership functions and the fuzzy rules. The resulting two-layered fuzzy rule-based system is then compared with a back propagating feed-forward neural network for diagnosis of four Aphasia types: Anomic, Broca, Global and Wernicke. In order to reduce the number of required inputs, the technique is applied and compared on both comprehensive and spontaneous speech tests. Statistical t-test analysis confirms that the proposed approach uses fewer Aphasia features while also presenting a significant improvement in terms of accuracy.

  10. How people explain their own and others’ behavior: a theory of lay causal explanations

    PubMed Central

    Böhm, Gisela; Pfister, Hans-Rüdiger

    2015-01-01

    A theoretical model is proposed that specifies lay causal theories of behavior; and supporting experimental evidence is presented. The model’s basic assumption is that different types of behavior trigger different hypotheses concerning the types of causes that may have brought about the behavior. Seven categories are distinguished that are assumed to serve as both behavior types and explanation types: goals, dispositions, temporary states such as emotions, intentional actions, outcomes, events, and stimulus attributes. The model specifies inference rules that lay people use when explaining behavior (actions are caused by goals; goals are caused by higher order goals or temporary states; temporary states are caused by dispositions, stimulus attributes, or events; outcomes are caused by actions, temporary states, dispositions, stimulus attributes, or events; events are caused by dispositions or preceding events). Two experiments are reported. Experiment 1 showed that free-response explanations followed the assumed inference rules. Experiment 2 demonstrated that explanations which match the inference rules are generated faster and more frequently than non-matching explanations. Together, the findings support models that incorporate knowledge-based aspects into the process of causal explanation. The results are discussed with respect to their implications for different stages of this process, such as the activation of causal hypotheses and their subsequent selection, as well as with respect to social influences on this process. PMID:25741306

  11. Alternative formulations of the Laplace transform boundary element (LTBE) numerical method for the solution of diffusion-type equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moridis, G.

    1992-03-01

    The Laplace Transform Boundary Element (LTBE) method is a recently introduced numerical method, and has been used for the solution of diffusion-type PDEs. It completely eliminates the time dependency of the problem and the need for time discretization, yielding solutions numerical in space and semi-analytical in time. In LTBE solutions are obtained in the Laplace spare, and are then inverted numerically to yield the solution in time. The Stehfest and the DeHoog formulations of LTBE, based on two different inversion algorithms, are investigated. Both formulations produce comparable, extremely accurate solutions.

  12. Wavefront reconstruction from non-modulated pyramid wavefront sensor data using a singular value type expansion

    NASA Astrophysics Data System (ADS)

    Hutterer, Victoria; Ramlau, Ronny

    2018-03-01

    The new generation of extremely large telescopes includes adaptive optics systems to correct for atmospheric blurring. In this paper, we present a new method of wavefront reconstruction from non-modulated pyramid wavefront sensor data. The approach is based on a simplified sensor model represented as the finite Hilbert transform of the incoming phase. Due to the non-compactness of the finite Hilbert transform operator the classical theory for singular systems is not applicable. Nevertheless, we can express the Moore-Penrose inverse as a singular value type expansion with weighted Chebychev polynomials.

  13. Visual judgment of similarity across shape transformations: evidence for a compositional model of articulated objects.

    PubMed

    Barenholtz, Elan; Tarr, Michael J

    2008-06-01

    A single biological object, such as a hand, can assume multiple, very different shapes, due to the articulation of its parts. Yet we are able to recognize all of these shapes as examples of the same object. How is this invariance to pose achieved? Here, we present evidence that the visual system maintains a model of object transformation that is based on rigid, convex parts articulating at extrema of negative curvature, i.e., part boundaries. We compared similarity judgments in a task in which subjects had to decide which of the two transformed versions of a 'base' shape-one a 'biologically valid' articulation and one a geometrically similar but 'biologically invalid' articulation-was more similar to the base shape. Two types of comparisons were made: in the figure/ground-reversal, the invalid articulation consisted of exactly the same contour transformation as the valid one with reversed figural polarity. In the axis-of-rotation reversal, the valid articulation consisted of a part rotated around its concave part boundaries, while the invalid articulation consisted of the same part rotated around the endpoints on the opposite side of the part. In two separate 2AFC similarity experiments-one in which the base and transformed shapes were presented simultaneously and one in which they were presented sequentially-subjects were more likely to match the base shape to a transform when it corresponded to a legitimate articulation. These results suggest that the visual system maintains expectations about the way objects will transform, based on their static geometry.

  14. Self-accommodation of B19' martensite in Ti-Ni shape memory alloys - Part I. Morphological and crystallographic studies of the variant selection rule

    NASA Astrophysics Data System (ADS)

    Nishida, M.; Nishiura, T.; Kawano, H.; Inamura, T.

    2012-06-01

    The self-accommodation morphologies of B19‧ martensite in Ti-Ni alloys have been investigated by optical microscopy, scanning electron microscopy (SEM) and transmission electron microscopy (TEM). Twelve pairs of minimum units consisting of two habit plane variants (HPVs) with V-shaped morphology connected to a ? B19‧ type I variant accommodation twin were observed. Three types of self-accommodation morphologies, based on the V-shaped minimum unit, developed around one of the {111}B2 traces, which were triangular, rhombic and hexangular and consisted of three, four and six HPVs, respectively. In addition, the variant selection rule and the number of possible HPV combinations in each of these self-accommodation morphologies are discussed.

  15. Pixel-level multisensor image fusion based on matrix completion and robust principal component analysis

    NASA Astrophysics Data System (ADS)

    Wang, Zhuozheng; Deller, J. R.; Fleet, Blair D.

    2016-01-01

    Acquired digital images are often corrupted by a lack of camera focus, faulty illumination, or missing data. An algorithm is presented for fusion of multiple corrupted images of a scene using the lifting wavelet transform. The method employs adaptive fusion arithmetic based on matrix completion and self-adaptive regional variance estimation. Characteristics of the wavelet coefficients are used to adaptively select fusion rules. Robust principal component analysis is applied to low-frequency image components, and regional variance estimation is applied to high-frequency components. Experiments reveal that the method is effective for multifocus, visible-light, and infrared image fusion. Compared with traditional algorithms, the new algorithm not only increases the amount of preserved information and clarity but also improves robustness.

  16. Job Shop Scheduling Focusing on Role of Buffer

    NASA Astrophysics Data System (ADS)

    Hino, Rei; Kusumi, Tetsuya; Yoo, Jae-Kyu; Shimizu, Yoshiaki

    A scheduling problem is formulated in order to consistently manage each manufacturing resource, including machine tools, assembly robots, AGV, storehouses, material shelves, and so on. The manufacturing resources are classified into three types: producer, location, and mover. This paper focuses especially on the role of the buffer, and the differences among these types are analyzed. A unified scheduling formulation is derived from the analytical results based on the resource’s roles. Scheduling procedures based on dispatching rules are also proposed in order to numerically evaluate job shop-type production having finite buffer capacity. The influences of the capacity of bottle-necked production devices and the buffer on productivity are discussed.

  17. Enterobacter aerogenes Hormaeche and Edwards 1960 (Approved Lists 1980) and Klebsiella mobilis Bascomb et al. 1971 (Approved Lists 1980) share the same nomenclatural type (ATCC 13048) on the Approved Lists and are homotypic synonyms, with consequences for the name Klebsiella mobilis Bascomb et al. 1971 (Approved Lists 1980).

    PubMed

    Tindall, B J; Sutton, G; Garrity, G M

    2017-02-01

    Enterobacter aerogenes Hormaeche and Edwards 1960 (Approved Lists 1980) and Klebsiella mobilis Bascomb et al. 1971 (Approved Lists 1980) were placed on the Approved Lists of Bacterial Names and were based on the same nomenclatural type, ATCC 13048. Consequently they are to be treated as homotypic synonyms. However, the names of homotypic synonyms at the rank of species normally are based on the same epithet. Examination of the Rules of the International Code of Nomenclature of Bacteria in force at the time indicates that the epithet mobilis in Klebsiella mobilis Bascomb et al. 1971 (Approved Lists 1980) was illegitimate at the time the Approved Lists were published and according to the Rules of the current International Code of Nomenclature of Prokaryotes continues to be illegitimate.

  18. Examining change detection approaches for tropical mangrove monitoring

    USGS Publications Warehouse

    Myint, Soe W.; Franklin, Janet; Buenemann, Michaela; Kim, Won; Giri, Chandra

    2014-01-01

    This study evaluated the effectiveness of different band combinations and classifiers (unsupervised, supervised, object-oriented nearest neighbor, and object-oriented decision rule) for quantifying mangrove forest change using multitemporal Landsat data. A discriminant analysis using spectra of different vegetation types determined that bands 2 (0.52 to 0.6 μm), 5 (1.55 to 1.75 μm), and 7 (2.08 to 2.35 μm) were the most effective bands for differentiating mangrove forests from surrounding land cover types. A ranking of thirty-six change maps, produced by comparing the classification accuracy of twelve change detection approaches, was used. The object-based Nearest Neighbor classifier produced the highest mean overall accuracy (84 percent) regardless of band combinations. The automated decision rule-based approach (mean overall accuracy of 88 percent) as well as a composite of bands 2, 5, and 7 used with the unsupervised classifier and the same composite or all band difference with the object-oriented Nearest Neighbor classifier were the most effective approaches.

  19. Three types of Indian Ocean Basin modes

    NASA Astrophysics Data System (ADS)

    Guo, Feiyan; Liu, Qinyu; Yang, Jianling; Fan, Lei

    2017-04-01

    The persistence of the Indian Ocean Basin Mode (IOBM) from March to August is important for the prediction of Asian summer monsoon. Based on the observational data and the pre-industrial control run outputs of the Community Climate System Model, version 4 (CCSM4), the IOBM is categorized into three types: the first type can persist until August; the second type transforms from the positive (negative) IOBM into the negative (positive) Indian Ocean Dipole Mode (IODM), accompanied by the El Niño-to-La Niña (La Niña-to-El Niño) transition in the boreal summer; the third type transforms from the positive (negative) IOBM into the positive (negative) IODM in early summer. It is discovered that aside from the influence of anomalous Walker Circulation resulted from the phase transition of ENSO, the persistence of Australia high anomaly (AHA) over the southeastern tropical Indian Ocean (TIO) and the west of Australia from March to May is favorable for the positive (negative) IOBM transformation into the positive (negative) IODM in the boreal summer. The stronger equatorially asymmetric sea surface temperature anomalies (SSTAs) in the boreal spring are the main mechanism for the persistence of IOBM, because the asymmetric atmospheric responses to the stronger equatorially asymmetric SSTAs in the TIO confine the AHA to the east of Australia from May to August. This result indicates a possibility of predicting summer atmospheric circulation based on the equatorial symmetry of SSTAs in the TIO in spring.

  20. The Influence of Type and Token Frequency on the Acquisition of Affixation Patterns: Implications for Language Processing

    ERIC Educational Resources Information Center

    Endress, Ansgar D.; Hauser, Marc D.

    2011-01-01

    Rules, and exceptions to such rules, are ubiquitous in many domains, including language. Here we used simple artificial grammars to investigate the influence of 2 factors on the acquisition of rules and their exceptions, namely type frequency (the relative numbers of different exceptions to different regular items) and token frequency (the number…

  1. Brushite coatings on titanium for orthopedic implants: Studies on deposition and transformation

    NASA Astrophysics Data System (ADS)

    Kumar, Mukesh

    Hydroxyapatite (HA, Ca5(PO4)3OH) coating on the metallic substrate is expected to assist bone growth and implant integration. However, HA is quite stable in physiological solution and the use of other more reactive calcium phosphate ceramics (CPC) could induce faster bone growth by providing calcium and phosphate ions to the interacting physiological solution. This study utilized a non-line of sight electrodeposition process to achieve brushite (CaHPO4.2H2O) coatings. The uses of potassium or sodium chloride as a conducting electrolyte in the depositing bath enhanced deposition rates and altered the morphology of the coatings. Analysis suggested a strained deposit with sight specific substitution of cations from the conducting electrolyte. Such a deposit (modified brushite) was determined to have CaHPO 4.2H2O and CaY2(1-x)HPO4•2H 2O (x ˜0.95) with Y as Na0 or K. Whereas normal brushite was obtained from unsupported baths. The deposited mass of brushite increased with charge consumed and bonding to the substrate decreased with increasing deposition time. Though inconclusive. in-situ studies on electrodeposition did not rule out the possibility of ionic species responsible for the deposit. Transformations of both forms of brushite were investigated in calcium free Hank's type simulated body fluid. Modified brushite showed periodic appearance of freshly precipitated, but poorly crystalline HA, without the benefit of monetite (CaHPO4) as an intermediate. However, normal brushite transformation showed nonstoichiometric HA with monetite as an intermediate. Normal brushite demonstrated a slower transformation to HA when compared to the transformation kinetics of modified brushite. It is shown that lattice strain due to localized ion incorporation could be used to after the properties of brushite coatings to adjust the kinetics of transformation and indirectly the amount of calcium and phosphate ions released into the surrounding.

  2. The Development of a Graphical User Interface Engine for the Convenient Use of the HL7 Version 2.x Interface Engine

    PubMed Central

    Kim, Hwa Sun; Cho, Hune

    2011-01-01

    Objectives The Health Level Seven Interface Engine (HL7 IE), developed by Kyungpook National University, has been employed in health information systems, however users without a background in programming have reported difficulties in using it. Therefore, we developed a graphical user interface (GUI) engine to make the use of the HL7 IE more convenient. Methods The GUI engine was directly connected with the HL7 IE to handle the HL7 version 2.x messages. Furthermore, the information exchange rules (called the mapping data), represented by a conceptual graph in the GUI engine, were transformed into program objects that were made available to the HL7 IE; the mapping data were stored as binary files for reuse. The usefulness of the GUI engine was examined through information exchange tests between an HL7 version 2.x message and a health information database system. Results Users could easily create HL7 version 2.x messages by creating a conceptual graph through the GUI engine without requiring assistance from programmers. In addition, time could be saved when creating new information exchange rules by reusing the stored mapping data. Conclusions The GUI engine was not able to incorporate information types (e.g., extensible markup language, XML) other than the HL7 version 2.x messages and the database, because it was designed exclusively for the HL7 IE protocol. However, in future work, by including additional parsers to manage XML-based information such as Continuity of Care Documents (CCD) and Continuity of Care Records (CCR), we plan to ensure that the GUI engine will be more widely accessible for the health field. PMID:22259723

  3. The Development of a Graphical User Interface Engine for the Convenient Use of the HL7 Version 2.x Interface Engine.

    PubMed

    Kim, Hwa Sun; Cho, Hune; Lee, In Keun

    2011-12-01

    The Health Level Seven Interface Engine (HL7 IE), developed by Kyungpook National University, has been employed in health information systems, however users without a background in programming have reported difficulties in using it. Therefore, we developed a graphical user interface (GUI) engine to make the use of the HL7 IE more convenient. The GUI engine was directly connected with the HL7 IE to handle the HL7 version 2.x messages. Furthermore, the information exchange rules (called the mapping data), represented by a conceptual graph in the GUI engine, were transformed into program objects that were made available to the HL7 IE; the mapping data were stored as binary files for reuse. The usefulness of the GUI engine was examined through information exchange tests between an HL7 version 2.x message and a health information database system. Users could easily create HL7 version 2.x messages by creating a conceptual graph through the GUI engine without requiring assistance from programmers. In addition, time could be saved when creating new information exchange rules by reusing the stored mapping data. The GUI engine was not able to incorporate information types (e.g., extensible markup language, XML) other than the HL7 version 2.x messages and the database, because it was designed exclusively for the HL7 IE protocol. However, in future work, by including additional parsers to manage XML-based information such as Continuity of Care Documents (CCD) and Continuity of Care Records (CCR), we plan to ensure that the GUI engine will be more widely accessible for the health field.

  4. Advanced Transformer Demonstration And Validation Project Summary Report Based On Experiences At Nas, North Island, San Diego. California

    DTIC Science & Technology

    1992-08-01

    MAXIMIUI• 0 P 8. ALL LIGHTS ARE LED"’ TORMAD TEM"ERATUE TO RESET 9. DIGITAL METER IS LE[in EMORY IETER WILL AUTOMATICALLY MAD PHASE WiTH HIGHEST...in place. 4.4 Building 379 The Building 379 installation consisted of removing three existing 167 kVA PCB-filled, single phase , polemount transformers...that were connected in a three phase bank and replacing them with a single 300 kVA Square D Company VPI dry-type transformer. This task also involved

  5. Self-organization, transformity, and information.

    PubMed

    Odum, H T

    1988-11-25

    Ecosystems and other self-organizing systems develop system designs and mathematics that reinforce energy use, characteristically with alternate pulsing of production and consumption, increasingly recognized as the new paradigm. Insights from the energetics of ecological food chains suggest the need to redefine work, distinguishing kinds of energy with a new quantity, the transformity (energy of one type required per unit of another). Transformities may be used as an energy-scaling factor for the hierarchies of the universe including information. Solar transformities in the biosphere, expressed as solar emjoules per joule, range from one for solar insolation to trillions for categories of shared information. Resource contributions multiplied by their transformities provide a scientifically based value system for human service, environmental mitigation, foreign trade equity, public policy alternatives, and economic vitality.

  6. Effects of Rolling and Cooling Conditions on Microstructure of Umbrella-Bone Steel

    NASA Astrophysics Data System (ADS)

    Wu, Yan-Xin; Fu, Jian-Xun; Zhang, Hua; Xu, Jie; Zhai, Qi-Jie

    2017-10-01

    The effects of deformation temperature and cooling rate on the micro-structure evolution of umbrella-bone steel was investigated using a Gleeble thermal-mechanical testing machine and dynamic continuous cooling transformation (CCT) curves. The results show that fast cooling which lowers the starting temperature of ferrite transformation leads to finer ferrite grains and more pearlite. Low temperature deformation enhances the hardening effect of austenite and reduces hardenability, allowing a wider range of cooling rates and thus avoiding martensite transformation after deformation. According to the phase transformation rules, the ultimate tensile strength and reduction in area of the wire rod formed in the optimized industrial trial are 636 MPa and 73.6 %, respectively, showing excellent strength and plasticity.

  7. Complex dynamics and empirical evidence (Invited Paper)

    NASA Astrophysics Data System (ADS)

    Delli Gatti, Domenico; Gaffeo, Edoardo; Giulioni, Gianfranco; Gallegati, Mauro; Kirman, Alan; Palestrini, Antonio; Russo, Alberto

    2005-05-01

    Standard macroeconomics, based on a reductionist approach centered on the representative agent, is badly equipped to explain the empirical evidence where heterogeneity and industrial dynamics are the rule. In this paper we show that a simple agent-based model of heterogeneous financially fragile agents is able to replicate a large number of scaling type stylized facts with a remarkable degree of statistical precision.

  8. A Flexible Mechanism of Rule Selection Enables Rapid Feature-Based Reinforcement Learning

    PubMed Central

    Balcarras, Matthew; Womelsdorf, Thilo

    2016-01-01

    Learning in a new environment is influenced by prior learning and experience. Correctly applying a rule that maps a context to stimuli, actions, and outcomes enables faster learning and better outcomes compared to relying on strategies for learning that are ignorant of task structure. However, it is often difficult to know when and how to apply learned rules in new contexts. In our study we explored how subjects employ different strategies for learning the relationship between stimulus features and positive outcomes in a probabilistic task context. We test the hypothesis that task naive subjects will show enhanced learning of feature specific reward associations by switching to the use of an abstract rule that associates stimuli by feature type and restricts selections to that dimension. To test this hypothesis we designed a decision making task where subjects receive probabilistic feedback following choices between pairs of stimuli. In the task, trials are grouped in two contexts by blocks, where in one type of block there is no unique relationship between a specific feature dimension (stimulus shape or color) and positive outcomes, and following an un-cued transition, alternating blocks have outcomes that are linked to either stimulus shape or color. Two-thirds of subjects (n = 22/32) exhibited behavior that was best fit by a hierarchical feature-rule model. Supporting the prediction of the model mechanism these subjects showed significantly enhanced performance in feature-reward blocks, and rapidly switched their choice strategy to using abstract feature rules when reward contingencies changed. Choice behavior of other subjects (n = 10/32) was fit by a range of alternative reinforcement learning models representing strategies that do not benefit from applying previously learned rules. In summary, these results show that untrained subjects are capable of flexibly shifting between behavioral rules by leveraging simple model-free reinforcement learning and context-specific selections to drive responses. PMID:27064794

  9. Enhancement of Lipid Productivity in Oleaginous Colletotrichum Fungus through Genetic Transformation Using the Yeast CtDGAT2b Gene under Model-Optimized Growth Condition

    PubMed Central

    Dey, Prabuddha; Mall, Nikunj; Chattopadhyay, Atrayee; Chakraborty, Monami; Maiti, Mrinal K.

    2014-01-01

    Oleaginous fungi are of special interest among microorganisms for the production of lipid feedstocks as they can be cultured on a variety of substrates, particularly waste lingocellulosic materials, and few fungal strains are reported to accumulate inherently higher neutral lipid than bacteria or microalgae. Previously, we have characterized an endophytic filamentous fungus Colletotrichum sp. DM06 that can produce total lipid ranging from 34% to 49% of its dry cell weight (DCW) upon growing with various carbon sources and nutrient-stress conditions. In the present study, we report on the genetic transformation of this fungal strain with the CtDGAT2b gene, which encodes for a catalytically efficient isozyme of type-2 diacylglycerol acyltransferase (DGAT) from oleaginous yeast Candida troplicalis SY005. Besides the increase in size of lipid bodies, total lipid titer by the transformed Colletotrichum (lipid content ∼73% DCW) was found to be ∼1.7-fold more than the wild type (lipid content ∼38% DCW) due to functional activity of the CtDGAT2b transgene when grown under standard condition of growth without imposition of any nutrient-stress. Analysis of lipid fractionation revealed that the neutral lipid titer in transformants increased up to 1.8-, 1.6- and 1.5-fold compared to the wild type when grown under standard, nitrogen stress and phosphorus stress conditions, respectively. Lipid titer of transformed cells was further increased to 1.7-fold following model-based optimization of culture conditions. Taken together, ∼2.9-fold higher lipid titer was achieved in Colletotrichum fungus due to overexpression of a rate-limiting crucial enzyme of lipid biosynthesis coupled with prediction-based bioprocess optimization. PMID:25375973

  10. The transformation of targeted killing and international order

    PubMed Central

    Senn, Martin; Troy, Jodok

    2017-01-01

    ABSTRACT This article introduces the special issue’s question of whether and how the current transformation of targeted killing is transforming the global international order and provides the conceptual ground for the individual contributions to the special issue. It develops a two-dimensional concept of political order and introduces a theoretical framework that conceives the maintenance and transformation of international order as a dynamic interplay between its behavioral dimension in the form of violence and discursive processes and its institutional dimension in the form of ideas, norms, and rules. The article also conceptualizes targeted killing and introduces a typology of targeted-killing acts on the basis of their legal and moral legitimacy. Building on this conceptual groundwork, the article takes stock of the current transformation of targeted killing and summarizes the individual contributions to this special issue. PMID:29097903

  11. Topological charge quantization via path integration: An application of the Kustaanheimo-Stiefel transformation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Inomata, A.; Junker, G.; Wilson, R.

    1993-08-01

    The unified treatment of the Dirac monopole, the Schwinger monopole, and the Aharonov-Bahn problem by Barut and Wilson is revisited via a path integral approach. The Kustaanheimo-Stiefel transformation of space and time is utilized to calculate the path integral for a charged particle in the singular vector potential. In the process of dimensional reduction, a topological charge quantization rule is derived, which contains Dirac's quantization condition as a special case. 32 refs.

  12. 78 FR 60947 - Self-Regulatory Organizations; C2 Options Exchange, Incorporated; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-02

    ... Rule Change Relating to Message Types, Connectivity and Bandwidth Allowance September 26, 2013... definitions, practices and requirements related to System connectivity, message types and bandwidth allowance... types and bandwidth allowance to promote transparency and maintain clarity in the rules. Specifically...

  13. 18 CFR 1b.4 - Types of investigations.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Types of investigations. 1b.4 Section 1b.4 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY GENERAL RULES RULES RELATING TO INVESTIGATIONS § 1b.4 Types of investigations...

  14. 18 CFR 1b.4 - Types of investigations.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Types of investigations. 1b.4 Section 1b.4 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY GENERAL RULES RULES RELATING TO INVESTIGATIONS § 1b.4 Types of investigations...

  15. 18 CFR 1b.4 - Types of investigations.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Types of investigations. 1b.4 Section 1b.4 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY GENERAL RULES RULES RELATING TO INVESTIGATIONS § 1b.4 Types of investigations...

  16. 18 CFR 1b.4 - Types of investigations.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Types of investigations. 1b.4 Section 1b.4 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY GENERAL RULES RULES RELATING TO INVESTIGATIONS § 1b.4 Types of investigations...

  17. Solving the interval type-2 fuzzy polynomial equation using the ranking method

    NASA Astrophysics Data System (ADS)

    Rahman, Nurhakimah Ab.; Abdullah, Lazim

    2014-07-01

    Polynomial equations with trapezoidal and triangular fuzzy numbers have attracted some interest among researchers in mathematics, engineering and social sciences. There are some methods that have been developed in order to solve these equations. In this study we are interested in introducing the interval type-2 fuzzy polynomial equation and solving it using the ranking method of fuzzy numbers. The ranking method concept was firstly proposed to find real roots of fuzzy polynomial equation. Therefore, the ranking method is applied to find real roots of the interval type-2 fuzzy polynomial equation. We transform the interval type-2 fuzzy polynomial equation to a system of crisp interval type-2 fuzzy polynomial equation. This transformation is performed using the ranking method of fuzzy numbers based on three parameters, namely value, ambiguity and fuzziness. Finally, we illustrate our approach by numerical example.

  18. Fabrication and characteristics of thin disc piezoelectric transformers based on piezoelectric buzzers with gap circles.

    PubMed

    Chang, Kuo-Tsai; Lee, Chun-Wei

    2008-04-01

    This paper investigates design, fabrication and test of thin disc piezoelectric transformers (PTs) based on piezoelectric buzzers with gap circles at different diameters of the gap circles. The performance test is focused on characteristics of voltage gains, including maximum voltage gains and maximum-gain frequencies, for each piezoelectric transformer under different load conditions. Both a piezoelectric buzzer and a gap circle on a silver electrode of the buzzer are needed to build any type of the PTs. Here, the gap circle is used to form a ring-shaped input electrode and a circle-shaped output electrode for each piezoelectric transformer. To do so, both structure and connection of a PT are first expressed. Then, operating principle of a PT and its related vibration mode observed by a carbon-power imaging technique are described. Moreover, an experimental setup for characterizing each piezoelectric transformer is constructed. Finally, effects of diameters of the gap circles on characteristics of voltage gains at different load resistances are discussed.

  19. Effects of normalization on quantitative traits in association test

    PubMed Central

    2009-01-01

    Background Quantitative trait loci analysis assumes that the trait is normally distributed. In reality, this is often not observed and one strategy is to transform the trait. However, it is not clear how much normality is required and which transformation works best in association studies. Results We performed simulations on four types of common quantitative traits to evaluate the effects of normalization using the logarithm, Box-Cox, and rank-based transformations. The impact of sample size and genetic effects on normalization is also investigated. Our results show that rank-based transformation gives generally the best and consistent performance in identifying the causal polymorphism and ranking it highly in association tests, with a slight increase in false positive rate. Conclusion For small sample size or genetic effects, the improvement in sensitivity for rank transformation outweighs the slight increase in false positive rate. However, for large sample size and genetic effects, normalization may not be necessary since the increase in sensitivity is relatively modest. PMID:20003414

  20. Overcoming rule-based rigidity and connectionist limitations through massively-parallel case-based reasoning

    NASA Technical Reports Server (NTRS)

    Barnden, John; Srinivas, Kankanahalli

    1990-01-01

    Symbol manipulation as used in traditional Artificial Intelligence has been criticized by neural net researchers for being excessively inflexible and sequential. On the other hand, the application of neural net techniques to the types of high-level cognitive processing studied in traditional artificial intelligence presents major problems as well. A promising way out of this impasse is to build neural net models that accomplish massively parallel case-based reasoning. Case-based reasoning, which has received much attention recently, is essentially the same as analogy-based reasoning, and avoids many of the problems leveled at traditional artificial intelligence. Further problems are avoided by doing many strands of case-based reasoning in parallel, and by implementing the whole system as a neural net. In addition, such a system provides an approach to some aspects of the problems of noise, uncertainty and novelty in reasoning systems. The current neural net system (Conposit), which performs standard rule-based reasoning, is being modified into a massively parallel case-based reasoning version.

Top