Science.gov

Sample records for updating recursive xml

  1. KSRQuerying: XML Keyword with Recursive Querying

    NASA Astrophysics Data System (ADS)

    Taha, Kamal; Elmasri, Ramez

    We propose an XML search engine called KSRQuerying. The search engine employs recursive querying techniques, which allows a query to query the results of a previous application of itself or of another query. It answers recursive queries, keyword-based queries, and loosely structured queries. KSRQuerying uses a sort-merge algorithm, which selects subsets from the set of nodes containing keywords, where each subset contains the smallest number of nodes that: (1) are closely related to each other, and (2) contain at least one occurrence of each keyword. We experimentally evaluated the quality and efficiency of KSRQuerying and compared it with 3 systems: XSeek, Schema-Free XQuery, and XKSearch.

  2. A recursive method for updating apple firmness prediction models based on spectral scattering images

    NASA Astrophysics Data System (ADS)

    Peng, Yankun; Lu, Renfu

    2007-09-01

    Multispectral scattering is effective for nondestructive prediction of fruit firmness. However, the established prediction models for multispectral scattering are variety specific and may not perform appropriately for fruit harvested from different orchards or at different times. In this research, a recursive least squares method was proposed to update the existing prediction model by adding samples from a new population to assure good performance of the model for predicting fruit from the new population. Multispectral scattering images acquired by a multispectral imaging system from Golden Delicious apples that were harvested at the same time but had different postharvest storage time periods were used to develop the updating method. Radial scattering profiles were described by the modified Lorentzian distribution (MLD) function with four profile parameters for eight wavelengths. Multi-linear regression was performed on MLD parameters to establish prediction models for fruit firmness for each group. The prediction model established in the first group was then updated by using selected samples from the second group, and four different sampling methods were compared and validated with the rest apples. The prediction model corrected by the model-updating method gave good firmness predictions with the correlation coefficient (r) of 0.86 and the standard error of prediction (SEP) of 6.11 N. This model updating method is promising for implementing the spectral scattering technique for real-time prediction of apple fruit firmness.

  3. XML in Libraries.

    ERIC Educational Resources Information Center

    Tennant, Roy, Ed.

    This book presents examples of how libraries are using XML (eXtensible Markup Language) to solve problems, expand services, and improve systems. Part I contains papers on using XML in library catalog records: "Updating MARC Records with XMLMARC" (Kevin S. Clarke, Stanford University) and "Searching and Retrieving XML Records via the Web" (Theo van…

  4. SU-E-T-327: The Update of a XML Composing Tool for TrueBeam Developer Mode

    SciTech Connect

    Yan, Y; Mao, W; Jiang, S

    2014-06-01

    Purpose: To introduce a major upgrade of a novel XML beam composing tool to scientists and engineers who strive to translate certain capabilities of TrueBeam Developer Mode to future clinical benefits of radiation therapy. Methods: TrueBeam Developer Mode provides the users with a test bed for unconventional plans utilizing certain unique features not accessible at the clinical mode. To access the full set of capabilities, a XML beam definition file accommodating all parameters including kV/MV imaging triggers in the plan can be locally loaded at this mode, however it is difficult and laborious to compose one in a text editor. In this study, a stand-along interactive XML beam composing application, TrueBeam TeachMod, was developed on Windows platforms to assist users in making their unique plans in a WYSWYG manner. A conventional plan can be imported in a DICOM RT object as the start of the beam editing process in which trajectories of all axes of a TrueBeam machine can be modified to the intended values at any control point. TeachMod also includes libraries of predefined imaging and treatment procedures to further expedite the process. Results: The TeachMod application is a major of the TeachMod module within DICOManTX. It fully supports TrueBeam 2.0. Trajectories of all axes including all MLC leaves can be graphically rendered and edited as needed. The time for XML beam composing has been reduced to a negligible amount regardless the complexity of the plan. A good understanding of XML language and TrueBeam schema is not required though preferred. Conclusion: Creating XML beams manually in a text editor will be a lengthy error-prone process for sophisticated plans. A XML beam composing tool is highly desirable for R and D activities. It will bridge the gap between scopes of TrueBeam capabilities and their clinical application potentials.

  5. Recursion Mathematics.

    ERIC Educational Resources Information Center

    Olson, Alton T.

    1989-01-01

    Discusses the use of the recursive method to permutations of n objects and a problem making c cents in change using pennies and nickels when order is important. Presents a LOGO program for the examples. (YP)

  6. XML under the Hood.

    ERIC Educational Resources Information Center

    Scharf, David

    2002-01-01

    Discusses XML (extensible markup language), particularly as it relates to libraries. Topics include organizing information; cataloging; metadata; similarities to HTML; organizations dealing with XML; making XML useful; a history of XML; the semantic Web; related technologies; XML at the Library of Congress; and its role in improving the…

  7. On recursion

    PubMed Central

    Watumull, Jeffrey; Hauser, Marc D.; Roberts, Ian G.; Hornstein, Norbert

    2014-01-01

    It is a truism that conceptual understanding of a hypothesis is required for its empirical investigation. However, the concept of recursion as articulated in the context of linguistic analysis has been perennially confused. Nowhere has this been more evident than in attempts to critique and extend Hauseretal's. (2002) articulation. These authors put forward the hypothesis that what is uniquely human and unique to the faculty of language—the faculty of language in the narrow sense (FLN)—is a recursive system that generates and maps syntactic objects to conceptual-intentional and sensory-motor systems. This thesis was based on the standard mathematical definition of recursion as understood by Gödel and Turing, and yet has commonly been interpreted in other ways, most notably and incorrectly as a thesis about the capacity for syntactic embedding. As we explain, the recursiveness of a function is defined independent of such output, whether infinite or finite, embedded or unembedded—existent or non-existent. And to the extent that embedding is a sufficient, though not necessary, diagnostic of recursion, it has not been established that the apparent restriction on embedding in some languages is of any theoretical import. Misunderstanding of these facts has generated research that is often irrelevant to the FLN thesis as well as to other theories of language competence that focus on its generative power of expression. This essay is an attempt to bring conceptual clarity to such discussions as well as to future empirical investigations by explaining three criterial properties of recursion: computability (i.e., rules in intension rather than lists in extension); definition by induction (i.e., rules strongly generative of structure); and mathematical induction (i.e., rules for the principled—and potentially unbounded—expansion of strongly generated structure). By these necessary and sufficient criteria, the grammars of all natural languages are recursive. PMID

  8. EXRT: Towards a Simple Benchmark for XML Readiness Testing

    NASA Astrophysics Data System (ADS)

    Carey, Michael J.; Ling, Ling; Nicola, Matthias; Shao, Lin

    As we approach the ten-year anniversary of the first working draft of the XQuery language, one finds XML storage and query support in a number of commercial database systems. For many XML use cases, database vendors now recommend storing and indexing XML natively and using XQuery or SQL/XML to query and update XML directly. If the complexity of the XML data allows, shredding and reconstructing XML to/from relational tables is still an alternative as well, and might in fact outperform native XML processing. In this paper we report on an effort to evaluate these basic XML data management trade-offs for current commercial systems. We describe EXRT (Experimental XML Readiness Test), a simple micro-benchmark that methodically evaluates the impact of query characteristics on the comparison of shredded and native XML. We describe our experiences and preliminary results from EXRT'ing pressure on the XML data management facilities offered by two relational databases and one XML database system.

  9. ScotlandsPlaces XML: Bespoke XML or XML Mapping?

    ERIC Educational Resources Information Center

    Beamer, Ashley; Gillick, Mark

    2010-01-01

    Purpose: The purpose of this paper is to investigate web services (in the form of parameterised URLs), specifically in the context of the ScotlandsPlaces project. This involves cross-domain querying, data retrieval and display via the development of a bespoke XML standard rather than existing XML formats and mapping between them.…

  10. XML Based Course Websites.

    ERIC Educational Resources Information Center

    Wollowski, Michael

    XML, the extensible markup language, is a quickly evolving technology that presents a viable alternative to courseware products and promises to ease the burden of Web authors, who edit their course pages directly. XML uses tags to label kinds of contents, rather than format information. The use of XML enables faculty to focus on providing…

  11. XML-BASED REPRESENTATION

    SciTech Connect

    R. KELSEY

    2001-02-01

    For focused applications with limited user and use application communities, XML can be the right choice for representation. It is easy to use, maintain, and extend and enjoys wide support in commercial and research sectors. When the knowledge and information to be represented is object-based and use of that knowledge and information is a high priority, then XML-based representation should be considered. This paper discusses some of the issues involved in using XML-based representation and presents an example application that successfully uses an XML-based representation.

  12. XML: A Publisher's Perspective.

    ERIC Educational Resources Information Center

    Andrews, Timothy M.

    1999-01-01

    Explains eXtensible Markup Language (XML) and describes how Dow Jones Interactive is using it to improve the news-gathering and dissemination process through intranets and the World Wide Web. Discusses benefits of using XML, the relationship to HyperText Markup Language (HTML), lack of available software tools and industry support, and future…

  13. Adaptable Iterative and Recursive Kalman Filter Schemes

    NASA Technical Reports Server (NTRS)

    Zanetti, Renato

    2014-01-01

    Nonlinear filters are often very computationally expensive and usually not suitable for real-time applications. Real-time navigation algorithms are typically based on linear estimators, such as the extended Kalman filter (EKF) and, to a much lesser extent, the unscented Kalman filter. The Iterated Kalman filter (IKF) and the Recursive Update Filter (RUF) are two algorithms that reduce the consequences of the linearization assumption of the EKF by performing N updates for each new measurement, where N is the number of recursions, a tuning parameter. This paper introduces an adaptable RUF algorithm to calculate N on the go, a similar technique can be used for the IKF as well.

  14. XML Files: MedlinePlus

    MedlinePlus

    ... page: https://medlineplus.gov/xml.html MedlinePlus XML Files To use the sharing features on this page, ... information on all English and Spanish topic groups. Files generated on November 05, 2016 MedlinePlus Health Topic ...

  15. Recursion, Language, and Starlings

    ERIC Educational Resources Information Center

    Corballis, Michael C.

    2007-01-01

    It has been claimed that recursion is one of the properties that distinguishes human language from any other form of animal communication. Contrary to this claim, a recent study purports to demonstrate center-embedded recursion in starlings. I show that the performance of the birds in this study can be explained by a counting strategy, without any…

  16. XML: An Introduction.

    ERIC Educational Resources Information Center

    Lewis, John D.

    1998-01-01

    Describes XML (extensible markup language), a new language classification submitted to the World Wide Web Consortium that is defined in terms of both SGML (Standard Generalized Markup Language) and HTML (Hypertext Markup Language), specifically designed for the Internet. Limitations of PDF (Portable Document Format) files for electronic journals…

  17. Recursive Deadbeat Controller Design

    NASA Technical Reports Server (NTRS)

    Juang, Jer-Nan; Phan, Minh Q.

    1997-01-01

    This paper presents a recursive algorithm for a deadbeat predictive controller design. The method combines together the concepts of system identification and deadbeat controller designs. It starts with the multi-step output prediction equation and derives the control force in terms of past input and output time histories. The formulation thus derived satisfies simultaneously system identification and deadbeat controller design requirements. As soon as the coefficient matrices are identified satisfying the output prediction equation, no further work is required to compute the deadbeat control gain matrices. The method can be implemented recursively just as any typical recursive system identification techniques.

  18. Document image representation using XML technologies

    NASA Astrophysics Data System (ADS)

    El-Kwae, Essam A.; Atmakuri, Kusuma H.

    2001-12-01

    Electronic documents have gained wide acceptance due to the ease of editing and sharing of information. However, paper documents are still widely used in many environments. Moving into a paperless and distributed office has become a major goal for document image research. A new approach for form document representation is presented. This approach allows for electronic document sharing over the World Wide Web (WWW) using Extensible Markup Language (XML) technologies. Each document is mapped into three different views, an XML view to represent the preprinted and filled-in data, an XSL (Extensible style Sheets) view to represent the structure of the document, and a DTD (Document Type Definition) view to represent the document grammar and field constraints. The XML and XSL views are generated from a document template, either automatically using image processing techniques, or semi-automatically with minimal user interaction. The DTD representation may be fixed for general documents or may be generated semi-automatically by mining a number of filled-in document examples. Document templates need to be entered once to create the proposed representation. Afterwards, documents may be displayed, updated, or shared over the web. The merits of this approach are demonstrated using a number of examples of widely used forms.

  19. Language and Recursion

    NASA Astrophysics Data System (ADS)

    Lowenthal, Francis

    2010-11-01

    This paper examines whether the recursive structure imbedded in some exercises used in the Non Verbal Communication Device (NVCD) approach is actually the factor that enables this approach to favor language acquisition and reacquisition in the case of children with cerebral lesions. For that a definition of the principle of recursion as it is used by logicians is presented. The two opposing approaches to the problem of language development are explained. For many authors such as Chomsky [1] the faculty of language is innate. This is known as the Standard Theory; the other researchers in this field, e.g. Bates and Elman [2], claim that language is entirely constructed by the young child: they thus speak of Language Acquisition. It is also shown that in both cases, a version of the principle of recursion is relevant for human language. The NVCD approach is defined and the results obtained in the domain of language while using this approach are presented: young subjects using this approach acquire a richer language structure or re-acquire such a structure in the case of cerebral lesions. Finally it is shown that exercises used in this framework imply the manipulation of recursive structures leading to regular grammars. It is thus hypothesized that language development could be favored using recursive structures with the young child. It could also be the case that the NVCD like exercises used with children lead to the elaboration of a regular language, as defined by Chomsky [3], which could be sufficient for language development but would not require full recursion. This double claim could reconcile Chomsky's approach with psychological observations made by adherents of the Language Acquisition approach, if it is confirmed by researches combining the use of NVCDs, psychometric methods and the use of Neural Networks. This paper thus suggests that a research group oriented towards this problematic should be organized.

  20. XML Schema Languages: Beyond DTD.

    ERIC Educational Resources Information Center

    Ioannides, Demetrios

    2000-01-01

    Discussion of XML (extensible markup language) and the traditional DTD (document type definition) format focuses on efforts of the World Wide Web Consortium's XML schema working group to develop a schema language to replace DTD that will be capable of defining the set of constraints of any possible data resource. (Contains 14 references.) (LRW)

  1. Recursion in Aphasia

    ERIC Educational Resources Information Center

    Banreti, Zoltan

    2010-01-01

    This study investigates how aphasic impairment impinges on syntactic and/or semantic recursivity of human language. A series of tests has been conducted with the participation of five Hungarian speaking aphasic subjects and 10 control subjects. Photographs representing simple situations were presented to subjects and questions were asked about…

  2. Recursive heuristic classification

    NASA Technical Reports Server (NTRS)

    Wilkins, David C.

    1994-01-01

    The author will describe a new problem-solving approach called recursive heuristic classification, whereby a subproblem of heuristic classification is itself formulated and solved by heuristic classification. This allows the construction of more knowledge-intensive classification programs in a way that yields a clean organization. Further, standard knowledge acquisition and learning techniques for heuristic classification can be used to create, refine, and maintain the knowledge base associated with the recursively called classification expert system. The method of recursive heuristic classification was used in the Minerva blackboard shell for heuristic classification. Minerva recursively calls itself every problem-solving cycle to solve the important blackboard scheduler task, which involves assigning a desirability rating to alternative problem-solving actions. Knowing these ratings is critical to the use of an expert system as a component of a critiquing or apprenticeship tutoring system. One innovation of this research is a method called dynamic heuristic classification, which allows selection among dynamically generated classification categories instead of requiring them to be prenumerated.

  3. Recursion, Computers and Art

    ERIC Educational Resources Information Center

    Kemp, Andy

    2007-01-01

    "Geomlab" is a functional programming language used to describe pictures that are made up of tiles. The beauty of "Geomlab" is that it introduces students to recursion, a very powerful mathematical concept, through a very simple and enticing graphical environment. Alongside the software is a series of eight worksheets which lead into producing…

  4. Cytometry metadata in XML

    NASA Astrophysics Data System (ADS)

    Leif, Robert C.; Leif, Stephanie H.

    2016-04-01

    Introduction: The International Society for Advancement of Cytometry (ISAC) has created a standard for the Minimum Information about a Flow Cytometry Experiment (MIFlowCyt 1.0). CytometryML will serve as a common metadata standard for flow and image cytometry (digital microscopy). Methods: The MIFlowCyt data-types were created, as is the rest of CytometryML, in the XML Schema Definition Language (XSD1.1). The datatypes are primarily based on the Flow Cytometry and the Digital Imaging and Communication (DICOM) standards. A small section of the code was formatted with standard HTML formatting elements (p, h1, h2, etc.). Results:1) The part of MIFlowCyt that describes the Experimental Overview including the specimen and substantial parts of several other major elements has been implemented as CytometryML XML schemas (www.cytometryml.org). 2) The feasibility of using MIFlowCyt to provide the combination of an overview, table of contents, and/or an index of a scientific paper or a report has been demonstrated. Previously, a sample electronic publication, EPUB, was created that could contain both MIFlowCyt metadata as well as the binary data. Conclusions: The use of CytometryML technology together with XHTML5 and CSS permits the metadata to be directly formatted and together with the binary data to be stored in an EPUB container. This will facilitate: formatting, data- mining, presentation, data verification, and inclusion in structured research, clinical, and regulatory documents, as well as demonstrate a publication's adherence to the MIFlowCyt standard, promote interoperability and should also result in the textual and numeric data being published using web technology without any change in composition.

  5. Publishing Scientific Articles in XML.

    NASA Astrophysics Data System (ADS)

    Shaya, E.; Borne, K.; Thomas, B.; Cheung, C. Y.

    2001-12-01

    Most publication houses are using SGML for electronic mark up of pages intended for hardcopy. Since XML is a major subset of SGML with W3C backing and greater database compatibility, many publication houses are naturally considering switching to or including XML. Now, if authors were also to switch to XML for their manuscripts, it would greatly reduce the work load at the publication houses and reduce the number of errors that are introduced in the translation process. XML is also a logical progression for authors since it is rapidly becoming incorporated into editors such as Word Perfect, Notepad, Emacs, etc. There is an XML standard for equation markup, MathML, and equation editors exist for it. It is easy to put these manuscripts onto the Web; all one needs is to link to a standard cascade style sheet (CSS2). Leveraging our experience with encapsulating scientific data in XML the ADC (Astronomical Data Center) staff are working out details of a scientific XML article format called "AXML" (Article XML Markup Language). We foresee using AXML eventually as an end to end solution for data from experiment/observation through analysis to publication. With fewer transformations needed on article text, equations, and tables, less human intervention will be required and fewer human errors will be introduced, for example, proofing of XML documents by publication houses could someday be unnecessary or (at least) vastly more efficient. In this poster we discuss examine several important aspects of this technology, give the technical details of AXML (including a DTD) and give examples which show the power of AXML.

  6. Recursive Objects--An Object Oriented Presentation of Recursion

    ERIC Educational Resources Information Center

    Sher, David B.

    2004-01-01

    Generally, when recursion is introduced to students the concept is illustrated with a toy (Towers of Hanoi) and some abstract mathematical functions (factorial, power, Fibonacci). These illustrate recursion in the same sense that counting to 10 can be used to illustrate a for loop. These are all good illustrations, but do not represent serious…

  7. Using recursion to compute the inverse of the genomic relationship matrix.

    PubMed

    Misztal, I; Legarra, A; Aguilar, I

    2014-01-01

    Computing the inverse of the genomic relationship matrix using recursion was investigated. A traditional algorithm to invert the numerator relationship matrix is based on the observation that the conditional expectation for an additive effect of 1 animal given the effects of all other animals depends on the effects of its sire and dam only, each with a coefficient of 0.5. With genomic relationships, such an expectation depends on all other genotyped animals, and the coefficients do not have any set value. For each animal, the coefficients plus the conditional variance can be called a genomic recursion. If such recursions are known, the mixed model equations can be solved without explicitly creating the inverse of the genomic relationship matrix. Several algorithms were developed to create genomic recursions. In an algorithm with sequential updates, genomic recursions are created animal by animal. That algorithm can also be used to update a known inverse of a genomic relationship matrix for additional genotypes. In an algorithm with forward updates, a newly computed recursion is immediately applied to update recursions for remaining animals. The computing costs for both algorithms depend on the sparsity pattern of the genomic recursions, but are lower or equal than for regular inversion. An algorithm for proven and young animals assumes that the genomic recursions for young animals contain coefficients only for proven animals. Such an algorithm generates exact genomic EBV in genomic BLUP and is an approximation in single-step genomic BLUP. That algorithm has a cubic cost for the number of proven animals and a linear cost for the number of young animals. The genomic recursions can provide new insight into genomic evaluation and possibly reduce costs of genetic predictions with extremely large numbers of genotypes.

  8. Recursive Feature Extraction in Graphs

    SciTech Connect

    2014-08-14

    ReFeX extracts recursive topological features from graph data. The input is a graph as a csv file and the output is a csv file containing feature values for each node in the graph. The features are based on topological counts in the neighborhoods of each nodes, as well as recursive summaries of neighbors' features.

  9. A recursive technique for adaptive vector quantization

    NASA Technical Reports Server (NTRS)

    Lindsay, Robert A.

    1989-01-01

    Vector Quantization (VQ) is fast becoming an accepted, if not preferred method for image compression. The VQ performs well when compressing all types of imagery including Video, Electro-Optical (EO), Infrared (IR), Synthetic Aperture Radar (SAR), Multi-Spectral (MS), and digital map data. The only requirement is to change the codebook to switch the compressor from one image sensor to another. There are several approaches for designing codebooks for a vector quantizer. Adaptive Vector Quantization is a procedure that simultaneously designs codebooks as the data is being encoded or quantized. This is done by computing the centroid as a recursive moving average where the centroids move after every vector is encoded. When computing the centroid of a fixed set of vectors the resultant centroid is identical to the previous centroid calculation. This method of centroid calculation can be easily combined with VQ encoding techniques. The defined quantizer changes after every encoded vector by recursively updating the centroid of minimum distance which is the selected by the encoder. Since the quantizer is changing definition or states after every encoded vector, the decoder must now receive updates to the codebook. This is done as side information by multiplexing bits into the compressed source data.

  10. XML: a lingua franca for science?

    PubMed

    Barillot, E; Achard, F

    2000-08-01

    XML is a new language designed to solve one of the biggest problems of the World Wide Web: its main language, HTML, is not extensible. In this article, the authors discuss the current successes and limitations of the World Wide Web, briefly explain the basics of XML and present the benefits of using XML as a data-exchange language. Finally, they discuss real-life applications that have been developed using XML, with a focus on biology.

  11. Recursivity in Lingua Cosmica

    NASA Astrophysics Data System (ADS)

    Ollongren, Alexander

    2011-02-01

    In a sequence of papers on the topic of message construction for interstellar communication by means of a cosmic language, the present author has discussed various significant requirements such a lingua should satisfy. The author's Lingua Cosmica is a (meta) system for annotating contents of possibly large-scale messages for ETI. LINCOS, based on formal constructive logic, was primarily designed for dealing with logic contents of messages but is also applicable for denoting structural properties of more general abstractions embedded in such messages. The present paper explains ways and means for achieving this for a special case: recursive entities. As usual two stages are involved: first the domain of discourse is enriched with suitable representations of the entities concerned, after which properties over them can be dealt with within the system itself. As a representative example the case of Russian dolls (Matrjoshka's) is discussed in some detail and relations with linguistic structures in natural languages are briefly exploited.

  12. Setting the Standard: XML on Campus.

    ERIC Educational Resources Information Center

    Rawlins, Mike

    2001-01-01

    Explains what XML (Extensible Markup Language) is; where to find it in a few years (everywhere from Web pages, to database management systems, to common campus applications); issues that will make XML somewhat of an experimental strategy in the near term; and the importance of decision-makers being abreast of XML trends in standards, tools…

  13. XML and the Future of Digital Libraries.

    ERIC Educational Resources Information Center

    Sperberg-McQueen, C. M.

    1998-01-01

    XML is a newly released subset of SGML that is intended to extend the benefits of that standard to the World Wide Web. Differences between SGML and XML are outlined, potential benefits are discussed, and answers are provided for some frequently asked questions regarding XML. (AEF)

  14. gfortran2xml V0.6

    2005-06-28

    the software tool, gfortran2xml, creates an XML representation of a parse tree created by the GNU Fortran compiler. The XML output file describes Fortran interfaces (procedure names and associated parameters and return values). gfortran2xml is useful for projects that perform static analysis to modify existing Fortran source files, tools can automatically insert performance monitoring calls to aid in improving the speed of parallel applications. gfortran2xml is used in projects that generate code for Fortran andmore » C/C++ language interoperability.« less

  15. Hopf algebras and topological recursion

    NASA Astrophysics Data System (ADS)

    Esteves, João N.

    2015-11-01

    We consider a model for topological recursion based on the Hopf algebra of planar binary trees defined by Loday and Ronco (1998 Adv. Math. 139 293-309 We show that extending this Hopf algebra by identifying pairs of nearest neighbor leaves, and thus producing graphs with loops, we obtain the full recursion formula discovered by Eynard and Orantin (2007 Commun. Number Theory Phys. 1 347-452).

  16. Generating Basic Units with XML

    NASA Astrophysics Data System (ADS)

    Gass, J.; Shaya, E.; Thomas, B.; Blackwell, J.; Cheung, C.

    2000-12-01

    A fundamental characteristic of any scientific data is its physical units. Unit comparisons and conversions are required in order to understand the meaning of the data and whether or not certain operations on the data are meaningful. For example, only data with equivalent units should be merged. Intelligent query of distributed systems, such as is envisioned for the NVO, will need to integrate or merge data with mixed conventions of units. It would be desirable to provide a standard process to enable machine understanding of units. We present a solution to this problem using XML entities. Our solution features a flexibility to encompass current unit systems, ease of human use, and interoperability (XML/MathML based) between heterogeneous operating systems.

  17. Tomcat, Oracle & XML Web Archive

    SciTech Connect

    Cothren, D. C.

    2008-01-01

    The TOX (Tomcat Oracle & XML) web archive is a foundation for development of HTTP-based applications using Tomcat (or some other servlet container) and an Oracle RDBMS. Use of TOX requires coding primarily in PL/SQL, JavaScript, and XSLT, but also in HTML, CSS and potentially Java. Coded in Java and PL/SQL itself, TOX provides the foundation for more complex applications to be built.

  18. XML Translator for Interface Descriptions

    NASA Technical Reports Server (NTRS)

    Boroson, Elizabeth R.

    2009-01-01

    A computer program defines an XML schema for specifying the interface to a generic FPGA from the perspective of software that will interact with the device. This XML interface description is then translated into header files for C, Verilog, and VHDL. User interface definition input is checked via both the provided XML schema and the translator module to ensure consistency and accuracy. Currently, programming used on both sides of an interface is inconsistent. This makes it hard to find and fix errors. By using a common schema, both sides are forced to use the same structure by using the same framework and toolset. This makes for easy identification of problems, which leads to the ability to formulate a solution. The toolset contains constants that allow a programmer to use each register, and to access each field in the register. Once programming is complete, the translator is run as part of the make process, which ensures that whenever an interface is changed, all of the code that uses the header files describing it is recompiled.

  19. Plug-and-play XML: a health care perspective.

    PubMed

    Schweiger, Ralf; Hoelzer, Simon; Altmann, Udo; Rieger, Joerg; Dudeck, Joachim

    2002-01-01

    The application of XML (Extensible Markup Language) is still costly. The authors present an approach to ease the development of XML applications. They have developed a Web-based framework that combines existing XML resources into a comprehensive XML application. The XML framework is model-driven, i.e., the authors primarily design XML document models (XML schema, document type definition), and users can enter, search, and view related XML documents using a Web browser. The XML model itself is flexible and might be composed of existing model standards. The second part of the paper relates the approach of the authors to some problems frequently encountered in the clinical documentation process.

  20. Speed up of XML parsers with PHP language implementation

    NASA Astrophysics Data System (ADS)

    Georgiev, Bozhidar; Georgieva, Adriana

    2012-11-01

    In this paper, authors introduce PHP5's XML implementation and show how to read, parse, and write a short and uncomplicated XML file using Simple XML in a PHP environment. The possibilities for mutual work of PHP5 language and XML standard are described. The details of parsing process with Simple XML are also cleared. A practical project PHP-XML-MySQL presents the advantages of XML implementation in PHP modules. This approach allows comparatively simple search of XML hierarchical data by means of PHP software tools. The proposed project includes database, which can be extended with new data and new XML parsing functions.

  1. XML technology planning database : lessons learned

    NASA Technical Reports Server (NTRS)

    Some, Raphael R.; Neff, Jon M.

    2005-01-01

    A hierarchical Extensible Markup Language(XML) database called XCALIBR (XML Analysis LIBRary) has been developed by Millennium Program to assist in technology investment (ROI) analysis and technology Language Capability the New return on portfolio optimization. The database contains mission requirements and technology capabilities, which are related by use of an XML dictionary. The XML dictionary codifies a standardized taxonomy for space missions, systems, subsystems and technologies. In addition to being used for ROI analysis, the database is being examined for use in project planning, tracking and documentation. During the past year, the database has moved from development into alpha testing. This paper describes the lessons learned during construction and testing of the prototype database and the motivation for moving from an XML taxonomy to a standard XML-based ontology.

  2. Direct Waveform Inversion: a New Recursive Scheme

    NASA Astrophysics Data System (ADS)

    Zheng, Y.

    2015-12-01

    The goal of the full-waveform inversion (FWI) is to find an Earth's model such that the synthetic waveforms computed using the model fit the observed ones. In practice, such a model is found in the context of the perturbation approach in an iterative fashion. Specifically, to find such a model, one starts from an initial global velocity model and perform model updating iteratively based on the Frechet derivative or single scattering by adjoint methods to minimize some cost function. However, this process often leads to local minima for the nonlinear cost function in the optimization and slow or no convergence when the starting model is far from the true model. To solve for the initial-model dependence and the convergence issue, we show a new direct waveform inversion (DWI) idea to directly invert the waveform data recursively by explicitly enforcing the causality principle. The DWI offers the advantage of assuming no global initial model and no iteration is needed for the model updating. Starting from the source-receiver region, the DWI builds the model outward recursively by fitting the earliest part of the reflection waveforms and the DWI process is always convergent. The DWI combines seismic imaging and velocity model building into one single process and this is in contrast to many industrial applications where seismic imaging/migration and velocity modeling building are done alternatively. The DWI idea is applicable to one-, two-, and three-dimensional spaces. We show numerical examples to support our idea using full waveform data including both free-surface and inter-bed multiples. Using reflection seismic data, we show that the DWI can invert for both velocity and density, separately.

  3. XML syntax for clinical laboratory procedure manuals.

    PubMed

    Saadawi, Gilan; Harrison, James H

    2003-01-01

    We have developed a document type description (DTD) in Extensable Markup Language (XML) for clinical laboratory procedures. Our XML syntax can adequately structure a variety of procedure types across different laboratories and is compatible with current procedure standards. The combination of this format with an XML content management system and appropriate style sheets will allow efficient procedure maintenance, distributed access, customized display and effective searching across a large body of test information.

  4. XML Schema Representation of DICOM Structured Reporting

    PubMed Central

    Lee, K. P.; Hu, Jingkun

    2003-01-01

    Objective: The Digital Imaging and Communications in Medicine (DICOM) Structured Reporting (SR) standard improves the expressiveness, precision, and comparability of documentation about diagnostic images and waveforms. It supports the interchange of clinical reports in which critical features shown by images and waveforms can be denoted unambiguously by the observer, indexed, and retrieved selectively by subsequent reviewers. It is essential to provide access to clinical reports across the health care enterprise by using technologies that facilitate information exchange and processing by computers as well as provide support for robust and semantically rich standards, such as DICOM. This is supported by the current trend in the healthcare industry towards the use of Extensible Markup Language (XML) technologies for storage and exchange of medical information. The objective of the work reported here is to develop XML Schema for representing DICOM SR as XML documents. Design: We briefly describe the document type definition (DTD) for XML and its limitations, followed by XML Schema (the intended replacement for DTD) and its features. A framework for generating XML Schema for representing DICOM SR in XML is presented next. Measurements: None applicable. Results: A schema instance based on an SR example in the DICOM specification was created and validated against the schema. The schema is being used extensively in producing reports on Philips Medical Systems ultrasound equipment. Conclusion: With the framework described it is feasible to generate XML Schema using the existing DICOM SR specification. It can also be applied to generate XML Schemas for other DICOM information objects. PMID:12595410

  5. Recursion relations for conformal blocks

    NASA Astrophysics Data System (ADS)

    Penedones, João; Trevisani, Emilio; Yamazaki, Masahito

    2016-09-01

    In the context of conformal field theories in general space-time dimension, we find all the possible singularities of the conformal blocks as functions of the scaling dimension Δ of the exchanged operator. In particular, we argue, using representation theory of parabolic Verma modules, that in odd spacetime dimension the singularities are only simple poles. We discuss how to use this information to write recursion relations that determine the conformal blocks. We first recover the recursion relation introduced in [1] for conformal blocks of external scalar operators. We then generalize this recursion relation for the conformal blocks associated to the four point function of three scalar and one vector operator. Finally we specialize to the case in which the vector operator is a conserved current.

  6. Recursive least-squares learning algorithms for neural networks

    SciTech Connect

    Lewis, P.S. ); Hwang, Jenq-Neng . Dept. of Electrical Engineering)

    1990-01-01

    This paper presents the development of a pair of recursive least squares (RLS) algorithms for online training of multilayer perceptrons, which are a class of feedforward artificial neural networks. These algorithms incorporate second order information about the training error surface in order to achieve faster learning rates than are possible using first order gradient descent algorithms such as the generalized delta rule. A least squares formulation is derived from a linearization of the training error function. Individual training pattern errors are linearized about the network parameters that were in effect when the pattern was presented. This permits the recursive solution of the least squares approximation, either via conventional RLS recursions or by recursive QR decomposition-based techniques. The computational complexity of the update is in the order of (N{sup 2}), where N is the number of network parameters. This is due to the estimation of the N {times} N inverse Hessian matrix. Less computationally intensive approximations of the RLS algorithms can be easily derived by using only block diagonal elements of this matrix, thereby partitioning the learning into independent sets. A simulation example is presented in which a neural network is trained to approximate a two dimensional Gaussian bump. In this example, RLS training required an order of magnitude fewer iterations on average (527) than did training with the generalized delta rule (6331). 14 refs., 3 figs.

  7. A Novel Navigation Paradigm for XML Repositories.

    ERIC Educational Resources Information Center

    Azagury, Alain; Factor, Michael E.; Maarek, Yoelle S.; Mandler, Benny

    2002-01-01

    Discusses data exchange over the Internet and describes the architecture and implementation of an XML document repository that promotes a navigation paradigm for XML documents based on content and context. Topics include information retrieval and semistructured documents; and file systems as information storage infrastructure, particularly XMLFS.…

  8. Compressing Aviation Data in XML Format

    NASA Technical Reports Server (NTRS)

    Patel, Hemil; Lau, Derek; Kulkarni, Deepak

    2003-01-01

    Design, operations and maintenance activities in aviation involve analysis of variety of aviation data. This data is typically in disparate formats making it difficult to use with different software packages. Use of a self-describing and extensible standard called XML provides a solution to this interoperability problem. XML provides a standardized language for describing the contents of an information stream, performing the same kind of definitional role for Web content as a database schema performs for relational databases. XML data can be easily customized for display using Extensible Style Sheets (XSL). While self-describing nature of XML makes it easy to reuse, it also increases the size of data significantly. Therefore, transfemng a dataset in XML form can decrease throughput and increase data transfer time significantly. It also increases storage requirements significantly. A natural solution to the problem is to compress the data using suitable algorithm and transfer it in the compressed form. We found that XML-specific compressors such as Xmill and XMLPPM generally outperform traditional compressors. However, optimal use of Xmill requires of discovery of optimal options to use while running Xmill. This, in turn, depends on the nature of data used. Manual disc0ver.y of optimal setting can require an engineer to experiment for weeks. We have devised an XML compression advisory tool that can analyze sample data files and recommend what compression tool would work the best for this data and what are the optimal settings to be used with a XML compression tool.

  9. Get It Together: Integrating Data with XML.

    ERIC Educational Resources Information Center

    Miller, Ron

    2003-01-01

    Discusses the use of XML for data integration to move data across different platforms, including across the Internet, from a variety of sources. Topics include flexibility; standards; organizing databases; unstructured data and the use of meta tags to encode it with XML information; cost effectiveness; and eliminating client software licenses.…

  10. How Learning Logic Programming Affects Recursion Comprehension

    ERIC Educational Resources Information Center

    Haberman, Bruria

    2004-01-01

    Recursion is a central concept in computer science, yet it is difficult for beginners to comprehend. Israeli high-school students learn recursion in the framework of a special modular program in computer science (Gal-Ezer & Harel, 1999). Some of them are introduced to the concept of recursion in two different paradigms: the procedural programming…

  11. An XML portable chart format.

    PubMed Central

    Chueh, H. C.; Raila, W. F.; Berkowicz, D. A.; Barnett, G. O.

    1998-01-01

    The clinical chart remains the fundamental record of outpatient clinical care. As this information migrates to electronic form, there is an opportunity to create standard formats for transmitting these charts. This paper describes work toward a Portable Chart Format (PCF) that can represent the relevant aspects of an outpatient chart. The main goal of the format is to provide a packaging medium for outpatient clinical charts in a transfer of care scenario. A secondary goal is to support the aggregation of comparable clinical data for outcomes analysis. The syntax used for PCF is Extended Markup Language (XML), a W3C standard. The structure of the PCF is based on a clinically relevant view of the data. The data definitions and nomenclature used are based primarily on existing clinical standards. PMID:9929315

  12. An Introduction to the Extensible Markup Language (XML).

    ERIC Educational Resources Information Center

    Bryan, Martin

    1998-01-01

    Describes Extensible Markup Language (XML), a subset of the Standard Generalized Markup Language (SGML) that is designed to make it easy to interchange structured documents over the Internet. Topics include Document Type Definition (DTD), components of XML, the use of XML, text and non-text elements, and uses for XML-coded files. (LRW)

  13. The use of XML in healthcare information management.

    PubMed

    Seals, M

    2000-01-01

    Extensible Markup Language (XML) is an emerging Internet standard that is gaining momentum in many industries, including healthcare. This article examines the origins of XML, its components, and some potential uses for XML in the healthcare industry. It then discusses a specific initiative to use XML as the basis for an industry-standard scheduling protocol.

  14. XML Format for SESAME and LEOS

    SciTech Connect

    Durrenberger, J K; Neely, J R; Sterne, P A

    2009-04-29

    The objective of this document is to describe the XML format used by LLNL and LANL to represent the equation-of-state and related material information in the LEOS and SESAME data libraries. The primary purpose of this document is to describe a specific XML format for representing EOS data that is tailored to the nature of the underlying data and is amenable to conversion to both legacy SESAME and LEOS binary formats. The secondary purpose is to describe an XML format that lends itself to a 'natural' representation in a binary file format of the SESAME, pdb or hdf5 form so that this format and related tools can be used for the rapid and efficient development and implementation of prototype data structures. This document describes the XML format only. A working knowledge of LEOS and SESAME formats is assumed.

  15. XML Flight/Ground Data Dictionary Management

    NASA Technical Reports Server (NTRS)

    Wright, Jesse; Wiklow, Colette

    2007-01-01

    A computer program generates Extensible Markup Language (XML) files that effect coupling between the command- and telemetry-handling software running aboard a spacecraft and the corresponding software running in ground support systems. The XML files are produced by use of information from the flight software and from flight-system engineering. The XML files are converted to legacy ground-system data formats for command and telemetry, transformed into Web-based and printed documentation, and used in developing new ground-system data-handling software. Previously, the information about telemetry and command was scattered in various paper documents that were not synchronized. The process of searching and reading the documents was time-consuming and introduced errors. In contrast, the XML files contain all of the information in one place. XML structures can evolve in such a manner as to enable the addition, to the XML files, of the metadata necessary to track the changes and the associated documentation. The use of this software has reduced the extent of manual operations in developing a ground data system, thereby saving considerable time and removing errors that previously arose in the translation and transcription of software information from the flight to the ground system.

  16. XML data compression in web publishing

    NASA Astrophysics Data System (ADS)

    Qiu, Ruiheng; Hu, Wei; Tang, Zhi; Lu, Xiaoqing; Zhang, Lei

    2012-03-01

    XML is widely used in various document formats on the web. But it has caused negative impacts such as expensive document distribution time over the web, and long content jumping and rendering delay, especially on mobile devices. Hence we proposed a Schema-based efficient queryable XML compressor, called XTrim, which significantly improves compression ratio by utilizing optimized information in XML Schema while supporting efficient queries. Firstly, XTrim draws structure information from XML document and corresponding XML Schema. Then a novel technique is used to transform the XML tree-like structure into a compact indexed form to support efficient queries. At the same time, text values are obtained, and a language-based text trim method (LTT) that facilitates language-specific text compressors is adopted to reduce the size of text values in various languages. In LTT a word composition detection method is proposed to better process text in non-Latin languages. To evaluate the performance of XTrim, we have implemented a compressor and query engine prototype. Via extensive experiments, results show that XTrim outperforms XMill and existing queryable alternatives in terms of compression ratio, as well as the query efficiency. By applying XTrim to documents, the storage space can save up to 30% and the content jumping and rendering delay is reduced to less than 100ms from 4 seconds.

  17. Recursively minimally-deformed oscillators

    NASA Astrophysics Data System (ADS)

    Katriel, J.; Quesne, C.

    1996-04-01

    A recursive deformation of the boson commutation relation is introduced. Each step consists of a minimal deformation of a commutator [a,a°]=fk(... ;n̂) into [a,a°]qk+1=fk(... ;n̂), where ... stands for the set of deformation parameters that fk depends on, followed by a transformation into the commutator [a,a°]=fk+1(...,qk+1;n̂) to which the deformed commutator is equivalent within the Fock space. Starting from the harmonic oscillator commutation relation [a,a°]=1 we obtain the Arik-Coon and Macfarlane-Biedenharn oscillators at the first and second steps, respectively, followed by a sequence of multiparameter generalizations. Several other types of deformed commutation relations related to the treatment of integrable models and to parastatistics are also obtained. The ``generic'' form consists of a linear combination of exponentials of the number operator, and the various recursive families can be classified according to the number of free linear parameters involved, that depends on the form of the initial commutator.

  18. Extensible User-Based XML Grammar Matching

    NASA Astrophysics Data System (ADS)

    Tekli, Joe; Chbeir, Richard; Yetongnon, Kokou

    XML grammar matching has found considerable interest recently due to the growing number of heterogeneous XML documents on the web and the increasing need to integrate, and consequently search and retrieve XML data originated from different data sources. In this paper, we provide an approach for automatic XML grammar matching and comparison aiming to minimize the amount of user effort required to perform the match task. We propose an open framework based on the concept of tree edit distance, integrating different matching criterions so as to capture XML grammar element semantic and syntactic similarities, cardinality and alternativeness constraints, as well as data-type correspondences and relative ordering. It is flexible, enabling the user to chose mapping cardinality (1:1, 1:n, n:1, n:n), in comparison with existing static methods (constrained to 1:1), and considers user feedback to adjust matching results to the user's perception of correct matches. Conducted experiments demonstrate the efficiency of our approach, in comparison with alternative methods.

  19. Using Spreadsheets to Help Students Think Recursively

    ERIC Educational Resources Information Center

    Webber, Robert P.

    2012-01-01

    Spreadsheets lend themselves naturally to recursive computations, since a formula can be defined as a function of one of more preceding cells. A hypothesized closed form for the "n"th term of a recursive sequence can be tested easily by using a spreadsheet to compute a large number of the terms. Similarly, a conjecture about the limit of a series…

  20. The Recursive Paradigm: Suppose We Already Knew.

    ERIC Educational Resources Information Center

    Maurer, Stephen B.

    1995-01-01

    Explains the recursive model in discrete mathematics through five examples and problems. Discusses the relationship between the recursive model, mathematical induction, and inductive reasoning and the relevance of these concepts in the school curriculum. Provides ideas for approaching this material with students. (Author/DDD)

  1. Conjugate gradient algorithms using multiple recursions

    SciTech Connect

    Barth, T.; Manteuffel, T.

    1996-12-31

    Much is already known about when a conjugate gradient method can be implemented with short recursions for the direction vectors. The work done in 1984 by Faber and Manteuffel gave necessary and sufficient conditions on the iteration matrix A, in order for a conjugate gradient method to be implemented with a single recursion of a certain form. However, this form does not take into account all possible recursions. This became evident when Jagels and Reichel used an algorithm of Gragg for unitary matrices to demonstrate that the class of matrices for which a practical conjugate gradient algorithm exists can be extended to include unitary and shifted unitary matrices. The implementation uses short double recursions for the direction vectors. This motivates the study of multiple recursion algorithms.

  2. Recursive estimation techniques for detection of small objects in infrared image data

    NASA Astrophysics Data System (ADS)

    Zeidler, J. R.; Soni, T.; Ku, W. H.

    1992-04-01

    This paper describes a recursive detection scheme for point targets in infrared (IR) images. Estimation of the background noise is done using a weighted autocorrelation matrix update method and the detection statistic is calculated using a recursive technique. A weighting factor allows the algorithm to have finite memory and deal with nonstationary noise characteristics. The detection statistic is created by using a matched filter for colored noise, using the estimated noise autocorrelation matrix. The relationship between the weighting factor, the nonstationarity of the noise and the probability of detection is described. Some results on one- and two-dimensional infrared images are presented.

  3. Modeling geological objects with the XML Schema

    NASA Astrophysics Data System (ADS)

    Babaie, Hassan A.; Babaei, Abbed

    2005-11-01

    Interchange, storage, and management of geological data require the development of knowledge-based, standardized vocabularies and data structures. Concepts modeled and designed with the Unified Markup Language (UML), can be mapped into XML Schema Definition Language (XSDL) to compose modular markup languages for each discipline. Developing such efficient, intra-disciplinary, modular and reusable components, based on the XSDL namespace facility and the principles of object-oriented design, reduces redundancy, increases efficiency, scalability, and extensibility, and simplifies the maintenance and future extension of the code. This paper discusses the best practices of composition and reuse of modular intra-disciplinary components by applying XML Schema namespace syntax. In addition to several small examples given for a variety of geological cases, the paper constructs a UML conceptual model and markup language, applying an XML-type library, for a component of the plate tectonics knowledge base (TectonicsML) that deals with the divergent plate boundary.

  4. XML Based Scientific Data Management Facility

    NASA Technical Reports Server (NTRS)

    Mehrotra, P.; Zubair, M.; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    The World Wide Web consortium has developed an Extensible Markup Language (XML) to support the building of better information management infrastructures. The scientific computing community realizing the benefits of XML has designed markup languages for scientific data. In this paper, we propose a XML based scientific data management ,facility, XDMF. The project is motivated by the fact that even though a lot of scientific data is being generated, it is not being shared because of lack of standards and infrastructure support for discovering and transforming the data. The proposed data management facility can be used to discover the scientific data itself, the transformation functions, and also for applying the required transformations. We have built a prototype system of the proposed data management facility that can work on different platforms. We have implemented the system using Java, and Apache XSLT engine Xalan. To support remote data and transformation functions, we had to extend the XSLT specification and the Xalan package.

  5. Pseudo Relevance Feedback Using Fast XML Retrieval

    NASA Astrophysics Data System (ADS)

    Tanioka, Hiroki

    This paper reports the result of experimentation of our approach using the vector space model for retrieving large-scale XML data. The purposes of the experiments are to improve retrieval precision on the INitiative for the Evaluation of XML Retrieval (INEX) 2008 Adhoc Track, and to compare the retrieval time of our system to other systems on the INEX 2008 Efficiency Track. For the INEX 2007 Adhoc Track, we developed a system using a relative inverted-path (RIP) list and a Bottom-UP approach. The system achieved reasonable retrieval time for XML data. However the system has a room for improvement in terms of retrieval precision. So for INEX 2008, the system uses CAS titles and Pseudo Relevance Feedback (PRF) to improve retrieval precision.

  6. [XML/MathML version of above paper by Sevian

    NASA Astrophysics Data System (ADS)

    Sevian, A. R.

    2003-06-01

    An XML/MathML version of A. R. Sevian's recent article [J. Opt. Netw. 6, 144 (2003)] is provided as an alternative (and experimental) format for online viewing. View the XML/MathML file. The "View Full Text" link below references a PDF file with additional information on XML/MathML technology and its relevance to scientific publishing.

  7. Recursive sequences in first-year calculus

    NASA Astrophysics Data System (ADS)

    Krainer, Thomas

    2016-02-01

    This article provides ready-to-use supplementary material on recursive sequences for a second-semester calculus class. It equips first-year calculus students with a basic methodical procedure based on which they can conduct a rigorous convergence or divergence analysis of many simple recursive sequences on their own without the need to invoke inductive arguments as is typically required in calculus textbooks. The sequences that are accessible to this kind of analysis are predominantly (eventually) monotonic, but also certain recursive sequences that alternate around their limit point as they converge can be considered.

  8. Shuttle-Data-Tape XML Translator

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Osborne, Richard N.

    2005-01-01

    JSDTImport is a computer program for translating native Shuttle Data Tape (SDT) files from American Standard Code for Information Interchange (ASCII) format into databases in other formats. JSDTImport solves the problem of organizing the SDT content, affording flexibility to enable users to choose how to store the information in a database to better support client and server applications. JSDTImport can be dynamically configured by use of a simple Extensible Markup Language (XML) file. JSDTImport uses this XML file to define how each record and field will be parsed, its layout and definition, and how the resulting database will be structured. JSDTImport also includes a client application programming interface (API) layer that provides abstraction for the data-querying process. The API enables a user to specify the search criteria to apply in gathering all the data relevant to a query. The API can be used to organize the SDT content and translate into a native XML database. The XML format is structured into efficient sections, enabling excellent query performance by use of the XPath query language. Optionally, the content can be translated into a Structured Query Language (SQL) database for fast, reliable SQL queries on standard database server computers.

  9. Converting from XML to HDF-EOS

    NASA Technical Reports Server (NTRS)

    Ullman, Richard; Bane, Bob; Yang, Jingli

    2008-01-01

    A computer program recreates an HDF-EOS file from an Extensible Markup Language (XML) representation of the contents of that file. This program is one of two programs written to enable testing of the schemas described in the immediately preceding article to determine whether the schemas capture all details of HDF-EOS files.

  10. Flight Dynamic Model Exchange using XML

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Hildreth, Bruce L.

    2002-01-01

    The AIAA Modeling and Simulation Technical Committee has worked for several years to develop a standard by which the information needed to develop physics-based models of aircraft can be specified. The purpose of this standard is to provide a well-defined set of information, definitions, data tables and axis systems so that cooperating organizations can transfer a model from one simulation facility to another with maximum efficiency. This paper proposes using an application of the eXtensible Markup Language (XML) to implement the AIAA simulation standard. The motivation and justification for using a standard such as XML is discussed. Necessary data elements to be supported are outlined. An example of an aerodynamic model as an XML file is given. This example includes definition of independent and dependent variables for function tables, definition of key variables used to define the model, and axis systems used. The final steps necessary for implementation of the standard are presented. Software to take an XML-defined model and import/export it to/from a given simulation facility is discussed, but not demonstrated. That would be the next step in final implementation of standards for physics-based aircraft dynamic models.

  11. XML-based resources for simulation

    SciTech Connect

    Kelsey, R. L.; Riese, J. M.; Young, G. A.

    2004-01-01

    As simulations and the machines they run on become larger and more complex the inputs and outputs become more unwieldy. Increased complexity makes the setup of simulation problems difficult. It also contributes to the burden of handling and analyzing large amounts of output results. Another problem is that among a class of simulation codes (such as those for physical system simulation) there is often no single standard format or resource for input data. To run the same problem on different simulations requires a different setup for each simulation code. The extensible Markup Language (XML) is used to represent a general set of data resources including physical system problems, materials, and test results. These resources provide a 'plug and play' approach to simulation setup. For example, a particular material for a physical system can be selected from a material database. The XML-based representation of the selected material is then converted to the native format of the simulation being run and plugged into the simulation input file. In this manner a user can quickly and more easily put together a simulation setup. In the case of output data, an XML approach to regression testing includes tests and test results with XML-based representations. This facilitates the ability to query for specific tests and make comparisons between results. Also, output results can easily be converted to other formats for publishing online or on paper.

  12. XTCE. XML Telemetry and Command Exchange Tutorial

    NASA Technical Reports Server (NTRS)

    Rice, Kevin; Kizzort, Brad; Simon, Jerry

    2010-01-01

    An XML Telemetry Command Exchange (XTCE) tutoral oriented towards packets or minor frames is shown. The contents include: 1) The Basics; 2) Describing Telemetry; 3) Describing the Telemetry Format; 4) Commanding; 5) Forgotten Elements; 6) Implementing XTCE; and 7) GovSat.

  13. Application of recursive approaches to differential orbit correction of near Earth asteroids

    NASA Astrophysics Data System (ADS)

    Dmitriev, Vasily; Lupovka, Valery; Gritsevich, Maria

    2016-10-01

    Comparison of three approaches to the differential orbit correction of celestial bodies was performed: batch least squares fitting, Kalman filter, and recursive least squares filter. The first two techniques are well known and widely used (Montenbruck, O. & Gill, E., 2000). The most attention is paid to the algorithm and details of program realization of recursive least squares filter. The filter's algorithm was derived based on recursive least squares technique that are widely used in data processing applications (Simon, D, 2006). Usage recursive least squares filter, makes possible to process a new set of observational data, without reprocessing data, which has been processed before. Specific feature of such approach is that number of observation in data set may be variable. This feature makes recursive least squares filter more flexible approach compare to batch least squares (process complete set of observations in each iteration) and Kalman filtering (suppose updating state vector on each epoch with measurements).Advantages of proposed approach are demonstrated by processing of real astrometric observations of near Earth asteroids. The case of 2008 TC3 was studied. 2008 TC3 was discovered just before its impact with Earth. There are a many closely spaced observations of 2008 TC3 on the interval between discovering and impact, which creates favorable conditions for usage of recursive approaches. Each of approaches has very similar precision in case of 2008 TC3. At the same time, recursive least squares approaches have much higher performance. Thus, this approach more favorable for orbit fitting of a celestial body, which was detected shortly before the collision or close approach to the Earth.This work was carried out at MIIGAiK and supported by the Russian Science Foundation, Project no. 14-22-00197.References:O. Montenbruck and E. Gill, "Satellite Orbits, Models, Methods and Applications," Springer-Verlag, 2000, pp. 1–369.D. Simon, "Optimal State Estimation

  14. Application of XML to Journal Table Archiving

    NASA Astrophysics Data System (ADS)

    Shaya, E. J.; Blackwell, J. H.; Gass, J. E.; Kargatis, V. E.; Schneider, G. L.; Weiland, J. L.; Borne, K. D.; White, R. A.; Cheung, C. Y.

    1998-12-01

    The Astronomical Data Center (ADC) at the NASA Goddard Space Flight Center is a major archive for machine-readable astronomical data tables. Many ADC tables are derived from published journal articles. Article tables are reformatted to be machine-readable and documentation is crafted to facilitate proper reuse by researchers. The recent switch of journals to web based electronic format has resulted in the generation of large amounts of tabular data that could be captured into machine-readable archive format at fairly low cost. The large data flow of the tables from all major North American astronomical journals (a factor of 100 greater than the present rate at the ADC) necessitates the development of rigorous standards for the exchange of data between researchers, publishers, and the archives. We have selected a suitable markup language that can fully describe the large variety of astronomical information contained in ADC tables. The eXtensible Markup Language XML is a powerful internet-ready documentation format for data. It provides a precise and clear data description language that is both machine- and human-readable. It is rapidly becoming the standard format for business and information transactions on the internet and it is an ideal common metadata exchange format. By labelling, or "marking up", all elements of the information content, documents are created that computers can easily parse. An XML archive can easily and automatically be maintained, ingested into standard databases or custom software, and even totally restructured whenever necessary. Structuring astronomical data into XML format will enable efficient and focused search capabilities via off-the-shelf software. The ADC is investigating XML's expanded hyperlinking power to enhance connectivity within the ADC data/metadata and developing XSL display scripts to enhance display of astronomical data. The ADC XML Definition Type Document can be viewed at http://messier.gsfc.nasa.gov/dtdhtml/DTD-TREE.html

  15. Time-varying modal parameters identification of a spacecraft with rotating flexible appendage by recursive algorithm

    NASA Astrophysics Data System (ADS)

    Ni, Zhiyu; Mu, Ruinan; Xun, Guangbin; Wu, Zhigang

    2016-01-01

    The rotation of spacecraft flexible appendage may cause changes in modal parameters. For this time-varying system, the computation cost of the frequently-used singular value decomposition (SVD) identification method is high. Some control problems, such as the self-adaptive control, need the latest modal parameters to update the controller parameters in time. In this paper, the projection approximation subspace tracking (PAST) recursive algorithm is applied as an alternative method to identify the time-varying modal parameters. This method avoids the SVD by signal subspace projection and improves the computational efficiency. To verify the ability of this recursive algorithm in spacecraft modal parameters identification, a spacecraft model with rapid rotational appendage, Soil Moisture Active/Passive (SMAP) satellite, is established, and the time-varying modal parameters of the satellite are identified recursively by designing the input and output signals. The results illustrate that this recursive algorithm can obtain the modal parameters in the high signal noise ratio (SNR) and it has better computational efficiency than the SVD method. Moreover, to improve the identification precision of this recursive algorithm in the low SNR, the wavelet de-noising technology is used to decrease the effect of noises.

  16. Recursive retrospective revaluation of causal judgments.

    PubMed

    Macho, Siegfried; Burkart, Judith

    2002-11-01

    Recursive causal evaluation is an iterative process in which the evaluation of a target cause, T, is based on the outcome of the evaluation of another cause, C, the evaluation of which itself depends on the evaluation of a 3rd cause, D. Retrospective revaluation consists of backward processing of information as indicated by the fact that the evaluation of T is influenced by subsequent information that is not concerned with T directly. Two experiments demonstrate recursive retrospective revaluation with contingency information presented in list format as well as with trial-by-trial acquisition. Existing associative models are unable to predict the results. The model of recursive causal disambiguation that conceptualizes the revaluation as a recursive process of disambiguation predicts the pattern of results correctly.

  17. Method for implementation of recursive hierarchical segmentation on parallel computers

    NASA Technical Reports Server (NTRS)

    Tilton, James C. (Inventor)

    2005-01-01

    A method, computer readable storage, and apparatus for implementing a recursive hierarchical segmentation algorithm on a parallel computing platform. The method includes setting a bottom level of recursion that defines where a recursive division of an image into sections stops dividing, and setting an intermediate level of recursion where the recursive division changes from a parallel implementation into a serial implementation. The segmentation algorithm is implemented according to the set levels. The method can also include setting a convergence check level of recursion with which the first level of recursion communicates with when performing a convergence check.

  18. Is recursion language-specific? Evidence of recursive mechanisms in the structure of intentional action.

    PubMed

    Vicari, Giuseppe; Adenzato, Mauro

    2014-05-01

    In their 2002 seminal paper Hauser, Chomsky and Fitch hypothesize that recursion is the only human-specific and language-specific mechanism of the faculty of language. While debate focused primarily on the meaning of recursion in the hypothesis and on the human-specific and syntax-specific character of recursion, the present work focuses on the claim that recursion is language-specific. We argue that there are recursive structures in the domain of motor intentionality by way of extending John R. Searle's analysis of intentional action. We then discuss evidence from cognitive science and neuroscience supporting the claim that motor-intentional recursion is language-independent and suggest some explanatory hypotheses: (1) linguistic recursion is embodied in sensory-motor processing; (2) linguistic and motor-intentional recursions are distinct and mutually independent mechanisms. Finally, we propose some reflections about the epistemic status of HCF as presenting an empirically falsifiable hypothesis, and on the possibility of testing recursion in different cognitive domains. PMID:24762973

  19. Is recursion language-specific? Evidence of recursive mechanisms in the structure of intentional action.

    PubMed

    Vicari, Giuseppe; Adenzato, Mauro

    2014-05-01

    In their 2002 seminal paper Hauser, Chomsky and Fitch hypothesize that recursion is the only human-specific and language-specific mechanism of the faculty of language. While debate focused primarily on the meaning of recursion in the hypothesis and on the human-specific and syntax-specific character of recursion, the present work focuses on the claim that recursion is language-specific. We argue that there are recursive structures in the domain of motor intentionality by way of extending John R. Searle's analysis of intentional action. We then discuss evidence from cognitive science and neuroscience supporting the claim that motor-intentional recursion is language-independent and suggest some explanatory hypotheses: (1) linguistic recursion is embodied in sensory-motor processing; (2) linguistic and motor-intentional recursions are distinct and mutually independent mechanisms. Finally, we propose some reflections about the epistemic status of HCF as presenting an empirically falsifiable hypothesis, and on the possibility of testing recursion in different cognitive domains.

  20. Recursive Branching Simulated Annealing Algorithm

    NASA Technical Reports Server (NTRS)

    Bolcar, Matthew; Smith, J. Scott; Aronstein, David

    2012-01-01

    This innovation is a variation of a simulated-annealing optimization algorithm that uses a recursive-branching structure to parallelize the search of a parameter space for the globally optimal solution to an objective. The algorithm has been demonstrated to be more effective at searching a parameter space than traditional simulated-annealing methods for a particular problem of interest, and it can readily be applied to a wide variety of optimization problems, including those with a parameter space having both discrete-value parameters (combinatorial) and continuous-variable parameters. It can take the place of a conventional simulated- annealing, Monte-Carlo, or random- walk algorithm. In a conventional simulated-annealing (SA) algorithm, a starting configuration is randomly selected within the parameter space. The algorithm randomly selects another configuration from the parameter space and evaluates the objective function for that configuration. If the objective function value is better than the previous value, the new configuration is adopted as the new point of interest in the parameter space. If the objective function value is worse than the previous value, the new configuration may be adopted, with a probability determined by a temperature parameter, used in analogy to annealing in metals. As the optimization continues, the region of the parameter space from which new configurations can be selected shrinks, and in conjunction with lowering the annealing temperature (and thus lowering the probability for adopting configurations in parameter space with worse objective functions), the algorithm can converge on the globally optimal configuration. The Recursive Branching Simulated Annealing (RBSA) algorithm shares some features with the SA algorithm, notably including the basic principles that a starting configuration is randomly selected from within the parameter space, the algorithm tests other configurations with the goal of finding the globally optimal

  1. An Exponentiation Method for XML Element Retrieval

    PubMed Central

    2014-01-01

    XML document is now widely used for modelling and storing structured documents. The structure is very rich and carries important information about contents and their relationships, for example, e-Commerce. XML data-centric collections require query terms allowing users to specify constraints on the document structure; mapping structure queries and assigning the weight are significant for the set of possibly relevant documents with respect to structural conditions. In this paper, we present an extension to the MEXIR search system that supports the combination of structural and content queries in the form of content-and-structure queries, which we call the Exponentiation function. It has been shown the structural information improve the effectiveness of the search system up to 52.60% over the baseline BM25 at MAP. PMID:24696643

  2. Internet-based data interchange with XML

    NASA Astrophysics Data System (ADS)

    Fuerst, Karl; Schmidt, Thomas

    2000-12-01

    In this paper, a complete concept for Internet Electronic Data Interchange (EDI) - a well-known buzzword in the area of logistics and supply chain management to enable the automation of the interactions between companies and their partners - using XML (eXtensible Markup Language) will be proposed. This approach is based on Internet and XML, because the implementation of traditional EDI (e.g. EDIFACT, ANSI X.12) is mostly too costly for small and medium sized enterprises, which want to integrate their suppliers and customers in a supply chain. The paper will also present the results of the implementation of a prototype for such a system, which has been developed for an industrial partner to improve the current situation of parts delivery. The main functions of this system are an early warning system to detect problems during the parts delivery process as early as possible, and a transport following system to pursue the transportation.

  3. An exponentiation method for XML element retrieval.

    PubMed

    Wichaiwong, Tanakorn

    2014-01-01

    XML document is now widely used for modelling and storing structured documents. The structure is very rich and carries important information about contents and their relationships, for example, e-Commerce. XML data-centric collections require query terms allowing users to specify constraints on the document structure; mapping structure queries and assigning the weight are significant for the set of possibly relevant documents with respect to structural conditions. In this paper, we present an extension to the MEXIR search system that supports the combination of structural and content queries in the form of content-and-structure queries, which we call the Exponentiation function. It has been shown the structural information improve the effectiveness of the search system up to 52.60% over the baseline BM25 at MAP.

  4. A Simple XML Producer-Consumer Protocol

    NASA Technical Reports Server (NTRS)

    Smith, Warren; Gunter, Dan; Quesnel, Darcy

    2000-01-01

    This document describes a simple XML-based protocol that can be used for producers of events to communicate with consumers of events. The protocol described here is not meant to be the most efficient protocol, the most logical protocol, or the best protocol in any way. This protocol was defined quickly and it's intent is to give us a reasonable protocol that we can implement relatively easily and then use to gain experience in distributed event services. This experience will help us evaluate proposals for event representations, XML-based encoding of information, and communication protocols. The next section of this document describes how we represent events in this protocol and then defines the two events that we choose to use for our initial experiments. These definitions are made by example so that they are informal and easy to understand. The following section then proceeds to define the producer-consumer protocol we have agreed upon for our initial experiments.

  5. XML Based Scientific Data Management Facility

    NASA Technical Reports Server (NTRS)

    Mehrotra, Piyush; Zubair, M.; Ziebartt, John (Technical Monitor)

    2001-01-01

    The World Wide Web consortium has developed an Extensible Markup Language (XML) to support the building of better information management infrastructures. The scientific computing community realizing the benefits of HTML has designed markup languages for scientific data. In this paper, we propose a XML based scientific data management facility, XDMF. The project is motivated by the fact that even though a lot of scientific data is being generated, it is not being shared because of lack of standards and infrastructure support for discovering and transforming the data. The proposed data management facility can be used to discover the scientific data itself, the transformation functions, and also for applying the required transformations. We have built a prototype system of the proposed data management facility that can work on different platforms. We have implemented the system using Java, and Apache XSLT engine Xalan. To support remote data and transformation functions, we had to extend the XSLT specification and the Xalan package.

  6. An exponentiation method for XML element retrieval.

    PubMed

    Wichaiwong, Tanakorn

    2014-01-01

    XML document is now widely used for modelling and storing structured documents. The structure is very rich and carries important information about contents and their relationships, for example, e-Commerce. XML data-centric collections require query terms allowing users to specify constraints on the document structure; mapping structure queries and assigning the weight are significant for the set of possibly relevant documents with respect to structural conditions. In this paper, we present an extension to the MEXIR search system that supports the combination of structural and content queries in the form of content-and-structure queries, which we call the Exponentiation function. It has been shown the structural information improve the effectiveness of the search system up to 52.60% over the baseline BM25 at MAP. PMID:24696643

  7. XML for Detector Description at GLAST

    SciTech Connect

    Bogart, Joanne

    2002-04-30

    The problem of representing a detector in a form which is accessible to a variety of applications, allows retrieval of information in ways which are natural to those applications, and is maintainable has been vexing physicists for some time. Although invented to address an entirely different problem domain, the document markup meta-language XML is well-suited to detector description. This paper describes its use for a GLAST detector.

  8. IdealXML: An Interaction Design Tool

    NASA Astrophysics Data System (ADS)

    Montero, Francisco; López-Jaquero, Víctor

    Task modeling has become one of the cornerstones of model-based user interface design. In this paper, a task-based approach to user interfaces design is introduced. This approach is supported by a tool, namely IdealXML, that allows for the animation of the specified user interfaces to generate a hi-fi prototype of the future user interface while still in the first development stages

  9. XML for Detector Description at GLAST

    NASA Astrophysics Data System (ADS)

    Bogart, J.

    2002-04-01

    The problem of representing a detector in a form which is accessible to a variety of applications, allows retrieval of information in ways which are natural to those applications, and is maintainable has been vexing physicists for some time. Although invented to address an entirely different problem domain, the document markup meta-language XML is well-suited to detector description. This paper describes its use for a GLAST detector.

  10. The potential of XML encoding in geomatics converting raster images to XML and SVG

    NASA Astrophysics Data System (ADS)

    Antoniou, Byron; Tsoulos, Lysandros

    2006-03-01

    The evolution of open standards and especially those pertaining to the family of XML technologies, have a considerable impact on the way the Geomatics community addresses the acquisition, storage, analysis and display of spatial data. The most recent version of the GML specification enables the merging of vector and raster data into a single "open" format. The notion of "coverage" as described in GML 3.0 can be the equivalent of a raster multi-band dataset. In addition, vector data storage is also described in detail through the GML Schemas and XML itself can store the values of a raster dataset, as values of a multi-table dataset. Under these circumstances an issue that must be addressed is the transformation of raster data into XML format and their subsequent visualization through SVG. The objective of this paper is to give an overview of the steps that can be followed in order to embody open standards and XML technologies in the raster domain. The last part of the work refers to a case study that suggests a step by step methodology to accomplish classification, an important function in Cartography and Remote Sensing, using the XML-encoded images.

  11. XML Based Markup Languages for Specific Domains

    NASA Astrophysics Data System (ADS)

    Varde, Aparna; Rundensteiner, Elke; Fahrenholz, Sally

    A challenging area in web based support systems is the study of human activities in connection with the web, especially with reference to certain domains. This includes capturing human reasoning in information retrieval, facilitating the exchange of domain-specific knowledge through a common platform and developing tools for the analysis of data on the web from a domain expert's angle. Among the techniques and standards related to such work, we have XML, the eXtensible Markup Language. This serves as a medium of communication for storing and publishing textual, numeric and other forms of data seamlessly. XML tag sets are such that they preserve semantics and simplify the understanding of stored information by users. Often domain-specific markup languages are designed using XML, with a user-centric perspective. Standardization bodies and research communities may extend these to include additional semantics of areas within and related to the domain. This chapter outlines the issues to be considered in developing domain-specific markup languages: the motivation for development, the semantic considerations, the syntactic constraints and other relevant aspects, especially taking into account human factors. Illustrating examples are provided from domains such as Medicine, Finance and Materials Science. Particular emphasis in these examples is on the Materials Markup Language MatML and the semantics of one of its areas, namely, the Heat Treating of Materials. The focus of this chapter, however, is not the design of one particular language but rather the generic issues concerning the development of domain-specific markup languages.

  12. δ-dependency for privacy-preserving XML data publishing.

    PubMed

    Landberg, Anders H; Nguyen, Kinh; Pardede, Eric; Rahayu, J Wenny

    2014-08-01

    An ever increasing amount of medical data such as electronic health records, is being collected, stored, shared and managed in large online health information systems and electronic medical record systems (EMR) (Williams et al., 2001; Virtanen, 2009; Huang and Liou, 2007) [1-3]. From such rich collections, data is often published in the form of census and statistical data sets for the purpose of knowledge sharing and enabling medical research. This brings with it an increasing need for protecting individual people privacy, and it becomes an issue of great importance especially when information about patients is exposed to the public. While the concept of data privacy has been comprehensively studied for relational data, models and algorithms addressing the distinct differences and complex structure of XML data are yet to be explored. Currently, the common compromise method is to convert private XML data into relational data for publication. This ad hoc approach results in significant loss of useful semantic information previously carried in the private XML data. Health data often has very complex structure, which is best expressed in XML. In fact, XML is the standard format for exchanging (e.g. HL7 version 3(1)) and publishing health information. Lack of means to deal directly with data in XML format is inevitably a serious drawback. In this paper we propose a novel privacy protection model for XML, and an algorithm for implementing this model. We provide general rules, both for transforming a private XML schema into a published XML schema, and for mapping private XML data to the new privacy-protected published XML data. In addition, we propose a new privacy property, δ-dependency, which can be applied to both relational and XML data, and that takes into consideration the hierarchical nature of sensitive data (as opposed to "quasi-identifiers"). Lastly, we provide an implementation of our model, algorithm and privacy property, and perform an experimental analysis

  13. δ-dependency for privacy-preserving XML data publishing.

    PubMed

    Landberg, Anders H; Nguyen, Kinh; Pardede, Eric; Rahayu, J Wenny

    2014-08-01

    An ever increasing amount of medical data such as electronic health records, is being collected, stored, shared and managed in large online health information systems and electronic medical record systems (EMR) (Williams et al., 2001; Virtanen, 2009; Huang and Liou, 2007) [1-3]. From such rich collections, data is often published in the form of census and statistical data sets for the purpose of knowledge sharing and enabling medical research. This brings with it an increasing need for protecting individual people privacy, and it becomes an issue of great importance especially when information about patients is exposed to the public. While the concept of data privacy has been comprehensively studied for relational data, models and algorithms addressing the distinct differences and complex structure of XML data are yet to be explored. Currently, the common compromise method is to convert private XML data into relational data for publication. This ad hoc approach results in significant loss of useful semantic information previously carried in the private XML data. Health data often has very complex structure, which is best expressed in XML. In fact, XML is the standard format for exchanging (e.g. HL7 version 3(1)) and publishing health information. Lack of means to deal directly with data in XML format is inevitably a serious drawback. In this paper we propose a novel privacy protection model for XML, and an algorithm for implementing this model. We provide general rules, both for transforming a private XML schema into a published XML schema, and for mapping private XML data to the new privacy-protected published XML data. In addition, we propose a new privacy property, δ-dependency, which can be applied to both relational and XML data, and that takes into consideration the hierarchical nature of sensitive data (as opposed to "quasi-identifiers"). Lastly, we provide an implementation of our model, algorithm and privacy property, and perform an experimental analysis

  14. Software Development Of XML Parser Based On Algebraic Tools

    NASA Astrophysics Data System (ADS)

    Georgiev, Bozhidar; Georgieva, Adriana

    2011-12-01

    In this paper, is presented one software development and implementation of an algebraic method for XML data processing, which accelerates XML parsing process. Therefore, the proposed in this article nontraditional approach for fast XML navigation with algebraic tools contributes to advanced efforts in the making of an easier user-friendly API for XML transformations. Here the proposed software for XML documents processing (parser) is easy to use and can manage files with strictly defined data structure. The purpose of the presented algorithm is to offer a new approach for search and restructuring hierarchical XML data. This approach permits fast XML documents processing, using algebraic model developed in details in previous works of the same authors. So proposed parsing mechanism is easy accessible to the web consumer who is able to control XML file processing, to search different elements (tags) in it, to delete and to add a new XML content as well. The presented various tests show higher rapidity and low consumption of resources in comparison with some existing commercial parsers.

  15. Parallel scheduling of recursively defined arrays

    NASA Technical Reports Server (NTRS)

    Myers, T. J.; Gokhale, M. B.

    1986-01-01

    A new method of automatic generation of concurrent programs which constructs arrays defined by sets of recursive equations is described. It is assumed that the time of computation of an array element is a linear combination of its indices, and integer programming is used to seek a succession of hyperplanes along which array elements can be computed concurrently. The method can be used to schedule equations involving variable length dependency vectors and mutually recursive arrays. Portions of the work reported here have been implemented in the PS automatic program generation system.

  16. Recursive Implementations of the Consider Filter

    NASA Technical Reports Server (NTRS)

    Zanetti, Renato; DSouza, Chris

    2012-01-01

    One method to account for parameters errors in the Kalman filter is to consider their effect in the so-called Schmidt-Kalman filter. This work addresses issues that arise when implementing a consider Kalman filter as a real-time, recursive algorithm. A favorite implementation of the Kalman filter as an onboard navigation subsystem is the UDU formulation. A new way to implement a UDU consider filter is proposed. The non-optimality of the recursive consider filter is also analyzed, and a modified algorithm is proposed to overcome this limitation.

  17. Experimental Evaluation of Processing Time for the Synchronization of XML-Based Business Objects

    NASA Astrophysics Data System (ADS)

    Ameling, Michael; Wolf, Bernhard; Springer, Thomas; Schill, Alexander

    Business objects (BOs) are data containers for complex data structures used in business applications such as Supply Chain Management and Customer Relationship Management. Due to the replication of application logic, multiple copies of BOs are created which have to be synchronized and updated. This is a complex and time consuming task because BOs rigorously vary in their structure according to the distribution, number and size of elements. Since BOs are internally represented as XML documents, the parsing of XML is one major cost factor which has to be considered for minimizing the processing time during synchronization. The prediction of the parsing time for BOs is an significant property for the selection of an efficient synchronization mechanism. In this paper, we present a method to evaluate the influence of the structure of BOs on their parsing time. The results of our experimental evaluation incorporating four different XML parsers examine the dependencies between the distribution of elements and the parsing time. Finally, a general cost model will be validated and simplified according to the results of the experimental setup.

  18. Information persistence using XML database technology

    NASA Astrophysics Data System (ADS)

    Clark, Thomas A.; Lipa, Brian E. G.; Macera, Anthony R.; Staskevich, Gennady R.

    2005-05-01

    The Joint Battlespace Infosphere (JBI) Information Management (IM) services provide information exchange and persistence capabilities that support tailored, dynamic, and timely access to required information, enabling near real-time planning, control, and execution for DoD decision making. JBI IM services will be built on a substrate of network centric core enterprise services and when transitioned, will establish an interoperable information space that aggregates, integrates, fuses, and intelligently disseminates relevant information to support effective warfighter business processes. This virtual information space provides individual users with information tailored to their specific functional responsibilities and provides a highly tailored repository of, or access to, information that is designed to support a specific Community of Interest (COI), geographic area or mission. Critical to effective operation of JBI IM services is the implementation of repositories, where data, represented as information, is represented and persisted for quick and easy retrieval. This paper will address information representation, persistence and retrieval using existing database technologies to manage structured data in Extensible Markup Language (XML) format as well as unstructured data in an IM services-oriented environment. Three basic categories of database technologies will be compared and contrasted: Relational, XML-Enabled, and Native XML. These technologies have diverse properties such as maturity, performance, query language specifications, indexing, and retrieval methods. We will describe our application of these evolving technologies within the context of a JBI Reference Implementation (RI) by providing some hopefully insightful anecdotes and lessons learned along the way. This paper will also outline future directions, promising technologies and emerging COTS products that can offer more powerful information management representations, better persistence mechanisms and

  19. A Simple XML Producer-Consumer Protocol

    NASA Technical Reports Server (NTRS)

    Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)

    2001-01-01

    There are many different projects from government, academia, and industry that provide services for delivering events in distributed environments. The problem with these event services is that they are not general enough to support all uses and they speak different protocols so that they cannot interoperate. We require such interoperability when we, for example, wish to analyze the performance of an application in a distributed environment. Such an analysis might require performance information from the application, computer systems, networks, and scientific instruments. In this work we propose and evaluate a standard XML-based protocol for the transmission of events in distributed systems. One recent trend in government and academic research is the development and deployment of computational grids. Computational grids are large-scale distributed systems that typically consist of high-performance compute, storage, and networking resources. Examples of such computational grids are the DOE Science Grid, the NASA Information Power Grid (IPG), and the NSF Partnerships for Advanced Computing Infrastructure (PACIs). The major effort to deploy these grids is in the area of developing the software services to allow users to execute applications on these large and diverse sets of resources. These services include security, execution of remote applications, managing remote data, access to information about resources and services, and so on. There are several toolkits for providing these services such as Globus, Legion, and Condor. As part of these efforts to develop computational grids, the Global Grid Forum is working to standardize the protocols and APIs used by various grid services. This standardization will allow interoperability between the client and server software of the toolkits that are providing the grid services. The goal of the Performance Working Group of the Grid Forum is to standardize protocols and representations related to the storage and distribution of

  20. Rosetta Ligand docking with flexible XML protocols.

    PubMed

    Lemmon, Gordon; Meiler, Jens

    2012-01-01

    RosettaLigand is premiere software for predicting how a protein and a small molecule interact. Benchmark studies demonstrate that 70% of the top scoring RosettaLigand predicted interfaces are within 2Å RMSD from the crystal structure [1]. The latest release of Rosetta ligand software includes many new features, such as (1) docking of multiple ligands simultaneously, (2) representing ligands as fragments for greater flexibility, (3) redesign of the interface during docking, and (4) an XML script based interface that gives the user full control of the ligand docking protocol. PMID:22183535

  1. A recursive filter for despeckling SAR images.

    PubMed

    Subrahmanyam, G R K S; Rajagopalan, A N; Aravind, R

    2008-10-01

    This correspondence proposes a recursive algorithm for noise reduction in synthetic aperture radar imagery. Excellent despeckling in conjunction with feature preservation is achieved by incorporating a discontinuity-adaptive Markov random field prior within the unscented Kalman filter framework through importance sampling. The performance of this method is demonstrated on both synthetic and real examples.

  2. On the design of recursive digital filters

    NASA Technical Reports Server (NTRS)

    Shenoi, K.; Narasimha, M. J.; Peterson, A. M.

    1976-01-01

    A change of variables is described which transforms the problem of designing a recursive digital filter to that of approximation by a ratio of polynomials on a finite interval. Some analytic techniques for the design of low-pass filters are presented, illustrating the use of the transformation. Also considered are methods for the design of phase equalizers.

  3. TORTIS (Toddler's Own Recursive Turtle Interpreter System).

    ERIC Educational Resources Information Center

    Perlman, Radia

    TORTIS (Toddler's Own Recursive Turtle Interpreter System) is a device which can be used to study or nurture the cognitive development of preschool children. The device consists of a "turtle" which the child can control by use of buttons on a control panel. The "turtle" can be made to move in prescribed directions, to take a given number of paces,…

  4. Supporting the Learning of Recursive Problem Solving.

    ERIC Educational Resources Information Center

    Bhuiyan, Shawkat; And Others

    1994-01-01

    Presents the PETAL (Programming Environment Tool) learning environment, and discusses a study where one group of students used PETAL and another used a standard LISP environment to learn recursion. The PETAL group performed better during the learning period and on a written posttest. Theorizes why PETAL may be responsible for improved learning of…

  5. Scheduling Topics for Improved Student Comprehension of Recursion

    ERIC Educational Resources Information Center

    Zmuda, Michael; Hatch, Melanie

    2007-01-01

    This paper presents the results of an experiment conducted to assess the affects of teaching recursion in two disjoint, non-consecutive units of instruction. One group of students was taught basic and advanced recursion topics in four consecutive class periods, while a second group was taught recursion in two two-period blocks that were separated…

  6. A Survey on Teaching and Learning Recursive Programming

    ERIC Educational Resources Information Center

    Rinderknecht, Christian

    2014-01-01

    We survey the literature about the teaching and learning of recursive programming. After a short history of the advent of recursion in programming languages and its adoption by programmers, we present curricular approaches to recursion, including a review of textbooks and some programming methodology, as well as the functional and imperative…

  7. Adaptive Hypermedia Educational System Based on XML Technologies.

    ERIC Educational Resources Information Center

    Baek, Yeongtae; Wang, Changjong; Lee, Sehoon

    This paper proposes an adaptive hypermedia educational system using XML technologies, such as XML, XSL, XSLT, and XLink. Adaptive systems are capable of altering the presentation of the content of the hypermedia on the basis of a dynamic understanding of the individual user. The user profile can be collected in a user model, while the knowledge…

  8. Does Being Technical Matter? XML, Single Source, and Technical Communication.

    ERIC Educational Resources Information Center

    Sapienza, Filipp

    2002-01-01

    Describes XML, a recent Web design language that will enable technical communicators to produce documentation that can reuse information and present it across multiple types of media for diverse audiences. Argues that XML requires more interdisciplinary approaches toward the teaching and research of technical communication, particularly with…

  9. An XML-Based Protocol for Distributed Event Services

    NASA Technical Reports Server (NTRS)

    Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)

    2001-01-01

    This viewgraph presentation provides information on the application of an XML (extensible mark-up language)-based protocol to the developing field of distributed processing by way of a computational grid which resembles an electric power grid. XML tags would be used to transmit events between the participants of a transaction, namely, the consumer and the producer of the grid scheme.

  10. An XML file format for exchanging singlet lens specifications

    NASA Astrophysics Data System (ADS)

    Gay, Shawn C.; Gangadhara, Sanjay

    2015-10-01

    Zemax has developed an XML schema for the distribution of singlet lens specifications based on the ISO 10110 standard. In OpticStudio 15, this kind of XML data can be exported from the ISO Element Drawing analysis. The data file is then used in a feature that automates exchange of lens data between designer and manufacturer, the Cost Estimator. This Cost Estimator feature submits the XML data to various manufacturers to obtain cost estimates for prototype lens production. The workflow centered on the XML data exchange facilitates rapid cost estimate retrieval and eliminates the need for redundant manual data entry. The XML Schema Definition (XSD) for the XML format can be used with Microsoft developer tools to automatically create .NET classes to serialize and deserialize the singlet lens data to/from XML files. The format provides flexible unit specification for most parameters. Choosing XML as the basis for the file format has provided several benefits, such as the above mentioned automated serialization capabilities in .NET, a human-readable text-based format, and ready support for consumption by web services.

  11. Adding XML to the MIS Curriculum: Lessons from the Classroom

    ERIC Educational Resources Information Center

    Wagner, William P.; Pant, Vik; Hilken, Ralph

    2008-01-01

    eXtensible Markup Language (XML) is a new technology that is currently being extolled by many industry experts and software vendors. Potentially it represents a platform independent language for sharing information over networks in a way that is much more seamless than with previous technologies. It is extensible in that XML serves as a "meta"…

  12. EquiX-A Search and Query Language for XML.

    ERIC Educational Resources Information Center

    Cohen, Sara; Kanza, Yaron; Kogan, Yakov; Sagiv, Yehoshua; Nutt, Werner; Serebrenik, Alexander

    2002-01-01

    Describes EquiX, a search language for XML that combines querying with searching to query the data and the meta-data content of Web pages. Topics include search engines; a data model for XML documents; search query syntax; search query semantics; an algorithm for evaluating a query on a document; and indexing EquiX queries. (LRW)

  13. A Typed Text Retrieval Query Language for XML Documents.

    ERIC Educational Resources Information Center

    Colazzo, Dario; Sartiani, Carlo; Albano, Antonio; Manghi, Paolo; Ghelli, Giorgio; Lini, Luca; Paoli, Michele

    2002-01-01

    Discussion of XML focuses on a description of Tequyla-TX, a typed text retrieval query language for XML documents that can search on both content and structures. Highlights include motivations; numerous examples; word-based and char-based searches; tag-dependent full-text searches; text normalization; query algebra; data models and term language;…

  14. An Expressive and Efficient Language for XML Information Retrieval.

    ERIC Educational Resources Information Center

    Chinenyanga, Taurai Tapiwa; Kushmerick, Nicholas

    2002-01-01

    Discusses XML and information retrieval and describes a query language, ELIXIR (expressive and efficient language for XML information retrieval), with a textual similarity operator that can be used for similarity joins. Explains the algorithm for answering ELIXIR queries to generate intermediate relational data. (Author/LRW)

  15. Data Manipulation in an XML-Based Digital Image Library

    ERIC Educational Resources Information Center

    Chang, Naicheng

    2005-01-01

    Purpose: To help to clarify the role of XML tools and standards in supporting transition and migration towards a fully XML-based environment for managing access to information. Design/methodology/approach: The Ching Digital Image Library, built on a three-tier architecture, is used as a source of examples to illustrate a number of methods of data…

  16. Update on the Development and Validation of MERCURY: A Modern, Monte Carlo Particle Transport Code

    SciTech Connect

    Procassini, R J; Taylor, J M; McKinley, M S; Greenman, G M; Cullen, D E; O'Brien, M J; Beck, B R; Hagmann, C A

    2005-06-06

    An update on the development and validation of the MERCURY Monte Carlo particle transport code is presented. MERCURY is a modern, parallel, general-purpose Monte Carlo code being developed at the Lawrence Livermore National Laboratory. During the past year, several major algorithm enhancements have been completed. These include the addition of particle trackers for 3-D combinatorial geometry (CG), 1-D radial meshes, 2-D quadrilateral unstructured meshes, as well as a feature known as templates for defining recursive, repeated structures in CG. New physics capabilities include an elastic-scattering neutron thermalization model, support for continuous energy cross sections and S ({alpha}, {beta}) molecular bound scattering. Each of these new physics features has been validated through code-to-code comparisons with another Monte Carlo transport code. Several important computer science features have been developed, including an extensible input-parameter parser based upon the XML data description language, and a dynamic load-balance methodology for efficient parallel calculations. This paper discusses the recent work in each of these areas, and describes a plan for future extensions that are required to meet the needs of our ever expanding user base.

  17. An introduction to on-shell recursion relations

    NASA Astrophysics Data System (ADS)

    Feng, Bo; Luo, Mingxing

    2012-10-01

    This article provides an introduction to on-shell recursion relations for calculations of tree-level amplitudes. Starting with the basics, such as spinor notations and color decompositions, we expose analytic properties of gauge-boson amplitudes, BCFW-deformations, the large z-behavior of amplitudes, and on-shell recursion relations of gluons. We discuss further developments of on-shell recursion relations, including generalization to other quantum field theories, supersymmetric theories in particular, recursion relations for off-shell currents, recursion relation with nonzero boundary contributions, bonus relations, relations for rational parts of one-loop amplitudes, recursion relations in 3D and a proof of CSW rules. Finally, we present samples of applications, including solutions of split helicity amplitudes and of N = 4 SYM theories, consequences of consistent conditions under recursion relation, Kleiss-Kuijf (KK) and Bern-Carrasco-Johansson (BCJ) relations for color-ordered gluon tree amplitudes, Kawai-Lewellen-Tye (KLT) relations.

  18. The recursion relation in Lagrangian perturbation theory

    SciTech Connect

    Rampf, Cornelius

    2012-12-01

    We derive a recursion relation in the framework of Lagrangian perturbation theory, appropriate for studying the inhomogeneities of the large scale structure of the universe. We use the fact that the perturbative expansion of the matter density contrast is in one-to-one correspondence with standard perturbation theory (SPT) at any order. This correspondence has been recently shown to be valid up to fourth order for a non-relativistic, irrotational and dust-like component. Assuming it to be valid at arbitrary (higher) order, we express the Lagrangian displacement field in terms of the perturbative kernels of SPT, which are itself given by their own and well-known recursion relation. We argue that the Lagrangian solution always contains more non-linear information in comparison with the SPT solution, (mainly) if the non-perturbative density contrast is restored after the displacement field is obtained.

  19. Recursively indexed differential pulse code modulation

    NASA Technical Reports Server (NTRS)

    Sayood, Khalid; Na, Sangsin

    1992-01-01

    The performance of a differential pulse code modulation (DPCM) system with a recursively indexed quantizer (RIQ) under various conditions, with first order Gauss-Markov and Laplace-Markov sources as inputs, is studied. When the predictor is matched to the input, the proposed system performs at or close to the optimum entropy constrained DPCM system. If one is willing to accept a 5 percent increase in the rate, the system is very forgiving of predictor mismatch.

  20. A Data Parallel Algorithm for XML DOM Parsing

    NASA Astrophysics Data System (ADS)

    Shah, Bhavik; Rao, Praveen R.; Moon, Bongki; Rajagopalan, Mohan

    The extensible markup language XML has become the de facto standard for information representation and interchange on the Internet. XML parsing is a core operation performed on an XML document for it to be accessed and manipulated. This operation is known to cause performance bottlenecks in applications and systems that process large volumes of XML data. We believe that parallelism is a natural way to boost performance. Leveraging multicore processors can offer a cost-effective solution, because future multicore processors will support hundreds of cores, and will offer a high degree of parallelism in hardware. We propose a data parallel algorithm called ParDOM for XML DOM parsing, that builds an in-memory tree structure for an XML document. ParDOM has two phases. In the first phase, an XML document is partitioned into chunks and parsed in parallel. In the second phase, partial DOM node tree structures created during the first phase, are linked together (in parallel) to build a complete DOM node tree. ParDOM offers fine-grained parallelism by adopting a flexible chunking scheme - each chunk can contain an arbitrary number of start and end XML tags that are not necessarily matched. ParDOM can be conveniently implemented using a data parallel programming model that supports map and sort operations. Through empirical evaluation, we show that ParDOM yields better scalability than PXP [23] - a recently proposed parallel DOM parsing algorithm - on commodity multicore processors. Furthermore, ParDOM can process a wide-variety of XML datasets with complex structures which PXP fails to parse.

  1. On 2-D recursive LMS algorithms using ARMA prediction for ADPCM encoding of images.

    PubMed

    Chung, Y S; Kanefsky, M

    1992-01-01

    A two-dimensional (2D) linear predictor which has an autoregressive moving average (ARMA) representation well as a bias term is adapted for adaptive differential pulse code modulation (ADPCM) encoding of nonnegative images. The predictor coefficients are updated by using a 2D recursive LMS (TRLMS) algorithm. A constraint on optimum values for the convergence factors and an updating algorithm based on the constraint are developed. The coefficient updating algorithm can be modified with a stability control factor. This realization can operate in real time and in the spatial domain. A comparison of three different types of predictors is made for real images. ARMA predictors show improved performance relative to an AR algorithm. PMID:18296174

  2. A Study of XML in the Library Science Curriculum in Taiwan and South East Asia

    ERIC Educational Resources Information Center

    Chang, Naicheng; Huang, Yuhui; Hopkinson, Alan

    2011-01-01

    This paper aims to investigate the current XML-related courses available in 96 LIS schools in South East Asia and Taiwan's 9 LIS schools. Also, this study investigates the linkage of library school graduates in Taiwan who took different levels of XML-related education (that is XML arranged as an individual course or XML arranged as a section unit…

  3. Using XML to Separate Content from the Presentation Software in eLearning Applications

    ERIC Educational Resources Information Center

    Merrill, Paul F.

    2005-01-01

    This paper has shown how XML (extensible Markup Language) can be used to mark up content. Since XML documents, with meaningful tags, can be interpreted easily by humans as well as computers, they are ideal for the interchange of information. Because XML tags can be defined by an individual or organization, XML documents have proven useful in a…

  4. Algebraic Modeling of Information Retrieval in XML Documents

    NASA Astrophysics Data System (ADS)

    Georgiev, Bozhidar; Georgieva, Adriana

    2009-11-01

    This paper presents an information retrieval approach in XML documents using tools, based on the linear algebra. The well-known transformation languages as XSLT (XPath) are grounded on the features of higher-order logic for manipulating hierarchical trees. The presented conception is compared to existing higher-order logic formalisms, where the queries are realized by both languages XSLT and XPath. The possibilities of the proposed linear algebraic model combined with hierarchy data models permit more efficient solutions for searching, extracting and manipulating semi-structured data with hierarchical structures avoiding the global navigation over the XML tree components. The main purpose of this algebraic model representation, applied to the hierarchical relationships in the XML data structures, is to make the implementation of linear algebra tools possible for XML data manipulations and to eliminate existing problems, related to regular grammars theory and also to avoid the difficulties, connected with higher -order logic (first-order logic, monadic second- order logic etc.).

  5. XML DTD and Schemas for HDF-EOS

    NASA Technical Reports Server (NTRS)

    Ullman, Richard; Yang, Jingli

    2008-01-01

    An Extensible Markup Language (XML) document type definition (DTD) standard for the structure and contents of HDF-EOS files and their contents, and an equivalent standard in the form of schemas, have been developed.

  6. Recursiveness, switching, and fluctuations in a replicating catalytic network

    NASA Astrophysics Data System (ADS)

    Kaneko, Kunihiko

    2003-09-01

    A protocell model consisting of mutually catalyzing molecules is studied, in order to investigate how chemical compositions are transferred recursively through cell divisions under replication errors. Depending on the numbers of molecules and species, and the path rate, three phases are found: fast switching state without recursive production, recursive production, and itinerancy between the above two states. The number distributions of the molecules in the recursive states are shown to be log-normal except for those species that form a core hypercycle, and are explained with the help of a heuristic argument.

  7. Path Query Processing in Large-Scale XML Databases

    NASA Astrophysics Data System (ADS)

    Haw, Su-Cheng; Radha Krishna Rao, G. S. V.

    With the ever-increasing popularity of XML (e-Xtensible Markup Language) as data representation and exchange on the Internet, querying XML data has become an important issue to be address. In Native XML Database (NXD), XML documents are usually modeled as trees and XML queries are typically specified in path expression. In path expression, the primitive structural relationships are Parent-Child (P-C) and Ancestor-Descendant (A-D). Thus, finding all occurrences of these relationships is crucial for XML query processing. Current methods for query processing on NXD usually employ either sequential traversing of tree-structured model or a decomposition-matching-merging processes. We adopt the later approach and propose a novel hybrid query optimization technique, INLAB comprising both indexing and labeling technologies. Furthermore, we also propose several algorithms to create INLAB encoding and analyze the path query. We implemented our technique and present performance results over several benchmarking datasets, which prove the viability of our approach.

  8. Container-component model and XML in ALMA ACS

    NASA Astrophysics Data System (ADS)

    Sommer, Heiko; Chiozzi, Gianluca; Zagar, Klemen; Voelter, Markus

    2004-09-01

    ALMA software, from high-level data flow applications down to instrument control, is built using the ACS framework. To meet the challenges of developing distributed software in distributed teams, ACS offers a container/component model that integrates the use of XML transfer objects. ACS containers are built on top of CORBA and are available for C++, Java, and Python, so that ALMA software can be written as components in any of these languages. The containers perform technical aspects of the software system, while components can focus on the implementation of functional requirements. Like Web services, components can use XML to exchange structured data by value. For Java components, the container seamlessly integrates the use of XML binding classes, which are Java classes that encapsulate access to XML data through type-safe methods. Binding classes are generated from XML schemas, allowing the Java compiler to enforce compliance of application code with the XML schemas. This presentation will explain the capabilities of the ACS container/component model, and how it relates to other middleware technologies that are popular in industry.

  9. Recursive deconvolution of combinatorial chemical libraries.

    PubMed

    Erb, E; Janda, K D; Brenner, S

    1994-11-22

    A recursive strategy that solves for the active members of a chemical library is presented. A pentapeptide library with an alphabet of Gly, Leu, Phe, and Tyr (1024 members) was constructed on a solid support by the method of split synthesis. One member of this library (NH2-Tyr-Gly-Gly-Phe-Leu) is a native binder to a beta-endorphin antibody. A variation of the split synthesis approach is used to build the combinatorial library. In four vials, a member of the library's alphabet is coupled to a solid support. After each coupling, a portion of the resin from each of the four reaction vials was set aside and catalogued. The solid support from each vial is then combined, mixed, and redivided. The steps of (i) coupling, (ii) saving and cataloging, and (iii) randomizing were repeated until a pentapeptide library was obtained. The four pentapeptide libraries where the N-terminal amino acid is defined were screened against the beta-endorphin antibody and quantitated via an ELISA. The amino acid of the four pools that demonstrated the most binding was then coupled to the four tetrapeptide partial libraries that had been set aside and catalogued during the split synthesis. This recursive deconvolution was repeated until the best binders were deduced. Besides the anticipated native binder, two other members of the library displayed significant binding. This recursive method of deconvolution does not use a molecular tag, requires only one split synthesis, and can be applied to the deconvolution of nonlinear small-molecule combinatorial libraries and linear oligomeric combinatorial libraries, since it is based only on the procedure of the synthesis. PMID:7972077

  10. Citing geospatial feature inventories with XML manifests

    NASA Astrophysics Data System (ADS)

    Bose, R.; McGarva, G.

    2006-12-01

    Today published scientific papers include a growing number of citations for online information sources that either complement or replace printed journals and books. We anticipate this same trend for cartographic citations used in the geosciences, following advances in web mapping and geographic feature-based services. Instead of using traditional libraries to resolve citations for print material, the geospatial citation life cycle will include requesting inventories of objects or geographic features from distributed geospatial data repositories. Using a case study from the UK Ordnance Survey MasterMap database, which is illustrative of geographic object-based products in general, we propose citing inventories of geographic objects using XML feature manifests. These manifests: (1) serve as a portable listing of sets of versioned features; (2) could be used as citations within the identification portion of an international geospatial metadata standard; (3) could be incorporated into geospatial data transfer formats such as GML; but (4) can be resolved only with comprehensive, curated repositories of current and historic data. This work has implications for any researcher who foresees the need to make or resolve references to online geospatial databases.

  11. Recursion Relations for Double Ramification Hierarchies

    NASA Astrophysics Data System (ADS)

    Buryak, Alexandr; Rossi, Paolo

    2016-03-01

    In this paper we study various properties of the double ramification hierarchy, an integrable hierarchy of hamiltonian PDEs introduced in Buryak (CommunMath Phys 336(3):1085-1107, 2015) using intersection theory of the double ramification cycle in the moduli space of stable curves. In particular, we prove a recursion formula that recovers the full hierarchy starting from just one of the Hamiltonians, the one associated to the first descendant of the unit of a cohomological field theory. Moreover, we introduce analogues of the topological recursion relations and the divisor equation both for the Hamiltonian densities and for the string solution of the double ramification hierarchy. This machinery is very efficient and we apply it to various computations for the trivial and Hodge cohomological field theories, and for the r -spin Witten's classes. Moreover, we prove the Miura equivalence between the double ramification hierarchy and the Dubrovin-Zhang hierarchy for the Gromov-Witten theory of the complex projective line (extended Toda hierarchy).

  12. A recursive algorithm for Zernike polynomials

    NASA Technical Reports Server (NTRS)

    Davenport, J. W.

    1982-01-01

    The analysis of a function defined on a rotationally symmetric system, with either a circular or annular pupil is discussed. In order to numerically analyze such systems it is typical to expand the given function in terms of a class of orthogonal polynomials. Because of their particular properties, the Zernike polynomials are especially suited for numerical calculations. Developed is a recursive algorithm that can be used to generate the Zernike polynomials up to a given order. The algorithm is recursively defined over J where R(J,N) is the Zernike polynomial of degree N obtained by orthogonalizing the sequence R(J), R(J+2), ..., R(J+2N) over (epsilon, 1). The terms in the preceding row - the (J-1) row - up to the N+1 term is needed for generating the (J,N)th term. Thus, the algorith generates an upper left-triangular table. This algorithm was placed in the computer with the necessary support program also included.

  13. Tetramethyleneethane Equivalents: Recursive Reagents for Serialized Cycloadditions

    PubMed Central

    2015-01-01

    New reactions and reagents that allow for multiple bond-forming events per synthetic operation are required to achieve structural complexity and thus value with step-, time-, cost-, and waste-economy. Here we report a new class of reagents that function like tetramethyleneethane (TME), allowing for back-to-back [4 + 2] cycloadditions, thereby amplifying the complexity-increasing benefits of Diels–Alder and metal-catalyzed cycloadditions. The parent recursive reagent, 2,3-dimethylene-4-trimethylsilylbutan-1-ol (DMTB), is readily available from the metathesis of ethylene and THP-protected 4-trimethylsilylbutyn-1-ol. DMTB and related reagents engage diverse dienophiles in an initial Diels–Alder or metal-catalyzed [4 + 2] cycloaddition, triggering a subsequent vinylogous Peterson elimination that recursively generates a new diene for a second cycloaddition. Overall, this multicomponent catalytic cascade produces in one operation carbo- and heterobicyclic building blocks for the synthesis of a variety of natural products, therapeutic leads, imaging agents, and materials. Its application to the three step synthesis of a new solvatochromic fluorophore, N-ethyl(6-N,N-dimethylaminoanthracene-2,3-dicarboximide) (6-DMA), and the photophysical characterization of this fluorophore are described. PMID:25961416

  14. Language, Mind, Practice: Families of Recursive Thinking in Human Reasoning

    ERIC Educational Resources Information Center

    Josephson, Marika

    2011-01-01

    In 2002, Chomsky, Hauser, and Fitch asserted that recursion may be the one aspect of the human language faculty that makes human language unique in the narrow sense--unique to language and unique to human beings. They also argue somewhat more quietly (as do Pinker and Jackendoff 2005) that recursion may be possible outside of language: navigation,…

  15. XML-BSPM: an XML format for storing Body Surface Potential Map recordings

    PubMed Central

    2010-01-01

    Background The Body Surface Potential Map (BSPM) is an electrocardiographic method, for recording and displaying the electrical activity of the heart, from a spatial perspective. The BSPM has been deemed more accurate for assessing certain cardiac pathologies when compared to the 12-lead ECG. Nevertheless, the 12-lead ECG remains the most popular ECG acquisition method for non-invasively assessing the electrical activity of the heart. Although data from the 12-lead ECG can be stored and shared using open formats such as SCP-ECG, no open formats currently exist for storing and sharing the BSPM. As a result, an innovative format for storing BSPM datasets has been developed within this study. Methods The XML vocabulary was chosen for implementation, as opposed to binary for the purpose of human readability. There are currently no standards to dictate the number of electrodes and electrode positions for recording a BSPM. In fact, there are at least 11 different BSPM electrode configurations in use today. Therefore, in order to support these BSPM variants, the XML-BSPM format was made versatile. Hence, the format supports the storage of custom torso diagrams using SVG graphics. This diagram can then be used in a 2D coordinate system for retaining electrode positions. Results This XML-BSPM format has been successfully used to store the Kornreich-117 BSPM dataset and the Lux-192 BSPM dataset. The resulting file sizes were in the region of 277 kilobytes for each BSPM recording and can be deemed suitable for example, for use with any telemonitoring application. Moreover, there is potential for file sizes to be further reduced using basic compression algorithms, i.e. the deflate algorithm. Finally, these BSPM files have been parsed and visualised within a convenient time period using a web based BSPM viewer. Conclusions This format, if widely adopted could promote BSPM interoperability, knowledge sharing and data mining. This work could also be used to provide conceptual

  16. The SGML Standardization Framework and the Introduction of XML

    PubMed Central

    Grütter, Rolf

    2000-01-01

    Extensible Markup Language (XML) is on its way to becoming a global standard for the representation, exchange, and presentation of information on the World Wide Web (WWW). More than that, XML is creating a standardization framework, in terms of an open network of meta-standards and mediators that allows for the definition of further conventions and agreements in specific business domains. Such an approach is particularly needed in the healthcare domain; XML promises to especially suit the particularities of patient records and their lifelong storage, retrieval, and exchange. At a time when change rather than steadiness is becoming the faithful feature of our society, standardization frameworks which support a diversified growth of specifications that are appropriate to the actual needs of the users are becoming more and more important; and efforts should be made to encourage this new attempt at standardization to grow in a fruitful direction. Thus, the introduction of XML reflects a standardization process which is neither exclusively based on an acknowledged standardization authority, nor a pure market standard. Instead, a consortium of companies, academic institutions, and public bodies has agreed on a common recommendation based on an existing standardization framework. The consortium's process of agreeing to a standardization framework will doubtlessly be successful in the case of XML, and it is suggested that it should be considered as a generic model for standardization processes in the future. PMID:11720931

  17. Realization Of Algebraic Processor For XML Documents Processing

    NASA Astrophysics Data System (ADS)

    Georgiev, Bozhidar; Georgieva, Adriana

    2010-10-01

    In this paper, are presented some possibilities concerning the implementation of an algebraic method for XML hierarchical data processing which makes faster the XML search mechanism. Here is offered a different point of view for creation of advanced algebraic processor (with all necessary software tools and programming modules respectively). Therefore, this nontraditional approach for fast XML navigation with the presented algebraic processor may help to build an easier user-friendly interface provided XML transformations, which can avoid the difficulties in the complicated language constructions of XSL, XSLT and XPath. This approach allows comparatively simple search of XML hierarchical data by means of the following types of functions: specification functions and so named build-in functions. The choice of programming language Java may appear strange at first, but it isn't when you consider that the applications can run on different kinds of computers. The specific search mechanism based on the linear algebra theory is faster in comparison with MSXML parsers (on the basis of the developed examples with about 30%). Actually, there exists the possibility for creating new software tools based on the linear algebra theory, which cover the whole navigation and search techniques characterizing XSLT/XPath. The proposed method is able to replace more complicated operations in other SOA components.

  18. Realization Of Algebraic Processor For XML Documents Processing

    SciTech Connect

    Georgiev, Bozhidar; Georgieva, Adriana

    2010-10-25

    In this paper, are presented some possibilities concerning the implementation of an algebraic method for XML hierarchical data processing which makes faster the XML search mechanism. Here is offered a different point of view for creation of advanced algebraic processor (with all necessary software tools and programming modules respectively). Therefore, this nontraditional approach for fast XML navigation with the presented algebraic processor may help to build an easier user-friendly interface provided XML transformations, which can avoid the difficulties in the complicated language constructions of XSL, XSLT and XPath. This approach allows comparatively simple search of XML hierarchical data by means of the following types of functions: specification functions and so named build-in functions. The choice of programming language Java may appear strange at first, but it isn't when you consider that the applications can run on different kinds of computers. The specific search mechanism based on the linear algebra theory is faster in comparison with MSXML parsers (on the basis of the developed examples with about 30%). Actually, there exists the possibility for creating new software tools based on the linear algebra theory, which cover the whole navigation and search techniques characterizing XSLT/XPath. The proposed method is able to replace more complicated operations in other SOA components.

  19. Detecting recursive and nonrecursive filters using chaos.

    PubMed

    Carroll, T L

    2010-03-01

    Filtering a chaotic signal through a recursive [or infinite impulse response (IIR)] filter has been shown to increase the dimension of chaos under certain conditions. Filtering with a nonrecursive [or finite impulse response (FIR)] filter should not increase dimension, but it has been shown that if the FIR filter has a long tail, measurements of actual signals may appear to show a dimension increase. I simulate IIR and FIR filters that correspond to naturally occurring resonant objects, and I show that using dimension measurements, I can distinguish the filter type. These measurements could be used to detect resonances using radar, sonar, or laser signals, or to determine if a resonance is due to an IIR or an FIR filter.

  20. Recursive Partitioning Method on Competing Risk Outcomes

    PubMed Central

    Xu, Wei; Che, Jiahua; Kong, Qin

    2016-01-01

    In some cancer clinical studies, researchers have interests to explore the risk factors associated with competing risk outcomes such as recurrence-free survival. We develop a novel recursive partitioning framework on competing risk data for both prognostic and predictive model constructions. We define specific splitting rules, pruning algorithm, and final tree selection algorithm for the competing risk tree models. This methodology is quite flexible that it can corporate both semiparametric method using Cox proportional hazards model and parametric competing risk model. Both prognostic and predictive tree models are developed to adjust for potential confounding factors. Extensive simulations show that our methods have well-controlled type I error and robust power performance. Finally, we apply both Cox proportional hazards model and flexible parametric model for prognostic tree development on a retrospective clinical study on oropharyngeal cancer patients. PMID:27486300

  1. Recursive Partitioning Method on Competing Risk Outcomes.

    PubMed

    Xu, Wei; Che, Jiahua; Kong, Qin

    2016-01-01

    In some cancer clinical studies, researchers have interests to explore the risk factors associated with competing risk outcomes such as recurrence-free survival. We develop a novel recursive partitioning framework on competing risk data for both prognostic and predictive model constructions. We define specific splitting rules, pruning algorithm, and final tree selection algorithm for the competing risk tree models. This methodology is quite flexible that it can corporate both semiparametric method using Cox proportional hazards model and parametric competing risk model. Both prognostic and predictive tree models are developed to adjust for potential confounding factors. Extensive simulations show that our methods have well-controlled type I error and robust power performance. Finally, we apply both Cox proportional hazards model and flexible parametric model for prognostic tree development on a retrospective clinical study on oropharyngeal cancer patients. PMID:27486300

  2. Semantic reasoning with XML-based biomedical information models.

    PubMed

    O'Connor, Martin J; Das, Amar

    2010-01-01

    The Extensible Markup Language (XML) is increasingly being used for biomedical data exchange. The parallel growth in the use of ontologies in biomedicine presents opportunities for combining the two technologies to leverage the semantic reasoning services provided by ontology-based tools. There are currently no standardized approaches for taking XML-encoded biomedical information models and representing and reasoning with them using ontologies. To address this shortcoming, we have developed a workflow and a suite of tools for transforming XML-based information models into domain ontologies encoded using OWL. In this study, we applied semantics reasoning methods to these ontologies to automatically generate domain-level inferences. We successfully used these methods to develop semantic reasoning methods for information models in the HIV and radiological image domains. PMID:20841831

  3. KAT: A Flexible XML-based Knowledge Authoring Environment

    PubMed Central

    Hulse, Nathan C.; Rocha, Roberto A.; Del Fiol, Guilherme; Bradshaw, Richard L.; Hanna, Timothy P.; Roemer, Lorrie K.

    2005-01-01

    As part of an enterprise effort to develop new clinical information systems at Intermountain Health Care, the authors have built a knowledge authoring tool that facilitates the development and refinement of medical knowledge content. At present, users of the application can compose order sets and an assortment of other structured clinical knowledge documents based on XML schemas. The flexible nature of the application allows the immediate authoring of new types of documents once an appropriate XML schema and accompanying Web form have been developed and stored in a shared repository. The need for a knowledge acquisition tool stems largely from the desire for medical practitioners to be able to write their own content for use within clinical applications. We hypothesize that medical knowledge content for clinical use can be successfully created and maintained through XML-based document frameworks containing structured and coded knowledge. PMID:15802477

  4. The Real Performance Drivers behind XML Lock Protocols

    NASA Astrophysics Data System (ADS)

    Bächle, Sebastian; Härder, Theo

    Fine-grained lock protocols should allow for highly concurrent transaction processing on XML document trees, which is addressed by the taDOM lock protocol family enabling specific lock modes and lock granules adjusted to the various XML processing models. We have already proved its operational flexibility and performance superiority when compared to competitor protocols. Here, we outline our experiences gained during the implementation and optimization of these protocols. We figure out their performance drivers to maximize throughput while keeping the response times at an acceptable level and perfectly exploiting the advantages of our tailor-made lock protocols for XML trees. Because we have implemented all options and alternatives in our prototype system XTC, benchmark runs for all “drivers” allow for comparisons in identical environments and illustrate the benefit of all implementation decisions. Finally, they reveal that careful lock protocol optimization pays off.

  5. Development Life Cycle and Tools for XML Content Models

    SciTech Connect

    Kulvatunyou, Boonserm; Morris, Katherine; Buhwan, Jeong; Goyal, Puja

    2004-11-01

    Many integration projects today rely on shared semantic models based on standards represented using Extensible Mark up Language (XML) technologies. Shared semantic models typically evolve and require maintenance. In addition, to promote interoperability and reduce integration costs, the shared semantics should be reused as much as possible. Semantic components must be consistent and valid in terms of agreed upon standards and guidelines. In this paper, we describe an activity model for creation, use, and maintenance of a shared semantic model that is coherent and supports efficient enterprise integration. We then use this activity model to frame our research and the development of tools to support those activities. We provide overviews of these tools primarily in the context of the W3C XML Schema. At the present, we focus our work on the W3C XML Schema as the representation of choice, due to its extensive adoption by industry.

  6. XML in an Adaptive Framework for Instrument Control

    NASA Technical Reports Server (NTRS)

    Ames, Troy J.

    2004-01-01

    NASA Goddard Space Flight Center is developing an extensible framework for instrument command and control, known as Instrument Remote Control (IRC), that combines the platform independent processing capabilities of Java with the power of the Extensible Markup Language (XML). A key aspect of the architecture is software that is driven by an instrument description, written using the Instrument Markup Language (IML). IML is an XML dialect used to describe interfaces to control and monitor the instrument, command sets and command formats, data streams, communication mechanisms, and data processing algorithms.

  7. XML: A Language To Manage the World Wide Web. ERIC Digest.

    ERIC Educational Resources Information Center

    Davis-Tanous, Jennifer R.

    This digest provides an overview of XML (Extensible Markup Language), a markup language used to construct World Wide Web pages. Topics addressed include: (1) definition of a markup language, including comparison of XML with SGML (Standard Generalized Markup Language) and HTML (HyperText Markup Language); (2) how XML works, including sample tags,…

  8. Applying Analogical Reasoning Techniques for Teaching XML Document Querying Skills in Database Classes

    ERIC Educational Resources Information Center

    Mitri, Michel

    2012-01-01

    XML has become the most ubiquitous format for exchange of data between applications running on the Internet. Most Web Services provide their information to clients in the form of XML. The ability to process complex XML documents in order to extract relevant information is becoming as important a skill for IS students to master as querying…

  9. XML does Real Programmers a Service

    SciTech Connect

    Gorton, Ian

    2008-09-01

    As the sun slowly sets of this first decade of the new millenium, it seems appropriate to update the sojourn of the real programmers as they adapt to their ever changing technical and business environment. Real Programmers were perfectly characterized and differentiated from their quiche-eating, Pascal programming brethren in Ed Post’s seminal “Real Programmers Don’t Use Pascal” (Datamation, 1983). My follow-up ("Real programmers do use Delphi," Software, IEEE , vol.12, no.6, pp.8, 10, 12-, Nov 1995) charted their evolution from FORTRAN-only programmers to embracing a wider range of mainstream languages and tools that still afforded ample opportunity for creativity, game-playing, irregular work hours, and importantly, long-term job security.

  10. An XML description of detector geometries for GEANT4

    NASA Astrophysics Data System (ADS)

    Figgins, J.; Walker, B.; Comfort, J. R.

    2006-12-01

    A code has been developed that enables the geometry of detectors to be specified easily and flexibly in the XML language, for use in the Monte Carlo program GEANT4. The user can provide clear documentation of the geometry without being proficient in the C++ language of GEANT4. The features and some applications are discussed.

  11. Improving the Virtual Learning Development Processes Using XML Standards.

    ERIC Educational Resources Information Center

    Suss, Kurt; Oberhofer, Thomas

    2002-01-01

    Suggests that distributed learning environments and content often lack a common basis for the exchange of learning materials, which can hinder or even delay innovation and delivery of learning technology. Standards for platforms and authoring may provide a way to improve interoperability and cooperative development. Provides an XML-based approach…

  12. Personalization of XML Content Browsing Based on User Preferences

    ERIC Educational Resources Information Center

    Encelle, Benoit; Baptiste-Jessel, Nadine; Sedes, Florence

    2009-01-01

    Personalization of user interfaces for browsing content is a key concept to ensure content accessibility. In this direction, we introduce concepts that result in the generation of personalized multimodal user interfaces for browsing XML content. User requirements concerning the browsing of a specific content type can be specified by means of…

  13. A Conversion Tool for Mathematical Expressions in Web XML Files.

    ERIC Educational Resources Information Center

    Ohtake, Nobuyuki; Kanahori, Toshihiro

    2003-01-01

    This article discusses the conversion of mathematical equations into Extensible Markup Language (XML) on the World Wide Web for individuals with visual impairments. A program is described that converts the presentation markup style to the content markup style in MathML to allow browsers to render mathematical expressions without other programs.…

  14. New NED XML/VOtable Services and Client Interface Applications

    NASA Astrophysics Data System (ADS)

    Pevunova, O.; Good, J.; Mazzarella, J.; Berriman, G. B.; Madore, B.

    2005-12-01

    The NASA/IPAC Extragalactic Database (NED) provides data and cross-identifications for over 7 million extragalactic objects fused from thousands of survey catalogs and journal articles. The data cover all frequencies from radio through gamma rays and include positions, redshifts, photometry and spectral energy distributions (SEDs), sizes, and images. NED services have traditionally supplied data in HTML format for connections from Web browsers, and a custom ASCII data structure for connections by remote computer programs written in the C programming language. We describe new services that provide responses from NED queries in XML documents compliant with the international virtual observatory VOtable protocol. The XML/VOtable services support cone searches, all-sky searches based on object attributes (survey names, cross-IDs, redshifts, flux densities), and requests for detailed object data. Initial services have been inserted into the NVO registry, and others will follow soon. The first client application is a Style Sheet specification for rendering NED VOtable query results in Web browsers that support XML. The second prototype application is a Java applet that allows users to compare multiple SEDs. The new XML/VOtable output mode will also simplify the integration of data from NED into visualization and analysis packages, software agents, and other virtual observatory applications. We show an example SED from NED plotted using VOPlot. The NED website is: http://nedwww.ipac.caltech.edu.

  15. COMPARISON OF RECURSIVE ESTIMATION TECHNIQUES FOR POSITION TRACKING RADIOACTIVE SOURCES

    SciTech Connect

    K. MUSKE; J. HOWSE

    2000-09-01

    This paper compares the performance of recursive state estimation techniques for tracking the physical location of a radioactive source within a room based on radiation measurements obtained from a series of detectors at fixed locations. Specifically, the extended Kalman filter, algebraic observer, and nonlinear least squares techniques are investigated. The results of this study indicate that recursive least squares estimation significantly outperforms the other techniques due to the severe model nonlinearity.

  16. Recursion Operators for CBC system with reductions. Geometric theory

    NASA Astrophysics Data System (ADS)

    Yanovski, A.; Vilasi, G.

    2016-09-01

    We discuss some recent developments of the geometric theory of the Recursion Operators (Generating Operators) for Caudrey-Beals-Coifman systems (CBC systems) on semisimple Lie algebras. As is well known the essence of this interpretation is that the Recursion Operators could be considered as adjoint to Nijenhuis tensors on certain infinite-dimensional manifolds. In particular, we discuss the case when there are Zp reductions of Mikhailov type.

  17. Recursive Construction of Operator Product Expansion Coefficients

    NASA Astrophysics Data System (ADS)

    Holland, Jan; Hollands, Stefan

    2015-06-01

    We derive a novel formula for the derivative of operator product expansion (OPE) coefficients with respect to a coupling constant. The formula involves just the OPE coefficients themselves but no further input, and is in this sense self-consistent. Furthermore, unlike other formal identities of this general nature in quantum field theory (such as the formal expression for the Lagrangian perturbation of a correlation function), our formula requires no further UV-renormalization, i.e., it is completely well-defined from the start. This feature is a result of a cancelation of UV- and IR-divergences between various terms in our identity. Our proof, and an analysis of the features of the identity, is given for the example of massive, Euclidean theory in 4 dimensional Euclidean space. It relies on the renormalization group flow equation method and is valid to arbitrary, but finite orders in perturbation theory. The final formula, however, makes neither explicit reference to the renormalization group flow, nor to perturbation theory, and we conjecture that it also holds non-perturbatively. Our identity can be applied constructively because it gives a novel recursive algorithm for the computation of OPE coefficients to arbitrary (finite) perturbation order in terms of the zeroth order coefficients corresponding to the underlying free field theory, which in turn are trivial to obtain. We briefly illustrate the relation of this method to more standard methods for computing the OPE in some simple examples.

  18. Recursive stochastic effects in valley hybrid inflation

    NASA Astrophysics Data System (ADS)

    Levasseur, Laurence Perreault; Vennin, Vincent; Brandenberger, Robert

    2013-10-01

    Hybrid inflation is a two-field model where inflation ends because of a tachyonic instability, the duration of which is determined by stochastic effects and has important observational implications. Making use of the recursive approach to the stochastic formalism presented in [L. P. Levasseur, preceding article, Phys. Rev. D 88, 083537 (2013)], these effects are consistently computed. Through an analysis of backreaction, this method is shown to converge in the valley but points toward an (expected) instability in the waterfall. It is further shown that the quasistationarity of the auxiliary field distribution breaks down in the case of a short-lived waterfall. We find that the typical dispersion of the waterfall field at the critical point is then diminished, thus increasing the duration of the waterfall phase and jeopardizing the possibility of a short transition. Finally, we find that stochastic effects worsen the blue tilt of the curvature perturbations by an O(1) factor when compared with the usual slow-roll contribution.

  19. Progress on an implementation of MIFlowCyt in XML

    NASA Astrophysics Data System (ADS)

    Leif, Robert C.; Leif, Stephanie H.

    2015-03-01

    Introduction: The International Society for Advancement of Cytometry (ISAC) Data Standards Task Force (DSTF) has created a standard for the Minimum Information about a Flow Cytometry Experiment (MIFlowCyt 1.0). The CytometryML schemas, are based in part upon the Flow Cytometry Standard and Digital Imaging and Communication (DICOM) standards. CytometryML has and will be extended and adapted to include MIFlowCyt, as well as to serve as a common standard for flow and image cytometry (digital microscopy). Methods: The MIFlowCyt data-types were created, as is the rest of CytometryML, in the XML Schema Definition Language (XSD1.1). Individual major elements of the MIFlowCyt schema were translated into XML and filled with reasonable data. A small section of the code was formatted with HTML formatting elements. Results: The differences in the amount of detail to be recorded for 1) users of standard techniques including data analysts and 2) others, such as method and device creators, laboratory and other managers, engineers, and regulatory specialists required that separate data-types be created to describe the instrument configuration and components. A very substantial part of the MIFlowCyt element that describes the Experimental Overview part of the MIFlowCyt and substantial parts of several other major elements have been developed. Conclusions: The future use of structured XML tags and web technology should facilitate searching of experimental information, its presentation, and inclusion in structured research, clinical, and regulatory documents, as well as demonstrate in publications adherence to the MIFlowCyt standard. The use of CytometryML together with XML technology should also result in the textual and numeric data being published using web technology without any change in composition. Preliminary testing indicates that CytometryML XML pages can be directly formatted with the combination of HTML

  20. Recursion to food plants by free-ranging Bornean elephant.

    PubMed

    English, Megan; Gillespie, Graeme; Goossens, Benoit; Ismail, Sulaiman; Ancrenaz, Marc; Linklater, Wayne

    2015-01-01

    Plant recovery rates after herbivory are thought to be a key factor driving recursion by herbivores to sites and plants to optimise resource-use but have not been investigated as an explanation for recursion in large herbivores. We investigated the relationship between plant recovery and recursion by elephants (Elephas maximus borneensis) in the Lower Kinabatangan Wildlife Sanctuary, Sabah. We identified 182 recently eaten food plants, from 30 species, along 14 × 50 m transects and measured their recovery growth each month over nine months or until they were re-browsed by elephants. The monthly growth in leaf and branch or shoot length for each plant was used to calculate the time required (months) for each species to recover to its pre-eaten length. Elephant returned to all but two transects with 10 eaten plants, a further 26 plants died leaving 146 plants that could be re-eaten. Recursion occurred to 58% of all plants and 12 of the 30 species. Seventy-seven percent of the re-eaten plants were grasses. Recovery times to all plants varied from two to twenty months depending on the species. Recursion to all grasses coincided with plant recovery whereas recursion to most browsed plants occurred four to twelve months before they had recovered to their previous length. The small sample size of many browsed plants that received recursion and uneven plant species distribution across transects limits our ability to generalise for most browsed species but a prominent pattern in plant-scale recursion did emerge. Plant recovery time was a good predictor of time to recursion but varied as a function of growth form (grass, ginger, palm, liana and woody) and differences between sites. Time to plant recursion coincided with plant recovery time for the elephant's preferred food, grasses, and perhaps also gingers, but not the other browsed species. Elephants are bulk feeders so it is likely that they time their returns to bulk feed on these grass species when quantities have

  1. Recursion to food plants by free-ranging Bornean elephant

    PubMed Central

    Gillespie, Graeme; Goossens, Benoit; Ismail, Sulaiman; Ancrenaz, Marc; Linklater, Wayne

    2015-01-01

    Plant recovery rates after herbivory are thought to be a key factor driving recursion by herbivores to sites and plants to optimise resource-use but have not been investigated as an explanation for recursion in large herbivores. We investigated the relationship between plant recovery and recursion by elephants (Elephas maximus borneensis) in the Lower Kinabatangan Wildlife Sanctuary, Sabah. We identified 182 recently eaten food plants, from 30 species, along 14 × 50 m transects and measured their recovery growth each month over nine months or until they were re-browsed by elephants. The monthly growth in leaf and branch or shoot length for each plant was used to calculate the time required (months) for each species to recover to its pre-eaten length. Elephant returned to all but two transects with 10 eaten plants, a further 26 plants died leaving 146 plants that could be re-eaten. Recursion occurred to 58% of all plants and 12 of the 30 species. Seventy-seven percent of the re-eaten plants were grasses. Recovery times to all plants varied from two to twenty months depending on the species. Recursion to all grasses coincided with plant recovery whereas recursion to most browsed plants occurred four to twelve months before they had recovered to their previous length. The small sample size of many browsed plants that received recursion and uneven plant species distribution across transects limits our ability to generalise for most browsed species but a prominent pattern in plant-scale recursion did emerge. Plant recovery time was a good predictor of time to recursion but varied as a function of growth form (grass, ginger, palm, liana and woody) and differences between sites. Time to plant recursion coincided with plant recovery time for the elephant’s preferred food, grasses, and perhaps also gingers, but not the other browsed species. Elephants are bulk feeders so it is likely that they time their returns to bulk feed on these grass species when quantities have

  2. Online recursive independent component analysis for real-time source separation of high-density EEG.

    PubMed

    Hsu, Sheng-Hsiou; Mullen, Tim; Jung, Tzyy-Ping; Cauwenberghs, Gert

    2014-01-01

    Online Independent Component Analysis (ICA) algorithms have recently seen increasing development and application across a range of fields, including communications, biosignal processing, and brain-computer interfaces. However, prior work in this domain has primarily focused on algorithmic proofs of convergence, with application limited to small `toy' examples or to relatively low channel density EEG datasets. Furthermore, there is limited availability of computationally efficient online ICA implementations, suitable for real-time application. This study describes an optimized online recursive ICA algorithm (ORICA), with online recursive least squares (RLS) whitening, for blind source separation of high-density EEG data. It is implemented as an online-capable plugin within the open-source BCILAB (EEGLAB) framework. We further derive and evaluate a block-update modification to the ORICA learning rule. We demonstrate the algorithm's suitability for accurate and efficient source identification in high density (64-channel) realistically-simulated EEG data, as well as real 61-channel EEG data recorded by a dry and wearable EEG system in a cognitive experiment. PMID:25570830

  3. Phase Response Design of Recursive All-Pass Digital Filters Using a Modified PSO Algorithm

    PubMed Central

    Chang, Wei-Der

    2015-01-01

    This paper develops a new design scheme for the phase response of an all-pass recursive digital filter. A variant of particle swarm optimization (PSO) algorithm will be utilized for solving this kind of filter design problem. It is here called the modified PSO (MPSO) algorithm in which another adjusting factor is more introduced in the velocity updating formula of the algorithm in order to improve the searching ability. In the proposed method, all of the designed filter coefficients are firstly collected to be a parameter vector and this vector is regarded as a particle of the algorithm. The MPSO with a modified velocity formula will force all particles into moving toward the optimal or near optimal solution by minimizing some defined objective function of the optimization problem. To show the effectiveness of the proposed method, two different kinds of linear phase response design examples are illustrated and the general PSO algorithm is compared as well. The obtained results show that the MPSO is superior to the general PSO for the phase response design of digital recursive all-pass filter. PMID:26366168

  4. XML: James Webb Space Telescope Database Issues, Lessons, and Status

    NASA Technical Reports Server (NTRS)

    Detter, Ryan; Mooney, Michael; Fatig, Curtis

    2003-01-01

    This paper will present the current concept using extensible Markup Language (XML) as the underlying structure for the James Webb Space Telescope (JWST) database. The purpose of using XML is to provide a JWST database, independent of any portion of the ground system, yet still compatible with the various systems using a variety of different structures. The testing of the JWST Flight Software (FSW) started in 2002, yet the launch is scheduled for 2011 with a planned 5-year mission and a 5-year follow on option. The initial database and ground system elements, including the commands, telemetry, and ground system tools will be used for 19 years, plus post mission activities. During the Integration and Test (I&T) phases of the JWST development, 24 distinct laboratories, each geographically dispersed, will have local database tools with an XML database. Each of these laboratories database tools will be used for the exporting and importing of data both locally and to a central database system, inputting data to the database certification process, and providing various reports. A centralized certified database repository will be maintained by the Space Telescope Science Institute (STScI), in Baltimore, Maryland, USA. One of the challenges for the database is to be flexible enough to allow for the upgrade, addition or changing of individual items without effecting the entire ground system. Also, using XML should allow for the altering of the import and export formats needed by the various elements, tracking the verification/validation of each database item, allow many organizations to provide database inputs, and the merging of the many existing database processes into one central database structure throughout the JWST program. Many National Aeronautics and Space Administration (NASA) projects have attempted to take advantage of open source and commercial technology. Often this causes a greater reliance on the use of Commercial-Off-The-Shelf (COTS), which is often limiting

  5. Experiments with recursive estimation in astronomical image processing

    NASA Technical Reports Server (NTRS)

    Busko, I.

    1992-01-01

    Recursive estimation concepts were applied to image enhancement problems since the 70's. However, very few applications in the particular area of astronomical image processing are known. These concepts were derived, for 2-dimensional images, from the well-known theory of Kalman filtering in one dimension. The historic reasons for application of these techniques to digital images are related to the images' scanned nature, in which the temporal output of a scanner device can be processed on-line by techniques borrowed directly from 1-dimensional recursive signal analysis. However, recursive estimation has particular properties that make it attractive even in modern days, when big computer memories make the full scanned image available to the processor at any given time. One particularly important aspect is the ability of recursive techniques to deal with non-stationary phenomena, that is, phenomena which have their statistical properties variable in time (or position in a 2-D image). Many image processing methods make underlying stationary assumptions either for the stochastic field being imaged, for the imaging system properties, or both. They will underperform, or even fail, when applied to images that deviate significantly from stationarity. Recursive methods, on the contrary, make it feasible to perform adaptive processing, that is, to process the image by a processor with properties tuned to the image's local statistical properties. Recursive estimation can be used to build estimates of images degraded by such phenomena as noise and blur. We show examples of recursive adaptive processing of astronomical images, using several local statistical properties to drive the adaptive processor, as average signal intensity, signal-to-noise and autocorrelation function. Software was developed under IRAF, and as such will be made available to interested users.

  6. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2011-12-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  7. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2012-01-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  8. A new scene-based nonuniformity corrrection applied in boundary backward recursive reconstruction for arbitrary subpixel microscan

    NASA Astrophysics Data System (ADS)

    Chen, Yi-nan; Jin, Wei-qi; Zhao, Lei; Zhao, Lin; Wang, Tan

    2008-02-01

    A scene-based nonuiformity correction occurred in the operation of recursive reconstruction for high resolution extracted from subpixel microscan imaging (SMI) sequence are presented and analyzed. The reconstruction algorithm in terms of block-by-block method recursive from the prescient boundary to centre in uniform 2×2 SMI, is updated to the two-dimensional focus plane array (FPA) considering the arbitrary scan translation not equivalent to the accurate halfpixel. In this paper, the focus is concentrated to the nonuniform SMI model with fixed pattern noise (FPN), which corrupts the image by the gain and offset from the individual cell-detector. Then, we firstly demonstrate that once our backward recursive reconstruction implements to the undersampled sequence with FPN, the dramatic impact to the majority pixels is the elimination of the offset due to the quits efficiency by inverse iterative function in each 2×2 region belonging to the high resolution lattice. The final achievement is the nonuiformity correction (NUC) synchronously concomitant with the higher resolution, so our method fully takes account of the potential information of the scanned inter-frames. Application of proposed algorithm to the simulated SMI procedure has the obvious superiority, including the much better image quality indexes from the cleaned FPN, time-consumed saving within one scan period (4 frames), no requirements of statistical assumption so as to the avoidance of ghost artifact, and the inter-frames adaptive property.

  9. XSemantic: An Extension of LCA Based XML Semantic Search

    NASA Astrophysics Data System (ADS)

    Supasitthimethee, Umaporn; Shimizu, Toshiyuki; Yoshikawa, Masatoshi; Porkaew, Kriengkrai

    One of the most convenient ways to query XML data is a keyword search because it does not require any knowledge of XML structure or learning a new user interface. However, the keyword search is ambiguous. The users may use different terms to search for the same information. Furthermore, it is difficult for a system to decide which node is likely to be chosen as a return node and how much information should be included in the result. To address these challenges, we propose an XML semantic search based on keywords called XSemantic. On the one hand, we give three definitions to complete in terms of semantics. Firstly, the semantic term expansion, our system is robust from the ambiguous keywords by using the domain ontology. Secondly, to return semantic meaningful answers, we automatically infer the return information from the user queries and take advantage of the shortest path to return meaningful connections between keywords. Thirdly, we present the semantic ranking that reflects the degree of similarity as well as the semantic relationship so that the search results with the higher relevance are presented to the users first. On the other hand, in the LCA and the proximity search approaches, we investigated the problem of information included in the search results. Therefore, we introduce the notion of the Lowest Common Element Ancestor (LCEA) and define our simple rule without any requirement on the schema information such as the DTD or XML Schema. The first experiment indicated that XSemantic not only properly infers the return information but also generates compact meaningful results. Additionally, the benefits of our proposed semantics are demonstrated by the second experiment.

  10. Enhance Reuse of Standard e-Business XML Schema Documents

    SciTech Connect

    Buhwan, Jeong; Kulvatunyou, Boonserm; Ivezic, Nenad; Jones, Albert

    2005-07-01

    Ideally, e-Business application interfaces would be built from highly reusable specifications of business document standards. Since many of these specifications are poorly understood, users often create new ones or customize existing ones every time a new integration problem arises. Consequently, even though there is a potential for reuse, the lack of a component discovery tool means that the cost of reuse is still prohibitively high. In this paper, we explore the potential of using similarity metrics to discover standard XML Schema documents. Our goal is to enhance reuse of XML Schema document/component standards in new integration contexts through the discovery process. We are motivated by the increasing access to the application interface specifications expressed in the form of XML Schema. These specifications are created to facilitate business documents exchange among software applications. Reuse can reduce both the proliferation of standards and the interoperability costs. To demonstrate these potential benefits, we propose and position our research based on an experimental scenario and a novel evaluation approach to qualify alternative similarity metrics on schema discovery. The edge equality in the evaluation method provides a conservative quality measure. We review a number of fundamental approaches to developing similarity metrics, and we organize these metrics into lexical, structural, and logical categories. For each of the metrics, we discuss its relevance and potential issues in its application to the XML Schema discovery task. We conclude that each of the similarity measures has its own strengths and weaknesses and each is expected to yield different results in different search situations. It is important, in the context of an application of these measures to e-Business standards that a schema discovery engine capable of assigning appropriate weights to different similarity measures be used when the search conditions change. This is a subject of our

  11. XTCE: XML Telemetry and Command Exchange Tutorial, XTCE Version 1

    NASA Technical Reports Server (NTRS)

    Rice, Kevin; Kizzort, Brad

    2008-01-01

    These presentation slides are a tutorial on XML Telemetry and Command Exchange (XTCE). The goal of XTCE is to provide an industry standard mechanism for describing telemetry and command streams (particularly from satellites.) it wiill lower cost and increase validation over traditional formats, and support exchange or native format.XCTE is designed to describe bit streams, that are typical of telemetry and command in the historic space domain.

  12. The redundancy of recursion and infinity for natural language.

    PubMed

    Luuk, Erkki; Luuk, Hendrik

    2011-02-01

    An influential line of thought claims that natural language and arithmetic processing require recursion, a putative hallmark of human cognitive processing (Chomsky in Evolution of human language: biolinguistic perspectives. Cambridge University Press, Cambridge, pp 45-61, 2010; Fitch et al. in Cognition 97(2):179-210, 2005; Hauser et al. in Science 298(5598):1569-1579, 2002). First, we question the need for recursion in human cognitive processing by arguing that a generally simpler and less resource demanding process--iteration--is sufficient to account for human natural language and arithmetic performance. We argue that the only motivation for recursion, the infinity in natural language and arithmetic competence, is equally approachable by iteration and recursion. Second, we submit that the infinity in natural language and arithmetic competence reduces to imagining infinite embedding or concatenation, which is completely independent from the ability to implement infinite processing, and thus, independent from both recursion and iteration. Furthermore, we claim that a property of natural language is physically uncountable finity and not discrete infinity. PMID:20652723

  13. The redundancy of recursion and infinity for natural language.

    PubMed

    Luuk, Erkki; Luuk, Hendrik

    2011-02-01

    An influential line of thought claims that natural language and arithmetic processing require recursion, a putative hallmark of human cognitive processing (Chomsky in Evolution of human language: biolinguistic perspectives. Cambridge University Press, Cambridge, pp 45-61, 2010; Fitch et al. in Cognition 97(2):179-210, 2005; Hauser et al. in Science 298(5598):1569-1579, 2002). First, we question the need for recursion in human cognitive processing by arguing that a generally simpler and less resource demanding process--iteration--is sufficient to account for human natural language and arithmetic performance. We argue that the only motivation for recursion, the infinity in natural language and arithmetic competence, is equally approachable by iteration and recursion. Second, we submit that the infinity in natural language and arithmetic competence reduces to imagining infinite embedding or concatenation, which is completely independent from the ability to implement infinite processing, and thus, independent from both recursion and iteration. Furthermore, we claim that a property of natural language is physically uncountable finity and not discrete infinity.

  14. Knot Invariants from Topological Recursion on Augmentation Varieties

    NASA Astrophysics Data System (ADS)

    Gu, Jie; Jockers, Hans; Klemm, Albrecht; Soroush, Masoud

    2015-06-01

    Using the duality between Wilson loop expectation values of SU( N) Chern-Simons theory on S 3 and topological open-string amplitudes on the local mirror of the resolved conifold, we study knots on S 3 and their invariants encoded in colored HOMFLY polynomials by means of topological recursion. In the context of the local mirror Calabi-Yau threefold of the resolved conifold, we generalize the topological recursion of the remodelled B-model in order to study branes beyond the class of toric Harvey-Lawson special Lagrangians—as required for analyzing non-trivial knots on S 3. The basic ingredients for the proposed recursion are the spectral curve, given by the augmentation variety of the knot, and the calibrated annulus kernel, encoding the topological annulus amplitudes associated to the knot. We present an explicit construction of the calibrated annulus kernel for torus knots and demonstrate the validity of the topological recursion. We further argue that—if an explicit form of the calibrated annulus kernel is provided for any other knot—the proposed topological recursion should still be applicable. We study the implications of our proposal for knot theory, which exhibit interesting consequences for colored HOMFLY polynomials of mutant knots.

  15. Rock.XML - Towards a library of rock physics models

    NASA Astrophysics Data System (ADS)

    Jensen, Erling Hugo; Hauge, Ragnar; Ulvmoen, Marit; Johansen, Tor Arne; Drottning, Åsmund

    2016-08-01

    Rock physics modelling provides tools for correlating physical properties of rocks and their constituents to the geophysical observations we measure on a larger scale. Many different theoretical and empirical models exist, to cover the range of different types of rocks. However, upon reviewing these, we see that they are all built around a few main concepts. Based on this observation, we propose a format for digitally storing the specifications for rock physics models which we have named Rock.XML. It does not only contain data about the various constituents, but also the theories and how they are used to combine these building blocks to make a representative model for a particular rock. The format is based on the Extensible Markup Language XML, making it flexible enough to handle complex models as well as scalable towards extending it with new theories and models. This technology has great advantages as far as documenting and exchanging models in an unambiguous way between people and between software. Rock.XML can become a platform for creating a library of rock physics models; making them more accessible to everyone.

  16. An XML Driven Graphical User Interface and Application Management Toolkit

    SciTech Connect

    White, Greg R

    2002-01-18

    In the past, the features of a user interface were limited by those available in the existing graphical widgets it used. Now, improvements in processor speed have fostered the emergence of interpreted languages, in which the appropriate method to render a given data object can be loaded at runtime. XML can be used to precisely describe the association of data types with their graphical handling (beans), and Java provides an especially rich environment for programming the graphics. We present a graphical user interface builder based on Java Beans and XML, in which the graphical screens are described textually (in files or a database) in terms of their screen components. Each component may be a simple text read back, or a complex plot. The programming model provides for dynamic data pertaining to a component to be forwarded synchronously or asynchronously, to the appropriate handler, which may be a built-in method, or a complex applet. This work was initially motivated by the need to move the legacy VMS display interface of the SLAC Control Program to another platform while preserving all of its existing functionality. However the model allows us a powerful and generic system for adding new kinds of graphics, such as Matlab, data sources, such as EPICS, middleware, such as AIDA[1], and transport, such as XML and SOAP. The system will also include a management console, which will be able to report on the present usage of the system, for instance who is running it where and connected to which channels.

  17. Recursive flexible multibody system dynamics using spatial operators

    NASA Technical Reports Server (NTRS)

    Jain, A.; Rodriguez, G.

    1992-01-01

    This paper uses spatial operators to develop new spatially recursive dynamics algorithms for flexible multibody systems. The operator description of the dynamics is identical to that for rigid multibody systems. Assumed-mode models are used for the deformation of each individual body. The algorithms are based on two spatial operator factorizations of the system mass matrix. The first (Newton-Euler) factorization of the mass matrix leads to recursive algorithms for the inverse dynamics, mass matrix evaluation, and composite-body forward dynamics for the systems. The second (innovations) factorization of the mass matrix, leads to an operator expression for the mass matrix inverse and to a recursive articulated-body forward dynamics algorithm. The primary focus is on serial chains, but extensions to general topologies are also described. A comparison of computational costs shows that the articulated-body, forward dynamics algorithm is much more efficient than the composite-body algorithm for most flexible multibody systems.

  18. Spin-1 Ising model on tetrahedron recursive lattices: Exact results

    NASA Astrophysics Data System (ADS)

    Jurčišinová, E.; Jurčišin, M.

    2016-11-01

    We investigate the ferromagnetic spin-1 Ising model on the tetrahedron recursive lattices. An exact solution of the model is found in the framework of which it is shown that the critical temperatures of the second order phase transitions of the model are driven by a single equation simultaneously on all such lattices. It is also shown that this general equation for the critical temperatures is equivalent to the corresponding polynomial equation for the model on the tetrahedron recursive lattice with arbitrary given value of the coordination number. The explicit form of these polynomial equations is shown for the lattices with the coordination numbers z = 6, 9, and 12. In addition, it is shown that the thermodynamic properties of all possible physical phases of the model are also completely driven by the corresponding single equations simultaneously on all tetrahedron recursive lattices. In this respect, the spontaneous magnetization, the free energy, the entropy, and the specific heat of the model are studied in detail.

  19. Recursive Query Facilities in Relational Databases: A Survey

    NASA Astrophysics Data System (ADS)

    Przymus, Piotr; Boniewicz, Aleksandra; Burzańska, Marta; Stencel, Krzysztof

    The relational model is the basis for most modern databases, while SQL is the most commonly used query language. However, there are data structures and computational problems that cannot be expressed using SQL-92 queries. Among them are those concerned with the bill-of-material and corporate hierarchies. A newer standard, called the SQL-99, introduced recursive queries which can be used to solve such tasks. Yet, only recently recursive queries have been implemented in most of the leading relational databases. In this paper we have reviewed and compared implementations of the recursive queries defined by SQL:1999 through SQL:2008 and offered by leading vendors of DBMSs. Our comparison concerns features, syntax and performance.

  20. The limits on combining recursive horn rules with description logics

    SciTech Connect

    Levy, A.Y.; Rousset, M.C.

    1996-12-31

    Horn rule languages have formed the basis for many Artificial Intelligence application languages, but are not expressive enough to model domains with a rich hierarchical structure. Description logics have been designed especially to model rich hierarchies. Several applications would significantly benefit from combining the expressive power of both formalisms. This paper focuses on combining recursive function-free Horn rules with the expressive description logic ALCNR, and shows exactly when a hybrid language with decidable inference can be obtained. First, we show that several of the core constructors of description logics lead by themselves to undecidability of inference when combined with recursive function-free Horn rules. We then show that without these constructors we obtain a maximal subset of ALCNRR that yields a decidable hybrid language. Finally, we describe a restriction on the Horn rules that guarantees decidable inference when combined with all of ALCNR, and covers many of the common usages of recursive rules.

  1. Elucidating the stop bands of structurally colored systems through recursion

    NASA Astrophysics Data System (ADS)

    Amir, Ariel; Vukusic, Peter

    2013-04-01

    Interference is the source of some of the spectacular colors of animals and plants in nature. In some of these systems, the physical structure consists of an ordered array of layers with alternating high and low refractive indices. This periodicity leads to an optical band structure that is analogous to the electronic band structure encountered in semiconductor physics: specific bands of wavelengths (the stop bands) are perfectly reflected. Here, we present a minimal model for optical band structure in a periodic multilayer structure and solve it using recursion relations. The stop bands emerge in the limit of an infinite number of layers by finding the fixed point of the recursion. We compare to experimental data for various beetles, whose optical structure resembles the proposed model. Thus, using only the phenomenon of interference and the idea of recursion, we are able to elucidate the concept of band structure in the context of the experimentally observed high reflectance and iridescent appearance of structurally colored beetles.

  2. Parameter Uncertainty for Aircraft Aerodynamic Modeling using Recursive Least Squares

    NASA Technical Reports Server (NTRS)

    Grauer, Jared A.; Morelli, Eugene A.

    2016-01-01

    A real-time method was demonstrated for determining accurate uncertainty levels of stability and control derivatives estimated using recursive least squares and time-domain data. The method uses a recursive formulation of the residual autocorrelation to account for colored residuals, which are routinely encountered in aircraft parameter estimation and change the predicted uncertainties. Simulation data and flight test data for a subscale jet transport aircraft were used to demonstrate the approach. Results showed that the corrected uncertainties matched the observed scatter in the parameter estimates, and did so more accurately than conventional uncertainty estimates that assume white residuals. Only small differences were observed between batch estimates and recursive estimates at the end of the maneuver. It was also demonstrated that the autocorrelation could be reduced to a small number of lags to minimize computation and memory storage requirements without significantly degrading the accuracy of predicted uncertainty levels.

  3. A decoupled recursive approach for constrained flexible multibody system dynamics

    NASA Technical Reports Server (NTRS)

    Lai, Hao-Jan; Kim, Sung-Soo; Haug, Edward J.; Bae, Dae-Sung

    1989-01-01

    A variational-vector calculus approach is employed to derive a recursive formulation for dynamic analysis of flexible multibody systems. Kinematic relationships for adjacent flexible bodies are derived in a companion paper, using a state vector notation that represents translational and rotational components simultaneously. Cartesian generalized coordinates are assigned for all body and joint reference frames, to explicitly formulate deformation kinematics under small deformation kinematics and an efficient flexible dynamics recursive algorithm is developed. Dynamic analysis of a closed loop robot is performed to illustrate efficiency of the algorithm.

  4. Recursive multibody dynamics and discrete-time optimal control

    NASA Technical Reports Server (NTRS)

    Deleuterio, G. M. T.; Damaren, C. J.

    1989-01-01

    A recursive algorithm is developed for the solution of the simulation dynamics problem for a chain of rigid bodies. Arbitrary joint constraints are permitted, that is, joints may allow translational and/or rotational degrees of freedom. The recursive procedure is shown to be identical to that encountered in a discrete-time optimal control problem. For each relevant quantity in the multibody dynamics problem, there exists an analog in the context of optimal control. The performance index that is minimized in the control problem is identified as Gibbs' function for the chain of bodies.

  5. Recursive dynamics algorithm for multibody systems with prescribed motion

    NASA Astrophysics Data System (ADS)

    Jain, Abhinandan; Rodriguez, Guillermo

    1993-10-01

    This paper uses spatial operator techniques to develop a new algorithm for the dynamics of multibody systems with hinges undergoing prescribed motion. This algorithm is spatially recursive, and its computational complexity grows only linearly with the number of degrees of freedom in the system. Its structure is a hybrid of known recursive forward and inverse dynamics algorithms for regular multibody systems. Changes to the prescribed/nonprescribed nature of hinges can be implemented during run time since they are handled with very low overhead in the algorithm.

  6. Efficient Scheduling of Recursive Control Flow on GPUs

    SciTech Connect

    Huo, Xin; Krishnamoorthy, Sriram; Agrawal, Gagan

    2013-06-10

    Graphics processing units (GPUs) have rapidly emerged as a very significant player in high performance computing. Single instruction multiple thread (SIMT) pipelines are typically used in GPUs to exploit parallelism and maximize performance. Although support for unstructured control flow has been included in GPUs, efficiently managing thread divergence for arbitrary parallel programs remains a critical challenge. In this paper, we focus on the problem of supporting recursion in modern GPUs. We design and comparatively evaluate various algorithms to manage thread divergence encountered in recursive programs. The results improve upon traditional post-dominator based reconvergence mechanisms designed to handle thread divergence due to control flow within a procedure.

  7. Use of XML and MathML for Scientific Documents and Data Sets

    NASA Astrophysics Data System (ADS)

    Baru, Chaitan

    2001-06-01

    The Extensible Markup Language is quickly becoming a part of the infrastructure of the Web and is the next "Web language", after HTML. We will provide an overview of the use of XML for modeling, exchange, and storage of document information. We will provide a brief overview of the basic features including, XML syntax, DTDs, and generic tools for authoring, validating, storing, and querying XML. We will focus on how XML technologies may affect future electronic versions of math and science documents and teaching materials. In particular, the use of MathML is permitting Web access to research and educational materials in which the mathematical equations and figures convey content in more than graphical form. This permits multi-modal access to the materials for diverse groups as well as user interactions with the equations and figures. We will provide examples of how one can develop flexible, interactive interfaces using XML data and open XML standards. note

  8. [Quantitatively Determination of Available Phosphorus and Available Potassium in Soil by Near Infrared Spectroscopy Combining with Recursive Partial Least Squares].

    PubMed

    Jia, Sheng-yao; Yang, Xiang-long; Li, Guang; Zhang, Jian-ming

    2015-09-01

    Soil available phosphorus (P) and available potassium (K) don't possess direct spectral response in the near infrared (NIR) region. They are predictable because of their correlation with spectrally active constituents (organic matter, carbonates, clays, water, etc.). Such correlation may of course differ between the soil sample sets. Therefore, the NIR calibration models with fixed structure are difficult to achieve good prediction performances for soil P and K. In this work, the method of recursive partial least squares (RPLS), which is able to update the model coefficients recursively during the prediction process, has been applied to improve the predictive abilities of calibration models. This work compared the performance of partial least squares regression (PLS), locally weighted PLS (LW-PLS), moving window LW-PLS (LW-PLS2) and RPLS for the measurement of soil P and K. The entire data set of 194 soil samples was split into calibration set and prediction set based on soil types. The calibration set was composed of 120 Anthrosols samples, while the prediction set included 29 Ferralsols samples, 23 Anthrosols samples and 22 Primarosols samples. The best prediction results were obtained by the RPLS model. The coefficient of determination (R2) and residual prediction deviation (RPD) were respectively 0.61, 0.76 and 1.60, 2.05 for soil P and K. The results indicate that RPLS is able to learn the information from the latest modeling sample by recursively updating the model coefficients. The proposed method RPLS has the advantages of wider applicability and better performance for NIR prediction of soil P and K compared with other methods in this work. PMID:26669158

  9. XML — an opportunity for data standards in the geosciences

    NASA Astrophysics Data System (ADS)

    Houlding, Simon W.

    2001-08-01

    Extensible markup language (XML) is a recently introduced meta-language standard on the Web. It provides the rules for development of metadata (markup) standards for information transfer in specific fields. XML allows development of markup languages that describe what information is rather than how it should be presented. This allows computer applications to process the information in intelligent ways. In contrast hypertext markup language (HTML), which fuelled the initial growth of the Web, is a metadata standard concerned exclusively with presentation of information. Besides its potential for revolutionizing Web activities, XML provides an opportunity for development of meaningful data standards in specific application fields. The rapid endorsement of XML by science, industry and e-commerce has already spawned new metadata standards in such fields as mathematics, chemistry, astronomy, multi-media and Web micro-payments. Development of XML-based data standards in the geosciences would significantly reduce the effort currently wasted on manipulating and reformatting data between different computer platforms and applications and would ensure compatibility with the new generation of Web browsers. This paper explores the evolution, benefits and status of XML and related standards in the more general context of Web activities and uses this as a platform for discussion of its potential for development of data standards in the geosciences. Some of the advantages of XML are illustrated by a simple, browser-compatible demonstration of XML functionality applied to a borehole log dataset. The XML dataset and the associated stylesheet and schema declarations are available for FTP download.

  10. Recursive single-layer nets for output error dynamic models.

    PubMed

    Berger, C S

    1995-01-01

    An algorithm for training recursive single-layer nets that has been shown to exhibit rapid convergence is presented. Convergence is not guaranteed, but a sufficient condition is given to justify the method. The method is demonstrated on a difficult modeling problem from bioengineering.

  11. Recursive Vocal Pattern Learning and Generalization in Starlings

    ERIC Educational Resources Information Center

    Bloomfield, Tiffany Corinna

    2012-01-01

    Among known communication systems, human language alone exhibits open-ended productivity of meaning. Interest in the psychological mechanisms supporting this ability, and their evolutionary origins, has resurged following the suggestion that the only uniquely human ability underlying language is a mechanism of recursion. This "Unique…

  12. Experimental verification of a recursive method to calculate evapotranspiration

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Recently, a recursive combination method (RCM) to calculate potential and crop evapotranspiration (ET) was given by Lascano and Van Bavel (Agron. J. 2007, 99:585–590). The RCM differs from the Penman-Monteith (PM) method, the main difference being that the assumptions made regarding the temperature ...

  13. Exploring the Recursive Nature of Food and Family Communication

    ERIC Educational Resources Information Center

    Manning, Linda D.

    2006-01-01

    Family meals act as a barometer to signify the changing nature of family life. The primary objective of this activity is to allow students to experience the many ways in which a recursive relationship exists between the food families eat and the patterns of communication families enact. Through this activity, students experience how food and…

  14. Recursive Frame Analysis: A Practitioner's Tool for Mapping Therapeutic Conversation

    ERIC Educational Resources Information Center

    Keeney, Hillary; Keeney, Bradford; Chenail, Ronald J.

    2012-01-01

    Recursive frame analysis (RFA), both a practical therapeutic tool and an advanced qualitative research method that maps the structure of therapeutic conversation, is introduced with a clinical case vignette. We present and illustrate a means of mapping metaphorical themes that contextualize the performance taking place in the room, recursively…

  15. Semantics Boosts Syntax in Artificial Grammar Learning Tasks with Recursion

    ERIC Educational Resources Information Center

    Fedor, Anna; Varga, Mate; Szathmary, Eors

    2012-01-01

    Center-embedded recursion (CER) in natural language is exemplified by sentences such as "The malt that the rat ate lay in the house." Parsing center-embedded structures is in the focus of attention because this could be one of the cognitive capacities that make humans distinct from all other animals. The ability to parse CER is usually tested by…

  16. Recursivity: A Working Paper on Rhetoric and "Mnesis"

    ERIC Educational Resources Information Center

    Stormer, Nathan

    2013-01-01

    This essay proposes the genealogical study of remembering and forgetting as recursive rhetorical capacities that enable discourse to place itself in an ever-changing present. "Mnesis" is a meta-concept for the arrangements of remembering and forgetting that enable rhetoric to function. Most of the essay defines the materiality of "mnesis", first…

  17. A Further Investigation of Children's Understanding of Recursive Thinking.

    ERIC Educational Resources Information Center

    Eliot, John; And Others

    1979-01-01

    Forty children of different ages responded individually to cartoon drawing in one of two orders of presentation in order to investigate children's understanding of recursive thinking. Five boys and five girls in each of the age ranges five to six, six to seven, seven to eight, and eight to nine served as subjects. (MP)

  18. Teaching and Learning Recursive Programming: A Review of the Research Literature

    ERIC Educational Resources Information Center

    McCauley, Renée; Grissom, Scott; Fitzgerald, Sue; Murphy, Laurie

    2015-01-01

    Hundreds of articles have been published on the topics of teaching and learning recursion, yet fewer than 50 of them have published research results. This article surveys the computing education research literature and presents findings on challenges students encounter in learning recursion, mental models students develop as they learn recursion,…

  19. An XML-based protocol for distributed event services

    SciTech Connect

    Gunter, Dan K.; Smith, Warren; Quesnel, Darcy

    2001-06-25

    A recent trend in distributed computing is the construction of high-performance distributed systems called computational grids. One difficulty we have encountered is that there is no standard format for the representation of performance information and no standard protocol for transmitting this information. This limits the types of performance analysis that can be undertaken in complex distributed systems. To address this problem, we present an XML-based protocol for transmitting performance events in distributed systems and evaluate the performance of this protocol.

  20. An XML-Based Protocol for Distributed Event Services

    NASA Technical Reports Server (NTRS)

    Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)

    2001-01-01

    A recent trend in distributed computing is the construction of high-performance distributed systems called computational grids. One difficulty we have encountered is that there is no standard format for the representation of performance information and no standard protocol for transmitting this information. This limits the types of performance analysis that can be undertaken in complex distributed systems. To address this problem, we present an XML-based protocol for transmitting performance events in distributed systems and evaluate the performance of this protocol.

  1. Research on Heterogeneous Data Exchange based on XML

    NASA Astrophysics Data System (ADS)

    Li, Huanqin; Liu, Jinfeng

    Integration of multiple data sources is becoming increasingly important for enterprises that cooperate closely with their partners for e-commerce. OLAP enables analysts and decision makers fast access to various materialized views from data warehouses. However, many corporations have internal business applications deployed on different platforms. This paper introduces a model for heterogeneous data exchange based on XML. The system can exchange and share the data among the different sources. The method used to realize the heterogeneous data exchange is given in this paper.

  2. Recursion method for deriving an energy-independent effective interaction

    NASA Astrophysics Data System (ADS)

    Suzuki, Kenji; Kumagai, Hiroo; Okamoto, Ryoji; Matsuzaki, Masayuki

    2014-04-01

    The effective-interaction theory has been one of the useful and practical methods for solving nuclear many-body problems based on the shell model. Various approaches have been proposed which are constructed in terms of the so-called Q̂ box and its energy derivatives introduced by Kuo et al. In order to find out a method of calculating them we make a decomposition of a full Hilbert space into subspaces (the Krylov subspaces) and transform a Hamiltonian to a block-tridiagonal form. This transformation brings about much simplification of the calculation of the Q̂ box. In the previous work a recursion method was derived for calculating the Q̂ box analytically on the basis of such transformation of the Hamiltonian. In the present study, by extending the recursion method for the Q̂ box, we derive another recursion relation to calculate the derivatives of the Q̂ box of arbitrary order. With the Q̂ box and its derivatives thus determined we apply them to the calculation of the E-independent effective interaction given in the so-called Lee-Suzuki (LS) method for a system with a degenerate unperturbed energy. We show that the recursion method can also be applied to the generalized LS scheme for a system with nondegenerate unperturbed energies. If the Hilbert space is taken to be sufficiently large, the theory provides an exact way of calculating the Q̂ box and its derivatives. This approach enables us to perform recursive calculations for the effective interaction to arbitrary order for both systems with degenerate and nondegenerate unperturbed energies.

  3. On the convergence improvement in the metadynamics simulations: a Wang-Landau recursion approach.

    PubMed

    Min, Donghong; Liu, Yusong; Carbone, Irina; Yang, Wei

    2007-05-21

    As a popular tool in exploring free energy landscapes, the metadynamics method has been widely applied to elucidate various chemical or biochemical processes. As deeply discussed by Laio et al. [J. Phys. Chem. B 109, 6714 (2005)], the size of the updating Gaussian function is pivotal to the free energy convergence toward the target free energy surface. For instance, a greater Gaussian height can facilitate the quick visit of a conformation region of interest; however, it may lead to a larger error of the calculated free energy surface. In contrast, a lower Gaussian height can guarantee a better resolution of the calculated free energy surface; however, it will take longer time for such a simulation to navigate through the defined conformational region. In order to reconcile such confliction, the authors present a method by implementing the Wang-Landau recursion scheme in the metadynamics simulations to adaptively update the height of the unit Gaussian function. As demonstrated in their model studies on both a toy system, and a realistic molecular system treated with the hybrid quantum mechanical and molecular mechanical (QMMM) potential, the present approach can quickly result in more decently converged free energy surfaces, compared with the classical metadynamics simulations employing the fixed Gaussian heights.

  4. An investigation of a manipulative simulation in the learning of recursive programming

    NASA Astrophysics Data System (ADS)

    Bower, Randall Wayne

    Recursion is a fundamentally important topic in computer science. Even so, it is often omitted in introductory courses, or discussed only briefly. This is likely due, at least in part, to the fact that teaching recursion has been difficult. Perhaps the biggest problem in teaching recursion is that there are few, if any, naturally existing examples of recursion in our lives. However, successful simulations have shown that the computer may hold the key to solving this problem. A simulation of recursion presented to students before formal classroom instruction can provide a foundation of concrete experiences to build upon. The challenge is to develop an appropriate simulation and lesson plan for introducing recursion to students early in their programming experience. This research reviews previous attempts at teaching recursion, including detailed lesson plans, mental models of recursion, and other simulations. Then, a new simulation and lesson plan for its use are described. The effectiveness of the simulation is studied using two groups of students enrolled in a college-level, introductory programming course. Results indicate that students who used the simulation as their first exposure to recursion gained a deeper understanding of recursion than students receiving a lecture-based introduction to recursion. Specifically, students who used the simulation required fewer attempts to complete a set of recursive programming exercises and performed better on a follow-up exam given six weeks after the experiment. This research concludes with a discussion of two important questions: How should students think about recursion and how do they think about recursion. The simulation's strengths and shortcomings in fostering effective ways of thinking about recursion are also discussed.

  5. XML schemas and mark-up practices of taxonomic literature

    PubMed Central

    Penev, Lyubomir; Lyal, Christopher HC; Weitzman, Anna; Morse, David R.; King, David; Sautter, Guido; Georgiev, Teodor; Morris, Robert A.; Catapano, Terry; Agosti, Donat

    2011-01-01

    Abstract We review the three most widely used XML schemas used to mark-up taxonomic texts, TaxonX, TaxPub and taXMLit. These are described from the viewpoint of their development history, current status, implementation, and use cases. The concept of “taxon treatment” from the viewpoint of taxonomy mark-up into XML is discussed. TaxonX and taXMLit are primarily designed for legacy literature, the former being more lightweight and with a focus on recovery of taxon treatments, the latter providing a much more detailed set of tags to facilitate data extraction and analysis. TaxPub is an extension of the National Library of Medicine Document Type Definition (NLM DTD) for taxonomy focussed on layout and recovery and, as such, is best suited for mark-up of new publications and their archiving in PubMedCentral. All three schemas have their advantages and shortcomings and can be used for different purposes. PMID:22207808

  6. Personalising e-learning modules: targeting Rasmussen levels using XML.

    PubMed

    Renard, J M; Leroy, S; Camus, H; Picavet, M; Beuscart, R

    2003-01-01

    The development of Internet technologies has made it possible to increase the number and the diversity of on-line resources for teachers and students. Initiatives like the French-speaking Virtual Medical University Project (UMVF) try to organise the access to these resources. But both teachers and students are working on a partly redundant subset of knowledge. From the analysis of some French courses we propose a model for knowledge organisation derived from Rasmussen's stepladder. In the context of decision-making Rasmussen has identified skill-based, rule-based and knowledge-based levels for the mental process. In the medical context of problem-solving, we apply these three levels to the definition of three students levels: beginners, intermediate-level learners, experts. Based on our model, we build a representation of the hierarchical structure of data using XML language. We use XSLT Transformation Language in order to filter relevant data according to student level and to propose an appropriate display on students' terminal. The model and the XML implementation we define help to design tools for building personalised e-learning modules. PMID:14664075

  7. XML-Based Visual Specification of Multidisciplinary Applications

    NASA Technical Reports Server (NTRS)

    Al-Theneyan, Ahmed; Jakatdar, Amol; Mehrotra, Piyush; Zubair, Mohammad

    2001-01-01

    The advancements in the Internet and Web technologies have fueled a growing interest in developing a web-based distributed computing environment. We have designed and developed Arcade, a web-based environment for designing, executing, monitoring, and controlling distributed heterogeneous applications, which is easy to use and access, portable, and provides support through all phases of the application development and execution. A major focus of the environment is the specification of heterogeneous, multidisciplinary applications. In this paper we focus on the visual and script-based specification interface of Arcade. The web/browser-based visual interface is designed to be intuitive to use and can also be used for visual monitoring during execution. The script specification is based on XML to: (1) make it portable across different frameworks, and (2) make the development of our tools easier by using the existing freely available XML parsers and editors. There is a one-to-one correspondence between the visual and script-based interfaces allowing users to go back and forth between the two. To support this we have developed translators that translate a script-based specification to a visual-based specification, and vice-versa. These translators are integrated with our tools and are transparent to users.

  8. On recursive least-squares filtering algorithms and implementations. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Hsieh, Shih-Fu

    1990-01-01

    In many real-time signal processing applications, fast and numerically stable algorithms for solving least-squares problems are necessary and important. In particular, under non-stationary conditions, these algorithms must be able to adapt themselves to reflect the changes in the system and take appropriate adjustments to achieve optimum performances. Among existing algorithms, the QR-decomposition (QRD)-based recursive least-squares (RLS) methods have been shown to be useful and effective for adaptive signal processing. In order to increase the speed of processing and achieve high throughput rate, many algorithms are being vectorized and/or pipelined to facilitate high degrees of parallelism. A time-recursive formulation of RLS filtering employing block QRD will be considered first. Several methods, including a new non-continuous windowing scheme based on selectively rejecting contaminated data, were investigated for adaptive processing. Based on systolic triarrays, many other forms of systolic arrays are shown to be capable of implementing different algorithms. Various updating and downdating systolic algorithms and architectures for RLS filtering are examined and compared in details, which include Householder reflector, Gram-Schmidt procedure, and Givens rotation. A unified approach encompassing existing square-root-free algorithms is also proposed. For the sinusoidal spectrum estimation problem, a judicious method of separating the noise from the signal is of great interest. Various truncated QR methods are proposed for this purpose and compared to the truncated SVD method. Computer simulations provided for detailed comparisons show the effectiveness of these methods. This thesis deals with fundamental issues of numerical stability, computational efficiency, adaptivity, and VLSI implementation for the RLS filtering problems. In all, various new and modified algorithms and architectures are proposed and analyzed; the significance of any of the new method depends

  9. A Practical Introduction to the XML, Extensible Markup Language, by Way of Some Useful Examples

    ERIC Educational Resources Information Center

    Snyder, Robin

    2004-01-01

    XML, Extensible Markup Language, is important as a way to represent and encapsulate the structure of underlying data in a portable way that supports data exchange regardless of the physical storage of the data. This paper (and session) introduces some useful and practical aspects of XML technology for sharing information in a educational setting…

  10. Integrated Syntactic/Semantic XML Data Validation with a Reusable Software Component

    ERIC Educational Resources Information Center

    Golikov, Steven

    2013-01-01

    Data integration is a critical component of enterprise system integration, and XML data validation is the foundation for sound data integration of XML-based information systems. Since B2B e-commerce relies on data validation as one of the critical components for enterprise integration, it is imperative for financial industries and e-commerce…

  11. Teaching and learning recursive programming: a review of the research literature

    NASA Astrophysics Data System (ADS)

    McCauley, Renée; Grissom, Scott; Fitzgerald, Sue; Murphy, Laurie

    2015-01-01

    Hundreds of articles have been published on the topics of teaching and learning recursion, yet fewer than 50 of them have published research results. This article surveys the computing education research literature and presents findings on challenges students encounter in learning recursion, mental models students develop as they learn recursion, and best practices in introducing recursion. Effective strategies for introducing the topic include using different contexts such as recurrence relations, programming examples, fractal images, and a description of how recursive methods are processed using a call stack. Several studies compared the efficacy of introducing iteration before recursion and vice versa. The paper concludes with suggestions for future research into how students learn and understand recursion, including a look at the possible impact of instructor attitude and newer pedagogies.

  12. An update on HL7's XML-based document representation standards.

    PubMed

    Dolin, R H; Alschuler, L; Boyer, S; Beebe, C

    2000-01-01

    Many people know of HL7 as an organization that creates healthcare messaging standards. But HL7 is also developing standards for the representation of clinical documents (such as discharge summaries and consultation notes). These document standards comprise the HL7 Clinical Document Architecture (CDA). Last year we presented a high-level conceptual overview of the CDA. Since that time, CDA has entered HL7's formal ballot process (which when successful will make the CDA an ANSI-approved HL7 standard). This article delves into the technical details of the current CDA proposal. Note that due to space limitations, only a subset of CDA details can be described. Also, because the ballot process elicits considerable feedback, it is likely that the material presented here will undergo evolution prior to becoming a final standard. The most up-to-date information is available on HL7's web site (www.hl7.org).

  13. EEG and MEG source localization using recursively applied (RAP) MUSIC

    SciTech Connect

    Mosher, J.C.; Leahy, R.M.

    1996-12-31

    The multiple signal characterization (MUSIC) algorithm locates multiple asynchronous dipolar sources from electroencephalography (EEG) and magnetoencephalography (MEG) data. A signal subspace is estimated from the data, then the algorithm scans a single dipole model through a three-dimensional head volume and computes projections onto this subspace. To locate the sources, the user must search the head volume for local peaks in the projection metric. Here we describe a novel extension of this approach which we refer to as RAP (Recursively APplied) MUSIC. This new procedure automatically extracts the locations of the sources through a recursive use of subspace projections, which uses the metric of principal correlations as a multidimensional form of correlation analysis between the model subspace and the data subspace. The dipolar orientations, a form of `diverse polarization,` are easily extracted using the associated principal vectors.

  14. Berends-Giele recursion for double-color-ordered amplitudes

    NASA Astrophysics Data System (ADS)

    Mafra, Carlos R.

    2016-07-01

    Tree-level double-color-ordered amplitudes are computed using Berends-Giele recursion relations applied to the bi-adjoint cubic scalar theory. The standard notion of Berends-Giele currents is generalized to double-currents and their recursions are derived from a perturbiner expansion of linearized fields that solve the non-linear field equations. Two applications are given. Firstly, we prove that the entries of the inverse KLT matrix are equal to Berends-Giele double-currents (and are therefore easy to compute). And secondly, a simple formula to generate tree-level BCJ-satisfying numerators for arbitrary multiplicity is proposed by evaluating the field-theory limit of tree-level string amplitudes for various color orderings using double-color-ordered amplitudes.

  15. An algorithm for protein engineering: simulations of recursive ensemble mutagenesis.

    PubMed Central

    Arkin, A P; Youvan, D C

    1992-01-01

    An algorithm for protein engineering, termed recursive ensemble mutagenesis, has been developed to produce diverse populations of phenotypically related mutants whose members differ in amino acid sequence. This method uses a feedback mechanism to control successive rounds of combinatorial cassette mutagenesis. Starting from partially randomized "wild-type" DNA sequences, a highly parallel search of sequence space for peptides fitting an experimenter's criteria is performed. Each iteration uses information gained from the previous rounds to search the space more efficiently. Simulations of the technique indicate that, under a variety of conditions, the algorithm can rapidly produce a diverse population of proteins fitting specific criteria. In the experimental analog, genetic selection or screening applied during recursive ensemble mutagenesis should force the evolution of an ensemble of mutants to a targeted cluster of related phenotypes. Images PMID:1502200

  16. On Recursion Operator of the q-KP Hierarchy

    NASA Astrophysics Data System (ADS)

    Tian, Ke-Lei; Zhu, Xiao-Ming; He, Jing-Song

    2016-09-01

    It is the aim of the present article to give a general expression of flow equations of the q-KP hierarchy. The distinct difference between the q-KP hierarchy and the KP hierarchy is due to q-binomial and the action of q-shift operator θ, which originates from the Leibnitz rule of the quantum calculus. We further show that the n-reduction leads to a recursive scheme for these flow equations. The recursion operator for the flow equations of the q-KP hierarchy under the n-reduction is also derived. Supported by the National Natural Science Foundation of China under Grant Nos. 11271210 and 11201451, and Anhui Province Natural Science Foundation under Grant No. 1608085MA04

  17. Development of a recursion RNG-based turbulence model

    NASA Technical Reports Server (NTRS)

    Zhou, YE; Vahala, George; Thangam, S.

    1993-01-01

    Reynolds stress closure models based on the recursion renormalization group theory are developed for the prediction of turbulent separated flows. The proposed model uses a finite wavenumber truncation scheme to account for the spectral distribution of energy. In particular, the model incorporates effects of both local and nonlocal interactions. The nonlocal interactions are shown to yield a contribution identical to that from the epsilon-renormalization group (RNG), while the local interactions introduce higher order dispersive effects. A formal analysis of the model is presented and its ability to accurately predict separated flows is analyzed from a combined theoretical and computational stand point. Turbulent flow past a backward facing step is chosen as a test case and the results obtained based on detailed computations demonstrate that the proposed recursion -RNG model with finite cut-off wavenumber can yield very good predictions for the backstep problem.

  18. A Precision Recursive Estimate for Ephemeris Refinement (PREFER)

    NASA Technical Reports Server (NTRS)

    Gibbs, B.

    1980-01-01

    A recursive filter/smoother orbit determination program was developed to refine the ephemerides produced by a batch orbit determination program (e.g., CELEST, GEODYN). The program PREFER can handle a variety of ground and satellite to satellite tracking types as well as satellite altimetry. It was tested on simulated data which contained significant modeling errors and the results clearly demonstrate the superiority of the program compared to batch estimation.

  19. An Accelerated Recursive Doubling Algorithm for Block Tridiagonal Systems

    SciTech Connect

    Seal, Sudip K

    2014-01-01

    Block tridiagonal systems of linear equations arise in a wide variety of scientific and engineering applications. Recursive doubling algorithm is a well-known prefix computation-based numerical algorithm that requires O(M^3(N/P + log P)) work to compute the solution of a block tridiagonal system with N block rows and block size M on P processors. In real-world applications, solutions of tridiagonal systems are most often sought with multiple, often hundreds and thousands, of different right hand sides but with the same tridiagonal matrix. Here, we show that a recursive doubling algorithm is sub-optimal when computing solutions of block tridiagonal systems with multiple right hand sides and present a novel algorithm, called the accelerated recursive doubling algorithm, that delivers O(R) improvement when solving block tridiagonal systems with R distinct right hand sides. Since R is typically about 100 1000, this improvement translates to very significant speedups in practice. Detailed complexity analyses of the new algorithm with empirical confirmation of runtime improvements are presented. To the best of our knowledge, this algorithm has not been reported before in the literature.

  20. Recursive linear optical networks for realizing quantum algorithms

    NASA Astrophysics Data System (ADS)

    Tabia, Gelo Noel

    Linear optics has played a leading role in the development of practical quantum technologies. In recent years, advances in integrated quantum photonics have significantly improved the functionality and scalability of linear optical devices. In this talk, I present recursive schemes for implementing quantum Fourier transforms and inversion about the mean in Grover's algorithm with photonic integrated circuits. By recursive, I mean that two copies of a d-dimensional unitary operation is used to build the corresponding unitary operation on 2 d modes. The linear optical networks operate on path-encoded qudits and realize d-dimensional unitary operations using O (d2) elements. To demonstrate that the recursive circuits are viable in practice, I conducted simulations of proof-of-principle experiments using a fabrication model of realistic errors in silicon-based photonic integrated devices. The results indicate high-fidelity performance in the circuits for 2-qubit and 3-qubit quantum Fourier transforms, and for quantum search on 4-item and 8-item databases. This work was funded by institutional research grant IUT2-1 from the Estonian Research Council and by the European Union through the European Regional Development Fund.

  1. Dynamics of deformable multibody systems using recursive projection methods

    NASA Astrophysics Data System (ADS)

    Shabana, A. A.

    1992-12-01

    In this investigation, generalized Newton-Euler equations are developed for deformable bodies that undergo large translational and rotational displacements. The configuration of the deformable body is identified using coupled sets of reference and elastic variables. The nonlinear generalized Newton-Euler equations are formulated in terms of a set of time invariant scalars and matrices that depend on the spatial coordinates as well as the assumed displacement field. These time-invariant quantities appear in the nonlinear terms that represent the dynamic coupling between the rigid body modes and the elastic deformation. A set of recursive kinematic equations, in which the absolute accelerations are expressed in terms of the joint and elastic accelerations are developed for several joint types. The recursive kinematic equations and the joint reaction relationships are combined with the generalized Newton-Euler equations in order to obtain a system of loosely coupled equations which have sparse matrix structure. Using matrix partitioning and recursive projection techniques based on optimal block factorization an order n solution for the system equations is obtained.

  2. Recursive linearization of multibody dynamics equations of motion

    NASA Technical Reports Server (NTRS)

    Lin, Tsung-Chieh; Yae, K. Harold

    1989-01-01

    The equations of motion of a multibody system are nonlinear in nature, and thus pose a difficult problem in linear control design. One approach is to have a first-order approximation through the numerical perturbations at a given configuration, and to design a control law based on the linearized model. Here, a linearized model is generated analytically by following the footsteps of the recursive derivation of the equations of motion. The equations of motion are first written in a Newton-Euler form, which is systematic and easy to construct; then, they are transformed into a relative coordinate representation, which is more efficient in computation. A new computational method for linearization is obtained by applying a series of first-order analytical approximations to the recursive kinematic relationships. The method has proved to be computationally more efficient because of its recursive nature. It has also turned out to be more accurate because of the fact that analytical perturbation circumvents numerical differentiation and other associated numerical operations that may accumulate computational error, thus requiring only analytical operations of matrices and vectors. The power of the proposed linearization algorithm is demonstrated, in comparison to a numerical perturbation method, with a two-link manipulator and a seven degrees of freedom robotic manipulator. Its application to control design is also demonstrated.

  3. Parallelizable approximate solvers for recursions arising in preconditioning

    SciTech Connect

    Shapira, Y.

    1996-12-31

    For the recursions used in the Modified Incomplete LU (MILU) preconditioner, namely, the incomplete decomposition, forward elimination and back substitution processes, a parallelizable approximate solver is presented. The present analysis shows that the solutions of the recursions depend only weakly on their initial conditions and may be interpreted to indicate that the inexact solution is close, in some sense, to the exact one. The method is based on a domain decomposition approach, suitable for parallel implementations with message passing architectures. It requires a fixed number of communication steps per preconditioned iteration, independently of the number of subdomains or the size of the problem. The overlapping subdomains are either cubes (suitable for mesh-connected arrays of processors) or constructed by the data-flow rule of the recursions (suitable for line-connected arrays with possibly SIMD or vector processors). Numerical examples show that, in both cases, the overhead in the number of iterations required for convergence of the preconditioned iteration is small relatively to the speed-up gained.

  4. Teaching object concepts for XML-based representations.

    SciTech Connect

    Kelsey, R. L.

    2002-01-01

    Students learned about object-oriented design concepts and knowledge representation through the use of a set of toy blocks. The blocks represented a limited and focused domain of knowledge and one that was physical and tangible. The blocks helped the students to better visualize, communicate, and understand the domain of knowledge as well as how to perform object decomposition. The blocks were further abstracted to an engineering design kit for water park design. This helped the students to work on techniques for abstraction and conceptualization. It also led the project from tangible exercises into software and programming exercises. Students employed XML to create object-based knowledge representations and Java to use the represented knowledge. The students developed and implemented software allowing a lay user to design and create their own water slide and then to take a simulated ride on their slide.

  5. XML-based Gating Descriptions in Flow Cytometry

    PubMed Central

    Spidlen, Josef; Leif, Robert; Moore, Wayne; Roederer, Mario; Brinkman, Ryan R.

    2008-01-01

    Background The lack of software interoperability with respect to gating due to lack of a standardized mechanism for data exchange has traditionally been a bottleneck preventing reproducibility of flow cytometry (FCM) data analysis and the usage of multiple analytical tools. Methods To facilitate interoperability among FCM data analysis tools, members of the International Society for the Advancement of Cytometry (ISAC) Data Standards Task Force (DSTF) have developed an XML-based mechanism to formally describe gates (Gating-ML). Results Gating-ML, an open specification for encoding gating, data transformations and compensation, has been adopted by the ISAC DSTF as a Candidate Recommendation (CR). Conclusions Gating-ML can facilitate exchange of gating descriptions the same way that FCS facilitated for exchange of raw FCM data. Its adoption will open new collaborative opportunities as well as possibilities for advanced analyses and methods development. The ISAC DSTF is satisfied that the standard addresses the requirements for a gating exchange standard. PMID:18773465

  6. Using XML and Java Technologies for Astronomical Instrument Control

    NASA Technical Reports Server (NTRS)

    Ames, Troy; Case, Lynne; Powers, Edward I. (Technical Monitor)

    2001-01-01

    Traditionally, instrument command and control systems have been highly specialized, consisting mostly of custom code that is difficult to develop, maintain, and extend. Such solutions are initially very costly and are inflexible to subsequent engineering change requests, increasing software maintenance costs. Instrument description is too tightly coupled with details of implementation. NASA Goddard Space Flight Center, under the Instrument Remote Control (IRC) project, is developing a general and highly extensible framework that applies to any kind of instrument that can be controlled by a computer. The software architecture combines the platform independent processing capabilities of Java with the power of the Extensible Markup Language (XML), a human readable and machine understandable way to describe structured data. A key aspect of the object-oriented architecture is that the software is driven by an instrument description, written using the Instrument Markup Language (IML), a dialect of XML. IML is used to describe the command sets and command formats of the instrument, communication mechanisms, format of the data coming from the instrument, and characteristics of the graphical user interface to control and monitor the instrument. The IRC framework allows the users to define a data analysis pipeline which converts data coming out of the instrument. The data can be used in visualizations in order for the user to assess the data in real-time, if necessary. The data analysis pipeline algorithms can be supplied by the user in a variety of forms or programming languages. Although the current integration effort is targeted for the High-resolution Airborne Wideband Camera (HAWC) and the Submillimeter and Far Infrared Experiment (SAFIRE), first-light instruments of the Stratospheric Observatory for Infrared Astronomy (SOFIA), the framework is designed to be generic and extensible so that it can be applied to any instrument. Plans are underway to test the framework

  7. Categorial compositionality III: F-(co)algebras and the systematicity of recursive capacities in human cognition.

    PubMed

    Phillips, Steven; Wilson, William H

    2012-01-01

    Human cognitive capacity includes recursively definable concepts, which are prevalent in domains involving lists, numbers, and languages. Cognitive science currently lacks a satisfactory explanation for the systematic nature of such capacities (i.e., why the capacity for some recursive cognitive abilities-e.g., finding the smallest number in a list-implies the capacity for certain others-finding the largest number, given knowledge of number order). The category-theoretic constructs of initial F-algebra, catamorphism, and their duals, final coalgebra and anamorphism provide a formal, systematic treatment of recursion in computer science. Here, we use this formalism to explain the systematicity of recursive cognitive capacities without ad hoc assumptions (i.e., to the same explanatory standard used in our account of systematicity for non-recursive capacities). The presence of an initial algebra/final coalgebra explains systematicity because all recursive cognitive capacities, in the domain of interest, factor through (are composed of) the same component process. Moreover, this factorization is unique, hence no further (ad hoc) assumptions are required to establish the intrinsic connection between members of a group of systematically-related capacities. This formulation also provides a new perspective on the relationship between recursive cognitive capacities. In particular, the link between number and language does not depend on recursion, as such, but on the underlying functor on which the group of recursive capacities is based. Thus, many species (and infants) can employ recursive processes without having a full-blown capacity for number and language.

  8. Using XML to improve the productivity and robustness in application development in geosciences

    NASA Astrophysics Data System (ADS)

    Mello, Ulisses T.; Xu, Liqing

    2006-12-01

    In this paper, we describe an approach to apply Extensible Markup Language (XML) technologies to improve the robustness of geological and geophysical applications as well as to increase the efficacy in the application development process. Geological and geophysical applications are often data centric, I/O intensive and their development is incremental. Therefore, significant amount of development resources is devoted to the design and reengineering of the container data structures that store data. This process is time consuming, mechanical and error prone. Normally, ad hoc parsers are necessary for reading inputs, as well as numerous filters, or adapters to transform the data for integration with other legacy applications. Most of this can be avoided by using XML-related technologies. XML has a type system schema that can be used to define input parameters and constraints. The XML parser can validate the input data using the constraints defined in the schema. Exporting results in XML format allows the use of Extensible Stylesheet Language Transformations (XSLT) to transform XML output to any other format necessary for integration with legacy applications. Additionally, XML-data binding code can be automatically generated in specified languages such C++ and Java. We used this approach to develop applications for seismic ray-tracing and basin modeling with great success, and the major benefits of this approach were the significant gains in productivity during the developement and application robustness.

  9. XML as a cross-platform representation for medical imaging with fuzzy algorithms.

    PubMed

    Gal, Norbert; Stoicu-Tivadar, Vasile

    2011-01-01

    Machines that perform linguistic medical image interpretation are based on fuzzy algorithms. There are several frameworks that can edit and simulate fuzzy algorithms, but they are not compatible with most of the implemented applications. This paper suggests a representation for fuzzy algorithms in XML files, and using this XML as a cross-platform between the simulation framework and the software applications. The paper presents a parsing algorithm that can convert files created by simulation framework, and converts them dynamically into an XML file keeping the original logical structure of the files. PMID:21685590

  10. PRIDEViewer: a novel user-friendly interface to visualize PRIDE XML files.

    PubMed

    Medina-Aunon, J Alberto; Carazo, José M; Albar, Juan Pablo

    2011-01-01

    Current standardization initiatives have greatly contributed to share the information derived by proteomics experiments. One of these initiatives is the XML-based repository PRIDE (PRoteomics IDEntification database), although an XML-based document does not appear to present a user-friendly view at the first glance. PRIDEViewer is a novel Java-based application that presents the information available in a PRIDE XML file in a user-friendly manner, facilitating the interaction among end users as well as the understanding and evaluation of the compiled information. PRIDEViewer is freely available at: http://proteo.cnb.csic.es/prideviewer/.

  11. Discovery of a Recursive Principle: An Artificial Grammar Investigation of Human Learning of a Counting Recursion Language.

    PubMed

    Cho, Pyeong Whan; Szkudlarek, Emily; Tabor, Whitney

    2016-01-01

    Learning is typically understood as a process in which the behavior of an organism is progressively shaped until it closely approximates a target form. It is easy to comprehend how a motor skill or a vocabulary can be progressively learned-in each case, one can conceptualize a series of intermediate steps which lead to the formation of a proficient behavior. With grammar, it is more difficult to think in these terms. For example, center embedding recursive structures seem to involve a complex interplay between multiple symbolic rules which have to be in place simultaneously for the system to work at all, so it is not obvious how the mechanism could gradually come into being. Here, we offer empirical evidence from a new artificial language (or "artificial grammar") learning paradigm, Locus Prediction, that, despite the conceptual conundrum, recursion acquisition occurs gradually, at least for a simple formal language. In particular, we focus on a variant of the simplest recursive language, a (n) b (n) , and find evidence that (i) participants trained on two levels of structure (essentially ab and aabb) generalize to the next higher level (aaabbb) more readily than participants trained on one level of structure (ab) combined with a filler sentence; nevertheless, they do not generalize immediately; (ii) participants trained up to three levels (ab, aabb, aaabbb) generalize more readily to four levels than participants trained on two levels generalize to three; (iii) when we present the levels in succession, starting with the lower levels and including more and more of the higher levels, participants show evidence of transitioning between the levels gradually, exhibiting intermediate patterns of behavior on which they were not trained; (iv) the intermediate patterns of behavior are associated with perturbations of an attractor in the sense of dynamical systems theory. We argue that all of these behaviors indicate a theory of mental representation in which recursive

  12. Discovery of a Recursive Principle: An Artificial Grammar Investigation of Human Learning of a Counting Recursion Language.

    PubMed

    Cho, Pyeong Whan; Szkudlarek, Emily; Tabor, Whitney

    2016-01-01

    Learning is typically understood as a process in which the behavior of an organism is progressively shaped until it closely approximates a target form. It is easy to comprehend how a motor skill or a vocabulary can be progressively learned-in each case, one can conceptualize a series of intermediate steps which lead to the formation of a proficient behavior. With grammar, it is more difficult to think in these terms. For example, center embedding recursive structures seem to involve a complex interplay between multiple symbolic rules which have to be in place simultaneously for the system to work at all, so it is not obvious how the mechanism could gradually come into being. Here, we offer empirical evidence from a new artificial language (or "artificial grammar") learning paradigm, Locus Prediction, that, despite the conceptual conundrum, recursion acquisition occurs gradually, at least for a simple formal language. In particular, we focus on a variant of the simplest recursive language, a (n) b (n) , and find evidence that (i) participants trained on two levels of structure (essentially ab and aabb) generalize to the next higher level (aaabbb) more readily than participants trained on one level of structure (ab) combined with a filler sentence; nevertheless, they do not generalize immediately; (ii) participants trained up to three levels (ab, aabb, aaabbb) generalize more readily to four levels than participants trained on two levels generalize to three; (iii) when we present the levels in succession, starting with the lower levels and including more and more of the higher levels, participants show evidence of transitioning between the levels gradually, exhibiting intermediate patterns of behavior on which they were not trained; (iv) the intermediate patterns of behavior are associated with perturbations of an attractor in the sense of dynamical systems theory. We argue that all of these behaviors indicate a theory of mental representation in which recursive

  13. Discovery of a Recursive Principle: An Artificial Grammar Investigation of Human Learning of a Counting Recursion Language

    PubMed Central

    Cho, Pyeong Whan; Szkudlarek, Emily; Tabor, Whitney

    2016-01-01

    Learning is typically understood as a process in which the behavior of an organism is progressively shaped until it closely approximates a target form. It is easy to comprehend how a motor skill or a vocabulary can be progressively learned—in each case, one can conceptualize a series of intermediate steps which lead to the formation of a proficient behavior. With grammar, it is more difficult to think in these terms. For example, center embedding recursive structures seem to involve a complex interplay between multiple symbolic rules which have to be in place simultaneously for the system to work at all, so it is not obvious how the mechanism could gradually come into being. Here, we offer empirical evidence from a new artificial language (or “artificial grammar”) learning paradigm, Locus Prediction, that, despite the conceptual conundrum, recursion acquisition occurs gradually, at least for a simple formal language. In particular, we focus on a variant of the simplest recursive language, anbn, and find evidence that (i) participants trained on two levels of structure (essentially ab and aabb) generalize to the next higher level (aaabbb) more readily than participants trained on one level of structure (ab) combined with a filler sentence; nevertheless, they do not generalize immediately; (ii) participants trained up to three levels (ab, aabb, aaabbb) generalize more readily to four levels than participants trained on two levels generalize to three; (iii) when we present the levels in succession, starting with the lower levels and including more and more of the higher levels, participants show evidence of transitioning between the levels gradually, exhibiting intermediate patterns of behavior on which they were not trained; (iv) the intermediate patterns of behavior are associated with perturbations of an attractor in the sense of dynamical systems theory. We argue that all of these behaviors indicate a theory of mental representation in which recursive

  14. Absence of the Kosterlitz-Thouless fixed points in the Migdal-Kadanoff recursion formulas

    SciTech Connect

    Ito, K.R.

    1985-06-03

    It is shown that the Migdal approximate-renormalization recursion formulas always bring the initial state into the high-temperature region for the U(1)-invariant models at the critical dimensions. This means that the Migdal recursion formulas fail to exhibit the Kosterlitz-Thouless transitions. This is also the case for the recursion formulas of Kadanoff type. The method developed here is extended to non-Abelian systems by some additional tricks.

  15. Genome-wide Identification of Zero Nucleotide Recursive Splicing in Drosophila

    PubMed Central

    Duff, Michael O.; Olson, Sara; Wei, Xintao; Garrett, Sandra C.; Osman, Ahmad; Bolisetty, Mohan; Plocik, Alex; Celniker, Susan; Graveley, Brenton R.

    2015-01-01

    Recursive splicing is a process in which large introns are removed in multiple steps by resplicing at ratchet points - 5′ splice sites recreated after splicing1. Recursive splicing was first identified in the Drosophila Ultrabithorax (Ubx) gene1 and only three additional Drosophila genes have since been experimentally shown to undergo recursive splicing2,3. Here, we identify 197 zero nucleotide exon ratchet points in 130 introns of 115 Drosophila genes from total RNA sequencing data generated from developmental time points, dissected tissues, and cultured cells. The sequential nature of recursive splicing was confirmed by identification of lariat introns generated by splicing to and from the ratchet points. We also show that recursive splicing is a constitutive process, that depletion of U2AF inhibits recursive splicing, and that the sequence and function of ratchet points are evolutionarily conserved in Drosophila. Finally, we identified four recursively spliced human genes, one of which is also recursively spliced in Drosophila. Together these results indicate that recursive splicing is commonly used in Drosophila, occurs in human and provides insight into the mechanisms by which some large introns are removed. PMID:25970244

  16. A Tool Kit for Implementing XML Schema Naming and Design Rules

    SciTech Connect

    Lubell, Joshua; Kulvatunyou, Boonserm

    2005-11-01

    A tool kit being developed at the National Institute of Standards and Technology (NIST)encodes XML schema Naming and Design Rules in a computer-interpretable fashion,enabling automated rule enforcement and improving schema quality.

  17. Using XML technology for the ontology-based semantic integration of life science databases.

    PubMed

    Philippi, Stephan; Köhler, Jacob

    2004-06-01

    Several hundred internet accessible life science databases with constantly growing contents and varying areas of specialization are publicly available via the internet. Database integration, consequently, is a fundamental prerequisite to be able to answer complex biological questions. Due to the presence of syntactic, schematic, and semantic heterogeneities, large scale database integration at present takes considerable efforts. As there is a growing apprehension of extensible markup language (XML) as a means for data exchange in the life sciences, this article focuses on the impact of XML technology on database integration in this area. In detail, a general architecture for ontology-driven data integration based on XML technology is introduced, which overcomes some of the traditional problems in this area. As a proof of concept, a prototypical implementation of this architecture based on a native XML database and an expert system shell is described for the realization of a real world integration scenario.

  18. XML-based approaches for the integration of heterogeneous bio-molecular data

    PubMed Central

    Mesiti, Marco; Jiménez-Ruiz, Ernesto; Sanz, Ismael; Berlanga-Llavori, Rafael; Perlasca, Paolo; Valentini, Giorgio; Manset, David

    2009-01-01

    Background The today's public database infrastructure spans a very large collection of heterogeneous biological data, opening new opportunities for molecular biology, bio-medical and bioinformatics research, but raising also new problems for their integration and computational processing. Results In this paper we survey the most interesting and novel approaches for the representation, integration and management of different kinds of biological data by exploiting XML and the related recommendations and approaches. Moreover, we present new and interesting cutting edge approaches for the appropriate management of heterogeneous biological data represented through XML. Conclusion XML has succeeded in the integration of heterogeneous biomolecular information, and has established itself as the syntactic glue for biological data sources. Nevertheless, a large variety of XML-based data formats have been proposed, thus resulting in a difficult effective integration of bioinformatics data schemes. The adoption of a few semantic-rich standard formats is urgent to achieve a seamless integration of the current biological resources. PMID:19828083

  19. Validation and Simplification of the Radiation Therapy Oncology Group Recursive Partitioning Analysis Classification for Glioblastoma

    SciTech Connect

    Li Jing; Wang Meihua; Won, Minhee; Shaw, Edward G.; Coughlin, Christopher; Curran, Walter J.; Mehta, Minesh P.

    2011-11-01

    Purpose: Previous recursive partitioning analysis (RPA) of patients with malignant glioma (glioblastoma multiforme [GBM] and anaplastic astrocytoma [AA]) produced six prognostic groups (I-VI) classified by six factors. We sought here to determine whether the classification for GBM could be improved by using an updated Radiation Therapy Oncology Group (RTOG) GBM database excluding AA and by considering additional baseline variables. Methods and Materials: The new analysis considered 42 baseline variables and 1,672 GBM patients from the expanded RTOG glioma database. Patients receiving radiation only were excluded such that all patients received radiation+carmustine. 'Radiation dose received' was replaced with 'radiation dose assigned.' The new RPA models were compared with the original model by applying them to a test dataset comprising 488 patients from six other RTOG trials. Fitness of the original and new models was evaluated using explained variation. Results: The original RPA model explained more variations in survival in the test dataset than did the new models (20% vs. 15%) and was therefore chosen for further analysis. It was reduced by combining Classes V and VI to produce three prognostic classes (Classes III, IV, and V+VI), as Classes V and VI had indistinguishable survival in the test dataset. The simplified model did not further improve performance (explained variation 18% vs. 20%) but is easier to apply because it involves only four variables: age, performance status, extent of resection, and neurologic function. Applying this simplified model to the updated GBM database resulted in three distinct classes with median survival times of 17.1, 11.2, and 7.5 months for Classes III, IV, and V+VI, respectively. Conclusions: The final model, the simplified original RPA model combining Classes V and VI, resulted in three distinct prognostic groups defined by age, performance status, extent of resection, and neurologic function. This classification will be used

  20. XML-based information system for planetary sciences

    NASA Astrophysics Data System (ADS)

    Carraro, F.; Fonte, S.; Turrini, D.

    2009-04-01

    EuroPlaNet (EPN in the following) has been developed by the planetological community under the "Sixth Framework Programme" (FP6 in the following), the European programme devoted to the improvement of the European research efforts through the creation of an internal market for science and technology. The goal of the EPN programme is the creation of a European network aimed to the diffusion of data produced by space missions dedicated to the study of the Solar System. A special place within the EPN programme is that of I.D.I.S. (Integrated and Distributed Information Service). The main goal of IDIS is to offer to the planetary science community a user-friendly access to the data and information produced by the various types of research activities, i.e. Earth-based observations, space observations, modeling, theory and laboratory experiments. During the FP6 programme IDIS development consisted in the creation of a series of thematic nodes, each of them specialized in a specific scientific domain, and a technical coordination node. The four thematic nodes are the Atmosphere node, the Plasma node, the Interiors & Surfaces node and the Small Bodies & Dust node. The main task of the nodes have been the building up of selected scientific cases related with the scientific domain of each node. The second work done by EPN nodes have been the creation of a catalogue of resources related to their main scientific theme. Both these efforts have been used as the basis for the development of the main IDIS goal, i.e. the integrated distributed service. An XML-based data model have been developed to describe resources using meta-data and to store the meta-data within an XML-based database called eXist. A search engine has been then developed in order to allow users to search resources within the database. Users can select the resource type and can insert one or more values or can choose a value among those present in a list, depending on selected resource. The system searches for all

  1. Design of a recursive vector processor using polynomial splines

    NASA Technical Reports Server (NTRS)

    Kim, C. S.; Shen, C. N.

    1980-01-01

    The problem of obtaining smoothed estimates of function values, particularly their derivatives, from a finite set of inaccurate measurements is considered. A recursive two-dimensional vector processor is introduced as an approximation to the nonrecursive constrained least-squares estimation. Here, piecewise bicubic Hermite polynomials are extensively used as approximating functions, and the smoothing integral is converted to a discrete quadratic form. This makes it possible to convert the problem of fitting an approximating function to one of estimating the function values and derivatives at the nodes.

  2. Random recursive trees and the elephant random walk

    NASA Astrophysics Data System (ADS)

    Kürsten, Rüdiger

    2016-03-01

    One class of random walks with infinite memory, so-called elephant random walks, are simple models describing anomalous diffusion. We present a surprising connection between these models and bond percolation on random recursive trees. We use a coupling between the two models to translate results from elephant random walks to the percolation process. We calculate, besides other quantities, exact expressions for the first and the second moment of the root cluster size and of the number of nodes in child clusters of the first generation. We further introduce another model, the skew elephant random walk, and calculate the first and second moment of this process.

  3. Vision-based recursive estimation of rotorcraft obstacle locations

    NASA Technical Reports Server (NTRS)

    Leblanc, D. J.; Mcclamroch, N. H.

    1992-01-01

    The authors address vision-based passive ranging during nap-of-the-earth (NOE) rotorcraft flight. They consider the problem of estimating the relative location of identifiable features on nearby obstacles, assuming a sequence of noisy camera images and imperfect measurements of the camera's translation and rotation. An iterated extended Kalman filter is used to provide recursive range estimation. The correspondence problem is simplified by predicting and tracking each feature's image within the Kalman filter framework. Simulation results are presented which show convergent estimates and generally successful feature point tracking. Estimation performance degrades for features near the optical axis and for accelerating motions. Image tracking is also sensitive to angular rate.

  4. Random recursive trees and the elephant random walk.

    PubMed

    Kürsten, Rüdiger

    2016-03-01

    One class of random walks with infinite memory, so-called elephant random walks, are simple models describing anomalous diffusion. We present a surprising connection between these models and bond percolation on random recursive trees. We use a coupling between the two models to translate results from elephant random walks to the percolation process. We calculate, besides other quantities, exact expressions for the first and the second moment of the root cluster size and of the number of nodes in child clusters of the first generation. We further introduce another model, the skew elephant random walk, and calculate the first and second moment of this process. PMID:27078296

  5. Integrated Design and Production Reference Integration with ArchGenXML V1.00

    SciTech Connect

    Barter, R H

    2004-07-20

    ArchGenXML is a tool that allows easy creation of Zope products through the use of Archetypes. The Integrated Design and Production Reference (IDPR) should be highly configurable in order to meet the needs of a diverse engineering community. Ease of configuration is key to the success of IDPR. The purpose of this paper is to describe a method of using a UML diagram editor to configure IDPR through ArchGenXML and Archetypes.

  6. NeXML: rich, extensible, and verifiable representation of comparative data and metadata.

    PubMed

    Vos, Rutger A; Balhoff, James P; Caravas, Jason A; Holder, Mark T; Lapp, Hilmar; Maddison, Wayne P; Midford, Peter E; Priyam, Anurag; Sukumaran, Jeet; Xia, Xuhua; Stoltzfus, Arlin

    2012-07-01

    In scientific research, integration and synthesis require a common understanding of where data come from, how much they can be trusted, and what they may be used for. To make such an understanding computer-accessible requires standards for exchanging richly annotated data. The challenges of conveying reusable data are particularly acute in regard to evolutionary comparative analysis, which comprises an ever-expanding list of data types, methods, research aims, and subdisciplines. To facilitate interoperability in evolutionary comparative analysis, we present NeXML, an XML standard (inspired by the current standard, NEXUS) that supports exchange of richly annotated comparative data. NeXML defines syntax for operational taxonomic units, character-state matrices, and phylogenetic trees and networks. Documents can be validated unambiguously. Importantly, any data element can be annotated, to an arbitrary degree of richness, using a system that is both flexible and rigorous. We describe how the use of NeXML by the TreeBASE and Phenoscape projects satisfies user needs that cannot be satisfied with other available file formats. By relying on XML Schema Definition, the design of NeXML facilitates the development and deployment of software for processing, transforming, and querying documents. The adoption of NeXML for practical use is facilitated by the availability of (1) an online manual with code samples and a reference to all defined elements and attributes, (2) programming toolkits in most of the languages used commonly in evolutionary informatics, and (3) input-output support in several widely used software applications. An active, open, community-based development process enables future revision and expansion of NeXML. PMID:22357728

  7. NeXML: Rich, Extensible, and Verifiable Representation of Comparative Data and Metadata

    PubMed Central

    Vos, Rutger A.; Balhoff, James P.; Caravas, Jason A.; Holder, Mark T.; Lapp, Hilmar; Maddison, Wayne P.; Midford, Peter E.; Priyam, Anurag; Sukumaran, Jeet; Xia, Xuhua; Stoltzfus, Arlin

    2012-01-01

    In scientific research, integration and synthesis require a common understanding of where data come from, how much they can be trusted, and what they may be used for. To make such an understanding computer-accessible requires standards for exchanging richly annotated data. The challenges of conveying reusable data are particularly acute in regard to evolutionary comparative analysis, which comprises an ever-expanding list of data types, methods, research aims, and subdisciplines. To facilitate interoperability in evolutionary comparative analysis, we present NeXML, an XML standard (inspired by the current standard, NEXUS) that supports exchange of richly annotated comparative data. NeXML defines syntax for operational taxonomic units, character-state matrices, and phylogenetic trees and networks. Documents can be validated unambiguously. Importantly, any data element can be annotated, to an arbitrary degree of richness, using a system that is both flexible and rigorous. We describe how the use of NeXML by the TreeBASE and Phenoscape projects satisfies user needs that cannot be satisfied with other available file formats. By relying on XML Schema Definition, the design of NeXML facilitates the development and deployment of software for processing, transforming, and querying documents. The adoption of NeXML for practical use is facilitated by the availability of (1) an online manual with code samples and a reference to all defined elements and attributes, (2) programming toolkits in most of the languages used commonly in evolutionary informatics, and (3) input–output support in several widely used software applications. An active, open, community-based development process enables future revision and expansion of NeXML. PMID:22357728

  8. A Generic Data Form Designer Based on XML/XSL Technology

    PubMed Central

    Kim, Ju Han; Safran, Charles; Slack, Warner V.

    2000-01-01

    Extensible Markup Language (XML) and Extensible Stylesheet Language (XSL) are newly developed Internet protocols. Development of custom data entry forms requires significant programming. Visual design tools and a modifiable, template-driven approach may facilitate this process. However, these approaches generally require the predefinition of data form element types. This paper describes an approach enabling post hoc definition of elementary and composite data entry form element types using XML/XSL technologies.

  9. Using XML and XSLT for flexible elicitation of mental-health risk knowledge.

    PubMed

    Buckingham, C D; Ahmed, A; Adams, A E

    2007-03-01

    Current tools for assessing risks associated with mental-health problems require assessors to make high-level judgements based on clinical experience. This paper describes how new technologies can enhance qualitative research methods to identify lower-level cues underlying these judgements, which can be collected by people without a specialist mental-health background. Content analysis of interviews with 46 multidisciplinary mental-health experts exposed the cues and their interrelationships, which were represented by a mind map using software that stores maps as XML. All 46 mind maps were integrated into a single XML knowledge structure and analysed by a Lisp program to generate quantitative information about the numbers of experts associated with each part of it. The knowledge was refined by the experts, using software developed in Flash to record their collective views within the XML itself. These views specified how the XML should be transformed by XSLT, a technology for rendering XML, which resulted in a validated hierarchical knowledge structure associating patient cues with risks. Changing knowledge elicitation requirements were accommodated by flexible transformations of XML data using XSLT, which also facilitated generation of multiple data-gathering tools suiting different assessment circumstances and levels of mental-health knowledge. PMID:17365646

  10. Using XML and Java for Astronomical Instrumentation Control

    NASA Technical Reports Server (NTRS)

    Ames, Troy; Koons, Lisa; Sall, Ken; Warsaw, Craig

    2000-01-01

    Traditionally, instrument command and control systems have been highly specialized, consisting mostly of custom code that is difficult to develop, maintain, and extend. Such solutions are initially very costly and are inflexible to subsequent engineering change requests, increasing software maintenance costs. Instrument description is too tightly coupled with details of implementation. NASA Goddard Space Flight Center is developing a general and highly extensible framework that applies to any kind of instrument that can be controlled by a computer. The software architecture combines the platform independent processing capabilities of Java with the power of the Extensible Markup Language (XML), a human readable and machine understandable way to describe structured data. A key aspect of the object-oriented architecture is software that is driven by an instrument description, written using the Instrument Markup Language (IML). ]ML is used to describe graphical user interfaces to control and monitor the instrument, command sets and command formats, data streams, and communication mechanisms. Although the current effort is targeted for the High-resolution Airborne Wideband Camera, a first-light instrument of the Stratospheric Observatory for Infrared Astronomy, the framework is designed to be generic and extensible so that it can be applied to any instrument.

  11. Modeling relationships between calving traits: a comparison between standard and recursive mixed models

    PubMed Central

    2010-01-01

    Background The use of structural equation models for the analysis of recursive and simultaneous relationships between phenotypes has become more popular recently. The aim of this paper is to illustrate how these models can be applied in animal breeding to achieve parameterizations of different levels of complexity and, more specifically, to model phenotypic recursion between three calving traits: gestation length (GL), calving difficulty (CD) and stillbirth (SB). All recursive models considered here postulate heterogeneous recursive relationships between GL and liabilities to CD and SB, and between liability to CD and liability to SB, depending on categories of GL phenotype. Methods Four models were compared in terms of goodness of fit and predictive ability: 1) standard mixed model (SMM), a model with unstructured (co)variance matrices; 2) recursive mixed model 1 (RMM1), assuming that residual correlations are due to the recursive relationships between phenotypes; 3) RMM2, assuming that correlations between residuals and contemporary groups are due to recursive relationships between phenotypes; and 4) RMM3, postulating that the correlations between genetic effects, contemporary groups and residuals are due to recursive relationships between phenotypes. Results For all the RMM considered, the estimates of the structural coefficients were similar. Results revealed a nonlinear relationship between GL and the liabilities both to CD and to SB, and a linear relationship between the liabilities to CD and SB. Differences in terms of goodness of fit and predictive ability of the models considered were negligible, suggesting that RMM3 is plausible. Conclusions The applications examined in this study suggest the plausibility of a nonlinear recursive effect from GL onto CD and SB. Also, the fact that the most restrictive model RMM3, which assumes that the only cause of correlation is phenotypic recursion, performs as well as the others indicates that the phenotypic recursion

  12. Parsimonious extreme learning machine using recursive orthogonal least squares.

    PubMed

    Wang, Ning; Er, Meng Joo; Han, Min

    2014-10-01

    Novel constructive and destructive parsimonious extreme learning machines (CP- and DP-ELM) are proposed in this paper. By virtue of the proposed ELMs, parsimonious structure and excellent generalization of multiinput-multioutput single hidden-layer feedforward networks (SLFNs) are obtained. The proposed ELMs are developed by innovative decomposition of the recursive orthogonal least squares procedure into sequential partial orthogonalization (SPO). The salient features of the proposed approaches are as follows: 1) Initial hidden nodes are randomly generated by the ELM methodology and recursively orthogonalized into an upper triangular matrix with dramatic reduction in matrix size; 2) the constructive SPO in the CP-ELM focuses on the partial matrix with the subcolumn of the selected regressor including nonzeros as the first column while the destructive SPO in the DP-ELM operates on the partial matrix including elements determined by the removed regressor; 3) termination criteria for CP- and DP-ELM are simplified by the additional residual error reduction method; and 4) the output weights of the SLFN need not be solved in the model selection procedure and is derived from the final upper triangular equation by backward substitution. Both single- and multi-output real-world regression data sets are used to verify the effectiveness and superiority of the CP- and DP-ELM in terms of parsimonious architecture and generalization accuracy. Innovative applications to nonlinear time-series modeling demonstrate superior identification results. PMID:25291736

  13. Grid Based Nonlinear Filtering Revisited: Recursive Estimation & Asymptotic Optimality

    NASA Astrophysics Data System (ADS)

    Kalogerias, Dionysios S.; Petropulu, Athina P.

    2016-08-01

    We revisit the development of grid based recursive approximate filtering of general Markov processes in discrete time, partially observed in conditionally Gaussian noise. The grid based filters considered rely on two types of state quantization: The \\textit{Markovian} type and the \\textit{marginal} type. We propose a set of novel, relaxed sufficient conditions, ensuring strong and fully characterized pathwise convergence of these filters to the respective MMSE state estimator. In particular, for marginal state quantizations, we introduce the notion of \\textit{conditional regularity of stochastic kernels}, which, to the best of our knowledge, constitutes the most relaxed condition proposed, under which asymptotic optimality of the respective grid based filters is guaranteed. Further, we extend our convergence results, including filtering of bounded and continuous functionals of the state, as well as recursive approximate state prediction. For both Markovian and marginal quantizations, the whole development of the respective grid based filters relies more on linear-algebraic techniques and less on measure theoretic arguments, making the presentation considerably shorter and technically simpler.

  14. Change-point detection for recursive Bayesian geoacoustic inversions.

    PubMed

    Tan, Bien Aik; Gerstoft, Peter; Yardim, Caglar; Hodgkiss, William S

    2015-04-01

    In order to carry out geoacoustic inversion in low signal-to-noise ratio (SNR) conditions, extended duration observations coupled with source and/or receiver motion may be necessary. As a result, change in the underlying model parameters due to time or space is anticipated. In this paper, an inversion method is proposed for cases when the model parameters change abruptly or slowly. A model parameter change-point detection method is developed to detect the change in the model parameters using the importance samples and corresponding weights that are already available from the recursive Bayesian inversion. If the model parameters change abruptly, a change-point will be detected and the inversion will restart with the pulse measurement after the change-point. If the model parameters change gradually, the inversion (based on constant model parameters) may proceed until the accumulated model parameter mismatch is significant and triggers the detection of a change-point. These change-point detections form the heuristics for controlling the coherent integration time in recursive Bayesian inversion. The method is demonstrated in simulation with parameters corresponding to the low SNR, 100-900 Hz linear frequency modulation pulses observed in the Shallow Water 2006 experiment [Tan, Gerstoft, Yardim, and Hodgkiss, J. Acoust. Soc. Am. 136, 1187-1198 (2014)].

  15. Efficient routing and broadcasting in recursive interconnection networks

    SciTech Connect

    Fernandes, R.; Friesen, D.K.; Kanevsky, A.

    1994-12-31

    The WK-Recursive Network (WKRN) is a hierarchical interconnection network that is recursively defined and has excellent properties for scalable message-passing multicomputer systems. In this paper, we present efficient routing and broadcasting schemes in a WKRN. For efficient routing, we define the MP-graph between the source and destination nodes of the message. For efficient broadcasting, we define the EDHP-graph and the NDST-graph. The MP-graph can also be used for message routing in the presence of faulty nodes. Similarly, the EDHP-graph (the NDST-graph) can be used for message broadcast in the presence of faulty links (nodes). Fault-tolerance communication schemes using these graphs have the advantage that no information about the presence or location of faulty components is required. Moreover, the MP-graph and the NDST-graph can be used under different fault models. We analyze the communication delays for message routing (broadcast) along MP-graphs (EDHP-graphs and NDST-graphs) under fault-free and faulty conditions.

  16. Fine tuning points of generating function construction for linear recursions

    NASA Astrophysics Data System (ADS)

    Yolcu, Bahar; Demiralp, Metin

    2014-10-01

    Recursions are quite important mathematical tools since many systems are mathematically modelled to ultimately take us to these equations because of their rather easy algebraic natures. They fit computer programming needs quite well in many circumstances to produce solutions. However, it is generally desired to find the asymptotic behaviour of the general term in the relevant sequence for convergence and therefore practicality issues. One of the general tendencies to find the general term asymptotic behaviour, when its ordering number grows unboundedly, is the integral representation over a generating function which does not depend on individual sequence elements. This is tried to be done almost for all types of recursions, even though the linear cases gain more importance than the others because they can be more effectively investigated by using many linear algebraic tools. Despite this may seem somehow to be rather trivial, there are a lot of theoretical fine tuning issues in the construction of true integral representations over true intervals on real axis or paths in complex domains. This work is devoted to focus on this issue starting from scratch for better understanding of the matter. The example cases are chosen to best illuminate the situations to get information for future generalization even though the work can be considered at somehow introductory level.

  17. Recursive mentalizing and common knowledge in the bystander effect.

    PubMed

    Thomas, Kyle A; De Freitas, Julian; DeScioli, Peter; Pinker, Steven

    2016-05-01

    The more potential helpers there are, the less likely any individual is to help. A traditional explanation for this bystander effect is that responsibility diffuses across the multiple bystanders, diluting the responsibility of each. We investigate an alternative, which combines the volunteer's dilemma (each bystander is best off if another responds) with recursive theory of mind (each infers what the others know about what he knows) to predict that actors will strategically shirk when they think others feel compelled to help. In 3 experiments, participants responded to a (fictional) person who needed help from at least 1 volunteer. Participants were in groups of 2 or 5 and had varying information about whether other group members knew that help was needed. As predicted, people's decision to help zigzagged with the depth of their asymmetric, recursive knowledge (e.g., "John knows that Michael knows that John knows help is needed"), and replicated the classic bystander effect when they had common knowledge (everyone knowing what everyone knows). The results demonstrate that the bystander effect may result not from a mere diffusion of responsibility but specifically from actors' strategic computations.

  18. REQUEST: A Recursive QUEST Algorithm for Sequential Attitude Determination

    NASA Technical Reports Server (NTRS)

    Bar-Itzhack, Itzhack Y.

    1996-01-01

    In order to find the attitude of a spacecraft with respect to a reference coordinate system, vector measurements are taken. The vectors are pairs of measurements of the same generalized vector, taken in the spacecraft body coordinates, as well as in the reference coordinate system. We are interested in finding the best estimate of the transformation between these coordinate system.s The algorithm called QUEST yields that estimate where attitude is expressed by a quarternion. Quest is an efficient algorithm which provides a least squares fit of the quaternion of rotation to the vector measurements. Quest however, is a single time point (single frame) batch algorithm, thus measurements that were taken at previous time points are discarded. The algorithm presented in this work provides a recursive routine which considers all past measurements. The algorithm is based on on the fact that the, so called, K matrix, one of whose eigenvectors is the sought quaternion, is linerly related to the measured pairs, and on the ability to propagate K. The extraction of the appropriate eigenvector is done according to the classical QUEST algorithm. This stage, however, can be eliminated, and the computation simplified, if a standard eigenvalue-eigenvector solver algorithm is used. The development of the recursive algorithm is presented and illustrated via a numerical example.

  19. Efficient Execution of Recursive Programs on Commodity Vector Hardware

    SciTech Connect

    Ren, Bin; Jo, Youngjoon; Krishnamoorthy, Sriram; Agrawal, Kunal; Kulkarni, Milind

    2015-06-13

    The pursuit of computational efficiency has led to the proliferation of throughput-oriented hardware, from GPUs to increasingly-wide vector units on commodity processors and accelerators. This hardware is designed to efficiently execute data-parallel computations in a vectorized manner. However, many algorithms are more naturally expressed as divide-and-conquer, recursive, task-parallel computations; in the absence of data parallelism, it seems that such algorithms are not well-suited to throughput-oriented architectures. This paper presents a set of novel code transformations that expose the data-parallelism latent in recursive, task-parallel programs. These transformations facilitate straightforward vectorization of task-parallel programs on commodity hardware. We also present scheduling policies that maintain high utilization of vector resources while limiting space usage. Across several task-parallel benchmarks, we demonstrate both efficient vector resource utilization and substantial speedup on chips using Intel's SSE4.2 vector units as well as accelerators using Intel's AVX512 units.

  20. Recursive mentalizing and common knowledge in the bystander effect.

    PubMed

    Thomas, Kyle A; De Freitas, Julian; DeScioli, Peter; Pinker, Steven

    2016-05-01

    The more potential helpers there are, the less likely any individual is to help. A traditional explanation for this bystander effect is that responsibility diffuses across the multiple bystanders, diluting the responsibility of each. We investigate an alternative, which combines the volunteer's dilemma (each bystander is best off if another responds) with recursive theory of mind (each infers what the others know about what he knows) to predict that actors will strategically shirk when they think others feel compelled to help. In 3 experiments, participants responded to a (fictional) person who needed help from at least 1 volunteer. Participants were in groups of 2 or 5 and had varying information about whether other group members knew that help was needed. As predicted, people's decision to help zigzagged with the depth of their asymmetric, recursive knowledge (e.g., "John knows that Michael knows that John knows help is needed"), and replicated the classic bystander effect when they had common knowledge (everyone knowing what everyone knows). The results demonstrate that the bystander effect may result not from a mere diffusion of responsibility but specifically from actors' strategic computations. PMID:26913616

  1. Automatic line detection in document images using recursive morphological transforms

    NASA Astrophysics Data System (ADS)

    Kong, Bin; Chen, Su S.; Haralick, Robert M.; Phillips, Ihsin T.

    1995-03-01

    In this paper, we describe a system that detects lines of various types, e.g., solid lines and dotted lines, on document images. The main techniques are based on the recursive morphological transforms, namely the recursive opening and closing transforms. The advantages of the transforms are that they can perform binary opening and closing with any sized structuring element simultaneously in constant time per pixel, and that they offer a solution to morphological image analysis problems where the sizes of the structuring elements have to be determined after the examination of the image itself. The system is evaluated on about 1,200 totally ground-truthed IRS tax form images of different qualities. The line detection output is compared with a set of hand-drawn groundtruth lines. The statistics like the number and rate of correct detection, miss detection, and false alarm are calculated. The performance of 32 algorithms for solid line detection are compared to find the best one. The optimal algorithm tuning parameter settings could be estimated on the fly using a regression tree.

  2. Optimal Recursive Digital Filters for Active Bending Stabilization

    NASA Technical Reports Server (NTRS)

    Orr, Jeb S.

    2013-01-01

    In the design of flight control systems for large flexible boosters, it is common practice to utilize active feedback control of the first lateral structural bending mode so as to suppress transients and reduce gust loading. Typically, active stabilization or phase stabilization is achieved by carefully shaping the loop transfer function in the frequency domain via the use of compensating filters combined with the frequency response characteristics of the nozzle/actuator system. In this paper we present a new approach for parameterizing and determining optimal low-order recursive linear digital filters so as to satisfy phase shaping constraints for bending and sloshing dynamics while simultaneously maximizing attenuation in other frequency bands of interest, e.g. near higher frequency parasitic structural modes. By parameterizing the filter directly in the z-plane with certain restrictions, the search space of candidate filter designs that satisfy the constraints is restricted to stable, minimum phase recursive low-pass filters with well-conditioned coefficients. Combined with optimal output feedback blending from multiple rate gyros, the present approach enables rapid and robust parametrization of autopilot bending filters to attain flight control performance objectives. Numerical results are presented that illustrate the application of the present technique to the development of rate gyro filters for an exploration-class multi-engined space launch vehicle.

  3. Recursive Ant Colony Global Optimization: a new technique for the inversion of geophysical data

    NASA Astrophysics Data System (ADS)

    Gupta, D. K.; Gupta, J. P.; Arora, Y.; Singh, U. K.

    2011-12-01

    We present a new method called Recursive Ant Colony Global Optimization (RACO) technique, a modified form of general ACO, which can be used to find the best solutions to inversion problems in geophysics. RACO simulates the social behaviour of ants to find the best path between the nest and the food source. A new term depth has been introduced, which controls the extent of recursion. A selective number of cities get qualified for the successive depth. The results of one depth are used to construct the models for the next depth and the range of values for each of the parameters is reduced without any change to the number of models. The three additional steps performed after each depth, are the pheromone tracking, pheromone updating and city selection. One of the advantages of RACO over ACO is that if a problem has multiple solutions, then pheromone accumulation will take place at more than one city thereby leading to formation of multiple nested ACO loops within the ACO loop of the previous depth. Also, while the convergence of ACO is almost linear, RACO shows exponential convergence and hence is faster than the ACO. RACO proves better over some other global optimization techniques, as it does not require any initial values to be assigned to the parameters function. The method has been tested on some mathematical functions, synthetic self-potential (SP) and synthetic gravity data. The obtained results reveal the efficiency and practicability of the method. The method is found to be efficient enough to solve the problems of SP and gravity anomalies due to a horizontal cylinder, a sphere, an inclined sheet and multiple idealized bodies buried inside the earth. These anomalies with and without noise were inverted using the RACO algorithm. The obtained results were compared with those obtained from the conventional methods and it was found that RACO results are more accurate. Finally this optimization technique was applied to real field data collected over the Surda

  4. Legislative update.

    PubMed

    1999-04-01

    Updates are presented from nine States on HIV-related legislation. Legislative topics include HIV case surveillance, testing, corrections, safety of health-care workers, HIV status notification, the definition of disability, viatical settlements, and confidentiality breaches. PMID:11366638

  5. Generalized Path Analysis and Generalized Simultaneous Equations Model for Recursive Systems with Responses of Mixed Types

    ERIC Educational Resources Information Center

    Tsai, Tien-Lung; Shau, Wen-Yi; Hu, Fu-Chang

    2006-01-01

    This article generalizes linear path analysis (PA) and simultaneous equations models (SiEM) to deal with mixed responses of different types in a recursive or triangular system. An efficient instrumental variable (IV) method for estimating the structural coefficients of a 2-equation partially recursive generalized path analysis (GPA) model and…

  6. Round-off error propagation in four generally applicable, recursive, least-squares-estimation schemes

    NASA Technical Reports Server (NTRS)

    Verhaegen, M. H.

    1987-01-01

    The numerical robustness of four generally applicable, recursive, least-squares-estimation schemes is analyzed by means of a theoretical round-off propagation study. This study highlights a number of practical, interesting insights of widely used recursive least-squares schemes. These insights have been confirmed in an experimental study as well.

  7. Stochastic and recursive calibration for operational, large-scale, agricultural land and water use management models

    NASA Astrophysics Data System (ADS)

    Maneta, M. P.; Kimball, J. S.; Jencso, K. G.

    2015-12-01

    Managing the impact of climatic cycles on agricultural production, on land allocation, and on the state of active and projected water sources is challenging. This is because in addition to the uncertainties associated with climate projections, it is difficult to anticipate how farmers will respond to climatic change or to economic and policy incentives. Some sophisticated decision support systems available to water managers consider farmers' adaptive behavior but they are data intensive and difficult to apply operationally over large regions. Satellite-based observational technologies, in conjunction with models and assimilation methods, create an opportunity for new, cost-effective analysis tools to support policy and decision-making over large spatial extents at seasonal scales.We present an integrated modeling framework that can be driven by satellite remote sensing to enable robust regional assessment and prediction of climatic and policy impacts on agricultural production, water resources, and management decisions. The core of this framework is a widely used model of agricultural production and resource allocation adapted to be used in conjunction with remote sensing inputs to quantify the amount of land and water farmers allocate for each crop they choose to grow on a seasonal basis in response to reduced or enhanced access to water due to climatic or policy restrictions. A recursive Bayesian update method is used to adjust the model parameters by assimilating information on crop acreage, production, and crop evapotranspiration as a proxy for water use that can be estimated from high spatial resolution satellite remote sensing. The data assimilation framework blends new and old information to avoid over-calibration to the specific conditions of a single year and permits the updating of parameters to track gradual changes in the agricultural system.This integrated framework provides an operational means of monitoring and forecasting what crops will be grown

  8. James Webb Space Telescope XML Database: From the Beginning to Today

    NASA Technical Reports Server (NTRS)

    Gal-Edd, Jonathan; Fatig, Curtis C.

    2005-01-01

    The James Webb Space Telescope (JWST) Project has been defining, developing, and exercising the use of a common eXtensible Markup Language (XML) for the command and telemetry (C&T) database structure. JWST is the first large NASA space mission to use XML for databases. The JWST project started developing the concepts for the C&T database in 2002. The database will need to last at least 20 years since it will be used beginning with flight software development, continuing through Observatory integration and test (I&T) and through operations. Also, a database tool kit has been provided to the 18 various flight software development laboratories located in the United States, Europe, and Canada that allows the local users to create their own databases. Recently the JWST Project has been working with the Jet Propulsion Laboratory (JPL) and Object Management Group (OMG) XML Telemetry and Command Exchange (XTCE) personnel to provide all the information needed by JWST and JPL for exchanging database information using a XML standard structure. The lack of standardization requires custom ingest scripts for each ground system segment, increasing the cost of the total system. Providing a non-proprietary standard of the telemetry and command database definition formation will allow dissimilar systems to communicate without the need for expensive mission specific database tools and testing of the systems after the database translation. The various ground system components that would benefit from a standardized database are the telemetry and command systems, archives, simulators, and trending tools. JWST has exchanged the XML database with the Eclipse, EPOCH, ASIST ground systems, Portable spacecraft simulator (PSS), a front-end system, and Integrated Trending and Plotting System (ITPS) successfully. This paper will discuss how JWST decided to use XML, the barriers to a new concept, experiences utilizing the XML structure, exchanging databases with other users, and issues that have

  9. Pedestrian navigation system using XML-based data

    NASA Astrophysics Data System (ADS)

    Moro, Maiko; Tanaka, Kenichiro; Utagawa, Yuka; Shigeno, Hiroshi; Matsushita, Yutaka

    2001-11-01

    In this paper, we discuss a pedestrian navigation using cellular phone. In order to offer navigation information intelligible for a user and to solve the problem that cellular phone has a small display area, we provide navigation sentences and landscape images from user's viewpoint. When a pedestrian goes to the destination, landmarks e.g. a building, a crossing, etc. exist on the way. Thus, we provide two navigation sentences at every mark, for example 'Go to the bank at the corner,' and 'Turn to the right at the bank.' At the point which is important or easy to mistake for user, it provide landscape images. Then users can do a check of the direction. Providing a minute information, navigation sentences and landscape images, it is easy for users to go to the destination. Additionary, not having all of navigation data, our system only have a little data to manage it. The navigation data is created by the informer who is the man of the destination, and upload it to their web site. The informer who is knowledgeable about the way to the destination can give the route for users who visit it for the first time. And, it can be created using two or more navigation data which others have been created, which is the difference-use. The data in which information about way guidance from a station to a destination is described by XML (eXtensible Markup Language). Pedestrian navigation system using navigation data with cellular phone is implemented. Proposed system can realize intelligible way guidance for users, and provide the route guide, which are available for a small display area of a cellular phone and for pedestrians.

  10. Recursive dynamics for flexible multibody systems using spatial operators

    NASA Technical Reports Server (NTRS)

    Jain, A.; Rodriguez, G.

    1990-01-01

    Due to their structural flexibility, spacecraft and space manipulators are multibody systems with complex dynamics and possess a large number of degrees of freedom. Here the spatial operator algebra methodology is used to develop a new dynamics formulation and spatially recursive algorithms for such flexible multibody systems. A key feature of the formulation is that the operator description of the flexible system dynamics is identical in form to the corresponding operator description of the dynamics of rigid multibody systems. A significant advantage of this unifying approach is that it allows ideas and techniques for rigid multibody systems to be easily applied to flexible multibody systems. The algorithms use standard finite-element and assumed modes models for the individual body deformation. A Newton-Euler Operator Factorization of the mass matrix of the multibody system is first developed. It forms the basis for recursive algorithms such as for the inverse dynamics, the computation of the mass matrix, and the composite body forward dynamics for the system. Subsequently, an alternative Innovations Operator Factorization of the mass matrix, each of whose factors is invertible, is developed. It leads to an operator expression for the inverse of the mass matrix, and forms the basis for the recursive articulated body forward dynamics algorithm for the flexible multibody system. For simplicity, most of the development here focuses on serial chain multibody systems. However, extensions of the algorithms to general topology flexible multibody systems are described. While the computational cost of the algorithms depends on factors such as the topology and the amount of flexibility in the multibody system, in general, it appears that in contrast to the rigid multibody case, the articulated body forward dynamics algorithm is the more efficient algorithm for flexible multibody systems containing even a small number of flexible bodies. The variety of algorithms described

  11. Recursive Subsystems in Aphasia and Alzheimer's Disease: Case Studies in Syntax and Theory of Mind.

    PubMed

    Bánréti, Zoltán; Hoffmann, Ildikó; Vincze, Veronika

    2016-01-01

    The relationship between recursive sentence embedding and theory-of-mind (ToM) inference is investigated in three persons with Broca's aphasia, two persons with Wernicke's aphasia, and six persons with mild and moderate Alzheimer's disease (AD). We asked questions of four types about photographs of various real-life situations. Type 4 questions asked participants about intentions, thoughts, or utterances of the characters in the pictures ("What may X be thinking/asking Y to do?"). The expected answers typically involved subordinate clauses introduced by conjunctions or direct quotations of the characters' utterances. Broca's aphasics did not produce answers with recursive sentence embedding. Rather, they projected themselves into the characters' mental states and gave direct answers in the first person singular, with relevant ToM content. We call such replies "situative statements." Where the question concerned the mental state of the character but did not require an answer with sentence embedding ("What does X hate?"), aphasics gave descriptive answers rather than situative statements. Most replies given by persons with AD to Type 4 questions were grammatical instances of recursive sentence embedding. They also gave a few situative statements but the ToM content of these was irrelevant. In more than one third of their well-formed sentence embeddings, too, they conveyed irrelevant ToM contents. Persons with moderate AD were unable to pass secondary false belief tests. The results reveal double dissociation: Broca's aphasics are unable to access recursive sentence embedding but they can make appropriate ToM inferences; moderate AD persons make the wrong ToM inferences but they are able to access recursive sentence embedding. The double dissociation may be relevant for the nature of the relationship between the two recursive capacities. Broca's aphasics compensated for the lack of recursive sentence embedding by recursive ToM reasoning represented in very simple

  12. Source localization using recursively applied and projected (RAP) MUSIC

    SciTech Connect

    Mosher, J.C.; Leahy, R.M.

    1998-03-01

    A new method for source localization is described that is based on a modification of the well known multiple signal classification (MUSIC) algorithm. In classical MUSIC, the array manifold vector is projected onto an estimate of the signal subspace, but errors in the estimate can make location of multiple sources difficult. Recursively applied and projected (RAP) MUSIC uses each successively located source to form an intermediate array gain matrix, and projects both the array manifold and the signal subspace estimate into its orthogonal complement. The MUSIC projection is then performed in this reduced subspace. Using the metric of principal angles, the authors describe a general form of the RAP-MUSIC algorithm for the case of diversely polarized sources. Through a uniform linear array simulation, the authors demonstrate the improved Monte Carlo performance of RAP-MUSIC relative to MUSIC and two other sequential subspace methods, S and IES-MUSIC.

  13. Adaptive control of large space structures using recursive lattice filters

    NASA Technical Reports Server (NTRS)

    Sundararajan, N.; Goglia, G. L.

    1985-01-01

    The use of recursive lattice filters for identification and adaptive control of large space structures is studied. Lattice filters were used to identify the structural dynamics model of the flexible structures. This identification model is then used for adaptive control. Before the identified model and control laws are integrated, the identified model is passed through a series of validation procedures and only when the model passes these validation procedures is control engaged. This type of validation scheme prevents instability when the overall loop is closed. Another important area of research, namely that of robust controller synthesis, was investigated using frequency domain multivariable controller synthesis methods. The method uses the Linear Quadratic Guassian/Loop Transfer Recovery (LQG/LTR) approach to ensure stability against unmodeled higher frequency modes and achieves the desired performance.

  14. Recursive approach to the moment-based phase unwrapping method.

    PubMed

    Langley, Jason A; Brice, Robert G; Zhao, Qun

    2010-06-01

    The moment-based phase unwrapping algorithm approximates the phase map as a product of Gegenbauer polynomials, but the weight function for the Gegenbauer polynomials generates artificial singularities along the edge of the phase map. A method is presented to remove the singularities inherent to the moment-based phase unwrapping algorithm by approximating the phase map as a product of two one-dimensional Legendre polynomials and applying a recursive property of derivatives of Legendre polynomials. The proposed phase unwrapping algorithm is tested on simulated and experimental data sets. The results are then compared to those of PRELUDE 2D, a widely used phase unwrapping algorithm, and a Chebyshev-polynomial-based phase unwrapping algorithm. It was found that the proposed phase unwrapping algorithm provides results that are comparable to those obtained by using PRELUDE 2D and the Chebyshev phase unwrapping algorithm. PMID:20517381

  15. Lanczos and Recursion Techniques for Multiscale Kinetic Monte Carlo Simulations

    SciTech Connect

    Rudd, R E; Mason, D R; Sutton, A P

    2006-03-13

    We review an approach to the simulation of the class of microstructural and morphological evolution involving both relatively short-ranged chemical and interfacial interactions and long-ranged elastic interactions. The calculation of the anharmonic elastic energy is facilitated with Lanczos recursion. The elastic energy changes affect the rate of vacancy hopping, and hence the rate of microstructural evolution due to vacancy mediated diffusion. The elastically informed hopping rates are used to construct the event catalog for kinetic Monte Carlo simulation. The simulation is accelerated using a second order residence time algorithm. The effect of elasticity on the microstructural development has been assessed. This article is related to a talk given in honor of David Pettifor at the DGP60 Workshop in Oxford.

  16. Recursive Estimation for the Tracking of Radioactive Sources

    SciTech Connect

    Howse, J.W.; Muske, K.R.; Ticknor, L.O.

    1999-02-01

    This paper describes a recursive estimation algorithm used for tracking the physical location of radioactive sources in real-time as they are moved around in a facility. The al- gorithm is a nonlinear least squares estimation that mini- mizes the change in, the source location and the deviation between measurements and model predictions simultane- ously. The measurements used to estimate position consist of four count rates reported by four different gamma ray de tectors. There is an uncertainty in the source location due to the variance of the detected count rate. This work repre- sents part of a suite of tools which will partially automate security and safety assessments, allow some assessments to be done remotely, and provide additional sensor modalities with which to make assessments.

  17. Recursive estimation for the tracking of radioactive sources

    SciTech Connect

    Howse, J.W.; Ticknor, L.O.; Muske, K.R.

    1998-12-31

    This paper describes a recursive estimation algorithm used for tracking the physical location of radioactive sources in real-time as they are moved around in a facility. The algorithm is related to a nonlinear least squares estimation that minimizes the change in the source location and the deviation between measurements and model predictions simultaneously. The measurements used to estimate position consist of four count rates reported by four different gamma ray detectors. There is an uncertainty in the source location due to the large variance of the detected count rate. This work represents part of a suite of tools which will partially automate security and safety assessments, allow some assessments to be done remotely, and provide additional sensor modalities with which to make assessments.

  18. Chaotic spin-spin entanglement on a recursive lattice.

    PubMed

    Chakhmakhchyan, Levon; Guérin, Stéphane; Leroy, Claude

    2015-08-01

    We propose an exactly solvable multisite interaction spin-1/2 Ising-Heisenberg model on a triangulated Husimi lattice for the rigorous studies of chaotic entanglement. By making use of the generalized star-triangle transformation, we map the initial model onto an effective Ising one on a Husimi lattice, which we solve then exactly by applying the recursive method. Expressing the entanglement of the Heisenberg spins, that we quantify by means of the concurrence, in terms of the magnetic quantities of the system, we demonstrate its bifurcation and chaotic behavior. Furthermore, we show that the underlying chaos may slightly enhance the amount of the entanglement and present on the phase diagram the transition lines from the uniform to periodic and from the periodic to chaotic regimes.

  19. Structure damage detection based on random forest recursive feature elimination

    NASA Astrophysics Data System (ADS)

    Zhou, Qifeng; Zhou, Hao; Zhou, Qingqing; Yang, Fan; Luo, Linkai

    2014-05-01

    Feature extraction is a key former step in structural damage detection. In this paper, a structural damage detection method based on wavelet packet decomposition (WPD) and random forest recursive feature elimination (RF-RFE) is proposed. In order to gain the most effective feature subset and to improve the identification accuracy a two-stage feature selection method is adopted after WPD. First, the damage features are sorted according to original random forest variable importance analysis. Second, using RF-RFE to eliminate the least important feature and reorder the feature list each time, then get the new feature importance sequence. Finally, k-nearest neighbor (KNN) algorithm, as a benchmark classifier, is used to evaluate the extracted feature subset. A four-storey steel shear building model is chosen as an example in method verification. The experimental results show that using the fewer features got from proposed method can achieve higher identification accuracy and reduce the detection time cost.

  20. Adaptive control of large space structures using recursive lattice filters

    NASA Technical Reports Server (NTRS)

    Goglia, G. L.

    1985-01-01

    The use of recursive lattice filters for identification and adaptive control of large space structures was studied. Lattice filters are used widely in the areas of speech and signal processing. Herein, they are used to identify the structural dynamics model of the flexible structures. This identified model is then used for adaptive control. Before the identified model and control laws are integrated, the identified model is passed through a series of validation procedures and only when the model passes these validation procedures control is engaged. This type of validation scheme prevents instability when the overall loop is closed. The results obtained from simulation were compared to those obtained from experiments. In this regard, the flexible beam and grid apparatus at the Aerospace Control Research Lab (ACRL) of NASA Langley Research Center were used as the principal candidates for carrying out the above tasks. Another important area of research, namely that of robust controller synthesis, was investigated using frequency domain multivariable controller synthesis methods.

  1. A class of recursion operators on a tangent bundle

    NASA Astrophysics Data System (ADS)

    Vermeire, F.; Sarlet, W.; Crampin, M.

    2006-06-01

    We generalize the construction of a class of type (1, 1) tensor fields R on a tangent bundle which was introduced in a preceding paper. The generalization comes from the fact that, apart from a given Lagrangian, the further data consist of a type (1, 1) tensor J along the tangent bundle projection τ: TQ →Q, rather than a tensor on Q. The main features under investigation are two kinds of recursion properties of R, namely its potential invariance under the flow of the given dynamics and the property of having vanishing Nijenhuis torsion. The theory is applied, in particular, to the case of second-order dynamics coming from a Finsler metric.

  2. Food-web formation with recursive evolutionary branching.

    PubMed

    Ito, Hiroshi C; Ikegami, Takashi

    2006-01-01

    A reaction-diffusion model describing the evolutionary dynamics of a food-web was constructed. In this model, predator-prey relationships among organisms were determined by their position in a two-dimensional phenotype space defined by two traits: as prey and as predator. The mutation process is expressed with a diffusion process of biomass in the phenotype space. Numerical simulation of this model showed co-evolutionary dynamics of isolated phenotypic clusters, including various types of evolutionary branching, which were classified into branching as prey, branching as predators, and co-evolutionary branching of both prey and predators. A complex food-web develops with recursive evolutionary branching from a single phenotypic cluster. Biodiversity peaks at the medium strength of the predator-prey interaction, where the food-web is maintained at medium biomass by a balanced frequency between evolutionary branching and extinction.

  3. Multiangle dynamic light scattering analysis using an improved recursion algorithm

    NASA Astrophysics Data System (ADS)

    Li, Lei; Li, Wei; Wang, Wanyan; Zeng, Xianjiang; Chen, Junyao; Du, Peng; Yang, Kecheng

    2015-10-01

    Multiangle dynamic light scattering (MDLS) compensates for the low information in a single-angle dynamic light scattering (DLS) measurement by combining the light intensity autocorrelation functions from a number of measurement angles. Reliable estimation of PSD from MDLS measurements requires accurate determination of the weighting coefficients and an appropriate inversion method. We propose the Recursion Nonnegative Phillips-Twomey (RNNPT) algorithm, which is insensitive to the noise of correlation function data, for PSD reconstruction from MDLS measurements. The procedure includes two main steps: 1) the calculation of the weighting coefficients by the recursion method, and 2) the PSD estimation through the RNNPT algorithm. And we obtained suitable regularization parameters for the algorithm by using MR-L-curve since the overall computational cost of this method is sensibly less than that of the L-curve for large problems. Furthermore, convergence behavior of the MR-L-curve method is in general superior to that of the L-curve method and the error of MR-L-curve method is monotone decreasing. First, the method was evaluated on simulated unimodal lognormal PSDs and multimodal lognormal PSDs. For comparison, reconstruction results got by a classical regularization method were included. Then, to further study the stability and sensitivity of the proposed method, all examples were analyzed using correlation function data with different levels of noise. The simulated results proved that RNNPT method yields more accurate results in the determination of PSDs from MDLS than those obtained with the classical regulation method for both unimodal and multimodal PSDs.

  4. Recursive least square vehicle mass estimation based on acceleration partition

    NASA Astrophysics Data System (ADS)

    Feng, Yuan; Xiong, Lu; Yu, Zhuoping; Qu, Tong

    2014-05-01

    Vehicle mass is an important parameter in vehicle dynamics control systems. Although many algorithms have been developed for the estimation of mass, none of them have yet taken into account the different types of resistance that occur under different conditions. This paper proposes a vehicle mass estimator. The estimator incorporates road gradient information in the longitudinal accelerometer signal, and it removes the road grade from the longitudinal dynamics of the vehicle. Then, two different recursive least square method (RLSM) schemes are proposed to estimate the driving resistance and the mass independently based on the acceleration partition under different conditions. A 6 DOF dynamic model of four In-wheel Motor Vehicle is built to assist in the design of the algorithm and in the setting of the parameters. The acceleration limits are determined to not only reduce the estimated error but also ensure enough data for the resistance estimation and mass estimation in some critical situations. The modification of the algorithm is also discussed to improve the result of the mass estimation. Experiment data on a sphalt road, plastic runway, and gravel road and on sloping roads are used to validate the estimation algorithm. The adaptability of the algorithm is improved by using data collected under several critical operating conditions. The experimental results show the error of the estimation process to be within 2.6%, which indicates that the algorithm can estimate mass with great accuracy regardless of the road surface and gradient changes and that it may be valuable in engineering applications. This paper proposes a recursive least square vehicle mass estimation method based on acceleration partition.

  5. Alternatives to relational database: comparison of NoSQL and XML approaches for clinical data storage.

    PubMed

    Lee, Ken Ka-Yin; Tang, Wai-Choi; Choi, Kup-Sze

    2013-04-01

    Clinical data are dynamic in nature, often arranged hierarchically and stored as free text and numbers. Effective management of clinical data and the transformation of the data into structured format for data analysis are therefore challenging issues in electronic health records development. Despite the popularity of relational databases, the scalability of the NoSQL database model and the document-centric data structure of XML databases appear to be promising features for effective clinical data management. In this paper, three database approaches--NoSQL, XML-enabled and native XML--are investigated to evaluate their suitability for structured clinical data. The database query performance is reported, together with our experience in the databases development. The results show that NoSQL database is the best choice for query speed, whereas XML databases are advantageous in terms of scalability, flexibility and extensibility, which are essential to cope with the characteristics of clinical data. While NoSQL and XML technologies are relatively new compared to the conventional relational database, both of them demonstrate potential to become a key database technology for clinical data management as the technology further advances. PMID:23177219

  6. Alternatives to relational database: comparison of NoSQL and XML approaches for clinical data storage.

    PubMed

    Lee, Ken Ka-Yin; Tang, Wai-Choi; Choi, Kup-Sze

    2013-04-01

    Clinical data are dynamic in nature, often arranged hierarchically and stored as free text and numbers. Effective management of clinical data and the transformation of the data into structured format for data analysis are therefore challenging issues in electronic health records development. Despite the popularity of relational databases, the scalability of the NoSQL database model and the document-centric data structure of XML databases appear to be promising features for effective clinical data management. In this paper, three database approaches--NoSQL, XML-enabled and native XML--are investigated to evaluate their suitability for structured clinical data. The database query performance is reported, together with our experience in the databases development. The results show that NoSQL database is the best choice for query speed, whereas XML databases are advantageous in terms of scalability, flexibility and extensibility, which are essential to cope with the characteristics of clinical data. While NoSQL and XML technologies are relatively new compared to the conventional relational database, both of them demonstrate potential to become a key database technology for clinical data management as the technology further advances.

  7. Theory of Mind Development in Adolescence and Early Adulthood: The Growing Complexity of Recursive Thinking Ability.

    PubMed

    Valle, Annalisa; Massaro, Davide; Castelli, Ilaria; Marchetti, Antonella

    2015-02-01

    This study explores the development of theory of mind, operationalized as recursive thinking ability, from adolescence to early adulthood (N = 110; young adolescents = 47; adolescents = 43; young adults = 20). The construct of theory of mind has been operationalized in two different ways: as the ability to recognize the correct mental state of a character, and as the ability to attribute the correct mental state in order to predict the character's behaviour. The Imposing Memory Task, with five recursive thinking levels, and a third-order false-belief task with three recursive thinking levels (devised for this study) have been used. The relationship among working memory, executive functions, and linguistic skills are also analysed. Results show that subjects exhibit less understanding of elevated recursive thinking levels (third, fourth, and fifth) compared to the first and second levels. Working memory is correlated with total recursive thinking, whereas performance on the linguistic comprehension task is related to third level recursive thinking in both theory of mind tasks. An effect of age on third-order false-belief task performance was also found. A key finding of the present study is that the third-order false-belief task shows significant age differences in the application of recursive thinking that involves the prediction of others' behaviour. In contrast, such an age effect is not observed in the Imposing Memory Task. These results may support the extension of the investigation of the third order false belief after childhood. PMID:27247645

  8. Theory of Mind Development in Adolescence and Early Adulthood: The Growing Complexity of Recursive Thinking Ability

    PubMed Central

    Valle, Annalisa; Massaro, Davide; Castelli, Ilaria; Marchetti, Antonella

    2015-01-01

    This study explores the development of theory of mind, operationalized as recursive thinking ability, from adolescence to early adulthood (N = 110; young adolescents = 47; adolescents = 43; young adults = 20). The construct of theory of mind has been operationalized in two different ways: as the ability to recognize the correct mental state of a character, and as the ability to attribute the correct mental state in order to predict the character’s behaviour. The Imposing Memory Task, with five recursive thinking levels, and a third-order false-belief task with three recursive thinking levels (devised for this study) have been used. The relationship among working memory, executive functions, and linguistic skills are also analysed. Results show that subjects exhibit less understanding of elevated recursive thinking levels (third, fourth, and fifth) compared to the first and second levels. Working memory is correlated with total recursive thinking, whereas performance on the linguistic comprehension task is related to third level recursive thinking in both theory of mind tasks. An effect of age on third-order false-belief task performance was also found. A key finding of the present study is that the third-order false-belief task shows significant age differences in the application of recursive thinking that involves the prediction of others’ behaviour. In contrast, such an age effect is not observed in the Imposing Memory Task. These results may support the extension of the investigation of the third order false belief after childhood. PMID:27247645

  9. Standardization of XML Database Exchanges and the James Webb Space Telescope Experience

    NASA Technical Reports Server (NTRS)

    Gal-Edd, Jonathan; Detter, Ryan; Jones, Ron; Fatig, Curtis C.

    2007-01-01

    Personnel from the National Aeronautics and Space Administration (NASA) James Webb Space Telescope (JWST) Project have been working with various standard communities such the Object Management Group (OMG) and the Consultative Committee for Space Data Systems (CCSDS) to assist in the definition of a common extensible Markup Language (XML) for database exchange format. The CCSDS and OMG standards are intended for the exchange of core command and telemetry information, not for all database information needed to exercise a NASA space mission. The mission-specific database, containing all the information needed for a space mission, is translated from/to the standard using a translator. The standard is meant to provide a system that encompasses 90% of the information needed for command and telemetry processing. This paper will discuss standardization of the XML database exchange format, tools used, and the JWST experience, as well as future work with XML standard groups both commercial and government.

  10. Interactive, Secure Web-enabled Aircraft Engine Simulation Using XML Databinding Integration

    NASA Technical Reports Server (NTRS)

    Lin, Risheng; Afjeh, Abdollah A.

    2003-01-01

    This paper discusses the detailed design of an XML databinding framework for aircraft engine simulation. The framework provides an object interface to access and use engine data. while at the same time preserving the meaning of the original data. The Language independent representation of engine component data enables users to move around XML data using HTTP through disparate networks. The application of this framework is demonstrated via a web-based turbofan propulsion system simulation using the World Wide Web (WWW). A Java Servlet based web component architecture is used for rendering XML engine data into HTML format and dealing with input events from the user, which allows users to interact with simulation data from a web browser. The simulation data can also be saved to a local disk for archiving or to restart the simulation at a later time.

  11. An adaptable neural-network model for recursive nonlinear traffic prediction and modeling of MPEG video sources.

    PubMed

    Doulamis, A D; Doulamis, N D; Kollias, S D

    2003-01-01

    Multimedia services and especially digital video is expected to be the major traffic component transmitted over communication networks [such as internet protocol (IP)-based networks]. For this reason, traffic characterization and modeling of such services are required for an efficient network operation. The generated models can be used as traffic rate predictors, during the network operation phase (online traffic modeling), or as video generators for estimating the network resources, during the network design phase (offline traffic modeling). In this paper, an adaptable neural-network architecture is proposed covering both cases. The scheme is based on an efficient recursive weight estimation algorithm, which adapts the network response to current conditions. In particular, the algorithm updates the network weights so that 1) the network output, after the adaptation, is approximately equal to current bit rates (current traffic statistics) and 2) a minimal degradation over the obtained network knowledge is provided. It can be shown that the proposed adaptable neural-network architecture simulates a recursive nonlinear autoregressive model (RNAR) similar to the notation used in the linear case. The algorithm presents low computational complexity and high efficiency in tracking traffic rates in contrast to conventional retraining schemes. Furthermore, for the problem of offline traffic modeling, a novel correlation mechanism is proposed for capturing the burstness of the actual MPEG video traffic. The performance of the model is evaluated using several real-life MPEG coded video sources of long duration and compared with other linear/nonlinear techniques used for both cases. The results indicate that the proposed adaptable neural-network architecture presents better performance than other examined techniques.

  12. Efficient design of two-dimensional recursive digital filters. Final report

    SciTech Connect

    Twogood, R.E.; Mitra, S.K.

    1980-01-01

    This report outlines the research progress during the period August 1978 to July 1979. This work can be divided into seven basic project areas. Project 1 deals with a comparative study of 2-D recursive and nonrecursive digital filters. The second project addresses a new design technique for 2-D half-plane recursive filters, and Projects 3 thru 5 deal with implementation issues. The sixth project presents our recent study of the applicability of array processors to 2-D digital signal processing. The final project involves our investigation into techniques for incorporating symmetry constraints on 2-D recursive filters in order to yield more efficient implementations.

  13. Update '98.

    ERIC Educational Resources Information Center

    Mock, Karen R.

    1998-01-01

    Updates cases and issues previously discussed in this regular column on human rights in Canada, including racism and anti-Semitism, laws on hate crimes, hate sites on the World Wide Web, the use of the "free speech" defense by hate groups, and legal challenges to antiracist groups by individuals criticized by them. (DSK)

  14. Dynamic XML-based exchange of relational data: application to the Human Brain Project.

    PubMed

    Tang, Zhengming; Kadiyska, Yana; Li, Hao; Suciu, Dan; Brinkley, James F

    2003-01-01

    This paper discusses an approach to exporting relational data in XML format for data exchange over the web. We describe the first real-world application of SilkRoute, a middleware program that dynamically converts existing relational data to a user-defined XML DTD. The application, called XBrain, wraps SilkRoute in a Java Server Pages framework, thus permitting a web-based XQuery interface to a legacy relational database. The application is demonstrated as a query interface to the University of Washington Brain Project's Language Map Experiment Management System, which is used to manage data about language organization in the brain.

  15. Development and Validation of XML-based Calculations within Order Sets

    PubMed Central

    Hulse, Nathan C.; Del Fiol, Guilherme; Rocha, Roberto A.

    2005-01-01

    We have developed two XML Schemas to support the implementation of calculations within XML-based order sets for use within a physician order entry system. The models support the representation of variable-based algorithms and include data elements designed to support ancillary functions such as input range checking, rounding, and minimum/maximum value constraints. Two clinicians successfully authored 57 unique calculated orders derived from a set of 11 calculations using the models within our authoring environment. The resultant knowledge base content was subsequently tested and found to produce the desired results within the electronic physician order entry environment. PMID:16779062

  16. Multi-Resolution Seismic Tomography Based on Recursive Tessellation Hierarchy

    SciTech Connect

    Simmons, N A; Myers, S C; Ramirez, A

    2009-07-01

    A 3-D global tomographic model that reconstructs velocity structure at multiple scales and incorporates laterally variable seismic discontinuities is currently being developed. The model parameterization is node-based where nodes are placed along vertices defined by triangular tessellations of a spheroidal surface. The triangular tessellation framework is hierarchical. Starting with a tetrahexahedron representing the whole globe (1st level of the hierarchy, 24 faces), they divide each triangle of the tessellation into daughter triangles. The collection of all daughter triangles comprises the 2nd level of the tessellation hierarchy and further recursion produces an arbitrary number of tessellation levels and arbitrarily fine node-spacing. They have developed an inversion procedure that takes advantage of the recursive properties of the tessellation hierarchies by progressively solving for shorter wavelength heterogeneities. In this procedure, we first perform the tomographic inversion using a tessellation level with coarse node spacing. They find that a coarse node spacing of approximately 8{sup o} is adequate to capture bulk regional properties. They then conduct the tomographic inversion on a 4{sup o} tessellation level using the residuals and inversion results from the 8{sup o} run. In practice they find that the progressive tomography approach is robust, providing an intrinsic regularization for inversion stability and avoids the issue of predefining resolution levels. Further, determining average regional properties with coarser tessellation levels enables long-wavelength heterogeneities to account for sparsely sampled regions (or regions of the mantle where longer wavelength patterns of heterogeneity suffice) while allowing shorter length-scale heterogeneities to emerge where necessary. They demonstrate the inversion approach with a set of synthetic test cases that mimic the complex nature of data arrangements (mixed-determined inversion) common to most

  17. A Parallel Implementation of Multilevel Recursive Spectral Bisection for Application to Adaptive Unstructured Meshes. Chapter 1

    NASA Technical Reports Server (NTRS)

    Barnard, Stephen T.; Simon, Horst; Lasinski, T. A. (Technical Monitor)

    1994-01-01

    The design of a parallel implementation of multilevel recursive spectral bisection is described. The goal is to implement a code that is fast enough to enable dynamic repartitioning of adaptive meshes.

  18. Recursive least squares method of regression coefficients estimation as a special case of Kalman filter

    NASA Astrophysics Data System (ADS)

    Borodachev, S. M.

    2016-06-01

    The simple derivation of recursive least squares (RLS) method equations is given as special case of Kalman filter estimation of a constant system state under changing observation conditions. A numerical example illustrates application of RLS to multicollinearity problem.

  19. Recursive dynamic programming for adaptive sequence and structure alignment

    SciTech Connect

    Thiele, R.; Zimmer, R.; Lengauer, T.

    1995-12-31

    We propose a new alignment procedure that is capable of aligning protein sequences and structures in a unified manner. Recursive dynamic programming (RDP) is a hierarchical method which, on each level of the hierarchy, identifies locally optimal solutions and assembles them into partial alignments of sequences and/or structures. In contrast to classical dynamic programming, RDP can also handle alignment problems that use objective functions not obeying the principle of prefix optimality, e.g. scoring schemes derived from energy potentials of mean force. For such alignment problems, RDP aims at computing solutions that are near-optimal with respect to the involved cost function and biologically meaningful at the same time. Towards this goal, RDP maintains a dynamic balance between different factors governing alignment fitness such as evolutionary relationships and structural preferences. As in the RDP method gaps are not scored explicitly, the problematic assignment of gap cost parameters is circumvented. In order to evaluate the RDP approach we analyse whether known and accepted multiple alignments based on structural information can be reproduced with the RDP method.

  20. Semantics boosts syntax in artificial grammar learning tasks with recursion.

    PubMed

    Fedor, Anna; Varga, Máté; Szathmáry, Eörs

    2012-05-01

    Center-embedded recursion (CER) in natural language is exemplified by sentences such as "The malt that the rat ate lay in the house." Parsing center-embedded structures is in the focus of attention because this could be one of the cognitive capacities that make humans distinct from all other animals. The ability to parse CER is usually tested by means of artificial grammar learning (AGL) tasks, during which participants have to infer the rule from a set of artificial sentences. One of the surprising results of previous AGL experiments is that learning CER is not as easy as had been thought. We hypothesized that because artificial sentences lack semantic content, semantics could help humans learn the syntax of center-embedded sentences. To test this, we composed sentences from 4 vocabularies of different degrees of semantic content due to 3 factors (familiarity, meaning of words, and semantic relationship between words). According to our results, these factors have no effect one by one but they make learning significantly faster when combined. This leads to the assumption that there were different mechanisms at work when CER was parsed in natural and in artificial languages. This finding questions the suitability of AGL tasks with artificial vocabularies for studying the learning and processing of linguistic CER.

  1. Recursive recovery of Markov transition probabilities from boundary value data

    SciTech Connect

    Patch, S.K.

    1994-04-01

    In an effort to mathematically describe the anisotropic diffusion of infrared radiation in biological tissue Gruenbaum posed an anisotropic diffusion boundary value problem in 1989. In order to accommodate anisotropy, he discretized the temporal as well as the spatial domain. The probabilistic interpretation of the diffusion equation is retained; radiation is assumed to travel according to a random walk (of sorts). In this random walk the probabilities with which photons change direction depend upon their previous as well as present location. The forward problem gives boundary value data as a function of the Markov transition probabilities. The inverse problem requires finding the transition probabilities from boundary value data. Problems in the plane are studied carefully in this thesis. Consistency conditions amongst the data are derived. These conditions have two effects: they prohibit inversion of the forward map but permit smoothing of noisy data. Next, a recursive algorithm which yields a family of solutions to the inverse problem is detailed. This algorithm takes advantage of all independent data and generates a system of highly nonlinear algebraic equations. Pluecker-Grassmann relations are instrumental in simplifying the equations. The algorithm is used to solve the 4 {times} 4 problem. Finally, the smallest nontrivial problem in three dimensions, the 2 {times} 2 {times} 2 problem, is solved.

  2. A Semantic Analysis of XML Schema Matching for B2B Systems Integration

    ERIC Educational Resources Information Center

    Kim, Jaewook

    2011-01-01

    One of the most critical steps to integrating heterogeneous e-Business applications using different XML schemas is schema matching, which is known to be costly and error-prone. Many automatic schema matching approaches have been proposed, but the challenge is still daunting because of the complexity of schemas and immaturity of technologies in…

  3. XML and Bibliographic Data: The TVS (Transport, Validation and Services) Model.

    ERIC Educational Resources Information Center

    de Carvalho, Joaquim; Cordeiro, Maria Ines

    This paper discusses the role of XML in library information systems at three major levels: as are presentation language that enables the transport of bibliographic data in a way that is technologically independent and universally understood across systems and domains; as a language that enables the specification of complex validation rules…

  4. Strategic Industrial Alliances in Paper Industry: XML- vs Ontology-Based Integration Platforms

    ERIC Educational Resources Information Center

    Naumenko, Anton; Nikitin, Sergiy; Terziyan, Vagan; Zharko, Andriy

    2005-01-01

    Purpose: To identify cases related to design of ICT platforms for industrial alliances, where the use of Ontology-driven architectures based on Semantic web standards is more advantageous than application of conventional modeling together with XML standards. Design/methodology/approach: A comparative analysis of the two latest and the most obvious…

  5. A distributed computing system for magnetic resonance imaging: Java-based processing and binding of XML.

    PubMed

    de Beer, R; Graveron-Demilly, D; Nastase, S; van Ormondt, D

    2004-03-01

    Recently we have developed a Java-based heterogeneous distributed computing system for the field of magnetic resonance imaging (MRI). It is a software system for embedding the various image reconstruction algorithms that we have created for handling MRI data sets with sparse sampling distributions. Since these data sets may result from multi-dimensional MRI measurements our system has to control the storage and manipulation of large amounts of data. In this paper we describe how we have employed the extensible markup language (XML) to realize this data handling in a highly structured way. To that end we have used Java packages, recently released by Sun Microsystems, to process XML documents and to compile pieces of XML code into Java classes. We have effectuated a flexible storage and manipulation approach for all kinds of data within the MRI system, such as data describing and containing multi-dimensional MRI measurements, data configuring image reconstruction methods and data representing and visualizing the various services of the system. We have found that the object-oriented approach, possible with the Java programming environment, combined with the XML technology is a convenient way of describing and handling various data streams in heterogeneous distributed computing systems.

  6. XML3D and Xflow: combining declarative 3D for the Web with generic data flows.

    PubMed

    Klein, Felix; Sons, Kristian; Rubinstein, Dmitri; Slusallek, Philipp

    2013-01-01

    Researchers have combined XML3D, which provides declarative, interactive 3D scene descriptions based on HTML5, with Xflow, a language for declarative, high-performance data processing. The result lets Web developers combine a 3D scene graph with data flows for dynamic meshes, animations, image processing, and postprocessing. PMID:24808080

  7. An Electronic Finding Aid Using Extensible Markup Language (XML) and Encoded Archival Description (EAD).

    ERIC Educational Resources Information Center

    Chang, May

    2000-01-01

    Describes the development of electronic finding aids for archives at the University of Illinois, Urbana-Champaign that used XML (extensible markup language) and EAD (encoded archival description) to enable more flexible information management and retrieval than using MARC or a relational database management system. EAD template is appended.…

  8. Integration of HTML documents into an XML-based knowledge repository.

    PubMed

    Roemer, Lorrie K; Rocha, Roberto A; Del Fiol, Guilherme

    2005-01-01

    The Emergency Patient Instruction Generator (EPIG) is an electronic content compiler / viewer / editor developed by Intermountain Health Care. The content is vendor-licensed HTML patient discharge instructions. This work describes the process by which discharge instructions where converted from ASCII-encoded HTML to XML, then loaded to a database for use by EPIG.

  9. Association Rule Extraction from XML Stream Data for Wireless Sensor Networks

    PubMed Central

    Paik, Juryon; Nam, Junghyun; Kim, Ung Mo; Won, Dongho

    2014-01-01

    With the advances of wireless sensor networks, they yield massive volumes of disparate, dynamic and geographically-distributed and heterogeneous data. The data mining community has attempted to extract knowledge from the huge amount of data that they generate. However, previous mining work in WSNs has focused on supporting simple relational data structures, like one table per network, while there is a need for more complex data structures. This deficiency motivates XML, which is the current de facto format for the data exchange and modeling of a wide variety of data sources over the web, to be used in WSNs in order to encourage the interchangeability of heterogeneous types of sensors and systems. However, mining XML data for WSNs has two challenging issues: one is the endless data flow; and the other is the complex tree structure. In this paper, we present several new definitions and techniques related to association rule mining over XML data streams in WSNs. To the best of our knowledge, this work provides the first approach to mining XML stream data that generates frequent tree items without any redundancy. PMID:25046017

  10. Applying XML-Based Technologies to Developing Online Courses: The Case of a Prototype Learning Environment

    ERIC Educational Resources Information Center

    Jedrzejowicz, Joanna; Neumann, Jakub

    2007-01-01

    Purpose: This paper seeks to describe XML technologies and to show how they can be applied for developing web-based courses and supporting authors who do not have much experience with the preparation of web-based courses. Design/methodology/approach: When developing online courses the academic staff has to address the following problem--how to…

  11. XML-Based Generator of C++ Code for Integration With GUIs

    NASA Technical Reports Server (NTRS)

    Hua, Hook; Oyafuso, Fabiano; Klimeck, Gerhard

    2003-01-01

    An open source computer program has been developed to satisfy a need for simplified organization of structured input data for scientific simulation programs. Typically, such input data are parsed in from a flat American Standard Code for Information Interchange (ASCII) text file into computational data structures. Also typically, when a graphical user interface (GUI) is used, there is a need to completely duplicate the input information while providing it to a user in a more structured form. Heretofore, the duplication of the input information has entailed duplication of software efforts and increases in susceptibility to software errors because of the concomitant need to maintain two independent input-handling mechanisms. The present program implements a method in which the input data for a simulation program are completely specified in an Extensible Markup Language (XML)-based text file. The key benefit for XML is storing input data in a structured manner. More importantly, XML allows not just storing of data but also describing what each of the data items are. That XML file contains information useful for rendering the data by other applications. It also then generates data structures in the C++ language that are to be used in the simulation program. In this method, all input data are specified in one place only, and it is easy to integrate the data structures into both the simulation program and the GUI. XML-to-C is useful in two ways: 1. As an executable, it generates the corresponding C++ classes and 2. As a library, it automatically fills the objects with the input data values.

  12. Recursive Subsystems in Aphasia and Alzheimer's Disease: Case Studies in Syntax and Theory of Mind

    PubMed Central

    Bánréti, Zoltán; Hoffmann, Ildikó; Vincze, Veronika

    2016-01-01

    The relationship between recursive sentence embedding and theory-of-mind (ToM) inference is investigated in three persons with Broca's aphasia, two persons with Wernicke's aphasia, and six persons with mild and moderate Alzheimer's disease (AD). We asked questions of four types about photographs of various real-life situations. Type 4 questions asked participants about intentions, thoughts, or utterances of the characters in the pictures (“What may X be thinking/asking Y to do?”). The expected answers typically involved subordinate clauses introduced by conjunctions or direct quotations of the characters' utterances. Broca's aphasics did not produce answers with recursive sentence embedding. Rather, they projected themselves into the characters' mental states and gave direct answers in the first person singular, with relevant ToM content. We call such replies “situative statements.” Where the question concerned the mental state of the character but did not require an answer with sentence embedding (“What does X hate?”), aphasics gave descriptive answers rather than situative statements. Most replies given by persons with AD to Type 4 questions were grammatical instances of recursive sentence embedding. They also gave a few situative statements but the ToM content of these was irrelevant. In more than one third of their well-formed sentence embeddings, too, they conveyed irrelevant ToM contents. Persons with moderate AD were unable to pass secondary false belief tests. The results reveal double dissociation: Broca's aphasics are unable to access recursive sentence embedding but they can make appropriate ToM inferences; moderate AD persons make the wrong ToM inferences but they are able to access recursive sentence embedding. The double dissociation may be relevant for the nature of the relationship between the two recursive capacities. Broca's aphasics compensated for the lack of recursive sentence embedding by recursive ToM reasoning represented in very

  13. Colloid update.

    PubMed

    Argalious, Maged Y

    2012-01-01

    This update aims to provide an evidence based review of natural and synthetic colloids with a special emphasis on the various generations of the synthetic colloid hydroxyethyl starch. The effect of 1(st), 2(nd) and 3(rd) generation hetastarches on bleeding, coagulopathy, acute kidney injury and mortality will be discussed. The results of randomised controlled trials addressing morbidity and mortality outcomes of colloid versus crystalloid resuscitation in critically ill patients will be described. In addition, the rationale and evidence behind early goal directed fluid therapy (EGDFT) including a practical approach to assessment of dynamic measures of fluid responsiveness will be presented.

  14. Rhabdomyolysis updated

    PubMed Central

    Efstratiadis, G; Voulgaridou, A; Nikiforou, D; Kyventidis, A; Kourkouni, E; Vergoulas, G

    2007-01-01

    Rhabdomyolysis constitutes a common cause of acute renal failure and presents paramount interest. A large variety of causes with different pathogenetic mechanisms can involve skeletal muscles resulting in rhabdomyolysis with or without acute renal failure. Crush syndrome, one of the most common causes of rhabdomyolysis presents increased clinical interest, particularly in areas often involved by earthquakes, such as Greece and Turkey. Drug abusers are another sensitive group of young patients prone to rhabdomyolysis, which attracts the clinical interest of a variety of medical specialties. We herein review the evidence extracted from updated literature concerning the data related to pathogenetic mechanisms and pathophysiology as well as the management of this interesting syndrome. PMID:19582207

  15. Development of the Plate Tectonics and Seismology markup languages with XML

    NASA Astrophysics Data System (ADS)

    Babaie, H.; Babaei, A.

    2003-04-01

    The Extensible Markup Language (XML) and its specifications such as the XSD Schema, allow geologists to design discipline-specific vocabularies such as Seismology Markup Language (SeismML) or Plate Tectonics Markup Language (TectML). These languages make it possible to store and interchange structured geological information over the Web. Development of a geological markup language requires mapping geological concepts, such as "Earthquake" or "Plate" into a UML object model, applying a modeling and design environment. We have selected four inter-related geological concepts: earthquake, fault, plate, and orogeny, and developed four XML Schema Definitions (XSD), that define the relationships, cardinalities, hierarchies, and semantics of these concepts. In such a geological concept model, the UML object "Earthquake" is related to one or more "Wave" objects, each arriving to a seismic station at a specific "DateTime", and relating to a specific "Epicenter" object that lies at a unique "Location". The "Earthquake" object occurs along a "Segment" of a "Fault" object, which is related to a specific "Plate" object. The "Fault" has its own associations with such things as "Bend", "Step", and "Segment", and could be of any kind (e.g., "Thrust", "Transform'). The "Plate" is related to many other objects such as "MOR", "Subduction", and "Forearc", and is associated with an "Orogeny" object that relates to "Deformation" and "Strain" and several other objects. These UML objects were mapped into XML Metadata Interchange (XMI) formats, which were then converted into four XSD Schemas. The schemas were used to create and validate the XML instance documents, and to create a relational database hosting the plate tectonics and seismological data in the Microsoft Access format. The SeismML and TectML allow seismologists and structural geologists, among others, to submit and retrieve structured geological data on the Internet. A seismologist, for example, can submit peer-reviewed and

  16. Content-Aware DataGuide with Incremental Index Update using Frequently Used Paths

    NASA Astrophysics Data System (ADS)

    Sharma, A. K.; Duhan, Neelam; Khattar, Priyanka

    2010-11-01

    Size of the WWW is increasing day by day. Due to the absence of structured data on the Web, it becomes very difficult for information retrieval tools to fully utilize the Web information. As a solution to this problem, XML pages come into play, which provide structural information to the users to some extent. Without efficient indexes, query processing can be quite inefficient due to an exhaustive traversal on XML data. In this paper an improved content-centric approach of Content-Aware DataGuide, which is an indexing technique for XML databases, is being proposed that uses frequently used paths from historical query logs to improve query performance. The index can be updated incrementally according to the changes in query workload and thus, the overhead of reconstruction can be minimized. Frequently used paths are extracted using any Sequential Pattern mining algorithm on subsequent queries in the query workload. After this, the data structures are incrementally updated. This indexing technique proves to be efficient as partial matching queries can be executed efficiently and users can now get the more relevant documents in results.

  17. Document Update and Compare

    NASA Technical Reports Server (NTRS)

    Knoch, C. F.; Caldwell, D. C.; Caldwell, D. L.

    1983-01-01

    Document Update and Compare programs provide simple computerized documentmaintenance system on Data General NOVA 840 computer. Document Update program allows user to update document either by batch or terminal input. Documents are modified and lists of modifications printed out.

  18. Mining IP to Domain Name Interactions to Detect DNS Flood Attacks on Recursive DNS Servers.

    PubMed

    Alonso, Roberto; Monroy, Raúl; Trejo, Luis A

    2016-08-17

    The Domain Name System (DNS) is a critical infrastructure of any network, and, not surprisingly a common target of cybercrime. There are numerous works that analyse higher level DNS traffic to detect anomalies in the DNS or any other network service. By contrast, few efforts have been made to study and protect the recursive DNS level. In this paper, we introduce a novel abstraction of the recursive DNS traffic to detect a flooding attack, a kind of Distributed Denial of Service (DDoS). The crux of our abstraction lies on a simple observation: Recursive DNS queries, from IP addresses to domain names, form social groups; hence, a DDoS attack should result in drastic changes on DNS social structure. We have built an anomaly-based detection mechanism, which, given a time window of DNS usage, makes use of features that attempt to capture the DNS social structure, including a heuristic that estimates group composition. Our detection mechanism has been successfully validated (in a simulated and controlled setting) and with it the suitability of our abstraction to detect flooding attacks. To the best of our knowledge, this is the first time that work is successful in using this abstraction to detect these kinds of attacks at the recursive level. Before concluding the paper, we motivate further research directions considering this new abstraction, so we have designed and tested two additional experiments which exhibit promising results to detect other types of anomalies in recursive DNS servers.

  19. Mining IP to Domain Name Interactions to Detect DNS Flood Attacks on Recursive DNS Servers.

    PubMed

    Alonso, Roberto; Monroy, Raúl; Trejo, Luis A

    2016-01-01

    The Domain Name System (DNS) is a critical infrastructure of any network, and, not surprisingly a common target of cybercrime. There are numerous works that analyse higher level DNS traffic to detect anomalies in the DNS or any other network service. By contrast, few efforts have been made to study and protect the recursive DNS level. In this paper, we introduce a novel abstraction of the recursive DNS traffic to detect a flooding attack, a kind of Distributed Denial of Service (DDoS). The crux of our abstraction lies on a simple observation: Recursive DNS queries, from IP addresses to domain names, form social groups; hence, a DDoS attack should result in drastic changes on DNS social structure. We have built an anomaly-based detection mechanism, which, given a time window of DNS usage, makes use of features that attempt to capture the DNS social structure, including a heuristic that estimates group composition. Our detection mechanism has been successfully validated (in a simulated and controlled setting) and with it the suitability of our abstraction to detect flooding attacks. To the best of our knowledge, this is the first time that work is successful in using this abstraction to detect these kinds of attacks at the recursive level. Before concluding the paper, we motivate further research directions considering this new abstraction, so we have designed and tested two additional experiments which exhibit promising results to detect other types of anomalies in recursive DNS servers. PMID:27548169

  20. The language faculty that wasn't: a usage-based account of natural language recursion

    PubMed Central

    Christiansen, Morten H.; Chater, Nick

    2015-01-01

    In the generative tradition, the language faculty has been shrinking—perhaps to include only the mechanism of recursion. This paper argues that even this view of the language faculty is too expansive. We first argue that a language faculty is difficult to reconcile with evolutionary considerations. We then focus on recursion as a detailed case study, arguing that our ability to process recursive structure does not rely on recursion as a property of the grammar, but instead emerges gradually by piggybacking on domain-general sequence learning abilities. Evidence from genetics, comparative work on non-human primates, and cognitive neuroscience suggests that humans have evolved complex sequence learning skills, which were subsequently pressed into service to accommodate language. Constraints on sequence learning therefore have played an important role in shaping the cultural evolution of linguistic structure, including our limited abilities for processing recursive structure. Finally, we re-evaluate some of the key considerations that have often been taken to require the postulation of a language faculty. PMID:26379567

  1. Mining IP to Domain Name Interactions to Detect DNS Flood Attacks on Recursive DNS Servers

    PubMed Central

    Alonso, Roberto; Monroy, Raúl; Trejo, Luis A.

    2016-01-01

    The Domain Name System (DNS) is a critical infrastructure of any network, and, not surprisingly a common target of cybercrime. There are numerous works that analyse higher level DNS traffic to detect anomalies in the DNS or any other network service. By contrast, few efforts have been made to study and protect the recursive DNS level. In this paper, we introduce a novel abstraction of the recursive DNS traffic to detect a flooding attack, a kind of Distributed Denial of Service (DDoS). The crux of our abstraction lies on a simple observation: Recursive DNS queries, from IP addresses to domain names, form social groups; hence, a DDoS attack should result in drastic changes on DNS social structure. We have built an anomaly-based detection mechanism, which, given a time window of DNS usage, makes use of features that attempt to capture the DNS social structure, including a heuristic that estimates group composition. Our detection mechanism has been successfully validated (in a simulated and controlled setting) and with it the suitability of our abstraction to detect flooding attacks. To the best of our knowledge, this is the first time that work is successful in using this abstraction to detect these kinds of attacks at the recursive level. Before concluding the paper, we motivate further research directions considering this new abstraction, so we have designed and tested two additional experiments which exhibit promising results to detect other types of anomalies in recursive DNS servers. PMID:27548169

  2. Recursive least squares estimation and Kalman filtering by systolic arrays

    NASA Technical Reports Server (NTRS)

    Chen, M. J.; Yao, K.

    1988-01-01

    One of the most promising new directions for high-throughput-rate problems is that based on systolic arrays. In this paper, using the matrix-decomposition approach, a systolic Kalman filter is formulated as a modified square-root information filter consisting of a whitening filter followed by a simple least-squares operation based on the systolic QR algorithm. By proper skewing of the input data, a fully pipelined time and measurement update systolic Kalman filter can be achieved with O(n squared) processing cells, resulting in a system throughput rate of O (n).

  3. [Pharmacovigilance update].

    PubMed

    Diezi, Léonore; Renard, Delphine; Rothuizen, Laura E; Livio, Françoise

    2014-01-15

    The main pharmacovigilance updates in 2013 are reviewed. Nitrofurantoin: lower efficacy and an increased risk of adverse events when creatinine clearance is below 60 ml/min. Dabigatran: contraindicated in patients with mechanical heart valves. Azithromycin: QT prolongation and increased risk of death. Zolpidem: towards a lower dosage. Roflumilast: avoid in patients known or at risk for mood disorders. Retigabine: indication restricted to last-line use and new monitoring requirements after reports of pigment changes in retina and other tissues. Telaprevir and rituximab: severe mucocutaneous reactions. Fingolimod: rare cases of progressive multifocal leucoencephalopathy. Tolvaptan: potential for hepatotoxicity. Nicotinic acid/laropiprant: suspension of marketing authorization as benefits no longer outweigh risks. PMID:24558915

  4. Transparent XML Binding using the ALMA Common Software (ACS) Container/Component Framework

    NASA Astrophysics Data System (ADS)

    Sommer, H.; Chiozzi, G.; Fugate, D.; Sekoranja, M.

    2004-07-01

    ALMA software, from high-level data flow applications down to instrument control, is built using the ACS framework. The common architecture and infrastructure used for the whole ALMA software is presented at this conference in (Schwarz, Farris, & Sommer 2004). ACS offers a CORBA-based container/component model and supports the exchange and persistence of XML data. For the Java programming language, the container integrates transparently the use of type-safe Java binding classes to let applications conveniently work with XML transfer objects without having to parse or serialize them. This paper will show how the ACS container/component architecture serves to pass complex data structures, such as observation meta-data, between heterogeneous applications.

  5. Using XML Configuration-Driven Development to Create a Customizable Ground Data System

    NASA Technical Reports Server (NTRS)

    Nash, Brent; DeMore, Martha

    2009-01-01

    The Mission data Processing and Control Subsystem (MPCS) is being developed as a multi-mission Ground Data System with the Mars Science Laboratory (MSL) as the first fully supported mission. MPCS is a fully featured, Java-based Ground Data System (GDS) for telecommand and telemetry processing based on Configuration-Driven Development (CDD). The eXtensible Markup Language (XML) is the ideal language for CDD because it is easily readable and editable by all levels of users and is also backed by a World Wide Web Consortium (W3C) standard and numerous powerful processing tools that make it uniquely flexible. The CDD approach adopted by MPCS minimizes changes to compiled code by using XML to create a series of configuration files that provide both coarse and fine grained control over all aspects of GDS operation.

  6. Representation of thermal infrared imaging data in the DICOM using XML configuration files.

    PubMed

    Ruminski, Jacek

    2007-01-01

    The DICOM standard has become a widely accepted and implemented format for the exchange and storage of medical imaging data. Different imaging modalities are supported however there is not a dedicated solution for thermal infrared imaging in medicine. In this article we propose new ideas and improvements to final proposal of the new DICOM Thermal Infrared Imaging structures and services. Additionally, we designed, implemented and tested software packages for universal conversion of existing thermal imaging files to the DICOM format using XML configuration files. The proposed solution works fast and requires minimal number of user interactions. The XML configuration file enables to compose a set of attributes for any source file format of thermal imaging camera.

  7. Enhancement of the Work in Scia Engineer's Environment by Employment of XML Programming Language

    NASA Astrophysics Data System (ADS)

    Kortiš, Ján

    2015-12-01

    The productivity of the work of engineers in the design of building structures by applying the rules of technical standards [1] has been increasing by using different software products for recent years. The software products offer engineers new possibilities to design different structures. However, there are problems especially for design of structures with similar static schemes as it is needed to follow the same work-steps. This can be more effective if the steps are done automatically by using a programming language for leading the processes that are done by software. The design process of timber structure which is done in the environment of Scia Engineer software is presented in the article. XML Programming Language is used for automatization of the design and the XML code is modified in the Excel environment by using VBA Programming language [2], [3].

  8. Light at Night Markup Language (LANML): XML Technology for Light at Night Monitoring Data

    NASA Astrophysics Data System (ADS)

    Craine, B. L.; Craine, E. R.; Craine, E. M.; Crawford, D. L.

    2013-05-01

    Light at Night Markup Language (LANML) is a standard, based upon XML, useful in acquiring, validating, transporting, archiving and analyzing multi-dimensional light at night (LAN) datasets of any size. The LANML standard can accommodate a variety of measurement scenarios including single spot measures, static time-series, web based monitoring networks, mobile measurements, and airborne measurements. LANML is human-readable, machine-readable, and does not require a dedicated parser. In addition LANML is flexible; ensuring future extensions of the format will remain backward compatible with analysis software. The XML technology is at the heart of communicating over the internet and can be equally useful at the desktop level, making this standard particularly attractive for web based applications, educational outreach and efficient collaboration between research groups.

  9. Technical Knowledge Management Platform Based on Ontology Engineering and XML Technology

    NASA Astrophysics Data System (ADS)

    Takafuji, Sunao; Kitamura, Yoshinobu; Mizoguchi, Riichiro

    We propose new knowledge management system, which is based on ontology engineering and XML technology, for manufacturing field. The purpose of this system is to support to externalize implicit functional knowledge of a product that a designer creates, to share the knowledge among those who are involved in the product development, and utilize the knowledge for any designer to collaboratively create or invent new products. It is true, however, there are not a few superior design tool such as CAD/CAE software, simulation tools, etc. Though such tools are indispensable to complete a product, they are insufficient to reveal design rational of a designer that indicates the reason why he/she adopted the function, the structure, materials, etc. Ontology engineering contributes to clarify design rational of any designer by describing a function decomposition tree (hereinafter FDT), which has the specific feature of separating function and way, that is a kind of tree to show functional structure based on device ontology and functional ontology. While the functional modeling framework is independent to any data representation infrastructure, XML has a synergy with the framework because of being able to handle semantics of the framework such as subject, object, function, etc. In addition to the semantic view point, XML also has flexibility to compound a document from multiple information fragments, so that it allows FDT to adapt to any application of manufacturing tasks. From those view point, we created OntoGear which is the foundation system that supports a designer or developer to author, share, and use FDT by using xfy technology that JustSystems, Corp. has developed as an XML application development framework.

  10. XTCE and XML Database Evolution and Lessons from JWST, LandSat, and Constellation

    NASA Technical Reports Server (NTRS)

    Gal-Edd, Jonathan; Kreistle, Steven; Fatig. Cirtos; Jones, Ronald

    2008-01-01

    The database organizations within three different NASA projects have advanced current practices by creating database synergy between the various spacecraft life cycle stakeholders and educating users in the benefits of the Consultative Committee for Space Data Systems (CCSDS) XML Telemetry and Command Exchange (XTCE) format. The combination of XML for managing program data and CCSDS XTCE for exchange is a robust approach that will meet all user requirements using Standards and Non proprietary tools. COTS tools for XTCEKML are very wide and varied. To combine together various low cost and free tools can be more expensive in the long run than choosing a more expensive COTS tool that meets all the needs. This was especially important when deploying in 32 remote sites with no need for licenses. A common mission XTCEKML format between dissimilar systems is possible and is not difficult. Command XMLKTCE is more complex than telemetry and the use of XTCEKML metadata to describe pages and scripts is needed due to the proprietary nature of most current ground systems. Other mission and science products such as spacecraft loads, science image catalogs, and mission operation procedures can all be described with XML as well to increase there flexibility as systems evolve and change. Figure 10 is an example of a spacecraft table load. The word is out and the XTCE community is growing, The f sXt TCE user group was held in October and in addition to ESAESOC, SC02000, and CNES identified several systems based on XTCE. The second XTCE user group is scheduled for March 10, 2008 with LDMC and others joining. As the experience with XTCE grows and the user community receives the promised benefits of using XTCE and XML the interest is growing fast.

  11. Customized Document Validation to Support a Flexible XML-based Knowledge Management Framework

    PubMed Central

    Hanna, Timothy P.; Rocha, Roberto A.; Hulse, Nathan C.; Del Fiol, Guilherme; Bradshaw, Richard L.; Roemer, Lorrie K.

    2005-01-01

    This paper describes a validation architecture used within Intermountain Health Care’s Clinical Knowledge Repository (CKR). The architecture provides additional functionality that complements XML Schema validation, producing user-friendly error messages and enabling validation rules reuse. The validation architecture helps document authors fix their own errors. As a result, less than 1% of all documents in the CKR are considered invalid. PMID:16779048

  12. XML-based 3D model visualization and simulation framework for dynamic models

    NASA Astrophysics Data System (ADS)

    Kim, Taewoo; Fishwick, Paul A.

    2002-07-01

    Relatively recent advances in computer technology enable us to create three-dimensional (3D) dynamic models and simulate them within a 3D web environment. The use of such models is especially valuable when teaching simulation, and the concepts behind dynamic models, since the models are made more accessible to the students. Students tend to enjoy a construction process in which they are able to employ their own cultural and aesthetic forms. The challenge is to create a language that allows for a grammar for modeling, while simultaneously permitting arbitrary presentation styles. For further flexibility, we need an effective way to represent and simulate dynamic models that can be shared by modelers over the Internet. We present an Extensible Markup Language (XML)-based framework that will guide a modeler in creating personalized 3D models, visualizing its dynamic behaviors, and simulating the created models. A model author will use XML files to represent geometries and topology of a dynamic model. Model Fusion Engine, written in Extensible Stylesheet Language Transformation (XSLT), expedites the modeling process by automating the creation of dynamic models with the user-defined XML files. Modelers can also link simulation programs with a created model to analyze the characteristics of the model. The advantages of this system lie in the education of modeling and simulating dynamic models, and in the exploitation of visualizing the dynamic model behaviors.

  13. Methods for assessing movement path recursion with application to African buffalo in South Africa

    USGS Publications Warehouse

    Bar-David, S.; Bar-David, I.; Cross, P.C.; Ryan, S.J.; Knechtel, C.U.; Getz, W.M.

    2009-01-01

    Recent developments of automated methods for monitoring animal movement, e.g., global positioning systems (GPS) technology, yield high-resolution spatiotemporal data. To gain insights into the processes creating movement patterns, we present two new techniques for extracting information from these data on repeated visits to a particular site or patch ("recursions"). Identification of such patches and quantification of recursion pathways, when combined with patch-related ecological data, should contribute to our understanding of the habitat requirements of large herbivores, of factors governing their space-use patterns, and their interactions with the ecosystem. We begin by presenting output from a simple spatial model that simulates movements of large-herbivore groups based on minimal parameters: resource availability and rates of resource recovery after a local depletion. We then present the details of our new techniques of analyses (recursion analysis and circle analysis) and apply them to data generated by our model, as well as two sets of empirical data on movements of African buffalo (Syncerus coffer): the first collected in Klaserie Private Nature Reserve and the second in Kruger National Park, South Africa. Our recursion analyses of model outputs provide us with a basis for inferring aspects of the processes governing the production of buffalo recursion patterns, particularly the potential influence of resource recovery rate. Although the focus of our simulations was a comparison of movement patterns produced by different resource recovery rates, we conclude our paper with a comprehensive discussion of how recursion analyses can be used when appropriate ecological data are available to elucidate various factors influencing movement. Inter alia, these include the various limiting and preferred resources, parasites, and topographical and landscape factors. ?? 2009 by the Ecological Society of America.

  14. The early acquisition of viable knowledge: A use of recursive model as an analytical devise (methodolosocial).

    PubMed

    Karstadt, Lyn; Thomas, Keith Robert; Abed, Shaymaa N

    2016-01-01

    In nurse education typically, information is presented to students within the classroom and then applied within a clinical situation. Acquisition of the knowledge required to inform the student's early practice is the focus of this research. This paper centres upon the construction of a cognitive model that is recursive in nature, and forms an integral part of a qualitative research study. The primary study investigated how first year student nurses use the information received in the classroom to underpin their early practice. Data were collected from 10 students and 4 of their lecturers, via blogs and interviews and used iteratively to create a model that is recursive in nature. Recursion is a process of repeatedly revisiting the same thing, in this case the data, which are considered in an iterative or progressive way. Recursion thus facilitated the development of a model, which was seen to change and develop in sophistication as more data were considered and evaluated. Visual devices were used throughout to bring clarity during the construction of the model. This visual process was pivotal to the analysis. This paper chronicles the development of an analytical device through the medium of the study presented. Viable knowledge is represented as the synthesis of concepts, as presented in the classroom, and practice, as experienced within the clinical area. It illustrates how conceptual material delivered within the classroom has become embedded within an individual student's consciousness and is used during a clinical placement to make sense of a specific situation. The study identifies how students use information and makes recommendations as to how appropriate curricula integrate all the facets of the recursive model. The process of recursive modelling is thus offered as an analytical devise, which may be applied by researchers to other qualitative data. PMID:26586185

  15. Recursive geometric integrators for wave propagation in a functionally graded multilayered elastic medium

    NASA Astrophysics Data System (ADS)

    Wang, Lugen; Rokhlin, S. I.

    2004-11-01

    The differential equations governing transfer and stiffness matrices and acoustic impedance for a functionally graded generally anisotropic magneto-electro-elastic medium have been obtained. It is shown that the transfer matrix satisfies a linear 1st order matrix differential equation, while the stiffness matrix satisfies a nonlinear Riccati equation. For a thin nonhomogeneous layer, approximate solutions with different levels of accuracy have been formulated in the form of a transfer matrix using a geometrical integration in the form of a Magnus expansion. This integration method preserves qualitative features of the exact solution of the differential equation, in particular energy conservation. The wave propagation solution for a thick layer or a multilayered structure of inhomogeneous layers is obtained recursively from the thin layer solutions. Since the transfer matrix solution becomes computationally unstable with increase of frequency or layer thickness, we reformulate the solution in the form of a stable stiffness-matrix solution which is obtained from the relation of the stiffness matrices to the transfer matrices. Using an efficient recursive algorithm, the stiffness matrices of the thin nonhomogeneous layer are combined to obtain the total stiffness matrix for an arbitrary functionally graded multilayered system. It is shown that the round-off error for the stiffness-matrix recursive algorithm is higher than that for the transfer matrices. To optimize the recursive procedure, a computationally stable hybrid method is proposed which first starts the recursive computation with the transfer matrices and then, as the thickness increases, transits to the stiffness matrix recursive algorithm. Numerical results show this solution to be stable and efficient. As an application example, we calculate the surface wave velocity dispersion for a functionally graded coating on a semispace.

  16. A new design for SLAM front-end based on recursive SOM

    NASA Astrophysics Data System (ADS)

    Yang, Xuesi; Xia, Shengping

    2015-12-01

    Aiming at the graph optimization-based monocular SLAM, a novel design for the front-end in single camera SLAM is proposed, based on the recursive SOM. Pixel intensities are directly used to achieve image registration and motion estimation, which can save time compared with the current appearance-based frameworks, usually including feature extraction and matching. Once a key-frame is identified, a recursive SOM is used to actualize loop-closure detecting, resulting a more precise location. The experiment on a public dataset validates our method on a computer with a quicker and effective result.

  17. Recursive analytical solution describing artificial satellite motion perturbed by an arbitrary number of zonal terms

    NASA Technical Reports Server (NTRS)

    Mueller, A. C.

    1977-01-01

    An analytical first order solution has been developed which describes the motion of an artificial satellite perturbed by an arbitrary number of zonal harmonics of the geopotential. A set of recursive relations for the solution, which was deduced from recursive relations of the geopotential, was derived. The method of solution is based on Von-Zeipel's technique applied to a canonical set of two-body elements in the extended phase space which incorporates the true anomaly as a canonical element. The elements are of Poincare type, that is, they are regular for vanishing eccentricities and inclinations. Numerical results show that this solution is accurate to within a few meters after 500 revolutions.

  18. Student Monks--Teaching Recursion in an IS or CS Programming Course Using the Towers of Hanoi

    ERIC Educational Resources Information Center

    Benander, Alan C.; Benander, Barbara A.

    2008-01-01

    Educators have been using the Towers of Hanoi problem for many years as an example of a problem that has a very elegant recursive solution. However, the elegance and conciseness of this solution can make it difficult for students to understand the amount of computer time required in the execution of this solution. And, like many recursive computer…

  19. Closed-form recursive formula for an optimal tracker with terminal constraints

    NASA Technical Reports Server (NTRS)

    Juang, J.-N.; Turner, J. D.; Chun, H. M.

    1984-01-01

    Feedback control laws are derived for a class of optimal finite time tracking problems with terminal constraints. Analytical solutions are obtained for the feedback gain and the closed-loop response trajectory. Such formulations are expressed in recursive forms so that a real-time computer implementation becomes feasible. Two examples are given to illustrate the validity and usefulness of the formulations.

  20. Recursive Frame Analysis: Reflections on the Development of a Qualitative Research Method

    ERIC Educational Resources Information Center

    Keeney, Hillary; Keeney, Bradford

    2012-01-01

    The origin of recursive frame analysis (RFA) is revisited and discussed as a postmodern alternative to modernist therapeutic models and research methods that foster hegemony of a preferred therapeutic metaphor, narrative, or strategy. It encourages improvisational performance while enabling a means of scoring the change and movement of the…

  1. Split-remerge method for eliminating processing window artifacts in recursive hierarchical segmentation

    NASA Technical Reports Server (NTRS)

    Tilton, James C. (Inventor)

    2010-01-01

    A method, computer readable storage, and apparatus for implementing recursive segmentation of data with spatial characteristics into regions including splitting-remerging of pixels with contagious region designations and a user controlled parameter for providing a preference for merging adjacent regions to eliminate window artifacts.

  2. On recursive edit distance kernels with application to time series classification.

    PubMed

    Marteau, Pierre-Francois; Gibet, Sylvie

    2015-06-01

    This paper proposes some extensions to the work on kernels dedicated to string or time series global alignment based on the aggregation of scores obtained by local alignments. The extensions that we propose allow us to construct, from classical recursive definition of elastic distances, recursive edit distance (or time-warp) kernels that are positive definite if some sufficient conditions are satisfied. The sufficient conditions we end up with are original and weaker than those proposed in earlier works, although a recursive regularizing term is required to get proof of the positive definiteness as a direct consequence of the Haussler's convolution theorem. Furthermore, the positive definiteness is maintained when a symmetric corridor is used to reduce the search space, and thus the algorithmic complexity, which is quadratic in the worst case. The classification experiment we conducted on three classical time-warp distances (two of which are metrics), using support vector machine classifier, leads to the conclusion that when the pairwise distance matrix obtained from the training data is far from definiteness, the positive definite recursive elastic kernels outperform in general the distance substituting kernels for several classical elastic distances we have tested.

  3. Evapotranspiration: Measured with a lysimeter vs. calculated with a recursive method

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Recently, a recursive combination method (RCM) to calculate potential and crop evapotranspiration (ET) was given by Lascano and Van Bavel (Agron. J. 2007, 99:585-590) that differs from the Penman-Monteith (PM) method. The main difference between the two methods is that the assumptions made regarding...

  4. Quantitative genetic models for describing simultaneous and recursive relationships between phenotypes.

    PubMed Central

    Gianola, Daniel; Sorensen, Daniel

    2004-01-01

    Multivariate models are of great importance in theoretical and applied quantitative genetics. We extend quantitative genetic theory to accommodate situations in which there is linear feedback or recursiveness between the phenotypes involved in a multivariate system, assuming an infinitesimal, additive, model of inheritance. It is shown that structural parameters defining a simultaneous or recursive system have a bearing on the interpretation of quantitative genetic parameter estimates (e.g., heritability, offspring-parent regression, genetic correlation) when such features are ignored. Matrix representations are given for treating a plethora of feedback-recursive situations. The likelihood function is derived, assuming multivariate normality, and results from econometric theory for parameter identification are adapted to a quantitative genetic setting. A Bayesian treatment with a Markov chain Monte Carlo implementation is suggested for inference and developed. When the system is fully recursive, all conditional posterior distributions are in closed form, so Gibbs sampling is straightforward. If there is feedback, a Metropolis step may be embedded for sampling the structural parameters, since their conditional distributions are unknown. Extensions of the model to discrete random variables and to nonlinear relationships between phenotypes are discussed. PMID:15280252

  5. A Cognitive Processing Account of Individual Differences in Novice Logo Programmers' Conceptualisation and Use of Recursion.

    ERIC Educational Resources Information Center

    Gibbons, Pamela

    1995-01-01

    Describes a study that investigated individual differences in the construction of mental models of recursion in LOGO programming. The learning process was investigated from the perspective of Norman's mental models theory and employed diSessa's ontology regarding distributed, functional, and surrogate mental models, and the Luria model of brain…

  6. User's Guide for the Precision Recursive Estimator for Ephemeris Refinement (PREFER)

    NASA Technical Reports Server (NTRS)

    Gibbs, B. P.

    1982-01-01

    PREFER is a recursive orbit determination program which is used to refine the ephemerides produced by a batch least squares program (e.g., GTDS). It is intended to be used primarily with GTDS and, thus, is compatible with some of the GTDS input/output files.

  7. Analysis of litter size and average litter weight in pigs using a recursive model.

    PubMed

    Varona, Luis; Sorensen, Daniel; Thompson, Robin

    2007-11-01

    An analysis of litter size and average piglet weight at birth in Landrace and Yorkshire using a standard two-trait mixed model (SMM) and a recursive mixed model (RMM) is presented. The RMM establishes a one-way link from litter size to average piglet weight. It is shown that there is a one-to-one correspondence between the parameters of SMM and RMM and that they generate equivalent likelihoods. As parameterized in this work, the RMM tests for the presence of a recursive relationship between additive genetic values, permanent environmental effects, and specific environmental effects of litter size, on average piglet weight. The equivalent standard mixed model tests whether or not the covariance matrices of the random effects have a diagonal structure. In Landrace, posterior predictive model checking supports a model without any form of recursion or, alternatively, a SMM with diagonal covariance matrices of the three random effects. In Yorkshire, the same criterion favors a model with recursion at the level of specific environmental effects only, or, in terms of the SMM, the association between traits is shown to be exclusively due to an environmental (negative) correlation. It is argued that the choice between a SMM or a RMM should be guided by the availability of software, by ease of interpretation, or by the need to test a particular theory or hypothesis that may best be formulated under one parameterization and not the other.

  8. The Recursive Process in and of Critical Literacy: Action Research in an Urban Elementary School

    ERIC Educational Resources Information Center

    Cooper, Karyn; White, Robert E.

    2012-01-01

    This paper provides an overview of the recursive process of initiating an action research project on literacy for students-at-risk in a Canadian urban elementary school. As this paper demonstrates, this requires development of a school-wide framework, which frames the action research project and desired outcomes, and a shared ownership of this…

  9. Fuzzy logic recursive change detection for tracking and denoising of video sequences

    NASA Astrophysics Data System (ADS)

    Zlokolica, Vladimir; De Geyter, Matthias; Schulte, Stefan; Pizurica, Aleksandra; Philips, Wilfried; Kerre, Etienne

    2005-03-01

    In this paper we propose a fuzzy logic recursive scheme for motion detection and temporal filtering that can deal with the Gaussian noise and unsteady illumination conditions both in temporal and spatial direction. Our focus is on applications concerning tracking and denoising of image sequences. We process an input noisy sequence with fuzzy logic motion detection in order to determine the degree of motion confidence. The proposed motion detector combines the membership degree appropriately using defined fuzzy rules, where the membership degree of motion for each pixel in a 2D-sliding-window is determined by the proposed membership function. Both fuzzy membership function and fuzzy rules are defined in such a way that the performance of the motion detector is optimized in terms of its robustness to noise and unsteady lighting conditions. We perform simultaneously tracking and recursive adaptive temporal filtering, where the amount of filtering is inversely proportional to the confidence with respect to the existence of motion. Finally, temporally filtered frames are further processed by the proposed spatial filter in order to obtain denoised image sequence. The main contribution of this paper is the robust novel fuzzy recursive scheme for motion detection and temporal filtering. We evaluate the proposed motion detection algorithm using two criteria: robustness to noise and changing illumination conditions and motion blur in temporal recursive denoising. Additionally, we make comparisons in terms of noise reduction with other state of the art video denoising techniques.

  10. Web-Based Delivery System for Disaster Prevention Information Using a New Jma Dpi Xml Format and Amedas Data

    NASA Astrophysics Data System (ADS)

    Nishio, M.; Mori, M.

    2012-07-01

    The Automated Meteorological Data Acquisition System (AMeDAS) Data is used along with compound disaster information for a geographic information system (GIS) by integration into the Japan Meteorological Agency (JMA) disaster prevention information XML data. A JMA XML format is a next generation format that contains weather warnings, tsunami warnings, and earthquake information, etc. However, it is not possible to process it by reading disaster prevention information XML Data and AMeDAS Data directly to the GIS system. Therefore, development of a program that converts the data structure is important to consolidate a variety of disaster prevention information on the GIS system. Information on escape routes and evacuation sites, etc. were given as points for regional meteorological observation forecasts using AMeDAS Data by disaster prevention information XML data and integrating it where the disaster was generated, giving a range of expansion of damage and a damage level. There are two main aims; the first is to deliver these compound data of disaster prevention information XML data and AMeDAS Data via the Internet. The second aim is to provide GIS files (shapefile format) of these data to such as local governments for their individual analysis. This was furthermore confirmed to enable the construction of a system using WebGIS (Google Maps) and Open Source Software GIS to monitor disaster information at low cost.

  11. An Assistant System for Dairy Works Using CBR Based on XML

    NASA Astrophysics Data System (ADS)

    Yasumura, Yoshiaki; Suzuki, Sachiko; Nitta, Katsumi

    This paper introduces an agent system for supporting user's dairy work on the Internet like a secretary. In this system, an agent is assigned to a user, and receives requests from the user or other agents. Since there are various kinds of requests, it is difficult to prepare a complete set of request-handling rules in advance. In order to handle various requests, the agent uses Case Based Reasoning (CBR), which is an approach to solve a problem by referring old cases. Requests and cases are described as XML documents, that are easily understandable for both the user and agent, because the agent needs to interact with them. Describing documents in XML enables an agent to match a request and a case more exactly. This agent consists of a request receipt module, planning module, executing module, and case storage module. The request receipt module receive a request from the user or other agents. The request is described as a XML document by interacting with the user. The planning module searches an old case similar to the request, and generates a sequence of basic operations as a plan by referring the case. The executing module executes the plan. If the agent fails to execute a basic operation or requires user's instruction, then it carries out the plan by interacting with the user. The case storage module stores the new case with user's evaluation score into the case base. The experimental results shows that increase in the number of cases raises a proposal rate and accuracy rate. However, too many cases may cause decline in the accuracy rate by inconsistency of user's evaluation.

  12. Lapin Data Interchange Among Database, Analysis and Display Programs Using XML-Based Text Files

    NASA Technical Reports Server (NTRS)

    2004-01-01

    The purpose was to investigate and evaluate the interchange of application- specific data among multiple programs each carrying out part of the analysis and design task. This has been carried out previously by creating a custom program to read data produced by one application and then write that data to a file whose format is specific to the second application that needs all or part of that data. In this investigation, data of interest is described using the XML markup language that allows the data to be stored in a text-string. Software to transform output data of a task into an XML-string and software to read an XML string and extract all or a portion of the data needed for another application is used to link two independent applications together as part of an overall design effort. This approach was initially used with a standard analysis program, Lapin, along with standard applications a standard spreadsheet program, a relational database program, and a conventional dialog and display program to demonstrate the successful sharing of data among independent programs. See Engineering Analysis Using a Web-Based Protocol by J.D. Schoeffler and R.W. Claus, NASA TM-2002-211981, October 2002. Most of the effort beyond that demonstration has been concentrated on the inclusion of more complex display programs. Specifically, a custom-written windowing program organized around dialogs to control the interactions have been combined with an independent CAD program (Open Cascade) that supports sophisticated display of CAD elements such as lines, spline curves, and surfaces and turbine-blade data produced by an independent blade design program (UD0300).

  13. Lapin Data Interchange Among Database, Analysis and Display Programs Using XML-Based Text Files

    NASA Technical Reports Server (NTRS)

    2005-01-01

    The purpose of grant NCC3-966 was to investigate and evaluate the interchange of application-specific data among multiple programs each carrying out part of the analysis and design task. This has been carried out previously by creating a custom program to read data produced by one application and then write that data to a file whose format is specific to the second application that needs all or part of that data. In this investigation, data of interest is described using the XML markup language that allows the data to be stored in a text-string. Software to transform output data of a task into an XML-string and software to read an XML string and extract all or a portion of the data needed for another application is used to link two independent applications together as part of an overall design effort. This approach was initially used with a standard analysis program, Lapin, along with standard applications a standard spreadsheet program, a relational database program, and a conventional dialog and display program to demonstrate the successful sharing of data among independent programs. Most of the effort beyond that demonstration has been concentrated on the inclusion of more complex display programs. Specifically, a custom-written windowing program organized around dialogs to control the interactions have been combined with an independent CAD program (Open Cascade) that supports sophisticated display of CAD elements such as lines, spline curves, and surfaces and turbine-blade data produced by an independent blade design program (UD0300).

  14. Agent Based Manufacturing Capability Assessment in the Extended Enterprise Using Step AP224 and XML

    NASA Astrophysics Data System (ADS)

    Ratchev, Svetan; Medani, Omar

    Data exchange in the extended enterprise is one of the most critical tasks in supporting collaborative decision-making. Companies often rely on geographically distributed suppliers for efficient product design and manufacture. Early design assessment can substantially reduce the cost of product development and production. This research proposes a new STEP AP224 EXPRESS based data model to facilitate the exchange of part and process data between distributed key agents in the early design process. The approach is illustrated using a prototype XML/CORBA environment to support the information exchange between collaborating design and manufacturing agents.

  15. A Transcription System from MusicXML Format to Braille Music Notation

    NASA Astrophysics Data System (ADS)

    Goto, D.; Gotoh, T.; Minamikawa-Tachino, R.; Tamura, N.

    2006-12-01

    The Internet enables us to freely access music as recorded sound and even music scores. For the visually impaired, music scores must be transcribed from computer-based musical formats to Braille music notation. This paper proposes a transcription system from the MusicXML format to Braille music notation using a structural model of Braille music notation. The resultant Braille scores inspected by volunteer transcribers are up to the international standard. Using this simple and efficient transcription system, it should be possible to provide Braille music scores via the Internet to the visually impaired.

  16. Informatics in radiology: automated structured reporting of imaging findings using the AIM standard and XML.

    PubMed

    Zimmerman, Stefan L; Kim, Woojin; Boonn, William W

    2011-01-01

    Quantitative and descriptive imaging data are a vital component of the radiology report and are frequently of paramount importance to the ordering physician. Unfortunately, current methods of recording these data in the report are both inefficient and error prone. In addition, the free-text, unstructured format of a radiology report makes aggregate analysis of data from multiple reports difficult or even impossible without manual intervention. A structured reporting work flow has been developed that allows quantitative data created at an advanced imaging workstation to be seamlessly integrated into the radiology report with minimal radiologist intervention. As an intermediary step between the workstation and the reporting software, quantitative and descriptive data are converted into an extensible markup language (XML) file in a standardized format specified by the Annotation and Image Markup (AIM) project of the National Institutes of Health Cancer Biomedical Informatics Grid. The AIM standard was created to allow image annotation data to be stored in a uniform machine-readable format. These XML files containing imaging data can also be stored on a local database for data mining and analysis. This structured work flow solution has the potential to improve radiologist efficiency, reduce errors, and facilitate storage of quantitative and descriptive imaging data for research. PMID:21357413

  17. Informatics in radiology: automated structured reporting of imaging findings using the AIM standard and XML.

    PubMed

    Zimmerman, Stefan L; Kim, Woojin; Boonn, William W

    2011-01-01

    Quantitative and descriptive imaging data are a vital component of the radiology report and are frequently of paramount importance to the ordering physician. Unfortunately, current methods of recording these data in the report are both inefficient and error prone. In addition, the free-text, unstructured format of a radiology report makes aggregate analysis of data from multiple reports difficult or even impossible without manual intervention. A structured reporting work flow has been developed that allows quantitative data created at an advanced imaging workstation to be seamlessly integrated into the radiology report with minimal radiologist intervention. As an intermediary step between the workstation and the reporting software, quantitative and descriptive data are converted into an extensible markup language (XML) file in a standardized format specified by the Annotation and Image Markup (AIM) project of the National Institutes of Health Cancer Biomedical Informatics Grid. The AIM standard was created to allow image annotation data to be stored in a uniform machine-readable format. These XML files containing imaging data can also be stored on a local database for data mining and analysis. This structured work flow solution has the potential to improve radiologist efficiency, reduce errors, and facilitate storage of quantitative and descriptive imaging data for research.

  18. Distribution of immunodeficiency fact files with XML – from Web to WAP

    PubMed Central

    Väliaho, Jouni; Riikonen, Pentti; Vihinen, Mauno

    2005-01-01

    Background Although biomedical information is growing rapidly, it is difficult to find and retrieve validated data especially for rare hereditary diseases. There is an increased need for services capable of integrating and validating information as well as proving it in a logically organized structure. A XML-based language enables creation of open source databases for storage, maintenance and delivery for different platforms. Methods Here we present a new data model called fact file and an XML-based specification Inherited Disease Markup Language (IDML), that were developed to facilitate disease information integration, storage and exchange. The data model was applied to primary immunodeficiencies, but it can be used for any hereditary disease. Fact files integrate biomedical, genetic and clinical information related to hereditary diseases. Results IDML and fact files were used to build a comprehensive Web and WAP accessible knowledge base ImmunoDeficiency Resource (IDR) available at . A fact file is a user oriented user interface, which serves as a starting point to explore information on hereditary diseases. Conclusion The IDML enables the seamless integration and presentation of genetic and disease information resources in the Internet. IDML can be used to build information services for all kinds of inherited diseases. The open source specification and related programs are available at . PMID:15978138

  19. MASCOT HTML and XML parser: an implementation of a novel object model for protein identification data.

    PubMed

    Yang, Chunguang G; Granite, Stephen J; Van Eyk, Jennifer E; Winslow, Raimond L

    2006-11-01

    Protein identification using MS is an important technique in proteomics as well as a major generator of proteomics data. We have designed the protein identification data object model (PDOM) and developed a parser based on this model to facilitate the analysis and storage of these data. The parser works with HTML or XML files saved or exported from MASCOT MS/MS ions search in peptide summary report or MASCOT PMF search in protein summary report. The program creates PDOM objects, eliminates redundancy in the input file, and has the capability to output any PDOM object to a relational database. This program facilitates additional analysis of MASCOT search results and aids the storage of protein identification information. The implementation is extensible and can serve as a template to develop parsers for other search engines. The parser can be used as a stand-alone application or can be driven by other Java programs. It is currently being used as the front end for a system that loads HTML and XML result files of MASCOT searches into a relational database. The source code is freely available at http://www.ccbm.jhu.edu and the program uses only free and open-source Java libraries.

  20. Use of XML and Java for collaborative petroleum reservoir modeling on the Internet

    USGS Publications Warehouse

    Victorine, J.; Watney, W.L.; Bhattacharya, S.

    2005-01-01

    The GEMINI (Geo-Engineering Modeling through INternet Informatics) is a public-domain, web-based freeware that is made up of an integrated suite of 14 Java-based software tools to accomplish on-line, real-time geologic and engineering reservoir modeling. GEMINI facilitates distant collaborations for small company and academic clients, negotiating analyses of both single and multiple wells. The system operates on a single server and an enterprise database. External data sets must be uploaded into this database. Feedback from GEMINI users provided the impetus to develop Stand Alone Web Start Applications of GEMINI modules that reside in and operate from the user's PC. In this version, the GEMINI modules run as applets, which may reside in local user PCs, on the server, or Java Web Start. In this enhanced version, XML-based data handling procedures are used to access data from remote and local databases and save results for later access and analyses. The XML data handling process also integrates different stand-alone GEMINI modules enabling the user(s) to access multiple databases. It provides flexibility to the user to customize analytical approach, database location, and level of collaboration. An example integrated field-study using GEMINI modules and Stand Alone Web Start Applications is provided to demonstrate the versatile applicability of this freeware for cost-effective reservoir modeling. ?? 2005 Elsevier Ltd. All rights reserved.

  1. Recursive neural networks for processing graphs with labelled edges: theory and applications.

    PubMed

    Bianchini, M; Maggini, M; Sarti, L; Scarselli, F

    2005-10-01

    In this paper, we introduce a new recursive neural network model able to process directed acyclic graphs with labelled edges. The model uses a state transition function which considers the edge labels and is independent both from the number and the order of the children of each node. The computational capabilities of the new recursive architecture are assessed. Moreover, in order to test the proposed architecture on a practical challenging application, the problem of object detection in images is also addressed. In fact, the localization of target objects is a preliminary step in any recognition system. The proposed technique is general and can be applied in different detection systems, since it does not exploit any a priori knowledge on the particular problem. Some experiments on face detection, carried out on scenes acquired by an indoor camera, are reported, showing very promising results. PMID:16181770

  2. Berends-Giele recursions and the BCJ duality in superspace and components

    NASA Astrophysics Data System (ADS)

    Mafra, Carlos R.; Schlotterer, Oliver

    2016-03-01

    The recursive method of Berends and Giele to compute tree-level gluon amplitudes is revisited using the framework of ten-dimensional super Yang-Mills. First, we prove that the pure spinor formula to compute SYM tree amplitudes derived in 2010 reduces to the standard Berends-Giele formula from the 80s when restricted to gluon amplitudes and additionally determine the fermionic completion. Second, using BRST cohomology manipulations in superspace, alternative representations of the component amplitudes are explored and the Bern-Carrasco-Johansson relations among partial tree amplitudes are derived in a novel way. Finally, it is shown how the supersymmetric components of manifestly local BCJ-satisfying tree-level numerators can be computed in a recursive fashion.

  3. Non-recursive augmented Lagrangian algorithms for the forward and inverse dynamics of constrained flexible multibodies

    NASA Technical Reports Server (NTRS)

    Bayo, Eduardo; Ledesma, Ragnar

    1993-01-01

    A technique is presented for solving the inverse dynamics of flexible planar multibody systems. This technique yields the non-causal joint efforts (inverse dynamics) as well as the internal states (inverse kinematics) that produce a prescribed nominal trajectory of the end effector. A non-recursive global Lagrangian approach is used in formulating the equations for motion as well as in solving the inverse dynamics equations. Contrary to the recursive method previously presented, the proposed method solves the inverse problem in a systematic and direct manner for both open-chain as well as closed-chain configurations. Numerical simulation shows that the proposed procedure provides an excellent tracking of the desired end effector trajectory.

  4. A Note on Local Stability Conditions for Two Types of Monetary Models with Recursive Utility

    NASA Astrophysics Data System (ADS)

    Miyazaki, Kenji; Utsunomiya, Hitoshi

    2009-09-01

    This note explores local stability conditions for money-in-utility-function (MIUF) and transaction-costs (TC) models with recursive utility. Although Chen et al. [Chen, B.-L., M. Hsu, and C.-H. Lin, 2008, Inflation and growth: impatience and a qualitative equivalent, Journal of Money, Credit, and Banking, Vol. 40, No. 6, 1310-1323] investigated the relationship between inflation and growth in MIUF and TC models with recursive utility, they conducted only a comparative static analysis in a steady state. By establishing sufficient conditions for local stability, this note proves that impatience should be increasing in consumption and real balances. Increasing impatience, although less plausible from an empirical point of view, receives more support from a theoretical viewpoint.

  5. A recursive vesicle-based model protocell with a primitive model cell cycle

    PubMed Central

    Kurihara, Kensuke; Okura, Yusaku; Matsuo, Muneyuki; Toyota, Taro; Suzuki, Kentaro; Sugawara, Tadashi

    2015-01-01

    Self-organized lipid structures (protocells) have been proposed as an intermediate between nonliving material and cellular life. Synthetic production of model protocells can demonstrate the potential processes by which living cells first arose. While we have previously described a giant vesicle (GV)-based model protocell in which amplification of DNA was linked to self-reproduction, the ability of a protocell to recursively self-proliferate for multiple generations has not been demonstrated. Here we show that newborn daughter GVs can be restored to the status of their parental GVs by pH-induced vesicular fusion of daughter GVs with conveyer GVs filled with depleted substrates. We describe a primitive model cell cycle comprising four discrete phases (ingestion, replication, maturity and division), each of which is selectively activated by a specific external stimulus. The production of recursive self-proliferating model protocells represents a step towards eventual production of model protocells that are able to mimic evolution. PMID:26418735

  6. Memory efficient and constant time 2D-recursive spatial averaging filter for embedded implementations

    NASA Astrophysics Data System (ADS)

    Gan, Qifeng; Seoud, Lama; Ben Tahar, Houssem; Langlois, J. M. Pierre

    2016-04-01

    Spatial Averaging Filters (SAF) are extensively used in image processing for image smoothing and denoising. Their latest implementations have already achieved constant time computational complexity regardless of kernel size. However, all the existing O(1) algorithms require additional memory for temporary data storage. In order to minimize memory usage in embedded systems, we introduce a new two-dimensional recursive SAF. It uses previous resultant pixel values along both rows and columns to calculate the current one. It can achieve constant time computational complexity without using any additional memory usage. Experimental comparisons with previous SAF implementations shows that the proposed 2D-Recursive SAF does not require any additional memory while offering a computational time similar to the most efficient existing SAF algorithm. These features make it especially suitable for embedded systems with limited memory capacity.

  7. Parsing recursive sentences with a connectionist model including a neural stack and synaptic gating.

    PubMed

    Fedor, Anna; Ittzés, Péter; Szathmáry, Eörs

    2011-02-21

    It is supposed that humans are genetically predisposed to be able to recognize sequences of context-free grammars with centre-embedded recursion while other primates are restricted to the recognition of finite state grammars with tail-recursion. Our aim was to construct a minimalist neural network that is able to parse artificial sentences of both grammars in an efficient way without using the biologically unrealistic backpropagation algorithm. The core of this network is a neural stack-like memory where the push and pop operations are regulated by synaptic gating on the connections between the layers of the stack. The network correctly categorizes novel sentences of both grammars after training. We suggest that the introduction of the neural stack memory will turn out to be substantial for any biological 'hierarchical processor' and the minimalist design of the model suggests a quest for similar, realistic neural architectures.

  8. A recursive vesicle-based model protocell with a primitive model cell cycle.

    PubMed

    Kurihara, Kensuke; Okura, Yusaku; Matsuo, Muneyuki; Toyota, Taro; Suzuki, Kentaro; Sugawara, Tadashi

    2015-01-01

    Self-organized lipid structures (protocells) have been proposed as an intermediate between nonliving material and cellular life. Synthetic production of model protocells can demonstrate the potential processes by which living cells first arose. While we have previously described a giant vesicle (GV)-based model protocell in which amplification of DNA was linked to self-reproduction, the ability of a protocell to recursively self-proliferate for multiple generations has not been demonstrated. Here we show that newborn daughter GVs can be restored to the status of their parental GVs by pH-induced vesicular fusion of daughter GVs with conveyer GVs filled with depleted substrates. We describe a primitive model cell cycle comprising four discrete phases (ingestion, replication, maturity and division), each of which is selectively activated by a specific external stimulus. The production of recursive self-proliferating model protocells represents a step towards eventual production of model protocells that are able to mimic evolution. PMID:26418735

  9. A recursive vesicle-based model protocell with a primitive model cell cycle

    NASA Astrophysics Data System (ADS)

    Kurihara, Kensuke; Okura, Yusaku; Matsuo, Muneyuki; Toyota, Taro; Suzuki, Kentaro; Sugawara, Tadashi

    2015-09-01

    Self-organized lipid structures (protocells) have been proposed as an intermediate between nonliving material and cellular life. Synthetic production of model protocells can demonstrate the potential processes by which living cells first arose. While we have previously described a giant vesicle (GV)-based model protocell in which amplification of DNA was linked to self-reproduction, the ability of a protocell to recursively self-proliferate for multiple generations has not been demonstrated. Here we show that newborn daughter GVs can be restored to the status of their parental GVs by pH-induced vesicular fusion of daughter GVs with conveyer GVs filled with depleted substrates. We describe a primitive model cell cycle comprising four discrete phases (ingestion, replication, maturity and division), each of which is selectively activated by a specific external stimulus. The production of recursive self-proliferating model protocells represents a step towards eventual production of model protocells that are able to mimic evolution.

  10. 2-D impulse noise suppression by recursive gaussian maximum likelihood estimation.

    PubMed

    Chen, Yang; Yang, Jian; Shu, Huazhong; Shi, Luyao; Wu, Jiasong; Luo, Limin; Coatrieux, Jean-Louis; Toumoulin, Christine

    2014-01-01

    An effective approach termed Recursive Gaussian Maximum Likelihood Estimation (RGMLE) is developed in this paper to suppress 2-D impulse noise. And two algorithms termed RGMLE-C and RGMLE-CS are derived by using spatially-adaptive variances, which are respectively estimated based on certainty and joint certainty & similarity information. To give reliable implementation of RGMLE-C and RGMLE-CS algorithms, a novel recursion stopping strategy is proposed by evaluating the estimation error of uncorrupted pixels. Numerical experiments on different noise densities show that the proposed two algorithms can lead to significantly better results than some typical median type filters. Efficient implementation is also realized via GPU (Graphic Processing Unit)-based parallelization techniques.

  11. Closed-form recursive formula for an optimal tracker with terminal constraints

    NASA Technical Reports Server (NTRS)

    Juang, J. N.; Turner, J. D.; Chun, H. M.

    1986-01-01

    Feedback control laws are derived for a class of optimal finite time tracking problems with terminal constraints. Analytical solutions are obtained for the feedback gain and the closed-loop response trajectory. Such formulations are expressed in recursive forms so that a real-time computer implementation becomes feasible. An example involving the feedback slewing of a flexible spacecraft is given to illustrate the validity and usefulness of the formulations.

  12. Recursive Estimation of the Stein Center of SPD Matrices & its Applications*

    PubMed Central

    Salehian, Hesamoddin; Cheng, Guang; Ho, Jeffrey

    2014-01-01

    Symmetric positive-definite (SPD) matrices are ubiquitous in Computer Vision, Machine Learning and Medical Image Analysis. Finding the center/average of a population of such matrices is a common theme in many algorithms such as clustering, segmentation, principal geodesic analysis, etc. The center of a population of such matrices can be defined using a variety of distance/divergence measures as the minimizer of the sum of squared distances/divergences from the unknown center to the members of the population. It is well known that the computation of the Karcher mean for the space of SPD matrices which is a negatively-curved Riemannian manifold is computationally expensive. Recently, the LogDet divergence-based center was shown to be a computationally attractive alternative. However, the LogDet-based mean of more than two matrices can not be computed in closed form, which makes it computationally less attractive for large populations. In this paper we present a novel recursive estimator for center based on the Stein distance – which is the square root of the LogDet divergence – that is significantly faster than the batch mode computation of this center. The key theoretical contribution is a closed-form solution for the weighted Stein center of two SPD matrices, which is used in the recursive computation of the Stein center for a population of SPD matrices. Additionally, we show experimental evidence of the convergence of our recursive Stein center estimator to the batch mode Stein center. We present applications of our recursive estimator to K-means clustering and image indexing depicting significant time gains over corresponding algorithms that use the batch mode computations. For the latter application, we develop novel hashing functions using the Stein distance and apply it to publicly available data sets, and experimental results have shown favorable comparisons to other competing methods. PMID:25350135

  13. Constraint on the Multi-Component CKP Hierarchy and Recursion Operators

    NASA Astrophysics Data System (ADS)

    Song, Tao; Li, Chuanzhong; He, Jingsong

    2016-06-01

    In this article, we give the definition of the multi-component constrained CKP (McCKP) and two-component constrained CKP (cCKP) hierarchies (under the condition N=2). Then we give recursion operators for the two-component cCKP hierarchy. At last, we give the constrained condition from the two-component cCKP hierarchies to cCKP hierarchy.

  14. Inferring relationships between health and fertility in Norwegian Red cows using recursive models.

    PubMed

    Heringstad, B; Wu, X-L; Gianola, D

    2009-04-01

    Health and fertility are complex traits, and the phenotype for one trait may affect the phenotype of one or more other traits. For instance, disease in early lactation may impair a cow's ability to show estrus and to conceive after insemination. The objectives of the present study were to explore phenotypic and genetic relationships among health and fertility traits in Norwegian Red cows using a recursive effects model, which allows disentangling causal effects of phenotypes from the genetic and environmental correlations among traits. Records of interval from calving to first insemination (CFI), nonreturn rate within 56 d after first insemination (NR56), clinical mastitis (CM), ketosis (KET), and retained placenta from 55,568 first-lactation daughters of 1,577 Norwegian Red sires were analyzed. Trivariate recursive Gaussian-threshold models were used to analyze the 2 fertility traits (CFI and NR56) together with 1 disease trait in each analysis. The estimated structural coefficients of the recursive models imply that presence of KET or retained placenta lengthened CFI, whereas causal effects from CM to fertility were negligible. Recursive effects of disease on NR56, and of CFI on NR56, were all close to zero. Genetic correlations between health and fertility traits were low or moderate. The strongest genetic correlation was between KET and CFI (0.29), whereas genetic correlations between CM and NR56 and between CFI and NR56 were nil. In general, selection against disease is expected to slightly improve fertility (shorter CFI and higher NR56) as a correlated response and vice versa. The present results suggest that the use of structural-equation models, such as the one used here, may enhance our understanding of complex relationships among traits.

  15. Majorant recursions to determine eigenstate bounds of a symmetric exponential quantum anharmonic oscillator

    SciTech Connect

    Özdemir, Semra Bayat; Demiralp, Metin

    2015-12-31

    The determination of the energy states is highly studied issue in the quantum mechanics. Based on expectation values dynamics, energy states can be observed. But conditions and calculations vary depending on the created system. In this work, a symmetric exponential anharmonic oscillator is considered and development of a recursive approximation method is studied to find its ground energy state. The use of majorant values facilitates the approximate calculation of expectation values.

  16. Identifying Homogeneous Subgroups in Neurological Disorders: Unbiased Recursive Partitioning in Cervical Complete Spinal Cord Injury.

    PubMed

    Tanadini, Lorenzo G; Steeves, John D; Hothorn, Torsten; Abel, Rainer; Maier, Doris; Schubert, Martin; Weidner, Norbert; Rupp, Rüdiger; Curt, Armin

    2014-07-01

    Background The reliable stratification of homogeneous subgroups and the prediction of future clinical outcomes within heterogeneous neurological disorders is a particularly challenging task. Nonetheless, it is essential for the implementation of targeted care and effective therapeutic interventions. Objective This study was designed to assess the value of a recently developed regression tool from the family of unbiased recursive partitioning methods in comparison to established statistical approaches (eg, linear and logistic regression) for predicting clinical endpoints and for prospective patients' stratification for clinical trials. Methods A retrospective, longitudinal analysis of prospectively collected neurological data from the European Multicenter study about Spinal Cord Injury (EMSCI) network was undertaken on C4-C6 cervical sensorimotor complete subjects. Predictors were based on a broad set of early (<2 weeks) clinical assessments. Endpoints were based on later clinical examinations of upper extremity motor scores and recovery of motor levels, at 6 and 12 months, respectively. Prediction accuracy for each statistical analysis was quantified by resampling techniques. Results For all settings, overlapping confidence intervals indicated similar prediction accuracy of unbiased recursive partitioning to established statistical approaches. In addition, unbiased recursive partitioning provided a direct way of identification of more homogeneous subgroups. The partitioning is carried out in a data-driven manner, independently from a priori decisions or predefined thresholds. Conclusion Unbiased recursive partitioning techniques may improve prediction of future clinical endpoints and the planning of future SCI clinical trials by providing easily implementable, data-driven rationales for early patient stratification based on simple decision rules and clinical read-outs.

  17. An application of change-point recursive models to the relationship between litter size and number of stillborns in pigs.

    PubMed

    Ibáñez-Escriche, N; López de Maturana, E; Noguera, J L; Varona, L

    2010-11-01

    We developed and implemented change-point recursive models and compared them with a linear recursive model and a standard mixed model (SMM), in the scope of the relationship between litter size (LS) and number of stillborns (NSB) in pigs. The proposed approach allows us to estimate the point of change in multiple-segment modeling of a nonlinear relationship between phenotypes. We applied the procedure to a data set provided by a commercial Large White selection nucleus. The data file consisted of LS and NSB records of 4,462 parities. The results of the analysis clearly identified the location of the change points between different structural regression coefficients. The magnitude of these coefficients increased with LS, indicating an increasing incidence of LS on the NSB ratio. However, posterior distributions of correlations were similar across subpopulations (defined by the change points on LS), except for those between residuals. The heritability estimates of NSB did not present differences between recursive models. Nevertheless, these heritabilities were greater than those obtained for SMM (0.05) with a posterior probability of 85%. These results suggest a nonlinear relationship between LS and NSB, which supports the adequacy of a change-point recursive model for its analysis. Furthermore, the results from model comparisons support the use of recursive models. However, the adequacy of the different recursive models depended on the criteria used: the linear recursive model was preferred on account of its smallest deviance value, whereas nonlinear recursive models provided a better fit and predictive ability based on the cross-validation approach.

  18. Development of Updated ABsorption SIMulation Software (ABSIM)

    SciTech Connect

    Yang, Zhiyao; Tang, Xin; Qu, Ming; Abdelaziz, Omar; Gluesenkamp, Kyle R

    2014-01-01

    ABsorption SIMulation, ABSIM, was developed for the simulation of absorption systems by The Oak Ridge National Laboratory during 1980s and 1990s. ABSIM provides a platform for users to investigate various cycle configurations and working fluids, to calculate their operating parameters, to predict their performance, and to compare them with each other on a uniform basis. ABSIM is indeed a very useful and accurate tool for researchers to investigate various absorption systems. However, it has not been well maintained: it is incompatible with recent operating systems; the interface needs improved user-friendliness, and the system needs better parameter setting and debugging tools to help achieve convergence. Therefore, it is highly needed to update and improve ABSIM. The paper presents recent efforts to improve ABSIM s compatibility with current operating systems, user interface, and analysis capabilities. The paper details the features and functions of the newly updated ABSIM software. The new ABSIM still uses the previously validated calculation engine of the old ABSIM. The new graphic user interfaces (GUI) were developed in Qt, which is an open source license GUI software based on C++. XML was used as the database for data storage in the new ABSIM. The new ABSIM has been designed to be easily learned and used. It has enhanced editing and construction functions, plus enhanced analysis features including parametric tables, plotting, property plots, and master panels for debugging. A single effect water/LiBr absorption system is used as a case study in this paper to illustrate the features, capabilities, and functions of the new ABSIM. This case study was actually an example system available in the old ABSIM. The new version of ABSIM will be continuously developed to include additional subroutines for the components in liquid desiccant systems. The new ABSIM will be available to public for free. The ultimate goal of the new ABSIM is to allow it to become a simulation

  19. XML Storage for Magnetotelluric Transfer Functions: Towards a Comprehensive Online Reference Database

    NASA Astrophysics Data System (ADS)

    Kelbert, A.; Blum, C.

    2015-12-01

    Magnetotelluric Transfer Functions (MT TFs) represent most of the information about Earth electrical conductivity found in the raw electromagnetic data, providing inputs for further inversion and interpretation. To be useful for scientific interpretation, they must also contain carefully recorded metadata. Making these data available in a discoverable and citable fashion would provide the most benefit to the scientific community, but such a development requires that the metadata is not only present in the file but is also searchable. The most commonly used MT TF format to date, the historical Society of Exploration Geophysicists Electromagnetic Data Interchange Standard 1987 (EDI), no longer supports some of the needs of modern magnetotellurics, most notably accurate error bars recording. Moreover, the inherent heterogeneity of EDI's and other historic MT TF formats has mostly kept the community away from healthy data sharing practices. Recently, the MT team at Oregon State University in collaboration with IRIS Data Management Center developed a new, XML-based format for MT transfer functions, and an online system for long-term storage, discovery and sharing of MT TF data worldwide (IRIS SPUD; www.iris.edu/spud/emtf). The system provides a query page where all of the MT transfer functions collected within the USArray MT experiment and other field campaigns can be searched for and downloaded; an automatic on-the-fly conversion to the historic EDI format is also included. To facilitate conversion to the new, more comprehensive and sustainable, XML format for MT TFs, and to streamline inclusion of historic data into the online database, we developed a set of open source format conversion tools, which can be used for rotation of MT TFs as well as a general XML <-> EDI converter (https://seiscode.iris.washington.edu/projects/emtf-fcu). Here, we report on the newly established collaboration between the USGS Geomagnetism Program and the Oregon State University to gather

  20. 77 FR 28541 - Request for Comments on the Recommendation for the Disclosure of Sequence Listings Using XML...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-15

    ...) Feature Keys and Qualifiers. ST.25 uses a controlled vocabulary of feature keys to describe nucleic acid... the standard for the presentation of nucleotide and/or amino acid sequences and the consequent changes... standard for the filing of nucleotide and/or amino acid sequence listings in XML format...

  1. Integrating XQuery-Enabled SCORM XML Metadata Repositories into an RDF-Based E-Learning P2P Network

    ERIC Educational Resources Information Center

    Qu, Changtao; Nejdl, Wolfgang

    2004-01-01

    Edutella is an RDF-based E-Learning P2P network that is aimed to accommodate heterogeneous learning resource metadata repositories in a P2P manner and further facilitate the exchange of metadata between these repositories based on RDF. Whereas Edutella provides RDF metadata repositories with a quite natural integration approach, XML metadata…

  2. Using Extensible Markup Language (XML) for the Single Source Delivery of Educational Resources by Print and Online: A Case Study

    ERIC Educational Resources Information Center

    Walsh, Lucas

    2007-01-01

    This article seeks to provide an introduction to Extensible Markup Language (XML) by looking at its use in a single source publishing approach to the provision of teaching resources in both hardcopy and online. Using the development of the International Baccalaureate Organisation's online Economics Subject Guide as a practical example, this…

  3. Evaluation of ISO EN 13606 as a result of its implementation in XML

    PubMed Central

    Sun, Shanghua; Hassan, Taher; Kalra, Dipak

    2013-01-01

    The five parts of the ISO EN 13606 standard define a means by which health-care records can be exchanged between computer systems. Starting within the European standardisation process, it has now become internationally ratified in ISO. However, ISO standards do not require that a reference implementation be provided, and in order for ISO EN 13606 to deliver the expected benefits, it must be provided not as a document, but as an operational system that is not vendor specific. This article describes the evolution of an Extensible Markup Language (XML) Schema through three iterations, each of which emphasised one particular approach to delivering an executable equivalent to the printed standard. Developing these operational versions and incorporating feedback from users of these demonstrated where implementation compromises were needed and exposed defects in the standard. These are discussed herein. They may require a future technical revision to ISO EN 13606 to resolve the issues identified. PMID:23995217

  4. Evaluation of ISO EN 13606 as a result of its implementation in XML.

    PubMed

    Austin, Tony; Sun, Shanghua; Hassan, Taher; Kalra, Dipak

    2013-12-01

    The five parts of the ISO EN 13606 standard define a means by which health-care records can be exchanged between computer systems. Starting within the European standardisation process, it has now become internationally ratified in ISO. However, ISO standards do not require that a reference implementation be provided, and in order for ISO EN 13606 to deliver the expected benefits, it must be provided not as a document, but as an operational system that is not vendor specific. This article describes the evolution of an Extensible Markup Language (XML) Schema through three iterations, each of which emphasised one particular approach to delivering an executable equivalent to the printed standard. Developing these operational versions and incorporating feedback from users of these demonstrated where implementation compromises were needed and exposed defects in the standard. These are discussed herein. They may require a future technical revision to ISO EN 13606 to resolve the issues identified. PMID:23995217

  5. Encoding of Fundamental Chemical Entities of Organic Reactivity Interest using chemical ontology and XML.

    PubMed

    Durairaj, Vijayasarathi; Punnaivanam, Sankar

    2015-09-01

    Fundamental chemical entities are identified in the context of organic reactivity and classified as appropriate concept classes namely ElectronEntity, AtomEntity, AtomGroupEntity, FunctionalGroupEntity and MolecularEntity. The entity classes and their subclasses are organized into a chemical ontology named "ChemEnt" for the purpose of assertion, restriction and modification of properties through entity relations. Individual instances of entity classes are defined and encoded as a library of chemical entities in XML. The instances of entity classes are distinguished with a unique notation and identification values in order to map them with the ontology definitions. A model GUI named Entity Table is created to view graphical representations of all the entity instances. The detection of chemical entities in chemical structures is achieved through suitable algorithms. The possibility of asserting properties to the entities at different levels and the mechanism of property flow within the hierarchical entity levels is outlined. PMID:26188793

  6. Encoding of Fundamental Chemical Entities of Organic Reactivity Interest using chemical ontology and XML.

    PubMed

    Durairaj, Vijayasarathi; Punnaivanam, Sankar

    2015-09-01

    Fundamental chemical entities are identified in the context of organic reactivity and classified as appropriate concept classes namely ElectronEntity, AtomEntity, AtomGroupEntity, FunctionalGroupEntity and MolecularEntity. The entity classes and their subclasses are organized into a chemical ontology named "ChemEnt" for the purpose of assertion, restriction and modification of properties through entity relations. Individual instances of entity classes are defined and encoded as a library of chemical entities in XML. The instances of entity classes are distinguished with a unique notation and identification values in order to map them with the ontology definitions. A model GUI named Entity Table is created to view graphical representations of all the entity instances. The detection of chemical entities in chemical structures is achieved through suitable algorithms. The possibility of asserting properties to the entities at different levels and the mechanism of property flow within the hierarchical entity levels is outlined.

  7. An XML Standard for Virtual Patients: Exchanging Case-Based Simulations in Medical Education

    PubMed Central

    Triola, Marc M.; Campion, Ned; McGee, James B.; Albright, Susan; Greene, Peter; Smothers, Valerie; Ellaway, Rachel

    2007-01-01

    Virtual Patients are computer-based simulations of a clinical encounter where the user plays the role of a healthcare provider while receiving in-context instruction. This unique pedagogical approach enables active case-based learning for learners. Academic institutions around the world have developed high-quality virtual patients using many different authoring and playback technologies. However, sustainability and scalability have proved challenging due to the number of cases needed and production costs. In an effort to promote sharing of Virtual Patients and broader adoption into medical education at all levels, MedBiquitous organized an international working group to create an XML-based “MedBiquitous Virtual Patient Standard” (MVP) describing a common structure for virtual patient content and activities. The MVP enables virtual patient exchange across systems, modification, and display within conformant player software. PMID:18693935

  8. HepML, an XML-based format for describing simulated data in high energy physics

    NASA Astrophysics Data System (ADS)

    Belov, S.; Dudko, L.; Kekelidze, D.; Sherstnev, A.

    2010-10-01

    In this paper we describe a HepML format and a corresponding C++ library developed for keeping complete description of parton level events in a unified and flexible form. HepML tags contain enough information to understand what kind of physics the simulated events describe and how the events have been prepared. A HepML block can be included into event files in the LHEF format. The structure of the HepML block is described by means of several XML Schemas. The Schemas define necessary information for the HepML block and how this information should be located within the block. The library libhepml is a C++ library intended for parsing and serialization of HepML tags, and representing the HepML block in computer memory. The library is an API for external software. For example, Matrix Element Monte Carlo event generators can use the library for preparing and writing a header of an LHEF file in the form of HepML tags. In turn, Showering and Hadronization event generators can parse the HepML header and get the information in the form of C++ classes. libhepml can be used in C++, C, and Fortran programs. All necessary parts of HepML have been prepared and we present the project to the HEP community. Program summaryProgram title: libhepml Catalogue identifier: AEGL_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGL_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPLv3 No. of lines in distributed program, including test data, etc.: 138 866 No. of bytes in distributed program, including test data, etc.: 613 122 Distribution format: tar.gz Programming language: C++, C Computer: PCs and workstations Operating system: Scientific Linux CERN 4/5, Ubuntu 9.10 RAM: 1 073 741 824 bytes (1 Gb) Classification: 6.2, 11.1, 11.2 External routines: Xerces XML library ( http://xerces.apache.org/xerces-c/), Expat XML Parser ( http://expat.sourceforge.net/) Nature of problem: Monte Carlo simulation in high

  9. ISS Update: Suitport

    NASA Video Gallery

    ISS Update commentator Lynnette Madison interviews Mallory Jennings, Suitport Human Testing Lead, about making spacewalks easier and more efficient with the Suitport. Questions? Ask us on Twitter @...

  10. A pipeline VLSI design of fast singular value decomposition processor for real-time EEG system based on on-line recursive independent component analysis.

    PubMed

    Huang, Kuan-Ju; Shih, Wei-Yeh; Chang, Jui Chung; Feng, Chih Wei; Fang, Wai-Chi

    2013-01-01

    This paper presents a pipeline VLSI design of fast singular value decomposition (SVD) processor for real-time electroencephalography (EEG) system based on on-line recursive independent component analysis (ORICA). Since SVD is used frequently in computations of the real-time EEG system, a low-latency and high-accuracy SVD processor is essential. During the EEG system process, the proposed SVD processor aims to solve the diagonal, inverse and inverse square root matrices of the target matrices in real time. Generally, SVD requires a huge amount of computation in hardware implementation. Therefore, this work proposes a novel design concept for data flow updating to assist the pipeline VLSI implementation. The SVD processor can greatly improve the feasibility of real-time EEG system applications such as brain computer interfaces (BCIs). The proposed architecture is implemented using TSMC 90 nm CMOS technology. The sample rate of EEG raw data adopts 128 Hz. The core size of the SVD processor is 580×580 um(2), and the speed of operation frequency is 20MHz. It consumes 0.774mW of power during the 8-channel EEG system per execution time.

  11. Bayesian recursive mixed linear model for gene expression analyses with continuous covariates.

    PubMed

    Casellas, J; Ibáñez-Escriche, N

    2012-01-01

    The analysis of microarray gene expression data has experienced a remarkable growth in scientific research over the last few years and is helping to decipher the genetic background of several productive traits. Nevertheless, most analytical approaches have relied on the comparison of 2 (or a few) well-defined groups of biological conditions where the continuous covariates have no sense (e.g., healthy vs. cancerous cells). Continuous effects could be of special interest when analyzing gene expression in animal production-oriented studies (e.g., birth weight), although very few studies address this peculiarity in the animal science framework. Within this context, we have developed a recursive linear mixed model where not only are linear covariates accounted for during gene expression analyses but also hierarchized and the effects of their genetic, environmental, and residual components on differential gene expression inferred independently. This parameterization allows a step forward in the inference of differential gene expression linked to a given quantitative trait such as birth weight. The statistical performance of this recursive model was exemplified under simulation by accounting for different sample sizes (n), heritabilities for the quantitative trait (h(2)), and magnitudes of differential gene expression (λ). It is important to highlight that statistical power increased with n, h(2), and λ, and the recursive model exceeded the standard linear mixed model with linear (nonrecursive) covariates in the majority of scenarios. This new parameterization would provide new insights about gene expression in the animal science framework, opening a new research scenario where within-covariate sources of differential gene expression could be individualized and estimated. The source code of the program accommodating these analytical developments and additional information about practical aspects on running the program are freely available by request to the corresponding

  12. Recursive Focal Plane Wavefront and Bias Estimation for the Direct Imaging of Exoplanets

    NASA Astrophysics Data System (ADS)

    Eldorado Riggs, A. J.; Kasdin, N. Jeremy; Groff, Tyler Dean

    2016-01-01

    To image the reflected light from exoplanets and disks, an instrument must suppress diffracted starlight by about nine orders of magnitude. A coronagraph alters the stellar PSF to create regions of high contrast, but it is extremely sensitive to wavefront aberrations. Deformable mirrors (DMs) are necessary to mitigate these quasi-static aberrations and recover high-contrast. To avoid non-common path aberrations, the science camera must be used as the primary wavefront sensor. Focal plane wavefront correction is an iterative process, and obtaining sufficient signal in the dark holes requires long exposure times. The fastest coronagraphic wavefront correction techniques require estimates of the stellar electric field. The main challenge of coronagraphy is thus to perform complex wavefront estimation quickly and efficiently using intensity images from the camera. The most widely applicable and tested technique is DM Diversity, in which a DM modulates the focal plane intensity and several images are used to reconstruct the stellar electric field in a batch process. At the High Contrast Imaging Lab (HCIL) at Princeton, we have developed an iterative extended Kalman filter (IEKF) to improve upon this technique. The IEKF enables recursive starlight estimation and can utilize fewer images per iteration, thereby speeding up wavefront correction. This IEKF formulation also estimates the bias in the images recursively. Since exoplanets and disks are embedded in the incoherent bias signal, the IEKF enables detection of science targets during wavefront correction. Here we present simulated and experimental results from Princeton's HCIL demonstrating the effectiveness of the IEKF for recursive electric field estimation and exoplanet detection.

  13. Use of the recursion formula of the Gompertz survival function to evaluate life-table data.

    PubMed

    Bassukas, I D

    1996-08-29

    The recursion formula of the Gompertz function is an established method for the analysis of growth processes. In the present study the recursion formula of the Gompertz survival function 1n S(t + s) = a + b x ln S(t) is introduced for the analysis of survival data, where S(t) is the survival fraction at age 1, s is the constant age increment between two consecutive measurements of the survival fraction and a and b are parameters. With the help of this method--and provided stroboscopial measurements of rates of survival are available--the Gompertz survival function, instead of the corresponding mortality function, can be determined directly using linear regression analysis. The application of the present algorithm is demonstrated by analysing two sets of data taken from the literature (survival of Drosophila imagoes and of female centenarians) using linear regression analysis to fit survival or mortality rates to the corresponding models. In both cases the quality of fit was superior by using the algorithm presently introduced. Moreover, survival functions calculated from the fits to the mortality law only poorly predict the survival data. On the contrary, the results of the present method not only fit to the measurements, but, for both sets of data the mortality parameters calculated by the present method are essentially identical to those obtained by a corresponding application of a non-linear Marquardt-Levenberg algorithm to fit the same sets of data to the explicit form of the Gompertz survival function. Taking into consideration the advantages of using a linear fit (goodness-of-fit test and efficient statistical comparison of survival patterns) the method of the recursion formula of the Gompertz survival function is the most preferable method to fit survival data to the Gompertz function.

  14. System Simulation by Recursive Feedback: Coupling A Set of Stand-Alone Subsystem Simulations

    NASA Technical Reports Server (NTRS)

    Nixon, Douglas D.; Hanson, John M. (Technical Monitor)

    2002-01-01

    Recursive feedback is defined and discussed as a framework for development of specific algorithms and procedures that propagate the time-domain solution for a dynamical system simulation consisting of multiple numerically coupled self-contained stand-alone subsystem simulations. A satellite motion example containing three subsystems (other dynamics, attitude dynamics, and aerodynamics) has been defined and constructed using this approach. Conventional solution methods are used in the subsystem simulations. Centralized and distributed versions of coupling structure have been addressed. Numerical results are evaluated by direct comparison with a standard total-system simultaneous-solution approach.

  15. System Simulation by Recursive Feedback: Coupling a Set of Stand-Alone Subsystem Simulations

    NASA Technical Reports Server (NTRS)

    Nixon, Douglas D.; Ryan, Stephen G. (Technical Monitor)

    2002-01-01

    Recursive feedback is defined and discussed as a framework for development of specific algorithms and procedures that propagate the time-domain solution for a dynamical system simulation consisting of multiple numerically coupled, self-contained, stand-alone subsystem simulations. A satellite motion example containing three subsystems (orbit dynamics, attitude dynamics, and aerodynamics) has been defined and constructed using this approach. Conventional solution methods are used in the subsystem simulations. Centralized and distributed versions of coupling structure have been addressed. Numerical results are evaluated by direct comparison with a standard total-system, simultaneous-solution approach.

  16. Recursive Recovery of Sparse Signal Sequences From Compressive Measurements: A Review

    NASA Astrophysics Data System (ADS)

    Vaswani, Namrata; Zhan, Jinchun

    2016-07-01

    In this article, we review the literature on design and analysis of recursive algorithms for reconstructing a time sequence of sparse signals from compressive measurements. The signals are assumed to be sparse in some transform domain or in some dictionary. Their sparsity patterns can change with time, although, in many practical applications, the changes are gradual. An important class of applications where this problem occurs is dynamic projection imaging, e.g., dynamic magnetic resonance imaging (MRI) for real-time medical applications such as interventional radiology, or dynamic computed tomography.

  17. Gauge amplitude identities by on-shell recursion relation in S-matrix program

    NASA Astrophysics Data System (ADS)

    Feng, Bo; Huang, Rijun; Jia, Yin

    2011-01-01

    Using only the Britto-Cachazo-Feng-Witten (BCFW) on-shell recursion relation we prove color-order reversed relation, U(1)-decoupling relation, Kleiss-Kuijf (KK) relation and Bern-Carrasco-Johansson (BCJ) relation for color-ordered gauge amplitude in the framework of S-matrix program without relying on Lagrangian description. Our derivation is the first pure field theory proof of the new discovered BCJ identity, which substantially reduces the color-ordered basis from (n-2)! to (n-3)!. Our proof gives also its physical interpretation as the mysterious bonus relation with 1/z behavior under suitable on-shell deformation for no adjacent pair.

  18. Chandrasekhar-type algorithms for fast recursive estimation in linear systems with constant parameters

    NASA Technical Reports Server (NTRS)

    Choudhury, A. K.; Djalali, M.

    1975-01-01

    In this recursive method proposed, the gain matrix for the Kalman filter and the convariance of the state vector are computed not via the Riccati equation, but from certain other equations. These differential equations are of Chandrasekhar-type. The 'invariant imbedding' idea resulted in the reduction of the basic boundary value problem of transport theory to an equivalent initial value system, a significant computational advance. Initial value experience showed that there is some computational savings in the method and the loss of positive definiteness of the covariance matrix is less vulnerable.

  19. Recursive ideal observer detection of known M-ary signals in multiplicative and additive Gaussian noise.

    NASA Technical Reports Server (NTRS)

    Painter, J. H.; Gupta, S. C.

    1973-01-01

    This paper presents the derivation of the recursive algorithms necessary for real-time digital detection of M-ary known signals that are subject to independent multiplicative and additive Gaussian noises. The motivating application is minimum probability of error detection of digital data-link messages aboard civil aircraft in the earth reflection multipath environment. For each known signal, the detector contains one Kalman filter and one probability computer. The filters estimate the multipath disturbance. The estimates and the received signal drive the probability computers. Outputs of all the computers are compared in amplitude to give the signal decision. The practicality and usefulness of the detector are extensively discussed.

  20. Cyclic period-3 window in antiferromagnetic potts and Ising models on recursive lattices

    NASA Astrophysics Data System (ADS)

    Ananikian, N. S.; Ananikyan, L. N.; Chakhmakhchyan, L. A.

    2011-09-01

    The magnetic properties of the antiferromagnetic Potts model with two-site interaction and the antiferromagnetic Ising model with three-site interaction on recursive lattices have been studied. A cyclic period-3 window has been revealed by the recurrence relation method in the antiferromagnetic Q-state Potts model on the Bethe lattice (at Q < 2) and in the antiferromagnetic Ising model with three-site interaction on the Husimi cactus. The Lyapunov exponents have been calculated, modulated phases and a chaotic regime in the cyclic period-3 window have been found for one-dimensional rational mappings determined the properties of these systems.

  1. A convolutional recursive modified Self Organizing Map for handwritten digits recognition.

    PubMed

    Mohebi, Ehsan; Bagirov, Adil

    2014-12-01

    It is well known that the handwritten digits recognition is a challenging problem. Different classification algorithms have been applied to solve it. Among them, the Self Organizing Maps (SOM) produced promising results. In this paper, first we introduce a Modified SOM for the vector quantization problem with improved initialization process and topology preservation. Then we develop a Convolutional Recursive Modified SOM and apply it to the problem of handwritten digits recognition. The computational results obtained using the well known MNIST dataset demonstrate the superiority of the proposed algorithm over the existing SOM-based algorithms.

  2. Recursive polarization of nuclear spins in diamond at arbitrary magnetic fields

    SciTech Connect

    Pagliero, Daniela; Laraoui, Abdelghani; Henshaw, Jacob D.; Meriles, Carlos A.

    2014-12-15

    We introduce an alternate route to dynamically polarize the nuclear spin host of nitrogen-vacancy (NV) centers in diamond. Our approach articulates optical, microwave, and radio-frequency pulses to recursively transfer spin polarization from the NV electronic spin. Using two complementary variants of the same underlying principle, we demonstrate nitrogen nuclear spin initialization approaching 80% at room temperature both in ensemble and single NV centers. Unlike existing schemes, our approach does not rely on level anti-crossings and is thus applicable at arbitrary magnetic fields. This versatility should prove useful in applications ranging from nanoscale metrology to sensitivity-enhanced NMR.

  3. Recursive encoding and decoding of the noiseless subsystem and decoherence-free subspace

    SciTech Connect

    Li, Chi-Kwong; Nakahara, Mikio; Poon, Yiu-Tung; Sze, Nung-Sing; Tomita, Hiroyuki

    2011-10-15

    When an environmental disturbance to a quantum system has a wavelength much larger than the system size, all qubits in the system are under the action of the same error operator. The noiseless subsystem and decoherence-free subspace are immune to such collective noise. We construct simple quantum circuits that implement these error-avoiding codes for a small number n of physical qubits. A single logical qubit is encoded with n=3 and 4, while two and three logical qubits are encoded with n=5 and 7, respectively. Recursive relations among subspaces employed in these codes play essential roles in our implementation.

  4. Recursive solution of number of reachable states of a simple subclass of FMS

    NASA Astrophysics Data System (ADS)

    Chao, Daniel Yuh

    2014-03-01

    This paper aims to compute the number of reachable (forbidden, live and deadlock) states for flexible manufacturing systems (FMS) without the construction of reachability graph. The problem is nontrivial and takes, in general, an exponential amount of time to solve. Hence, this paper focusses on a simple version of Systems of Simple Sequential Processes with Resources (S3PR), called kth-order system, where each resource place holds one token to be shared between two processes. The exact number of reachable (forbidden, live and deadlock) states can be computed recursively.

  5. Recursive Averaging

    ERIC Educational Resources Information Center

    Smith, Scott G.

    2015-01-01

    In this article, Scott Smith presents an innocent problem (Problem 12 of the May 2001 Calendar from "Mathematics Teacher" ("MT" May 2001, vol. 94, no. 5, p. 384) that was transformed by several timely "what if?" questions into a rewarding investigation of some interesting mathematics. These investigations led to two…

  6. Genomics and Health Impact Update

    MedlinePlus

    ... Genomics in Practice Newborn Screening Pharmacogenomics Reproductive Health Tools and Databases About the Genomics & Health Impact Update The Office of Public Health Genomics provides updated and credible ...

  7. A recursive field-normalized bibliometric performance indicator: an application to the field of library and information science.

    PubMed

    Waltman, Ludo; Yan, Erjia; van Eck, Nees Jan

    2011-10-01

    Two commonly used ideas in the development of citation-based research performance indicators are the idea of normalizing citation counts based on a field classification scheme and the idea of recursive citation weighing (like in PageRank-inspired indicators). We combine these two ideas in a single indicator, referred to as the recursive mean normalized citation score indicator, and we study the validity of this indicator. Our empirical analysis shows that the proposed indicator is highly sensitive to the field classification scheme that is used. The indicator also has a strong tendency to reinforce biases caused by the classification scheme. Based on these observations, we advise against the use of indicators in which the idea of normalization based on a field classification scheme and the idea of recursive citation weighing are combined.

  8. VKCDB: voltage-gated K+ channel database updated and upgraded.

    PubMed

    Gallin, Warren J; Boutet, Patrick A

    2011-01-01

    The Voltage-gated K(+) Channel DataBase (VKCDB) (http://vkcdb.biology.ualberta.ca) makes a comprehensive set of sequence data readily available for phylogenetic and comparative analysis. The current update contains 2063 entries for full-length or nearly full-length unique channel sequences from Bacteria (477), Archaea (18) and Eukaryotes (1568), an increase from 346 solely eukaryotic entries in the original release. In addition to protein sequences for channels, corresponding nucleotide sequences of the open reading frames corresponding to the amino acid sequences are now available and can be extracted in parallel with sets of protein sequences. Channels are categorized into subfamilies by phylogenetic analysis and by using hidden Markov model analyses. Although the raw database contains a number of fragmentary, duplicated, obsolete and non-channel sequences that were collected in early steps of data collection, the web interface will only return entries that have been validated as likely K(+) channels. The retrieval function of the web interface allows retrieval of entries that contain a substantial fraction of the core structural elements of VKCs, fragmentary entries, or both. The full database can be downloaded as either a MySQL dump or as an XML dump from the web site. We have now implemented automated updates at quarterly intervals.

  9. Kernel Recursive Least-Squares Temporal Difference Algorithms with Sparsification and Regularization

    PubMed Central

    Zhu, Qingxin; Niu, Xinzheng

    2016-01-01

    By combining with sparse kernel methods, least-squares temporal difference (LSTD) algorithms can construct the feature dictionary automatically and obtain a better generalization ability. However, the previous kernel-based LSTD algorithms do not consider regularization and their sparsification processes are batch or offline, which hinder their widespread applications in online learning problems. In this paper, we combine the following five techniques and propose two novel kernel recursive LSTD algorithms: (i) online sparsification, which can cope with unknown state regions and be used for online learning, (ii) L2 and L1 regularization, which can avoid overfitting and eliminate the influence of noise, (iii) recursive least squares, which can eliminate matrix-inversion operations and reduce computational complexity, (iv) a sliding-window approach, which can avoid caching all history samples and reduce the computational cost, and (v) the fixed-point subiteration and online pruning, which can make L1 regularization easy to implement. Finally, simulation results on two 50-state chain problems demonstrate the effectiveness of our algorithms. PMID:27436996

  10. Real-Time Adaptive EEG Source Separation Using Online Recursive Independent Component Analysis.

    PubMed

    Hsu, Sheng-Hsiou; Mullen, Tim R; Jung, Tzyy-Ping; Cauwenberghs, Gert

    2016-03-01

    Independent component analysis (ICA) has been widely applied to electroencephalographic (EEG) biosignal processing and brain-computer interfaces. The practical use of ICA, however, is limited by its computational complexity, data requirements for convergence, and assumption of data stationarity, especially for high-density data. Here we study and validate an optimized online recursive ICA algorithm (ORICA) with online recursive least squares (RLS) whitening for blind source separation of high-density EEG data, which offers instantaneous incremental convergence upon presentation of new data. Empirical results of this study demonstrate the algorithm's: 1) suitability for accurate and efficient source identification in high-density (64-channel) realistically-simulated EEG data; 2) capability to detect and adapt to nonstationarity in 64-ch simulated EEG data; and 3) utility for rapidly extracting principal brain and artifact sources in real 61-channel EEG data recorded by a dry and wearable EEG system in a cognitive experiment. ORICA was implemented as functions in BCILAB and EEGLAB and was integrated in an open-source Real-time EEG Source-mapping Toolbox (REST), supporting applications in ICA-based online artifact rejection, feature extraction for real-time biosignal monitoring in clinical environments, and adaptable classifications in brain-computer interfaces. PMID:26685257

  11. A recursive genetic framework for evolutionary decision-making in problems with high dynamism

    NASA Astrophysics Data System (ADS)

    Pashaei, Kaveh; Taghiyareh, Fattaneh; Badie, Kambiz

    2015-11-01

    Communication and coordination are the main cores for reaching a constructive agreement among multi-agent systems (MASs). Dividing the overall performance of MAS to individual agents may lead to group learning as opposed to individual learning, which is one of the weak points of MASs. This paper proposes a recursive genetic framework for solving problems with high dynamism. In this framework, a combination of genetic algorithm and multi-agent capabilities is utilised to accelerate team learning and accurate credit assignment. The argumentation feature is used to accomplish agent learning and the negotiation features of MASs are used to achieve a credit assignment. The proposed framework is quite general and its recursive hierarchical structure could be extended. We have dedicated one special controlling module for increasing convergence time. Due to the complexity of blackjack, we have applied it as a possible test bed to evaluate the system's performance. The learning rate of agents is measured as well as their credit assignment. The analysis of the obtained results led us to believe that our robust framework with the proposed negotiation operator is a promising methodology to solve similar problems in other areas with high dynamism.

  12. Recursive graphical construction of feynman diagrams in straight phi(4) theory: asymmetric case and effective energy

    PubMed

    Kastening

    2000-04-01

    The free energy of a multicomponent scalar field theory is considered as a functional W[G,J] of the free correlation function G and an external current J. It obeys nonlinear functional differential equations which are turned into recursion relations for the connected Green's functions in a loop expansion. These relations amount to a simple proof that W[G,J] generates only connected graphs and can be used to find all such graphs with their combinatoric weights. A Legendre transformation with respect to the external current converts the functional differential equations for the free energy into those for the effective energy Gamma[G,Phi], which is considered as a functional of the free correlation function G and the field expectation Phi. These equations are turned into recursion relations for the one-particle irreducible Green's functions. These relations amount to a simple proof that Gamma[G,J] generates only one-particle irreducible graphs and can be used to find all such graphs with their combinatoric weights. The techniques used also allow for a systematic investigation into resummations of classes of graphs. Examples are given for resumming one-loop and multiloop tadpoles, both through all orders of perturbation theory. Since the functional differential equations derived are nonperturbative, they constitute also a convenient starting point for other expansions than those in numbers of loops or powers of coupling constants. We work with general interactions through four powers in the field.

  13. Recursive forward dynamics for multiple robot arms moving a common task object

    NASA Technical Reports Server (NTRS)

    Rodriguez, G.

    1988-01-01

    Recursive forward dynamics algorithms are developed for an arbitrary number of robot arms moving a commonly held object. The multiarm forward dynamics problem is to find the angular accelerations at the joints and the contact forces that the arms impart to the task object. The problem also involves finding the acceleration of this object. The multiarm forward dynamics solutions provide a thorough physical and mathematical understanding of the way several arms behave in response to a set of applied joint moments. Such an understanding simplifies and guides the subsequent control design and experimentation process. The forward dynamics algorithms also provide the necessary analytical foundation for conducting analysis and simulation studies. The multiarm algorithms are based on the filtering and smoothing approach recently advanced for single-arm dynamics, and they can be built up modularly from the single-arm algorithms. The algorithms compute recursively the joint-angle accelerations, the contact forces, and the task-object accelerations. Algorithms are also developed to evaluate in closed form the linear transformations from the active joint moments to the joint-angle accelerations, to the task-object accelerations., and to the task-object contact forces. A possible computing architecture is presented as a precursor to a more complete investigation of the computational performance of the dynamics algorithms.

  14. Recursive directional ligation by plasmid reconstruction allows rapid and seamless cloning of oligomeric genes.

    PubMed

    McDaniel, Jonathan R; Mackay, J Andrew; Quiroz, Felipe García; Chilkoti, Ashutosh

    2010-04-12

    This paper reports a new strategy, recursive directional ligation by plasmid reconstruction (PRe-RDL), to rapidly clone highly repetitive polypeptides of any sequence and specified length over a large range of molecular weights. In a single cycle of PRe-RDL, two halves of a parent plasmid, each containing a copy of an oligomer, are ligated together, thereby dimerizing the oligomer and reconstituting a functional plasmid. This process is carried out recursively to assemble an oligomeric gene with the desired number of repeats. PRe-RDL has several unique features that stem from the use of type IIs restriction endonucleases: first, PRe-RDL is a seamless cloning method that leaves no extraneous nucleotides at the ligation junction. Because it uses type IIs endonucleases to ligate the two halves of the plasmid, PRe-RDL also addresses the major limitation of RDL in that it abolishes any restriction on the gene sequence that can be oligomerized. The reconstitution of a functional plasmid only upon successful ligation in PRe-RDL also addresses two other limitations of RDL: the significant background from self-ligation of the vector observed in RDL, and the decreased efficiency of ligation due to nonproductive circularization of the insert. PRe-RDL can also be used to assemble genes that encode different sequences in a predetermined order to encode block copolymers or append leader and trailer peptide sequences to the oligomerized gene. PMID:20184309

  15. Development of Fast Algorithms Using Recursion, Nesting and Iterations for Computational Electromagnetics

    NASA Technical Reports Server (NTRS)

    Chew, W. C.; Song, J. M.; Lu, C. C.; Weedon, W. H.

    1995-01-01

    In the first phase of our work, we have concentrated on laying the foundation to develop fast algorithms, including the use of recursive structure like the recursive aggregate interaction matrix algorithm (RAIMA), the nested equivalence principle algorithm (NEPAL), the ray-propagation fast multipole algorithm (RPFMA), and the multi-level fast multipole algorithm (MLFMA). We have also investigated the use of curvilinear patches to build a basic method of moments code where these acceleration techniques can be used later. In the second phase, which is mainly reported on here, we have concentrated on implementing three-dimensional NEPAL on a massively parallel machine, the Connection Machine CM-5, and have been able to obtain some 3D scattering results. In order to understand the parallelization of codes on the Connection Machine, we have also studied the parallelization of 3D finite-difference time-domain (FDTD) code with PML material absorbing boundary condition (ABC). We found that simple algorithms like the FDTD with material ABC can be parallelized very well allowing us to solve within a minute a problem of over a million nodes. In addition, we have studied the use of the fast multipole method and the ray-propagation fast multipole algorithm to expedite matrix-vector multiplication in a conjugate-gradient solution to integral equations of scattering. We find that these methods are faster than LU decomposition for one incident angle, but are slower than LU decomposition when many incident angles are needed as in the monostatic RCS calculations.

  16. Accelerated solution of non-linear flow problems using Chebyshev iteration polynomial based RK recursions

    SciTech Connect

    Lorber, A.A.; Carey, G.F.; Bova, S.W.; Harle, C.H.

    1996-12-31

    The connection between the solution of linear systems of equations by iterative methods and explicit time stepping techniques is used to accelerate to steady state the solution of ODE systems arising from discretized PDEs which may involve either physical or artificial transient terms. Specifically, a class of Runge-Kutta (RK) time integration schemes with extended stability domains has been used to develop recursion formulas which lead to accelerated iterative performance. The coefficients for the RK schemes are chosen based on the theory of Chebyshev iteration polynomials in conjunction with a local linear stability analysis. We refer to these schemes as Chebyshev Parameterized Runge Kutta (CPRK) methods. CPRK methods of one to four stages are derived as functions of the parameters which describe an ellipse {Epsilon} which the stability domain of the methods is known to contain. Of particular interest are two-stage, first-order CPRK and four-stage, first-order methods. It is found that the former method can be identified with any two-stage RK method through the correct choice of parameters. The latter method is found to have a wide range of stability domains, with a maximum extension of 32 along the real axis. Recursion performance results are presented below for a model linear convection-diffusion problem as well as non-linear fluid flow problems discretized by both finite-difference and finite-element methods.

  17. Efficient O(N) recursive computation of the operational space inertial matrix

    SciTech Connect

    Lilly, K.W.; Orin, D.E.

    1993-09-01

    The operational space inertia matrix {Lambda} reflects the dynamic properties of a robot manipulator to its tip. In the control domain, it may be used to decouple force and/or motion control about the manipulator workspace axes. The matrix {Lambda} also plays an important role in the development of efficient algorithms for the dynamic simulation of closed-chain robotic mechanisms, including simple closed-chain mechanisms such as multiple manipulator systems and walking machines. The traditional approach used to compute {Lambda} has a computational complexity of O(N{sup 3}) for an N degree-of-freedom manipulator. This paper presents the development of a recursive algorithm for computing the operational space inertia matrix (OSIM) that reduces the computational complexity to O(N). This algorithm, the inertia propagation method, is based on a single recursion that begins at the base of the manipulator and progresses out to the last link. Also applicable to redundant systems and mechanisms with multiple-degree-of-freedom joints, the inertia propagation method is the most efficient method known for computing {Lambda} for N {>=} 6. The numerical accuracy of the algorithm is discussed for a PUMA 560 robot with a fixed base.

  18. A generalized recursive convolution method for time-domain propagation in porous media.

    PubMed

    Dragna, Didier; Pineau, Pierre; Blanc-Benon, Philippe

    2015-08-01

    An efficient numerical method, referred to as the auxiliary differential equation (ADE) method, is proposed to compute convolutions between relaxation functions and acoustic variables arising in sound propagation equations in porous media. For this purpose, the relaxation functions are approximated in the frequency domain by rational functions. The time variation of the convolution is thus governed by first-order differential equations which can be straightforwardly solved. The accuracy of the method is first investigated and compared to that of recursive convolution methods. It is shown that, while recursive convolution methods are first or second-order accurate in time, the ADE method does not introduce any additional error. The ADE method is then applied for outdoor sound propagation using the equations proposed by Wilson et al. in the ground [(2007). Appl. Acoust. 68, 173-200]. A first one-dimensional case is performed showing that only five poles are necessary to accurately approximate the relaxation functions for typical applications. Finally, the ADE method is used to compute sound propagation in a three-dimensional geometry over an absorbing ground. Results obtained with Wilson's equations are compared to those obtained with Zwikker and Kosten's equations and with an impedance surface for different flow resistivities.

  19. A recursive regularization algorithm for estimating the particle size distribution from multiangle dynamic light scattering measurements

    NASA Astrophysics Data System (ADS)

    Li, Lei; Yang, Kecheng; Li, Wei; Wang, Wanyan; Guo, Wenping; Xia, Min

    2016-07-01

    Conventional regularization methods have been widely used for estimating particle size distribution (PSD) in single-angle dynamic light scattering, but they could not be used directly in multiangle dynamic light scattering (MDLS) measurements for lack of accurate angular weighting coefficients, which greatly affects the PSD determination and none of the regularization methods perform well for both unimodal and multimodal distributions. In this paper, we propose a recursive regularization method-Recursion Nonnegative Tikhonov-Phillips-Twomey (RNNT-PT) algorithm for estimating the weighting coefficients and PSD from MDLS data. This is a self-adaptive algorithm which distinguishes characteristics of PSDs and chooses the optimal inversion method from Nonnegative Tikhonov (NNT) and Nonnegative Phillips-Twomey (NNPT) regularization algorithm efficiently and automatically. In simulations, the proposed algorithm was able to estimate the PSDs more accurately than the classical regularization methods and performed stably against random noise and adaptable to both unimodal and multimodal distributions. Furthermore, we found that the six-angle analysis in the 30-130° range is an optimal angle set for both unimodal and multimodal PSDs.

  20. Kernel Recursive Least-Squares Temporal Difference Algorithms with Sparsification and Regularization.

    PubMed

    Zhang, Chunyuan; Zhu, Qingxin; Niu, Xinzheng

    2016-01-01

    By combining with sparse kernel methods, least-squares temporal difference (LSTD) algorithms can construct the feature dictionary automatically and obtain a better generalization ability. However, the previous kernel-based LSTD algorithms do not consider regularization and their sparsification processes are batch or offline, which hinder their widespread applications in online learning problems. In this paper, we combine the following five techniques and propose two novel kernel recursive LSTD algorithms: (i) online sparsification, which can cope with unknown state regions and be used for online learning, (ii) L 2 and L 1 regularization, which can avoid overfitting and eliminate the influence of noise, (iii) recursive least squares, which can eliminate matrix-inversion operations and reduce computational complexity, (iv) a sliding-window approach, which can avoid caching all history samples and reduce the computational cost, and (v) the fixed-point subiteration and online pruning, which can make L 1 regularization easy to implement. Finally, simulation results on two 50-state chain problems demonstrate the effectiveness of our algorithms. PMID:27436996

  1. Multi-fidelity modelling via recursive co-kriging and Gaussian–Markov random fields

    PubMed Central

    Perdikaris, P.; Venturi, D.; Royset, J. O.; Karniadakis, G. E.

    2015-01-01

    We propose a new framework for design under uncertainty based on stochastic computer simulations and multi-level recursive co-kriging. The proposed methodology simultaneously takes into account multi-fidelity in models, such as direct numerical simulations versus empirical formulae, as well as multi-fidelity in the probability space (e.g. sparse grids versus tensor product multi-element probabilistic collocation). We are able to construct response surfaces of complex dynamical systems by blending multiple information sources via auto-regressive stochastic modelling. A computationally efficient machine learning framework is developed based on multi-level recursive co-kriging with sparse precision matrices of Gaussian–Markov random fields. The effectiveness of the new algorithms is demonstrated in numerical examples involving a prototype problem in risk-averse design, regression of random functions, as well as uncertainty quantification in fluid mechanics involving the evolution of a Burgers equation from a random initial state, and random laminar wakes behind circular cylinders. PMID:26345079

  2. A small-world network derived from the deterministic uniform recursive tree by line graph operation

    NASA Astrophysics Data System (ADS)

    Hou, Pengfeng; Zhao, Haixing; Mao, Yaping; Wang, Zhao

    2016-03-01

    The deterministic uniform recursive tree ({DURT}) is one of the deterministic versions of the uniform recursive tree ({URT}). Zhang et al (2008 Eur. Phys. J. B 63 507-13) studied the properties of DURT, including its topological characteristics and spectral properties. Although DURT shows a logarithmic scaling with the size of the network, DURT is not a small-world network since its clustering coefficient is zero. Lu et al (2012 Physica A 391 87-92) proposed a deterministic small-world network by adding some edges with a simple rule in each DURT iteration. In this paper, we intoduce a method for constructing a new deterministic small-world network by the line graph operation in each DURT iteration. The line graph operation brings about cliques at each node of the previous given graph, and the resulting line graph possesses larger clustering coefficients. On the other hand, this operation can decrease the diameter at almost one, then giving the analytic solutions to several topological characteristics of the model proposed. Supported by The Ministry of Science and Technology 973 project (No. 2010C B334708); National Science Foundation of China (Nos. 61164005, 11161037, 11101232, 11461054, 11551001); The Ministry of education scholars and innovation team support plan of Yangtze River (No. IRT1068); Qinghai Province Nature Science Foundation Project (Nos. 2012-Z-943, 2014-ZJ-907).

  3. Optimal waveform-based clutter suppression algorithm for recursive synthetic aperture radar imaging systems

    NASA Astrophysics Data System (ADS)

    Zhu, Binqi; Gao, Yesheng; Wang, Kaizhi; Liu, Xingzhao

    2016-04-01

    A computational method for suppressing clutter and generating clear microwave images of targets is proposed in this paper, which combines synthetic aperture radar (SAR) principles with recursive method and waveform design theory, and it is suitable for SAR for special applications. The nonlinear recursive model is introduced into the SAR operation principle, and the cubature Kalman filter algorithm is used to estimate target and clutter responses in each azimuth position based on their previous states, which are both assumed to be Gaussian distributions. NP criteria-based optimal waveforms are designed repeatedly as the sensor flies along its azimuth path and are used as the transmitting signals. A clutter suppression filter is then designed and added to suppress the clutter response while maintaining most of the target response. Thus, with fewer disturbances from the clutter response, we can generate the SAR image with traditional azimuth matched filters. Our simulations show that the clutter suppression filter significantly reduces the clutter response, and our algorithm greatly improves the SINR of the SAR image based on different clutter suppression filter parameters. As such, this algorithm may be preferable for special target imaging when prior information on the target is available.

  4. A conceptual basis to encode and detect organic functional groups in XML.

    PubMed

    Sankar, Punnaivanam; Krief, Alain; Vijayasarathi, Durairaj

    2013-06-01

    A conceptual basis to define and detect organic functional groups is developed. The basic model of a functional group is termed as a primary functional group and is characterized by a group center composed of one or more group center atoms bonded to terminal atoms and skeletal carbon atoms. The generic group center patterns are identified from the structures of known functional groups. Accordingly, a chemical ontology 'Font' is developed to organize the existing functional groups as well as the new ones to be defined by the chemists. The basic model is extended to accommodate various combinations of primary functional groups as functional group assemblies. A concept of skeletal group is proposed to define the characteristic groups composed of only carbon atoms to be regarded as equivalent to functional groups. The combination of primary functional groups with skeletal groups is categorized as skeletal group assembly. In order to make the model suitable for reaction modeling purpose, a Graphical User Interface (GUI) is developed to define the functional groups and to encode in XML format appropriate to detect them in chemical structures. The system is capable of detecting multiple instances of primary functional groups as well as the overlapping poly-functional groups as the respective assemblies. PMID:23666030

  5. Automatic Indexing for Content Analysis of Whale Recordings and XML Representation

    NASA Astrophysics Data System (ADS)

    Bénard, Frédéric; Glotin, Hervé

    2010-12-01

    This paper focuses on the robust indexing of sperm whale hydrophone recordings based on a set of features extracted from a real-time passive underwater acoustic tracking algorithm for multiple whales using four hydrophones. Acoustic localization permits the study of whale behavior in deep water without interfering with the environment. Given the position coordinates, we are able to generate different features such as the speed, energy of the clicks, Inter-Click-Interval (ICI), and so on. These features allow to construct different markers which allow us to index and structure the audio files. Thus, the behavior study is facilitated by choosing and accessing the corresponding index in the audio file. The complete indexing algorithm is processed on real data from the NUWC (Naval Undersea Warfare Center of the US Navy) and the AUTEC (Atlantic Undersea Test & Evaluation Center-Bahamas). Our model is validated by similar results from the US Navy (NUWC) and SOEST (School of Ocean and Earth Science and Technology) Hawaii university labs in a single whale case. Finally, as an illustration, we index a single whale sound file using the extracted whale's features provided by the tracking, and we present an example of an XML script structuring it.

  6. Adding Hierarchical Objects to Relational Database General-Purpose XML-Based Information Managements

    NASA Technical Reports Server (NTRS)

    Lin, Shu-Chun; Knight, Chris; La, Tracy; Maluf, David; Bell, David; Tran, Khai Peter; Gawdiak, Yuri

    2006-01-01

    NETMARK is a flexible, high-throughput software system for managing, storing, and rapid searching of unstructured and semi-structured documents. NETMARK transforms such documents from their original highly complex, constantly changing, heterogeneous data formats into well-structured, common data formats in using Hypertext Markup Language (HTML) and/or Extensible Markup Language (XML). The software implements an object-relational database system that combines the best practices of the relational model utilizing Structured Query Language (SQL) with those of the object-oriented, semantic database model for creating complex data. In particular, NETMARK takes advantage of the Oracle 8i object-relational database model using physical-address data types for very efficient keyword searches of records across both context and content. NETMARK also supports multiple international standards such as WEBDAV for drag-and-drop file management and SOAP for integrated information management using Web services. The document-organization and -searching capabilities afforded by NETMARK are likely to make this software attractive for use in disciplines as diverse as science, auditing, and law enforcement.

  7. Development of XML Schema for Broadband Digital Seismograms and Data Center Portal

    NASA Astrophysics Data System (ADS)

    Takeuchi, N.; Tsuboi, S.; Ishihara, Y.; Nagao, H.; Yamagishi, Y.; Watanabe, T.; Yanaka, H.; Yamaji, H.

    2008-12-01

    There are a number of data centers around the globe, where the digital broadband seismograms are opened to researchers. Those centers use their own user interfaces and there are no standard to access and retrieve seismograms from different data centers using unified interface. One of the emergent technologies to realize unified user interface for different data centers is the concept of WebService and WebService portal. Here we have developed a prototype of data center portal for digital broadband seismograms. This WebService portal uses WSDL (Web Services Description Language) to accommodate differences among the different data centers. By using the WSDL, alteration and addition of data center user interfaces can be easily managed. This portal, called NINJA Portal, assumes three WebServices: (1) database Query service, (2) Seismic event data request service, and (3) Seismic continuous data request service. Current system supports both station search of database Query service and seismic continuous data request service. Data centers supported by this NINJA portal will be OHP data center in ERI and Pacific21 data center in IFREE/JAMSTEC in the beginning. We have developed metadata standard for seismological data based on QuakeML for parametric data, which has been developed by ETH Zurich, and XML-SEED for waveform data, which was developed by IFREE/JAMSTEC. The prototype of NINJA portal is now released through IFREE web page (http://www.jamstec.go.jp/pacific21/).

  8. ISS Update: Suitport Testing

    NASA Video Gallery

    ISS Update commentator Lynnette Madison interviews Joel Maganza, Test Director, about thermal vacuum chambers and unmanned and human-testing with the Suitport. Questions? Ask us on Twitter @NASA_Jo...

  9. ISS Update: NEEMO 16

    NASA Video Gallery

    ISS Update commentator Josh Byerly interviews astronaut Stan Love about the NEEMO 16 mission from Aquarius Base. Questions? Ask us on Twitter @NASA_Johnson and include the hashtag #askStation. For ...

  10. ACS Updates Environmental Report.

    ERIC Educational Resources Information Center

    Chemical and Engineering News, 1978

    1978-01-01

    Describes a new publication of a report prepared by the American Chemical Society's Committee on Environmental Improvement. This is a new version that updates a 1969 report and contains additional material and expanded recommendations. (GA)

  11. Environmental regulatory update table

    SciTech Connect

    Brown, K.J.; Langston, M.E.; Tucker, C.S.; Reed, R.M.

    1987-06-01

    The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.

  12. Using Recursive Regression to Explore Nonlinear Relationships and Interactions: A Tutorial Applied to a Multicultural Education Study

    ERIC Educational Resources Information Center

    Strang, Kenneth David

    2009-01-01

    This paper discusses how a seldom-used statistical procedure, recursive regression (RR), can numerically and graphically illustrate data-driven nonlinear relationships and interaction of variables. This routine falls into the family of exploratory techniques, yet a few interesting features make it a valuable compliment to factor analysis and…

  13. A Comparison between Cure Model and Recursive Partitioning: A Retrospective Cohort Study of Iranian Female with Breast Cancer

    PubMed Central

    Safe, Mozhgan; Faradmal, Javad

    2016-01-01

    Background. Breast cancer which is the most common cause of women cancer death has an increasing incidence and mortality rates in Iran. A proper modeling would correctly detect the factors' effect on breast cancer, which may be the basis of health care planning. Therefore, this study aimed to practically develop two recently introduced statistical models in order to compare them as the survival prediction tools for breast cancer patients. Materials and Methods. For this retrospective cohort study, the 18-year follow-up information of 539 breast cancer patients was analyzed by “Parametric Mixture Cure Model” and “Model-Based Recursive Partitioning.” Furthermore, a simulation study was carried out to compare the performance of mentioned models for different situations. Results. “Model-Based Recursive Partitioning” was able to present a better description of dataset and provided a fine separation of individuals with different risk levels. Additionally the results of simulation study confirmed the superiority of this recursive partitioning for nonlinear model structures. Conclusion. “Model-Based Recursive Partitioning” seems to be a potential instrument for processing complex mixture cure models. Therefore, applying this model is recommended for long-term survival patients.

  14. A Comparison between Cure Model and Recursive Partitioning: A Retrospective Cohort Study of Iranian Female with Breast Cancer

    PubMed Central

    Safe, Mozhgan; Faradmal, Javad

    2016-01-01

    Background. Breast cancer which is the most common cause of women cancer death has an increasing incidence and mortality rates in Iran. A proper modeling would correctly detect the factors' effect on breast cancer, which may be the basis of health care planning. Therefore, this study aimed to practically develop two recently introduced statistical models in order to compare them as the survival prediction tools for breast cancer patients. Materials and Methods. For this retrospective cohort study, the 18-year follow-up information of 539 breast cancer patients was analyzed by “Parametric Mixture Cure Model” and “Model-Based Recursive Partitioning.” Furthermore, a simulation study was carried out to compare the performance of mentioned models for different situations. Results. “Model-Based Recursive Partitioning” was able to present a better description of dataset and provided a fine separation of individuals with different risk levels. Additionally the results of simulation study confirmed the superiority of this recursive partitioning for nonlinear model structures. Conclusion. “Model-Based Recursive Partitioning” seems to be a potential instrument for processing complex mixture cure models. Therefore, applying this model is recommended for long-term survival patients. PMID:27660647

  15. An Introduction to Recursive Partitioning: Rationale, Application, and Characteristics of Classification and Regression Trees, Bagging, and Random Forests

    ERIC Educational Resources Information Center

    Strobl, Carolin; Malley, James; Tutz, Gerhard

    2009-01-01

    Recursive partitioning methods have become popular and widely used tools for nonparametric regression and classification in many scientific fields. Especially random forests, which can deal with large numbers of predictor variables even in the presence of complex interactions, have been applied successfully in genetics, clinical medicine, and…

  16. On construction of symmetries and recursion operators from zero-curvature representations and the Darboux-Egoroff system

    NASA Astrophysics Data System (ADS)

    Igonin, Sergei; Marvan, Michal

    2014-11-01

    The Darboux-Egoroff system of PDEs with any number n≥3 of independent variables plays an essential role in the problems of describing n-dimensional flat diagonal metrics of Egoroff type and Frobenius manifolds. We construct a recursion operator and its inverse for symmetries of the Darboux-Egoroff system and describe some symmetries generated by these operators. The constructed recursion operators are not pseudodifferential, but are Bäcklund autotransformations for the linearized system whose solutions correspond to symmetries of the Darboux-Egoroff system. For some other PDEs, recursion operators of similar types were considered previously by Papachristou, Guthrie, Marvan, Pobořil, and Sergyeyev. In the structure of the obtained third and fifth order symmetries of the Darboux-Egoroff system, one finds the third and fifth order flows of an (n-1)-component vector modified KdV hierarchy. The constructed recursion operators generate also an infinite number of nonlocal symmetries. In particular, we obtain a simple construction of nonlocal symmetries that were studied by Buryak and Shadrin in the context of the infinitesimal version of the Givental-van de Leur twisted loop group action on the space of semisimple Frobenius manifolds. We obtain these results by means of rather general methods, using only the zero-curvature representation of the considered PDEs.

  17. Question Utilization in Solution-Focused Brief Therapy: A Recursive Frame Analysis of Insoo Kim Berg's Solution Talk

    ERIC Educational Resources Information Center

    Cotton, Jeffrey

    2010-01-01

    Recursive frame analysis (RFA) was used to conduct a single case investigation of Insoo Kim Berg's question utilization talk in a solution-focused brief therapy (SFBT) session. Due to the lack of process research that explores how SFBT questions facilitate change, the author investigated how Berg's solution language influenced a client to respond…

  18. Diverse Pathways to Positive and Negative Affect in Adulthood and Later Life: An Integrative Approach Using Recursive Partitioning

    ERIC Educational Resources Information Center

    Gruenewald, Tara L.; Mroczek, Daniel K.; Ryff, Carol D.; Singer, Burton H.

    2008-01-01

    Recursive partitioning is an analytic technique that is useful for identifying complex combinations of conditions that predict particular outcomes as well as for delineating multiple subgroup differences in how such factors work together. As such, the methodology is well suited to multidisciplinary, life course inquiry in which the goal is to…

  19. On the Shock-Response-Spectrum Recursive Algorithm of Kelly and Richman

    NASA Technical Reports Server (NTRS)

    Martin, Justin N.; Sinclair, Andrew J.; Foster, Winfred A.

    2010-01-01

    The monograph Principles and Techniques of Shock Data Analysis written by Kelly and Richman in 1969 has become a seminal reference on the shock response spectrum (SRS) [1]. Because of its clear physical descriptions and mathematical presentation of the SRS, it has been cited in multiple handbooks on the subject [2, 3] and research articles [4 10]. Because of continued interest, two additional versions of the monograph have been published: a second edition by Scavuzzo and Pusey in 1996 [11] and a reprint of the original edition in 2008 [12]. The main purpose of this note is to correct several typographical errors in the manuscript's presentation of a recursive algorithm for SRS calculations. These errors are consistent across all three editions of the monograph. The secondary purpose of this note is to present a Matlab implementation of the corrected algorithm.

  20. Management of Large-Scale Wireless Sensor Networks Utilizing Multi-Parent Recursive Area Hierarchies

    SciTech Connect

    Cree, Johnathan V.; Delgado-Frias, Jose

    2013-04-19

    Autonomously configuring and self-healing a largescale wireless sensor network requires a light-weight maintenance protocol that is scalable. Further, in a battery powered wireless sensor network duty-cycling a node’s radio can reduce the power consumption of a device and extend the lifetime of a network. With duty-cycled nodes the power consumption of a node’s radio depends on the amount of communication is must perform and by reducing the communication the power consumption can also be reduced. Multi-parent hierarchies can be used to reduce the communication cost when constructing a recursive area clustering hierarchy when compared to singleparent solutions that utilize inefficient communication methods such as flooding and information propagation via single-hop broadcasts. The multi-parent hierarchies remain scalable and provides a level of redundancy for the hierarchy.

  1. Recursive state estimation for discrete time-varying stochastic nonlinear systems with randomly occurring deception attacks

    NASA Astrophysics Data System (ADS)

    Ding, Derui; Shen, Yuxuan; Song, Yan; Wang, Yongxiong

    2016-07-01

    This paper is concerned with the state estimation problem for a class of discrete time-varying stochastic nonlinear systems with randomly occurring deception attacks. The stochastic nonlinearity described by statistical means which covers several classes of well-studied nonlinearities as special cases is taken into discussion. The randomly occurring deception attacks are modelled by a set of random variables obeying Bernoulli distributions with given probabilities. The purpose of the addressed state estimation problem is to design an estimator with hope to minimize the upper bound for estimation error covariance at each sampling instant. Such an upper bound is minimized by properly designing the estimator gain. The proposed estimation scheme in the form of two Riccati-like difference equations is of a recursive form. Finally, a simulation example is exploited to demonstrate the effectiveness of the proposed scheme.

  2. Human motion planning based on recursive dynamics and optimal control techniques

    NASA Technical Reports Server (NTRS)

    Lo, Janzen; Huang, Gang; Metaxas, Dimitris

    2002-01-01

    This paper presents an efficient optimal control and recursive dynamics-based computer animation system for simulating and controlling the motion of articulated figures. A quasi-Newton nonlinear programming technique (super-linear convergence) is implemented to solve minimum torque-based human motion-planning problems. The explicit analytical gradients needed in the dynamics are derived using a matrix exponential formulation and Lie algebra. Cubic spline functions are used to make the search space for an optimal solution finite. Based on our formulations, our method is well conditioned and robust, in addition to being computationally efficient. To better illustrate the efficiency of our method, we present results of natural looking and physically correct human motions for a variety of human motion tasks involving open and closed loop kinematic chains.

  3. A Recursive Multiscale Correlation-Averaging Algorithm for an Automated Distributed Road Condition Monitoring System

    SciTech Connect

    Ndoye, Mandoye; Barker, Alan M; Krogmeier, James; Bullock, Darcy

    2011-01-01

    A signal processing approach is proposed to jointly filter and fuse spatially indexed measurements captured from many vehicles. It is assumed that these measurements are influenced by both sensor noise and measurement indexing uncertainties. Measurements from low-cost vehicle-mounted sensors (e.g., accelerometers and Global Positioning System (GPS) receivers) are properly combined to produce higher quality road roughness data for cost-effective road surface condition monitoring. The proposed algorithms are recursively implemented and thus require only moderate computational power and memory space. These algorithms are important for future road management systems, which will use on-road vehicles as a distributed network of sensing probes gathering spatially indexed measurements for condition monitoring, in addition to other applications, such as environmental sensing and/or traffic monitoring. Our method and the related signal processing algorithms have been successfully tested using field data.

  4. Solution of the antiferromagnetic Ising model on a tetrahedron recursive lattice.

    PubMed

    Jurčišinová, E; Jurčišin, M

    2014-03-01

    We consider the antiferromagnetic spin-1/2 Ising model on the recursive tetrahedron lattice on which two elementary tetrahedrons are connected at each site. The model represents the simplest approximation of the antiferromagnetic Ising model on the real three-dimensional tetrahedron lattice which takes into account effects of frustration. An exact analytical solution of the model is found and discussed. It is shown that the model exhibits neither the first-order nor the second-order phase transitions. A detailed analysis of the magnetization of the model in the presence of the external magnetic field is performed and the existence of the magnetization plateaus for low temperatures is shown. All possible ground states of the model are found and discussed. The existence of nontrivial singular ground states is proven and exact explicit expressions for them are found.

  5. Towards Interactive Construction of Topical Hierarchy: A Recursive Tensor Decomposition Approach

    PubMed Central

    Wang, Chi; Liu, Xueqing; Song, Yanglei; Han, Jiawei

    2015-01-01

    Automatic construction of user-desired topical hierarchies over large volumes of text data is a highly desirable but challenging task. This study proposes to give users freedom to construct topical hierarchies via interactive operations such as expanding a branch and merging several branches. Existing hierarchical topic modeling techniques are inadequate for this purpose because (1) they cannot consistently preserve the topics when the hierarchy structure is modified; and (2) the slow inference prevents swift response to user requests. In this study, we propose a novel method, called STROD, that allows efficient and consistent modification of topic hierarchies, based on a recursive generative model and a scalable tensor decomposition inference algorithm with theoretical performance guarantee. Empirical evaluation shows that STROD reduces the runtime of construction by several orders of magnitude, while generating consistent and quality hierarchies. PMID:26705505

  6. Cloud Computing Application for Hotspot Clustering Using Recursive Density Based Clustering (RDBC)

    NASA Astrophysics Data System (ADS)

    Santoso, Aries; Khiyarin Nisa, Karlina

    2016-01-01

    Indonesia has vast areas of tropical forest, but are often burned which causes extensive damage to property and human life. Monitoring hotspots can be one of the forest fire management. Each hotspot is recorded in dataset so that it can be processed and analyzed. This research aims to build a cloud computing application which visualizes hotspots clustering. This application uses the R programming language with Shiny web framework and implements Recursive Density Based Clustering (RDBC) algorithm. Clustering is done on hotspot dataset of the Kalimantan Island and South Sumatra Province to find the spread pattern of hotspots. The clustering results are evaluated using the Silhouette's Coefficient (SC) which yield best value 0.3220798 for Kalimantan dataset. Clustering pattern are displayed in the form of web pages so that it can be widely accessed and become the reference for fire occurrence prediction.

  7. Weak coupling expansion of Yang-Mills theory on recursive infinite genus surfaces

    NASA Astrophysics Data System (ADS)

    Ghoshal, Debashis; Imbimbo, Camillo; Kumar, Dushyant

    2014-10-01

    We analyze the partition function of two dimensional Yang-Mills theory on a family of surfaces of infinite genus. These surfaces have a recursive structure, which was used by one of us to compute the partition function that results in a generalized Migdal formula. In this paper we study the `small area' (weak coupling) expansion of the partition function, by exploiting the fact that the generalized Migdal formula is analytic in the (complexification of the) Euler characteristic. The structure of the perturbative part of the weak coupling expansion suggests that the moduli space of flat connections (of the SU(2) and SO(3) theories) on these infinite genus surfaces are well defined, perhaps in an appropriate regularization.

  8. Parallel Implementation of the Recursive Approximation of an Unsupervised Hierarchical Segmentation Algorithm. Chapter 5

    NASA Technical Reports Server (NTRS)

    Tilton, James C.; Plaza, Antonio J. (Editor); Chang, Chein-I. (Editor)

    2008-01-01

    The hierarchical image segmentation algorithm (referred to as HSEG) is a hybrid of hierarchical step-wise optimization (HSWO) and constrained spectral clustering that produces a hierarchical set of image segmentations. HSWO is an iterative approach to region grooving segmentation in which the optimal image segmentation is found at N(sub R) regions, given a segmentation at N(sub R+1) regions. HSEG's addition of constrained spectral clustering makes it a computationally intensive algorithm, for all but, the smallest of images. To counteract this, a computationally efficient recursive approximation of HSEG (called RHSEG) has been devised. Further improvements in processing speed are obtained through a parallel implementation of RHSEG. This chapter describes this parallel implementation and demonstrates its computational efficiency on a Landsat Thematic Mapper test scene.

  9. Two dimensional recursive digital filters for near real time image processing

    NASA Technical Reports Server (NTRS)

    Olson, D.; Sherrod, E.

    1980-01-01

    A program was designed toward the demonstration of the feasibility of using two dimensional recursive digital filters for subjective image processing applications that require rapid turn around. The concept of the use of a dedicated minicomputer for the processor for this application was demonstrated. The minicomputer used was the HP1000 series E with a RTE 2 disc operating system and 32K words of memory. A Grinnel 256 x 512 x 8 bit display system was used to display the images. Sample images were provided by NASA Goddard on a 800 BPI, 9 track tape. Four 512 x 512 images representing 4 spectral regions of the same scene were provided. These images were filtered with enhancement filters developed during this effort.

  10. An explicit matrix formulation of the dynamical equations for flexible multibody systems - A recursive approach

    NASA Astrophysics Data System (ADS)

    Amirouche, F. M. L.; Xie, M.

    1993-01-01

    The dynamic simulation of complex rigid/flexible multibody systems relies greatly on the presentation and development of the equations of motion. To achieve computational speed in the execution and to further develop the control algorithms, the expressions involved in the kinematics and the subsequent coefficients associated with the equations of motion must be clearly defined. The intention of this paper is to develop a recursive formulation based on finite element method where all terms are presented in a matrix form. The methodology permits one to identify the coupling between rigid and flexible body motion, and build the necessary arrays for the application at hand. The equations of motion are based on Kane's equation and the general matrix representation for n bodies of its partial velocities and partial angular velocities. The algorithm developed is applied to a single two-link robot manipulator and the subsequent explicit equations of motion are presented.

  11. System Simulation by Recursive Feedback: Coupling a Set of Stand-Alone Subsystem Simulations

    NASA Technical Reports Server (NTRS)

    Nixon, D. D.

    2001-01-01

    Conventional construction of digital dynamic system simulations often involves collecting differential equations that model each subsystem, arran g them to a standard form, and obtaining their numerical gin solution as a single coupled, total-system simultaneous set. Simulation by numerical coupling of independent stand-alone subsimulations is a fundamentally different approach that is attractive because, among other things, the architecture naturally facilitates high fidelity, broad scope, and discipline independence. Recursive feedback is defined and discussed as a candidate approach to multidiscipline dynamic system simulation by numerical coupling of self-contained, single-discipline subsystem simulations. A satellite motion example containing three subsystems (orbit dynamics, attitude dynamics, and aerodynamics) has been defined and constructed using this approach. Conventional solution methods are used in the subsystem simulations. Distributed and centralized implementations of coupling have been considered. Numerical results are evaluated by direct comparison with a standard total-system, simultaneous-solution approach.

  12. Topological recursion for chord diagrams, RNA complexes, and cells in moduli spaces

    NASA Astrophysics Data System (ADS)

    Andersen, Jørgen E.; Chekhov, Leonid O.; Penner, R. C.; Reidys, Christian M.; Sułkowski, Piotr

    2013-01-01

    We introduce and study the Hermitian matrix model with potential V(x)=x2/2-stx/(1-tx), which enumerates the number of linear chord diagrams with no isolated vertices of fixed genus with specified numbers of backbones generated by s and chords generated by t. For the one-cut solution, the partition function, correlators and free energies are convergent for small t and all s as a perturbation of the Gaussian potential, which arises for st=0. This perturbation is computed using the formalism of the topological recursion. The corresponding enumeration of chord diagrams gives at once the number of RNA complexes of a given topology as well as the number of cells in Riemann's moduli spaces for bordered surfaces. The free energies are computed here in principle for all genera and explicitly in genus less than four.

  13. Recursive method to obtain the parametric representation of a generic Feynman diagram

    SciTech Connect

    Gonzalez, Ivan; Schmidt, Ivan

    2005-11-15

    A recursive algebraic method which allows one to obtain the Feynman or Schwinger parametric representation of a generic L-loops and (E+1) external lines diagram, in a scalar {phi}{sup 3}+{phi}{sup 4} theory, is presented. The representation is obtained starting from an initial parameters matrix, which relates the scalar products between internal and external momenta, and which appears directly when this parametrization is applied to the momentum space representation of the graph. The final product is an algebraic formula that shows explicitly the external momenta dependence and also an algorithm that can be easily programmed, either in a computer programming language (C/C++, Fortran, ...) or in a symbolic calculation package (Maple, Mathematica, ...)

  14. Laplacian spectrum of a family of recursive trees and its applications in network coherence

    NASA Astrophysics Data System (ADS)

    Sun, Weigang; Xuan, Tengfei; Qin, Sen

    2016-06-01

    Many of the topological and dynamical properties of a network are related to its Laplacian spectrum; these properties include network diameter, Kirchhoff index, and mean first-passage time. This paper investigates consensus dynamics in a linear dynamical system with additive stochastic disturbances, which is characterized as network coherence by the Laplacian spectrum. We choose a family of uniform recursive trees as our model, and propose a method to calculate the first- and second-order network coherence. Using the tree structures, we identify a relationship between the Laplacian matrix and Laplacian eigenvalues. We then derive the exact solutions for the reciprocals and square reciprocals of all nonzero Laplacian eigenvalues. We also obtain the scalings of network coherence with network size. The scalings of network coherence of the studied trees are smaller than those of Vicsek fractals and are not related to its fractal dimension.

  15. Recursive multiport schemes for implementing quantum algorithms with photonic integrated circuits

    NASA Astrophysics Data System (ADS)

    Tabia, Gelo Noel M.

    2016-01-01

    We present recursive multiport schemes for implementing quantum Fourier transforms and the inversion step in Grover's algorithm on an integrated linear optics device. In particular, each scheme shows how to execute a quantum operation on 2 d modes using a pair of circuits for the same operation on d modes. The circuits operate on path-encoded qudits and realize d -dimensional unitary transformations on these states using linear optical networks with O (d2) optical elements. To evaluate the schemes against realistic errors, we ran simulations of proof-of-principle experiments using a simple fabrication model of silicon-based photonic integrated devices that employ directional couplers and thermo-optic modulators for beam splitters and phase shifters, respectively. We find that high-fidelity performance is achievable with our multiport circuits for 2-qubit and 3-qubit quantum Fourier transforms, and for quantum search on four-item and eight-item databases.

  16. Spatial join optimization among WFSs based on recursive partitioning and filtering rate estimation

    NASA Astrophysics Data System (ADS)

    Lan, Guiwen; Wu, Congcong; Shi, Guangyi; Chen, Qi; Yang, Zhao

    2015-12-01

    Spatial join among Web Feature Services (WFS) is time-consuming for most of non-candidate spatial objects may be encoded by GML and transferred to client side. In this paper, an optimization strategy is proposed to enhance performance of these joins by filtering non-candidate spatial objects as many as possible. By recursive partitioning, the data skew of sub-areas is facilitated to reduce data transmission using spatial semi-join. Moreover filtering rate is used to determine whether a spatial semi-join for a sub-area is profitable and choose a suitable execution plan for it. The experimental results show that the proposed strategy is feasible under most circumstances.

  17. Parallel 2D and 3D Prestack Depth Migration Using Recursive Kirchhoff Wavefield Extrapolation

    NASA Astrophysics Data System (ADS)

    Geiger, H. D.; Margrave, G. F.; Liu, K.

    2004-05-01

    Recursive Kirchhoff wavefield extrapolation in the space-frequency domain can be thought of as a simple convolutional filter that calculates a single output point at depth z+dz using a weighted summation of all input points within the extrapolator aperture at depth z. The desired velocity values for the extrapolator are the ones that provide the best approximation of the true phase (propagation time) of the seismic wavefield between the input points and the output point. Recursive Kirchhoff extrapolators can be designed to handle lateral variations in velocity in a number of ways: a PSPI-type (phase shift plus interpolation) extrapolator uses only the velocity at the output point, a NSPS-type (nonstationary phase shift) extrapolator uses the velocities at the input points; a SNPS-type (symmetric nonstationary phase shift) extrapolator incorporates two extrapolation steps of dz/2 where the first step uses the velocities at the input points (NSPS-type) and the second step uses the velocity at the output point (PSPI-type); while the Weyl-type extrapolator uses an average of the velocities between each input point and the output point. Here, we introduce the PAVG-type (slowness averaged) extrapolator, which uses velocity values calculated by an average of slowness along straight raypaths between each input point and the output point. Parallel 2D and 3D prestack depth migration algorithms have been coded in both MATLAB and C and tested on a small Linux cluster. A simple synthetic with a lateral step in velocity shows that the PAVG Kirchhoff extrapolator is very close to the exact desired response. Tests using the 2D Marmousi synthetic data set suggest that the extrapolator behaviour is only one of many considerations that must be addressed for accurate depth imaging. Other important considerations include preprocessing, aperture size, taper width, extrapolator stability, and imaging condition.

  18. Recursive adjustment approach for the inversion of the Euler-Liouville Equation

    NASA Astrophysics Data System (ADS)

    Kirschner, S.; Seitz, F.

    2012-04-01

    Earth rotation is physically described by the Euler-Liouville Equation that is based on the balance of angular momentum in the Earth system. The Earth orientation parameters (EOP), polar motion and length of day, are highly precise observed by geodetic methods over many decades. A sensitivity analysis showed that some weakly determined Earth parameters have a great influence on the numerical forward modeling of the EOP. Therefore we concentrate on the inversion of the Euler-Liouville Equation in order to estimate and improve such parameters. A recursive adjustment approach allows the inversion of the Euler-Liouville Equation to be efficient. Here we concentrate on the estimation of parameters related to period and damping of the free rotation of the Earth (Chandler oscillation). Before we apply the approach to the complex Earth system we demonstrate its concept on the simplified example of a spring mass damper system. The spring mass damper system is analogous to the damped Chandler oscillation and the results can directly be transferred. Also the differential equation describing the motion of the spring has the same structure as the Euler-Liouville Equation. Spring constant and damping coefficient describing the anelastic behavior of the system correspond to real and imaginary part of the Earth's pole tide Love number. Therefore the simplified model is ideal for studying various aspects, e.g. the influences of sampling rate, overall time frame, and the number of observations on the numerical results. It is shown that the recursive adjustment approach is an adequate method for the estimation of the spring parameters and therewith for the parameters describing the Earth's rheology. The study is carried out in the frame of the German research unit on Earth Rotation and Global Dynamic Processes.

  19. SU-E-T-48: Automated Quality Assurance for XML Controlled Linacs

    SciTech Connect

    Valdes, G; Morin, O; Pouliot, J; Chuang, C

    2014-06-01

    Purpose: To automate routine imaging QA procedures so that complying with TG 142 and TG 179 can be efficient and reliable. Methods: Two QA tests for a True Beam Linac were automatized. A Winston Lutz test as described by Lutz et al{sup 1} using the Winston Lutz test kit from BrainLab, Germany and a CBCT Image Quality test as described in TG 179 using the EMMA phantom, Siemens Medical Physics, Germany were performed in our True Beam. For each QA procedure tested, a 3 step paradigm was used. First, the data was automatically acquired using True Beam Developer Mode and XML scripting. Second, the data acquired in the first step was automatically processed using in-home grown Matlab GUIs. Third, Machine Learning algorithms were used to automatically classify the processed data and reports generated. Results: The Winston Luzt test could be performed by an experienced medical physicist in 29.0 ± 8.0 min. The same test, if automated using our paradigm, could be performed in 3.0 ± 0.1 min. In the same lieu, time could be substantially saved for image quality tests. In this case, the amount of time saved will depend on the phantoms used and the initial localization method. Additionally, machine learning algorithms could automatically identify the roots of the problems if any and possibly help reduce machine down time. Conclusion: Modern linear accelerators are equipped with advanced 2D and 3D imaging that are used for patient alignment substantially improving IGRT protocols. However, this extra complexity exponentially increases the number of QA tests needed. Using the new paradigm described above, not only bare minimum but best practice QA programs could be implemented with the same manpower. This work is supported by Varian, Palo Alto, CA.

  20. Using XML for Instrument Description, Communication and Control of the SOFIA/HAWC Instrument

    NASA Astrophysics Data System (ADS)

    Ames, T. A.; Sall, K. B.; Warsaw, C. E.; Shafer, R. A.

    1998-12-01

    The goal of the Instrument Remote Control (IRC) project is to develop a distributed framework from science user to instrument which will provide robust interactive and reconfigurable control and monitoring of remote instrumentation. The focus of the joint effort between NASA/GSFC's Advanced Architectures and Automation branch (Code 588) and Century Computing has been infrared astronomy, although most of the techniques employed have much wider applicability. This poster presentation will describe the work currently underway for Stratospheric Observatory For Infrared Astronomy (SOFIA) in developing an Extensible Markup Language (XML) vocabulary to aid in instrument description, communication and control. In particular, the instruments to be controlled are the High-resolution Airborne Wideband Camera (HAWC) and ultimately the Submillimeter And Far InfraRed Experiment (SAFIRE). IRC will enable trusted infrared astronomers around the world to easily access infrared astronomical instruments located in remote, inhospitable environments. The long-term focus is to develop an extensible framework to which new instruments can be added with relative ease. This will eventually be accomplished by implementing our own Instrument Control Markup Language (ICML) based on a custom Document Type Definition (DTD). ICML will be used to describe control capabilities, data streams, message formats, and communication mechanisms, as well as for online documentation and the association of housekeeping metadata with acquired images. Some of these aspects of instrument control will be reflected in Java graphical user interfaces, generated from the instrument descriptions. Other sections of the instrument description will be applied to data capture, as well as to other instrument subsystems.

  1. Recursive mass matrix factorization and inversion: An operator approach to open- and closed-chain multibody dynamics

    NASA Technical Reports Server (NTRS)

    Rodriguez, G.; Kreutz, K.

    1988-01-01

    This report advances a linear operator approach for analyzing the dynamics of systems of joint-connected rigid bodies.It is established that the mass matrix M for such a system can be factored as M=(I+H phi L)D(I+H phi L) sup T. This yields an immediate inversion M sup -1=(I-H psi L) sup T D sup -1 (I-H psi L), where H and phi are given by known link geometric parameters, and L, psi and D are obtained recursively by a spatial discrete-step Kalman filter and by the corresponding Riccati equation associated with this filter. The factors (I+H phi L) and (I-H psi L) are lower triangular matrices which are inverses of each other, and D is a diagonal matrix. This factorization and inversion of the mass matrix leads to recursive algortihms for forward dynamics based on spatially recursive filtering and smoothing. The primary motivation for advancing the operator approach is to provide a better means to formulate, analyze and understand spatial recursions in multibody dynamics. This is achieved because the linear operator notation allows manipulation of the equations of motion using a very high-level analytical framework (a spatial operator algebra) that is easy to understand and use. Detailed lower-level recursive algorithms can readily be obtained for inspection from the expressions involving spatial operators. The report consists of two main sections. In Part 1, the problem of serial chain manipulators is analyzed and solved. Extensions to a closed-chain system formed by multiple manipulators moving a common task object are contained in Part 2. To retain ease of exposition in the report, only these two types of multibody systems are considered. However, the same methods can be easily applied to arbitrary multibody systems formed by a collection of joint-connected regid bodies.

  2. Modular representation of the guideline text: an approach for maintaining and updating the content of medical education.

    PubMed

    Kumar, Anand; Quaglini, Silvana; Stefanelli, Mario; Ciccarese, Paolo; Caffi, Ezio

    2003-06-01

    One of the principal challenges in the medical practice is the update of their knowledge. One of the prime roles of the Continuing Medical Education is to train the medical practitioners with the latest advances in health care, specialized to their needs. Online courses and classroom teaching with computer-based representations have become an established mode of delivering medical education. This paper deals with the modularized representation of a medical text concerning clinical practice guidelines. The proposed system takes into consideration the semantics of the Unified Medical Language System and is based upon the marking up and display of the knowledge using the XML and XSLT languages. This modularization of the concepts leads to the determination of the context of a portion or the whole document. Thus, after marking up using our system, the text components can be exchanged, modified or reconstructed, which, in turn, would help to maintain the updates in medical knowledge.

  3. Poster — Thur Eve — 55: An automated XML technique for isocentre verification on the Varian TrueBeam

    SciTech Connect

    Asiev, Krum; Mullins, Joel; DeBlois, François; Liang, Liheng; Syme, Alasdair

    2014-08-15

    Isocentre verification tests, such as the Winston-Lutz (WL) test, have gained popularity in the recent years as techniques such as stereotactic radiosurgery/radiotherapy (SRS/SRT) treatments are more commonly performed on radiotherapy linacs. These highly conformal treatments require frequent monitoring of the geometrical accuracy of the isocentre to ensure proper radiation delivery. At our clinic, the WL test is performed by acquiring with the EPID a collection of 8 images of a WL phantom fixed on the couch for various couch/gantry angles. This set of images is later analyzed to determine the isocentre size. The current work addresses the acquisition process. A manual WL test acquisition performed by and experienced physicist takes in average 25 minutes and is prone to user manipulation errors. We have automated this acquisition on a Varian TrueBeam STx linac (Varian, Palo Alto, USA). The Varian developer mode allows the execution of custom-made XML script files to control all aspects of the linac operation. We have created an XML-WL script that cycles through each couch/gantry combinations taking an EPID image at each position. This automated acquisition is done in less than 4 minutes. The reproducibility of the method was verified by repeating the execution of the XML file 5 times. The analysis of the images showed variation of the isocenter size less than 0.1 mm along the X, Y and Z axes and compares favorably to a manual acquisition for which we typically observe variations up to 0.5 mm.

  4. The Open Microscopy Environment (OME) Data Model and XML file: open tools for informatics and quantitative analysis in biological imaging

    PubMed Central

    Goldberg, Ilya G; Allan, Chris; Burel, Jean-Marie; Creager, Doug; Falconi, Andrea; Hochheiser, Harry; Johnston, Josiah; Mellen, Jeff; Sorger, Peter K; Swedlow, Jason R

    2005-01-01

    The Open Microscopy Environment (OME) defines a data model and a software implementation to serve as an informatics framework for imaging in biological microscopy experiments, including representation of acquisition parameters, annotations and image analysis results. OME is designed to support high-content cell-based screening as well as traditional image analysis applications. The OME Data Model, expressed in Extensible Markup Language (XML) and realized in a traditional database, is both extensible and self-describing, allowing it to meet emerging imaging and analysis needs. PMID:15892875

  5. DbMap: improving database interoperability issues in medical software using a simple, Java-Xml based solution.

    PubMed Central

    Karadimas, H.; Hemery, F.; Roland, P.; Lepage, E.

    2000-01-01

    In medical software development, the use of databases plays a central role. However, most of the databases have heterogeneous encoding and data models. To deal with these variations in the application code directly is error-prone and reduces the potential reuse of the produced software. Several approaches to overcome these limitations have been proposed in the medical database literature, which will be presented. We present a simple solution, based on a Java library, and a central Metadata description file in XML. This development approach presents several benefits in software design and development cycles, the main one being the simplicity in maintenance. PMID:11079915

  6. Updating: Learning versus Supposing

    ERIC Educational Resources Information Center

    Zhao, Jiaying; Crupi, Vincenzo; Tentori, Katya; Fitelson, Branden; Osherson, Daniel

    2012-01-01

    Bayesian orthodoxy posits a tight relationship between conditional probability and updating. Namely, the probability of an event "A" after learning "B" should equal the conditional probability of "A" given "B" prior to learning "B". We examine whether ordinary judgment conforms to the orthodox view. In three experiments we found substantial…

  7. ''Smart Gun'' Technology Update

    SciTech Connect

    WIRSBINSKI, JOHN W.

    2001-11-01

    This report is an update to previous ''smart gun'' work and the corresponding report that were completed in 1996. It incorporates some new terminology and expanded definitions. This effort is the product of an open source look at what has happened to the ''smart gun'' technology landscape since the 1996 report was published.

  8. Updated opal opacities

    SciTech Connect

    Iglesias, C.A.; Rogers, F.J.

    1996-06-01

    The reexamination of astrophysical opacities has eliminated gross discrepancies between a variety of observations and theoretical calculations; thus allowing for more detailed tests of stellar models. A number of such studies indicate that model results are sensitive to modest changes in the opacity. Consequently, it is desirable to update available opacity databases with recent improvements in physics, refinements of element abundance, and other such factors affecting the results. Updated OPAL Rosseland mean opacities are presented. The new results have incorporated improvements in the physics and numerical procedures as well as corrections. The main opacity changes are increases of as much as 20{percent} for Population I stars due to the explicit inclusion of 19 metals (compared to 12 metals in the earlier calculations) with the other modifications introducing opacity changes smaller than 10{percent}. In addition, the temperature and density range covered by the updated opacity tables has been extended. As before, the tables allow accurate interpolation in density and temperature as well as hydrogen, helium, carbon, oxygen, and metal mass fractions. Although a specific metal composition is emphasized, opacity tables for different metal distributions can be made readily available. The updated opacities are compared to other work. {copyright} {ital 1996 The American Astronomical Society.}

  9. SEI: An update

    NASA Technical Reports Server (NTRS)

    Peach, Lewis L., Jr.

    1991-01-01

    An update on the Space Exploration Initiative (SEI) is given in viewgraph form. Topics covered include the key prerequisites of human exploration, project planning, Mars and lunar explorations, supporting technologies, near-term strategies for SEI, human support elements, and Space Station Freedom SEI accommodations.

  10. Veterinary medicines update.

    PubMed

    2016-07-01

    The following information has been produced for Veterinary Record by the Veterinary Medicines Directorate (VMD) to provide an update for veterinary surgeons on recent changes to marketing authorisations for veterinary medicines in the UK and on other relevant issues. PMID:27365238

  11. Veterinary medicines update.

    PubMed

    2016-06-11

    The following information has been produced for Veterinary Record by the Veterinary Medicines Directorate (VMD) to provide an update for veterinary surgeons on recent changes to marketing authorisations for veterinary medicines in the UK and on other relevant issues. PMID:27288166

  12. Veterinary medicines update.

    PubMed

    2016-08-01

    The following information has been produced for Veterinary Record by the Veterinary Medicines Directorate (VMD) to provide an update for veterinary surgeons on recent changes to marketing authorisations for veterinary medicines in the UK and on other relevant issues. PMID:27493045

  13. Supreme Court Update

    ERIC Educational Resources Information Center

    Taylor, Kelley R.

    2009-01-01

    "Chief Justice Flubs Oath." "Justice Ginsburg Has Cancer Surgery." At the start of this year, those were the news headlines about the U.S. Supreme Court. But January 2009 also brought news about key education cases--one resolved and two others on the docket--of which school administrators should take particular note. The Supreme Court updates on…

  14. Community Update, 2001.

    ERIC Educational Resources Information Center

    Ashby, Nicole, Ed.

    2001-01-01

    This document consists of 10 issues (covering January through December 2000) of the newsletter, "Community Update," which features articles on community and family involvement in education. In addition to the articles, each issue (except the Special Issue) includes a preview of the month's Satellite Town Meeting; events and information discussed…

  15. Updating Older Fume Hoods.

    ERIC Educational Resources Information Center

    Saunders, G. Thomas

    1985-01-01

    Provides information on updating older fume hoods. Areas addressed include: (1) adjustment of the hood's back baffle; (2) hood air leakage; (3) light level; (4) hood location in relation to room traffic and room air; and (5) establishing and maintaining hood performance. (JN)

  16. Veterinary medicines update.

    PubMed

    2016-09-10

    The following information has been produced for Veterinary Record by the Veterinary Medicines Directorate (VMD) to provide an update for veterinary surgeons on recent changes to marketing authorisations for veterinary medicines in the UK and on other relevant issues. PMID:27609956

  17. Veterinary medicines update.

    PubMed

    2016-10-01

    The following information has been produced for Veterinary Record by the Veterinary Medicines Directorate (VMD) to provide an update for veterinary surgeons on recent changes to marketing authorisations for veterinary medicines in the UK and on other relevant issues. PMID:27687269

  18. Update: Biological Nitrogen Fixation.

    ERIC Educational Resources Information Center

    Wiseman, Alan; And Others

    1985-01-01

    Updates knowledge on nitrogen fixation, indicating that investigation of free-living nitrogen-fixing organisms is proving useful in understanding bacterial partners and is expected to lead to development of more effective symbioses. Specific areas considered include biochemistry/genetics, synthesis control, proteins and enzymes, symbiotic systems,…

  19. Technology Update-87

    SciTech Connect

    Not Available

    1987-12-01

    The five papers in this issue of Technology Update reflect improvements in equipment reliability, inspection techniques, data storage techniques, and production technology - all aimed at reducing process variations. Each paper represents an achievement by our technical staff that allows Mound to make more effective use of our resources. A separate abstract has been prepared for one of the papers.

  20. An enhanced security solution for electronic medical records based on AES hybrid technique with SOAP/XML and SHA-1.

    PubMed

    Kiah, M L Mat; Nabi, Mohamed S; Zaidan, B B; Zaidan, A A

    2013-10-01

    This study aims to provide security solutions for implementing electronic medical records (EMRs). E-Health organizations could utilize the proposed method and implement recommended solutions in medical/health systems. Majority of the required security features of EMRs were noted. The methods used were tested against each of these security features. In implementing the system, the combination that satisfied all of the security features of EMRs was selected. Secure implementation and management of EMRs facilitate the safeguarding of the confidentiality, integrity, and availability of e-health organization systems. Health practitioners, patients, and visitors can use the information system facilities safely and with confidence anytime and anywhere. After critically reviewing security and data transmission methods, a new hybrid method was proposed to be implemented on EMR systems. This method will enhance the robustness, security, and integration of EMR systems. The hybrid of simple object access protocol/extensible markup language (XML) with advanced encryption standard and secure hash algorithm version 1 has achieved the security requirements of an EMR system with the capability of integrating with other systems through the design of XML messages.

  1. An enhanced security solution for electronic medical records based on AES hybrid technique with SOAP/XML and SHA-1.

    PubMed

    Kiah, M L Mat; Nabi, Mohamed S; Zaidan, B B; Zaidan, A A

    2013-10-01

    This study aims to provide security solutions for implementing electronic medical records (EMRs). E-Health organizations could utilize the proposed method and implement recommended solutions in medical/health systems. Majority of the required security features of EMRs were noted. The methods used were tested against each of these security features. In implementing the system, the combination that satisfied all of the security features of EMRs was selected. Secure implementation and management of EMRs facilitate the safeguarding of the confidentiality, integrity, and availability of e-health organization systems. Health practitioners, patients, and visitors can use the information system facilities safely and with confidence anytime and anywhere. After critically reviewing security and data transmission methods, a new hybrid method was proposed to be implemented on EMR systems. This method will enhance the robustness, security, and integration of EMR systems. The hybrid of simple object access protocol/extensible markup language (XML) with advanced encryption standard and secure hash algorithm version 1 has achieved the security requirements of an EMR system with the capability of integrating with other systems through the design of XML messages. PMID:24037086

  2. [Formula: see text]-regularized recursive total least squares based sparse system identification for the error-in-variables.

    PubMed

    Lim, Jun-Seok; Pang, Hee-Suk

    2016-01-01

    In this paper an [Formula: see text]-regularized recursive total least squares (RTLS) algorithm is considered for the sparse system identification. Although recursive least squares (RLS) has been successfully applied in sparse system identification, the estimation performance in RLS based algorithms becomes worse, when both input and output are contaminated by noise (the error-in-variables problem). We proposed an algorithm to handle the error-in-variables problem. The proposed [Formula: see text]-RTLS algorithm is an RLS like iteration using the [Formula: see text] regularization. The proposed algorithm not only gives excellent performance but also reduces the required complexity through the effective inversion matrix handling. Simulations demonstrate the superiority of the proposed [Formula: see text]-regularized RTLS for the sparse system identification setting. PMID:27652035

  3. Patellar segmentation from 3D magnetic resonance images using guided recursive ray-tracing for edge pattern detection

    NASA Astrophysics Data System (ADS)

    Cheng, Ruida; Jackson, Jennifer N.; McCreedy, Evan S.; Gandler, William; Eijkenboom, J. J. F. A.; van Middelkoop, M.; McAuliffe, Matthew J.; Sheehan, Frances T.

    2016-03-01

    The paper presents an automatic segmentation methodology for the patellar bone, based on 3D gradient recalled echo and gradient recalled echo with fat suppression magnetic resonance images. Constricted search space outlines are incorporated into recursive ray-tracing to segment the outer cortical bone. A statistical analysis based on the dependence of information in adjacent slices is used to limit the search in each image to between an outer and inner search region. A section based recursive ray-tracing mechanism is used to skip inner noise regions and detect the edge boundary. The proposed method achieves higher segmentation accuracy (0.23mm) than the current state-of-the-art methods with the average dice similarity coefficient of 96.0% (SD 1.3%) agreement between the auto-segmentation and ground truth surfaces.

  4. Multiple concurrent recursive least squares identification with application to on-line spacecraft mass-property identification

    NASA Technical Reports Server (NTRS)

    Wilson, Edward (Inventor)

    2006-01-01

    The present invention is a method for identifying unknown parameters in a system having a set of governing equations describing its behavior that cannot be put into regression form with the unknown parameters linearly represented. In this method, the vector of unknown parameters is segmented into a plurality of groups where each individual group of unknown parameters may be isolated linearly by manipulation of said equations. Multiple concurrent and independent recursive least squares identification of each said group run, treating other unknown parameters appearing in their regression equation as if they were known perfectly, with said values provided by recursive least squares estimation from the other groups, thereby enabling the use of fast, compact, efficient linear algorithms to solve problems that would otherwise require nonlinear solution approaches. This invention is presented with application to identification of mass and thruster properties for a thruster-controlled spacecraft.

  5. Recursive graphical construction of feynman diagrams and their multiplicities in straight phi(4) and straight phi2A theory

    PubMed

    Kleinert; Pelster; Kastening; Bachmann

    2000-08-01

    The free energy of a field theory can be considered as a functional of the free correlation function. As such it obeys a nonlinear functional differential equation that can be turned into a recursion relation. This is solved order by order in the coupling constant to find all connected vacuum diagrams with their proper multiplicities. The procedure is applied to a multicomponent scalar field theory with a straight phi(4) self-interaction and then to a theory of two scalar fields straight phi and A with an interaction straight phi2A. All Feynman diagrams with external lines are obtained from functional derivatives of the connected vacuum diagrams with respect to the free correlation function. Finally, the recursive graphical construction is automatized by computer algebra with the help of a unique matrix notation for the Feynman diagrams.

  6. A non-recursive Lagrangian solution of the non-causal inverse dynamics of flexible multibody systems - The planar case

    NASA Astrophysics Data System (ADS)

    Ledesma, Ragnar; Bayo, Eduardo

    1993-08-01

    A technique is presented for solving the inverse dynamics of flexible planar multibody systems. This technique yields the non-causal joint efforts (inverse dynamics) as well as the internal states (inverse kinematics) that produce a prescribed nominal trajectory of the end effector. A non-recursive Lagrangian approach is used in formulating the equations of motion as well as in solving the inverse dynamics equations. Contrary to the recursive method previously presented, the proposed method solves the inverse problem in a systematic and direct manner for both open-chain as well as closed-chain configurations. Numerical simulation shows that the proposed procedure provides an excellent tracking of the desired end effector trajectory.

  7. Post-processing for JPEG 2000 image coding using recursive line filtering based on a fuzzy control model

    NASA Astrophysics Data System (ADS)

    Yao, Susu; Rahardja, Susanto; Lin, Xiao; Lim, Keng Pang; Lu, Zhongkang

    2003-06-01

    In this paper, we propose a new method for removing coding artifacts appeared in JPEG 2000 coded images. The proposed method uses a fuzzy control model to control the weighting function for different image edges according to the gradient of pixels and membership functions. Regularized post-processing approach and recursive line algorithm are described in this paper. Experimental results demonstrate that the proposed algorithm can significantly improve image quality in terms of objective and subjective evaluation.

  8. On the structural limitations of recursive digital filters for base flow estimation

    NASA Astrophysics Data System (ADS)

    Su, Chun-Hsu; Costelloe, Justin F.; Peterson, Tim J.; Western, Andrew W.

    2016-06-01

    Recursive digital filters (RDFs) are widely used for estimating base flow from streamflow hydrographs, and various forms of RDFs have been developed based on different physical models. Numerical experiments have been used to objectively evaluate their performance, but they have not been sufficiently comprehensive to assess a wide range of RDFs. This paper extends these studies to understand the limitations of a generalized RDF method as a pathway for future field calibration. Two formalisms are presented to generalize most existing RDFs, allowing systematic tuning of their complexity. The RDFs with variable complexity are evaluated collectively in a synthetic setting, using modeled daily base flow produced by Li et al. (2014) from a range of synthetic catchments simulated with HydroGeoSphere. Our evaluation reveals that there are optimal RDF complexities in reproducing base flow simulations but shows that there is an inherent physical inconsistency within the RDF construction. Even under the idealized setting where true base flow data are available to calibrate the RDFs, there is persistent disagreement between true and estimated base flow over catchments with small base flow components, low saturated hydraulic conductivity of the soil and larger surface runoff. The simplest explanation is that low base flow "signal" in the streamflow data is hard to distinguish, although more complex RDFs can improve upon the simpler Eckhardt filter at these catchments.

  9. An empirical test of the weighted effect approach to generalized prediction using recursive neural nets

    SciTech Connect

    Lang, R.

    1996-12-31

    The requirement of a strict and fixed distinction between dependent variables and independent variables, together with the presence of missing data, typically imposes considerable problems for most standard statistical prediction procedures. This paper describes a solution of these problems through the {open_quotes}weighted effect{close_quotes} approach in which recursive neural nets are used to learn how to compensate for any main and interaction effects attributable to missing data through the use of an {open_quotes}effect set{close_quotes} in addition to the data of actual cases. Extensive simulations of the approach based on an existing psychological data base showed high predictive validity, and a graceful degradation in performance with an increase in the number of unknown predictor variables. Moreover, the method proved amenable to the use of two-parameter logistic curves to arrive at a three way {open_quotes}low,{close_quotes} {open_quotes}high,{close_quotes} and {open_quotes}undecided{close_quotes} decision scheme with a-priori known error rates.

  10. Recursive, in-place algorithm for the hexagonal orthogonal oriented quadrature image pyramid

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.

    1989-01-01

    Pyramid image transforms have proven useful in image coding and pattern recognition. The hexagonal orthogonal oriented quadrature image pyramid (HOP), transforms an image into a set of orthogonal, oriented, odd and even bandpass subimages. It operates on a hexagonal input lattice and employs seven kernels, each of which occupies a neighborhood consisting of a point and a hexagon of six nearest neighbors. The kernels consist of one lowpass and six bandpass kernels that are orthogonal, self-similar, and localized in space, spatial frequency, orientation, and phase. The kernels are first applied to the image samples to create the first level of the pyramid, then to the lowpass coefficients to create the next level. The resulting pyramid is a compact, efficient image code. Here, a recursive, in-place algorithm for computation of the HOP transform is described. The transform may be regarded as a depth-first traversal of a tree structure. It is shown that the algorithm requires a number of operations that is on the order of the number of pixels.

  11. Paired MEG data set source localization using recursively applied and projected (RAP) MUSIC.

    PubMed

    Ermer, J J; Mosher, J C; Huang, M; Leahy, R M

    2000-09-01

    An important class of experiments in functional brain mapping involves collecting pairs of data corresponding to separate "Task" and "Control" conditions. The data are then analyzed to determine what activity occurs during the Task experiment but not in the Control. Here we describe a new method for processing paired magnetoencephalographic (MEG) data sets using our recursively applied and projected multiple signal classification (RAP-MUSIC) algorithm. In this method the signal subspace of the Task data is projected against the orthogonal complement of the Control data signal subspace to obtain a subspace which describes spatial activity unique to the Task. A RAP-MUSIC localization search is then performed on this projected data to localize the sources which are active in the Task but not in the Control data. In addition to dipolar sources, effective blocking of more complex sources, e.g., multiple synchronously activated dipoles or synchronously activated distributed source activity, is possible since these topographies are well-described by the Control data signal subspace. Unlike previously published methods, the proposed method is shown to be effective in situations where the time series associated with Control and Task activity possess significant cross correlation. The method also allows for straightforward determination of the estimated time series of the localized target sources. A multiepoch MEG simulation and a phantom experiment are presented to demonstrate the ability of this method to successfully identify sources and their time series in the Task data.

  12. Magnetic Reconnection: Recursive Current Sheet Collapse Triggered by “Ideal” Tearing

    NASA Astrophysics Data System (ADS)

    Tenerani, Anna; Velli, Marco; Rappazzo, Antonio Franco; Pucci, Fulvia

    2015-11-01

    We study, by means of MHD simulations, the onset and evolution of fast reconnection via the “ideal” tearing mode within a collapsing current sheet at high Lundquist numbers (S\\gg {10}4). We first confirm that as the collapse proceeds, fast reconnection is triggered well before a Sweet–Parker-type configuration can form: during the linear stage, plasmoids rapidly grow in a few Alfvén times when the predicted “ideal” tearing threshold S‑1/3 is approached from above; after the linear phase of the initial instability, X-points collapse and reform nonlinearly. We show that these give rise to a hierarchy of tearing events repeating faster and faster on current sheets at ever smaller scales, corresponding to the triggering of “ideal” tearing at the renormalized Lundquist number. In resistive MHD, this process should end with the formation of sub-critical (S ≤ 104) Sweet–Parker sheets at microscopic scales. We present a simple model describing the nonlinear recursive evolution that explains the timescale of the disruption of the initial sheet.

  13. Orthogonal recursive bisection as data decomposition strategy for massively parallel cardiac simulations.

    PubMed

    Reumann, Matthias; Fitch, Blake G; Rayshubskiy, Aleksandr; Pitman, Michael C; Rice, John J

    2011-06-01

    We present the orthogonal recursive bisection algorithm that hierarchically segments the anatomical model structure into subvolumes that are distributed to cores. The anatomy is derived from the Visible Human Project, with electrophysiology based on the FitzHugh-Nagumo (FHN) and ten Tusscher (TT04) models with monodomain diffusion. Benchmark simulations with up to 16,384 and 32,768 cores on IBM Blue Gene/P and L supercomputers for both FHN and TT04 results show good load balancing with almost perfect speedup factors that are close to linear with the number of cores. Hence, strong scaling is demonstrated. With 32,768 cores, a 1000 ms simulation of full heart beat requires about 6.5 min of wall clock time for a simulation of the FHN model. For the largest machine partitions, the simulations execute at a rate of 0.548 s (BG/P) and 0.394 s (BG/L) of wall clock time per 1 ms of simulation time. To our knowledge, these simulations show strong scaling to substantially higher numbers of cores than reported previously for organ-level simulation of the heart, thus significantly reducing run times. The ability to reduce runtimes could play a critical role in enabling wider use of cardiac models in research and clinical applications. PMID:21657987

  14. A landscape-based cluster analysis using recursive search instead of a threshold parameter.

    PubMed

    Gladwin, Thomas E; Vink, Matthijs; Mars, Roger B

    2016-01-01

    Cluster-based analysis methods in neuroimaging provide control of whole-brain false positive rates without the need to conservatively correct for the number of voxels and the associated false negative results. The current method defines clusters based purely on shapes in the landscape of activation, instead of requiring the choice of a statistical threshold that may strongly affect results. Statistical significance is determined using permutation testing, combining both size and height of activation. A method is proposed for dealing with relatively small local peaks. Simulations confirm the method controls the false positive rate and correctly identifies regions of activation. The method is also illustrated using real data. •A landscape-based method to define clusters in neuroimaging data avoids the need to pre-specify a threshold to define clusters.•The implementation of the method works as expected, based on simulated and real data.•The recursive method used for defining clusters, the method used for combining clusters, and the definition of the "value" of a cluster may be of interest for future variations.

  15. Recursive Factorization of the Inverse Overlap Matrix in Linear-Scaling Quantum Molecular Dynamics Simulations.

    PubMed

    Negre, Christian F A; Mniszewski, Susan M; Cawkwell, Marc J; Bock, Nicolas; Wall, Michael E; Niklasson, Anders M N

    2016-07-12

    We present a reduced complexity algorithm to compute the inverse overlap factors required to solve the generalized eigenvalue problem in a quantum-based molecular dynamics (MD) simulation. Our method is based on the recursive, iterative refinement of an initial guess of Z (inverse square root of the overlap matrix S). The initial guess of Z is obtained beforehand by using either an approximate divide-and-conquer technique or dynamical methods, propagated within an extended Lagrangian dynamics from previous MD time steps. With this formulation, we achieve long-term stability and energy conservation even under the incomplete, approximate, iterative refinement of Z. Linear-scaling performance is obtained using numerically thresholded sparse matrix algebra based on the ELLPACK-R sparse matrix data format, which also enables efficient shared-memory parallelization. As we show in this article using self-consistent density-functional-based tight-binding MD, our approach is faster than conventional methods based on the diagonalization of overlap matrix S for systems as small as a few hundred atoms, substantially accelerating quantum-based simulations even for molecular structures of intermediate size. For a 4158-atom water-solvated polyalanine system, we find an average speedup factor of 122 for the computation of Z in each MD step. PMID:27267207

  16. ECG compression using non-recursive wavelet transform with quality control

    NASA Astrophysics Data System (ADS)

    Liu, Je-Hung; Hung, King-Chu; Wu, Tsung-Ching

    2016-09-01

    While wavelet-based electrocardiogram (ECG) data compression using scalar quantisation (SQ) yields excellent compression performance, a wavelet's SQ scheme, however, must select a set of multilevel quantisers for each quantisation process. As a result of the properties of multiple-to-one mapping, however, this scheme is not conducive for reconstruction error control. In order to address this problem, this paper presents a single-variable control SQ scheme able to guarantee the reconstruction quality of wavelet-based ECG data compression. Based on the reversible round-off non-recursive discrete periodised wavelet transform (RRO-NRDPWT), the SQ scheme is derived with a three-stage design process that first uses genetic algorithm (GA) for high compression ratio (CR), followed by a quadratic curve fitting for linear distortion control, and the third uses a fuzzy decision-making for minimising data dependency effect and selecting the optimal SQ. The two databases, Physikalisch-Technische Bundesanstalt (PTB) and Massachusetts Institute of Technology (MIT) arrhythmia, are used to evaluate quality control performance. Experimental results show that the design method guarantees a high compression performance SQ scheme with statistically linear distortion. This property can be independent of training data and can facilitate rapid error control.

  17. Literacity: A multimedia adult literacy package combining NASA technology, recursive ID theory, and authentic instruction theory

    NASA Technical Reports Server (NTRS)

    Willis, Jerry; Willis, Dee Anna; Walsh, Clare; Stephens, Elizabeth; Murphy, Timothy; Price, Jerry; Stevens, William; Jackson, Kevin; Villareal, James A.; Way, Bob

    1994-01-01

    An important part of NASA's mission involves the secondary application of its technologies in the public and private sectors. One current application under development is LiteraCity, a simulation-based instructional package for adults who do not have functional reading skills. Using fuzzy logic routines and other technologies developed by NASA's Information Systems Directorate and hypermedia sound, graphics, and animation technologies the project attempts to overcome the limited impact of adult literacy assessment and instruction by involving the adult in an interactive simulation of real-life literacy activities. The project uses a recursive instructional development model and authentic instruction theory. This paper describes one component of a project to design, develop, and produce a series of computer-based, multimedia instructional packages. The packages are being developed for use in adult literacy programs, particularly in correctional education centers. They use the concepts of authentic instruction and authentic assessment to guide development. All the packages to be developed are instructional simulations. The first is a simulation of 'finding a friend a job.'

  18. Prognostic factors for survival of patients with glioblastoma: Recursive partitioning analysis1

    PubMed Central

    Lamborn, Kathleen R.; Chang, Susan M.; Prados, Michael D.

    2004-01-01

    Survival for patients with glioblastoma multiforme is short, and current treatments provide limited benefit. Therefore, there is interest in conducting phase 2 trials of experimental treatments in newly diagnosed patients. However, this requires historical data with which to compare the experimental therapies. Knowledge of prognostic markers would also allow stratification into risk groups for phase 3 randomized trials. In this retrospective study of 832 glioblastoma multiforme patients enrolled into prospective clinical trials at the time of initial diagnosis, we evaluated several potential prognostic markers for survival to establish risk groups. Analyses were done using both Cox proportional hazards modeling and recursive partitioning analyses. Initially, patients from 8 clinical trials, 6 of which included adjuvant chemotherapy, were included. Subsequent analyses excluded trials with interstitial brachytherapy, and finally included only nonbrachytherapy trials with planned adjuvant chemotherapy. The initial analysis defined 4 risk groups. The 2 lower risk groups included patients under the age of 40, the lowest risk group being young patients with tumor in the frontal lobe only. An intermediate-risk group included patients with Karnofsky performance status (KPS) >70, subtotal or total resection, and age between 40 and 65. The highest risk group included all patients over 65 and patients between 40 and 65 with either KPS < 80 or biopsy only. Subgroup analyses indicated that inclusion of adjuvant chemotherapy provides an increase in survival, although that improvement tends to be minimal for patients over age 65, for patients over age 40 with KPS less than 80, and for those treated with brachytherapy. PMID:15279715

  19. Prognostic factors for survival of patients with glioblastoma: recursive partitioning analysis.

    PubMed

    Lamborn, Kathleen R; Chang, Susan M; Prados, Michael D

    2004-07-01

    Survival for patients with glioblastoma multiforme is short, and current treatments provide limited benefit. Therefore, there is interest in conducting phase 2 trials of experimental treatments in newly diagnosed patients. However, this requires historical data with which to compare the experimental therapies. Knowledge of prognostic markers would also allow stratification into risk groups for phase 3 randomized trials. In this retrospective study of 832 glioblastoma multiforme patients enrolled into prospective clinical trials at the time of initial diagnosis, we evaluated several potential prognostic markers for survival to establish risk groups. Analyses were done using both Cox proportional hazards modeling and recursive partitioning analyses. Initially, patients from 8 clinical trials, 6 of which included adjuvant chemotherapy, were included. Subsequent analyses excluded trials with interstitial brachytherapy, and finally included only nonbrachytherapy trials with planned adjuvant chemotherapy. The initial analysis defined 4 risk groups. The 2 lower risk groups included patients under the age of 40, the lowest risk group being young patients with tumor in the frontal lobe only. An intermediate-risk group included patients with Karnofsky performance status (KPS) >70, subtotal or total resection, and age between 40 and 65. The highest risk group included all patients over 65 and patients between 40 and 65 with either KPS<80 or biopsy only. Subgroup analyses indicated that inclusion of adjuvant chemotherapy provides an increase in survival, although that improvement tends to be minimal for patients over age 65, for patients over age 40 with KPS less than 80, and for those treated with brachytherapy.

  20. Recursive Bayesian filtering framework for lithium-ion cell state estimation

    NASA Astrophysics Data System (ADS)

    Tagade, Piyush; Hariharan, Krishnan S.; Gambhire, Priya; Kolake, Subramanya Mayya; Song, Taewon; Oh, Dukjin; Yeo, Taejung; Doo, Seokgwang

    2016-02-01

    Robust battery management system is critical for a safe and reliable electric vehicle operation. One of the most important functions of the battery management system is to accurately estimate the battery state using minimal on-board instrumentation. This paper presents a recursive Bayesian filtering framework for on-board battery state estimation by assimilating measurables like cell voltage, current and temperature with physics-based reduced order model (ROM) predictions. The paper proposes an improved Particle filtering algorithm for implementation of the framework, and compares its performance against the unscented Kalman filter. Functionality of the proposed framework is demonstrated for a commercial NCA/C cell state estimation at different operating conditions including constant current discharge at room and low temperatures, hybrid power pulse characterization (HPPC) and urban driving schedule (UDDS) protocols. In addition to accurate voltage prediction, the electrochemical nature of ROM enables drawing of physical insights into the cell behavior. Advantages of using electrode concentrations over conventional Coulomb counting for accessible capacity estimation are discussed. In addition to the mean state estimation, the framework also provides estimation of the associated confidence bounds that are used to establish predictive capability of the proposed framework.