Sample records for hyper text markup

  1. A Leaner, Meaner Markup Language.

    ERIC Educational Resources Information Center

    Online & CD-ROM Review, 1997

    1997-01-01

    In 1996 a working group of the World Wide Web Consortium developed and released a simpler form of markup language, Extensible Markup Language (XML), combining the flexibility of standard Generalized Markup Language (SGML) and the Web suitability of HyperText Markup Language (HTML). Reviews SGML and discusses XML's suitability for journal…

  2. XML: A Language To Manage the World Wide Web. ERIC Digest.

    ERIC Educational Resources Information Center

    Davis-Tanous, Jennifer R.

    This digest provides an overview of XML (Extensible Markup Language), a markup language used to construct World Wide Web pages. Topics addressed include: (1) definition of a markup language, including comparison of XML with SGML (Standard Generalized Markup Language) and HTML (HyperText Markup Language); (2) how XML works, including sample tags,…

  3. Analysis of the Effect of Environmental Conditions in Conducting Amphibious Assaults Using a Ship Simulator/Vessel-Response Model Proof-of-Concept Study

    DTIC Science & Technology

    2017-05-01

    Center ESRI Environmental Systems Research Institute GIS Geographic Information System HTML Hyper -Text Markup Language LCAC Landing Craft Air... loop .” The ship simulator bridge is generic in that its layout is similar to that found in a variety of ships. As shown in Figures 17 and 18, the...information stored in the geodatabases. The Hyper -Text Markup Language (HTML) capability built into ArcMap permits a planner to click on a vessel track and

  4. XML: A Publisher's Perspective.

    ERIC Educational Resources Information Center

    Andrews, Timothy M.

    1999-01-01

    Explains eXtensible Markup Language (XML) and describes how Dow Jones Interactive is using it to improve the news-gathering and dissemination process through intranets and the World Wide Web. Discusses benefits of using XML, the relationship to HyperText Markup Language (HTML), lack of available software tools and industry support, and future…

  5. HyperText MARCup: A Conceptualization for Encoding, De-Constructing, Searching, Retrieving, and Using Traditional Knowledge Tools.

    ERIC Educational Resources Information Center

    Wall, C. Edward; And Others

    1995-01-01

    Discusses the integration of Standard General Markup Language, Hypertext Markup Language, and MARC format to parse classified analytical bibliographies. Use of the resulting electronic knowledge constructs in local library systems as maps of a specified subset of resources is discussed, and an example is included. (LRW)

  6. Data Archival and Retrieval Enhancement (DARE) Metadata Modeling and Its User Interface

    NASA Technical Reports Server (NTRS)

    Hyon, Jason J.; Borgen, Rosana B.

    1996-01-01

    The Defense Nuclear Agency (DNA) has acquired terabytes of valuable data which need to be archived and effectively distributed to the entire nuclear weapons effects community and others...This paper describes the DARE (Data Archival and Retrieval Enhancement) metadata model and explains how it is used as a source for generating HyperText Markup Language (HTML)or Standard Generalized Markup Language (SGML) documents for access through web browsers such as Netscape.

  7. How to use the WWW to distribute STI

    NASA Technical Reports Server (NTRS)

    Roper, Donna G.

    1994-01-01

    This presentation explains how to use the World Wide Web (WWW) to distribute scientific and technical information as hypermedia. WWW clients and servers use the HyperText Transfer Protocol (HTTP) to transfer documents containing links to other text, graphics, video, and sound. The standard language for these documents is the HyperText MarkUp Language (HTML). These are simply text files with formatting codes that contain layout information and hyperlinks. HTML documents can be created with any text editor or with one of the publicly available HTML editors or convertors. HTML can also include links to available image formats. This presentation is available online. The URL is http://sti.larc.nasa. (followed by) gov/demos/workshop/introtext.html.

  8. Home Page, Sweet Home Page: Creating a Web Presence.

    ERIC Educational Resources Information Center

    Falcigno, Kathleen; Green, Tim

    1995-01-01

    Focuses primarily on design issues and practical concerns involved in creating World Wide Web documents for use within an organization. Concerns for those developing Web home pages are: learning HyperText Markup Language (HTML); defining customer group; allocating staff resources for maintenance of documents; providing feedback mechanism for…

  9. The World-Wide Web and Mosaic: An Overview for Librarians.

    ERIC Educational Resources Information Center

    Morgan, Eric Lease

    1994-01-01

    Provides an overview of the Internet's World-Wide Web (Web), a hypertext system. Highlights include the client/server model; Uniform Resource Locator; examples of software; Web servers versus Gopher servers; HyperText Markup Language (HTML); converting files; Common Gateway Interface; organizing Web information; and the role of librarians in…

  10. Development and Evaluation of a Thai Learning System on the Web Using Natural Language Processing.

    ERIC Educational Resources Information Center

    Dansuwan, Suyada; Nishina, Kikuko; Akahori, Kanji; Shimizu, Yasutaka

    2001-01-01

    Describes the Thai Learning System, which is designed to help learners acquire the Thai word order system. The system facilitates the lessons on the Web using HyperText Markup Language and Perl programming, which interfaces with natural language processing by means of Prolog. (Author/VWL)

  11. So You Wanna Be a Web Author?

    ERIC Educational Resources Information Center

    Buchanan, Larry

    1996-01-01

    Defines HyperText Markup Language (HTML) as it relates to the World Wide Web (WWW). Describes steps needed to create HTML files on a UNIX system and to make them accessible via the WWW. Presents a list of basic HTML formatting codes and explains the coding used in the author's personal HTML file. (JMV)

  12. Online Survey, Enrollment, and Examination: Special Internet Applications in Teacher Education.

    ERIC Educational Resources Information Center

    Tu, Jho-Ju; Babione, Carolyn; Chen, Hsin-Chu

    The Teachers College at Emporia State University in Kansas is now utilizing World Wide Web technology for automating the application procedure for student teaching. The general concepts and some of the key terms that are important for understanding the process involved in this project include: a client-server model, HyperText Markup Language,…

  13. The place of SGML and HTML in building electronic patient records.

    PubMed

    Pitty, D; Gordon, C; Reeves, P; Capey, A; Vieyra, P; Rickards, T

    1997-01-01

    The authors are concerned that, although popular, SGML (Standard Generalized Markup Language) is only one approach to capturing, storing, viewing and exchanging healthcare information and does not provide a suitable paradigm for solving most of the problems associated with paper based patient record systems. Although a discussion of the relative merits of SGML, HTML (HyperText Markup Language) may be interesting, we feel such a discussion is avoiding the real issues associated with the most appropriate way to model, represent, and store electronic patient information in order to solve healthcare problems, and therefore the medical informatics community should firstly concern itself with these issues. The paper substantiates this viewpoint and concludes with some suggestions of how progress can be made.

  14. Automating Information Assurance for Cyber Situational Awareness within a Smart Cloud System of Systems

    DTIC Science & Technology

    2014-03-01

    Humanitarian Assistance and Disaster Relief HTML HyperText Markup Language IA Information Assurance IAI Israel Aerospace Industries IASA Information ...decision maker at the Command and Control “mini cloud” was of upmost interest . This discussion not only confirmed the need to have information ...2) monitoring for specific cyber attacks on a specified system, (3) alerting information of interest to an operator, and finally (4) allowing the

  15. The Air Force Records Management Program: A Paradigm Shift from Compliance to Guiding Principles in an Ever-Changing Information Environment

    DTIC Science & Technology

    2014-06-22

    GIG Global Information Grid GOTS Government Off-the-Shelf HTML Hyper Text Markup Language ICT Information and Communication Technology IEC...maintenance, retrieval, and preservation of vital information created in public and private organizations in all sectors of the economy . It is also the...constructed in the 1940’s, as part of a government-effort to provide employment 120 during the Depression, and boost the economy . This road is set in

  16. An interactive HTML ocean nowcast GUI based on Perl and JavaScript

    NASA Astrophysics Data System (ADS)

    Sakalaukus, Peter J.; Fox, Daniel N.; Louise Perkins, A.; Smedstad, Lucy F.

    1999-02-01

    We describe the use of Hyper Text Markup Language (HTML), JavaScript code, and Perl I/O to create and validate forms in an Internet-based graphical user interface (GUI) for the Naval Research Laboratory (NRL) Ocean models and Assimilation Demonstration System (NOMADS). The resulting nowcast system can be operated from any compatible browser across the Internet, for although the GUI was prepared in a Netscape browser, it used no Netscape extensions. Code available at: http://www.iamg.org/CGEditor/index.htm

  17. ADASS Web Database XML Project

    NASA Astrophysics Data System (ADS)

    Barg, M. I.; Stobie, E. B.; Ferro, A. J.; O'Neil, E. J.

    In the spring of 2000, at the request of the ADASS Program Organizing Committee (POC), we began organizing information from previous ADASS conferences in an effort to create a centralized database. The beginnings of this database originated from data (invited speakers, participants, papers, etc.) extracted from HyperText Markup Language (HTML) documents from past ADASS host sites. Unfortunately, not all HTML documents are well formed and parsing them proved to be an iterative process. It was evident at the beginning that if these Web documents were organized in a standardized way, such as XML (Extensible Markup Language), the processing of this information across the Web could be automated, more efficient, and less error prone. This paper will briefly review the many programming tools available for processing XML, including Java, Perl and Python, and will explore the mapping of relational data from our MySQL database to XML.

  18. Records and history of the United States Geological Survey

    USGS Publications Warehouse

    Nelson, Clifford M.

    2000-01-01

    This publication contains two presentations in Portable Document Format (PDF). The first is Renee M. Jaussaud's inventory of the documents accessioned by the end of 1997 into Record Group 57 (Geological Survey) at the National Archives and Records Administration's (NARA) Archives II facility in College Park, Md., but not the materials in NARA's regional archives. The second is Mary C. Rabbitt's 'The United States Geological Survey 1879-1989,' which appeared in 1989 as USGS Circular 1050. Additionally, USGS Circular 1050 is also presented in Hyper Text Markup Language (HTML) format.

  19. Home Page: The Mode of Transport through the Information Superhighway

    NASA Technical Reports Server (NTRS)

    Lujan, Michelle R.

    1995-01-01

    The purpose of the project with the Aeroacoustics Branch was to create and submit a home page for the internet about branch information. In order to do this, one must also become familiar with the way that the internet operates. Learning HyperText Markup Language (HTML), and the ability to create a document using this language was the final objective in order to place a home page on the internet (World Wide Web). A manual of instructions regarding maintenance of the home page, and how to keep it up to date was also necessary in order to provide branch members with the opportunity to make any pertinent changes.

  20. STS Case Study Development Support

    NASA Technical Reports Server (NTRS)

    Rosa de Jesus, Dan A.; Johnson, Grace K.

    2013-01-01

    The Shuttle Case Study Collection (SCSC) has been developed using lessons learned documented by NASA engineers, analysts, and contractors. The SCSC provides educators with a new tool to teach real-world engineering processes with the goal of providing unique educational materials that enhance critical thinking, decision-making and problem-solving skills. During this third phase of the project, responsibilities included: the revision of the Hyper Text Markup Language (HTML) source code to ensure all pages follow World Wide Web Consortium (W3C) standards, and the addition and edition of website content, including text, documents, and images. Basic HTML knowledge was required, as was basic knowledge of photo editing software, and training to learn how to use NASA's Content Management System for website design. The outcome of this project was its release to the public.

  1. Web-Based Collaborative Publications System: R&Tserve

    NASA Technical Reports Server (NTRS)

    Abrams, Steve

    1997-01-01

    R&Tserve is a publications system based on 'commercial, off-the-shelf' (COTS) software that provides a persistent, collaborative workspace for authors and editors to support the entire publication development process from initial submission, through iterative editing in a hierarchical approval structure, and on to 'publication' on the WWW. It requires no specific knowledge of the WWW (beyond basic use) or HyperText Markup Language (HTML). Graphics and URLs are automatically supported. The system includes a transaction archive, a comments utility, help functionality, automated graphics conversion, automated table generation, and an email-based notification system. It may be configured and administered via the WWW and can support publications ranging from single page documents to multiple-volume 'tomes'.

  2. SGML-Based Markup for Literary Texts: Two Problems and Some Solutions.

    ERIC Educational Resources Information Center

    Barnard, David; And Others

    1988-01-01

    Identifies the Standard Generalized Markup Language (SGML) as the best basis for a markup standard for encoding literary texts. Outlines solutions to problems using SGML and discusses the problem of maintaining multiple views of a document. Examines several ways of reducing the burden of markups. (GEA)

  3. Test Generator for MATLAB Simulations

    NASA Technical Reports Server (NTRS)

    Henry, Joel

    2011-01-01

    MATLAB Automated Test Tool, version 3.0 (MATT 3.0) is a software package that provides automated tools that reduce the time needed for extensive testing of simulation models that have been constructed in the MATLAB programming language by use of the Simulink and Real-Time Workshop programs. MATT 3.0 runs on top of the MATLAB engine application-program interface to communicate with the Simulink engine. MATT 3.0 automatically generates source code from the models, generates custom input data for testing both the models and the source code, and generates graphs and other presentations that facilitate comparison of the outputs of the models and the source code for the same input data. Context-sensitive and fully searchable help is provided in HyperText Markup Language (HTML) format.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carnes, E.T.; Truett, D.F.; Truett, L.F.

    In the handful of years since the World Wide Web (WWW or Web) came into being, Web sites have developed at an astonishing rate. With the influx of Web pages comes a disparity of site types, including personal homepages, commercial sales sites, and educational data. The variety of sites and the deluge of information contained on the Web exemplify the individual nature of the WWW. Whereas some people argue that it is this eclecticism which gives the Web its charm, we propose that sites which are repositories of technical data would benefit from standardization. This paper proffers a methodology formore » publishing ecological research on the Web. The template we describe uses capabilities of HTML (the HyperText Markup Language) to enhance the value of the traditional scientific paper.« less

  5. TOPS On-Line: Automating the Construction and Maintenance of HTML Pages

    NASA Technical Reports Server (NTRS)

    Jones, Kennie H.

    1994-01-01

    After the Technology Opportunities Showcase (TOPS), in October, 1993, Langley Research Center's (LaRC) Information Systems Division (ISD) accepted the challenge to preserve the investment in information assembled in the TOPS exhibits by establishing a data base. Following the lead of several people at LaRC and others around the world, the HyperText Transport Protocol (HTTP) server and Mosaic were the obvious tools of choice for implementation. Initially, some TOPS exhibitors began the conventional approach of constructing HyperText Markup Language (HTML) pages of their exhibits as input to Mosaic. Considering the number of pages to construct, a better approach was conceived that would automate the construction of pages. This approach allowed completion of the data base construction in a shorter period of time using fewer resources than would have been possible with the conventional approach. It also provided flexibility for the maintenance and enhancement of the data base. Since that time, this approach has been used to automate construction of other HTML data bases. Through these experiences, it is concluded that the most effective use of the HTTP/Mosaic technology will require better tools and techniques for creating, maintaining and managing the HTML pages. The development and use of these tools and techniques are the subject of this document.

  6. An Introduction to the Extensible Markup Language (XML).

    ERIC Educational Resources Information Center

    Bryan, Martin

    1998-01-01

    Describes Extensible Markup Language (XML), a subset of the Standard Generalized Markup Language (SGML) that is designed to make it easy to interchange structured documents over the Internet. Topics include Document Type Definition (DTD), components of XML, the use of XML, text and non-text elements, and uses for XML-coded files. (LRW)

  7. Hyper Text Mark-up Language and Dublin Core metadata element set usage in websites of Iranian State Universities' libraries.

    PubMed

    Zare-Farashbandi, Firoozeh; Ramezan-Shirazi, Mahtab; Ashrafi-Rizi, Hasan; Nouri, Rasool

    2014-01-01

    Recent progress in providing innovative solutions in the organization of electronic resources and research in this area shows a global trend in the use of new strategies such as metadata to facilitate description, place for, organization and retrieval of resources in the web environment. In this context, library metadata standards have a special place; therefore, the purpose of the present study has been a comparative study on the Central Libraries' Websites of Iran State Universities for Hyper Text Mark-up Language (HTML) and Dublin Core metadata elements usage in 2011. The method of this study is applied-descriptive and data collection tool is the check lists created by the researchers. Statistical community includes 98 websites of the Iranian State Universities of the Ministry of Health and Medical Education and Ministry of Science, Research and Technology and method of sampling is the census. Information was collected through observation and direct visits to websites and data analysis was prepared by Microsoft Excel software, 2011. The results of this study indicate that none of the websites use Dublin Core (DC) metadata and that only a few of them have used overlaps elements between HTML meta tags and Dublin Core (DC) elements. The percentage of overlaps of DC elements centralization in the Ministry of Health were 56% for both description and keywords and, in the Ministry of Science, were 45% for the keywords and 39% for the description. But, HTML meta tags have moderate presence in both Ministries, as the most-used elements were keywords and description (56%) and the least-used elements were date and formatter (0%). It was observed that the Ministry of Health and Ministry of Science follows the same path for using Dublin Core standard on their websites in the future. Because Central Library Websites are an example of scientific web pages, special attention in designing them can help the researchers to achieve faster and more accurate information resources. Therefore, the influence of librarians' ideas on the awareness of web designers and developers will be important for using metadata elements as general, and specifically for applying such standards.

  8. Hyper Text Mark-up Language and Dublin Core metadata element set usage in websites of Iranian State Universities’ libraries

    PubMed Central

    Zare-Farashbandi, Firoozeh; Ramezan-Shirazi, Mahtab; Ashrafi-Rizi, Hasan; Nouri, Rasool

    2014-01-01

    Introduction: Recent progress in providing innovative solutions in the organization of electronic resources and research in this area shows a global trend in the use of new strategies such as metadata to facilitate description, place for, organization and retrieval of resources in the web environment. In this context, library metadata standards have a special place; therefore, the purpose of the present study has been a comparative study on the Central Libraries’ Websites of Iran State Universities for Hyper Text Mark-up Language (HTML) and Dublin Core metadata elements usage in 2011. Materials and Methods: The method of this study is applied-descriptive and data collection tool is the check lists created by the researchers. Statistical community includes 98 websites of the Iranian State Universities of the Ministry of Health and Medical Education and Ministry of Science, Research and Technology and method of sampling is the census. Information was collected through observation and direct visits to websites and data analysis was prepared by Microsoft Excel software, 2011. Results: The results of this study indicate that none of the websites use Dublin Core (DC) metadata and that only a few of them have used overlaps elements between HTML meta tags and Dublin Core (DC) elements. The percentage of overlaps of DC elements centralization in the Ministry of Health were 56% for both description and keywords and, in the Ministry of Science, were 45% for the keywords and 39% for the description. But, HTML meta tags have moderate presence in both Ministries, as the most-used elements were keywords and description (56%) and the least-used elements were date and formatter (0%). Conclusion: It was observed that the Ministry of Health and Ministry of Science follows the same path for using Dublin Core standard on their websites in the future. Because Central Library Websites are an example of scientific web pages, special attention in designing them can help the researchers to achieve faster and more accurate information resources. Therefore, the influence of librarians’ ideas on the awareness of web designers and developers will be important for using metadata elements as general, and specifically for applying such standards. PMID:24741646

  9. Automation and integration of components for generalized semantic markup of electronic medical texts.

    PubMed

    Dugan, J M; Berrios, D C; Liu, X; Kim, D K; Kaizer, H; Fagan, L M

    1999-01-01

    Our group has built an information retrieval system based on a complex semantic markup of medical textbooks. We describe the construction of a set of web-based knowledge-acquisition tools that expedites the collection and maintenance of the concepts required for text markup and the search interface required for information retrieval from the marked text. In the text markup system, domain experts (DEs) identify sections of text that contain one or more elements from a finite set of concepts. End users can then query the text using a predefined set of questions, each of which identifies a subset of complementary concepts. The search process matches that subset of concepts to relevant points in the text. The current process requires that the DE invest significant time to generate the required concepts and questions. We propose a new system--called ACQUIRE (Acquisition of Concepts and Queries in an Integrated Retrieval Environment)--that assists a DE in two essential tasks in the text-markup process. First, it helps her to develop, edit, and maintain the concept model: the set of concepts with which she marks the text. Second, ACQUIRE helps her to develop a query model: the set of specific questions that end users can later use to search the marked text. The DE incorporates concepts from the concept model when she creates the questions in the query model. The major benefit of the ACQUIRE system is a reduction in the time and effort required for the text-markup process. We compared the process of concept- and query-model creation using ACQUIRE to the process used in previous work by rebuilding two existing models that we previously constructed manually. We observed a significant decrease in the time required to build and maintain the concept and query models.

  10. Archive of Boomer seismic reflection data: collected during USGS Cruise 96CCT01, nearshore south central South Carolina coast, June 26 - July 1, 1996

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Kindinger, Jack G.; Flocks, James G.; Wiese, Dana S.

    2003-01-01

    This archive consists of marine seismic reflection profile data collected in four survey areas from southeast of Charleston Harbor to the mouth of the North Edisto River of South Carolina. These data were acquired June 26 - July 1, 1996, aboard the R/V G.K. Gilbert. Included here are data in a variety of formats including binary, American Standard Code for Information Interchange (ASCII), Hyper Text Markup Language (HTML), Portable Document Format (PDF), Rich Text Format (RTF), Graphics Interchange Format (GIF) and Joint Photographic Experts Group (JPEG) images, and shapefiles. Binary data are in Society of Exploration Geophysicists (SEG) SEG-Y format and may be downloaded for further processing or display. Reference maps and GIF images of the profiles may be viewed with a web browser. The Geographic Information Systems (GIS) map documents provided were created with Environmental Systems Research Institute (ESRI) GIS software ArcView 3.2 and 8.1.

  11. Automation and integration of components for generalized semantic markup of electronic medical texts.

    PubMed Central

    Dugan, J. M.; Berrios, D. C.; Liu, X.; Kim, D. K.; Kaizer, H.; Fagan, L. M.

    1999-01-01

    Our group has built an information retrieval system based on a complex semantic markup of medical textbooks. We describe the construction of a set of web-based knowledge-acquisition tools that expedites the collection and maintenance of the concepts required for text markup and the search interface required for information retrieval from the marked text. In the text markup system, domain experts (DEs) identify sections of text that contain one or more elements from a finite set of concepts. End users can then query the text using a predefined set of questions, each of which identifies a subset of complementary concepts. The search process matches that subset of concepts to relevant points in the text. The current process requires that the DE invest significant time to generate the required concepts and questions. We propose a new system--called ACQUIRE (Acquisition of Concepts and Queries in an Integrated Retrieval Environment)--that assists a DE in two essential tasks in the text-markup process. First, it helps her to develop, edit, and maintain the concept model: the set of concepts with which she marks the text. Second, ACQUIRE helps her to develop a query model: the set of specific questions that end users can later use to search the marked text. The DE incorporates concepts from the concept model when she creates the questions in the query model. The major benefit of the ACQUIRE system is a reduction in the time and effort required for the text-markup process. We compared the process of concept- and query-model creation using ACQUIRE to the process used in previous work by rebuilding two existing models that we previously constructed manually. We observed a significant decrease in the time required to build and maintain the concept and query models. Images Figure 1 Figure 2 Figure 4 Figure 5 PMID:10566457

  12. ArdenML: The Arden Syntax Markup Language (or Arden Syntax: It's Not Just Text Any More!)

    PubMed Central

    Sailors, R. Matthew

    2001-01-01

    It is no longer necessary to think of Arden Syntax as simply a text-based knowledge base format. The development of ArdenML (Arden Syntax Markup Language), an XML-based markup language allows structured access to most of the maintenance and library categories without the need to write or buy a compiler may lead to the development of simple commercial and freeware tools for processing Arden Syntax Medical Logic Modules (MLMs)

  13. jsNMR: an embedded platform-independent NMR spectrum viewer.

    PubMed

    Vosegaard, Thomas

    2015-04-01

    jsNMR is a lightweight NMR spectrum viewer written in JavaScript/HyperText Markup Language (HTML), which provides a cross-platform spectrum visualizer that runs on all computer architectures including mobile devices. Experimental (and simulated) datasets are easily opened in jsNMR by (i) drag and drop on a jsNMR browser window, (ii) by preparing a jsNMR file from the jsNMR web site, or (iii) by mailing the raw data to the jsNMR web portal. jsNMR embeds the original data in the HTML file, so a jsNMR file is a self-transforming dataset that may be exported to various formats, e.g. comma-separated values. The main applications of jsNMR are to provide easy access to NMR data without the need for dedicated software installed and to provide the possibility to visualize NMR spectra on web sites. Copyright © 2015 John Wiley & Sons, Ltd.

  14. Client-side Medical Image Colorization in a Collaborative Environment.

    PubMed

    Virag, Ioan; Stoicu-Tivadar, Lăcrămioara; Crişan-Vida, Mihaela

    2015-01-01

    The paper presents an application related to collaborative medicine using a browser based medical visualization system with focus on the medical image colorization process and the underlying open source web development technologies involved. Browser based systems allow physicians to share medical data with their remotely located counterparts or medical students, assisting them during patient diagnosis, treatment monitoring, surgery planning or for educational purposes. This approach brings forth the advantage of ubiquity. The system can be accessed from a any device, in order to process the images, assuring the independence towards having a specific proprietary operating system. The current work starts with processing of DICOM (Digital Imaging and Communications in Medicine) files and ends with the rendering of the resulting bitmap images on a HTML5 (fifth revision of the HyperText Markup Language) canvas element. The application improves the image visualization emphasizing different tissue densities.

  15. Archive of chirp seismic reflection data collected during USGS cruises 00SCC02 and 00SCC04, Barataria Basin, Louisiana, May 12-31 and June 17-July 2, 2000

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, S.V.; Kindinger, J.L.; Flocks, J.G.; Wiese, D.S.; Kulp, Mark; Penland, Shea; Britsch, L.D.; Brooks, G.R.

    2003-01-01

    This archive consists of two-dimensional marine seismic reflection profile data collected in the Barataria Basin of southern Louisiana. These data were acquired in May, June, and July of 2000 aboard the R/V G.K. Gilbert. Included here are data in a variety of formats including binary, American Standard Code for Information Interchange (ASCII), Hyper-Text Markup Language (HTML), shapefiles, and Graphics Interchange Format (GIF) and Joint Photographic Experts Group (JPEG) images. Binary data are in Society of Exploration Geophysicists (SEG) SEG-Y format and may be downloaded for further processing or display. Reference maps and GIF images of the profiles may be viewed with a web browser. The Geographic Information Systems (GIS) information provided here is compatible with Environmental Systems Research Institute (ESRI) GIS software.

  16. Semantic Markup for Literary Scholars: How Descriptive Markup Affects the Study and Teaching of Literature.

    ERIC Educational Resources Information Center

    Campbell, D. Grant

    2002-01-01

    Describes a qualitative study which investigated the attitudes of literary scholars towards the features of semantic markup for primary texts in XML format. Suggests that layout is a vital part of the reading process which implies that the standardization of DTDs (Document Type Definitions) should extend to styling as well. (Author/LRW)

  17. FastScript3D - A Companion to Java 3D

    NASA Technical Reports Server (NTRS)

    Koenig, Patti

    2005-01-01

    FastScript3D is a computer program, written in the Java 3D(TM) programming language, that establishes an alternative language that helps users who lack expertise in Java 3D to use Java 3D for constructing three-dimensional (3D)-appearing graphics. The FastScript3D language provides a set of simple, intuitive, one-line text-string commands for creating, controlling, and animating 3D models. The first word in a string is the name of a command; the rest of the string contains the data arguments for the command. The commands can also be used as an aid to learning Java 3D. Developers can extend the language by adding custom text-string commands. The commands can define new 3D objects or load representations of 3D objects from files in formats compatible with such other software systems as X3D. The text strings can be easily integrated into other languages. FastScript3D facilitates communication between scripting languages [which enable programming of hyper-text markup language (HTML) documents to interact with users] and Java 3D. The FastScript3D language can be extended and customized on both the scripting side and the Java 3D side.

  18. Training Joint, Interagency, Intergovernmental, and Multinational (JIIM) Participants for Stability Operations

    DTIC Science & Technology

    2012-09-01

    boxes) using a third-party commercial software component. When creating version 1, it was necessary to enter raw Hypertext Markup Language (HTML) tags...Markup Language (HTML) web page. Figure 12. Authors create procedures using the Procedure Editor. Users run procedures using the...step presents instructions to the user using formatted text and graphics specified using the Hypertext Markup Language (HTML). Instructions can

  19. Descriptive Metadata: Emerging Standards.

    ERIC Educational Resources Information Center

    Ahronheim, Judith R.

    1998-01-01

    Discusses metadata, digital resources, cross-disciplinary activity, and standards. Highlights include Standard Generalized Markup Language (SGML); Extensible Markup Language (XML); Dublin Core; Resource Description Framework (RDF); Text Encoding Initiative (TEI); Encoded Archival Description (EAD); art and cultural-heritage metadata initiatives;…

  20. Real-time WebRTC-based design for a telepresence wheelchair.

    PubMed

    Van Kha Ly Ha; Rifai Chai; Nguyen, Hung T

    2017-07-01

    This paper presents a novel approach to the telepresence wheelchair system which is capable of real-time video communication and remote interaction. The investigation of this emerging technology aims at providing a low-cost and efficient way for assisted-living of people with disabilities. The proposed system has been designed and developed by deploying the JavaScript with Hyper Text Markup Language 5 (HTML5) and Web Real-time Communication (WebRTC) in which the adaptive rate control algorithm for video transmission is invoked. We conducted experiments in real-world environments, and the wheelchair was controlled from a distance using the Internet browser to compare with existing methods. The results show that the adaptively encoded video streaming rate matches the available bandwidth. The video streaming is high-quality with approximately 30 frames per second (fps) and round trip time less than 20 milliseconds (ms). These performance results confirm that the WebRTC approach is a potential method for developing a telepresence wheelchair system.

  1. SGML by Evolution.

    ERIC Educational Resources Information Center

    Ensign, Chet

    1993-01-01

    Describes how the change to Standard Generalized Markup Language at Information Builders began with the use of SGML-like markup in text because it solved a specific problem. Notes that many additional unexpected benefits led to an investigation of converting to formal SGML-based electronic publishing. (SR)

  2. Automated Text Markup for Information Retrieval from an Electronic Textbook of Infectious Disease

    PubMed Central

    Berrios, Daniel C.; Kehler, Andrew; Kim, David K.; Yu, Victor L.; Fagan, Lawrence M.

    1998-01-01

    The information needs of practicing clinicians frequently require textbook or journal searches. Making these sources available in electronic form improves the speed of these searches, but precision (i.e., the fraction of relevant to total documents retrieved) remains low. Improving the traditional keyword search by transforming search terms into canonical concepts does not improve search precision greatly. Kim et al. have designed and built a prototype system (MYCIN II) for computer-based information retrieval from a forthcoming electronic textbook of infectious disease. The system requires manual indexing by experts in the form of complex text markup. However, this mark-up process is time consuming (about 3 person-hours to generate, review, and transcribe the index for each of 218 chapters). We have designed and implemented a system to semiautomate the markup process. The system, information extraction for semiautomated indexing of documents (ISAID), uses query models and existing information-extraction tools to provide support for any user, including the author of the source material, to mark up tertiary information sources quickly and accurately.

  3. PubMed-EX: a web browser extension to enhance PubMed search with text mining features.

    PubMed

    Tsai, Richard Tzong-Han; Dai, Hong-Jie; Lai, Po-Ting; Huang, Chi-Hsin

    2009-11-15

    PubMed-EX is a browser extension that marks up PubMed search results with additional text-mining information. PubMed-EX's page mark-up, which includes section categorization and gene/disease and relation mark-up, can help researchers to quickly focus on key terms and provide additional information on them. All text processing is performed server-side, freeing up user resources. PubMed-EX is freely available at http://bws.iis.sinica.edu.tw/PubMed-EX and http://iisr.cse.yzu.edu.tw:8000/PubMed-EX/.

  4. Implications of the Java language on computer-based patient records.

    PubMed

    Pollard, D; Kucharz, E; Hammond, W E

    1996-01-01

    The growth of the utilization of the World Wide Web (WWW) as a medium for the delivery of computer-based patient records (CBPR) has created a new paradigm in which clinical information may be delivered. Until recently the authoring tools and environment for application development on the WWW have been limited to Hyper Text Markup Language (HTML) utilizing common gateway interface scripts. While, at times, this provides an effective medium for the delivery of CBPR, it is a less than optimal solution. The server-centric dynamics and low levels of interactivity do not provide for a robust application which is required in a clinical environment. The emergence of Sun Microsystems' Java language is a solution to the problem. In this paper we examine the Java language and its implications to the CBPR. A quantitative and qualitative assessment was performed. The Java environment is compared to HTML and Telnet CBPR environments. Qualitative comparisons include level of interactivity, server load, client load, ease of use, and application capabilities. Quantitative comparisons include data transfer time delays. The Java language has demonstrated promise for delivering CBPRs.

  5. [Development of quality assurance/quality control web system in radiotherapy].

    PubMed

    Okamoto, Hiroyuki; Mochizuki, Toshihiko; Yokoyama, Kazutoshi; Wakita, Akihisa; Nakamura, Satoshi; Ueki, Heihachi; Shiozawa, Keiko; Sasaki, Koji; Fuse, Masashi; Abe, Yoshihisa; Itami, Jun

    2013-12-01

    Our purpose is to develop a QA/QC (quality assurance/quality control) web system using a server-side script language such as HTML (HyperText Markup Language) and PHP (Hypertext Preprocessor), which can be useful as a tool to share information about QA/QC in radiotherapy. The system proposed in this study can be easily built in one's own institute, because HTML can be easily handled. There are two desired functions in a QA/QC web system: (i) To review the results of QA/QC for a radiotherapy machine, manuals, and reports necessary for routinely performing radiotherapy through this system. By disclosing the results, transparency can be maintained, (ii) To reveal a protocol for QA/QC in one's own institute using pictures and movies relating to QA/QC for simplicity's sake, which can also be used as an educational tool for junior radiation technologists and medical physicists. By using this system, not only administrators, but also all staff involved in radiotherapy, can obtain information about the conditions and accuracy of treatment machines through the QA/QC web system.

  6. A comprehensive strategy for designing a Web-based medical curriculum.

    PubMed Central

    Zucker, J.; Chase, H.; Molholt, P.; Bean, C.; Kahn, R. M.

    1996-01-01

    In preparing for a full featured online curriculum, it is necessary to develop scaleable strategies for software design that will support the pedagogical goals of the curriculum and which will address the issues of acquisition and updating of materials, of robust content-based linking, and of integration of the online materials into other methods of learning. A complete online curriculum, as distinct from an individual computerized module, must provide dynamic updating of both content and structure and an easy pathway from the professor's notes to the finished online product. At the College of Physicians and Surgeons, we are developing such strategies including a scripted text conversion process that uses the Hypertext Markup Language (HTML) as structural markup rather than as display markup, automated linking by the use of relational databases and the Unified Medical Language System (UMLS), integration of text, images, and multimedia along with interface designs which promote multiple contexts and collaborative study. PMID:8947624

  7. Development of Markup Language for Medical Record Charting: A Charting Language.

    PubMed

    Jung, Won-Mo; Chae, Younbyoung; Jang, Bo-Hyoung

    2015-01-01

    Nowadays a lot of trials for collecting electronic medical records (EMRs) exist. However, structuring data format for EMR is an especially labour-intensive task for practitioners. Here we propose a new mark-up language for medical record charting (called Charting Language), which borrows useful properties from programming languages. Thus, with Charting Language, the text data described in dynamic situation can be easily used to extract information.

  8. Teaching with HyperCard in Place of a Textbook.

    ERIC Educational Resources Information Center

    Mackey, Neosha; And Others

    1992-01-01

    To alleviate the staffing pressures of increased demands for tours and classes at the Duane G. Meyer Library, Southwest Missouri State University, two HyperCard programs were developed--a library instruction text and a library orientation tour. A study of the relative effectiveness of the HyperCard text with paper texts for bibliographic…

  9. Evolution of a Structure-Searchable Database into a Prototype for a High-Fidelity SmartPhone App for 62 Common Pesticides Used in Delaware.

    PubMed

    D'Souza, Malcolm J; Barile, Benjamin; Givens, Aaron F

    2015-05-01

    Synthetic pesticides are widely used in the modern world for human benefit. They are usually classified according to their intended pest target. In Delaware (DE), approximately 42 percent of the arable land is used for agriculture. In order to manage insectivorous and herbaceous pests (such as insects, weeds, nematodes, and rodents), pesticides are used profusely to biologically control the normal pest's life stage. In this undergraduate project, we first created a usable relational database containing 62 agricultural pesticides that are common in Delaware. Chemically pertinent quantitative and qualitative information was first stored in Bio-Rad's KnowItAll® Informatics System. Next, we extracted the data out of the KnowItAll® system and created additional sections on a Microsoft® Excel spreadsheet detailing pesticide use(s) and safety and handling information. Finally, in an effort to promote good agricultural practices, to increase efficiency in business decisions, and to make pesticide data globally accessible, we developed a mobile application for smartphones that displayed the pesticide database using Appery.io™; a cloud-based HyperText Markup Language (HTML5), jQuery Mobile and Hybrid Mobile app builder.

  10. The National Cancer Informatics Program (NCIP) Annotation and Image Markup (AIM) Foundation model.

    PubMed

    Mongkolwat, Pattanasak; Kleper, Vladimir; Talbot, Skip; Rubin, Daniel

    2014-12-01

    Knowledge contained within in vivo imaging annotated by human experts or computer programs is typically stored as unstructured text and separated from other associated information. The National Cancer Informatics Program (NCIP) Annotation and Image Markup (AIM) Foundation information model is an evolution of the National Institute of Health's (NIH) National Cancer Institute's (NCI) Cancer Bioinformatics Grid (caBIG®) AIM model. The model applies to various image types created by various techniques and disciplines. It has evolved in response to the feedback and changing demands from the imaging community at NCI. The foundation model serves as a base for other imaging disciplines that want to extend the type of information the model collects. The model captures physical entities and their characteristics, imaging observation entities and their characteristics, markups (two- and three-dimensional), AIM statements, calculations, image source, inferences, annotation role, task context or workflow, audit trail, AIM creator details, equipment used to create AIM instances, subject demographics, and adjudication observations. An AIM instance can be stored as a Digital Imaging and Communications in Medicine (DICOM) structured reporting (SR) object or Extensible Markup Language (XML) document for further processing and analysis. An AIM instance consists of one or more annotations and associated markups of a single finding along with other ancillary information in the AIM model. An annotation describes information about the meaning of pixel data in an image. A markup is a graphical drawing placed on the image that depicts a region of interest. This paper describes fundamental AIM concepts and how to use and extend AIM for various imaging disciplines.

  11. World Wide Web and Internet: applications for radiologists.

    PubMed

    Wunderbaldinger, P; Schima, W; Turetschek, K; Helbich, T H; Bankier, A A; Herold, C J

    1999-01-01

    Global exchange of information is one of the major sources of scientific progress in medicine. For management of the rapidly growing body of medical information, computers and their applications have become an indispensable scientific tool. Approximately 36 million computer users are part of a worldwide network called the Internet or "information highway" and have created a new infrastructure to promote rapid and efficient access to medical, and thus also to radiological, information. With the establishment of the World Wide Web (WWW) by a consortium of computer users who used a standardized, nonproprietary syntax termed HyperText Markup Language (HTML) for composing documents, it has become possible to provide interactive multimedia presentations to a wide audience. The extensive use of images in radiology makes education, worldwide consultation (review) and scientific presentation via the Internet a major beneficiary of this technical development. This is possible, since both information (text) as well as medical images can be transported via the Internet. Presently, the Internet offers an extensive database for radiologists. Since many radiologists and physicians have to be considered "Internet novices" and, hence, cannot yet avail themselves of the broad spectrum of the Internet, the aim of this article is to present a general introduction to the WWW/Internet and its applications for radiologists. All Internet sites mentioned in this article can be found at the following Internet address: http://www.univie.ac. at/radio/radio.html (Department of Radiology, University of Vienna)

  12. Real-time Data Display System of the Korean Neonatal Network

    PubMed Central

    Lee, Byong Sop; Moon, Wi Hwan

    2015-01-01

    Real-time data reporting in clinical research networks can provide network members through interim analyses of the registered data, which can facilitate further studies and quality improvement activities. The aim of this report was to describe the building process of the data display system (DDS) of the Korean Neonatal Network (KNN) and its basic structure. After member verification at the KNN member's site, users can choose a variable of interest that is listed in the in-hospital data statistics (for 90 variables) or in the follow-up data statistics (for 54 variables). The statistical results of the outcome variables are displayed on the HyperText Markup Language 5-based chart graphs and tables. Participating hospitals can compare their performance to those of KNN as a whole and identify the trends over time. Ranking of each participating hospital is also displayed in terms of key outcome variables such as mortality and major neonatal morbidities with the names of other centers blinded. The most powerful function of the DDS is the ability to perform 'conditional filtering' which allows users to exclusively review the records of interest. Further collaboration is needed to upgrade the DDS to a more sophisticated analytical system and to provide a more user-friendly interface. PMID:26566352

  13. Making Technology Work for Scholarship: Investing in the Data.

    ERIC Educational Resources Information Center

    Hockey, Susan

    This paper examines issues related to how providers and consumers can make the best use of electronic information, focusing on the humanities. Topics include: new technology or old; electronic text and data formats; Standard Generalized Markup Language (SGML); text encoding initiative; encoded archival description (EAD); other applications of…

  14. HGML: a hypertext guideline markup language.

    PubMed Central

    Hagerty, C. G.; Pickens, D.; Kulikowski, C.; Sonnenberg, F.

    2000-01-01

    Existing text-based clinical practice guidelines can be difficult to put into practice. While a growing number of such documents have gained acceptance in the medical community and contain a wealth of valuable information, the time required to digest them is substantial. Yet the expressive power, subtlety and flexibility of natural language pose challenges when designing computer tools that will help in their application. At the same time, formal computer languages typically lack such expressiveness and the effort required to translate existing documents into these languages may be costly. We propose a method based on the mark-up concept for converting text-based clinical guidelines into a machine-operable form. This allows existing guidelines to be manipulated by machine, and viewed in different formats at various levels of detail according to the needs of the practitioner, while preserving their originally published form. PMID:11079898

  15. Hospital markup and operation outcomes in the United States.

    PubMed

    Gani, Faiz; Ejaz, Aslam; Makary, Martin A; Pawlik, Timothy M

    2016-07-01

    Although the price hospitals charge for operations has broad financial implications, hospital pricing is not subject to regulation. We sought to characterize national variation in hospital price markup for major cardiothoracic and gastrointestinal operations and to evaluate perioperative outcomes of hospitals relative to hospital price markup. All hospitals in which a patient underwent a cardiothoracic or gastrointestinal procedure were identified using the Nationwide Inpatient Sample for 2012. Markup ratios (ratio of charges to costs) for the total cost of hospitalization were compared across hospitals. Risk-adjusted morbidity, failure-to-rescue, and mortality were calculated using multivariable, hierarchical logistic regression. Among the 3,498 hospitals identified, markup ratios ranged from 0.5-12.2, with a median markup ratio of 2.8 (interquartile range 2.7-3.9). For the 888 hospitals with extreme markup (greatest markup ratio quartile: markup ratio >3.9), the median markup ratio was 4.9 (interquartile range 4.3-6.0), with 10% of these hospitals billing more than 7 times the Medicare-allowable costs (markup ratio ≥7.25). Extreme markup hospitals were more often large (46.3% vs 33.8%, P < .001), urban, nonteaching centers (57.0% vs 37.9%, P < .001), and located in the Southern (46.4% vs 32.8%, P < .001) or Western (27.8% vs 17.6%, P < .001) regions of the United States. Of the 639 investor-owned, for-profit hospitals, 401 hospitals (62.8%) had an extreme markup ratio compared with 19.3% (n = 452) and 6.8% (n = 35) of nonprofit and government hospitals, respectively. Perioperative morbidity (32.7% vs 26.4%, P < .001) was greater at extreme markup hospitals. There is wide variation in hospital markup for cardiothoracic and gastrointestinal procedures, with approximately a quarter of hospital charges being 4 times greater than the actual cost of hospitalization. Hospitals with an extreme markup had greater perioperative morbidity. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Making journals accessible to the visually impaired: the future is near

    PubMed Central

    GARDNER, John; BULATOV, Vladimir; KELLY, Robert

    2010-01-01

    The American Physical Society (APS) has been a leader in using markup languages for publishing. ViewPlus has led development of innovative technologies for graphical information accessibility by people with print disabilities. APS, ViewPlus, and other collaborators in the Enhanced Reading Project are working together to develop the necessary technology and infrastructure for APS to publish its journals in the DAISY (Digital Accessible Information SYstem) eXtended Markup Language (XML) format, in which all text, math, and figures would be accessible to people who are blind or have other print disabilities. The first APS DAISY XML publications are targeted for late 2010. PMID:20676358

  17. Database Reports Over the Internet

    NASA Technical Reports Server (NTRS)

    Smith, Dean Lance

    2002-01-01

    Most of the summer was spent developing software that would permit existing test report forms to be printed over the web on a printer that is supported by Adobe Acrobat Reader. The data is stored in a DBMS (Data Base Management System). The client asks for the information from the database using an HTML (Hyper Text Markup Language) form in a web browser. JavaScript is used with the forms to assist the user and verify the integrity of the entered data. Queries to a database are made in SQL (Sequential Query Language), a widely supported standard for making queries to databases. Java servlets, programs written in the Java programming language running under the control of network server software, interrogate the database and complete a PDF form template kept in a file. The completed report is sent to the browser requesting the report. Some errors are sent to the browser in an HTML web page, others are reported to the server. Access to the databases was restricted since the data are being transported to new DBMS software that will run on new hardware. However, the SQL queries were made to Microsoft Access, a DBMS that is available on most PCs (Personal Computers). Access does support the SQL commands that were used, and a database was created with Access that contained typical data for the report forms. Some of the problems and features are discussed below.

  18. Ancient Egypt

    NASA Astrophysics Data System (ADS)

    Swamy, Ashwin Balegar

    This thesis involves development of an interactive GIS (Geographic Information System) based application, which gives information about the ancient history of Egypt. The astonishing architecture, the strange burial rituals and their civilization were some of the intriguing questions that motivated me towards developing this application. The application is a historical timeline starting from 3100 BC, leading up to 664 BC, focusing on the evolution of the Egyptian dynasties. The tool holds information regarding some of the famous monuments which were constructed during that era and also about the civilizations that co-existed. It also provides details about the religions followed by their kings. It also includes the languages spoken during those periods. The tool is developed using JAVA, a programing language and MOJO (Map Objects Java Objects) a product of ESRI (Environmental Science Research Institute) to create map objects, to provide geographic information. JAVA Swing is used for designing the user interface. HTML (Hyper Text Markup Language) pages are created to provide the user with more information related to the historic period. CSS (Cascade Style Sheets) and JAVA Scripts are used with HTML5 to achieve creative display of content. The tool is kept simple and easy for the user to interact with. The tool also includes pictures and videos for the user to get a feel of the historic period. The application is built to motivate people to know more about one of the prominent and ancient civilization of the Mediterranean world.

  19. HyperCard K-12: Classroom Computer Learning Special Supplement Sponsored by Apple Computer.

    ERIC Educational Resources Information Center

    Classroom Computer Learning, 1989

    1989-01-01

    Follows the development of hypertext which is the electronic movement of large amounts of text. Probes the use of the Macintosh HyperCard and its applications in education. Notes programs are stackable in the computer. Provides tool, resource, and stack directory along with tips for using HyperCard. (MVL)

  20. Dictionary as Database.

    ERIC Educational Resources Information Center

    Painter, Derrick

    1996-01-01

    Discussion of dictionaries as databases focuses on the digitizing of The Oxford English dictionary (OED) and the use of Standard Generalized Mark-Up Language (SGML). Topics include the creation of a consortium to digitize the OED, document structure, relational databases, text forms, sequence, and discourse. (LRW)

  1. HyperCard as a Text Analysis Tool for the Qualitative Researcher.

    ERIC Educational Resources Information Center

    Handler, Marianne G.; Turner, Sandra V.

    HyperCard is a general-purpose program for the Macintosh computer that allows multiple ways of viewing and accessing a large body of information. Two ways in which HyperCard can be used as a research tool are illustrated. One way is to organize and analyze qualitative data from observations, interviews, surveys, and other documents. The other way…

  2. Castles Made of Sand: Building Sustainable Digitized Collections Using XML.

    ERIC Educational Resources Information Center

    Ragon, Bart

    2003-01-01

    Describes work at the University of Virginia library to digitize special collections. Discusses the use of XML (Extensible Markup Language); providing access to original source materials; DTD (Document Type Definition); TEI (Text Encoding Initiative); metadata; XSL (Extensible Style Language); and future possibilities. (LRW)

  3. Balancing medicine prices and business sustainability: analyses of pharmacy costs, revenues and profit shed light on retail medicine mark-ups in rural Kyrgyzstan

    PubMed Central

    2010-01-01

    Background Numerous not-for-profit pharmacies have been created to improve access to medicines for the poor, but many have failed due to insufficient financial planning and management. These pharmacies are not well described in health services literature despite strong demand from policy makers, implementers, and researchers. Surveys reporting unaffordable medicine prices and high mark-ups have spurred efforts to reduce medicine prices, but price reduction goals are arbitrary in the absence of information on pharmacy costs, revenues, and profit structures. Health services research is needed to develop sustainable and "reasonable" medicine price goals and strategic initiatives to reach them. Methods We utilized cost accounting methods on inventory and financial information obtained from a not-for-profit rural pharmacy network in mountainous Kyrgyzstan to quantify costs, revenues, profits and medicine mark-ups during establishment and maintenance periods (October 2004-December 2007). Results Twelve pharmacies and one warehouse were established in remote Kyrgyzstan with < US $25,000 due to governmental resource-sharing. The network operated at break-even profit, leaving little room to lower medicine prices and mark-ups. Medicine mark-ups needed for sustainability were greater than originally envisioned by network administration. In 2005, 55%, 35%, and 10% of the network's top 50 products revealed mark-ups of < 50%, 50-99% and > 100%, respectively. Annual mark-ups increased dramatically each year to cover increasing recurrent costs, and by 2007, only 19% and 46% of products revealed mark-ups of < 50% and 50-99%, respectively; while 35% of products revealed mark-ups > 100%. 2007 medicine mark-ups varied substantially across these products, ranging from 32% to 244%. Mark-ups needed to sustain private pharmacies would be even higher in the absence of government subsidies. Conclusion Pharmacy networks can be established in hard-to-reach regions with little funding using public-private partnership, resource-sharing models. Medicine prices and mark-ups must be interpreted with consideration for regional costs of business. Mark-ups vary dramatically across medicines. Some mark-ups appear "excessive" but are likely necessary for pharmacy viability. Pharmacy financial data is available in remote settings and can be used towards determination of "reasonable" medicine price goals. Health systems researchers must document the positive and negative financial experiences of pharmacy initiatives to inform future projects and advance access to medicines goals. PMID:20626904

  4. Balancing medicine prices and business sustainability: analyses of pharmacy costs, revenues and profit shed light on retail medicine mark-ups in rural Kyrgyzstan.

    PubMed

    Waning, Brenda; Maddix, Jason; Soucy, Lyne

    2010-07-13

    Numerous not-for-profit pharmacies have been created to improve access to medicines for the poor, but many have failed due to insufficient financial planning and management. These pharmacies are not well described in health services literature despite strong demand from policy makers, implementers, and researchers. Surveys reporting unaffordable medicine prices and high mark-ups have spurred efforts to reduce medicine prices, but price reduction goals are arbitrary in the absence of information on pharmacy costs, revenues, and profit structures. Health services research is needed to develop sustainable and "reasonable" medicine price goals and strategic initiatives to reach them. We utilized cost accounting methods on inventory and financial information obtained from a not-for-profit rural pharmacy network in mountainous Kyrgyzstan to quantify costs, revenues, profits and medicine mark-ups during establishment and maintenance periods (October 2004-December 2007). Twelve pharmacies and one warehouse were established in remote Kyrgyzstan with < US $25,000 due to governmental resource-sharing. The network operated at break-even profit, leaving little room to lower medicine prices and mark-ups. Medicine mark-ups needed for sustainability were greater than originally envisioned by network administration. In 2005, 55%, 35%, and 10% of the network's top 50 products revealed mark-ups of < 50%, 50-99% and > 100%, respectively. Annual mark-ups increased dramatically each year to cover increasing recurrent costs, and by 2007, only 19% and 46% of products revealed mark-ups of < 50% and 50-99%, respectively; while 35% of products revealed mark-ups > 100%. 2007 medicine mark-ups varied substantially across these products, ranging from 32% to 244%. Mark-ups needed to sustain private pharmacies would be even higher in the absence of government subsidies. Pharmacy networks can be established in hard-to-reach regions with little funding using public-private partnership, resource-sharing models. Medicine prices and mark-ups must be interpreted with consideration for regional costs of business. Mark-ups vary dramatically across medicines. Some mark-ups appear "excessive" but are likely necessary for pharmacy viability. Pharmacy financial data is available in remote settings and can be used towards determination of "reasonable" medicine price goals. Health systems researchers must document the positive and negative financial experiences of pharmacy initiatives to inform future projects and advance access to medicines goals.

  5. BioC: a minimalist approach to interoperability for biomedical text processing

    PubMed Central

    Comeau, Donald C.; Islamaj Doğan, Rezarta; Ciccarese, Paolo; Cohen, Kevin Bretonnel; Krallinger, Martin; Leitner, Florian; Lu, Zhiyong; Peng, Yifan; Rinaldi, Fabio; Torii, Manabu; Valencia, Alfonso; Verspoor, Karin; Wiegers, Thomas C.; Wu, Cathy H.; Wilbur, W. John

    2013-01-01

    A vast amount of scientific information is encoded in natural language text, and the quantity of such text has become so great that it is no longer economically feasible to have a human as the first step in the search process. Natural language processing and text mining tools have become essential to facilitate the search for and extraction of information from text. This has led to vigorous research efforts to create useful tools and to create humanly labeled text corpora, which can be used to improve such tools. To encourage combining these efforts into larger, more powerful and more capable systems, a common interchange format to represent, store and exchange the data in a simple manner between different language processing systems and text mining tools is highly desirable. Here we propose a simple extensible mark-up language format to share text documents and annotations. The proposed annotation approach allows a large number of different annotations to be represented including sentences, tokens, parts of speech, named entities such as genes or diseases and relationships between named entities. In addition, we provide simple code to hold this data, read it from and write it back to extensible mark-up language files and perform some sample processing. We also describe completed as well as ongoing work to apply the approach in several directions. Code and data are available at http://bioc.sourceforge.net/. Database URL: http://bioc.sourceforge.net/ PMID:24048470

  6. Geospatial Data Management Platform for Urban Groundwater

    NASA Astrophysics Data System (ADS)

    Gaitanaru, D.; Priceputu, A.; Gogu, C. R.

    2012-04-01

    Due to the large amount of civil work projects and research studies, large quantities of geo-data are produced for the urban environments. These data are usually redundant as well as they are spread in different institutions or private companies. Time consuming operations like data processing and information harmonisation represents the main reason to systematically avoid the re-use of data. The urban groundwater data shows the same complex situation. The underground structures (subway lines, deep foundations, underground parkings, and others), the urban facility networks (sewer systems, water supply networks, heating conduits, etc), the drainage systems, the surface water works and many others modify continuously. As consequence, their influence on groundwater changes systematically. However, these activities provide a large quantity of data, aquifers modelling and then behaviour prediction can be done using monitored quantitative and qualitative parameters. Due to the rapid evolution of technology in the past few years, transferring large amounts of information through internet has now become a feasible solution for sharing geoscience data. Furthermore, standard platform-independent means to do this have been developed (specific mark-up languages like: GML, GeoSciML, WaterML, GWML, CityML). They allow easily large geospatial databases updating and sharing through internet, even between different companies or between research centres that do not necessarily use the same database structures. For Bucharest City (Romania) an integrated platform for groundwater geospatial data management is developed under the framework of a national research project - "Sedimentary media modeling platform for groundwater management in urban areas" (SIMPA) financed by the National Authority for Scientific Research of Romania. The platform architecture is based on three components: a geospatial database, a desktop application (a complex set of hydrogeological and geological analysis tools) and a front-end geoportal service. The SIMPA platform makes use of mark-up transfer standards to provide a user-friendly application that can be accessed through internet to query, analyse, and visualise geospatial data related to urban groundwater. The platform holds the information within the local groundwater geospatial databases and the user is able to access this data through a geoportal service. The database architecture allows storing accurate and very detailed geological, hydrogeological, and infrastructure information that can be straightforwardly generalized and further upscaled. The geoportal service offers the possibility of querying a dataset from the spatial database. The query is coded in a standard mark-up language, and sent to the server through a standard Hyper Text Transfer Protocol (http) to be processed by the local application. After the validation of the query, the results are sent back to the user to be displayed by the geoportal application. The main advantage of the SIMPA platform is that it offers to the user the possibility to make a primary multi-criteria query, which results in a smaller set of data to be analysed afterwards. This improves both the transfer process parameters and the user's means of creating the desired query.

  7. Variation in markup of general surgical procedures by hospital market concentration.

    PubMed

    Cerullo, Marcelo; Chen, Sophia Y; Dillhoff, Mary; Schmidt, Carl R; Canner, Joseph K; Pawlik, Timothy M

    2018-04-01

    Increasing hospital market concentration (with concomitantly decreasing hospital market competition) may be associated with rising hospital prices. Hospital markup - the relative increase in price over costs - has been associated with greater hospital market concentration. Patients undergoing a cardiothoracic or gastrointestinal procedure in the 2008-2011 Nationwide Inpatient Sample (NIS) were identified and linked to Hospital Market Structure Files. The association between market concentration, hospital markup and hospital for-profit status was assessed using mixed-effects log-linear models. A weighted total of 1,181,936 patients were identified. In highly concentrated markets, private for-profit status was associated with an 80.8% higher markup compared to public/private not-for-profit status (95%CI: +69.5% - +96.9%; p < 0.001). However, private for-profit status in highly concentrated markets was associated with only a 62.9% higher markup compared to public/private not-for-profit status in unconcentrated markets (95%CI: +45.4% - +81.1%; p < 0.001). Hospital for-profit status modified the association between hospitals' market concentration and markup. Government and private not-for-profit hospitals employed lower markups in more concentrated markets, whereas private for-profit hospitals employed higher markups in more concentrated markets. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Definition of an XML markup language for clinical laboratory procedures and comparison with generic XML markup.

    PubMed

    Saadawi, Gilan M; Harrison, James H

    2006-10-01

    Clinical laboratory procedure manuals are typically maintained as word processor files and are inefficient to store and search, require substantial effort for review and updating, and integrate poorly with other laboratory information. Electronic document management systems could improve procedure management and utility. As a first step toward building such systems, we have developed a prototype electronic format for laboratory procedures using Extensible Markup Language (XML). Representative laboratory procedures were analyzed to identify document structure and data elements. This information was used to create a markup vocabulary, CLP-ML, expressed as an XML Document Type Definition (DTD). To determine whether this markup provided advantages over generic markup, we compared procedures structured with CLP-ML or with the vocabulary of the Health Level Seven, Inc. (HL7) Clinical Document Architecture (CDA) narrative block. CLP-ML includes 124 XML tags and supports a variety of procedure types across different laboratory sections. When compared with a general-purpose markup vocabulary (CDA narrative block), CLP-ML documents were easier to edit and read, less complex structurally, and simpler to traverse for searching and retrieval. In combination with appropriate software, CLP-ML is designed to support electronic authoring, reviewing, distributing, and searching of clinical laboratory procedures from a central repository, decreasing procedure maintenance effort and increasing the utility of procedure information. A standard electronic procedure format could also allow laboratories and vendors to share procedures and procedure layouts, minimizing duplicative word processor editing. Our results suggest that laboratory-specific markup such as CLP-ML will provide greater benefit for such systems than generic markup.

  9. XML: An Introduction.

    ERIC Educational Resources Information Center

    Lewis, John D.

    1998-01-01

    Describes XML (extensible markup language), a new language classification submitted to the World Wide Web Consortium that is defined in terms of both SGML (Standard Generalized Markup Language) and HTML (Hypertext Markup Language), specifically designed for the Internet. Limitations of PDF (Portable Document Format) files for electronic journals…

  10. The Digital electronic Guideline Library (DeGeL): a hybrid framework for representation and use of clinical guidelines.

    PubMed

    Shahar, Yuval; Young, Ohad; Shalom, Erez; Mayaffit, Alon; Moskovitch, Robert; Hessing, Alon; Galperin, Maya

    2004-01-01

    We propose to present a poster (and potentially also a demonstration of the implemented system) summarizing the current state of our work on a hybrid, multiple-format representation of clinical guidelines that facilitates conversion of guidelines from free text to a formal representation. We describe a distributed Web-based architecture (DeGeL) and a set of tools using the hybrid representation. The tools enable performing tasks such as guideline specification, semantic markup, search, retrieval, visualization, eligibility determination, runtime application and retrospective quality assessment. The representation includes four parallel formats: Free text (one or more original sources); semistructured text (labeled by the target guideline-ontology semantic labels); semiformal text (which includes some control specification); and a formal, machine-executable representation. The specification, indexing, search, retrieval, and browsing tools are essentially independent of the ontology chosen for guideline representation, but editing the semi-formal and formal formats requires ontology-specific tools, which we have developed in the case of the Asbru guideline-specification language. The four formats support increasingly sophisticated computational tasks. The hybrid guidelines are stored in a Web-based library. All tools, such as for runtime guideline application or retrospective quality assessment, are designed to operate on all representations. We demonstrate the hybrid framework by providing examples from the semantic markup and search tools.

  11. Telemetry Attributes Transfer Standard (TMATS) Handbook

    DTIC Science & Technology

    2015-07-01

    Example ......................... 6-1 Appendix A. Extensible Markup Language TMATS Differences ...................................... A-1 Appendix B...return-to-zero - level TG Telemetry Group TM telemetry TMATS Telemetry Attributes Transfer Standard XML eXtensible Markup Language Telemetry... Markup Language) format. The initial version of a standard 1 Range Commanders Council. Telemetry

  12. XML Content Finally Arrives on the Web!

    ERIC Educational Resources Information Center

    Funke, Susan

    1998-01-01

    Explains extensible markup language (XML) and how it differs from hypertext markup language (HTML) and standard generalized markup language (SGML). Highlights include features of XML, including better formatting of documents, better searching capabilities, multiple uses for hyperlinking, and an increase in Web applications; Web browsers; and what…

  13. HyperGLOB/Freedom: Preparing Student Designers for a New Media.

    ERIC Educational Resources Information Center

    Slawson, Brian

    The HyperGLOB project introduced university-level graphic design students to interactive multimedia. This technology involves using the personal computer to display and manipulate a variety of electronic media simultaneously (combining elements of text and speech, music and sound, still images, motion video, and animated graphics) and allows…

  14. 48 CFR 552.243-71 - Equitable Adjustments.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...) Markups. (3) Change to the time for completion specified in the contract. (e) Direct costs. The Contractor... contract regarding the Contractor's project schedule. (h) Markups. For each firm whose direct costs are... applicable, a bond rate and insurance rate. Markups shall be determined and applied as follows: (1) Overhead...

  15. Competition in the economic crisis: Analysis of procurement auctions.

    PubMed

    Gugler, Klaus; Weichselbaumer, Michael; Zulehner, Christine

    2015-01-01

    We study the effects of the recent economic crisis on firms׳ bidding behavior and markups in sealed bid auctions. Using data from Austrian construction procurements, we estimate bidders׳ construction costs within a private value auction model. We find that markups of all bids submitted decrease by 1.5 percentage points in the recent economic crisis, markups of winning bids decrease by 3.3 percentage points. We also find that without the government stimulus package this decrease would have been larger. These two pieces of evidence point to pro-cyclical markups.

  16. The caBIG annotation and image Markup project.

    PubMed

    Channin, David S; Mongkolwat, Pattanasak; Kleper, Vladimir; Sepukar, Kastubh; Rubin, Daniel L

    2010-04-01

    Image annotation and markup are at the core of medical interpretation in both the clinical and the research setting. Digital medical images are managed with the DICOM standard format. While DICOM contains a large amount of meta-data about whom, where, and how the image was acquired, DICOM says little about the content or meaning of the pixel data. An image annotation is the explanatory or descriptive information about the pixel data of an image that is generated by a human or machine observer. An image markup is the graphical symbols placed over the image to depict an annotation. While DICOM is the standard for medical image acquisition, manipulation, transmission, storage, and display, there are no standards for image annotation and markup. Many systems expect annotation to be reported verbally, while markups are stored in graphical overlays or proprietary formats. This makes it difficult to extract and compute with both of them. The goal of the Annotation and Image Markup (AIM) project is to develop a mechanism, for modeling, capturing, and serializing image annotation and markup data that can be adopted as a standard by the medical imaging community. The AIM project produces both human- and machine-readable artifacts. This paper describes the AIM information model, schemas, software libraries, and tools so as to prepare researchers and developers for their use of AIM.

  17. Today's Authoring Tools for Tomorrow's Semantic Web.

    ERIC Educational Resources Information Center

    Dingley, Andy; Shabajee, Paul

    This paper reports on the development of a prototype authoring tool developed as part of on-going research around the needs of the ARKive project. The project holds text, rich-media and descriptions of factual statements about bio-diversity and conservation information. A key user community is that of school age children, requiring the mark-up of…

  18. Data Display Markup Language (DDML) Handbook

    DTIC Science & Technology

    2017-01-31

    Moreover, the tendency of T&E is towards a plug-and-play-like data acquisition system that requires standard languages and modules for data displays...Telemetry Group DOCUMENT 127-17 DATA DISPLAY MARKUP LANGUAGE (DDML) HANDBOOK DISTRIBUTION A: APPROVED FOR...DOCUMENT 127-17 DATA DISPLAY MARKUP LANGUAGE (DDML) HANDBOOK January 2017 Prepared by Telemetry Group

  19. A Conversion Tool for Mathematical Expressions in Web XML Files.

    ERIC Educational Resources Information Center

    Ohtake, Nobuyuki; Kanahori, Toshihiro

    2003-01-01

    This article discusses the conversion of mathematical equations into Extensible Markup Language (XML) on the World Wide Web for individuals with visual impairments. A program is described that converts the presentation markup style to the content markup style in MathML to allow browsers to render mathematical expressions without other programs.…

  20. Answer Markup Algorithms for Southeast Asian Languages.

    ERIC Educational Resources Information Center

    Henry, George M.

    1991-01-01

    Typical markup methods for providing feedback to foreign language learners are not applicable to languages not written in a strictly linear fashion. A modification of Hart's edit markup software is described, along with a second variation based on a simple edit distance algorithm adapted to a general Southeast Asian font system. (10 references)…

  1. Chemical Markup, XML and the World-Wide Web. 8. Polymer Markup Language.

    PubMed

    Adams, Nico; Winter, Jerry; Murray-Rust, Peter; Rzepa, Henry S

    2008-11-01

    Polymers are among the most important classes of materials but are only inadequately supported by modern informatics. The paper discusses the reasons why polymer informatics is considerably more challenging than small molecule informatics and develops a vision for the computer-aided design of polymers, based on modern semantic web technologies. The paper then discusses the development of Polymer Markup Language (PML). PML is an extensible language, designed to support the (structural) representation of polymers and polymer-related information. PML closely interoperates with Chemical Markup Language (CML) and overcomes a number of the previously identified challenges.

  2. Data on the interexaminer variation of minutia markup on latent fingerprints.

    PubMed

    Ulery, Bradford T; Hicklin, R Austin; Roberts, Maria Antonia; Buscaglia, JoAnn

    2016-09-01

    The data in this article supports the research paper entitled "Interexaminer variation of minutia markup on latent fingerprints" [1]. The data in this article describes the variability in minutia markup during both analysis of the latents and comparison between latents and exemplars. The data was collected in the "White Box Latent Print Examiner Study," in which each of 170 volunteer latent print examiners provided detailed markup documenting their examinations of latent-exemplar pairs of prints randomly assigned from a pool of 320 pairs. Each examiner examined 22 latent-exemplar pairs; an average of 12 examiners marked each latent.

  3. Development of Human Face Literature Database Using Text Mining Approach: Phase I.

    PubMed

    Kaur, Paramjit; Krishan, Kewal; Sharma, Suresh K

    2018-06-01

    The face is an important part of the human body by which an individual communicates in the society. Its importance can be highlighted by the fact that a person deprived of face cannot sustain in the living world. The amount of experiments being performed and the number of research papers being published under the domain of human face have surged in the past few decades. Several scientific disciplines, which are conducting research on human face include: Medical Science, Anthropology, Information Technology (Biometrics, Robotics, and Artificial Intelligence, etc.), Psychology, Forensic Science, Neuroscience, etc. This alarms the need of collecting and managing the data concerning human face so that the public and free access of it can be provided to the scientific community. This can be attained by developing databases and tools on human face using bioinformatics approach. The current research emphasizes on creating a database concerning literature data of human face. The database can be accessed on the basis of specific keywords, journal name, date of publication, author's name, etc. The collected research papers will be stored in the form of a database. Hence, the database will be beneficial to the research community as the comprehensive information dedicated to the human face could be found at one place. The information related to facial morphologic features, facial disorders, facial asymmetry, facial abnormalities, and many other parameters can be extracted from this database. The front end has been developed using Hyper Text Mark-up Language and Cascading Style Sheets. The back end has been developed using hypertext preprocessor (PHP). The JAVA Script has used as scripting language. MySQL (Structured Query Language) is used for database development as it is most widely used Relational Database Management System. XAMPP (X (cross platform), Apache, MySQL, PHP, Perl) open source web application software has been used as the server.The database is still under the developmental phase and discusses the initial steps of its creation. The current paper throws light on the work done till date.

  4. XML Style Guide

    DTIC Science & Technology

    2015-07-01

    Acronyms ASCII American Standard Code for Information Interchange DAU data acquisition unit DDML data display markup language IHAL...Transfer Standard URI uniform resource identifier W3C World Wide Web Consortium XML extensible markup language XSD XML schema definition XML Style...Style Guide, RCC 125-15, July 2015 1 Introduction The next generation of telemetry systems will rely heavily on extensible markup language (XML

  5. Gopher Is No Longer Just a Rodent: Using Gopher and World Wide Web in Composition Studies.

    ERIC Educational Resources Information Center

    Krause, Steve

    Gopher and World Wide Web (WWW) are two useful Internet technologies for the composition and rhetoric classroom. Gopher software makes available a wide variety of text-based information in the Internet. A Gopher at Bowling Green State University offers many types of information. The World Wide Web, using a fairly simple markup language, is also…

  6. Making the Most of Scarce Resources: A Small College Language Department's Experience with HyperCard.

    ERIC Educational Resources Information Center

    Donaldson, Randall P.; Morgan, Leslie Zarker

    1994-01-01

    The development and use of two programs developed at Loyola College using HyperCard are described. One is a reading comprehension program of a Renaissance Italian text; the other, in German, uses scanned-in maps of the various stages of German political development to illustrate German history. (21 references) (LB)

  7. DDML Schema Validation

    DTIC Science & Technology

    2016-02-08

    Data Display Markup Language HUD heads-up display IRIG Inter-Range Instrumentation Group RCC Range Commanders Council SVG Scalable Vector Graphics...T&E test and evaluation TMATS Telemetry Attributes Transfer Standard XML eXtensible Markup Language DDML Schema Validation, RCC 126-16, February...2016 viii This page intentionally left blank. DDML Schema Validation, RCC 126-16, February 2016 1 1. Introduction This Data Display Markup

  8. 77 FR 47896 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-10

    ... that it is appropriate to charge a markup with respect to directed orders to reflect the costs of offering routing services and the value of such services. Notably, in all instances NASDAQ charges a markup... that it does not currently charge a markup with respect to non-directed orders that are routed to PSX...

  9. Application of whole slide image markup and annotation for pathologist knowledge capture.

    PubMed

    Campbell, Walter S; Foster, Kirk W; Hinrichs, Steven H

    2013-01-01

    The ability to transfer image markup and annotation data from one scanned image of a slide to a newly acquired image of the same slide within a single vendor platform was investigated. The goal was to study the ability to use image markup and annotation data files as a mechanism to capture and retain pathologist knowledge without retaining the entire whole slide image (WSI) file. Accepted mathematical principles were investigated as a method to overcome variations in scans of the same glass slide and to accurately associate image markup and annotation data across different WSI of the same glass slide. Trilateration was used to link fixed points within the image and slide to the placement of markups and annotations of the image in a metadata file. Variation in markup and annotation placement between WSI of the same glass slide was reduced from over 80 μ to less than 4 μ in the x-axis and from 17 μ to 6 μ in the y-axis (P < 0.025). This methodology allows for the creation of a highly reproducible image library of histopathology images and interpretations for educational and research use.

  10. Application of whole slide image markup and annotation for pathologist knowledge capture

    PubMed Central

    Campbell, Walter S.; Foster, Kirk W.; Hinrichs, Steven H.

    2013-01-01

    Objective: The ability to transfer image markup and annotation data from one scanned image of a slide to a newly acquired image of the same slide within a single vendor platform was investigated. The goal was to study the ability to use image markup and annotation data files as a mechanism to capture and retain pathologist knowledge without retaining the entire whole slide image (WSI) file. Methods: Accepted mathematical principles were investigated as a method to overcome variations in scans of the same glass slide and to accurately associate image markup and annotation data across different WSI of the same glass slide. Trilateration was used to link fixed points within the image and slide to the placement of markups and annotations of the image in a metadata file. Results: Variation in markup and annotation placement between WSI of the same glass slide was reduced from over 80 μ to less than 4 μ in the x-axis and from 17 μ to 6 μ in the y-axis (P < 0.025). Conclusion: This methodology allows for the creation of a highly reproducible image library of histopathology images and interpretations for educational and research use. PMID:23599902

  11. TMATS/ IHAL/ DDML Schema Validation

    DTIC Science & Technology

    2017-02-01

    task was to create a method for performing IRIG eXtensible Markup Language (XML) schema validation. As opposed to XML instance document validation...TMATS / IHAL / DDML Schema Validation, RCC 126-17, February 2017 vii Acronyms DDML Data Display Markup Language HUD heads-up display iNET...system XML eXtensible Markup Language TMATS / IHAL / DDML Schema Validation, RCC 126-17, February 2017 viii This page intentionally left blank

  12. An object-oriented approach for harmonization of multimedia markup languages

    NASA Astrophysics Data System (ADS)

    Chen, Yih-Feng; Kuo, May-Chen; Sun, Xiaoming; Kuo, C.-C. Jay

    2003-12-01

    An object-oriented methodology is proposed to harmonize several different markup languages in this research. First, we adopt the Unified Modelling Language (UML) as the data model to formalize the concept and the process of the harmonization process between the eXtensible Markup Language (XML) applications. Then, we design the Harmonization eXtensible Markup Language (HXML) based on the data model and formalize the transformation between the Document Type Definitions (DTDs) of the original XML applications and HXML. The transformation between instances is also discussed. We use the harmonization of SMIL and X3D as an example to demonstrate the proposed methodology. This methodology can be generalized to various application domains.

  13. Managing and Querying Image Annotation and Markup in XML.

    PubMed

    Wang, Fusheng; Pan, Tony; Sharma, Ashish; Saltz, Joel

    2010-01-01

    Proprietary approaches for representing annotations and image markup are serious barriers for researchers to share image data and knowledge. The Annotation and Image Markup (AIM) project is developing a standard based information model for image annotation and markup in health care and clinical trial environments. The complex hierarchical structures of AIM data model pose new challenges for managing such data in terms of performance and support of complex queries. In this paper, we present our work on managing AIM data through a native XML approach, and supporting complex image and annotation queries through native extension of XQuery language. Through integration with xService, AIM databases can now be conveniently shared through caGrid.

  14. Development of clinical contents model markup language for electronic health records.

    PubMed

    Yun, Ji-Hyun; Ahn, Sun-Ju; Kim, Yoon

    2012-09-01

    To develop dedicated markup language for clinical contents models (CCM) to facilitate the active use of CCM in electronic health record systems. Based on analysis of the structure and characteristics of CCM in the clinical domain, we designed extensible markup language (XML) based CCM markup language (CCML) schema manually. CCML faithfully reflects CCM in both the syntactic and semantic aspects. As this language is based on XML, it can be expressed and processed in computer systems and can be used in a technology-neutral way. CCML HAS THE FOLLOWING STRENGTHS: it is machine-readable and highly human-readable, it does not require a dedicated parser, and it can be applied for existing electronic health record systems.

  15. Managing and Querying Image Annotation and Markup in XML

    PubMed Central

    Wang, Fusheng; Pan, Tony; Sharma, Ashish; Saltz, Joel

    2010-01-01

    Proprietary approaches for representing annotations and image markup are serious barriers for researchers to share image data and knowledge. The Annotation and Image Markup (AIM) project is developing a standard based information model for image annotation and markup in health care and clinical trial environments. The complex hierarchical structures of AIM data model pose new challenges for managing such data in terms of performance and support of complex queries. In this paper, we present our work on managing AIM data through a native XML approach, and supporting complex image and annotation queries through native extension of XQuery language. Through integration with xService, AIM databases can now be conveniently shared through caGrid. PMID:21218167

  16. Interexaminer variation of minutia markup on latent fingerprints.

    PubMed

    Ulery, Bradford T; Hicklin, R Austin; Roberts, Maria Antonia; Buscaglia, JoAnn

    2016-07-01

    Latent print examiners often differ in the number of minutiae they mark during analysis of a latent, and also during comparison of a latent with an exemplar. Differences in minutia counts understate interexaminer variability: examiners' markups may have similar minutia counts but differ greatly in which specific minutiae were marked. We assessed variability in minutia markup among 170 volunteer latent print examiners. Each provided detailed markup documenting their examinations of 22 latent-exemplar pairs of prints randomly assigned from a pool of 320 pairs. An average of 12 examiners marked each latent. The primary factors associated with minutia reproducibility were clarity, which regions of the prints examiners chose to mark, and agreement on value or comparison determinations. In clear areas (where the examiner was "certain of the location, presence, and absence of all minutiae"), median reproducibility was 82%; in unclear areas, median reproducibility was 46%. Differing interpretations regarding which regions should be marked (e.g., when there is ambiguity in the continuity of a print) contributed to variability in minutia markup: especially in unclear areas, marked minutiae were often far from the nearest minutia marked by a majority of examiners. Low reproducibility was also associated with differences in value or comparison determinations. Lack of standardization in minutia markup and unfamiliarity with test procedures presumably contribute to the variability we observed. We have identified factors accounting for interexaminer variability; implementing standards for detailed markup as part of documentation and focusing future training efforts on these factors may help to facilitate transparency and reduce subjectivity in the examination process. Published by Elsevier Ireland Ltd.

  17. Looking Tasks Online: Utilizing Webcams to Collect Video Data from Home

    PubMed Central

    Semmelmann, Kilian; Hönekopp, Astrid; Weigelt, Sarah

    2017-01-01

    Online experimentation is emerging as a new methodology within classical data acquisition in psychology. It allows for easy, fast, broad, and cheap data conduction from the comfort of people’s homes. To add another method to the array of available tools, here we used recent developments in web technology to investigate the technical feasibility of online HyperText Markup Language-5/JavaScript-based video data recording. We employed a preferential looking task with children between 4 and 24 months. Parents and their children participated from home through a three-stage process: First, interested adults registered and took pictures through a webcam-based photo application. In the second step, we edited the pictures and integrated them into the design. Lastly, participants returned to the website and the video data acquisition took place through their webcam. In sum, we were able to create and employ the video recording application with participants as young as 4 months old. Quality-wise, no participant had to be removed due to the framerate or quality of videos and only 7% of data was excluded due to behavioral factors (lack of concentration). Results-wise, interrater reliability of rated looking side (left/right) showed a high agreement between raters, Fleiss’ Kappa, κ = 0.97, which can be translated to sufficient data quality for further analyses. With regard to on-/off-screen attention attribution, we found that children lost interest after about 10 s after trial onset using a static image presentation or 60 s total experimental time. Taken together, we were able to show that online video data recording is possible and viable for developmental psychology and beyond. PMID:28955284

  18. Looking Tasks Online: Utilizing Webcams to Collect Video Data from Home.

    PubMed

    Semmelmann, Kilian; Hönekopp, Astrid; Weigelt, Sarah

    2017-01-01

    Online experimentation is emerging as a new methodology within classical data acquisition in psychology. It allows for easy, fast, broad, and cheap data conduction from the comfort of people's homes. To add another method to the array of available tools, here we used recent developments in web technology to investigate the technical feasibility of online HyperText Markup Language-5/JavaScript-based video data recording. We employed a preferential looking task with children between 4 and 24 months. Parents and their children participated from home through a three-stage process: First, interested adults registered and took pictures through a webcam-based photo application. In the second step, we edited the pictures and integrated them into the design. Lastly, participants returned to the website and the video data acquisition took place through their webcam. In sum, we were able to create and employ the video recording application with participants as young as 4 months old. Quality-wise, no participant had to be removed due to the framerate or quality of videos and only 7% of data was excluded due to behavioral factors (lack of concentration). Results-wise, interrater reliability of rated looking side (left/right) showed a high agreement between raters, Fleiss' Kappa, κ = 0.97, which can be translated to sufficient data quality for further analyses. With regard to on-/off-screen attention attribution, we found that children lost interest after about 10 s after trial onset using a static image presentation or 60 s total experimental time. Taken together, we were able to show that online video data recording is possible and viable for developmental psychology and beyond.

  19. WebGIS Platform Adressed to Forest Fire Management Methodologies

    NASA Astrophysics Data System (ADS)

    André Ramos-Simões, Nuno; Neto Paixão, Helena Maria; Granja Martins, Fernando Miguel; Pedras, Celestina; Lança, Rui; Silva, Elisa; Jordán, António; Zavala, Lorena; Soares, Cristina

    2015-04-01

    Forest fires are one of the natural disasters that causes more damages in nature, as well as high material costs, and sometimes, a significant losses in human lives. In summer season, when high temperatures are attained, fire may rapidly progress and destroy vast areas of forest and also rural and urban areas. The forest fires have effect on forest species, forest composition and structure, soil properties and soil capacity for nutrient retention. In order to minimize the negative impact of the forest fires in the environment, many studies have been developed, e.g. Jordán et al (2009), Cerdà & Jordán (2010), and Gonçalves & Vieira (2013). Nowadays, Remote Sensing (RS) and Geographic Information System (GIS) technologies are used as support tools in fire management decisions, namely during the fire, but also before and after. This study presents the development of a user-friendly WebGIS dedicated to share data, maps and provide updated information on forest fire management for stakeholders in Iberia Peninsula. The WebGIS platform was developed with ArcGIS Online, ArcGIS for Desktop; HyperText Markup Language (HTML) and Javascript. This platform has a database that includes spatial and alphanumeric information, such as: origin, burned areas, vegetation change over time, terrain natural slope, land use, soil erosion and fire related hazards. The same database contains also the following relevant information: water sources, forest tracks and traffic ways, lookout posts and urban areas. The aim of this study is to provide the authorities with a tool to assess risk areas and manage more efficiently forest fire hazards, giving more support to their decisions and helping the populations when facing this kind of phenomena.

  20. Education Office Application Design and Development

    NASA Technical Reports Server (NTRS)

    Johnson, Jamie E.

    2013-01-01

    The content of this project focuses on designing and implementing a new prototype website for the Kennedy Intern Tracking System (KITS). The goal of the new website is to allow the user to search for interns based on several different categories and fields. In hence, making it easier to find a count of interns matching a set of criteria. The KSC office of education is the primary users of KITS, their job is to recruit interns year-round. As a secondary goal, each user will be able to generate a report of their searches onto a portable document format (PDF) me. The results of each search will be set to a limited amount per page. This site will be used for Kennedy Space Center internal purposes only. After the implementations are done, a visual walk through using screen shots will be used to guide the users through all of the different scenarios that are likely to occur when the users are navigating through the site. In addition, a demo of the site will be presented to the KSC Office of Education. JavaScript and JQuery are the languages that will focus on the functionality of the implementation. Hyper Text Markup Language will be used to form the foundation for the body structure of the website. Ruby will be the programming language used to elevate the prototype to a dynamic website and enable the programmer to finish with in an efficient time frame. Cascading Style Sheet will be the language used for the design and styling purposes. Rails is the framework that the new website will be built upon. By default, the database will be managed by Sequel Lite (SQLite). All users will need to be granted special privileges in order to use the site.

  1. A Cloud Architecture for Teleradiology-as-a-Service.

    PubMed

    Melício Monteiro, Eriksson J; Costa, Carlos; Oliveira, José L

    2016-05-17

    Telemedicine has been promoted by healthcare professionals as an efficient way to obtain remote assistance from specialised centres, to get a second opinion about complex diagnosis or even to share knowledge among practitioners. The current economic restrictions in many countries are increasing the demand for these solutions even more, in order to optimize processes and reduce costs. However, despite some technological solutions already in place, their adoption has been hindered by the lack of usability, especially in the set-up process. In this article we propose a telemedicine platform that relies on a cloud computing infrastructure and social media principles to simplify the creation of dynamic user-based groups, opening up opportunities for the establishment of teleradiology trust domains. The collaborative platform is provided as a Software-as-a-Service solution, supporting real time and asynchronous collaboration between users. To evaluate the solution, we have deployed the platform in a private cloud infrastructure. The system is made up of three main components - the collaborative framework, the Medical Management Information System (MMIS) and the HTML5 (Hyper Text Markup Language) Web client application - connected by a message-oriented middleware. The solution allows physicians to create easily dynamic network groups for synchronous or asynchronous cooperation. The network created improves dataflow between colleagues and also knowledge sharing and cooperation through social media tools. The platform was implemented and it has already been used in two distinct scenarios: teaching of radiology and tele-reporting. Collaborative systems can simplify the establishment of telemedicine expert groups with tools that enable physicians to improve their clinical practice. Streamlining the usage of this kind of systems through the adoption of Web technologies that are common in social media will increase the quality of current solutions, facilitating the sharing of clinical information, medical imaging studies and patient diagnostics among collaborators.

  2. Development of Clinical Contents Model Markup Language for Electronic Health Records

    PubMed Central

    Yun, Ji-Hyun; Kim, Yoon

    2012-01-01

    Objectives To develop dedicated markup language for clinical contents models (CCM) to facilitate the active use of CCM in electronic health record systems. Methods Based on analysis of the structure and characteristics of CCM in the clinical domain, we designed extensible markup language (XML) based CCM markup language (CCML) schema manually. Results CCML faithfully reflects CCM in both the syntactic and semantic aspects. As this language is based on XML, it can be expressed and processed in computer systems and can be used in a technology-neutral way. Conclusions CCML has the following strengths: it is machine-readable and highly human-readable, it does not require a dedicated parser, and it can be applied for existing electronic health record systems. PMID:23115739

  3. Variation in Emergency Department vs Internal Medicine Excess Charges in the United States.

    PubMed

    Xu, Tim; Park, Angela; Bai, Ge; Joo, Sarah; Hutfless, Susan M; Mehta, Ambar; Anderson, Gerard F; Makary, Martin A

    2017-08-01

    Uninsured and insured but out-of-network emergency department (ED) patients are often billed hospital chargemaster prices, which exceed amounts typically paid by insurers. To examine the variation in excess charges for services provided by emergency medicine and internal medicine physicians. Retrospective analysis was conducted of professional fee payment claims made by the Centers for Medicare & Medicaid Services for all services provided to Medicare Part B fee-for-service beneficiaries in calendar year 2013. Data analysis was conducted from January 1 to July 31, 2016. Markup ratios for ED and internal medicine professional services, defined as the charges submitted by the hospital divided by the Medicare allowable amount. Our analysis included 12 337 emergency medicine physicians from 2707 hospitals and 57 607 internal medicine physicians from 3669 hospitals in all 50 states. Services provided by emergency medicine physicians had an overall markup ratio of 4.4 (340% excess charges), which was greater than the markup ratio of 2.1 (110% excess charges) for all services performed by internal medicine physicians. Markup ratios for all ED services ranged by hospital from 1.0 to 12.6 (median, 4.2; interquartile range [IQR], 3.3-5.8); markup ratios for all internal medicine services ranged by hospital from 1.0 to 14.1 (median, 2.0; IQR, 1.7-2.5). The median markup ratio by hospital for ED evaluation and management procedure codes varied between 4.0 and 5.0. Among the most common ED services, laceration repair had the highest median markup ratio (7.0); emergency medicine physician review of a head computed tomographic scan had the greatest interhospital variation (range, 1.6-27.7). Across hospitals, markups in the ED were often substantially higher than those in the internal medicine department for the same services. Higher ED markup ratios were associated with hospital for-profit ownership (median, 5.7; IQR, 4.0-7.1), a greater percentage of uninsured patients seen (median, 5.0; IQR, 3.5-6.7 for ≥20% uninsured), and location (median, 5.3; IQR, 3.8-6.8 for the southeastern United States). Across hospitals, there is wide variation in excess charges on ED services, which are often priced higher than internal medicine services. Our results inform policy efforts to protect uninsured and out-of-network patients from highly variable pricing.

  4. A quality assessment tool for markup-based clinical guidelines.

    PubMed

    Shalom, Erez; Shahar, Yuval; Taieb-Maimon, Meirav; Lunenfeld, Eitan

    2008-11-06

    We introduce a tool for quality assessment of procedural and declarative knowledge. We developed this tool for evaluating the specification of mark-up-based clinical GLs. Using this graphical tool, the expert physician and knowledge engineer collaborate to perform scoring, using pre-defined scoring scale, each of the knowledge roles of the mark-ups, comparing it to a gold standard. The tool enables scoring the mark-ups simultaneously at different sites by different users at different locations.

  5. Changes in latent fingerprint examiners' markup between analysis and comparison.

    PubMed

    Ulery, Bradford T; Hicklin, R Austin; Roberts, Maria Antonia; Buscaglia, JoAnn

    2015-02-01

    After the initial analysis of a latent print, an examiner will sometimes revise the assessment during comparison with an exemplar. Changes between analysis and comparison may indicate that the initial analysis of the latent was inadequate, or that confirmation bias may have affected the comparison. 170 volunteer latent print examiners, each randomly assigned 22 pairs of prints from a pool of 320 total pairs, provided detailed markup documenting their interpretations of the prints and the bases for their comparison conclusions. We describe changes in value assessments and markup of features and clarity. When examiners individualized, they almost always added or deleted minutiae (90.3% of individualizations); every examiner revised at least some markups. For inconclusive and exclusion determinations, changes were less common, and features were added more frequently when the image pair was mated (same source). Even when individualizations were based on eight or fewer corresponding minutiae, in most cases some of those minutiae had been added during comparison. One erroneous individualization was observed: the markup changes were notably extreme, and almost all of the corresponding minutiae had been added during comparison. Latents assessed to be of value for exclusion only (VEO) during analysis were often individualized when compared to a mated exemplar (26%); in our previous work, where examiners were not required to provide markup of features, VEO individualizations were much less common (1.8%). Published by Elsevier Ireland Ltd.

  6. 17 CFR 240.15c2-7 - Identification of quotations.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ..., guarantee of profit, guarantee against loss, commission, markup, markdown, indication of interest and... account, guarantee of profit, guarantee against loss, commission, markup, markdown, indication of interest...

  7. XML Based Markup Languages for Specific Domains

    NASA Astrophysics Data System (ADS)

    Varde, Aparna; Rundensteiner, Elke; Fahrenholz, Sally

    A challenging area in web based support systems is the study of human activities in connection with the web, especially with reference to certain domains. This includes capturing human reasoning in information retrieval, facilitating the exchange of domain-specific knowledge through a common platform and developing tools for the analysis of data on the web from a domain expert's angle. Among the techniques and standards related to such work, we have XML, the eXtensible Markup Language. This serves as a medium of communication for storing and publishing textual, numeric and other forms of data seamlessly. XML tag sets are such that they preserve semantics and simplify the understanding of stored information by users. Often domain-specific markup languages are designed using XML, with a user-centric perspective. Standardization bodies and research communities may extend these to include additional semantics of areas within and related to the domain. This chapter outlines the issues to be considered in developing domain-specific markup languages: the motivation for development, the semantic considerations, the syntactic constraints and other relevant aspects, especially taking into account human factors. Illustrating examples are provided from domains such as Medicine, Finance and Materials Science. Particular emphasis in these examples is on the Materials Markup Language MatML and the semantics of one of its areas, namely, the Heat Treating of Materials. The focus of this chapter, however, is not the design of one particular language but rather the generic issues concerning the development of domain-specific markup languages.

  8. Impact of the zero-markup drug policy on hospitalisation expenditure in western rural China: an interrupted time series analysis.

    PubMed

    Yang, Caijun; Shen, Qian; Cai, Wenfang; Zhu, Wenwen; Li, Zongjie; Wu, Lina; Fang, Yu

    2017-02-01

    To assess the long-term effects of the introduction of China's zero-markup drug policy on hospitalisation expenditure and hospitalisation expenditures after reimbursement. An interrupted time series was used to evaluate the impact of the zero-markup drug policy on hospitalisation expenditure and hospitalisation expenditure after reimbursement at primary health institutions in Fufeng County of Shaanxi Province, western China. Two regression models were developed. Monthly average hospitalisation expenditure and monthly average hospitalisation expenditure after reimbursement in primary health institutions were analysed covering the period 2009 through to 2013. For the monthly average hospitalisation expenditure, the increasing trend was slowed down after the introduction of the zero-markup drug policy (coefficient = -16.49, P = 0.009). For the monthly average hospitalisation expenditure after reimbursement, the increasing trend was slowed down after the introduction of the zero-markup drug policy (coefficient = -10.84, P = 0.064), and a significant decrease in the intercept was noted after the second intervention of changes in reimbursement schemes of the new rural cooperative medical insurance (coefficient = -220.64, P < 0.001). A statistically significant absolute decrease in the level or trend of monthly average hospitalisation expenditure and monthly average hospitalisation expenditure after reimbursement was detected after the introduction of the zero-markup drug policy in western China. However, hospitalisation expenditure and hospitalisation expenditure after reimbursement were still increasing. More effective policies are needed to prevent these costs from continuing to rise. © 2016 John Wiley & Sons Ltd.

  9. Representing nested semantic information in a linear string of text using XML.

    PubMed

    Krauthammer, Michael; Johnson, Stephen B; Hripcsak, George; Campbell, David A; Friedman, Carol

    2002-01-01

    XML has been widely adopted as an important data interchange language. The structure of XML enables sharing of data elements with variable degrees of nesting as long as the elements are grouped in a strict tree-like fashion. This requirement potentially restricts the usefulness of XML for marking up written text, which often includes features that do not properly nest within other features. We encountered this problem while marking up medical text with structured semantic information from a Natural Language Processor. Traditional approaches to this problem separate the structured information from the actual text mark up. This paper introduces an alternative solution, which tightly integrates the semantic structure with the text. The resulting XML markup preserves the linearity of the medical texts and can therefore be easily expanded with additional types of information.

  10. Representing nested semantic information in a linear string of text using XML.

    PubMed Central

    Krauthammer, Michael; Johnson, Stephen B.; Hripcsak, George; Campbell, David A.; Friedman, Carol

    2002-01-01

    XML has been widely adopted as an important data interchange language. The structure of XML enables sharing of data elements with variable degrees of nesting as long as the elements are grouped in a strict tree-like fashion. This requirement potentially restricts the usefulness of XML for marking up written text, which often includes features that do not properly nest within other features. We encountered this problem while marking up medical text with structured semantic information from a Natural Language Processor. Traditional approaches to this problem separate the structured information from the actual text mark up. This paper introduces an alternative solution, which tightly integrates the semantic structure with the text. The resulting XML markup preserves the linearity of the medical texts and can therefore be easily expanded with additional types of information. PMID:12463856

  11. Variation in Emergency Department vs Internal Medicine Excess Charges in the United States

    PubMed Central

    Park, Angela; Bai, Ge; Joo, Sarah; Hutfless, Susan M.; Mehta, Ambar; Anderson, Gerard F.; Makary, Martin A.

    2017-01-01

    Importance Uninsured and insured but out-of-network emergency department (ED) patients are often billed hospital chargemaster prices, which exceed amounts typically paid by insurers. Objective To examine the variation in excess charges for services provided by emergency medicine and internal medicine physicians. Design, Setting, and Participants Retrospective analysis was conducted of professional fee payment claims made by the Centers for Medicare & Medicaid Services for all services provided to Medicare Part B fee-for-service beneficiaries in calendar year 2013. Data analysis was conducted from January 1 to July 31, 2016. Main Outcomes and Measures Markup ratios for ED and internal medicine professional services, defined as the charges submitted by the hospital divided by the Medicare allowable amount. Results Our analysis included 12 337 emergency medicine physicians from 2707 hospitals and 57 607 internal medicine physicians from 3669 hospitals in all 50 states. Services provided by emergency medicine physicians had an overall markup ratio of 4.4 (340% excess charges), which was greater than the markup ratio of 2.1 (110% excess charges) for all services performed by internal medicine physicians. Markup ratios for all ED services ranged by hospital from 1.0 to 12.6 (median, 4.2; interquartile range [IQR], 3.3-5.8); markup ratios for all internal medicine services ranged by hospital from 1.0 to 14.1 (median, 2.0; IQR, 1.7-2.5). The median markup ratio by hospital for ED evaluation and management procedure codes varied between 4.0 and 5.0. Among the most common ED services, laceration repair had the highest median markup ratio (7.0); emergency medicine physician review of a head computed tomographic scan had the greatest interhospital variation (range, 1.6-27.7). Across hospitals, markups in the ED were often substantially higher than those in the internal medicine department for the same services. Higher ED markup ratios were associated with hospital for-profit ownership (median, 5.7; IQR, 4.0-7.1), a greater percentage of uninsured patients seen (median, 5.0; IQR, 3.5-6.7 for ≥20% uninsured), and location (median, 5.3; IQR, 3.8-6.8 for the southeastern United States). Conclusions and Relevance Across hospitals, there is wide variation in excess charges on ED services, which are often priced higher than internal medicine services. Our results inform policy efforts to protect uninsured and out-of-network patients from highly variable pricing. PMID:28558093

  12. SuML: A Survey Markup Language for Generalized Survey Encoding

    PubMed Central

    Barclay, MW; Lober, WB; Karras, BT

    2002-01-01

    There is a need in clinical and research settings for a sophisticated, generalized, web based survey tool that supports complex logic, separation of content and presentation, and computable guidelines. There are many commercial and open source survey packages available that provide simple logic; few provide sophistication beyond “goto” statements; none support the use of guidelines. These tools are driven by databases, static web pages, and structured documents using markup languages such as eXtensible Markup Language (XML). We propose a generalized, guideline aware language and an implementation architecture using open source standards.

  13. Prices and mark-ups on antimalarials: evidence from nationally representative studies in six malaria-endemic countries

    PubMed Central

    Palafox, Benjamin; Patouillard, Edith; Tougher, Sarah; Goodman, Catherine; Hanson, Kara; Kleinschmidt, Immo; Torres Rueda, Sergio; Kiefer, Sabine; O’Connell, Kate; Zinsou, Cyprien; Phok, Sochea; Akulayi, Louis; Arogundade, Ekundayo; Buyungo, Peter; Mpasela, Felton; Poyer, Stephen; Chavasse, Desmond

    2016-01-01

    The private for-profit sector is an important source of treatment for malaria. However, private patients face high prices for the recommended treatment for uncomplicated malaria, artemisinin combination therapies (ACTs), which makes them more likely to receive cheaper, less effective non-artemisinin therapies (nATs). This study seeks to better understand consumer antimalarial prices by documenting and exploring the pricing behaviour of retailers and wholesalers. Using data collected in 2009–10, we present survey estimates of antimalarial retail prices, and wholesale- and retail-level price mark-ups from six countries (Benin, Cambodia, the Democratic Republic of Congo, Nigeria, Uganda and Zambia), along with qualitative findings on factors affecting pricing decisions. Retail prices were lowest for nATs, followed by ACTs and artemisinin monotherapies (AMTs). Retailers applied the highest percentage mark-ups on nATs (range: 40% in Nigeria to 100% in Cambodia and Zambia), whereas mark-ups on ACTs (range: 22% in Nigeria to 71% in Zambia) and AMTs (range: 22% in Nigeria to 50% in Uganda) were similar in magnitude, but lower than those applied to nATs. Wholesale mark-ups were generally lower than those at retail level, and were similar across antimalarial categories in most countries. When setting prices wholesalers and retailers commonly considered supplier prices, prevailing market prices, product availability, product characteristics and the costs related to transporting goods, staff salaries and maintaining a property. Price discounts were regularly used to encourage sales and were sometimes used by wholesalers to reward long-term customers. Pricing constraints existed only in Benin where wholesaler and retailer mark-ups are regulated; however, unlicensed drug vendors based in open-air markets did not adhere to the pricing regime. These findings indicate that mark-ups on antimalarials are reasonable. Therefore, improving ACT affordability would be most readily achieved by interventions that reduce commodity prices for retailers, such as ACT subsidies, pooled purchasing mechanisms and cost-effective strategies to increase the distribution coverage area of wholesalers. PMID:25944705

  14. Improving Interoperability by Incorporating UnitsML Into Markup Languages

    PubMed Central

    Celebi, Ismet; Dragoset, Robert A.; Olsen, Karen J.; Schaefer, Reinhold; Kramer, Gary W.

    2010-01-01

    Maintaining the integrity of analytical data over time is a challenge. Years ago, data were recorded on paper that was pasted directly into a laboratory notebook. The digital age has made maintaining the integrity of data harder. Nowadays, digitized analytical data are often separated from information about how the sample was collected and prepared for analysis and how the data were acquired. The data are stored on digital media, while the related information about the data may be written in a paper notebook or stored separately in other digital files. Sometimes the connection between this “scientific meta-data” and the analytical data is lost, rendering the spectrum or chromatogram useless. We have been working with ASTM Subcommittee E13.15 on Analytical Data to create the Analytical Information Markup Language or AnIML—a new way to interchange and store spectroscopy and chromatography data based on XML (Extensible Markup Language). XML is a language for describing what data are by enclosing them in computer-useable tags. Recording the units associated with the analytical data and metadata is an essential issue for any data representation scheme that must be addressed by all domain-specific markup languages. As scientific markup languages proliferate, it is very desirable to have a single scheme for handling units to facilitate moving information between different data domains. At NIST, we have been developing a general markup language just for units that we call UnitsML. This presentation will describe how UnitsML is used and how it is being incorporated into AnIML. PMID:27134778

  15. Improving Interoperability by Incorporating UnitsML Into Markup Languages.

    PubMed

    Celebi, Ismet; Dragoset, Robert A; Olsen, Karen J; Schaefer, Reinhold; Kramer, Gary W

    2010-01-01

    Maintaining the integrity of analytical data over time is a challenge. Years ago, data were recorded on paper that was pasted directly into a laboratory notebook. The digital age has made maintaining the integrity of data harder. Nowadays, digitized analytical data are often separated from information about how the sample was collected and prepared for analysis and how the data were acquired. The data are stored on digital media, while the related information about the data may be written in a paper notebook or stored separately in other digital files. Sometimes the connection between this "scientific meta-data" and the analytical data is lost, rendering the spectrum or chromatogram useless. We have been working with ASTM Subcommittee E13.15 on Analytical Data to create the Analytical Information Markup Language or AnIML-a new way to interchange and store spectroscopy and chromatography data based on XML (Extensible Markup Language). XML is a language for describing what data are by enclosing them in computer-useable tags. Recording the units associated with the analytical data and metadata is an essential issue for any data representation scheme that must be addressed by all domain-specific markup languages. As scientific markup languages proliferate, it is very desirable to have a single scheme for handling units to facilitate moving information between different data domains. At NIST, we have been developing a general markup language just for units that we call UnitsML. This presentation will describe how UnitsML is used and how it is being incorporated into AnIML.

  16. Aligning Greek-English parallel texts

    NASA Astrophysics Data System (ADS)

    Galiotou, Eleni; Koronakis, George; Lazari, Vassiliki

    2015-02-01

    In this paper, we discuss issues concerning the alignment of parallel texts written in languages with different alphabets based on an experiment of aligning texts from the proceedings of the European Parliament in Greek and English. First, we describe our implementation of the k-vec algorithm and its application to the bilingual corpus. Then the output of the algorithm is used as a starting point for an alignment procedure at a sentence level which also takes into account mark-ups of meta-information. The results of the implementation are compared to those of the application of the Church and Gale alignment algorithm on the Europarl corpus. The conclusions of this comparison can give useful insights as for the efficiency of alignment algorithms when applied to the particular bilingual corpus.

  17. ScienceCentral: open access full-text archive of scientific journals based on Journal Article Tag Suite regardless of their languages.

    PubMed

    Huh, Sun

    2013-01-01

    ScienceCentral, a free or open access, full-text archive of scientific journal literature at the Korean Federation of Science and Technology Societies, was under test in September 2013. Since it is a Journal Article Tag Suite-based full text database, extensible markup language files of all languages can be presented, according to Unicode Transformation Format 8-bit encoding. It is comparable to PubMed Central: however, there are two distinct differences. First, its scope comprises all science fields; second, it accepts all language journals. Launching ScienceCentral is the first step for free access or open access academic scientific journals of all languages to leap to the world, including scientific journals from Croatia.

  18. NAVAIR Portable Source Initiative (NPSI) Data Preparation Standard V2.2: NPSI DPS V2.2

    DTIC Science & Technology

    2012-05-22

    Keyhole Markup Language (file format) KMZ ............................................................................. Keyhole Markup...required for the geo-specific texture may differ within the database depending on the mission parameters. When operating close to the ground (e.g

  19. TumorML: Concept and requirements of an in silico cancer modelling markup language.

    PubMed

    Johnson, David; Cooper, Jonathan; McKeever, Steve

    2011-01-01

    This paper describes the initial groundwork carried out as part of the European Commission funded Transatlantic Tumor Model Repositories project, to develop a new markup language for computational cancer modelling, TumorML. In this paper we describe the motivations for such a language, arguing that current state-of-the-art biomodelling languages are not suited to the cancer modelling domain. We go on to describe the work that needs to be done to develop TumorML, the conceptual design, and a description of what existing markup languages will be used to compose the language specification.

  20. SGML Authoring Tools for Technical Communication.

    ERIC Educational Resources Information Center

    Davidson, W. J.

    1993-01-01

    Explains that structured authoring systems designed for the creation of generically encoded reusable information have context-sensitive application of markup, markup suppression, queing and automated formatting, structural navigation, and self-validation features. Maintains that they are a real alternative to conventional publishing systems. (SR)

  1. 76 FR 70187 - Self-Regulatory Organizations; NASDAQ OMX PHLX LLC; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-10

    ... rates. In effect, the Exchange is obtaining wholesale rates from the carriers and then charging a markup... a markup to allow the Exchange to cover its administrative costs and to earn a profit on its...

  2. 76 FR 8791 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-15

    ..., thereby maintaining the $0.0002 markup that exists in the current fee schedule. \\4\\ SR-PHLX-2011-11... recent pricing changes by that venue, and allows NASDAQ to maintain the current markup of $0.0002 per...

  3. XML and E-Journals: The State of Play.

    ERIC Educational Resources Information Center

    Wusteman, Judith

    2003-01-01

    Discusses the introduction of the use of XML (Extensible Markup Language) in publishing electronic journals. Topics include standards, including DTDs (Document Type Definition), or document type definitions; aggregator requirements; SGML (Standard Generalized Markup Language); benefits of XML for e-journals; XML metadata; the possibility of…

  4. 76 FR 70184 - Self-Regulatory Organizations; NASDAQ OMX BX, Inc.; Notice of Filing of Proposed Rule Change To...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-10

    ... rates. In effect, the Exchange is obtaining wholesale rates from the carriers and then charging a markup... a markup to allow the Exchange to cover its administrative costs and to earn a profit on its...

  5. 76 FR 70199 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing of Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-10

    ... rates. In effect, the Exchange is obtaining wholesale rates from the carriers and then charging a markup... a markup to allow the Exchange to cover its administrative costs and to earn a profit on its...

  6. Informatics in radiology: An open-source and open-access cancer biomedical informatics grid annotation and image markup template builder.

    PubMed

    Mongkolwat, Pattanasak; Channin, David S; Kleper, Vladimir; Rubin, Daniel L

    2012-01-01

    In a routine clinical environment or clinical trial, a case report form or structured reporting template can be used to quickly generate uniform and consistent reports. Annotation and image markup (AIM), a project supported by the National Cancer Institute's cancer biomedical informatics grid, can be used to collect information for a case report form or structured reporting template. AIM is designed to store, in a single information source, (a) the description of pixel data with use of markups or graphical drawings placed on the image, (b) calculation results (which may or may not be directly related to the markups), and (c) supplemental information. To facilitate the creation of AIM annotations with data entry templates, an AIM template schema and an open-source template creation application were developed to assist clinicians, image researchers, and designers of clinical trials to quickly create a set of data collection items, thereby ultimately making image information more readily accessible.

  7. Informatics in Radiology: An Open-Source and Open-Access Cancer Biomedical Informatics Grid Annotation and Image Markup Template Builder

    PubMed Central

    Channin, David S.; Rubin, Vladimir Kleper Daniel L.

    2012-01-01

    In a routine clinical environment or clinical trial, a case report form or structured reporting template can be used to quickly generate uniform and consistent reports. Annotation and Image Markup (AIM), a project supported by the National Cancer Institute’s cancer Biomedical Informatics Grid, can be used to collect information for a case report form or structured reporting template. AIM is designed to store, in a single information source, (a) the description of pixel data with use of markups or graphical drawings placed on the image, (b) calculation results (which may or may not be directly related to the markups), and (c) supplemental information. To facilitate the creation of AIM annotations with data entry templates, an AIM template schema and an open-source template creation application were developed to assist clinicians, image researchers, and designers of clinical trials to quickly create a set of data collection items, thereby ultimately making image information more readily accessible. © RSNA, 2012 PMID:22556315

  8. Semi-automated XML markup of biosystematic legacy literature with the GoldenGATE editor.

    PubMed

    Sautter, Guido; Böhm, Klemens; Agosti, Donat

    2007-01-01

    Today, digitization of legacy literature is a big issue. This also applies to the domain of biosystematics, where this process has just started. Digitized biosystematics literature requires a very precise and fine grained markup in order to be useful for detailed search, data linkage and mining. However, manual markup on sentence level and below is cumbersome and time consuming. In this paper, we present and evaluate the GoldenGATE editor, which is designed for the special needs of marking up OCR output with XML. It is built in order to support the user in this process as far as possible: Its functionality ranges from easy, intuitive tagging through markup conversion to dynamic binding of configurable plug-ins provided by third parties. Our evaluation shows that marking up an OCR document using GoldenGATE is three to four times faster than with an off-the-shelf XML editor like XML-Spy. Using domain-specific NLP-based plug-ins, these numbers are even higher.

  9. Knowledge requirements for automated inference of medical textbook markup.

    PubMed Central

    Berrios, D. C.; Kehler, A.; Fagan, L. M.

    1999-01-01

    Indexing medical text in journals or textbooks requires a tremendous amount of resources. We tested two algorithms for automatically indexing nouns, noun-modifiers, and noun phrases, and inferring selected binary relations between UMLS concepts in a textbook of infectious disease. Sixty-six percent of nouns and noun-modifiers and 81% of noun phrases were correctly matched to UMLS concepts. Semantic relations were identified with 100% specificity and 94% sensitivity. For some medical sub-domains, these algorithms could permit expeditious generation of more complex indexing. PMID:10566445

  10. HTML5 May Provide Vital Link for Friendly Future Mobile Applications

    DTIC Science & Technology

    2012-01-01

    31Army Communicator HTML5 may provide vital link for friendly future mobile applications By LTC Gregory Motes An important thread that has exist...would be accomplished with the maturation of HTML5 . To fully grasp the opportunity, one needs to be aware of the history of Hy- perText Markup...number of devices in both on-line and off-line states. To many, this would be accomplished with the maturation of HTML5 . (Continued on page 32

  11. Prices and mark-ups on antimalarials: evidence from nationally representative studies in six malaria-endemic countries.

    PubMed

    Palafox, Benjamin; Patouillard, Edith; Tougher, Sarah; Goodman, Catherine; Hanson, Kara; Kleinschmidt, Immo; Torres Rueda, Sergio; Kiefer, Sabine; O'Connell, Kate; Zinsou, Cyprien; Phok, Sochea; Akulayi, Louis; Arogundade, Ekundayo; Buyungo, Peter; Mpasela, Felton; Poyer, Stephen; Chavasse, Desmond

    2016-03-01

    The private for-profit sector is an important source of treatment for malaria. However, private patients face high prices for the recommended treatment for uncomplicated malaria, artemisinin combination therapies (ACTs), which makes them more likely to receive cheaper, less effective non-artemisinin therapies (nATs). This study seeks to better understand consumer antimalarial prices by documenting and exploring the pricing behaviour of retailers and wholesalers. Using data collected in 2009-10, we present survey estimates of antimalarial retail prices, and wholesale- and retail-level price mark-ups from six countries (Benin, Cambodia, the Democratic Republic of Congo, Nigeria, Uganda and Zambia), along with qualitative findings on factors affecting pricing decisions. Retail prices were lowest for nATs, followed by ACTs and artemisinin monotherapies (AMTs). Retailers applied the highest percentage mark-ups on nATs (range: 40% in Nigeria to 100% in Cambodia and Zambia), whereas mark-ups on ACTs (range: 22% in Nigeria to 71% in Zambia) and AMTs (range: 22% in Nigeria to 50% in Uganda) were similar in magnitude, but lower than those applied to nATs. Wholesale mark-ups were generally lower than those at retail level, and were similar across antimalarial categories in most countries. When setting prices wholesalers and retailers commonly considered supplier prices, prevailing market prices, product availability, product characteristics and the costs related to transporting goods, staff salaries and maintaining a property. Price discounts were regularly used to encourage sales and were sometimes used by wholesalers to reward long-term customers. Pricing constraints existed only in Benin where wholesaler and retailer mark-ups are regulated; however, unlicensed drug vendors based in open-air markets did not adhere to the pricing regime. These findings indicate that mark-ups on antimalarials are reasonable. Therefore, improving ACT affordability would be most readily achieved by interventions that reduce commodity prices for retailers, such as ACT subsidies, pooled purchasing mechanisms and cost-effective strategies to increase the distribution coverage area of wholesalers. © The Author 2015. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.

  12. Symmetric Key Services Markup Language (SKSML)

    NASA Astrophysics Data System (ADS)

    Noor, Arshad

    Symmetric Key Services Markup Language (SKSML) is the eXtensible Markup Language (XML) being standardized by the OASIS Enterprise Key Management Infrastructure Technical Committee for requesting and receiving symmetric encryption cryptographic keys within a Symmetric Key Management System (SKMS). This protocol is designed to be used between clients and servers within an Enterprise Key Management Infrastructure (EKMI) to secure data, independent of the application and platform. Building on many security standards such as XML Signature, XML Encryption, Web Services Security and PKI, SKSML provides standards-based capability to allow any application to use symmetric encryption keys, while maintaining centralized control. This article describes the SKSML protocol and its capabilities.

  13. The "New Oxford English Dictionary" Project.

    ERIC Educational Resources Information Center

    Fawcett, Heather

    1993-01-01

    Describes the conversion of the 22,000-page Oxford English Dictionary to an electronic version incorporating a modified Standard Generalized Markup Language (SGML) syntax. Explains that the database designers chose structured markup because it supports users' data searching needs, allows textual components to be extracted or modified, and allows…

  14. 77 FR 21607 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-10

    ... believes that it is reasonable to charge a $0.0001 per share markup on such routed orders as a means of..., and accordingly, it is equitable for NASDAQ to charge members a markup for making use of NASDAQ's...

  15. CytometryML, an XML format based on DICOM and FCS for analytical cytology data.

    PubMed

    Leif, Robert C; Leif, Suzanne B; Leif, Stephanie H

    2003-07-01

    Flow Cytometry Standard (FCS) was initially created to standardize the software researchers use to analyze, transmit, and store data produced by flow cytometers and sorters. Because of the clinical utility of flow cytometry, it is necessary to have a standard consistent with the requirements of medical regulatory agencies. We extended the existing mapping of FCS to the Digital Imaging and Communications in Medicine (DICOM) standard to include list-mode data produced by flow cytometry, laser scanning cytometry, and microscopic image cytometry. FCS list-mode was mapped to the DICOM Waveform Information Object. We created a collection of Extensible Markup Language (XML) schemas to express the DICOM analytical cytologic text-based data types except for large binary objects. We also developed a cytometry markup language, CytometryML, in an open environment subject to continuous peer review. The feasibility of expressing the data contained in FCS, including list-mode in DICOM, was demonstrated; and a preliminary mapping for list-mode data in the form of XML schemas and documents was completed. DICOM permitted the creation of indices that can be used to rapidly locate in a list-mode file the cells that are members of a subset. DICOM and its coding schemes for other medical standards can be represented by XML schemas, which can be combined with other relevant XML applications, such as Mathematical Markup Language (MathML). The use of XML format based on DICOM for analytical cytology met most of the previously specified requirements and appears capable of meeting the others; therefore, the present FCS should be retired and replaced by an open, XML-based, standard CytometryML. Copyright 2003 Wiley-Liss, Inc.

  16. Designing Multimedia for the Hypertext Markup Language.

    ERIC Educational Resources Information Center

    Schwier, Richard A.; Misanchuk, Earl R.

    Dynamic discussions have begun to emerge concerning style of presentation on world wide web sites. Some hypertext markup language (HTML) designers seek an intimate and chatty ambience, while others want to project a more professional image. Evaluators see many sites as overdecorated and indecipherable. This paper offers suggestions on selecting…

  17. 106-17 Telemetry Standards Metadata Configuration Chapter 23

    DTIC Science & Technology

    2017-07-01

    23-1 23.2 Metadata Description Language ...Chapter 23, July 2017 iii Acronyms HTML Hypertext Markup Language MDL Metadata Description Language PCM pulse code modulation TMATS Telemetry...Attributes Transfer Standard W3C World Wide Web Consortium XML eXtensible Markup Language XSD XML schema document Telemetry Network Standard

  18. DAVE-ML Utility Programs

    NASA Technical Reports Server (NTRS)

    Jackson, Bruce

    2006-01-01

    DAVEtools is a set of Java archives that embodies tools for manipulating flight-dynamics models that have been encoded in dynamic aerospace vehicle exchange markup language (DAVE-ML). [DAVE-ML is an application program, written in Extensible Markup Language (XML), for encoding complete computational models of the dynamics of aircraft and spacecraft.

  19. An Introduction to the Resource Description Framework.

    ERIC Educational Resources Information Center

    Miller, Eric

    1998-01-01

    Explains the Resource Description Framework (RDF), an infrastructure developed under the World Wide Web Consortium that enables the encoding, exchange, and reuse of structured metadata. It is an application of Extended Markup Language (XML), which is a subset of Standard Generalized Markup Language (SGML), and helps with expressing semantics.…

  20. ScienceCentral: open access full-text archive of scientific journals based on Journal Article Tag Suite regardless of their languages

    PubMed Central

    Huh, Sun

    2013-01-01

    ScienceCentral, a free or open access, full-text archive of scientific journal literature at the Korean Federation of Science and Technology Societies, was under test in September 2013. Since it is a Journal Article Tag Suite-based full text database, extensible markup language files of all languages can be presented, according to Unicode Transformation Format 8-bit encoding. It is comparable to PubMed Central: however, there are two distinct differences. First, its scope comprises all science fields; second, it accepts all language journals. Launching ScienceCentral is the first step for free access or open access academic scientific journals of all languages to leap to the world, including scientific journals from Croatia. PMID:24266292

  1. WITH: a system to write clinical trials using XML and RDBMS.

    PubMed Central

    Fazi, Paola; Luzi, Daniela; Manco, Mariarosaria; Ricci, Fabrizio L.; Toffoli, Giovanni; Vignetti, Marco

    2002-01-01

    The paper illustrates the system WITH (Write on Internet clinical Trials in Haematology) which supports the writing of a clinical trial (CT) document. The requirements of this system have been defined analysing the writing process of a CT and then modelling the content of its sections together with their logical and temporal relationships. The system WITH allows: a) editing the document text; b) re-using the text; and c) facilitating the cooperation and the collaborative writing. It is based on XML mark-up language, and on a RDBMS. This choice guarantees: a) process standardisation; b) process management; c) efficient delivery of information-based tasks; and d) explicit focus on process design. PMID:12463823

  2. Modeling the Arden Syntax for medical decisions in XML.

    PubMed

    Kim, Sukil; Haug, Peter J; Rocha, Roberto A; Choi, Inyoung

    2008-10-01

    A new model expressing Arden Syntax with the eXtensible Markup Language (XML) was developed to increase its portability. Every example was manually parsed and reviewed until the schema and the style sheet were considered to be optimized. When the first schema was finished, several MLMs in Arden Syntax Markup Language (ArdenML) were validated against the schema. They were then transformed to HTML formats with the style sheet, during which they were compared to the original text version of their own MLM. When faults were found in the transformed MLM, the schema and/or style sheet was fixed. This cycle continued until all the examples were encoded into XML documents. The original MLMs were encoded in XML according to the proposed XML schema and reverse-parsed MLMs in ArdenML were checked using a public domain Arden Syntax checker. Two hundred seventy seven examples of MLMs were successfully transformed into XML documents using the model, and the reverse-parse yielded the original text version of MLMs. Two hundred sixty five of the 277 MLMs showed the same error patterns before and after transformation, and all 11 errors related to statement structure were resolved in XML version. The model uses two syntax checking mechanisms, first an XML validation process, and second, a syntax check using an XSL style sheet. Now that we have a schema for ArdenML, we can also begin the development of style sheets for transformation ArdenML into other languages.

  3. Phased development of a web-based PACS viewer

    NASA Astrophysics Data System (ADS)

    Gidron, Yoad; Shani, Uri; Shifrin, Mark

    2000-05-01

    The Web browser is an excellent environment for the rapid development of an effective and inexpensive PACS viewer. In this paper we will share our experience in developing a browser-based viewer, from the inception and prototype stages to its current state of maturity. There are many operational advantages to a browser-based viewer, even when native viewers already exist in the system (with multiple and/or high resolution screens): (1) It can be used on existing personal workstations throughout the hospital. (2) It is easy to make the service available from physician's homes. (3) The viewer is extremely portable and platform independent. There is a wide variety of means available for implementing the browser- based viewer. Each file sent to the client by the server can perform some end-user or client/server interaction. These means range from HTML (for HyperText Markup Language) files, through Java Script, to Java applets. Some data types may also invoke plug-in code in the client, although this would reduce the portability of the viewer, it would provide the needed efficiency in critical places. On the server side the range of means is also very rich: (1) A set of files: html, Java Script, Java applets, etc. (2) Extensions of the server via cgi-bin programs, (3) Extensions of the server via servlets, (4) Any other helper application residing and working with the server to access the DICOM archive. The viewer architecture consists of two basic parts: The first part performs query and navigation through the DICOM archive image folders. The second part does the image access and display. While the first part deals with low data traffic, it involves many database transactions. The second part is simple as far as access transactions are concerned, but requires much more data traffic and display functions. Our web-based viewer has gone through three development stages characterized by the complexity of the means and tools employed on both client and server sides.

  4. Resolving Controlled Vocabulary in DITA Markup: A Case Example in Agroforestry

    ERIC Educational Resources Information Center

    Zschocke, Thomas

    2012-01-01

    Purpose: This paper aims to address the issue of matching controlled vocabulary on agroforestry from knowledge organization systems (KOS) and incorporating these terms in DITA markup. The paper has been selected for an extended version from MTSR'11. Design/methodology/approach: After a general description of the steps taken to harmonize controlled…

  5. Developing a Markup Language for Encoding Graphic Content in Plan Documents

    ERIC Educational Resources Information Center

    Li, Jinghuan

    2009-01-01

    While deliberating and making decisions, participants in urban development processes need easy access to the pertinent content scattered among different plans. A Planning Markup Language (PML) has been proposed to represent the underlying structure of plans in an XML-compliant way. However, PML currently covers only textual information and lacks…

  6. Overview of the World Wide Web Consortium (W3C) (SIGs IA, USE).

    ERIC Educational Resources Information Center

    Daly, Janet

    2000-01-01

    Provides an overview of a planned session to describe the work of the World Wide Web Consortium, including technical specifications for HTML (Hypertext Markup Language), XML (Extensible Markup Language), CSS (Cascading Style Sheets), and over 20 other Web standards that address graphics, multimedia, privacy, metadata, and other technologies. (LRW)

  7. Making the World Wide Web Accessible to All Students.

    ERIC Educational Resources Information Center

    Guthrie, Sally A.

    2000-01-01

    Examines the accessibility of Web sites belonging to 80 colleges of communications and schools of journalism by examining the hypertext markup language (HTML) used to format the pages. Suggests ways to revise the markup of pages to make them more accessible to students with vision, hearing, and mobility problems. Lists resources of the latest…

  8. An Electronic Finding Aid Using Extensible Markup Language (XML) and Encoded Archival Description (EAD).

    ERIC Educational Resources Information Center

    Chang, May

    2000-01-01

    Describes the development of electronic finding aids for archives at the University of Illinois, Urbana-Champaign that used XML (extensible markup language) and EAD (encoded archival description) to enable more flexible information management and retrieval than using MARC or a relational database management system. EAD template is appended.…

  9. Modularization and Structured Markup for Learning Content in an Academic Environment

    ERIC Educational Resources Information Center

    Schluep, Samuel; Bettoni, Marco; Schar, Sissel Guttormsen

    2006-01-01

    This article aims to present a flexible component model for modular, web-based learning content, and a simple structured markup schema for the separation of content and presentation. The article will also contain an overview of the dynamic Learning Content Management System (dLCMS) project, which implements these concepts. Content authors are a…

  10. The Adoption of Mark-Up Tools in an Interactive e-Textbook Reader

    ERIC Educational Resources Information Center

    Van Horne, Sam; Russell, Jae-eun; Schuh, Kathy L.

    2016-01-01

    Researchers have more often examined whether students prefer using an e-textbook over a paper textbook or whether e-textbooks provide a better resource for learning than paper textbooks, but students' adoption of mark-up tools has remained relatively unexamined. Drawing on the concept of Innovation Diffusion Theory, we used educational data mining…

  11. A methodology for evaluation of a markup-based specification of clinical guidelines.

    PubMed

    Shalom, Erez; Shahar, Yuval; Taieb-Maimon, Meirav; Lunenfeld, Eitan

    2008-11-06

    We introduce a three-phase, nine-step methodology for specification of clinical guidelines (GLs) by expert physicians, clinical editors, and knowledge engineers, and for quantitative evaluation of the specification's quality. We applied this methodology to a particular framework for incremental GL structuring (mark-up) and to GLs in three clinical domains with encouraging results.

  12. Extensible Markup Language: How Might It Alter the Software Documentation Process and the Role of the Technical Communicator?

    ERIC Educational Resources Information Center

    Battalio, John T.

    2002-01-01

    Describes the influence that Extensible Markup Language (XML) will have on the software documentation process and subsequently on the curricula of advanced undergraduate and master's programs in technical communication. Recommends how curricula of advanced undergraduate and master's programs in technical communication ought to change in order to…

  13. Thin client (web browser)-based collaboration for medical imaging and web-enabled data.

    PubMed

    Le, Tuong Huu; Malhi, Nadeem

    2002-01-01

    Utilizing thin client software and open source server technology, a collaborative architecture was implemented allowing for sharing of Digital Imaging and Communications in Medicine (DICOM) and non-DICOM images with real-time markup. Using the Web browser as a thin client integrated with standards-based components, such as DHTML (dynamic hypertext markup language), JavaScript, and Java, collaboration was achieved through a Web server/proxy server combination utilizing Java Servlets and Java Server Pages. A typical collaborative session involved the driver, who directed the navigation of the other collaborators, the passengers, and provided collaborative markups of medical and nonmedical images. The majority of processing was performed on the server side, allowing for the client to remain thin and more accessible.

  14. Astronomical Instrumentation System Markup Language

    NASA Astrophysics Data System (ADS)

    Goldbaum, Jesse M.

    2016-05-01

    The Astronomical Instrumentation System Markup Language (AISML) is an Extensible Markup Language (XML) based file format for maintaining and exchanging information about astronomical instrumentation. The factors behind the need for an AISML are first discussed followed by the reasons why XML was chosen as the format. Next it's shown how XML also provides the framework for a more precise definition of an astronomical instrument and how these instruments can be combined to form an Astronomical Instrumentation System (AIS). AISML files for several instruments as well as one for a sample AIS are provided. The files demonstrate how AISML can be utilized for various tasks from web page generation and programming interface to instrument maintenance and quality management. The advantages of widespread adoption of AISML are discussed.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    North, Michael J.

    SchemaOnRead provides tools for implementing schema-on-read including a single function call (e.g., schemaOnRead("filename")) that reads text (TXT), comma separated value (CSV), raster image (BMP, PNG, GIF, TIFF, and JPG), R data (RDS), HDF5, NetCDF, spreadsheet (XLS, XLSX, ODS, and DIF), Weka Attribute-Relation File Format (ARFF), Epi Info (REC), Pajek network (PAJ), R network (NET), Hypertext Markup Language (HTML), SPSS (SAV), Systat (SYS), and Stata (DTA) files. It also recursively reads folders (e.g., schemaOnRead("folder")), returning a nested list of the contained elements.

  16. Using Extensible Markup Language (XML) for the Single Source Delivery of Educational Resources by Print and Online: A Case Study

    ERIC Educational Resources Information Center

    Walsh, Lucas

    2007-01-01

    This article seeks to provide an introduction to Extensible Markup Language (XML) by looking at its use in a single source publishing approach to the provision of teaching resources in both hardcopy and online. Using the development of the International Baccalaureate Organisation's online Economics Subject Guide as a practical example, this…

  17. A Practical Introduction to the XML, Extensible Markup Language, by Way of Some Useful Examples

    ERIC Educational Resources Information Center

    Snyder, Robin

    2004-01-01

    XML, Extensible Markup Language, is important as a way to represent and encapsulate the structure of underlying data in a portable way that supports data exchange regardless of the physical storage of the data. This paper (and session) introduces some useful and practical aspects of XML technology for sharing information in a educational setting…

  18. Knowledge Provenance in Semantic Wikis

    NASA Astrophysics Data System (ADS)

    Ding, L.; Bao, J.; McGuinness, D. L.

    2008-12-01

    Collaborative online environments with a technical Wiki infrastructure are becoming more widespread. One of the strengths of a Wiki environment is that it is relatively easy for numerous users to contribute original content and modify existing content (potentially originally generated by others). As more users begin to depend on informational content that is evolving by Wiki communities, it becomes more important to track the provenance of the information. Semantic Wikis expand upon traditional Wiki environments by adding some computationally understandable encodings of some of the terms and relationships in Wikis. We have developed a semantic Wiki environment that expands a semantic Wiki with provenance markup. Provenance of original contributions as well as modifications is encoded using the provenance markup component of the Proof Markup Language. The Wiki environment provides the provenance markup automatically, thus users are not required to make specific encodings of author, contribution date, and modification trail. Further, our Wiki environment includes a search component that understands the provenance primitives and thus can be used to provide a provenance-aware search facility. We will describe the knowledge provenance infrastructure of our Semantic Wiki and show how it is being used as the foundation of our group web site as well as a number of project web sites.

  19. WikiHyperGlossary (WHG): an information literacy technology for chemistry documents.

    PubMed

    Bauer, Michael A; Berleant, Daniel; Cornell, Andrew P; Belford, Robert E

    2015-01-01

    The WikiHyperGlossary is an information literacy technology that was created to enhance reading comprehension of documents by connecting them to socially generated multimedia definitions as well as semantically relevant data. The WikiHyperGlossary enhances reading comprehension by using the lexicon of a discipline to generate dynamic links in a document to external resources that can provide implicit information the document did not explicitly provide. Currently, the most common method to acquire additional information when reading a document is to access a search engine and browse the web. This may lead to skimming of multiple documents with the novice actually never returning to the original document of interest. The WikiHyperGlossary automatically brings information to the user within the current document they are reading, enhancing the potential for deeper document understanding. The WikiHyperGlossary allows users to submit a web URL or text to be processed against a chosen lexicon, returning the document with tagged terms. The selection of a tagged term results in the appearance of the WikiHyperGlossary Portlet containing a definition, and depending on the type of word, tabs to additional information and resources. Current types of content include multimedia enhanced definitions, ChemSpider query results, 3D molecular structures, and 2D editable structures connected to ChemSpider queries. Existing glossaries can be bulk uploaded, locked for editing and associated with multiple social generated definitions. The WikiHyperGlossary leverages both social and semantic web technologies to bring relevant information to a document. This can not only aid reading comprehension, but increases the users' ability to obtain additional information within the document. We have demonstrated a molecular editor enabled knowledge framework that can result in a semantic web inductive reasoning process, and integration of the WikiHyperGlossary into other software technologies, like the Jikitou Biomedical Question and Answer system. Although this work was developed in the chemical sciences and took advantage of open science resources and initiatives, the technology is extensible to other knowledge domains. Through the DeepLit (Deeper Literacy: Connecting Documents to Data and Discourse) startup, we seek to extend WikiHyperGlossary technologies to other knowledge domains, and integrate them into other knowledge acquisition workflows.

  20. Determinants of price setting decisions on anti-malarial drugs at retail shops in Cambodia.

    PubMed

    Patouillard, Edith; Hanson, Kara; Kleinschmidt, Immo; Palafox, Benjamin; Tougher, Sarah; Pok, Sochea; O'Connell, Kate; Goodman, Catherine

    2015-05-30

    In many low-income countries, the private commercial sector plays an important role in the provision of malaria treatment. However, the quality of care it provides is often poor, with artemisinin combination therapy (ACT) generally being too costly for consumers. Decreasing ACT prices is critical for improving private sector treatment outcomes and reducing the spread of artemisinin resistance. Yet limited evidence exists on the factors influencing retailers' pricing decisions. This study investigates the determinants of price mark-ups on anti-malarial drugs in retail outlets in Cambodia. Taking an economics perspective, the study tests the hypothesis that the structure of the anti-malarial market determines the way providers set their prices. Providers facing weak competition are hypothesized to apply high mark-ups and set prices above the competitive level. To analyse the relationship between market competition and provider pricing, the study used cross-sectional data from retail outlets selling anti-malarial drugs, including outlet characteristics data (e.g. outlet type, anti-malarial sales volumes), range of anti-malarial drugs stocked (e.g. dosage form, brand status) and purchase and selling prices. Market concentration, a measure of the level of market competition, was estimated using sales volume data. Market accessibility was defined based on travel time to the closest main commercial area. Percent mark-ups were calculated using price data. The relationship between mark-ups and market concentration was explored using regression analysis. The anti-malarial market was on average highly concentrated, suggesting weak competition. Higher concentration was positively associated with higher mark-ups in moderately accessible markets only, with no significant relationship or a negative relationship in other markets. Other determinants of pricing included anti-malarial brand status and generic type, with higher mark-ups on cheaper products. The results indicate that provider pricing as well as other key elements of anti-malarial supply and demand may have played an important role in the limited access to appropriate malaria treatment in Cambodia. The potential for an ACT price subsidy at manufacturer level combined with effective communications directed at consumers and supportive private sector regulation should be explored to improve access to quality malaria treatment in Cambodia.

  1. Integrating and visualizing primary data from prospective and legacy taxonomic literature

    PubMed Central

    Agosti, Donat; Penev, Lyubomir; Sautter, Guido; Georgiev, Teodor; Catapano, Terry; Patterson, David; King, David; Pereira, Serrano; Vos, Rutger Aldo; Sierra, Soraya

    2015-01-01

    Abstract Specimen data in taxonomic literature are among the highest quality primary biodiversity data. Innovative cybertaxonomic journals are using workflows that maintain data structure and disseminate electronic content to aggregators and other users; such structure is lost in traditional taxonomic publishing. Legacy taxonomic literature is a vast repository of knowledge about biodiversity. Currently, access to that resource is cumbersome, especially for non-specialist data consumers. Markup is a mechanism that makes this content more accessible, and is especially suited to machine analysis. Fine-grained XML (Extensible Markup Language) markup was applied to all (37) open-access articles published in the journal Zootaxa containing treatments on spiders (Order: Araneae). The markup approach was optimized to extract primary specimen data from legacy publications. These data were combined with data from articles containing treatments on spiders published in Biodiversity Data Journal where XML structure is part of the routine publication process. A series of charts was developed to visualize the content of specimen data in XML-tagged taxonomic treatments, either singly or in aggregate. The data can be filtered by several fields (including journal, taxon, institutional collection, collecting country, collector, author, article and treatment) to query particular aspects of the data. We demonstrate here that XML markup using GoldenGATE can address the challenge presented by unstructured legacy data, can extract structured primary biodiversity data which can be aggregated with and jointly queried with data from other Darwin Core-compatible sources, and show how visualization of these data can communicate key information contained in biodiversity literature. We complement recent studies on aspects of biodiversity knowledge using XML structured data to explore 1) the time lag between species discovry and description, and 2) the prevelence of rarity in species descriptions. PMID:26023286

  2. A New Method of Viewing Attachment Document of eMail on Various Mobile Devices

    NASA Astrophysics Data System (ADS)

    Ko, Heeae; Seo, Changwoo; Lim, Yonghwan

    As the computing power of the mobile devices is improving rapidly, many kinds of web services are also available in mobile devices just as Email service. Mobile Mail Service began early, but this service is mostly limited in some specified mobile devices such as Smart Phone. That is a limitation that users have to purchase specified phone to be benefited from Mobile Mail Service. In this paper, it uses DIDL (digital item declaration language) markup type defined in MPEG-21 and MobileGate Server, and solved this problem. DIDL could be converted to other markup types which are displayed by mobile devices. By transforming PC Web Mail contents including attachment document to DIDL markup through MobileGate Server, the Mobile Mail Service could be available for all kinds of mobile devices.

  3. Informatics in radiology: automated structured reporting of imaging findings using the AIM standard and XML.

    PubMed

    Zimmerman, Stefan L; Kim, Woojin; Boonn, William W

    2011-01-01

    Quantitative and descriptive imaging data are a vital component of the radiology report and are frequently of paramount importance to the ordering physician. Unfortunately, current methods of recording these data in the report are both inefficient and error prone. In addition, the free-text, unstructured format of a radiology report makes aggregate analysis of data from multiple reports difficult or even impossible without manual intervention. A structured reporting work flow has been developed that allows quantitative data created at an advanced imaging workstation to be seamlessly integrated into the radiology report with minimal radiologist intervention. As an intermediary step between the workstation and the reporting software, quantitative and descriptive data are converted into an extensible markup language (XML) file in a standardized format specified by the Annotation and Image Markup (AIM) project of the National Institutes of Health Cancer Biomedical Informatics Grid. The AIM standard was created to allow image annotation data to be stored in a uniform machine-readable format. These XML files containing imaging data can also be stored on a local database for data mining and analysis. This structured work flow solution has the potential to improve radiologist efficiency, reduce errors, and facilitate storage of quantitative and descriptive imaging data for research. Copyright © RSNA, 2011.

  4. Developing and integrating an adverse drug reaction reporting system with the hospital information system.

    PubMed

    Kataoka, Satoshi; Ohe, Kazuhiko; Mochizuki, Mayumi; Ueda, Shiro

    2002-01-01

    We have developed an adverse drug reaction (ADR) reporting system integrating it with Hospital Information System (HIS) of the University of Tokyo Hospital. Since this system is designed with JAVA, it is portable without re-compiling to any operating systems on which JAVA virtual machines work. In this system, we implemented an automatic data filling function using XML-based (extended Markup Language) files generated by HIS. This new specification would decrease the time needed for physicians and pharmacists to fill the spontaneous ADR reports. By clicking a button, the report is sent to the text database through Simple Mail Transfer Protocol (SMTP) electronic mails. The destination of the report mail can be changed arbitrarily by administrators, which adds this system more flexibility for practical operation. Although we tried our best to use the SGML-based (Standard Generalized Markup Language) ICH M2 guideline to follow the global standard of the case report, we eventually adopted XML as the output report format. This is because we found some problems in handling two bytes characters with ICH guideline and XML has a lot of useful features. According to our pilot survey conducted at the University of Tokyo Hospital, many physicians answered that our idea, integrating ADR reporting system to HIS, would increase the ADR reporting numbers.

  5. The content and efficacy of environmental public health journal homepages.

    PubMed

    Lindars, E S; Spickett, J T

    2000-01-01

    The information on several environmental public health journal homepages has been assessed for its quality and quantity, using selected key criteria. These criteria included the extent of text available, the ability to search the website, the table of contents free via email, and the presence of hyper-links. A high degree of variability is seen, with services and facilities offered ranging from none to the entire journal available for no fee. The journal homepages that are the most comprehensive are those that are associated with major institutions and hence financed by contributions from their members or public money, i.e. the British Medical Association, the World Health Organisation and the US National Institute of Environmental Health Sciences. The journal homepages associated with these institutions offered full text of both current and archived issues as well as additions such as the ability to search other sites, web links, and in some cases hyper-linked references and information on related articles. The provision of text on the Internet should be an essential aim for all journal Homepages, to ensure fast and effective conveyance of information to health professionals.

  6. Producing a Data Dictionary from an Extensible Markup Language (XML) Schemain the Global Force Management Data Initiative

    DTIC Science & Technology

    2017-02-01

    entity relationship (diagram) EwID Enterprise-wide Identifier FMID Force Management Identifier GFM Global Force Management HTML Hypertext Markup Language... Management Data Initiative by Frederick S Brundick Approved for public release; distribution is unlimited. NOTICES Disclaimers The findings in this report...Schema in the Global Force Management Data Initiative by Frederick S Brundick Computing and Information Sciences Directorate, ARL Approved for public

  7. RTML: remote telescope markup language and you

    NASA Astrophysics Data System (ADS)

    Hessman, F. V.

    2001-12-01

    In order to coordinate the use of robotic and remotely operated telescopes in networks -- like Göttingen's MOnitoring NEtwork of Telescopes (MONET) -- a standard format for the exchange of observing requests and reports is needed. I describe the benefits of Remote Telescope Markup Language (RTML), an XML-based protocol originally developed by the Hands-On Universe Project, which is being used and further developed by several robotic telescope projects and firms.

  8. Visualization Development of the Ballistic Threat Geospatial Optimization

    DTIC Science & Technology

    2015-07-01

    topographic globes, Keyhole Markup Language (KML), and Collada files. World Wind gives the user the ability to import 3-D models and navigate...present. After the first person view window is closed , the images stored in memory are then converted to a QuickTime movie (.MOV). The video will be...processing unit HPC high-performance computing JOGL Java implementation of OpenGL KML Keyhole Markup Language NASA National Aeronautics and Space

  9. Markup of temporal information in electronic health records.

    PubMed

    Hyun, Sookyung; Bakken, Suzanne; Johnson, Stephen B

    2006-01-01

    Temporal information plays a critical role in the understanding of clinical narrative (i.e., free text). We developed a representation for marking up temporal information in a narrative, consisting of five elements: 1) reference point, 2) direction, 3) number, 4) time unit, and 5) pattern. We identified 254 temporal expressions from 50 discharge summaries and represented them using our scheme. The overall inter-rater reliability among raters applying the representation model was 75 percent agreement. The model can contribute to temporal reasoning in computer systems for decision support, data mining, and process and outcomes analyses by providing structured temporal information.

  10. Chemical Markup, XML, and the World Wide Web. 7. CMLSpect, an XML vocabulary for spectral data.

    PubMed

    Kuhn, Stefan; Helmus, Tobias; Lancashire, Robert J; Murray-Rust, Peter; Rzepa, Henry S; Steinbeck, Christoph; Willighagen, Egon L

    2007-01-01

    CMLSpect is an extension of Chemical Markup Language (CML) for managing spectral and other analytical data. It is designed to be flexible enough to contain a wide variety of spectral data. The paper describes the CMLElements used and gives practical examples for common types of spectra. In addition it demonstrates how different views of the data can be expressed and what problems still exist.

  11. Computer support for physiological cell modelling using an ontology on cell physiology.

    PubMed

    Takao, Shimayoshi; Kazuhiro, Komurasaki; Akira, Amano; Takeshi, Iwashita; Masanori, Kanazawa; Tetsuya, Matsuda

    2006-01-01

    The development of electrophysiological whole cell models to support the understanding of biological mechanisms is increasing rapidly. Due to the complexity of biological systems, comprehensive cell models, which are composed of many imported sub-models of functional elements, can get quite complicated as well, making computer modification difficult. Here, we propose a computer support to enhance structural changes of cell models, employing the markup languages CellML and our original PMSML (physiological model structure markup language), in addition to a new ontology for cell physiological modelling. In particular, a method to make references from CellML files to the ontology and a method to assist manipulation of model structures using markup languages together with the ontology are reported. Using these methods three software utilities, including a graphical model editor, are implemented. Experimental results proved that these methods are effective for the modification of electrophysiological models.

  12. Development of the Plate Tectonics and Seismology markup languages with XML

    NASA Astrophysics Data System (ADS)

    Babaie, H.; Babaei, A.

    2003-04-01

    The Extensible Markup Language (XML) and its specifications such as the XSD Schema, allow geologists to design discipline-specific vocabularies such as Seismology Markup Language (SeismML) or Plate Tectonics Markup Language (TectML). These languages make it possible to store and interchange structured geological information over the Web. Development of a geological markup language requires mapping geological concepts, such as "Earthquake" or "Plate" into a UML object model, applying a modeling and design environment. We have selected four inter-related geological concepts: earthquake, fault, plate, and orogeny, and developed four XML Schema Definitions (XSD), that define the relationships, cardinalities, hierarchies, and semantics of these concepts. In such a geological concept model, the UML object "Earthquake" is related to one or more "Wave" objects, each arriving to a seismic station at a specific "DateTime", and relating to a specific "Epicenter" object that lies at a unique "Location". The "Earthquake" object occurs along a "Segment" of a "Fault" object, which is related to a specific "Plate" object. The "Fault" has its own associations with such things as "Bend", "Step", and "Segment", and could be of any kind (e.g., "Thrust", "Transform'). The "Plate" is related to many other objects such as "MOR", "Subduction", and "Forearc", and is associated with an "Orogeny" object that relates to "Deformation" and "Strain" and several other objects. These UML objects were mapped into XML Metadata Interchange (XMI) formats, which were then converted into four XSD Schemas. The schemas were used to create and validate the XML instance documents, and to create a relational database hosting the plate tectonics and seismological data in the Microsoft Access format. The SeismML and TectML allow seismologists and structural geologists, among others, to submit and retrieve structured geological data on the Internet. A seismologist, for example, can submit peer-reviewed and reliable data about a specific earthquake to a Java Server Page on our web site hosting the XML application. Other geologists can readily retrieve the submitted data, saved in files or special tables of the designed database, through a search engine designed with J2EE (JSP, servlet, Java Bean) and XML specifications such as XPath, XPointer, and XSLT. When extended to include all the important concepts of seismology and plate tectonics, the two markup languages will make global interchange of geological data a reality.

  13. An Overview of Genomic Sequence Variation Markup Language (GSVML)

    PubMed Central

    Nakaya, Jun; Hiroi, Kaei; Ido, Keisuke; Yang, Woosung; Kimura, Michio

    2006-01-01

    Internationally accumulated genomic sequence variation data on human requires the interoperable data exchanging format. We developed the GSVML as the data exchanging format. The GSVML is human health oriented and has three categories. Analyses on the use case in human health domain and the investigation on the databases and markup languages were conducted. An interface ability to Health Level Seven Genotype Model was examined. GSVML provides a sharable platform for both clinical and research applications.

  14. MYCIN II: design and implementation of a therapy reference with complex content-based indexing.

    PubMed Central

    Kim, D. K.; Fagan, L. M.; Jones, K. T.; Berrios, D. C.; Yu, V. L.

    1998-01-01

    We describe the construction of MYCIN II, a prototype system that provides for content-based markup and search of a forthcoming clinical therapeutics textbook, Antimicrobial Therapy and Vaccines. Existing commercial search technology for digital references utilizes generic tools such as textword-based searches with geographical or statistical refinements. We suggest that the drawbacks of such systems significantly restrict their use in everyday clinical practice. This is in spite of the fact that there is a great need for the information contained within these same references. The system we describe is intended to supplement keyword searching so that certain important questions can be asked easily and can be answered reliably (in terms of precision and recall). Our method attacks this problem in a restricted domain of knowledge-clinical infectious disease. For example, we would like to be able to answer the class of questions exemplified by the following query: "What antimicrobial agents can be used to treat endocarditis caused by Eikenella corrodens?" We have compiled and analyzed a list of such questions to develop a concept-based markup scheme. This scheme was then applied within an HTML markup to electronically "highlight" passages from three textbook chapters. We constructed a functioning web-based search interface. Our system also provides semi-automated querying of PubMed using our concept markup and the user's actions as a guide. PMID:9929205

  15. MYCIN II: design and implementation of a therapy reference with complex content-based indexing.

    PubMed

    Kim, D K; Fagan, L M; Jones, K T; Berrios, D C; Yu, V L

    1998-01-01

    We describe the construction of MYCIN II, a prototype system that provides for content-based markup and search of a forthcoming clinical therapeutics textbook, Antimicrobial Therapy and Vaccines. Existing commercial search technology for digital references utilizes generic tools such as textword-based searches with geographical or statistical refinements. We suggest that the drawbacks of such systems significantly restrict their use in everyday clinical practice. This is in spite of the fact that there is a great need for the information contained within these same references. The system we describe is intended to supplement keyword searching so that certain important questions can be asked easily and can be answered reliably (in terms of precision and recall). Our method attacks this problem in a restricted domain of knowledge-clinical infectious disease. For example, we would like to be able to answer the class of questions exemplified by the following query: "What antimicrobial agents can be used to treat endocarditis caused by Eikenella corrodens?" We have compiled and analyzed a list of such questions to develop a concept-based markup scheme. This scheme was then applied within an HTML markup to electronically "highlight" passages from three textbook chapters. We constructed a functioning web-based search interface. Our system also provides semi-automated querying of PubMed using our concept markup and the user's actions as a guide.

  16. Instrument Remote Control via the Astronomical Instrument Markup Language

    NASA Technical Reports Server (NTRS)

    Sall, Ken; Ames, Troy; Warsaw, Craig; Koons, Lisa; Shafer, Richard

    1998-01-01

    The Instrument Remote Control (IRC) project ongoing at NASA's Goddard Space Flight Center's (GSFC) Information Systems Center (ISC) supports NASA's mission by defining an adaptive intranet-based framework that provides robust interactive and distributed control and monitoring of remote instruments. An astronomical IRC architecture that combines the platform-independent processing capabilities of Java with the power of Extensible Markup Language (XML) to express hierarchical data in an equally platform-independent, as well as human readable manner, has been developed. This architecture is implemented using a variety of XML support tools and Application Programming Interfaces (API) written in Java. IRC will enable trusted astronomers from around the world to easily access infrared instruments (e.g., telescopes, cameras, and spectrometers) located in remote, inhospitable environments, such as the South Pole, a high Chilean mountaintop, or an airborne observatory aboard a Boeing 747. Using IRC's frameworks, an astronomer or other scientist can easily define the type of onboard instrument, control the instrument remotely, and return monitoring data all through the intranet. The Astronomical Instrument Markup Language (AIML) is the first implementation of the more general Instrument Markup Language (IML). The key aspects of our approach to instrument description and control applies to many domains, from medical instruments to machine assembly lines. The concepts behind AIML apply equally well to the description and control of instruments in general. IRC enables us to apply our techniques to several instruments, preferably from different observatories.

  17. The tissue micro-array data exchange specification: a web based experience browsing imported data

    PubMed Central

    Nohle, David G; Hackman, Barbara A; Ayers, Leona W

    2005-01-01

    Background The AIDS and Cancer Specimen Resource (ACSR) is an HIV/AIDS tissue bank consortium sponsored by the National Cancer Institute (NCI) Division of Cancer Treatment and Diagnosis (DCTD). The ACSR offers to approved researchers HIV infected biologic samples and uninfected control tissues including tissue cores in micro-arrays (TMA) accompanied by de-identified clinical data. Researchers interested in the type and quality of TMA tissue cores and the associated clinical data need an efficient method for viewing available TMA materials. Because each of the tissue samples within a TMA has separate data including a core tissue digital image and clinical data, an organized, standard approach to producing, navigating and publishing such data is necessary. The Association for Pathology Informatics (API) extensible mark-up language (XML) TMA data exchange specification (TMA DES) proposed in April 2003 provides a common format for TMA data. Exporting TMA data into the proposed format offers an opportunity to implement the API TMA DES. Using our public BrowseTMA tool, we created a web site that organizes and cross references TMA lists, digital "virtual slide" images, TMA DES export data, linked legends and clinical details for researchers. Microsoft Excel® and Microsoft Word® are used to convert tabular clinical data and produce an XML file in the TMA DES format. The BrowseTMA tool contains Extensible Stylesheet Language Transformation (XSLT) scripts that convert XML data into Hyper-Text Mark-up Language (HTML) web pages with hyperlinks automatically added to allow rapid navigation. Results Block lists, virtual slide images, legends, clinical details and exports have been placed on the ACSR web site for 14 blocks with 1623 cores of 2.0, 1.0 and 0.6 mm sizes. Our virtual microscope can be used to view and annotate these TMA images. Researchers can readily navigate from TMA block lists to TMA legends and to clinical details for a selected tissue core. Exports for 11 blocks with 3812 cores from three other institutions were processed with the BrowseTMA tool. Fifty common data elements (CDE) from the TMA DES were used and 42 more created for site-specific data. Researchers can download TMA clinical data in the TMA DES format. Conclusion Virtual TMAs with clinical data can be viewed on the Internet by interested researchers using the BrowseTMA tool. We have organized our approach to producing, sorting, navigating and publishing TMA information to facilitate such review. We have converted Excel TMA data into TMA DES XML, and imported it and TMA DES XML from another institution into BrowseTMA to produce web pages that allow us to browse through the merged data. We proposed enhancements to the TMA DES as a result of this experience. We implemented improvements to the API TMA DES as a result of using exported data from several institutions. A document type definition was written for the API TMA DES (that optionally includes proposed enhancements). Independent validators can be used to check exports against the DTD (with or without the proposed enhancements). Linking tissue core images to readily navigable clinical data greatly improves the value of the TMA. PMID:16086837

  18. Effects on the medical revenue of comprehensive pricing reform in Chinese urban public hospitals after removing drug markups: case of Nanjing.

    PubMed

    Tang, Wenxi; Xie, Jing; Lu, Yijuan; Liu, Qizhi; Malone, Daniel; Ma, Aixia

    2018-04-01

    The State Council of China requires that all urban public hospitals must eliminate drug markups by September 2017, and that hospital drugs must be sold at the purchase price. Nanjing-one of the first provincial capital cities to implement the reform-is studied to evaluate the effects of the comprehensive reform on drug prices in public hospitals, and to explore differential compensation plans. Sixteen hospitals were selected, and financial data were collected over the 48-month period before the reform and for 12 months after the reform. An analysis was carried out using a simple linear interrupted time series model. The average difference ratio of drug surplus fell 13.39% after the reform, and the drug markups were basically eliminated. Revenue from medical services showed a net growth of 28.25%. The overall compensation received from government financial budget and medical service revenue growth was 103.69% for the loss from policy-permitted 15% markup sales, and 116.48% for the net loss. However, there were large differences in compensation levels at different hospitals, ranging from -21.92% to 413.74% by medical services revenue growth, causing the combined rate of both financial and service compensation to vary from 28.87-413.74%, There was a significant positive correlation between the services compensation rate and the proportion of medical service revenue (p < .001), and the compensation rate increased by 8% for every 1% increase in the proportion of services revenue. Nanjing's pricing and compensation reform has basically achieved the policy targets of eliminating the drug markups, promoting the growth of medical services revenue, and adjusting the structure of medical revenue. However, the growth rate of service revenue of hospitals varied significantly from one another. Nanjing's reform represents successful pricing and compensation reform in Chinese urban public hospitals. It is recommended that a differentiated and dynamic compensation plan should be established in accordance with the revenue structure of different hospitals.

  19. Evaluating Drug Prices, Availability, Affordability, and Price Components: Implications for Access to Drugs in Malaysia

    PubMed Central

    Babar, Zaheer Ud Din; Ibrahim, Mohamed Izham Mohamed; Singh, Harpal; Bukahri, Nadeem Irfan; Creese, Andrew

    2007-01-01

    Background Malaysia's stable health care system is facing challenges with increasing medicine costs. To investigate these issues a survey was carried out to evaluate medicine prices, availability, affordability, and the structure of price components. Methods and Findings The methodology developed by the World Health Organization (WHO) and Health Action International (HAI) was used. Price and availability data for 48 medicines was collected from 20 public sector facilities, 32 private sector retail pharmacies and 20 dispensing doctors in four geographical regions of West Malaysia. Medicine prices were compared with international reference prices (IRPs) to obtain a median price ratio. The daily wage of the lowest paid unskilled government worker was used to gauge the affordability of medicines. Price component data were collected throughout the supply chain, and markups, taxes, and other distribution costs were identified. In private pharmacies, innovator brand (IB) prices were 16 times higher than the IRPs, while generics were 6.6 times higher. In dispensing doctor clinics, the figures were 15 times higher for innovator brands and 7.5 for generics. Dispensing doctors applied high markups of 50%–76% for IBs, and up to 316% for generics. Retail pharmacy markups were also high—25%–38% and 100%–140% for IBs and generics, respectively. In the public sector, where medicines are free, availability was low even for medicines on the National Essential Drugs List. For a month's treatment for peptic ulcer disease and hypertension people have to pay about a week's wages in the private sector. Conclusions The free market by definition does not control medicine prices, necessitating price monitoring and control mechanisms. Markups for generic products are greater than for IBs. Reducing the base price without controlling markups may increase profits for retailers and dispensing doctors without reducing the price paid by end users. To increase access and affordability, promotion of generic medicines and improved availability of medicines in the public sector are required. PMID:17388660

  20. Evaluating drug prices, availability, affordability, and price components: implications for access to drugs in Malaysia.

    PubMed

    Babar, Zaheer Ud Din; Ibrahim, Mohamed Izham Mohamed; Singh, Harpal; Bukahri, Nadeem Irfan; Creese, Andrew

    2007-03-27

    Malaysia's stable health care system is facing challenges with increasing medicine costs. To investigate these issues a survey was carried out to evaluate medicine prices, availability, affordability, and the structure of price components. The methodology developed by the World Health Organization (WHO) and Health Action International (HAI) was used. Price and availability data for 48 medicines was collected from 20 public sector facilities, 32 private sector retail pharmacies and 20 dispensing doctors in four geographical regions of West Malaysia. Medicine prices were compared with international reference prices (IRPs) to obtain a median price ratio. The daily wage of the lowest paid unskilled government worker was used to gauge the affordability of medicines. Price component data were collected throughout the supply chain, and markups, taxes, and other distribution costs were identified. In private pharmacies, innovator brand (IB) prices were 16 times higher than the IRPs, while generics were 6.6 times higher. In dispensing doctor clinics, the figures were 15 times higher for innovator brands and 7.5 for generics. Dispensing doctors applied high markups of 50%-76% for IBs, and up to 316% for generics. Retail pharmacy markups were also high-25%-38% and 100%-140% for IBs and generics, respectively. In the public sector, where medicines are free, availability was low even for medicines on the National Essential Drugs List. For a month's treatment for peptic ulcer disease and hypertension people have to pay about a week's wages in the private sector. The free market by definition does not control medicine prices, necessitating price monitoring and control mechanisms. Markups for generic products are greater than for IBs. Reducing the base price without controlling markups may increase profits for retailers and dispensing doctors without reducing the price paid by end users. To increase access and affordability, promotion of generic medicines and improved availability of medicines in the public sector are required.

  1. Importing MAGE-ML format microarray data into BioConductor.

    PubMed

    Durinck, Steffen; Allemeersch, Joke; Carey, Vincent J; Moreau, Yves; De Moor, Bart

    2004-12-12

    The microarray gene expression markup language (MAGE-ML) is a widely used XML (eXtensible Markup Language) standard for describing and exchanging information about microarray experiments. It can describe microarray designs, microarray experiment designs, gene expression data and data analysis results. We describe RMAGEML, a new Bioconductor package that provides a link between cDNA microarray data stored in MAGE-ML format and the Bioconductor framework for preprocessing, visualization and analysis of microarray experiments. http://www.bioconductor.org. Open Source.

  2. Chemical markup, XML, and the World Wide Web. 5. Applications of chemical metadata in RSS aggregators.

    PubMed

    Murray-Rust, Peter; Rzepa, Henry S; Williamson, Mark J; Willighagen, Egon L

    2004-01-01

    Examples of the use of the RSS 1.0 (RDF Site Summary) specification together with CML (Chemical Markup Language) to create a metadata based alerting service termed CMLRSS for molecular content are presented. CMLRSS can be viewed either using generic software or with modular opensource chemical viewers and editors enhanced with CMLRSS modules. We discuss the more automated use of CMLRSS as a component of a World Wide Molecular Matrix of semantically rich chemical information.

  3. Computer-Aided Writing.

    DTIC Science & Technology

    1988-04-01

    e.g., definitions, references, pictures) on the selected item in a separate window. For example, in a hyper- text document on astronomy , the reader...might arrive at the highlighted word " Copernicus ", select the word with the keyboard or mouse, and then be offered a number of related topics from

  4. Archive of Boomer and Chirp Seismic Reflection Data Collected During USGS Cruise 01RCE02, Southern Louisiana, April and May 2001

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.

    2003-01-01

    In April and May of 2001, the U.S. Geological Survey conducted a geophysical study of the Mississippi River Delta, Atchafalaya River Delta, and Shell Island Pass in southern Louisiana. This study was part of a larger USGS River Contaminant Evaluation (RCE) Project. This disc serves as an archive of unprocessed digital seismic reflection data, trackline navigation files, shotpoint navigation maps, observers' logbooks, GIS information, and formal Federal Geographic Data Committee (FGDC) metadata. In addition, a filtered and gained digital GIF-formatted image of each seismic profile is provided. For your convenience, a list of acronyms and abbreviations frequently used in this report has also been provided. This DVD (Digital Versatile Disc) document is readable on any computing platform that has standard DVD driver software installed. Documentation on this DVD was produced using Hyper Text Markup Language (HTML) utilized by the World Wide Web (WWW) and allows the user to access the information by using a web browser (i.e. Netscape or Internet Explorer). To access the information contained on this disc, open the file 'index.htm' located at the top level of the disc using your web browser. This report also contains WWW links to USGS collaborators and other agencies. These links are only accessible if access to the internet is available while viewing the DVD. The archived boomer and chirp seismic reflection data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry et al., 1975) and may be downloaded for processing with public domain software such as Seismic Unix (SU), currently located at http://www.cwp.mines.edu/cwpcodes. Examples of SU processing scripts are provided in the boom.tar and chirp.tar files located in the SU subfolder of the SOFTWARE folder located at the top level of this DVD. In-house (USGS) DOS and Microsoft Windows compatible software for viewing SEG-Y headers - DUMPSEGY.EXE (Zilhman, 1992) - is provided in the USGS subfolder of the SOFTWARE folder. Processed profile images, shotpoint navigation maps, logbooks, and formal metadata may be viewed with your web browser.

  5. Operational Monitoring of Volcanoes Using Keyhole Markup Language

    NASA Astrophysics Data System (ADS)

    Dehn, J.; Bailey, J. E.; Webley, P.

    2007-12-01

    Volcanoes are some of the most geologically powerful, dynamic, visually appealing structures on the Earth's landscape. Volcanic eruptions are hard to predict, difficult to quantify and impossible to prevent, making effective monitoring a difficult proposition. In Alaska, volcanoes are an intrinsic part of the culture, with over 100 volcanoes and volcanic fields that have been active in historic time monitored by the Alaska Volcano Observatory (AVO). Observations and research are performed using a suite of methods and tools in the fields of remote sensing, seismology, geodesy and geology, producing large volumes of geospatial data. Keyhole Markup Language (KML) offers a context in which these different, and in the past disparate, data can be displayed simultaneously. Dynamic links keep these data current, allowing it to be used in an operational capacity. KML is used to display information from the aviation color codes and activity alert levels for volcanoes to locations of thermal anomalies, earthquake locations and ash plume modeling. The dynamic refresh and time primitive are used to display volcano webcam and satellite image overlays in near real-time. In addition a virtual globe browser using KML, such as Google Earth, provides an interface to further information using the hyperlink, rich- text and flash-embedding abilities supported within object description balloons. By merging these data sets in an easy to use interface, a virtual globe browser provides a better tool for scientists and emergency managers alike to mitigate volcanic crises.

  6. CytometryML: a markup language for analytical cytology

    NASA Astrophysics Data System (ADS)

    Leif, Robert C.; Leif, Stephanie H.; Leif, Suzanne B.

    2003-06-01

    Cytometry Markup Language, CytometryML, is a proposed new analytical cytology data standard. CytometryML is a set of XML schemas for encoding both flow cytometry and digital microscopy text based data types. CytometryML schemas reference both DICOM (Digital Imaging and Communications in Medicine) codes and FCS keywords. These schemas provide representations for the keywords in FCS 3.0 and will soon include DICOM microscopic image data. Flow Cytometry Standard (FCS) list-mode has been mapped to the DICOM Waveform Information Object. A preliminary version of a list mode binary data type, which does not presently exist in DICOM, has been designed. This binary type is required to enhance the storage and transmission of flow cytometry and digital microscopy data. Index files based on Waveform indices will be used to rapidly locate the cells present in individual subsets. DICOM has the advantage of employing standard file types, TIF and JPEG, for Digital Microscopy. Using an XML schema based representation means that standard commercial software packages such as Excel and MathCad can be used to analyze, display, and store analytical cytometry data. Furthermore, by providing one standard for both DICOM data and analytical cytology data, it eliminates the need to create and maintain special purpose interfaces for analytical cytology data thereby integrating the data into the larger DICOM and other clinical communities. A draft version of CytometryML is available at www.newportinstruments.com.

  7. Ontology aided modeling of organic reaction mechanisms with flexible and fragment based XML markup procedures.

    PubMed

    Sankar, Punnaivanam; Aghila, Gnanasekaran

    2007-01-01

    The mechanism models for primary organic reactions encoding the structural fragments undergoing substitution, addition, elimination, and rearrangements are developed. In the proposed models, each and every structural component of mechanistic pathways is represented with flexible and fragment based markup technique in XML syntax. A significant feature of the system is the encoding of the electron movements along with the other components like charges, partial charges, half bonded species, lone pair electrons, free radicals, reaction arrows, etc. needed for a complete representation of reaction mechanism. The rendering of reaction schemes described with the proposed methodology is achieved with a concise XML extension language interoperating with the structure markup. The reaction scheme is visualized as 2D graphics in a browser by converting them into SVG documents enabling the desired layouts normally perceived by the chemists conventionally. An automatic representation of the complex patterns of the reaction mechanism is achieved by reusing the knowledge in chemical ontologies and developing artificial intelligence components in terms of axioms.

  8. Do state minimum markup/price laws work? Evidence from retail scanner data and TUS-CPS

    PubMed Central

    Huang, Jidong; Chriqui, Jamie F; DeLong, Hillary; Mirza, Maryam; Diaz, Megan C; Chaloupka, Frank J

    2016-01-01

    Background Minimum markup/price laws (MPLs) have been proposed as an alternative non-tax pricing strategy to reduce tobacco use and access. However, the empirical evidence on the effectiveness of MPLs in increasing cigarette prices is very limited. This study aims to fill this critical gap by examining the association between MPLs and cigarette prices. Methods State MPLs were compiled from primary legal research databases and were linked to cigarette prices constructed from the Nielsen retail scanner data and the self-reported cigarette prices from the Tobacco Use Supplement to the Current Population Survey. Multivariate regression analyses were conducted to examine the association between MPLs and the major components of MPLs and cigarette prices. Results The presence of MPLs was associated with higher cigarette prices. In addition, cigarette prices were higher, above and beyond the higher prices resulting from MPLs, in states that prohibit below-cost combination sales; do not allow any distributing party to use trade discounts to reduce the base cost of cigarettes; prohibit distributing parties from meeting the price of a competitor, and prohibit distributing below-cost coupons to the consumer. Moreover, states that had total markup rates >24% were associated with significantly higher cigarette prices. Conclusions MPLs are an effective way to increase cigarette prices. The impact of MPLs can be further strengthened by imposing greater markup rates and by prohibiting coupon distribution, competitor price matching, and use of below-cost combination sales and trade discounts. PMID:27697948

  9. XML — an opportunity for data standards in the geosciences

    NASA Astrophysics Data System (ADS)

    Houlding, Simon W.

    2001-08-01

    Extensible markup language (XML) is a recently introduced meta-language standard on the Web. It provides the rules for development of metadata (markup) standards for information transfer in specific fields. XML allows development of markup languages that describe what information is rather than how it should be presented. This allows computer applications to process the information in intelligent ways. In contrast hypertext markup language (HTML), which fuelled the initial growth of the Web, is a metadata standard concerned exclusively with presentation of information. Besides its potential for revolutionizing Web activities, XML provides an opportunity for development of meaningful data standards in specific application fields. The rapid endorsement of XML by science, industry and e-commerce has already spawned new metadata standards in such fields as mathematics, chemistry, astronomy, multi-media and Web micro-payments. Development of XML-based data standards in the geosciences would significantly reduce the effort currently wasted on manipulating and reformatting data between different computer platforms and applications and would ensure compatibility with the new generation of Web browsers. This paper explores the evolution, benefits and status of XML and related standards in the more general context of Web activities and uses this as a platform for discussion of its potential for development of data standards in the geosciences. Some of the advantages of XML are illustrated by a simple, browser-compatible demonstration of XML functionality applied to a borehole log dataset. The XML dataset and the associated stylesheet and schema declarations are available for FTP download.

  10. Dealing Your Own Hands with Hypercard.

    ERIC Educational Resources Information Center

    Larsen, Mark D.

    1988-01-01

    Extensively reviews Hypercard, a multifaceted software package for the Macintosh. HyperCard uses a language called "hypertext" which was patterned after everyday language and designed to allow flexibility in the linking and manipulation of text, graphics, and sounds. Describes one use for Hypercard in an advanced course on Latin American…

  11. Hypertext Image Retrieval: The Evolution of an Application.

    ERIC Educational Resources Information Center

    Roberts, G. Louis; Kenney, Carol E.

    1991-01-01

    Describes the development and implementation of a full-text image retrieval system at the Boeing Commercial Airplane Group. The conversion of card formats to a microcomputer-based system using HyperCard is described; the online system architecture is explained; and future plans are discussed, including conversion to digital images. (LRW)

  12. Assessment of Cognitive Style to Examine Students' Use of Hypermedia within Historic Costume.

    ERIC Educational Resources Information Center

    Frey, Diane; Simonson, Michael

    1993-01-01

    Cognitive style of 70 students in fashion merchandising or education was compared to their choice of media in a hypermedia lesson on costume using HyperCard. Students used hypermedia effectively to accommodate preferred style; 56% chose visual, 30% text, and 14% auditory. (SK)

  13. Adverse Event extraction from Structured Product Labels using the Event-based Text-mining of Health Electronic Records (ETHER)system.

    PubMed

    Pandey, Abhishek; Kreimeyer, Kory; Foster, Matthew; Botsis, Taxiarchis; Dang, Oanh; Ly, Thomas; Wang, Wei; Forshee, Richard

    2018-01-01

    Structured Product Labels follow an XML-based document markup standard approved by the Health Level Seven organization and adopted by the US Food and Drug Administration as a mechanism for exchanging medical products information. Their current organization makes their secondary use rather challenging. We used the Side Effect Resource database and DailyMed to generate a comparison dataset of 1159 Structured Product Labels. We processed the Adverse Reaction section of these Structured Product Labels with the Event-based Text-mining of Health Electronic Records system and evaluated its ability to extract and encode Adverse Event terms to Medical Dictionary for Regulatory Activities Preferred Terms. A small sample of 100 labels was then selected for further analysis. Of the 100 labels, Event-based Text-mining of Health Electronic Records achieved a precision and recall of 81 percent and 92 percent, respectively. This study demonstrated Event-based Text-mining of Health Electronic Record's ability to extract and encode Adverse Event terms from Structured Product Labels which may potentially support multiple pharmacoepidemiological tasks.

  14. XML in an Adaptive Framework for Instrument Control

    NASA Technical Reports Server (NTRS)

    Ames, Troy J.

    2004-01-01

    NASA Goddard Space Flight Center is developing an extensible framework for instrument command and control, known as Instrument Remote Control (IRC), that combines the platform independent processing capabilities of Java with the power of the Extensible Markup Language (XML). A key aspect of the architecture is software that is driven by an instrument description, written using the Instrument Markup Language (IML). IML is an XML dialect used to describe interfaces to control and monitor the instrument, command sets and command formats, data streams, communication mechanisms, and data processing algorithms.

  15. Experimental Applications of Automatic Test Markup Language (ATML)

    NASA Technical Reports Server (NTRS)

    Lansdowne, Chatwin A.; McCartney, Patrick; Gorringe, Chris

    2012-01-01

    The authors describe challenging use-cases for Automatic Test Markup Language (ATML), and evaluate solutions. The first case uses ATML Test Results to deliver active features to support test procedure development and test flow, and bridging mixed software development environments. The second case examines adding attributes to Systems Modelling Language (SysML) to create a linkage for deriving information from a model to fill in an ATML document set. Both cases are outside the original concept of operations for ATML but are typical when integrating large heterogeneous systems with modular contributions from multiple disciplines.

  16. Pharmacometrics Markup Language (PharmML): Opening New Perspectives for Model Exchange in Drug Development.

    PubMed

    Swat, M J; Moodie, S; Wimalaratne, S M; Kristensen, N R; Lavielle, M; Mari, A; Magni, P; Smith, M K; Bizzotto, R; Pasotti, L; Mezzalana, E; Comets, E; Sarr, C; Terranova, N; Blaudez, E; Chan, P; Chard, J; Chatel, K; Chenel, M; Edwards, D; Franklin, C; Giorgino, T; Glont, M; Girard, P; Grenon, P; Harling, K; Hooker, A C; Kaye, R; Keizer, R; Kloft, C; Kok, J N; Kokash, N; Laibe, C; Laveille, C; Lestini, G; Mentré, F; Munafo, A; Nordgren, R; Nyberg, H B; Parra-Guillen, Z P; Plan, E; Ribba, B; Smith, G; Trocóniz, I F; Yvon, F; Milligan, P A; Harnisch, L; Karlsson, M; Hermjakob, H; Le Novère, N

    2015-06-01

    The lack of a common exchange format for mathematical models in pharmacometrics has been a long-standing problem. Such a format has the potential to increase productivity and analysis quality, simplify the handling of complex workflows, ensure reproducibility of research, and facilitate the reuse of existing model resources. Pharmacometrics Markup Language (PharmML), currently under development by the Drug Disease Model Resources (DDMoRe) consortium, is intended to become an exchange standard in pharmacometrics by providing means to encode models, trial designs, and modeling steps.

  17. Pharmacometrics Markup Language (PharmML): Opening New Perspectives for Model Exchange in Drug Development

    PubMed Central

    Swat, MJ; Moodie, S; Wimalaratne, SM; Kristensen, NR; Lavielle, M; Mari, A; Magni, P; Smith, MK; Bizzotto, R; Pasotti, L; Mezzalana, E; Comets, E; Sarr, C; Terranova, N; Blaudez, E; Chan, P; Chard, J; Chatel, K; Chenel, M; Edwards, D; Franklin, C; Giorgino, T; Glont, M; Girard, P; Grenon, P; Harling, K; Hooker, AC; Kaye, R; Keizer, R; Kloft, C; Kok, JN; Kokash, N; Laibe, C; Laveille, C; Lestini, G; Mentré, F; Munafo, A; Nordgren, R; Nyberg, HB; Parra-Guillen, ZP; Plan, E; Ribba, B; Smith, G; Trocóniz, IF; Yvon, F; Milligan, PA; Harnisch, L; Karlsson, M; Hermjakob, H; Le Novère, N

    2015-01-01

    The lack of a common exchange format for mathematical models in pharmacometrics has been a long-standing problem. Such a format has the potential to increase productivity and analysis quality, simplify the handling of complex workflows, ensure reproducibility of research, and facilitate the reuse of existing model resources. Pharmacometrics Markup Language (PharmML), currently under development by the Drug Disease Model Resources (DDMoRe) consortium, is intended to become an exchange standard in pharmacometrics by providing means to encode models, trial designs, and modeling steps. PMID:26225259

  18. Digitizing the Past: A History Book on CD-ROM.

    ERIC Educational Resources Information Center

    Rosenzweig, Roy

    1993-01-01

    Describes the development of an American history book with interactive CD-ROM technology that includes text, pictures, graphs and charts, audio, and film. Topics discussed include the use of HyperCard software to link information; access to primary sources of information; greater student control over learning; and the concept of collaborative…

  19. Guide to the Internet. The world wide web.

    PubMed Central

    Pallen, M.

    1995-01-01

    The world wide web provides a uniform, user friendly interface to the Internet. Web pages can contain text and pictures and are interconnected by hypertext links. The addresses of web pages are recorded as uniform resource locators (URLs), transmitted by hypertext transfer protocol (HTTP), and written in hypertext markup language (HTML). Programs that allow you to use the web are available for most operating systems. Powerful on line search engines make it relatively easy to find information on the web. Browsing through the web--"net surfing"--is both easy and enjoyable. Contributing to the web is not difficult, and the web opens up new possibilities for electronic publishing and electronic journals. Images p1554-a Fig 5 PMID:8520402

  20. User's Manual for the Object User Interface (OUI): An Environmental Resource Modeling Framework

    USGS Publications Warehouse

    Markstrom, Steven L.; Koczot, Kathryn M.

    2008-01-01

    The Object User Interface is a computer application that provides a framework for coupling environmental-resource models and for managing associated temporal and spatial data. The Object User Interface is designed to be easily extensible to incorporate models and data interfaces defined by the user. Additionally, the Object User Interface is highly configurable through the use of a user-modifiable, text-based control file that is written in the eXtensible Markup Language. The Object User Interface user's manual provides (1) installation instructions, (2) an overview of the graphical user interface, (3) a description of the software tools, (4) a project example, and (5) specifications for user configuration and extension.

  1. Impact of cigarette minimum price laws on the retail price of cigarettes in the USA.

    PubMed

    Tynan, Michael A; Ribisl, Kurt M; Loomis, Brett R

    2013-05-01

    Cigarette price increases prevent youth initiation, reduce cigarette consumption and increase the number of smokers who quit. Cigarette minimum price laws (MPLs), which typically require cigarette wholesalers and retailers to charge a minimum percentage mark-up for cigarette sales, have been identified as an intervention that can potentially increase cigarette prices. 24 states and the District of Columbia have cigarette MPLs. Using data extracted from SCANTRACK retail scanner data from the Nielsen company, average cigarette prices were calculated for designated market areas in states with and without MPLs in three retail channels: grocery stores, drug stores and convenience stores. Regression models were estimated using the average cigarette pack price in each designated market area and calendar quarter in 2009 as the outcome variable. The average difference in cigarette pack prices are 46 cents in the grocery channel, 29 cents in the drug channel and 13 cents in the convenience channel, with prices being lower in states with MPLs for all three channels. The findings that MPLs do not raise cigarette prices could be the result of a lack of compliance and enforcement by the state or could be attributed to the minimum state mark-up being lower than the free-market mark-up for cigarettes. Rather than require a minimum mark-up, which can be nullified by promotional incentives and discounts, states and countries could strengthen MPLs by setting a simple 'floor price' that is the true minimum price for all cigarettes or could prohibit discounts to consumers and retailers.

  2. Systems Biology Markup Language (SBML) Level 2 Version 5: Structures and Facilities for Model Definitions

    PubMed Central

    Hucka, Michael; Bergmann, Frank T.; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M.; Le Novére, Nicolas; Myers, Chris J.; Olivier, Brett G.; Sahle, Sven; Schaff, James C.; Smith, Lucian P.; Waltemath, Dagmar; Wilkinson, Darren J.

    2017-01-01

    Summary Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/. PMID:26528569

  3. Biological Dynamics Markup Language (BDML): an open format for representing quantitative biological dynamics data

    PubMed Central

    Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H. L.; Onami, Shuichi

    2015-01-01

    Motivation: Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. Results: We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. Availability and implementation: A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Contact: sonami@riken.jp Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:25414366

  4. The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core

    PubMed Central

    Hucka, Michael; Bergmann, Frank T.; Hoops, Stefan; Keating, Sarah M.; Sahle, Sven; Schaff, James C.; Smith, Lucian P.; Wilkinson, Darren J.

    2017-01-01

    Summary Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/. PMID:26528564

  5. Systems Biology Markup Language (SBML) Level 2 Version 5: Structures and Facilities for Model Definitions.

    PubMed

    Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J

    2015-09-04

    Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org.

  6. The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core.

    PubMed

    Hucka, Michael; Bergmann, Frank T; Hoops, Stefan; Keating, Sarah M; Sahle, Sven; Schaff, James C; Smith, Lucian P; Wilkinson, Darren J

    2015-09-04

    Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.

  7. The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core.

    PubMed

    Hucka, Michael; Bergmann, Frank T; Hoops, Stefan; Keating, Sarah M; Sahle, Sven; Schaff, James C; Smith, Lucian P; Wilkinson, Darren J

    2015-06-01

    Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.

  8. Systems Biology Markup Language (SBML) Level 2 Version 5: Structures and Facilities for Model Definitions.

    PubMed

    Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J

    2015-06-01

    Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.

  9. Biological Dynamics Markup Language (BDML): an open format for representing quantitative biological dynamics data.

    PubMed

    Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H L; Onami, Shuichi

    2015-04-01

    Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  10. Collaborative Planning of Robotic Exploration

    NASA Technical Reports Server (NTRS)

    Norris, Jeffrey; Backes, Paul; Powell, Mark; Vona, Marsette; Steinke, Robert

    2004-01-01

    The Science Activity Planner (SAP) software system includes an uplink-planning component, which enables collaborative planning of activities to be undertaken by an exploratory robot on a remote planet or on Earth. Included in the uplink-planning component is the SAP-Uplink Browser, which enables users to load multiple spacecraft activity plans into a single window, compare them, and merge them. The uplink-planning component includes a subcomponent that implements the Rover Markup Language Activity Planning format (RML-AP), based on the Extensible Markup Language (XML) format that enables the representation, within a single document, of planned spacecraft and robotic activities together with the scientific reasons for the activities. Each such document is highly parseable and can be validated easily. Another subcomponent of the uplink-planning component is the Activity Dictionary Markup Language (ADML), which eliminates the need for two mission activity dictionaries - one in a human-readable format and one in a machine-readable format. Style sheets that have been developed along with the ADML format enable users to edit one dictionary in a user-friendly environment without compromising

  11. The carbohydrate sequence markup language (CabosML): an XML description of carbohydrate structures.

    PubMed

    Kikuchi, Norihiro; Kameyama, Akihiko; Nakaya, Shuuichi; Ito, Hiromi; Sato, Takashi; Shikanai, Toshihide; Takahashi, Yoriko; Narimatsu, Hisashi

    2005-04-15

    Bioinformatics resources for glycomics are very poor as compared with those for genomics and proteomics. The complexity of carbohydrate sequences makes it difficult to define a common language to represent them, and the development of bioinformatics tools for glycomics has not progressed. In this study, we developed a carbohydrate sequence markup language (CabosML), an XML description of carbohydrate structures. The language definition (XML Schema) and an experimental database of carbohydrate structures using an XML database management system are available at http://www.phoenix.hydra.mki.co.jp/CabosDemo.html kikuchi@hydra.mki.co.jp.

  12. cluML: A markup language for clustering and cluster validity assessment of microarray data.

    PubMed

    Bolshakova, Nadia; Cunningham, Pádraig

    2005-01-01

    cluML is a new markup language for microarray data clustering and cluster validity assessment. The XML-based format has been designed to address some of the limitations observed in traditional formats, such as inability to store multiple clustering (including biclustering) and validation results within a dataset. cluML is an effective tool to support biomedical knowledge representation in gene expression data analysis. Although cluML was developed for DNA microarray analysis applications, it can be effectively used for the representation of clustering and for the validation of other biomedical and physical data that has no limitations.

  13. Generating Systems Biology Markup Language Models from the Synthetic Biology Open Language.

    PubMed

    Roehner, Nicholas; Zhang, Zhen; Nguyen, Tramy; Myers, Chris J

    2015-08-21

    In the context of synthetic biology, model generation is the automated process of constructing biochemical models based on genetic designs. This paper discusses the use cases for model generation in genetic design automation (GDA) software tools and introduces the foundational concepts of standards and model annotation that make this process useful. Finally, this paper presents an implementation of model generation in the GDA software tool iBioSim and provides an example of generating a Systems Biology Markup Language (SBML) model from a design of a 4-input AND sensor written in the Synthetic Biology Open Language (SBOL).

  14. Getting a Jump on the Future: Everything You'll Ever Need to Know about Multimedia Authoring Tools.

    ERIC Educational Resources Information Center

    D'Ignazio, Fred

    1992-01-01

    Discusses issues involved with buying and using multimedia authoring programs. Six programs are compared: (1) MediaText, (2) HyperCard, (3) LinkWay Live!, (4) AmigaVision, (5) Director, and (6) Multimedia Desktop. Highlights include the use of multimedia in education, sequential versus hierarchical organization, price, system requirements, digital…

  15. Do state minimum markup/price laws work? Evidence from retail scanner data and TUS-CPS.

    PubMed

    Huang, Jidong; Chriqui, Jamie F; DeLong, Hillary; Mirza, Maryam; Diaz, Megan C; Chaloupka, Frank J

    2016-10-01

    Minimum markup/price laws (MPLs) have been proposed as an alternative non-tax pricing strategy to reduce tobacco use and access. However, the empirical evidence on the effectiveness of MPLs in increasing cigarette prices is very limited. This study aims to fill this critical gap by examining the association between MPLs and cigarette prices. State MPLs were compiled from primary legal research databases and were linked to cigarette prices constructed from the Nielsen retail scanner data and the self-reported cigarette prices from the Tobacco Use Supplement to the Current Population Survey. Multivariate regression analyses were conducted to examine the association between MPLs and the major components of MPLs and cigarette prices. The presence of MPLs was associated with higher cigarette prices. In addition, cigarette prices were higher, above and beyond the higher prices resulting from MPLs, in states that prohibit below-cost combination sales; do not allow any distributing party to use trade discounts to reduce the base cost of cigarettes; prohibit distributing parties from meeting the price of a competitor, and prohibit distributing below-cost coupons to the consumer. Moreover, states that had total markup rates >24% were associated with significantly higher cigarette prices. MPLs are an effective way to increase cigarette prices. The impact of MPLs can be further strengthened by imposing greater markup rates and by prohibiting coupon distribution, competitor price matching, and use of below-cost combination sales and trade discounts. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  16. Facilitating NCAR Data Discovery by Connecting Related Resources

    NASA Astrophysics Data System (ADS)

    Rosati, A.

    2012-12-01

    Linking datasets, creators, and users by employing the proper standards helps to increase the impact of funded research. In order for users to find a dataset, it must first be named. Data citations play the important role of giving datasets a persistent presence by assigning a formal "name" and location. This project focuses on the next step of the "name-find-use" sequence: enhancing discoverability of NCAR data by connecting related resources on the web. By examining metadata schemas that document datasets, I examined how Semantic Web approaches can help to ensure the widest possible range of data users. The focus was to move from search engine optimization (SEO) to information connectivity. Two main markup types are very visible in the Semantic Web and applicable to scientific dataset discovery: The Open Archives Initiative-Object Reuse and Exchange (OAI-ORE - www.openarchives.org) and Microdata (HTML5 and www.schema.org). My project creates pilot aggregations of related resources using both markup types for three case studies: The North American Regional Climate Change Assessment Program (NARCCAP) dataset and related publications, the Palmer Drought Severity Index (PSDI) animation and image files from NCAR's Visualization Lab (VisLab), and the multidisciplinary data types and formats from the Advanced Cooperative Arctic Data and Information Service (ACADIS). This project documents the differences between these markups and how each creates connectedness on the web. My recommendations point toward the most efficient and effective markup schema for aggregating resources within the three case studies based on the following assessment criteria: ease of use, current state of support and adoption of technology, integration with typical web tools, available vocabularies and geoinformatic standards, interoperability with current repositories and access portals (e.g. ESG, Java), and relation to data citation tools and methods.

  17. The Educator's Guide to HyperCard and HyperTalk. A Longwood Professional Book.

    ERIC Educational Resources Information Center

    Culp, George H.; Watkins, G. Morgan

    This book and three accompanying floppy disks introduce HyperCard 2.1 for the Macintosh microcomputer and its programming component, HyperTalk, to educators. The first four chapters introduce the basics of HyperCard, including its structure, which is based on a hierarchy of units; the use of tools and graphics; and ways of linking information…

  18. Probable alpha and 14C cluster emission from hyper Ac nuclei

    NASA Astrophysics Data System (ADS)

    Santhosh, K. P.

    2013-10-01

    A systematic study on the probability for the emission of 4He and 14C cluster from hyper {Λ/207-234}Ac and non-strange normal 207-234Ac nuclei are performed for the first time using our fission model, the Coulomb and proximity potential model (CPPM). The predicted half lives show that hyper {Λ/207-234}Ac nuclei are unstable against 4He emission and 14C emission from hyper {Λ/217-228}Ac are favorable for measurement. Our study also show that hyper {Λ/207-234}Ac are stable against hyper {Λ/4}He and {Λ/14}C emission. The role of neutron shell closure ( N = 126) in hyper {Λ/214}Fr daughter and role of proton/neutron shell closure ( Z ≈ 82, N = 126) in hyper {Λ/210}Bi daughter are also revealed. As hyper-nuclei decays to normal nuclei by mesonic/non-mesonic decay and since most of the predicted half lives for 4He and 14C emission from normal Ac nuclei are favourable for measurement, we presume that alpha and 14C cluster emission from hyper Ac nuclei can be detected in laboratory in a cascade (two-step) process.

  19. Differential responses of targeted lung redox enzymes to rat exposure to 60 or 85% oxygen

    PubMed Central

    Gan, Zhuohui; Roerig, David L.; Clough, Anne V.

    2011-01-01

    Rat exposure to 60% O2 (hyper-60) or 85% O2 (hyper-85) for 7 days confers susceptibility or tolerance, respectively, of the otherwise lethal effects of exposure to 100% O2. The objective of this study was to determine whether activities of the antioxidant cytosolic enzyme NAD(P)H:quinone oxidoreductase 1 (NQO1) and mitochondrial complex III are differentially altered in hyper-60 and hyper-85 lungs. Duroquinone (DQ), an NQO1 substrate, or its hydroquinone (DQH2), a complex III substrate, was infused into the arterial inflow of isolated, perfused lungs, and the venous efflux rates of DQH2 and DQ were measured. Based on inhibitor effects and kinetic modeling, capacities of NQO1-mediated DQ reduction (Vmax1) and complex III-mediated DQH2 oxidation (Vmax2) increased by ∼140 and ∼180% in hyper-85 lungs, respectively, compared with rates in lungs of rats exposed to room air (normoxic). In hyper-60 lungs, Vmax1 increased by ∼80%, with no effect on Vmax2. Additional studies revealed that mitochondrial complex I activity in hyper-60 and hyper-85 lung tissue homogenates was ∼50% lower than in normoxic lung homogenates, whereas mitochondrial complex IV activity was ∼90% higher in only hyper-85 lung tissue homogenates. Thus NQO1 activity increased in both hyper-60 and hyper-85 lungs, whereas complex III activity increased in hyper-85 lungs only. This increase, along with the increase in complex IV activity, may counter the effects the depression in complex I activity might have on tissue mitochondrial function and/or reactive oxygen species production and may be important to the tolerance of 100% O2 observed in hyper-85 rats. PMID:21551015

  20. Differential responses of targeted lung redox enzymes to rat exposure to 60 or 85% oxygen.

    PubMed

    Gan, Zhuohui; Roerig, David L; Clough, Anne V; Audi, Said H

    2011-07-01

    Rat exposure to 60% O(2) (hyper-60) or 85% O(2) (hyper-85) for 7 days confers susceptibility or tolerance, respectively, of the otherwise lethal effects of exposure to 100% O(2). The objective of this study was to determine whether activities of the antioxidant cytosolic enzyme NAD(P)H:quinone oxidoreductase 1 (NQO1) and mitochondrial complex III are differentially altered in hyper-60 and hyper-85 lungs. Duroquinone (DQ), an NQO1 substrate, or its hydroquinone (DQH(2)), a complex III substrate, was infused into the arterial inflow of isolated, perfused lungs, and the venous efflux rates of DQH(2) and DQ were measured. Based on inhibitor effects and kinetic modeling, capacities of NQO1-mediated DQ reduction (V(max1)) and complex III-mediated DQH(2) oxidation (V(max2)) increased by ∼140 and ∼180% in hyper-85 lungs, respectively, compared with rates in lungs of rats exposed to room air (normoxic). In hyper-60 lungs, V(max1) increased by ∼80%, with no effect on V(max2). Additional studies revealed that mitochondrial complex I activity in hyper-60 and hyper-85 lung tissue homogenates was ∼50% lower than in normoxic lung homogenates, whereas mitochondrial complex IV activity was ∼90% higher in only hyper-85 lung tissue homogenates. Thus NQO1 activity increased in both hyper-60 and hyper-85 lungs, whereas complex III activity increased in hyper-85 lungs only. This increase, along with the increase in complex IV activity, may counter the effects the depression in complex I activity might have on tissue mitochondrial function and/or reactive oxygen species production and may be important to the tolerance of 100% O(2) observed in hyper-85 rats.

  1. Web portal for dynamic creation and publication of teaching materials in multiple formats from a single source representation

    NASA Astrophysics Data System (ADS)

    Roganov, E. A.; Roganova, N. A.; Aleksandrov, A. I.; Ukolova, A. V.

    2017-01-01

    We implement a web portal which dynamically creates documents in more than 30 different formats including html, pdf and docx from a single original material source. It is obtained by using a number of free software such as Markdown (markup language), Pandoc (document converter), MathJax (library to display mathematical notation in web browsers), framework Ruby on Rails. The portal enables the creation of documents with a high quality visualization of mathematical formulas, is compatible with a mobile device and allows one to search documents by text or formula fragments. Moreover, it gives professors the ability to develop the latest technology educational materials, without qualified technicians' assistance, thus improving the quality of the whole educational process.

  2. A Converter from the Systems Biology Markup Language to the Synthetic Biology Open Language.

    PubMed

    Nguyen, Tramy; Roehner, Nicholas; Zundel, Zach; Myers, Chris J

    2016-06-17

    Standards are important to synthetic biology because they enable exchange and reproducibility of genetic designs. This paper describes a procedure for converting between two standards: the Systems Biology Markup Language (SBML) and the Synthetic Biology Open Language (SBOL). SBML is a standard for behavioral models of biological systems at the molecular level. SBOL describes structural and basic qualitative behavioral aspects of a biological design. Converting SBML to SBOL enables a consistent connection between behavioral and structural information for a biological design. The conversion process described in this paper leverages Systems Biology Ontology (SBO) annotations to enable inference of a designs qualitative function.

  3. The Surgical Simulation and Training Markup Language (SSTML): an XML-based language for medical simulation.

    PubMed

    Bacon, James; Tardella, Neil; Pratt, Janey; Hu, John; English, James

    2006-01-01

    Under contract with the Telemedicine & Advanced Technology Research Center (TATRC), Energid Technologies is developing a new XML-based language for describing surgical training exercises, the Surgical Simulation and Training Markup Language (SSTML). SSTML must represent everything from organ models (including tissue properties) to surgical procedures. SSTML is an open language (i.e., freely downloadable) that defines surgical training data through an XML schema. This article focuses on the data representation of the surgical procedures and organ modeling, as they highlight the need for a standard language and illustrate the features of SSTML. Integration of SSTML with software is also discussed.

  4. Field Markup Language: biological field representation in XML.

    PubMed

    Chang, David; Lovell, Nigel H; Dokos, Socrates

    2007-01-01

    With an ever increasing number of biological models available on the internet, a standardized modeling framework is required to allow information to be accessed or visualized. Based on the Physiome Modeling Framework, the Field Markup Language (FML) is being developed to describe and exchange field information for biological models. In this paper, we describe the basic features of FML, its supporting application framework and its ability to incorporate CellML models to construct tissue-scale biological models. As a typical application example, we present a spatially-heterogeneous cardiac pacemaker model which utilizes both FML and CellML to describe and solve the underlying equations of electrical activation and propagation.

  5. The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 2 Core.

    PubMed

    Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J

    2018-03-09

    Computational models can help researchers to interpret data, understand biological functions, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that different software systems can exchange. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 2 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML, their encoding in XML (the eXtensible Markup Language), validation rules that determine the validity of an SBML document, and examples of models in SBML form. The design of Version 2 differs from Version 1 principally in allowing new MathML constructs, making more child elements optional, and adding identifiers to all SBML elements instead of only selected elements. Other materials and software are available from the SBML project website at http://sbml.org/.

  6. Gaussian Process Regression (GPR) Representation in Predictive Model Markup Language (PMML)

    PubMed Central

    Lechevalier, D.; Ak, R.; Ferguson, M.; Law, K. H.; Lee, Y.-T. T.; Rachuri, S.

    2017-01-01

    This paper describes Gaussian process regression (GPR) models presented in predictive model markup language (PMML). PMML is an extensible-markup-language (XML) -based standard language used to represent data-mining and predictive analytic models, as well as pre- and post-processed data. The previous PMML version, PMML 4.2, did not provide capabilities for representing probabilistic (stochastic) machine-learning algorithms that are widely used for constructing predictive models taking the associated uncertainties into consideration. The newly released PMML version 4.3, which includes the GPR model, provides new features: confidence bounds and distribution for the predictive estimations. Both features are needed to establish the foundation for uncertainty quantification analysis. Among various probabilistic machine-learning algorithms, GPR has been widely used for approximating a target function because of its capability of representing complex input and output relationships without predefining a set of basis functions, and predicting a target output with uncertainty quantification. GPR is being employed to various manufacturing data-analytics applications, which necessitates representing this model in a standardized form for easy and rapid employment. In this paper, we present a GPR model and its representation in PMML. Furthermore, we demonstrate a prototype using a real data set in the manufacturing domain. PMID:29202125

  7. Gaussian Process Regression (GPR) Representation in Predictive Model Markup Language (PMML).

    PubMed

    Park, J; Lechevalier, D; Ak, R; Ferguson, M; Law, K H; Lee, Y-T T; Rachuri, S

    2017-01-01

    This paper describes Gaussian process regression (GPR) models presented in predictive model markup language (PMML). PMML is an extensible-markup-language (XML) -based standard language used to represent data-mining and predictive analytic models, as well as pre- and post-processed data. The previous PMML version, PMML 4.2, did not provide capabilities for representing probabilistic (stochastic) machine-learning algorithms that are widely used for constructing predictive models taking the associated uncertainties into consideration. The newly released PMML version 4.3, which includes the GPR model, provides new features: confidence bounds and distribution for the predictive estimations. Both features are needed to establish the foundation for uncertainty quantification analysis. Among various probabilistic machine-learning algorithms, GPR has been widely used for approximating a target function because of its capability of representing complex input and output relationships without predefining a set of basis functions, and predicting a target output with uncertainty quantification. GPR is being employed to various manufacturing data-analytics applications, which necessitates representing this model in a standardized form for easy and rapid employment. In this paper, we present a GPR model and its representation in PMML. Furthermore, we demonstrate a prototype using a real data set in the manufacturing domain.

  8. Extreme Markup: The Fifty US Hospitals With The Highest Charge-To-Cost Ratios.

    PubMed

    Bai, Ge; Anderson, Gerard F

    2015-06-01

    Using Medicare cost reports, we examined the fifty US hospitals with the highest charge-to-cost ratios in 2012. These hospitals have markups (ratios of charges over Medicare-allowable costs) approximately ten times their Medicare-allowable costs compared to a national average of 3.4 and a mode of 2.4. Analysis of the fifty hospitals showed that forty-nine are for profit (98 percent), forty-six are owned by for-profit hospital systems (92 percent), and twenty (40 percent) operate in Florida. One for-profit hospital system owns half of these fifty hospitals. While most public and private health insurers do not use hospital charges to set their payment rates, uninsured patients are commonly asked to pay the full charges, and out-of-network patients and casualty and workers' compensation insurers are often expected to pay a large portion of the full charges. Because it is difficult for patients to compare prices, market forces fail to constrain hospital charges. Federal and state governments may want to consider limitations on the charge-to-cost ratio, some form of all-payer rate setting, or mandated price disclosure to regulate hospital markups. Project HOPE—The People-to-People Health Foundation, Inc.

  9. Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 2.

    PubMed

    Bergmann, Frank T; Cooper, Jonathan; Le Novère, Nicolas; Nickerson, David; Waltemath, Dagmar

    2015-09-04

    The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE) guidelines. This document presents Level 1 Version 2 of the Simulation Experiment Description Markup Language (SED-ML), a computer-readable format for encoding simulation and analysis experiments to apply to computational models. SED-ML files are encoded in the Extensible Markup Language (XML) and can be used in conjunction with any XML-based model encoding format, such as CellML or SBML. A SED-ML file includes details of which models to use, how to modify them prior to executing a simulation, which simulation and analysis procedures to apply, which results to extract and how to present them. Level 1 Version 2 extends the format by allowing the encoding of repeated and chained procedures.

  10. Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 2.

    PubMed

    Bergmann, Frank T; Cooper, Jonathan; Le Novère, Nicolas; Nickerson, David; Waltemath, Dagmar

    2015-06-01

    The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE) guidelines. This document presents Level 1 Version 2 of the Simulation Experiment Description Markup Language (SED-ML), a computer-readable format for encoding simulation and analysis experiments to apply to computational models. SED-ML files are encoded in the Extensible Markup Language (XML) and can be used in conjunction with any XML-based model encoding format, such as CellML or SBML. A SED-ML file includes details of which models to use, how to modify them prior to executing a simulation, which simulation and analysis procedures to apply, which results to extract and how to present them. Level 1 Version 2 extends the format by allowing the encoding of repeated and chained procedures.

  11. Pricing and components analysis of some key essential pediatric medicine in Odisha state.

    PubMed

    Samal, Satyajit; Swain, Trupti Rekha

    2017-01-01

    Study highlighting prices, i.e., the patients actually pay at ground level is important for interventions such as alternate procurement schemes or to expedite regulatory assessment of essential medicines for children. The present study was undertaken to study pricing and component analysis of few key essential medicines in Odisha state. Six child-specific medicines of different formulations were selected based on use in different disease condition and having widest pricing variation. Data were collected, entered, and analyzed in the price components data collection form of the World Health Organization-Health Action International (WHO-HAI) 2007 Workbook version 5 - Part II provided as part of the WHO/HAI methodology. The analysis includes the cumulative percent markup, total cumulative percent markup, and percent contribution of individual components to the final medicine price in both public and private sector of Odisha state. Add-on costs such as taxes, wholesale, and retail markups contribute substantially to the final price of medicines in private sector, particularly for branded-generic products. The largest contributor to add-on costs is at the level of retailer shop. Policy should be framed to achieve a greater transparency and uniformity of the pricing of medicines at different health sectors of Odisha.

  12. Helping Students Design HyperCard Stacks.

    ERIC Educational Resources Information Center

    Dunham, Ken

    1995-01-01

    Discusses how to teach students to design HyperCard stacks. Highlights include introducing HyperCard, developing storyboards, introducing design concepts and scripts, presenting stacks, evaluating storyboards, and continuing projects. A sidebar presents a HyperCard stack evaluation form. (AEF)

  13. Engineering software development with HyperCard

    NASA Technical Reports Server (NTRS)

    Darko, Robert J.

    1990-01-01

    The successful and unsuccessful techniques used in the development of software using HyperCard are described. The viability of the HyperCard for engineering is evaluated and the future use of HyperCard by this particular group of developers is discussed.

  14. Genetics Home Reference: X-linked hyper IgM syndrome

    MedlinePlus

    ... Home Health Conditions X-linked hyper IgM syndrome X-linked hyper IgM syndrome Printable PDF Open All ... Javascript to view the expand/collapse boxes. Description X-linked hyper IgM syndrome is a condition that ...

  15. Icon Images in HyperCard: An Exploration of Visual Concepts with Middle School Students.

    ERIC Educational Resources Information Center

    Philleo, Tom

    The purpose of this project was to investigate, in an informal and exploratory manner, the reactions of middle school students to unfamiliar symbols used as computer screen icons. The project focused on discovering a means to address the following issues: (1) the appearance of buttons containing text compared to those with graphics; (2) the…

  16. The Memory Stack: New Technologies Harness Talking for Writing.

    ERIC Educational Resources Information Center

    Gannon, Maureen T.

    In this paper, an elementary school teacher describes her experiences with the Memory Stack--a HyperCard based tool that can accommodate a voice recording, a graphic image, and a written text on the same card--which she designed to help her second and third grade students integrate their oral language fluency into the process of learning how to…

  17. Where's Your Thesis Statement and What Happened to Your Topic Sentences? Identifying Organizational Challenges in Undergraduate Student Argumentative Writing

    ERIC Educational Resources Information Center

    Miller, Ryan T.; Pessoa, Silvia

    2016-01-01

    The authors examine the challenges students faced in trying to write organized texts using effective thesis statements and topic sentences by analyzing argumentative history essays written by multilingual students enrolled in an undergraduate history course. They use the notions of macro-Theme (i.e., thesis statement) and hyper-Theme (i.e., topic…

  18. Educational Implications of Psychopathology for Brain-Injured Children; Lesley College Annual Graduate Symposium (3rd, Cambridge, Massachusetts, May 13, 1967).

    ERIC Educational Resources Information Center

    Gertz, Boris, Ed.

    The symposium report includes the text of an illustrated lecture given by William M. Cruickshank on "Psychopathology and Implications for Educating Brain-Injured Children." Considered in the lecture are hyperactivity, the needs of hyperative children, and educational setting and curriculum. Panel reactions are provided by E.F. Rabe, a pediatric…

  19. An Examination of Undergraduate Student's Perceptions and Predilections of the Use of YouTube in the Teaching and Learning Process

    ERIC Educational Resources Information Center

    Buzzetto-More, Nicole A.

    2014-01-01

    Pervasive social networking and media sharing technologies have augmented perceptual understanding and information gathering and, while text-based resources have remained the standard for centuries, they do not appeal to the hyper-stimulated visual learners of today. In particular, the research suggests that targeted YouTube videos enhance student…

  20. HyperCLIPS: A HyperCard interface to CLIPS

    NASA Technical Reports Server (NTRS)

    Pickering, Brad; Hill, Randall W., Jr.

    1990-01-01

    HyperCLIPS combines the intuitive, interactive user interface of the Apple Macintosh(TM) with the powerful symbolic computation of an expert system interpreter. HyperCard(TM) is an excellent environment for quickly developing the front end of an application with buttons, dialogs, and pictures, while the CLIPS interpreter provides a powerful inference engine for complex problem solving and analysis. By integrating HyperCard and CLIPS the advantages and uses of both packages are made available for a wide range of uses: rapid prototyping of knowledge-based expert systems, interactive simulations of physical systems, and intelligent control of hypertext processes, to name a few. Interfacing HyperCard and CLIPS is natural. HyperCard was designed to be extended through the use of external commands (XCMDs), and CLIPS was designed to be embedded through the use of the I/O router facilities and callable interface routines. With the exception of some technical difficulties which will be discussed later, HyperCLIPS implements this interface in a straight forward manner, using the facilities provided. An XCMD called 'ClipsX' was added to HyperCard to give access to the CLIPS routines: clear, load, reset, and run. And an I/O router was added to CLIPS to handle the communication of data between CLIPS and HyperCard.

  1. Hyperunified field theory and gravitational gauge-geometry duality

    NASA Astrophysics Data System (ADS)

    Wu, Yue-Liang

    2018-01-01

    A hyperunified field theory is built in detail based on the postulates of gauge invariance and coordinate independence along with the conformal scaling symmetry. All elementary particles are merged into a single hyper-spinor field and all basic forces are unified into a fundamental interaction governed by the hyper-spin gauge symmetry SP(1, D_h-1). The dimension D_h of hyper-spacetime is conjectured to have a physical origin in correlation with the hyper-spin charge of elementary particles. The hyper-gravifield fiber bundle structure of biframe hyper-spacetime appears naturally with the globally flat Minkowski hyper-spacetime as a base spacetime and the locally flat hyper-gravifield spacetime as a fiber that is viewed as a dynamically emerged hyper-spacetime characterized by a non-commutative geometry. The gravitational origin of gauge symmetry is revealed with the hyper-gravifield that plays an essential role as a Goldstone-like field. The gauge-gravity and gravity-geometry correspondences bring about the gravitational gauge-geometry duality. The basic properties of hyperunified field theory and the issue on the fundamental scale are analyzed within the framework of quantum field theory, which allows us to describe the laws of nature in deriving the gauge gravitational equation with the conserved current and the geometric gravitational equations of Einstein-like type and beyond.

  2. [Study of sharing platform of web-based enhanced extracorporeal counterpulsation hemodynamic waveform data].

    PubMed

    Huang, Mingbo; Hu, Ding; Yu, Donglan; Zheng, Zhensheng; Wang, Kuijian

    2011-12-01

    Enhanced extracorporeal counterpulsation (EECP) information consists of both text and hemodynamic waveform data. At present EECP text information has been successfully managed through Web browser, while the management and sharing of hemodynamic waveform data through Internet has not been solved yet. In order to manage EECP information completely, based on the in-depth analysis of EECP hemodynamic waveform file of digital imaging and communications in medicine (DICOM) format and its disadvantages in Internet sharing, we proposed the use of the extensible markup language (XML), which is currently the Internet popular data exchange standard, as the storage specification for the sharing of EECP waveform data. Then we designed a web-based sharing system of EECP hemodynamic waveform data via ASP. NET 2.0 platform. Meanwhile, we specifically introduced the four main system function modules and their implement methods, including DICOM to XML conversion module, EECP waveform data management module, retrieval and display of EECP waveform module and the security mechanism of the system.

  3. A Java viewer to publish Digital Imaging and Communications in Medicine (DICOM) radiologic images on the World Wide Web.

    PubMed

    Setti, E; Musumeci, R

    2001-06-01

    The world wide web is an exciting service that allows one to publish electronic documents made of text and images on the internet. Client software called a web browser can access these documents, and display and print them. The most popular browsers are currently Microsoft Internet Explorer (Microsoft, Redmond, WA) and Netscape Communicator (Netscape Communications, Mountain View, CA). These browsers can display text in hypertext markup language (HTML) format and images in Joint Photographic Expert Group (JPEG) and Graphic Interchange Format (GIF). Currently, neither browser can display radiologic images in native Digital Imaging and Communications in Medicine (DICOM) format. With the aim to publish radiologic images on the internet, we wrote a dedicated Java applet. Our software can display radiologic and histologic images in DICOM, JPEG, and GIF formats, and provides a a number of functions like windowing and magnification lens. The applet is compatible with some web browsers, even the older versions. The software is free and available from the author.

  4. Development of a Google-based search engine for data mining radiology reports.

    PubMed

    Erinjeri, Joseph P; Picus, Daniel; Prior, Fred W; Rubin, David A; Koppel, Paul

    2009-08-01

    The aim of this study is to develop a secure, Google-based data-mining tool for radiology reports using free and open source technologies and to explore its use within an academic radiology department. A Health Insurance Portability and Accountability Act (HIPAA)-compliant data repository, search engine and user interface were created to facilitate treatment, operations, and reviews preparatory to research. The Institutional Review Board waived review of the project, and informed consent was not required. Comprising 7.9 GB of disk space, 2.9 million text reports were downloaded from our radiology information system to a fileserver. Extensible markup language (XML) representations of the reports were indexed using Google Desktop Enterprise search engine software. A hypertext markup language (HTML) form allowed users to submit queries to Google Desktop, and Google's XML response was interpreted by a practical extraction and report language (PERL) script, presenting ranked results in a web browser window. The query, reason for search, results, and documents visited were logged to maintain HIPAA compliance. Indexing averaged approximately 25,000 reports per hour. Keyword search of a common term like "pneumothorax" yielded the first ten most relevant results of 705,550 total results in 1.36 s. Keyword search of a rare term like "hemangioendothelioma" yielded the first ten most relevant results of 167 total results in 0.23 s; retrieval of all 167 results took 0.26 s. Data mining tools for radiology reports will improve the productivity of academic radiologists in clinical, educational, research, and administrative tasks. By leveraging existing knowledge of Google's interface, radiologists can quickly perform useful searches.

  5. The HyperLeda project en route to the astronomical virtual observatory

    NASA Astrophysics Data System (ADS)

    Golev, V.; Georgiev, V.; Prugniel, Ph.

    2002-07-01

    HyperLeda (Hyper-Linked Extragalactic Databases and Archives) is aimed to study the evolution of galaxies, their kinematics and stellar populations and the structure of Local Universe. HyperLeda is involved in catalogue and software production, data-mining and massive data processing. The products are serviced to the community through web mirrors. The development of HyperLeda is distributed between different sites and is based on the background experience of the LEDA and Hypercat databases. The HyperLeda project is focused both on the European iAstro colaboration and as a unique database for studies of the physics of the extragalactic objects.

  6. Web-based X-ray quality control documentation.

    PubMed

    David, George; Burnett, Lou Ann; Schenkel, Robert

    2003-01-01

    The department of radiology at the Medical College of Georgia Hospital and Clinics has developed an equipment quality control web site. Our goal is to provide immediate access to virtually all medical physics survey data. The web site is designed to assist equipment engineers, department management and technologists. By improving communications and access to equipment documentation, we believe productivity is enhanced. The creation of the quality control web site was accomplished in three distinct steps. First, survey data had to be placed in a computer format. The second step was to convert these various computer files to a format supported by commercial web browsers. Third, a comprehensive home page had to be designed to provide convenient access to the multitude of surveys done in the various x-ray rooms. Because we had spent years previously fine-tuning the computerization of the medical physics quality control program, most survey documentation was already in spreadsheet or database format. A major technical decision was the method of conversion of survey spreadsheet and database files into documentation appropriate for the web. After an unsatisfactory experience with a HyperText Markup Language (HTML) converter (packaged with spreadsheet and database software), we tried creating Portable Document Format (PDF) files using Adobe Acrobat software. This process preserves the original formatting of the document and takes no longer than conventional printing; therefore, it has been very successful. Although the PDF file generated by Adobe Acrobat is a proprietary format, it can be displayed through a conventional web browser using the freely distributed Adobe Acrobat Reader program that is available for virtually all platforms. Once a user installs the software, it is automatically invoked by the web browser whenever the user follows a link to a file with a PDF extension. Although no confidential patient information is available on the web site, our legal department recommended that we secure the site in order to keep out those wishing to make mischief. Our interim solution has not been to password protect the page, which we feared would hinder access for occasional legitimate users, but also not to provide links to it from other hospital and department pages. Utility and productivity were improved and time and money were saved by making radiological equipment quality control documentation instantly available on-line.

  7. Impact of styrenic polymer one-step hyper-cross-linking on volatile organic compound adsorption and desorption performance.

    PubMed

    Ghafari, Mohsen; Atkinson, John D

    2018-06-05

    A novel one-step hyper-cross-linking method, using 1,2-dichloroethane (DCE) and 1,6-dichlorohexane (DCH) cross-linkers, expands the micropore volume of commercial styrenic polymers. Performance of virgin and modified polymers was evaluated by measuring hexane, toluene, and methyl-ethyl-ketone (MEK) adsorption capacity, adsorption/desorption kinetics, and desorption efficiency. Hyper-cross-linked polymers have up to 128% higher adsorption capacity than virgin polymers at P/P 0  = 0.05 due to micropore volume increases up to 330%. Improvements are most pronounced with the DCE cross-linker. Hyper-cross-linking has minimal impact on hexane adsorption kinetics, but adsorption rates for toluene and MEK decrease by 6-41%. Desorption rates decreased (3-36%) for all materials after hyper-cross-linking, with larger decreases for DCE hyper-cross-linked polymers due to smaller average pore widths. For room temperature desorption, 20-220% more adsorbate remains in hyper-cross-linked polymers after regeneration compared to virgin materials. DCE hyper-cross-linked polymers have 13-92% more residual adsorbate than DCH counterparts. Higher temperatures were required for DCE hyper-cross-linked polymers to completely desorb VOCs compared to the DCH hyper-cross-linked and virgin counterparts. Results show that the one-step hyper-cross-linking method for modifying styrenic polymers improves adsorption capacity because of added micropores, but decreases adsorption/desorption kinetics and desorption efficiency for large VOCs due to a decrease in average pore width. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. HyperCard--A Science Teaching Tool.

    ERIC Educational Resources Information Center

    Parker, Carol

    1992-01-01

    Discussion of new technological resources available for science instruction focuses on the use of the HyperCard software for the Macintosh to design customized materials. Topics addressed include general features of HyperCard, designing HyperCard stacks, graphics, and designing buttons (i.e., links for moving through the stacks). Several sample…

  9. Optimality of profit-including prices under ideal planning.

    PubMed

    Samuelson, P A

    1973-07-01

    Although prices calculated by a constant percentage markup on all costs (nonlabor as well as direct-labor) are usually admitted to be more realistic for a competitive capitalistic model, the view is often expressed that, for optimal planning purposes, the "values" model of Marx's Capital, Volume I, is to be preferred. It is shown here that an optimal-control model that maximizes discounted social utility of consumption per capita and that ultimately approaches a steady state must ultimately have optimal pricing that involves equal rates of steady-state profit in all industries; and such optimal pricing will necessarily deviate from Marx's model of equal rates of surplus value (markups on direct-labor only) in all industries.

  10. Optimality of Profit-Including Prices Under Ideal Planning

    PubMed Central

    Samuelson, Paul A.

    1973-01-01

    Although prices calculated by a constant percentage markup on all costs (nonlabor as well as direct-labor) are usually admitted to be more realistic for a competitive capitalistic model, the view is often expressed that, for optimal planning purposes, the “values” model of Marx's Capital, Volume I, is to be preferred. It is shown here that an optimal-control model that maximizes discounted social utility of consumption per capita and that ultimately approaches a steady state must ultimately have optimal pricing that involves equal rates of steady-state profit in all industries; and such optimal pricing will necessarily deviate from Marx's model of equal rates of surplus value (markups on direct-labor only) in all industries. PMID:16592102

  11. Upcoming hearings in Congress

    NASA Astrophysics Data System (ADS)

    The following hearings and markups have been tentatively scheduled for the coming weeks by the Senate and House of Representatives. Dates and times should be verified with the committee or subcommittee holding the hearing or markup; all offices on Capitol Hill may be reached by telephoning 202-224-3121. For guidelines on contacting a member of Congress, see AGU's Guide to Legislative Information and Contacts (Eos, August 28, 1984, p. 669).October 8: A joint hearing by the Energy Research & Development Subcommittee of the Senate Energy and Natural Resources Committee and the Nuclear Regulation Subcommittee of the Senate Environment and Public Works Committee on low-level radioactive waste (S. 1517 and S. 1578). Room SD-366, Dirksen Building, 9:30 A.M.

  12. Earth Science Markup Language: Transitioning From Design to Application

    NASA Technical Reports Server (NTRS)

    Moe, Karen; Graves, Sara; Ramachandran, Rahul

    2002-01-01

    The primary objective of the proposed Earth Science Markup Language (ESML) research is to transition from design to application. The resulting schema and prototype software will foster community acceptance for the "define once, use anywhere" concept central to ESML. Supporting goals include: 1. Refinement of the ESML schema and software libraries in cooperation with the user community. 2. Application of the ESML schema and software libraries to a variety of Earth science data sets and analysis tools. 3. Development of supporting prototype software for enhanced ease of use. 4. Cooperation with standards bodies in order to assure ESML is aligned with related metadata standards as appropriate. 5. Widespread publication of the ESML approach, schema, and software.

  13. Standardized Semantic Markup for Reference Terminologies, Thesauri and Coding Systems: Benefits for distributed E-Health Applications.

    PubMed

    Hoelzer, Simon; Schweiger, Ralf K; Liu, Raymond; Rudolf, Dirk; Rieger, Joerg; Dudeck, Joachim

    2005-01-01

    With the introduction of the ICD-10 as the standard for diagnosis, the development of an electronic representation of its complete content, inherent semantics and coding rules is necessary. Our concept refers to current efforts of the CEN/TC 251 to establish a European standard for hierarchical classification systems in healthcare. We have developed an electronic representation of the ICD-10 with the extensible Markup Language (XML) that facilitates the integration in current information systems or coding software taking into account different languages and versions. In this context, XML offers a complete framework of related technologies and standard tools for processing that helps to develop interoperable applications.

  14. A hypermedia reference system to the Forest Ecosystem Management Assessment team report and some related publications.

    Treesearch

    K.M. Reynolds; H.M. Rauscher; C.V. Worth

    1995-01-01

    The hypermedia system, ForestEM, was developed in HyperWriter for use in Microsoft Windows. ForestEM version 1.0 includes text and figures from the FEMAT report and the Record of Decision and Standards and Guidelines. Hypermedia introduces two fundamental changes to knowledge management. The first is the capability to interactively store and retrieve large amounts of...

  15. Toward Effective and Compelling Instruction for High School eCommerce Students: Results from a Small Field Study

    ERIC Educational Resources Information Center

    Luterbach, Kenneth J.; Rodriguez, Diane; Love, Lakecia

    2012-01-01

    This paper describes an instructional development effort to create effective and compelling instruction for eCommerce students. Results from a small field study inform the development project. Four high school students in an eCommerce course completed the standalone tutorial developed to teach them how to create a web page in the HyperText Markup…

  16. Comparing the Efficacy of an Engineered-Based System (College Livetext) with an Off-the-Shelf General Tool (Hyperstudio) for Developing Electronic Portfolios in Teacher Education

    ERIC Educational Resources Information Center

    Johnson-Leslie, Natalie A.

    2009-01-01

    In teacher education, electronic portfolios provide an authentic form of assessment documenting students' personal and professional growth. Using the engineered-based system, College LiveText, and an off-the-shelf general tool, HyperStudio, pre-service teachers constructed e-portfolios as part of their teacher preparation requirements. This case…

  17. HyperCard for Educators. An Introduction.

    ERIC Educational Resources Information Center

    Bull, Glen L.; Harris, Judi

    This guide is designed to provide a quick introduction to the basic elements of HyperCard for teachers who are familiar with other computer applications but may not have worked with hypermedia applications; previous familiarity with HyperCard or with Macintosh computers is not necessary. It is noted that HyperCard is a software construction…

  18. The HyperCard Launching Pad.

    ERIC Educational Resources Information Center

    Aufdenspring, Gary; Aufdenspring, Deborah

    1992-01-01

    Describes how HyperCard software can be used to direct students to databases, applications, and explanations in an online environment. The use of HyperCard with other software is discussed; using HyperCard to set up tutorials is explained; and limitations are addressed, including the amount of memory needed and the speed of the hardware. (LRW)

  19. Talent in autism: hyper-systemizing, hyper-attention to detail and sensory hypersensitivity

    PubMed Central

    Baron-Cohen, Simon; Ashwin, Emma; Ashwin, Chris; Tavassoli, Teresa; Chakrabarti, Bhismadev

    2009-01-01

    We argue that hyper-systemizing predisposes individuals to show talent, and review evidence that hyper-systemizing is part of the cognitive style of people with autism spectrum conditions (ASC). We then clarify the hyper-systemizing theory, contrasting it to the weak central coherence (WCC) and executive dysfunction (ED) theories. The ED theory has difficulty explaining the existence of talent in ASC. While both hyper-systemizing and WCC theories postulate excellent attention to detail, by itself excellent attention to detail will not produce talent. By contrast, the hyper-systemizing theory argues that the excellent attention to detail is directed towards detecting ‘if p, then q’ rules (or [input–operation–output] reasoning). Such law-based pattern recognition systems can produce talent in systemizable domains. Finally, we argue that the excellent attention to detail in ASC is itself a consequence of sensory hypersensitivity. We review an experiment from our laboratory demonstrating sensory hypersensitivity detection thresholds in vision. We conclude that the origins of the association between autism and talent begin at the sensory level, include excellent attention to detail and end with hyper-systemizing. PMID:19528020

  20. Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 3 (L1V3).

    PubMed

    Bergmann, Frank T; Cooper, Jonathan; König, Matthias; Moraru, Ion; Nickerson, David; Le Novère, Nicolas; Olivier, Brett G; Sahle, Sven; Smith, Lucian; Waltemath, Dagmar

    2018-03-19

    The creation of computational simulation experiments to inform modern biological research poses challenges to reproduce, annotate, archive, and share such experiments. Efforts such as SBML or CellML standardize the formal representation of computational models in various areas of biology. The Simulation Experiment Description Markup Language (SED-ML) describes what procedures the models are subjected to, and the details of those procedures. These standards, together with further COMBINE standards, describe models sufficiently well for the reproduction of simulation studies among users and software tools. The Simulation Experiment Description Markup Language (SED-ML) is an XML-based format that encodes, for a given simulation experiment, (i) which models to use; (ii) which modifications to apply to models before simulation; (iii) which simulation procedures to run on each model; (iv) how to post-process the data; and (v) how these results should be plotted and reported. SED-ML Level 1 Version 1 (L1V1) implemented support for the encoding of basic time course simulations. SED-ML L1V2 added support for more complex types of simulations, specifically repeated tasks and chained simulation procedures. SED-ML L1V3 extends L1V2 by means to describe which datasets and subsets thereof to use within a simulation experiment.

  1. SBRML: a markup language for associating systems biology data with models.

    PubMed

    Dada, Joseph O; Spasić, Irena; Paton, Norman W; Mendes, Pedro

    2010-04-01

    Research in systems biology is carried out through a combination of experiments and models. Several data standards have been adopted for representing models (Systems Biology Markup Language) and various types of relevant experimental data (such as FuGE and those of the Proteomics Standards Initiative). However, until now, there has been no standard way to associate a model and its entities to the corresponding datasets, or vice versa. Such a standard would provide a means to represent computational simulation results as well as to frame experimental data in the context of a particular model. Target applications include model-driven data analysis, parameter estimation, and sharing and archiving model simulations. We propose the Systems Biology Results Markup Language (SBRML), an XML-based language that associates a model with several datasets. Each dataset is represented as a series of values associated with model variables, and their corresponding parameter values. SBRML provides a flexible way of indexing the results to model parameter values, which supports both spreadsheet-like data and multidimensional data cubes. We present and discuss several examples of SBRML usage in applications such as enzyme kinetics, microarray gene expression and various types of simulation results. The XML Schema file for SBRML is available at http://www.comp-sys-bio.org/SBRML under the Academic Free License (AFL) v3.0.

  2. Pricing and components analysis of some key essential pediatric medicine in Odisha state

    PubMed Central

    Samal, Satyajit; Swain, Trupti Rekha

    2017-01-01

    Objective: Study highlighting prices, i.e., the patients actually pay at ground level is important for interventions such as alternate procurement schemes or to expedite regulatory assessment of essential medicines for children. The present study was undertaken to study pricing and component analysis of few key essential medicines in Odisha state. Methodology: Six child-specific medicines of different formulations were selected based on use in different disease condition and having widest pricing variation. Data were collected, entered, and analyzed in the price components data collection form of the World Health Organization-Health Action International (WHO-HAI) 2007 Workbook version 5 – Part II provided as part of the WHO/HAI methodology. The analysis includes the cumulative percent markup, total cumulative percent markup, and percent contribution of individual components to the final medicine price in both public and private sector of Odisha state. Results: Add-on costs such as taxes, wholesale, and retail markups contribute substantially to the final price of medicines in private sector, particularly for branded-generic products. The largest contributor to add-on costs is at the level of retailer shop. Conclusion: Policy should be framed to achieve a greater transparency and uniformity of the pricing of medicines at different health sectors of Odisha. PMID:28458429

  3. Cross-instrument Analysis Correlation Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McJunkin, Timothy R.

    This program has been designed to assist with the tracking of a sample from one analytical instrument to another such as SEM, microscopes, micro x-ray diffraction and other instruments where particular positions/locations on the sample are examined, photographed, etc. The software is designed to easily enter the position of fiducials and locations of interest such that in a future session in the same of different instrument the positions of interest can be re-found through using the known location fiducials in the current and reference session to transform the point into the current sessions coordinate system. The software is dialog boxmore » driven guiding the user through the necessary data entry and program choices. Information is stored in a series of text based extensible markup language (XML) files.« less

  4. Risks, prices, and positions: A social network analysis of illegal drug trafficking in the world-economy.

    PubMed

    Boivin, Rémi

    2014-03-01

    Illegal drug prices are extremely high, compared to similar goods. There is, however, considerable variation in value depending on place, market level and type of drugs. A prominent framework for the study of illegal drugs is the "risks and prices" model (Reuter & Kleiman, 1986). Enforcement is seen as a "tax" added to the regular price. In this paper, it is argued that such economic models are not sufficient to explain price variations at country-level. Drug markets are analysed as global trade networks in which a country's position has an impact on various features, including illegal drug prices. This paper uses social network analysis (SNA) to explain price markups between pairs of countries involved in the trafficking of illegal drugs between 1998 and 2007. It aims to explore a simple question: why do prices increase between two countries? Using relational data from various international organizations, separate trade networks were built for cocaine, heroin and cannabis. Wholesale price markups are predicted with measures of supply, demand, risks of seizures, geographic distance and global positioning within the networks. Reported prices (in $US) and purchasing power parity-adjusted values are analysed. Drug prices increase more sharply when drugs are headed to countries where law enforcement imposes higher costs on traffickers. The position and role of a country in global drug markets are also closely associated with the value of drugs. Price markups are lower if the destination country is a transit to large potential markets. Furthermore, price markups for cocaine and heroin are more pronounced when drugs are exported to countries that are better positioned in the legitimate world-economy, suggesting that relations in legal and illegal markets are directed in opposite directions. Consistent with the world-system perspective, evidence is found of coherent world drug markets driven by both local realities and international relations. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Measuring market performance in restructured electricity markets: An empirical analysis of the PJM energy market

    NASA Astrophysics Data System (ADS)

    Tucker, Russell Jay

    2002-09-01

    Today the electric industry in the U.S. is transitioning to competitive markets for wholesale electricity. Independent system operators (ISOs) now manage broad regional markets for electrical energy in several areas of the U.S. A recent rulemaking by the Federal Energy Regulatory Commission (FERC) encourages the development of regional transmission organizations (RTOs) and restructured competitive wholesale electricity markets nationwide. To date, the transition to competitive wholesale markets has not been easy. The increased reliance on market forces coupled with unusually high electricity demand for some periods have created conditions amenable to market power abuse in many regions throughout the U.S. In the summer of 1999, hot and humid summer conditions in Pennsylvania, New Jersey, Maryland, Delaware, and the District of Columbia pushed peak demand in the PJM Interconnection to record levels. These demand conditions coincided with the introduction of market-based pricing in the wholesale electricity market. Prices for electricity increased on average by 55 percent, and reached the $1,000/MWh range. This study examines the extent to which generator market power raised prices above competitive levels in the PJM Interconnection during the summer of 1999. It simulates hourly market-clearing prices assuming competitive market behavior and compares these prices with observed market prices in computing price markups over the April 1-August 31, 1999 period. The results of the simulation analysis are supported with an examination of actual generator bid data of incumbent generators. Price markups averaged 14.7 percent above expected marginal cost over the 5-month period for all non-transmission-constrained hours. The evidence presented suggests that the June and July monthly markups were strongly influenced by generator market power as price inelastic peak demand approached the electricity generation capacity constraint of the market. While this analysis of the performance of the PJM market finds evidence of market power, the measured markups are markedly less than estimates from prior analysis of the PJM market.

  6. Impact of Market Behavior, Fleet Composition, and Ancillary Services on Revenue Sufficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frew, Bethany; Gallo, Giulia; Brinkman, Gregory

    Revenue insufficiency, or the missing money problem, occurs when the revenues that generators earn from the market are not sufficient to cover both fixed and variable costs to remain in the market and/or justify investments in new capacity, which may be needed for reliability. The near-zero marginal cost of variable renewable generators further exacerbates these revenue challenges. Estimating the extent of the missing money problem in current electricity markets is an important, nontrivial task that requires representing both how the power system operates and how market participants behave. This paper explores the missing money problem using a production cost modelmore » that represented a simplified version of the Electric Reliability Council of Texas (ERCOT) energy-only market for the years 2012-2014. We evaluate how various market structures -- including market behavior, ancillary services, and changing fleet compositions -- affect net revenues in this ERCOT-like system. In most production cost modeling exercises, resources are assumed to offer their marginal capabilities at marginal costs. Although this assumption is reasonable for feasibility studies and long-term planning, it does not adequately consider the market behaviors that impact revenue sufficiency. In this work, we simulate a limited set of market participant strategic bidding behaviors by means of different sets of markups; these markups are applied to the true production costs of all gas generators, which are the most prominent generators in ERCOT. Results show that markups can help generators increase their net revenues overall, although net revenues may increase or decrease depending on the technology and the year under study. Results also confirm that conventional, variable-cost-based production cost simulations do not capture prices accurately, and this particular feature calls for proxies for strategic behaviors (e.g., markups) and more accurate representations of how electricity markets work. The analysis also shows that generators face revenue sufficiency challenges in this ERCOT-like energy-only market model; net revenues provided by the market in all base markup cases and sensitivity scenarios (except when a large fraction of the existing coal fleet is retired) are not sufficient to justify investments in new capacity for thermal and nuclear power units. Overall, the work described in this paper points to the need for improved behavioral models of electricity markets to more accurately study current and potential market design issues that could arise in systems with high penetrations of renewable generation.« less

  7. Informatics in radiology (infoRAD): HTML and Web site design for the radiologist: a primer.

    PubMed

    Ryan, Anthony G; Louis, Luck J; Yee, William C

    2005-01-01

    A Web site has enormous potential as a medium for the radiologist to store, present, and share information in the form of text, images, and video clips. With a modest amount of tutoring and effort, designing a site can be as painless as preparing a Microsoft PowerPoint presentation. The site can then be used as a hub for the development of further offshoots (eg, Web-based tutorials, storage for a teaching library, publication of information about one's practice, and information gathering from a wide variety of sources). By learning the basics of hypertext markup language (HTML), the reader will be able to produce a simple and effective Web page that permits display of text, images, and multimedia files. The process of constructing a Web page can be divided into five steps: (a) creating a basic template with formatted text, (b) adding color, (c) importing images and multimedia files, (d) creating hyperlinks, and (e) uploading one's page to the Internet. This Web page may be used as the basis for a Web-based tutorial comprising text documents and image files already in one's possession. Finally, there are many commercially available packages for Web page design that require no knowledge of HTML.

  8. 4D Hyperspherical Harmonic (HyperSPHARM) Representation of Surface Anatomy: A Holistic Treatment of Multiple Disconnected Anatomical Structures

    PubMed Central

    Hosseinbor, A. Pasha; Chung, Moo K.; Koay, Cheng Guan; Schaefer, Stacey M.; van Reekum, Carien M.; Schmitz, Lara Peschke; Sutterer, Matt; Alexander, Andrew L.; Davidson, Richard J.

    2015-01-01

    Image-based parcellation of the brain often leads to multiple disconnected anatomical structures, which pose significant challenges for analyses of morphological shapes. Existing shape models, such as the widely used spherical harmonic (SPHARM) representation, assume topological invariance, so are unable to simultaneously parameterize multiple disjoint structures. In such a situation, SPHARM has to be applied separately to each individual structure. We present a novel surface parameterization technique using 4D hyperspherical harmonics in representing multiple disjoint objects as a single analytic function, terming it HyperSPHARM. The underlying idea behind Hyper-SPHARM is to stereographically project an entire collection of disjoint 3D objects onto the 4D hypersphere and subsequently simultaneously parameterize them with the 4D hyperspherical harmonics. Hence, HyperSPHARM allows for a holistic treatment of multiple disjoint objects, unlike SPHARM. In an imaging dataset of healthy adult human brains, we apply HyperSPHARM to the hippocampi and amygdalae. The HyperSPHARM representations are employed as a data smoothing technique, while the HyperSPHARM coefficients are utilized in a support vector machine setting for object classification. HyperSPHARM yields nearly identical results as SPHARM, as will be shown in the paper. Its key advantage over SPHARM lies computationally; Hyper-SPHARM possess greater computational efficiency than SPHARM because it can parameterize multiple disjoint structures using much fewer basis functions and stereographic projection obviates SPHARM's burdensome surface flattening. In addition, HyperSPHARM can handle any type of topology, unlike SPHARM, whose analysis is confined to topologically invariant structures. PMID:25828650

  9. Split2 Protein-Ligation Generates Active IL-6-Type Hyper-Cytokines from Inactive Precursors.

    PubMed

    Moll, Jens M; Wehmöller, Melanie; Frank, Nils C; Homey, Lisa; Baran, Paul; Garbers, Christoph; Lamertz, Larissa; Axelrod, Jonathan H; Galun, Eithan; Mootz, Henning D; Scheller, Jürgen

    2017-12-15

    Trans-signaling of the major pro- and anti-inflammatory cytokines Interleukin (IL)-6 and IL-11 has the unique feature to virtually activate all cells of the body and is critically involved in chronic inflammation and regeneration. Hyper-IL-6 and Hyper-IL-11 are single chain designer trans-signaling cytokines, in which the cytokine and soluble receptor units are trapped in one complex via a flexible peptide linker. Albeit, Hyper-cytokines are essential tools to study trans-signaling in vitro and in vivo, the superior potency of these designer cytokines are accompanied by undesirable stress responses. To enable tailor-made generation of Hyper-cytokines, we developed inactive split-cytokine-precursors adapted for posttranslational reassembly by split-intein mediated protein trans-splicing (PTS). We identified cutting sites within IL-6 (E 134 /S 135 ) and IL-11 (G 116 /S 117 ) and obtained inactive split-Hyper-IL-6 and split-Hyper-IL-11 cytokine precursors. After fusion with split-inteins, PTS resulted in reconstitution of active Hyper-cytokines, which were efficiently secreted from transfected cells. Our strategy comprises the development of a background-free cytokine signaling system from reversibly inactivated precursor cytokines.

  10. Reducing tobacco use and access through strengthened minimum price laws.

    PubMed

    McLaughlin, Ian; Pearson, Anne; Laird-Metke, Elisa; Ribisl, Kurt

    2014-10-01

    Higher prices reduce consumption and initiation of tobacco products. A minimum price law that establishes a high statutory minimum price and prohibits the industry's discounting tactics for tobacco products is a promising pricing strategy as an alternative to excise tax increases. Although some states have adopted minimum price laws on the basis of statutorily defined price "markups" over the invoice price, existing state laws have been largely ineffective at increasing the retail price. We analyzed 3 new variations of minimum price laws that hold great potential for raising tobacco prices and reducing consumption: (1) a flat rate minimum price law similar to a recent enactment in New York City, (2) an enhanced markup law, and (3) a law that incorporates both elements.

  11. Networking observers and observatories with remote telescope markup language

    NASA Astrophysics Data System (ADS)

    Hessman, Frederic V.; Tuparev, Georg; Allan, Alasdair

    2006-06-01

    Remote Telescope Markup Language (RTML) is an XML-based protocol for the transport of the high-level description of a set of observations to be carried out on a remote, robotic or service telescope. We describe how RTML is being used in a wide variety of contexts: the transport of service and robotic observing requests in the Hands-On Universe TM, ACP, eSTAR, and MONET networks; how RTML is easily combined with other XML protocols for more localized control of telescopes; RTML as a secondary observation report format for the IVOA's VOEvent protocol; the input format for a general-purpose observation simulator; and the observatory-independent means for carrying out request transactions for the international Heterogeneous Telescope Network (HTN).

  12. The semantics of Chemical Markup Language (CML) for computational chemistry : CompChem.

    PubMed

    Phadungsukanan, Weerapong; Kraft, Markus; Townsend, Joe A; Murray-Rust, Peter

    2012-08-07

    : This paper introduces a subdomain chemistry format for storing computational chemistry data called CompChem. It has been developed based on the design, concepts and methodologies of Chemical Markup Language (CML) by adding computational chemistry semantics on top of the CML Schema. The format allows a wide range of ab initio quantum chemistry calculations of individual molecules to be stored. These calculations include, for example, single point energy calculation, molecular geometry optimization, and vibrational frequency analysis. The paper also describes the supporting infrastructure, such as processing software, dictionaries, validation tools and database repositories. In addition, some of the challenges and difficulties in developing common computational chemistry dictionaries are discussed. The uses of CompChem are illustrated by two practical applications.

  13. Pathology data integration with eXtensible Markup Language.

    PubMed

    Berman, Jules J

    2005-02-01

    It is impossible to overstate the importance of XML (eXtensible Markup Language) as a data organization tool. With XML, pathologists can annotate all of their data (clinical and anatomic) in a format that can transform every pathology report into a database, without compromising narrative structure. The purpose of this manuscript is to provide an overview of XML for pathologists. Examples will demonstrate how pathologists can use XML to annotate individual data elements and to structure reports in a common format that can be merged with other XML files or queried using standard XML tools. This manuscript gives pathologists a glimpse into how XML allows pathology data to be linked to other types of biomedical data and reduces our dependence on centralized proprietary databases.

  14. SBML-PET-MPI: a parallel parameter estimation tool for Systems Biology Markup Language based models.

    PubMed

    Zi, Zhike

    2011-04-01

    Parameter estimation is crucial for the modeling and dynamic analysis of biological systems. However, implementing parameter estimation is time consuming and computationally demanding. Here, we introduced a parallel parameter estimation tool for Systems Biology Markup Language (SBML)-based models (SBML-PET-MPI). SBML-PET-MPI allows the user to perform parameter estimation and parameter uncertainty analysis by collectively fitting multiple experimental datasets. The tool is developed and parallelized using the message passing interface (MPI) protocol, which provides good scalability with the number of processors. SBML-PET-MPI is freely available for non-commercial use at http://www.bioss.uni-freiburg.de/cms/sbml-pet-mpi.html or http://sites.google.com/site/sbmlpetmpi/.

  15. The semantics of Chemical Markup Language (CML) for computational chemistry : CompChem

    PubMed Central

    2012-01-01

    This paper introduces a subdomain chemistry format for storing computational chemistry data called CompChem. It has been developed based on the design, concepts and methodologies of Chemical Markup Language (CML) by adding computational chemistry semantics on top of the CML Schema. The format allows a wide range of ab initio quantum chemistry calculations of individual molecules to be stored. These calculations include, for example, single point energy calculation, molecular geometry optimization, and vibrational frequency analysis. The paper also describes the supporting infrastructure, such as processing software, dictionaries, validation tools and database repositories. In addition, some of the challenges and difficulties in developing common computational chemistry dictionaries are discussed. The uses of CompChem are illustrated by two practical applications. PMID:22870956

  16. Scattering and cloaking of binary hyper-particles in metamaterials.

    PubMed

    Alexopoulos, A; Yau, K S B

    2010-09-13

    We derive the d-dimensional scattering cross section for homogeneous and composite hyper-particles inside a metamaterial. The polarizability of the hyper-particles is expressed in multi-dimensional form and is used in order to examine various scattering characteristics. We introduce scattering bounds that display interesting results when d --> ∞ and in particular consider the special limit of hyper-particle cloaking in some detail. We demonstrate cloaking via resonance for homogeneous particles and show that composite hyper-particles can be used in order to obtain electromagnetic cloaking with either negative or all positive constitutive parameters respectively. Our approach not only considers cloaking of particles of integer dimension but also particles with non-integer dimension such as fractals. Theoretical results are compared to full-wave numerical simulations for two interacting hyper-particles in a medium.

  17. Role of glutathione in lung retention of 99mTc-hexamethylpropyleneamine oxime in two unique rat models of hyperoxic lung injury

    PubMed Central

    Roerig, David L.; Haworth, Steven T.; Clough, Anne V.

    2012-01-01

    Rat exposure to 60% oxygen (O2) for 7 days (hyper-60) or to >95% O2 for 2 days followed by 24 h in room air (hyper-95R) confers susceptibility or tolerance, respectively, of the otherwise lethal effects of subsequent exposure to 100% O2. The objective of this study was to determine if lung retention of the radiopharmaceutical agent technetium-labeled-hexamethylpropyleneamine oxime (HMPAO) is differentially altered in hyper-60 and hyper-95R rats. Tissue retention of HMPAO is dependent on intracellular content of the antioxidant GSH and mitochondrial function. HMPAO was injected intravenously in anesthetized rats, and planar images were acquired. We investigated the role of GSH in the lung retention of HMPAO by pretreating rats with the GSH-depleting agent diethyl maleate (DEM) prior to imaging. We also measured GSH content and activities of mitochondrial complexes I and IV in lung homogenate. The lung retention of HMPAO increased by ∼50% and ∼250% in hyper-60 and hyper-95R rats, respectively, compared with retention in rats exposed to room air (normoxic). DEM decreased retention in normoxic (∼26%) and hyper-95R (∼56%) rats compared with retention in the absence of DEM. GSH content increased by 19% and 40% in hyper-60 and hyper-95R lung homogenate compared with normoxic lung homogenate. Complex I activity decreased by ∼50% in hyper-60 and hyper-95R lung homogenate compared with activity in normoxic lung homogenate. However, complex IV activity was increased by 32% in hyper-95R lung homogenate only. Furthermore, we identified correlations between the GSH content in lung homogenate and the DEM-sensitive fraction of HMPAO retention and between the complex IV/complex I activity ratio and the DEM-insensitive fraction of HMPAO retention. These results suggest that an increase in the GSH-dependent component of the lung retention of HMPAO may be a marker of tolerance to sustained exposure to hyperoxia. PMID:22628374

  18. Role of glutathione in lung retention of 99mTc-hexamethylpropyleneamine oxime in two unique rat models of hyperoxic lung injury.

    PubMed

    Audi, Said H; Roerig, David L; Haworth, Steven T; Clough, Anne V

    2012-08-15

    Rat exposure to 60% oxygen (O(2)) for 7 days (hyper-60) or to >95% O(2) for 2 days followed by 24 h in room air (hyper-95R) confers susceptibility or tolerance, respectively, of the otherwise lethal effects of subsequent exposure to 100% O(2). The objective of this study was to determine if lung retention of the radiopharmaceutical agent technetium-labeled-hexamethylpropyleneamine oxime (HMPAO) is differentially altered in hyper-60 and hyper-95R rats. Tissue retention of HMPAO is dependent on intracellular content of the antioxidant GSH and mitochondrial function. HMPAO was injected intravenously in anesthetized rats, and planar images were acquired. We investigated the role of GSH in the lung retention of HMPAO by pretreating rats with the GSH-depleting agent diethyl maleate (DEM) prior to imaging. We also measured GSH content and activities of mitochondrial complexes I and IV in lung homogenate. The lung retention of HMPAO increased by ≈ 50% and ≈ 250% in hyper-60 and hyper-95R rats, respectively, compared with retention in rats exposed to room air (normoxic). DEM decreased retention in normoxic (≈ 26%) and hyper-95R (≈ 56%) rats compared with retention in the absence of DEM. GSH content increased by 19% and 40% in hyper-60 and hyper-95R lung homogenate compared with normoxic lung homogenate. Complex I activity decreased by ≈ 50% in hyper-60 and hyper-95R lung homogenate compared with activity in normoxic lung homogenate. However, complex IV activity was increased by 32% in hyper-95R lung homogenate only. Furthermore, we identified correlations between the GSH content in lung homogenate and the DEM-sensitive fraction of HMPAO retention and between the complex IV/complex I activity ratio and the DEM-insensitive fraction of HMPAO retention. These results suggest that an increase in the GSH-dependent component of the lung retention of HMPAO may be a marker of tolerance to sustained exposure to hyperoxia.

  19. Two dissimilar approaches to dynamical systems on hyper MV -algebras and their information entropy

    NASA Astrophysics Data System (ADS)

    Mehrpooya, Adel; Ebrahimi, Mohammad; Davvaz, Bijan

    2017-09-01

    Measuring the flow of information that is related to the evolution of a system which is modeled by applying a mathematical structure is of capital significance for science and usually for mathematics itself. Regarding this fact, a major issue in concern with hyperstructures is their dynamics and the complexity of the varied possible dynamics that exist over them. Notably, the dynamics and uncertainty of hyper MV -algebras which are hyperstructures and extensions of a central tool in infinite-valued Lukasiewicz propositional calculus that models many valued logics are of primary concern. Tackling this problem, in this paper we focus on the subject of dynamical systems on hyper MV -algebras and their entropy. In this respect, we adopt two varied approaches. One is the set-based approach in which hyper MV -algebra dynamical systems are developed by employing set functions and set partitions. By the other method that is based on points and point partitions, we establish the concept of hyper injective dynamical systems on hyper MV -algebras. Next, we study the notion of entropy for both kinds of systems. Furthermore, we consider essential ergodic characteristics of those systems and their entropy. In particular, we introduce the concept of isomorphic hyper injective and hyper MV -algebra dynamical systems, and we demonstrate that isomorphic systems have the same entropy. We present a couple of theorems in order to help calculate entropy. In particular, we prove a contemporary version of addition and Kolmogorov-Sinai Theorems. Furthermore, we provide a comparison between the indispensable properties of hyper injective and semi-independent dynamical systems. Specifically, we present and prove theorems that draw comparisons between the entropies of such systems. Lastly, we discuss some possible relationships between the theories of hyper MV -algebra and MV -algebra dynamical systems.

  20. Psychotic experiences and hyper-theory-of-mind in preadolescence--a birth cohort study.

    PubMed

    Clemmensen, L; van Os, J; Drukker, M; Munkholm, A; Rimvall, M K; Væver, M; Rask, C U; Bartels-Velthuis, A A; Skovgaard, A M; Jeppesen, P

    2016-01-01

    Knowledge on the risk mechanisms of psychotic experiences (PE) is still limited. The aim of this population-based study was to explore developmental markers of PE with a particular focus on the specificity of hyper-theory-of-mind (HyperToM) as correlate of PE as opposed to correlate of any mental disorder. We assessed 1630 children from the Copenhagen Child Cohort 2000 regarding PE and HyperToM at the follow-up at 11-12 years. Mental disorders were diagnosed by clinical ratings based on standardized parent-, teacher- and self-reported psychopathology. Logistic regression analyses were performed to test the correlates of PE and HyperToM, and the specificity of correlates of PE v. correlates of any Diagnostic and Statistical Manual of Mental Disorders, 4th edition (DSM-IV) mental disorder. Univariate analyses showed the following correlates of PE: familial psychiatric liability; parental mental illness during early child development; change in family composition; low family income; regulatory problems in infancy; onset of puberty; bullying; concurrent mental disorder; and HyperToM. When estimating the adjusted effects, only low family income, concurrent mental disorder, bullying and HyperToM remained significantly associated with PE. Further analyses of the specificity of these correlates with regard to outcome revealed that HyperToM was the only variable specifically associated with PE without concurrent mental disorder. Finally, HyperToM did not share any of the investigated precursors with PE. HyperToM may have a specific role in the risk trajectories of PE, being specifically associated with PE in preadolescent children, independently of other family and child risk factors associated with PE and overall psychopathology at this age.

  1. Systems of evidence-based healthcare and personalised health information: some international and national trends.

    PubMed

    Gordon, C; Gray, J A; Toth, B; Veloso, M

    2000-01-01

    In Europe, North America and elsewhere, growing interest has focussed on evidence-based healthcare systems, incorporating the deployment of practice guidelines, as a field of application for health telematics. The clinical benefit and technical feasibility of common European approaches to this task has recently been demonstrated. In Europe it is likely that, building on recent progress in electronic health record architecture (EHRA) standards, a sufficient state of maturity can be reached to justify initiation within CEN TC251 of a prestandards process on guideline content formats during the current 5th Framework of EC RT&D activity. There is now a similar impetus to agree standards for this field in North America. Thanks to fruitful EC-USA contacts during the 4th Framework programme, there is now a chance, given well-planned coordination, to establish a global consensus optimally suited to serve the world-wide delivery and application of evidence-based medicine. This review notes three factors which may accelerate progress to convergence: (1) revolutionary changes in the knowledge basis of professional/patient/public healthcare partnerships, involving the key role of the Web as a health knowledge resource for citizens, and a rapidly growing market for personalised health information and advice; (2) the emergence at national levels of digital warehouses of clinical guidelines and EBM knowledge resources, agencies which are capable of brokering common mark-up and interchange media definitions between knowledge providers, industry and healthcare organizations; (3) the closing gap in knowledge management technology, with the advent of XML and RDF, between approaches and services based respectively on text mark-up and knowledge-base paradigms. A current project in the UK National Health Service (the National electronic Library of Health) is cited as an example of a national initiative designed to harness these trends.

  2. Systematic in J-PARC/Hyper-K

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Minamino, Akihiro

    The Hyper-Kamiokande (Hyper-K) detector is a next generation underground water Chrenkov detector. The J-PARC to Hyper-K experiment has good potential for precision measurements of neutrino oscillation parameters and discovery reach for CP violation in the lepton sector. With a total exposure of 10 years to a neutrino beam produced by the 750 kW J-PARC proton synchrotron, it is expected that the CP phase δ can be determined to better than 18 degree for all possible values of δ if sin{sup 2} 2θ{sub 13} > 0.03 and the mass hierarchy is known. Control of systematic uncertainties is critical to make maximummore » use of the Hyper-K potential. Based on learning from T2K experience, a strategy to reduce systematic uncertainties in J-PARC/Hyper-K are developed.« less

  3. Recent progress of push-broom infrared hyper-spectral imager in SITP

    NASA Astrophysics Data System (ADS)

    Wang, Yueming; Hu, Weida; Shu, Rong; Li, Chunlai; Yuan, Liyin; Wang, Jianyu

    2017-02-01

    In the past decades, hyper-spectral imaging technologies were well developed in SITP, CAS. Many innovations for system design and key parts of hyper-spectral imager were finished. First airborne hyper-spectral imager operating from VNIR to TIR in the world was emerged in SITP. It is well known as OMIS(Operational Modular Imaging Spectrometer). Some new technologies were introduced to improve the performance of hyper-spectral imaging system in these years. A high spatial space-borne hyper-spectral imager aboard Tiangong-1 spacecraft was launched on Sep.29, 2011. Thanks for ground motion compensation and high optical efficiency prismatic spectrometer, a large amount of hyper-spectral imagery with high sensitivity and good quality were acquired in the past years. Some important phenomena were observed. To diminish spectral distortion and expand field of view, new type of prismatic imaging spectrometer based curved prism were proposed by SITP. A prototype of hyper-spectral imager based spherical fused silica prism were manufactured, which can operate from 400nm 2500nm. We also made progress in the development of LWIR hyper-spectral imaging technology. Compact and low F number LWIR imaging spectrometer was designed, manufactured and integrated. The spectrometer operated in a cryogenically-cooled vacuum box for background radiation restraint. The system performed well during flight experiment in an airborne platform. Thanks high sensitivity FPA and high performance optics, spatial resolution and spectral resolution and SNR of system are improved enormously. However, more work should be done for high radiometric accuracy in the future.

  4. Parallel Evolution of Sperm Hyper-Activation Ca2+ Channels

    PubMed Central

    Phadnis, Nitin

    2017-01-01

    Abstract Sperm hyper-activation is a dramatic change in sperm behavior where mature sperm burst into a final sprint in the race to the egg. The mechanism of sperm hyper-activation in many metazoans, including humans, consists of a jolt of Ca2+ into the sperm flagellum via CatSper ion channels. Surprisingly, all nine CatSper genes have been independently lost in several animal lineages. In Drosophila, sperm hyper-activation is performed through the cooption of the polycystic kidney disease 2 (pkd2) Ca2+ channel. The parallels between CatSpers in primates and pkd2 in Drosophila provide a unique opportunity to examine the molecular evolution of the sperm hyper-activation machinery in two independent, nonhomologous calcium channels separated by > 500 million years of divergence. Here, we use a comprehensive phylogenomic approach to investigate the selective pressures on these sperm hyper-activation channels. First, we find that the entire CatSper complex evolves rapidly under recurrent positive selection in primates. Second, we find that pkd2 has parallel patterns of adaptive evolution in Drosophila. Third, we show that this adaptive evolution of pkd2 is driven by its role in sperm hyper-activation. These patterns of selection suggest that the evolution of the sperm hyper-activation machinery is driven by sexual conflict with antagonistic ligands that modulate channel activity. Together, our results add sperm hyper-activation channels to the class of fast evolving reproductive proteins and provide insights into the mechanisms used by the sexes to manipulate sperm behavior. PMID:28810709

  5. State cigarette minimum price laws - United States, 2009.

    PubMed

    2010-04-09

    Cigarette price increases reduce the demand for cigarettes and thereby reduce smoking prevalence, cigarette consumption, and youth initiation of smoking. Excise tax increases are the most effective government intervention to increase the price of cigarettes, but cigarette manufacturers use trade discounts, coupons, and other promotions to counteract the effects of these tax increases and appeal to price-sensitive smokers. State cigarette minimum price laws, initiated by states in the 1940s and 1950s to protect tobacco retailers from predatory business practices, typically require a minimum percentage markup to be added to the wholesale and/or retail price. If a statute prohibits trade discounts from the minimum price calculation, these laws have the potential to counteract discounting by cigarette manufacturers. To assess the status of cigarette minimum price laws in the United States, CDC surveyed state statutes and identified those states with minimum price laws in effect as of December 31, 2009. This report summarizes the results of that survey, which determined that 25 states had minimum price laws for cigarettes (median wholesale markup: 4.00%; median retail markup: 8.00%), and seven of those states also expressly prohibited the use of trade discounts in the minimum retail price calculation. Minimum price laws can help prevent trade discounting from eroding the positive effects of state excise tax increases and higher cigarette prices on public health.

  6. XML at the ADC: Steps to a Next Generation Data Archive

    NASA Astrophysics Data System (ADS)

    Shaya, E.; Blackwell, J.; Gass, J.; Oliversen, N.; Schneider, G.; Thomas, B.; Cheung, C.; White, R. A.

    1999-05-01

    The eXtensible Markup Language (XML) is a document markup language that allows users to specify their own tags, to create hierarchical structures to qualify their data, and to support automatic checking of documents for structural validity. It is being intensively supported by nearly every major corporate software developer. Under the funds of a NASA AISRP proposal, the Astronomical Data Center (ADC, http://adc.gsfc.nasa.gov) is developing an infrastructure for importation, enhancement, and distribution of data and metadata using XML as the document markup language. We discuss the preliminary Document Type Definition (DTD, at http://adc.gsfc.nasa.gov/xml) which specifies the elements and their attributes in our metadata documents. This attempts to define both the metadata of an astronomical catalog and the `header' information of an astronomical table. In addition, we give an overview of the planned flow of data through automated pipelines from authors and journal presses into our XML archive and retrieval through the web via the XML-QL Query Language and eXtensible Style Language (XSL) scripts. When completed, the catalogs and journal tables at the ADC will be tightly hyperlinked to enhance data discovery. In addition one will be able to search on fragmentary information. For instance, one could query for a table by entering that the second author is so-and-so or that the third author is at such-and-such institution.

  7. Propulsion System Airframe Integration Issues and Aerodynamic Database Development for the Hyper-X Flight Research Vehicle

    NASA Technical Reports Server (NTRS)

    Engelund, Walter C.; Holland, Scott D.; Cockrell, Charles E., Jr.; Bittner, Robert D.

    1999-01-01

    NASA's Hyper-X Research Vehicle will provide a unique opportunity to obtain data on an operational airframe integrated scramjet propulsion system at true flight conditions. The airframe integrated nature of the scramjet engine with the Hyper-X vehicle results in a strong coupling effect between the propulsion system operation and the airframe s basic aerodynamic characteristics. Comments on general airframe integrated scramjet propulsion system effects on vehicle aerodynamic performance, stability, and control are provided, followed by examples specific to the Hyper-X research vehicle. An overview is provided of the current activities associated with the development of the Hyper-X aerodynamic database, including wind tunnel test activities and parallel CFD analysis efforts. A brief summary of the Hyper-X aerodynamic characteristics is provided, including the direct and indirect effects of the airframe integrated scramjet propulsion system operation on the basic airframe stability and control characteristics.

  8. Exploring expressivity and emotion with artificial voice and speech technologies.

    PubMed

    Pauletto, Sandra; Balentine, Bruce; Pidcock, Chris; Jones, Kevin; Bottaci, Leonardo; Aretoulaki, Maria; Wells, Jez; Mundy, Darren P; Balentine, James

    2013-10-01

    Emotion in audio-voice signals, as synthesized by text-to-speech (TTS) technologies, was investigated to formulate a theory of expression for user interface design. Emotional parameters were specified with markup tags, and the resulting audio was further modulated with post-processing techniques. Software was then developed to link a selected TTS synthesizer with an automatic speech recognition (ASR) engine, producing a chatbot that could speak and listen. Using these two artificial voice subsystems, investigators explored both artistic and psychological implications of artificial speech emotion. Goals of the investigation were interdisciplinary, with interest in musical composition, augmentative and alternative communication (AAC), commercial voice announcement applications, human-computer interaction (HCI), and artificial intelligence (AI). The work-in-progress points towards an emerging interdisciplinary ontology for artificial voices. As one study output, HCI tools are proposed for future collaboration.

  9. Electronic Procedures for Medical Operations

    NASA Technical Reports Server (NTRS)

    2015-01-01

    Electronic procedures are replacing text-based documents for recording the steps in performing medical operations aboard the International Space Station. S&K Aerospace, LLC, has developed a content-based electronic system-based on the Extensible Markup Language (XML) standard-that separates text from formatting standards and tags items contained in procedures so they can be recognized by other electronic systems. For example, to change a standard format, electronic procedures are changed in a single batch process, and the entire body of procedures will have the new format. Procedures can be quickly searched to determine which are affected by software and hardware changes. Similarly, procedures are easily shared with other electronic systems. The system also enables real-time data capture and automatic bookmarking of current procedure steps. In Phase II of the project, S&K Aerospace developed a Procedure Representation Language (PRL) and tools to support the creation and maintenance of electronic procedures for medical operations. The goal is to develop these tools in such a way that new advances can be inserted easily, leading to an eventual medical decision support system.

  10. Light at Night Markup Language (LANML): XML Technology for Light at Night Monitoring Data

    NASA Astrophysics Data System (ADS)

    Craine, B. L.; Craine, E. R.; Craine, E. M.; Crawford, D. L.

    2013-05-01

    Light at Night Markup Language (LANML) is a standard, based upon XML, useful in acquiring, validating, transporting, archiving and analyzing multi-dimensional light at night (LAN) datasets of any size. The LANML standard can accommodate a variety of measurement scenarios including single spot measures, static time-series, web based monitoring networks, mobile measurements, and airborne measurements. LANML is human-readable, machine-readable, and does not require a dedicated parser. In addition LANML is flexible; ensuring future extensions of the format will remain backward compatible with analysis software. The XML technology is at the heart of communicating over the internet and can be equally useful at the desktop level, making this standard particularly attractive for web based applications, educational outreach and efficient collaboration between research groups.

  11. Dealing with Diversity in Computational Cancer Modeling

    PubMed Central

    Johnson, David; McKeever, Steve; Stamatakos, Georgios; Dionysiou, Dimitra; Graf, Norbert; Sakkalis, Vangelis; Marias, Konstantinos; Wang, Zhihui; Deisboeck, Thomas S.

    2013-01-01

    This paper discusses the need for interconnecting computational cancer models from different sources and scales within clinically relevant scenarios to increase the accuracy of the models and speed up their clinical adaptation, validation, and eventual translation. We briefly review current interoperability efforts drawing upon our experiences with the development of in silico models for predictive oncology within a number of European Commission Virtual Physiological Human initiative projects on cancer. A clinically relevant scenario, addressing brain tumor modeling that illustrates the need for coupling models from different sources and levels of complexity, is described. General approaches to enabling interoperability using XML-based markup languages for biological modeling are reviewed, concluding with a discussion on efforts towards developing cancer-specific XML markup to couple multiple component models for predictive in silico oncology. PMID:23700360

  12. Medicare program; revisions to payment policies under the Physician Fee Schedule, and other part B payment policies for CY 2008; delay of the date of applicability of the revised anti-markup provisions for certain services furnished in certain locations (Sec. 414.50). Final rule.

    PubMed

    2008-01-03

    This final rule delays until January 1, 2009 the applicability of the anti-markup provisions in Sec. 414.50, as revised at 72 FR 66222, except with respect to the technical component of a purchased diagnostic test and with respect to any anatomic pathology diagnostic testing services furnished in space that: Is utilized by a physician group practice as a "centralized building" (as defined at Sec. 411.351 of this chapter) for purposes of complying with the physician self-referral rules; and does not qualify as a "same building" under Sec. 411.355(b)(2)(i) of this chapter.

  13. Improving the Interoperability of Disaster Models: a Case Study of Proposing Fireml for Forest Fire Model

    NASA Astrophysics Data System (ADS)

    Jiang, W.; Wang, F.; Meng, Q.; Li, Z.; Liu, B.; Zheng, X.

    2018-04-01

    This paper presents a new standardized data format named Fire Markup Language (FireML), extended by the Geography Markup Language (GML) of OGC, to elaborate upon the fire hazard model. The proposed FireML is able to standardize the input and output documents of a fire model for effectively communicating with different disaster management systems to ensure a good interoperability. To demonstrate the usage of FireML and testify its feasibility, an adopted forest fire spread model being compatible with FireML is described. And a 3DGIS disaster management system is developed to simulate the dynamic procedure of forest fire spread with the defined FireML documents. The proposed approach will enlighten ones who work on other disaster models' standardization work.

  14. Multicenter analysis of treatment outcomes in adult patients with lymphoblastic lymphoma who received hyper-CVAD induction followed by hematopoietic stem cell transplantation.

    PubMed

    Jeong, Seong Hyun; Moon, Joon Ho; Kim, Jin Seok; Yang, Deok-Hwan; Park, Yong; Cho, Seok Goo; Kwak, Jae-Yong; Eom, Hyeon Seok; Won, Jong Ho; Hong, Jun Shik; Oh, Sung Yong; Lee, Ho Sup; Kim, Seok Jin

    2015-04-01

    The hyperfractionated cyclophosphamide, vincristine, doxorubicin, and dexamethasone (hyper-CVAD) regimen has been widely used for lymphoblastic lymphoma (LBL) as a primary treatment. However, there is few data about its treatment outcome in Asian patients. Thus, we conducted this study to evaluate the efficacy of hyper-CVAD induction and stem cell transplantation (SCT) consolidation in LBL patients. The treatment responses of 49 patients treated with the hyper-CVAD regimen were retrospectively analyzed in 13 institutions. Given 24 patients who responded to hyper-CVAD underwent consolidation treatment with SCT, overall survival (OS) and progression-free survival (PFS) of patients who received SCT were compared with patients who did not. The overall response rate was 79 %: 73 % (36/49) complete responses, 6 % (3/49) partial responses, and 4 % (2/49) induction deaths. The major limitation for the delivery of the planned hyper-CVAD cycles was hematological toxicity. Among 39 responders, 24 patients underwent autologous (n = 16) and allogeneic SCT (n = 8) consolidation. Their 3-year OS and PFS rates were 76 and 78 %, respectively, and there was no difference in survival outcomes between autologous and allogeneic SCT. However, 15 patients without SCT consolidation showed poorer PFS even though they all achieved complete response. Thus, only seven patients maintained their response at the time of analysis. In conclusion, the hyper-CVAD regimen is effective for remission induction in LBL, and SCT consolidation after hyper-CVAD induction produced better clinical outcomes than did continuation of hyper-CVAD.

  15. Responses of heart rate and blood pressure to KC-135 hyper-gravity

    NASA Technical Reports Server (NTRS)

    Satake, Hirotaka; Matsunami, Ken'ichi; Reschke, Millard F.

    1992-01-01

    Many investigators have clarified the effects of hyper gravitational-inertial forces (G) upon the cardiovascular system, using the centrifugal apparatus with short rotating radius. We investigated the cardiovascular responses to KC-135 hyper-G flight with negligibly small angular velocity. Six normal, healthy subjects 29 to 40 years old (5 males and 1 female) took part in this experiment. Hyper gravitational-inertial force was generated by the KC-135 hyper-G flight, flown in a spiral path with a very long radius of 1.5 miles. Hyper-G was sustained for 3 minutes with 1.8 +Gz in each session and was repeatedly exposed to very subject sitting on a chair 5 times. The preliminary results of blood pressure and R-R interval are discussed. An exposure of 1.8 +Gz stress resulted in a remarkable increase of systolic and diastolic blood pressure, while the pulse pressure did not change and remained equal to the control level regardless of an exposure of hyper-G. These results in blood pressure indicate an increase of resistance in the peripheral vessels, when an exposure of hyper-G was applied. The R-R interval was calculated from ECG. R-R interval in all subjects was changed but not systematically, and R-R interval became obviously shorter during the hyper-G period than during the 1 +Gz control period although R-R interval varied widely in some cases. The coefficient of variation of R-R interval was estimated to determine the autonomic nerve activity, but no significant change was detectable.

  16. Parallel Evolution of Sperm Hyper-Activation Ca2+ Channels.

    PubMed

    Cooper, Jacob C; Phadnis, Nitin

    2017-07-01

    Sperm hyper-activation is a dramatic change in sperm behavior where mature sperm burst into a final sprint in the race to the egg. The mechanism of sperm hyper-activation in many metazoans, including humans, consists of a jolt of Ca2+ into the sperm flagellum via CatSper ion channels. Surprisingly, all nine CatSper genes have been independently lost in several animal lineages. In Drosophila, sperm hyper-activation is performed through the cooption of the polycystic kidney disease 2 (pkd2) Ca2+ channel. The parallels between CatSpers in primates and pkd2 in Drosophila provide a unique opportunity to examine the molecular evolution of the sperm hyper-activation machinery in two independent, nonhomologous calcium channels separated by > 500 million years of divergence. Here, we use a comprehensive phylogenomic approach to investigate the selective pressures on these sperm hyper-activation channels. First, we find that the entire CatSper complex evolves rapidly under recurrent positive selection in primates. Second, we find that pkd2 has parallel patterns of adaptive evolution in Drosophila. Third, we show that this adaptive evolution of pkd2 is driven by its role in sperm hyper-activation. These patterns of selection suggest that the evolution of the sperm hyper-activation machinery is driven by sexual conflict with antagonistic ligands that modulate channel activity. Together, our results add sperm hyper-activation channels to the class of fast evolving reproductive proteins and provide insights into the mechanisms used by the sexes to manipulate sperm behavior. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  17. [Application of hyper-spectral remote sensing technology in environmental protection].

    PubMed

    Zhao, Shao-Hua; Zhang, Feng; Wang, Qiao; Yao, Yun-Jun; Wang, Zhong-Ting; You, Dai-An

    2013-12-01

    Hyper-spectral remote sensing (RS) technology has been widely used in environmental protection. The present work introduces its recent application in the RS monitoring of pollution gas, green-house gas, algal bloom, water quality of catch water environment, safety of drinking water sources, biodiversity, vegetation classification, soil pollution, and so on. Finally, issues such as scarce hyper-spectral satellites, the limits of data processing and information extract are related. Some proposals are also presented, including developing subsequent satellites of HJ-1 satellite with differential optical absorption spectroscopy, greenhouse gas spectroscopy and hyper-spectral imager, strengthening the study of hyper-spectral data processing and information extraction, and promoting the construction of environmental application system.

  18. Characteristic optical coherence tomography findings in patients with primary vitreoretinal lymphoma: a novel aid to early diagnosis.

    PubMed

    Barry, Robert J; Tasiopoulou, Anastasia; Murray, Philip I; Patel, Praveen J; Sagoo, Mandeep S; Denniston, Alastair K; Keane, Pearse A

    2018-01-06

    The diagnosis of primary vitreoretinal lymphoma (PVRL) poses significant difficulties; presenting features are non-specific and confirmation usually necessitates invasive vitreoretinal biopsy. Diagnosis is often delayed, resulting in increased morbidity and mortality. Non-invasive imaging modalities such as spectral domain optical coherence tomography (SD-OCT) offer simple and rapid aids to diagnosis. We present characteristic SD-OCT images of patients with biopsy-positive PVRL and propose a number of typical features, which we believe are useful in identifying these lesions at an early stage. Medical records of all patients attending Moorfields Eye Hospital between April 2010 and April 2016 with biopsy-positive PVRL were reviewed. Pretreatment SD-OCT images were collected for all eyes and were reviewed independently by two researchers for features suggestive of PVRL. Pretreatment SD-OCT images of 32 eyes of 22 patients with biopsy-proven PVRL were reviewed. Observed features included hyper-reflective subretinal infiltrates (17/32), hyper-reflective infiltration in inner retinal layers (6/32), retinal pigment epithelium (RPE) undulation (5/32), clumps of vitreous cells (5/32) and sub-RPE deposits (3/32). Of these, the hyper-reflective subretinal infiltrates have an appearance unique to PVRL, with features not seen in other diseases. We have identified a range of SD-OCT features, which we believe to be consistent with a diagnosis of PVRL. We propose that the observation of hyper-reflective subretinal infiltrates as described is highly suggestive of PVRL. This case series further demonstrates the utility of SD-OCT as a non-invasive and rapid aid to diagnosis, which may improve both visual outcomes and survival of patients with intraocular malignancies such as PVRL. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. Neuromodulation of detrusor hyper-reflexia by functional magnetic stimulation of the sacral roots.

    PubMed

    Sheriff, M K; Shah, P J; Fowler, C; Mundy, A R; Craggs, M D

    1996-07-01

    To investigate the acute effects of functional magnetic stimulation (FMS) on detrusor hyper-reflexia using a multi-pulse magnetic stimulator. Seven male patients with established and intractable detrusor hyper-reflexia following spinal cord injury were studied. No patient was on medication and none had had previous surgery for detrusor hyper-reflexia. After optimization of magnetic stimulation of S2-S4 sacral anterior roots by recording toe flexor electromyograms, unstable detrusor activity was provoked during cystometry by rapid infusion of fluid into the bladder. The provocation test produced consistent and predictable detrusor hyper-reflexia. On some provocations, supramaximal FMS at 20 pulses/s for 5 s was applied at detrusor pressures which were > 15 cmH2O. Following FMS there was an obvious acute suppression of detrusor hyper-reflexia. There was a profound reduction in detrusor contraction, as assessed by the area under the curves of detrusor pressure with time. Functional magnetic stimulation applied over the sacrum can profoundly suppress detrusor hyper-reflexia in man. It may provide a non-invasive method of assessing patients for implantable electrical neuromodulation devices and as a therapeutic option in its own right.

  20. Genomic Sequence Variation Markup Language (GSVML).

    PubMed

    Nakaya, Jun; Kimura, Michio; Hiroi, Kaei; Ido, Keisuke; Yang, Woosung; Tanaka, Hiroshi

    2010-02-01

    With the aim of making good use of internationally accumulated genomic sequence variation data, which is increasing rapidly due to the explosive amount of genomic research at present, the development of an interoperable data exchange format and its international standardization are necessary. Genomic Sequence Variation Markup Language (GSVML) will focus on genomic sequence variation data and human health applications, such as gene based medicine or pharmacogenomics. We developed GSVML through eight steps, based on case analysis and domain investigations. By focusing on the design scope to human health applications and genomic sequence variation, we attempted to eliminate ambiguity and to ensure practicability. We intended to satisfy the requirements derived from the use case analysis of human-based clinical genomic applications. Based on database investigations, we attempted to minimize the redundancy of the data format, while maximizing the data covering range. We also attempted to ensure communication and interface ability with other Markup Languages, for exchange of omics data among various omics researchers or facilities. The interface ability with developing clinical standards, such as the Health Level Seven Genotype Information model, was analyzed. We developed the human health-oriented GSVML comprising variation data, direct annotation, and indirect annotation categories; the variation data category is required, while the direct and indirect annotation categories are optional. The annotation categories contain omics and clinical information, and have internal relationships. For designing, we examined 6 cases for three criteria as human health application and 15 data elements for three criteria as data formats for genomic sequence variation data exchange. The data format of five international SNP databases and six Markup Languages and the interface ability to the Health Level Seven Genotype Model in terms of 317 items were investigated. GSVML was developed as a potential data exchanging format for genomic sequence variation data exchange focusing on human health applications. The international standardization of GSVML is necessary, and is currently underway. GSVML can be applied to enhance the utilization of genomic sequence variation data worldwide by providing a communicable platform between clinical and research applications. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.

  1. The intense world theory - a unifying theory of the neurobiology of autism.

    PubMed

    Markram, Kamila; Markram, Henry

    2010-01-01

    Autism covers a wide spectrum of disorders for which there are many views, hypotheses and theories. Here we propose a unifying theory of autism, the Intense World Theory. The proposed neuropathology is hyper-functioning of local neural microcircuits, best characterized by hyper-reactivity and hyper-plasticity. Such hyper-functional microcircuits are speculated to become autonomous and memory trapped leading to the core cognitive consequences of hyper-perception, hyper-attention, hyper-memory and hyper-emotionality. The theory is centered on the neocortex and the amygdala, but could potentially be applied to all brain regions. The severity on each axis depends on the severity of the molecular syndrome expressed in different brain regions, which could uniquely shape the repertoire of symptoms of an autistic child. The progression of the disorder is proposed to be driven by overly strong reactions to experiences that drive the brain to a hyper-preference and overly selective state, which becomes more extreme with each new experience and may be particularly accelerated by emotionally charged experiences and trauma. This may lead to obsessively detailed information processing of fragments of the world and an involuntarily and systematic decoupling of the autist from what becomes a painfully intense world. The autistic is proposed to become trapped in a limited, but highly secure internal world with minimal extremes and surprises. We present the key studies that support this theory of autism, show how this theory can better explain past findings, and how it could resolve apparently conflicting data and interpretations. The theory also makes further predictions from the molecular to the behavioral levels, provides a treatment strategy and presents its own falsifying hypothesis.

  2. Resting Energy Expenditure of Rats Acclimated to Hyper-Gravity

    NASA Technical Reports Server (NTRS)

    Wade, Charles E.; Moran, Megan M.; Oyama, Jiro; Schwenke, David; Dalton, Bonnie P. (Technical Monitor)

    2000-01-01

    To determine the influence of body mass and age on resting energy expenditure (EE) following acclimation to hyper-gravity, oxygen consumption (VO2) and carbon dioxide production (VCO2) were measured to calculate resting energy expenditure (EE), in male rats, ages 40 to 400 days, acclimated to 1.23 or 4.1 G for a minimum of two weeks. Animals were maintained on a centrifuge to produce the hyper-gravity environment. Measurements were made over three hours in hyper-gravity during the period when the lights were on, the inactive period of rats. In rats matched for body mass (approximately 400 g) hyper-gravity increased VO2 by 18% and VCO2 by 27% compared to controls, resulting in an increase in RER, 0.80 to 0.87. There were increases in resting EE with an increase in gravity. This increase was greater when the mass of the rat was larger. Rating EE for 400g animals were increased from 47 +/- 1 kcal/kg/day at 1 G, to 57 +/- 1.5 and 5.8 +/- 2.2 kcal/kg/day at 2,3 and 4.1 G, respectively. There was no difference between the two hyper-gravity environments. When differences in age of the animals were accounted for, the increase in resting EE adjusted for body mass was increased by over 36% in older animals due to exposure to hyper-gravity. Acclimation to hyper-gravity increases the resting EE of rats, dependent upon body mass and age, and appears to alter substrate metabolism. Increasing the level of hyper-gravity, from 2.3 to 4.1 G, produced no further changes raising questions as to a dose effect of gravity level on resting metabolism.

  3. The Intense World Theory – A Unifying Theory of the Neurobiology of Autism

    PubMed Central

    Markram, Kamila; Markram, Henry

    2010-01-01

    Autism covers a wide spectrum of disorders for which there are many views, hypotheses and theories. Here we propose a unifying theory of autism, the Intense World Theory. The proposed neuropathology is hyper-functioning of local neural microcircuits, best characterized by hyper-reactivity and hyper-plasticity. Such hyper-functional microcircuits are speculated to become autonomous and memory trapped leading to the core cognitive consequences of hyper-perception, hyper-attention, hyper-memory and hyper-emotionality. The theory is centered on the neocortex and the amygdala, but could potentially be applied to all brain regions. The severity on each axis depends on the severity of the molecular syndrome expressed in different brain regions, which could uniquely shape the repertoire of symptoms of an autistic child. The progression of the disorder is proposed to be driven by overly strong reactions to experiences that drive the brain to a hyper-preference and overly selective state, which becomes more extreme with each new experience and may be particularly accelerated by emotionally charged experiences and trauma. This may lead to obsessively detailed information processing of fragments of the world and an involuntarily and systematic decoupling of the autist from what becomes a painfully intense world. The autistic is proposed to become trapped in a limited, but highly secure internal world with minimal extremes and surprises. We present the key studies that support this theory of autism, show how this theory can better explain past findings, and how it could resolve apparently conflicting data and interpretations. The theory also makes further predictions from the molecular to the behavioral levels, provides a treatment strategy and presents its own falsifying hypothesis. PMID:21191475

  4. Automatic reconstruction of a bacterial regulatory network using Natural Language Processing

    PubMed Central

    Rodríguez-Penagos, Carlos; Salgado, Heladia; Martínez-Flores, Irma; Collado-Vides, Julio

    2007-01-01

    Background Manual curation of biological databases, an expensive and labor-intensive process, is essential for high quality integrated data. In this paper we report the implementation of a state-of-the-art Natural Language Processing system that creates computer-readable networks of regulatory interactions directly from different collections of abstracts and full-text papers. Our major aim is to understand how automatic annotation using Text-Mining techniques can complement manual curation of biological databases. We implemented a rule-based system to generate networks from different sets of documents dealing with regulation in Escherichia coli K-12. Results Performance evaluation is based on the most comprehensive transcriptional regulation database for any organism, the manually-curated RegulonDB, 45% of which we were able to recreate automatically. From our automated analysis we were also able to find some new interactions from papers not already curated, or that were missed in the manual filtering and review of the literature. We also put forward a novel Regulatory Interaction Markup Language better suited than SBML for simultaneously representing data of interest for biologists and text miners. Conclusion Manual curation of the output of automatic processing of text is a good way to complement a more detailed review of the literature, either for validating the results of what has been already annotated, or for discovering facts and information that might have been overlooked at the triage or curation stages. PMID:17683642

  5. Intelligent tutoring using HyperCLIPS

    NASA Technical Reports Server (NTRS)

    Hill, Randall W., Jr.; Pickering, Brad

    1990-01-01

    HyperCard is a popular hypertext-like system used for building user interfaces to databases and other applications, and CLIPS is a highly portable government-owned expert system shell. We developed HyperCLIPS in order to fill a gap in the U.S. Army's computer-based instruction tool set; it was conceived as a development environment for building adaptive practical exercises for subject-matter problem-solving, though it is not limited to this approach to tutoring. Once HyperCLIPS was developed, we set out to implement a practical exercise prototype using HyperCLIPS in order to demonstrate the following concepts: learning can be facilitated by doing; student performance evaluation can be done in real-time; and the problems in a practical exercise can be adapted to the individual student's knowledge.

  6. A spectrum fractal feature classification algorithm for agriculture crops with hyper spectrum image

    NASA Astrophysics Data System (ADS)

    Su, Junying

    2011-11-01

    A fractal dimension feature analysis method in spectrum domain for hyper spectrum image is proposed for agriculture crops classification. Firstly, a fractal dimension calculation algorithm in spectrum domain is presented together with the fast fractal dimension value calculation algorithm using the step measurement method. Secondly, the hyper spectrum image classification algorithm and flowchart is presented based on fractal dimension feature analysis in spectrum domain. Finally, the experiment result of the agricultural crops classification with FCL1 hyper spectrum image set with the proposed method and SAM (spectral angle mapper). The experiment results show it can obtain better classification result than the traditional SAM feature analysis which can fulfill use the spectrum information of hyper spectrum image to realize precision agricultural crops classification.

  7. Theoretical studies of surface enhanced hyper-Raman spectroscopy: The chemical enhancement mechanism

    NASA Astrophysics Data System (ADS)

    Valley, Nicholas; Jensen, Lasse; Autschbach, Jochen; Schatz, George C.

    2010-08-01

    Hyper-Raman spectra for pyridine and pyridine on the surface of a tetrahedral 20 silver atom cluster are calculated using static hyperpolarizability derivatives obtained from time dependent density functional theory. The stability of the results with respect to choice of exchange-correlation functional and basis set is verified by comparison with experiment and with Raman spectra calculated for the same systems using the same methods. Calculated Raman spectra were found to match well with experiment and previous theoretical calculations. The calculated normal and surface enhanced hyper-Raman spectra closely match experimental results. The chemical enhancement factors for hyper-Raman are generally larger than for Raman (102-104 versus 101-102). Integrated hyper-Raman chemical enhancement factors are presented for a set of substituted pyridines. A two-state model is developed to predict these chemical enhancement factors and this was found to work well for the majority of the molecules considered, providing a rationalization for the difference between hyper-Raman and Raman enhancement factors.

  8. A hyper-spherical adaptive sparse-grid method for high-dimensional discontinuity detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Guannan; Webster, Clayton G.; Gunzburger, Max D.

    This work proposes and analyzes a hyper-spherical adaptive hierarchical sparse-grid method for detecting jump discontinuities of functions in high-dimensional spaces is proposed. The method is motivated by the theoretical and computational inefficiencies of well-known adaptive sparse-grid methods for discontinuity detection. Our novel approach constructs a function representation of the discontinuity hyper-surface of an N-dimensional dis- continuous quantity of interest, by virtue of a hyper-spherical transformation. Then, a sparse-grid approximation of the transformed function is built in the hyper-spherical coordinate system, whose value at each point is estimated by solving a one-dimensional discontinuity detection problem. Due to the smoothness of themore » hyper-surface, the new technique can identify jump discontinuities with significantly reduced computational cost, compared to existing methods. Moreover, hierarchical acceleration techniques are also incorporated to further reduce the overall complexity. Rigorous error estimates and complexity analyses of the new method are provided as are several numerical examples that illustrate the effectiveness of the approach.« less

  9. Coupled Thermo-Electro-Magneto-Elastic Response of Smart Stiffened Panels

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Yarrington, Phillip W.

    2009-01-01

    This report documents the procedures developed for incorporating smart laminate and panel analysis capabilities within the HyperSizer aerospace structural sizing software package. HyperSizer analyzes stiffened panels composed of arbitrary composite laminates through stiffener homogenization, or "smearing " techniques. The result is an effective constitutive equation for the stiffened panel that is suitable for use in a full vehicle-scale finite element analysis via MSC/NASTRAN. The existing thermo-elastic capabilities of HyperSizer have herein been extended to include coupled thermo-electro-magneto-elastic analysis capabilities. This represents a significant step toward realization of design tools capable of guiding the development of the next generation of smart aerospace structures. Verification results are presented that compare the developed smart HyperSizer capability with an ABAQUS piezoelectric finite element solution for a facesheet-flange combination. These results show good agreement between HyperSizer and ABAQUS, but highlight a limitation of the HyperSizer formulation in that constant electric field components are assumed.

  10. The hyper-enrichment of V and Zn in black shales of the Late Devonian-Early Mississippian Bakken Formation (USA)

    USGS Publications Warehouse

    Scott, Clinton T.; Slack, John F.; Kelley, Karen Duttweiler

    2017-01-01

    Black shales of the Late Devonian to Early Mississippian Bakken Formation are characterized by high concentrations of organic carbon and the hyper-enrichment (> 500 to 1000s of mg/kg) of V and Zn. Deposition of black shales resulted from shallow seafloor depths that promoted rapid development of euxinic conditions. Vanadium hyper-enrichments, which are unknown in modern environments, are likely the result of very high levels of dissolved H2S (~ 10 mM) in bottom waters or sediments. Because modern hyper-enrichments of Zn are documented only in Framvaren Fjord (Norway), it is likely that the biogeochemical trigger responsible for Zn hyper-enrichment in Framvaren Fjord was also present in the Bakken basin. With Framvaren Fjord as an analogue, we propose a causal link between the activity of phototrophic sulfide oxidizing bacteria, related to the development of photic-zone euxinia, and the hyper-enrichment of Zn in black shales of the Bakken Formation.

  11. GASP- General Aviation Synthesis Program. Volume 7: Economics

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The economic analysis includes: manufacturing costs; labor costs; parts costs; operating costs; markups and consumer price. A user's manual for a computer program to calculate the final consumer price is included.

  12. Introducing Today's SGML.

    ERIC Educational Resources Information Center

    Gilmore, Elizabeth

    1993-01-01

    Describes the fundamental concepts and potential of Standard General Markup Language (SGML), a system that allows computer users to exchange, reuse, and reformat information without constraint. Illustrates the concepts of SGML through a simple example. (SR)

  13. HYPERCLIPS

    NASA Technical Reports Server (NTRS)

    Hill, R. W.

    1994-01-01

    The integration of CLIPS into HyperCard combines the intuitive, interactive user interface of the Macintosh with the powerful symbolic computation of an expert system interpreter. HyperCard is an excellent environment for quickly developing the front end of an application with buttons, dialogs, and pictures, while the CLIPS interpreter provides a powerful inference engine for complex problem solving and analysis. In order to understand the benefit of integrating HyperCard and CLIPS, consider the following: HyperCard is an information storage and retrieval system which exploits the use of the graphics and user interface capabilities of the Apple Macintosh computer. The user can easily define buttons, dialog boxes, information templates, pictures, and graphic displays through the use of the HyperCard tools and scripting language. What is generally lacking in this environment is a powerful reasoning engine for complex problem solving, and this is where CLIPS plays a role. CLIPS 5.0 (C Language Integrated Production System, v5.0) was developed at the Johnson Space Center Software Technology Branch to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 5.0 supports forward chaining rule systems, object-oriented language, and procedural programming for the construction of expert systems. It features incremental reset, seven conflict resolution stategies, truth maintenance, and user-defined external functions. Since CLIPS is implemented in the C language it is highly portable; in addition, it is embeddable as a callable routine from a program written in another language such as Ada or Fortran. By integrating HyperCard and CLIPS the advantages and uses of both packages are made available for a wide range of applications: rapid prototyping of knowledge-based expert systems, interactive simulations of physical systems and intelligent control of hypertext processes, to name a few. HyperCLIPS 2.0 is written in C-Language (54%) and Pascal (46%) for Apple Macintosh computers running Macintosh System 6.0.2 or greater. HyperCLIPS requires HyperCard 1.2 or higher and at least 2Mb of RAM are recommended to run. An executable is provided. To compile the source code, the Macintosh Programmer's Workshop (MPW) version 3.0, CLIPS 5.0 (MSC-21927), and the MPW C-Language compiler are also required. NOTE: Installing this program under Macintosh System 7 requires HyperCard v2.1. This program is distributed on a 3.5 inch Macintosh format diskette. A copy of the program documentation is included on the diskette, but may be purchased separately. HyperCLIPS was developed in 1990 and version 2.0 was released in 1991. HyperCLIPS is a copyrighted work with all copyright vested in NASA. Apple, Macintosh, MPW, and HyperCard are registered trademarks of Apple Computer, Inc.

  14. A Case Study of Controlling Crossover in a Selection Hyper-heuristic Framework Using the Multidimensional Knapsack Problem.

    PubMed

    Drake, John H; Özcan, Ender; Burke, Edmund K

    2016-01-01

    Hyper-heuristics are high-level methodologies for solving complex problems that operate on a search space of heuristics. In a selection hyper-heuristic framework, a heuristic is chosen from an existing set of low-level heuristics and applied to the current solution to produce a new solution at each point in the search. The use of crossover low-level heuristics is possible in an increasing number of general-purpose hyper-heuristic tools such as HyFlex and Hyperion. However, little work has been undertaken to assess how best to utilise it. Since a single-point search hyper-heuristic operates on a single candidate solution, and two candidate solutions are required for crossover, a mechanism is required to control the choice of the other solution. The frameworks we propose maintain a list of potential solutions for use in crossover. We investigate the use of such lists at two conceptual levels. First, crossover is controlled at the hyper-heuristic level where no problem-specific information is required. Second, it is controlled at the problem domain level where problem-specific information is used to produce good-quality solutions to use in crossover. A number of selection hyper-heuristics are compared using these frameworks over three benchmark libraries with varying properties for an NP-hard optimisation problem: the multidimensional 0-1 knapsack problem. It is shown that allowing crossover to be managed at the domain level outperforms managing crossover at the hyper-heuristic level in this problem domain.

  15. Control of hyper-extended passive margin architecture on subduction initiation with application to the Alps and present-day North Atlantic ocean

    NASA Astrophysics Data System (ADS)

    Candioti, Lorenzo; Bauville, Arthur; Picazo, Suzanne; Mohn, Geoffroy; Kaus, Boris

    2016-04-01

    Hyper-extended magma-poor margins are characterized by extremely thinned crust and partially serpentinized mantle exhumation. As this can act as a zone of weakness during a subsequent compression event, a hyper-extended margin can thus potentially facilitate subduction initiation. Hyper-extended margins are also found today as passive margins fringing the Atlantic and North Atlantic ocean, e.g. Iberia and New Foundland margins [1] and Porcupine, Rockwall and Hatton basins. It has been proposed in the literature that hyper-extension in the Alpine Tethys does not exceed ~600 km in width [2]. The geodynamical evolution of the Alpine and Atlantic passive margins are distinct: no subduction is yet initiated in the North Atlantic, whereas the Alpine Tethys basin has undergone subduction. Here, we investigate the control of the presence of a hyper-extended margin on subduction initiation. We perform high resolution 2D simulations considering realistic rheologies and temperature profiles for these locations. We systematically vary the length and thickness of the hyper-extended crust and serpentinized mantle, to better understand the conditions for subduction initiation. References: [1] G. Manatschal. New models for evolution of magma-poor rifted margins based on a review of data and concepts from West Iberia and the Alps. Int J Earth Sci (Geol Rundsch) (2004); 432-466. [2] G. Mohn, G. Manatschal, M. Beltrando, I. Haupert. The role of rift-inherited hyper-extension in alpine-type orogens. Terra Nova (2014); 347-353.

  16. A practical approach to spectral calibration of short wavelength infrared hyper-spectral imaging systems

    NASA Astrophysics Data System (ADS)

    Bürmen, Miran; Pernuš, Franjo; Likar, Boštjan

    2010-02-01

    Near-infrared spectroscopy is a promising, rapidly developing, reliable and noninvasive technique, used extensively in the biomedicine and in pharmaceutical industry. With the introduction of acousto-optic tunable filters (AOTF) and highly sensitive InGaAs focal plane sensor arrays, real-time high resolution hyper-spectral imaging has become feasible for a number of new biomedical in vivo applications. However, due to the specificity of the AOTF technology and lack of spectral calibration standardization, maintaining long-term stability and compatibility of the acquired hyper-spectral images across different systems is still a challenging problem. Efficiently solving both is essential as the majority of methods for analysis of hyper-spectral images relay on a priori knowledge extracted from large spectral databases, serving as the basis for reliable qualitative or quantitative analysis of various biological samples. In this study, we propose and evaluate fast and reliable spectral calibration of hyper-spectral imaging systems in the short wavelength infrared spectral region. The proposed spectral calibration method is based on light sources or materials, exhibiting distinct spectral features, which enable robust non-rigid registration of the acquired spectra. The calibration accounts for all of the components of a typical hyper-spectral imaging system such as AOTF, light source, lens and optical fibers. The obtained results indicated that practical, fast and reliable spectral calibration of hyper-spectral imaging systems is possible, thereby assuring long-term stability and inter-system compatibility of the acquired hyper-spectral images.

  17. Application of the Hyper-Poisson Generalized Linear Model for Analyzing Motor Vehicle Crashes.

    PubMed

    Khazraee, S Hadi; Sáez-Castillo, Antonio Jose; Geedipally, Srinivas Reddy; Lord, Dominique

    2015-05-01

    The hyper-Poisson distribution can handle both over- and underdispersion, and its generalized linear model formulation allows the dispersion of the distribution to be observation-specific and dependent on model covariates. This study's objective is to examine the potential applicability of a newly proposed generalized linear model framework for the hyper-Poisson distribution in analyzing motor vehicle crash count data. The hyper-Poisson generalized linear model was first fitted to intersection crash data from Toronto, characterized by overdispersion, and then to crash data from railway-highway crossings in Korea, characterized by underdispersion. The results of this study are promising. When fitted to the Toronto data set, the goodness-of-fit measures indicated that the hyper-Poisson model with a variable dispersion parameter provided a statistical fit as good as the traditional negative binomial model. The hyper-Poisson model was also successful in handling the underdispersed data from Korea; the model performed as well as the gamma probability model and the Conway-Maxwell-Poisson model previously developed for the same data set. The advantages of the hyper-Poisson model studied in this article are noteworthy. Unlike the negative binomial model, which has difficulties in handling underdispersed data, the hyper-Poisson model can handle both over- and underdispersed crash data. Although not a major issue for the Conway-Maxwell-Poisson model, the effect of each variable on the expected mean of crashes is easily interpretable in the case of this new model. © 2014 Society for Risk Analysis.

  18. Intelligent retrieval of medical images from the Internet

    NASA Astrophysics Data System (ADS)

    Tang, Yau-Kuo; Chiang, Ted T.

    1996-05-01

    The object of this study is using Internet resources to provide a cost-effective, user-friendly method to access the medical image archive system and to provide an easy method for the user to identify the images required. This paper describes the prototype system architecture, the implementation, and results. In the study, we prototype the Intelligent Medical Image Retrieval (IMIR) system as a Hypertext Transport Prototype server and provide Hypertext Markup Language forms for user, as an Internet client, using browser to enter image retrieval criteria for review. We are developing the intelligent retrieval engine, with the capability to map the free text search criteria to the standard terminology used for medical image identification. We evaluate retrieved records based on the number of the free text entries matched and their relevance level to the standard terminology. We are in the integration and testing phase. We have collected only a few different types of images for testing and have trained a few phrases to map the free text to the standard medical terminology. Nevertheless, we are able to demonstrate the IMIR's ability to search, retrieve, and review medical images from the archives using general Internet browser. The prototype also uncovered potential problems in performance, security, and accuracy. Additional studies and enhancements will make the system clinically operational.

  19. MacMouse. Developing Preschool Readiness Concepts and Skills with HyperCard and MacRecorder.

    ERIC Educational Resources Information Center

    Fitterman, L. Jeffrey

    Through developments with the use of the "Apple Macintosh" computer, "HyperCard," and "MacRecorder," children in preschool handicapped programs are now capable of participating in appropriate computerized learning experiences. "HyperCard" allows educators to produce their own computerized instructional…

  20. XML DTD and Schemas for HDF-EOS

    NASA Technical Reports Server (NTRS)

    Ullman, Richard; Yang, Jingli

    2008-01-01

    An Extensible Markup Language (XML) document type definition (DTD) standard for the structure and contents of HDF-EOS files and their contents, and an equivalent standard in the form of schemas, have been developed.

  1. Web GIS in practice VIII: HTML5 and the canvas element for interactive online mapping.

    PubMed

    Boulos, Maged N Kamel; Warren, Jeffrey; Gong, Jianya; Yue, Peng

    2010-03-03

    HTML5 is being developed as the next major revision of HTML (Hypertext Markup Language), the core markup language of the World Wide Web. It aims at reducing the need for proprietary, plug-in-based rich Internet application (RIA) technologies such as Adobe Flash. The canvas element is part of HTML5 and is used to draw graphics using scripting (e.g., JavaScript). This paper introduces Cartagen, an open-source, vector-based, client-side framework for rendering plug-in-free, offline-capable, interactive maps in native HTML5 on a wide range of Web browsers and mobile phones. Cartagen was developed at MIT Media Lab's Design Ecology group. Potential applications of the technology as an enabler for participatory online mapping include mapping real-time air pollution, citizen reporting, and disaster response, among many other possibilities.

  2. Running MONET and SALT with Remote Telescope Markup Language 3.0

    NASA Astrophysics Data System (ADS)

    Hessman, F. V.; Romero, E.

    2003-05-01

    Complex robotic and service observations in heterogenous networks of telescopes require a common telescopic lingua franca for the description and transport of observing requests and results. Building upon the experience gained within the Hands-On Universe (HOU) and advanced amateur communities with Remote Telescope Markup Language (RTML) Version 2.1 (http://sunra.lbl.gov/rtml), we have implemented a revised RTML syntax (Version 3.0) which is fully capable of - running the two 1.2m MONET robotic telescopes for a very inhomogeneous clientel from 3 research institutions and high school classes all over the world; - connecting MONET to the HOU telescope network; - connecting MONET as a trigger to the 11m SALT telescope; - providing all the objects needed to perform and document internet-based user support, ranging all the way from proposal submission and time-allocation to observation reports.

  3. The Schema.org Datasets Schema: Experiences at the National Snow and Ice Data Center

    NASA Astrophysics Data System (ADS)

    Duerr, R.; Billingsley, B. W.; Harper, D.; Kovarik, J.

    2014-12-01

    Data discovery, is still a major challenge for many users. Relevant data may be located anywhere. There are currently no existing universal data registries. Often users start with a simple query through their web browser. But how do you get your data to actually show up near the top of the results? One relatively new way to accomplish this is to use schema.org dataset markup in your data pages. Theoretically this provides web crawlers the additional information needed so that a query for data will preferentially return those pages that were marked up accordingly. The National Snow and Ice Data Center recently implemented an initial set of markup in the data set pages returned by its catalog. The Datasets data model, our process, challenges encountered and results will be described.

  4. The development of MML (Medical Markup Language) version 3.0 as a medical document exchange format for HL7 messages.

    PubMed

    Guo, Jinqiu; Takada, Akira; Tanaka, Koji; Sato, Junzo; Suzuki, Muneou; Suzuki, Toshiaki; Nakashima, Yusei; Araki, Kenji; Yoshihara, Hiroyuki

    2004-12-01

    Medical Markup Language (MML), as a set of standards, has been developed over the last 8 years to allow the exchange of medical data between different medical information providers. MML Version 2.21 used XML as a metalanguage and was announced in 1999. In 2001, MML was updated to Version 2.3, which contained 12 modules. The latest version--Version 3.0--is based on the HL7 Clinical Document Architecture (CDA). During the development of this new version, the structure of MML Version 2.3 was analyzed, subdivided into several categories, and redefined so the information defined in MML could be described in HL7 CDA Level One. As a result of this development, it has become possible to exchange MML Version 3.0 medical documents via HL7 messages.

  5. Instrument Remote Control Application Framework

    NASA Technical Reports Server (NTRS)

    Ames, Troy; Hostetter, Carl F.

    2006-01-01

    The Instrument Remote Control (IRC) architecture is a flexible, platform-independent application framework that is well suited for the control and monitoring of remote devices and sensors. IRC enables significant savings in development costs by utilizing extensible Markup Language (XML) descriptions to configure the framework for a specific application. The Instrument Markup Language (IML) is used to describe the commands used by an instrument, the data streams produced, the rules for formatting commands and parsing the data, and the method of communication. Often no custom code is needed to communicate with a new instrument or device. An IRC instance can advertise and publish a description about a device or subscribe to another device's description on a network. This simple capability of dynamically publishing and subscribing to interfaces enables a very flexible, self-adapting architecture for monitoring and control of complex instruments in diverse environments.

  6. Improving access to malaria medicine through private-sector subsidies in seven African countries.

    PubMed

    Tougher, Sarah; Mann, Andrea G; Ye, Yazoume; Kourgueni, Idrissa A; Thomson, Rebecca; Amuasi, John H; Ren, Ruilin; Willey, Barbara A; Ansong, Daniel; Bruxvoort, Katia; Diap, Graciela; Festo, Charles; Johanes, Boniface; Kalolella, Admirabilis; Mallam, Oumarou; Mberu, Blessing; Ndiaye, Salif; Nguah, Samual Blay; Seydou, Moctar; Taylor, Mark; Wamukoya, Marilyn; Arnold, Fred; Hanson, Kara; Goodman, Catherine

    2014-09-01

    Improving access to quality-assured artemisinin combination therapies (ACTs) is an important component of malaria control in low- and middle-income countries. In 2010 the Global Fund to Fight AIDS, Tuberculosis, and Malaria launched the Affordable Medicines Facility--malaria (AMFm) program in seven African countries. The goal of the program was to decrease malaria morbidity and delay drug resistance by increasing the use of ACTs, primarily through subsidies intended to reduce costs. We collected data on price and retail markups on antimalarial medicines from 19,625 private for-profit retail outlets before and 6-15 months after the program's implementation. We found that in six of the AMFm pilot programs, prices for quality-assured ACTs decreased by US$1.28-$4.34, and absolute retail markups on these therapies decreased by US$0.31-$1.03. Prices and markups on other classes of antimalarials also changed during the evaluation period, but not to the same extent. In all but two of the pilot programs, we found evidence that prices could fall further without suppliers' losing money. Thus, concerns may be warranted that wholesalers and retailers are capturing subsidies instead of passing them on to consumers. These findings demonstrate that supranational subsidies can dramatically reduce retail prices of health commodities and that recommended retail prices communicated to a wide audience may be an effective mechanism for controlling the market power of private-sector antimalarial retailers and wholesalers. Project HOPE—The People-to-People Health Foundation, Inc.

  7. Geospatial Visualization of Scientific Data Through Keyhole Markup Language

    NASA Astrophysics Data System (ADS)

    Wernecke, J.; Bailey, J. E.

    2008-12-01

    The development of virtual globes has provided a fun and innovative tool for exploring the surface of the Earth. However, it has been the paralleling maturation of Keyhole Markup Language (KML) that has created a new medium and perspective through which to visualize scientific datasets. Originally created by Keyhole Inc., and then acquired by Google in 2004, in 2007 KML was given over to the Open Geospatial Consortium (OGC). It became an OGC international standard on 14 April 2008, and has subsequently been adopted by all major geobrowser developers (e.g., Google, Microsoft, ESRI, NASA) and many smaller ones (e.g., Earthbrowser). By making KML a standard at a relatively young stage in its evolution, developers of the language are seeking to avoid the issues that plagued the early World Wide Web and development of Hypertext Markup Language (HTML). The popularity and utility of Google Earth, in particular, has been enhanced by KML features such as the Smithsonian volcano layer and the dynamic weather layers. Through KML, users can view real-time earthquake locations (USGS), view animations of polar sea-ice coverage (NSIDC), or read about the daily activities of chimpanzees (Jane Goodall Institute). Perhaps even more powerful is the fact that any users can create, edit, and share their own KML, with no or relatively little knowledge of manipulating computer code. We present an overview of the best current scientific uses of KML and a guide to how scientists can learn to use KML themselves.

  8. Transparent ICD and DRG coding using information technology: linking and associating information sources with the eXtensible Markup Language.

    PubMed

    Hoelzer, Simon; Schweiger, Ralf K; Dudeck, Joachim

    2003-01-01

    With the introduction of ICD-10 as the standard for diagnostics, it becomes necessary to develop an electronic representation of its complete content, inherent semantics, and coding rules. The authors' design relates to the current efforts by the CEN/TC 251 to establish a European standard for hierarchical classification systems in health care. The authors have developed an electronic representation of ICD-10 with the eXtensible Markup Language (XML) that facilitates integration into current information systems and coding software, taking different languages and versions into account. In this context, XML provides a complete processing framework of related technologies and standard tools that helps develop interoperable applications. XML provides semantic markup. It allows domain-specific definition of tags and hierarchical document structure. The idea of linking and thus combining information from different sources is a valuable feature of XML. In addition, XML topic maps are used to describe relationships between different sources, or "semantically associated" parts of these sources. The issue of achieving a standardized medical vocabulary becomes more and more important with the stepwise implementation of diagnostically related groups, for example. The aim of the authors' work is to provide a transparent and open infrastructure that can be used to support clinical coding and to develop further software applications. The authors are assuming that a comprehensive representation of the content, structure, inherent semantics, and layout of medical classification systems can be achieved through a document-oriented approach.

  9. Transparent ICD and DRG Coding Using Information Technology: Linking and Associating Information Sources with the eXtensible Markup Language

    PubMed Central

    Hoelzer, Simon; Schweiger, Ralf K.; Dudeck, Joachim

    2003-01-01

    With the introduction of ICD-10 as the standard for diagnostics, it becomes necessary to develop an electronic representation of its complete content, inherent semantics, and coding rules. The authors' design relates to the current efforts by the CEN/TC 251 to establish a European standard for hierarchical classification systems in health care. The authors have developed an electronic representation of ICD-10 with the eXtensible Markup Language (XML) that facilitates integration into current information systems and coding software, taking different languages and versions into account. In this context, XML provides a complete processing framework of related technologies and standard tools that helps develop interoperable applications. XML provides semantic markup. It allows domain-specific definition of tags and hierarchical document structure. The idea of linking and thus combining information from different sources is a valuable feature of XML. In addition, XML topic maps are used to describe relationships between different sources, or “semantically associated” parts of these sources. The issue of achieving a standardized medical vocabulary becomes more and more important with the stepwise implementation of diagnostically related groups, for example. The aim of the authors' work is to provide a transparent and open infrastructure that can be used to support clinical coding and to develop further software applications. The authors are assuming that a comprehensive representation of the content, structure, inherent semantics, and layout of medical classification systems can be achieved through a document-oriented approach. PMID:12807813

  10. Gene Fusion Markup Language: a prototype for exchanging gene fusion data.

    PubMed

    Kalyana-Sundaram, Shanker; Shanmugam, Achiraman; Chinnaiyan, Arul M

    2012-10-16

    An avalanche of next generation sequencing (NGS) studies has generated an unprecedented amount of genomic structural variation data. These studies have also identified many novel gene fusion candidates with more detailed resolution than previously achieved. However, in the excitement and necessity of publishing the observations from this recently developed cutting-edge technology, no community standardization approach has arisen to organize and represent the data with the essential attributes in an interchangeable manner. As transcriptome studies have been widely used for gene fusion discoveries, the current non-standard mode of data representation could potentially impede data accessibility, critical analyses, and further discoveries in the near future. Here we propose a prototype, Gene Fusion Markup Language (GFML) as an initiative to provide a standard format for organizing and representing the significant features of gene fusion data. GFML will offer the advantage of representing the data in a machine-readable format to enable data exchange, automated analysis interpretation, and independent verification. As this database-independent exchange initiative evolves it will further facilitate the formation of related databases, repositories, and analysis tools. The GFML prototype is made available at http://code.google.com/p/gfml-prototype/. The Gene Fusion Markup Language (GFML) presented here could facilitate the development of a standard format for organizing, integrating and representing the significant features of gene fusion data in an inter-operable and query-able fashion that will enable biologically intuitive access to gene fusion findings and expedite functional characterization. A similar model is envisaged for other NGS data analyses.

  11. What Are HyperCard? (Part 2).

    ERIC Educational Resources Information Center

    Marcus, Stephen

    1989-01-01

    Presents the second article in a two-part series on HyperCard materials (computer software used to build structures that create patterns and connections) designed for English and language arts classes. Suggests assignments for use with early HyperCard software that can be adapted to a variety of nonverbal "stackware." (MM)

  12. A Study of the Effect of HyperCard and Pen-Paper Performance Assessment Methods on Expert-Novice Chemistry Problem Solving.

    ERIC Educational Resources Information Center

    Kumar, David D.; And Others

    1994-01-01

    Investigates HyperCard as a tool for assessment in science education and determines whether or not a HyperCard assessment instrument could differentiate between expert and novice student performance on balancing stoichiometric equations in science education. (ZWH)

  13. DEVELOPMENT OF BIO-BASED MOLECULAR TECHNOLOGIES FOR REMOVAL AND REAL-TIME MONITORING OF TOXIC METALS

    EPA Science Inventory

    Transformation of heavy-metal related genes from a hyper-accumulator to a high-biomass species is expected to promote a zinc hyper-accumulating phenotype in the normally non-hyper-accumulating poplar. Coupling fluorescence with heavy metal proteins is anticipated to allow ...

  14. Design and fabrication of a 900-1700 nm hyper-spectral imaging spectrometer

    NASA Astrophysics Data System (ADS)

    Kim, Tae Hyoung; Kong, Hong Jin; Kim, Tae Hoon; Shin, Jae Sung

    2010-02-01

    This paper presents a 900-1700 nm hyper-spectral imaging spectrometer which offers low distortions, a low F-number, a compact size, an easily-fabricated design and a low cost (is presented in this paper). The starting point for its optical design is discussed according to the geometrical aberration theory and Rowland circle condition. It is shown that these methods are useful in designing a push-broom hyper-spectral imaging spectrometer that has an aperture of f/2.4, modulation transfer functions of less than 0.8 at 25 cycles/mm, and spot sizes less than 10 μm. A prototype of the optimized hyper-spectral imaging spectrometer has been fabricated using a high precision machine and the experimental demonstration with the fabricated hyper-spectral imaging spectrometer is presented.

  15. Self-assembled tunable photonic hyper-crystals

    PubMed Central

    Smolyaninova, Vera N.; Yost, Bradley; Lahneman, David; Narimanov, Evgenii E.; Smolyaninov, Igor I.

    2014-01-01

    We demonstrate a novel artificial optical material, the “photonic hyper-crystal”, which combines the most interesting features of hyperbolic metamaterials and photonic crystals. Similar to hyperbolic metamaterials, photonic hyper-crystals exhibit broadband divergence in their photonic density of states due to the lack of usual diffraction limit on the photon wave vector. On the other hand, similar to photonic crystals, hyperbolic dispersion law of extraordinary photons is modulated by forbidden gaps near the boundaries of photonic Brillouin zones. Three dimensional self-assembly of photonic hyper-crystals has been achieved by application of external magnetic field to a cobalt nanoparticle-based ferrofluid. Unique spectral properties of photonic hyper-crystals lead to extreme sensitivity of the material to monolayer coatings of cobalt nanoparticles, which should find numerous applications in biological and chemical sensing. PMID:25027947

  16. Self-assembled tunable photonic hyper-crystals.

    PubMed

    Smolyaninova, Vera N; Yost, Bradley; Lahneman, David; Narimanov, Evgenii E; Smolyaninov, Igor I

    2014-07-16

    We demonstrate a novel artificial optical material, the "photonic hyper-crystal", which combines the most interesting features of hyperbolic metamaterials and photonic crystals. Similar to hyperbolic metamaterials, photonic hyper-crystals exhibit broadband divergence in their photonic density of states due to the lack of usual diffraction limit on the photon wave vector. On the other hand, similar to photonic crystals, hyperbolic dispersion law of extraordinary photons is modulated by forbidden gaps near the boundaries of photonic Brillouin zones. Three dimensional self-assembly of photonic hyper-crystals has been achieved by application of external magnetic field to a cobalt nanoparticle-based ferrofluid. Unique spectral properties of photonic hyper-crystals lead to extreme sensitivity of the material to monolayer coatings of cobalt nanoparticles, which should find numerous applications in biological and chemical sensing.

  17. [Hyper-reactive malarial splenomegaly].

    PubMed

    Maazoun, F; Deschamps, O; Barros-Kogel, E; Ngwem, E; Fauchet, N; Buffet, P; Froissart, A

    2015-11-01

    Hyper-reactive malarial splenomegaly is a rare and severe form of chronic malaria. This condition is a common cause of splenomegaly in endemic areas. The pathophysiology of hyper-reactive malarial splenomegaly involves an intense immune reaction (predominantly B cell-driven) to repeated/chronic infections with Plasmodium sp. The diagnosis may be difficult, due to a poorly specific clinical presentation (splenomegaly, fatigue, cytopenias), a long delay between residence in a malaria-endemic area and onset of symptoms, and a frequent absence of parasites on conventional thin and thick blood smears. A strongly contributive laboratory parameter is the presence of high levels of total immunoglobulin M. When the diagnostic of hyper-reactive malarial splenomegaly is considered, search for anti-Plasmodium antibodies and Plasmodium nucleic acids (genus and species) by PCR is useful. Diagnosis of hyper-reactive malarial splenomegaly relies on the simultaneous presence of epidemiological, clinical, biological and follow-up findings. Regression of both splenomegaly and hypersplenism following antimalarial therapy allows the differential diagnosis with splenic lymphoma, a common complication of hyper-reactive malarial splenomegaly. Although rare in Western countries, hyper-reactive malarial splenomegaly deserves increased medical awareness to reduce the incidence of incorrect diagnosis, to prevent progression to splenic lymphoma and to avoid splenectomy. Copyright © 2015 Société nationale française de médecine interne (SNFMI). Published by Elsevier SAS. All rights reserved.

  18. Renal responses to plasma volume expansion and hyperosmolality in fasting seal pups

    NASA Technical Reports Server (NTRS)

    Ortiz, Rudy M.; Wade, Charles E.; Costa, Daniel P.; Ortiz, C. Leo

    2002-01-01

    Renal responses were quantified in northern elephant seal (Mirounga angustirostris) pups during their postweaning fast to examine their excretory capabilities. Pups were infused with either isotonic (0.9%; n = 8; Iso) or hypertonic (16.7%; n = 7; Hyper) saline via an indwelling catheter such that each pup received 3 mmol NaCl/kg. Diuresis after the infusions was similar in magnitude between the two treatments. Osmotic clearance increased by 37% in Iso and 252% in Hyper. Free water clearance was reduced 3.4-fold in Hyper but was not significantly altered in Iso. Glomerular filtration rate increased 71% in the 24-h period after Hyper, but no net change occurred during the same time after Iso. Natriuresis increased 3.6-fold in Iso and 5.3-fold in Hyper. Iso decreased plasma arginine vasopressin (AVP) and cortisol acutely, whereas Hyper increased plasma and excreted AVP and cortisol. Iso was accompanied by the retention of water and electrolytes, whereas the Hyper load was excreted within 24 h. Natriuresis is attributed to increased filtration and is independent of an increase in atrial natriuretic peptide and decreases in ANG II and aldosterone. Fasting pups appear to have well-developed kidneys capable of both extreme conservation and excretion of Na(+).

  19. Future perspectives - proposal for Oxford Physiome Project.

    PubMed

    Oku, Yoshitaka

    2010-01-01

    The Physiome Project is an effort to understand living creatures using "analysis by synthesis" strategy, i.e., by reproducing their behaviors. In order to achieve its goal, sharing developed models between different computer languages and application programs to incorporate into integrated models is critical. To date, several XML-based markup languages has been developed for this purpose. However, source codes written with XML-based languages are very difficult to read and edit using text editors. An alternative way is to use an object-oriented meta-language, which can be translated to different computer languages and transplanted to different application programs. Object-oriented languages are suitable for describing structural organization by hierarchical classes and taking advantage of statistical properties to reduce the number of parameter while keeping the complexity of behaviors. Using object-oriented languages to describe each element and posting it to a public domain should be the next step to build up integrated models of the respiratory control system.

  20. Emotional hyper-reactivity and cardiometabolic risk in remitted bipolar patients: a machine learning approach.

    PubMed

    Dargél, A A; Roussel, F; Volant, S; Etain, B; Grant, R; Azorin, J-M; M'Bailara, K; Bellivier, F; Bougerol, T; Kahn, J-P; Roux, P; Aubin, V; Courtet, P; Leboyer, M; Kapczinski, F; Henry, C

    2018-05-15

    Remitted bipolar disorder (BD) patients frequently present with chronic mood instability and emotional hyper-reactivity, associated with poor psychosocial functioning and low-grade inflammation. We investigated emotional hyper-reactivity as a dimension for characterization of remitted BD patients, and clinical and biological factors for identifying those with and without emotional hyper-reactivity. A total of 635 adult remitted BD patients, evaluated in the French Network of Bipolar Expert Centers from 2010-2015, were assessed for emotional reactivity using the Multidimensional Assessment of Thymic States. Machine learning algorithms were used on clinical and biological variables to enhance characterization of patients. After adjustment, patients with emotional hyper-reactivity (n = 306) had significantly higher levels of systolic and diastolic blood pressure (P < 1.0 × 10 -8 ), high-sensitivity C-reactive protein (P < 1.0 × 10 -8 ), fasting glucose (P < 2.23 × 10 -6 ), glycated hemoglobin (P = 0.0008) and suicide attempts (P = 1.4 × 10 -8 ). Using models of combined clinical and biological factors for distinguishing BD patients with and without emotional hyper-reactivity, the strongest predictors were: systolic and diastolic blood pressure, fasting glucose, C-reactive protein and number of suicide attempts. This predictive model identified patients with emotional hyper-reactivity with 84.9% accuracy. The assessment of emotional hyper-reactivity in remitted BD patients is clinically relevant, particularly for identifying those at higher risk of cardiometabolic dysfunction, chronic inflammation, and suicide. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. (Relatively) Painless Computer-Assisted Instruction with HyperStudio.

    ERIC Educational Resources Information Center

    Pina, Anthony A.

    The College of the Desert (California) has created a multi-station technology training and development facility for faculty. HyperStudio has been adopted as the introductory tool for multimedia/hypermedia authoring for the following reasons: (1) the card/stack metaphor used by HyperStudio is easy for novices to understand and familiar to users of…

  2. A Tour of the Stacks--HyperCard for Libraries.

    ERIC Educational Resources Information Center

    Ertel, Monica; Oros, Jane

    1989-01-01

    Description of HyperCard, a software package that runs on Macintosh microcomputers, focuses on its use in the Apple Computer, Inc., Library as a user guide to the library. Examples of screen displays are given, and a list of resources is included to help use and understand HyperCard more completely. (LRW)

  3. Hyper-crosslinked cyclodextrin porous polymer: An efficient CO 2 capturing material with tunable porosity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meng, Bo; Li, Haiyang; East China Univ. of Science and Technology, Shanghai

    We designed and synthesized the cyclodextrin (CD)-based hyper-crosslinked porous polymers (HCPPs) for selective CO 2 adsorption and storage. We also explored the effect of monomer size on micropore formation, and determined a feasible way to tailor the porosity of the materials during the hyper-crosslinking process.

  4. Hyper-crosslinked cyclodextrin porous polymer: An efficient CO 2 capturing material with tunable porosity

    DOE PAGES

    Meng, Bo; Li, Haiyang; East China Univ. of Science and Technology, Shanghai; ...

    2016-11-11

    We designed and synthesized the cyclodextrin (CD)-based hyper-crosslinked porous polymers (HCPPs) for selective CO 2 adsorption and storage. We also explored the effect of monomer size on micropore formation, and determined a feasible way to tailor the porosity of the materials during the hyper-crosslinking process.

  5. Hyper-Activity in Children Having Behavior Disorders

    ERIC Educational Resources Information Center

    Childers, A. T.

    2009-01-01

    Frequently, child guidance clinics, pediatricians, teachers, and others have brought to their attention children who manifest hyper-activity as an outstanding feature and of such a degree as to be regarded outside the bounds of normal conduct. The literature on this subject, except for hyper-activity in infancy, has mostly to do with the…

  6. Hyper-Fit: Fitting Linear Models to Multidimensional Data with Multivariate Gaussian Uncertainties

    NASA Astrophysics Data System (ADS)

    Robotham, A. S. G.; Obreschkow, D.

    2015-09-01

    Astronomical data is often uncertain with errors that are heteroscedastic (different for each data point) and covariant between different dimensions. Assuming that a set of D-dimensional data points can be described by a (D - 1)-dimensional plane with intrinsic scatter, we derive the general likelihood function to be maximised to recover the best fitting model. Alongside the mathematical description, we also release the hyper-fit package for the R statistical language (http://github.com/asgr/hyper.fit) and a user-friendly web interface for online fitting (http://hyperfit.icrar.org). The hyper-fit package offers access to a large number of fitting routines, includes visualisation tools, and is fully documented in an extensive user manual. Most of the hyper-fit functionality is accessible via the web interface. In this paper, we include applications to toy examples and to real astronomical data from the literature: the mass-size, Tully-Fisher, Fundamental Plane, and mass-spin-morphology relations. In most cases, the hyper-fit solutions are in good agreement with published values, but uncover more information regarding the fitted model.

  7. Aerodynamic Database Development for the Hyper-X Airframe Integrated Scramjet Propulsion Experiments

    NASA Technical Reports Server (NTRS)

    Engelund, Walter C.; Holland, Scott D.; Cockrell, Charles E., Jr.; Bittner, Robert D.

    2000-01-01

    This paper provides an overview of the activities associated with the aerodynamic database which is being developed in support of NASA's Hyper-X scramjet flight experiments. Three flight tests are planned as part of the Hyper-X program. Each will utilize a small, nonrecoverable research vehicle with an airframe integrated scramjet propulsion engine. The research vehicles will be individually rocket boosted to the scramjet engine test points at Mach 7 and Mach 10. The research vehicles will then separate from the first stage booster vehicle and the scramjet engine test will be conducted prior to the terminal decent phase of the flight. An overview is provided of the activities associated with the development of the Hyper-X aerodynamic database, including wind tunnel test activities and parallel CFD analysis efforts for all phases of the Hyper-X flight tests. A brief summary of the Hyper-X research vehicle aerodynamic characteristics is provided, including the direct and indirect effects of the airframe integrated scramjet propulsion system operation on the basic airframe stability and control characteristics. Brief comments on the planned post flight data analysis efforts are also included.

  8. USAF Summer Research Program - 1995 High School Apprenticeship Program Final Reports, Volume 12A, Armstrong Laboratory

    DTIC Science & Technology

    1995-12-01

    square cms and ground up for ten minutes using a mortar and pestle . Ten ml of sterile water was 1-3 added, and the roots was ground up for an additional...preparation as well as others in the Human Resources Directorate was positive. HyperText Technical Reports and Papers are available any night or holiday ... Analysis ." Psychological Bulletin 1994: 429-456. Feingold, Alan. "Sex Differences in Variability in Intellectual Abilities: A New Look at an Old

  9. Archive of Boomer Seismic Reflection Data Collected During USGS Cruises 01SCC01 and 01SCC02, Timbalier Bay and Offshore East Timbalier Island, Louisiana, June-August, 2001

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Flocks, James G.; Kindinger, Jack G.; Wiese, Dana S.

    2003-01-01

    In June, July, and August of 2001, the U.S. Geological Survey (USGS), in cooperation with the University of New Orleans (UNO), the U.S. Army Corps of Engineers, and the Louisiana Department of Natural Resources, conducted a shallow geophysical and sediment core survey of Timbalier Bay and the Gulf of Mexico offshore East Timbalier Island, Louisiana. This report serves as an archive of unprocessed digital seismic reflection data, trackline navigation files, trackline navigation maps, observers' logbooks, Geographic Information Systems (GIS) information, and formal Federal Geographic Data Committee (FGDC) metadata. In addition, a filtered and gained digital Graphics Interchange Format (GIF) image of each seismic profile is provided. Please see Kulp and others (2002), Flocks and others (2003), and Kulp and others (in prep.) for further information about the sediment cores collected and the geophysical results. For convenience, a list of acronyms and abbreviations frequently used in this report is also included. This Digital Versatile Disc (DVD) document is readable on any computing platform that has standard DVD driver software installed. Documentation on this DVD was produced using Hyper Text Markup Language (HTML) utilized by the World Wide Web (WWW) and allows the user to access the information using a web browser (i.e. Netscape, Internet Explorer). To access the information contained on this disc, open the file 'index.htm' located at the top level of the disc using a web browser. This report also contains WWW links to USGS collaborators and other agencies. These links are only accessible if access to the Internet is available while viewing this DVD. The archived boomer seismic reflection data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry et al., 1975) and may be downloaded for processing with public domain software such as Seismic Unix (SU), currently located at http://www.cwp.mines.edu/cwpcodes/index.html. Examples of SU processing scripts are provided in the BOOM.tar file located in the SU subfolder of the SOFTWARE folder located at the top level of this disc. In-house (USGS) DOS and Microsoft Windows compatible software for viewing SEG-Y headers - DUMPSEGY.EXE (Zihlman, 1992) - is provided in the USGS subfolder of the SOFTWARE folder. Processed profile images, trackline navigation maps, logbooks, and formal metadata may be viewed with a web browser.

  10. Archive of Chirp Seismic Reflection Data Collected During USGS Cruises 01SCC01 and 01SCC02, Timbalier Bay and Offshore East Timbalier Island, Louisiana, June 30 - July 9 and August 1 - 12, 2001

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.; Kindinger, Jack G.

    2003-01-01

    In June, July, and August of 2001, the U.S. Geological Survey (USGS), in cooperation with the University of New Orleans, the U.S. Army Corps of Engineers, and the Louisiana Department of Natural Resources, conducted a shallow geophysical and sediment core survey of Timbalier Bay and the Gulf of Mexico offshore East Timbalier Island, Louisiana. This report serves as an archive of unprocessed digital seismic reflection data, trackline navigation files, trackline navigation maps, observers' logbooks, Geographic Information Systems (GIS) information, and formal Federal Geographic Data Committee (FGDC) metadata. In addition, a gained digital Graphics Interchange Format (GIF) image of each seismic profile is provided. Please see Kulp and others (2002), Flocks and others (2003), and Kulp and others (in prep.) for further information about the sediment cores collected and the geophysical results. For convenience, a list of acronyms and abbreviations frequently used in this report is also included. This Digital Versatile Disc (DVD) document is readable on any computing platform that has standard DVD driver software installed. Documentation on this DVD was produced using Hyper Text Markup Language (HTML) utilized by the World Wide Web (WWW) and allows the user to access the information using a web browser (i.e. Netscape, Internet Explorer). To access the information contained on these discs, open the file 'index.htm' located at the top level of each disc using a web browser. This report also contains WWW links to USGS collaborators and other agencies. These links are only accessible if access to the internet is available while viewing these DVDs. The archived chirp seismic reflection data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry et al., 1975) and may be downloaded for processing with public domain software such as Seismic Unix (SU), currently located at http://www.cwp.mines.edu/cwpcodes/index.html. Examples of SU processing scripts are provided in the CHIRP.tar file located in the SU subfolder of the SOFTWARE folder located at the top level of each disc. In-house (USGS) DOS and Microsoft Windows compatible software for viewing SEG-Y headers - DUMPSEGY.EXE (Zihlman, 1992) - is provided in the USGS subfolder of the SOFTWARE folder. Processed profile images, trackline navigation maps, logbooks, and formal metadata may be viewed with a web browser.

  11. Characterization of benign thyroid nodules with HyperSPACE (Hyper Spectral Analysis for Characterization in Echography) before and after percutaneous laser ablation: a pilot study.

    PubMed

    Granchi, Simona; Vannacci, Enrico; Biagi, Elena

    2017-04-22

    To evaluate the capability of the HyperSPACE (Hyper SPectral Analysis for Characterization in Echography) method in tissue characterization, in order to provide information for the laser treatment of benign thyroid nodules in respect of conventional B-mode images and elastography. The method, based on the spectral analysis of the raw radiofrequency ultrasonic signal, was applied to characterize the nodule before and after laser treatment. Thirty patients (25 females and 5 males, age between 37 and 81 years) with thyroid benign nodule at cytology (Thyr 2) were evaluated by conventional ultrasonography, elastography, and HyperSPACE, before and after laser ablation. The images processed by HyperSPACE exhibit different color distributions that are referred to different tissue features. By calculating the percentages of the color coverages, the analysed nodules were subdivided into 3 groups. Each nodule belonging to the same group experienced, on average, similar necrosis extension. The nodules exhibit different Configurations (colors) distributions that could be indicative of the response of nodular tissue to the laser treatmentConclusions: HyperSPACEcan characterize benign nodules by providing additional information in respect of conventional ultrasound and elastography which is useful for support in the laser treatment of nodules in order to increase the probability of success.

  12. OntologyWidget – a reusable, embeddable widget for easily locating ontology terms

    PubMed Central

    Beauheim, Catherine C; Wymore, Farrell; Nitzberg, Michael; Zachariah, Zachariah K; Jin, Heng; Skene, JH Pate; Ball, Catherine A; Sherlock, Gavin

    2007-01-01

    Background Biomedical ontologies are being widely used to annotate biological data in a computer-accessible, consistent and well-defined manner. However, due to their size and complexity, annotating data with appropriate terms from an ontology is often challenging for experts and non-experts alike, because there exist few tools that allow one to quickly find relevant ontology terms to easily populate a web form. Results We have produced a tool, OntologyWidget, which allows users to rapidly search for and browse ontology terms. OntologyWidget can easily be embedded in other web-based applications. OntologyWidget is written using AJAX (Asynchronous JavaScript and XML) and has two related elements. The first is a dynamic auto-complete ontology search feature. As a user enters characters into the search box, the appropriate ontology is queried remotely for terms that match the typed-in text, and the query results populate a drop-down list with all potential matches. Upon selection of a term from the list, the user can locate this term within a generic and dynamic ontology browser, which comprises the second element of the tool. The ontology browser shows the paths from a selected term to the root as well as parent/child tree hierarchies. We have implemented web services at the Stanford Microarray Database (SMD), which provide the OntologyWidget with access to over 40 ontologies from the Open Biological Ontology (OBO) website [1]. Each ontology is updated weekly. Adopters of the OntologyWidget can either use SMD's web services, or elect to rely on their own. Deploying the OntologyWidget can be accomplished in three simple steps: (1) install Apache Tomcat [2] on one's web server, (2) download and install the OntologyWidget servlet stub that provides access to the SMD ontology web services, and (3) create an html (HyperText Markup Language) file that refers to the OntologyWidget using a simple, well-defined format. Conclusion We have developed OntologyWidget, an easy-to-use ontology search and display tool that can be used on any web page by creating a simple html description. OntologyWidget provides a rapid auto-complete search function paired with an interactive tree display. We have developed a web service layer that communicates between the web page interface and a database of ontology terms. We currently store 40 of the ontologies from the OBO website [1], as well as a several others. These ontologies are automatically updated on a weekly basis. OntologyWidget can be used in any web-based application to take advantage of the ontologies we provide via web services or any other ontology that is provided elsewhere in the correct format. The full source code for the JavaScript and description of the OntologyWidget is available from . PMID:17854506

  13. OntologyWidget - a reusable, embeddable widget for easily locating ontology terms.

    PubMed

    Beauheim, Catherine C; Wymore, Farrell; Nitzberg, Michael; Zachariah, Zachariah K; Jin, Heng; Skene, J H Pate; Ball, Catherine A; Sherlock, Gavin

    2007-09-13

    Biomedical ontologies are being widely used to annotate biological data in a computer-accessible, consistent and well-defined manner. However, due to their size and complexity, annotating data with appropriate terms from an ontology is often challenging for experts and non-experts alike, because there exist few tools that allow one to quickly find relevant ontology terms to easily populate a web form. We have produced a tool, OntologyWidget, which allows users to rapidly search for and browse ontology terms. OntologyWidget can easily be embedded in other web-based applications. OntologyWidget is written using AJAX (Asynchronous JavaScript and XML) and has two related elements. The first is a dynamic auto-complete ontology search feature. As a user enters characters into the search box, the appropriate ontology is queried remotely for terms that match the typed-in text, and the query results populate a drop-down list with all potential matches. Upon selection of a term from the list, the user can locate this term within a generic and dynamic ontology browser, which comprises the second element of the tool. The ontology browser shows the paths from a selected term to the root as well as parent/child tree hierarchies. We have implemented web services at the Stanford Microarray Database (SMD), which provide the OntologyWidget with access to over 40 ontologies from the Open Biological Ontology (OBO) website 1. Each ontology is updated weekly. Adopters of the OntologyWidget can either use SMD's web services, or elect to rely on their own. Deploying the OntologyWidget can be accomplished in three simple steps: (1) install Apache Tomcat 2 on one's web server, (2) download and install the OntologyWidget servlet stub that provides access to the SMD ontology web services, and (3) create an html (HyperText Markup Language) file that refers to the OntologyWidget using a simple, well-defined format. We have developed OntologyWidget, an easy-to-use ontology search and display tool that can be used on any web page by creating a simple html description. OntologyWidget provides a rapid auto-complete search function paired with an interactive tree display. We have developed a web service layer that communicates between the web page interface and a database of ontology terms. We currently store 40 of the ontologies from the OBO website 1, as well as a several others. These ontologies are automatically updated on a weekly basis. OntologyWidget can be used in any web-based application to take advantage of the ontologies we provide via web services or any other ontology that is provided elsewhere in the correct format. The full source code for the JavaScript and description of the OntologyWidget is available from http://smd.stanford.edu/ontologyWidget/.

  14. Principles for Instructional Stack Development in HyperCard.

    ERIC Educational Resources Information Center

    McEneaney, John E.

    The purpose of this paper is to provide information about obtaining and using HyperCard stacks that introduce users to principles of stack development. The HyperCard stacks described are available for downloading free of charge from a server at Indiana University South Bend. Specific directions are given for stack use, with advice for beginners. A…

  15. NETL's Hybrid Performance, or Hyper, facility

    ScienceCinema

    None

    2018-02-13

    NETL's Hybrid Performance, or Hyper, facility is a one-of-a-kind laboratory built to develop control strategies for the reliable operation of fuel cell/turbine hybrids and enable the simulation, design, and implementation of commercial equipment. The Hyper facility provides a unique opportunity for researchers to explore issues related to coupling fuel cell and gas turbine technologies.

  16. Hyper-Binding across Time: Age Differences in the Effect of Temporal Proximity on Paired-Associate Learning

    ERIC Educational Resources Information Center

    Campbell, Karen L.; Trelle, Alexandra; Hasher, Lynn

    2014-01-01

    Older adults show hyper- (or excessive) binding effects for simultaneously and sequentially presented distraction. Here, we addressed the potential role of hyper-binding in paired-associate learning. Older and younger adults learned a list of word pairs and then received an associative recognition task in which rearranged pairs were formed from…

  17. Web Browser Trends and Technologies.

    ERIC Educational Resources Information Center

    Goodwin-Jones, Bob

    2000-01-01

    Discusses Web browsers and how their capabilities have been expanded, support for Web browsing on different devices (cell phones, palmtop computers, TV sets), and browser support for the next-generation Web authoring language, XML ("extensible markup language"). (Author/VWL)

  18. CLAIM (CLinical Accounting InforMation)--an XML-based data exchange standard for connecting electronic medical record systems to patient accounting systems.

    PubMed

    Guo, Jinqiu; Takada, Akira; Tanaka, Koji; Sato, Junzo; Suzuki, Muneou; Takahashi, Kiwamu; Daimon, Hiroyuki; Suzuki, Toshiaki; Nakashima, Yusei; Araki, Kenji; Yoshihara, Hiroyuki

    2005-08-01

    With the evolving and diverse electronic medical record (EMR) systems, there appears to be an ever greater need to link EMR systems and patient accounting systems with a standardized data exchange format. To this end, the CLinical Accounting InforMation (CLAIM) data exchange standard was developed. CLAIM is subordinate to the Medical Markup Language (MML) standard, which allows the exchange of medical data among different medical institutions. CLAIM uses eXtensible Markup Language (XML) as a meta-language. The current version, 2.1, inherited the basic structure of MML 2.x and contains two modules including information related to registration, appointment, procedure and charging. CLAIM 2.1 was implemented successfully in Japan in 2001. Consequently, it was confirmed that CLAIM could be used as an effective data exchange format between EMR systems and patient accounting systems.

  19. Restructuring an EHR system and the Medical Markup Language (MML) standard to improve interoperability by archetype technology.

    PubMed

    Kobayashi, Shinji; Kume, Naoto; Yoshihara, Hiroyuki

    2015-01-01

    In 2001, we developed an EHR system for regional healthcare information inter-exchange and to provide individual patient data to patients. This system was adopted in three regions in Japan. We also developed a Medical Markup Language (MML) standard for inter- and intra-hospital communications. The system was built on a legacy platform, however, and had not been appropriately maintained or updated to meet clinical requirements. To improve future maintenance costs, we reconstructed the EHR system using archetype technology on the Ruby on Rails platform, and generated MML equivalent forms from archetypes. The system was deployed as a cloud-based system for preliminary use as a regional EHR. The system now has the capability to catch up with new requirements, maintaining semantic interoperability with archetype technology. It is also more flexible than the legacy EHR system.

  20. Combinational pixel-by-pixel and object-level classifying, segmenting, and agglomerating in performing quantitative image analysis that distinguishes between healthy non-cancerous and cancerous cell nuclei and delineates nuclear, cytoplasm, and stromal material objects from stained biological tissue materials

    DOEpatents

    Boucheron, Laura E

    2013-07-16

    Quantitative object and spatial arrangement-level analysis of tissue are detailed using expert (pathologist) input to guide the classification process. A two-step method is disclosed for imaging tissue, by classifying one or more biological materials, e.g. nuclei, cytoplasm, and stroma, in the tissue into one or more identified classes on a pixel-by-pixel basis, and segmenting the identified classes to agglomerate one or more sets of identified pixels into segmented regions. Typically, the one or more biological materials comprises nuclear material, cytoplasm material, and stromal material. The method further allows a user to markup the image subsequent to the classification to re-classify said materials. The markup is performed via a graphic user interface to edit designated regions in the image.

  1. HDF-EOS Web Server

    NASA Technical Reports Server (NTRS)

    Ullman, Richard; Bane, Bob; Yang, Jingli

    2008-01-01

    A shell script has been written as a means of automatically making HDF-EOS-formatted data sets available via the World Wide Web. ("HDF-EOS" and variants thereof are defined in the first of the two immediately preceding articles.) The shell script chains together some software tools developed by the Data Usability Group at Goddard Space Flight Center to perform the following actions: Extract metadata in Object Definition Language (ODL) from an HDF-EOS file, Convert the metadata from ODL to Extensible Markup Language (XML), Reformat the XML metadata into human-readable Hypertext Markup Language (HTML), Publish the HTML metadata and the original HDF-EOS file to a Web server and an Open-source Project for a Network Data Access Protocol (OPeN-DAP) server computer, and Reformat the XML metadata and submit the resulting file to the EOS Clearinghouse, which is a Web-based metadata clearinghouse that facilitates searching for, and exchange of, Earth-Science data.

  2. XML Based Scientific Data Management Facility

    NASA Technical Reports Server (NTRS)

    Mehrotra, Piyush; Zubair, M.; Ziebartt, John (Technical Monitor)

    2001-01-01

    The World Wide Web consortium has developed an Extensible Markup Language (XML) to support the building of better information management infrastructures. The scientific computing community realizing the benefits of HTML has designed markup languages for scientific data. In this paper, we propose a XML based scientific data management facility, XDMF. The project is motivated by the fact that even though a lot of scientific data is being generated, it is not being shared because of lack of standards and infrastructure support for discovering and transforming the data. The proposed data management facility can be used to discover the scientific data itself, the transformation functions, and also for applying the required transformations. We have built a prototype system of the proposed data management facility that can work on different platforms. We have implemented the system using Java, and Apache XSLT engine Xalan. To support remote data and transformation functions, we had to extend the XSLT specification and the Xalan package.

  3. XML Based Scientific Data Management Facility

    NASA Technical Reports Server (NTRS)

    Mehrotra, P.; Zubair, M.; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    The World Wide Web consortium has developed an Extensible Markup Language (XML) to support the building of better information management infrastructures. The scientific computing community realizing the benefits of XML has designed markup languages for scientific data. In this paper, we propose a XML based scientific data management ,facility, XDMF. The project is motivated by the fact that even though a lot of scientific data is being generated, it is not being shared because of lack of standards and infrastructure support for discovering and transforming the data. The proposed data management facility can be used to discover the scientific data itself, the transformation functions, and also for applying the required transformations. We have built a prototype system of the proposed data management facility that can work on different platforms. We have implemented the system using Java, and Apache XSLT engine Xalan. To support remote data and transformation functions, we had to extend the XSLT specification and the Xalan package.

  4. Web GIS in practice VIII: HTML5 and the canvas element for interactive online mapping

    PubMed Central

    2010-01-01

    HTML5 is being developed as the next major revision of HTML (Hypertext Markup Language), the core markup language of the World Wide Web. It aims at reducing the need for proprietary, plug-in-based rich Internet application (RIA) technologies such as Adobe Flash. The canvas element is part of HTML5 and is used to draw graphics using scripting (e.g., JavaScript). This paper introduces Cartagen, an open-source, vector-based, client-side framework for rendering plug-in-free, offline-capable, interactive maps in native HTML5 on a wide range of Web browsers and mobile phones. Cartagen was developed at MIT Media Lab's Design Ecology group. Potential applications of the technology as an enabler for participatory online mapping include mapping real-time air pollution, citizen reporting, and disaster response, among many other possibilities. PMID:20199681

  5. Standard Generalized Markup Language for self-defining structured reports.

    PubMed

    Kahn, C E

    1999-01-01

    Structured reporting is the process of using standardized data elements and predetermined data-entry formats to record observations. The Standard Generalized Markup Language (SGML; International Standards Organization (ISO) 8879:1986)--an open, internationally accepted standard for document interchange was used to encode medical observations acquired in an Internet-based structured reporting system. The resulting report is self-documenting: it includes a definition of its allowable data fields and values encoded as a report-specific SGML document type definition (DTD). The data-entry forms, DTD, and report document instances are based on report specifications written in a simple, SGML-based language designed for that purpose. Reporting concepts can be linked with those of external vocabularies such as the Unified Medical Language System (UMLS) Metathesaurus. The use of open standards such as SGML is an important step in the creation of open, universally comprehensible structured reports.

  6. GEM at 10: a decade's experience with the Guideline Elements Model.

    PubMed

    Hajizadeh, Negin; Kashyap, Nitu; Michel, George; Shiffman, Richard N

    2011-01-01

    The Guideline Elements Model (GEM) was developed in 2000 to organize the information contained in clinical practice guidelines using XML and to represent guideline content in a form that can be understood by human readers and processed by computers. In this work, we systematically reviewed the literature to better understand how GEM was being used, potential barriers to its use, and suggestions for improvement. Fifty external and twelve internally produced publications were identified and analyzed. GEM was used most commonly for modeling and ontology creation. Other investigators applied GEM for knowledge extraction and data mining, for clinical decision support for guideline generation. The GEM Cutter software-used to markup guidelines for translation into XML- has been downloaded 563 times since 2000. Although many investigators found GEM to be valuable, others critiqued its failure to clarify guideline semantics, difficulties in markup, and the fact that GEM files are not usually executable.

  7. The markup is the model: reasoning about systems biology models in the Semantic Web era.

    PubMed

    Kell, Douglas B; Mendes, Pedro

    2008-06-07

    Metabolic control analysis, co-invented by Reinhart Heinrich, is a formalism for the analysis of biochemical networks, and is a highly important intellectual forerunner of modern systems biology. Exchanging ideas and exchanging models are part of the international activities of science and scientists, and the Systems Biology Markup Language (SBML) allows one to perform the latter with great facility. Encoding such models in SBML allows their distributed analysis using loosely coupled workflows, and with the advent of the Internet the various software modules that one might use to analyze biochemical models can reside on entirely different computers and even on different continents. Optimization is at the core of many scientific and biotechnological activities, and Reinhart made many major contributions in this area, stimulating our own activities in the use of the methods of evolutionary computing for optimization.

  8. SBML-PET: a Systems Biology Markup Language-based parameter estimation tool.

    PubMed

    Zi, Zhike; Klipp, Edda

    2006-11-01

    The estimation of model parameters from experimental data remains a bottleneck for a major breakthrough in systems biology. We present a Systems Biology Markup Language (SBML) based Parameter Estimation Tool (SBML-PET). The tool is designed to enable parameter estimation for biological models including signaling pathways, gene regulation networks and metabolic pathways. SBML-PET supports import and export of the models in the SBML format. It can estimate the parameters by fitting a variety of experimental data from different experimental conditions. SBML-PET has a unique feature of supporting event definition in the SMBL model. SBML models can also be simulated in SBML-PET. Stochastic Ranking Evolution Strategy (SRES) is incorporated in SBML-PET for parameter estimation jobs. A classic ODE Solver called ODEPACK is used to solve the Ordinary Differential Equation (ODE) system. http://sysbio.molgen.mpg.de/SBML-PET/. The website also contains detailed documentation for SBML-PET.

  9. A standard MIGS/MIMS compliant XML Schema: toward the development of the Genomic Contextual Data Markup Language (GCDML).

    PubMed

    Kottmann, Renzo; Gray, Tanya; Murphy, Sean; Kagan, Leonid; Kravitz, Saul; Lombardot, Thierry; Field, Dawn; Glöckner, Frank Oliver

    2008-06-01

    The Genomic Contextual Data Markup Language (GCDML) is a core project of the Genomic Standards Consortium (GSC) that implements the "Minimum Information about a Genome Sequence" (MIGS) specification and its extension, the "Minimum Information about a Metagenome Sequence" (MIMS). GCDML is an XML Schema for generating MIGS/MIMS compliant reports for data entry, exchange, and storage. When mature, this sample-centric, strongly-typed schema will provide a diverse set of descriptors for describing the exact origin and processing of a biological sample, from sampling to sequencing, and subsequent analysis. Here we describe the need for such a project, outline design principles required to support the project, and make an open call for participation in defining the future content of GCDML. GCDML is freely available, and can be downloaded, along with documentation, from the GSC Web site (http://gensc.org).

  10. Evolving a lingua franca and associated software infrastructure for computational systems biology: the Systems Biology Markup Language (SBML) project.

    PubMed

    Hucka, M; Finney, A; Bornstein, B J; Keating, S M; Shapiro, B E; Matthews, J; Kovitz, B L; Schilstra, M J; Funahashi, A; Doyle, J C; Kitano, H

    2004-06-01

    Biologists are increasingly recognising that computational modelling is crucial for making sense of the vast quantities of complex experimental data that are now being collected. The systems biology field needs agreed-upon information standards if models are to be shared, evaluated and developed cooperatively. Over the last four years, our team has been developing the Systems Biology Markup Language (SBML) in collaboration with an international community of modellers and software developers. SBML has become a de facto standard format for representing formal, quantitative and qualitative models at the level of biochemical reactions and regulatory networks. In this article, we summarise the current and upcoming versions of SBML and our efforts at developing software infrastructure for supporting and broadening its use. We also provide a brief overview of the many SBML-compatible software tools available today.

  11. Intended and unintended consequences of China's zero markup drug policy.

    PubMed

    Yi, Hongmei; Miller, Grant; Zhang, Linxiu; Li, Shaoping; Rozelle, Scott

    2015-08-01

    Since economic liberalization in the late 1970s, China's health care providers have grown heavily reliant on revenue from drugs, which they both prescribe and sell. To curb abuse and to promote the availability, safety, and appropriate use of essential drugs, China introduced its national essential drug list in 2009 and implemented a zero markup policy designed to decouple provider compensation from drug prescription and sales. We collected and analyzed representative data from China's township health centers and their catchment-area populations both before and after the reform. We found large reductions in drug revenue, as intended by policy makers. However, we also found a doubling of inpatient care that appeared to be driven by supply, instead of demand. Thus, the reform had an important unintended consequence: China's health care providers have sought new, potentially inappropriate, forms of revenue. Project HOPE—The People-to-People Health Foundation, Inc.

  12. Unique hyper-thermal composting process in Kagoshima City forms distinct bacterial community structures.

    PubMed

    Tashiro, Yukihiro; Tabata, Hanae; Itahara, Asuka; Shimizu, Natsuki; Tashiro, Kosuke; Sakai, Kenji

    2016-11-01

    A unique compost, Satsuma soil, is produced from three types of wastewater sludge using hyper-thermal processes at temperatures much higher than that of general thermophilic processes in Kagoshima City, Japan. We analyzed the bacterial community structures of this hyper-thermal compost sample and other sludges and composts by a high-throughput barcoded pyrosequencing method targeting the 16S rRNA gene. In total, 621,076 reads were derived from 17 samples and filtered. Artificial sequences were deleted and the reads were clustered based on the operational taxonomic units (OTUs) at 97% similarity. Phylum-level analysis of the hyper-thermal compost revealed drastic changes of the sludge structures (each relative abundance) from Firmicutes (average 47.8%), Proteobacteria (average 22.3%), and Bacteroidetes (average 10.1%) to two main phyla including Firmicutes (73.6%) and Actinobacteria (25.0%) with less Proteobacteria (∼0.3%) and Bacteroidetes (∼0.1%). Furthermore, we determined the predominant species (each relative abundance) of the hyper-thermal compost including Firmicutes related to Staphylococcus cohnii (13.8%), Jeotgalicoccus coquinae (8.01%), and Staphylococcus lentus (5.96%), and Actinobacteria related to Corynebacterium stationis (6.41%), and found that these species were not predominant in wastewater sludge. In contrast, we did not observe any common structures among eight other composts produced, using the hyper-thermal composts as the inoculums, under thermophilic conditions from different materials. Principle coordinate analysis of the hyper-thermal compost indicated a large difference in bacterial community structures from material sludge and other composts. These results suggested that a distinct bacterial community structure was formed by hyper-thermal composting. Copyright © 2016 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  13. Development of hyper osmotic resistant CHO host cells for enhanced antibody production.

    PubMed

    Kamachi, Yasuharu; Omasa, Takeshi

    2018-04-01

    Cell culture platform processes are generally employed to shorten the duration of new product development. A fed-batch process with continuous feeding is a conventional platform process for monoclonal antibody production using Chinese hamster ovary (CHO) cells. To establish a simplified platform process, the feeding method can be changed from continuous feed to bolus feed. However, this change induces a rapid increase of osmolality by the bolus addition of nutrients. The increased osmolality suppresses cell culture growth, and the final product concentration is decreased. In this study, osmotic resistant CHO host cells were developed to attain a high product concentration. To establish hyper osmotic resistant CHO host cells, CHO-S host cells were passaged long-term in a hyper osmotic basal medium. There were marked differences in cell growth of the original and established host cells under iso- (328 mOsm/kg) or hyper-osmolality (over 450 mOsm/kg) conditions. Cell growth of the original CHO host cells was markedly decreased by the induction of osmotic stress, whereas cell growth of the hyper osmotic resistant CHO host cells was not affected. The maximum viable cell concentration of hyper osmotic resistant CHO host cells was 132% of CHO-S host cells after the induction of osmotic stress. Moreover, the hyper osmotic resistant characteristic of established CHO host cells was maintained even after seven passages in iso-osmolality basal medium. The use of hyper osmotic resistance CHO host cells to create a monoclonal antibody production cell line might be a new approach to increase final antibody concentrations with a fed-batch process. Copyright © 2017 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  14. Analytic calculations of hyper-Raman spectra from density functional theory hyperpolarizability gradients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ringholm, Magnus; Ruud, Kenneth; Bast, Radovan

    We present the first analytic calculations of the geometrical gradients of the first hyperpolarizability tensors at the density-functional theory (DFT) level. We use the analytically calculated hyperpolarizability gradients to explore the importance of electron correlation effects, as described by DFT, on hyper-Raman spectra. In particular, we calculate the hyper-Raman spectra of the all-trans and 11-cis isomers of retinal at the Hartree-Fock (HF) and density-functional levels of theory, also allowing us to explore the sensitivity of the hyper-Raman spectra on the geometrical characteristics of these structurally related molecules. We show that the HF results, using B3LYP-calculated vibrational frequencies and force fields,more » reproduce the experimental data for all-trans-retinal well, and that electron correlation effects are of minor importance for the hyper-Raman intensities.« less

  15. An efficient approach for inverse kinematics and redundancy resolution scheme of hyper-redundant manipulators

    NASA Astrophysics Data System (ADS)

    Chembuly, V. V. M. J. Satish; Voruganti, Hari Kumar

    2018-04-01

    Hyper redundant manipulators have a large number of degrees of freedom (DOF) than the required to perform a given task. Additional DOF of manipulators provide the flexibility to work in highly cluttered environment and in constrained workspaces. Inverse kinematics (IK) of hyper-redundant manipulators is complicated due to large number of DOF and these manipulators have multiple IK solutions. The redundancy gives a choice of selecting best solution out of multiple solutions based on certain criteria such as obstacle avoidance, singularity avoidance, joint limit avoidance and joint torque minimization. This paper focuses on IK solution and redundancy resolution of hyper-redundant manipulator using classical optimization approach. Joint positions are computed by optimizing various criteria for a serial hyper redundant manipulators while traversing different paths in the workspace. Several cases are addressed using this scheme to obtain the inverse kinematic solution while optimizing the criteria like obstacle avoidance, joint limit avoidance.

  16. Effects of Inaccurate Identification of Interictal Epileptiform Discharges in Concurrent EEG-fMRI

    NASA Astrophysics Data System (ADS)

    Gkiatis, K.; Bromis, K.; Kakkos, I.; Karanasiou, I. S.; Matsopoulos, G. K.; Garganis, K.

    2017-11-01

    Concurrent continuous EEG-fMRI is a novel multimodal technique that is finding its way into clinical practice in epilepsy. EEG timeseries are used to identify the timing of interictal epileptiform discharges (IEDs) which is then included in a GLM analysis in fMRI to localize the epileptic onset zone. Nevertheless, there are still some concerns about its reliability concerning BOLD changes correlated with IEDs. Even though IEDs are identified by an experienced neurologist-epiliptologist, the reliability and concordance of the mark-ups is depending on many factors including the level of fatigue, the amount of time that he spent or, in some cases, even the screen that is being used for the display of timeseries. This investigation is aiming to unravel the effect of misidentification or inaccuracy in the mark-ups of IEDs in the fMRI statistical parametric maps. Concurrent EEG-fMRI was conducted in six subjects with various types of epilepsy. IEDs were identified by an experienced neurologist-epiliptologist. Analysis of EEG was performed with EEGLAB and analysis of fMRI was conducted in FSL. Preliminary results revealed lower statistical significance for missing events or larger period of IEDs than the actual ones and the introduction of false positives and false negatives in statistical parametric maps when random events were included in the GLM on top of the IEDs. Our results suggest that mark-ups in EEG for simultaneous EEG-fMRI should be done with caution from an experienced and restful neurologist as it affects the fMRI results in various and unpredicted ways.

  17. Gene Fusion Markup Language: a prototype for exchanging gene fusion data

    PubMed Central

    2012-01-01

    Background An avalanche of next generation sequencing (NGS) studies has generated an unprecedented amount of genomic structural variation data. These studies have also identified many novel gene fusion candidates with more detailed resolution than previously achieved. However, in the excitement and necessity of publishing the observations from this recently developed cutting-edge technology, no community standardization approach has arisen to organize and represent the data with the essential attributes in an interchangeable manner. As transcriptome studies have been widely used for gene fusion discoveries, the current non-standard mode of data representation could potentially impede data accessibility, critical analyses, and further discoveries in the near future. Results Here we propose a prototype, Gene Fusion Markup Language (GFML) as an initiative to provide a standard format for organizing and representing the significant features of gene fusion data. GFML will offer the advantage of representing the data in a machine-readable format to enable data exchange, automated analysis interpretation, and independent verification. As this database-independent exchange initiative evolves it will further facilitate the formation of related databases, repositories, and analysis tools. The GFML prototype is made available at http://code.google.com/p/gfml-prototype/. Conclusion The Gene Fusion Markup Language (GFML) presented here could facilitate the development of a standard format for organizing, integrating and representing the significant features of gene fusion data in an inter-operable and query-able fashion that will enable biologically intuitive access to gene fusion findings and expedite functional characterization. A similar model is envisaged for other NGS data analyses. PMID:23072312

  18. The Development of Hyper-MNP: Hyper-Media Navigational Performance Scale

    ERIC Educational Resources Information Center

    Firat, Mehmet; Yurdakul, Isil Kabakci

    2016-01-01

    The present study aimed at developing a scale to evaluate navigational performance as a whole, which is one of the factors influencing learning in hyper media. In line with this purpose, depending on the related literature, an item pool of 15 factors was prepared, and these variables were decreased to 5 based on the views of 38 field experts. In…

  19. Pre-Creating the HyperNews Classroom Community: (Not)Speaking, (Not)Writing the Subtext.

    ERIC Educational Resources Information Center

    Satie, Stephanie

    As two groups of teachers met to set up a HyperNews network for a grant project, it became clear that politics cannot be kept out of the classroom. In creating a community of diverse writers via HyperNews, six composition classes were linked for online discourse among departments: Asian American Studies, Chicano Studies, Pan African Studies, and…

  20. Parallel Nonnegative Least Squares Solvers for Model Order Reduction

    DTIC Science & Technology

    2016-03-01

    NNLS problems that arise when the Energy Conserving Sampling and Weighting hyper -reduction procedure is used when constructing a reduced-order model...ScaLAPACK and performance results are presented. nonnegative least squares, model order reduction, hyper -reduction, Energy Conserving Sampling and...optimal solution. ........................................ 20 Table 6 Reduced mesh sizes produced for each solver in the ECSW hyper -reduction step

  1. Self-assembled Tunable Photonic Hyper-crystals

    DTIC Science & Technology

    2014-07-16

    a cobalt nanoparticle-based ferrofluid. Unique spectral properties of photonic hyper-crystals lead to extreme sensitivity of the material to...monolayer coatings of cobalt nanoparticles, which should find numerous applications in biological and chemical sensing. 2 Approved for public release...assembly of photonic hyper crystals has been achieved by application of external magnetic field to a cobalt nanoparticle based ferrofluid. Unique spectral

  2. Hyper-heuristics with low level parameter adaptation.

    PubMed

    Ren, Zhilei; Jiang, He; Xuan, Jifeng; Luo, Zhongxuan

    2012-01-01

    Recent years have witnessed the great success of hyper-heuristics applying to numerous real-world applications. Hyper-heuristics raise the generality of search methodologies by manipulating a set of low level heuristics (LLHs) to solve problems, and aim to automate the algorithm design process. However, those LLHs are usually parameterized, which may contradict the domain independent motivation of hyper-heuristics. In this paper, we show how to automatically maintain low level parameters (LLPs) using a hyper-heuristic with LLP adaptation (AD-HH), and exemplify the feasibility of AD-HH by adaptively maintaining the LLPs for two hyper-heuristic models. Furthermore, aiming at tackling the search space expansion due to the LLP adaptation, we apply a heuristic space reduction (SAR) mechanism to improve the AD-HH framework. The integration of the LLP adaptation and the SAR mechanism is able to explore the heuristic space more effectively and efficiently. To evaluate the performance of the proposed algorithms, we choose the p-median problem as a case study. The empirical results show that with the adaptation of the LLPs and the SAR mechanism, the proposed algorithms are able to achieve competitive results over the three heterogeneous classes of benchmark instances.

  3. Research on marine and freshwater fish identification model based on hyper-spectral imaging technology

    NASA Astrophysics Data System (ADS)

    Fu, Yan; Guo, Pei-yuan; Xiang, Ling-zi; Bao, Man; Chen, Xing-hai

    2013-08-01

    With the gradually mature of hyper spectral image technology, the application of the meat nondestructive detection and recognition has become one of the current research focuses. This paper for the study of marine and freshwater fish by the pre-processing and feature extraction of the collected spectral curve data, combined with BP network structure and LVQ network structure, a predictive model of hyper spectral image data of marine and freshwater fish has been initially established and finally realized the qualitative analysis and identification of marine and freshwater fish quality. The results of this study show that hyper spectral imaging technology combined with the BP and LVQ Artificial Neural Network Model can be used for the identification of marine and freshwater fish detection. Hyper-spectral data acquisition can be carried out without any pretreatment of the samples, thus hyper-spectral imaging technique is the lossless, high- accuracy and rapid detection method for quality of fish. In this study, only 30 samples are used for the exploratory qualitative identification of research, although the ideal study results are achieved, we will further increase the sample capacity to take the analysis of quantitative identification and verify the feasibility of this theory.

  4. Micromechanics and Piezo Enhancements of HyperSizer

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Bednarcyk, Brett A.; Yarrington, Phillip; Collier, Craig S.

    2006-01-01

    The commercial HyperSizer aerospace-composite-material-structure-sizing software has been enhanced by incorporating capabilities for representing coupled thermal, piezoelectric, and piezomagnetic effects on the levels of plies, laminates, and stiffened panels. This enhancement is based on a formulation similar to that of the pre-existing HyperSizer capability for representing thermal effects. As a result of this enhancement, the electric and/or magnetic response of a material or structure to a mechanical or thermal load, or its mechanical response to an applied electric or magnetic field can be predicted. In another major enhancement, a capability for representing micromechanical effects has been added by establishment of a linkage between HyperSizer and Glenn Research Center s Micromechanics Analysis Code With Generalized Method of Cells (MAC/GMC) computer program, which was described in several prior NASA Tech Briefs articles. The linkage enables Hyper- Sizer to localize to the fiber and matrix level rather than only to the ply level, making it possible to predict local failures and to predict properties of plies from those of the component fiber and matrix materials. Advanced graphical user interfaces and database structures have been developed to support the new HyperSizer micromechanics capabilities.

  5. AAS Publishing News: Preparing Your Manuscript Just Got Easier

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-03-01

    Watermarking using the command watermark{DRAFT, v2}.Are you an astronomer considering submitting a paper to an AAS journal (i.e., AJ, ApJ, ApJ Letters, or ApJ Supplements)? If so, this post is for you! Read on to find out about the exciting new things you can do with the AASs newest LaTeX class file, available for download now.Why the Update?AAS publishing has maintained a consistent class file for LaTeX manuscript preparation for the past decade. But academic publishing is changing rapidly in todays era of electronic journals! Since its journals went fully electronic, the AAS has been continuously adding new publishing capabilities based on the recommendations of the Journals Task Force and the needs and requests of AAS authors. The AASs manuscript preparation tools are now being updated accordingly.Whats New in AASTex 6.0?There are many exciting new features and capabilities in AASTex 6.0. Here are just a few:Tracking options for author revisions include added{text}, deleted{text}, replaced{old}{new}, and explain{text}.Based on emulateapjDo you use the popular class file emulateapj to prepare your manuscripts? AASTex 6.0 is based on emulateapj, rather than on the older AASTex 5.2 (though 5.2 is still supported). This means that it is easy to produce a double-columned, single-spaced, and astro-ph-ready manuscript. Since two thirds of the AAS journals authors use emulateapj, this transition was designed to make manuscript preparation and sharing an easier and more seamless process.Tools for collaborationsDo you work in a large collaboration? AASTex now includes new tools to make preparing a manuscript within a collaboration easier. Drafts can now be watermarked to differentiate between versions. New markup for large author lists streamlines the display so that readers can access article information immediately, yet they can still access the full author list and affiliations at the end of the paper. And author revision markup allows members of a collaboration to track their edits within a manuscript, for clearer organization of versions and edits.An example figure set, which the reader can download as a .tar.gz high-resolution set or as PowerPoint slides.Additional figure supportDo you have a lot of similar figures that youd like associated with the electronic journal article but dont all need to be included in the article pdf? New support is now available for figure sets, which allow readers efficient access to the full set of images without slowing down their ability to read your article. In addition, AASTex 6.0 now offers new markup for displaying figures in a grid, providing authors with more control over figure placement.New features for tablesDo you frequently work with large data tables? You might be especially happy with the changes in table-handling in AASTex 6.0. Now you can automatically number columns, hide columns with a single command, specify math mode automatically for a designated column, control decimal alignment, and even split wide tables into multiple parts.Example use of the new software command.Software citation supportDo you want to cite software and third-party repositories within your articles? With AASTex 6.0, theres now a software command that can be used to highlight and link to software that you used in your work. In addition, the ApJ BibTeX style file has been updated to support software citation.Where Can You Get More Information?Learn more about AASTex 6.0Watch a video presentation about AASTex 6.0 by AAS Data Scientist Greg SchwarzDownload AASTex 6.0Wishing for still more improvements?The AAS publishing team would love your input! You can contact them at aastex-help@aas.org with additional suggestions or ideas for the next iteration of AASTex.

  6. Why SGML? Why Now?

    ERIC Educational Resources Information Center

    Marcoux, Yves; Sevigny, Martin

    1997-01-01

    Defines Standard Generalized Markup Language (SGML), a format for electronic documents that provides documentary information for efficient accessibility, dissemination, and preservation. Compares SGML to Open Document Architecture (ODA) based on standards by the International Organization for Standardization (ISO), and describes the principles and…

  7. The application of the HyPer fluorescent sensor in the real-time detection of H2O2 in Babesia bovis merozoites in vitro.

    PubMed

    Asada, Masahito; Hakimi, Hassan; Kawazu, Shin-Ichiro

    2018-05-15

    In recent years, genetically encoded fluorescent probes have allowed a dramatic advancement in time-lapse imaging, enabling this imaging modality to be used to investigate intracellular events in several apicomplexan parasite species. In this study, we constructed a plasmid vector to stably express a genetically encoded H 2 O 2 sensor probe called HyPer in Babesia bovis. The HyPer-transfected parasite population was successfully developed and subjected to a time-lapse imaging analysis under in vitro culture conditions. HyPer was capable of sensing an increasing H 2 O 2 concentration in the parasite cells which was induced by the administration of paraquat as a superoxide donor. HyPer fluorescence co-staining with MitoTracker Red indicated the mitochondria as the major source of reactive oxygen species (ROS) in parasite cells. The fluctuating ROS dynamics in the parasite gliding toward, attaching to, and invading the target red blood cell was visualized and monitored in real time with the HyPer expressing parasite population. This is the first report to describe the application of the HyPer probe in an imaging analysis involving Babesia parasites. Hyper-expressing parasites can be widely utilized in studies to investigate the mechanisms of emergence and the reduction of oxidative stress, as well as the signal transduction in the parasite cells during host invasion and intercellular development. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. An adaptive band selection method for dimension reduction of hyper-spectral remote sensing image

    NASA Astrophysics Data System (ADS)

    Yu, Zhijie; Yu, Hui; Wang, Chen-sheng

    2014-11-01

    Hyper-spectral remote sensing data can be acquired by imaging the same area with multiple wavelengths, and it normally consists of hundreds of band-images. Hyper-spectral images can not only provide spatial information but also high resolution spectral information, and it has been widely used in environment monitoring, mineral investigation and military reconnaissance. However, because of the corresponding large data volume, it is very difficult to transmit and store Hyper-spectral images. Hyper-spectral image dimensional reduction technique is desired to resolve this problem. Because of the High relation and high redundancy of the hyper-spectral bands, it is very feasible that applying the dimensional reduction method to compress the data volume. This paper proposed a novel band selection-based dimension reduction method which can adaptively select the bands which contain more information and details. The proposed method is based on the principal component analysis (PCA), and then computes the index corresponding to every band. The indexes obtained are then ranked in order of magnitude from large to small. Based on the threshold, system can adaptively and reasonably select the bands. The proposed method can overcome the shortcomings induced by transform-based dimension reduction method and prevent the original spectral information from being lost. The performance of the proposed method has been validated by implementing several experiments. The experimental results show that the proposed algorithm can reduce the dimensions of hyper-spectral image with little information loss by adaptively selecting the band images.

  9. Research on hyperspectral dynamic scene and image sequence simulation

    NASA Astrophysics Data System (ADS)

    Sun, Dandan; Gao, Jiaobo; Sun, Kefeng; Hu, Yu; Li, Yu; Xie, Junhu; Zhang, Lei

    2016-10-01

    This paper presents a simulation method of hyper-spectral dynamic scene and image sequence for hyper-spectral equipment evaluation and target detection algorithm. Because of high spectral resolution, strong band continuity, anti-interference and other advantages, in recent years, hyper-spectral imaging technology has been rapidly developed and is widely used in many areas such as optoelectronic target detection, military defense and remote sensing systems. Digital imaging simulation, as a crucial part of hardware in loop simulation, can be applied to testing and evaluation hyper-spectral imaging equipment with lower development cost and shorter development period. Meanwhile, visual simulation can produce a lot of original image data under various conditions for hyper-spectral image feature extraction and classification algorithm. Based on radiation physic model and material characteristic parameters this paper proposes a generation method of digital scene. By building multiple sensor models under different bands and different bandwidths, hyper-spectral scenes in visible, MWIR, LWIR band, with spectral resolution 0.01μm, 0.05μm and 0.1μm have been simulated in this paper. The final dynamic scenes have high real-time and realistic, with frequency up to 100 HZ. By means of saving all the scene gray data in the same viewpoint image sequence is obtained. The analysis results show whether in the infrared band or the visible band, the grayscale variations of simulated hyper-spectral images are consistent with the theoretical analysis results.

  10. More than ten million years of hyper-aridity recorded in the Atacama Gravels

    NASA Astrophysics Data System (ADS)

    Sun, Tao; Bao, Huiming; Reich, Martin; Hemming, Sidney R.

    2018-04-01

    The Atacama Desert's hyper-aridity is closely linked to the development of world-class copper and nitrate/iodine ores and to regional tectonics and global paleoclimate changes in the Cenozoic era. The timing when the hyper-aridity commenced remains controversial, with proposed ages ranging from Late Oligocene to Pleistocene. In this study, we provide an independent constraint on the initiation of Atacama hyper-aridity utilizing a 100-m deep profile within the Atacama Gravels and underneath porphyry copper deposit in Spence, northern Chile. The overall high concentration of sulfate (up to 10 wt%) and a multimodal distribution of water soluble salt (sulfates, chlorides and nitrates) indicate multiple generations of sedimentation and salt accumulation events under semi-arid to hyper-arid climate conditions. The multiple sulfate isotope compositions (Δ17O, δ18O, δ34S) of the upper section (-15.0 to -34.5 m) are close to those of modern hyperarid surface sulfates, while the lower section (-34.5 to -65 m) displays a depth dependent isotope trend that is best interpreted as marking a period of climate change from semi-arid to hyper-arid. When these data are combined with new chronological 40Ar/39Ar dates obtained from a volcanic ash layer at depth of -28.0 m, our results show that hyper-arid condition in the Atacama Desert was prevailing at least prior to 9.47 Ma and may go back as old as the middle Miocene.

  11. Learning from Experience Case Studies of the Hyper-X Project

    NASA Technical Reports Server (NTRS)

    Peebles, Curtis

    2009-01-01

    The Hyper-X project (X-43A) provides a number of "lessons learned" which can be applied to other aerospace project. The specific areas examined were the selection of the goals of the Hyper-X. How the technical unknowns and assumptions were handled. The final lesson was the ambiguous nature of risk assessment, and how trying to remove a technical unknown can have unintended consequences.

  12. X43 Hyper-X

    NASA Image and Video Library

    2004-02-11

    NASA's Hyper-x Program Manager, Vince Rausch talks about the upcoming launch of the X43A vehicle over the Pacific Ocean later this month from his office at NASA Langley Research Center in Hampton, VA. Hyper X is a high risk, high payoff program. The flight of the X43 A will demonstrated in flight for the first time, air breathing hypersonic propulsion technology. (Photo by Jeff Caplan)

  13. The XML approach to implementing space link extension service management

    NASA Technical Reports Server (NTRS)

    Tai, W.; Welz, G. A.; Theis, G.; Yamada, T.

    2001-01-01

    A feasibility study has been conducted at JPL, ESOC, and ISAS to assess the possible applications of the eXtensible Mark-up Language (XML) capabilities to the implementation of the CCSDS Space Link Extension (SLE) Service Management function.

  14. Designing for the Next Web.

    ERIC Educational Resources Information Center

    Bremser, Wayne

    1998-01-01

    Discusses how to choose from the available interactive graphic-design possibilities for the World Wide Web. Compatibility and appropriateness are discussed; and DHTML (Dynamic Hypertext Markup Language), Java, CSS (Cascading Style Sheets), plug-ins, ActiveX, and Push and channel technologies are described. (LRW)

  15. Experimental Realization of Tunable Metamaterial Hyper-transmitter

    PubMed Central

    Yoo, Young Joon; Yi, Changhyun; Hwang, Ji Sub; Kim, Young Ju; Park, Sang Yoon; Kim, Ki Won; Rhee, Joo Yull; Lee, YoungPak

    2016-01-01

    We realized the tunable metamaterial hyper-transmitter in the microwave range utilizing simple planar meta-structure. The single-layer metamaterial hyper-transmitter shows that the transmission peak occurs at 14 GHz. In case of the dual-layer one, it is possible to control the transmission peak from 5 to 10 GHz. Moreover, all the transmission peaks reveal transmission over 100%. We experimentally and theoretically investigated these phenomena through 3-dimensional simulation and measurement. The reason for being over 100% is also elucidated. The suggested hyper-transmitter can be used, for example, in enhancing the operating distance of the electromagnetic wave in Wi-Fi, military radar, wireless power transfer and self-driving car. PMID:27629804

  16. Experimental Realization of Tunable Metamaterial Hyper-transmitter

    NASA Astrophysics Data System (ADS)

    Yoo, Young Joon; Yi, Changhyun; Hwang, Ji Sub; Kim, Young Ju; Park, Sang Yoon; Kim, Ki Won; Rhee, Joo Yull; Lee, Youngpak

    2016-09-01

    We realized the tunable metamaterial hyper-transmitter in the microwave range utilizing simple planar meta-structure. The single-layer metamaterial hyper-transmitter shows that the transmission peak occurs at 14 GHz. In case of the dual-layer one, it is possible to control the transmission peak from 5 to 10 GHz. Moreover, all the transmission peaks reveal transmission over 100%. We experimentally and theoretically investigated these phenomena through 3-dimensional simulation and measurement. The reason for being over 100% is also elucidated. The suggested hyper-transmitter can be used, for example, in enhancing the operating distance of the electromagnetic wave in Wi-Fi, military radar, wireless power transfer and self-driving car.

  17. Dense Annotation of Free-Text Critical Care Discharge Summaries from an Indian Hospital and Associated Performance of a Clinical NLP Annotator.

    PubMed

    Ramanan, S V; Radhakrishna, Kedar; Waghmare, Abijeet; Raj, Tony; Nathan, Senthil P; Sreerama, Sai Madhukar; Sampath, Sriram

    2016-08-01

    Electronic Health Record (EHR) use in India is generally poor, and structured clinical information is mostly lacking. This work is the first attempt aimed at evaluating unstructured text mining for extracting relevant clinical information from Indian clinical records. We annotated a corpus of 250 discharge summaries from an Intensive Care Unit (ICU) in India, with markups for diseases, procedures, and lab parameters, their attributes, as well as key demographic information and administrative variables such as patient outcomes. In this process, we have constructed guidelines for an annotation scheme useful to clinicians in the Indian context. We evaluated the performance of an NLP engine, Cocoa, on a cohort of these Indian clinical records. We have produced an annotated corpus of roughly 90 thousand words, which to our knowledge is the first tagged clinical corpus from India. Cocoa was evaluated on a test corpus of 50 documents. The overlap F-scores across the major categories, namely disease/symptoms, procedures, laboratory parameters and outcomes, are 0.856, 0.834, 0.961 and 0.872 respectively. These results are competitive with results from recent shared tasks based on US records. The annotated corpus and associated results from the Cocoa engine indicate that unstructured text mining is a viable method for cohort analysis in the Indian clinical context, where structured EHR records are largely absent.

  18. Single-photon test of hyper-complex quantum theories using a metamaterial.

    PubMed

    Procopio, Lorenzo M; Rozema, Lee A; Wong, Zi Jing; Hamel, Deny R; O'Brien, Kevin; Zhang, Xiang; Dakić, Borivoje; Walther, Philip

    2017-04-21

    In standard quantum mechanics, complex numbers are used to describe the wavefunction. Although this has so far proven sufficient to predict experimental results, there is no theoretical reason to choose them over real numbers or generalizations of complex numbers, that is, hyper-complex numbers. Experiments performed to date have proven that real numbers are insufficient, but the need for hyper-complex numbers remains an open question. Here we experimentally probe hyper-complex quantum theories, studying one of their deviations from complex quantum theory: the non-commutativity of phases. We do so by passing single photons through a Sagnac interferometer containing both a metamaterial with a negative refractive index, and a positive phase shifter. To accomplish this we engineered a fishnet metamaterial to have a negative refractive index at 780 nm. We show that the metamaterial phase commutes with other phases with high precision, allowing us to place limits on a particular prediction of hyper-complex quantum theories.

  19. ECN-2301

    NASA Image and Video Library

    1969-09-10

    The Hyper III was a low-cost test vehicle for an advanced lifting-body shape. Like the earlier M2-F1, it was a "homebuilt" research aircraft, i.e., built at the Flight Research Center (FRC), later redesignated the Dryden Flight Research Center. It had a steel-tube frame covered with Dacron, a fiberglass nose, sheet aluminum fins, and a wing from an HP-11 sailplane. Construction was by volunteers at the FRC. Although the Hyper III was to be flown remotely in its initial tests, it was fitted with a cockpit for a pilot. On the Hyper III's only flight, it was towed aloft attached to a Navy SH-3 helicopter by a 400-foot cable. NASA research pilot Bruce Peterson flew the SH-3. After he released the Hyper III from the cable, NASA research pilot Milt Thompson flew the vehicle by radio control until the final approach when Dick Fischer took over control using a model-airplane radio-control box. The Hyper III flared, then landed and slid to a stop on Rogers Dry Lakebed.

  20. Rapid Molecular Analysis of the STAT3 Gene in Job Syndrome of Hyper-IgE and Recurrent Infectious Diseases

    PubMed Central

    Kumánovics, Attila; Wittwer, Carl T.; Pryor, Robert J.; Augustine, Nancy H.; Leppert, Mark F.; Carey, John C.; Ochs, Hans D.; Wedgwood, Ralph J.; Faville, Ralph J.; Quie, Paul G.; Hill, Harry R.

    2010-01-01

    With the recent discovery of mutations in the STAT3 gene in the majority of patients with classic Hyper-IgE syndrome, it is now possible to make a molecular diagnosis in most of these cases. We have developed a PCR-based high-resolution DNA-melting assay to scan selected exons of the STAT3 gene for mutations responsible for Hyper-IgE syndrome, which is then followed by targeted sequencing. We scanned for mutations in 10 unrelated pedigrees, which include 16 patients with classic Hyper-IgE syndrome. These pedigrees include both sporadic and familial cases and their relatives, and we have found STAT3 mutations in all affected individuals. High-resolution melting analysis allows a single day turn-around time for mutation scanning and targeted sequencing of the STAT3 gene, which will greatly facilitate the rapid diagnosis of the Hyper-IgE syndrome, allowing prompt and appropriate therapy, prophylaxis, improved clinical outcome, and accurate genetic counseling. PMID:20093388

  1. Single-photon test of hyper-complex quantum theories using a metamaterial

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Procopio, Lorenzo M.; Rozema, Lee A.; Wong, Zi Jing

    In standard quantum mechanics, complex numbers are used to describe the wavefunction. Although this has so far proven sufficient to predict experimental results, there is no theoretical reason to choose them over real numbers or generalizations of complex numbers, that is, hyper-complex numbers. Experiments performed to date have proven that real numbers are insufficient, but the need for hyper-complex numbers remains an open question. Here we experimentally probe hyper-complex quantum theories, studying one of their deviations from complex quantum theory: the non-commutativity of phases. We do so by passing single photons through a Sagnac interferometer containing both a metamaterial withmore » a negative refractive index, and a positive phase shifter. In order to accomplish this we engineered a fishnet metamaterial to have a negative refractive index at 780 nm. Here, we show that the metamaterial phase commutes with other phases with high precision, allowing us to place limits on a particular prediction of hyper-complex quantum theories.« less

  2. Single-photon test of hyper-complex quantum theories using a metamaterial

    DOE PAGES

    Procopio, Lorenzo M.; Rozema, Lee A.; Wong, Zi Jing; ...

    2017-04-21

    In standard quantum mechanics, complex numbers are used to describe the wavefunction. Although this has so far proven sufficient to predict experimental results, there is no theoretical reason to choose them over real numbers or generalizations of complex numbers, that is, hyper-complex numbers. Experiments performed to date have proven that real numbers are insufficient, but the need for hyper-complex numbers remains an open question. Here we experimentally probe hyper-complex quantum theories, studying one of their deviations from complex quantum theory: the non-commutativity of phases. We do so by passing single photons through a Sagnac interferometer containing both a metamaterial withmore » a negative refractive index, and a positive phase shifter. In order to accomplish this we engineered a fishnet metamaterial to have a negative refractive index at 780 nm. Here, we show that the metamaterial phase commutes with other phases with high precision, allowing us to place limits on a particular prediction of hyper-complex quantum theories.« less

  3. Single-photon test of hyper-complex quantum theories using a metamaterial

    PubMed Central

    Procopio, Lorenzo M.; Rozema, Lee A.; Wong, Zi Jing; Hamel, Deny R.; O'Brien, Kevin; Zhang, Xiang; Dakić, Borivoje; Walther, Philip

    2017-01-01

    In standard quantum mechanics, complex numbers are used to describe the wavefunction. Although this has so far proven sufficient to predict experimental results, there is no theoretical reason to choose them over real numbers or generalizations of complex numbers, that is, hyper-complex numbers. Experiments performed to date have proven that real numbers are insufficient, but the need for hyper-complex numbers remains an open question. Here we experimentally probe hyper-complex quantum theories, studying one of their deviations from complex quantum theory: the non-commutativity of phases. We do so by passing single photons through a Sagnac interferometer containing both a metamaterial with a negative refractive index, and a positive phase shifter. To accomplish this we engineered a fishnet metamaterial to have a negative refractive index at 780 nm. We show that the metamaterial phase commutes with other phases with high precision, allowing us to place limits on a particular prediction of hyper-complex quantum theories. PMID:28429711

  4. Preventive Intra Oral Treatment of Sea Cucumber Ameliorate OVA-Induced Allergic Airway Inflammation.

    PubMed

    Lee, Da-In; Park, Mi-Kyung; Kang, Shin Ae; Choi, Jun-Ho; Kang, Seok-Jung; Lee, Jeong-Yeol; Yu, Hak Sun

    2016-01-01

    Sea cucumber extracts have potent biological effects, including anti-viral, anti-cancer, antibacterial, anti-oxidant, and anti-inflammation effects. To understand their anti-asthma effects, we induced allergic airway inflammation in mice after 7 oral administrations of the extract. The hyper-responsiveness value in mice with ovalbumin (OVA)-alum-induced asthma after oral injection of sea cucumber extracts was significantly lower than that in the OVA-alum-induced asthma group. In addition, the number of eosinophils in the lungs of asthma-induced mice pre-treated with sea cucumber extract was significantly decreased compared to that of PBS pre-treated mice. Additionally, CD4[Formula: see text]CD25[Formula: see text]Foxp3[Formula: see text]T (regulatory T; Treg) cells significantly increased in mesenteric lymph nodes after 7 administrations of the extract. These results suggest that sea cucumber extract can ameliorate allergic airway inflammation via Treg cell activation and recruitment to the lung.

  5. Flight Test of the Engine Fuel Schedules of the X-43A Hyper-X Research Vehicles

    NASA Technical Reports Server (NTRS)

    Jones, Thomas

    2006-01-01

    The Hyper-X program flew two X-43A Hyper-X Research Vehicles (HXRVs) in 2004, referred to as Ship 2 and Ship 3. The scramjet engine of the X-43A research vehicle was autonomously controlled in flight to track a predetermined fueling schedule. Ship 2 flew at approximately Mach 7 and Ship 3 flew at approximately Mach 10.

  6. Controlled growth of novel hyper-branched nanostructures in nanoporous alumina membrane.

    PubMed

    Zhang, Junping; Day, Cynthia S; Carroll, David L

    2009-12-07

    This paper proposes a novel approach to fabricate hyper-branched anodic aluminium oxide (AAO) nanostructures with different branches on the vertically-aligned trunk and at the trunk terminal. Silver nanowires with different dimensional and multifunctional complexity have been prepared from this hyper-branched AAO template by varying the electrodeposition time. These kinds of novel nanostructure may be used to build blocks for nanoelectronic and nanophotonic devices.

  7. [Would the Screening of Common Mental Disorders in Primary-Care Health Services Hyper-Frequent Patients Be Useful?].

    PubMed

    Rincón-Hoyos, Hernán G; López, Mérida R Rodríguez; Ruiz, Ana María Villa; Hernández, Carlos Augusto; Ramos, Martha Lucía

    2012-12-01

    Hyper-frequentation in health services is a problem for patients, their families and the institutions. This study is aimed at determining the frequency and characteristics of common mental disorders in hyper-frequent patients showing vague symptoms and signs at a primary healthcare service during the year 2007 in the city of Cali (Colombia). Cross sectional. The most frequent mental disorders in hyper-frequent patients were detected through a telephone interview which included several modules of the PRIME MD instrument. In general, healthcare service hyper-frequenters are working women, 38,7-year old in average. Basically, the consultation is due to cephalalgia but they also exhibit a high prevalence of common mental disorders (somatization, depression and anxiety) not easily diagnosed by physicians in primary care. Expenses for additional health activities generated by these patients are attributed basically to medical consultation and required procedures. Considering hyper-frequenters in health care services as a risk group in terms of common mental disorders involves screening as an efficient strategy to prevent abuse in service use and to improve satisfaction with the attention received. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  8. Influence of Averaging Preprocessing on Image Analysis with a Markov Random Field Model

    NASA Astrophysics Data System (ADS)

    Sakamoto, Hirotaka; Nakanishi-Ohno, Yoshinori; Okada, Masato

    2018-02-01

    This paper describes our investigations into the influence of averaging preprocessing on the performance of image analysis. Averaging preprocessing involves a trade-off: image averaging is often undertaken to reduce noise while the number of image data available for image analysis is decreased. We formulated a process of generating image data by using a Markov random field (MRF) model to achieve image analysis tasks such as image restoration and hyper-parameter estimation by a Bayesian approach. According to the notions of Bayesian inference, posterior distributions were analyzed to evaluate the influence of averaging. There are three main results. First, we found that the performance of image restoration with a predetermined value for hyper-parameters is invariant regardless of whether averaging is conducted. We then found that the performance of hyper-parameter estimation deteriorates due to averaging. Our analysis of the negative logarithm of the posterior probability, which is called the free energy based on an analogy with statistical mechanics, indicated that the confidence of hyper-parameter estimation remains higher without averaging. Finally, we found that when the hyper-parameters are estimated from the data, the performance of image restoration worsens as averaging is undertaken. We conclude that averaging adversely influences the performance of image analysis through hyper-parameter estimation.

  9. Hyper-Methylated Loci Persisting from Sessile Serrated Polyps to Serrated Cancers.

    PubMed

    Andrew, Angeline S; Baron, John A; Butterly, Lynn F; Suriawinata, Arief A; Tsongalis, Gregory J; Robinson, Christina M; Amos, Christopher I

    2017-03-02

    Although serrated polyps were historically considered to pose little risk, it is now understood that progression down the serrated pathway could account for as many as 15%-35% of colorectal cancers. The sessile serrated adenoma/polyp (SSA/P) is the most prevalent pre-invasive serrated lesion. Our objective was to identify the CpG loci that are persistently hyper-methylated during serrated carcinogenesis, from the early SSA/P lesion through the later cancer phases of neoplasia development. We queried the loci hyper-methylated in serrated cancers within our rightsided SSA/Ps from the New Hampshire Colonoscopy Registry, using the Illumina Infinium Human Methylation 450 k panel to comprehensively assess the DNA methylation status. We identified CpG loci and regions consistently hyper-methylated throughout the serrated carcinogenesis spectrum, in both our SSA/P specimens and in serrated cancers. Hyper-methylated CpG loci included the known the tumor suppressor gene RET (p = 5.72 x 10-10), as well as loci in differentially methylated regions for GSG1L, MIR4493, NTNG1, MCIDAS, ZNF568, and RERG. The hyper-methylated loci that we identified help characterize the biology of SSA/P development, and could be useful as therapeutic targets, or for future identification of patients who may benefit from shorter surveillance intervals.

  10. An advanced scanning method for space-borne hyper-spectral imaging system

    NASA Astrophysics Data System (ADS)

    Wang, Yue-ming; Lang, Jun-Wei; Wang, Jian-Yu; Jiang, Zi-Qing

    2011-08-01

    Space-borne hyper-spectral imagery is an important means for the studies and applications of earth science. High cost efficiency could be acquired by optimized system design. In this paper, an advanced scanning method is proposed, which contributes to implement both high temporal and spatial resolution imaging system. Revisit frequency and effective working time of space-borne hyper-spectral imagers could be greatly improved by adopting two-axis scanning system if spatial resolution and radiometric accuracy are not harshly demanded. In order to avoid the quality degradation caused by image rotation, an idea of two-axis rotation has been presented based on the analysis and simulation of two-dimensional scanning motion path and features. Further improvement of the imagers' detection ability under the conditions of small solar altitude angle and low surface reflectance can be realized by the Ground Motion Compensation on pitch axis. The structure and control performance are also described. An intelligent integration technology of two-dimensional scanning and image motion compensation is elaborated in this paper. With this technology, sun-synchronous hyper-spectral imagers are able to pay quick visit to hot spots, acquiring both high spatial and temporal resolution hyper-spectral images, which enables rapid response of emergencies. The result has reference value for developing operational space-borne hyper-spectral imagers.

  11. Follicular fluid lipid fingerprinting from women with PCOS and hyper response during IVF treatment.

    PubMed

    Cordeiro, Fernanda Bertuccez; Cataldi, Thaís Regiani; do Vale Teixeira da Costa, Lívia; de Lima, Camila Bruna; Stevanato, Juliana; Zylbersztejn, Daniel Suslik; Ferreira, Christina Ramires; Eberlin, Marcos Nogueira; Cedenho, Agnaldo Pereira; Turco, Edson Guimarães Lo

    2015-01-01

    Polycystic ovary syndrome (PCOS) is an endocrine-metabolic disorder that leads to lower natural reproductive potential and presents a challenge for assisted reproductive medicine because patients may exhibit immature oocyte retrieval and a higher risk of ovarian hyper stimulation syndrome during in vitro fertilization (IVF) treatment. This study aimed to identify potential lipid biomarkers for women with PCOS and a hyper response to controlled ovarian stimulation. Follicular fluid samples were collected from patients who underwent IVF, including normal responder women who became pregnant (control group, n = 11), women with PCOS and a hyper response to gonadotropins (PCOS group, n = 7) and women with only hyper response to gonadotropins (HR group, n = 7). A lipidomic analysis was performed by electrospray ionization mass spectrometry, and candidate biomarkers were analyzed by tandem mass spectrometry experiment. The lipid profiles indicated particularities related to differences in phosphatidylcholine (PCOS and HR), phosphatidylserine, phosphatydilinositol and phosphatidylglycerol (control), sphingolipids (PCOS) and phosphatidylethanolamine (control and HR). These findings contribute to the understanding of the molecular mechanisms associated with lipid metabolism in the PCOS-related hyper response, and strongly suggest that these lipids may be useful as biomarkers, leading to the development of more individualized treatment for pregnancy outcome.

  12. Hyper-X Engine Testing in the NASA Langley 8-Foot High Temperature Tunnel

    NASA Technical Reports Server (NTRS)

    Huebner, Lawrence D.; Rock, Kenneth E.; Witte, David W.; Ruf, Edward G.; Andrews, Earl H., Jr.

    2000-01-01

    Airframe-integrated scramjet engine tests have 8 completed at Mach 7 in the NASA Langley 8-Foot High Temperature Tunnel under the Hyper-X program. These tests provided critical engine data as well as design and database verification for the Mach 7 flight tests of the Hyper-X research vehicle (X-43), which will provide the first-ever airframe- integrated scramjet flight data. The first model tested was the Hyper-X Engine Model (HXEM), and the second was the Hyper-X Flight Engine (HXFE). The HXEM, a partial-width, full-height engine that is mounted on an airframe structure to simulate the forebody features of the X-43, was tested to provide data linking flowpath development databases to the complete airframe-integrated three-dimensional flight configuration and to isolate effects of ground testing conditions and techniques. The HXFE, an exact geometric representation of the X-43 scramjet engine mounted on an airframe structure that duplicates the entire three-dimensional propulsion flowpath from the vehicle leading edge to the vehicle base, was tested to verify the complete design as it will be flight tested. This paper presents an overview of these two tests, their importance to the Hyper-X program, and the significance of their contribution to scramjet database development.

  13. Clinical features and dysfunctions of iron metabolism in Parkinson disease patients with hyper echogenicity in substantia nigra: a cross-sectional study.

    PubMed

    Yu, Shu-Yang; Cao, Chen-Jie; Zuo, Li-Jun; Chen, Ze-Jie; Lian, Teng-Hong; Wang, Fang; Hu, Yang; Piao, Ying-Shan; Li, Li-Xia; Guo, Peng; Liu, Li; Yu, Qiu-Jin; Wang, Rui-Dan; Chan, Piu; Chen, Sheng-di; Wang, Xiao-Min; Zhang, Wei

    2018-01-17

    Transcranial ultrasound is a useful tool for providing the evidences for the early diagnosis and differential diagnosis of Parkinson disease (PD). However, the relationship between hyper echogenicity in substantia nigra (SN) and clinical symptoms of PD patients remains unknown, and the role of dysfunction of iron metabolism on the pathogenesis of SN hyper echogenicity is unclear. PD patients was detected by transcranial sonography and divided into with no hyper echogenicity (PDSN-) group and with hyper echogenicity (PDSN+) group. Motor symptoms (MS) and non-motor symptoms (NMS) were evaluated, and the levels of iron and related proteins in serum and cerebrospinal fluid (CSF) were detected for PD patients. Data comparison between the two groups and correlation analyses were performed. PDSN+ group was significantly older, and had significantly older age of onset, more advanced Hohen-Yahr stage, higher SCOPA-AUT score and lower MoCA score than PDSN- group (P < 0.05). Compared with PDSN- group, the levels of transferrin and light-ferritin in serum and iron level in CSF were significantly elevated (P < 0.05), but ferroportin level in CSF was significantly decreased in PDSN+ group (P < 0.05). PD patients with hyper echogenicity in SN are older, at more advanced disease stage, have severer motor symptoms, and non-motor symptoms of cognitive impairment and autonomic dysfunction. Hyper echogenicity of SN in PD patients is related to dysfunction of iron metabolism, involving increased iron transport from peripheral system to central nervous system, reduction of intracellular iron release and excessive iron deposition in brain.

  14. Hyper-Resolution Groundwater Modeling using MODFLOW 6

    NASA Astrophysics Data System (ADS)

    Hughes, J. D.; Langevin, C.

    2017-12-01

    MODFLOW 6 is the latest version of the U.S. Geological Survey's modular hydrologic model. MODFLOW 6 was developed to synthesize many of the recent versions of MODFLOW into a single program, improve the way different process models are coupled, and to provide an object-oriented framework for adding new types of models and packages. The object-oriented framework and underlying numerical solver make it possible to tightly couple any number of hyper-resolution models within coarser regional models. The hyper-resolution models can be used to evaluate local-scale groundwater issues that may be affected by regional-scale forcings. In MODFLOW 6, hyper-resolution meshes can be maintained as separate model datasets, similar to MODFLOW-LGR, which simplifies the development of a coarse regional model with imbedded hyper-resolution models from a coarse regional model. For example, the South Atlantic Coastal Plain regional water availability model was converted from a MODFLOW-2000 model to a MODFLOW 6 model. The horizontal discretization of the original model is approximately 3,218 m x 3,218 m. Hyper-resolution models of the Aiken and Sumter County water budget areas in South Carolina with a horizontal discretization of approximately 322 m x 322 m were developed and were tightly coupled to a modified version of the original coarse regional model that excluded these areas. Hydraulic property and aquifer geometry data from the coarse model were mapped to the hyper-resolution models. The discretization of the hyper-resolution models is fine enough to make detailed analyses of the effect that changes in groundwater withdrawals in the production aquifers have on the water table and surface-water/groundwater interactions. The approach used in this analysis could be applied to other regional water availability models that have been developed by the U.S. Geological Survey to evaluate local scale groundwater issues.

  15. An Object-Oriented Approach to the Development of Computer-Assisted Instructional Material Using Hypertext

    DTIC Science & Technology

    1988-12-01

    reading on computers for more than 10 or 15 minutes . If it takes any longer I would rather have a piece of paper in front of me. It did provide an outline...advisor, Capt. David Umphress. I thank Dave also for the moral support he provided by agreeing to advise this thesis, and by providing timely...2.17 information using... text, graphics, video , music, voice, and animation" (Williams, 1987:109; Conklin, 1987a:32). Even so, HyperCard has been very

  16. Teaching Critical Thinking in the Business Mathematics Course.

    ERIC Educational Resources Information Center

    Rosenbaum, Roberta

    1986-01-01

    Appropriate strategies for teaching students to interpret and understand quantitative data in marketing, management, accounting, and data processing are described. Accompanying figures illustrate samples of percentage markups, trade discounts, gross earning, gross commissions, accounting entries, balance sheet entries, and percentage problems. (CT)

  17. "The Wonder Years" of XML.

    ERIC Educational Resources Information Center

    Gazan, Rich

    2000-01-01

    Surveys the current state of Extensible Markup Language (XML), a metalanguage for creating structured documents that describe their own content, and its implications for information professionals. Predicts that XML will become the common language underlying Web, word processing, and database formats. Also discusses Extensible Stylesheet Language…

  18. Morphine induced exacerbation of sepsis is mediated by tempering endotoxin tolerance through modulation of miR-146a

    PubMed Central

    Banerjee, Santanu; Meng, Jingjing; Das, Subhas; Krishnan, Anitha; Haworth, Justin; Charboneau, Richard; Zeng, Yan; Ramakrishnan, Sundaram; Roy, Sabita

    2013-01-01

    Development of tolerance to endotoxin prevents sustained hyper inflammation during systemic infections. Here we report for the first time that chronic morphine treatment tempers endotoxin tolerance resulting in persistent inflammation, septicemia and septic shock. Morphine was found to down-regulate endotoxin/LPS induced miR-146a and 155 in macrophages. However, only miR-146a over expression, but not miR-155 abrogates morphine mediated hyper-inflammation. Conversely, antagonizing miR-146a (but not miR-155) heightened the severity of morphine-mediated hyper-inflammation. These results suggest that miR-146a acts as a molecular switch controlling hyper-inflammation in clinical and/or recreational use of morphine. PMID:23756365

  19. Hyper-X Program Status

    NASA Technical Reports Server (NTRS)

    McClinton, Charles R.; Rausch, Vincent L.; Sitz, Joel; Reukauf, Paul

    2001-01-01

    This paper provides an overview of the objectives and status of the Hyper-X program, which is tailored to move hypersonic, airbreathing vehicle technology from the laboratory environment to the flight environment. The first Hyper-X research vehicle (HXRV), designated X-43, is being prepared at the Dryden Flight Research Center for flight at Mach 7. Extensive risk reduction activities for the first flight are completed, and non-recurring design activities for the Mach 10 X-43 (3rd flight) are nearing completion. The Mach 7 flight of the X-43, in the spring of 2001, will be the first flight of an airframe-integrated scramjet-powered vehicle. The Hyper-X program is continuing to plan follow-on activities to focus an orderly continuation of hypersonic technology development through flight research.

  20. Hyper-X Program Status

    NASA Technical Reports Server (NTRS)

    McClinton, Charles R.; Reubush, David E.; Sitz, Joel; Reukauf, Paul

    2001-01-01

    This paper provides an overview of the objectives and status of the Hyper-X program, which is tailored to move hypersonic, airbreathing vehicle technology from the laboratory environment to the flight environment. The first Hyper-X research vehicle (HXRV), designated X-43, is being prepared at the Dryden Flight Research Center for flight at Mach 7. Extensive risk reduction activities for the first flight are completed, and non-recurring design activities for the Mach 10 X-43 (third flight) are nearing completion. The Mach 7 flight of the X-43, in the spring of 2001, will be the first flight of an airframe-integrated scramjet-powered vehicle. The Hyper-X program is continuing to plan follow-on activities to focus an orderly continuation of hypersonic technology development through flight research.

  1. Bit-level quantum color image encryption scheme with quantum cross-exchange operation and hyper-chaotic system

    NASA Astrophysics Data System (ADS)

    Zhou, Nanrun; Chen, Weiwei; Yan, Xinyu; Wang, Yunqian

    2018-06-01

    In order to obtain higher encryption efficiency, a bit-level quantum color image encryption scheme by exploiting quantum cross-exchange operation and a 5D hyper-chaotic system is designed. Additionally, to enhance the scrambling effect, the quantum channel swapping operation is employed to swap the gray values of corresponding pixels. The proposed color image encryption algorithm has larger key space and higher security since the 5D hyper-chaotic system has more complex dynamic behavior, better randomness and unpredictability than those based on low-dimensional hyper-chaotic systems. Simulations and theoretical analyses demonstrate that the presented bit-level quantum color image encryption scheme outperforms its classical counterparts in efficiency and security.

  2. Descriptive Analysis on the Impacts of Universal Zero-Markup Drug Policy on a Chinese Urban Tertiary Hospital

    PubMed Central

    Yang, Dong

    2016-01-01

    Background Universal Zero-Markup Drug Policy (UZMDP) mandates no price mark-ups on any drug dispensed by a healthcare institution, and covers the medicines not included in the China’s National Essential Medicine System. Five tertiary hospitals in Beijing, China implemented UZMDP in 2012. Its impacts on these hospitals are unknown. We described the effects of UZMDP on a participating hospital, Jishuitan Hospital, Beijing, China (JST). Methods This retrospective longitudinal study examined the hospital-level data of JST and city-level data of tertiary hospitals of Beijing, China (BJT) 2009–2015. Rank-sum tests and join-point regression analyses were used to assess absolute changes and differences in trends, respectively. Results In absolute terms, after the UZDMP implementation, there were increased annual patient-visits and decreased ratios of medicine-to-healthcare-charges (RMOH) in JST outpatient and inpatient services; however, in outpatient service, physician work-days decreased and physician-workload and inflation-adjusted per-visit healthcare charges increased, while the inpatient physician work-days increased and inpatient mortality-rate reduced. Interestingly, the decreasing trend in inpatient mortality-rate was neutralized after UZDMP implementation. Compared with BJT and under influence of UZDMP, JST outpatient and inpatient services both had increasing trends in annual patient-visits (annual percentage changes[APC] = 8.1% and 6.5%, respectively) and decreasing trends in RMOH (APC = -4.3% and -5.4%, respectively), while JST outpatient services had increasing trend in inflation-adjusted per-visit healthcare charges (APC = 3.4%) and JST inpatient service had decreasing trend in inflation-adjusted per-visit medicine-charges (APC = -5.2%). Conclusion Implementation of UZMDP seems to increase annual patient-visits, reduce RMOH and have different impacts on outpatient and inpatient services in a Chinese urban tertiary hospital. PMID:27627811

  3. Medicine prices, availability, and affordability in 36 developing and middle-income countries: a secondary analysis.

    PubMed

    Cameron, A; Ewen, M; Ross-Degnan, D; Ball, D; Laing, R

    2009-01-17

    WHO and Health Action International (HAI) have developed a standardised method for surveying medicine prices, availability, affordability, and price components in low-income and middle-income countries. Here, we present a secondary analysis of medicine availability in 45 national and subnational surveys done using the WHO/HAI methodology. Data from 45 WHO/HAI surveys in 36 countries were adjusted for inflation or deflation and purchasing power parity. International reference prices from open international procurements for generic products were used as comparators. Results are presented for 15 medicines included in at least 80% of surveys and four individual medicines. Average public sector availability of generic medicines ranged from 29.4% to 54.4% across WHO regions. Median government procurement prices for 15 generic medicines were 1.11 times corresponding international reference prices, although purchasing efficiency ranged from 0.09 to 5.37 times international reference prices. Low procurement prices did not always translate into low patient prices. Private sector patients paid 9-25 times international reference prices for lowest-priced generic products and over 20 times international reference prices for originator products across WHO regions. Treatments for acute and chronic illness were largely unaffordable in many countries. In the private sector, wholesale mark-ups ranged from 2% to 380%, whereas retail mark-ups ranged from 10% to 552%. In countries where value added tax was applied to medicines, the amount charged varied from 4% to 15%. Overall, public and private sector prices for originator and generic medicines were substantially higher than would be expected if purchasing and distribution were efficient and mark-ups were reasonable. Policy options such as promoting generic medicines and alternative financing mechanisms are needed to increase availability, reduce prices, and improve affordability.

  4. Descriptive Analysis on the Impacts of Universal Zero-Markup Drug Policy on a Chinese Urban Tertiary Hospital.

    PubMed

    Tian, Wei; Yuan, Jiangfan; Yang, Dong; Zhang, Lanjing

    2016-01-01

    Universal Zero-Markup Drug Policy (UZMDP) mandates no price mark-ups on any drug dispensed by a healthcare institution, and covers the medicines not included in the China's National Essential Medicine System. Five tertiary hospitals in Beijing, China implemented UZMDP in 2012. Its impacts on these hospitals are unknown. We described the effects of UZMDP on a participating hospital, Jishuitan Hospital, Beijing, China (JST). This retrospective longitudinal study examined the hospital-level data of JST and city-level data of tertiary hospitals of Beijing, China (BJT) 2009-2015. Rank-sum tests and join-point regression analyses were used to assess absolute changes and differences in trends, respectively. In absolute terms, after the UZDMP implementation, there were increased annual patient-visits and decreased ratios of medicine-to-healthcare-charges (RMOH) in JST outpatient and inpatient services; however, in outpatient service, physician work-days decreased and physician-workload and inflation-adjusted per-visit healthcare charges increased, while the inpatient physician work-days increased and inpatient mortality-rate reduced. Interestingly, the decreasing trend in inpatient mortality-rate was neutralized after UZDMP implementation. Compared with BJT and under influence of UZDMP, JST outpatient and inpatient services both had increasing trends in annual patient-visits (annual percentage changes[APC] = 8.1% and 6.5%, respectively) and decreasing trends in RMOH (APC = -4.3% and -5.4%, respectively), while JST outpatient services had increasing trend in inflation-adjusted per-visit healthcare charges (APC = 3.4%) and JST inpatient service had decreasing trend in inflation-adjusted per-visit medicine-charges (APC = -5.2%). Implementation of UZMDP seems to increase annual patient-visits, reduce RMOH and have different impacts on outpatient and inpatient services in a Chinese urban tertiary hospital.

  5. The E-Book: Pipe Dream or Potential Disaster?

    ERIC Educational Resources Information Center

    Dorman, David

    1999-01-01

    Discusses the development of electronic books and considers marketing and distribution, rights management, and technical standards. Economic and institutional relationships, copyrights, Extensible Markup Language (XML), access to content, free access versus fees, preservation versus loss of control over long-term storage and access, and trusted…

  6. Developing Intranets: Practical Issues for Implementation and Design.

    ERIC Educational Resources Information Center

    Trowbridge, Dave

    1996-01-01

    An intranet is a system which has "domesticated" the technologies of the Internet for specific organizational settings and goals. Although the adaptability of Hypertext Markup Language to intranets is sometimes limited, implementing various protocols and technologies enable organizations to share files among heterogeneous computers,…

  7. Designing and Managing Your Digital Library.

    ERIC Educational Resources Information Center

    Guenther, Kim

    2000-01-01

    Discusses digital libraries and Web site design issues. Highlights include accessibility issues, including standards, markup languages like HTML and XML, and metadata; building virtual communities; the use of Web portals for customized delivery of information; quality assurance tools, including data mining; and determining user needs, including…

  8. SCOPIC Design and Overview

    ERIC Educational Resources Information Center

    Barth, Danielle; Evans, Nicholas

    2017-01-01

    This paper provides an overview of the design and motivation for creating the Social Cognition Parallax Interview Corpus (SCOPIC), an open-ended, accessible corpus that balances the need for language-specific annotation with typologically-calibrated markup. SCOPIC provides richly annotated data, focusing on functional categories relevant to social…

  9. XML under the Hood.

    ERIC Educational Resources Information Center

    Scharf, David

    2002-01-01

    Discusses XML (extensible markup language), particularly as it relates to libraries. Topics include organizing information; cataloging; metadata; similarities to HTML; organizations dealing with XML; making XML useful; a history of XML; the semantic Web; related technologies; XML at the Library of Congress; and its role in improving the…

  10. Accessing Electronic Theses: Progress?

    ERIC Educational Resources Information Center

    Tennant, Roy

    2000-01-01

    Describes various ways by which universities provide access to their electronic theses and dissertations (ETDs), discussing UMI (University Microfilms International), XML (eXtensible Markup Language), and other formats. Discusses key leaders--national and international--in the ETD effort. Outlines the two main methods for locating ETDs. Presents a…

  11. From data to analysis: linking NWChem and Avogadro with the syntax and semantics of Chemical Markup Language

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Jong, Wibe A.; Walker, Andrew M.; Hanwell, Marcus D.

    Background Multidisciplinary integrated research requires the ability to couple the diverse sets of data obtained from a range of complex experiments and computer simulations. Integrating data requires semantically rich information. In this paper the generation of semantically rich data from the NWChem computational chemistry software is discussed within the Chemical Markup Language (CML) framework. Results The NWChem computational chemistry software has been modified and coupled to the FoX library to write CML compliant XML data files. The FoX library was expanded to represent the lexical input files used by the computational chemistry software. Conclusions The production of CML compliant XMLmore » files for the computational chemistry software NWChem can be relatively easily accomplished using the FoX library. A unified computational chemistry or CompChem convention and dictionary needs to be developed through a community-based effort. The long-term goal is to enable a researcher to do Google-style chemistry and physics searches.« less

  12. Adding Hierarchical Objects to Relational Database General-Purpose XML-Based Information Managements

    NASA Technical Reports Server (NTRS)

    Lin, Shu-Chun; Knight, Chris; La, Tracy; Maluf, David; Bell, David; Tran, Khai Peter; Gawdiak, Yuri

    2006-01-01

    NETMARK is a flexible, high-throughput software system for managing, storing, and rapid searching of unstructured and semi-structured documents. NETMARK transforms such documents from their original highly complex, constantly changing, heterogeneous data formats into well-structured, common data formats in using Hypertext Markup Language (HTML) and/or Extensible Markup Language (XML). The software implements an object-relational database system that combines the best practices of the relational model utilizing Structured Query Language (SQL) with those of the object-oriented, semantic database model for creating complex data. In particular, NETMARK takes advantage of the Oracle 8i object-relational database model using physical-address data types for very efficient keyword searches of records across both context and content. NETMARK also supports multiple international standards such as WEBDAV for drag-and-drop file management and SOAP for integrated information management using Web services. The document-organization and -searching capabilities afforded by NETMARK are likely to make this software attractive for use in disciplines as diverse as science, auditing, and law enforcement.

  13. The Biological Connection Markup Language: a SBGN-compliant format for visualization, filtering and analysis of biological pathways.

    PubMed

    Beltrame, Luca; Calura, Enrica; Popovici, Razvan R; Rizzetto, Lisa; Guedez, Damariz Rivero; Donato, Michele; Romualdi, Chiara; Draghici, Sorin; Cavalieri, Duccio

    2011-08-01

    Many models and analysis of signaling pathways have been proposed. However, neither of them takes into account that a biological pathway is not a fixed system, but instead it depends on the organism, tissue and cell type as well as on physiological, pathological and experimental conditions. The Biological Connection Markup Language (BCML) is a format to describe, annotate and visualize pathways. BCML is able to store multiple information, permitting a selective view of the pathway as it exists and/or behave in specific organisms, tissues and cells. Furthermore, BCML can be automatically converted into data formats suitable for analysis and into a fully SBGN-compliant graphical representation, making it an important tool that can be used by both computational biologists and 'wet lab' scientists. The XML schema and the BCML software suite are freely available under the LGPL for download at http://bcml.dc-atlas.net. They are implemented in Java and supported on MS Windows, Linux and OS X.

  14. Converting CSV Files to RKSML Files

    NASA Technical Reports Server (NTRS)

    Trebi-Ollennu, Ashitey; Liebersbach, Robert

    2009-01-01

    A computer program converts, into a format suitable for processing on Earth, files of downlinked telemetric data pertaining to the operation of the Instrument Deployment Device (IDD), which is a robot arm on either of the Mars Explorer Rovers (MERs). The raw downlinked data files are in comma-separated- value (CSV) format. The present program converts the files into Rover Kinematics State Markup Language (RKSML), which is an Extensible Markup Language (XML) format that facilitates representation of operations of the IDD and enables analysis of the operations by means of the Rover Sequencing Validation Program (RSVP), which is used to build sequences of commanded operations for the MERs. After conversion by means of the present program, the downlinked data can be processed by RSVP, enabling the MER downlink operations team to play back the actual IDD activity represented by the telemetric data against the planned IDD activity. Thus, the present program enhances the diagnosis of anomalies that manifest themselves as differences between actual and planned IDD activities.

  15. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool

    PubMed Central

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda

    2008-01-01

    Background It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. Results This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. Conclusion SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes. PMID:18706080

  16. AllerML: markup language for allergens.

    PubMed

    Ivanciuc, Ovidiu; Gendel, Steven M; Power, Trevor D; Schein, Catherine H; Braun, Werner

    2011-06-01

    Many concerns have been raised about the potential allergenicity of novel, recombinant proteins into food crops. Guidelines, proposed by WHO/FAO and EFSA, include the use of bioinformatics screening to assess the risk of potential allergenicity or cross-reactivities of all proteins introduced, for example, to improve nutritional value or promote crop resistance. However, there are no universally accepted standards that can be used to encode data on the biology of allergens to facilitate using data from multiple databases in this screening. Therefore, we developed AllerML a markup language for allergens to assist in the automated exchange of information between databases and in the integration of the bioinformatics tools that are used to investigate allergenicity and cross-reactivity. As proof of concept, AllerML was implemented using the Structural Database of Allergenic Proteins (SDAP; http://fermi.utmb.edu/SDAP/) database. General implementation of AllerML will promote automatic flow of validated data that will aid in allergy research and regulatory analysis. Copyright © 2011 Elsevier Inc. All rights reserved.

  17. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool.

    PubMed

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda

    2008-08-15

    It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes.

  18. The gel electrophoresis markup language (GelML) from the Proteomics Standards Initiative.

    PubMed

    Gibson, Frank; Hoogland, Christine; Martinez-Bartolomé, Salvador; Medina-Aunon, J Alberto; Albar, Juan Pablo; Babnigg, Gyorgy; Wipat, Anil; Hermjakob, Henning; Almeida, Jonas S; Stanislaus, Romesh; Paton, Norman W; Jones, Andrew R

    2010-09-01

    The Human Proteome Organisation's Proteomics Standards Initiative has developed the GelML (gel electrophoresis markup language) data exchange format for representing gel electrophoresis experiments performed in proteomics investigations. The format closely follows the reporting guidelines for gel electrophoresis, which are part of the Minimum Information About a Proteomics Experiment (MIAPE) set of modules. GelML supports the capture of metadata (such as experimental protocols) and data (such as gel images) resulting from gel electrophoresis so that laboratories can be compliant with the MIAPE Gel Electrophoresis guidelines, while allowing such data sets to be exchanged or downloaded from public repositories. The format is sufficiently flexible to capture data from a broad range of experimental processes, and complements other PSI formats for MS data and the results of protein and peptide identifications to capture entire gel-based proteome workflows. GelML has resulted from the open standardisation process of PSI consisting of both public consultation and anonymous review of the specifications.

  19. AllerML: Markup Language for Allergens

    PubMed Central

    Ivanciuc, Ovidiu; Gendel, Steven M.; Power, Trevor D.; Schein, Catherine H.; Braun, Werner

    2011-01-01

    Many concerns have been raised about the potential allergenicity of novel, recombinant proteins into food crops. Guidelines, proposed by WHO/FAO and EFSA, include the use of bioinformatics screening to assess the risk of potential allergenicity or cross-reactivities of all proteins introduced, for example, to improve nutritional value or promote crop resistance. However, there are no universally accepted standards that can be used to encode data on the biology of allergens to facilitate using data from multiple databases in this screening. Therefore, we developed AllerML a markup language for allergens to assist in the automated exchange of information between databases and in the integration of the bioinformatics tools that are used to investigate allergenicity and cross-reactivity. As proof of concept, AllerML was implemented using the Structural Database of Allergenic Proteins (SDAP; http://fermi.utmb.edu/SDAP/) database. General implementation of AllerML will promote automatic flow of validated data that will aid in allergy research and regulatory analysis. PMID:21420460

  20. HyperART: non-invasive quantification of leaf traits using hyperspectral absorption-reflectance-transmittance imaging.

    PubMed

    Bergsträsser, Sergej; Fanourakis, Dimitrios; Schmittgen, Simone; Cendrero-Mateo, Maria Pilar; Jansen, Marcus; Scharr, Hanno; Rascher, Uwe

    2015-01-01

    Combined assessment of leaf reflectance and transmittance is currently limited to spot (point) measurements. This study introduces a tailor-made hyperspectral absorption-reflectance-transmittance imaging (HyperART) system, yielding a non-invasive determination of both reflectance and transmittance of the whole leaf. We addressed its applicability for analysing plant traits, i.e. assessing Cercospora beticola disease severity or leaf chlorophyll content. To test the accuracy of the obtained data, these were compared with reflectance and transmittance measurements of selected leaves acquired by the point spectroradiometer ASD FieldSpec, equipped with the FluoWat device. The working principle of the HyperART system relies on the upward redirection of transmitted and reflected light (range of 400 to 2500 nm) of a plant sample towards two line scanners. By using both the reflectance and transmittance image, an image of leaf absorption can be calculated. The comparison with the dynamically high-resolution ASD FieldSpec data showed good correlation, underlying the accuracy of the HyperART system. Our experiments showed that variation in both leaf chlorophyll content of four different crop species, due to different fertilization regimes during growth, and fungal symptoms on sugar beet leaves could be accurately estimated and monitored. The use of leaf reflectance and transmittance, as well as their sum (by which the non-absorbed radiation is calculated) obtained by the HyperART system gave considerably improved results in classification of Cercospora leaf spot disease and determination of chlorophyll content. The HyperART system offers the possibility for non-invasive and accurate mapping of leaf transmittance and absorption, significantly expanding the applicability of reflectance, based on mapping spectroscopy, in plant sciences. Therefore, the HyperART system may be readily employed for non-invasive determination of the spatio-temporal dynamics of various plant properties.

  1. Response of general practitioners to computer-generated critiques of hypertension therapy.

    PubMed

    van der Lei, J; van der Does, E; Man in 't Veld, A J; Musen, M A; van Bemmel, J H

    1993-04-01

    We recently have shown that a computer system, known as HyperCritic, can successfully audit general practitioners' treatment of hypertension by analyzing computer-based patient records. HyperCritic reviews the electronic medical records and offers unsolicited advice. To determine which unsolicited advice might be perceived as inappropriate, builders of programs such as HyperCritic need insight into providers' responses to computer-generated critique of their patient care. Twenty medical charts, describing in total 243 visits of patients with hypertension, were audited by 8 human reviewers and by the critiquing-system HyperCritic. A panel of 14 general practitioners subsequently judged the relevance of those critiques on a five-point scale ranging from relevant critique to erroneous or harmful critique. The panel judged reviewers' comments to be either relevant or somewhat relevant in 61 to 68% of cases, and either erroneous or possibly erroneous in 15 to 18%; the panel judged HyperCritic's comments to be either relevant or somewhat relevant in 65% of cases, and either erroneous or possibly erroneous in 16%. Comparison of individual members of the panel showed large differences; for example, the portion of HyperCritic's comments judged relevant ranged from 0 to 82%. We conclude that, from the perspective of general practitioners, critiques generated by the critiquing system HyperCritic are perceived equally beneficial as critiques generated by human reviewers. Different general practitioners, however, judge the critiques differently. Before auditing systems based on computer-based patient records that are acceptable to practitioners can be introduced, additional studies are needed to evaluate the reasons a physician may have for judging critiques to be irrelevant, and to evaluate the effect of critiques on physician behavior.

  2. High-resolution, time-resolved MRA provides superior definition of lower-extremity arterial segments compared to 2D time-of-flight imaging.

    PubMed

    Thornton, F J; Du, J; Suleiman, S A; Dieter, R; Tefera, G; Pillai, K R; Korosec, F R; Mistretta, C A; Grist, T M

    2006-08-01

    To evaluate a novel time-resolved contrast-enhanced (CE) projection reconstruction (PR) magnetic resonance angiography (MRA) method for identifying potential bypass graft target vessels in patients with Class II-IV peripheral vascular disease. Twenty patients (M:F = 15:5, mean age = 58 years, range = 48-83 years), were recruited from routine MRA referrals. All imaging was performed on a 1.5 T MRI system with fast gradients (Signa LX; GE Healthcare, Waukesha, WI). Images were acquired with a novel technique that combined undersampled PR with a time-resolved acquisition to yield an MRA method with high temporal and spatial resolution. The method is called PR hyper time-resolved imaging of contrast kinetics (PR-hyperTRICKS). Quantitative and qualitative analyses were used to compare two-dimensional (2D) time-of-flight (TOF) and PR-hyperTRICKS in 13 arterial segments per lower extremity. Statistical analysis was performed with the Wilcoxon signed-rank test. Fifteen percent (77/517) of the vessels were scored as missing or nondiagnostic with 2D TOF, but were scored as diagnostic with PR-hyperTRICKS. Image quality was superior with PR-hyperTRICKS vs. 2D TOF (on a four-point scale, mean rank = 3.3 +/- 1.2 vs. 2.9 +/- 1.2, P < 0.0001). PR-hyperTRICKS produced images with high contrast-to-noise ratios (CNR) and high spatial and temporal resolution. 2D TOF images were of inferior quality due to moderate spatial resolution, inferior CNR, greater flow-related artifacts, and absence of temporal resolution. PR-hyperTRICKS provides superior preoperative assessment of lower limb ischemia compared to 2D TOF.

  3. Unique atom hyper-kagome order in Na4Ir3O8 and in low-symmetry spinel modifications.

    PubMed

    Talanov, V M; Shirokov, V B; Talanov, M V

    2015-05-01

    Group-theoretical and thermodynamic methods of the Landau theory of phase transitions are used to investigate the hyper-kagome atomic order in structures of ordered spinels and a spinel-like Na4Ir3O8 crystal. The formation of an atom hyper-kagome sublattice in Na4Ir3O8 is described theoretically on the basis of the archetype (hypothetical parent structure/phase) concept. The archetype structure of Na4Ir3O8 has a spinel-like structure (space group Fd\\bar 3m) and composition [Na1/2Ir3/2](16d)[Na3/2](16c)O(32e)4. The critical order parameter which induces hypothetical phase transition has been stated. It is shown that the derived structure of Na4Ir3O8 is formed as a result of the displacements of Na, Ir and O atoms, and ordering of Na, Ir and O atoms, ordering dxy, dxz, dyz orbitals as well. Ordering of all atoms takes place according to the type 1:3. Ir and Na atoms form an intriguing atom order: a network of corner-shared Ir triangles called a hyper-kagome lattice. The Ir atoms form nanoclusters which are named decagons. The existence of hyper-kagome lattices in six types of ordered spinel structures is predicted theoretically. The structure mechanisms of the formation of the predicted hyper-kagome atom order in some ordered spinel phases are established. For a number of cases typical diagrams of possible crystal phase states are built in the framework of the Landau theory of phase transitions. Thermodynamical conditions of hyper-kagome order formation are discussed by means of these diagrams. The proposed theory is in accordance with experimental data.

  4. A dynamic multiarmed bandit-gene expression programming hyper-heuristic for combinatorial optimization problems.

    PubMed

    Sabar, Nasser R; Ayob, Masri; Kendall, Graham; Qu, Rong

    2015-02-01

    Hyper-heuristics are search methodologies that aim to provide high-quality solutions across a wide variety of problem domains, rather than developing tailor-made methodologies for each problem instance/domain. A traditional hyper-heuristic framework has two levels, namely, the high level strategy (heuristic selection mechanism and the acceptance criterion) and low level heuristics (a set of problem specific heuristics). Due to the different landscape structures of different problem instances, the high level strategy plays an important role in the design of a hyper-heuristic framework. In this paper, we propose a new high level strategy for a hyper-heuristic framework. The proposed high-level strategy utilizes a dynamic multiarmed bandit-extreme value-based reward as an online heuristic selection mechanism to select the appropriate heuristic to be applied at each iteration. In addition, we propose a gene expression programming framework to automatically generate the acceptance criterion for each problem instance, instead of using human-designed criteria. Two well-known, and very different, combinatorial optimization problems, one static (exam timetabling) and one dynamic (dynamic vehicle routing) are used to demonstrate the generality of the proposed framework. Compared with state-of-the-art hyper-heuristics and other bespoke methods, empirical results demonstrate that the proposed framework is able to generalize well across both domains. We obtain competitive, if not better results, when compared to the best known results obtained from other methods that have been presented in the scientific literature. We also compare our approach against the recently released hyper-heuristic competition test suite. We again demonstrate the generality of our approach when we compare against other methods that have utilized the same six benchmark datasets from this test suite.

  5. Toxoplasmosis Presenting as Hyper Viscosity Syndrome due to Polyclonal Gammopathy.

    PubMed

    Puranik, Shaila C; Rathod, Kalpana B; Kudrimoti, Jyoti K

    2014-03-01

    We are presenting a rare case of toxoplasma lymphadenopathy with hyper viscosity syndrome due to polyclonal gammopathy. A 30 year old female presented with generalized lymphadenopathy. Lymph node biopsy findings suggestive of toxoplasmosis were confirmed on serology. Bone marrow aspiration showed 50 % plasma cells. On serum electrophoresis broad, diffuse band noted, indicative of polyclonal gammopathy. M band was absent. The patient was immunocompetent and presented with hyper viscosity syndrome masking the symptoms of underlying toxoplasmosis.

  6. Surface-enhanced hyper-Raman spectroscopy with a picosecond laser: gold and copper colloids

    NASA Astrophysics Data System (ADS)

    Lipscomb, Leigh Ann; Nie, Shuming; Feng, Sibo; Yu, Nai-Teng

    1990-07-01

    We have obtained surface-enhanced hyper-Raman scattering (SEHRS) spectra of crystal violet, rhodamine 6G and Ru(trpy) (BPE) 32+ adsorbed on gold and copper colloidal surfaces (where trpy=2,2',2″-terpyridine, BPE=trans-bis(4-pyridyl)ethylene). Our results demonstrate that the SEHRS effect is not intrinsically restricted to a Ag substrate and that surface enhancements at the emitted hyper-Raman photon frequencies are not required for observing SEHRS signals.

  7. A study on the nature of interactions of mixed-mode ligands HEA and PPA HyperCel using phenylglyoxal modified lysozyme.

    PubMed

    Pezzini, J; Cabanne, C; Dupuy, J-W; Gantier, R; Santarelli, X

    2014-06-01

    Mixed mode chromatography, or multimodal chromatography, involves the exploitation of combinations of several interactions in a controlled manner, to facilitate the rapid capture of proteins. Mixed-mode ligands like HEA and PPA HyperCel™ facilitate different kinds of interactions (hydrophobic, ionic, etc.) under different conditions. In order to better characterize the nature of this multi-modal interaction, we sought to study a protein, lysozyme, which is normally not retained by these mixed mode resins under normal binding conditions. Lysozyme was modified specifically at Arginine residues by the action of phenylglyoxal, and was extensively studied in this work to better characterize the mixed-mode interactions of HEA HyperCel™ and PPA HyperCel™ chromatographic supports. We show here that the adsorption behaviour of lysozyme on HEA and PPA HyperCel™ mixed mode sorbents varies depending on the degree of charge modification at the surface of the protein. Experiments using conventional cation exchange and hydrophobic interaction chromatography confirm that both charge and hydrophobicity modification occurs at the surface of the protein after lysozyme reaction with phenylglyoxal. The results emanating from this work using HEA and PPA HyperCel sorbents strongly suggest that mixed mode chromatography can efficiently separate closely related proteins of only minor surface charge and/or hydrophobicity differences. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Negative Refraction with Superior Transmission in Graphene-Hexagonal Boron Nitride (hBN) Multilayer Hyper Crystal

    PubMed Central

    Sayem, Ayed Al; Rahman, Md. Masudur; Mahdy, M. R. C.; Jahangir, Ifat; Rahman, Md. Saifur

    2016-01-01

    In this article, we have theoretically investigated the performance of graphene-hexagonal Boron Nitride (hBN) multilayer structure (hyper crystal) to demonstrate all angle negative refraction along with superior transmission. hBN, one of the latest natural hyperbolic materials, can be a very strong contender to form a hyper crystal with graphene due to its excellence as a graphene-compatible substrate. Although bare hBN can exhibit negative refraction, the transmission is generally low due to its high reflectivity. Whereas due to graphene’s 2D nature and metallic characteristics in the frequency range where hBN behaves as a type-I hyperbolic material, we have found graphene-hBN hyper-crystals to exhibit all angle negative refraction with superior transmission. Interestingly, superior transmission from the whole structure can be fully controlled by the tunability of graphene without hampering the negative refraction originated mainly from hBN. We have also presented an effective medium description of the hyper crystal in the low-k limit and validated the proposed theory analytically and with full wave simulations. Along with the current extensive research on hybridization of graphene plasmon polaritons with (hyperbolic) hBN phonon polaritons, this work might have some substantial impact on this field of research and can be very useful in applications such as hyper-lensing. PMID:27146561

  9. Geometrical calibration of an AOTF hyper-spectral imaging system

    NASA Astrophysics Data System (ADS)

    Špiclin, Žiga; Katrašnik, Jaka; Bürmen, Miran; Pernuš, Franjo; Likar, Boštjan

    2010-02-01

    Optical aberrations present an important problem in optical measurements. Geometrical calibration of an imaging system is therefore of the utmost importance for achieving accurate optical measurements. In hyper-spectral imaging systems, the problem of optical aberrations is even more pronounced because optical aberrations are wavelength dependent. Geometrical calibration must therefore be performed over the entire spectral range of the hyper-spectral imaging system, which is usually far greater than that of the visible light spectrum. This problem is especially adverse in AOTF (Acousto- Optic Tunable Filter) hyper-spectral imaging systems, as the diffraction of light in AOTF filters is dependent on both wavelength and angle of incidence. Geometrical calibration of hyper-spectral imaging system was performed by stable caliber of known dimensions, which was imaged at different wavelengths over the entire spectral range. The acquired images were then automatically registered to the caliber model by both parametric and nonparametric transformation based on B-splines and by minimizing normalized correlation coefficient. The calibration method was tested on an AOTF hyper-spectral imaging system in the near infrared spectral range. The results indicated substantial wavelength dependent optical aberration that is especially pronounced in the spectral range closer to the infrared part of the spectrum. The calibration method was able to accurately characterize the aberrations and produce transformations for efficient sub-pixel geometrical calibration over the entire spectral range, finally yielding better spatial resolution of hyperspectral imaging system.

  10. Autobiographical accounts of sensing in Asperger syndrome and high-functioning autism.

    PubMed

    Elwin, Marie; Ek, Lena; Schröder, Agneta; Kjellin, Lars

    2012-10-01

    Sensory experiences in Asperger syndrome (AS) or high-functioning autism (HFA) were explored by qualitative content analysis of autobiographical texts by persons with AS/HFA. Predetermined categories of hyper- and hyposensitivity were applied to texts. Hypersensitivity consists of strong reactions and heightened apprehension in reaction to external stimuli, sometimes together with overfocused or unselective attention. It was common in vision, hearing, and touch. In contrast, hyposensitivity was frequent in reaction to internal and body stimuli such as interoception, proprioception, and pain. It consists of less registration, discrimination, and recognition of stimuli as well as cravings for specific stimuli. Awareness of the strong impact of sensitivity is essential for creating good environments and encounters in the context of psychiatric and other health care. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. Increased gluconeogenesis in rats exposed to hyper-G stress

    NASA Technical Reports Server (NTRS)

    Daligcon, B. C.; Oyama, J.; Hannak, K.

    1985-01-01

    The effect of gluconeogenesis on the levels of plasma glucose and liver glycogen was studied in rats exposed to hyper-G stress. Incorporation of lactate, alanine, or glycerol, labeled with C-14, into plasma glucose and liver glycogen was measured in rats centrifuged at 3.1 G for 0.25, 0.50, and 1.0-hr periods, and was compared to noncentrifuged controls injected with appropriate glycogen precursors. It was found that exposure to G-stress leads to increased incorporation from all three substrates into both plasma glucose and liver glycogen. These early incorporation increases were blocked upon pre-G administration of 5-methoxyindole-2-carboxylic acid, a gluconeogenesis inhibitor, or propanolol, a beta-adrenergic blocker, as well as by adrenodemedullation. Results indicate that the rapid rise in plasma glucose, as well as in liver glycogen in rats exposed to hyper-G stress is due to an increased rate of gluconeogenesis, and that epinephrine, released in response to hyper-G-induced activation of the sympathetic-adrenal system, plays a dominant role during the early stages of hyper-G stress.

  12. Biologically-inspired data decorrelation for hyper-spectral imaging

    NASA Astrophysics Data System (ADS)

    Picon, Artzai; Ghita, Ovidiu; Rodriguez-Vaamonde, Sergio; Iriondo, Pedro Ma; Whelan, Paul F.

    2011-12-01

    Hyper-spectral data allows the construction of more robust statistical models to sample the material properties than the standard tri-chromatic color representation. However, because of the large dimensionality and complexity of the hyper-spectral data, the extraction of robust features (image descriptors) is not a trivial issue. Thus, to facilitate efficient feature extraction, decorrelation techniques are commonly applied to reduce the dimensionality of the hyper-spectral data with the aim of generating compact and highly discriminative image descriptors. Current methodologies for data decorrelation such as principal component analysis (PCA), linear discriminant analysis (LDA), wavelet decomposition (WD), or band selection methods require complex and subjective training procedures and in addition the compressed spectral information is not directly related to the physical (spectral) characteristics associated with the analyzed materials. The major objective of this article is to introduce and evaluate a new data decorrelation methodology using an approach that closely emulates the human vision. The proposed data decorrelation scheme has been employed to optimally minimize the amount of redundant information contained in the highly correlated hyper-spectral bands and has been comprehensively evaluated in the context of non-ferrous material classification

  13. The X-43A Hyper-X Mach 7 Flight 2 Guidance, Navigation, and Control Overview and Flight Test Results

    NASA Technical Reports Server (NTRS)

    Bahm, Catherine; Baumann, Ethan; Martin, John; Bose, David; Beck, Roger E.; Strovers, Brian

    2005-01-01

    The objective of the Hyper-X program was to flight demonstrate an airframe-integrated hypersonic vehicle. On March 27, 2004, the Hyper-X program team successfully conducted flight 2 and achieved all of the research objectives. The Hyper-X research vehicle successfully separated from the Hyper-X launch vehicle and achieved the desired engine test conditions before the experiment began. The research vehicle rejected the disturbances caused by the cowl door opening and the fuel turning on and off and maintained the engine test conditions throughout the experiment. After the engine test was complete, the vehicle recovered and descended along a trajectory while performing research maneuvers. The last data acquired showed that the vehicle maintained control to the water. This report will provide an overview of the research vehicle guidance and control systems and the performance of the vehicle during the separation event and engine test. The research maneuvers were performed to collect data for aerodynamics and flight controls research. This report also will provide an overview of the flight controls related research and results.

  14. Cation exchange properties of zeolites in hyper alkaline aqueous media.

    PubMed

    Van Tendeloo, Leen; de Blochouse, Benny; Dom, Dirk; Vancluysen, Jacqueline; Snellings, Ruben; Martens, Johan A; Kirschhock, Christine E A; Maes, André; Breynaert, Eric

    2015-02-03

    Construction of multibarrier concrete based waste disposal sites and management of alkaline mine drainage water requires cation exchangers combining excellent sorption properties with a high stability and predictable performance in hyper alkaline media. Though highly selective organic cation exchange resins have been developed for most pollutants, they can serve as a growth medium for bacterial proliferation, impairing their long-term stability and introducing unpredictable parameters into the evolution of the system. Zeolites represent a family of inorganic cation exchangers, which naturally occur in hyper alkaline conditions and cannot serve as an electron donor or carbon source for microbial proliferation. Despite their successful application as industrial cation exchangers under near neutral conditions, their performance in hyper alkaline, saline water remains highly undocumented. Using Cs(+) as a benchmark element, this study aims to assess the long-term cation exchange performance of zeolites in concrete derived aqueous solutions. Comparison of their exchange properties in alkaline media with data obtained in near neutral solutions demonstrated that the cation exchange selectivity remains unaffected by the increased hydroxyl concentration; the cation exchange capacity did however show an unexpected increase in hyper alkaline media.

  15. Optimized Hyper Beamforming of Linear Antenna Arrays Using Collective Animal Behaviour

    PubMed Central

    Ram, Gopi; Mandal, Durbadal; Kar, Rajib; Ghoshal, Sakti Prasad

    2013-01-01

    A novel optimization technique which is developed on mimicking the collective animal behaviour (CAB) is applied for the optimal design of hyper beamforming of linear antenna arrays. Hyper beamforming is based on sum and difference beam patterns of the array, each raised to the power of a hyperbeam exponent parameter. The optimized hyperbeam is achieved by optimization of current excitation weights and uniform interelement spacing. As compared to conventional hyper beamforming of linear antenna array, real coded genetic algorithm (RGA), particle swarm optimization (PSO), and differential evolution (DE) applied to the hyper beam of the same array can achieve reduction in sidelobe level (SLL) and same or less first null beam width (FNBW), keeping the same value of hyperbeam exponent. Again, further reductions of sidelobe level (SLL) and first null beam width (FNBW) have been achieved by the proposed collective animal behaviour (CAB) algorithm. CAB finds near global optimal solution unlike RGA, PSO, and DE in the present problem. The above comparative optimization is illustrated through 10-, 14-, and 20-element linear antenna arrays to establish the optimization efficacy of CAB. PMID:23970843

  16. 4D Hyperspherical Harmonic (HyperSPHARM) Representation of Multiple Disconnected Brain Subcortical Structures

    PubMed Central

    Hosseinbor, A. Pasha; Chung, Moo K.; Schaefer, Stacey M.; van Reekum, Carien M.; Peschke-Schmitz, Lara; Sutterer, Matt; Alexander, Andrew L.; Davidson, Richard J.

    2014-01-01

    We present a novel surface parameterization technique using hyperspherical harmonics (HSH) in representing compact, multiple, disconnected brain subcortical structures as a single analytic function. The proposed hyperspherical harmonic representation (HyperSPHARM) has many advantages over the widely used spherical harmonic (SPHARM) parameterization technique. SPHARM requires flattening 3D surfaces to 3D sphere which can be time consuming for large surface meshes, and can’t represent multiple disconnected objects with single parameterization. On the other hand, HyperSPHARM treats 3D object, via simple stereographic projection, as a surface of 4D hypersphere with extremely large radius, hence avoiding the computationally demanding flattening process. HyperSPHARM is shown to achieve a better reconstruction with only 5 basis compared to SPHARM that requires more than 441. PMID:24505716

  17. Symmetries of hyper-Kähler (or Poisson gauge field) hierarchy

    NASA Astrophysics Data System (ADS)

    Takasaki, K.

    1990-08-01

    Symmetry properties of the space of complex (or formal) hyper-Kähler metrics are studied in the language of hyper-Kähler hierarchies. The construction of finite symmetries is analogous to the theory of Riemann-Hilbert transformations, loop group elements now taking values in a (pseudo-) group of canonical transformations of a simplectic manifold. In spite of their highly nonlinear and involved nature, infinitesimal expressions of these symmetries are shown to have a rather simple form. These infinitesimal transformations are extended to the Plebanski key functions to give rise to a nonlinear realization of a Poisson loop algebra. The Poisson algebra structure turns out to originate in a contact structure behind a set of symplectic structures inherent in the hyper-Kähler hierarchy. Possible relations to membrane theory are briefly discussed.

  18. Does the HyperCP evidence for the decay Sigma+ -->pmu+mu- indicate a light pseudoscalar Higgs boson?

    PubMed

    He, Xiao-Gang; Tandean, Jusak; Valencia, G

    2007-02-23

    The HyperCP Collaboration has observed three events for the decay Sigma+ -->p mu+mu- which may be interpreted as a new particle of mass 214.3 MeV. However, existing data from kaon and B-meson decays provide stringent constraints on the construction of models that support this interpretation. In this Letter we show that the "HyperCP particle" can be identified with the light pseudoscalar Higgs boson in the next-to-minimal supersymmetric standard model, the A10. In this model there are regions of parameter space where the A10 can satisfy all the existing constraints from kaon and B-meson decays and mediate Sigma+ -->p mu+mu- at a level consistent with the HyperCP observation.

  19. The study of hydrogen peroxide level under cisplatin action using genetically encoded sensor hyper

    NASA Astrophysics Data System (ADS)

    Belova, A. S.; Orlova, A. G.; Maslennikova, A. V.; Brilkina, A. A.; Balalaeva, I. V.; Antonova, N. O.; Mishina, N. M.; Shakhova, N. M.; Belousov, V. V.

    2014-03-01

    The aim of the work was to study the participation of hydrogen peroxide in reaction of cervical cancer cell line HeLa Kyoto on cisplatin action. Determination of hydrogen peroxide level was performed using genetically encoded fluorescent sensor HyPer2. The dependence of cell viability on cisplatin concentration was determined using MTT assay. Mechanisms of cell death as well as HyPer2 reaction was revealed by flow cytometry after 6-hours of incubation with cisplatin in different concentrations. Cisplatin used in low concentrations had no effect on hydrogen peroxide level in HeLa Kyoto cells. Increase of HyPer2 fluorescence was detected only after exposure with cisplatin in high concentration. The reaction was not the consequence of cell death.

  20. Relating UMLS semantic types and task-based ontology to computer-interpretable clinical practice guidelines.

    PubMed

    Kumar, Anand; Ciccarese, Paolo; Quaglini, Silvana; Stefanelli, Mario; Caffi, Ezio; Boiocchi, Lorenzo

    2003-01-01

    Medical knowledge in clinical practice guideline (GL) texts is the source of task-based computer-interpretable clinical guideline models (CIGMs). We have used Unified Medical Language System (UMLS) semantic types (STs) to understand the percentage of GL text which belongs to a particular ST. We also use UMLS semantic network together with the CIGM-specific ontology to derive a semantic meaning behind the GL text. In order to achieve this objective, we took nine GL texts from the National Guideline Clearinghouse (NGC) and marked up the text dealing with a particular ST. The STs we took into consideration were restricted taking into account the requirements of a task-based CIGM. We used DARPA Agent Markup Language and Ontology Inference Layer (DAML + OIL) to create the UMLS and CIGM specific semantic network. For the latter, as a bench test, we used the 1999 WHO-International Society of Hypertension Guidelines for the Management of Hypertension. We took into consideration the UMLS STs closest to the clinical tasks. The percentage of the GL text dealing with the ST "Health Care Activity" and subtypes "Laboratory Procedure", "Diagnostic Procedure" and "Therapeutic or Preventive Procedure" were measured. The parts of text belonging to other STs or comments were separated. A mapping of terms belonging to other STs was done to the STs under "HCA" for representation in DAML + OIL. As a result, we found that the three STs under "HCA" were the predominant STs present in the GL text. In cases where the terms of related STs existed, they were mapped into one of the three STs. The DAML + OIL representation was able to describe the hierarchy in task-based CIGMs. To conclude, we understood that the three STs could be used to represent the semantic network of the task-bases CIGMs. We identified some mapping operators which could be used for the mapping of other STs into these.

  1. Combining dictionary techniques with extensible markup language (XML)--requirements to a new approach towards flexible and standardized documentation.

    PubMed Central

    Altmann, U.; Tafazzoli, A. G.; Noelle, G.; Huybrechts, T.; Schweiger, R.; Wächter, W.; Dudeck, J. W.

    1999-01-01

    In oncology various international and national standards exist for the documentation of different aspects of a disease. Since elements of these standards are repeated in different contexts, a common data dictionary could support consistent representation in any context. For the construction of such a dictionary existing documents have to be worked up in a complex procedure, that considers aspects of hierarchical decomposition of documents and of domain control as well as aspects of user presentation and models of the underlying model of patient data. In contrast to other thesauri, text chunks like definitions or explanations are very important and have to be preserved, since oncologic documentation often means coding and classification on an aggregate level and the safe use of coding systems is an important precondition for comparability of data. This paper discusses the potentials of the use of XML in combination with a dictionary for the promotion and development of standard conformable applications for tumor documentation. PMID:10566311

  2. Atmosphere-based image classification through luminance and hue

    NASA Astrophysics Data System (ADS)

    Xu, Feng; Zhang, Yujin

    2005-07-01

    In this paper a novel image classification system is proposed. Atmosphere serves an important role in generating the scene"s topic or in conveying the message behind the scene"s story, which belongs to abstract attribute level in semantic levels. At first, five atmosphere semantic categories are defined according to rules of photo and film grammar, followed by global luminance and hue features. Then the hierarchical SVM classifiers are applied. In each classification stage, corresponding features are extracted and the trained linear SVM is implemented, resulting in two classes. After three stages of classification, five atmosphere categories are obtained. At last, the text annotation of the atmosphere semantics and the corresponding features by Extensible Markup Language (XML) in MPEG-7 is defined, which can be integrated into more multimedia applications (such as searching, indexing and accessing of multimedia content). The experiment is performed on Corel images and film frames. The classification results prove the effectiveness of the definition of atmosphere semantic classes and the corresponding features.

  3. Fournier gangrene associated with hyper IgE syndrome (Job syndrome).

    PubMed

    Hori, Junichi; Yamaguchi, Satoshi; Watanabe, Masaki; Osanai, Hiroaki; Hori, Masako

    2008-04-01

    We report a case of a 32-year-old man with hyper IgE syndrome (Job syndrome) who developed Fournier gangrene due to infectious multiple atheromas of the scrotal skin that progressed to the right groin and thigh. The patient required surgical debridement and subsequent skin grafting. This is a rare case of Fournier gangrene associated with hyper IgE syndrome (Job syndrome). When a patient without diabetes mellitus has repeated infections and atopic-like dermatitis, Job syndrome should be considered.

  4. Autonomic Recovery: HyperCheck: A Hardware-Assisted Integrity Monitor

    DTIC Science & Technology

    2013-08-01

    system (OS). HyperCheck leverages the CPU System Management Mode ( SMM ), present in x86 systems, to securely generate and transmit the full state of the...HyperCheck harnesses the CPU System Management Mode ( SMM ) which is present in all x86 commodity systems to create a snapshot view of the current state of the...protect the software above it. Our assumptions are that the attacker does not have physical access to the machine and that the SMM BIOS is locked and

  5. Low Income Life-Styles and the Consumption of Durable Goods: Implications for Consumer Educators

    ERIC Educational Resources Information Center

    Jolly, Desmond A.

    1978-01-01

    Low-income consumers badly need special purchasing skills, due to merchandising practices and greater markups for durable goods in low-income communities. The author discusses some of the ways in which these people are victimized, with implications for consumer education. (MF)

  6. XBRL: Beyond Basic XML

    ERIC Educational Resources Information Center

    VanLengen, Craig Alan

    2010-01-01

    The Securities and Exchange Commission (SEC) has recently announced a proposal that will require all public companies to report their financial data in Extensible Business Reporting Language (XBRL). XBRL is an extension of Extensible Markup Language (XML). Moving to a standard reporting format makes it easier for organizations to report the…

  7. Computer Literacy and Non-IS Majors

    ERIC Educational Resources Information Center

    Thomas, Jennifer D. E.; Blackwood, Martina

    2010-01-01

    This paper presents an investigation of non-Information Systems (IS) major's perceptions and performance when enrolled in a required introductory Computer Information Systems course. Students of various academic backgrounds were taught Excel, Hypertext Markup Language (HTML), JavaScript and computer literacy in a 14-week introductory course, in…

  8. UCD IIRG at TREC 2012 Medical Track

    DTIC Science & Technology

    2012-11-01

    documents. For ex- ample, the query “shakespeare.author” would en- sure that documents matching shakespeare in the au- thor field are returned. On the...corpus side, field extents are identified using XMLlike markup, e.g. <author> shakespeare </author>. 3 System Background & Motivation This section outlines

  9. Global Situational Awareness with Free Tools

    DTIC Science & Technology

    2015-01-15

    Client Technical Solutions • Software Engineering Measurement and Analysis • Architecture Practices • Product Line Practice • Team Software Process...multiple data sources • Snort (Snorby on Security Onion ) • Nagios • SharePoint RSS • Flow • Others • Leverage standard data formats • Keyhole Markup Language

  10. XML in Libraries.

    ERIC Educational Resources Information Center

    Tennant, Roy, Ed.

    This book presents examples of how libraries are using XML (eXtensible Markup Language) to solve problems, expand services, and improve systems. Part I contains papers on using XML in library catalog records: "Updating MARC Records with XMLMARC" (Kevin S. Clarke, Stanford University) and "Searching and Retrieving XML Records via the…

  11. Applying Data Mining Principles to Library Data Collection.

    ERIC Educational Resources Information Center

    Guenther, Kim

    2000-01-01

    Explains how libraries can use data mining techniques for more effective data collection. Highlights include three phases: data selection and acquisition; data preparation and processing, including a discussion of the use of XML (extensible markup language); and data interpretation and integration, including database management systems. (LRW)

  12. How Does XML Help Libraries?

    ERIC Educational Resources Information Center

    Banerjee, Kyle

    2002-01-01

    Discusses XML, how it has transformed the way information is managed and delivered, and its impact on libraries. Topics include how XML differs from other markup languages; the document object model (DOM); style sheets; practical applications for archival materials, interlibrary loans, digital collections, and MARC data; and future possibilities.…

  13. Cornerstone: Foundational Models and Services for Integrated Battle Planning

    DTIC Science & Technology

    2012-06-01

    We close with a summary of future planned research. 3 Cross-Domain Knowledge Representation One of the primary reasons behind the...mission data using Google Earth to display the results of a Keyhole Markup Language (KML) mission data translator. Finally, we successfully ran Thread 1

  14. Public accessibility of biomedical articles from PubMed Central reduces journal readership--retrospective cohort analysis.

    PubMed

    Davis, Philip M

    2013-07-01

    Does PubMed Central--a government-run digital archive of biomedical articles--compete with scientific society journals? A longitudinal, retrospective cohort analysis of 13,223 articles (5999 treatment, 7224 control) published in 14 society-run biomedical research journals in nutrition, experimental biology, physiology, and radiology between February 2008 and January 2011 reveals a 21.4% reduction in full-text hypertext markup language (HTML) article downloads and a 13.8% reduction in portable document format (PDF) article downloads from the journals' websites when U.S. National Institutes of Health-sponsored articles (treatment) become freely available from the PubMed Central repository. In addition, the effect of PubMed Central on reducing PDF article downloads is increasing over time, growing at a rate of 1.6% per year. There was no longitudinal effect for full-text HTML downloads. While PubMed Central may be providing complementary access to readers traditionally underserved by scientific journals, the loss of article readership from the journal website may weaken the ability of the journal to build communities of interest around research papers, impede the communication of news and events to scientific society members and journal readers, and reduce the perceived value of the journal to institutional subscribers.

  15. Mobile terrestrial light detection and ranging (T-LiDAR) survey of areas on Dauphin Island, Alabama, in the aftermath of Hurricane Isaac, 2012

    USGS Publications Warehouse

    Kimbrow, Dustin R.

    2014-01-01

    Topographic survey data of areas on Dauphin Island on the Alabama coast were collected using a truck-mounted mobile terrestrial light detection and ranging system. This system is composed of a high frequency laser scanner in conjunction with an inertial measurement unit and a position and orientation computer to produce highly accurate topographic datasets. A global positioning system base station was set up on a nearby benchmark and logged vertical and horizontal position information during the survey for post-processing. Survey control points were also collected throughout the study area to determine residual errors. Data were collected 5 days after Hurricane Isaac made landfall in early September 2012 to document sediment deposits prior to clean-up efforts. Three data files in ASCII text format with the extension .xyz are included in this report, and each file is named according to both the acquisition date and the relative geographic location on Dauphin Island (for example, 20120903_Central.xyz). Metadata are also included for each of the files in both Extensible Markup Language with the extension .xml and ASCII text formats. These topographic data can be used to analyze the effects of storm surge on barrier island environments and also serve as a baseline dataset for future change detection analyses.

  16. Cosmological Parameters and Hyper-Parameters: The Hubble Constant from Boomerang and Maxima

    NASA Astrophysics Data System (ADS)

    Lahav, Ofer

    Recently several studies have jointly analysed data from different cosmological probes with the motivation of estimating cosmological parameters. Here we generalise this procedure to allow freedom in the relative weights of various probes. This is done by including in the joint likelihood function a set of `Hyper-Parameters', which are dealt with using Bayesian considerations. The resulting algorithm, which assumes uniform priors on the log of the Hyper-Parameters, is very simple to implement. We illustrate the method by estimating the Hubble constant H0 from different sets of recent CMB experiments (including Saskatoon, Python V, MSAM1, TOCO, Boomerang and Maxima). The approach can be generalised for a combination of cosmic probes, and for other priors on the Hyper-Parameters. Reference: Lahav, Bridle, Hobson, Lasenby & Sodre, 2000, MNRAS, in press (astro-ph/9912105)

  17. A Comparison of Genetic Programming Variants for Hyper-Heuristics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, Sean

    Modern society is faced with ever more complex problems, many of which can be formulated as generate-and-test optimization problems. General-purpose optimization algorithms are not well suited for real-world scenarios where many instances of the same problem class need to be repeatedly and efficiently solved, such as routing vehicles over highways with constantly changing traffic flows, because they are not targeted to a particular scenario. Hyper-heuristics automate the design of algorithms to create a custom algorithm for a particular scenario. Hyper-heuristics typically employ Genetic Programming (GP) and this project has investigated the relationship between the choice of GP and performance inmore » Hyper-heuristics. Results are presented demonstrating the existence of problems for which there is a statistically significant performance differential between the use of different types of GP.« less

  18. User-customized brain computer interfaces using Bayesian optimization

    NASA Astrophysics Data System (ADS)

    Bashashati, Hossein; Ward, Rabab K.; Bashashati, Ali

    2016-04-01

    Objective. The brain characteristics of different people are not the same. Brain computer interfaces (BCIs) should thus be customized for each individual person. In motor-imagery based synchronous BCIs, a number of parameters (referred to as hyper-parameters) including the EEG frequency bands, the channels and the time intervals from which the features are extracted should be pre-determined based on each subject’s brain characteristics. Approach. To determine the hyper-parameter values, previous work has relied on manual or semi-automatic methods that are not applicable to high-dimensional search spaces. In this paper, we propose a fully automatic, scalable and computationally inexpensive algorithm that uses Bayesian optimization to tune these hyper-parameters. We then build different classifiers trained on the sets of hyper-parameter values proposed by the Bayesian optimization. A final classifier aggregates the results of the different classifiers. Main Results. We have applied our method to 21 subjects from three BCI competition datasets. We have conducted rigorous statistical tests, and have shown the positive impact of hyper-parameter optimization in improving the accuracy of BCIs. Furthermore, We have compared our results to those reported in the literature. Significance. Unlike the best reported results in the literature, which are based on more sophisticated feature extraction and classification methods, and rely on prestudies to determine the hyper-parameter values, our method has the advantage of being fully automated, uses less sophisticated feature extraction and classification methods, and yields similar or superior results compared to the best performing designs in the literature.

  19. Real-time imaging of hydrogen peroxide dynamics in vegetative and pathogenic hyphae of Fusarium graminearum.

    PubMed

    Mentges, Michael; Bormann, Jörg

    2015-10-08

    Balanced dynamics of reactive oxygen species in the phytopathogenic fungus Fusarium graminearum play key roles for development and infection. To monitor those dynamics, ratiometric analysis using the novel hydrogen peroxide (H2O2) sensitive fluorescent indicator protein HyPer-2 was established for the first time in phytopathogenic fungi. H2O2 changes the excitation spectrum of HyPer-2 with an excitation maximum at 405 nm for the reduced and 488 nm for the oxidized state, facilitating ratiometric readouts with maximum emission at 516 nm. HyPer-2 analyses were performed using a microtiter fluorometer and confocal laser scanning microscopy (CLSM). Addition of external H2O2 to mycelia caused a steep and transient increase in fluorescence excited at 488 nm. This can be reversed by the addition of the reducing agent dithiothreitol. HyPer-2 in F. graminearum is highly sensitive and specific to H2O2 even in tiny amounts. Hyperosmotic treatment elicited a transient internal H2O2 burst. Hence, HyPer-2 is suitable to monitor the intracellular redox balance. Using CLSM, developmental processes like nuclear division, tip growth, septation, and infection structure development were analyzed. The latter two processes imply marked accumulations of intracellular H2O2. Taken together, HyPer-2 is a valuable and reliable tool for the analysis of environmental conditions, cellular development, and pathogenicity.

  20. Influence of maternal hyperthyroidism in the rat on the expression of neuronal and astrocytic cytoskeletal proteins in fetal brain.

    PubMed

    Evans, I M; Pickard, M R; Sinha, A K; Leonard, A J; Sampson, D C; Ekins, R P

    2002-12-01

    Maternal hypothyroidism during pregnancy impairs brain function in human and rat offspring, but little is known regarding the influence of maternal hyperthyroidism on neurodevelopment. We have previously shown that the expression of neuronal and glial differentiation markers in fetal brain is compromised in hypothyroid rat dam pregnancies and have now therefore extended this investigation to hyperthyroid rat dams. Study groups comprised partially thyroidectomised dams, implanted with osmotic pumps infusing either vehicle (TX dams) or a supraphysiological dose of thyroxine (T4) (HYPER dams), and euthyroid dams infused with vehicle (N dams). Cytoskeletal protein abundance was determined in fetal brain at 21 days of gestation by immunoblot analysis. Relative to N dams, circulating total T4 levels were reduced to around one-third in TX dams but were doubled in HYPER dams. Fetal brain weight was increased in HYPER dams, whereas litter size and fetal body weight were reduced in TX dams. Glial fibrillary acidic protein expression was similar in HYPER and TX dams, being reduced in both cases relative to N dams. alpha-Internexin (INX) abundance was reduced in HYPER dams and increased in TX dams, whereas neurofilament 68 (NF68) exhibited increased abundance in HYPER dams. Furthermore, INX was inversely related to - and NF68 directly related to - maternal serum total T4 levels, independently of fetal brain weight. In conclusion, maternal hyperthyroidism compromises the expression of neuronal cytoskeletal proteins in late fetal brain, suggestive of a pattern of accelerated neuronal differentiation.

  1. Hyper III on ramp, front view

    NASA Technical Reports Server (NTRS)

    1969-01-01

    The Hyper III was a low-cost test vehicle for an advanced lifting-body shape. Like the earlier M2-F1, it was a 'homebuilt' research aircraft, i.e., built at the Flight Research Center (FRC), later redesignated the Dryden Flight Research Center. It had a steel-tube frame covered with Dacron, a fiberglass nose, sheet aluminum fins, and a wing from an HP-11 sailplane. Construction was by volunteers at the FRC. Although the Hyper III was to be flown remotely in its initial tests, it was fitted with a cockpit for a pilot. On the Hyper III's only flight, it was towed aloft attached to a Navy SH-3 helicopter by a 400-foot cable. NASA research pilot Bruce Peterson flew the SH-3. After he released the Hyper III from the cable, NASA research pilot Milt Thompson flew the vehicle by radio control until the final approach when Dick Fischer took over control using a model-airplane radio-control box. The Hyper III flared, then landed and slid to a stop on Rogers Dry Lakebed. The Flight Research Center (FRC--as Dryden was named from 1959 until 1976) already had experience with testing small-scale aircraft using model-airplane techniques, but the first true remotely piloted research vehicle was the Hyper III, which flew only once in December 1969. At that time, the Center was engaged in flight research with a variety of reentry shapes called lifting bodies, and there was a desire both to expand the flight research experience with maneuverable reentry vehicles, including a high-performance, variable-geometry craft, and to investigate a remotely piloted flight research technique that made maximum use of a research pilot's skill and experience by placing him 'in the loop' as if he were in the cockpit. (There have been, as yet, no female research pilots assigned to Dryden.) The Hyper III as originally conceived was a stiletto-shaped lifting body that had resulted from a study at NASA's Langley Research Center in Hampton, Virginia. It was one of a number of hypersonic, cross-range reentry vehicles studied at Langley. (Hypersonic means Mach 5--five times the speed of sound--or faster; cross-range means able to fly a considerable distance to the left or right of the initial reentry path.) The FRC added a small, deployable, skewed wing to compensate for the shape's extremely low glide ratio. Shop personnel built the 32-foot-long Hyper III and covered its tubular frame with dacron, aluminum, and fiberglass, for about $6,500. Hyper III employed the same '8-ball' attitude indicator developed for control-room use when flying the X-15, two model-airplane receivers to command the vehicle's hydraulic controls, and a telemetry system (surplus from the X-15 program) to transmit 12 channels of data to the ground not only for display and control but for data analysis. Dropped from a helicopter at 10,000 feet, Hyper III flew under the control of research pilot Milt Thompson to a near landing using instruments for control. When the vehicle was close to the ground, he handed the vehicle off to experienced model pilot Dick Fischer for a visual landing using standard controls. The flight demonstrated the feasibility of remotely piloting research vehicles and, among other things, that control of the vehicle in roll was much better than predicted and that the vehicle had a much lower lift-to-drag ratio than predicted (a maximum of 4.0 rather than 5.0). Pilot Milt Thompson exhibited some suprising reactions during the Hyper III flight; he behaved as if he were in the cockpit of an actual research aircraft. 'I was really stimulated emotionally and physically in exactly the same manner that I have been during actual first flights.' 'Flying the Hyper III from a ground cockpit was just as dramatic as an actual flight in any of the other vehicles....responsibility rather than fear of personal safety is the real emotional driver. I have never come out of a simulator emtionally and physically tired as is often the case after a test flight in a research aircraft. I was emotionally and physically tired after a 3-minute flight of the Hyper III.'

  2. 42 CFR 405.515 - Reimbursement for clinical laboratory services billed by physicians.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Reimbursement for clinical laboratory services... Criteria for Determining Reasonable Charges § 405.515 Reimbursement for clinical laboratory services billed... limitation on reimbursement for markups on clinical laboratory services billed by physicians. If a physician...

  3. Using the Structured Product Labeling format to index versatile chemical data (ACS Spring meeting)

    EPA Science Inventory

    Structured Product Labeling (SPL) is a document markup standard approved by the Health Level Seven (HL7) standards organization and adopted by the FDA as a mechanism for exchanging product and facility information. Product information provided by companies in SPL format may be ac...

  4. A Google Earth Grand Tour of the Terrestrial Planets

    ERIC Educational Resources Information Center

    De Paor, Declan; Coba, Filis; Burgin, Stephen

    2016-01-01

    Google Earth is a powerful instructional resource for geoscience education. We have extended the virtual globe to include all terrestrial planets. Downloadable Keyhole Markup Language (KML) files (Google Earth's scripting language) associated with this paper include lessons about Mercury, Venus, the Moon, and Mars. We created "grand…

  5. Ontario Hydro and SGML.

    ERIC Educational Resources Information Center

    Rockley, Ann

    1993-01-01

    Describes how an analysis of Ontario Hydro's conversion of 20,000 pages of paper manuals to online documentation established the scope of the project, provided a set of design criteria, and recommended the use of Standard Generalized Markup Language to create the new documentation and the purchase of the "Dinatext" program to produce it.…

  6. Teaching XBRL to Graduate Business Students: A Hands-On Approach

    ERIC Educational Resources Information Center

    Pinsker, Robert

    2004-01-01

    EXtensible Business Reporting Language (XBRL) is a non-proprietary, computer language that has many uses. Known primarily as the Extensible Markup Language (XML) for business reporting, XBRL allows entities to report their business information (i.e., financial statements, announcements, etc.) on the Internet and communicate with other entities'…

  7. The Essen Learning Model--A Step towards a Representation of Learning Objectives.

    ERIC Educational Resources Information Center

    Bick, Markus; Pawlowski, Jan M.; Veith, Patrick

    The importance of the Extensible Markup Language (XML) technology family in the field of Computer Assisted Learning (CAL) can not be denied. The Instructional Management Systems Project (IMS), for example, provides a learning resource XML binding specification. Considering this specification and other implementations using XML to represent…

  8. 78 FR 39023 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-28

    ... confirmations than larger firms because they effect fewer transactions. The Commission staff estimates the costs... regarding their securities transactions. This information includes: the date and time of the transaction... as well as mark-up and mark-down information. For transactions in debt securities, Rule 10b-10...

  9. Assessing Place Location Knowledge Using a Virtual Globe

    ERIC Educational Resources Information Center

    Zhu, Liangfeng; Pan, Xin; Gao, Gongcheng

    2016-01-01

    Advances in the Google Earth virtual globe and the concomitant Keyhole Markup Language (KML) are providing educators with a convenient platform to cultivate and assess one's place location knowledge (PLK). This article presents a general framework and associated implementation methods for the online testing of PLK using Google Earth. The proposed…

  10. Searchers Net Treasure in Monterey.

    ERIC Educational Resources Information Center

    McDermott, Irene E.

    1999-01-01

    Reports on Web keyword searching, metadata, Dublin Core, Extensible Markup Language (XML), metasearch engines (metasearch engines search several Web indexes and/or directories and/or Usenet and/or specific Web sites), and the Year 2000 (Y2K) dilemma, all topics discussed at the second annual Internet Librarian Conference sponsored by Information…

  11. The Implications of Well-Formedness on Web-Based Educational Resources.

    ERIC Educational Resources Information Center

    Mohler, James L.

    Within all institutions, Web developers are beginning to utilize technologies that make sites more than static information resources. Databases such as XML (Extensible Markup Language) and XSL (Extensible Stylesheet Language) are key technologies that promise to extend the Web beyond the "information storehouse" paradigm and provide…

  12. Migrating an Online Service to WAP - A Case Study.

    ERIC Educational Resources Information Center

    Klasen, Lars

    2002-01-01

    Discusses mobile access via wireless application protocol (WAP) to online services that is offered in Sweden through InfoTorg. Topics include the Swedish online market; filtering HTML data from an Internet/Web server into WML (wireless markup language); mobile phone technology; microbrowsers; WAP protocol; and future possibilities. (LRW)

  13. Semantic-Aware Components and Services of ActiveMath

    ERIC Educational Resources Information Center

    Melis, Erica; Goguadze, Giorgi; Homik, Martin; Libbrecht, Paul; Ullrich, Carsten; Winterstein, Stefan

    2006-01-01

    ActiveMath is a complex web-based adaptive learning environment with a number of components and interactive learning tools. The basis for handling semantics of learning content is provided by its semantic (mathematics) content markup, which is additionally annotated with educational metadata. Several components, tools and external services can…

  14. XML Schema Languages: Beyond DTD.

    ERIC Educational Resources Information Center

    Ioannides, Demetrios

    2000-01-01

    Discussion of XML (extensible markup language) and the traditional DTD (document type definition) format focuses on efforts of the World Wide Web Consortium's XML schema working group to develop a schema language to replace DTD that will be capable of defining the set of constraints of any possible data resource. (Contains 14 references.) (LRW)

  15. Converting from XML to HDF-EOS

    NASA Technical Reports Server (NTRS)

    Ullman, Richard; Bane, Bob; Yang, Jingli

    2008-01-01

    A computer program recreates an HDF-EOS file from an Extensible Markup Language (XML) representation of the contents of that file. This program is one of two programs written to enable testing of the schemas described in the immediately preceding article to determine whether the schemas capture all details of HDF-EOS files.

  16. 17 CFR 1.33 - Monthly and confirmation statements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... customer— (i) The open contracts with prices at which acquired; (ii) The net unrealized profits or losses... transactions received from or disbursed to such customer, premiums charged and received, and realized profits... of the premium, as well as each mark-up thereon, if applicable, and all other commissions, costs...

  17. Silicon Graphics' IRIS InSight: An SGML Success Story.

    ERIC Educational Resources Information Center

    Glushko, Robert J.; Kershner, Ken

    1993-01-01

    Offers a case history of the development of the Silicon Graphics "IRIS InSight" system, a system for viewing on-line documentation using Standard Generalized Markup Language. Notes that SGML's explicit encoding of structure and separation of structure and presentation make possible structure-based search, alternative structural views of…

  18. A data model and database for high-resolution pathology analytical image informatics.

    PubMed

    Wang, Fusheng; Kong, Jun; Cooper, Lee; Pan, Tony; Kurc, Tahsin; Chen, Wenjin; Sharma, Ashish; Niedermayr, Cristobal; Oh, Tae W; Brat, Daniel; Farris, Alton B; Foran, David J; Saltz, Joel

    2011-01-01

    The systematic analysis of imaged pathology specimens often results in a vast amount of morphological information at both the cellular and sub-cellular scales. While microscopy scanners and computerized analysis are capable of capturing and analyzing data rapidly, microscopy image data remain underutilized in research and clinical settings. One major obstacle which tends to reduce wider adoption of these new technologies throughout the clinical and scientific communities is the challenge of managing, querying, and integrating the vast amounts of data resulting from the analysis of large digital pathology datasets. This paper presents a data model, which addresses these challenges, and demonstrates its implementation in a relational database system. This paper describes a data model, referred to as Pathology Analytic Imaging Standards (PAIS), and a database implementation, which are designed to support the data management and query requirements of detailed characterization of micro-anatomic morphology through many interrelated analysis pipelines on whole-slide images and tissue microarrays (TMAs). (1) Development of a data model capable of efficiently representing and storing virtual slide related image, annotation, markup, and feature information. (2) Development of a database, based on the data model, capable of supporting queries for data retrieval based on analysis and image metadata, queries for comparison of results from different analyses, and spatial queries on segmented regions, features, and classified objects. The work described in this paper is motivated by the challenges associated with characterization of micro-scale features for comparative and correlative analyses involving whole-slides tissue images and TMAs. Technologies for digitizing tissues have advanced significantly in the past decade. Slide scanners are capable of producing high-magnification, high-resolution images from whole slides and TMAs within several minutes. Hence, it is becoming increasingly feasible for basic, clinical, and translational research studies to produce thousands of whole-slide images. Systematic analysis of these large datasets requires efficient data management support for representing and indexing results from hundreds of interrelated analyses generating very large volumes of quantifications such as shape and texture and of classifications of the quantified features. We have designed a data model and a database to address the data management requirements of detailed characterization of micro-anatomic morphology through many interrelated analysis pipelines. The data model represents virtual slide related image, annotation, markup and feature information. The database supports a wide range of metadata and spatial queries on images, annotations, markups, and features. We currently have three databases running on a Dell PowerEdge T410 server with CentOS 5.5 Linux operating system. The database server is IBM DB2 Enterprise Edition 9.7.2. The set of databases consists of 1) a TMA database containing image analysis results from 4740 cases of breast cancer, with 641 MB storage size; 2) an algorithm validation database, which stores markups and annotations from two segmentation algorithms and two parameter sets on 18 selected slides, with 66 GB storage size; and 3) an in silico brain tumor study database comprising results from 307 TCGA slides, with 365 GB storage size. The latter two databases also contain human-generated annotations and markups for regions and nuclei. Modeling and managing pathology image analysis results in a database provide immediate benefits on the value and usability of data in a research study. The database provides powerful query capabilities, which are otherwise difficult or cumbersome to support by other approaches such as programming languages. Standardized, semantic annotated data representation and interfaces also make it possible to more efficiently share image data and analysis results.

  19. Wind-Tunnel Results of the B-52B with the X-43A Stack

    NASA Technical Reports Server (NTRS)

    Davis, Mark C.; Sim, Alexander G.; Rhode, Matthew; Johnson, Kevin D., Sr.

    2007-01-01

    A low-speed wind-tunnel test was performed with a 3%-scale model of a booster rocket mated to an X-43A research vehicle, a combination referred to as the Hyper-X launch vehicle. The test was conducted both in freestream air and in the presence of a partial model of the B-52B airplane. The objectives of the test were to obtain force and moment data to generate structural loads affecting the pylon of the B-52B airplane and to determine the aerodynamic influence of the B-52B on the Hyper-X launch vehicle for evaluating launch separation characteristics. The windtunnel test was conducted at a low-speed wind tunnel in Hampton, Virginia. All moments and forces reported are based either on the aerodynamic influence of the B-52B airplane or are for the Hyper-X launch vehicle in freestream air. Overall, the test showed that the B-52B airplane imparts a strong downwash onto the Hyper-X launch vehicle, reducing the net lift of the Hyper-X launch vehicle. Pitching and rolling moments are also imparted onto the booster and are a strong function of the launch-drop angle of attack.

  20. Hyper-dry conditions provide new insights into the cause of extreme floods after wildfire

    USGS Publications Warehouse

    Moody, John A.; Ebel, Brian A.

    2012-01-01

    A catastrophic wildfire in the foothills of the Rocky Mountains near Boulder, Colorado provided a unique opportunity to investigate soil conditions immediately after a wildfire and before alteration by rainfall. Measurements of near-surface (θ; and matric suction, ψ), rainfall, and wind velocity were started 8 days after the wildfire began. These measurements established that hyper-dryconditions (θ 3 cm-3; ψ > ~ 3 x 105 cm) existed and provided an in-situ retention curve for these conditions. These conditions exacerbate the effects of water repellency (natural and fire-induced) and limit the effectiveness of capillarity and gravity driven infiltration into fire-affected soils. The important consequence is that given hyper-dryconditions, the critical rewetting process before the first rain is restricted to the diffusion–adsorption of water-vapor. This process typically has a time scale of days to weeks (especially when the hydrologic effects of the ash layer are included) that is longer than the typical time scale (minutes to hours) of some rainstorms, such that under hyper-dryconditions essentially no rain infiltrates. The existence of hyper-dryconditions provides insight into why, frequently during the first rain storm after a wildfire, nearly all rainfall becomes runoff causing extremefloods and debris flows.

Top