Sample records for multi-tier web application

  1. SLA-based optimisation of virtualised resource for multi-tier web applications in cloud data centres

    NASA Astrophysics Data System (ADS)

    Bi, Jing; Yuan, Haitao; Tie, Ming; Tan, Wei

    2015-10-01

    Dynamic virtualised resource allocation is the key to quality of service assurance for multi-tier web application services in cloud data centre. In this paper, we develop a self-management architecture of cloud data centres with virtualisation mechanism for multi-tier web application services. Based on this architecture, we establish a flexible hybrid queueing model to determine the amount of virtual machines for each tier of virtualised application service environments. Besides, we propose a non-linear constrained optimisation problem with restrictions defined in service level agreement. Furthermore, we develop a heuristic mixed optimisation algorithm to maximise the profit of cloud infrastructure providers, and to meet performance requirements from different clients as well. Finally, we compare the effectiveness of our dynamic allocation strategy with two other allocation strategies. The simulation results show that the proposed resource allocation method is efficient in improving the overall performance and reducing the resource energy cost.

  2. Optimizing Libraries’ Content Findability Using Simple Object Access Protocol (SOAP) With Multi-Tier Architecture

    NASA Astrophysics Data System (ADS)

    Lahinta, A.; Haris, I.; Abdillah, T.

    2017-03-01

    The aim of this paper is to describe a developed application of Simple Object Access Protocol (SOAP) as a model for improving libraries’ digital content findability on the library web. The study applies XML text-based protocol tools in the collection of data about libraries’ visibility performance in the search results of the book. Model from the integrated Web Service Document Language (WSDL) and Universal Description, Discovery and Integration (UDDI) are applied to analyse SOAP as element within the system. The results showed that the developed application of SOAP with multi-tier architecture can help people simply access the website in the library server Gorontalo Province and support access to digital collections, subscription databases, and library catalogs in each library in Regency or City in Gorontalo Province.

  3. A framework for integration of scientific applications into the OpenTopography workflow

    NASA Astrophysics Data System (ADS)

    Nandigam, V.; Crosby, C.; Baru, C.

    2012-12-01

    The NSF-funded OpenTopography facility provides online access to Earth science-oriented high-resolution LIDAR topography data, online processing tools, and derivative products. The underlying cyberinfrastructure employs a multi-tier service oriented architecture that is comprised of an infrastructure tier, a processing services tier, and an application tier. The infrastructure tier consists of storage, compute resources as well as supporting databases. The services tier consists of the set of processing routines each deployed as a Web service. The applications tier provides client interfaces to the system. (e.g. Portal). We propose a "pluggable" infrastructure design that will allow new scientific algorithms and processing routines developed and maintained by the community to be integrated into the OpenTopography system so that the wider earth science community can benefit from its availability. All core components in OpenTopography are available as Web services using a customized open-source Opal toolkit. The Opal toolkit provides mechanisms to manage and track job submissions, with the help of a back-end database. It allows monitoring of job and system status by providing charting tools. All core components in OpenTopography have been developed, maintained and wrapped as Web services using Opal by OpenTopography developers. However, as the scientific community develops new processing and analysis approaches this integration approach is not scalable efficiently. Most of the new scientific applications will have their own active development teams performing regular updates, maintenance and other improvements. It would be optimal to have the application co-located where its developers can continue to actively work on it while still making it accessible within the OpenTopography workflow for processing capabilities. We will utilize a software framework for remote integration of these scientific applications into the OpenTopography system. This will be accomplished by virtually extending the OpenTopography service over the various infrastructures running these scientific applications and processing routines. This involves packaging and distributing a customized instance of the Opal toolkit that will wrap the software application as an OPAL-based web service and integrate it into the OpenTopography framework. We plan to make this as automated as possible. A structured specification of service inputs and outputs along with metadata annotations encoded in XML can be utilized to automate the generation of user interfaces, with appropriate tools tips and user help features, and generation of other internal software. The OpenTopography Opal toolkit will also include the customizations that will enable security authentication, authorization and the ability to write application usage and job statistics back to the OpenTopography databases. This usage information could then be reported to the original service providers and used for auditing and performance improvements. This pluggable framework will enable the application developers to continue to work on enhancing their application while making the latest iteration available in a timely manner to the earth sciences community. This will also help us establish an overall framework that other scientific application providers will also be able to use going forward.

  4. Research and development of web oriented remote sensing image publication system based on Servlet technique

    NASA Astrophysics Data System (ADS)

    Juanle, Wang; Shuang, Li; Yunqiang, Zhu

    2005-10-01

    According to the requirements of China National Scientific Data Sharing Program (NSDSP), the research and development of web oriented RS Image Publication System (RSIPS) is based on Java Servlet technique. The designing of RSIPS framework is composed of 3 tiers, which is Presentation Tier, Application Service Tier and Data Resource Tier. Presentation Tier provides user interface for data query, review and download. For the convenience of users, visual spatial query interface is included. Served as a middle tier, Application Service Tier controls all actions between users and databases. Data Resources Tier stores RS images in file and relationship databases. RSIPS is developed with cross platform programming based on Java Servlet tools, which is one of advanced techniques in J2EE architecture. RSIPS's prototype has been developed and applied in the geosciences clearinghouse practice which is among the experiment units of NSDSP in China.

  5. Methods and apparatus for constructing and implementing a universal extension module for processing objects in a database

    NASA Technical Reports Server (NTRS)

    Li, Chung-Sheng (Inventor); Smith, John R. (Inventor); Chang, Yuan-Chi (Inventor); Jhingran, Anant D. (Inventor); Padmanabhan, Sriram K. (Inventor); Hsiao, Hui-I (Inventor); Choy, David Mun-Hien (Inventor); Lin, Jy-Jine James (Inventor); Fuh, Gene Y. C. (Inventor); Williams, Robin (Inventor)

    2004-01-01

    Methods and apparatus for providing a multi-tier object-relational database architecture are disclosed. In one illustrative embodiment of the present invention, a multi-tier database architecture comprises an object-relational database engine as a top tier, one or more domain-specific extension modules as a bottom tier, and one or more universal extension modules as a middle tier. The individual extension modules of the bottom tier operationally connect with the one or more universal extension modules which, themselves, operationally connect with the database engine. The domain-specific extension modules preferably provide such functions as search, index, and retrieval services of images, video, audio, time series, web pages, text, XML, spatial data, etc. The domain-specific extension modules may include one or more IBM DB2 extenders, Oracle data cartridges and/or Informix datablades, although other domain-specific extension modules may be used.

  6. Web mapping system for complex processing and visualization of environmental geospatial datasets

    NASA Astrophysics Data System (ADS)

    Titov, Alexander; Gordov, Evgeny; Okladnikov, Igor

    2016-04-01

    Environmental geospatial datasets (meteorological observations, modeling and reanalysis results, etc.) are used in numerous research applications. Due to a number of objective reasons such as inherent heterogeneity of environmental datasets, big dataset volume, complexity of data models used, syntactic and semantic differences that complicate creation and use of unified terminology, the development of environmental geodata access, processing and visualization services as well as client applications turns out to be quite a sophisticated task. According to general INSPIRE requirements to data visualization geoportal web applications have to provide such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. It should be noted that modern web mapping systems as integrated geoportal applications are developed based on the SOA and might be considered as complexes of interconnected software tools for working with geospatial data. In the report a complex web mapping system including GIS web client and corresponding OGC services for working with geospatial (NetCDF, PostGIS) dataset archive is presented. There are three basic tiers of the GIS web client in it: 1. Tier of geospatial metadata retrieved from central MySQL repository and represented in JSON format 2. Tier of JavaScript objects implementing methods handling: --- NetCDF metadata --- Task XML object for configuring user calculations, input and output formats --- OGC WMS/WFS cartographical services 3. Graphical user interface (GUI) tier representing JavaScript objects realizing web application business logic Metadata tier consists of a number of JSON objects containing technical information describing geospatial datasets (such as spatio-temporal resolution, meteorological parameters, valid processing methods, etc). The middleware tier of JavaScript objects implementing methods for handling geospatial metadata, task XML object, and WMS/WFS cartographical services interconnects metadata and GUI tiers. The methods include such procedures as JSON metadata downloading and update, launching and tracking of the calculation task running on the remote servers as well as working with WMS/WFS cartographical services including: obtaining the list of available layers, visualizing layers on the map, exporting layers in graphical (PNG, JPG, GeoTIFF), vector (KML, GML, Shape) and digital (NetCDF) formats. Graphical user interface tier is based on the bundle of JavaScript libraries (OpenLayers, GeoExt and ExtJS) and represents a set of software components implementing web mapping application business logic (complex menus, toolbars, wizards, event handlers, etc.). GUI provides two basic capabilities for the end user: configuring the task XML object functionality and cartographical information visualizing. The web interface developed is similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Web mapping system developed has shown its effectiveness in the process of solving real climate change research problems and disseminating investigation results in cartographical form. The work is supported by SB RAS Basic Program Projects VIII.80.2.1 and IV.38.1.7.

  7. Multi-Dimensional Optimization for Cloud Based Multi-Tier Applications

    ERIC Educational Resources Information Center

    Jung, Gueyoung

    2010-01-01

    Emerging trends toward cloud computing and virtualization have been opening new avenues to meet enormous demands of space, resource utilization, and energy efficiency in modern data centers. By being allowed to host many multi-tier applications in consolidated environments, cloud infrastructure providers enable resources to be shared among these…

  8. Regulatory Compliance in Multi-Tier Supplier Networks

    NASA Technical Reports Server (NTRS)

    Goossen, Emray R.; Buster, Duke A.

    2014-01-01

    Over the years, avionics systems have increased in complexity to the point where 1st tier suppliers to an aircraft OEM find it financially beneficial to outsource designs of subsystems to 2nd tier and at times to 3rd tier suppliers. Combined with challenging schedule and budgetary pressures, the environment in which safety-critical systems are being developed introduces new hurdles for regulatory agencies and industry. This new environment of both complex systems and tiered development has raised concerns in the ability of the designers to ensure safety considerations are fully addressed throughout the tier levels. This has also raised questions about the sufficiency of current regulatory guidance to ensure: proper flow down of safety awareness, avionics application understanding at the lower tiers, OEM and 1st tier oversight practices, and capabilities of lower tier suppliers. Therefore, NASA established a research project to address Regulatory Compliance in a Multi-tier Supplier Network. This research was divided into three major study efforts: 1. Describe Modern Multi-tier Avionics Development 2. Identify Current Issues in Achieving Safety and Regulatory Compliance 3. Short-term/Long-term Recommendations Toward Higher Assurance Confidence This report presents our findings of the risks, weaknesses, and our recommendations. It also includes a collection of industry-identified risks, an assessment of guideline weaknesses related to multi-tier development of complex avionics systems, and a postulation of potential modifications to guidelines to close the identified risks and weaknesses.

  9. A Two-Tiered Model for Analyzing Library Web Site Usage Statistics, Part 1: Web Server Logs.

    ERIC Educational Resources Information Center

    Cohen, Laura B.

    2003-01-01

    Proposes a two-tiered model for analyzing web site usage statistics for academic libraries: one tier for library administrators that analyzes measures indicating library use, and a second tier for web site managers that analyzes measures aiding in server maintenance and site design. Discusses the technology of web site usage statistics, and…

  10. Evaluating Web Sources in an EAP Course: Introducing a Multi-Trait Instrument for Feedback and Assessment

    ERIC Educational Resources Information Center

    Stapleton, Paul; Helms-Park, Rena

    2006-01-01

    This paper introduces the Website Acceptability Tiered Checklist (WATCH), a preliminary version of a multi-trait scale that could be used by instructors and students to assess the quality of websites chosen as source materials in students' research papers in a Humanities program. The scale includes bands for assessing: (i) the authority and…

  11. Optimal Resource Allocation under Fair QoS in Multi-tier Server Systems

    NASA Astrophysics Data System (ADS)

    Akai, Hirokazu; Ushio, Toshimitsu; Hayashi, Naoki

    Recent development of network technology realizes multi-tier server systems, where several tiers perform functionally different processing requested by clients. It is an important issue to allocate resources of the systems to clients dynamically based on their current requests. On the other hand, Q-RAM has been proposed for resource allocation in real-time systems. In the server systems, it is important that execution results of all applications requested by clients are the same QoS(quality of service) level. In this paper, we extend Q-RAM to multi-tier server systems and propose a method for optimal resource allocation with fairness of the QoS levels of clients’ requests. We also consider an assignment problem of physical machines to be sleep in each tier sothat the energy consumption is minimized.

  12. Earth Science Computational Architecture for Multi-disciplinary Investigations

    NASA Astrophysics Data System (ADS)

    Parker, J. W.; Blom, R.; Gurrola, E.; Katz, D.; Lyzenga, G.; Norton, C.

    2005-12-01

    Understanding the processes underlying Earth's deformation and mass transport requires a non-traditional, integrated, interdisciplinary, approach dependent on multiple space and ground based data sets, modeling, and computational tools. Currently, details of geophysical data acquisition, analysis, and modeling largely limit research to discipline domain experts. Interdisciplinary research requires a new computational architecture that is optimized to perform complex data processing of multiple solid Earth science data types in a user-friendly environment. A web-based computational framework is being developed and integrated with applications for automatic interferometric radar processing, and models for high-resolution deformation & gravity, forward models of viscoelastic mass loading over short wavelengths & complex time histories, forward-inverse codes for characterizing surface loading-response over time scales of days to tens of thousands of years, and inversion of combined space magnetic & gravity fields to constrain deep crustal and mantle properties. This framework combines an adaptation of the QuakeSim distributed services methodology with the Pyre framework for multiphysics development. The system uses a three-tier architecture, with a middle tier server that manages user projects, available resources, and security. This ensures scalability to very large networks of collaborators. Users log into a web page and have a personal project area, persistently maintained between connections, for each application. Upon selection of an application and host from a list of available entities, inputs may be uploaded or constructed from web forms and available data archives, including gravity, GPS and imaging radar data. The user is notified of job completion and directed to results posted via URLs. Interdisciplinary work is supported through easy availability of all applications via common browsers, application tutorials and reference guides, and worked examples with visual response. At the platform level, multi-physics application development and workflow are available in the enriched environment of the Pyre framework. Advantages for combining separate expert domains include: multiple application components efficiently interact through Python shared libraries, investigators may nimbly swap models and try new parameter values, and a rich array of common tools are inherent in the Pyre system. The first four specific investigations to use this framework are: Gulf Coast subsidence: understanding of partitioning between compaction, subsidence and growth faulting; Gravity & deformation of a layered spherical earth model due to large earthquakes; Rift setting of Lake Vostok, Antarctica; and global ice mass changes.

  13. Customizable scientific web-portal for DIII-D nuclear fusion experiment

    NASA Astrophysics Data System (ADS)

    Abla, G.; Kim, E. N.; Schissel, D. P.

    2010-04-01

    Increasing utilization of the Internet and convenient web technologies has made the web-portal a major application interface for remote participation and control of scientific instruments. While web-portals have provided a centralized gateway for multiple computational services, the amount of visual output often is overwhelming due to the high volume of data generated by complex scientific instruments and experiments. Since each scientist may have different priorities and areas of interest in the experiment, filtering and organizing information based on the individual user's need can increase the usability and efficiency of a web-portal. DIII-D is the largest magnetic nuclear fusion device in the US. A web-portal has been designed to support the experimental activities of DIII-D researchers worldwide. It offers a customizable interface with personalized page layouts and list of services for users to select. Each individual user can create a unique working environment to fit his own needs and interests. Customizable services are: real-time experiment status monitoring, diagnostic data access, interactive data analysis and visualization. The web-portal also supports interactive collaborations by providing collaborative logbook, and online instant announcement services. The DIII-D web-portal development utilizes multi-tier software architecture, and Web 2.0 technologies and tools, such as AJAX and Django, to develop a highly-interactive and customizable user interface.

  14. Development of a 3D WebGIS System for Retrieving and Visualizing CityGML Data Based on their Geometric and Semantic Characteristics by Using Free and Open Source Technology

    NASA Astrophysics Data System (ADS)

    Pispidikis, I.; Dimopoulou, E.

    2016-10-01

    CityGML is considered as an optimal standard for representing 3D city models. However, international experience has shown that visualization of the latter is quite difficult to be implemented on the web, due to the large size of data and the complexity of CityGML. As a result, in the context of this paper, a 3D WebGIS application is developed in order to successfully retrieve and visualize CityGML data in accordance with their respective geometric and semantic characteristics. Furthermore, the available web technologies and the architecture of WebGIS systems are investigated, as provided by international experience, in order to be utilized in the most appropriate way for the purposes of this paper. Specifically, a PostgreSQL/ PostGIS Database is used, in compliance with the 3DCityDB schema. At Server tier, Apache HTTP Server and GeoServer are utilized, while a Server Side programming language PHP is used. At Client tier, which implemented the interface of the application, the following technologies were used: JQuery, AJAX, JavaScript, HTML5, WebGL and Ol3-Cesium. Finally, it is worth mentioning that the application's primary objectives are a user-friendly interface and a fully open source development.

  15. Supply Chain Collaboration: Information Sharing in a Tactical Operating Environment

    DTIC Science & Technology

    2013-06-01

    architecture, there are four tiers: Client (Web Application Clients ), Presentation (Web-Server), Processing (Application-Server), Data (Database...organization in each period. This data will be collected to analyze. i) Analyses and Validation: We will do a statistics test in this data, Pareto ...notes, outstanding deliveries, and inventory. i) Analyses and Validation: We will do a statistics test in this data, Pareto analyses and confirmation

  16. Validity and reliability of an application review process using dedicated reviewers in one stage of a multi-stage admissions model.

    PubMed

    Zeeman, Jacqueline M; McLaughlin, Jacqueline E; Cox, Wendy C

    2017-11-01

    With increased emphasis placed on non-academic skills in the workplace, a need exists to identify an admissions process that evaluates these skills. This study assessed the validity and reliability of an application review process involving three dedicated application reviewers in a multi-stage admissions model. A multi-stage admissions model was utilized during the 2014-2015 admissions cycle. After advancing through the academic review, each application was independently reviewed by two dedicated application reviewers utilizing a six-construct rubric (written communication, extracurricular and community service activities, leadership experience, pharmacy career appreciation, research experience, and resiliency). Rubric scores were extrapolated to a three-tier ranking to select candidates for on-site interviews. Kappa statistics were used to assess interrater reliability. A three-facet Many-Facet Rasch Model (MFRM) determined reviewer severity, candidate suitability, and rubric construct difficulty. The kappa statistic for candidates' tier rank score (n = 388 candidates) was 0.692 with a perfect agreement frequency of 84.3%. There was substantial interrater reliability between reviewers for the tier ranking (kappa: 0.654-0.710). Highest construct agreement occurred in written communication (kappa: 0.924-0.984). A three-facet MFRM analysis explained 36.9% of variance in the ratings, with 0.06% reflecting application reviewer scoring patterns (i.e., severity or leniency), 22.8% reflecting candidate suitability, and 14.1% reflecting construct difficulty. Utilization of dedicated application reviewers and a defined tiered rubric provided a valid and reliable method to effectively evaluate candidates during the application review process. These analyses provide insight into opportunities for improving the application review process among schools and colleges of pharmacy. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Advanced Networks in Motion Mobile Sensorweb

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.; Stewart, David H.

    2011-01-01

    Advanced mobile networking technology applicable to mobile sensor platforms was developed, deployed and demonstrated. A two-tier sensorweb design was developed. The first tier utilized mobile network technology to provide mobility. The second tier, which sits above the first tier, utilizes 6LowPAN (Internet Protocol version 6 Low Power Wireless Personal Area Networks) sensors. The entire network was IPv6 enabled. Successful mobile sensorweb system field tests took place in late August and early September of 2009. The entire network utilized IPv6 and was monitored and controlled using a remote Web browser via IPv6 technology. This paper describes the mobile networking and 6LowPAN sensorweb design, implementation, deployment and testing as well as wireless systems and network monitoring software developed to support testing and validation.

  18. The “NetBoard”: Network Monitoring Tools Integration for INFN Tier-1 Data Center

    NASA Astrophysics Data System (ADS)

    De Girolamo, D.; dell'Agnello and, L.; Zani, S.

    2012-12-01

    The monitoring and alert system is fundamental for the management and the operation of the network in a large data center such as an LHC Tier-1. The network of the INFN Tier-1 at CNAF is a multi-vendor environment: for its management and monitoring several tools have been adopted and different sensors have been developed. In this paper, after an overview on the different aspects to be monitored and the tools used for them (i.e. MRTG, Nagios, Arpwatch, NetFlow, Syslog, etc), we will describe the “NetBoard”, a monitoring toolkit developed at the INFN Tier-1. NetBoard, developed for a multi-vendor network, is able to install and auto-configure all tools needed for its monitoring, either via network devices discovery mechanism or via configuration file or via wizard. In this way, we are also able to activate different types of sensors and Nagios checks according to the equipment vendor specifications. Moreover, when a new device is connected in the LAN, NetBoard can detect where it is plugged. Finally the NetBoard web interface allows to have the overall status of the entire network “at a glance”, both the local and the geographical (including the LHCOPN and the LHCONE) link utilization, health status of network devices (with active alerts) and flow analysis.

  19. 77 FR 67053 - Self-Regulatory Organizations; New York Stock Exchange LLC; Order Granting Approval of Proposed...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-08

    ... Tier Two, at each such issuer's election, the Exchange offers either market analytics or Web-hosting... One, the Exchange offers market surveillance and Web-hosting products and services to U.S. issuers... Tier One offerings, the Exchange proposes to permit a Tier One issuer to choose market analytics...

  20. The CMS dataset bookkeeping service

    NASA Astrophysics Data System (ADS)

    Afaq, A.; Dolgert, A.; Guo, Y.; Jones, C.; Kosyakov, S.; Kuznetsov, V.; Lueking, L.; Riley, D.; Sekhri, V.

    2008-07-01

    The CMS Dataset Bookkeeping Service (DBS) has been developed to catalog all CMS event data from Monte Carlo and Detector sources. It provides the ability to identify MC or trigger source, track data provenance, construct datasets for analysis, and discover interesting data. CMS requires processing and analysis activities at various service levels and the DBS system provides support for localized processing or private analysis, as well as global access for CMS users at large. Catalog entries can be moved among the various service levels with a simple set of migration tools, thus forming a loose federation of databases. DBS is available to CMS users via a Python API, Command Line, and a Discovery web page interfaces. The system is built as a multi-tier web application with Java servlets running under Tomcat, with connections via JDBC to Oracle or MySQL database backends. Clients connect to the service through HTTP or HTTPS with authentication provided by GRID certificates and authorization through VOMS. DBS is an integral part of the overall CMS Data Management and Workflow Management systems.

  1. Development of Web GIS for complex processing and visualization of climate geospatial datasets as an integral part of dedicated Virtual Research Environment

    NASA Astrophysics Data System (ADS)

    Gordov, Evgeny; Okladnikov, Igor; Titov, Alexander

    2017-04-01

    For comprehensive usage of large geospatial meteorological and climate datasets it is necessary to create a distributed software infrastructure based on the spatial data infrastructure (SDI) approach. Currently, it is generally accepted that the development of client applications as integrated elements of such infrastructure should be based on the usage of modern web and GIS technologies. The paper describes the Web GIS for complex processing and visualization of geospatial (mainly in NetCDF and PostGIS formats) datasets as an integral part of the dedicated Virtual Research Environment for comprehensive study of ongoing and possible future climate change, and analysis of their implications, providing full information and computing support for the study of economic, political and social consequences of global climate change at the global and regional levels. The Web GIS consists of two basic software parts: 1. Server-side part representing PHP applications of the SDI geoportal and realizing the functionality of interaction with computational core backend, WMS/WFS/WPS cartographical services, as well as implementing an open API for browser-based client software. Being the secondary one, this part provides a limited set of procedures accessible via standard HTTP interface. 2. Front-end part representing Web GIS client developed according to a "single page application" technology based on JavaScript libraries OpenLayers (http://openlayers.org/), ExtJS (https://www.sencha.com/products/extjs), GeoExt (http://geoext.org/). It implements application business logic and provides intuitive user interface similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Boundless/OpenGeo architecture was used as a basis for Web-GIS client development. According to general INSPIRE requirements to data visualization Web GIS provides such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. The specialized Web GIS client contains three basic tires: • Tier of NetCDF metadata in JSON format • Middleware tier of JavaScript objects implementing methods to work with: o NetCDF metadata o XML file of selected calculations configuration (XML task) o WMS/WFS/WPS cartographical services • Graphical user interface tier representing JavaScript objects realizing general application business logic Web-GIS developed provides computational processing services launching to support solving tasks in the area of environmental monitoring, as well as presenting calculation results in the form of WMS/WFS cartographical layers in raster (PNG, JPG, GeoTIFF), vector (KML, GML, Shape), and binary (NetCDF) formats. It has shown its effectiveness in the process of solving real climate change research problems and disseminating investigation results in cartographical formats. The work is supported by the Russian Science Foundation grant No 16-19-10257.

  2. Use of Self-Monitoring to Maintain Program Fidelity of Multi-Tiered Interventions

    ERIC Educational Resources Information Center

    Nelson, J. Ron; Oliver, Regina M.; Hebert, Michael A.; Bohaty, Janet

    2015-01-01

    Multi-tiered system of supports represents one of the most significant advancements in improving the outcomes of students for whom typical instruction is not effective. While many practices need to be in place to make multi-tiered systems of support effective, accurate implementation of evidence-based practices by individuals at all tiers is…

  3. Development and Application of a Novel Rasch-based Methodology for Evaluating Multi-Tiered Assessment Instruments: Validation and utilization of an undergraduate diagnostic test of the water cycle

    NASA Astrophysics Data System (ADS)

    Romine, William L.; Schaffer, Dane L.; Barrow, Lloyd

    2015-11-01

    We describe the development and validation of a three-tiered diagnostic test of the water cycle (DTWC) and use it to evaluate the impact of prior learning experiences on undergraduates' misconceptions. While most approaches to instrument validation take a positivist perspective using singular criteria such as reliability and fit with a measurement model, we extend this to a multi-tiered approach which supports multiple interpretations. Using a sample of 130 undergraduate students from two colleges, we utilize the Rasch model to place students and items along traditional one-, two-, and three-tiered scales as well as a misconceptions scale. In the three-tiered and misconceptions scales, high confidence was indicative of mastery. In the latter scale, a 'misconception' was defined as mastery of an incorrect concept. We found that integrating confidence into mastery did little to change item functioning; however, three-tiered usage resulted in higher reliability and lower student ability estimates than two-tiered usage. The misconceptions scale showed high efficacy in predicting items on which particular students were likely to express misconceptions, and revealed several tenacious misconceptions that all students were likely to express regardless of ability. Previous coursework on the water cycle did little to change the prevalence of undergraduates' misconceptions.

  4. Web-based multimedia information retrieval for clinical application research

    NASA Astrophysics Data System (ADS)

    Cao, Xinhua; Hoo, Kent S., Jr.; Zhang, Hong; Ching, Wan; Zhang, Ming; Wong, Stephen T. C.

    2001-08-01

    We described a web-based data warehousing method for retrieving and analyzing neurological multimedia information. The web-based method supports convenient access, effective search and retrieval of clinical textual and image data, and on-line analysis. To improve the flexibility and efficiency of multimedia information query and analysis, a three-tier, multimedia data warehouse for epilepsy research has been built. The data warehouse integrates clinical multimedia data related to epilepsy from disparate sources and archives them into a well-defined data model.

  5. Web-Based Two-Tier Diagnostic Test and Remedial Learning Experiment

    ERIC Educational Resources Information Center

    Lai, Ah-Fur; Chen, Deng-Jyi

    2010-01-01

    Offering a series of diagnosis and individual remedial learning activities for a general class by means of web and multimedia technology can overcome the dilemma of conventional diagnosis and remedial instruction. The study proposes a three-layer conceptual framework and adopts a two-tier diagnostic test theory to develop a web-based two-tier…

  6. Web-based GIS for collaborative planning and public participation: an application to the strategic planning of wind farm sites.

    PubMed

    Simão, Ana; Densham, Paul J; Haklay, Mordechai Muki

    2009-05-01

    Spatial planning typically involves multiple stakeholders. To any specific planning problem, stakeholders often bring different levels of knowledge about the components of the problem and make assumptions, reflecting their individual experiences, that yield conflicting views about desirable planning outcomes. Consequently, stakeholders need to learn about the likely outcomes that result from their stated preferences; this learning can be supported through enhanced access to information, increased public participation in spatial decision-making and support for distributed collaboration amongst planners, stakeholders and the public. This paper presents a conceptual system framework for web-based GIS that supports public participation in collaborative planning. The framework combines an information area, a Multi-Criteria Spatial Decision Support System (MC-SDSS) and an argumentation map to support distributed and asynchronous collaboration in spatial planning. After analysing the novel aspects of this framework, the paper describes its implementation, as a proof of concept, in a system for Web-based Participatory Wind Energy Planning (WePWEP). Details are provided on the specific implementation of each of WePWEP's four tiers, including technical and structural aspects. Throughout the paper, particular emphasis is placed on the need to support user learning throughout the planning process.

  7. A component-based, distributed object services architecture for a clinical workstation.

    PubMed

    Chueh, H C; Raila, W F; Pappas, J J; Ford, M; Zatsman, P; Tu, J; Barnett, G O

    1996-01-01

    Attention to an architectural framework in the development of clinical applications can promote reusability of both legacy systems as well as newly designed software. We describe one approach to an architecture for a clinical workstation application which is based on a critical middle tier of distributed object-oriented services. This tier of network-based services provides flexibility in the creation of both the user interface and the database tiers. We developed a clinical workstation for ambulatory care using this architecture, defining a number of core services including those for vocabulary, patient index, documents, charting, security, and encounter management. These services can be implemented through proprietary or more standard distributed object interfaces such as CORBA and OLE. Services are accessed over the network by a collection of user interface components which can be mixed and matched to form a variety of interface styles. These services have also been reused with several applications based on World Wide Web browser interfaces.

  8. A component-based, distributed object services architecture for a clinical workstation.

    PubMed Central

    Chueh, H. C.; Raila, W. F.; Pappas, J. J.; Ford, M.; Zatsman, P.; Tu, J.; Barnett, G. O.

    1996-01-01

    Attention to an architectural framework in the development of clinical applications can promote reusability of both legacy systems as well as newly designed software. We describe one approach to an architecture for a clinical workstation application which is based on a critical middle tier of distributed object-oriented services. This tier of network-based services provides flexibility in the creation of both the user interface and the database tiers. We developed a clinical workstation for ambulatory care using this architecture, defining a number of core services including those for vocabulary, patient index, documents, charting, security, and encounter management. These services can be implemented through proprietary or more standard distributed object interfaces such as CORBA and OLE. Services are accessed over the network by a collection of user interface components which can be mixed and matched to form a variety of interface styles. These services have also been reused with several applications based on World Wide Web browser interfaces. PMID:8947744

  9. Multi-Tiered System of Support: Best Differentiation Practices for English Language Learners in Tier 1

    ERIC Educational Resources Information Center

    Izaguirre, Cecilia

    2017-01-01

    Purpose: This qualitative case study explored the best practices of differentiation of Tier 1 instruction within a multi-tiered system of support for English Language Learners who were predominately Spanish speaking. Theoretical Framework: The zone of proximal development theory, cognitive theory, and the affective filter hypothesis guided this…

  10. Multi-tiered system of support incorporating the R.E.N.E.W. process and its relationship to perception of school safety and office discipline referrals

    NASA Astrophysics Data System (ADS)

    Flood, Molly M.

    This study examined the relationship between the fidelity of multi-tier school-wide positive behavior interventions and supports (SWPBIS) and staff perception of school safety and office discipline referrals. This research provided a case study on multi-tier supports and interventions, and the RENEW person-centered planning process in an alternative special education center following the implementation of a multi-tier SWPBIS model. Pennsylvania is one of several states looking to adopt an effective Tier III behavioral tool. The research described the results of an analysis of implementation fidelity on a multi-tiered school-wide positive behavior support model developed at a special education center operated by a public school system entity. This research explored the fidelity of SWPBIS implementation; analyzed the relationship of SWPBIS to school climate as measured by staff perceptions and reduction of office discipline referrals (ODR); explored tier III supports incorporating a process Rehabilitation and Empowerment, Natural Supports, Education and Work (RENEW); and investigated the potential sustainability of the RENEW process as a multi-tier system of support. This study investigated staff perceptions on integrated supports between schools and communities and identified the degree of relationship to school risk factors, school protective factors, and office discipline referrals following the building of cooperative partnerships between Systems of Care and Local Education Agencies.

  11. Analytical Study on Multi-Tier 5G Heterogeneous Small Cell Networks: Coverage Performance and Energy Efficiency.

    PubMed

    Xiao, Zhu; Liu, Hongjing; Havyarimana, Vincent; Li, Tong; Wang, Dong

    2016-11-04

    In this paper, we investigate the coverage performance and energy efficiency of multi-tier heterogeneous cellular networks (HetNets) which are composed of macrocells and different types of small cells, i.e., picocells and femtocells. By virtue of stochastic geometry tools, we model the multi-tier HetNets based on a Poisson point process (PPP) and analyze the Signal to Interference Ratio (SIR) via studying the cumulative interference from pico-tier and femto-tier. We then derive the analytical expressions of coverage probabilities in order to evaluate coverage performance in different tiers and investigate how it varies with the small cells' deployment density. By taking the fairness and user experience into consideration, we propose a disjoint channel allocation scheme and derive the system channel throughput for various tiers. Further, we formulate the energy efficiency optimization problem for multi-tier HetNets in terms of throughput performance and resource allocation fairness. To solve this problem, we devise a linear programming based approach to obtain the available area of the feasible solutions. System-level simulations demonstrate that the small cells' deployment density has a significant effect on the coverage performance and energy efficiency. Simulation results also reveal that there exits an optimal small cell base station (SBS) density ratio between pico-tier and femto-tier which can be applied to maximize the energy efficiency and at the same time enhance the system performance. Our findings provide guidance for the design of multi-tier HetNets for improving the coverage performance as well as the energy efficiency.

  12. Analytical Study on Multi-Tier 5G Heterogeneous Small Cell Networks: Coverage Performance and Energy Efficiency

    PubMed Central

    Xiao, Zhu; Liu, Hongjing; Havyarimana, Vincent; Li, Tong; Wang, Dong

    2016-01-01

    In this paper, we investigate the coverage performance and energy efficiency of multi-tier heterogeneous cellular networks (HetNets) which are composed of macrocells and different types of small cells, i.e., picocells and femtocells. By virtue of stochastic geometry tools, we model the multi-tier HetNets based on a Poisson point process (PPP) and analyze the Signal to Interference Ratio (SIR) via studying the cumulative interference from pico-tier and femto-tier. We then derive the analytical expressions of coverage probabilities in order to evaluate coverage performance in different tiers and investigate how it varies with the small cells’ deployment density. By taking the fairness and user experience into consideration, we propose a disjoint channel allocation scheme and derive the system channel throughput for various tiers. Further, we formulate the energy efficiency optimization problem for multi-tier HetNets in terms of throughput performance and resource allocation fairness. To solve this problem, we devise a linear programming based approach to obtain the available area of the feasible solutions. System-level simulations demonstrate that the small cells’ deployment density has a significant effect on the coverage performance and energy efficiency. Simulation results also reveal that there exits an optimal small cell base station (SBS) density ratio between pico-tier and femto-tier which can be applied to maximize the energy efficiency and at the same time enhance the system performance. Our findings provide guidance for the design of multi-tier HetNets for improving the coverage performance as well as the energy efficiency. PMID:27827917

  13. Development and Application of a Novel Rasch-Based Methodology for Evaluating Multi-Tiered Assessment Instruments: Validation and Utilization of an Undergraduate Diagnostic Test of the Water Cycle

    ERIC Educational Resources Information Center

    Romine, William L.; Schaffer, Dane L.; Barrow, Lloyd

    2015-01-01

    We describe the development and validation of a three-tiered diagnostic test of the water cycle (DTWC) and use it to evaluate the impact of prior learning experiences on undergraduates' misconceptions. While most approaches to instrument validation take a positivist perspective using singular criteria such as reliability and fit with a measurement…

  14. The Deep Impact Network Experiment Operations Center Monitor and Control System

    NASA Technical Reports Server (NTRS)

    Wang, Shin-Ywan (Cindy); Torgerson, J. Leigh; Schoolcraft, Joshua; Brenman, Yan

    2009-01-01

    The Interplanetary Overlay Network (ION) software at JPL is an implementation of Delay/Disruption Tolerant Networking (DTN) which has been proposed as an interplanetary protocol to support space communication. The JPL Deep Impact Network (DINET) is a technology development experiment intended to increase the technical readiness of the JPL implemented ION suite. The DINET Experiment Operations Center (EOC) developed by JPL's Protocol Technology Lab (PTL) was critical in accomplishing the experiment. EOC, containing all end nodes of simulated spaces and one administrative node, exercised publish and subscribe functions for payload data among all end nodes to verify the effectiveness of data exchange over ION protocol stacks. A Monitor and Control System was created and installed on the administrative node as a multi-tiered internet-based Web application to support the Deep Impact Network Experiment by allowing monitoring and analysis of the data delivery and statistics from ION. This Monitor and Control System includes the capability of receiving protocol status messages, classifying and storing status messages into a database from the ION simulation network, and providing web interfaces for viewing the live results in addition to interactive database queries.

  15. Multicenter breast cancer collaborative registry.

    PubMed

    Sherman, Simon; Shats, Oleg; Fleissner, Elizabeth; Bascom, George; Yiee, Kevin; Copur, Mehmet; Crow, Kate; Rooney, James; Mateen, Zubeena; Ketcham, Marsha A; Feng, Jianmin; Sherman, Alexander; Gleason, Michael; Kinarsky, Leo; Silva-Lopez, Edibaldo; Edney, James; Reed, Elizabeth; Berger, Ann; Cowan, Kenneth

    2011-01-01

    The Breast Cancer Collaborative Registry (BCCR) is a multicenter web-based system that efficiently collects and manages a variety of data on breast cancer (BC) patients and BC survivors. This registry is designed as a multi-tier web application that utilizes Java Servlet/JSP technology and has an Oracle 11g database as a back-end. The BCCR questionnaire has accommodated standards accepted in breast cancer research and healthcare. By harmonizing the controlled vocabulary with the NCI Thesaurus (NCIt) or Systematized Nomenclature of Medicine-Clinical Terms (SNOMED-CT), the BCCR provides a standardized approach to data collection and reporting. The BCCR has been recently certified by the National Cancer Institute's Center for Biomedical Informatics and Information Technology (NCI CBIIT) as a cancer Biomedical Informatics Grid (caBIG(®)) Bronze Compatible product.The BCCR is aimed at facilitating rapid and uniform collection of critical information and biological samples to be used in developing diagnostic, prevention, treatment, and survivorship strategies against breast cancer. Currently, seven cancer institutions are participating in the BCCR that contains data on almost 900 subjects (BC patients and survivors, as well as individuals at high risk of getting BC).

  16. Multicenter Breast Cancer Collaborative Registry

    PubMed Central

    Sherman, Simon; Shats, Oleg; Fleissner, Elizabeth; Bascom, George; Yiee, Kevin; Copur, Mehmet; Crow, Kate; Rooney, James; Mateen, Zubeena; Ketcham, Marsha A.; Feng, Jianmin; Sherman, Alexander; Gleason, Michael; Kinarsky, Leo; Silva-Lopez, Edibaldo; Edney, James; Reed, Elizabeth; Berger, Ann; Cowan, Kenneth

    2011-01-01

    The Breast Cancer Collaborative Registry (BCCR) is a multicenter web-based system that efficiently collects and manages a variety of data on breast cancer (BC) patients and BC survivors. This registry is designed as a multi-tier web application that utilizes Java Servlet/JSP technology and has an Oracle 11g database as a back-end. The BCCR questionnaire has accommodated standards accepted in breast cancer research and healthcare. By harmonizing the controlled vocabulary with the NCI Thesaurus (NCIt) or Systematized Nomenclature of Medicine-Clinical Terms (SNOMED-CT), the BCCR provides a standardized approach to data collection and reporting. The BCCR has been recently certified by the National Cancer Institute’s Center for Biomedical Informatics and Information Technology (NCI CBIIT) as a cancer Biomedical Informatics Grid (caBIG®) Bronze Compatible product. The BCCR is aimed at facilitating rapid and uniform collection of critical information and biological samples to be used in developing diagnostic, prevention, treatment, and survivorship strategies against breast cancer. Currently, seven cancer institutions are participating in the BCCR that contains data on almost 900 subjects (BC patients and survivors, as well as individuals at high risk of getting BC). PMID:21918596

  17. The CMS dataset bookkeeping service

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Afaq, Anzar,; /Fermilab; Dolgert, Andrew

    2007-10-01

    The CMS Dataset Bookkeeping Service (DBS) has been developed to catalog all CMS event data from Monte Carlo and Detector sources. It provides the ability to identify MC or trigger source, track data provenance, construct datasets for analysis, and discover interesting data. CMS requires processing and analysis activities at various service levels and the DBS system provides support for localized processing or private analysis, as well as global access for CMS users at large. Catalog entries can be moved among the various service levels with a simple set of migration tools, thus forming a loose federation of databases. DBS ismore » available to CMS users via a Python API, Command Line, and a Discovery web page interfaces. The system is built as a multi-tier web application with Java servlets running under Tomcat, with connections via JDBC to Oracle or MySQL database backends. Clients connect to the service through HTTP or HTTPS with authentication provided by GRID certificates and authorization through VOMS. DBS is an integral part of the overall CMS Data Management and Workflow Management systems.« less

  18. A socio-technical critique of tiered services: implications for interprofessional care.

    PubMed

    Hood, Rick

    2015-01-01

    In the health and social care sector, tiered services have become an increasingly influential way of organising professional expertise to address the needs of vulnerable people. Drawing on its application to UK child welfare services, this paper discusses the merits of the tiered model from a socio-technical perspective - an approach that has emerged from the fields of accident analysis and systems design. The main elements of a socio-technical critique are outlined and used to explore how tiered services provide support to families and prevent harm to children. Attention is drawn to the distribution of expertise and resources in a tiered system, and to the role of referral and gate-keeping procedures in dispersing accountability for outcomes. An argument is made for designing systems "against demand", and the paper concludes by discussing some alternative models of multi-agency provision.

  19. Storing files in a parallel computing system based on user or application specification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faibish, Sorin; Bent, John M.; Nick, Jeffrey M.

    2016-03-29

    Techniques are provided for storing files in a parallel computing system based on a user-specification. A plurality of files generated by a distributed application in a parallel computing system are stored by obtaining a specification from the distributed application indicating how the plurality of files should be stored; and storing one or more of the plurality of files in one or more storage nodes of a multi-tier storage system based on the specification. The plurality of files comprise a plurality of complete files and/or a plurality of sub-files. The specification can optionally be processed by a daemon executing on onemore » or more nodes in a multi-tier storage system. The specification indicates how the plurality of files should be stored, for example, identifying one or more storage nodes where the plurality of files should be stored.« less

  20. Transitioning Client Based NALCOMIS to a Multi Function Web Based Application

    DTIC Science & Technology

    2016-09-23

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS TRANSITIONING CLIENT- BASED NALCOMIS TO A MULTI-FUNCTION WEB- BASED APPLICATION by Aaron P...TITLE AND SUBTITLE TRANSITIONING CLIENT- BASED NALCOMIS TO A MULTI-FUNCTION WEB- BASED APPLICATION 5. FUNDING NUMBERS 6. AUTHOR(S) Aaron P. Schnetzler 7...NALCOMIS. NALCOMIS has two configurations that are used by organizational and intermediate level maintenance activi- ties, Optimized Organizational

  1. Development and implementation of an Integrated Water Resources Management System (IWRMS)

    NASA Astrophysics Data System (ADS)

    Flügel, W.-A.; Busch, C.

    2011-04-01

    One of the innovative objectives in the EC project BRAHMATWINN was the development of a stakeholder oriented Integrated Water Resources Management System (IWRMS). The toolset integrates the findings of the project and presents it in a user friendly way for decision support in sustainable integrated water resources management (IWRM) in river basins. IWRMS is a framework, which integrates different types of basin information and which supports the development of IWRM options for climate change mitigation. It is based on the River Basin Information System (RBIS) data models and delivers a graphical user interface for stakeholders. A special interface was developed for the integration of the enhanced DANUBIA model input and the NetSyMod model with its Mulino decision support system (mulino mDss) component. The web based IWRMS contains and combines different types of data and methods to provide river basin data and information for decision support. IWRMS is based on a three tier software framework which uses (i) html/javascript at the client tier, (ii) PHP programming language to realize the application tier, and (iii) a postgresql/postgis database tier to manage and storage all data, except the DANUBIA modelling raw data, which are file based and registered in the database tier. All three tiers can reside on one or different computers and are adapted to the local hardware infrastructure. IWRMS as well as RBIS are based on Open Source Software (OSS) components and flexible and time saving access to that database is guaranteed by web-based interfaces for data visualization and retrieval. The IWRMS is accessible via the BRAHMATWINN homepage: http://www.brahmatwinn.uni-jena.de and a user manual for the RBIS is available for download as well.

  2. Revel8or: Model Driven Capacity Planning Tool Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Liming; Liu, Yan; Bui, Ngoc B.

    2007-05-31

    Designing complex multi-tier applications that must meet strict performance requirements is a challenging software engineering problem. Ideally, the application architect could derive accurate performance predictions early in the project life-cycle, leveraging initial application design-level models and a description of the target software and hardware platforms. To this end, we have developed a capacity planning tool suite for component-based applications, called Revel8tor. The tool adheres to the model driven development paradigm and supports benchmarking and performance prediction for J2EE, .Net and Web services platforms. The suite is composed of three different tools: MDAPerf, MDABench and DSLBench. MDAPerf allows annotation of designmore » diagrams and derives performance analysis models. MDABench allows a customized benchmark application to be modeled in the UML 2.0 Testing Profile and automatically generates a deployable application, with measurement automatically conducted. DSLBench allows the same benchmark modeling and generation to be conducted using a simple performance engineering Domain Specific Language (DSL) in Microsoft Visual Studio. DSLBench integrates with Visual Studio and reuses its load testing infrastructure. Together, the tool suite can assist capacity planning across platforms in an automated fashion.« less

  3. A Forecast Skill Comparison between CliPAS One-Tier and Two-Tier Hindcast Experiments

    NASA Astrophysics Data System (ADS)

    Lee, J.; Wang, B.; Kang, I.

    2006-05-01

    A 24-year (1981-2004) MME hindcast experimental dataset is produced under the "Climate Prediction and Its Application to Society" (CliPAS) project sponsored by Korean Meteorological Administration (KMA). This dataset consists of 5 one-tier model systems from National Aeronautics and Space Administration (NASA), National Center for Environmental Prediction (NCEP), Frontier Research Center for Global Change (FRCGC), Seoul National University (SNU), and University of Hawaii (UH) and 5 two-tier model systems from Florida State University (FSU), Geophysical Fluid Dynamic Lab (GFDL), SNU, and UH. Multi-model Ensemble (MME) Forecast skills of seasonal precipitation and atmospheric circulation are compared between CliPAS one-tier and two-tier hindcast experiments for seasonal mean precipitation and atmospheric circulation. For winter prediction, two-tier MME has a comparable skill to one-tier MME. However, it is demonstrated that in the Asian-Australian monsoon (A-AM) heavy precipitation regions, one-tier systems are superior to two-tier systems in summer season. The reason is that inclusion of the local warm pool- monsoon interaction in the one-tier system improves the ENSO teleconnection with monsoon regions. Both one-tier and two-tier MME fail to predict Indian monsoon circulation, while they have a significantly good skill for the broad scale monsoon circulation defined by Webster and Yang index. One-tier system has a much better skill to predict the monsoon circulation over the western North pacific where air-sea interaction plays an important role than two-tier system.

  4. Data Access System for Hydrology

    NASA Astrophysics Data System (ADS)

    Whitenack, T.; Zaslavsky, I.; Valentine, D.; Djokic, D.

    2007-12-01

    As part of the CUAHSI HIS (Consortium of Universities for the Advancement of Hydrologic Science, Inc., Hydrologic Information System), the CUAHSI HIS team has developed Data Access System for Hydrology or DASH. DASH is based on commercial off the shelf technology, which has been developed in conjunction with a commercial partner, ESRI. DASH is a web-based user interface, developed in ASP.NET developed using ESRI ArcGIS Server 9.2 that represents a mapping, querying and data retrieval interface over observation and GIS databases, and web services. This is the front end application for the CUAHSI Hydrologic Information System Server. The HIS Server is a software stack that organizes observation databases, geographic data layers, data importing and management tools, and online user interfaces such as the DASH application, into a flexible multi- tier application for serving both national-level and locally-maintained observation data. The user interface of the DASH web application allows online users to query observation networks by location and attributes, selecting stations in a user-specified area where a particular variable was measured during a given time interval. Once one or more stations and variables are selected, the user can retrieve and download the observation data for further off-line analysis. The DASH application is highly configurable. The mapping interface can be configured to display map services from multiple sources in multiple formats, including ArcGIS Server, ArcIMS, and WMS. The observation network data is configured in an XML file where you specify the network's web service location and its corresponding map layer. Upon initial deployment, two national level observation networks (USGS NWIS daily values and USGS NWIS Instantaneous values) are already pre-configured. There is also an optional login page which can be used to restrict access as well as providing a alternative to immediate downloads. For large request, users would be notified via email with a link to their data when it is ready.

  5. An Improved Publication Process for the UMVF.

    PubMed

    Renard, Jean-Marie; Brunetaud, Jean-Marc; Cuggia, Marc; Darmoni, Stephan; Lebeux, Pierre; Beuscart, Régis

    2005-01-01

    The "Université Médicale Virtuelle Francophone" (UMVF) is a federation of French medical schools. Its main goal is to share the production and use of pedagogic medical resources generated by academic medical teachers. We developed an Open-Source application based upon a workflow system which provides an improved publication process for the UMVF. For teachers, the tool permits easy and efficient upload of new educational resources. For web masters it provides a mechanism to easily locate and validate the resources. For both the teachers and the web masters, the utility provides the control and communication functions that define a workflow system.For all users, students in particular, the application improves the value of the UMVF repository by providing an easy way to find a detailed description of a resource and to check any resource from the UMVF to ascertain its quality and integrity, even if the resource is an old deprecated version. The server tier of the application is used to implement the main workflow functionalities and is deployed on certified UMVF servers using the PHP language, an LDAP directory and an SQL database. The client tier of the application provides both the workflow and the search and check functionalities and is implemented using a Java applet through a W3C compliant web browser. A unique signature for each resource, was needed to provide security functionality and is implemented using the MD5 Digest algorithm. The testing performed by Rennes and Lille verified the functionality and conformity with our specifications.

  6. A multi-tiered architecture for content retrieval in mobile peer-to-peer networks.

    DOT National Transportation Integrated Search

    2012-01-01

    In this paper, we address content retrieval in Mobile Peer-to-Peer (P2P) Networks. We design a multi-tiered architecture for content : retrieval, where at Tier 1, we design a protocol for content similarity governed by a parameter that trades accu...

  7. EDSP Tier 2 test (T2T) guidances and protocols are delivered, including web-based guidance for diagnosing and scoring, and evaluating EDC-induced pathology in fish and amphibian

    EPA Science Inventory

    The Agency’s Endocrine Disruptor Screening Program (EDSP) consists of two tiers. The first tier provides information regarding whether a chemical may have endocrine disruption properties. Tier 2 tests provide confirmation of ED effects and dose-response information to be us...

  8. A cloud medication safety support system using QR code and Web services for elderly outpatients.

    PubMed

    Tseng, Ming-Hseng; Wu, Hui-Ching

    2014-01-01

    Drug is an important part of disease treatment, but medication errors happen frequently and have significant clinical and financial consequences. The prevalence of prescription medication use among the ambulatory adult population increases with advancing age. Because of the global aging society, outpatients need to improve medication safety more than inpatients. The elderly with multiple chronic conditions face the complex task of medication management. To reduce the medication errors for the elder outpatients with chronic diseases, a cloud medication safety supporting system is designed, demonstrated and evaluated. The proposed system is composed of a three-tier architecture: the front-end tier, the mobile tier and the cloud tier. The mobile tier will host the personalized medication safety supporting application on Android platforms that provides some primary functions including reminders for medication, assistance with pill-dispensing, recording of medications, position of medications and notices of forgotten medications for elderly outpatients. Finally, the hybrid technology acceptance model is employed to understand the intention and satisfaction level of the potential users to use this mobile medication safety support application system. The result of the system acceptance testing indicates that this developed system, implementing patient-centered services, is highly accepted by the elderly. This proposed M-health system could assist elderly outpatients' homecare in preventing medication errors and improving their medication safety.

  9. Diagnosing Students' Misconceptions in Number Sense via a Web-Based Two-Tier Test

    ERIC Educational Resources Information Center

    Lin, Yung-Chi; Yang, Der-Ching; Li, Mao-Neng

    2016-01-01

    A web-based two-tier test (WTTT-NS) which combined the advantages of traditional written tests and interviews in assessing number sense was developed and applied to assess students' answers and reasons for the questions. In addition, students' major misconceptions can be detected. A total of 1,248 sixth graders in Taiwan were selected to…

  10. Using School Improvement and Implementation Science to Integrate Multi-Tiered Systems of Support in Secondary Schools

    ERIC Educational Resources Information Center

    Bohanon, Hank; Gilman, Carrie; Parker, Ben; Amell, Chris

    2016-01-01

    The purpose of this paper is to describe the integration of tiered interventions and supports in secondary schools, sometimes referred to as multi-tiered systems of support (MTSS). The interventions include academic, behavioural, social, and emotional supports for all students. A description of the connections across specifc MTSS systems,…

  11. Vitro Pulmonary Toxicity of Metal Oxide Nanoparticles

    EPA Science Inventory

    The diversity of engineered-nanomaterials and their applications as well as potential unknown health effects of these novel materials are significant challenges to assessing the health risks of nanotechnology. An integrated multi-tier testing strategy (www.epa.gov/nanoscience/) ...

  12. An Examination of Multi-Tier Designs for Legacy Data Access

    DTIC Science & Technology

    1997-12-01

    heterogeneous relational database management systems. The first test system incorporates a two-tier architecture design using Java, and the second system...employs a three-tier architecture design using Java and CORBA. Data on replication times for the two-tier and three-tier designs are presented

  13. Tier-scalable reconnaissance: the challenge of sensor optimization, sensor deployment, sensor fusion, and sensor interoperability

    NASA Astrophysics Data System (ADS)

    Fink, Wolfgang; George, Thomas; Tarbell, Mark A.

    2007-04-01

    Robotic reconnaissance operations are called for in extreme environments, not only those such as space, including planetary atmospheres, surfaces, and subsurfaces, but also in potentially hazardous or inaccessible operational areas on Earth, such as mine fields, battlefield environments, enemy occupied territories, terrorist infiltrated environments, or areas that have been exposed to biochemical agents or radiation. Real time reconnaissance enables the identification and characterization of transient events. A fundamentally new mission concept for tier-scalable reconnaissance of operational areas, originated by Fink et al., is aimed at replacing the engineering and safety constrained mission designs of the past. The tier-scalable paradigm integrates multi-tier (orbit atmosphere surface/subsurface) and multi-agent (satellite UAV/blimp surface/subsurface sensing platforms) hierarchical mission architectures, introducing not only mission redundancy and safety, but also enabling and optimizing intelligent, less constrained, and distributed reconnaissance in real time. Given the mass, size, and power constraints faced by such a multi-platform approach, this is an ideal application scenario for a diverse set of MEMS sensors. To support such mission architectures, a high degree of operational autonomy is required. Essential elements of such operational autonomy are: (1) automatic mapping of an operational area from different vantage points (including vehicle health monitoring); (2) automatic feature extraction and target/region-of-interest identification within the mapped operational area; and (3) automatic target prioritization for close-up examination. These requirements imply the optimal deployment of MEMS sensors and sensor platforms, sensor fusion, and sensor interoperability.

  14. Web Service for Positional Quality Assessment: the Wps Tier

    NASA Astrophysics Data System (ADS)

    Xavier, E. M. A.; Ariza-López, F. J.; Ureña-Cámara, M. A.

    2015-08-01

    In the field of spatial data every day we have more and more information available, but we still have little or very little information about the quality of spatial data. We consider that the automation of the spatial data quality assessment is a true need for the geomatic sector, and that automation is possible by means of web processing services (WPS), and the application of specific assessment procedures. In this paper we propose and develop a WPS tier centered on the automation of the positional quality assessment. An experiment using the NSSDA positional accuracy method is presented. The experiment involves the uploading by the client of two datasets (reference and evaluation data). The processing is to determine homologous pairs of points (by distance) and calculate the value of positional accuracy under the NSSDA standard. The process generates a small report that is sent to the client. From our experiment, we reached some conclusions on the advantages and disadvantages of WPSs when applied to the automation of spatial data accuracy assessments.

  15. Collaborative medical informatics research using the Internet and the World Wide Web.

    PubMed Central

    Shortliffe, E. H.; Barnett, G. O.; Cimino, J. J.; Greenes, R. A.; Huff, S. M.; Patel, V. L.

    1996-01-01

    The InterMed Collaboratory is an interdisciplinary project involving six participating medical institutions. There are two broad mandates for the effort. The first is to further the development, sharing, and demonstration of numerous software and system components, data sets, procedures and tools that will facilitate the collaborations and support the application goals of these projects. The second is to provide a distributed suite of clinical applications, guidelines, and knowledge-bases for clinical, educational, and administrative purposes. To define the interactions among the components, datasets, procedures, and tools that we are producing and sharing, we have identified a model composed of seven tiers, each of which supports the levels above it. In this paper we briefly describe those tiers and the nature of the collaborative process with which we have experimented. PMID:8947641

  16. Tier 2 Interventions in Positive Behavior Support: A Survey of School Implementation

    ERIC Educational Resources Information Center

    Rodriguez, Billie Jo; Loman, Sheldon L.; Borgmeier, Christopher

    2016-01-01

    As increasing numbers of schools implement Multi-Tiered Systems of Support (MTSS), schools are looking for and implementing evidence-based practices for students whose needs are not fully met by Tier 1 supports. Although there is relative consistency and clarity in what constitutes Tier 1 behavior support within MTSS, Tier 2 supports may be more…

  17. Implementation of an Enterprise Information Portal (EIP) in the Loyola University Health System

    PubMed Central

    Price, Ronald N.; Hernandez, Kim

    2001-01-01

    Loyola University Chicago Stritch School of Medicine and Loyola University Medical Center have long histories in the development of applications to support the institutions' missions of education, research and clinical care. In late 1998, the institutions' application development group undertook an ambitious program to re-architecture more than 10 years of legacy application development (30+ core applications) into a unified World Wide Web (WWW) environment. The primary project objectives were to construct an environment that would support the rapid development of n-tier, web-based applications while providing standard methods for user authentication/validation, security/access control and definition of a user's organizational context. The project's efforts resulted in Loyola's Enterprise Information Portal (EIP), which meets the aforementioned objectives. This environment: 1) allows access to other vertical Intranet portals (e.g., electronic medical record, patient satisfaction information and faculty effort); 2) supports end-user desktop customization; and 3) provides a means for standardized application “look and feel.” The portal was constructed utilizing readily available hardware and software. Server hardware consists of multiprocessor (Intel Pentium 500Mhz) Compaq 6500 servers with one gigabyte of random access memory and 75 gigabytes of hard disk storage. Microsoft SQL Server was selected to house the portal's internal or security data structures. Netscape Enterprise Server was selected for the web server component of the environment and Allaire's ColdFusion was chosen for access and application tiers. Total costs for the portal environment was less than $40,000. User data storage is accomplished through two Microsoft SQL Servers and an existing SUN Microsystems enterprise server with eight processors, 750 gigabytes of disk storage operating Sybase relational database manager. Total storage capacity for all system exceeds one terabyte. In the past 12 months, the EIP has supported development of more than 88 applications and is utilized by more than 2,200 users.

  18. The Use of Proxy Caches for File Access in a Multi-Tier Grid Environment

    NASA Astrophysics Data System (ADS)

    Brun, R.; Duellmann, D.; Ganis, G.; Hanushevsky, A.; Janyst, L.; Peters, A. J.; Rademakers, F.; Sindrilaru, E.

    2011-12-01

    The use of proxy caches has been extensively studied in the HEP environment for efficient access of database data and showed significant performance with only very moderate operational effort at higher grid tiers (T2, T3). In this contribution we propose to apply the same concept to the area of file access and analyse the possible performance gains, operational impact on site services and applicability to different HEP use cases. Base on a proof-of-concept studies with a modified XROOT proxy server we review the cache efficiency and overheads for access patterns of typical ROOT based analysis programs. We conclude with a discussion of the potential role of this new component at the different tiers of a distributed computing grid.

  19. The Use of Proxy Caches for File Access in a Multi-Tier Grid Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brun, R.; Dullmann, D.; Ganis, G.

    2012-04-19

    The use of proxy caches has been extensively studied in the HEP environment for efficient access of database data and showed significant performance with only very moderate operational effort at higher grid tiers (T2, T3). In this contribution we propose to apply the same concept to the area of file access and analyze the possible performance gains, operational impact on site services and applicability to different HEP use cases. Base on a proof-of-concept studies with a modified XROOT proxy server we review the cache efficiency and overheads for access patterns of typical ROOT based analysis programs. We conclude with amore » discussion of the potential role of this new component at the different tiers of a distributed computing grid.« less

  20. Documenting clinical pharmacist intervention before and after the introduction of a web-based tool.

    PubMed

    Nurgat, Zubeir A; Al-Jazairi, Abdulrazaq S; Abu-Shraie, Nada; Al-Jedai, Ahmed

    2011-04-01

    To develop a database for documenting pharmacist intervention through a web-based application. The secondary endpoint was to determine if the new, web-based application provides any benefits with regards to documentation compliance by clinical pharmacists and ease of calculating cost savings compared with our previous method of documenting pharmacist interventions. A tertiary care hospital in Saudi Arabia. The documentation of interventions using a web-based documentation application was retrospectively compared with previous methods of documentation of clinical pharmacists' interventions (multi-user PC software). The number and types of interventions recorded by pharmacists, data mining of archived data, efficiency, cost savings, and the accuracy of the data generated. The number of documented clinical interventions increased from 4,926, using the multi-user PC software, to 6,840 for the web-based application. On average, we observed 653 interventions per clinical pharmacist using the web-based application, which showed an increase compared to an average of 493 interventions using the old multi-user PC software. However, using a paired Student's t-test there was no statistical significance difference between the two means (P = 0.201). Using a χ² test, which captured management level and the type of system used, we found a strong effect of management level (P < 2.2 × 10⁻¹⁶) on the number of documented interventions. We also found a moderately significant relationship between educational level and the number of interventions documented (P = 0.045). The mean ± SD time required to document an intervention using the web-based application was 66.55 ± 8.98 s. Using the web-based application, 29.06% of documented interventions resulted in cost-savings, while using the multi-user PC software only 4.75% of interventions did so. The majority of cost savings across both platforms resulted from the discontinuation of unnecessary drugs and a change in dosage regimen. Data collection using the web-based application was consistently more complete when compared to the multi-user PC software. The web-based application is an efficient system for documenting pharmacist interventions. Its flexibility and accessibility, as well as its detailed report functionality is a useful tool that will hopefully encourage other primary and secondary care facilities to adopt similar applications.

  1. The Molecule Pages database

    PubMed Central

    Saunders, Brian; Lyon, Stephen; Day, Matthew; Riley, Brenda; Chenette, Emily; Subramaniam, Shankar

    2008-01-01

    The UCSD-Nature Signaling Gateway Molecule Pages (http://www.signaling-gateway.org/molecule) provides essential information on more than 3800 mammalian proteins involved in cellular signaling. The Molecule Pages contain expert-authored and peer-reviewed information based on the published literature, complemented by regularly updated information derived from public data source references and sequence analysis. The expert-authored data includes both a full-text review about the molecule, with citations, and highly structured data for bioinformatics interrogation, including information on protein interactions and states, transitions between states and protein function. The expert-authored pages are anonymously peer reviewed by the Nature Publishing Group. The Molecule Pages data is present in an object-relational database format and is freely accessible to the authors, the reviewers and the public from a web browser that serves as a presentation layer. The Molecule Pages are supported by several applications that along with the database and the interfaces form a multi-tier architecture. The Molecule Pages and the Signaling Gateway are routinely accessed by a very large research community. PMID:17965093

  2. The Molecule Pages database.

    PubMed

    Saunders, Brian; Lyon, Stephen; Day, Matthew; Riley, Brenda; Chenette, Emily; Subramaniam, Shankar; Vadivelu, Ilango

    2008-01-01

    The UCSD-Nature Signaling Gateway Molecule Pages (http://www.signaling-gateway.org/molecule) provides essential information on more than 3800 mammalian proteins involved in cellular signaling. The Molecule Pages contain expert-authored and peer-reviewed information based on the published literature, complemented by regularly updated information derived from public data source references and sequence analysis. The expert-authored data includes both a full-text review about the molecule, with citations, and highly structured data for bioinformatics interrogation, including information on protein interactions and states, transitions between states and protein function. The expert-authored pages are anonymously peer reviewed by the Nature Publishing Group. The Molecule Pages data is present in an object-relational database format and is freely accessible to the authors, the reviewers and the public from a web browser that serves as a presentation layer. The Molecule Pages are supported by several applications that along with the database and the interfaces form a multi-tier architecture. The Molecule Pages and the Signaling Gateway are routinely accessed by a very large research community.

  3. VAAPA: a web platform for visualization and analysis of alternative polyadenylation.

    PubMed

    Guan, Jinting; Fu, Jingyi; Wu, Mingcheng; Chen, Longteng; Ji, Guoli; Quinn Li, Qingshun; Wu, Xiaohui

    2015-02-01

    Polyadenylation [poly(A)] is an essential process during the maturation of most mRNAs in eukaryotes. Alternative polyadenylation (APA) as an important layer of gene expression regulation has been increasingly recognized in various species. Here, a web platform for visualization and analysis of alternative polyadenylation (VAAPA) was developed. This platform can visualize the distribution of poly(A) sites and poly(A) clusters of a gene or a section of a chromosome. It can also highlight genes with switched APA sites among different conditions. VAAPA is an easy-to-use web-based tool that provides functions of poly(A) site query, data uploading, downloading, and APA sites visualization. It was designed in a multi-tier architecture and developed based on Smart GWT (Google Web Toolkit) using Java as the development language. VAAPA will be a valuable addition to the community for the comprehensive study of APA, not only by making the high quality poly(A) site data more accessible, but also by providing users with numerous valuable functions for poly(A) site analysis and visualization. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Development of a web application for water resources based on open source software

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj; Jonoski, Andreja; Solomatine, Dimitri P.

    2014-01-01

    This article presents research and development of a prototype web application for water resources using latest advancements in Information and Communication Technologies (ICT), open source software and web GIS. The web application has three web services for: (1) managing, presenting and storing of geospatial data, (2) support of water resources modeling and (3) water resources optimization. The web application is developed using several programming languages (PhP, Ajax, JavaScript, Java), libraries (OpenLayers, JQuery) and open source software components (GeoServer, PostgreSQL, PostGIS). The presented web application has several main advantages: it is available all the time, it is accessible from everywhere, it creates a real time multi-user collaboration platform, the programing languages code and components are interoperable and designed to work in a distributed computer environment, it is flexible for adding additional components and services and, it is scalable depending on the workload. The application was successfully tested on a case study with concurrent multi-users access.

  5. Integrating Model-Based Transmission Reduction into a multi-tier architecture

    NASA Astrophysics Data System (ADS)

    Straub, J.

    A multi-tier architecture consists of numerous craft as part of the system, orbital, aerial, and surface tiers. Each tier is able to collect progressively greater levels of information. Generally, craft from lower-level tiers are deployed to a target of interest based on its identification by a higher-level craft. While the architecture promotes significant amounts of science being performed in parallel, this may overwhelm the computational and transmission capabilities of higher-tier craft and links (particularly the deep space link back to Earth). Because of this, a new paradigm in in-situ data processing is required. Model-based transmission reduction (MBTR) is such a paradigm. Under MBTR, each node (whether a single spacecraft in orbit of the Earth or another planet or a member of a multi-tier network) is given an a priori model of the phenomenon that it is assigned to study. It performs activities to validate this model. If the model is found to be erroneous, corrective changes are identified, assessed to ensure their significance for being passed on, and prioritized for transmission. A limited amount of verification data is sent with each MBTR assertion message to allow those that might rely on the data to validate the correct operation of the spacecraft and MBTR engine onboard. Integrating MBTR with a multi-tier framework creates an MBTR hierarchy. Higher levels of the MBTR hierarchy task lower levels with data collection and assessment tasks that are required to validate or correct elements of its model. A model of the expected conditions is sent to the lower level craft; which then engages its own MBTR engine to validate or correct the model. This may include tasking a yet lower level of craft to perform activities. When the MBTR engine at a given level receives all of its component data (whether directly collected or from delegation), it randomly chooses some to validate (by reprocessing the validation data), performs analysis and sends its own results (v- lidation and/or changes of model elements and supporting validation data) to its upstream node. This constrains data transmission to only significant (either because it includes a change or is validation data critical for assessing overall performance) information and reduces the processing requirements (by not having to process insignificant data) at higher-level nodes. This paper presents a framework for multi-tier MBTR and two demonstration mission concepts: an Earth sensornet and a mission to Mars. These multi-tier MBTR concepts are compared to a traditional mission approach.

  6. Earth-Base: A Free And Open Source, RESTful Earth Sciences Platform

    NASA Astrophysics Data System (ADS)

    Kishor, P.; Heim, N. A.; Peters, S. E.; McClennen, M.

    2012-12-01

    This presentation describes the motivation, concept, and architecture behind Earth-Base, a web-based, RESTful data-management, analysis and visualization platform for earth sciences data. Traditionally web applications have been built directly accessing data from a database using a scripting language. While such applications are great at bring results to a wide audience, they are limited in scope to the imagination and capabilities of the application developer. Earth-Base decouples the data store from the web application by introducing an intermediate "data application" tier. The data application's job is to query the data store using self-documented, RESTful URIs, and send the results back formatted as JavaScript Object Notation (JSON). Decoupling the data store from the application allows virtually limitless flexibility in developing applications, both web-based for human consumption or programmatic for machine consumption. It also allows outside developers to use the data in their own applications, potentially creating applications that the original data creator and app developer may not have even thought of. Standardized specifications for URI-based querying and JSON-formatted results make querying and developing applications easy. URI-based querying also allows utilizing distributed datasets easily. Companion mechanisms for querying data snapshots aka time-travel, usage tracking and license management, and verification of semantic equivalence of data are also described. The latter promotes the "What You Expect Is What You Get" (WYEIWYG) principle that can aid in data citation and verification.

  7. Multi-Tiered Systems of Support Preservice Residency: A Pilot Undergraduate Teacher Preparation Model

    ERIC Educational Resources Information Center

    Ross, Scott Warren; Lignugaris-Kraft, Ben

    2015-01-01

    This case study examined the implementation of a novel nontraditional teacher preparation program, "Multi-Tiered Systems of Support Preservice Residency Project" (MTSS-PR). The two-year program placed general and special education composite undergraduate majors full time in high-need schools implementing evidence-based systems of…

  8. AnaBench: a Web/CORBA-based workbench for biomolecular sequence analysis

    PubMed Central

    Badidi, Elarbi; De Sousa, Cristina; Lang, B Franz; Burger, Gertraud

    2003-01-01

    Background Sequence data analyses such as gene identification, structure modeling or phylogenetic tree inference involve a variety of bioinformatics software tools. Due to the heterogeneity of bioinformatics tools in usage and data requirements, scientists spend much effort on technical issues including data format, storage and management of input and output, and memorization of numerous parameters and multi-step analysis procedures. Results In this paper, we present the design and implementation of AnaBench, an interactive, Web-based bioinformatics Analysis workBench allowing streamlined data analysis. Our philosophy was to minimize the technical effort not only for the scientist who uses this environment to analyze data, but also for the administrator who manages and maintains the workbench. With new bioinformatics tools published daily, AnaBench permits easy incorporation of additional tools. This flexibility is achieved by employing a three-tier distributed architecture and recent technologies including CORBA middleware, Java, JDBC, and JSP. A CORBA server permits transparent access to a workbench management database, which stores information about the users, their data, as well as the description of all bioinformatics applications that can be launched from the workbench. Conclusion AnaBench is an efficient and intuitive interactive bioinformatics environment, which offers scientists application-driven, data-driven and protocol-driven analysis approaches. The prototype of AnaBench, managed by a team at the Université de Montréal, is accessible on-line at: . Please contact the authors for details about setting up a local-network AnaBench site elsewhere. PMID:14678565

  9. A Multi-Channel Approach for Collaborative Web-Based Learning

    ERIC Educational Resources Information Center

    Azeta, A. A.

    2008-01-01

    This paper describes an architectural framework and a prototype implementation of a web-based multi-channel e-Learning application that allows students, lecturers and the research communities to collaborate irrespective of the communication device a user is carrying. The application was developed based on the concept of "right once run on any…

  10. Experience with Multi-Tier Grid MySQL Database Service Resiliency at BNL

    NASA Astrophysics Data System (ADS)

    Wlodek, Tomasz; Ernst, Michael; Hover, John; Katramatos, Dimitrios; Packard, Jay; Smirnov, Yuri; Yu, Dantong

    2011-12-01

    We describe the use of F5's BIG-IP smart switch technology (3600 Series and Local Traffic Manager v9.0) to provide load balancing and automatic fail-over to multiple Grid services (GUMS, VOMS) and their associated back-end MySQL databases. This resiliency is introduced in front of the external application servers and also for the back-end database systems, which is what makes it "multi-tier". The combination of solutions chosen to ensure high availability of the services, in particular the database replication and fail-over mechanism, are discussed in detail. The paper explains the design and configuration of the overall system, including virtual servers, machine pools, and health monitors (which govern routing), as well as the master-slave database scheme and fail-over policies and procedures. Pre-deployment planning and stress testing will be outlined. Integration of the systems with our Nagios-based facility monitoring and alerting is also described. And application characteristics of GUMS and VOMS which enable effective clustering will be explained. We then summarize our practical experiences and real-world scenarios resulting from operating a major US Grid center, and assess the applicability of our approach to other Grid services in the future.

  11. Design of the Resources and Environment Monitoring Website in Kashgar

    NASA Astrophysics Data System (ADS)

    Huang, Z.; Lin, Q. Z.; Wang, Q. J.

    2014-03-01

    Despite the development of the web geographical information system (web GIS), many useful spatial analysis functions are ignored in the system implementation. As Kashgar is rich in natural resources, it is of great significance to monitor the ample natural resource and environment situation in the region. Therefore, with multiple uses of spatial analysis, resources and environment monitoring website of Kashgar was built. Functions of water, vegetation, ice and snow extraction, task management, change assessment as well as thematic mapping and reports based on TM remote sensing images were implemented in the website. The design of the website was presented based on database management tier, the business logic tier and the top-level presentation tier. The vital operations of the website were introduced and the general performance was evaluated.

  12. Planetary Data Systems (PDS) Imaging Node Atlas II

    NASA Technical Reports Server (NTRS)

    Stanboli, Alice; McAuley, James M.

    2013-01-01

    The Planetary Image Atlas (PIA) is a Rich Internet Application (RIA) that serves planetary imaging data to the science community and the general public. PIA also utilizes the USGS Unified Planetary Coordinate system (UPC) and the on-Mars map server. The Atlas was designed to provide the ability to search and filter through greater than 8 million planetary image files. This software is a three-tier Web application that contains a search engine backend (MySQL, JAVA), Web service interface (SOAP) between server and client, and a GWT Google Maps API client front end. This application allows for the search, retrieval, and download of planetary images and associated meta-data from the following missions: 2001 Mars Odyssey, Cassini, Galileo, LCROSS, Lunar Reconnaissance Orbiter, Mars Exploration Rover, Mars Express, Magellan, Mars Global Surveyor, Mars Pathfinder, Mars Reconnaissance Orbiter, MESSENGER, Phoe nix, Viking Lander, Viking Orbiter, and Voyager. The Atlas utilizes the UPC to translate mission-specific coordinate systems into a unified coordinate system, allowing the end user to query across missions of similar targets. If desired, the end user can also use a mission-specific view of the Atlas. The mission-specific views rely on the same code base. This application is a major improvement over the initial version of the Planetary Image Atlas. It is a multi-mission search engine. This tool includes both basic and advanced search capabilities, providing a product search tool to interrogate the collection of planetary images. This tool lets the end user query information about each image, and ignores the data that the user has no interest in. Users can reduce the number of images to look at by defining an area of interest with latitude and longitude ranges.

  13. Benchmarking of Decision-Support Tools Used for Tiered Sustainable Remediation Appraisal.

    PubMed

    Smith, Jonathan W N; Kerrison, Gavin

    2013-01-01

    Sustainable remediation comprises soil and groundwater risk-management actions that are selected, designed, and operated to maximize net environmental, social, and economic benefit (while assuring protection of human health and safety). This paper describes a benchmarking exercise to comparatively assess potential differences in environmental management decision making resulting from application of different sustainability appraisal tools ranging from simple (qualitative) to more quantitative (multi-criteria and fully monetized cost-benefit analysis), as outlined in the SuRF-UK framework. The appraisal tools were used to rank remedial options for risk management of a subsurface petroleum release that occurred at a petrol filling station in central England. The remediation options were benchmarked using a consistent set of soil and groundwater data for each tier of sustainability appraisal. The ranking of remedial options was very similar in all three tiers, and an environmental management decision to select the most sustainable options at tier 1 would have been the same decision at tiers 2 and 3. The exercise showed that, for relatively simple remediation projects, a simple sustainability appraisal led to the same remediation option selection as more complex appraisal, and can be used to reliably inform environmental management decisions on other relatively simple land contamination projects.

  14. Formal Analysis of Privacy Requirements Specifications for Multi-Tier Applications

    DTIC Science & Technology

    2013-07-30

    Requirements Engineering Lab and co- founder of the Requirements Engineering and Law Workshop and has several publications in ACM- and IEEE- sponsored journals...Advertising that serves the online ad “Buying Razors Sucks” in this game. Zynga also produces a version of this game for the Android and iPhone mobile

  15. An Autonomic Framework for Integrating Security and Quality of Service Support in Databases

    ERIC Educational Resources Information Center

    Alomari, Firas

    2013-01-01

    The back-end databases of multi-tiered applications are a major data security concern for enterprises. The abundance of these systems and the emergence of new and different threats require multiple and overlapping security mechanisms. Therefore, providing multiple and diverse database intrusion detection and prevention systems (IDPS) is a critical…

  16. LandEx - Fast, FOSS-Based Application for Query and Retrieval of Land Cover Patterns

    NASA Astrophysics Data System (ADS)

    Netzel, P.; Stepinski, T.

    2012-12-01

    The amount of satellite-based spatial data is continuously increasing making a development of efficient data search tools a priority. The bulk of existing research on searching satellite-gathered data concentrates on images and is based on the concept of Content-Based Image Retrieval (CBIR); however, available solutions are not efficient and robust enough to be put to use as deployable web-based search tools. Here we report on development of a practical, deployable tool that searches classified, rather than raw image. LandEx (Landscape Explorer) is a GeoWeb-based tool for Content-Based Pattern Retrieval (CBPR) contained within the National Land Cover Dataset 2006 (NLCD2006). The USGS-developed NLCD2006 is derived from Landsat multispectral images; it covers the entire conterminous U.S. with the resolution of 30 meters/pixel and it depicts 16 land cover classes. The size of NLCD2006 is about 10 Gpixels (161,000 x 100,000 pixels). LandEx is a multi-tier GeoWeb application based on Open Source Software. Main components are: GeoExt/OpenLayers (user interface), GeoServer (OGC WMS, WCS and WPS server), and GRASS (calculation engine). LandEx performs search using query-by-example approach: user selects a reference scene (exhibiting a chosen pattern of land cover classes) and the tool produces, in real time, a map indicating a degree of similarity between the reference pattern and all local patterns across the U.S. Scene pattern is encapsulated by a 2D histogram of classes and sizes of single-class clumps. Pattern similarity is based on the notion of mutual information. The resultant similarity map can be viewed and navigated in a web browser, or it can download as a GeoTiff file for more in-depth analysis. The LandEx is available at http://sil.uc.edu

  17. Multi-Tier Mental Health Program for Refugee Youth

    ERIC Educational Resources Information Center

    Ellis, B. Heidi; Miller, Alisa B.; Abdi, Saida; Barrett, Colleen; Blood, Emily A.; Betancourt, Theresa S.

    2013-01-01

    Objective: We sought to establish that refugee youths who receive a multi-tiered approach to services, Project SHIFA, would show high levels of engagement in treatment appropriate to their level of mental health distress, improvements in mental health symptoms, and a decrease in resource hardships. Method: Study participants were 30 Somali and…

  18. Integrating a Multi-Tiered System of Supports with Comprehensive School Counseling Programs

    ERIC Educational Resources Information Center

    Ziomek-Daigle, Jolie; Goodman-Scott, Emily; Cavin, Jason; Donohue, Peg

    2016-01-01

    A multi-tiered system of supports, including Response to Intervention and Positive Behavioral Interventions and Supports, is a widely utilized framework implemented in K-12 schools to address the academic and behavioral needs of all students. School counselors are leaders who facilitate comprehensive school counseling programs and demonstrate…

  19. Exploring an Ecological Model of Perceived Usability within a Multi-Tiered Vocabulary Intervention

    ERIC Educational Resources Information Center

    Neugebauer, Sabina R.; Chafouleas, Sandra M.; Coyne, Michael D.; McCoach, D. Betsy; Briesch, Amy M.

    2016-01-01

    The present study examines an ecological model for intervention use to explain student vocabulary performance in a multi-tiered intervention setting. A teacher self-report measure composed of factors hypothesized to influence intervention use at multiple levels (i.e., individual, intervention, and system level) was administered to 54 teachers and…

  20. Multi-Tiered System of Supports (MTSS) and Implementation Science

    ERIC Educational Resources Information Center

    Dillard, Christina

    2017-01-01

    Many districts and schools are having difficulty implementing Multi-Tiered System of Supports (MTSS) in school settings. This quantitative study set out to examine the stage of MTSS implementation schools are at and identify factors from the implementation science framework that account for the different reported student outcomes related to MTSS…

  1. Systematic Review of Real-time Remote Health Monitoring System in Triage and Priority-Based Sensor Technology: Taxonomy, Open Challenges, Motivation and Recommendations.

    PubMed

    Albahri, O S; Albahri, A S; Mohammed, K I; Zaidan, A A; Zaidan, B B; Hashim, M; Salman, Omar H

    2018-03-22

    The new and ground-breaking real-time remote monitoring in triage and priority-based sensor technology used in telemedicine have significantly bounded and dispersed communication components. To examine these technologies and provide researchers with a clear vision of this area, we must first be aware of the utilised approaches and existing limitations in this line of research. To this end, an extensive search was conducted to find articles dealing with (a) telemedicine, (b) triage, (c) priority and (d) sensor; (e) comprehensively review related applications and establish the coherent taxonomy of these articles. ScienceDirect, IEEE Xplore and Web of Science databases were checked for articles on triage and priority-based sensor technology in telemedicine. The retrieved articles were filtered according to the type of telemedicine technology explored. A total of 150 articles were selected and classified into two categories. The first category includes reviews and surveys of triage and priority-based sensor technology in telemedicine. The second category includes articles on the three-tiered architecture of telemedicine. Tier 1 represents the users. Sensors acquire the vital signs of the users and send them to Tier 2, which is the personal gateway that uses local area network protocols or wireless body area network. Medical data are sent from Tier 2 to Tier 3, which is the healthcare provider in medical institutes. Then, the motivation for using triage and priority-based sensor technology in telemedicine, the issues related to the obstruction of its application and the development and utilisation of telemedicine are examined on the basis of the findings presented in the literature.

  2. OpenGL in Multi-User Web-Based Applications

    NASA Astrophysics Data System (ADS)

    Szostek, K.; Piórkowski, A.

    In this article construction and potential of OpenGL multi-user web-based application are presented. The most common technologies like: .NET ASP, Java and Mono were used with specific OpenGL libraries to visualize tree-dimensional medical data. The most important conclusion of this work is that server side applications can easily take advantage of fast GPU and produce efficient results of advanced computation just like the visualization.

  3. WebCN: A web-based computation tool for in situ-produced cosmogenic nuclides

    NASA Astrophysics Data System (ADS)

    Ma, Xiuzeng; Li, Yingkui; Bourgeois, Mike; Caffee, Marc; Elmore, David; Granger, Darryl; Muzikar, Paul; Smith, Preston

    2007-06-01

    Cosmogenic nuclide techniques are increasingly being utilized in geoscience research. For this it is critical to establish an effective, easily accessible and well defined tool for cosmogenic nuclide computations. We have been developing a web-based tool (WebCN) to calculate surface exposure ages and erosion rates based on the nuclide concentrations measured by the accelerator mass spectrometry. WebCN for 10Be and 26Al has been finished and published at http://www.physics.purdue.edu/primelab/for_users/rockage.html. WebCN for 36Cl is under construction. WebCN is designed as a three-tier client/server model and uses the open source PostgreSQL for the database management and PHP for the interface design and calculations. On the client side, an internet browser and Microsoft Access are used as application interfaces to access the system. Open Database Connectivity is used to link PostgreSQL and Microsoft Access. WebCN accounts for both spatial and temporal distributions of the cosmic ray flux to calculate the production rates of in situ-produced cosmogenic nuclides at the Earth's surface.

  4. Tiered Systems of Support: Practical Considerations for School Districts. Issue Focus

    ERIC Educational Resources Information Center

    MDRC, 2017

    2017-01-01

    Students learn or progress at their own paces. How can schools make sure that they get the help they need--and only the help they need? Many are turning to multi-tiered systems of support. This brief provides some practical considerations for schools contemplating tiered approaches.

  5. Capacity Development and Multi-Tiered Systems of Support: Guiding Principles

    ERIC Educational Resources Information Center

    Sugai, George; Simonsen, Brandi; Freeman, Jennifer; La Salle, Tamika

    2016-01-01

    Implementation of multi-tiered systems of support is occurring within and across a number of countries with an increased recent focus on the development of local system capacity to maintain high levels of practice implementation fidelity. The purpose of this paper is to describe the importance of local capacity development in the high fidelity and…

  6. Implementing a Multi-Tiered System of Support (MTSS): Collaboration between School Psychologists and Administrators to Promote Systems-Level Change

    ERIC Educational Resources Information Center

    Eagle, John W.; Dowd-Eagle, Shannon E.; Snyder, Andrew; Holtzman, Elizabeth Gibbons

    2015-01-01

    Current educational reform mandates the implementation of school-based models for early identification and intervention, progress monitoring, and data-based assessment of student progress. This article provides an overview of interdisciplinary collaboration for systems-level consultation within a Multi-Tiered System of Support (MTSS) framework.…

  7. Evaluation Brief: Implementation and Outcomes of Kansas Multi-Tier System of Supports: 2011-2014

    ERIC Educational Resources Information Center

    Reedy, Kristen; Lacireno-Paquet, Natalie

    2015-01-01

    States, school districts, and schools across the country are increasingly implementing multi-tier systems of support (MTSS) to improve outcomes for all students. Kansas is no exception. The Kansas MTSS is designed to improve outcomes for all students by instituting system-level change across the classroom, school, district, and state. Such…

  8. Incorporating a Multi-Tiered System of Supports into School Counselor Preparation

    ERIC Educational Resources Information Center

    Sink, Christopher A.

    2016-01-01

    With the advent of a multi-tiered system of supports (MTSS) in schools, counselor preparation programs are once again challenged to further extend the education and training of pre-service and in-service school counselors. To introduce and contextualize this special issue, an MTSS's intent and foci, as well as its theoretical and research…

  9. Operational Use of OGC Web Services at the Met Office

    NASA Astrophysics Data System (ADS)

    Wright, Bruce

    2010-05-01

    The Met Office has adopted the Service-Orientated Architecture paradigm to deliver services to a range of customers through Rich Internet Applications (RIAs). The approach uses standard Open Geospatial Consortium (OGC) web services to provide information to web-based applications through a range of generic data services. "Invent", the Met Office beta site, is used to showcase Met Office future plans for presenting web-based weather forecasts, product and information to the public. This currently hosts a freely accessible Weather Map Viewer, written in JavaScript, which accesses a Web Map Service (WMS), to deliver innovative web-based visualizations of weather and its potential impacts to the public. The intention is to engage the public in the development of new web-based services that more accurately meet their needs. As the service is intended for public use within the UK, it has been designed to support a user base of 5 million, the analysed level of UK web traffic reaching the Met Office's public weather information site. The required scalability has been realised through the use of multi-tier tile caching: - WMS requests are made for 256x256 tiles for fixed areas and zoom levels; - a Tile Cache, developed in house, efficiently serves tiles on demand, managing WMS request for the new tiles; - Edge Servers, externally hosted by Akamai, provide a highly scalable (UK-centric) service for pre-cached tiles, passing new requests to the Tile Cache; - the Invent Weather Map Viewer uses the Google Maps API to request tiles from Edge Servers. (We would expect to make use of the Web Map Tiling Service, when it becomes an OGC standard.) The Met Office delivers specialist commercial products to market sectors such as transport, utilities and defence, which exploit a Web Feature Service (WFS) for data relating forecasts and observations to specific geographic features, and a Web Coverage Service (WCS) for sub-selections of gridded data. These are locally rendered as maps or graphs, and combined with the WMS pre-rendered images and text, in a FLEX application, to provide sophisticated, user impact-based view of the weather. The OGC web services supporting these applications have been developed in collaboration with commercial companies. Visual Weather was originally a desktop application for forecasters, but IBL have developed it to expose the full range of forecast and observation data through standard web services (WCS and WMS). Forecasts and observations relating to specific locations and geographic features are held in an Oracle Database, and exposed as a WFS using Snowflake Software's GO-Publisher application. The Met Office has worked closely with both IBL and Snowflake Software to ensure that the web services provided strike a balance between conformance to the standards and performance in an operational environment. This has proved challenging in areas where the standards are rapidly evolving (e.g. WCS) or do not allow adequate description of the Met-Ocean domain (e.g. multiple time coordinates and parametric vertical coordinates). It has also become clear that careful selection of the features to expose, based on the way in which you expect users to query those features, in necessary in order to deliver adequate performance. These experiences are providing useful 'real-world' input in to the recently launched OGC MetOcean Domain Working Group and World Meteorological Organisation (WMO) initiatives in this area.

  10. Just healthcare? The moral failure of single-tier basic healthcare.

    PubMed

    Meadowcroft, John

    2015-04-01

    This article sets out the moral failure of single-tier basic healthcare. Single-tier basic healthcare has been advocated on the grounds that the provision of healthcare should be divorced from ability to pay and unequal access to basic healthcare is morally intolerable. However, single-tier basic healthcare encounters a host of catastrophic moral failings. Given the fact of human pluralism it is impossible to objectively define "basic" healthcare. Attempts to provide single-tier healthcare therefore become political processes in which interest groups compete for control of scarce resources with the most privileged possessing an inherent advantage. The focus on outputs in arguments for single-tier provision neglects the question of justice between individuals when some people provide resources for others without reciprocal benefits. The principle that only healthcare that can be provided to everyone should be provided at all leads to a leveling-down problem in which advocates of single-tier provision must prefer a situation where some individuals are made worse-off without any individual being made better-off compared to plausible multi-tier alternatives. Contemporary single-tier systems require the exclusion of noncitizens, meaning that their universalism is a myth. In the light of these pathologies, it is judged that multi-tier healthcare is morally required. © The Author 2015. Published by Oxford University Press, on behalf of the Journal of Medicine and Philosophy Inc. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. Responsiveness-to-Intervention and School-Wide Positive Behavior Supports: Integration of Multi-Tiered System Approaches

    ERIC Educational Resources Information Center

    Sugai, George; Horner, Robert H.

    2009-01-01

    The Individuals with Disabilities Education Act and No Child Left Behind emphasize the use of scientifically based research to improve outcomes for students. From this emphasis, response-to-intervention has evolved. We present one perspective on the defining features of response-to-intervention and application of those features to school-wide…

  12. Using Brief Experimental Analysis to Intensify Tier 3 Reading Interventions

    ERIC Educational Resources Information Center

    Coolong-Chaffin, Melissa; Wagner, Dana

    2015-01-01

    As implementation of multi-tiered systems of support becomes common practice across the nation, practitioners continue to need strategies for intensifying interventions and supports for the subset of students who fail to make adequate progress despite strong programs at Tiers 1 and 2. Experts recommend making several changes to the structure and…

  13. Handbook of Response to Intervention: The Science and Practice of Multi-Tiered Systems of Support, 2nd Edition

    ERIC Educational Resources Information Center

    Jimerson, Shane R., Ed.; Burns, Matthew K., Ed.; VanDerHeyden, Amanda M., Ed.

    2016-01-01

    The second edition of this essential handbook provides a comprehensive, updated overview of the science that informs best practices for the implementation of response to intervention (RTI) processes within Multi-Tiered Systems of Support (MTSS) to facilitate the academic success of all students. The volume includes insights from leading scholars…

  14. Local Implementation Effectiveness of a Multi-Tier System of Support in Elementary School Settings

    ERIC Educational Resources Information Center

    Houlton, Terry P.

    2017-01-01

    Ensuring all students learn at high levels is demanding. Multi-tier systems of supports (MTSS) has shown promise as a way to promote high levels of learning for all students while catching students who are struggling to learn. However, implementing MTSS models in school districts and schools has seen its challenges. The context of an individual…

  15. Using Direct Behavior Rating--Single Item Scales to Assess Student Behavior within Multi-Tiered Systems of Support

    ERIC Educational Resources Information Center

    Miller, Faith G.; Patwa, Shamim S.; Chafouleas, Sandra M.

    2014-01-01

    An increased emphasis on collecting and using data in schools has occurred, in part, because of the implementation of multi-tiered systems of support (MTSS). Commonly referred to as response to intervention in the academic domain and school-wide positive behavioral interventions and supports in the behavioral domain, these initiatives have a…

  16. Using Response to Intervention/Multi-Tiered Systems of Supports to Promote Social Justice in Schools

    ERIC Educational Resources Information Center

    Avant, Deneca Winfrey

    2016-01-01

    Purpose: The purpose of this study was to explore the use of response to intervention/multi-tiered systems of supports (RtI/MTSS) in promoting social justice in schools. Design/methodology/approach: This study used survey research, using a 32-item questionnaire, and presented results of approximately 200 school social workers (SSWs). Findings:…

  17. Implementation and Outcomes of Kansas Multi-Tier System of Supports: Final Evaluation Report-2014

    ERIC Educational Resources Information Center

    Reedy, Kristen; Lacireno-Paquet, Natalie

    2015-01-01

    Implementation of multi-tier system of supports (MTSS) has grown rapidly in Kansas and is a key strategy for turning around low-performing schools in the state. MTSS is designed to improve outcomes for all students by instituting system-level change across the classroom, school, district, and state. Such systemic change is accomplished by…

  18. The ASCA Model and a Multi-Tiered System of Supports: A Framework to Support Students of Color with Problem Behavior

    ERIC Educational Resources Information Center

    Belser, Christopher T.; Shillingford, M. Ann; Joe, J. Richelle

    2016-01-01

    The American School Counselor Association (ASCA) National Model and a multi-tiered system of supports (MTSS) both provide frameworks for systematically solving problems in schools, including student behavior concerns. The authors outline a model that integrates overlapping elements of the National Model and MTSS as a support for marginalized…

  19. Supporting Teaching and Learning Via the Web: Transforming Hard-Copy Linear Mindsets into Web-Flexible Creative Thinking.

    ERIC Educational Resources Information Center

    Borkowski, Ellen Yu; Henry, David; Larsen, Lida L.; Mateik, Deborah

    This paper describes a four-tiered approach to supporting University of Maryland faculty in the development of instructional materials to be delivered via the World Wide Web. The approach leverages existing equipment and staff by the design of Web posting, editing, and management tools for use on the campus-wide information server,…

  20. Beyond the Pink Sand: Case Studies of Experiences of Multi-Tier System of Supports Implementation in the Bermuda Public School System

    ERIC Educational Resources Information Center

    Francis-Thompson, Nyshawana

    2017-01-01

    This qualitative study examined how Multi-tier System of Supports (MTSS), a systematic approach to providing academic and behavioral supports to students, was implemented and experienced by macro and micro levels of educators in the Bermuda Public School system. I asked three research questions regarding: (a) how MTSS was being implemented in the…

  1. Multi-tiered S-SOA, Parameter-Driven New Islamic Syariah Products of Holistic Islamic Banking System (HiCORE): Virtual Banking Environment

    NASA Astrophysics Data System (ADS)

    Halimah, B. Z.; Azlina, A.; Sembok, T. M.; Sufian, I.; Sharul Azman, M. N.; Azuraliza, A. B.; Zulaiha, A. O.; Nazlia, O.; Salwani, A.; Sanep, A.; Hailani, M. T.; Zaher, M. Z.; Azizah, J.; Nor Faezah, M. Y.; Choo, W. O.; Abdullah, Chew; Sopian, B.

    The Holistic Islamic Banking System (HiCORE), a banking system suitable for virtual banking environment, created based on universityindustry collaboration initiative between Universiti Kebangsaan Malaysia (UKM) and Fuziq Software Sdn Bhd. HiCORE was modeled on a multitiered Simple - Services Oriented Architecture (S-SOA), using the parameterbased semantic approach. HiCORE's existence is timely as the financial world is looking for a new approach to creating banking and financial products that are interest free or based on the Islamic Syariah principles and jurisprudence. An interest free banking system has currently caught the interest of bankers and financiers all over the world. HiCORE's Parameter-based module houses the Customer-information file (CIF), Deposit and Financing components. The Parameter based module represents the third tier of the multi-tiered Simple SOA approach. This paper highlights the multi-tiered parameter- driven approach to the creation of new Islamiic products based on the 'dalil' (Quran), 'syarat' (rules) and 'rukun' (procedures) as required by the syariah principles and jurisprudence reflected by the semantic ontology embedded in the parameter module of the system.

  2. Development, deployment and operations of ATLAS databases

    NASA Astrophysics Data System (ADS)

    Vaniachine, A. V.; Schmitt, J. G. v. d.

    2008-07-01

    In preparation for ATLAS data taking, a coordinated shift from development towards operations has occurred in ATLAS database activities. In addition to development and commissioning activities in databases, ATLAS is active in the development and deployment (in collaboration with the WLCG 3D project) of the tools that allow the worldwide distribution and installation of databases and related datasets, as well as the actual operation of this system on ATLAS multi-grid infrastructure. We describe development and commissioning of major ATLAS database applications for online and offline. We present the first scalability test results and ramp-up schedule over the initial LHC years of operations towards the nominal year of ATLAS running, when the database storage volumes are expected to reach 6.1 TB for the Tag DB and 1.0 TB for the Conditions DB. ATLAS database applications require robust operational infrastructure for data replication between online and offline at Tier-0, and for the distribution of the offline data to Tier-1 and Tier-2 computing centers. We describe ATLAS experience with Oracle Streams and other technologies for coordinated replication of databases in the framework of the WLCG 3D services.

  3. An Energy-Efficient Multi-Tier Architecture for Fall Detection Using Smartphones.

    PubMed

    Guvensan, M Amac; Kansiz, A Oguz; Camgoz, N Cihan; Turkmen, H Irem; Yavuz, A Gokhan; Karsligil, M Elif

    2017-06-23

    Automatic detection of fall events is vital to providing fast medical assistance to the causality, particularly when the injury causes loss of consciousness. Optimization of the energy consumption of mobile applications, especially those which run 24/7 in the background, is essential for longer use of smartphones. In order to improve energy-efficiency without compromising on the fall detection performance, we propose a novel 3-tier architecture that combines simple thresholding methods with machine learning algorithms. The proposed method is implemented on a mobile application, called uSurvive, for Android smartphones. It runs as a background service and monitors the activities of a person in daily life and automatically sends a notification to the appropriate authorities and/or user defined contacts when it detects a fall. The performance of the proposed method was evaluated in terms of fall detection performance and energy consumption. Real life performance tests conducted on two different models of smartphone demonstrate that our 3-tier architecture with feature reduction could save up to 62% of energy compared to machine learning only solutions. In addition to this energy saving, the hybrid method has a 93% of accuracy, which is superior to thresholding methods and better than machine learning only solutions.

  4. The Impact of Tier 1 Reading Instruction on Reading Outcomes for Students in Grades 4-12: A Meta-Analysis

    ERIC Educational Resources Information Center

    Swanson, Elizabeth; Stevens, Elizabeth A.; Scammacca, Nancy K.; Capin, Philip; Stewart, Alicia A.; Austin, Christy R.

    2017-01-01

    Understanding the efficacy of evidence-based reading practices delivered in the Tier 1 (i.e. general classroom) setting is critical to successful implementation of multi-tiered systems, meeting a diverse range of student learning needs, and providing high quality reading instruction across content areas. This meta-analysis presents evidence on the…

  5. DBMap: a TreeMap-based framework for data navigation and visualization of brain research registry

    NASA Astrophysics Data System (ADS)

    Zhang, Ming; Zhang, Hong; Tjandra, Donny; Wong, Stephen T. C.

    2003-05-01

    The purpose of this study is to investigate and apply a new, intuitive and space-conscious visualization framework to facilitate efficient data presentation and exploration of large-scale data warehouses. We have implemented the DBMap framework for the UCSF Brain Research Registry. Such a novel utility would facilitate medical specialists and clinical researchers in better exploring and evaluating a number of attributes organized in the brain research registry. The current UCSF Brain Research Registry consists of a federation of disease-oriented database modules, including Epilepsy, Brain Tumor, Intracerebral Hemorrphage, and CJD (Creuzfeld-Jacob disease). These database modules organize large volumes of imaging and non-imaging data to support Web-based clinical research. While the data warehouse supports general information retrieval and analysis, there lacks an effective way to visualize and present the voluminous and complex data stored. This study investigates whether the TreeMap algorithm can be adapted to display and navigate categorical biomedical data warehouse or registry. TreeMap is a space constrained graphical representation of large hierarchical data sets, mapped to a matrix of rectangles, whose size and color represent interested database fields. It allows the display of a large amount of numerical and categorical information in limited real estate of computer screen with an intuitive user interface. The paper will describe, DBMap, the proposed new data visualization framework for large biomedical databases. Built upon XML, Java and JDBC technologies, the prototype system includes a set of software modules that reside in the application server tier and provide interface to backend database tier and front-end Web tier of the brain registry.

  6. Enhancing efficiency and quality of statistical estimation of immunogenicity assay cut points through standardization and automation.

    PubMed

    Su, Cheng; Zhou, Lei; Hu, Zheng; Weng, Winnie; Subramani, Jayanthi; Tadkod, Vineet; Hamilton, Kortney; Bautista, Ami; Wu, Yu; Chirmule, Narendra; Zhong, Zhandong Don

    2015-10-01

    Biotherapeutics can elicit immune responses, which can alter the exposure, safety, and efficacy of the therapeutics. A well-designed and robust bioanalytical method is critical for the detection and characterization of relevant anti-drug antibody (ADA) and the success of an immunogenicity study. As a fundamental criterion in immunogenicity testing, assay cut points need to be statistically established with a risk-based approach to reduce subjectivity. This manuscript describes the development of a validated, web-based, multi-tier customized assay statistical tool (CAST) for assessing cut points of ADA assays. The tool provides an intuitive web interface that allows users to import experimental data generated from a standardized experimental design, select the assay factors, run the standardized analysis algorithms, and generate tables, figures, and listings (TFL). It allows bioanalytical scientists to perform complex statistical analysis at a click of the button to produce reliable assay parameters in support of immunogenicity studies. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Application of a three-tier framework to assess ecological condition of Gulf of Mexico coastal wetlands.

    PubMed

    Nestlerode, Janet A; Hansen, Virginia D; Teague, Aarin; Harwell, Matthew C

    2014-06-01

    A multi-level coastal wetland assessment strategy was applied to wetlands in the northern Gulf of Mexico (GOM) to evaluate the feasibility of this approach for a broad national scale wetland condition assessment (US Environmental Protection Agency's National Wetlands Condition Assessment). Landscape-scale assessment indicators (tier 1) were developed and applied at the sub-watershed (12-digit hydrologic unit code (HUC)) level within the GOM coastal wetland sample frame with scores calculated using land-use maps and geographic information system. Rapid assessment protocols (tier 2), using a combination of data analysis and field work, evaluated metrics associated with landscape context, hydrology, physical structure, and biological structure. Intensive site monitoring (tier 3) included measures of soil chemistry and composition, water column and pore-water chemistry, and dominant macrophyte community composition and tissue chemistry. Relationships within and among assessment levels were evaluated using multivariate analyses with few significant correlations found. More detailed measures of hydrology, soils, and macrophyte species composition from sites across a known condition gradient, in conjunction with validation of standardized rapid assessment method, may be necessary to fully characterize coastal wetlands across the region.

  8. CMS Readiness for Multi-Core Workload Scheduling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perez-Calero Yzquierdo, A.; Balcas, J.; Hernandez, J.

    In the present run of the LHC, CMS data reconstruction and simulation algorithms benefit greatly from being executed as multiple threads running on several processor cores. The complexity of the Run 2 events requires parallelization of the code to reduce the memory-per- core footprint constraining serial execution programs, thus optimizing the exploitation of present multi-core processor architectures. The allocation of computing resources for multi-core tasks, however, becomes a complex problem in itself. The CMS workload submission infrastructure employs multi-slot partitionable pilots, built on HTCondor and GlideinWMS native features, to enable scheduling of single and multi-core jobs simultaneously. This provides amore » solution for the scheduling problem in a uniform way across grid sites running a diversity of gateways to compute resources and batch system technologies. This paper presents this strategy and the tools on which it has been implemented. The experience of managing multi-core resources at the Tier-0 and Tier-1 sites during 2015, along with the deployment phase to Tier-2 sites during early 2016 is reported. The process of performance monitoring and optimization to achieve efficient and flexible use of the resources is also described.« less

  9. CMS readiness for multi-core workload scheduling

    NASA Astrophysics Data System (ADS)

    Perez-Calero Yzquierdo, A.; Balcas, J.; Hernandez, J.; Aftab Khan, F.; Letts, J.; Mason, D.; Verguilov, V.

    2017-10-01

    In the present run of the LHC, CMS data reconstruction and simulation algorithms benefit greatly from being executed as multiple threads running on several processor cores. The complexity of the Run 2 events requires parallelization of the code to reduce the memory-per- core footprint constraining serial execution programs, thus optimizing the exploitation of present multi-core processor architectures. The allocation of computing resources for multi-core tasks, however, becomes a complex problem in itself. The CMS workload submission infrastructure employs multi-slot partitionable pilots, built on HTCondor and GlideinWMS native features, to enable scheduling of single and multi-core jobs simultaneously. This provides a solution for the scheduling problem in a uniform way across grid sites running a diversity of gateways to compute resources and batch system technologies. This paper presents this strategy and the tools on which it has been implemented. The experience of managing multi-core resources at the Tier-0 and Tier-1 sites during 2015, along with the deployment phase to Tier-2 sites during early 2016 is reported. The process of performance monitoring and optimization to achieve efficient and flexible use of the resources is also described.

  10. Bellerophon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lingerfelt, Eric J; Messer, II, Otis E

    2017-01-02

    The Bellerophon software system supports CHIMERA, a production-level HPC application that simulates the evolution of core-collapse supernovae. Bellerophon enables CHIMERA's geographically dispersed team of collaborators to perform job monitoring and real-time data analysis from multiple supercomputing resources, including platforms at OLCF, NERSC, and NICS. Its multi-tier architecture provides an encapsulated, end-to-end software solution that enables the CHIMERA team to quickly and easily access highly customizable animated and static views of results from anywhere in the world via a cross-platform desktop application.

  11. Multi-Tiered Systems of Supports: An Investigative Study of Their Impact on Third Grade Reading Test Scores in an Urban District

    ERIC Educational Resources Information Center

    Haynes, Heather A.

    2012-01-01

    This study analyzed the impact of implementing response to intervention (RTI), a three-tiered system of intervention of increasing intensity, in this case for reading, schoolwide in 32 elementary schools. When a three-tiered framework is applied schoolwide, with all students and addressing academic and/or behavioral curricular instruction, it is…

  12. The Development of a Web-Based Assessment System to Identify Students' Misconception Automatically on Linear Kinematics with a Four-Tier Instrument Test

    ERIC Educational Resources Information Center

    Pujayanto, Pujayanto; Budiharti, Rini; Adhitama, Egy; Nuraini, Niken Rizky Amalia; Putri, Hanung Vernanda

    2018-01-01

    This research proposes the development of a web-based assessment system to identify students' misconception. The system, named WAS (web-based assessment system), can identify students' misconception profile on linear kinematics automatically after the student has finished the test. The test instrument was developed and validated. Items were…

  13. Efficient provisioning for multi-core applications with LSF

    NASA Astrophysics Data System (ADS)

    Dal Pra, Stefano

    2015-12-01

    Tier-1 sites providing computing power for HEP experiments are usually tightly designed for high throughput performances. This is pursued by reducing the variety of supported use cases and tuning for performances those ones, the most important of which have been that of singlecore jobs. Moreover, the usual workload is saturation: each available core in the farm is in use and there are queued jobs waiting for their turn to run. Enabling multi-core jobs thus requires dedicating a number of hosts where to run, and waiting for them to free the needed number of cores. This drain-time introduces a loss of computing power driven by the number of unusable empty cores. As an increasing demand for multi-core capable resources have emerged, a Task Force have been constituted in WLCG, with the goal to define a simple and efficient multi-core resource provisioning model. This paper details the work done at the INFN Tier-1 to enable multi-core support for the LSF batch system, with the intent of reducing to the minimum the average number of unused cores. The adopted strategy has been that of dedicating to multi-core a dynamic set of nodes, whose dimension is mainly driven by the number of pending multi-core requests and fair-share priority of the submitting user. The node status transition, from single to multi core et vice versa, is driven by a finite state machine which is implemented in a custom multi-core director script, running in the cluster. After describing and motivating both the implementation and the details specific to the LSF batch system, results about performance are reported. Factors having positive and negative impact on the overall efficiency are discussed and solutions to reduce at most the negative ones are proposed.

  14. Progress toward a Semantic eScience Framework; building on advanced cyberinfrastructure

    NASA Astrophysics Data System (ADS)

    McGuinness, D. L.; Fox, P. A.; West, P.; Rozell, E.; Zednik, S.; Chang, C.

    2010-12-01

    The configurable and extensible semantic eScience framework (SESF) has begun development and implementation of several semantic application components. Extensions and improvements to several ontologies have been made based on distinct interdisciplinary use cases ranging from solar physics, to biologicl and chemical oceanography. Importantly, these semantic representations mediate access to a diverse set of existing and emerging cyberinfrastructure. Among the advances are the population of triple stores with web accessible query services. A triple store is akin to a relational data store where the basic stored unit is a subject-predicate-object tuple. Access via a query is provided by the W3 Recommendation language specification SPARQL. Upon this middle tier of semantic cyberinfrastructure, we have developed several forms of semantic faceted search, including provenance-awareness. We report on the rapid advances in semantic technologies and tools and how we are sustaining the software path for the required technical advances as well as the ontology improvements and increased functionality of the semantic applications including how they are integrated into web-based portals (e.g. Drupal) and web services. Lastly, we indicate future work direction and opportunities for collaboration.

  15. Genetic and economic benefits of selection based on performance recording and genotyping in lower tiers of multi-tiered sheep breeding schemes.

    PubMed

    Santos, Bruno F S; van der Werf, Julius H J; Gibson, John P; Byrne, Timothy J; Amer, Peter R

    2017-01-17

    Performance recording and genotyping in the multiplier tier of multi-tiered sheep breeding schemes could potentially reduce the difference in the average genetic merit between nucleus and commercial flocks, and create additional economic benefits for the breeding structure. The genetic change in a multiple-trait breeding objective was predicted for various selection strategies that included performance recording, parentage testing and genomic selection. A deterministic simulation model was used to predict selection differentials and the flow of genetic superiority through the different tiers. Cumulative discounted economic benefits were calculated based on trait gains achieved in each of the tiers and considering the extra revenue and associated costs of applying recording, genotyping and selection practices in the multiplier tier of the breeding scheme. Performance recording combined with genomic or parentage information in the multiplier tier reduced the genetic lag between the nucleus and commercial flock by 2 to 3 years. The overall economic benefits of improved performance in the commercial tier offset the costs of recording the multiplier. However, it took more than 18 years before the cumulative net present value of benefits offset the costs at current test prices. Strategies in which recorded multiplier ewes were selected as replacements for the nucleus flock did modestly increase profitability when compared to a closed nucleus structure. Applying genomic selection is the most beneficial strategy if testing costs can be reduced or by genotyping only a proportion of the selection candidates. When the cost of genotyping was reduced, scenarios that combine performance recording with genomic selection were more profitable and reached breakeven point about 10 years earlier. Economic benefits can be generated in multiplier flocks by implementing performance recording in conjunction with either DNA pedigree recording or genomic technology. These recording practices reduce the long genetic lag between the nucleus and commercial flocks in multi-tiered breeding programs. Under current genotyping costs, the time to breakeven was found to be generally very long, although this varied between strategies. Strategies using either genomic selection or DNA pedigree verification were found to be economically viable provided the price paid for the tests is lower than current prices, in the long-term.

  16. Benchmarking multimedia performance

    NASA Astrophysics Data System (ADS)

    Zandi, Ahmad; Sudharsanan, Subramania I.

    1998-03-01

    With the introduction of faster processors and special instruction sets tailored to multimedia, a number of exciting applications are now feasible on the desktops. Among these is the DVD playback consisting, among other things, of MPEG-2 video and Dolby digital audio or MPEG-2 audio. Other multimedia applications such as video conferencing and speech recognition are also becoming popular on computer systems. In view of this tremendous interest in multimedia, a group of major computer companies have formed, Multimedia Benchmarks Committee as part of Standard Performance Evaluation Corp. to address the performance issues of multimedia applications. The approach is multi-tiered with three tiers of fidelity from minimal to full compliant. In each case the fidelity of the bitstream reconstruction as well as quality of the video or audio output are measured and the system is classified accordingly. At the next step the performance of the system is measured. In many multimedia applications such as the DVD playback the application needs to be run at a specific rate. In this case the measurement of the excess processing power, makes all the difference. All these make a system level, application based, multimedia benchmark very challenging. Several ideas and methodologies for each aspect of the problems will be presented and analyzed.

  17. The Brief Classroom Interaction Observation-Revised: An Observation System to Inform and Increase Teacher Use of Universal Classroom Management Practices

    ERIC Educational Resources Information Center

    Reinke, Wendy M.; Stormont, Melissa; Herman, Keith C.; Wachsmuth, Sean; Newcomer, Lori

    2015-01-01

    Schools are increasingly using multi-tiered prevention models to address the academic and behavior needs of students. The foundation of these models is the implementation of universal, or Tier 1, practices designed to support the academic and behavioral needs of the vast majority of students. To support teachers in the use of effective Tier 1…

  18. Welfare of organic laying hens kept at different indoor stocking densities in a multi-tier aviary system. II: live weight, health measures and perching.

    PubMed

    Steenfeldt, S; Nielsen, B L

    2015-09-01

    Multi-tier aviary systems, where conveyor belts below the tiers remove the manure at regular intervals, are becoming more common in organic egg production. The area on the tiers can be included in the net area available to the hens (also referred to as usable area) when calculating maximum indoor stocking densities in organic systems within the EU. In this article, results on live weight, health measures and perching are reported for organic laying hens housed in a multi-tier system with permanent access to a veranda and kept at stocking densities (D) of 6, 9 and 12 hens/m2 available floor area, with concomitant increases in the number of hens per trough, drinker, perch and nest space. In a fourth treatment, access to the top tier was blocked reducing vertical, trough, and perch access at the lowest stocking density (D6x). In all other aspects than stocking density, the experiment followed the EU regulations on the keeping of organic laying hens. Hen live weight, mortality and foot health were not affected by the stocking densities used in the present study. Other variables (plumage condition, presence of breast redness and blisters, pecked tail feathers, and perch use) were indirectly affected by the increase in stocking density through the simultaneous reduction in access to other resources, mainly perches and troughs. The welfare of the hens was mostly affected by these associated constraints, despite all of them being within the allowed minimum requirements for organic production in the EU. Although the welfare consequences reported here were assessed to be moderate to minor, it is important to take into account concurrent constraints on access to other resources when higher stocking densities are used in organic production.

  19. An Energy-Efficient Multi-Tier Architecture for Fall Detection on Smartphones

    PubMed Central

    Guvensan, M. Amac; Kansiz, A. Oguz; Camgoz, N. Cihan; Turkmen, H. Irem; Yavuz, A. Gokhan; Karsligil, M. Elif

    2017-01-01

    Automatic detection of fall events is vital to providing fast medical assistance to the causality, particularly when the injury causes loss of consciousness. Optimization of the energy consumption of mobile applications, especially those which run 24/7 in the background, is essential for longer use of smartphones. In order to improve energy-efficiency without compromising on the fall detection performance, we propose a novel 3-tier architecture that combines simple thresholding methods with machine learning algorithms. The proposed method is implemented on a mobile application, called uSurvive, for Android smartphones. It runs as a background service and monitors the activities of a person in daily life and automatically sends a notification to the appropriate authorities and/or user defined contacts when it detects a fall. The performance of the proposed method was evaluated in terms of fall detection performance and energy consumption. Real life performance tests conducted on two different models of smartphone demonstrate that our 3-tier architecture with feature reduction could save up to 62% of energy compared to machine learning only solutions. In addition to this energy saving, the hybrid method has a 93% of accuracy, which is superior to thresholding methods and better than machine learning only solutions. PMID:28644378

  20. Service Migration from Cloud to Multi-tier Fog Nodes for Multimedia Dissemination with QoE Support

    PubMed Central

    Camargo, João; Rochol, Juergen; Gerla, Mario

    2018-01-01

    A wide range of multimedia services is expected to be offered for mobile users via various wireless access networks. Even the integration of Cloud Computing in such networks does not support an adequate Quality of Experience (QoE) in areas with high demands for multimedia contents. Fog computing has been conceptualized to facilitate the deployment of new services that cloud computing cannot provide, particularly those demanding QoE guarantees. These services are provided using fog nodes located at the network edge, which is capable of virtualizing their functions/applications. Service migration from the cloud to fog nodes can be actuated by request patterns and the timing issues. To the best of our knowledge, existing works on fog computing focus on architecture and fog node deployment issues. In this article, we describe the operational impacts and benefits associated with service migration from the cloud to multi-tier fog computing for video distribution with QoE support. Besides that, we perform the evaluation of such service migration of video services. Finally, we present potential research challenges and trends. PMID:29364172

  1. Service Migration from Cloud to Multi-tier Fog Nodes for Multimedia Dissemination with QoE Support.

    PubMed

    Rosário, Denis; Schimuneck, Matias; Camargo, João; Nobre, Jéferson; Both, Cristiano; Rochol, Juergen; Gerla, Mario

    2018-01-24

    A wide range of multimedia services is expected to be offered for mobile users via various wireless access networks. Even the integration of Cloud Computing in such networks does not support an adequate Quality of Experience (QoE) in areas with high demands for multimedia contents. Fog computing has been conceptualized to facilitate the deployment of new services that cloud computing cannot provide, particularly those demanding QoE guarantees. These services are provided using fog nodes located at the network edge, which is capable of virtualizing their functions/applications. Service migration from the cloud to fog nodes can be actuated by request patterns and the timing issues. To the best of our knowledge, existing works on fog computing focus on architecture and fog node deployment issues. In this article, we describe the operational impacts and benefits associated with service migration from the cloud to multi-tier fog computing for video distribution with QoE support. Besides that, we perform the evaluation of such service migration of video services. Finally, we present potential research challenges and trends.

  2. Expanding the role of unattended ground sensors to multi-tiered systems

    NASA Astrophysics Data System (ADS)

    Garrison, David R., II

    2009-05-01

    Unattended Ground Sensors (UGS) have recently gained momentum in surveillance and protection applications. Many of these Unattended Ground Sensors are deployed in current operations today across the Department of Defense (DoD) and Department of Homeland Security (DHS). In addition to UGS needs, there is a growing desire to leverage existing UGS for incorporation into higher level systems for a broadening role in defense and homeland security applications. The architecture to achieve this goal and examples of non-traditional scenarios that leverage higher level systems are discussed in this paper.

  3. JACOB: an enterprise framework for computational chemistry.

    PubMed

    Waller, Mark P; Dresselhaus, Thomas; Yang, Jack

    2013-06-15

    Here, we present just a collection of beans (JACOB): an integrated batch-based framework designed for the rapid development of computational chemistry applications. The framework expedites developer productivity by handling the generic infrastructure tier, and can be easily extended by user-specific scientific code. Paradigms from enterprise software engineering were rigorously applied to create a scalable, testable, secure, and robust framework. A centralized web application is used to configure and control the operation of the framework. The application-programming interface provides a set of generic tools for processing large-scale noninteractive jobs (e.g., systematic studies), or for coordinating systems integration (e.g., complex workflows). The code for the JACOB framework is open sourced and is available at: www.wallerlab.org/jacob. Copyright © 2013 Wiley Periodicals, Inc.

  4. DOTAGWA: A CASE STUDY IN WEB-BASED ARCHITECTURES FOR CONNECTING SURFACE WATER MODELS TO SPATIALLY ENABLED WEB APPLICATIONS

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment (AGWA) tool is a desktop application that uses widely available standardized spatial datasets to derive inputs for multi-scale hydrologic models (Miller et al., 2007). The required data sets include topography (DEM data), soils, clima...

  5. An IT-enabled supply chain model: a simulation study

    NASA Astrophysics Data System (ADS)

    Cannella, Salvatore; Framinan, Jose M.; Barbosa-Póvoa, Ana

    2014-11-01

    During the last decades, supply chain collaboration practices and the underlying enabling technologies have evolved from the classical electronic data interchange (EDI) approach to a web-based and radio frequency identification (RFID)-enabled collaboration. In this field, most of the literature has focused on the study of optimal parameters for reducing the total cost of suppliers, by adopting operational research (OR) techniques. Herein we are interested in showing that the considered information technology (IT)-enabled structure is resilient, that is, it works well across a reasonably broad range of parameter settings. By adopting a methodological approach based on system dynamics, we study a multi-tier collaborative supply chain. Results show that the IT-enabled supply chain improves operational performance and customer service level. Nonetheless, benefits for geographically dispersed networks are of minor entity.

  6. Supporting Families to Support Students

    ERIC Educational Resources Information Center

    Kelly, John; Rossen, Eric; Cowan, Katherine C.

    2018-01-01

    Collaboration between students' families and the school is an essential component to promoting student mental and behavioral health. Many schools structure their mental health services using a Multi-Tiered System of Supports that offers three different tiers of support from universal supports to personalized help for students with serious…

  7. Fostering SMART partnerships to develop an effective continuum of behavioral health services and supports in schools.

    PubMed

    Bruns, Eric J; Duong, Mylien T; Lyon, Aaron R; Pullmann, Michael D; Cook, Clayton R; Cheney, Douglas; McCauley, Elizabeth

    2016-03-01

    The education sector offers compelling opportunities to address the shortcomings of traditional mental health delivery systems and to prevent and treat youth mental, emotional, and behavioral (MEB) problems. Recognizing that social and emotional wellness is intrinsically related to academic success, schools are moving to adopt multi-tier frameworks based on the public health model that provide a continuum of services to all children, including services to address both academic and MEB problems. In this article, we review the potential value of multi-tier frameworks in facilitating access to, and increasing the effectiveness of, mental health services in schools, and review the empirical support for school-based mental health interventions by tier. We go on to describe a community-academic partnership between the Seattle Public Schools and the University of Washington School Mental Health Assessment, Research, and Training (SMART) Center that exemplifies how multi-tier educational frameworks, research and evidence, and purposeful collaboration can combine to improve development and implementation of a range of school-based strategies focused on MEB needs of students. Finally, we present a set of 10 recommendations that may help guide other research and practice improvement efforts to address MEB problems in youth through effective school mental health programming. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  8. Fostering SMART Partnerships to Develop an Effective Continuum of Behavioral Health Services and Supports in Schools

    PubMed Central

    Bruns, Eric J.; Duong, Mylien T.; Lyon, Aaron R.; Pullmann, Michael D.; Cook, Clayton R.; Cheney, Douglas; McCauley, Elizabeth

    2015-01-01

    The education sector offers compelling opportunities to address the shortcomings of traditional mental health delivery systems and to prevent and treat youth mental, emotional, and behavioral (MEB) problems. Recognizing that social and emotional wellness is intrinsically related to academic success, schools are moving to adopt multi-tier frameworks based on the public health model that provide a continuum of services to all children, including services to address both academic and MEB problems. In this paper, we review the potential value of multi-tier frameworks in facilitating access to, and increasing the effectiveness of, mental health services in schools and review the empirical support for school-based mental health interventions by tier. We go on to describe a community-academic partnership between the Seattle Public Schools and the University of Washington School Mental Health Assessment, Research, and Training (SMART) Center that exemplifies how multi-tier educational frameworks, research and evidence, and purposeful collaboration can combine to improve development and implementation of a range of school-based strategies focused on MEB needs of students. Finally, we present a set of 10 recommendations that may help guide other research and practice improvement efforts to address MEB problems in youth through effective school mental health programming. PMID:26963185

  9. A Collaborative Planning Framework for Teachers Implementing Tiered Instruction

    ERIC Educational Resources Information Center

    Stuart, Shannon K.; Rinaldi, Claudia

    2009-01-01

    The recent reauthorization and regulations of the Individuals With Disabilities Education Improvement Act (IDEA 2004) encourage the use of school-wide interventions including response to intervention (RTI; Bradley, Danielson, & Doolittle, 2007). RTI refers to a multi-tiered system that addresses the academic needs of all students by using…

  10. visPIG--a web tool for producing multi-region, multi-track, multi-scale plots of genetic data.

    PubMed

    Scales, Matthew; Jäger, Roland; Migliorini, Gabriele; Houlston, Richard S; Henrion, Marc Y R

    2014-01-01

    We present VISual Plotting Interface for Genetics (visPIG; http://vispig.icr.ac.uk), a web application to produce multi-track, multi-scale, multi-region plots of genetic data. visPIG has been designed to allow users not well versed with mathematical software packages and/or programming languages such as R, Matlab®, Python, etc., to integrate data from multiple sources for interpretation and to easily create publication-ready figures. While web tools such as the UCSC Genome Browser or the WashU Epigenome Browser allow custom data uploads, such tools are primarily designed for data exploration. This is also true for the desktop-run Integrative Genomics Viewer (IGV). Other locally run data visualisation software such as Circos require significant computer skills of the user. The visPIG web application is a menu-based interface that allows users to upload custom data tracks and set track-specific parameters. Figures can be downloaded as PDF or PNG files. For sensitive data, the underlying R code can also be downloaded and run locally. visPIG is multi-track: it can display many different data types (e.g association, functional annotation, intensity, interaction, heat map data,…). It also allows annotation of genes and other custom features in the plotted region(s). Data tracks can be plotted individually or on a single figure. visPIG is multi-region: it supports plotting multiple regions, be they kilo- or megabases apart or even on different chromosomes. Finally, visPIG is multi-scale: a sub-region of particular interest can be 'zoomed' in. We describe the various features of visPIG and illustrate its utility with examples. visPIG is freely available through http://vispig.icr.ac.uk under a GNU General Public License (GPLv3).

  11. Towards a Ubiquitous User Model for Profile Sharing and Reuse

    PubMed Central

    de Lourdes Martinez-Villaseñor, Maria; Gonzalez-Mendoza, Miguel; Hernandez-Gress, Neil

    2012-01-01

    People interact with systems and applications through several devices and are willing to share information about preferences, interests and characteristics. Social networking profiles, data from advanced sensors attached to personal gadgets, and semantic web technologies such as FOAF and microformats are valuable sources of personal information that could provide a fair understanding of the user, but profile information is scattered over different user models. Some researchers in the ubiquitous user modeling community envision the need to share user model's information from heterogeneous sources. In this paper, we address the syntactic and semantic heterogeneity of user models in order to enable user modeling interoperability. We present a dynamic user profile structure based in Simple Knowledge Organization for the Web (SKOS) to provide knowledge representation for ubiquitous user model. We propose a two-tier matching strategy for concept schemas alignment to enable user modeling interoperability. Our proposal is proved in the application scenario of sharing and reusing data in order to deal with overweight and obesity. PMID:23201995

  12. A multi-tiered approach to addressing the mental health issues surrounding obesity in children and youth.

    PubMed

    Bazyk, Susan; Winne, Rebecca

    2013-04-01

    Obesity in children and youth is a major public health concern known to have a significant impact on physical and mental health. Although traditional approaches to obesity have emphasized diet and exercise at the individual level, broader attention to the mental health consequences of obesity is crucial. Individuals who are obese live in a world where they are often less accepted resulting in social exclusion and discrimination. A public health multi-tiered approach to obesity focusing on mental health promotion, prevention, and individualized intervention is presented.

  13. A Multi-Tiered Approach for Building Capacity in Hydrologic Modeling for Water Resource Management in Developing Regions

    NASA Astrophysics Data System (ADS)

    Markert, K. N.; Limaye, A. S.; Rushi, B. R.; Adams, E. C.; Anderson, E.; Ellenburg, W. L.; Mithieu, F.; Griffin, R.

    2017-12-01

    Water resource management is the process by which governments, businesses and/or individuals reach and implement decisions that are intended to address the future quantity and/or quality of water for societal benefit. The implementation of water resource management typically requires the understanding of the quantity and/or timing of a variety of hydrologic variables (e.g. discharge, soil moisture and evapotranspiration). Often times these variables for management are simulated using hydrologic models particularly in data sparse regions. However, there are several large barriers to entry in learning how to use models, applying best practices during the modeling process, and selecting and understanding the most appropriate model for diverse applications. This presentation focuses on a multi-tiered approach to bring the state-of-the-art hydrologic modeling capabilities and methods to developing regions through the SERVIR program, a joint NASA and USAID initiative that builds capacity of regional partners and their end users on the use of Earth observations for environmental decision making. The first tier is a series of trainings on the use of multiple hydrologic models, including the Variable Infiltration Capacity (VIC) and Ensemble Framework For Flash Flood Forecasting (EF5), which focus on model concepts and steps to successfully implement the models. We present a case study for this in a pilot area, the Nyando Basin in Kenya. The second tier is focused on building a community of practice on applied hydrology modeling aimed at creating a support network for hydrologists in SERVIR regions and promoting best practices. The third tier is a hydrologic inter-comparison project under development in the SERVIR regions. The objective of this step is to understand model performance under specific decision-making scenarios, and to share knowledge among hydrologists in SERVIR regions. The results of these efforts include computer programs, training materials, and new scientific understanding, all of which are shared in an open and collaborative environment for transparency and subsequent capacity building in SERVIR regions and beyond. The outcome of this work is increased awareness and capacity on the use of hydrologic models in developing regions to support water resource management and water security.

  14. Helping Children with Emotional Difficulties: A Response to Intervention Investigation

    ERIC Educational Resources Information Center

    Pearce, Lee R.

    2009-01-01

    This article describes a Response to Intervention (RTI) model of service delivery implemented within a rural elementary school for students in kindergarten through fifth grade experiencing significant emotional and behavioral difficulties. A multi-tiered model is presented that includes school wide interventions in Tier 1, as well as a six…

  15. Implementing Positive Behavior Support in Preschools: An Exploratory Study of CW-FIT Tier 1

    ERIC Educational Resources Information Center

    Jolstead, Krystine A.; Caldarella, Paul; Hansen, Blake; Korth, Byran B.; Williams, Leslie; Kamps, Debra

    2017-01-01

    Challenging behavior in preschool is a serious concern for teachers. Positive behavior interventions and supports (PBIS) have been shown to be effective in reducing such behaviors. Class-Wide Function-Related Intervention Teams (CW-FIT) is a specific multi-tiered intervention for implementing effective classroom management strategies using PBIS…

  16. Implementing Positive Behavior Support in Preschools: An Exploratory Study of CW-FIT Tier 1

    ERIC Educational Resources Information Center

    Jolstead, Krystine A.; Caldarella, Paul; Hansen, Blake D.; Korth, Byran B.; Williams, Leslie; Kamps, Debra M.

    2017-01-01

    Challenging behavior in preschool is a serious concern for teachers. Positive behavior interventions and supports (PBIS) has been shown to be effective in reducing such behaviors. Class-Wide Function-Related Intervention Teams (CW-FIT) is a specific multi-tiered intervention for implementing effective classroom management strategies using PBIS…

  17. Numbered Heads Together as a Tier 1 Instructional Strategy in Multitiered Systems of Support

    ERIC Educational Resources Information Center

    Hunter, William C.; Maheady, Lawrence; Jasper, Andrea D.; Williamson, Robert L.; Murley, Renee C.; Stratton, Elizabeth

    2015-01-01

    Federal mandates (Individuals with Disabilities Education Improvement Act, 2004; No Child Left Behind Act, 2001) require teachers to accommodate students with more diverse academic and behavioral needs in inclusive general educational settings. To assist general educators in meeting this instructional challenge, multi-tiered systems of support…

  18. Web-based access to near real-time and archived high-density time-series data: cyber infrastructure challenges & developments in the open-source Waveform Server

    NASA Astrophysics Data System (ADS)

    Reyes, J. C.; Vernon, F. L.; Newman, R. L.; Steidl, J. H.

    2010-12-01

    The Waveform Server is an interactive web-based interface to multi-station, multi-sensor and multi-channel high-density time-series data stored in Center for Seismic Studies (CSS) 3.0 schema relational databases (Newman et al., 2009). In the last twelve months, based on expanded specifications and current user feedback, both the server-side infrastructure and client-side interface have been extensively rewritten. The Python Twisted server-side code-base has been fundamentally modified to now present waveform data stored in cluster-based databases using a multi-threaded architecture, in addition to supporting the pre-existing single database model. This allows interactive web-based access to high-density (broadband @ 40Hz to strong motion @ 200Hz) waveform data that can span multiple years; the common lifetime of broadband seismic networks. The client-side interface expands on it's use of simple JSON-based AJAX queries to now incorporate a variety of User Interface (UI) improvements including standardized calendars for defining time ranges, applying on-the-fly data calibration to display SI-unit data, and increased rendering speed. This presentation will outline the various cyber infrastructure challenges we have faced while developing this application, the use-cases currently in existence, and the limitations of web-based application development.

  19. Solar-Terrestrial Ontology Development

    NASA Astrophysics Data System (ADS)

    McGuinness, D.; Fox, P.; Middleton, D.; Garcia, J.; Cinquni, L.; West, P.; Darnell, J. A.; Benedict, J.

    2005-12-01

    The development of an interdisciplinary virtual observatory (the Virtual Solar-Terrestrial Observatory; VSTO) as a scalable environment for searching, integrating, and analyzing databases distributed over the Internet requires a higher level of semantic interoperability than here-to-fore required by most (if not all) distributed data systems or discipline specific virtual observatories. The formalization of semantics using ontologies and their encodings for the internet (e.g. OWL - the Web Ontology Language), as well as the use of accompanying tools, such as reasoning, inference and explanation, open up both a substantial leap in options for interoperability and in the need for formal development principles to guide ontology development and use within modern, multi-tiered network data environments. In this presentation, we outline the formal methodologies we utilize in the VSTO project, the currently developed use-cases, ontologies and their relation to existing ontologies (such as SWEET).

  20. A Web-Based Information System for Field Data Management

    NASA Astrophysics Data System (ADS)

    Weng, Y. H.; Sun, F. S.

    2014-12-01

    A web-based field data management system has been designed and developed to allow field geologists to store, organize, manage, and share field data online. System requirements were analyzed and clearly defined first regarding what data are to be stored, who the potential users are, and what system functions are needed in order to deliver the right data in the right way to the right user. A 3-tiered architecture was adopted to create this secure, scalable system that consists of a web browser at the front end while a database at the back end and a functional logic server in the middle. Specifically, HTML, CSS, and JavaScript were used to implement the user interface in the front-end tier, the Apache web server runs PHP scripts, and MySQL to server is used for the back-end database. The system accepts various types of field information, including image, audio, video, numeric, and text. It allows users to select data and populate them on either Google Earth or Google Maps for the examination of the spatial relations. It also makes the sharing of field data easy by converting them into XML format that is both human-readable and machine-readable, and thus ready for reuse.

  1. Progress in landslide susceptibility mapping over Europe using Tier-based approaches

    NASA Astrophysics Data System (ADS)

    Günther, Andreas; Hervás, Javier; Reichenbach, Paola; Malet, Jean-Philippe

    2010-05-01

    The European Thematic Strategy for Soil Protection aims, among other objectives, to ensure a sustainable use of soil. The legal instrument of the strategy, the proposed Framework Directive, suggests identifying priority areas of several soil threats including landslides using a coherent and compatible approach based on the use of common thematic data. In a first stage, this can be achieved through landslide susceptibility mapping using geographically nested, multi-step tiered approaches, where areas identified as of high susceptibility by a first, synoptic-scale Tier ("Tier 1") can then be further assessed and mapped at larger scale by successive Tiers. In order to identify areas prone to landslides at European scale ("Tier 1"), a number of thematic terrain and environmental data sets already available for the whole of Europe can be used as input for a continental scale susceptibility model. However, since no coherent landslide inventory data is available at the moment over the whole continent, qualitative heuristic zonation approaches are proposed. For "Tier 1" a preliminary, simplified model has been developed. It consists of an equally weighting combination of a reduced, continent-wide common dataset of landslide conditioning factors including soil parent material, slope angle and land cover, to derive a landslide susceptibility index using raster mapping units consisting of 1 x 1 km pixels. A preliminary European-wide susceptibility map has thus been produced at 1:1 Million scale, since this is compatible with that of the datasets used. The map has been validated by means of a ratio of effectiveness using samples from landslide inventories in Italy, Austria, Hungary and United Kingdom. Although not differentiated for specific geomorphological environments or specific landslide types, the experimental model reveals a relatively good performance in many European regions at a 1:1 Million scale. An additional "Tier 1" susceptibility map at the same scale and using the same or equivalent thematic data as for the one above has been generated for six French departments using a heuristic, weighting-based multi-criteria evaluation model applied also to raster-cell mapping units. In this experiment, thematic data class weights have been differentiated for two stratification areas, namely mountains and plains, and four main landslide types. Separate susceptibility maps for each landslide type and a combined map for all types have been produced. Results have been validated using BRGM's BDMvT landslide inventory. Unlike "Tier 1", "Tier 2" assessment requires landslide inventory data and additional thematic data on conditioning factors which may not be available for all European countries. For the "Tier 2", a nation-wide quantitative landslide susceptibility assessment has been performed for Italy by applying a statistical model. In this assessment, multivariate analysis was applied using bedrock, soil and climate data together with a number of derivatives from SRTM90 DEM. In addition, separate datasets from a historical landslide inventory were used for model training and validation respectively. The mapping units selected were based on administrative boundaries (municipalities). The performance of this nation-wide, quantitative susceptibility assessment has been evaluated using multi-temporal landslide inventory data. Finally, model limitations for "Tier 1" are discussed, and recommendations for enhanced Tier 1 and Tier 2 models including additional thematic data for conditioning factors are drawn. This project is part of the collaborative research carried out within the European Landslide Expert Group coordinated by JRC in support to the EU Soil Thematic Strategy. It is also supported by the International Programme on Landslides of the International Consortium on Landslides.

  2. Designing Multi-Channel Web Frameworks for Cultural Tourism Applications: The MUSE Case Study.

    ERIC Educational Resources Information Center

    Garzotto, Franca; Salmon, Tullio; Pigozzi, Massimiliano

    A framework for the design of multi-channel (MC) applications in the cultural tourism domain is presented. Several heterogeneous interface devices are supported including location-sensitive mobile units, on-site stationary devices, and personalized CDs that extend the on-site experience beyond the visit time thanks to personal memories gathered…

  3. A Cultural, Linguistic, and Ecological Framework for Response to Intervention with English Language Learners

    ERIC Educational Resources Information Center

    Brown, Julie Esparza; Doolittle, Jennifer

    2008-01-01

    Response to Intervention (RTI) has been heralded by many as the long-awaited alternative to using a discrepancy formula for special education eligibility decisions. RTI focuses on intervening early through a multi-tiered approach where each tier provides interventions of increasing intensity. RTI has the potential to affect change for English…

  4. Mapping 21st Century Skills: Investigating the Curriculum Preparing Teachers and Librarians

    ERIC Educational Resources Information Center

    Witte, Shelbie D.; Gross, Melissa R.; Latham, Don L., Jr.

    2015-01-01

    In the first tier of a multi-tier research project, U.S. faculty from the School of Teacher Education and the School of Library and Information Studies seek to create synergies between teacher education and library initiatives in order to understand the best ways to encourage collaboration between teachers and librarians. This article discusses…

  5. Designing and Implementing Group Contingencies in the Classroom: A Teacher's Guide

    ERIC Educational Resources Information Center

    Chow, Jason C.; Gilmour, Allison F.

    2016-01-01

    Group contingencies are a positive, proactive classroom management technique that works well as Tier 1 of a multi-tiered system of behavior support. These programs are adaptable to student and classroom needs and work well to support the behavior of students with disabilities in general education classrooms. Off-the-shelf programs exist, but…

  6. Multitiered Support Framework for Teachers' Classroom-Management Practices: Overview and Case Study of Building the Triangle for Teachers

    ERIC Educational Resources Information Center

    Simonsen, Brandi; MacSuga-Gage, Ashley S.; Briere, Donald E.; Freeman, Jennifer; Myers, Diane; Scott, Terrance M.; Sugai, George

    2014-01-01

    Many teachers enter the field without sufficient training in classroom management and continue to experience challenges throughout their careers. Therefore, school-based leaders need a multi-tiered support (MTS) framework to (a) provide training to all teachers in classroom management (Tier 1), (b) identify teachers who require additional…

  7. Identifying Preschool Children for Higher Tiers of Language and Early Literacy Instruction within a Response to Intervention Framework

    ERIC Educational Resources Information Center

    Carta, Judith J.; Greenwood, Charles R.; Atwater, Jane; McConnell, Scott R.; Goldstein, Howard; Kaminski, Ruth A.

    2014-01-01

    Response to Intervention (RTI) or Multi-Tiered Systems of Support (MTSS) is beginning to be implemented in preschool programs to improve outcomes and to reduce the need for special education services. The proportions of children in programs identified as struggling learners through universal screening have important implications for the…

  8. 34 CFR 75.224 - What are the procedures for using a multiple tier review process to evaluate applications?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 1 2010-07-01 2010-07-01 false What are the procedures for using a multiple tier... applications received. (d) The Secretary may, in any tier— (1) Use more than one group of experts to gain... procedures for using a multiple tier review process to evaluate applications? (a) The Secretary may use a...

  9. Web-based UMLS concept retrieval by automatic text scanning: a comparison of two methods.

    PubMed

    Brandt, C; Nadkarni, P

    2001-01-01

    The Web is increasingly the medium of choice for multi-user application program delivery. Yet selection of an appropriate programming environment for rapid prototyping, code portability, and maintainability remain issues. We summarize our experience on the conversion of a LISP Web application, Search/SR to a new, functionally identical application, Search/SR-ASP using a relational database and active server pages (ASP) technology. Our results indicate that provision of easy access to database engines and external objects is almost essential for a development environment to be considered viable for rapid and robust application delivery. While LISP itself is a robust language, its use in Web applications may be hard to justify given that current vendor implementations do not provide such functionality. Alternative, currently available scripting environments for Web development appear to have most of LISP's advantages and few of its disadvantages.

  10. A Prescription for Drug Formulary Evaluation: An Application of Price Indexes

    PubMed Central

    Glazer, Jacob; Huskamp, Haiden A.; McGuire, Thomas G.

    2012-01-01

    Existing economic approaches to the design and evaluation of health insurance do not readily apply to coverage decisions in the multi-tiered drug formularies characterizing drug coverage in private health insurance and Medicare. This paper proposes a method for evaluating a change in the value of a formulary to covered members based on the economic theory of price indexes. A formulary is cast as a set of demand-side prices, and our measure approximates the compensation (positive or negative) that would need to be paid to consumers to accept the new set of prices. The measure also incorporates any effect of the formulary change on plan drug acquisition costs and “offset effects” on non-drug services covered by the plan. Data needed to calculate formulary value are known or can be forecast by a health plan. We illustrate the method with data from a move from a two- to a three-tier formulary. PMID:23372543

  11. A neural network approach for enhancing information extraction from multispectral image data

    USGS Publications Warehouse

    Liu, J.; Shao, G.; Zhu, H.; Liu, S.

    2005-01-01

    A back-propagation artificial neural network (ANN) was applied to classify multispectral remote sensing imagery data. The classification procedure included four steps: (i) noisy training that adds minor random variations to the sampling data to make the data more representative and to reduce the training sample size; (ii) iterative or multi-tier classification that reclassifies the unclassified pixels by making a subset of training samples from the original training set, which means the neural model can focus on fewer classes; (iii) spectral channel selection based on neural network weights that can distinguish the relative importance of each channel in the classification process to simplify the ANN model; and (iv) voting rules that adjust the accuracy of classification and produce outputs of different confidence levels. The Purdue Forest, located west of Purdue University, West Lafayette, Indiana, was chosen as the test site. The 1992 Landsat thematic mapper imagery was used as the input data. High-quality airborne photographs of the same Lime period were used for the ground truth. A total of 11 land use and land cover classes were defined, including water, broadleaved forest, coniferous forest, young forest, urban and road, and six types of cropland-grassland. The experiment, indicated that the back-propagation neural network application was satisfactory in distinguishing different land cover types at US Geological Survey levels II-III. The single-tier classification reached an overall accuracy of 85%. and the multi-tier classification an overall accuracy of 95%. For the whole test, region, the final output of this study reached an overall accuracy of 87%. ?? 2005 CASI.

  12. The effect of a three-tier formulary on antidepressant utilization and expenditures.

    PubMed

    Hodgkin, Dominic; Parks Thomas, Cindy; Simoni-Wastila, Linda; Ritter, Grant A; Lee, Sue

    2008-06-01

    Health plans in the United States are struggling to contain rapid growth in their spending on medications. They have responded by implementing multi-tiered formularies, which label certain brand medications 'non-preferred' and require higher patient copayments for those medications. This multi-tier policy relies on patients' willingness to switch medications in response to copayment differentials. The antidepressant class has certain characteristics that may pose problems for implementation of three-tier formularies, such as differences in which medication works for which patient, and high rates of medication discontinuation. To measure the effect of a three-tier formulary on antidepressant utilization and spending, including decomposing spending allocations between patient and plan. We use claims and eligibility files for a large, mature nonprofit managed care organization that started introducing its three-tier formulary on January 1, 2000, with a staggered implementation across employer groups. The sample includes 109,686 individuals who were continuously enrolled members during the study period. We use a pretest-posttest quasi-experimental design that includes a comparison group, comprising members whose employer had not adopted three-tier as of March 1, 2000. This permits some control for potentially confounding changes that could have coincided with three-tier implementation. For the antidepressants that became nonpreferred, prescriptions per enrollee decreased 11% in the three-tier group and increased 5% in the comparison group. The own-copay elasticity of demand for nonpreferred drugs can be approximated as -0.11. Difference-in-differences regression finds that the three-tier formulary slowed the growth in the probability of using antidepressants in the post-period, which was 0.3 percentage points lower than it would have been without three-tier. The three-tier formulary also increased out-of-pocket payments while reducing plan payments and total spending. The results indicate that the plan enrollees were somewhat responsive to the changed incentives, shifting away from the drugs that became nonpreferred. However, the intervention also resulted in cost-shifting from plan to enrollees, indicating some price-inelasticity. The reduction in the proportion of enrollees filling any prescriptions contrasts with results of prior studies for non-psychotropic drug classes. Limitations include the possibility of confounding changes coinciding with three-tier implementation (if they affected the two groups differentially); restriction to continuous enrollees; and lack of data on rebates the plan paid to drug manufacturers. The results of this study suggest that the impact of the three-tier formulary approach may be somewhat different for antidepressants than for some other classes. Policymakers should monitor the effects of three-tier programs on utilization in psychotropic medication classes. Future studies should seek to understand the reasons for patients' limited response to the change in incentives, perhaps using physician and/or patient surveys. Studies should also examine the effects of three-tier programs on patient adherence, quality of care, and clinical and economic outcomes.

  13. Concept of a spatial data infrastructure for web-mapping, processing and service provision for geo-hazards

    NASA Astrophysics Data System (ADS)

    Weinke, Elisabeth; Hölbling, Daniel; Albrecht, Florian; Friedl, Barbara

    2017-04-01

    Geo-hazards and their effects are distributed geographically over wide regions. The effective mapping and monitoring is essential for hazard assessment and mitigation. It is often best achieved using satellite imagery and new object-based image analysis approaches to identify and delineate geo-hazard objects (landslides, floods, forest fires, storm damages, etc.). At the moment, several local/national databases and platforms provide and publish data of different types of geo-hazards as well as web-based risk maps and decision support systems. Also, the European commission implemented the Copernicus Emergency Management Service (EMS) in 2015 that publishes information about natural and man-made disasters and risks. Currently, no platform for landslides or geo-hazards as such exists that enables the integration of the user in the mapping and monitoring process. In this study we introduce the concept of a spatial data infrastructure for object delineation, web-processing and service provision of landslide information with the focus on user interaction in all processes. A first prototype for the processing and mapping of landslides in Austria and Italy has been developed within the project Land@Slide, funded by the Austrian Research Promotion Agency FFG in the Austrian Space Applications Program ASAP. The spatial data infrastructure and its services for the mapping, processing and analysis of landslides can be extended to other regions and to all types of geo-hazards for analysis and delineation based on Earth Observation (EO) data. The architecture of the first prototypical spatial data infrastructure includes four main areas of technical components. The data tier consists of a file storage system and the spatial data catalogue for the management of EO-data, other geospatial data on geo-hazards, as well as descriptions and protocols for the data processing and analysis. An interface to extend the data integration from external sources (e.g. Sentinel-2 data) is planned for the possibility of rapid mapping. The server tier consists of java based web and GIS server. Sub and main services are part of the service tier. Sub services are for example map services, feature editing services, geometry services, geoprocessing services and metadata services. For (meta)data provision and to support data interoperability, web standards of the OGC and the rest-interface is used. Four central main services are designed and developed: (1) a mapping service (including image segmentation and classification approaches), (2) a monitoring service to monitor changes over time, (3) a validation service to analyze landslide delineations from different sources and (4) an infrastructure service to identify affected landslides. The main services use and combine parts of the sub services. Furthermore, a series of client applications based on new technology standards making use of the data and services offered by the spatial data infrastructure. Next steps include the design to extend the current spatial data infrastructure to other areas and geo-hazard types to develop a spatial data infrastructure that can assist targeted mapping and monitoring of geo-hazards on a global context.

  14. 48 CFR 1852.227-11 - Patent Rights-Retention by the Contractor (Short Form).

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Technology Reporting Web site http://invention.nasa.gov. (End of addition) (f)(5) The Contractor shall... of tier, for experimental, developmental, research, design, or engineering work to be performed by...

  15. A Study of West Virginia Elementary Special Education Teachers' Roles, Responsibilities, and Practices within a Multi-Tiered Instructional System: Implications for Policy and Practice

    ERIC Educational Resources Information Center

    Palenchar, Linda M.

    2012-01-01

    The purpose of the study was to provide a data-based description of West Virginia special education teachers' roles, responsibilities, and practices relevant to their participation in selected components of the Response to Intervention (RTI) process. Special educators' practices related to assessment, tiered instruction, decision making, and…

  16. Looking beyond RtI Standard Treatment Approach: It's Not Too Late to Embrace the Problem-Solving Approach

    ERIC Educational Resources Information Center

    King, Diane; Coughlin, Patricia Kathleen

    2016-01-01

    There are two approaches for providing Tier 2 interventions within Response to Intervention (RtI): standard treatment protocol (STP) and the problem-solving approach (PSA). This article describes the multi-tiered RtI prevention model being implemented across the United States through an analysis of these two approaches in reading instruction. It…

  17. Educational Support for Low-Performing Students in Mathematics: The Three-Tier Support Model in Finnish Lower Secondary Schools

    ERIC Educational Resources Information Center

    Ekstam, Ulrika; Linnanmäki, Karin; Aunio, Pirjo

    2015-01-01

    In 2011, there was a legislative reform regarding educational support in Finland, with a focus on early identification, differentiation and flexible arrangement of support using a multi-professional approach, the three-tier support model. The main aim of this study was to investigate what educational support practices are used with low-performing…

  18. Supporting Comprehensive, Integrated, Three-Tiered Models of Prevention in Schools: Administrators' Perspectives

    ERIC Educational Resources Information Center

    Lane, Kathleen Lynne; Carter, Erik W.; Jenkins, Abbie; Dwiggins, Lauren; Germer, Kathryn

    2015-01-01

    We report findings from a statewide survey of 365 site-level administrators developed to (a) learn about the extent to which schools across the state were implementing components of multi-tiered systems of support and (b) determine the areas in which these schools might need professional development or resources to support them. At least half of…

  19. Pervasive brain monitoring and data sharing based on multi-tier distributed computing and linked data technology

    PubMed Central

    Zao, John K.; Gan, Tchin-Tze; You, Chun-Kai; Chung, Cheng-En; Wang, Yu-Te; Rodríguez Méndez, Sergio José; Mullen, Tim; Yu, Chieh; Kothe, Christian; Hsiao, Ching-Teng; Chu, San-Liang; Shieh, Ce-Kuen; Jung, Tzyy-Ping

    2014-01-01

    EEG-based Brain-computer interfaces (BCI) are facing basic challenges in real-world applications. The technical difficulties in developing truly wearable BCI systems that are capable of making reliable real-time prediction of users' cognitive states in dynamic real-life situations may seem almost insurmountable at times. Fortunately, recent advances in miniature sensors, wireless communication and distributed computing technologies offered promising ways to bridge these chasms. In this paper, we report an attempt to develop a pervasive on-line EEG-BCI system using state-of-art technologies including multi-tier Fog and Cloud Computing, semantic Linked Data search, and adaptive prediction/classification models. To verify our approach, we implement a pilot system by employing wireless dry-electrode EEG headsets and MEMS motion sensors as the front-end devices, Android mobile phones as the personal user interfaces, compact personal computers as the near-end Fog Servers and the computer clusters hosted by the Taiwan National Center for High-performance Computing (NCHC) as the far-end Cloud Servers. We succeeded in conducting synchronous multi-modal global data streaming in March and then running a multi-player on-line EEG-BCI game in September, 2013. We are currently working with the ARL Translational Neuroscience Branch to use our system in real-life personal stress monitoring and the UCSD Movement Disorder Center to conduct in-home Parkinson's disease patient monitoring experiments. We shall proceed to develop the necessary BCI ontology and introduce automatic semantic annotation and progressive model refinement capability to our system. PMID:24917804

  20. Pervasive brain monitoring and data sharing based on multi-tier distributed computing and linked data technology.

    PubMed

    Zao, John K; Gan, Tchin-Tze; You, Chun-Kai; Chung, Cheng-En; Wang, Yu-Te; Rodríguez Méndez, Sergio José; Mullen, Tim; Yu, Chieh; Kothe, Christian; Hsiao, Ching-Teng; Chu, San-Liang; Shieh, Ce-Kuen; Jung, Tzyy-Ping

    2014-01-01

    EEG-based Brain-computer interfaces (BCI) are facing basic challenges in real-world applications. The technical difficulties in developing truly wearable BCI systems that are capable of making reliable real-time prediction of users' cognitive states in dynamic real-life situations may seem almost insurmountable at times. Fortunately, recent advances in miniature sensors, wireless communication and distributed computing technologies offered promising ways to bridge these chasms. In this paper, we report an attempt to develop a pervasive on-line EEG-BCI system using state-of-art technologies including multi-tier Fog and Cloud Computing, semantic Linked Data search, and adaptive prediction/classification models. To verify our approach, we implement a pilot system by employing wireless dry-electrode EEG headsets and MEMS motion sensors as the front-end devices, Android mobile phones as the personal user interfaces, compact personal computers as the near-end Fog Servers and the computer clusters hosted by the Taiwan National Center for High-performance Computing (NCHC) as the far-end Cloud Servers. We succeeded in conducting synchronous multi-modal global data streaming in March and then running a multi-player on-line EEG-BCI game in September, 2013. We are currently working with the ARL Translational Neuroscience Branch to use our system in real-life personal stress monitoring and the UCSD Movement Disorder Center to conduct in-home Parkinson's disease patient monitoring experiments. We shall proceed to develop the necessary BCI ontology and introduce automatic semantic annotation and progressive model refinement capability to our system.

  1. Proof of Concept Integration of a Single-Level Service-Oriented Architecture into a Multi-Domain Secure Environment

    DTIC Science & Technology

    2008-03-01

    Machine [29]. OC4J applications support Java Servlets , Web services, and the following J2EE specific standards: Extensible Markup Language (XML...IMAP Internet Message Access Protocol IP Internet Protocol IT Information Technology xviii J2EE Java Enterprise Environment JSR 168 Java ...LDAP), World Wide Web Distributed Authoring and Versioning (WebDav), Java Specification Request 168 (JSR 168), and Web Services for Remote

  2. 26 CFR 1.1446-5 - Tiered partnership structures.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 26 Internal Revenue 12 2010-04-01 2010-04-01 false Tiered partnership structures. 1.1446-5 Section...-Free Covenant Bonds § 1.1446-5 Tiered partnership structures. (a) In general. The rules of this section... prescribes rules applicable to a publicly traded partnership in a tiered partnership structure. Paragraph (e...

  3. 26 CFR 1.1503-2 - Dual consolidated loss.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... tiers of separate units. If a separate unit of a domestic corporation is owned indirectly through... upper-tier separate unit were a subsidiary of the domestic corporation and the lower-tier separate unit were a lower-tier subsidiary. (4) Examples. The following examples illustrate the application of this...

  4. An Internet supported workflow for the publication process in UMVF (French Virtual Medical University).

    PubMed

    Renard, Jean-Marie; Bourde, Annabel; Cuggia, Marc; Garcelon, Nicolas; Souf, Nathalie; Darmoni, Stephan; Beuscart, Régis; Brunetaud, Jean-Marc

    2007-01-01

    The " Université Médicale Virtuelle Francophone" (UMVF) is a federation of French medical schools. Its main goal is to share the production and use of pedagogic medical resources generated by academic medical teachers. We developed an Open-Source application based upon a workflow system, which provides an improved publication process for the UMVF. For teachers, the tool permits easy and efficient upload of new educational resources. For web masters it provides a mechanism to easily locate and validate the resources. For librarian it provide a way to improve the efficiency of indexation. For all, the utility provides a workflow system to control the publication process. On the students side, the application improves the value of the UMVF repository by facilitating the publication of new resources and by providing an easy way to find a detailed description of a resource and to check any resource from the UMVF to ascertain its quality and integrity, even if the resource is an old deprecated version. The server tier of the application is used to implement the main workflow functionalities and is deployed on certified UMVF servers using the PHP language, an LDAP directory and an SQL database. The client tier of the application provides both the workflow and the search and check functionalities. A unique signature for each resource, was needed to provide security functionality and is implemented using a Digest algorithm. The testing performed by Rennes and Lille verified the functionality and conformity with our specifications.

  5. Intercontinental Multi-Domain Monitoring for LHC with perfSONAR

    NASA Astrophysics Data System (ADS)

    Vicinanza, D.

    2012-12-01

    The Large Hadron Collider (LHC) is currently running at CERN in Geneva, Switzerland. Physicists are using LHC to recreate the conditions just after the Big Bang, by colliding two beams of particles and heavy ions head-on at very high energy. The project is generating more than 15 TB of raw data per year, plus 10 TB of “event summary data”. This data is sent out from CERN to eleven Tier 1 research centres in Europe, Asia, and North America using a multi-gigabits Optical Private Network (OPN), the LHCOPN. Tier 1 sites are then connected to 100+ academic and research institutions in the world (the Tier 2s) through a Multipoint to Multipoint network, the LHC Open Network Environment (LHCONE). Network monitoring on such complex network architecture to ensure robust and reliable operation is of crucial importance. The chosen approach for monitoring the OPN and ONE is based on the perfSONAR framework, which is designed for multi-domain monitoring environments. perfSONAR (www.perfsonar.net) is an infrastructure for performance monitoring data exchange between networks, making it easier to solve performance problems occurring between network measurement points interconnected through several network domains.

  6. The benefits of convergence.

    PubMed

    Chang, Gee-Kung; Cheng, Lin

    2016-03-06

    A multi-tier radio access network (RAN) combining the strength of fibre-optic and radio access technologies employing adaptive microwave photonics interfaces and radio-over-fibre (RoF) techniques is envisioned for future heterogeneous wireless communications. All-band radio spectrum from 0.1 to 100 GHz will be used to deliver wireless services with high capacity, high link speed and low latency. The multi-tier RAN will improve the cell-edge performance in an integrated heterogeneous environment enabled by fibre-wireless integration and networking for mobile fronthaul/backhaul, resource sharing and all-layer centralization of multiple standards with different frequency bands and modulation formats. In essence, this is a 'no-more-cells' architecture in which carrier aggregation among multiple frequency bands can be easily achieved with seamless handover between cells. In this way, current and future mobile network standards such as 4G and 5G can coexist with optimized and continuous cell coverage using multi-tier RoF regardless of the underlying network topology or protocol. In terms of users' experience, the future-proof approach achieves the goals of system capacity, link speed, latency and continuous heterogeneous cell coverage while overcoming the bandwidth crunch in next-generation communication networks. © 2016 The Author(s).

  7. A combined approach of AHP and TOPSIS methods applied in the field of integrated software systems

    NASA Astrophysics Data System (ADS)

    Berdie, A. D.; Osaci, M.; Muscalagiu, I.; Barz, C.

    2017-05-01

    Adopting the most appropriate technology for developing applications on an integrated software system for enterprises, may result in great savings both in cost and hours of work. This paper proposes a research study for the determination of a hierarchy between three SAP (System Applications and Products in Data Processing) technologies. The technologies Web Dynpro -WD, Floorplan Manager - FPM and CRM WebClient UI - CRM WCUI are multi-criteria evaluated in terms of the obtained performances through the implementation of the same web business application. To establish the hierarchy a multi-criteria analysis model that combines the AHP (Analytic Hierarchy Process) and the TOPSIS (Technique for Order Preference by Similarity to Ideal Solution) methods was proposed. This model was built with the help of the SuperDecision software. This software is based on the AHP method and determines the weights for the selected sets of criteria. The TOPSIS method was used to obtain the final ranking and the technologies hierarchy.

  8. Integrated forward and reverse supply chain: A tire case study.

    PubMed

    Pedram, Ali; Yusoff, Nukman Bin; Udoncy, Olugu Ezutah; Mahat, Abu Bakar; Pedram, Payam; Babalola, Ayo

    2017-02-01

    This paper attempts to integrate both a forward and reverse supply chain to design a closed-loop supply chain network (CLSC). The problem in the design of a CLSC network is uncertainty in demand, return products and the quality of return products. Scenario analyses are generated to overcome this uncertainty. In contrast to the existing supply chain network design models, a new application of a CLSC network was studied in this paper to reduce waste. A multi-product, multi-tier mixed integer linear model is developed for a CLSC network design. The main objective is to maximize profit and provide waste management decision support in order to minimize pollution. The result shows applicability of the model in the tire industry. The model determines the number and the locations of facilities and the material flows between these facilities. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. CBrowse: a SAM/BAM-based contig browser for transcriptome assembly visualization and analysis.

    PubMed

    Li, Pei; Ji, Guoli; Dong, Min; Schmidt, Emily; Lenox, Douglas; Chen, Liangliang; Liu, Qi; Liu, Lin; Zhang, Jie; Liang, Chun

    2012-09-15

    To address the impending need for exploring rapidly increased transcriptomics data generated for non-model organisms, we developed CBrowse, an AJAX-based web browser for visualizing and analyzing transcriptome assemblies and contigs. Designed in a standard three-tier architecture with a data pre-processing pipeline, CBrowse is essentially a Rich Internet Application that offers many seamlessly integrated web interfaces and allows users to navigate, sort, filter, search and visualize data smoothly. The pre-processing pipeline takes the contig sequence file in FASTA format and its relevant SAM/BAM file as the input; detects putative polymorphisms, simple sequence repeats and sequencing errors in contigs and generates image, JSON and database-compatible CSV text files that are directly utilized by different web interfaces. CBowse is a generic visualization and analysis tool that facilitates close examination of assembly quality, genetic polymorphisms, sequence repeats and/or sequencing errors in transcriptome sequencing projects. CBrowse is distributed under the GNU General Public License, available at http://bioinfolab.muohio.edu/CBrowse/ liangc@muohio.edu or liangc.mu@gmail.com; glji@xmu.edu.cn Supplementary data are available at Bioinformatics online.

  10. Mentoring to Promote Courage and Confidence among Elementary School Students with Internalizing Problems: A Single-Case Design Pilot Study

    ERIC Educational Resources Information Center

    Fiat, Aria E.; Cook, Clayton R.; Zhang, Yanchen; Renshaw, Tyler L.; DeCano, Polocarpio; Merrick, Jillian S.

    2017-01-01

    There is a paucity of selective, Tier 2 interventions that educators can implement for students with internalizing problems as part of their schools' Multi-Tiered Systems of Supports. To fill this void, the authors' purpose was to evaluate the efficacy, acceptability, and integrity of a structured school-based mentoring program, the Courage and…

  11. 40 CFR 86.1811-09 - Emission standards for light-duty vehicles, light-duty trucks and medium-duty passenger vehicles.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Diurnal Plus Hot Soak Evaporative Emission Standards: Non-Gasoline Portion of Multi-Fueled Vehicles Model...) Evaporative emission in-use standards. (1) For LDVs and LLDTs certified prior to the 2012 model year, the Tier... the 2011 model year must meet the Tier 2 LDV/LLDT evaporative emission standards (Table S04-3) in-use...

  12. The Effects of Tier 2 Intervention on the Mathematics Performance of First-Grade Students Who Are at Risk for Mathematics Difficulties

    ERIC Educational Resources Information Center

    Bryant, Diane Pedrotty; Bryant, Brian R.; Gersten, Russell M.; Scammacca, Nancy N.; Funk, Catherine; Winter, Amanda; Shih, Minyi; Pool, Cathy

    2008-01-01

    Responsiveness to Intervention (RtI) is recommended both as an essential step before identifying learning disabilities (LD) and as a mechanism for preventing learning difficulties. The use of evidence-based multi-tiered interventions is of critical importance when implementing RtI. This article presents the results of a study that examined the…

  13. Using a Multi-Tier Diagnostic Test to Explore the Nature of Students' Alternative Conceptions on Reaction Kinetics

    ERIC Educational Resources Information Center

    Yan, Yaw Kai; Subramaniam, R.

    2018-01-01

    This study focused on grade 12 students' understanding of reaction kinetics. A 4-tier diagnostic instrument was developed for this purpose and administered to 137 students in the main study. Findings showed that reaction kinetics is a difficult topic for these students, with a total of 25 alternative conceptions (ACs) being uncovered. Except for…

  14. 40 CFR 1033.135 - Labeling.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Tier 1 and later locomotives. The label on the engine is replaced each time the locomotive is... 0 and Tier 1 locomotives, the label may be made up of more than one piece, as long as all pieces are... to Tier 1+ locomotives.” (4) “This locomotive conforms to U.S. EPA regulations applicable to Tier 2...

  15. An Automated End-To Multi-Agent Qos Based Architecture for Selection of Geospatial Web Services

    NASA Astrophysics Data System (ADS)

    Shah, M.; Verma, Y.; Nandakumar, R.

    2012-07-01

    Over the past decade, Service-Oriented Architecture (SOA) and Web services have gained wide popularity and acceptance from researchers and industries all over the world. SOA makes it easy to build business applications with common services, and it provides like: reduced integration expense, better asset reuse, higher business agility, and reduction of business risk. Building of framework for acquiring useful geospatial information for potential users is a crucial problem faced by the GIS domain. Geospatial Web services solve this problem. With the help of web service technology, geospatial web services can provide useful geospatial information to potential users in a better way than traditional geographic information system (GIS). A geospatial Web service is a modular application designed to enable the discovery, access, and chaining of geospatial information and services across the web that are often both computation and data-intensive that involve diverse sources of data and complex processing functions. With the proliferation of web services published over the internet, multiple web services may provide similar functionality, but with different non-functional properties. Thus, Quality of Service (QoS) offers a metric to differentiate the services and their service providers. In a quality-driven selection of web services, it is important to consider non-functional properties of the web service so as to satisfy the constraints or requirements of the end users. The main intent of this paper is to build an automated end-to-end multi-agent based solution to provide the best-fit web service to service requester based on QoS.

  16. Small satellite multi mission C2 for maximum effect

    USGS Publications Warehouse

    Miller, E.; Medina, O.; Lane, C.R.; Kirkham, A.; Ivancic, W.; Jones, B.; Risty, R.

    2006-01-01

    This paper discusses US Air Force, US Army, US Navy, and NASA demonstrations based around the Virtual Mission Operations Center (VMOC) and its application in fielding a Multi Mission Satellite Operations Center (MMSOC) designed to integrate small satellites into the inherently tiered system environment of operations. The intent is to begin standardizing the spacecraft to ground interfaces needed to reduce costs, maximize space effects to the user, and allow the generation of Tactics, Techniques and Procedures (TTPs) that lead to Responsive Space employment. Combining the US Air Force/Army focus of theater command and control of payloads with the US Navy's user collaboration and FORCEnet consistent approach lays the groundwork for the fundamental change needed to maximize responsive space effects.

  17. Acute tier-1 and tier-2 effect assessment approaches in the EFSA Aquatic Guidance Document: are they sufficiently protective for insecticides?

    PubMed

    van Wijngaarden, René P A; Maltby, Lorraine; Brock, Theo C M

    2015-08-01

    The objective of this paper is to evaluate whether the acute tier-1 and tier-2 methods as proposed by the Aquatic Guidance Document recently published by the European Food Safety Authority (EFSA) are appropriate for deriving regulatory acceptable concentrations (RACs) for insecticides. The tier-1 and tier-2 RACs were compared with RACs based on threshold concentrations from micro/mesocosm studies (ETO-RAC). A lower-tier RAC was considered as sufficiently protective, if less than the corresponding ETO-RAC. ETO-RACs were calculated for repeated (n = 13) and/or single pulsed applications (n = 17) of 26 insecticides to micro/mesocosms, giving a maximum of 30 insecticide × application combinations (i.e. cases) for comparison. Acute tier-1 RACs (for 24 insecticides) were lower than the corresponding ETO-RACs in 27 out of 29 cases, while tier-2 Geom-RACs (for 23 insecticides) were lower in 24 out of 26 cases. The tier-2 SSD-RAC (for 21 insecticides) using HC5 /3 was lower than the ETO-RAC in 23 out of 27 cases, whereas the tier-2 SSD-RAC using HC5 /6 was protective in 25 out of 27 cases. The tier-1 and tier-2 approaches proposed by EFSA for acute effect assessment are sufficiently protective for the majority of insecticides evaluated. Further evaluation may be needed for insecticides with more novel chemistries (neonicotinoids, biopesticides) and compounds that show delayed effects (insect growth regulators). © 2014 Society of Chemical Industry.

  18. 77 FR 36027 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-15

    ... Rates; and For Investor Tiers 1-3, the applicable rate based on an ETP Holder's qualifying levels. The... applicable Tier, Step Up Tier or Basic Rate and would be based on an ETP Holder's qualifying levels, as...-listed Tape B securities based on an ETP Holder's qualifying levels. $0.0026 per share fee for Tape B...

  19. Design and implementation of space physics multi-model application integration based on web

    NASA Astrophysics Data System (ADS)

    Jiang, Wenping; Zou, Ziming

    With the development of research on space environment and space science, how to develop network online computing environment of space weather, space environment and space physics models for Chinese scientific community is becoming more and more important in recent years. Currently, There are two software modes on space physics multi-model application integrated system (SPMAIS) such as C/S and B/S. the C/S mode which is traditional and stand-alone, demands a team or workshop from many disciplines and specialties to build their own multi-model application integrated system, that requires the client must be deployed in different physical regions when user visits the integrated system. Thus, this requirement brings two shortcomings: reducing the efficiency of researchers who use the models to compute; inconvenience of accessing the data. Therefore, it is necessary to create a shared network resource access environment which could help users to visit the computing resources of space physics models through the terminal quickly for conducting space science research and forecasting spatial environment. The SPMAIS develops high-performance, first-principles in B/S mode based on computational models of the space environment and uses these models to predict "Space Weather", to understand space mission data and to further our understanding of the solar system. the main goal of space physics multi-model application integration system (SPMAIS) is to provide an easily and convenient user-driven online models operating environment. up to now, the SPMAIS have contained dozens of space environment models , including international AP8/AE8 IGRF T96 models and solar proton prediction model geomagnetic transmission model etc. which are developed by Chinese scientists. another function of SPMAIS is to integrate space observation data sets which offers input data for models online high-speed computing. In this paper, service-oriented architecture (SOA) concept that divides system into independent modules according to different business needs is applied to solve the problem of the independence of the physical space between multiple models. The classic MVC(Model View Controller) software design pattern is concerned to build the architecture of space physics multi-model application integrated system. The JSP+servlet+javabean technology is used to integrate the web application programs of space physics multi-model. It solves the problem of multi-user requesting the same job of model computing and effectively balances each server computing tasks. In addition, we also complete follow tasks: establishing standard graphical user interface based on Java Applet application program; Designing the interface between model computing and model computing results visualization; Realizing three-dimensional network visualization without plug-ins; Using Java3D technology to achieve a three-dimensional network scene interaction; Improved ability to interact with web pages and dynamic execution capabilities, including rendering three-dimensional graphics, fonts and color control. Through the design and implementation of the SPMAIS based on Web, we provide an online computing and application runtime environment of space physics multi-model. The practical application improves that researchers could be benefit from our system in space physics research and engineering applications.

  20. Evaluation of a Two-Phase Experimental Study of a Small Group ("MultiLit") Reading Intervention for Older Low-Progress Readers

    ERIC Educational Resources Information Center

    Buckingham, Jennifer; Beaman-Wheldall, Robyn; Wheldall, Kevin

    2014-01-01

    The study reported here examined the efficacy of a small group (Tier 2 in a three-tier Response to Intervention model) literacy intervention for older low-progress readers (in Years 3-6). This article focuses on the second phase of a two-phase, crossover randomized control trial involving 26 students. In Phase 1, the experimental group (E1)…

  1. Testing a multi-tiered stress-gradient model for risk assessment using sediment constituents from coral reef environments

    USGS Publications Warehouse

    Lidz, B.H.; Hallock, P.; ,

    2000-01-01

    Coral reefs are threatened worldwide by stresses ranging from local to global in extent. One of the major challenges in studies of reef decline is understanding how to distinguish between changes resulting from natural, anthropogenic, local, and global environmental perturbations. As such, a conceptual risk-assessment model is developed that includes tiers for natural stresses, global/regional stresses, and local anthropogenic stresses.

  2. Prospective Environmental Risk Assessment for Sediment-Bound Organic Chemicals: A Proposal for Tiered Effect Assessment.

    PubMed

    Diepens, Noël J; Koelmans, Albert A; Baveco, Hans; van den Brink, Paul J; van den Heuvel-Greve, Martine J; Brock, Theo C M

    A broadly accepted framework for prospective environmental risk assessment (ERA) of sediment-bound organic chemicals is currently lacking. Such a framework requires clear protection goals, evidence-based concepts that link exposure to effects and a transparent tiered-effect assessment. In this paper, we provide a tiered prospective sediment ERA procedure for organic chemicals in sediment, with a focus on the applicable European regulations and the underlying data requirements. Using the ecosystem services concept, we derived specific protection goals for ecosystem service providing units: microorganisms, benthic algae, sediment-rooted macrophytes, benthic invertebrates and benthic vertebrates. Triggers for sediment toxicity testing are discussed.We recommend a tiered approach (Tier 0 through Tier 3). Tier-0 is a cost-effective screening based on chronic water-exposure toxicity data for pelagic species and equilibrium partitioning. Tier-1 is based on spiked sediment laboratory toxicity tests with standard benthic test species and standardised test methods. If comparable chronic toxicity data for both standard and additional benthic test species are available, the Species Sensitivity Distribution (SSD) approach is a more viable Tier-2 option than the geometric mean approach. This paper includes criteria for accepting results of sediment-spiked single species toxicity tests in prospective ERA, and for the application of the SSD approach. We propose micro/mesocosm experiments with spiked sediment, to study colonisation success by benthic organisms, as a Tier-3 option. Ecological effect models can be used to supplement the experimental tiers. A strategy for unifying information from various tiers by experimental work and exposure-and effect modelling is provided.

  3. SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications

    PubMed Central

    Kalinin, Alexandr A.; Palanimalai, Selvam; Dinov, Ivo D.

    2018-01-01

    The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis. PMID:29630069

  4. SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications.

    PubMed

    Kalinin, Alexandr A; Palanimalai, Selvam; Dinov, Ivo D

    2017-04-01

    The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis.

  5. Fast access to the CMS detector condition data employing HTML5 technologies

    NASA Astrophysics Data System (ADS)

    Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo

    2011-12-01

    This paper focuses on using HTML version 5 (HTML5) for accessing condition data for the CMS experiment, evaluating the benefits and risks posed by the use of this technology. According to the authors of HTML5, this technology attempts to solve issues found in previous iterations of HTML and addresses the needs of web applications, an area previously not adequately covered by HTML. We demonstrate that employing HTML5 brings important benefits in terms of access performance to the CMS condition data. The combined use of web storage and web sockets allows increasing the performance and reducing the costs in term of computation power, memory usage and network bandwidth for client and server. Above all, the web workers allow creating different scripts that can be executed using multi-thread mode, exploiting multi-core microprocessors. Web workers have been employed in order to substantially decrease the web page rendering time to display the condition data stored in the CMS condition database.

  6. 76 FR 56248 - Self-Regulatory Organizations; Chicago Stock Exchange, Inc.; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-12

    ... this tiered schedule, there were three volume-based Tiers and the rate of applicable take fees and provide credits varied based upon the Tier into which a Participant falls. \\5\\ Through its filing on....0026/share to $0.0025/share for the lowest Tier of activity, from $0.0028/share to $0.0027/share in the...

  7. Development of Web-based Distributed Cooperative Development Environmentof Sign-Language Animation System and its Evaluation

    NASA Astrophysics Data System (ADS)

    Yuizono, Takaya; Hara, Kousuke; Nakayama, Shigeru

    A web-based distributed cooperative development environment of sign-language animation system has been developed. We have extended the system from the previous animation system that was constructed as three tiered system which consists of sign-language animation interface layer, sign-language data processing layer, and sign-language animation database. Two components of a web client using VRML plug-in and web servlet are added to the previous system. The systems can support humanoid-model avatar for interoperability, and can use the stored sign language animation data shared on the database. It is noted in the evaluation of this system that the inverse kinematics function of web client improves the sign-language animation making.

  8. 44th Annual Gun and Missile Systems Conference and Exhibition

    DTIC Science & Technology

    2009-04-09

    IR S A R GOT S/ COT S/OT S Client Vie ws We b-Based Vi ews 2D V i s 3D Vi s DS P S T F D S C MT IX WT X ETS Web - Ba...08-Q-1904 on NSWCDD web site – FY10 Announcement (Pending) • EO/ IR /LST integration into a Tier 2 UAS targeting ball • Solicitation FY10, award at...present picture ID. Badges must be worn at all conference functions. Proceedings will be available on the web one to two weeks after

  9. Preschool Multi-Tier Prevention-Intervention Model for Language and Early Literacy (Pre-3T): Development Summary and Implementation Guide. CYFS Working Paper No. 2014-3

    ERIC Educational Resources Information Center

    Clarke, Brandy L.; Knoche, Lisa L.; Abbott, Mary I.; Sheridan, Susan M.; Carta, Judith J.; Sjuts, Tara S.

    2014-01-01

    The primary objective of this development study was to develop and pilot a three-tiered prevention model (universal, targeted, individualized) in early education for children at risk of reading difficulties. The aims of this study were to: (1) Define and develop a Pre-3T model to address the early literacy and language needs of young children in…

  10. EMERGING TECHNOLOGIES FOR THE MANAGEMENT AND UTILIZATION OF LANDFILL GAS

    EPA Science Inventory

    The report gives information on emerging technologies that are considered to be commercially available (Tier 1), currently undergoing research and development (Tier 2), or considered as potentially applicable (Tier 3) for the management of landfill gas (LFG) emissions or for the ...

  11. Development of online diary and self-management system on e-Healthcare for asthmatic children in Taiwan.

    PubMed

    Lin, Hsueh-Chun; Chiang, Li-Chi; Wen, Tzu-Ning; Yeh, Kuo-Wei; Huang, Jing-Long

    2014-10-01

    Many regional programs of the countries educate asthmatic children and their families to manage healthcare data. This study aims to establish a Web-based self-management system, eAsthmaCare, to promote the electronic healthcare (e-Healthcare) services for the asthmatic children in Taiwan. The platform can perform real time online functionality based upon a five-tier infrastructure with mutually supportive components to acquire asthma diaries, quality of life assessments and health educations. We have designed five multi-disciplinary portions on the interactive interface functioned with the analytical diagrams: (1) online asthma diary, (2) remote asthma assessment, (3) instantaneous asthma alert, (4) diagrammatical clinic support, and (5) asthma health education. The Internet-based asthma diary and assessment program was developed for patients to process self-management healthcare at home. In addition, the online analytical charts can help healthcare professionals to evaluate multi-domain health information of patients immediately. eAsthmaCare was developed by Java™ Servlet/JSP technology upon Apache Tomcat™ web server and Oracle™ database. Forty-one voluntary asthmatic children (and their parents) were intervened to examine the proposed system. Seven domains of satisfiability assessment by using the system were applied for approving the development. The average scores were scaled in the acceptable range for each domain to ensure feasibility of the proposed system. The study revealed the details of system infrastructure and developed functions that can help asthmatic children in self-management for healthcare to enhance communications between patients and hospital professionals. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  12. The ATLAS Tier-0: Overview and operational experience

    NASA Astrophysics Data System (ADS)

    Elsing, Markus; Goossens, Luc; Nairz, Armin; Negri, Guido

    2010-04-01

    Within the ATLAS hierarchical, multi-tier computing infrastructure, the Tier-0 centre at CERN is mainly responsible for prompt processing of the raw data coming from the online DAQ system, to archive the raw and derived data on tape, to register the data with the relevant catalogues and to distribute them to the associated Tier-1 centers. The Tier-0 is already fully functional. It has been successfully participating in all cosmic and commissioning data taking since May 2007, and was ramped up to its foreseen full size, performance and throughput for the cosmic (and short single-beam) run periods between July and October 2008. Data and work flows for collision data taking were exercised in several "Full Dress Rehearsals" (FDRs) in the course of 2008. The transition from an expert to a shifter-based system was successfully established in July 2008. This article will give an overview of the Tier-0 system, its data and work flows, and operations model. It will review the operational experience gained in cosmic, commissioning, and FDR exercises during the past year. And it will give an outlook on planned developments and the evolution of the system towards first collision data taking expected now in late Autumn 2009.

  13. To Wait in Tier 1 or Intervene Immediately: A Randomized Experiment Examining First Grade Response to Intervention (RTI) in Reading.

    PubMed

    Al Otaiba, Stephanie; Connor, Carol M; Folsom, Jessica S; Wanzek, Jeanne; Greulich, Luana; Schatschneider, Christopher; Wagner, Richard K

    2014-10-01

    This randomized controlled experiment compared the efficacy of two Response to Intervention (RTI) models - Typical RTI and Dynamic RTI - and included 34 first-grade classrooms ( n = 522 students) across 10 socio-economically and culturally diverse schools. Typical RTI was designed to follow the two-stage RTI decision rules that wait to assess response to Tier 1 in many districts, whereas Dynamic RTI provided Tier 2 or Tier 3 interventions immediately according to students' initial screening results. Interventions were identical across conditions except for when intervention began. Reading assessments included letter-sound, word, and passage reading, and teacher-reported severity of reading difficulties. An intent-to-treat analysis using multi-level modeling indicated an overall effect favoring the Dynamic RTI condition ( d = .36); growth curve analyses demonstrated that students in Dynamic RTI showed an immediate score advantage, and effects accumulated across the year. Analyses of standard score outcomes confirmed that students in the Dynamic condition who received Tier 2 and Tier 3 ended the study with significantly higher reading performance than students in the Typical condition. Implications for RTI implementation practice and for future research are discussed.

  14. To Wait in Tier 1 or Intervene Immediately: A Randomized Experiment Examining First Grade Response to Intervention (RTI) in Reading

    PubMed Central

    Al Otaiba, Stephanie; Connor, Carol M.; Folsom, Jessica S.; Wanzek, Jeanne; Greulich, Luana; Schatschneider, Christopher; Wagner, Richard K.

    2014-01-01

    This randomized controlled experiment compared the efficacy of two Response to Intervention (RTI) models – Typical RTI and Dynamic RTI - and included 34 first-grade classrooms (n = 522 students) across 10 socio-economically and culturally diverse schools. Typical RTI was designed to follow the two-stage RTI decision rules that wait to assess response to Tier 1 in many districts, whereas Dynamic RTI provided Tier 2 or Tier 3 interventions immediately according to students’ initial screening results. Interventions were identical across conditions except for when intervention began. Reading assessments included letter-sound, word, and passage reading, and teacher-reported severity of reading difficulties. An intent-to-treat analysis using multi-level modeling indicated an overall effect favoring the Dynamic RTI condition (d = .36); growth curve analyses demonstrated that students in Dynamic RTI showed an immediate score advantage, and effects accumulated across the year. Analyses of standard score outcomes confirmed that students in the Dynamic condition who received Tier 2 and Tier 3 ended the study with significantly higher reading performance than students in the Typical condition. Implications for RTI implementation practice and for future research are discussed. PMID:25530622

  15. 40 CFR 86.1810-09 - General standards; increase in emissions; unsafe condition; waivers.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... light-duty vehicles and light-duty trucks fueled by gasoline, diesel, methanol, ethanol, natural gas and... applicable to methanol fueled vehicles are also applicable to Tier 2 and interim non-Tier 2 ethanol fueled...

  16. 40 CFR 86.1810-09 - General standards; increase in emissions; unsafe condition; waivers.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... light-duty vehicles and light-duty trucks fueled by gasoline, diesel, methanol, ethanol, natural gas and... applicable to methanol fueled vehicles are also applicable to Tier 2 and interim non-Tier 2 ethanol fueled...

  17. The CMS Tier0 goes cloud and grid for LHC Run 2

    DOE PAGES

    Hufnagel, Dirk

    2015-12-23

    In 2015, CMS will embark on a new era of collecting LHC collisions at unprecedented rates and complexity. This will put a tremendous stress on our computing systems. Prompt Processing of the raw data by the Tier-0 infrastructure will no longer be constrained to CERN alone due to the significantly increased resource requirements. In LHC Run 2, we will need to operate it as a distributed system utilizing both the CERN Cloud-based Agile Infrastructure and a significant fraction of the CMS Tier-1 Grid resources. In another big change for LHC Run 2, we will process all data using the multi-threadedmore » framework to deal with the increased event complexity and to ensure efficient use of the resources. Furthermore, this contribution will cover the evolution of the Tier-0 infrastructure and present scale testing results and experiences from the first data taking in 2015.« less

  18. The CMS TierO goes Cloud and Grid for LHC Run 2

    NASA Astrophysics Data System (ADS)

    Hufnagel, Dirk

    2015-12-01

    In 2015, CMS will embark on a new era of collecting LHC collisions at unprecedented rates and complexity. This will put a tremendous stress on our computing systems. Prompt Processing of the raw data by the Tier-0 infrastructure will no longer be constrained to CERN alone due to the significantly increased resource requirements. In LHC Run 2, we will need to operate it as a distributed system utilizing both the CERN Cloud-based Agile Infrastructure and a significant fraction of the CMS Tier-1 Grid resources. In another big change for LHC Run 2, we will process all data using the multi-threaded framework to deal with the increased event complexity and to ensure efficient use of the resources. This contribution will cover the evolution of the Tier-0 infrastructure and present scale testing results and experiences from the first data taking in 2015.

  19. Deska: Tool for Central Administration of a Grid Site

    NASA Astrophysics Data System (ADS)

    Kundrát, Jan; Krejčová, Martina; Hubík, Tomáš; Kerpl, Lukáš

    2011-12-01

    Running a typical Tier-2 site requires mastering quite a few tools for fabric management. Keeping an inventory of installed HW machines, their roles and detailed information, from IP addresses to rack locations, is typically done using various in-house applications ranging from simple spreadsheets to web applications. Such solutions, whose documentation usually leaves much to be desired, typically do not prevent a significant duplication of information, and therefore the data therein quickly become obsolete. After having deployed Cfengine as one of a few sites in the WLCG environment, the Prague Tier-2 site set forth to further automate the fabric management, developing the Deska project. The aim of the system is to provide a central place to perform changes, from adding new machines or moving them between racks to changing their assigned service roles and additional metadata. The database provides an authoritative source of information from which all other systems and services (like DHCP servers, Ethernet switches or the Cfengine system) pull their data, using newly developed configuration adaptors. An easy-to-use command line interface modelled after the Cisco IOS-based switches was developed, enabling the data center administrators to easily change any information in an intuitive way. We provide an overview of the current status of the implementation and describe our design choices aimed at further reducing the system engineers' workload.

  20. Thoughts on Access, Differentiation, and Implementation of a Multicultural Curriculum

    ERIC Educational Resources Information Center

    Cavilla, Derek

    2014-01-01

    Identification of gifted students from diverse and underserved communities is traditionally low; however, there are ways to expand identification methods in order to make access to gifted education programs more equitable. Creation and implementation of multi-faceted and multi-dimensional assessments as well as tiered access into gifted education…

  1. 75 FR 20085 - Subpart B-Advanced Biofuel Payment Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-16

    ... biofuels industry is very capital intensive, the Agency is proposing multi-year contracts to enable advanced biofuels producers the assurance of a multi-year revenue stream. This approach is consistent with the goal of creating a stable industry. Finally, the Agency is proposing a two- tiered payment...

  2. 76 FR 74079 - Self-Regulatory Organizations; BATS Exchange, Inc.; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-30

    ... applicable Exchange listing tier into which such products fall. \\3\\ 15 U.S.C. 78f(b). \\4\\ 15 U.S.C. 78f(b)(5... on the Exchange pursuant to Rule 14.11 as Tier I securities. Exchange Rule 14.11 sets forth the... the Exchange's current rules, ETPs are not designated as either Tier I or Tier II securities. The...

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sondrup, Andrus Jeffrey

    The Department of Energy Idaho Operations Office (DOE-ID) is applying for a synthetic minor, Sitewide, air quality permit to construct (PTC) with a facility emission cap (FEC) component from the Idaho Department of Environmental Quality (DEQ) for Idaho National Laboratory (INL) to limit its potential to emit to less than major facility limits for criteria air pollutants (CAPs) and hazardous air pollutants (HAPs) regulated under the Clean Air Act. This document is supplied as an appendix to the application, Idaho National Laboratory Application for a Synthetic Minor Sitewide Air Quality Permit to Construct with a Facility Emissions Cap Component, hereaftermore » referred to as “permit application” (DOE-ID 2015). Air dispersion modeling was performed as part of the permit application process to demonstrate pollutant emissions from the INL will not cause a violation of any ambient air quality standards. This report documents the modeling methodology and results for the air dispersion impact analysis. All CAPs regulated under Section 109 of the Clean Air Act were modeled with the exception of lead (Pb) and ozone, which are not required to be modeled by DEQ. Modeling was not performed for toxic air pollutants (TAPs) as uncontrolled emissions did not exceed screening emission levels for carcinogenic and non-carcinogenic TAPs. Modeling for CAPs was performed with the EPA approved AERMOD dispersion modeling system (Version 14134) (EPA 2004a) and five years (2000-2004) of meteorological data. The meteorological data set was produced with the companion AERMET model (Version 14134) (EPA 2004b) using surface data from the Idaho Falls airport, and upper-air data from Boise International Airport supplied by DEQ. Onsite meteorological data from the Grid 3 Mesonet tower located near the center of the INL (north of INTEC) and supplied by the local National Oceanic and Atmospheric Administration (NOAA) office was used for surface wind directions and wind speeds. Surface data (i.e., land use data that defines roughness, albedo, Bowen ratio, and other parameters) were processed using the AERSURFACE utility (Version 13016) (EPA 2013). Emission sources were modeled as point sources using actual stack locations and dimensions. Emissions, flow rates and exit temperatures were based on the design operating capacity of each source. All structures close enough to produce an area of wake effect were included for all sources. For multi-tiered structures, the heights of the tiers were included or the entire building height was assumed to be equal to the height of the tallest tier. Concentrations were calculated at 1,352 receptor locations provided by DEQ. All receptors were considered for each pollutant and averaging period. Maximum modeled CAP concentrations summed with average background concentration values were presented and compared to National Ambient Air Quality Standards (NAAQS). The background concentration values used were obtained using the Washington State University’s Laboratory for Atmospheric Research North West Airquest web-based retrieval tool (http://lar.wsu.edu/nw airquest/lookup.html). The air dispersion modeling results show the maximum impacts for CAPs are less than applicable standards and demonstrate the INL will not cause a violation of any ambient air quality standards.« less

  4. Distributed run of a one-dimensional model in a regional application using SOAP-based web services

    NASA Astrophysics Data System (ADS)

    Smiatek, Gerhard

    This article describes the setup of a distributed computing system in Perl. It facilitates the parallel run of a one-dimensional environmental model on a number of simple network PC hosts. The system uses Simple Object Access Protocol (SOAP) driven web services offering the model run on remote hosts and a multi-thread environment distributing the work and accessing the web services. Its application is demonstrated in a regional run of a process-oriented biogenic emission model for the area of Germany. Within a network consisting of up to seven web services implemented on Linux and MS-Windows hosts, a performance increase of approximately 400% has been reached compared to a model run on the fastest single host.

  5. Wireless Testbed Bonsai

    DTIC Science & Technology

    2006-02-01

    wireless sensor device network, and a about 200 Stargate nodes higher-tier multi-hop peer- to-peer 802.11b wireless network. Leading up to the full ExScal...deployment, we conducted spatial scaling tests on our higher-tier protocols on a 7 × 7 grid of Stargates nodes 45m and with 90m separations respectively...onW and its scaled version W̃ . III. EXPERIMENTAL SETUP Description of Kansei testbed. A stargate is a single board linux-based computer [7]. It uses a

  6. Mobile User Objective System (MUOS) Multi-Service Operational Test and Evaluation-2 Report (with Classified Annex)

    DTIC Science & Technology

    2016-06-01

    an effective system monitoring and display capability. The SOM, C-SSE, and resource managers access MUOS via a web portal called the MUOS Planning...and Provisioning Application (PlanProvApp). This web portal is their window into MUOS and is designed to provide them with a shared understanding of...including page loading errors, partially loaded web pages, incomplete reports, and inaccurate reports. For example, MUOS reported that there were

  7. Using latent class analysis to identify academic and behavioral risk status in elementary students.

    PubMed

    King, Kathleen R; Lembke, Erica S; Reinke, Wendy M

    2016-03-01

    Identifying classes of children on the basis of academic and behavior risk may have important implications for the allocation of intervention resources within Response to Intervention (RTI) and Multi-Tiered System of Support (MTSS) models. Latent class analysis (LCA) was conducted with a sample of 517 third grade students. Fall screening scores in the areas of reading, mathematics, and behavior were used as indicators of success on an end of year statewide achievement test. Results identified 3 subclasses of children, including a class with minimal academic and behavioral concerns (Tier 1; 32% of the sample), a class at-risk for academic problems and somewhat at-risk for behavior problems (Tier 2; 37% of the sample), and a class with significant academic and behavior problems (Tier 3; 31%). Each class was predictive of end of year performance on the statewide achievement test, with the Tier 1 class performing significantly higher on the test than the Tier 2 class, which in turn scored significantly higher than the Tier 3 class. The results of this study indicated that distinct classes of children can be determined through brief screening measures and are predictive of later academic success. Further implications are discussed for prevention and intervention for students at risk for academic failure and behavior problems. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  8. Evicase: an evidence-based case structuring approach for personalized healthcare.

    PubMed

    Carmeli, Boaz; Casali, Paolo; Goldbraich, Anna; Goldsteen, Abigail; Kent, Carmel; Licitra, Lisa; Locatelli, Paolo; Restifo, Nicola; Rinott, Ruty; Sini, Elena; Torresani, Michele; Waks, Zeev

    2012-01-01

    The personalized medicine era stresses a growing need to combine evidence-based medicine with case based reasoning in order to improve the care process. To address this need we suggest a framework to generate multi-tiered statistical structures we call Evicases. Evicase integrates established medical evidence together with patient cases from the bedside. It then uses machine learning algorithms to produce statistical results and aggregators, weighted predictions, and appropriate recommendations. Designed as a stand-alone structure, Evicase can be used for a range of decision support applications including guideline adherence monitoring and personalized prognostic predictions.

  9. EOforge: Generic Open Framework for Earth Observation Data Processing Systems

    DTIC Science & Technology

    2006-09-01

    Allow the use of existing interfaces, i.e. MUIS: ESA multimission catalogue for EO products. • Support last EO systems technologies, i.e. MASS ...5. Extensibility and configurability to allow customisation and the inclusion of new functionality. 6. Multi-instrument and multi-mission processing...such as: • MUIS: ESA multimission catalogue for EO products. • MASS (Multi-Application Support Service System): ESA web services technology standard

  10. Web-based three-dimensional geo-referenced visualization

    NASA Astrophysics Data System (ADS)

    Lin, Hui; Gong, Jianhua; Wang, Freeman

    1999-12-01

    This paper addresses several approaches to implementing web-based, three-dimensional (3-D), geo-referenced visualization. The discussion focuses on the relationship between multi-dimensional data sets and applications, as well as the thick/thin client and heavy/light server structure. Two models of data sets are addressed in this paper. One is the use of traditional 3-D data format such as 3-D Studio Max, Open Inventor 2.0, Vis5D and OBJ. The other is modelled by a web-based language such as VRML. Also, traditional languages such as C and C++, as well as web-based programming tools such as Java, Java3D and ActiveX, can be used for developing applications. The strengths and weaknesses of each approach are elaborated. Four practical solutions for using VRML and Java, Java and Java3D, VRML and ActiveX and Java wrapper classes (Java and C/C++), to develop applications are presented for web-based, real-time interactive and explorative visualization.

  11. Enabling interspecies epigenomic comparison with CEpBrowser.

    PubMed

    Cao, Xiaoyi; Zhong, Sheng

    2013-05-01

    We developed the Comparative Epigenome Browser (CEpBrowser) to allow the public to perform multi-species epigenomic analysis. The web-based CEpBrowser integrates, manages and visualizes sequencing-based epigenomic datasets. Five key features were developed to maximize the efficiency of interspecies epigenomic comparisons. CEpBrowser is a web application implemented with PHP, MySQL, C and Apache. URL: http://www.cepbrowser.org/.

  12. Using a Java Web-based Graphical User Interface to access the SOHO Data Arch ive

    NASA Astrophysics Data System (ADS)

    Scholl, I.; Girard, Y.; Bykowski, A.

    This paper presents the architecture of a Java web-based graphical interface dedicated to the access of the SOHO Data archive. This application allows local and remote users to search in the SOHO data catalog and retrieve the SOHO data files from the archive. It has been developed at MEDOC (Multi-Experiment Data and Operations Centre), located at the Institut d'Astrophysique Spatiale (Orsay, France), which is one of the European Archives for the SOHO data. This development is part of a joint effort between ESA, NASA and IAS in order to implement long term archive systems for the SOHO data. The software architecture is built as a client-server application using Java language and SQL above a set of components such as an HTTP server, a JDBC gateway, a RDBMS server, a data server and a Web browser. Since HTML pages and CGI scripts are not powerful enough to allow user interaction during a multi-instrument catalog search, this type of requirement enforces the choice of Java as the main language. We also discuss performance issues, security problems and portability on different Web browsers and operating syste ms.

  13. NOBAI: a web server for character coding of geometrical and statistical features in RNA structure

    PubMed Central

    Knudsen, Vegeir; Caetano-Anollés, Gustavo

    2008-01-01

    The Numeration of Objects in Biology: Alignment Inferences (NOBAI) web server provides a web interface to the applications in the NOBAI software package. This software codes topological and thermodynamic information related to the secondary structure of RNA molecules as multi-state phylogenetic characters, builds character matrices directly in NEXUS format and provides sequence randomization options. The web server is an effective tool that facilitates the search for evolutionary history embedded in the structure of functional RNA molecules. The NOBAI web server is accessible at ‘http://www.manet.uiuc.edu/nobai/nobai.php’. This web site is free and open to all users and there is no login requirement. PMID:18448469

  14. Toward a workbench for rodent brain image data: systems architecture and design.

    PubMed

    Moene, Ivar A; Subramaniam, Shankar; Darin, Dmitri; Leergaard, Trygve B; Bjaalie, Jan G

    2007-01-01

    We present a novel system for storing and manipulating microscopic images from sections through the brain and higher-level data extracted from such images. The system is designed and built on a three-tier paradigm and provides the research community with a web-based interface for facile use in neuroscience research. The Oracle relational database management system provides the ability to store a variety of objects relevant to the images and provides the framework for complex querying of data stored in the system. Further, the suite of applications intimately tied into the infrastructure in the application layer provide the user the ability not only to query and visualize the data, but also to perform analysis operations based on the tools embedded into the system. The presentation layer uses extant protocols of the modern web browser and this provides ease of use of the system. The present release, named Functional Anatomy of the Cerebro-Cerebellar System (FACCS), available through The Rodent Brain Workbench (http:// rbwb.org/), is targeted at the functional anatomy of the cerebro-cerebellar system in rats, and holds axonal tracing data from these projections. The system is extensible to other circuits and projections and to other categories of image data and provides a unique environment for analysis of rodent brain maps in the context of anatomical data. The FACCS application assumes standard animal brain atlas models and can be extended to future models. The system is available both for interactive use from a remote web-browser client as well as for download to a local server machine.

  15. 40 CFR 89.207 - Credit calculation.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Trading Provisions § 89.207 Credit calculation. (a) Requirements for calculating NO X credits from Tier 1...) × (Volume) × (AvgPR) × (UL) × (10−6) Where: Std = the applicable Tier 1 NOX nonroad engine emission standard...) of this section, to be applied to Tier 1 NOX credits to be banked or traded for determining...

  16. DocML: A Digital Library of University Data.

    ERIC Educational Resources Information Center

    Papadakis, Ioannis; Karakoidas, Vassileios; Chrissikopoulos, Vassileios

    2002-01-01

    Describes DocML, a Web-based digital library of university data that is used to build a system capable of preserving and managing student assignments. Topics include requirements for a digital library of university data; metadata and XML; three-tier architecture; user interface; searching; browsing; content delivery; and administrative issues.…

  17. Do Students Know What They Know and What They Don't Know? Using a Four-Tier Diagnostic Test to Assess the Nature of Students' Alternative Conceptions

    ERIC Educational Resources Information Center

    Caleon, Imelda S.; Subramaniam, R.

    2010-01-01

    This study reports on the development and application of a four-tier multiple-choice (4TMC) diagnostic instrument, which has not been reported in the literature. It is an enhanced version of the two-tier multiple-choice (2TMC) test. As in 2TMC tests, its answer and reason tiers measure students' content knowledge and explanatory knowledge,…

  18. Towards 100,000 CPU Cycle-Scavenging by Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    Globus, Al; Biegel, Bryan A. (Technical Monitor)

    2001-01-01

    We examine a web-centric design using standard tools such as web servers, web browsers, PHP, and mySQL. We also consider the applicability of Information Power Grid tools such as the Globus (no relation to the author) Toolkit. We intend to implement this architecture with JavaGenes running on at least two cycle-scavengers: Condor and United Devices. JavaGenes, a genetic algorithm code written in Java, will be used to evolve multi-species reactive molecular force field parameters.

  19. Update on Plans to Establish a National Phenology Network in the U.S.A.

    NASA Astrophysics Data System (ADS)

    Betancourt, J.; Schwartz, M.; Breshears, D.; Cayan, D.; Dettinger, M.; Inouye, D.; Post, E.; Reed, B.; Gray, S.

    2005-12-01

    The passing of the seasons is the most pervasive source of climatic and biological variability on Earth, yet phenological monitoring has been spotty worldwide. Formal phenological networks were recently established in Europe and Canada, and we are now following their lead in organizing a National Phenology Network (NPN) for the U.S.A. With support from federal agencies (NSF, USGS, NPS, USDA-FS, EPA, NOAA, NASA), on Aug. 22-26 we organized a workshop in Tucson, Arizona to begin planning a national-scale, multi-tiered phenological network. A prototype for a web-based NPN and preliminary workshop results are available at http://www.npn.uwm.edu. The main goals of NPN will be to: (1) facilitate thorough understanding of phenological phenomena, including causes and effects; (2) provide ground truthing to make the most of heavy public investment in remote sensing data; (3) allow detection and prediction of environmental change for a wide of variety of applications; (4) harness the power of mass participation and engage tens of thousands of "citizen scientists" in meeting national needs in Education, Health, Commerce, Natural Resources and Agriculture; (5) develop a model system for substantive collaboration across different levels of government, academia and the private sector. Just as the national networks of weather stations and stream gauges are critical for providing weather, climate and water-related information, NPN will help safeguard and procure goods and services that ecosystems provide. We expect that NPN will consist of a four-tiered, expandable structure: 1) a backbone network linked to existing weather stations, run by recruited public observers; 2) A smaller, second tier of intensive observations, run by scientists at established research sites; 3) a much larger network of observations made by citizen scientists; and 4) remote sensing observations that can be validated with surface observations, thereby providing wall-to-wall coverage for the U.S.A. Key to the success of NPN will be formal linkages with other ecological networks (e.g., LTER, AmeriFlux, NEON, USDA-FS Inventory and Analysis, NPS Inventory and Monitoring) and strategic co-location of phenological measurements with weather stations (e.g., NOAA's Real-Time Observation Network and state mesonets). Establishment and operation of NPN will require partnerships among multiple federal and state agencies, universities, and NGO's. Interagency agreements will facilitate data sharing, staff commitments, and the transfer of funds, while demonstrating policy-level support for NPN and smoothing the path for use of phenological data in decision-making. A formal implementation report will be completed and circulated for review by Dec. 1, 2005. As soon as the network can start assimilating observations from the public at large (tier 3), NPN will start recruiting observers through NGO's, as well as regional and national media. Every effort will be made to start making observations and expanding the monitoring network by Spring 2006.

  20. Is the two-tier ovarian serous carcinoma grading system potentially useful in stratifying uterine serous carcinoma? A large multi-institutional analysis.

    PubMed

    Ahmed, Quratulain; Hussein, Yaser; Hayek, Kinda; Bandyopadhyay, Sudeshna; Semaan, Assaad; Abdul-Karim, Fadi; Al-Wahab, Zaid; Munkarah, Adnan R; Elshaikh, Mohamed A; Alosh, Baraa; Nucci, Marisa R; Van de Vijver, Koen K; Morris, Robert T; Oliva, Esther; Ali-Fehmi, Rouba

    2014-02-01

    A subset of uterine serous carcinoma (USC) may have better clinical behavior bringing up the possibility that there may be morphologic features, which would help in their categorization. The aim of this study is to evaluate the potential use of the MD Anderson Cancer Center 2-tier grading system for ovarian carcinoma in USC. Tumors assigned a combined score included in this analysis were 1) low-grade: tumors without marked atypia and 12 mitoses/10 high power field (HPF) and 2) high grade: tumors with severe nuclear atypia and >12 mitoses/10 HPF. Clinicopathologic parameters evaluated included patients' age, tumor size, myometrial invasion (MI), lymphovascular invasion (LVI), lymph node (LN), FIGO stage, and patient outcome. 140 patients with USC were included, 30 low grade uterine serous carcinoma (LGUSC) and 110 high grade uterine serous carcinoma (HGUSC). Of all parameters only 2 (MI and stage IA) reached statistical significance. 67% of LGUSC cases showed myometrial invasion versus 93.6% HGUSC cases (p = 0.003). A higher percentage of LGUSC (63.3%) versus HGUSC (32.7%) were in stage IA (p = 0.01). However, by multivariate analysis including age, LVI, stage and tumor grade only stage was an independent prognostic factor. The presence of atypia and mitosis across a uterine serous carcinoma is notoriously variable in magnitude and extent, potentially making evaluation of these features difficult and subsequent grading subjective. Our findings thus show that actual prognostic utility of application of MDACC two-tier grading system to uterine serous carcinoma may not be applicable. Copyright © 2013 Elsevier Inc. All rights reserved.

  1. Recent progress in low-temperature-process monolithic three dimension technology

    NASA Astrophysics Data System (ADS)

    Yang, Chih-Chao; Hsieh, Tung-Ying; Huang, Wen-Hsien; Shen, Chang-Hong; Shieh, Jia-Min; Yeh, Wen-Kuan; Wu, Meng-Chyi

    2018-04-01

    Monolithic three-dimension (3D) integration is an ultimate alternative method of fabricating high density, high performance, and multi-functional integrated circuits. It offers the promise of being a new approach to increase system performance. How to manage the thermal impact of multi-tiered processes, such as dopant activation, source/drain silicidation, and channel formation, and to prevent the degradation of pre-existing devices/circuits become key challenges. In this paper, we provide updates on several important monolithic 3D works, particularly in sequentially stackable channels, and our recent achievements in monolithic 3D integrated circuit (3D-IC). These results indicate that the advanced 3D architecture with novel design tools enables ultrahigh-density stackable circuits to have superior performance and low power consumption for future artificial intelligence (AI) and internet of things (IoTs) application.

  2. Performance evaluation of throughput computing workloads using multi-core processors and graphics processors

    NASA Astrophysics Data System (ADS)

    Dave, Gaurav P.; Sureshkumar, N.; Blessy Trencia Lincy, S. S.

    2017-11-01

    Current trend in processor manufacturing focuses on multi-core architectures rather than increasing the clock speed for performance improvement. Graphic processors have become as commodity hardware for providing fast co-processing in computer systems. Developments in IoT, social networking web applications, big data created huge demand for data processing activities and such kind of throughput intensive applications inherently contains data level parallelism which is more suited for SIMD architecture based GPU. This paper reviews the architectural aspects of multi/many core processors and graphics processors. Different case studies are taken to compare performance of throughput computing applications using shared memory programming in OpenMP and CUDA API based programming.

  3. The Live Access Server Scientific Product Generation Through Workflow Orchestration

    NASA Astrophysics Data System (ADS)

    Hankin, S.; Calahan, J.; Li, J.; Manke, A.; O'Brien, K.; Schweitzer, R.

    2006-12-01

    The Live Access Server (LAS) is a well-established Web-application for display and analysis of geo-science data sets. The software, which can be downloaded and installed by anyone, gives data providers an easy way to establish services for their on-line data holdings, so their users can make plots; create and download data sub-sets; compare (difference) fields; and perform simple analyses. Now at version 7.0, LAS has been in operation since 1994. The current "Armstrong" release of LAS V7 consists of three components in a tiered architecture: user interface, workflow orchestration and Web Services. The LAS user interface (UI) communicates with the LAS Product Server via an XML protocol embedded in an HTTP "get" URL. Libraries (APIs) have been developed in Java, JavaScript and perl that can readily generate this URL. As a result of this flexibility it is common to find LAS user interfaces of radically different character, tailored to the nature of specific datasets or the mindset of specific users. When a request is received by the LAS Product Server (LPS -- the workflow orchestration component), business logic converts this request into a series of Web Service requests invoked via SOAP. These "back- end" Web services perform data access and generate products (visualizations, data subsets, analyses, etc.). LPS then packages these outputs into final products (typically HTML pages) via Jakarta Velocity templates for delivery to the end user. "Fine grained" data access is performed by back-end services that may utilize JDBC for data base access; the OPeNDAP "DAPPER" protocol; or (in principle) the OGC WFS protocol. Back-end visualization services are commonly legacy science applications wrapped in Java or Python (or perl) classes and deployed as Web Services accessible via SOAP. Ferret is the default visualization application used by LAS, though other applications such as Matlab, CDAT, and GrADS can also be used. Other back-end services may include generation of Google Earth layers using KML; generation of maps via WMS or ArcIMS protocols; and data manipulation with Unix utilities.

  4. Application of a three-tier framework to assess ecological ...

    EPA Pesticide Factsheets

    A multi‐level coastal wetland assessment strategy was applied to wetlands in the northern Gulf of Mexico (GOM) to evaluate the feasibility of this approach for a broad national scale wetland condition assessment (U.S. Environmental Protection Agency’s National Wetlands Condition Assessment). Landscape‐scale assessment indicators (Tier 1) were developed and applied at the sub‐watershed (12‐digit Hydrologic Unit) level within the GOM coastal wetland sample frame with scores calculated using land‐use maps and GIS. Rapid assessment protocols (Tier‐2), using a combination of office and field work, evaluated metrics associated with landscape context, hydrology, physical structure, and biological structure. Intensive site monitoring (Tier‐3) included measures of soil chemistry and composition, water column and pore‐water chemistry, and dominant macrophyte community composition and tissue chemistry. Relationships within and among assessment levels were evaluated using multivariate and principal component analyses with few significant correlations were found. More detailed measures of hydrology, soils, and macrophyte species composition from sites across a known condition gradient, in conjunction with validation of standardized rapid assessment method, may be necessary to fully characterize coastal wetlands across the region This manuscript describes the application of a multi-level coastal wetland assessment strategy to wetlands in the northern Gulf of

  5. A Novel Two-Tier Cooperative Caching Mechanism for the Optimization of Multi-Attribute Periodic Queries in Wireless Sensor Networks

    PubMed Central

    Zhou, ZhangBing; Zhao, Deng; Shu, Lei; Tsang, Kim-Fung

    2015-01-01

    Wireless sensor networks, serving as an important interface between physical environments and computational systems, have been used extensively for supporting domain applications, where multiple-attribute sensory data are queried from the network continuously and periodically. Usually, certain sensory data may not vary significantly within a certain time duration for certain applications. In this setting, sensory data gathered at a certain time slot can be used for answering concurrent queries and may be reused for answering the forthcoming queries when the variation of these data is within a certain threshold. To address this challenge, a popularity-based cooperative caching mechanism is proposed in this article, where the popularity of sensory data is calculated according to the queries issued in recent time slots. This popularity reflects the possibility that sensory data are interested in the forthcoming queries. Generally, sensory data with the highest popularity are cached at the sink node, while sensory data that may not be interested in the forthcoming queries are cached in the head nodes of divided grid cells. Leveraging these cooperatively cached sensory data, queries are answered through composing these two-tier cached data. Experimental evaluation shows that this approach can reduce the network communication cost significantly and increase the network capability. PMID:26131665

  6. BioPortal: enhanced functionality via new Web services from the National Center for Biomedical Ontology to access and use ontologies in software applications.

    PubMed

    Whetzel, Patricia L; Noy, Natalya F; Shah, Nigam H; Alexander, Paul R; Nyulas, Csongor; Tudorache, Tania; Musen, Mark A

    2011-07-01

    The National Center for Biomedical Ontology (NCBO) is one of the National Centers for Biomedical Computing funded under the NIH Roadmap Initiative. Contributing to the national computing infrastructure, NCBO has developed BioPortal, a web portal that provides access to a library of biomedical ontologies and terminologies (http://bioportal.bioontology.org) via the NCBO Web services. BioPortal enables community participation in the evaluation and evolution of ontology content by providing features to add mappings between terms, to add comments linked to specific ontology terms and to provide ontology reviews. The NCBO Web services (http://www.bioontology.org/wiki/index.php/NCBO_REST_services) enable this functionality and provide a uniform mechanism to access ontologies from a variety of knowledge representation formats, such as Web Ontology Language (OWL) and Open Biological and Biomedical Ontologies (OBO) format. The Web services provide multi-layered access to the ontology content, from getting all terms in an ontology to retrieving metadata about a term. Users can easily incorporate the NCBO Web services into software applications to generate semantically aware applications and to facilitate structured data collection.

  7. Assessing the Nutritional Quality of Diets of Canadian Adults Using the 2014 Health Canada Surveillance Tool Tier System.

    PubMed

    Jessri, Mahsa; Nishi, Stephanie K; L'Abbé, Mary R

    2015-12-12

    The 2014 Health Canada Surveillance Tool (HCST) was developed to assess adherence of dietary intakes with Canada's Food Guide. HCST classifies foods into one of four Tiers based on thresholds for sodium, total fat, saturated fat and sugar, with Tier 1 representing the healthiest and Tier 4 foods being the unhealthiest. This study presents the first application of HCST to assess (a) dietary patterns of Canadians; and (b) applicability of this tool as a measure of diet quality among 19,912 adult participants of Canadian Community Health Survey 2.2. Findings indicated that even though most of processed meats and potatoes were Tier 4, the majority of reported foods in general were categorized as Tiers 2 and 3 due to the adjustable lenient criteria used in HCST. Moving from the 1st to the 4th quartile of Tier 4 and "other" foods/beverages, there was a significant trend towards increased calories (1876 kcal vs. 2290 kcal) and "harmful" nutrients (e.g., sodium) as well as decreased "beneficial" nutrients. Compliance with the HCST was not associated with lower body mass index. Future nutrient profiling systems need to incorporate both "positive" and "negative" nutrients, an overall score and a wider range of nutrient thresholds to better capture food product differences.

  8. 76 FR 68642 - Fisheries of the Northeastern United States; Atlantic Mackerel, Squid, and Butterfish Fisheries...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-07

    ...: Copies of supporting documents used by the Mid-Atlantic Fishery Management Council (Council), including..., Tier 1 and Tier 2 vessel owners are required to obtain a fish hold capacity measurement from a certified marine surveyor. The hold capacity measurement submitted at the time of application for a Tier 1...

  9. Role of Professional Development and Multi-Level Coaching in Promoting Evidence-Based Practice in Education

    ERIC Educational Resources Information Center

    Wood, Charles L.; Goodnight, Crystalyn I.; Bethune, Keri S.; Preston, Angela I.; Cleaver, Samantha L.

    2016-01-01

    Professional development through in-service training may not be of sufficient duration, intensity, and specificity to improve teachers' instructional skills. Due to the increased need to support teachers' use of evidence-based practices in multi-tiered systems of support such as RTI [Response to Intervention] and PBIS [Positive Behavior…

  10. Multi-Criteria Evaluation of the Web-Based E-Learning System: A Methodology Based on Learner Satisfaction and Its Applications

    ERIC Educational Resources Information Center

    Shee, Daniel Y.; Wang, Yi-Shun

    2008-01-01

    The web-based e-learning system (WELS) has emerged as a new means of skill training and knowledge acquisition, encouraging both academia and industry to invest resources in the adoption of this system. Traditionally, most pre- and post-adoption tasks related to evaluation are carried out from the viewpoints of technology. Since users have been…

  11. Comparison of tiered formularies and reference pricing policies: a systematic review

    PubMed Central

    Morgan, Steve; Hanley, Gillian; Greyson, Devon

    2009-01-01

    Objectives To synthesize methodologically comparable evidence from the published literature regarding the outcomes of tiered formularies and therapeutic reference pricing of prescription drugs. Methods We searched the following electronic databases: ABI/Inform, CINAHL, Clinical Evidence, Digital Dissertations & Theses, Evidence-Based Medicine Reviews (which incorporates ACP Journal Club, Cochrane Central Register of Controlled Trials, Cochrane Database of Systematic Reviews, Cochrane Methodology Register, Database of Abstracts of Reviews of Effectiveness, Health Technology Assessments and NHS Economic Evaluation Database), EconLit, EMBASE, International Pharmaceutical Abstracts, MEDLINE, PAIS International and PAIS Archive, and the Web of Science. We also searched the reference lists of relevant articles and several grey literature sources. We sought English-language studies published from 1986 to 2007 that examined the effects of either therapeutic reference pricing or tiered formularies, reported on outcomes relevant to patient care and cost-effectiveness, and employed quantitative study designs that included concurrent or historical comparison groups. We abstracted and assessed potentially appropriate articles using a modified version of the data abstraction form developed by the Cochrane Effective Practice and Organisation of Care Group. Results From an initial list of 2964 citations, 12 citations (representing 11 studies) were deemed eligible for inclusion in our review: 3 studies (reported in 4 articles) of reference pricing and 8 studies of tiered formularies. The introduction of reference pricing was associated with reduced plan spending, switching to preferred medicines, reduced overall drug utilization and short-term increases in the use of physician services. Reference pricing was not associated with adverse health impacts. The introduction of tiered formularies was associated with reduced plan expenditures, greater patient costs and increased rates of non-compliance with prescribed drug therapy. From the data available, we were unable to examine the hypothesis that tiered formulary policies result in greater use of physician services and potentially worse health outcomes. Conclusion The available evidence does not clearly differentiate between reference pricing and tiered formularies in terms of policy outcomes. Reference pricing appears to have a slight evidentiary advantage, given that patients’ health outcomes under tiered formularies have not been well studied and that tiered formularies are associated with increased rates of medicine discontinuation. PMID:21603047

  12. web cellHTS2: a web-application for the analysis of high-throughput screening data.

    PubMed

    Pelz, Oliver; Gilsdorf, Moritz; Boutros, Michael

    2010-04-12

    The analysis of high-throughput screening data sets is an expanding field in bioinformatics. High-throughput screens by RNAi generate large primary data sets which need to be analyzed and annotated to identify relevant phenotypic hits. Large-scale RNAi screens are frequently used to identify novel factors that influence a broad range of cellular processes, including signaling pathway activity, cell proliferation, and host cell infection. Here, we present a web-based application utility for the end-to-end analysis of large cell-based screening experiments by cellHTS2. The software guides the user through the configuration steps that are required for the analysis of single or multi-channel experiments. The web-application provides options for various standardization and normalization methods, annotation of data sets and a comprehensive HTML report of the screening data analysis, including a ranked hit list. Sessions can be saved and restored for later re-analysis. The web frontend for the cellHTS2 R/Bioconductor package interacts with it through an R-server implementation that enables highly parallel analysis of screening data sets. web cellHTS2 further provides a file import and configuration module for common file formats. The implemented web-application facilitates the analysis of high-throughput data sets and provides a user-friendly interface. web cellHTS2 is accessible online at http://web-cellHTS2.dkfz.de. A standalone version as a virtual appliance and source code for platforms supporting Java 1.5.0 can be downloaded from the web cellHTS2 page. web cellHTS2 is freely distributed under GPL.

  13. DataFed: A Federated Data System for Visualization and Analysis of Spatio-Temporal Air Quality Data

    NASA Astrophysics Data System (ADS)

    Husar, R. B.; Hoijarvi, K.

    2017-12-01

    DataFed is a distributed web-services-based computing environment for accessing, processing, and visualizing atmospheric data in support of air quality science and management. The flexible, adaptive environment facilitates the access and flow of atmospheric data from provider to users by enabling the creation of user-driven data processing/visualization applications. DataFed `wrapper' components, non-intrusively wrap heterogeneous, distributed datasets for access by standards-based GIS web services. The mediator components (also web services) map the heterogeneous data into a spatio-temporal data model. Chained web services provide homogeneous data views (e.g., geospatial, time views) using a global multi-dimensional data model. In addition to data access and rendering, the data processing component services can be programmed for filtering, aggregation, and fusion of multidimensional data. A complete application software is written in a custom made data flow language. Currently, the federated data pool consists of over 50 datasets originating from globally distributed data providers delivering surface-based air quality measurements, satellite observations, emissions data as well as regional and global-scale air quality models. The web browser-based user interface allows point and click navigation and browsing the XYZT multi-dimensional data space. The key applications of DataFed are for exploring spatial pattern of pollutants, seasonal, weekly, diurnal cycles and frequency distributions for exploratory air quality research. Since 2008, DataFed has been used to support EPA in the implementation of the Exceptional Event Rule. The data system is also used at universities in the US, Europe and Asia.

  14. General Framework for Animal Food Safety Traceability Using GS1 and RFID

    NASA Astrophysics Data System (ADS)

    Cao, Weizhu; Zheng, Limin; Zhu, Hong; Wu, Ping

    GS1 is global traceability standard, which is composed by the encoding system (EAN/UCC, EPC), the data carriers identified automatically (bar codes, RFID), electronic data interchange standards (EDI, XML). RFID is a non-contact, multi-objective automatic identification technique. Tracing of source food, standardization of RFID tags, sharing of dynamic data are problems to solve urgently for recent traceability systems. The paper designed general framework for animal food safety traceability using GS1 and RFID. This framework uses RFID tags encoding with EPCglobal tag data standards. Each information server has access tier, business tier and resource tier. These servers are heterogeneous and distributed, providing user access interfaces by SOAP or HTTP protocols. For sharing dynamic data, discovery service and object name service are used to locate dynamic distributed information servers.

  15. Web-Based Learning System for Developing and Assessing Clinical Diagnostic Skills for Dermatology Residency Program

    ERIC Educational Resources Information Center

    Kuo, Fan-Ray; Chin, Yi-Ying; Lee, Chao-Hsien; Chiu, Yu-Hsien; Hong, Chien-Hu; Lee, Kuang-Lieh; Ho, Wen-Hsien; Lee, Chih-Hung

    2016-01-01

    Few studies have explored the learning difficulties and misconceptions that students encounter when using information and communication technology for e-learning. To address this issue, this research developed a system for evaluating the learning efficiency of medical students by applying two-tier diagnosis assessment. The effectiveness of the…

  16. Enhancing Students' Understanding of Photosynthesis and Respiration in Plant through Conceptual Change Approach

    ERIC Educational Resources Information Center

    Yenilmez, Ayse; Tekkaya, Ceren

    2006-01-01

    This study investigated the effectiveness of combining conceptual change text and discussion web strategies on students' understanding of photosynthesis and respiration in plants. Students' conceptual understanding of photosynthesis and respiration in plants was measured using the two-tier diagnostic test developed by Haslam and Treagust (1987,…

  17. STEM TIPS: Supporting the Beginning Secondary STEM Teacher

    ERIC Educational Resources Information Center

    Jones, Griff; Dana, Thomas; LaFramenta, Joanne; Adams, Thomasenia Lott; Arnold, Jason Dean

    2016-01-01

    The STEM TIPS mobile-ready support platform gives institutions or school districts the ability to provide immediate and customized mentoring to teachers through multiple tiers of web-based support and resources. Using the results of a needs assessment, STEM TIPS was created and launched in partnership with 18 Florida school districts. Further…

  18. Sample EPA Biotech Form

    EPA Pesticide Factsheets

    This sample “EPA Biotech Form” is a header sheet that will accompany all biotechnology submission choices, including MCANs, TERAs, Tier I and Tier II exemption, and biotechnology Test Market Exemption Applications (TMEAs).

  19. Multi-tiered sensing and data processing for monitoring ship structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farrar, Charles; Salvino, Liming; Lynch, Jerome

    2009-01-01

    A comprehensive structural health monitoring (SHM) system is a critical mechanism to ensure hull integrity and evaluate structural performance over the life of a ship, especially for lightweight high-speed ships. One of the most important functions of a SHM system is to provide real-time performance guidance and reduce the risk of structural damage during operations at sea. This is done by continuous feedback from onboard sensors providing measurements of seaway loads and structural responses. Applications of SHM should also include diagnostic capabilities such as identifying the presence of damage, assessing the location and extent of damage when it does occurmore » in order to plan for future inspection and maintenance. The development of such SHM systems is extremely challenging because of the physical size of these structures, the widely varying and often extreme operational and environmental conditions associated with the missions of high performance ships, the lack of data from known damage conditions, the limited sensing that was not designed specifically for SHM, the management of the vast amounts of data, and the need for continued, real-time data processing. This paper will discuss some of these challenges and several outstanding issues that need to be addressed in the context of applying various SHM approaches to sea trials data measured on an aluminum high-speed catamaran, the HSV-2 Swift. A multi-tiered approach for sensing and data processing will be discussed as potential SHM architecture for future shipboard application. This approach will involve application of low cost and dense sensor arrays such as wireless communications in selected areas of the ship hull in addition to conventional sensors measuring global structural response of the ship. A recent wireless hull monitoring demo on FSF-I SeaFighter will be discussed as an example to show how this proposed architecture is a viable approach for long-term and real-time hull monitoring.« less

  20. Automated Processing of Imaging Data through Multi-tiered Classification of Biological Structures Illustrated Using Caenorhabditis elegans.

    PubMed

    Zhan, Mei; Crane, Matthew M; Entchev, Eugeni V; Caballero, Antonio; Fernandes de Abreu, Diana Andrea; Ch'ng, QueeLim; Lu, Hang

    2015-04-01

    Quantitative imaging has become a vital technique in biological discovery and clinical diagnostics; a plethora of tools have recently been developed to enable new and accelerated forms of biological investigation. Increasingly, the capacity for high-throughput experimentation provided by new imaging modalities, contrast techniques, microscopy tools, microfluidics and computer controlled systems shifts the experimental bottleneck from the level of physical manipulation and raw data collection to automated recognition and data processing. Yet, despite their broad importance, image analysis solutions to address these needs have been narrowly tailored. Here, we present a generalizable formulation for autonomous identification of specific biological structures that is applicable for many problems. The process flow architecture we present here utilizes standard image processing techniques and the multi-tiered application of classification models such as support vector machines (SVM). These low-level functions are readily available in a large array of image processing software packages and programming languages. Our framework is thus both easy to implement at the modular level and provides specific high-level architecture to guide the solution of more complicated image-processing problems. We demonstrate the utility of the classification routine by developing two specific classifiers as a toolset for automation and cell identification in the model organism Caenorhabditis elegans. To serve a common need for automated high-resolution imaging and behavior applications in the C. elegans research community, we contribute a ready-to-use classifier for the identification of the head of the animal under bright field imaging. Furthermore, we extend our framework to address the pervasive problem of cell-specific identification under fluorescent imaging, which is critical for biological investigation in multicellular organisms or tissues. Using these examples as a guide, we envision the broad utility of the framework for diverse problems across different length scales and imaging methods.

  1. 75 FR 48397 - Self-Regulatory Organizations; The Chicago Stock Exchange, Inc.; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-10

    ... transactions) for the calendar month in which the executions occurred. There are three volume-based Tiers and the rate of applicable take fees and provide credits vary based upon the Tier into which a Participant falls. \\5\\ Through its filing on January 4, 2010, the Exchange instituted a tiered fee and rebate...

  2. 12 CFR 208.73 - What additional provisions are applicable to state member banks with financial subsidiaries?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... subsidiaries from both the bank's Tier 1 capital and Tier 2 capital; and (ii) Deduct the entire amount of the... deducted from the bank's Tier 1 capital. (b) Financial statement disclosure of capital deduction. Any... (including the well capitalized standard of § 208.71(a)(1)): (1) The bank must not consolidate the assets and...

  3. Development and Application of a Four-Tier Test to Assess Pre-Service Physics Teachers' Misconceptions about Geometrical Optics

    ERIC Educational Resources Information Center

    Kaltakci-Gurel, Derya; Eryilmaz, Ali; McDermott, Lillian Christie

    2017-01-01

    Background: Correct identification of misconceptions is an important first step in order to gain an understanding of student learning. More recently, four-tier multiple choice tests have been found to be effective in assessing misconceptions. Purpose: The purposes of this study are (1) to develop and validate a four-tier misconception test to…

  4. Digital image archiving: challenges and choices.

    PubMed

    Dumery, Barbara

    2002-01-01

    In the last five years, imaging exam volume has grown rapidly. In addition to increased image acquisition, there is more patient information per study. RIS-PACS integration and information-rich DICOM headers now provide us with more patient information relative to each study. The volume of archived digital images is increasing and will continue to rise at a steeper incline than film-based storage of the past. Many filmless facilities have been caught off guard by this increase, which has been stimulated by many factors. The most significant factor is investment in new digital and DICOM-compliant modalities. A huge volume driver is the increase in images per study from multi-slice technology. Storage requirements also are affected by disaster recovery initiatives and state retention mandates. This burgeoning rate of imaging data volume presents many challenges: cost of ownership, data accessibility, storage media obsolescence, database considerations, physical limitations, reliability and redundancy. There are two basic approaches to archiving--single tier and multi-tier. Each has benefits. With a single-tier approach, all the data is stored on a single media that can be accessed very quickly. A redundant copy of the data is then stored onto another less expensive media. This is usually a removable media. In this approach, the on-line storage is increased incrementally as volume grows. In a multi-tier approach, storage levels are set up based on access speed and cost. In other words, all images are stored at the deepest archiving level, which is also the least expensive. Images are stored on or moved back to the intermediate and on-line levels if they will need to be accessed more quickly. It can be difficult to decide what the best approach is for your organization. The options include RAIDs (redundant array of independent disks), direct attached RAID storage (DAS), network storage using RAIDs (NAS and SAN), removable media such as different types of tape, compact disks (CDs and DVDs) and magneto-optical disks (MODs). As you evaluate the various options for storage, it is important to consider both performance and cost. For most imaging enterprises, a single-tier archiving approach is the best solution. With the cost of hard drives declining, NAS is a very feasible solution today. It is highly reliable, offers immediate access to all exams, and easily scales as imaging volume grows. Best of all, media obsolescence challenges need not be of concern. For back-up storage, removable media can be implemented, with a smaller investment needed as it will only be used for a redundant copy of the data. There is no need to keep it online and available. If further system redundancy is desired, multiple servers should be considered. The multi-tier approach still has its merits for smaller enterprises, but with a detailed long-term cost of ownership analysis, NAS will probably still come out on top as the solution of choice for many imaging facilities.

  5. Integrating scales of seagrass monitoring to meet conservation needs

    USGS Publications Warehouse

    Neckles, Hilary A.; Kopp, Blaine S.; Peterson, Bradley J.; Pooler, Penelope S.

    2012-01-01

    We evaluated a hierarchical framework for seagrass monitoring in two estuaries in the northeastern USA: Little Pleasant Bay, Massachusetts, and Great South Bay/Moriches Bay, New York. This approach includes three tiers of monitoring that are integrated across spatial scales and sampling intensities. We identified monitoring attributes for determining attainment of conservation objectives to protect seagrass ecosystems from estuarine nutrient enrichment. Existing mapping programs provided large-scale information on seagrass distribution and bed sizes (tier 1 monitoring). We supplemented this with bay-wide, quadrat-based assessments of seagrass percent cover and canopy height at permanent sampling stations following a spatially distributed random design (tier 2 monitoring). Resampling simulations showed that four observations per station were sufficient to minimize bias in estimating mean percent cover on a bay-wide scale, and sample sizes of 55 stations in a 624-ha system and 198 stations in a 9,220-ha system were sufficient to detect absolute temporal increases in seagrass abundance from 25% to 49% cover and from 4% to 12% cover, respectively. We made high-resolution measurements of seagrass condition (percent cover, canopy height, total and reproductive shoot density, biomass, and seagrass depth limit) at a representative index site in each system (tier 3 monitoring). Tier 3 data helped explain system-wide changes. Our results suggest tiered monitoring as an efficient and feasible way to detect and predict changes in seagrass systems relative to multi-scale conservation objectives.

  6. Negotiation-based Order Lot-Sizing Approach for Two-tier Supply Chain

    NASA Astrophysics Data System (ADS)

    Chao, Yuan; Lin, Hao Wen; Chen, Xili; Murata, Tomohiro

    This paper focuses on a negotiation based collaborative planning process for the determination of order lot-size over multi-period planning, and confined to a two-tier supply chain scenario. The aim is to study how negotiation based planning processes would be used to refine locally preferred ordering patterns, which would consequently affect the overall performance of the supply chain in terms of costs and service level. Minimal information exchanges in the form of mathematical models are suggested to represent the local preferences and used to support the negotiation processes.

  7. Assessing the Nutritional Quality of Diets of Canadian Adults Using the 2014 Health Canada Surveillance Tool Tier System

    PubMed Central

    Jessri, Mahsa; Nishi, Stephanie K.; L’Abbé, Mary R.

    2015-01-01

    The 2014 Health Canada Surveillance Tool (HCST) was developed to assess adherence of dietary intakes with Canada’s Food Guide. HCST classifies foods into one of four Tiers based on thresholds for sodium, total fat, saturated fat and sugar, with Tier 1 representing the healthiest and Tier 4 foods being the unhealthiest. This study presents the first application of HCST to assess (a) dietary patterns of Canadians; and (b) applicability of this tool as a measure of diet quality among 19,912 adult participants of Canadian Community Health Survey 2.2. Findings indicated that even though most of processed meats and potatoes were Tier 4, the majority of reported foods in general were categorized as Tiers 2 and 3 due to the adjustable lenient criteria used in HCST. Moving from the 1st to the 4th quartile of Tier 4 and “other” foods/beverages, there was a significant trend towards increased calories (1876 kcal vs. 2290 kcal) and “harmful” nutrients (e.g., sodium) as well as decreased “beneficial” nutrients. Compliance with the HCST was not associated with lower body mass index. Future nutrient profiling systems need to incorporate both “positive” and “negative” nutrients, an overall score and a wider range of nutrient thresholds to better capture food product differences. PMID:26703721

  8. Multi-National Information Sharing -- Cross Domain Collaborative Information Environment (CDCIE) Solution. Revision 4

    DTIC Science & Technology

    2005-04-12

    Hardware, Database, and Operating System independence using Java • Enterprise-class Architecture using Java2 Enterprise Edition 1.4 • Standards based...portal applications. Compliance with the Java Specification Request for Portlet APIs (JSR-168) (Portlet API) and Web Services for Remote Portals...authentication and authorization • Portal Standards using Java Specification Request for Portlet APIs (JSR-168) (Portlet API) and Web Services for Remote

  9. Developing a ubiquitous health management system with healthy diet control for metabolic syndrome healthcare in Taiwan.

    PubMed

    Kan, Yao-Chiang; Chen, Kai-Hong; Lin, Hsueh-Chun

    2017-06-01

    Self-management in healthcare can allow patients managing their health data anytime and everywhere for prevention of chronic diseases. This study established a prototype of ubiquitous health management system (UHMS) with healthy diet control (HDC) for people who need services of metabolic syndrome healthcare in Taiwan. System infrastructure comprises of three portals and a database tier with mutually supportive components to achieve functionality of diet diaries, nutrition guides, and health risk assessments for self-health management. With the diet, nutrition, and personal health database, the design enables the analytical diagrams on the interactive interface to support a mobile application for diet diary, a Web-based platform for health management, and the modules of research and development for medical care. For database integrity, dietary data can be stored at offline mode prior to transformation between mobile device and server site at online mode. The UHMS-HDC was developed by open source technology for ubiquitous health management with personalized dietary criteria. The system integrates mobile, internet, and electronic healthcare services with the diet diary functions to manage healthy diet behaviors of users. The virtual patients were involved to simulate the self-health management procedure. The assessment functions were approved by capturing the screen snapshots in the procedure. The proposed system development was capable for practical intervention. This approach details the expandable framework with collaborative components regarding the self-developed UHMS-HDC. The multi-disciplinary applications for self-health management can support the healthcare professionals to reduce medical resources and improve healthcare effects for the patient who requires monitoring personal health condition with diet control. The proposed system can be practiced for intervention in the hospital. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. 78 FR 41166 - Self-Regulatory Organizations; New York Stock Exchange LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-09

    ... greater specificity related to (i) the applicable ``tier 3'' Supplemental Liquidity Provider (``SLP... amend its Price List to add greater specificity related to (i) the applicable ``tier 3'' SLP rate and... the fee change effective July 1, 2013. SLP Credits \\3\\ \\3\\ The SLP program provides incentives for...

  11. Development and Application of a Two-Tier Diagnostic Test for High School Students' Understanding of Flowering Plant Growth and Development

    ERIC Educational Resources Information Center

    Lin, Sheau-Wen

    2004-01-01

    This study involved the development and application of a two-tier diagnostic test measuring students' understanding of flowering plant growth and development. The instrument development procedure had three general steps: defining the content boundaries of the test, collecting information on students' misconceptions, and instrument development.…

  12. Development and Application of a Two-Tier Multiple-Choice Diagnostic Test for High School Students' Understanding of Cell Division and Reproduction

    ERIC Educational Resources Information Center

    Sesli, Ertugrul; Kara, Yilmaz

    2012-01-01

    This study involved the development and application of a two-tier diagnostic test for measuring students' understanding of cell division and reproduction. The instrument development procedure had three general steps: defining the content boundaries of the test, collecting information on students' misconceptions, and instrument development.…

  13. 12 CFR 347.111 - Underwriting and dealing limits applicable to foreign organizations held by insured state...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... the lesser of $60 million or 25 percent of the bank's Tier 1 capital, except as otherwise provided in..., with at least 50 percent of the deduction being taken from Tier 1 capital, with the bank remaining well...: (1) May not exceed the lesser of $30 million or 5 percent of the bank's Tier 1 capital, subject to...

  14. MySQL/PHP web database applications for IPAC proposal submission

    NASA Astrophysics Data System (ADS)

    Crane, Megan K.; Storrie-Lombardi, Lisa J.; Silbermann, Nancy A.; Rebull, Luisa M.

    2008-07-01

    The Infrared Processing and Analysis Center (IPAC) is NASA's multi-mission center of expertise for long-wavelength astrophysics. Proposals for various IPAC missions and programs are ingested via MySQL/PHP web database applications. Proposers use web forms to enter coversheet information and upload PDF files related to the proposal. Upon proposal submission, a unique directory is created on the webserver into which all of the uploaded files are placed. The coversheet information is converted into a PDF file using a PHP extension called FPDF. The files are concatenated into one PDF file using the command-line tool pdftk and then forwarded to the review committee. This work was performed at the California Institute of Technology under contract to the National Aeronautics and Space Administration.

  15. The Saccharomyces Genome Database Variant Viewer

    PubMed Central

    Sheppard, Travis K.; Hitz, Benjamin C.; Engel, Stacia R.; Song, Giltae; Balakrishnan, Rama; Binkley, Gail; Costanzo, Maria C.; Dalusag, Kyla S.; Demeter, Janos; Hellerstedt, Sage T.; Karra, Kalpana; Nash, Robert S.; Paskov, Kelley M.; Skrzypek, Marek S.; Weng, Shuai; Wong, Edith D.; Cherry, J. Michael

    2016-01-01

    The Saccharomyces Genome Database (SGD; http://www.yeastgenome.org) is the authoritative community resource for the Saccharomyces cerevisiae reference genome sequence and its annotation. In recent years, we have moved toward increased representation of sequence variation and allelic differences within S. cerevisiae. The publication of numerous additional genomes has motivated the creation of new tools for their annotation and analysis. Here we present the Variant Viewer: a dynamic open-source web application for the visualization of genomic and proteomic differences. Multiple sequence alignments have been constructed across high quality genome sequences from 11 different S. cerevisiae strains and stored in the SGD. The alignments and summaries are encoded in JSON and used to create a two-tiered dynamic view of the budding yeast pan-genome, available at http://www.yeastgenome.org/variant-viewer. PMID:26578556

  16. The INFN-CNAF Tier-1 GEMSS Mass Storage System and database facility activity

    NASA Astrophysics Data System (ADS)

    Ricci, Pier Paolo; Cavalli, Alessandro; Dell'Agnello, Luca; Favaro, Matteo; Gregori, Daniele; Prosperini, Andrea; Pezzi, Michele; Sapunenko, Vladimir; Zizzi, Giovanni; Vagnoni, Vincenzo

    2015-05-01

    The consolidation of Mass Storage services at the INFN-CNAF Tier1 Storage department that has occurred during the last 5 years, resulted in a reliable, high performance and moderately easy-to-manage facility that provides data access, archive, backup and database services to several different use cases. At present, the GEMSS Mass Storage System, developed and installed at CNAF and based upon an integration between the IBM GPFS parallel filesystem and the Tivoli Storage Manager (TSM) tape management software, is one of the largest hierarchical storage sites in Europe. It provides storage resources for about 12% of LHC data, as well as for data of other non-LHC experiments. Files are accessed using standard SRM Grid services provided by the Storage Resource Manager (StoRM), also developed at CNAF. Data access is also provided by XRootD and HTTP/WebDaV endpoints. Besides these services, an Oracle database facility is in production characterized by an effective level of parallelism, redundancy and availability. This facility is running databases for storing and accessing relational data objects and for providing database services to the currently active use cases. It takes advantage of several Oracle technologies, like Real Application Cluster (RAC), Automatic Storage Manager (ASM) and Enterprise Manager centralized management tools, together with other technologies for performance optimization, ease of management and downtime reduction. The aim of the present paper is to illustrate the state-of-the-art of the INFN-CNAF Tier1 Storage department infrastructures and software services, and to give a brief outlook to forthcoming projects. A description of the administrative, monitoring and problem-tracking tools that play a primary role in managing the whole storage framework is also given.

  17. The Design and Development of a Computerized Attention-Training Game System for School-Aged Children

    ERIC Educational Resources Information Center

    Wang, Tsui-Ying; Huang, Ho-Chuan

    2013-01-01

    A computerized attention-training game system has been developed to support attention training for school-aged children. The present system offers various types of computer games that provide training in different aspects of attention, such as selective attention, sustained attention, and divided attention. The N-tier architecture of the Web-based…

  18. Utilization of data estimation via existing models, within a tiered data quality system, for populating species sensitivity distributions

    EPA Science Inventory

    The acquisition toxicity test data of sufficient quality from open literature to fulfill taxonomic diversity requirements can be a limiting factor in the creation of new 304(a) Aquatic Life Criteria. The use of existing models (WebICE and ACE) that estimate acute and chronic eff...

  19. A Two-Tier Test-Based Approach to Improving Students' Computer-Programming Skills in a Web-Based Learning Environment

    ERIC Educational Resources Information Center

    Yang, Tzu-Chi; Hwang, Gwo-Jen; Yang, Stephen J. H.; Hwang, Gwo-Haur

    2015-01-01

    Computer programming is an important skill for engineering and computer science students. However, teaching and learning programming concepts and skills has been recognized as a great challenge to both teachers and students. Therefore, the development of effective learning strategies and environments for programming courses has become an important…

  20. 76 FR 77032 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-09

    ...'s Web site at http://nasdaq.cchwallstreet.com/Filings , at the principal office of the Exchange, and... offer an alternative to trading venues that are entirely dark. For members qualifying for this tier, the... otherwise be traded in ``dark pool'' alternative trading systems that have been exempted from compliance...

  1. Eodataservice.org: Big Data Platform to Enable Multi-disciplinary Information Extraction from Geospatial Data

    NASA Astrophysics Data System (ADS)

    Natali, S.; Mantovani, S.; Barboni, D.; Hogan, P.

    2017-12-01

    In 1999, US Vice-President Al Gore outlined the concept of `Digital Earth' as a multi-resolution, three-dimensional representation of the planet to find, visualise and make sense of vast amounts of geo- referenced information on physical and social environments, allowing to navigate through space and time, accessing historical and forecast data to support scientists, policy-makers, and any other user. The eodataservice platform (http://eodataservice.org/) implements the Digital Earth Concept: eodatasevice is a cross-domain platform that makes available a large set of multi-year global environmental collections allowing data discovery, visualization, combination, processing and download. It implements a "virtual datacube" approach where data stored on distributed data centers are made available via standardized OGC-compliant interfaces. Dedicated web-based Graphic User Interfaces (based on the ESA-NASA WebWorldWind technology) as well as web-based notebooks (e.g. Jupyter notebook), deskop GIS tools and command line interfaces can be used to access and manipulate the data. The platform can be fully customized on users' needs. So far eodataservice has been used for the following thematic applications: High resolution satellite data distribution Land surface monitoring using SAR surface deformation data Atmosphere, ocean and climate applications Climate-health applications Urban Environment monitoring Safeguard of cultural heritage sites Support to farmers and (re)-insurances in the agriculturés field In the current work, the EO Data Service concept is presented as key enabling technology; furthermore various examples are provided to demonstrate the high level of interdisciplinarity of the platform.

  2. Linked data scientometrics in semantic e-Science

    NASA Astrophysics Data System (ADS)

    Narock, Tom; Wimmer, Hayden

    2017-03-01

    The Semantic Web is inherently multi-disciplinary and many domains have taken advantage of semantic technologies. Yet, the geosciences are one of the fields leading the way in Semantic Web adoption and validation. Astronomy, Earth science, hydrology, and solar-terrestrial physics have seen a noteworthy amount of semantic integration. The geoscience community has been willing early adopters of semantic technologies and have provided essential feedback to the broader semantic web community. Yet, there has been no systematic study of the community as a whole and there exists no quantitative data on the impact and status of semantic technologies in the geosciences. We explore the applicability of Linked Data to scientometrics in the geosciences. In doing so, we gain an initial understanding of the breadth and depth of the Semantic Web in the geosciences. We identify what appears to be a transitionary period in the applicability of these technologies.

  3. Development and Application of a Two-Tier Multiple Choice Diagnostic Instrument To Assess High School Students' Understanding of Inorganic Chemistry Qualitative Analysis.

    ERIC Educational Resources Information Center

    Tan, Kim Chwee Daniel; Goh, Ngoh Khang; Chia, Lian Sai; Treagust, David F.

    2002-01-01

    Describes the development and application of a two-tier multiple choice diagnostic instrument to assess high school students' understanding of inorganic chemistry qualitative analysis. Shows that the Grade 10 students had difficulty understanding the reactions involved in the identification of cations and anions, for example, double decomposition…

  4. NaviCom: a web application to create interactive molecular network portraits using multi-level omics data.

    PubMed

    Dorel, Mathurin; Viara, Eric; Barillot, Emmanuel; Zinovyev, Andrei; Kuperstein, Inna

    2017-01-01

    Human diseases such as cancer are routinely characterized by high-throughput molecular technologies, and multi-level omics data are accumulated in public databases at increasing rate. Retrieval and visualization of these data in the context of molecular network maps can provide insights into the pattern of regulation of molecular functions reflected by an omics profile. In order to make this task easy, we developed NaviCom, a Python package and web platform for visualization of multi-level omics data on top of biological network maps. NaviCom is bridging the gap between cBioPortal, the most used resource of large-scale cancer omics data and NaviCell, a data visualization web service that contains several molecular network map collections. NaviCom proposes several standardized modes of data display on top of molecular network maps, allowing addressing specific biological questions. We illustrate how users can easily create interactive network-based cancer molecular portraits via NaviCom web interface using the maps of Atlas of Cancer Signalling Network (ACSN) and other maps. Analysis of these molecular portraits can help in formulating a scientific hypothesis on the molecular mechanisms deregulated in the studied disease. NaviCom is available at https://navicom.curie.fr. © The Author(s) 2017. Published by Oxford University Press.

  5. A false sense of security? Can tiered approach be trusted to accurately classify immunogenicity samples?

    PubMed

    Jaki, Thomas; Allacher, Peter; Horling, Frank

    2016-09-05

    Detecting and characterizing of anti-drug antibodies (ADA) against a protein therapeutic are crucially important to monitor the unwanted immune response. Usually a multi-tiered approach that initially rapidly screens for positive samples that are subsequently confirmed in a separate assay is employed for testing of patient samples for ADA activity. In this manuscript we evaluate the ability of different methods used to classify subject with screening and competition based confirmatory assays. We find that for the overall performance of the multi-stage process the method used for confirmation is most important where a t-test is best when differences are moderate to large. Moreover we find that, when differences between positive and negative samples are not sufficiently large, using a competition based confirmation step does yield poor classification of positive samples. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  6. A systems relations model for Tier 2 early intervention child mental health services with schools: an exploratory study.

    PubMed

    van Roosmalen, Marc; Gardner-Elahi, Catherine; Day, Crispin

    2013-01-01

    Over the last 15 years, policy initiatives have aimed at the provision of more comprehensive Child and Adolescent Mental Health care. These presented a series of new challenges in organising and delivering Tier 2 child mental health services, particularly in schools. This exploratory study aimed to examine and clarify the service model underpinning a Tier 2 child mental health service offering school-based mental health work. Using semi-structured interviews, clinician descriptions of operational experiences were gathered. These were analysed using grounded theory methods. Analysis was validated by respondents at two stages. A pathway for casework emerged that included a systemic consultative function, as part of an overall three-function service model, which required: (1) activity as a member of the multi-agency system; (2) activity to improve the system working around a particular child; and (3) activity to universally develop a Tier 1 workforce confident in supporting children at risk of or experiencing mental health problems. The study challenged the perception of such a service serving solely a Tier 2 function, the requisite workforce to deliver the service model, and could give service providers a rationale for negotiating service models that include an explicit focus on improving the children's environments.

  7. Determining the coordinates of lamps in an illumination dome

    NASA Astrophysics Data System (ADS)

    MacDonald, Lindsay W.; Ahmadabadian, Ali H.; Robson, Stuart

    2015-05-01

    The UCL Dome consists of an acrylic hemisphere of nominal diameter 1030 mm, fitted with 64 flash lights, arranged in three tiers of 16, one tier of 12, and one tier of 4 lights at approximately equal intervals. A Nikon D200 digital camera is mounted on a rigid steel frame at the `north pole' of the dome pointing vertically downwards with its optical axis normal to the horizontal baseboard in the `equatorial' plane. It is used to capture sets of images in pixel register for visualisation and surface reconstruction. Three techniques were employed for the geometric calibration of flash light positions in the dome: (1) the shadow cast by a vertical pin onto graph paper; (2) multi-image photogrammetry with retro-reflective targets; and (3) multi-image photogrammetry using the flash lights themselves as targets. The precision of the coordinates obtained by the three techniques was analysed, and it was found that although photogrammetric methods could locate individual targets to an accuracy of 20 μm, the uncertainty of locating the centroids of the flash lights was approximately 1.5 mm. This result was considered satisfactory for the purposes of using the dome for photometric imaging, and in particular for the visualisation of object surfaces by the polynomial texture mapping (PTM) technique.

  8. The KUPNetViz: a biological network viewer for multiple -omics datasets in kidney diseases.

    PubMed

    Moulos, Panagiotis; Klein, Julie; Jupp, Simon; Stevens, Robert; Bascands, Jean-Loup; Schanstra, Joost P

    2013-07-24

    Constant technological advances have allowed scientists in biology to migrate from conventional single-omics to multi-omics experimental approaches, challenging bioinformatics to bridge this multi-tiered information. Ongoing research in renal biology is no exception. The results of large-scale and/or high throughput experiments, presenting a wealth of information on kidney disease are scattered across the web. To tackle this problem, we recently presented the KUPKB, a multi-omics data repository for renal diseases. In this article, we describe KUPNetViz, a biological graph exploration tool allowing the exploration of KUPKB data through the visualization of biomolecule interactions. KUPNetViz enables the integration of multi-layered experimental data over different species, renal locations and renal diseases to protein-protein interaction networks and allows association with biological functions, biochemical pathways and other functional elements such as miRNAs. KUPNetViz focuses on the simplicity of its usage and the clarity of resulting networks by reducing and/or automating advanced functionalities present in other biological network visualization packages. In addition, it allows the extrapolation of biomolecule interactions across different species, leading to the formulations of new plausible hypotheses, adequate experiment design and to the suggestion of novel biological mechanisms. We demonstrate the value of KUPNetViz by two usage examples: the integration of calreticulin as a key player in a larger interaction network in renal graft rejection and the novel observation of the strong association of interleukin-6 with polycystic kidney disease. The KUPNetViz is an interactive and flexible biological network visualization and exploration tool. It provides renal biologists with biological network snapshots of the complex integrated data of the KUPKB allowing the formulation of new hypotheses in a user friendly manner.

  9. The KUPNetViz: a biological network viewer for multiple -omics datasets in kidney diseases

    PubMed Central

    2013-01-01

    Background Constant technological advances have allowed scientists in biology to migrate from conventional single-omics to multi-omics experimental approaches, challenging bioinformatics to bridge this multi-tiered information. Ongoing research in renal biology is no exception. The results of large-scale and/or high throughput experiments, presenting a wealth of information on kidney disease are scattered across the web. To tackle this problem, we recently presented the KUPKB, a multi-omics data repository for renal diseases. Results In this article, we describe KUPNetViz, a biological graph exploration tool allowing the exploration of KUPKB data through the visualization of biomolecule interactions. KUPNetViz enables the integration of multi-layered experimental data over different species, renal locations and renal diseases to protein-protein interaction networks and allows association with biological functions, biochemical pathways and other functional elements such as miRNAs. KUPNetViz focuses on the simplicity of its usage and the clarity of resulting networks by reducing and/or automating advanced functionalities present in other biological network visualization packages. In addition, it allows the extrapolation of biomolecule interactions across different species, leading to the formulations of new plausible hypotheses, adequate experiment design and to the suggestion of novel biological mechanisms. We demonstrate the value of KUPNetViz by two usage examples: the integration of calreticulin as a key player in a larger interaction network in renal graft rejection and the novel observation of the strong association of interleukin-6 with polycystic kidney disease. Conclusions The KUPNetViz is an interactive and flexible biological network visualization and exploration tool. It provides renal biologists with biological network snapshots of the complex integrated data of the KUPKB allowing the formulation of new hypotheses in a user friendly manner. PMID:23883183

  10. Photometric redshift estimation based on data mining with PhotoRApToR

    NASA Astrophysics Data System (ADS)

    Cavuoti, S.; Brescia, M.; De Stefano, V.; Longo, G.

    2015-03-01

    Photometric redshifts (photo-z) are crucial to the scientific exploitation of modern panchromatic digital surveys. In this paper we present PhotoRApToR (Photometric Research Application To Redshift): a Java/C ++ based desktop application capable to solve non-linear regression and multi-variate classification problems, in particular specialized for photo-z estimation. It embeds a machine learning algorithm, namely a multi-layer neural network trained by the Quasi Newton learning rule, and special tools dedicated to pre- and post-processing data. PhotoRApToR has been successfully tested on several scientific cases. The application is available for free download from the DAME Program web site.

  11. Cloud Based Web 3d GIS Taiwan Platform

    NASA Astrophysics Data System (ADS)

    Tsai, W.-F.; Chang, J.-Y.; Yan, S. Y.; Chen, B.

    2011-09-01

    This article presents the status of the web 3D GIS platform, which has been developed in the National Applied Research Laboratories. The purpose is to develop a global earth observation 3D GIS platform for applications to disaster monitoring and assessment in Taiwan. For quick response to preliminary and detailed assessment after a natural disaster occurs, the web 3D GIS platform is useful to access, transfer, integrate, display and analyze the multi-scale huge data following the international OGC standard. The framework of cloud service for data warehousing management and efficiency enhancement using VMWare is illustrated in this article.

  12. Second-Tier Database for Ecosystem Focus, 2002-2003 Annual Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Holmes, Chris; Muongchanh, Christine; Anderson, James J.

    2003-11-01

    The Second-Tier Database for Ecosystem Focus (Contract 00004124) provides direct and timely public access to Columbia Basin environmental, operational, fishery and riverine data resources for federal, state, public and private entities. The Second-Tier Database known as Data Access in Realtime (DART) integrates public data for effective access, consideration and application. DART also provides analysis tools and performance measures helpful in evaluating the condition of Columbia Basin salmonid stocks.

  13. Pragmatic service development and customisation with the CEDA OGC Web Services framework

    NASA Astrophysics Data System (ADS)

    Pascoe, Stephen; Stephens, Ag; Lowe, Dominic

    2010-05-01

    The CEDA OGC Web Services framework (COWS) emphasises rapid service development by providing a lightweight layer of OGC web service logic on top of Pylons, a mature web application framework for the Python language. This approach gives developers a flexible web service development environment without compromising access to the full range of web application tools and patterns: Model-View-Controller paradigm, XML templating, Object-Relational-Mapper integration and authentication/authorization. We have found this approach useful for exploring evolving standards and implementing protocol extensions to meet the requirements of operational deployments. This paper outlines how COWS is being used to implement customised WMS, WCS, WFS and WPS services in a variety of web applications from experimental prototypes to load-balanced cluster deployments serving 10-100 simultaneous users. In particular we will cover 1) The use of Climate Science Modeling Language (CSML) in complex-feature aware WMS, WCS and WFS services, 2) Extending WMS to support applications with features specific to earth system science and 3) A cluster-enabled Web Processing Service (WPS) supporting asynchronous data processing. The COWS WPS underpins all backend services in the UK Climate Projections User Interface where users can extract, plot and further process outputs from a multi-dimensional probabilistic climate model dataset. The COWS WPS supports cluster job execution, result caching, execution time estimation and user management. The COWS WMS and WCS components drive the project-specific NCEO and QESDI portals developed by the British Atmospheric Data Centre. These portals use CSML as a backend description format and implement features such as multiple WMS layer dimensions and climatology axes that are beyond the scope of general purpose GIS tools and yet vital for atmospheric science applications.

  14. The Application of a Three-Tier Model of Intervention to Parent Training

    PubMed Central

    Phaneuf, Leah; McIntyre, Laura Lee

    2015-01-01

    A three-tier intervention system was designed for use with parents with preschool children with developmental disabilities to modify parent–child interactions. A single-subject changing-conditions design was used to examine the utility of a three-tier intervention system in reducing negative parenting strategies, increasing positive parenting strategies, and reducing child behavior problems in parent–child dyads (n = 8). The three intervention tiers consisted of (a) self-administered reading material, (b) group training, and (c) individualized video feedback sessions. Parental behavior was observed to determine continuation or termination of intervention. Results support the utility of a tiered model of intervention to maximize treatment outcomes and increase efficiency by minimizing the need for more costly time-intensive interventions for participants who may not require them. PMID:26213459

  15. 77 FR 43903 - Intent To Prepare an Environmental Impact Statement for Proposed Transit Improvements to the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-26

    ... Administration (FTA), as the lead federal agency, and the Chicago Transit Authority (CTA) published a Notice of... interested parties that the EIS will no longer be a Tier 1 EIS as originally proposed. The methodology and... analysis are available on the CTA Web site www.transitchicago.com/rpmproject . The CTA operates the rapid...

  16. Assessing the nutritional quality of diets of Canadian children and adolescents using the 2014 Health Canada Surveillance Tool Tier System.

    PubMed

    Jessri, Mahsa; Nishi, Stephanie K; L'Abbe, Mary R

    2016-05-10

    Health Canada's Surveillance Tool (HCST) Tier System was developed in 2014 with the aim of assessing the adherence of dietary intakes with Eating Well with Canada's Food Guide (EWCFG). HCST uses a Tier system to categorize all foods into one of four Tiers based on thresholds for total fat, saturated fat, sodium, and sugar, with Tier 4 reflecting the unhealthiest and Tier 1 the healthiest foods. This study presents the first application of the HCST to examine (i) the dietary patterns of Canadian children, and (ii) the applicability and relevance of HCST as a measure of diet quality. Data were from the nationally-representative, cross-sectional Canadian Community Health Survey 2.2. A total of 13,749 participants aged 2-18 years who had complete lifestyle and 24-hour dietary recall data were examined. Dietary patterns of Canadian children and adolescents demonstrated a high prevalence of Tier 4 foods within the sub-groups of processed meats and potatoes. On average, 23-31 % of daily calories were derived from "other" foods and beverages not recommended in EWCFG. However, the majority of food choices fell within the Tier 2 and 3 classifications due to lenient criteria used by the HCST for classifying foods. Adherence to the recommendations presented in the HCST was associated with closer compliance to meeting nutrient Dietary Reference Intake recommendations, however it did not relate to reduced obesity as assessed by body mass index (p > 0.05). EWCFG recommendations are currently not being met by most children and adolescents. Future nutrient profiling systems need to incorporate both positive and negative nutrients and an overall score. In addition, a wider range of nutrient thresholds should be considered for HCST to better capture product differences, prevent categorization of most foods as Tiers 2-3 and provide incentives for product reformulation.

  17. Design for Connecting Spatial Data Infrastructures with Sensor Web (sensdi)

    NASA Astrophysics Data System (ADS)

    Bhattacharya, D.; M., M.

    2016-06-01

    Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. It is about research to harness the sensed environment by utilizing domain specific sensor data to create a generalized sensor webframework. The challenges being semantic enablement for Spatial Data Infrastructures, and connecting the interfaces of SDI with interfaces of Sensor Web. The proposed research plan is to Identify sensor data sources, Setup an open source SDI, Match the APIs and functions between Sensor Web and SDI, and Case studies like hazard applications, urban applications etc. We take up co-operative development of SDI best practices to enable a new realm of a location enabled and semantically enriched World Wide Web - the "Geospatial Web" or "Geosemantic Web" by setting up one to one correspondence between WMS, WFS, WCS, Metadata and 'Sensor Observation Service' (SOS); 'Sensor Planning Service' (SPS); 'Sensor Alert Service' (SAS); a service that facilitates asynchronous message interchange between users and services, and between two OGC-SWE services, called the 'Web Notification Service' (WNS). Hence in conclusion, it is of importance to geospatial studies to integrate SDI with Sensor Web. The integration can be done through merging the common OGC interfaces of SDI and Sensor Web. Multi-usability studies to validate integration has to be undertaken as future research.

  18. Quantitative Acylcarnitine Determination by UHPLC-MS/MS – Going Beyond Tandem MS Acylcarnitine “Profiles”

    PubMed Central

    Minkler, Paul E.; Stoll, Maria S.K.; Ingalls, Stephen T.; Kerner, Janos; Hoppel, Charles L.

    2016-01-01

    Tandem MS “profiling” of acylcarnitines and amino acids was conceived as a first-tier screening method, and its application to expanded newborn screening has been enormously successful. However, unlike amino acid screening (which uses amino acid analysis as its second-tier validation of screening results), acylcarnitine “profiling” also assumed the role of second-tier validation, due to the lack of a generally accepted second-tier acylcarnitine determination method. In this report, we present results from the application of our validated UHPLC-MS/MS second-tier method for the quantification of total carnitine, free carnitine, butyrobetaine, and acylcarnitines to patient samples with known diagnoses: malonic acidemia, short-chain acyl-CoA dehydrogenase deficiency (SCADD) or isobutyryl-CoA dehydrogenase deficiency (IBD), 3-methyl-crotonyl carboxylase deficiency (3-MCC) or β-ketothiolase deficiency (BKT), and methylmalonic acidemia (MMA). We demonstrate the assay’s ability to separate constitutional isomers and diastereomeric acylcarnitines and generate values with a high level of accuracy and precision. These capabilities are unavailable when using tandem MS “profiles”. We also show examples of research interest, where separation of acylcarnitine species and accurate and precise acylcarnitine quantification is necessary. PMID:26458767

  19. gQTL: A Web Application for QTL Analysis Using the Collaborative Cross Mouse Genetic Reference Population.

    PubMed

    Konganti, Kranti; Ehrlich, Andre; Rusyn, Ivan; Threadgill, David W

    2018-06-07

    Multi-parental recombinant inbred populations, such as the Collaborative Cross (CC) mouse genetic reference population, are increasingly being used for analysis of quantitative trait loci (QTL). However specialized analytic software for these complex populations is typically built in R that works only on command-line, which limits the utility of these powerful resources for many users. To overcome analytic limitations, we developed gQTL, a web accessible, simple graphical user interface application based on the DOQTL platform in R to perform QTL mapping using data from CC mice. Copyright © 2018, G3: Genes, Genomes, Genetics.

  20. Cyberinfrastructure for Open Science at the Montreal Neurological Institute

    PubMed Central

    Das, Samir; Glatard, Tristan; Rogers, Christine; Saigle, John; Paiva, Santiago; MacIntyre, Leigh; Safi-Harab, Mouna; Rousseau, Marc-Etienne; Stirling, Jordan; Khalili-Mahani, Najmeh; MacFarlane, David; Kostopoulos, Penelope; Rioux, Pierre; Madjar, Cecile; Lecours-Boucher, Xavier; Vanamala, Sandeep; Adalat, Reza; Mohaddes, Zia; Fonov, Vladimir S.; Milot, Sylvain; Leppert, Ilana; Degroot, Clotilde; Durcan, Thomas M.; Campbell, Tara; Moreau, Jeremy; Dagher, Alain; Collins, D. Louis; Karamchandani, Jason; Bar-Or, Amit; Fon, Edward A.; Hoge, Rick; Baillet, Sylvain; Rouleau, Guy; Evans, Alan C.

    2017-01-01

    Data sharing is becoming more of a requirement as technologies mature and as global research and communications diversify. As a result, researchers are looking for practical solutions, not only to enhance scientific collaborations, but also to acquire larger amounts of data, and to access specialized datasets. In many cases, the realities of data acquisition present a significant burden, therefore gaining access to public datasets allows for more robust analyses and broadly enriched data exploration. To answer this demand, the Montreal Neurological Institute has announced its commitment to Open Science, harnessing the power of making both clinical and research data available to the world (Owens, 2016a,b). As such, the LORIS and CBRAIN (Das et al., 2016) platforms have been tasked with the technical challenges specific to the institutional-level implementation of open data sharing, including: Comprehensive linking of multimodal data (phenotypic, clinical, neuroimaging, biobanking, and genomics, etc.)Secure database encryption, specifically designed for institutional and multi-project data sharing, ensuring subject confidentiality (using multi-tiered identifiers).Querying capabilities with multiple levels of single study and institutional permissions, allowing public data sharing for all consented and de-identified subject data.Configurable pipelines and flags to facilitate acquisition and analysis, as well as access to High Performance Computing clusters for rapid data processing and sharing of software tools.Robust Workflows and Quality Control mechanisms ensuring transparency and consistency in best practices.Long term storage (and web access) of data, reducing loss of institutional data assets.Enhanced web-based visualization of imaging, genomic, and phenotypic data, allowing for real-time viewing and manipulation of data from anywhere in the world.Numerous modules for data filtering, summary statistics, and personalized and configurable dashboards. Implementing the vision of Open Science at the Montreal Neurological Institute will be a concerted undertaking that seeks to facilitate data sharing for the global research community. Our goal is to utilize the years of experience in multi-site collaborative research infrastructure to implement the technical requirements to achieve this level of public data sharing in a practical yet robust manner, in support of accelerating scientific discovery. PMID:28111547

  1. Cyberinfrastructure for Open Science at the Montreal Neurological Institute.

    PubMed

    Das, Samir; Glatard, Tristan; Rogers, Christine; Saigle, John; Paiva, Santiago; MacIntyre, Leigh; Safi-Harab, Mouna; Rousseau, Marc-Etienne; Stirling, Jordan; Khalili-Mahani, Najmeh; MacFarlane, David; Kostopoulos, Penelope; Rioux, Pierre; Madjar, Cecile; Lecours-Boucher, Xavier; Vanamala, Sandeep; Adalat, Reza; Mohaddes, Zia; Fonov, Vladimir S; Milot, Sylvain; Leppert, Ilana; Degroot, Clotilde; Durcan, Thomas M; Campbell, Tara; Moreau, Jeremy; Dagher, Alain; Collins, D Louis; Karamchandani, Jason; Bar-Or, Amit; Fon, Edward A; Hoge, Rick; Baillet, Sylvain; Rouleau, Guy; Evans, Alan C

    2016-01-01

    Data sharing is becoming more of a requirement as technologies mature and as global research and communications diversify. As a result, researchers are looking for practical solutions, not only to enhance scientific collaborations, but also to acquire larger amounts of data, and to access specialized datasets. In many cases, the realities of data acquisition present a significant burden, therefore gaining access to public datasets allows for more robust analyses and broadly enriched data exploration. To answer this demand, the Montreal Neurological Institute has announced its commitment to Open Science, harnessing the power of making both clinical and research data available to the world (Owens, 2016a,b). As such, the LORIS and CBRAIN (Das et al., 2016) platforms have been tasked with the technical challenges specific to the institutional-level implementation of open data sharing, including: Comprehensive linking of multimodal data (phenotypic, clinical, neuroimaging, biobanking, and genomics, etc.)Secure database encryption, specifically designed for institutional and multi-project data sharing, ensuring subject confidentiality (using multi-tiered identifiers).Querying capabilities with multiple levels of single study and institutional permissions, allowing public data sharing for all consented and de-identified subject data.Configurable pipelines and flags to facilitate acquisition and analysis, as well as access to High Performance Computing clusters for rapid data processing and sharing of software tools.Robust Workflows and Quality Control mechanisms ensuring transparency and consistency in best practices.Long term storage (and web access) of data, reducing loss of institutional data assets.Enhanced web-based visualization of imaging, genomic, and phenotypic data, allowing for real-time viewing and manipulation of data from anywhere in the world.Numerous modules for data filtering, summary statistics, and personalized and configurable dashboards. Implementing the vision of Open Science at the Montreal Neurological Institute will be a concerted undertaking that seeks to facilitate data sharing for the global research community. Our goal is to utilize the years of experience in multi-site collaborative research infrastructure to implement the technical requirements to achieve this level of public data sharing in a practical yet robust manner, in support of accelerating scientific discovery.

  2. Genome Partitioner: A web tool for multi-level partitioning of large-scale DNA constructs for synthetic biology applications.

    PubMed

    Christen, Matthias; Del Medico, Luca; Christen, Heinz; Christen, Beat

    2017-01-01

    Recent advances in lower-cost DNA synthesis techniques have enabled new innovations in the field of synthetic biology. Still, efficient design and higher-order assembly of genome-scale DNA constructs remains a labor-intensive process. Given the complexity, computer assisted design tools that fragment large DNA sequences into fabricable DNA blocks are needed to pave the way towards streamlined assembly of biological systems. Here, we present the Genome Partitioner software implemented as a web-based interface that permits multi-level partitioning of genome-scale DNA designs. Without the need for specialized computing skills, biologists can submit their DNA designs to a fully automated pipeline that generates the optimal retrosynthetic route for higher-order DNA assembly. To test the algorithm, we partitioned a 783 kb Caulobacter crescentus genome design. We validated the partitioning strategy by assembling a 20 kb test segment encompassing a difficult to synthesize DNA sequence. Successful assembly from 1 kb subblocks into the 20 kb segment highlights the effectiveness of the Genome Partitioner for reducing synthesis costs and timelines for higher-order DNA assembly. The Genome Partitioner is broadly applicable to translate DNA designs into ready to order sequences that can be assembled with standardized protocols, thus offering new opportunities to harness the diversity of microbial genomes for synthetic biology applications. The Genome Partitioner web tool can be accessed at https://christenlab.ethz.ch/GenomePartitioner.

  3. The NOvA software testing framework

    NASA Astrophysics Data System (ADS)

    Tamsett, M.; C Group

    2015-12-01

    The NOvA experiment at Fermilab is a long-baseline neutrino experiment designed to study vε appearance in a vμ beam. NOvA has already produced more than one million Monte Carlo and detector generated files amounting to more than 1 PB in size. This data is divided between a number of parallel streams such as far and near detector beam spills, cosmic ray backgrounds, a number of data-driven triggers and over 20 different Monte Carlo configurations. Each of these data streams must be processed through the appropriate steps of the rapidly evolving, multi-tiered, interdependent NOvA software framework. In total there are greater than 12 individual software tiers, each of which performs a different function and can be configured differently depending on the input stream. In order to regularly test and validate that all of these software stages are working correctly NOvA has designed a powerful, modular testing framework that enables detailed validation and benchmarking to be performed in a fast, efficient and accessible way with minimal expert knowledge. The core of this system is a novel series of python modules which wrap, monitor and handle the underlying C++ software framework and then report the results to a slick front-end web-based interface. This interface utilises modern, cross-platform, visualisation libraries to render the test results in a meaningful way. They are fast and flexible, allowing for the easy addition of new tests and datasets. In total upwards of 14 individual streams are regularly tested amounting to over 70 individual software processes, producing over 25 GB of output files. The rigour enforced through this flexible testing framework enables NOvA to rapidly verify configurations, results and software and thus ensure that data is available for physics analysis in a timely and robust manner.

  4. The emerging Web 2.0 social software: an enabling suite of sociable technologies in health and health care education.

    PubMed

    Kamel Boulos, Maged N; Wheeler, Steve

    2007-03-01

    Web 2.0 sociable technologies and social software are presented as enablers in health and health care, for organizations, clinicians, patients and laypersons. They include social networking services, collaborative filtering, social bookmarking, folksonomies, social search engines, file sharing and tagging, mashups, instant messaging, and online multi-player games. The more popular Web 2.0 applications in education, namely wikis, blogs and podcasts, are but the tip of the social software iceberg. Web 2.0 technologies represent a quite revolutionary way of managing and repurposing/remixing online information and knowledge repositories, including clinical and research information, in comparison with the traditional Web 1.0 model. The paper also offers a glimpse of future software, touching on Web 3.0 (the Semantic Web) and how it could be combined with Web 2.0 to produce the ultimate architecture of participation. Although the tools presented in this review look very promising and potentially fit for purpose in many health care applications and scenarios, careful thinking, testing and evaluation research are still needed in order to establish 'best practice models' for leveraging these emerging technologies to boost our teaching and learning productivity, foster stronger 'communities of practice', and support continuing medical education/professional development (CME/CPD) and patient education.

  5. Dynamic analysis of space structures including elastic, multibody, and control behavior

    NASA Technical Reports Server (NTRS)

    Pinson, Larry; Soosaar, Keto

    1989-01-01

    The problem is to develop analysis methods, modeling stategies, and simulation tools to predict with assurance the on-orbit performance and integrity of large complex space structures that cannot be verified on the ground. The problem must incorporate large reliable structural models, multi-body flexible dynamics, multi-tier controller interaction, environmental models including 1g and atmosphere, various on-board disturbances, and linkage to mission-level performance codes. All areas are in serious need of work, but the weakest link is multi-body flexible dynamics.

  6. Validation of a multi-analyte panel with cell-bound complement activation products for systemic lupus erythematosus.

    PubMed

    Dervieux, Thierry; Conklin, John; Ligayon, Jo-Anne; Wolover, Leilani; O'Malley, Tyler; Alexander, Roberta Vezza; Weinstein, Arthur; Ibarra, Claudia A

    2017-07-01

    We describe the analytical validation of an assay panel intended to assist clinicians with the diagnosis of systemic lupus erythematosus (SLE). The multi-analyte panel includes quantitative assessment of complement activation and measurement of autoantibodies. The levels of the complement split product C4d bound to erythrocytes (EC4d) and B-lymphocytes (BC4d) (expressed as mean fluorescence intensity [MFI]) are measured by quantitative flow cytometry, while autoantibodies (inclusive of antinuclear and anti-double stranded DNA antibodies) are determined by immunoassays. Results of the multi-analyte panel are reported as positive or negative based on a 2-tiered index score. Post-phlebotomy stability of EC4d and BC4d in EDTA-anticoagulated blood is determined using specimens collected from patients with SLE and normal donors. Three-level C4 coated positive beads are run daily as controls. Analytical validity is reported using intra-day and inter-day coefficient of variation (CV). EC4d and BC4d are stable for 2days at ambient temperature and for 4days at 4°C post-phlebotomy. Median intra-day and inter-day CV range from 2.9% to 7.8% (n=30) and 7.3% to 12.4% (n=66), respectively. The 2-tiered index score is reproducible over 4 consecutive daysupon storage of blood at 4°C. A total of 2,888 three-level quality control data were collected from 6 flow cytometers with an overall failure rate below 3%. Median EC4d level is 6 net MFI (Interquartile [IQ] range 4-9 net MFI) and median BC4d is 18 net MFI (IQ range 13-27 net MFI) among 86,852 specimens submitted for testing. The incidence of 2-tiered positive test results is 13.4%. We have established the analytical validity of a multi-analyte assay panel for SLE. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Research and Design of the Three-tier Distributed Network Management System Based on COM / COM + and DNA

    NASA Astrophysics Data System (ADS)

    Liang, Likai; Bi, Yushen

    Considered on the distributed network management system's demand of high distributives, extensibility and reusability, a framework model of Three-tier distributed network management system based on COM/COM+ and DNA is proposed, which adopts software component technology and N-tier application software framework design idea. We also give the concrete design plan of each layer of this model. Finally, we discuss the internal running process of each layer in the distributed network management system's framework model.

  8. CAS-viewer: web-based tool for splicing-guided integrative analysis of multi-omics cancer data.

    PubMed

    Han, Seonggyun; Kim, Dongwook; Kim, Youngjun; Choi, Kanghoon; Miller, Jason E; Kim, Dokyoon; Lee, Younghee

    2018-04-20

    The Cancer Genome Atlas (TCGA) project is a public resource that provides transcriptomic, DNA sequence, methylation, and clinical data for 33 cancer types. Transforming the large size and high complexity of TCGA cancer genome data into integrated knowledge can be useful to promote cancer research. Alternative splicing (AS) is a key regulatory mechanism of genes in human cancer development and in the interaction with epigenetic factors. Therefore, AS-guided integration of existing TCGA data sets will make it easier to gain insight into the genetic architecture of cancer risk and related outcomes. There are already existing tools analyzing and visualizing alternative mRNA splicing patterns for large-scale RNA-seq experiments. However, these existing web-based tools are limited to the analysis of individual TCGA data sets at a time, such as only transcriptomic information. We implemented CAS-viewer (integrative analysis of Cancer genome data based on Alternative Splicing), a web-based tool leveraging multi-cancer omics data from TCGA. It illustrates alternative mRNA splicing patterns along with methylation, miRNAs, and SNPs, and then provides an analysis tool to link differential transcript expression ratio to methylation, miRNA, and splicing regulatory elements for 33 cancer types. Moreover, one can analyze AS patterns with clinical data to identify potential transcripts associated with different survival outcome for each cancer. CAS-viewer is a web-based application for transcript isoform-driven integration of multi-omics data in multiple cancer types and will aid in the visualization and possible discovery of biomarkers for cancer by integrating multi-omics data from TCGA.

  9. Chromhome: A rich internet application for accessing comparative chromosome homology maps

    PubMed Central

    Nagarajan, Sridevi; Rens, Willem; Stalker, James; Cox, Tony; Ferguson-Smith, Malcolm A

    2008-01-01

    Background Comparative genomics has become a significant research area in recent years, following the availability of a number of sequenced genomes. The comparison of genomes is of great importance in the analysis of functionally important genome regions. It can also be used to understand the phylogenetic relationships of species and the mechanisms leading to rearrangement of karyotypes during evolution. Many species have been studied at the cytogenetic level by cross species chromosome painting. With the large amount of such information, it has become vital to computerize the data and make them accessible worldwide. Chromhome is a comprehensive web application that is designed to provide cytogenetic comparisons among species and to fulfil this need. Results The Chromhome application architecture is multi-tiered with an interactive client layer, business logic and database layers. Enterprise java platform with open source framework OpenLaszlo is used to implement the Rich Internet Chromhome Application. Cross species comparative mapping raw data are collected and the processed information is stored into MySQL Chromhome database. Chromhome Release 1.0 contains 109 homology maps from 51 species. The data cover species from 14 orders and 30 families. The homology map displays all the chromosomes of the compared species as one image, making comparisons among species easier. Inferred data also provides maps of homologous regions that could serve as a guideline for researchers involved in phylogenetic or evolution based studies. Conclusion Chromhome provides a useful resource for comparative genomics, holding graphical homology maps of a wide range of species. It brings together cytogenetic data of many genomes under one roof. Inferred painting can often determine the chromosomal homologous regions between two species, if each has been compared with a common third species. Inferred painting greatly reduces the need to map entire genomes and helps focus only on relevant regions of the chromosomes of the species under study. Future releases of Chromhome will accommodate more species and their respective gene and BAC maps, in addition to chromosome painting data. Chromhome application provides a single-page interface (SPI) with desktop style layout, delivering a better and richer user experience. PMID:18366796

  10. Chromhome: a rich internet application for accessing comparative chromosome homology maps.

    PubMed

    Nagarajan, Sridevi; Rens, Willem; Stalker, James; Cox, Tony; Ferguson-Smith, Malcolm A

    2008-03-26

    Comparative genomics has become a significant research area in recent years, following the availability of a number of sequenced genomes. The comparison of genomes is of great importance in the analysis of functionally important genome regions. It can also be used to understand the phylogenetic relationships of species and the mechanisms leading to rearrangement of karyotypes during evolution. Many species have been studied at the cytogenetic level by cross species chromosome painting. With the large amount of such information, it has become vital to computerize the data and make them accessible worldwide. Chromhome http://www.chromhome.org is a comprehensive web application that is designed to provide cytogenetic comparisons among species and to fulfil this need. The Chromhome application architecture is multi-tiered with an interactive client layer, business logic and database layers. Enterprise java platform with open source framework OpenLaszlo is used to implement the Rich Internet Chromhome Application. Cross species comparative mapping raw data are collected and the processed information is stored into MySQL Chromhome database. Chromhome Release 1.0 contains 109 homology maps from 51 species. The data cover species from 14 orders and 30 families. The homology map displays all the chromosomes of the compared species as one image, making comparisons among species easier. Inferred data also provides maps of homologous regions that could serve as a guideline for researchers involved in phylogenetic or evolution based studies. Chromhome provides a useful resource for comparative genomics, holding graphical homology maps of a wide range of species. It brings together cytogenetic data of many genomes under one roof. Inferred painting can often determine the chromosomal homologous regions between two species, if each has been compared with a common third species. Inferred painting greatly reduces the need to map entire genomes and helps focus only on relevant regions of the chromosomes of the species under study. Future releases of Chromhome will accommodate more species and their respective gene and BAC maps, in addition to chromosome painting data. Chromhome application provides a single-page interface (SPI) with desktop style layout, delivering a better and richer user experience.

  11. Multi-tiered Approach to Development of Increased Throughput Assay Models to Assess Endocrine-Disrupting Activity of Chemicals

    EPA Science Inventory

    Screening for endocrine-disrupting chemicals (EDCs) requires sensitive, scalable assays. Current high-throughput screening (HTPS) approaches for estrogenic and androgenic activity yield rapid results, but many are not sensitive to physiological hormone concentrations, suggesting ...

  12. Resistance Management Research Status

    EPA Science Inventory

    Long-term sustainability of genetically modified corn expressing Bt relies on the validity of assumptions underlying IRM models used by the EPA and the ability of EPA to monitor, detect and react to insect resistance when it develops. The EPA is developing a multi-tiered approac...

  13. SEM Characterization of Extinguished Grains from Plasma-Ignited M30 Charges

    NASA Technical Reports Server (NTRS)

    Kinkennon, A.; Birk, A.; DelGuercio, M.; Kaste, P.; Lieb, R.; Newberry, J.; Pesce-Rodriguez, R.; Schroeder, M.

    2000-01-01

    M30 propellant grains that had been ignited in interrupted closed bomb experiments were characterize by scanning electron microscopy (SEM). Previous chemical analysis of extinguished grains had given no indications of plasma-propellant chemical interactions that could explain the increased burning rates that had been previously observed in full-pressure closed bomb experiments. (This does not mean that there is no unique chemistry occurring with plasma ignition. It may occur very early in the ignition event and then become obscured by the burning chemistry.) In this work, SEM was used to look at grain morphologies to determine if there were increases in the surface areas of the plasma-ignited grains which would contribute to the apparent increase in the burning rate. Charges were made using 30 propellant grains (approximately 32 grams) stacked in two tiers and in two concentric circles around a plastic straw. Each grain was notched so that, when the grains were expelled from the bomb during extinguishment, it could be determined in which tier and which circle each grain was originally packed. Charges were ignited in a closed bomb by either a nickel wire/Mylar-capillary plasma or black powder. The bomb contained a blowout disk that ruptured when the pressure reached 35 MPa, and the propellant was vented into a collection chamber packed with polyurethane foam. SEM analysis of the grains fired with a conventional black powder igniter showed no signs of unusual burning characteristics. The surfaces seemed to be evenly burned on the exteriors of the grains and in the perforations. Grains that had been subjected to plasma ignition, however, had pits, gouges, chasms, and cracks in the surfaces. The sides of the grains closest to the plasma had the greatest amount of damage, but even surfaces facing the outer wall of the bomb had small pits. The perforations contained gouges and abnormally burned regions (wormholes) that extended into the web. The SEM photos indicated that a grain from the top tier, which was farther away from the plasma ignition source, sustained more plasma-induced damage to the perforations and the web than did the grains on the bottom tier.

  14. Train the trainer? A randomized controlled trial of a multi-tiered oral health education programme in community-based residential services for adults with intellectual disability.

    PubMed

    Mac Giolla Phadraig, Caoimhin; Guerin, Suzanne; Nunn, June

    2013-04-01

    To assess the impact of a multi-tiered oral health education programme on care staff caring for people with intellectual disability (ID). Postal questionnaires were sent to all care staff of a community-based residential care service for adults, randomly assigned to control and intervention groups. A specifically developed training programme was delivered to residential staff nominees, who then trained all staff within the intervention group. The control group received no training. Post-test questionnaires were sent to both groups. Paired-samples t-test was used to compare oral health-related knowledge (K) and behaviour, attitude and self-efficacy (BAS) scores. Of the initial 219 respondents, 154 (response rate between 40% and 35.8%, with attrition rate of 29.7% from baseline to repeat) returned completed questionnaires at post-test (M=8.5 months, range=6.5-11 months). Control and intervention groups were comparable for general training, employment and demographic variables. In the intervention group, mean Knowledge Index score rose from K=7.2 to K=7.9 (P<0.001) and mean BAS scale score rose from BAS=4.7 to BAS=5.4 (P<0.001). There was no statistically significant increase in mean scores from test (K=7.0, BAS=4.7) to post-test (K=7.2, BAS=4.9) for the control group. Mean scores regarding knowledge, attitude, self-efficacy and reported behaviour increased significantly at 8.5 months in staff where training was provided. The results indicate that a multi-tiered training programme improved knowledge, attitude, self-efficacy and reported behaviour amongst staff caring for people with ID. © 2012 John Wiley & Sons A/S.

  15. Trend and impact of international collaboration in clinical medicine papers published in Malaysia.

    PubMed

    Low, Wah Yun; Ng, Kwan Hoong; Kabir, M A; Koh, Ai Peng; Sinnasamy, Janaki

    2014-01-01

    Research collaboration is the way forward in order to improve quality and impact of its research findings. International research collaboration has resulted in international co-authorship in scientific communications and publications. This study highlights the collaborating research and authorship trend in clinical medicine in Malaysia from 2001 to 2010. Malaysian-based author affiliation in the Web of Science (Science Citation Index Expanded) and clinical medicine journals ( n  = 999) and articles ( n  = 3951) as of 30th Oct 2011 were downloaded. Types of document analyzed were articles and reviews, and impact factors (IF) in the 2010 Journal Citation Report Science Edition were taken to access the quality of the articles. The number of publications in clinical medicine increased from 4.5 % ( n  = 178) in 2001 to 23.9 % ( n  = 944) in 2010. The top three contributors in the subject categories are Pharmacology and Pharmacy (13.9 %), General and Internal Medicine (13.6 %) and Tropical Medicine (7.3 %). By journal tier system: Tier 1 (18.7 %, n  = 738), Tier 2 (22.5 %, n  = 888), Tier 3 (29.6 %, n  = 1170), Tier 4 (27.2 %, n  = 1074), and journals without IF (2.1 %, n  = 81). University of Malaya was the most productive. Local collaborators accounted for 60.3 % and international collaborations 39.7 %. Articles with international collaborations appeared in journals with higher journal IFs than those without international collaboration. They were also cited more significantly than articles without international collaborations. Citations, impact factor and journal tiers were significantly associated with international collaboration in Malaysia's clinical medicine publications. Malaysia has achieved a significant number of ISI publications in clinical medicine participation in international collaboration.

  16. RCrawler: An R package for parallel web crawling and scraping

    NASA Astrophysics Data System (ADS)

    Khalil, Salim; Fakir, Mohamed

    RCrawler is a contributed R package for domain-based web crawling and content scraping. As the first implementation of a parallel web crawler in the R environment, RCrawler can crawl, parse, store pages, extract contents, and produce data that can be directly employed for web content mining applications. However, it is also flexible, and could be adapted to other applications. The main features of RCrawler are multi-threaded crawling, content extraction, and duplicate content detection. In addition, it includes functionalities such as URL and content-type filtering, depth level controlling, and a robot.txt parser. Our crawler has a highly optimized system, and can download a large number of pages per second while being robust against certain crashes and spider traps. In this paper, we describe the design and functionality of RCrawler, and report on our experience of implementing it in an R environment, including different optimizations that handle the limitations of R. Finally, we discuss our experimental results.

  17. DESIGNING A COMPREHENSIVE, INTEGRATED WATER RESOURCES MONITORING PROGRAM FOR FLORIDA

    EPA Science Inventory

    Proceedings of the National Water Quality Monitoring Conference "Monitoring Critical Foundations to Protect Our Waters," 7-9 July 1998, Reno, NV.

    In late 1996, Florida Department of Environmental Protection (FDEP) initiated an effort to design a multi-tiered monitoring and...

  18. Resistance Management Research Status-May 2008

    EPA Science Inventory

    Long-term sustainability of genetically modified corn expressing Bt relies on the validity of assumptions underlying IRM models used by the EPA and the ability of EPA to monitor, detect and react to insect resistance when it develops. The EPA is developing a multi-tiered approac...

  19. Transforming Big Data into cancer-relevant insight: An initial, multi-tier approach to assess reproducibility and relevance

    PubMed Central

    2016-01-01

    The Cancer Target Discovery and Development (CTD2) Network was established to accelerate the transformation of “Big Data” into novel pharmacological targets, lead compounds, and biomarkers for rapid translation into improved patient outcomes. It rapidly became clear in this collaborative network that a key central issue was to define what constitutes sufficient computational or experimental evidence to support a biologically or clinically relevant finding. This manuscript represents a first attempt to delineate the challenges of supporting and confirming discoveries arising from the systematic analysis of large-scale data resources in a collaborative work environment and to provide a framework that would begin a community discussion to resolve these challenges. The Network implemented a multi-Tier framework designed to substantiate the biological and biomedical relevance as well as the reproducibility of data and insights resulting from its collaborative activities. The same approach can be used by the broad scientific community to drive development of novel therapeutic and biomarker strategies for cancer. PMID:27401613

  20. Sustainability assessment of electrokinetic bioremediation compared with alternative remediation options for a petroleum release site.

    PubMed

    Gill, R T; Thornton, S F; Harbottle, M J; Smith, J W N

    2016-12-15

    Sustainable management practices can be applied to the remediation of contaminated land to maximise the economic, environmental and social benefits of the process. The Sustainable Remediation Forum UK (SuRF-UK) have developed a framework to support the implementation of sustainable practices within contaminated land management and decision making. This study applies the framework, including qualitative (Tier 1) and semi-quantitative (Tier 2) sustainability assessments, to a complex site where the principal contaminant source is unleaded gasoline, giving rise to a dissolved phase BTEX and MTBE plume. The pathway is groundwater migration through a chalk aquifer and the receptor is a water supply borehole. A hydraulic containment system (HCS) has been installed to manage the MTBE plume migration. The options considered to remediate the MTBE source include monitored natural attenuation (MNA), air sparging/soil vapour extraction (AS/SVE), pump and treat (PT) and electrokinetic-enhanced bioremediation (EK-BIO). A sustainability indictor set from the SuRF-UK framework, including priority indicator categories selected during a stakeholder engagement workshop, was used to frame the assessments. At Tier 1 the options are ranked based on qualitative supporting information, whereas in Tier 2 a multi-criteria analysis is applied. Furthermore, the multi-criteria analysis was refined for scenarios where photovoltaics (PVs) are included and amendments are excluded from the EK-BIO option. Overall, the analysis identified AS/SVE and EK-BIO as more sustainable remediation options at this site than either PT or MNA. The wider implications of this study include: (1) an appraisal of the management decision from each Tier of the assessment with the aim to highlight areas for time and cost savings for similar assessments in the future; (2) the observation that EK-BIO performed well against key indicator categories compared to the other intensive treatments; and (3) introducing methods to improve the sustainability of the EK-BIO treatment design (such as PVs) did not have a significant effect in this instance. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. A two-tier atmospheric circulation classification scheme for the European-North Atlantic region

    NASA Astrophysics Data System (ADS)

    Guentchev, Galina S.; Winkler, Julie A.

    A two-tier classification of large-scale atmospheric circulation was developed for the European-North-Atlantic domain. The classification was constructed using a combination of principal components and k-means cluster analysis applied to reanalysis fields of mean sea-level pressure for 1951-2004. Separate classifications were developed for the winter, spring, summer, and fall seasons. For each season, the two classification tiers were identified independently, such that the definition of one tier does not depend on the other tier having already been defined. The first tier of the classification is comprised of supertype patterns. These broad-scale circulation classes are useful for generalized analyses such as investigations of the temporal trends in circulation frequency and persistence. The second, more detailed tier consists of circulation types and is useful for numerous applied research questions regarding the relationships between large-scale circulation and local and regional climate. Three to five supertypes and up to 19 circulation types were identified for each season. An intuitive nomenclature scheme based on the physical entities (i.e., anomaly centers) which dominate the specific patterns was used to label each of the supertypes and types. Two example applications illustrate the potential usefulness of a two-tier classification. In the first application, the temporal variability of the supertypes was evaluated. In general, the frequency and persistence of supertypes dominated by anticyclonic circulation increased during the study period, whereas the supertypes dominated by cyclonic features decreased in frequency and persistence. The usefulness of the derived circulation types was exemplified by an analysis of the circulation associated with heat waves and cold spells reported at several cities in Bulgaria. These extreme temperature events were found to occur with a small number of circulation types, a finding that can be helpful in understanding past variability and projecting future changes in the occurrence of extreme weather and climate events.

  2. Optical fibre multi-parameter sensing with secure cloud based signal capture and processing

    NASA Astrophysics Data System (ADS)

    Newe, Thomas; O'Connell, Eoin; Meere, Damien; Yuan, Hongwei; Leen, Gabriel; O'Keeffe, Sinead; Lewis, Elfed

    2016-05-01

    Recent advancements in cloud computing technologies in the context of optical and optical fibre based systems are reported. The proliferation of real time and multi-channel based sensor systems represents significant growth in data volume. This coupled with a growing need for security presents many challenges and presents a huge opportunity for an evolutionary step in the widespread application of these sensing technologies. A tiered infrastructural system approach is adopted that is designed to facilitate the delivery of Optical Fibre-based "SENsing as a Service- SENaaS". Within this infrastructure, novel optical sensing platforms, deployed within different environments, are interfaced with a Cloud-based backbone infrastructure which facilitates the secure collection, storage and analysis of real-time data. Feedback systems, which harness this data to affect a change within the monitored location/environment/condition, are also discussed. The cloud based system presented here can also be used with chemical and physical sensors that require real-time data analysis, processing and feedback.

  3. 77 FR 44047 - Federal Acquisition Regulation; Reporting Executive Compensation and First-Tier Subcontract Awards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-26

    ... Compensation B. Definitions C. Thresholds D. Paperwork Burden E. Applicability F. Subcontract Award Data G... definition of ``first- tier subcontractor.'' Comment: A number of respondents believed that executive... technologies and ideas, and increase the Government's costs by reducing competition. These respondents also...

  4. Service Management Database for DSN Equipment

    NASA Technical Reports Server (NTRS)

    Zendejas, Silvino; Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Wolgast, Paul; Allen, Christopher; Luong, Ivy; Chang, George; Sadaqathulla, Syed

    2009-01-01

    This data- and event-driven persistent storage system leverages the use of commercial software provided by Oracle for portability, ease of maintenance, scalability, and ease of integration with embedded, client-server, and multi-tiered applications. In this role, the Service Management Database (SMDB) is a key component of the overall end-to-end process involved in the scheduling, preparation, and configuration of the Deep Space Network (DSN) equipment needed to perform the various telecommunication services the DSN provides to its customers worldwide. SMDB makes efficient use of triggers, stored procedures, queuing functions, e-mail capabilities, data management, and Java integration features provided by the Oracle relational database management system. SMDB uses a third normal form schema design that allows for simple data maintenance procedures and thin layers of integration with client applications. The software provides an integrated event logging system with ability to publish events to a JMS messaging system for synchronous and asynchronous delivery to subscribed applications. It provides a structured classification of events and application-level messages stored in database tables that are accessible by monitoring applications for real-time monitoring or for troubleshooting and analysis over historical archives.

  5. S3DB core: a framework for RDF generation and management in bioinformatics infrastructures

    PubMed Central

    2010-01-01

    Background Biomedical research is set to greatly benefit from the use of semantic web technologies in the design of computational infrastructure. However, beyond well defined research initiatives, substantial issues of data heterogeneity, source distribution, and privacy currently stand in the way towards the personalization of Medicine. Results A computational framework for bioinformatic infrastructure was designed to deal with the heterogeneous data sources and the sensitive mixture of public and private data that characterizes the biomedical domain. This framework consists of a logical model build with semantic web tools, coupled with a Markov process that propagates user operator states. An accompanying open source prototype was developed to meet a series of applications that range from collaborative multi-institution data acquisition efforts to data analysis applications that need to quickly traverse complex data structures. This report describes the two abstractions underlying the S3DB-based infrastructure, logical and numerical, and discusses its generality beyond the immediate confines of existing implementations. Conclusions The emergence of the "web as a computer" requires a formal model for the different functionalities involved in reading and writing to it. The S3DB core model proposed was found to address the design criteria of biomedical computational infrastructure, such as those supporting large scale multi-investigator research, clinical trials, and molecular epidemiology. PMID:20646315

  6. Flood Resilient Systems and their Application for Flood Resilient Planning

    NASA Astrophysics Data System (ADS)

    Manojlovic, N.; Gabalda, V.; Antanaskovic, D.; Gershovich, I.; Pasche, E.

    2012-04-01

    Following the paradigm shift in flood management from traditional to more integrated approaches, and considering the uncertainties of future development due to drivers such as climate change, one of the main emerging tasks of flood managers becomes the development of (flood) resilient cities. It can be achieved by application of non-structural - flood resilience measures, summarised in the 4As: assistance, alleviation, awareness and avoidance (FIAC, 2007). As a part of this strategy, the key aspect of development of resilient cities - resilient built environment can be reached by efficient application of Flood Resilience Technology (FReT) and its meaningful combination into flood resilient systems (FRS). FRS are given as [an interconnecting network of FReT which facilitates resilience (including both restorative and adaptive capacity) to flooding, addressing physical and social systems and considering different flood typologies] (SMARTeST, http://www.floodresilience.eu/). Applying the system approach (e.g. Zevenbergen, 2008), FRS can be developed at different scales from the building to the city level. Still, a matter of research is a method to define and systematise different FRS crossing those scales. Further, the decision on which resilient system is to be applied for the given conditions and given scale is a complex task, calling for utilisation of decision support tools. This process of decision-making should follow the steps of flood risk assessment (1) and development of a flood resilience plan (2) (Manojlovic et al, 2009). The key problem in (2) is how to match the input parameters that describe physical&social system and flood typology to the appropriate flood resilient system. Additionally, an open issue is how to integrate the advances in FReT and findings on its efficiency into decision support tools. This paper presents a way to define, systematise and make decisions on FRS at different scales of an urban system developed within the 7th FP Project SMARTeST. A web based three tier advisory system FLORETO-KALYPSO (http://floreto.wb.tu-harburg.de/, Manojlovic et al, 2009) devoted to support decision-making process at the building level has been further developed to support multi-scale decision making on resilient systems, improving the existing data mining algorithms of the Business Logic tier. Further tuning of the algorithms is to be performed based on the new developments and findings in applicability and efficiency of different FRe Technology for different flood typologies. The first results obtained at the case studies in Greater Hamburg, Germany indicate the potential of this approach to contribute to the multiscale resilient planning on the road to flood resilient cities. FIAC (2007): "Final report form the Awareness and Assistance Sub-committee", FIAC, Scottish Government Zevenbergen C. et al (2008) "Challenges in urban flood management: travelling across spatial and temporal scales", Journal of FRM Volume 1 Issue 2, p 81-88 Manojlovic N., et al (2009): "Capacity Building in FRM through a DSS Utilising Data Mining Approach", Proceed. 8th HIC, Concepcion, Chile, January, 2009

  7. 75 FR 80790 - Multi-Family Housing Program 2011 Industry Forums-Open Teleconference and/or Web Conference Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-23

    ...--Open Teleconference and/or Web Conference Meetings AGENCY: Rural Housing Service, USDA. ACTION: Notice. SUMMARY: This Notice announces a series of teleconference and/or Web conference meetings regarding the USDA Multi-Family Housing Program. The teleconference and/or Web conference meetings will be scheduled...

  8. FwWebViewPlus: integration of web technologies into WinCC OA based Human-Machine Interfaces at CERN

    NASA Astrophysics Data System (ADS)

    Golonka, Piotr; Fabian, Wojciech; Gonzalez-Berges, Manuel; Jasiun, Piotr; Varela-Rodriguez, Fernando

    2014-06-01

    The rapid growth in popularity of web applications gives rise to a plethora of reusable graphical components, such as Google Chart Tools and JQuery Sparklines, implemented in JavaScript and run inside a web browser. In the paper we describe the tool that allows for seamless integration of web-based widgets into WinCC Open Architecture, the SCADA system used commonly at CERN to build complex Human-Machine Interfaces. Reuse of widely available widget libraries and pushing the development efforts to a higher abstraction layer based on a scripting language allow for significant reduction in maintenance of the code in multi-platform environments compared to those currently used in C++ visualization plugins. Adequately designed interfaces allow for rapid integration of new web widgets into WinCC OA. At the same time, the mechanisms familiar to HMI developers are preserved, making the use of new widgets "native". Perspectives for further integration between the realms of WinCC OA and Web development are also discussed.

  9. RiboSketch: Versatile Visualization of Multi-stranded RNA and DNA Secondary Structure.

    PubMed

    Lu, Jacob S; Bindewald, Eckart; Kasprzak, Wojciech; Shapiro, Bruce A

    2018-06-15

    Creating clear, visually pleasing 2D depictions of RNA and DNA strands and their interactions is important to facilitate and communicate insights related to nucleic acid structure. Here we present RiboSketch, a secondary structure image production application that enables the visualization of multistranded structures via layout algorithms, comprehensive editing capabilities, and a multitude of simulation modes. These interactive features allow RiboSketch to create publication quality diagrams for structures with a wide range of composition, size, and complexity. The program may be run in any web browser without the need for installation, or as a standalone Java application. https://binkley2.ncifcrf.gov/users/bindewae/ribosketch_web.

  10. wHospital: a web-based application with digital signature for drugs dispensing management.

    PubMed

    Rossi, Lorenzo; Margola, Lorenzo; Manzelli, Vacia; Bandera, Alessandra

    2006-01-01

    wHospital is the result of an information technology research project, based on the utilization of a web based application for managing the hospital drugs dispensing. Part of wHospital back bone and its key distinguishing characteristic is the adoption of the digital signature system,initially deployed by the Government of Lombardia, a Northern Italy Region, throughout the distribution of smart cards to all the healthcare and hospital staffs. The developed system is a web-based application with a proposed Health Records Digital Signature (HReDS) handshake to comply with the national law and with the Joint Commission International Standards. The prototype application, for a single hospital Operative Unit (OU), has focused on data and process management, related to drug therapy. Following a multi-faceted selection process, the Infective Disease OU of the Hospital in Busto Arsizio, Lombardia, was chosen for the development and prototype implementation. The project lead time, from user requirement analysis to training and deployment was approximately 8 months. This paper highlights the applied project methodology, the system architecture, and the achieved preliminary results.

  11. Techno-economic performance evaluation of solar tower plants with integrated multi-layered PCM thermocline thermal energy storage - A comparative study to conventional two-tank storage systems

    NASA Astrophysics Data System (ADS)

    Guedéz, Rafael; Ferruzza, Davide; Arnaudo, Monica; Rodríguez, Ivette; Perez-Segarra, Carlos D.; Hassar, Zhor; Laumert, Björn

    2016-05-01

    Solar Tower Power Plants with thermal energy storage are a promising technology for dispatchable renewable energy in the near future. Storage integration makes possible to shift the electricity production to more profitable peak hours. Usually two tanks are used to store cold and hot fluids, but this means both higher investment costs and difficulties during the operation of the variable volume tanks. Instead, another solution can be a single tank thermocline storage in a multi-layered configuration. In such tank both latent and sensible fillers are employed to decrease the related cost up to 30% and maintain high efficiencies. This paper analyses a multi-layered solid PCM storage tank concept for solar tower applications, and describes a comprehensive methodology to determine under which market structures such devices can outperform the more conventional two tank storage systems. A detail model of the tank has been developed and introduced in an existing techno-economic tool developed by the authors (DYESOPT). The results show that under current cost estimates and technical limitations the multi-layered solid PCM storage concept is a better solution when peaking operating strategies are desired, as it is the case for the two-tier South African tariff scheme.

  12. Evolution of the Building Management System in the INFN CNAF Tier-1 data center facility.

    NASA Astrophysics Data System (ADS)

    Ricci, Pier Paolo; Donatelli, Massimo; Falabella, Antonio; Mazza, Andrea; Onofri, Michele

    2017-10-01

    The INFN CNAF Tier-1 data center is composed by two different main rooms containing IT resources and four additional locations that hosts the necessary technology infrastructures providing the electrical power and cooling to the facility. The power supply and continuity are ensured by a dedicated room with three 15,000 to 400 V transformers in a separate part of the principal building and two redundant 1.4MW diesel rotary uninterruptible power supplies. The cooling is provided by six free cooling chillers of 320 kW each with a N+2 redundancy configuration. Clearly, considering the complex physical distribution of the technical plants, a detailed Building Management System (BMS) was designed and implemented as part of the original project in order to monitor and collect all the necessary information and for providing alarms in case of malfunctions or major failures. After almost 10 years of service, a revision of the BMS system was somewhat necessary. In addition, the increasing cost of electrical power is nowadays a strong motivation for improving the energy efficiency of the infrastructure. Therefore the exact calculation of the power usage effectiveness (PUE) metric has become one of the most important factors when aiming for the optimization of a modern data center. For these reasons, an evolution of the BMS system was designed using the Schneider StruxureWare infrastructure hardware and software products. This solution proves to be a natural and flexible development of the previous TAC Vista software with advantages in the ease of use and the possibility to customize the data collection and the graphical interfaces display. Moreover, the addition of protocols like open standard Web services gives the possibility to communicate with the BMS from custom user application and permits the exchange of data and information through the Web between different third-party systems. Specific Web services SOAP requests has been implemented in our Tier-1 monitoring system in order to collect historical trends of power demands and calculate the partial PUE (pPUE) of a specific part of the infrastructure. This would help in the identification of “spots” that may need further energy optimization. The StruxureWare system maintains compatibility with standard protocols like Modbus as well as native LonWorks, making possible reusing the existing network between physical locations as well as a considerable number of programmable controller and I/O modules that interact with the facility. The high increase of detailed statistical information about power consumption and the HVAC (heat, ventilation and air conditioning) parameters could prove to be a very valuable strategic choice for improving the overall PUE. This will bring remarkable benefits for the overall management costs, despite the limits of the non-optimal actual location of the facility, and it will help us in the process of making a more energy efficient data center that embraces the concept of green IT.

  13. Advances in Schoolwide Inclusive School Reform

    ERIC Educational Resources Information Center

    Sailor, Wayne

    2015-01-01

    This article highlights three significant advances in schoolwide inclusive school reform and suggests three next steps to improve educational outcomes for "all" students, particularly for students for whom typical instruction is not effective. Significant advances are as follows: (a) a multi-tiered system of support (MTSS) with embedded…

  14. RTI Goes Mainstream

    ERIC Educational Resources Information Center

    Pascopella, Angela

    2010-01-01

    In more districts than ever, Response-to-Intervention programs are gaining ground, nipping learning problems in the bud and keeping more students out of unnecessary special education classes, which is the goal. RTI, a multi-tier intervention used to diagnose and address potential learning or behavioral problem early, is also increasing in…

  15. Current Advances and Future Directions in Behavior Assessment

    ERIC Educational Resources Information Center

    Riley-Tillman, T. Chris; Johnson, Austin H.

    2017-01-01

    Multi-tiered problem-solving models that focus on promoting positive outcomes for student behavior continue to be emphasized within educational research. Although substantial work has been conducted to support systems-level implementation and intervention for behavior, concomitant advances in behavior assessment have been limited. This is despite…

  16. Application of a web-based Decision Support System in risk management

    NASA Astrophysics Data System (ADS)

    Aye, Zar Chi; Jaboyedoff, Michel; Derron, Marc-Henri

    2013-04-01

    Increasingly, risk information is widely available with the help of advanced technologies such as earth observation satellites, global positioning technologies, coupled with hazard modeling and analysis, and geographical information systems (GIS). Even though it exists, no effort will be put into action if it is not properly presented to the decision makers. These information need to be communicated clearly and show its usefulness so that people can make better informed decision. Therefore, communicating available risk information has become an important challenge and decision support systems have been one of the significant approaches which can help not only in presenting risk information to the decision makers but also in making efficient decisions while reducing human resources and time needed. In this study, the conceptual framework of an internet-based decision support system is presented to highlight its importance role in risk management framework and how it can be applied in case study areas chosen. The main purpose of the proposed system is to facilitate the available risk information in risk reduction by taking into account of the changes in climate, land use and socio-economic along with the risk scenarios. It allows the users to formulate, compare and select risk reduction scenarios (mainly for floods and landslides) through an enhanced participatory platform with diverse stakeholders' involvement in the decision making process. It is based on the three-tier (client-server) architecture which integrates web-GIS plus DSS functionalities together with cost benefit analysis and other supporting tools. Embedding web-GIS provides its end users to make better planning and informed decisions referenced to a geographical location, which is the one of the essential factors in disaster risk reduction programs. Different risk reduction measures of a specific area (local scale) will be evaluated using this web-GIS tool, available risk scenarios obtained from Probabilistic Risk Assessment (PRA) model and the knowledge collected from experts. The visualization of the risk reduction scenarios can also be shared among the users on the web to support the on-line participatory process. In addition, cost-benefit ratios of the different risk reduction scenarios can be prepared in order to serve as inputs for high-level decision makers. The most appropriate risk reduction scenarios will be chosen using Multi-Criteria Evaluation (MCE) method by weighting different parameters according to the preferences and criteria defined by the users. The role of public participation has been changing from one-way communication between authorities, experts, stakeholders and citizens towards more intensive two-way interaction. Involving the affected public and interest groups can enhance the level of legitimacy, transparency, and confidence in the decision making process. Due to its important part in decision making, online participatory tool is included in the DSS in order to allow the involved stakeholders interactively in risk reduction and be aware of the existing vulnerability conditions of the community. Moreover, it aims to achieve a more transparent and better informed decision-making process. The system is under in progress and the first tools implemented will be presented showing the wide possibilities of new web technologies which can have a great impact on the decision making process. It will be applied in four pilot areas in Europe: French Alps, North Eastern Italy, Romania and Poland. Nevertheless, the framework will be designed and implemented in a way to be applicable in any other regions.

  17. Development of a safe ground to space laser propagation system for the optical communications telescope laboratory

    NASA Technical Reports Server (NTRS)

    Wu, Janet P.

    2003-01-01

    Furthering pursuits in high bandwidth communications to future NASA deep space and neat-Earth probes, the Jet Propulsion Laboratory (JPL) is building the Optical communications Telescope Laboratory (OCTL) atop Table Mountain in Southern California. This R&D optical antenna will be used to develop optical communication strategies for future optical ground stations. Initial experiments to be conducted include propagating high-powered, Q-switched laser beams to retro-reflecting satellites. Yet laser beam propagation from the ground to space is under the cognizance of various government agencies, namely: the Occupational Safety and Health Administration (ISHA) that is responsible for protecting workforce personnel; the Federal Aviation Administration (FAA) responsible for protecting pilots and aircraft; and the Laser Clearinghouse of Space Command responsible for protecting space assets. To ensure that laser beam propagation from the OCTL and future autonomously operated ground stations comply with the guidelines of these organizations, JPL is developing a multi-tiered safety system that will meet the coordination, monitoring, and reporting functions required by the agencies. At Tier 0, laser operators will meet OSHA safety standards for protection and access to the high power lasers area will be restricted and interlocked. Tier 1, the area defined from the telescope dome out to a range of 3.4-km, will utilize long wave infrared camera sensors to alert operators of at risk aircraft in the FAA controlled airspace. Tier 2, defined to extend from 3.4-km out to the aircraft service ceiling in FAA airspace, will detect at risk aircraft by radar. Lastly, beam propagation into space, defined as Tier 3, will require coordination with the Laser Clearinghouse. A detailed description of the four tiers is presented along with the design of the integrated monitoring and beam transmission control system.

  18. Improving three-tier environmental assessment model by using a 3D scanning FLS-AM series hyperspectral lidar

    NASA Astrophysics Data System (ADS)

    Samberg, Andre; Babichenko, Sergei; Poryvkina, Larisa

    2005-05-01

    Delay between the time when natural disaster, for example, oil accident in coastal water, occurred and the time when environmental protection actions, for example, water and shoreline clean-up, started is of significant importance. Mostly remote sensing techniques are considered as (near) real-time and suitable for multiple tasks. These techniques in combination with rapid environmental assessment methodologies would form multi-tier environmental assessment model, which allows creating (near) real-time datasets and optimizing sampling scenarios. This paper presents the idea of three-tier environmental assessment model. Here all three tiers are briefly described to show the linkages between them, with a particular focus on the first tier. Furthermore, it is described how large-scale environmental assessment can be improved by using an airborne 3-D scanning FLS-AM series hyperspectral lidar. This new aircraft-based sensor is typically applied for oil mapping on sea/ground surface and extracting optical features of subjects. In general, a sampling network, which is based on three-tier environmental assessment model, can include ship(s) and aircraft(s). The airborne 3-D scanning FLS-AM series hyperspectral lidar helps to speed up the whole process of assessing of area of natural disaster significantly, because this is a real-time remote sensing mean. For instance, it can deliver such information as georeferenced oil spill position in WGS-84, the estimated size of the whole oil spill, and the estimated amount of oil in seawater or on ground. All information is produced in digital form and, thus, can be directly transferred into a customer"s GIS (Geographical Information System) system.

  19. VT-136 Market Research and Sourcing Case Exercise

    DTIC Science & Technology

    2006-04-30

    Josh Parsons (USAF), Army AL&T Magazine, January-February 2005 Web Edition. “The Yoder Three-tier Model for Optimal Planning and Execution of...relevant to pricing and ability to determine “fair and reasonable” price2 Identification of suitable substitutes and alternative products or...Tests for Determining Fair and Reasonable Price By far, the best methodology for making the determination that a product or service cost or price

  20. A two-tiered self-powered wireless monitoring system architecture for bridge health management

    NASA Astrophysics Data System (ADS)

    Kurata, Masahiro; Lynch, Jerome P.; Galchev, Tzeno; Flynn, Michael; Hipley, Patrick; Jacob, Vince; van der Linden, Gwendolyn; Mortazawi, Amir; Najafi, Khalil; Peterson, Rebecca L.; Sheng, Li-Hong; Sylvester, Dennis; Thometz, Edward

    2010-04-01

    Bridges are an important societal resource used to carry vehicular traffic within a transportation network. As such, the economic impact of the failure of a bridge is high; the recent failure of the I-35W Bridge in Minnesota (2007) serves as a poignant example. Structural health monitoring (SHM) systems can be adopted to detect and quantify structural degradation and damage in an affordable and real-time manner. This paper presents a detailed overview of a multi-tiered architecture for the design of a low power wireless monitoring system for large and complex infrastructure systems. The monitoring system architecture employs two wireless sensor nodes, each with unique functional features and varying power demand. At the lowest tier of the system architecture is the ultra-low power Phoenix wireless sensor node whose design has been optimized to draw minimal power during standby. These ultra low-power nodes are configured to communicate their measurements to a more functionally-rich wireless sensor node residing on the second-tier of the monitoring system architecture. While the Narada wireless sensor node offers more memory, greater processing power and longer communication ranges, it also consumes more power during operation. Radio frequency (RF) and mechanical vibration power harvesting is integrated with the wireless sensor nodes to allow them to operate freely for long periods of time (e.g., years). Elements of the proposed two-tiered monitoring system architecture are validated upon an operational long-span suspension bridge.

  1. Genome Partitioner: A web tool for multi-level partitioning of large-scale DNA constructs for synthetic biology applications

    PubMed Central

    Del Medico, Luca; Christen, Heinz; Christen, Beat

    2017-01-01

    Recent advances in lower-cost DNA synthesis techniques have enabled new innovations in the field of synthetic biology. Still, efficient design and higher-order assembly of genome-scale DNA constructs remains a labor-intensive process. Given the complexity, computer assisted design tools that fragment large DNA sequences into fabricable DNA blocks are needed to pave the way towards streamlined assembly of biological systems. Here, we present the Genome Partitioner software implemented as a web-based interface that permits multi-level partitioning of genome-scale DNA designs. Without the need for specialized computing skills, biologists can submit their DNA designs to a fully automated pipeline that generates the optimal retrosynthetic route for higher-order DNA assembly. To test the algorithm, we partitioned a 783 kb Caulobacter crescentus genome design. We validated the partitioning strategy by assembling a 20 kb test segment encompassing a difficult to synthesize DNA sequence. Successful assembly from 1 kb subblocks into the 20 kb segment highlights the effectiveness of the Genome Partitioner for reducing synthesis costs and timelines for higher-order DNA assembly. The Genome Partitioner is broadly applicable to translate DNA designs into ready to order sequences that can be assembled with standardized protocols, thus offering new opportunities to harness the diversity of microbial genomes for synthetic biology applications. The Genome Partitioner web tool can be accessed at https://christenlab.ethz.ch/GenomePartitioner. PMID:28531174

  2. The Saccharomyces Genome Database Variant Viewer.

    PubMed

    Sheppard, Travis K; Hitz, Benjamin C; Engel, Stacia R; Song, Giltae; Balakrishnan, Rama; Binkley, Gail; Costanzo, Maria C; Dalusag, Kyla S; Demeter, Janos; Hellerstedt, Sage T; Karra, Kalpana; Nash, Robert S; Paskov, Kelley M; Skrzypek, Marek S; Weng, Shuai; Wong, Edith D; Cherry, J Michael

    2016-01-04

    The Saccharomyces Genome Database (SGD; http://www.yeastgenome.org) is the authoritative community resource for the Saccharomyces cerevisiae reference genome sequence and its annotation. In recent years, we have moved toward increased representation of sequence variation and allelic differences within S. cerevisiae. The publication of numerous additional genomes has motivated the creation of new tools for their annotation and analysis. Here we present the Variant Viewer: a dynamic open-source web application for the visualization of genomic and proteomic differences. Multiple sequence alignments have been constructed across high quality genome sequences from 11 different S. cerevisiae strains and stored in the SGD. The alignments and summaries are encoded in JSON and used to create a two-tiered dynamic view of the budding yeast pan-genome, available at http://www.yeastgenome.org/variant-viewer. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  3. The research and implementation of PDM systems based on the .NET platform

    NASA Astrophysics Data System (ADS)

    Gao, Hong-li; Jia, Ying-lian; Yang, Ji-long; Jiang, Wei

    2005-12-01

    A new kind of PDM system scheme based on the .NET platform for solving application problems of the current PDM system applied in an enterprise is described. The key technologies of this system, such as .NET, Accessing Data, information processing, Web, ect., were discussed. The 3-tier architecture of a PDM system based on the C/S and B/S mixed mode was presented. In this system, all users share the same Database Server in order to ensure the coherence and safety of client data. ADO.NET leverages the power of XML to provide disconnected access to data, which frees the connection to be used by other clients. Using this approach, the system performance was improved. Moreover, the important function modules in a PDM system such as project management, product structure management and Document Management module were developed and realized.

  4. 7 CFR 1469.20 - Application for contracts.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... § 1469.7 for the eligible land uses on the entire operation or, if Tier I, for the portion being enrolled; (3) Any other requirements specified in the sign-up notice; (4) For Tier I, clear indication of which... to meet the relevant contract requirements outlined in the sign-up notice. (b) Producers who are...

  5. 7 CFR 1469.20 - Application for contracts.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... § 1469.7 for the eligible land uses on the entire operation or, if Tier I, for the portion being enrolled; (3) Any other requirements specified in the sign-up notice; (4) For Tier I, clear indication of which... to meet the relevant contract requirements outlined in the sign-up notice. (b) Producers who are...

  6. 48 CFR 52.232-27 - Prompt payment for construction contracts.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... met, if applicable. However, when the due date falls on a Saturday, Sunday, or legal holiday, the... include such clauses in their subcontracts with each lower-tier subcontractor or supplier. (d) Subcontract... impair the right of the Contractor or a subcontractor at any tier to negotiate, and to include in their...

  7. MCSDSS: A Multi-Criteria Decision Support System for Merging Geoscience Information with Natural User Interfaces, Preference Ranking, and Interactive Data Utilities

    NASA Astrophysics Data System (ADS)

    Pierce, S. A.; Gentle, J.

    2015-12-01

    The multi-criteria decision support system (MCSDSS) is a newly completed application for touch-enabled group decision support that uses D3 data visualization tools, a geojson conversion utility that we developed, and Paralelex to create an interactive tool. The MCSDSS is a prototype system intended to demonstrate the potential capabilities of a single page application (SPA) running atop a web and cloud based architecture utilizing open source technologies. The application is implemented on current web standards while supporting human interface design that targets both traditional mouse/keyboard interactions and modern touch/gesture enabled interactions. The technology stack for MCSDSS was selected with the goal of creating a robust and dynamic modular codebase that can be adjusted to fit many use cases and scale to support usage loads that range between simple data display to complex scientific simulation-based modelling and analytics. The application integrates current frameworks for highly performant agile development with unit testing, statistical analysis, data visualization, mapping technologies, geographic data manipulation, and cloud infrastructure while retaining support for traditional HTML5/CSS3 web standards. The software lifecylcle for MCSDSS has following best practices to develop, share, and document the codebase and application. Code is documented and shared via an online repository with the option for programmers to see, contribute, or fork the codebase. Example data files and tutorial documentation have been shared with clear descriptions and data object identifiers. And the metadata about the application has been incorporated into an OntoSoft entry to ensure that MCSDSS is searchable and clearly described. MCSDSS is a flexible platform that allows for data fusion and inclusion of large datasets in an interactive front-end application capable of connecting with other science-based applications and advanced computing resources. In addition, MCSDSS offers functionality that enables communication with non-technical users for policy, education, or engagement with groups around scientific topics with societal relevance.

  8. ATM over hybrid fiber-coaxial cable networks: practical issues in deploying residential ATM services

    NASA Astrophysics Data System (ADS)

    Laubach, Mark

    1996-11-01

    Residential broadband access network technology based on asynchronous transfer modem (ATM) will soon reach commercial availability. The capabilities provided by ATM access network promise integrated services bandwidth available in excess of those provided by traditional twisted pair copper wire public telephone networks. ATM to the side of the home placed need quality of service capability closest to the subscriber allowing immediate support for Internet services and traditional voice telephony. Other services such as desktop video teleconferencing and enhanced server-based application support can be added as part of future evolution of the network. Additionally, advanced subscriber home networks can be supported easily. This paper presents an updated summary of the standardization efforts for the ATM over HFC definition work currently taking place in the ATM forum's residential broadband working group and the standards progress in the IEEE 802.14 cable TV media access control and physical protocol working group. This update is fundamental for establishing the foundation for delivering ATM-based integrated services via a cable TV network. An economic model for deploying multi-tiered services is presenting showing that a single-tier service is insufficient for a viable cable operator business. Finally, the use of an ATM based system lends itself well to various deployment scenarios of synchronous optical networks (SONET).

  9. Multi-tier drugs assessment in a decentralised health care system. The Italian case-study.

    PubMed

    Jommi, Claudio; Costa, Enrico; Michelon, Alessandra; Pisacane, Maria; Scroccaro, Giovanna

    2013-10-01

    To investigate the organisation and decision-making processes of regional and local therapeutic committees in Italy, as a case-study of decentralised health care systems. A structured questionnaire was designed, validated, and self-administered to respondents. Committee members, prioritisation, assessment process and criteria, and transparency of committees were investigated. The respondents represent 100% of the 17 regional committees out of 21 regions (in 4 regions there is not any regional formulary), 88% of the 16 hospital networks and 42% of the 183 public hospitals. The assessment process appears fragmented and may take a long time: drugs inclusion into hospital formularies requires two steps in most regions (regional and local assessment). Most of the therapeutic committees are closed to industry and patients associations involvement. Prioritisation in the assessment is mostly driven by disease severity, clinical evidence, and the absence of therapeutic alternatives. Only 13 out of the 17 regional committees have a public application form for drugs inclusion into regional formulary. Regional and local committees (i) often re-assess the clinical evidence already evaluated at central level and (ii) mostly rely on comparative drug unit prices per DDD and drug budget impact. The level of transparency is quite low. The Italian case-study provides useful insights into an appropriate management of multi-tier drugs assessment, which is particularly complex in decentralised health care systems, but exists also in centralised systems where drugs are assessed by local therapeutic committees. A clear definition of regulatory competences at different levels, a higher collaboration between central, regional and local actors, and increased transparency are necessary to pursue consistency between central policies on price and reimbursement and budget accountability at the regional and local levels. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  10. The Need for a Harmonized Repository for Next-Generation Human Activity Data

    EPA Science Inventory

    Multi-tiered human time-activity-location data can inform many efforts to describe human exposures to air pollutants and other chemicals on a range of temporal and spatial scales. In the last decade, EPA's Consolidated Human Activity Database (CHAD) has served as a harmonized rep...

  11. New Hampshire's Multi-Tiered Approach to Dropout Prevention. Snapshot: New Hampshire

    ERIC Educational Resources Information Center

    National High School Center, 2007

    2007-01-01

    Many states and districts across the country struggle with designing and implementing coherent dropout prevention initiatives that promote academic advancement, especially for special needs students, who drop out at much higher rates than the general student population. This "snapshot" describes New Hampshire's innovative use of data…

  12. Sustaining SWPBIS for Inclusive Behavior Instruction. Research to Practice Brief

    ERIC Educational Resources Information Center

    National Center on Schoolwide Inclusive School Reform: The SWIFT Center, 2017

    2017-01-01

    Inclusive Behavior Instruction features universal or schoolwide positive behavior interventions and supports in a multi-tiered system of support. Many schools use School-Wide Positive Behavior Intervention and Supports (SWPBIS; Horner et al., 2009) for this purpose. Andreau, McIntosh, Ross, and Kahn describe thirteen characteristics of…

  13. Negotiating Peer Mentoring Roles in Undergraduate Research Lab Settings

    ERIC Educational Resources Information Center

    Packard, Becky W.; Marciano, Vincenza N.; Payne, Jessica M.; Bledzki, Leszek A.; Woodard, Craig T.

    2014-01-01

    Undergraduate research is viewed as an important catalyst for educational engagement and persistence, with an emphasis on the faculty mentoring relationship. Despite the common practice of having multi-tiered lab teams composed of newer undergraduates and more seasoned undergraduates serving as peer mentors, less is understood about the experience…

  14. Early Intervention for Reading Difficulties: The Interactive Strategies Approach

    ERIC Educational Resources Information Center

    Scanlon, Donna M.; Anderson, Kimberly L.; Sweeney, Joan M.

    2010-01-01

    This book presents a research-supported framework for early literacy instruction that aligns with multi-tiered response-to-intervention (RTI) models. The book focuses on giving teachers a better understanding of literacy development and how to effectively support children as they begin to read and write. The authors' interactive strategies…

  15. Using Qualitative Methods to Create a Home Health Web Application User Interface for Patients with Low Computer Proficiency.

    PubMed

    Baier, Rosa R; Cooper, Emily; Wysocki, Andrea; Gravenstein, Stefan; Clark, Melissa

    2015-01-01

    Despite the investment in public reporting for a number of healthcare settings, evidence indicates that consumers do not routinely use available data to select providers. This suggests that existing reports do not adequately incorporate recommendations for consumer-facing reports or web applications. Healthcentric Advisors and Brown University undertook a multi-phased approach to create a consumer-facing home health web application in Rhode Island. This included reviewing the evidence base review to identify design recommendations and then creating a paper prototype and wireframe. We performed qualitative research to iteratively test our proposed user interface with two user groups, home health consumers and hospital case managers, refining our design to create the final web application. To test our prototype, we conducted two focus groups, with a total of 13 consumers, and 28 case manager interviews. Both user groups responded favorably to the prototype, with the majority commenting that they felt this type of tool would be useful. Case managers suggested revisions to ensure the application conformed to laws requiring Medicare patients to have the freedom to choose among providers and could be incorporated into hospital workflow. After incorporating changes and creating the wireframe, we conducted usability testing interviews with 14 home health consumers and six hospital case managers. We found that consumers needed prompting to navigate through the wireframe; they demonstrated confusion through both their words and body language. As a result, we modified the web application's sequence, navigation, and function to provide additional instructions and prompts. Although we designed our web application for low literacy and low health literacy, using recommendations from the evidence base, we overestimated the extent to which older adults were familiar with using computers. Some of our key learnings and recommendations run counter to general web design principles, leading us to believe that such guidelines need to be adapted for this user group. As web applications proliferate, it is important to ensure those who are most vulnerable-who have the least knowledge and the lowest literacy, health literacy, and computer proficiency-can access, understand, and use them. In order for the investment in public reporting to produce value, consumer-facing web applications need to be designed to address end users' unique strengths and limitations. Our findings may help others to build consumer-facing tools or technology targeted to a predominantly older population. We encourage others designing consumer-facing web technologies to critically evaluate their assumptions about user interface design, particularly if they are designing tools for older adults, and to test products with their end users.

  16. shinyGISPA: A web application for characterizing phenotype by gene sets using multiple omics data combinations.

    PubMed

    Dwivedi, Bhakti; Kowalski, Jeanne

    2018-01-01

    While many methods exist for integrating multi-omics data or defining gene sets, there is no one single tool that defines gene sets based on merging of multiple omics data sets. We present shinyGISPA, an open-source application with a user-friendly web-based interface to define genes according to their similarity in several molecular changes that are driving a disease phenotype. This tool was developed to help facilitate the usability of a previously published method, Gene Integrated Set Profile Analysis (GISPA), among researchers with limited computer-programming skills. The GISPA method allows the identification of multiple gene sets that may play a role in the characterization, clinical application, or functional relevance of a disease phenotype. The tool provides an automated workflow that is highly scalable and adaptable to applications that go beyond genomic data merging analysis. It is available at http://shinygispa.winship.emory.edu/shinyGISPA/.

  17. shinyGISPA: A web application for characterizing phenotype by gene sets using multiple omics data combinations

    PubMed Central

    Dwivedi, Bhakti

    2018-01-01

    While many methods exist for integrating multi-omics data or defining gene sets, there is no one single tool that defines gene sets based on merging of multiple omics data sets. We present shinyGISPA, an open-source application with a user-friendly web-based interface to define genes according to their similarity in several molecular changes that are driving a disease phenotype. This tool was developed to help facilitate the usability of a previously published method, Gene Integrated Set Profile Analysis (GISPA), among researchers with limited computer-programming skills. The GISPA method allows the identification of multiple gene sets that may play a role in the characterization, clinical application, or functional relevance of a disease phenotype. The tool provides an automated workflow that is highly scalable and adaptable to applications that go beyond genomic data merging analysis. It is available at http://shinygispa.winship.emory.edu/shinyGISPA/. PMID:29415010

  18. Monitoring of IaaS and scientific applications on the Cloud using the Elasticsearch ecosystem

    NASA Astrophysics Data System (ADS)

    Bagnasco, S.; Berzano, D.; Guarise, A.; Lusso, S.; Masera, M.; Vallero, S.

    2015-05-01

    The private Cloud at the Torino INFN computing centre offers IaaS services to different scientific computing applications. The infrastructure is managed with the OpenNebula cloud controller. The main stakeholders of the facility are a grid Tier-2 site for the ALICE collaboration at LHC, an interactive analysis facility for the same experiment and a grid Tier-2 site for the BES-III collaboration, plus an increasing number of other small tenants. Besides keeping track of the usage, the automation of dynamic allocation of resources to tenants requires detailed monitoring and accounting of the resource usage. As a first investigation towards this, we set up a monitoring system to inspect the site activities both in terms of IaaS and applications running on the hosted virtual instances. For this purpose we used the Elasticsearch, Logstash and Kibana stack. In the current implementation, the heterogeneous accounting information is fed to different MySQL databases and sent to Elasticsearch via a custom Logstash plugin. For the IaaS metering, we developed sensors for the OpenNebula API. The IaaS level information gathered through the API is sent to the MySQL database through an ad-hoc developed RESTful web service, which is also used for other accounting purposes. Concerning the application level, we used the Root plugin TProofMonSenderSQL to collect accounting data from the interactive analysis facility. The BES-III virtual instances used to be monitored with Zabbix, as a proof of concept we also retrieve the information contained in the Zabbix database. Each of these three cases is indexed separately in Elasticsearch. We are now starting to consider dismissing the intermediate level provided by the SQL database and evaluating a NoSQL option as a unique central database for all the monitoring information. We setup a set of Kibana dashboards with pre-defined queries in order to monitor the relevant information in each case. In this way we have achieved a uniform monitoring interface for both the IaaS and the scientific applications, mostly leveraging off-the-shelf tools.

  19. XML — an opportunity for data standards in the geosciences

    NASA Astrophysics Data System (ADS)

    Houlding, Simon W.

    2001-08-01

    Extensible markup language (XML) is a recently introduced meta-language standard on the Web. It provides the rules for development of metadata (markup) standards for information transfer in specific fields. XML allows development of markup languages that describe what information is rather than how it should be presented. This allows computer applications to process the information in intelligent ways. In contrast hypertext markup language (HTML), which fuelled the initial growth of the Web, is a metadata standard concerned exclusively with presentation of information. Besides its potential for revolutionizing Web activities, XML provides an opportunity for development of meaningful data standards in specific application fields. The rapid endorsement of XML by science, industry and e-commerce has already spawned new metadata standards in such fields as mathematics, chemistry, astronomy, multi-media and Web micro-payments. Development of XML-based data standards in the geosciences would significantly reduce the effort currently wasted on manipulating and reformatting data between different computer platforms and applications and would ensure compatibility with the new generation of Web browsers. This paper explores the evolution, benefits and status of XML and related standards in the more general context of Web activities and uses this as a platform for discussion of its potential for development of data standards in the geosciences. Some of the advantages of XML are illustrated by a simple, browser-compatible demonstration of XML functionality applied to a borehole log dataset. The XML dataset and the associated stylesheet and schema declarations are available for FTP download.

  20. 40 CFR 1033.655 - Special provisions for certain Tier 0/Tier 1 locomotives.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... locomotives covered by this section. (c) You may ask us to allow these locomotives to exceed otherwise applicable line-haul cycle NOX standard for high ambient temperatures and/or altitude because of limitations... locomotives, you may ask for relief for ambient temperatures above 23 °C and/or barometric pressure below 97.5...

  1. 40 CFR 1033.655 - Special provisions for certain Tier 0/Tier 1 locomotives.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... locomotives covered by this section. (c) You may ask us to allow these locomotives to exceed otherwise applicable line-haul cycle NOX standard for high ambient temperatures and/or altitude because of limitations... locomotives, you may ask for relief for ambient temperatures above 23 °C and/or barometric pressure below 97.5...

  2. 40 CFR 1033.655 - Special provisions for certain Tier 0/Tier 1 locomotives.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... locomotives covered by this section. (c) You may ask us to allow these locomotives to exceed otherwise applicable line-haul cycle NOX standard for high ambient temperatures and/or altitude because of limitations... locomotives, you may ask for relief for ambient temperatures above 23 °C and/or barometric pressure below 97.5...

  3. Using UML Modeling to Facilitate Three-Tier Architecture Projects in Software Engineering Courses

    ERIC Educational Resources Information Center

    Mitra, Sandeep

    2014-01-01

    This article presents the use of a model-centric approach to facilitate software development projects conforming to the three-tier architecture in undergraduate software engineering courses. Many instructors intend that such projects create software applications for use by real-world customers. While it is important that the first version of these…

  4. 12 CFR 607.3 - Assessment of banks, associations, and designated other System entities.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... fall within the graduated risk-adjusted asset tiers contained in the following table. An institution's... falling into each applicable tier, subject to adjustment for its FIRS rating as required in paragraphs (b... percentage of X1 in the following table) will be applied to each dollar value of risk-adjusted assets falling...

  5. The Assessment of Public Issue Perception: Exploration of a Three-Tiered, Social-Network Based Methodology in the Champlain Basin.

    ERIC Educational Resources Information Center

    Gore, Peter H.; And Others

    Design, application, and interpretation of a three-tiered sampling framework as a strategy for eliciting public participation in planning and program implementation is presented, with emphasis on implications for federal programs which mandate citizen participation (for example, Level B planning of Water Resources Planning Act, Federal Water…

  6. Three-tier multi-granularity switching system based on PCE

    NASA Astrophysics Data System (ADS)

    Wang, Yubao; Sun, Hao; Liu, Yanfei

    2017-10-01

    With the growing demand for business communications, electrical signal processing optical path switching can't meet the demand. The multi-granularity switch system that can improve node routing and switching capabilities came into being. In the traditional network, each node is responsible for calculating the path; synchronize the whole network state, which will increase the burden on the network, so the concept of path calculation element (PCE) is proposed. The PCE is responsible for routing and allocating resources in the network1. In the traditional band-switched optical network, the wavelength is used as the basic routing unit, resulting in relatively low wavelength utilization. Due to the limitation of wavelength continuity, the routing design of the band technology becomes complicated, which directly affects the utilization of the system. In this paper, optical code granularity is adopted. There is no continuity of the optical code, and the number of optical codes is more flexible than the wavelength. For the introduction of optical code switching, we propose a Code Group Routing Entity (CGRE) algorithm. In short, the combination of three-tier multi-granularity optical switching system and PCE can simplify the network structure, reduce the node load, and enhance the network scalability and survivability. Realize the intelligentization of optical network.

  7. Design and Implementation of a Three-Tiered Web-Based Inventory Ordering and Tracking System Prototype Using CORBA and Java

    DTIC Science & Technology

    2000-03-01

    languages yet still be able to access the legacy relational databases that businesses have huge investments in. JDBC is a low-level API designed for...consider the return of investment . The system requirements, discussed in Chapter II, are the main source of input to developing the relational...1996. Inprise, Gatekeeper Guide, Inprise Corporation, 1999. Kroenke, D., Database Processing Fundementals , Design, and Implementation, Sixth Edition

  8. Web vulnerability study of online pharmacy sites.

    PubMed

    Kuzma, Joanne

    2011-01-01

    Consumers are increasingly using online pharmacies, but these sites may not provide an adequate level of security with the consumers' personal data. There is a gap in this research addressing the problems of security vulnerabilities in this industry. The objective is to identify the level of web application security vulnerabilities in online pharmacies and the common types of flaws, thus expanding on prior studies. Technical, managerial and legal recommendations on how to mitigate security issues are presented. The proposed four-step method first consists of choosing an online testing tool. The next steps involve choosing a list of 60 online pharmacy sites to test, and then running the software analysis to compile a list of flaws. Finally, an in-depth analysis is performed on the types of web application vulnerabilities. The majority of sites had serious vulnerabilities, with the majority of flaws being cross-site scripting or old versions of software that have not been updated. A method is proposed for the securing of web pharmacy sites, using a multi-phased approach of technical and managerial techniques together with a thorough understanding of national legal requirements for securing systems.

  9. On Statistical Approaches for Demonstrating Analytical Similarity in the Presence of Correlation.

    PubMed

    Yang, Harry; Novick, Steven; Burdick, Richard K

    Analytical similarity is the foundation for demonstration of biosimilarity between a proposed product and a reference product. For this assessment, currently the U.S. Food and Drug Administration (FDA) recommends a tiered system in which quality attributes are categorized into three tiers commensurate with their risk and approaches of varying statistical rigor are subsequently used for the three-tier quality attributes. Key to the analyses of Tiers 1 and 2 quality attributes is the establishment of equivalence acceptance criterion and quality range. For particular licensure applications, the FDA has provided advice on statistical methods for demonstration of analytical similarity. For example, for Tier 1 assessment, an equivalence test can be used based on an equivalence margin of 1.5 σ R , where σ R is the reference product variability estimated by the sample standard deviation S R from a sample of reference lots. The quality range for demonstrating Tier 2 analytical similarity is of the form X̄ R ± K × σ R where the constant K is appropriately justified. To demonstrate Tier 2 analytical similarity, a large percentage (e.g., 90%) of test product must fall in the quality range. In this paper, through both theoretical derivations and simulations, we show that when the reference drug product lots are correlated, the sample standard deviation S R underestimates the true reference product variability σ R As a result, substituting S R for σ R in the Tier 1 equivalence acceptance criterion and the Tier 2 quality range inappropriately reduces the statistical power and the ability to declare analytical similarity. Also explored is the impact of correlation among drug product lots on Type I error rate and power. Three methods based on generalized pivotal quantities are introduced, and their performance is compared against a two-one-sided tests (TOST) approach. Finally, strategies to mitigate risk of correlation among the reference products lots are discussed. A biosimilar is a generic version of the original biological drug product. A key component of a biosimilar development is the demonstration of analytical similarity between the biosimilar and the reference product. Such demonstration relies on application of statistical methods to establish a similarity margin and appropriate test for equivalence between the two products. This paper discusses statistical issues with demonstration of analytical similarity and provides alternate approaches to potentially mitigate these problems. © PDA, Inc. 2016.

  10. Web-based monitoring and management system for integrated enterprise-wide imaging networks

    NASA Astrophysics Data System (ADS)

    Ma, Keith; Slik, David; Lam, Alvin; Ng, Won

    2003-05-01

    Mass proliferation of IP networks and the maturity of standards has enabled the creation of sophisticated image distribution networks that operate over Intranets, Extranets, Communities of Interest (CoI) and even the public Internet. Unified monitoring, provisioning and management of such systems at the application and protocol levels represent a challenge. This paper presents a web based monitoring and management tool that employs established telecom standards for the creation of an open system that enables proactive management, provisioning and monitoring of image management systems at the enterprise level and across multi-site geographically distributed deployments. Utilizing established standards including ITU-T M.3100, and web technologies such as XML/XSLT, JSP/JSTL, and J2SE, the system allows for seamless device and protocol adaptation between multiple disparate devices. The goal has been to develop a unified interface that provides network topology views, multi-level customizable alerts, real-time fault detection as well as real-time and historical reporting of all monitored resources, including network connectivity, system load, DICOM transactions and storage capacities.

  11. Integration of a Zero-footprint Cloud-based Picture Archiving and Communication System with Customizable Forms for Radiology Research and Education.

    PubMed

    Hostetter, Jason; Khanna, Nishanth; Mandell, Jacob C

    2018-06-01

    The purpose of this study was to integrate web-based forms with a zero-footprint cloud-based Picture Archiving and Communication Systems (PACS) to create a tool of potential benefit to radiology research and education. Web-based forms were created with a front-end and back-end architecture utilizing common programming languages including Vue.js, Node.js and MongoDB, and integrated into an existing zero-footprint cloud-based PACS. The web-based forms application can be accessed in any modern internet browser on desktop or mobile devices and allows the creation of customizable forms consisting of a variety of questions types. Each form can be linked to an individual DICOM examination or a collection of DICOM examinations. Several uses are demonstrated through a series of case studies, including implementation of a research platform for multi-reader multi-case (MRMC) studies and other imaging research, and creation of an online Objective Structure Clinical Examination (OSCE) and an educational case file. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  12. Impact of multi-tiered pharmacy benefits on attitudes of plan members with chronic disease states.

    PubMed

    Nair, Kavita V; Ganther, Julie M; Valuck, Robert J; McCollum, Marianne M; Lewis, Sonya J

    2002-01-01

    To evaluate the effects of 2- and 3-tiered pharmacy benefit plans on member attitudes regarding their pharmacy benefits. We performed a mail survey and cross-sectional comparison of the outcome variables in a large managed care population in the western United States. Participants were persons with chronic disease states who were in 2- or 3-tier copay drug plans. A random sample of 10,662 was selected from a total of 25,008 members who had received 2 or more prescriptions for a drug commonly used to treat one of 5 conditions: hypertension, diabetes, dyslipidemia, gastroesophageal reflux disease (GERD), or arthritis. Statistical analysis included bivariate comparisons and regression analysis of the factors affecting member attitudes, including satisfaction, loyalty, health plan choices, and willingness to pay a higher out-of-pocket cost for medications. A response rate of 35.8% was obtained from continuously enrolled plan members. Respondents were older, sicker, and consumed more prescriptions than nonrespondents. There were significant differences in age and health plan characteristics between 2- and 3-tier plan members: respondents aged 65 or older represented 11.7% of 2-tier plan members and 54.7% of 3-tier plan members, and 10.0% of 2-tier plan members were in Medicare+Choice plans versus 61.4% in Medicare+Choice plans for 3-tier plan members (P<0.05). Controlling for demographic characteristics, number of comorbidities, and the cost of health care, 2-tier plan members were more satisfied with their plan, more likely to recommend their plan to others, and less likely to switch their current plans to obtain better prescription drug coverage than 3-tier plan members. While members were willing to purchase higher cost nonformulary and brand-name medications, in general, they were not willing to pay more than 10 dollars (in addition to their copayment amount) for these medications. Older respondents and sicker individuals (those with higher scores on the Chronic Disease Indicator) appeared to have more positive attitudes toward their pharmacy benefit plans in general. Higher reported incomes by respondents were also associated with greater satisfaction with prescription drug coverage and increased loyalty toward the pharmacy benefit plan. Conversely, the more individuals spent for either their health care or prescription medications, the less satisfied they were with their prescription drug coverage and less loyalty they appeared to have for their health plans. An inverse relationship also appeared to exist between the out-of-pocket costs for prescription medications and members' willingness to pay for nonformulary medications. Three-tier members had lower reported satisfaction with their plans compared to members in 2-tier plans. The financial resources available to members (which may be a function of being older and having more education and higher incomes), the number of chronic disease states that members have, and other factors may influence their attitudes toward their prescription drug coverage.

  13. Research on networked manufacturing system for reciprocating pump industry

    NASA Astrophysics Data System (ADS)

    Wu, Yangdong; Qi, Guoning; Xie, Qingsheng; Lu, Yujun

    2005-12-01

    Networked manufacturing is a trend of reciprocating pump industry. According to the enterprises' requirement, the architecture of networked manufacturing system for reciprocating pump industry was proposed, which composed of infrastructure layer, system management layer, application service layer and user layer. Its main functions included product data management, ASP service, business management, and customer relationship management, its physics framework was a multi-tier internet-based model; the concept of ASP service integration was put forward and its process model was also established. As a result, a networked manufacturing system aimed at the characteristics of reciprocating pump industry was built. By implementing this system, reciprocating pump industry can obtain a new way to fully utilize their own resources and enhance the capabilities to respond to the global market quickly.

  14. Global Benchmarking of Marketing Doctoral Program Faculty and Institutions by Subarea

    ERIC Educational Resources Information Center

    Elbeck, Matt; Vander Schee, Brian A.

    2014-01-01

    This study benchmarks marketing doctoral programs worldwide in five popular subareas by faculty and institutional scholarly impact. A multi-item approach identifies a collection of top-tier scholarly journals for each subarea, while citation data over the decade 2003 to 2012 identify high scholarly impact marketing faculty by subarea used to…

  15. Integrating Wraparound into a Schoolwide System of Positive Behavior Supports

    ERIC Educational Resources Information Center

    Eber, Lucille; Hyde, Kelly; Suter, Jesse C.

    2011-01-01

    We describe the structure for implementation of the wraparound process within a multi-tiered system of school wide positive behavior support (SWPBS) to address the needs of the 1-5% of students with complex emotional/behavioral challenges. The installation of prerequisite system features that, based on a 3 year demonstration process, we consider…

  16. An Initial Study of the Diagnostic Utility of the Emotional and Behavioural Screener in Lithuania

    ERIC Educational Resources Information Center

    Sointu, Erkko; Lambert, Matthew C.; Nordness, Philip D.; Geležiniene, Renata; Epstein, Michael H.

    2018-01-01

    In schools, screening is an effective method to identify students at-risk for emotional and behavioural disorders. Several intervention programmes such as Positive Behaviour Interventions and Supports, Response to Intervention, and Multi-tiered Systems of Supports call for the use of psychometrically sound screening instruments. This study…

  17. Effectiveness of a Class-Wide Peer-Mediated Elementary Math Differentiation Strategy

    ERIC Educational Resources Information Center

    Lloyd, Jason D.

    2017-01-01

    Approximately 60% of classroom students have insufficient math skills. Within a Multi-Tiered Systems of Support (MTSS) framework, teachers can implement core differentiation strategies targeted at improving math skills of an entire class of students. Differentiation programs are developed in order to target academic skills of groups of students…

  18. Quality and Utility of the Multi-Tiered Instruction Self-Efficacy Scale

    ERIC Educational Resources Information Center

    Barnes, Susan K.; Burchard, Melinda S.

    2011-01-01

    Response to Intervention (RTI) is an educational approach that integrates ongoing assessment of individual student progress with targeted instruction. Administrators and teachers in P-12 schools expressed a need for colleagues in higher education to provide training to general education pre-service and in-service teachers in selecting appropriate…

  19. "Choose, Explore, Analyze": A Multi-Tiered Approach to Social Media in the Classroom

    ERIC Educational Resources Information Center

    Rosatelli, Meghan

    2015-01-01

    In this essay, social media are presented as complex tools that require student involvement from potential classroom implementation to the post-mortem. The "choose, explore, analyze" approach narrows social media options for the classroom based on student feedback and allows students and teachers to work together to understand why and…

  20. DEVELOPMENT OF A MULTI-TIERED INSECT RESISTANCE MANAGEMENT PROGRAM FOR GENETICALLY MODIFIED CORN HYBRIDS EXPRESSING THE PLANT INCORPORATED PROTECTANT, BACILLUS THURINGIENSIS

    EPA Science Inventory

    A significant increase in genetically modified corn planting driven by biofuel demand is expected for the 2007 growing season with future planted acreages approaching 80% of total corn plantings anticipated by 2009. As demand increases, incidence of farmer non-compliance with ma...

  1. An AlgU-regulated antisense transcript encoded within the Pseudomonas syringae fleQ gene has a positive effect on motility

    USDA-ARS?s Scientific Manuscript database

    Bacterial flagella production is controlled by a multi-tiered regulatory system that coordinates expression of 40-50 subunits and correct assembly of these complicated structures. Flagellar expression is environmentally controlled, presumably to optimize the benefits and liabilities of flagellar ex...

  2. Relationship between School-Wide Positive Behavior Interventions and Supports and Academic, Attendance, and Behavior Outcomes in High Schools

    ERIC Educational Resources Information Center

    Freeman, Jennifer; Simonsen, Brandi; McCoach, D. Betsy; Sugai, George; Lombardi, Allison; Horner, Robert

    2016-01-01

    Attendance, behavior, and academic outcomes are important indicators of school effectiveness and long-term student outcomes. "Multi-tiered systems of support" (MTSS), such as "School-Wide Positive Behavior Interventions and Supports" (SWPBIS), have emerged as potentially effective frameworks for addressing student needs and…

  3. Examining Perceptions of Culturally Responsive Pedagogy in Teacher Preparation and Teacher Leadership Candidates

    ERIC Educational Resources Information Center

    Samuels, Amy J.; Samuels, Gregory L.; Cook, Tammy M.

    2017-01-01

    The study examined a multi-tiered approach for facilitating learning and examining perceptions about culturally responsive pedagogy in teacher preparation and teacher leadership programs. The study aligned with a learning unit we designed to (1) increase understanding of culturally responsive pedagogy and (2) investigate perceptions of cultural…

  4. Implementing Intensive Intervention: How Do We Get There from Here?

    ERIC Educational Resources Information Center

    Zumeta, Rebecca O.

    2015-01-01

    Despite years of school reform intended to help students reach high academic standards, students with disabilities continue to struggle, suggesting a need for more intensive intervention as a part of special education and multi-tiered systems of support. At the same time, greater inclusion of students with disabilities in large-scale assessment,…

  5. Managing Student Behavior with Class-Wide Function-Related Intervention Teams: An Observational Study in Early Elementary Classrooms

    ERIC Educational Resources Information Center

    Caldarella, Paul; Williams, Leslie; Hansen, Blake D.; Wills, Howard

    2015-01-01

    Comprehensive evidence-based interventions are needed to help early childhood educators manage challenging student behaviors. One such intervention, class-wide function-related intervention teams (CW-FIT), is a multi-tiered behavioral intervention program based on positive behavior support principles, including four main elements: (a) teaching…

  6. Efficacy of Rich Vocabulary Instruction in Fourth- and Fifth-Grade Classrooms

    ERIC Educational Resources Information Center

    Vadasy, Patricia F.; Sanders, Elizabeth A.; Logan Herrera, Becky

    2015-01-01

    A multi-cohort cluster randomized trial was conducted to estimate effects of rich vocabulary classroom instruction on vocabulary and reading comprehension. A total of 1,232 fourth- and fifth-grade students from 61 classrooms in 24 schools completed the study. Students received instruction in 140 Tier Two vocabulary words featured in two…

  7. Multipurpose floating platform for hyperspectral imaging, sampling and sensing of surface water sources used in irrigation and recreation

    USDA-ARS?s Scientific Manuscript database

    The objective of this work was to design, construct, and test the self-propelled aquatic platform for imaging, multi-tier water sampling, water quality sensing, and depth profiling to document microbial content and environmental covariates in the interior of irrigation ponds and reservoirs. The plat...

  8. Using Research-Based Instruction to Improve Math Outcomes with Underprepared Students

    ERIC Educational Resources Information Center

    Pearce, Lee R.; Pearce, Kristi L.; Siewert, Daluss J.

    2017-01-01

    The authors used a mixed-methods research design to evaluate a multi-tiered system of supports model to address the disturbing failure rates of underprepared college students placed in developmental mathematics at a small state university. While qualitative data gathered from using Participatory Action Research methods directed the two-year…

  9. The School Implementation Scale: Measuring Implementation in Response to Intervention Models

    ERIC Educational Resources Information Center

    Erickson, Amy Gaumer; Noonan, Pattie M.; Jenson, Ronda

    2012-01-01

    Models of response to intervention (RTI) have been widely developed and implemented and have expanded to include integrated academic/behavior RTI models. Until recently, evaluation of model effectiveness has focused primarily on student-level data, but additional measures of treatment integrity within these multi-tiered models are emerging to…

  10. Understanding Mental Health Intervention and Assessment within a Multi-Tiered Framework: Contemporary Science, Practice, and Policy

    ERIC Educational Resources Information Center

    Kilgus, Stephen P.; Reinke, Wendy M.; Jimerson, Shane R.

    2015-01-01

    This special topic section features research regarding practices that will support mental health service delivery within a school-based multitiered framework. The articles include data and discussions regarding the evaluation of universal, targeted, or intensive intervention addressing mental health concerns and assessment tools intended for use…

  11. Integration of Academic and Behavioral MTSS at the District Level Using Implementation Science

    ERIC Educational Resources Information Center

    Freeman, Rachel; Miller, Dawn; Newcomer, Lori

    2015-01-01

    The evolution of multi-tier systems of support (MTSS) for both academics and behavior has reflected the diverse interests of those leading implementation efforts, the influence of various state and local regulatory requirements, and differing funding methods and priorities. These variations have naturally led to many different pathways for…

  12. Human service delivery in a multi-tier system: the subtleties of collaboration among partners.

    PubMed

    Mayhew, Fred

    2012-01-01

    This article examines the nature of interorganizational relationships that are formed within a multi-tier human service delivery system. Taking into account the hierarchical structure of a statewide initiative to support early childhood education, the study investigates the differences in the relationships between organizations at the service and administrative levels of the system. Forty-nine administrative level and 146 service delivery level relationships are evaluated. Findings indicate that organizations involved in direct service delivery form more collaborative relationships. Thus, when government provides funding for human services, policymakers must seek to balance public accountability with the advantages believed to be inherent in devolved service delivery. Furthermore, practitioners who appreciate the importance and nuances of interorganizational relationships will be in a position to better manage their organizations in an environment of increased collaborative activity and joint delivery of services. Going forward, human service systems will continue to involve organizations from the public, nonprofit, and private sector. A better understanding of how these organizations work together is crucial to the effective delivery of these essential services.

  13. Tools and Strategies for Product Life Cycle Management ñ A Case Study in Foundry

    NASA Astrophysics Data System (ADS)

    Patil, Rajashekar; Kumar, S. Mohan; Abhilash, E.

    2012-08-01

    Advances in information and communication technology (ICT) have opened new possibilities of collaborations among the customers, suppliers, manufactures and partners to effectively tackle various business challenges. Product Life Cycle Management(PLM) has been a proven approach for Original Equipment Manufacturers (OEMs) to increase their productivity, improve their product quality, speed up delivery, and increase their profit and to become more efficient. However, their Tier 2 and Tier 3 suppliers like foundry industries are still in their infancy without adopting PLM. Hence to enhance their understanding, the basic concepts, the tools and strategies for PLM are presented is this paper. By selecting and implementing appropriate PLM strategies in a small foundry, an attempt was also made to understand the immediate benefits of using PLM tools (commercial PLM software and digital manufacturing tools). This study indicated a reduction in lead time and improved utilization of organizational resources in the production of automobile impeller. These observations may be further extrapolated to other multiproduct, multi-discipline and multi-customer companies to realize the advantages of using PLM technology

  14. A new three-tier architecture design for multi-sphere neutron spectrometer with the FLUKA code

    NASA Astrophysics Data System (ADS)

    Huang, Hong; Yang, Jian-Bo; Tuo, Xian-Guo; Liu, Zhi; Wang, Qi-Biao; Wang, Xu

    2016-07-01

    The current commercially, available Bonner sphere neutron spectrometer (BSS) has high sensitivity to neutrons below 20 MeV, which causes it to be poorly placed to measure neutrons ranging from a few MeV to 100 MeV. The paper added moderator layers and the auxiliary material layer upon 3He proportional counters with FLUKA code, with a view to improve. The results showed that the responsive peaks to neutrons below 20 MeV gradually shift to higher energy region and decrease slightly with the increasing moderator thickness. On the contrary, the response for neutrons above 20 MeV was always very low until we embed auxiliary materials such as copper (Cu), lead (Pb), tungsten (W) into moderator layers. This paper chose the most suitable auxiliary material Pb to design a three-tier architecture multi-sphere neutron spectrometer (NBSS). Through calculating and comparing, the NBSS was advantageous in terms of response for 5-100 MeV and the highest response was 35.2 times the response of polyethylene (PE) ball with the same PE thickness.

  15. Using container orchestration to improve service management at the RAL Tier-1

    NASA Astrophysics Data System (ADS)

    Lahiff, Andrew; Collier, Ian

    2017-10-01

    In recent years container orchestration has been emerging as a means of gaining many potential benefits compared to a traditional static infrastructure, such as increased utilisation through multi-tenancy, improved availability due to self-healing, and the ability to handle changing loads due to elasticity and auto-scaling. To this end we have been investigating migrating services at the RAL Tier-1 to an Apache Mesos cluster. In this model the concept of individual machines is abstracted away and services are run in containers on a cluster of machines, managed by schedulers, enabling a high degree of automation. Here we describe Mesos, the infrastructure deployed at RAL, and describe in detail the explicit example of running a batch farm on Mesos.

  16. RTI in a Middle School: Findings and Practical Implications of a Tier 2 Reading Comprehension Study

    ERIC Educational Resources Information Center

    Faggella-Luby, Michael; Wardwell, Michelle

    2011-01-01

    Response to intervention (RTI) has received considerable attention from both researchers and practitioners as a schoolwide model for service delivery. However, research is limited on RTI applications in middle and high schools. The purpose of this article is to describe the outcomes of an experimental examination of a secondary (Tier 2) literacy…

  17. Development of a Three-Tier Test as a Valid Diagnostic Tool for Identification of Misconceptions Related to Carbohydrates

    ERIC Educational Resources Information Center

    Milenkovic, Dusica D.; Hrin, Tamara N.; Segedinac, Mirjana D.; Horvat, Sasa

    2016-01-01

    This study describes the development and application of a three-tier test as a valid and reliable tool in diagnosing students' misconceptions regarding some basic concepts about carbohydrates. The test was administrated to students of the Pharmacy Department at the University of Bijeljina (Serb Republic). The results denoted construct and content…

  18. Density-based parallel skin lesion border detection with webCL

    PubMed Central

    2015-01-01

    Background Dermoscopy is a highly effective and noninvasive imaging technique used in diagnosis of melanoma and other pigmented skin lesions. Many aspects of the lesion under consideration are defined in relation to the lesion border. This makes border detection one of the most important steps in dermoscopic image analysis. In current practice, dermatologists often delineate borders through a hand drawn representation based upon visual inspection. Due to the subjective nature of this technique, intra- and inter-observer variations are common. Because of this, the automated assessment of lesion borders in dermoscopic images has become an important area of study. Methods Fast density based skin lesion border detection method has been implemented in parallel with a new parallel technology called WebCL. WebCL utilizes client side computing capabilities to use available hardware resources such as multi cores and GPUs. Developed WebCL-parallel density based skin lesion border detection method runs efficiently from internet browsers. Results Previous research indicates that one of the highest accuracy rates can be achieved using density based clustering techniques for skin lesion border detection. While these algorithms do have unfavorable time complexities, this effect could be mitigated when implemented in parallel. In this study, density based clustering technique for skin lesion border detection is parallelized and redesigned to run very efficiently on the heterogeneous platforms (e.g. tablets, SmartPhones, multi-core CPUs, GPUs, and fully-integrated Accelerated Processing Units) by transforming the technique into a series of independent concurrent operations. Heterogeneous computing is adopted to support accessibility, portability and multi-device use in the clinical settings. For this, we used WebCL, an emerging technology that enables a HTML5 Web browser to execute code in parallel for heterogeneous platforms. We depicted WebCL and our parallel algorithm design. In addition, we tested parallel code on 100 dermoscopy images and showed the execution speedups with respect to the serial version. Results indicate that parallel (WebCL) version and serial version of density based lesion border detection methods generate the same accuracy rates for 100 dermoscopy images, in which mean of border error is 6.94%, mean of recall is 76.66%, and mean of precision is 99.29% respectively. Moreover, WebCL version's speedup factor for 100 dermoscopy images' lesion border detection averages around ~491.2. Conclusions When large amount of high resolution dermoscopy images considered in a usual clinical setting along with the critical importance of early detection and diagnosis of melanoma before metastasis, the importance of fast processing dermoscopy images become obvious. In this paper, we introduce WebCL and the use of it for biomedical image processing applications. WebCL is a javascript binding of OpenCL, which takes advantage of GPU computing from a web browser. Therefore, WebCL parallel version of density based skin lesion border detection introduced in this study can supplement expert dermatologist, and aid them in early diagnosis of skin lesions. While WebCL is currently an emerging technology, a full adoption of WebCL into the HTML5 standard would allow for this implementation to run on a very large set of hardware and software systems. WebCL takes full advantage of parallel computational resources including multi-cores and GPUs on a local machine, and allows for compiled code to run directly from the Web Browser. PMID:26423836

  19. Density-based parallel skin lesion border detection with webCL.

    PubMed

    Lemon, James; Kockara, Sinan; Halic, Tansel; Mete, Mutlu

    2015-01-01

    Dermoscopy is a highly effective and noninvasive imaging technique used in diagnosis of melanoma and other pigmented skin lesions. Many aspects of the lesion under consideration are defined in relation to the lesion border. This makes border detection one of the most important steps in dermoscopic image analysis. In current practice, dermatologists often delineate borders through a hand drawn representation based upon visual inspection. Due to the subjective nature of this technique, intra- and inter-observer variations are common. Because of this, the automated assessment of lesion borders in dermoscopic images has become an important area of study. Fast density based skin lesion border detection method has been implemented in parallel with a new parallel technology called WebCL. WebCL utilizes client side computing capabilities to use available hardware resources such as multi cores and GPUs. Developed WebCL-parallel density based skin lesion border detection method runs efficiently from internet browsers. Previous research indicates that one of the highest accuracy rates can be achieved using density based clustering techniques for skin lesion border detection. While these algorithms do have unfavorable time complexities, this effect could be mitigated when implemented in parallel. In this study, density based clustering technique for skin lesion border detection is parallelized and redesigned to run very efficiently on the heterogeneous platforms (e.g. tablets, SmartPhones, multi-core CPUs, GPUs, and fully-integrated Accelerated Processing Units) by transforming the technique into a series of independent concurrent operations. Heterogeneous computing is adopted to support accessibility, portability and multi-device use in the clinical settings. For this, we used WebCL, an emerging technology that enables a HTML5 Web browser to execute code in parallel for heterogeneous platforms. We depicted WebCL and our parallel algorithm design. In addition, we tested parallel code on 100 dermoscopy images and showed the execution speedups with respect to the serial version. Results indicate that parallel (WebCL) version and serial version of density based lesion border detection methods generate the same accuracy rates for 100 dermoscopy images, in which mean of border error is 6.94%, mean of recall is 76.66%, and mean of precision is 99.29% respectively. Moreover, WebCL version's speedup factor for 100 dermoscopy images' lesion border detection averages around ~491.2. When large amount of high resolution dermoscopy images considered in a usual clinical setting along with the critical importance of early detection and diagnosis of melanoma before metastasis, the importance of fast processing dermoscopy images become obvious. In this paper, we introduce WebCL and the use of it for biomedical image processing applications. WebCL is a javascript binding of OpenCL, which takes advantage of GPU computing from a web browser. Therefore, WebCL parallel version of density based skin lesion border detection introduced in this study can supplement expert dermatologist, and aid them in early diagnosis of skin lesions. While WebCL is currently an emerging technology, a full adoption of WebCL into the HTML5 standard would allow for this implementation to run on a very large set of hardware and software systems. WebCL takes full advantage of parallel computational resources including multi-cores and GPUs on a local machine, and allows for compiled code to run directly from the Web Browser.

  20. Federated Giovanni: A Distributed Web Service for Analysis and Visualization of Remote Sensing Data

    NASA Technical Reports Server (NTRS)

    Lynnes, Chris

    2014-01-01

    The Geospatial Interactive Online Visualization and Analysis Interface (Giovanni) is a popular tool for users of the Goddard Earth Sciences Data and Information Services Center (GES DISC) and has been in use for over a decade. It provides a wide variety of algorithms and visualizations to explore large remote sensing datasets without having to download the data and without having to write readers and visualizers for it. Giovanni is now being extended to enable its capabilities at other data centers within the Earth Observing System Data and Information System (EOSDIS). This Federated Giovanni will allow four other data centers to add and maintain their data within Giovanni on behalf of their user community. Those data centers are the Physical Oceanography Distributed Active Archive Center (PO.DAAC), MODIS Adaptive Processing System (MODAPS), Ocean Biology Processing Group (OBPG), and Land Processes Distributed Active Archive Center (LP DAAC). Three tiers are supported: Tier 1 (GES DISC-hosted) gives the remote data center a data management interface to add and maintain data, which are provided through the Giovanni instance at the GES DISC. Tier 2 packages Giovanni up as a virtual machine for distribution to and deployment by the other data centers. Data variables are shared among data centers by sharing documents from the Solr database that underpins Giovanni's data management capabilities. However, each data center maintains their own instance of Giovanni, exposing the variables of most interest to their user community. Tier 3 is a Shared Source model, in which the data centers cooperate to extend the infrastructure by contributing source code.

  1. Design and deployment of a large brain-image database for clinical and nonclinical research

    NASA Astrophysics Data System (ADS)

    Yang, Guo Liang; Lim, Choie Cheio Tchoyoson; Banukumar, Narayanaswami; Aziz, Aamer; Hui, Francis; Nowinski, Wieslaw L.

    2004-04-01

    An efficient database is an essential component of organizing diverse information on image metadata and patient information for research in medical imaging. This paper describes the design, development and deployment of a large database system serving as a brain image repository that can be used across different platforms in various medical researches. It forms the infrastructure that links hospitals and institutions together and shares data among them. The database contains patient-, pathology-, image-, research- and management-specific data. The functionalities of the database system include image uploading, storage, indexing, downloading and sharing as well as database querying and management with security and data anonymization concerns well taken care of. The structure of database is multi-tier client-server architecture with Relational Database Management System, Security Layer, Application Layer and User Interface. Image source adapter has been developed to handle most of the popular image formats. The database has a user interface based on web browsers and is easy to handle. We have used Java programming language for its platform independency and vast function libraries. The brain image database can sort data according to clinically relevant information. This can be effectively used in research from the clinicians" points of view. The database is suitable for validation of algorithms on large population of cases. Medical images for processing could be identified and organized based on information in image metadata. Clinical research in various pathologies can thus be performed with greater efficiency and large image repositories can be managed more effectively. The prototype of the system has been installed in a few hospitals and is working to the satisfaction of the clinicians.

  2. Spider-web inspired multi-resolution graphene tactile sensor.

    PubMed

    Liu, Lu; Huang, Yu; Li, Fengyu; Ma, Ying; Li, Wenbo; Su, Meng; Qian, Xin; Ren, Wanjie; Tang, Kanglai; Song, Yanlin

    2018-05-08

    Multi-dimensional accurate response and smooth signal transmission are critical challenges in the advancement of multi-resolution recognition and complex environment analysis. Inspired by the structure-activity relationship between discrepant microstructures of the spiral and radial threads in a spider web, we designed and printed graphene with porous and densely-packed microstructures to integrate into a multi-resolution graphene tactile sensor. The three-dimensional (3D) porous graphene structure performs multi-dimensional deformation responses. The laminar densely-packed graphene structure contributes excellent conductivity with flexible stability. The spider-web inspired printed pattern inherits orientational and locational kinesis tracking. The multi-structure construction with homo-graphene material can integrate discrepant electronic properties with remarkable flexibility, which will attract enormous attention for electronic skin, wearable devices and human-machine interactions.

  3. Tethys: A Platform for Water Resources Modeling and Decision Support Apps

    NASA Astrophysics Data System (ADS)

    Nelson, J.; Swain, N. R.

    2015-12-01

    The interactive nature of web applications or "web apps" makes it an excellent medium for conveying complex scientific concepts to lay audiences and creating decision support tools that harness cutting edge modeling techniques. However, the technical expertise required to develop web apps represents a barrier for would-be developers. This barrier can be characterized by the following hurdles that developers must overcome: (1) identify, select, and install software that meet the spatial and computational capabilities commonly required for water resources modeling; (2) orchestrate the use of multiple free and open source (FOSS) projects and navigate their differing application programming interfaces; (3) learn the multi-language programming skills required for modern web development; and (4) develop a web-secure and fully featured web portal to host the app. Tethys Platform has been developed to lower the technical barrier and minimize the initial development investment that prohibits many scientists and engineers from making use of the web app medium. It includes (1) a suite of FOSS that address the unique data and computational needs common to water resources web app development, (2) a Python software development kit that streamlines development, and (3) a customizable web portal that is used to deploy the completed web apps. Tethys synthesizes several software projects including PostGIS, 52°North WPS, GeoServer, Google Maps™, OpenLayers, and Highcharts. It has been used to develop a broad array of web apps for water resources modeling and decision support for several projects including CI-WATER, HydroShare, and the National Flood Interoperability Experiment. The presentation will include live demos of some of the apps that have been developed using Tethys to demonstrate its capabilities.

  4. EURODELTA-Trends, a multi-model experiment of air quality hindcast in Europe over 1990-2010

    NASA Astrophysics Data System (ADS)

    Colette, Augustin; Andersson, Camilla; Manders, Astrid; Mar, Kathleen; Mircea, Mihaela; Pay, Maria-Teresa; Raffort, Valentin; Tsyro, Svetlana; Cuvelier, Cornelius; Adani, Mario; Bessagnet, Bertrand; Bergström, Robert; Briganti, Gino; Butler, Tim; Cappelletti, Andrea; Couvidat, Florian; D'Isidoro, Massimo; Doumbia, Thierno; Fagerli, Hilde; Granier, Claire; Heyes, Chris; Klimont, Zig; Ojha, Narendra; Otero, Noelia; Schaap, Martijn; Sindelarova, Katarina; Stegehuis, Annemiek I.; Roustan, Yelva; Vautard, Robert; van Meijgaard, Erik; Garcia Vivanco, Marta; Wind, Peter

    2017-09-01

    The EURODELTA-Trends multi-model chemistry-transport experiment has been designed to facilitate a better understanding of the evolution of air pollution and its drivers for the period 1990-2010 in Europe. The main objective of the experiment is to assess the efficiency of air pollutant emissions mitigation measures in improving regional-scale air quality. The present paper formulates the main scientific questions and policy issues being addressed by the EURODELTA-Trends modelling experiment with an emphasis on how the design and technical features of the modelling experiment answer these questions. The experiment is designed in three tiers, with increasing degrees of computational demand in order to facilitate the participation of as many modelling teams as possible. The basic experiment consists of simulations for the years 1990, 2000, and 2010. Sensitivity analysis for the same three years using various combinations of (i) anthropogenic emissions, (ii) chemical boundary conditions, and (iii) meteorology complements it. The most demanding tier consists of two complete time series from 1990 to 2010, simulated using either time-varying emissions for corresponding years or constant emissions. Eight chemistry-transport models have contributed with calculation results to at least one experiment tier, and five models have - to date - completed the full set of simulations (and 21-year trend calculations have been performed by four models). The modelling results are publicly available for further use by the scientific community. The main expected outcomes are (i) an evaluation of the models' performances for the three reference years, (ii) an evaluation of the skill of the models in capturing observed air pollution trends for the 1990-2010 time period, (iii) attribution analyses of the respective role of driving factors (e.g. emissions, boundary conditions, meteorology), (iv) a dataset based on a multi-model approach, to provide more robust model results for use in impact studies related to human health, ecosystem, and radiative forcing.

  5. Eurodelta-Trends, a Multi-Model Experiment of Air Quality Hindcast in Europe over 1990-2010. Experiment Design and Key Findings

    NASA Astrophysics Data System (ADS)

    Colette, A.; Ciarelli, G.; Otero, N.; Theobald, M.; Solberg, S.; Andersson, C.; Couvidat, F.; Manders-Groot, A.; Mar, K. A.; Mircea, M.; Pay, M. T.; Raffort, V.; Tsyro, S.; Cuvelier, K.; Adani, M.; Bessagnet, B.; Bergstrom, R.; Briganti, G.; Cappelletti, A.; D'isidoro, M.; Fagerli, H.; Ojha, N.; Roustan, Y.; Vivanco, M. G.

    2017-12-01

    The Eurodelta-Trends multi-model chemistry-transport experiment has been designed to better understand the evolution of air pollution and its drivers for the period 1990-2010 in Europe. The main objective of the experiment is to assess the efficiency of air pollutant emissions mitigation measures in improving regional scale air quality. The experiment is designed in three tiers with increasing degree of computational demand in order to facilitate the participation of as many modelling teams as possible. The basic experiment consists of simulations for the years 1990, 2000 and 2010. Sensitivity analysis for the same three years using various combinations of (i) anthropogenic emissions, (ii) chemical boundary conditions and (iii) meteorology complements it. The most demanding tier consists in two complete time series from 1990 to 2010, simulated using either time varying emissions for corresponding years or constant emissions. Eight chemistry-transport models have contributed with calculation results to at least one experiment tier, and six models have completed the 21-year trend simulations. The modelling results are publicly available for further use by the scientific community. We assess the skill of the models in capturing observed air pollution trends for the 1990-2010 time period. The average particulate matter relative trends are well captured by the models, even if they display the usual lower bias in reproducing absolute levels. Ozone trends are also well reproduced, yet slightly overestimated in the 1990s. The attribution study emphasizes the efficiency of mitigation measures in reducing air pollution over Europe, although a strong impact of long range transport is pointed out for ozone trends. Meteorological variability is also an important factor in some regions of Europe. The results of the first health and ecosystem impact studies impacts building upon a regional scale multi-model ensemble over a 20yr time period will also be presented.

  6. Multi-sources data fusion framework for remote triage prioritization in telehealth.

    PubMed

    Salman, O H; Rasid, M F A; Saripan, M I; Subramaniam, S K

    2014-09-01

    The healthcare industry is streamlining processes to offer more timely and effective services to all patients. Computerized software algorithm and smart devices can streamline the relation between users and doctors by providing more services inside the healthcare telemonitoring systems. This paper proposes a multi-sources framework to support advanced healthcare applications. The proposed framework named Multi Sources Healthcare Architecture (MSHA) considers multi-sources: sensors (ECG, SpO2 and Blood Pressure) and text-based inputs from wireless and pervasive devices of Wireless Body Area Network. The proposed framework is used to improve the healthcare scalability efficiency by enhancing the remote triaging and remote prioritization processes for the patients. The proposed framework is also used to provide intelligent services over telemonitoring healthcare services systems by using data fusion method and prioritization technique. As telemonitoring system consists of three tiers (Sensors/ sources, Base station and Server), the simulation of the MSHA algorithm in the base station is demonstrated in this paper. The achievement of a high level of accuracy in the prioritization and triaging patients remotely, is set to be our main goal. Meanwhile, the role of multi sources data fusion in the telemonitoring healthcare services systems has been demonstrated. In addition to that, we discuss how the proposed framework can be applied in a healthcare telemonitoring scenario. Simulation results, for different symptoms relate to different emergency levels of heart chronic diseases, demonstrate the superiority of our algorithm compared with conventional algorithms in terms of classify and prioritize the patients remotely.

  7. Scientific Visualization and Simulation for Multi-dimensional Marine Environment Data

    NASA Astrophysics Data System (ADS)

    Su, T.; Liu, H.; Wang, W.; Song, Z.; Jia, Z.

    2017-12-01

    As higher attention on the ocean and rapid development of marine detection, there are increasingly demands for realistic simulation and interactive visualization of marine environment in real time. Based on advanced technology such as GPU rendering, CUDA parallel computing and rapid grid oriented strategy, a series of efficient and high-quality visualization methods, which can deal with large-scale and multi-dimensional marine data in different environmental circumstances, has been proposed in this paper. Firstly, a high-quality seawater simulation is realized by FFT algorithm, bump mapping and texture animation technology. Secondly, large-scale multi-dimensional marine hydrological environmental data is virtualized by 3d interactive technologies and volume rendering techniques. Thirdly, seabed terrain data is simulated with improved Delaunay algorithm, surface reconstruction algorithm, dynamic LOD algorithm and GPU programming techniques. Fourthly, seamless modelling in real time for both ocean and land based on digital globe is achieved by the WebGL technique to meet the requirement of web-based application. The experiments suggest that these methods can not only have a satisfying marine environment simulation effect, but also meet the rendering requirements of global multi-dimension marine data. Additionally, a simulation system for underwater oil spill is established by OSG 3D-rendering engine. It is integrated with the marine visualization method mentioned above, which shows movement processes, physical parameters, current velocity and direction for different types of deep water oil spill particle (oil spill particles, hydrates particles, gas particles, etc.) dynamically and simultaneously in multi-dimension. With such application, valuable reference and decision-making information can be provided for understanding the progress of oil spill in deep water, which is helpful for ocean disaster forecasting, warning and emergency response.

  8. A Security Architecture for Grid-enabling OGC Web Services

    NASA Astrophysics Data System (ADS)

    Angelini, Valerio; Petronzio, Luca

    2010-05-01

    In the proposed presentation we describe an architectural solution for enabling a secure access to Grids and possibly other large scale on-demand processing infrastructures through OGC (Open Geospatial Consortium) Web Services (OWS). This work has been carried out in the context of the security thread of the G-OWS Working Group. G-OWS (gLite enablement of OGC Web Services) is an international open initiative started in 2008 by the European CYCLOPS , GENESI-DR, and DORII Project Consortia in order to collect/coordinate experiences in the enablement of OWS's on top of the gLite Grid middleware. G-OWS investigates the problem of the development of Spatial Data and Information Infrastructures (SDI and SII) based on the Grid/Cloud capacity in order to enable Earth Science applications and tools. Concerning security issues, the integration of OWS compliant infrastructures and gLite Grids needs to address relevant challenges, due to their respective design principles. In fact OWS's are part of a Web based architecture that demands security aspects to other specifications, whereas the gLite middleware implements the Grid paradigm with a strong security model (the gLite Grid Security Infrastructure: GSI). In our work we propose a Security Architectural Framework allowing the seamless use of Grid-enabled OGC Web Services through the federation of existing security systems (mostly web based) with the gLite GSI. This is made possible mediating between different security realms, whose mutual trust is established in advance during the deployment of the system itself. Our architecture is composed of three different security tiers: the user's security system, a specific G-OWS security system, and the gLite Grid Security Infrastructure. Applying the separation-of-concerns principle, each of these tiers is responsible for controlling the access to a well-defined resource set, respectively: the user's organization resources, the geospatial resources and services, and the Grid resources. While the gLite middleware is tied to a consolidated security approach based on X.509 certificates, our system is able to support different kinds of user's security infrastructures. Our central component, the G-OWS Security Framework, is based on the OASIS WS-Trust specifications and on the OGC GeoRM architectural framework. This allows to satisfy advanced requirements such as the enforcement of specific geospatial policies and complex secure web service chained requests. The typical use case is represented by a scientist belonging to a given organization who issues a request to a G-OWS Grid-enabled Web Service. The system initially asks the user to authenticate to his/her organization's security system and, after verification of the user's security credentials, it translates the user's digital identity into a G-OWS identity. This identity is linked to a set of attributes describing the user's access rights to the G-OWS services and resources. Inside the G-OWS Security system, access restrictions are applied making use of the enhanced Geospatial capabilities specified by the OGC GeoXACML. If the required action needs to make use of the Grid environment the system checks if the user is entitled to access a Grid infrastructure. In that case his/her identity is translated to a temporary Grid security token using the Short Lived Credential Services (IGTF Standard). In our case, for the specific gLite Grid infrastructure, some information (VOMS Attributes) is plugged into the Grid Security Token to grant the access to the user's Virtual Organization Grid resources. The resulting token is used to submit the request to the Grid and also by the various gLite middleware elements to verify the user's grants. Basing on the presented framework, the G-OWS Security Working Group developed a prototype, enabling the execution of OGC Web Services on the EGEE Production Grid through the federation with a Shibboleth based security infrastructure. Future plans aim to integrate other Web authentication services such as OpenID, Kerberos and WS-Federation.

  9. Unlocking Solar for Low- and Moderate-Income Residents: A Matrix of Financing Options by Resident, Provider, and Housing Type

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Jeffrey J.; Bird, Lori A.

    Historically the low and moderate income (LMI) market has been underserved by solar photovoltaics (PV), in part due to the unique barriers LMI residents face to participation in the PV market. The intent of this report is to identify the most promising strategies state policymakers might consider to finance PV for LMI customers across three housing types: single family, multi-family, and manufactured housing. The result is a financing matrix that documents the first and second tier financing options states may consider for each housing type. The first tier options were selected based upon their potential impact on LMI PV deployment.more » Second tier financing approaches could also be used to achieve state policy goals, but may not have as much effect on the relevant LMI market segment. Nevertheless, each financing option comes with tradeoffs that state policymakers may wish to consider when they make decisions about which financing approaches are best suited to achieve their LMI PV deployment goals.« less

  10. Predicting First Grade Reading Performance from Kindergarten Response to Tier 1 Instruction

    PubMed Central

    Al Otaiba, Stephanie; Folsom, Jessica S.; Schatschneider, Christopher; Wanzek, Jeanne; Greulich, Luana; Meadows, Jane; Li, Zhi; Connor, Carol M

    2010-01-01

    Many schools are beginning to implement multi-tier response to intervention (RTI) models for the prevention of reading difficulties and to assist in the identification of students with learning disabilities (LD). The present study was part of our larger ongoing longitudinal RTI investigation within the Florida Learning Disabilities Center grant. This study used a longitudinal correlational design, conducted in 7 ethnically and socio-economically diverse schools. We observed reading instruction in 20 classrooms, examined response rates to kindergarten Tier 1 instruction, and predicted students’ first grade reading performance based upon kindergarten growth and end of year reading performance (n = 203). Teachers followed an explicit core reading program and overall, classroom instruction was rated as effective. Results indicate that controlling for students’ end of kindergarten reading, their growth across kindergarten on a variety of language and literacy measures suppressed predictions of first grade performance. Specifically, the steeper the students’ trajectory to a satisfactory outcome, the less likely they were to demonstrate good performance in first grade. Implications for future research and RTI implementation are discussed. PMID:21857718

  11. Development of Two-Tier Diagnostic Test Pictorial-Based for Identifying High School Students Misconceptions on the Mole Concept

    NASA Astrophysics Data System (ADS)

    Siswaningsih, W.; Firman, H.; Zackiyah; Khoirunnisa, A.

    2017-02-01

    The aim of this study was to develop the two-tier pictorial-based diagnostic test for identifying student misconceptions on mole concept. The method of this study is used development and validation. The development of the test Obtained through four phases, development of any items, validation, determination key, and application test. Test was developed in the form of pictorial consisting of two tier, the first tier Consist of four possible answers and the second tier Consist of four possible reasons. Based on the results of content validity of 20 items using the CVR (Content Validity Ratio), a number of 18 items declared valid. Based on the results of the reliability test using SPSS, Obtained 17 items with Cronbach’s Alpha value of 0703, the which means that items have accepted. A total of 10 items was conducted to 35 students of senior high school students who have studied the mole concept on one of the high schools in Cimahi. Based on the results of the application test, student misconceptions were identified in each label concept in mole concept with the percentage of misconceptions on the label concept of mole (60.15%), Avogadro’s number (34.28%), relative atomic mass (62, 84%), relative molecule mass (77.08%), molar mass (68.53%), molar volume of gas (57.11%), molarity (71.32%), chemical equation (82.77%), limiting reactants (91.40%), and molecular formula (77.13%).

  12. Adoption of Second Life in Higher Education: Comparing the Effect of Utilitarian and Hedonic Behaviours

    ERIC Educational Resources Information Center

    Saeed, Nauman; Sinnappan, Sukunesan

    2013-01-01

    Second Life is a three dimensional multi-user virtual environment within the Web 2.0 suite of applications which has gained wide spread popularity amongst educators in the recent years. However, limited empirical research has been reported on the adoption of Second Life, especially within higher education. The majority of technology adoption…

  13. Avatars Go to Class: A Virtual Environment Soil Science Activity

    ERIC Educational Resources Information Center

    Mamo, M.; Namuth-Covert, D.; Guru, A.; Nugent, G.; Phillips, L.; Sandall, L.; Kettler, T.; McCallister, D.

    2011-01-01

    Web 2.0 technology is expanding rapidly from social and gaming uses into the educational applications. Specifically, the multi-user virtual environment (MUVE), such as SecondLife, allows educators to fill the gap of first-hand experience by creating simulated realistic evolving problems/games. In a pilot study, a team of educators at the…

  14. The discovery of student experiences using the Frayer model map as a Tier 2 intervention in secondary science

    NASA Astrophysics Data System (ADS)

    Miller, Cory D.

    The purpose of this study was to discover the student experiences of using the Frayer model map as a Tier 2 intervention in science. As a response to the criticized discrepancy model and the mandates contained in NCLB and the IDEA, response to intervention (RtI) has been implemented in education to increase achievement for all students and to discover what students need further interventions. Based on Cronbach's (1957) aptitude X treatment interaction theory, RtI assumes that progress over time can be measured when interventions are applied. While RtI has been actively implemented in reading and math, it has not been implemented in science. Therefore, it was not known what the experiences of students are using the Frayer model map as a Tier 2 intervention to impact science achievement. The multiple case study used a qualitative methodology that included pre-intervention and post-intervention web-based surveys, field notes during observations, and student work that were collected during the course of the study. The population that was studied was seventh- and eighth-grade students considered at-risk and attend a Title I school in Florida. The sample of the studied population was purposively selected according to a set of criteria similar to Tier 2 selection in RtI. The research question was, "What are the experiences of middle grades students using the Frayer model map as an instructional intervention in science?" The answer to the research question was that the experiences of students using the Frayer model map as a Tier 2 intervention in secondary science can be described as participants perceived the Frayer model map as use as a tool to organize tasks and create meaning while they completed the work independently and with accuracy. Even though there were limitations to quantity of data, the research question was adequately answered. Overall, the study fills a gap in the literature related to RtI and science education.

  15. Pesticides exposure assessment of kettleman city using the industrial source complex short-term model version 3.

    PubMed

    Tao, Jing; Barry, Terrell; Segawa, Randy; Neal, Rosemary; Tuli, Atac

    2013-01-01

    Kettleman City, California, reported a higher than expected number of birth defect cases between 2007 and 2010, raising the concern of community and government agencies. A pesticide exposure evaluation was conducted as part of a complete assessment of community chemical exposure. Nineteen pesticides that potentially cause birth defects were investigated. The Industrial Source Complex Short-Term Model Version 3 (ISCST3) was used to estimate off-site air concentrations associated with pesticide applications within 8 km of the community from late 2006 to 2009. The health screening levels were designed to indicate potential health effects and used for preliminary health evaluations of estimated air concentrations. A tiered approach was conducted. The first tier modeled simple, hypothetical worst-case situations for each of 19 pesticides. The second tier modeled specific applications of the pesticides with estimated concentrations exceeding health screening levels in the first tier. The pesticide use report database of the California Department of Pesticide Regulation provided application information. Weather input data were summarized from the measurements of a local weather station in the California Irrigation Management Information System. The ISCST3 modeling results showed that during the target period, only two application days of one pesticide (methyl isothiocyanate) produced air concentration estimates above the health screening level for developmental effects at the boundary of Kettleman City. These results suggest that the likelihood of birth defects caused by pesticide exposure was low. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  16. Influence of genetic strain and access to litter on spatial distribution of 4 strains of laying hens in an aviary system1

    PubMed Central

    Ali, A. B. A.; Campbell, D. L. M.; Karcher, D. M.; Siegford, J. M.

    2016-01-01

    Many laying hen producers are transitioning from conventional cages to new housing systems including multi-tier aviaries. Aviary resources, such as litter areas, are intended to encourage hens’ expression of natural behaviors to improve their welfare. Little research has examined the influence of laying hen strain on distribution and behavior inside aviaries, yet differences could influence a strain's suitability for an aviary design. This research examined how laying hens of 4 strains (Hy-Line Brown [HB], Bovans Brown [BB], DeKalb White [DW], and Hy-Line W36) distributed themselves among 3 enclosed aviary tiers and 2 litter areas at peak lay (25 to 28 wk of age) and after gaining access to litter on the floor (26 wk). Observations of hens’ spatial distribution were conducted immediately before and after, and 3 wk after hens gained access to litter. More HB and BB hens were in upper tiers in morning compared to DW and W36 (all P ≤ 0.05). However, DW and W36 hens roosted in upper tiers in larger numbers than HB and BB during evening (all P ≤ 0.05). More DW and W36 hens were on litter compared to BB and HB, particularly when litter was first accessible (all P ≤ 0.05). The number of hens on litter increased over time for all strains (P ≤ 0.06). White hens on litter occupied open areas in higher numbers (P ≤ 0.05), while more brown hens occupied litter under the aviary after acclimation (P ≤ 0.05). In the dark period, W36 and DW hens were present in higher numbers in upper tiers than HB and BB, while HB and BB showed higher tier-to-tier movement than DW and W36 (P ≤ 0.05). In general, more white hens roosted higher at night and explored litter sooner, while more brown hens were near or in nests in the morning and moved at night. Distinct strain differences indicate that attention should be paid to the match between configuration of the aviary design and strain of laying hen. PMID:27444438

  17. Free Factories: Unified Infrastructure for Data Intensive Web Services

    PubMed Central

    Zaranek, Alexander Wait; Clegg, Tom; Vandewege, Ward; Church, George M.

    2010-01-01

    We introduce the Free Factory, a platform for deploying data-intensive web services using small clusters of commodity hardware and free software. Independently administered virtual machines called Freegols give application developers the flexibility of a general purpose web server, along with access to distributed batch processing, cache and storage services. Each cluster exploits idle RAM and disk space for cache, and reserves disks in each node for high bandwidth storage. The batch processing service uses a variation of the MapReduce model. Virtualization allows every CPU in the cluster to participate in batch jobs. Each 48-node cluster can achieve 4-8 gigabytes per second of disk I/O. Our intent is to use multiple clusters to process hundreds of simultaneous requests on multi-hundred terabyte data sets. Currently, our applications achieve 1 gigabyte per second of I/O with 123 disks by scheduling batch jobs on two clusters, one of which is located in a remote data center. PMID:20514356

  18. Software architecture and design of the web services facilitating climate model diagnostic analysis

    NASA Astrophysics Data System (ADS)

    Pan, L.; Lee, S.; Zhang, J.; Tang, B.; Zhai, C.; Jiang, J. H.; Wang, W.; Bao, Q.; Qi, M.; Kubar, T. L.; Teixeira, J.

    2015-12-01

    Climate model diagnostic analysis is a computationally- and data-intensive task because it involves multiple numerical model outputs and satellite observation data that can both be high resolution. We have built an online tool that facilitates this process. The tool is called Climate Model Diagnostic Analyzer (CMDA). It employs the web service technology and provides a web-based user interface. The benefits of these choices include: (1) No installation of any software other than a browser, hence it is platform compatable; (2) Co-location of computation and big data on the server side, and small results and plots to be downloaded on the client side, hence high data efficiency; (3) multi-threaded implementation to achieve parallel performance on multi-core servers; and (4) cloud deployment so each user has a dedicated virtual machine. In this presentation, we will focus on the computer science aspects of this tool, namely the architectural design, the infrastructure of the web services, the implementation of the web-based user interface, the mechanism of provenance collection, the approach to virtualization, and the Amazon Cloud deployment. As an example, We will describe our methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). Another example is the use of Docker, a light-weight virtualization container, to distribute and deploy CMDA onto an Amazon EC2 instance. Our tool of CMDA has been successfully used in the 2014 Summer School hosted by the JPL Center for Climate Science. Students had positive feedbacks in general and we will report their comments. An enhanced version of CMDA with several new features, some requested by the 2014 students, will be used in the 2015 Summer School soon.

  19. Arrowland v1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BIRKEL, GARRETT; GARCIA MARTIN, HECTOR; MORRELL, WILLIAM

    "Arrowland" is a web-based software application primarily for mapping, integrating and visualizing a variety of metabolism data of living organisms, including but not limited to metabolomics, proteomics, transcriptomics and fluxomics. This software application makes multi-omics data analysis intuitive and interactive. It improves data sharing and communication by enabling users to visualize their omics data using a web browser (on a PC or mobile device). It increases user's productivity by simplifying multi-omics data analysis using well developed maps as a guide. Users using this tool can gain insights into their data sets that would be difficult or even impossible to teasemore » out by looking at raw number, or using their currently existing toolchains to generate static single-use maps. Arrowland helps users save time by visualizing relative changes in different conditions or over time, and helps users to produce more significant insights faster. Preexisting maps decrease the learning curve for beginners in the omics field. Sets of multi-omics data are presented in the browser, as a two-dimensional flowchart resembling a map, with varying levels of detail information, based on the scaling of the map. Users can pan and zoom to explore different maps, compare maps, upload their own research data sets onto desired maps, alter map appearance in ways that facilitate interpretation, visualization and analysis of the given data, and export data, reports and actionable items to help the user initiative.« less

  20. Risk prediction with procalcitonin and clinical rules in community-acquired pneumonia

    PubMed Central

    Huang, David T.; Weissfeld, Lisa A.; Kellum, John A.; Yealy, Donald M.; Kong, Lan; Martino, Michael; Angus, Derek C.

    2009-01-01

    Objective The Pneumonia Severity Index (PSI) and CURB-65 predict outcomes in community acquired pneumonia (CAP), but have limitations. Procalcitonin, a biomarker of bacterial infection, may provide prognostic information in CAP. Our objective was to describe the pattern of procalcitonin in CAP, and determine if procalcitonin provides prognostic information beyond PSI and CURB-65. Methods We conducted a multi-center prospective cohort study in 28 community and teaching emergency departments. Patients presenting with a clinical and radiographic diagnosis of CAP were enrolled. We stratified procalcitonin levels a priori into four tiers – I: < 0.1; II: ≥ 0.1 to <0.25; III: ≥ 0.25 to < 0.5; and IV: ≥ 0.5 ng/ml. Primary outcome was 30d mortality. Results 1651 patients formed the study cohort. Procalcitonin levels were broadly spread across tiers: 32.8% (I), 21.6% (II), 10.2% (III), 35.4% (IV). Used alone, procalcitonin had modest test characteristics: specificity (35%), sensitivity (92%), positive likelihood ratio (LR) (1.41), and negative LR (0.22). Adding procalcitonin to PSI in all subjects minimally improved performance. Adding procalcitonin to low risk PSI subjects (Class I–III) provided no additional information. However, subjects in procalcitonin tier I had low 30d mortality regardless of clinical risk, including those in higher risk classes (1.5% vs. 1.6% for those in PSI Class I–III vs. Class IV/V). Among high risk PSI subjects (Class IV/V), one quarter (126/546) were in procalcitonin tier I, and the negative LR of procalcitonin tier I was 0.09. Procalcitonin tier I was also associated with lower burden of other adverse outcomes. Similar results were seen with CURB-65 stratification. Conclusions Selective use of procalcitonin as an adjunct to existing rules may offer additional prognostic information in high risk patients. PMID:18342993

  1. Race to the Top--Early Learning Challenge (RTT-ELC): Descriptive Study of Tiered Quality Rating and Improvement Systems (TQRIS). Master Data Collection Protocol

    ERIC Educational Resources Information Center

    Mathematica Policy Research, Inc., 2015

    2015-01-01

    This master data collection protocol describes the data that Mathematica collected for the Race to the Top-Early Learning Challenge Study of Tiered Quality Rating and Improvement Systems. This study was conducted for the Department of Education's Institute of Education Sciences. The data were collected from reviews of applications, documents, and…

  2. The feasibility of a modified shoe for multi-segment foot motion analysis: a preliminary study.

    PubMed

    Halstead, J; Keenan, A M; Chapman, G J; Redmond, A C

    2016-01-01

    The majority of multi-segment kinematic foot studies have been limited to barefoot conditions, because shod conditions have the potential for confounding surface-mounted markers. The aim of this study was to investigate whether a shoe modified with a webbed upper can accommodate multi-segment foot marker sets without compromising kinematic measurements under barefoot and shod conditions. Thirty participants (15 controls and 15 participants with midfoot pain) underwent gait analysis in two conditions; barefoot and wearing a shoe (shod) in a random order. The shod condition employed a modified shoe (rubber plimsoll) with a webbed upper, allowing skin mounted reflective markers to be visualised through slits in the webbed material. Three dimensional foot kinematics were captured using the Oxford multi-segment foot model whilst participants walked at a self-selected speed. The foot pain group showed greater hindfoot eversion and less hindfoot dorsiflexion than controls in the barefoot condition and these differences were maintained when measured in the shod condition. Differences between the foot pain and control participants were also observed for walking speed in the barefoot and in the shod conditions. No significant differences between foot pain and control groups were demonstrated at the forefoot in either condition. Subtle differences between pain and control groups, which were found during barefoot walking are retained when wearing the modified shoe. The novel properties of the modified shoe offers a potential solution for the use of passive infrared based motion analysis for shod applications, for instance to investigate the kinematic effect of foot orthoses.

  3. Astrophysical data mining with GPU. A case study: Genetic classification of globular clusters

    NASA Astrophysics Data System (ADS)

    Cavuoti, S.; Garofalo, M.; Brescia, M.; Paolillo, M.; Pescape', A.; Longo, G.; Ventre, G.

    2014-01-01

    We present a multi-purpose genetic algorithm, designed and implemented with GPGPU/CUDA parallel computing technology. The model was derived from our CPU serial implementation, named GAME (Genetic Algorithm Model Experiment). It was successfully tested and validated on the detection of candidate Globular Clusters in deep, wide-field, single band HST images. The GPU version of GAME will be made available to the community by integrating it into the web application DAMEWARE (DAta Mining Web Application REsource, http://dame.dsf.unina.it/beta_info.html), a public data mining service specialized on massive astrophysical data. Since genetic algorithms are inherently parallel, the GPGPU computing paradigm leads to a speedup of a factor of 200× in the training phase with respect to the CPU based version.

  4. Web accessibility and open source software.

    PubMed

    Obrenović, Zeljko

    2009-07-01

    A Web browser provides a uniform user interface to different types of information. Making this interface universally accessible and more interactive is a long-term goal still far from being achieved. Universally accessible browsers require novel interaction modalities and additional functionalities, for which existing browsers tend to provide only partial solutions. Although functionality for Web accessibility can be found as open source and free software components, their reuse and integration is complex because they were developed in diverse implementation environments, following standards and conventions incompatible with the Web. To address these problems, we have started several activities that aim at exploiting the potential of open-source software for Web accessibility. The first of these activities is the development of Adaptable Multi-Interface COmmunicator (AMICO):WEB, an infrastructure that facilitates efficient reuse and integration of open source software components into the Web environment. The main contribution of AMICO:WEB is in enabling the syntactic and semantic interoperability between Web extension mechanisms and a variety of integration mechanisms used by open source and free software components. Its design is based on our experiences in solving practical problems where we have used open source components to improve accessibility of rich media Web applications. The second of our activities involves improving education, where we have used our platform to teach students how to build advanced accessibility solutions from diverse open-source software. We are also partially involved in the recently started Eclipse projects called Accessibility Tools Framework (ACTF), the aim of which is development of extensible infrastructure, upon which developers can build a variety of utilities that help to evaluate and enhance the accessibility of applications and content for people with disabilities. In this article we briefly report on these activities.

  5. Multi-criteria decision analysis in environmental sciences: ten years of applications and trends.

    PubMed

    Huang, Ivy B; Keisler, Jeffrey; Linkov, Igor

    2011-09-01

    Decision-making in environmental projects requires consideration of trade-offs between socio-political, environmental, and economic impacts and is often complicated by various stakeholder views. Multi-criteria decision analysis (MCDA) emerged as a formal methodology to face available technical information and stakeholder values to support decisions in many fields and can be especially valuable in environmental decision making. This study reviews environmental applications of MCDA. Over 300 papers published between 2000 and 2009 reporting MCDA applications in the environmental field were identified through a series of queries in the Web of Science database. The papers were classified by their environmental application area, decision or intervention type. In addition, the papers were also classified by the MCDA methods used in the analysis (analytic hierarchy process, multi-attribute utility theory, and outranking). The results suggest that there is a significant growth in environmental applications of MCDA over the last decade across all environmental application areas. Multiple MCDA tools have been successfully used for environmental applications. Even though the use of the specific methods and tools varies in different application areas and geographic regions, our review of a few papers where several methods were used in parallel with the same problem indicates that recommended course of action does not vary significantly with the method applied. Published by Elsevier B.V.

  6. Pathways to the Profession Survey 2008: Report and Results

    ERIC Educational Resources Information Center

    Spencer, Sarah E.; Kreutzer, Kim; Shallenberger, David

    2008-01-01

    "Pathways to the Profession" has been a multi-tiered project that has looked at the profession of education abroad and the individuals who serve in the profession. The first Pathways survey was conducted by Dr. Joe Brockington of Kalamazoo College. The first survey analyzed how people came to the field of education abroad, what knowledge and…

  7. The Somatic Appraisal Model of Affect: Paradigm for Educational Neuroscience and Neuropedagogy

    ERIC Educational Resources Information Center

    Patten, Kathryn E.

    2011-01-01

    This chapter presents emotion as a function of brain-body interaction, as a vital part of a multi-tiered phylogenetic set of neural mechanisms, evoked by both instinctive processes and learned appraisal systems, and argues to establish the primacy of emotion in relation to cognition. Primarily based on Damasio's somatic marker hypothesis, but also…

  8. An Ecobehavioral Analysis of Child Academic Engagement: Implications for Preschool Children Not Responding to Instructional Intervention

    ERIC Educational Resources Information Center

    Greenwood, Charles R.; Beecher, Constance; Atwater, Jane; Petersen, Sarah; Schiefelbusch, Jean; Irvin, Dwight

    2018-01-01

    A gap exists in the information needed to make intervention decisions with preschool children who are unresponsive to instructional intervention. "Multi-Tiered System of Supports/Response to Intervention" (MTSS/RTI) progress monitoring is helpful in indicating when an intervention change is needed but provides little information on what…

  9. How Educators Perceive the Process and Implementation of a Multi-Tiered System of Supports (MTSS): A Case Study

    ERIC Educational Resources Information Center

    Maniglia, Linda M.

    2017-01-01

    In this qualitative case study the researcher examined educators' perceptions of an RTI/MTSS model of instruction and how staff perceptions relate to fidelity of implementation. The purpose of this research was to investigate educators' understanding of the critical components necessary to support implementation of RTI/MTSS framework. The…

  10. Providing Educationally Related Mental Health Services in California Schools: The Roles of School Psychologists

    ERIC Educational Resources Information Center

    Sosa-Estrella, Olga

    2017-01-01

    Although there is a great need for school-based mental health services (SBMH), these needs are not adequately met in California's public schools. To meet these needs better, evidence-based methods have been used, including multi-tiered systems of support, training and workforce development, cultural competence, and family and youth engagement and…

  11. 75 FR 6033 - Agency Information Collection Request; 60-Day Public Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-05

    ... Initial Telephone Screen........ Active Control 2400 1 20 minutes 800 hours Group (ACG)/ Experimental Group (EG) In-person interview EG 1200 1 1.25 hours 1,500 hours Jump start phone call EG 1200 1 30... care insurance who are age 75 and over using a multi- tiered random experimental research design to...

  12. 75 FR 19976 - Agency Information Collection Request; 30-Day Public Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-16

    ...) hours Initial Telephone Screen...... Experimental 240 1 20/60 80 Group. In-person interview 240 1 80/60... Telephone Screen...... Active Control 240 1 20/60 80 Group. Quarterly phone calls......... 240 4 10/60 160... private long-term care insurance who are age 75 and over using a multi- tiered random experimental...

  13. Effects of CW-FIT on Teachers' Ratings of Elementary School Students at Risk for Emotional and Behavioral Disorders

    ERIC Educational Resources Information Center

    Caldarella, Paul; Larsen, Ross A. A.; Williams, Leslie; Wills, Howard P.; Kamps, Debra M.; Wehby, Joseph H.

    2017-01-01

    Students with deficits in social skills have been found to experience both short- and long-term problems, including interpersonal conflicts and academic difficulties. These problems are compounded for students with emotional and behavioral disorders (EBD). Class-wide function-related intervention teams (CW-FIT), a multi-tiered classroom management…

  14. Perceptions of Pre-Service Teachers Regarding the Response-to-Intervention Model

    ERIC Educational Resources Information Center

    Arroyo, Kimberly A.

    2014-01-01

    A Response-to-Intervention (RTI) model of educational service delivery is a multi-tiered, preventative approach designed to meet the educational and behavioral needs of all learners. While the New York State (NYS) Department of Education has mandated the use of this model in grades K-4, the extent to which RTI competencies are taught within…

  15. Bridging the Gap: Individual Growth and Development Indicators--The Which One Doesn't Belong Task

    ERIC Educational Resources Information Center

    Rodriguez, Megan I.; Wackerle-Hollman, Alisha K.

    2015-01-01

    The Individual Growth and Development Indicator (IGDI): Which One Doesn't Belong (WODB) task is an early comprehension screening assessment designed for use in pre-Kindergarten multi-tiered systems of support. This article summarizes the purpose, procedures, and evidence base currently available in the literature to support the WODB task. Example…

  16. Assessing African American Students for Specific Learning Disabilities: The Promises and Perils of Response to Intervention

    ERIC Educational Resources Information Center

    Proctor, Sherrie L.; Graves, Scott L., Jr.; Esch, Rachel C.

    2012-01-01

    Response to Intervention (RtI) consists of multi-tiered instructional delivery systems in which educators provide research-based interventions to students that increase in intensity depending on students' instructional response. RtI is currently being implemented in schools across the United States. RtI's shift away from standardized testing…

  17. Using Latent Class Analysis to Identify Academic and Behavioral Risk Status in Elementary Students

    ERIC Educational Resources Information Center

    King, Kathleen R.; Lembke, Erica S.; Reinke, Wendy M.

    2016-01-01

    Identifying classes of children on the basis of academic and behavior risk may have important implications for the allocation of intervention resources within Response to Intervention (RTI) and Multi-Tiered System of Support (MTSS) models. Latent class analysis (LCA) was conducted with a sample of 517 third grade students. Fall screening scores in…

  18. Cloud Based Earth Observation Data Exploitation Platforms

    NASA Astrophysics Data System (ADS)

    Romeo, A.; Pinto, S.; Loekken, S.; Marin, A.

    2017-12-01

    In the last few years data produced daily by several private and public Earth Observation (EO) satellites reached the order of tens of Terabytes, representing for scientists and commercial application developers both a big opportunity for their exploitation and a challenge for their management. New IT technologies, such as Big Data and cloud computing, enable the creation of web-accessible data exploitation platforms, which offer to scientists and application developers the means to access and use EO data in a quick and cost effective way. RHEA Group is particularly active in this sector, supporting the European Space Agency (ESA) in the Exploitation Platforms (EP) initiative, developing technology to build multi cloud platforms for the processing and analysis of Earth Observation data, and collaborating with larger European initiatives such as the European Plate Observing System (EPOS) and the European Open Science Cloud (EOSC). An EP is a virtual workspace, providing a user community with access to (i) large volume of data, (ii) algorithm development and integration environment, (iii) processing software and services (e.g. toolboxes, visualization routines), (iv) computing resources, (v) collaboration tools (e.g. forums, wiki, etc.). When an EP is dedicated to a specific Theme, it becomes a Thematic Exploitation Platform (TEP). Currently, ESA has seven TEPs in a pre-operational phase dedicated to geo-hazards monitoring and prevention, costal zones, forestry areas, hydrology, polar regions, urban areas and food security. On the technology development side, solutions like the multi cloud EO data processing platform provides the technology to integrate ICT resources and EO data from different vendors in a single platform. In particular it offers (i) Multi-cloud data discovery, (ii) Multi-cloud data management and access and (iii) Multi-cloud application deployment. This platform has been demonstrated with the EGI Federated Cloud, Innovation Platform Testbed Poland and the Amazon Web Services cloud. This work will present an overview of the TEPs and the multi-cloud EO data processing platform, and discuss their main achievements and their impacts in the context of distributed Research Infrastructures such as EPOS and EOSC.

  19. NeisseriaBase: a specialised Neisseria genomic resource and analysis platform.

    PubMed

    Zheng, Wenning; Mutha, Naresh V R; Heydari, Hamed; Dutta, Avirup; Siow, Cheuk Chuen; Jakubovics, Nicholas S; Wee, Wei Yee; Tan, Shi Yang; Ang, Mia Yang; Wong, Guat Jah; Choo, Siew Woh

    2016-01-01

    Background. The gram-negative Neisseria is associated with two of the most potent human epidemic diseases: meningococcal meningitis and gonorrhoea. In both cases, disease is caused by bacteria colonizing human mucosal membrane surfaces. Overall, the genus shows great diversity and genetic variation mainly due to its ability to acquire and incorporate genetic material from a diverse range of sources through horizontal gene transfer. Although a number of databases exist for the Neisseria genomes, they are mostly focused on the pathogenic species. In this present study we present the freely available NeisseriaBase, a database dedicated to the genus Neisseria encompassing the complete and draft genomes of 15 pathogenic and commensal Neisseria species. Methods. The genomic data were retrieved from National Center for Biotechnology Information (NCBI) and annotated using the RAST server which were then stored into the MySQL database. The protein-coding genes were further analyzed to obtain information such as calculation of GC content (%), predicted hydrophobicity and molecular weight (Da) using in-house Perl scripts. The web application was developed following the secure four-tier web application architecture: (1) client workstation, (2) web server, (3) application server, and (4) database server. The web interface was constructed using PHP, JavaScript, jQuery, AJAX and CSS, utilizing the model-view-controller (MVC) framework. The in-house developed bioinformatics tools implemented in NeisseraBase were developed using Python, Perl, BioPerl and R languages. Results. Currently, NeisseriaBase houses 603,500 Coding Sequences (CDSs), 16,071 RNAs and 13,119 tRNA genes from 227 Neisseria genomes. The database is equipped with interactive web interfaces. Incorporation of the JBrowse genome browser in the database enables fast and smooth browsing of Neisseria genomes. NeisseriaBase includes the standard BLAST program to facilitate homology searching, and for Virulence Factor Database (VFDB) specific homology searches, the VFDB BLAST is also incorporated into the database. In addition, NeisseriaBase is equipped with in-house designed tools such as the Pairwise Genome Comparison tool (PGC) for comparative genomic analysis and the Pathogenomics Profiling Tool (PathoProT) for the comparative pathogenomics analysis of Neisseria strains. Discussion. This user-friendly database not only provides access to a host of genomic resources on Neisseria but also enables high-quality comparative genome analysis, which is crucial for the expanding scientific community interested in Neisseria research. This database is freely available at http://neisseria.um.edu.my.

  20. NeisseriaBase: a specialised Neisseria genomic resource and analysis platform

    PubMed Central

    Zheng, Wenning; Mutha, Naresh V.R.; Heydari, Hamed; Dutta, Avirup; Siow, Cheuk Chuen; Jakubovics, Nicholas S.; Wee, Wei Yee; Tan, Shi Yang; Ang, Mia Yang; Wong, Guat Jah

    2016-01-01

    Background. The gram-negative Neisseria is associated with two of the most potent human epidemic diseases: meningococcal meningitis and gonorrhoea. In both cases, disease is caused by bacteria colonizing human mucosal membrane surfaces. Overall, the genus shows great diversity and genetic variation mainly due to its ability to acquire and incorporate genetic material from a diverse range of sources through horizontal gene transfer. Although a number of databases exist for the Neisseria genomes, they are mostly focused on the pathogenic species. In this present study we present the freely available NeisseriaBase, a database dedicated to the genus Neisseria encompassing the complete and draft genomes of 15 pathogenic and commensal Neisseria species. Methods. The genomic data were retrieved from National Center for Biotechnology Information (NCBI) and annotated using the RAST server which were then stored into the MySQL database. The protein-coding genes were further analyzed to obtain information such as calculation of GC content (%), predicted hydrophobicity and molecular weight (Da) using in-house Perl scripts. The web application was developed following the secure four-tier web application architecture: (1) client workstation, (2) web server, (3) application server, and (4) database server. The web interface was constructed using PHP, JavaScript, jQuery, AJAX and CSS, utilizing the model-view-controller (MVC) framework. The in-house developed bioinformatics tools implemented in NeisseraBase were developed using Python, Perl, BioPerl and R languages. Results. Currently, NeisseriaBase houses 603,500 Coding Sequences (CDSs), 16,071 RNAs and 13,119 tRNA genes from 227 Neisseria genomes. The database is equipped with interactive web interfaces. Incorporation of the JBrowse genome browser in the database enables fast and smooth browsing of Neisseria genomes. NeisseriaBase includes the standard BLAST program to facilitate homology searching, and for Virulence Factor Database (VFDB) specific homology searches, the VFDB BLAST is also incorporated into the database. In addition, NeisseriaBase is equipped with in-house designed tools such as the Pairwise Genome Comparison tool (PGC) for comparative genomic analysis and the Pathogenomics Profiling Tool (PathoProT) for the comparative pathogenomics analysis of Neisseria strains. Discussion. This user-friendly database not only provides access to a host of genomic resources on Neisseria but also enables high-quality comparative genome analysis, which is crucial for the expanding scientific community interested in Neisseria research. This database is freely available at http://neisseria.um.edu.my. PMID:27017950

  1. A Web Application For Visualizing Empirical Models of the Space-Atmosphere Interface Region: AtModWeb

    NASA Astrophysics Data System (ADS)

    Knipp, D.; Kilcommons, L. M.; Damas, M. C.

    2015-12-01

    We have created a simple and user-friendly web application to visualize output from empirical atmospheric models that describe the lower atmosphere and the Space-Atmosphere Interface Region (SAIR). The Atmospheric Model Web Explorer (AtModWeb) is a lightweight, multi-user, Python-driven application which uses standard web technology (jQuery, HTML5, CSS3) to give an in-browser interface that can produce plots of modeled quantities such as temperature and individual species and total densities of neutral and ionized upper-atmosphere. Output may be displayed as: 1) a contour plot over a map projection, 2) a pseudo-color plot (heatmap) which allows visualization of a variable as a function of two spatial coordinates, or 3) a simple line plot of one spatial coordinate versus any number of desired model output variables. The application is designed around an abstraction of an empirical atmospheric model, essentially treating the model code as a black box, which makes it simple to add additional models without modifying the main body of the application. Currently implemented are the Naval Research Laboratory NRLMSISE00 model for neutral atmosphere and the International Reference Ionosphere (IRI). These models are relevant to the Low Earth Orbit environment and the SAIR. The interface is simple and usable, allowing users (students and experts) to specify time and location, and choose between historical (i.e. the values for the given date) or manual specification of whichever solar or geomagnetic activity drivers are required by the model. We present a number of use-case examples from research and education: 1) How does atmospheric density between the surface and 1000 km vary with time of day, season and solar cycle?; 2) How do ionospheric layers change with the solar cycle?; 3 How does the composition of the SAIR vary between day and night at a fixed altitude?

  2. Multi-target screening mines hesperidin as a multi-potent inhibitor: Implication in Alzheimer's disease therapeutics.

    PubMed

    Chakraborty, Sandipan; Bandyopadhyay, Jaya; Chakraborty, Sourav; Basu, Soumalee

    2016-10-04

    Alzheimer's disease (AD) is the most frequent form of neurodegenerative disorder in elderly people. Involvement of several pathogenic events and their interconnections make this disease a complex disorder. Therefore, designing compounds that can inhibit multiple toxic pathways is the most attractive therapeutic strategy in complex disorders like AD. Here, we have designed a multi-tier screening protocol combining ensemble docking to mine BACE1 inhibitor, as well as 2-D QSAR models for anti-amyloidogenic and antioxidant activities. An in house developed phytochemical library of 200 phytochemicals has been screened through this multi-target procedure which mine hesperidin, a flavanone glycoside commonly found in citrus food items, as a multi-potent phytochemical in AD therapeutics. Steady-state and time-resolved fluorescence spectroscopy reveal that binding of hesperidin to the active site of BACE1 induces a conformational transition of the protein from open to closed form. Hesperidin docks close to the catalytic aspartate residues and orients itself in a way that blocks the cavity opening thereby precluding substrate binding. Hesperidin is a high affinity BACE1 inhibitor and only 500 nM of the compound shows complete inhibition of the enzyme activity. Furthermore, ANS and Thioflavin-T binding assay show that hesperidin completely inhibits the amyloid fibril formation which is further supported by atomic force microscopy. Hesperidin exhibits moderate ABTS(+) radical scavenging assay but strong hydroxyl radical scavenging ability, as evident from DNA nicking assay. Present study demonstrates the applicability of a novel multi-target screening procedure to mine multi-potent agents from natural origin for AD therapeutics. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  3. OpenTopography: Addressing Big Data Challenges Using Cloud Computing, HPC, and Data Analytics

    NASA Astrophysics Data System (ADS)

    Crosby, C. J.; Nandigam, V.; Phan, M.; Youn, C.; Baru, C.; Arrowsmith, R.

    2014-12-01

    OpenTopography (OT) is a geoinformatics-based data facility initiated in 2009 for democratizing access to high-resolution topographic data, derived products, and tools. Hosted at the San Diego Supercomputer Center (SDSC), OT utilizes cyberinfrastructure, including large-scale data management, high-performance computing, and service-oriented architectures to provide efficient Web based access to large, high-resolution topographic datasets. OT collocates data with processing tools to enable users to quickly access custom data and derived products for their application. OT's ongoing R&D efforts aim to solve emerging technical challenges associated with exponential growth in data, higher order data products, as well as user base. Optimization of data management strategies can be informed by a comprehensive set of OT user access metrics that allows us to better understand usage patterns with respect to the data. By analyzing the spatiotemporal access patterns within the datasets, we can map areas of the data archive that are highly active (hot) versus the ones that are rarely accessed (cold). This enables us to architect a tiered storage environment consisting of high performance disk storage (SSD) for the hot areas and less expensive slower disk for the cold ones, thereby optimizing price to performance. From a compute perspective, OT is looking at cloud based solutions such as the Microsoft Azure platform to handle sudden increases in load. An OT virtual machine image in Microsoft's VM Depot can be invoked and deployed quickly in response to increased system demand. OT has also integrated SDSC HPC systems like the Gordon supercomputer into our infrastructure tier to enable compute intensive workloads like parallel computation of hydrologic routing on high resolution topography. This capability also allows OT to scale to HPC resources during high loads to meet user demand and provide more efficient processing. With a growing user base and maturing scientific user community comes new requests for algorithms and processing capabilities. To address this demand, OT is developing an extensible service based architecture for integrating community-developed software. This "plugable" approach to Web service deployment will enable new processing and analysis tools to run collocated with OT hosted data.

  4. Analysis and application of ichnofabrics

    NASA Astrophysics Data System (ADS)

    Taylor, Andrew; Goldring, Roland; Gowland, Stuart

    2003-02-01

    Bioturbation at all scales, which tends to replace the primary fabric of a sediment by the ichnofabric (the overall fabric of a sediment that has been bioturbated), is now recognised as playing a major role in facies interpretation. The manner in which the substrate may be colonized, and the physical, chemical and ecological controls (grainsize, sedimentation rate, oxygenation, nutrition, salinity, ethology, community structure and succession), together with the several ways in which the substrate is tiered by bioturbators, are the factors and processes that determine the nature of the ichnofabric. Eleven main styles of substrate tiering are described, ranging from single, pioneer colonization to complex tiering under equilibria, their modification under environmental deterioration and amelioration, and diagenetic enhancement or obscuration. Ichnofabrics may be assessed by four attributes: primary sedimentary factors, Bioturbation Index (BI), burrow size and frequency, and ichnological diversity. Construction of tier and ichnofabric constituent diagrams aid visualization and comparison. The breaks or changes in colonization and style of tiering at key stratal surfaces accentuate the surfaces, and many reflect a major environmental shift of the trace-forming biota due to change in hydrodynamic regime (leading to non-deposition and/or erosion and/or lithification), change in salinity regime, or subaerial exposure. The succession of gradational or abrupt changes in ichnofabric through genetically related successions, together with changes in colonization and tiering across event beds, may also be interpreted in terms of changes in environmental parameters. It is not the ichnotaxa per se that are important in discriminating between ichnofabrics, but rather the environmental conditions that determine the overall style of colonization. Fabrics composed of different ichnotaxa (and different taphonomies) but similar tier structure and ichnoguild may form in similar environments of different age or different latitude. Appreciation of colonization and tiering styles places ancient ichnofabrics on a sound process-related basis for environmental interpretation.

  5. Evaluation of Computer-Aided Instruction in a Gross Anatomy Course: A Six-Year Study

    ERIC Educational Resources Information Center

    McNulty, John A.; Sonntag, Beth; Sinacore, James M.

    2009-01-01

    Web-based computer-aided instruction (CAI) has become increasingly important to medical curricula. This multi-year study investigated the effectiveness of CAI and the factors affecting level of individual use. Three CAI were tested that differed in specificity of applicability to the curriculum and in the level of student interaction with the CAI.…

  6. Enhancing Collaborative Peer-to-Peer Systems Using Resource Aggregation and Caching: A Multi-Attribute Resource and Query Aware Approach

    ERIC Educational Resources Information Center

    Bandara, H. M. N. Dilum

    2012-01-01

    Resource-rich computing devices, decreasing communication costs, and Web 2.0 technologies are fundamentally changing the way distributed applications communicate and collaborate. With these changes, we envision Peer-to-Peer (P2P) systems that will allow for the integration and collaboration of peers with diverse capabilities to a virtual community…

  7. Geospatial Multi-Agency Coordination (GeoMAC) wildland fire perimeters, 2008

    USGS Publications Warehouse

    Walters, Sandra P.; Schneider, Norma J.; Guthrie, John D.

    2011-01-01

    The Geospatial Multi-Agency Coordination (GeoMAC) has been collecting and storing data on wildland fire perimeters since August 2000. The dataset presented via this U.S. Geological Survey Data Series product contains the GeoMAC wildland fire perimeter data for the calendar year 2008, which are based upon input from incident intelligence sources, Global Positioning System (GPS) data, and infrared (IR) imagery. Wildland fire perimeter data are obtained from the incidents, evaluated for completeness and accuracy, and processed to reflect consistent field names and attributes. After a quality check, the perimeters are loaded to GeoMAC databases, which support the GeoMAC Web application for access by wildland fire managers and the public. The wildland fire perimeters are viewed through the Web application. The data are subsequently archived according to year and state and are made available for downloading through the Internet in shapefile and Keyhole Markup Language (KML) format. These wildland fire perimeter data are also retained for historical, planning, and research purposes. The datasets that pertain to this report can be found on the Rocky Mountain Geographic Science Center HTTP site at http://rmgsc.cr.usgs.gov/outgoing/GeoMAC/historic_fire_data/. The links are also provided on the sidebar.

  8. Evaluation of protein safety in the context of agricultural biotechnology.

    PubMed

    Delaney, Bryan; Astwood, James D; Cunny, Helen; Conn, Robin Eichen; Herouet-Guicheney, Corinne; Macintosh, Susan; Meyer, Linda S; Privalle, Laura; Gao, Yong; Mattsson, Joel; Levine, Marci

    2008-05-01

    One component of the safety assessment of agricultural products produced through biotechnology is evaluation of the safety of newly expressed proteins. The ILSI International Food Biotechnology Committee has developed a scientifically based two-tiered, weight-of-evidence strategy to assess the safety of novel proteins used in the context of agricultural biotechnology. Recommendations draw upon knowledge of the biological and chemical characteristics of proteins and testing methods for evaluating potential intrinsic hazards of chemicals. Tier I (potential hazard identification) includes an assessment of the biological function or mode of action and intended application of the protein, history of safe use, comparison of the amino acid sequence of the protein to other proteins, as well as the biochemical and physico-chemical properties of the proteins. Studies outlined in Tier II (hazard characterization) are conducted when the results from Tier I are not sufficient to allow a determination of safety (reasonable certainty of no harm) on a case-by-case basis. These studies may include acute and repeated dose toxicology studies and hypothesis-based testing. The application of these guidelines is presented using examples of transgenic proteins applied for agricultural input and output traits in genetically modified crops along with recommendations for future research considerations related to protein safety assessment.

  9. Status of forensic odontology in metro and in tier 2 city in urban India.

    PubMed

    Khare, Parul; Chandra, Shaleen; Raj, Vineet; Verma, Poonam; Subha, G; Khare, Abhishek

    2013-07-01

    Dentist can play a significant role in identifying the victims or perpetrators of crime as well as in disasters. Knowledge about the various aspects of forensic science as well as dental and related evidences can help a dental practitioner in assisting the civil agencies in such cases. To evaluate the awareness and knowledge of forensic odontology among dentists in a metropolitan and a tier 2 city. Seven hundred and seventy four dentists were included in this survey. Questionnaire was designed to assess the knowledge, aptitude, and status of practice of forensic odontology. Data was analyzed by comparing overall awareness of forensic odontology among dentists in metro and tier 2 city as well as between the different groups. Apart from the source of knowledge, no significant differences were seen in respondents of metropolitan and tier 2 city. Significantly higher proportion of subjects in metro reported journals as source of knowledge (P < 0.001), whereas it was newspaper in tier 2 city (P = 0.001). On comparing the mean scores of knowledge (k), aptitude (a), and practice (p) among different study groups, it was found that all the three scores were highest for practitioner cum academician (PA) group (k - 2.37, a - 0.69, P - 0.17). Knowledge scores were minimum for pure practitioner (PP) group (1.98), and attitude and practice scores of pure academician (A) group were minimum (a - 0.53, P - 0.06). Respondents had low knowledge about the applications of forensic odontology in routine practice; hence, steps must be taken to educate the dental practitioners about its clinical applications.

  10. Secure medical digital libraries.

    PubMed

    Papadakis, I; Chrissikopoulos, V; Polemi, D

    2001-12-01

    In this paper, a secure medical digital library is presented. It is based on the CORBA specifications for distributed systems. The described approach relies on a three-tier architecture. Interaction between the medical digital library and its users is achieved through a Web server. The choice of employing Web technology for the dissemination of medical data has many advantages compared to older approaches, but also poses extra requirements that need to be fulfilled. Thus, special attention is paid to the distinguished nature of such medical data, whose integrity and confidentiality should be preserved at all costs. This is achieved through the employment of Trusted Third Parties (TTP) technology for the support of the required security services. Additionally, the proposed digital library employs smartcards for the management of the various security tokens that are used from the above services.

  11. System approach to distributed sensor management

    NASA Astrophysics Data System (ADS)

    Mayott, Gregory; Miller, Gordon; Harrell, John; Hepp, Jared; Self, Mid

    2010-04-01

    Since 2003, the US Army's RDECOM CERDEC Night Vision Electronic Sensor Directorate (NVESD) has been developing a distributed Sensor Management System (SMS) that utilizes a framework which demonstrates application layer, net-centric sensor management. The core principles of the design support distributed and dynamic discovery of sensing devices and processes through a multi-layered implementation. This results in a sensor management layer that acts as a System with defined interfaces for which the characteristics, parameters, and behaviors can be described. Within the framework, the definition of a protocol is required to establish the rules for how distributed sensors should operate. The protocol defines the behaviors, capabilities, and message structures needed to operate within the functional design boundaries. The protocol definition addresses the requirements for a device (sensors or processes) to dynamically join or leave a sensor network, dynamically describe device control and data capabilities, and allow dynamic addressing of publish and subscribe functionality. The message structure is a multi-tiered definition that identifies standard, extended, and payload representations that are specifically designed to accommodate the need for standard representations of common functions, while supporting the need for feature-based functions that are typically vendor specific. The dynamic qualities of the protocol enable a User GUI application the flexibility of mapping widget-level controls to each device based on reported capabilities in real-time. The SMS approach is designed to accommodate scalability and flexibility within a defined architecture. The distributed sensor management framework and its application to a tactical sensor network will be described in this paper.

  12. Using a web-based system for the continuous distance education in cytopathology.

    PubMed

    Stergiou, Nikolaos; Georgoulakis, Giannis; Margari, Niki; Aninos, Dionisios; Stamataki, Melina; Stergiou, Efi; Pouliakis, Abraam; Karakitsos, Petros

    2009-12-01

    The evolution of information technologies and telecommunications has made the World Wide Web a low cost and easily accessible tool for the dissemination of information and knowledge. Continuous Medical Education (CME) sites dedicated in cytopathology field are rather poor, they do not succeed in following the constant changes and lack the ability of providing cytopathologists with a dynamic learning environment, adaptable to the development of cytopathology. Learning methods including skills such as decision making, reasoning and problem solving are critical in the development of such a learning environment. The objectives of this study are (1) to demonstrate on the basis of a web-based training system the successful application of traditional learning theories and methods and (2) to effectively evaluate users' perception towards the educational program, using a combination of observers, theories and methods. Trainees are given the opportunity to browse through the educational material, collaborate in synchronous and asynchronous mode, practice their skills through problems and tasks and test their knowledge using the self-evaluation tool. On the other hand, the trainers are responsible for editing learning material, attending students' progress and organizing the problem-based and task-based scenarios. The implementation of the web-based training system is based on the three-tier architecture and uses an Apache Tomcat web server and a MySQL database server. By December 2008, CytoTrainer's learning environment contains two courses in cytopathology: Gynaecological Cytology and Thyroid Cytology offering about 2000 digital images and 20 case sessions. Our evaluation method is a combination of both qualitative and quantitative approaches to explore how the various parts of the system and students' attitudes work together. Trainees approved of the course's content, methodology and learning activities. The triangulation of evaluation methods revealed that the training program is suitable for the continuous distance education in cytopathology and that it has improved the trainees' skills in diagnostic cytopathology. The web-based training system can be successfully involved in the continuous distance education in cytopathology. It provides the opportunity to access learning material from any place at any time and supports the acquisition of diagnostic knowledge.

  13. ESA web mapping activities applied to Earth observation

    NASA Astrophysics Data System (ADS)

    Caspar, C.; Petiteville, I.; Kohlhammer, G.; Tandurella, G.

    2002-05-01

    Thousands of Earth Observation satellite instrument products are generated daily, in a multitude of formats, using a variety of projection coordinate sytems. This diversity is a barrier to the development of EO multi-mission-based applications and prevents the merging of EO data with GIS data, which is requested by the user community (value-added companies, serivce providers, scientists, institutions, commercial users, and academic users). The web mapping technologies introduced in this article represent an elegant and low-technologies introduced in this article represent an elegant and low-cost solution. The extraordinary added value that is achieved may be considered a revolution in the use of EO data products.

  14. Web Content Management and One EPA Web Factsheet

    EPA Pesticide Factsheets

    One EPA Web is a multi-year project to improve EPA’s website to better meet the needs of our Web visitors. Content is developed and managed in the WebCMS which supports One EPA Web goals by standardizing how we create and publish content.

  15. Revisiting the Regular Education Initiative: Multi-Tiered Systems of Support Can Strengthen the Connection between General and Special Education

    ERIC Educational Resources Information Center

    Leach, Debra; Helf, Shawnna

    2016-01-01

    In 1986 Madeleine Will proposed the Regular Education Initiative (REI) to share possibilities for eliminating the divide between general and special education. Although great strides have been made over the past several decades in regard to the inclusion of students with disabilities, a significant divide between general and special education…

  16. Middle School Teacher Satisfaction with Response to Intervention (RtI): An Assessment between Inception and Implementation

    ERIC Educational Resources Information Center

    Zahedi, Karynn Jensen

    2010-01-01

    Response to intervention (RtI) is a multi-tiered process of monitoring student responses to remediation that is designed to help struggling learners succeed within the purview of regular education. Under the RtI model, students are referred to special education only after a series of documented interventions have been attempted. This study…

  17. A Preliminary Investigation into the Added Value of Multiple Gates and Informants in Universal Screening for Behavioral and Emotional Risk

    ERIC Educational Resources Information Center

    Dowdy, Erin; Dever, Bridget V.; Raines, Tara C.; Moffa, Kathryn

    2016-01-01

    Mental health screening in schools is a progressive practice to identify students for prevention and intervention services. Multiple gating procedures, in which students are provided more intensive assessments following initial identification of risk, are aligned with prevention science and poised to enhance multi-tiered systems of support. Yet,…

  18. Cedar Middle School's Response to Intervention Journey: A Systematic, Multi-Tier, Problem-Solving Approach to Program Implementation

    ERIC Educational Resources Information Center

    Dulaney, Shannon Kay

    2010-01-01

    The purpose of the present study was to record Cedar Middle School's (CMS) response to intervention implementation journey. It is a qualitative case study that examines one school's efforts to bring school improvements under the response to inventory (RtI) umbrella in order to achieve a more systematic approach to providing high-quality…

  19. Elementary Principals' Perception of Response to Intervention (RtI) Implementation in North Carolina: An Exploratory Study

    ERIC Educational Resources Information Center

    Buckner, Jerry W.

    2013-01-01

    Conceptually, Response to Intervention (RtI) is a multi-tiered problem solving process rooted in special education yet designed as a framework for early and on-going instructional interventions for students across a continuum of academic need. In recent years, however, RtI has become an increasingly significant part of the discourse on school…

  20. A Grant Project to Initiate School Counselors' Development of a Multi-Tiered System of Supports Based on Social-Emotional Data

    ERIC Educational Resources Information Center

    Harrington, Karen; Griffith, Catherine; Gray, Katharine; Greenspan, Scott

    2016-01-01

    This article provides an overview of a grant project designed to create a district-wide elementary school counseling program with a strong data-based decision-making process. Project goals included building data literacy skills among school counselors and developing the infrastructure to efficiently collect important social-emotional indicators…

  1. Elementary Teachers' Perspectives of the Implementation of Response to Intervention and Special Education Rates

    ERIC Educational Resources Information Center

    Dupuis, Susan D.

    2010-01-01

    Response to Intervention (RTI) employs a multi-tiered approach to providing targeted interventions for students who are at risk for school failure. With the reauthorization of the Individual with Disabilities Act (IDEA) 2004 and No Child Left Behind (NCLB) 2001 districts are given the option to implement RTI prior to student referral for special…

  2. UPM: unified policy-based network management

    NASA Astrophysics Data System (ADS)

    Law, Eddie; Saxena, Achint

    2001-07-01

    Besides providing network management to the Internet, it has become essential to offer different Quality of Service (QoS) to users. Policy-based management provides control on network routers to achieve this goal. The Internet Engineering Task Force (IETF) has proposed a two-tier architecture whose implementation is based on the Common Open Policy Service (COPS) protocol and Lightweight Directory Access Protocol (LDAP). However, there are several limitations to this design such as scalability and cross-vendor hardware compatibility. To address these issues, we present a functionally enhanced multi-tier policy management architecture design in this paper. Several extensions are introduced thereby adding flexibility and scalability. In particular, an intermediate entity between the policy server and policy rule database called the Policy Enforcement Agent (PEA) is introduced. By keeping internal data in a common format, using a standard protocol, and by interpreting and translating request and decision messages from multi-vendor hardware, this agent allows a dynamic Unified Information Model throughout the architecture. We have tailor-made this unique information system to save policy rules in the directory server and allow executions of policy rules with dynamic addition of new equipment during run-time.

  3. Implementing a low-cost web-based clinical trial management system for community studies: a case study.

    PubMed

    Geyer, John; Myers, Kathleen; Vander Stoep, Ann; McCarty, Carolyn; Palmer, Nancy; DeSalvo, Amy

    2011-10-01

    Clinical trials with multiple intervention locations and a single research coordinating center can be logistically difficult to implement. Increasingly, web-based systems are used to provide clinical trial support with many commercial, open source, and proprietary systems in use. New web-based tools are available which can be customized without programming expertise to deliver web-based clinical trial management and data collection functions. To demonstrate the feasibility of utilizing low-cost configurable applications to create a customized web-based data collection and study management system for a five intervention site randomized clinical trial establishing the efficacy of providing evidence-based treatment via teleconferencing to children with attention-deficit hyperactivity disorder. The sites are small communities that would not usually be included in traditional randomized trials. A major goal was to develop database that participants could access from computers in their home communities for direct data entry. Discussed is the selection process leading to the identification and utilization of a cost-effective and user-friendly set of tools capable of customization for data collection and study management tasks. An online assessment collection application, template-based web portal creation application, and web-accessible Access 2007 database were selected and customized to provide the following features: schedule appointments, administer and monitor online secure assessments, issue subject incentives, and securely transmit electronic documents between sites. Each tool was configured by users with limited programming expertise. As of June 2011, the system has successfully been used with 125 participants in 5 communities, who have completed 536 sets of assessment questionnaires, 8 community therapists, and 11 research staff at the research coordinating center. Total automation of processes is not possible with the current set of tools as each is loosely affiliated, creating some inefficiency. This system is best suited to investigations with a single data source e.g., psychosocial questionnaires. New web-based applications can be used by investigators with limited programming experience to implement user-friendly, efficient, and cost-effective tools for multi-site clinical trials with small distant communities. Such systems allow the inclusion in research of populations that are not usually involved in clinical trials.

  4. Global search tool for the Advanced Photon Source Integrated Relational Model of Installed Systems (IRMIS) database.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quock, D. E. R.; Cianciarulo, M. B.; APS Engineering Support Division

    2007-01-01

    The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, themore » necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.« less

  5. An integrated approach to assess broad-scale condition of coastal wetlands - The Gulf of Mexico Coastal Wetlands pilot survey

    USGS Publications Warehouse

    Nestlerode, J.A.; Engle, V.D.; Bourgeois, P.; Heitmuller, P.T.; Macauley, J.M.; Allen, Y.C.

    2009-01-01

    The Environmental Protection Agency (EPA) and U.S. Geological Survey (USGS) initiated a two-year regional pilot survey in 2007 to develop, test, and validate tools and approaches to assess the condition of northern Gulf of Mexico (GOM) coastal wetlands. Sampling sites were selected from estuarine and palustrine wetland areas with herbaceous, forested, and shrub/scrub habitats delineated by the US Fish and Wildlife Service National Wetlands Inventory Status and Trends (NWI S&T) program and contained within northern GOM coastal watersheds. A multi-level, stepwise, iterative survey approach is being applied to multiple wetland classes at 100 probabilistically-selected coastal wetlands sites. Tier 1 provides information at the landscape scale about habitat inventory, land use, and environmental stressors associated with the watershed in which each wetland site is located. Tier 2, a rapid assessment conducted through a combination of office and field work, is based on best professional judgment and on-site evidence. Tier 3, an intensive site assessment, involves on-site collection of vegetation, water, and sediment samples to establish an integrated understanding of current wetland condition and validate methods and findings from Tiers 1 and 2. The results from this survey, along with other similar regional pilots from the Mid-Atlantic, West Coast, and Great Lakes Regions will contribute to a design and implementation approach for the National Wetlands Condition Assessment to be conducted by EPA's Office of Water in 2011. ?? Springer Science+Business Media B.V. 2008.

  6. Storage assignment optimization in a multi-tier shuttle warehousing system

    NASA Astrophysics Data System (ADS)

    Wang, Yanyan; Mou, Shandong; Wu, Yaohua

    2016-03-01

    The current mathematical models for the storage assignment problem are generally established based on the traveling salesman problem(TSP), which has been widely applied in the conventional automated storage and retrieval system(AS/RS). However, the previous mathematical models in conventional AS/RS do not match multi-tier shuttle warehousing systems(MSWS) because the characteristics of parallel retrieval in multiple tiers and progressive vertical movement destroy the foundation of TSP. In this study, a two-stage open queuing network model in which shuttles and a lift are regarded as servers at different stages is proposed to analyze system performance in the terms of shuttle waiting period (SWP) and lift idle period (LIP) during transaction cycle time. A mean arrival time difference matrix for pairwise stock keeping units(SKUs) is presented to determine the mean waiting time and queue length to optimize the storage assignment problem on the basis of SKU correlation. The decomposition method is applied to analyze the interactions among outbound task time, SWP, and LIP. The ant colony clustering algorithm is designed to determine storage partitions using clustering items. In addition, goods are assigned for storage according to the rearranging permutation and the combination of storage partitions in a 2D plane. This combination is derived based on the analysis results of the queuing network model and on three basic principles. The storage assignment method and its entire optimization algorithm method as applied in a MSWS are verified through a practical engineering project conducted in the tobacco industry. The applying results show that the total SWP and LIP can be reduced effectively to improve the utilization rates of all devices and to increase the throughput of the distribution center.

  7. Integration of Web-based and PC-based clinical research databases.

    PubMed

    Brandt, C A; Sun, K; Charpentier, P; Nadkarni, P M

    2004-01-01

    We have created a Web-based repository or data library of information about measurement instruments used in studies of multi-factorial geriatric health conditions (the Geriatrics Research Instrument Library - GRIL) based upon existing features of two separate clinical study data management systems. GRIL allows browsing, searching, and selecting measurement instruments based upon criteria such as keywords and areas of applicability. Measurement instruments selected can be printed and/or included in an automatically generated standalone microcomputer database application, which can be downloaded by investigators for use in data collection and data management. Integration of database applications requires the creation of a common semantic model, and mapping from each system to this model. Various database schema conflicts at the table and attribute level must be identified and resolved prior to integration. Using a conflict taxonomy and a mapping schema facilitates this process. Critical conflicts at the table level that required resolution included name and relationship differences. A major benefit of integration efforts is the sharing of features and cross-fertilization of applications created for similar purposes in different operating environments. Integration of applications mandates some degree of metadata model unification.

  8. "Less Clicking, More Watching": Results from the User-Centered Design of a Multi-Institutional Web Site for Art and Culture.

    ERIC Educational Resources Information Center

    Vergo, John; Karat, Clare-Marie; Karat, John; Pinhanez, Claudio; Arora, Renee; Cofino, Thomas; Riecken, Doug; Podlaseck, Mark

    This paper summarizes a 10-month long research project conducted at the IBM T.J. Watson Research Center aimed at developing the design concept of a multi-institutional art and culture web site. The work followed a user-centered design (UCD) approach, where interaction with prototypes and feedback from potential users of the web site were sought…

  9. Social Security privatization in Latin America.

    PubMed

    Kritzer, B E

    2000-01-01

    The new, partially privatized social security system adopted by Chile in 1981 has attracted attention in many parts of the world. Since then, a number of Latin American countries have implemented the Chilean model, with some variations: either with a single- or multi-tier system, or with a period of transition to take care of those in the labor force at the time of the change. The single-tier version consists of a privatized program with individual accounts in pension fund management companies. Multi-tier systems have a privatized component and retain some form of public program. This article describes each of the new programs in Latin America, their background, and similarities and differences among them. Much more information is available for Chile than for the other countries (in part because Chile has the oldest system), enough to be able to evaluate what, in most cases, is the most accurate information. That is often not the case for the other countries, especially when dealing with subjects such as transition costs and net rates of return (rates of return minus administrative fees). No country has copied the Chilean system exactly. Bolivia, El Salvador, and Mexico have closed their public systems and set up mandatory individual accounts. Argentina has a mixed public/private system with three tiers. In Colombia and Peru, workers have a choice between the public and private programs. Uruguay created a two-tier mixed system. Costa Rica has a voluntary program for individual accounts as a supplement to the pay-as-you-go program and has just passed a law setting up mandatory accounts containing employer contributions for severance pay. All of the countries continue to face unresolved issues, including: High rates of noncompliance--the percentage of enrollees who do not actively and regularly contribute to their accounts--which could lead to low benefits and greater costs to the governments that offer a guaranteed minimum benefit; Proportionately lower benefits for women and lower earners than for men and higher earners; A minimum required rate of return among the pension fund management companies (in most of these countries) that has resulted in similarity among the companies and the consequent lack of meaningful choice; and High administrative fees in most of these countries, which reduce the individual's effective rate of return. To what extent these issues can be mitigated or resolved in the future is not yet clear. In general, a definitive assessment of the Chilean model and its Latin American variations will not be possible until a cohort of retirees has spent most of its career under the new system.

  10. Colorectal Cancer Screening: Recommendations for Physicians and Patients from the U.S. Multi-Society Task Force on Colorectal Cancer.

    PubMed

    Rex, Douglas K; Boland, C Richard; Dominitz, Jason A; Giardiello, Francis M; Johnson, David A; Kaltenbach, Tonya; Levin, Theodore R; Lieberman, David; Robertson, Douglas J

    2017-07-01

    This document updates the colorectal cancer (CRC) screening recommendations of the U.S. Multi-Society Task Force of Colorectal Cancer (MSTF), which represents the American College of Gastroenterology, the American Gastroenterological Association, and The American Society for Gastrointestinal Endoscopy. CRC screening tests are ranked in 3 tiers based on performance features, costs, and practical considerations. The first-tier tests are colonoscopy every 10 years and annual fecal immunochemical test (FIT). Colonoscopy and FIT are recommended as the cornerstones of screening regardless of how screening is offered. Thus, in a sequential approach based on colonoscopy offered first, FIT should be offered to patients who decline colonoscopy. Colonoscopy and FIT are recommended as tests of choice when multiple options are presented as alternatives. A risk-stratified approach is also appropriate, with FIT screening in populations with an estimated low prevalence of advanced neoplasia and colonoscopy screening in high prevalence populations. The second-tier tests include CT colonography every 5 years, the FIT-fecal DNA test every 3 years, and flexible sigmoidoscopy every 5 to 10 years. These tests are appropriate screening tests, but each has disadvantages relative to the tier 1 tests. Because of limited evidence and current obstacles to use, capsule colonoscopy every 5 years is a third-tier test. We suggest that the Septin9 serum assay (Epigenomics, Seattle, Wash) not be used for screening. Screening should begin at age 50 years in average-risk persons, except in African Americans in whom limited evidence supports screening at 45 years. CRC incidence is rising in persons under age 50, and thorough diagnostic evaluation of young persons with suspected colorectal bleeding is recommended. Discontinuation of screening should be considered when persons up to date with screening, who have prior negative screening (particularly colonoscopy), reach age 75 or have <10 years of life expectancy. Persons without prior screening should be considered for screening up to age 85, depending on age and comorbidities. Persons with a family history of CRC or a documented advanced adenoma in a first-degree relative age <60 years or 2 first-degree relatives with these findings at any age are recommended to undergo screening by colonoscopy every 5 years, beginning 10 years before the age at diagnosis of the youngest affected relative or age 40, whichever is earlier. Persons with a single first-degree relative diagnosed at ≥60 years with CRC or an advanced adenoma can be offered average-risk screening options beginning at age 40 years.

  11. Pilot implementation and user preferences of a Bariatric After-care application.

    PubMed

    Zhang, Melvyn W B; Ho, Roger C M; Hawa, Raed; Sockalingam, Sanjeev

    2015-01-01

    The respective rates of obesity in Canada and the United states are estimated to be 24.1% and 34.1%. Due to the increased incidence of obesity, Bariatric surgery has been recognized as one of the treatment options. Patients who have undergone Bariatric surgery tend to need chronic long-term follow-up with a multi-disciplinary team. In the past decade, there has been massive advancement and development in Internet, Web-based and Smartphone technologies. However, there seemed to be a pacuity of applications in this area that enables post-bariatric patients to self-manage their own condition. In addition, past research have highlighted the limited evidence based with regards to currently available bariatric applications, mainly due to the lack of medical professionals involvement. Our current research objective is to illustrate the development of a Bariatric After-care smartphone application and to highlight user preferences with regards to the features integrated within the application. The Bariatric Aftercare application was developed between the months of March 2014 to April 2014. Making use of low-cost online web-based application developmental technologies, the authors embarked on the development of the web-based application. Patients who attended their routine follow-up appointments are given the links to the web-based application. They were also recruited to participate in an online user evaluation survey to identify their preferences with regards to the features integrated. Since the inception of the web-based application to date, there has been a cumulative total of 385 unique assess to the online web-based application. There is a slight change in the confidence levels of the participants with regards to using the application to help them self-manage their own condition. The majority of the users have indicated that they preferred the information pertaining to what happens during each consult with members of the multidisciplinary team and also greatly valued the feature with regards to the ability to re-schedule their appointments. The vast majority also found the additional resources to be helpful. This is one of the first studies to demonstrate the potential use of smartphone innovations in Bariatric After-care self-management. The current study has shown that users are generally receptive towards such an innovative implementation and has also highlighted some of their preferences with regards to such a self-management application for self-management of their health condition post bariatric surgery. In addition, the authors have also managed to demonstrate how clinicians could be involved in the formulation of a bariatric care application, which has an evidence base.

  12. Person-centred web-based support--development through a Swedish multi-case study.

    PubMed

    Josefsson, Ulrika; Berg, Marie; Koinberg, Ingalill; Hellström, Anna-Lena; Nolbris, Margaretha Jenholt; Ranerup, Agneta; Lundin, Carina Sparud; Skärsäter, Ingela

    2013-10-19

    Departing from the widespread use of the internet in modern society and the emerging use of web applications in healthcare this project captures persons' needs and expectations in order to develop highly usable web recourses. The purpose of this paper is to outline a multi-case research project focused on the development and evaluation of person-centred web-based support for people with long-term illness. To support the underlying idea to move beyond the illness, we approach the development of web support from the perspective of the emergent area of person-centred care. The project aims to contribute to the ongoing development of web-based supports in health care and to the emerging field of person-centred care. The research design uses a meta-analytical approach through its focus on synthesizing experiences from four Swedish regional and national cases of design and use of web-based support in long-term illness. The cases include children (bladder dysfunction and urogenital malformation), young adults (living close to persons with mental illness), and two different cases of adults (women with breast cancer and childbearing women with type 1 diabetes). All of the cases are ongoing, though in different stages of design, implementation, and analysis. This, we argue, will lead to a synthesis of results on a meta-level not yet described. To allow valid comparisons between the four cases we explore and problematize them in relation to four main aspects: 1) The use of people's experiences and needs; 2) The role of use of theories in the design of person-centred web-based supports; 3) The evaluation of the effects of health outcomes for the informants involved and 4) The development of a generic person-centred model for learning and social support for people with long-term illness and their significant others. Person-centred web-based support is a new area and few studies focus on how web-based interventions can contribute to the development of person-centred care. In summary, the main intention of the project outlined here is to contribute with both a synthesis of results on meta-level from four cases and a substantial contribution to the field person-centred care.

  13. Research into a distributed fault diagnosis system and its application

    NASA Astrophysics Data System (ADS)

    Qian, Suxiang; Jiao, Weidong; Lou, Yongjian; Shen, Xiaomei

    2005-12-01

    CORBA (Common Object Request Broker Architecture) is a solution to distributed computing methods over heterogeneity systems, which establishes a communication protocol between distributed objects. It takes great emphasis on realizing the interoperation between distributed objects. However, only after developing some application approaches and some practical technology in monitoring and diagnosis, can the customers share the monitoring and diagnosis information, so that the purpose of realizing remote multi-expert cooperation diagnosis online can be achieved. This paper aims at building an open fault monitoring and diagnosis platform combining CORBA, Web and agent. Heterogeneity diagnosis object interoperate in independent thread through the CORBA (soft-bus), realizing sharing resource and multi-expert cooperation diagnosis online, solving the disadvantage such as lack of diagnosis knowledge, oneness of diagnosis technique and imperfectness of analysis function, so that more complicated and further diagnosis can be carried on. Take high-speed centrifugal air compressor set for example, we demonstrate a distributed diagnosis based on CORBA. It proves that we can find out more efficient approaches to settle the problems such as real-time monitoring and diagnosis on the net and the break-up of complicated tasks, inosculating CORBA, Web technique and agent frame model to carry on complemental research. In this system, Multi-diagnosis Intelligent Agent helps improve diagnosis efficiency. Besides, this system offers an open circumstances, which is easy for the diagnosis objects to upgrade and for new diagnosis server objects to join in.

  14. Geomorphic Unit Tool (GUT): Applications of Fluvial Mapping

    NASA Astrophysics Data System (ADS)

    Kramer, N.; Bangen, S. G.; Wheaton, J. M.; Bouwes, N.; Wall, E.; Saunders, C.; Bennett, S.; Fortney, S.

    2017-12-01

    Geomorphic units are the building blocks of rivers and represent distinct habitat patches for many fluvial organisms. We present the Geomorphic Unit Toolkit (GUT), a flexible GIS geomorphic unit mapping tool, to generate maps of fluvial landforms from topography. GUT applies attributes to landforms based on flow stage (Tier 1), topographic signatures (Tier 2), geomorphic characteristics (Tier 3) and patch characteristics (Tier 4) to derive attributed maps at the level of detail required by analysts. We hypothesize that if more rigorous and consistent geomorphic mapping is conducted, better correlations between physical habitat units and ecohydraulic model results will be obtained compared to past work. Using output from GUT for coarse bed tributary streams in the Columbia River Basin, we explore relationships between salmonid habitat and geomorphic spatial metrics. We also highlight case studies of how GUT can be used to showcase geomorphic impact from large wood restoration efforts. Provided high resolution topography exists, this tool can be used to quickly assess changes in fluvial geomorphology in watersheds impacted by human activities.

  15. Second-Tier Database for Ecosystem Focus, 2003-2004 Annual Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    University of Washington, Columbia Basin Research, DART Project Staff,

    2004-12-01

    The Second-Tier Database for Ecosystem Focus (Contract 00004124) provides direct and timely public access to Columbia Basin environmental, operational, fishery and riverine data resources for federal, state, public and private entities essential to sound operational and resource management. The database also assists with juvenile and adult mainstem passage modeling supporting federal decisions affecting the operation of the FCRPS. The Second-Tier Database known as Data Access in Real Time (DART) integrates public data for effective access, consideration and application. DART also provides analysis tools and performance measures for evaluating the condition of Columbia Basin salmonid stocks. These services are critical tomore » BPA's implementation of its fish and wildlife responsibilities under the Endangered Species Act (ESA).« less

  16. UAV field demonstration of social media enabled tactical data link

    NASA Astrophysics Data System (ADS)

    Olson, Christopher C.; Xu, Da; Martin, Sean R.; Castelli, Jonathan C.; Newman, Andrew J.

    2015-05-01

    This paper addresses the problem of enabling Command and Control (C2) and data exfiltration functions for missions using small, unmanned, airborne surveillance and reconnaissance platforms. The authors demonstrated the feasibility of using existing commercial wireless networks as the data transmission infrastructure to support Unmanned Aerial Vehicle (UAV) autonomy functions such as transmission of commands, imagery, metadata, and multi-vehicle coordination messages. The authors developed and integrated a C2 Android application for ground users with a common smart phone, a C2 and data exfiltration Android application deployed on-board the UAVs, and a web server with database to disseminate the collected data to distributed users using standard web browsers. The authors performed a mission-relevant field test and demonstration in which operators commanded a UAV from an Android device to search and loiter; and remote users viewed imagery, video, and metadata via web server to identify and track a vehicle on the ground. Social media served as the tactical data link for all command messages, images, videos, and metadata during the field demonstration. Imagery, video, and metadata were transmitted from the UAV to the web server via multiple Twitter, Flickr, Facebook, YouTube, and similar media accounts. The web server reassembled images and video with corresponding metadata for distributed users. The UAV autopilot communicated with the on-board Android device via on-board Bluetooth network.

  17. NASA Planning for Orion Multi-Purpose Crew Vehicle Ground Operations

    NASA Technical Reports Server (NTRS)

    Letchworth, Gary; Schlierf, Roland

    2011-01-01

    The NASA Orion Ground Processing Team was originally formed by the Kennedy Space Center (KSC) Constellation (Cx) Project Office's Orion Division to define, refine and mature pre-launch and post-landing ground operations for the Orion human spacecraft. The multidisciplined KSC Orion team consisted of KSC civil servant, SAIC, Productivity Apex, Inc. and Boeing-CAPPS engineers, project managers and safety engineers, as well as engineers from Constellation's Orion Project and Lockheed Martin Orion Prime contractor. The team evaluated the Orion design configurations as the spacecraft concept matured between Systems Design Review (SDR), Systems Requirement Review (SRR) and Preliminary Design Review (PDR). The team functionally decomposed prelaunch and post-landing steps at three levels' of detail, or tiers, beginning with functional flow block diagrams (FFBDs). The third tier FFBDs were used to build logic networks and nominal timelines. Orion ground support equipment (GSE) was identified and mapped to each step. This information was subsequently used in developing lower level operations steps in a Ground Operations Planning Document PDR product. Subject matter experts for each spacecraft and GSE subsystem were used to define 5th - 95th percentile processing times for each FFBD step, using the Delphi Method. Discrete event simulations used this information and the logic network to provide processing timeline confidence intervals for launch rate assessments. The team also used the capabilities of the KSC Visualization Lab, the FFBDs and knowledge of the spacecraft, GSE and facilities to build visualizations of Orion pre-launch and postlanding processing at KSC. Visualizations were a powerful tool for communicating planned operations within the KSC community (i.e., Ground Systems design team), and externally to the Orion Project, Lockheed Martin spacecraft designers and other Constellation Program stakeholders during the SRR to PDR timeframe. Other operations planning tools included Kaizen/Lean events, mockups and human factors analysis. The majority of products developed by this team are applicable as KSC prepares 21st Century Ground Systems for the Orion Multi-Purpose Crew Vehicle and Space Launch System.

  18. Chemical Space: Big Data Challenge for Molecular Diversity.

    PubMed

    Awale, Mahendra; Visini, Ricardo; Probst, Daniel; Arús-Pous, Josep; Reymond, Jean-Louis

    2017-10-25

    Chemical space describes all possible molecules as well as multi-dimensional conceptual spaces representing the structural diversity of these molecules. Part of this chemical space is available in public databases ranging from thousands to billions of compounds. Exploiting these databases for drug discovery represents a typical big data problem limited by computational power, data storage and data access capacity. Here we review recent developments of our laboratory, including progress in the chemical universe databases (GDB) and the fragment subset FDB-17, tools for ligand-based virtual screening by nearest neighbor searches, such as our multi-fingerprint browser for the ZINC database to select purchasable screening compounds, and their application to discover potent and selective inhibitors for calcium channel TRPV6 and Aurora A kinase, the polypharmacology browser (PPB) for predicting off-target effects, and finally interactive 3D-chemical space visualization using our online tools WebDrugCS and WebMolCS. All resources described in this paper are available for public use at www.gdb.unibe.ch.

  19. Interaction Design and Usability of Learning Spaces in 3D Multi-user Virtual Worlds

    NASA Astrophysics Data System (ADS)

    Minocha, Shailey; Reeves, Ahmad John

    Three-dimensional virtual worlds are multimedia, simulated environments, often managed over the Web, which users can 'inhabit' and interact via their own graphical, self-representations known as 'avatars'. 3D virtual worlds are being used in many applications: education/training, gaming, social networking, marketing and commerce. Second Life is the most widely used 3D virtual world in education. However, problems associated with usability, navigation and way finding in 3D virtual worlds may impact on student learning and engagement. Based on empirical investigations of learning spaces in Second Life, this paper presents design guidelines to improve the usability and ease of navigation in 3D spaces. Methods of data collection include semi-structured interviews with Second Life students, educators and designers. The findings have revealed that design principles from the fields of urban planning, Human- Computer Interaction, Web usability, geography and psychology can influence the design of spaces in 3D multi-user virtual environments.

  20. Interoperative fundus image and report sharing in compliance with integrating the healthcare enterprise conformance and web access to digital imaging and communication in medicine persistent object protocol.

    PubMed

    Wu, Hui-Qun; Lv, Zheng-Min; Geng, Xing-Yun; Jiang, Kui; Tang, Le-Min; Zhou, Guo-Min; Dong, Jian-Cheng

    2013-01-01

    To address issues in interoperability between different fundus image systems, we proposed a web eye-picture archiving and communication system (PACS) framework in conformance with digital imaging and communication in medicine (DICOM) and health level 7 (HL7) protocol to realize fundus images and reports sharing and communication through internet. Firstly, a telemedicine-based eye care work flow was established based on integrating the healthcare enterprise (IHE) Eye Care technical framework. Then, a browser/server architecture eye-PACS system was established in conformance with the web access to DICOM persistent object (WADO) protocol, which contains three tiers. In any client system installed with web browser, clinicians could log in the eye-PACS to observe fundus images and reports. Multipurpose internet mail extensions (MIME) type of a structured report is saved as pdf/html with reference link to relevant fundus image using the WADO syntax could provide enough information for clinicians. Some functions provided by open-source Oviyam could be used to query, zoom, move, measure, view DICOM fundus images. Such web eye-PACS in compliance to WADO protocol could be used to store and communicate fundus images and reports, therefore is of great significance for teleophthalmology.

  1. Molecule database framework: a framework for creating database applications with chemical structure search capability

    PubMed Central

    2013-01-01

    Background Research in organic chemistry generates samples of novel chemicals together with their properties and other related data. The involved scientists must be able to store this data and search it by chemical structure. There are commercial solutions for common needs like chemical registration systems or electronic lab notebooks. However for specific requirements of in-house databases and processes no such solutions exist. Another issue is that commercial solutions have the risk of vendor lock-in and may require an expensive license of a proprietary relational database management system. To speed up and simplify the development for applications that require chemical structure search capabilities, I have developed Molecule Database Framework. The framework abstracts the storing and searching of chemical structures into method calls. Therefore software developers do not require extensive knowledge about chemistry and the underlying database cartridge. This decreases application development time. Results Molecule Database Framework is written in Java and I created it by integrating existing free and open-source tools and frameworks. The core functionality includes: • Support for multi-component compounds (mixtures) • Import and export of SD-files • Optional security (authorization) For chemical structure searching Molecule Database Framework leverages the capabilities of the Bingo Cartridge for PostgreSQL and provides type-safe searching, caching, transactions and optional method level security. Molecule Database Framework supports multi-component chemical compounds (mixtures). Furthermore the design of entity classes and the reasoning behind it are explained. By means of a simple web application I describe how the framework could be used. I then benchmarked this example application to create some basic performance expectations for chemical structure searches and import and export of SD-files. Conclusions By using a simple web application it was shown that Molecule Database Framework successfully abstracts chemical structure searches and SD-File import and export to simple method calls. The framework offers good search performance on a standard laptop without any database tuning. This is also due to the fact that chemical structure searches are paged and cached. Molecule Database Framework is available for download on the projects web page on bitbucket: https://bitbucket.org/kienerj/moleculedatabaseframework. PMID:24325762

  2. Molecule database framework: a framework for creating database applications with chemical structure search capability.

    PubMed

    Kiener, Joos

    2013-12-11

    Research in organic chemistry generates samples of novel chemicals together with their properties and other related data. The involved scientists must be able to store this data and search it by chemical structure. There are commercial solutions for common needs like chemical registration systems or electronic lab notebooks. However for specific requirements of in-house databases and processes no such solutions exist. Another issue is that commercial solutions have the risk of vendor lock-in and may require an expensive license of a proprietary relational database management system. To speed up and simplify the development for applications that require chemical structure search capabilities, I have developed Molecule Database Framework. The framework abstracts the storing and searching of chemical structures into method calls. Therefore software developers do not require extensive knowledge about chemistry and the underlying database cartridge. This decreases application development time. Molecule Database Framework is written in Java and I created it by integrating existing free and open-source tools and frameworks. The core functionality includes:•Support for multi-component compounds (mixtures)•Import and export of SD-files•Optional security (authorization)For chemical structure searching Molecule Database Framework leverages the capabilities of the Bingo Cartridge for PostgreSQL and provides type-safe searching, caching, transactions and optional method level security. Molecule Database Framework supports multi-component chemical compounds (mixtures).Furthermore the design of entity classes and the reasoning behind it are explained. By means of a simple web application I describe how the framework could be used. I then benchmarked this example application to create some basic performance expectations for chemical structure searches and import and export of SD-files. By using a simple web application it was shown that Molecule Database Framework successfully abstracts chemical structure searches and SD-File import and export to simple method calls. The framework offers good search performance on a standard laptop without any database tuning. This is also due to the fact that chemical structure searches are paged and cached. Molecule Database Framework is available for download on the projects web page on bitbucket: https://bitbucket.org/kienerj/moleculedatabaseframework.

  3. Application of elements of microbiological risk assessment in the food industry via a tiered approach.

    PubMed

    van Gerwen, Suzanne J C; Gorris, Leon G M

    2004-09-01

    Food safety control is a matter for concern for all parts of the food supply chain, including governments that develop food safety policy, food industries that must control potential hazards, and consumers who need to keep to the intended use of the food. In the future, food safety policy may be set using the framework of risk analysis, part of which is the development of (inter)national microbiological risk assessment (MRA) studies. MRA studies increase our understanding of the impact of risk management interventions and of the relationships among subsequent parts of food supply chains with regard to the safety of the food when it reaches the consumer. Application of aspects of MRA in the development of new food concepts has potential benefits for the food industry. A tiered approach to applying MRA can best realize these benefits. The tiered MRA approach involves calculation of microbial fate for a product and process design on the basis of experimental data (e.g., monitoring data on prevalence) and predictive microbiological models. Calculations on new product formulations and novel processing technologies provide improved understanding of microbial fate beyond currently known boundaries, which enables identification of new opportunities in process design. The outcome of the tiered approach focuses on developing benchmarks of potential consumer exposure to hazards associated with new products by comparison with exposure associated with products that are already on the market and have a safe history of use. The tiered prototype is a tool to be used by experienced microbiologists as a basis for advice to product developers and can help to make safety assurance for new food concepts transparent to food inspection services.

  4. Social security reform in Central and Eastern Europe: variations on a Latin American theme.

    PubMed

    Kritzer, B E

    After Chile reformed its social security system in 1981, several other Latin American countries and certain Central and Eastern European (CEE) countries implemented the Chilean model, with some variations: either a single- or multitier system, or with a period of transition to take care of those in the labor force at the time of the change. The single-tier version consists of individual accounts in pension fund management companies. Multi-tier systems retain some form of public program and add mandatory individual accounts. Most of the CEE countries did not want to incur the high transition costs associated with the Chilean model. The switch to a market economy had already strained their economies. Also, the countries' desire to adopt the European Union's Euro as their currency--a move that required a specific debt ceiling--limited the amount of additional debt they could incur. This article describes the CEE reforms and makes some comparisons with the Latin American experience. Most of the CEE countries have chosen a mixed system and have restructured the pay-as-you-go (PAYGO) tier, while the Latin American countries have both single- and multi-tier systems. Some CEE countries have set up notional defined contribution (NDC) schemes for the PAYGO tier in which each insured person has a hypothetical account made up of all contributions during his or her working life. Survivors and disability programs in CEE have remained in the public tier, but in most of the Latin American programs the insured must purchase a separate insurance policy. Issues common to both regions include: Administrative costs are high and competition is keen, which has led to consolidation and mergers among the companies and a large market share controlled by a few companies. Benefits are proportionately lower for women than for men. A large, informal sector is not covered by social security. This sector is apparently much larger in Latin America than in the CEE countries. Issues that are unique to some of the CEE countries include: Individual accounts in Hungary and Poland have proved more attractive than originally anticipated. As a result, contributions to the public PAYGO system in Hungary and Poland fell short of expectations. In several countries, laws setting up the programs were enacted without all the details of providing benefits. For example, in some countries laws must now be drawn up for establishment of annuities because they do not yet exist. Setting up a coherent pension policy has been difficult in some countries because of frequent and significant changes in government. This situation has affected the progress of reform in various stages of development. In general, a definitive assessment of individual accounts in these countries will not be possible until a cohort of retirees has spent most of its career under the new system.

  5. Evaluation of Decision Rules in a Tiered Assessment of Inhalation Exposure to Nanomaterials.

    PubMed

    Brouwer, Derk; Boessen, Ruud; van Duuren-Stuurman, Birgit; Bard, Delphine; Moehlmann, Carsten; Bekker, Cindy; Fransman, Wouter; Klein Entink, Rinke

    2016-10-01

    Tiered or stepwise approaches to assess occupational exposure to nano-objects, and their agglomerates and aggregates have been proposed, which require decision rules (DRs) to move to a next tier, or terminate the assessment. In a desk study the performance of a number of DRs based on the evaluation of results from direct reading instruments was investigated by both statistical simulations and the application of the DRs to real workplace data sets. A statistical model that accounts for autocorrelation patterns in time-series, i.e. autoregressive integrated moving average (ARIMA), was used as 'gold' standard. The simulations showed that none of the proposed DRs covered the entire range of simulated scenarios with respect to the ARIMA model parameters, however, a combined DR showed a slightly better agreement. Application of the DRs to real workplace datasets (n = 117) revealed sensitivity up to 0.72, whereas the lowest observed specificity was 0.95. The selection of the most appropriate DR is very much dependent on the consequences of the decision, i.e. ruling in or ruling out of scenarios for further evaluation. Since a basic assessment may also comprise of other type of measurements and information, an evaluation logic was proposed which embeds the DRs, but furthermore supports decision making in view of a tiered-approach exposure assessment. © The Author 2016. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  6. Implementation of system intelligence in a 3-tier telemedicine/PACS hierarchical storage management system

    NASA Astrophysics Data System (ADS)

    Chao, Woodrew; Ho, Bruce K. T.; Chao, John T.; Sadri, Reza M.; Huang, Lu J.; Taira, Ricky K.

    1995-05-01

    Our tele-medicine/PACS archive system is based on a three-tier distributed hierarchical architecture, including magnetic disk farms, optical jukebox, and tape jukebox sub-systems. The hierarchical storage management (HSM) architecture, built around a low cost high performance platform [personal computers (PC) and Microsoft Windows NT], presents a very scaleable and distributed solution ideal for meeting the needs of client/server environments such as tele-medicine, tele-radiology, and PACS. These image based systems typically require storage capacities mirroring those of film based technology (multi-terabyte with 10+ years storage) and patient data retrieval times at near on-line performance as demanded by radiologists. With the scaleable architecture, storage requirements can be easily configured to meet the needs of the small clinic (multi-gigabyte) to those of a major hospital (multi-terabyte). The patient data retrieval performance requirement was achieved by employing system intelligence to manage migration and caching of archived data. Relevant information from HIS/RIS triggers prefetching of data whenever possible based on simple rules. System intelligence embedded in the migration manger allows the clustering of patient data onto a single tape during data migration from optical to tape medium. Clustering of patient data on the same tape eliminates multiple tape loading and associated seek time during patient data retrieval. Optimal tape performance can then be achieved by utilizing the tape drives high performance data streaming capabilities thereby reducing typical data retrieval delays associated with streaming tape devices.

  7. Prototype development of a web-based participative decision support platform in risk management

    NASA Astrophysics Data System (ADS)

    Aye, Zar Chi; Olyazadeh, Roya; Jaboyedoff, Michel; Derron, Marc-Henri

    2014-05-01

    This paper discusses the proposed background architecture and prototype development of an internet-based decision support system (DSS) in the field of natural hazards and risk management using open-source geospatial software and web technologies. It is based on a three-tier, client-server architecture with the support of boundless (opengeo) framework and its client side SDK application environment using customized gxp components and data utility classes. The main purpose of the system is to integrate the workflow of risk management systematically with the diverse involvement of stakeholders from different organizations dealing with natural hazards and risk for evaluation of management measures through the active online participation approach. It aims to develop an adaptive user friendly, web-based environment that allows the users to set up risk management strategies based on actual context and data by integrating web-GIS and DSS functionality associated with process flow and other visualization tools. Web-GIS interface has been integrated within the DSS to deliver maps and provide certain geo-processing capabilities on the web, which can be easily accessible and shared by different organizations located in case study sites of the project. This platform could be envisaged not only as a common web-based platform for the centralized sharing of data such as hazard maps, elements at risk maps and additional information but also to ensure an integrated platform of risk management where the users could upload data, analyze risk and identify possible alternative scenarios for risk reduction especially for floods and landslides, either quantitatively or qualitatively depending on the risk information provided by the stakeholders in case study regions. The level of involvement, access to and interaction with the provided functionality of the system varies depending on the roles and responsibilities of the stakeholders, for example, only the experts (planners, geological services, etc.) can have access to the alternative definition component to formulate the risk reduction measures. The development of such a participative platform would finally lead to an integrated risk management approach highlighting the needs to deal with involved experts and civil society in the decision-making process for evaluation of risk management measures through the active participation approach. The system will be applied and evaluated in four case study areas of the CHANGES project in Europe: Romania, North Eastern Italy, French Alps and Poland. However, the framework of the system is designed in a generic way so as to be applicable in other regions to achieve the high adaptability and flexibility of the system. The research has been undertaken as a part of the CHANGES project funded by the European Commission's 7th framework program.

  8. A Team Approach to Data-Driven Decision-Making Literacy Instruction in Preschool Classrooms: Child Assessment and Intervention through Classroom Team Self-Reflection

    ERIC Educational Resources Information Center

    Abbott, Mary; Beecher, Constance; Petersen, Sarah; Greenwood, Charles R.; Atwater, Jane

    2017-01-01

    Many schools around the country are getting positive responses implementing Response to Intervention (RTI) within a Multi-Tiered System of Support (MTSS) framework (e.g., Abbott, 2011; Ball & Trammell, 2011; Buysee & Peisner-Feinberg, 2009). RTI refers to an instructional model that is based on a student's response to instruction. RTI…

  9. Examining the Average and Local Effects of a Standardized Treatment for Fourth Graders with Reading Difficulties

    ERIC Educational Resources Information Center

    Wanzek, Jeanne; Petscher, Yaacov; Al Otaiba, Stephanie; Kent, Shawn; Christopher, Schatschneider; Haynes, Martha; Rivas, Brenna K.; Jones, Francesca G.

    2016-01-01

    The present study used a randomized control trial to examine the effects of a widely-used multi-component Tier 2 type intervention, Passport to Literacy, on the reading ability of 221 fourth graders who initially scored at or below the 30th percentile in reading comprehension. Intervention was provided by research staff to groups of 4-7 students…

  10. Organizational Structures and Processes to Support and Sustain Effective Technical Assistance in a State-Wide Multi-Tiered System of Support Initiative

    ERIC Educational Resources Information Center

    Morrison, Julie Q.; Russell, Christine; Dyer, Stephanie; Metcalf, Terri; Rahschulte, Rebecca L.

    2014-01-01

    Despite the national proliferation of technical assistance as a driver for school reform and as a model for embedded and sustained professional development, very little is known about the organizational structures and processes needed to support technical assistance. The purpose of this paper is to describe a structured needs assessment process…

  11. STITCHER: A web resource for high-throughput design of primers for overlapping PCR applications.

    PubMed

    O'Halloran, Damien M

    2015-06-01

    Overlapping PCR is routinely used in a wide number of molecular applications. These include stitching PCR fragments together, generating fluorescent transcriptional and translational fusions, inserting mutations, making deletions, and PCR cloning. Overlapping PCR is also used for genotyping by traditional PCR techniques and in detection experiments using techniques such as loop-mediated isothermal amplification (LAMP). STITCHER is a web tool providing a central resource for researchers conducting all types of overlapping PCR experiments with an intuitive interface for automated primer design that's fast, easy to use, and freely available online (http://ohalloranlab.net/STITCHER.html). STITCHER can handle both single sequence and multi-sequence input, and specific features facilitate numerous other PCR applications, including assembly PCR, adapter PCR, and primer walking. Field PCR, and in particular, LAMP, offers promise as an on site tool for pathogen detection in underdeveloped areas, and STITCHER includes off-target detection features for pathogens commonly targeted using LAMP technology.

  12. Development and preliminary testing of a web-based, self-help application for disaster-affected families

    PubMed Central

    Yuen, Erica K; Gros, Kirstin; Welsh, Kyleen E; McCauley, Jenna; Resnick, Heidi S; Danielson, Carla K; Price, Matthew; Ruggiero, Kenneth J

    2015-01-01

    Technology-based self-help interventions have the potential to increase access to evidence-based mental healthcare, especially for families affected by natural disasters. However, development of these interventions is a complex process and poses unique challenges. Usability testing, which assesses the ability of individuals to use an application successfully, can have a significant impact on the quality of a self-help intervention. This article describes (a) the development of a novel web-based multi-module self-help intervention for disaster-affected adolescents and their parents and (b) a mixed-methods formal usability study to evaluate user response. A total of 24 adolescents were observed, videotaped, and interviewed as they used the depressed mood component of the self-help intervention. Quantitative results indicated an above-average user experience, and qualitative analysis identified 120 unique usability issues. We discuss the challenges of developing self-help applications, including design considerations and the value of usability testing in technology-based interventions, as well as our plan for widespread dissemination. PMID:25933798

  13. ICM: a web server for integrated clustering of multi-dimensional biomedical data.

    PubMed

    He, Song; He, Haochen; Xu, Wenjian; Huang, Xin; Jiang, Shuai; Li, Fei; He, Fuchu; Bo, Xiaochen

    2016-07-08

    Large-scale efforts for parallel acquisition of multi-omics profiling continue to generate extensive amounts of multi-dimensional biomedical data. Thus, integrated clustering of multiple types of omics data is essential for developing individual-based treatments and precision medicine. However, while rapid progress has been made, methods for integrated clustering are lacking an intuitive web interface that facilitates the biomedical researchers without sufficient programming skills. Here, we present a web tool, named Integrated Clustering of Multi-dimensional biomedical data (ICM), that provides an interface from which to fuse, cluster and visualize multi-dimensional biomedical data and knowledge. With ICM, users can explore the heterogeneity of a disease or a biological process by identifying subgroups of patients. The results obtained can then be interactively modified by using an intuitive user interface. Researchers can also exchange the results from ICM with collaborators via a web link containing a Project ID number that will directly pull up the analysis results being shared. ICM also support incremental clustering that allows users to add new sample data into the data of a previous study to obtain a clustering result. Currently, the ICM web server is available with no login requirement and at no cost at http://biotech.bmi.ac.cn/icm/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. A hierarchical analysis of terrestrial ecosystem model Biome-BGC: Equilibrium analysis and model calibration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thornton, Peter E; Wang, Weile; Law, Beverly E.

    2009-01-01

    The increasing complexity of ecosystem models represents a major difficulty in tuning model parameters and analyzing simulated results. To address this problem, this study develops a hierarchical scheme that simplifies the Biome-BGC model into three functionally cascaded tiers and analyzes them sequentially. The first-tier model focuses on leaf-level ecophysiological processes; it simulates evapotranspiration and photosynthesis with prescribed leaf area index (LAI). The restriction on LAI is then lifted in the following two model tiers, which analyze how carbon and nitrogen is cycled at the whole-plant level (the second tier) and in all litter/soil pools (the third tier) to dynamically supportmore » the prescribed canopy. In particular, this study analyzes the steady state of these two model tiers by a set of equilibrium equations that are derived from Biome-BGC algorithms and are based on the principle of mass balance. Instead of spinning-up the model for thousands of climate years, these equations are able to estimate carbon/nitrogen stocks and fluxes of the target (steady-state) ecosystem directly from the results obtained by the first-tier model. The model hierarchy is examined with model experiments at four AmeriFlux sites. The results indicate that the proposed scheme can effectively calibrate Biome-BGC to simulate observed fluxes of evapotranspiration and photosynthesis; and the carbon/nitrogen stocks estimated by the equilibrium analysis approach are highly consistent with the results of model simulations. Therefore, the scheme developed in this study may serve as a practical guide to calibrate/analyze Biome-BGC; it also provides an efficient way to solve the problem of model spin-up, especially for applications over large regions. The same methodology may help analyze other similar ecosystem models as well.« less

  15. A Novel Segment-Based Approach for Improving Classification Performance of Transport Mode Detection.

    PubMed

    Guvensan, M Amac; Dusun, Burak; Can, Baris; Turkmen, H Irem

    2017-12-30

    Transportation planning and solutions have an enormous impact on city life. To minimize the transport duration, urban planners should understand and elaborate the mobility of a city. Thus, researchers look toward monitoring people's daily activities including transportation types and duration by taking advantage of individual's smartphones. This paper introduces a novel segment-based transport mode detection architecture in order to improve the results of traditional classification algorithms in the literature. The proposed post-processing algorithm, namely the Healing algorithm, aims to correct the misclassification results of machine learning-based solutions. Our real-life test results show that the Healing algorithm could achieve up to 40% improvement of the classification results. As a result, the implemented mobile application could predict eight classes including stationary, walking, car, bus, tram, train, metro and ferry with a success rate of 95% thanks to the proposed multi-tier architecture and Healing algorithm.

  16. Coordinating teams of autonomous vehicles: an architectural perspective

    NASA Astrophysics Data System (ADS)

    Czichon, Cary; Peterson, Robert W.; Mettala, Erik G.; Vondrak, Ivo

    2005-05-01

    In defense-related robotics research, a mission level integration gap exists between mission tasks (tactical) performed by ground, sea, or air applications and elementary behaviors enacted by processing, communications, sensors, and weaponry resources (platform specific). The gap spans ensemble (heterogeneous team) behaviors, automatic MOE/MOP tracking, and tactical task modeling/simulation for virtual and mixed teams comprised of robotic and human combatants. This study surveys robotic system architectures, compares approaches for navigating problem/state spaces by autonomous systems, describes an architecture for an integrated, repository-based modeling, simulation, and execution environment, and outlines a multi-tiered scheme for robotic behavior components that is agent-based, platform-independent, and extendable via plug-ins. Tools for this integrated environment, along with a distributed agent framework for collaborative task performance are being developed by a U.S. Army funded SBIR project (RDECOM Contract N61339-04-C-0005).

  17. Explorations Around "Graceful Failure" in Transportation Infrastructure: Lessons Learned By the Infrastructure and Climate Network (ICNet)

    NASA Astrophysics Data System (ADS)

    Jacobs, J. M.; Thomas, N.; Mo, W.; Kirshen, P. H.; Douglas, E. M.; Daniel, J.; Bell, E.; Friess, L.; Mallick, R.; Kartez, J.; Hayhoe, K.; Croope, S.

    2014-12-01

    Recent events have demonstrated that the United States' transportation infrastructure is highly vulnerable to extreme weather events which will likely increase in the future. In light of the 60% shortfall of the $900 billion investment needed over the next five years to maintain this aging infrastructure, hardening of all infrastructures is unlikely. Alternative strategies are needed to ensure that critical aspects of the transportation network are maintained during climate extremes. Preliminary concepts around multi-tier service expectations of bridges and roads with reference to network capacity will be presented. Drawing from recent flooding events across the U.S., specific examples for roads/pavement will be used to illustrate impacts, disruptions, and trade-offs between performance during events and subsequent damage. This talk will also address policy and cultural norms within the civil engineering practice that will likely challenge the application of graceful failure pathways during extreme events.

  18. The Barrie Jones Lecture—Eye care for the neglected population: challenges and solutions

    PubMed Central

    Rao, G N

    2015-01-01

    Globally, pockets of ‘neglected populations' do not have access to basic health-care services and carry a much greater risk of blindness and visual impairment. While large-scale public health approaches to control blindness due to vitamin A deficiency, onchocerciasis, and trachoma are successful, other causes of blindness still take a heavy toll in the population. High-quality comprehensive eye care that is equitable is the approach that needs wide-scale application to alleviate this inequity. L V Prasad Eye Institute of India developed a multi-tier pyramidal model of eye care delivery that encompasses all levels from primary to advanced tertiary (quaternary). This has demonstrated the feasibility of ‘Universal Eye Health Coverage' covering promotive, preventive, corrective, and rehabilitative aspects of eye care. Using human resources with competency-based training, effective and cost-effective care could be provided to many disadvantaged people. PMID:25567375

  19. Shape-and-behavior encoded tracking of bee dances.

    PubMed

    Veeraraghavan, Ashok; Chellappa, Rama; Srinivasan, Mandyam

    2008-03-01

    Behavior analysis of social insects has garnered impetus in recent years and has led to some advances in fields like control systems, flight navigation etc. Manual labeling of insect motions required for analyzing the behaviors of insects requires significant investment of time and effort. In this paper, we propose certain general principles that help in simultaneous automatic tracking and behavior analysis with applications in tracking bees and recognizing specific behaviors exhibited by them. The state space for tracking is defined using position, orientation and the current behavior of the insect being tracked. The position and orientation are parametrized using a shape model while the behavior is explicitly modeled using a three-tier hierarchical motion model. The first tier (dynamics) models the local motions exhibited and the models built in this tier act as a vocabulary for behavior modeling. The second tier is a Markov motion model built on top of the local motion vocabulary which serves as the behavior model. The third tier of the hierarchy models the switching between behaviors and this is also modeled as a Markov model. We address issues in learning the three-tier behavioral model, in discriminating between models, detecting and in modeling abnormal behaviors. Another important aspect of this work is that it leads to joint tracking and behavior analysis instead of the traditional track and then recognize approach. We apply these principles for tracking bees in a hive while they are executing the waggle dance and the round dance.

  20. Is the chronic Tier-1 effect assessment approach for insecticides protective for aquatic ecosystems?

    PubMed

    Brock, Theo Cm; Bhatta, Ranjana; van Wijngaarden, René Pa; Rico, Andreu

    2016-10-01

    We investigated the appropriateness of several methods, including those recommended in the Aquatic Guidance Document of the European Food Safety Authority (EFSA), for the derivation of chronic Tier-1 regulatory acceptable concentrations (RACs) for insecticides and aquatic organisms. The insecticides represented different chemical classes (organophosphates, pyrethroids, benzoylureas, insect growth regulators, biopesticides, carbamates, neonicotinoids, and miscellaneous). Chronic Tier-1 RACs derived using toxicity data for the standard species Daphnia magna, Chironomus spp., and/or Americamysis bahia, were compared with Tier-3 RACs derived from micro- and mesocosm studies on basis of the ecological threshold option (ETO-RACs). ETO-RACs could be derived for 31 insecticides applied to micro- and mesocosms in single or multiple applications, yielding a total number of 36 cases for comparison. The chronic Tier-1 RACs calculated according to the EFSA approach resulted in a sufficient protection level, except for 1 neonicotinoid (slightly underprotective) and for several pyrethroids if toxicity data for A. bahia were not included. This latter observation can be explained by 1) the fact that A. bahia is the most sensitive standard test species for pyrethroids, 2) the hydrophobic properties of pyrethroids, and 3) the fact that long-term effects observed in (epi) benthic arthropods may be better explained by exposure via the sediment than via overlying water. Besides including toxicity data for A. bahia, the protection level for pyrethroids can be improved by selecting both D. magna and Chironomus spp. as standard test species for chronic Tier-1 derivation. Although protective in the majority of cases, the conservativeness of the recommended chronic Tier-1 RACs appears to be less than an order of magnitude for a relatively large proportion of insecticides when compared with their Tier-3 ETO-RACs. This may leave limited options for refinement of the chronic effect assessment using laboratory toxicity data for additional species. Integr Environ Assess Manag 2016;12:747-758. © 2015 SETAC. © 2015 SETAC.

  1. Multi-dimensional construction of a novel active yolk@conductive shell nanofiber web as a self-standing anode for high-performance lithium-ion batteries

    NASA Astrophysics Data System (ADS)

    Liu, Hao; Chen, Luyi; Liang, Yeru; Fu, Ruowen; Wu, Dingcai

    2015-11-01

    A novel active yolk@conductive shell nanofiber web with a unique synergistic advantage of various hierarchical nanodimensional objects including the 0D monodisperse SiO2 yolks, the 1D continuous carbon shell and the 3D interconnected non-woven fabric web has been developed by an innovative multi-dimensional construction method, and thus demonstrates excellent electrochemical properties as a self-standing LIB anode.A novel active yolk@conductive shell nanofiber web with a unique synergistic advantage of various hierarchical nanodimensional objects including the 0D monodisperse SiO2 yolks, the 1D continuous carbon shell and the 3D interconnected non-woven fabric web has been developed by an innovative multi-dimensional construction method, and thus demonstrates excellent electrochemical properties as a self-standing LIB anode. Electronic supplementary information (ESI) available: Experimental details and additional information about material characterization. See DOI: 10.1039/c5nr06531c

  2. Web-Based Architecture to Enable Compute-Intensive CAD Tools and Multi-user Synchronization in Teleradiology

    NASA Astrophysics Data System (ADS)

    Mehta, Neville; Kompalli, Suryaprakash; Chaudhary, Vipin

    Teleradiology is the electronic transmission of radiological patient images, such as x-rays, CT, or MR across multiple locations. The goal could be interpretation, consultation, or medical records keeping. Information technology solutions have enabled electronic records and their associated benefits are evident in health care today. However, salient aspects of collaborative interfaces, and computer assisted diagnostic (CAD) tools are yet to be integrated into workflow designs. The Computer Assisted Diagnostics and Interventions (CADI) group at the University at Buffalo has developed an architecture that facilitates web-enabled use of CAD tools, along with the novel concept of synchronized collaboration. The architecture can support multiple teleradiology applications and case studies are presented here.

  3. Toward Transparent Data Management in Multi-layer Storage Hierarchy for HPC Systems

    DOE PAGES

    Wadhwa, Bharti; Byna, Suren; Butt, Ali R.

    2018-04-17

    Upcoming exascale high performance computing (HPC) systems are expected to comprise multi-tier storage hierarchy, and thus will necessitate innovative storage and I/O mechanisms. Traditional disk and block-based interfaces and file systems face severe challenges in utilizing capabilities of storage hierarchies due to the lack of hierarchy support and semantic interfaces. Object-based and semantically-rich data abstractions for scientific data management on large scale systems offer a sustainable solution to these challenges. Such data abstractions can also simplify users involvement in data movement. Here, we take the first steps of realizing such an object abstraction and explore storage mechanisms for these objectsmore » to enhance I/O performance, especially for scientific applications. We explore how an object-based interface can facilitate next generation scalable computing systems by presenting the mapping of data I/O from two real world HPC scientific use cases: a plasma physics simulation code (VPIC) and a cosmology simulation code (HACC). Our storage model stores data objects in different physical organizations to support data movement across layers of memory/storage hierarchy. Our implementation sclaes well to 16K parallel processes, and compared to the state of the art, such as MPI-IO and HDF5, our object-based data abstractions and data placement strategy in multi-level storage hierarchy achieves up to 7 X I/O performance improvement for scientific data.« less

  4. Toward Transparent Data Management in Multi-layer Storage Hierarchy for HPC Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wadhwa, Bharti; Byna, Suren; Butt, Ali R.

    Upcoming exascale high performance computing (HPC) systems are expected to comprise multi-tier storage hierarchy, and thus will necessitate innovative storage and I/O mechanisms. Traditional disk and block-based interfaces and file systems face severe challenges in utilizing capabilities of storage hierarchies due to the lack of hierarchy support and semantic interfaces. Object-based and semantically-rich data abstractions for scientific data management on large scale systems offer a sustainable solution to these challenges. Such data abstractions can also simplify users involvement in data movement. Here, we take the first steps of realizing such an object abstraction and explore storage mechanisms for these objectsmore » to enhance I/O performance, especially for scientific applications. We explore how an object-based interface can facilitate next generation scalable computing systems by presenting the mapping of data I/O from two real world HPC scientific use cases: a plasma physics simulation code (VPIC) and a cosmology simulation code (HACC). Our storage model stores data objects in different physical organizations to support data movement across layers of memory/storage hierarchy. Our implementation sclaes well to 16K parallel processes, and compared to the state of the art, such as MPI-IO and HDF5, our object-based data abstractions and data placement strategy in multi-level storage hierarchy achieves up to 7 X I/O performance improvement for scientific data.« less

  5. BioPartsDB: a synthetic biology workflow web-application for education and research.

    PubMed

    Stracquadanio, Giovanni; Yang, Kun; Boeke, Jef D; Bader, Joel S

    2016-11-15

    Synthetic biology has become a widely used technology, and expanding applications in research, education and industry require progress tracking for team-based DNA synthesis projects. Although some vendors are beginning to supply multi-kilobase sequence-verified constructs, synthesis workflows starting with short oligos remain important for cost savings and pedagogical benefit. We developed BioPartsDB as an open source, extendable workflow management system for synthetic biology projects with entry points for oligos and larger DNA constructs and ending with sequence-verified clones. BioPartsDB is released under the MIT license and available for download at https://github.com/baderzone/biopartsdb Additional documentation and video tutorials are available at https://github.com/baderzone/biopartsdb/wiki An Amazon Web Services image is available from the AWS Market Place (ami-a01d07c8). joel.bader@jhu.edu. © The Author 2016. Published by Oxford University Press.

  6. 38 CFR 39.84 - Application requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., the State or Tribal Organization must submit an application (as opposed to a preapplication... Debarment, Suspension, Ineligibility and Voluntary Exclusion-Lower Tier Covered Transactions (Contractor... Management and Budget has approved the information collection requirements in this section under control...

  7. 38 CFR 39.84 - Application requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., the State or Tribal Organization must submit an application (as opposed to a preapplication... Debarment, Suspension, Ineligibility and Voluntary Exclusion-Lower Tier Covered Transactions (Contractor... Management and Budget has approved the information collection requirements in this section under control...

  8. Rich internet application system for patient-centric healthcare data management using handheld devices.

    PubMed

    Constantinescu, L; Pradana, R; Kim, J; Gong, P; Fulham, Michael; Feng, D

    2009-01-01

    Rich Internet Applications (RIAs) are an emerging software platform that blurs the line between web service and native application, and is a powerful tool for handheld device deployment. By democratizing health data management and widening its availability, this software platform has the potential to revolutionize telemedicine, clinical practice, medical education and information distribution, particularly in rural areas, and to make patient-centric medical computing a reality. In this paper, we propose a telemedicine application that leverages the ability of a mobile RIA platform to transcode, organise and present textual and multimedia data, which are sourced from medical database software. We adopted a web-based approach to communicate, in real-time, with an established hospital information system via a custom RIA. The proposed solution allows communication between handheld devices and a hospital information system for media streaming with support for real-time encryption, on any RIA enabled platform. We demonstrate our prototype's ability to securely and rapidly access, without installation requirements, medical data ranging from simple textual records to multi-slice PET-CT images and maximum intensity (MIP) projections.

  9. An innovative middle tier design for protecting federal privacy act data

    NASA Astrophysics Data System (ADS)

    Allen, Thomas G. L.

    2008-03-01

    This paper identifies an innovative middle tier technique and design that provides a solid layer of network security for a single source of human resources (HR) data that falls under the Federal Privacy Act. The paper also discusses functionality for both retrieving data and updating data in a secure way. It will be shown that access to this information is limited by a security mechanism that authorizes all connections based on both application (client) and user information.

  10. Ground/Air Task Oriented Radar (G/ATOR)

    DTIC Science & Technology

    2015-12-01

    Identification (Block 1) (Applicable to Block 4 G/ATOR December 2015 SAR March 18, 2016 06:47:59 UNCLASSIFIED 12 Integrate IFF Mode 5 ( Level 3) and Mode S... Level 3) Integrate IFF Mode 5 ( Level 3) and Mode S ( Level 3) Growth - Block 4. AD/SR shall integrate MK XIIA IFF Mode 5 ( Level 2) capabilities...and Mode S ( level 2) TBD Integrate IFF Mode 5 ( Level 3) and Mode S ( Level 3) Tier 1: Logistics Tier 2: 0perational Contract Support Sustainment

  11. Vipie: web pipeline for parallel characterization of viral populations from multiple NGS samples.

    PubMed

    Lin, Jake; Kramna, Lenka; Autio, Reija; Hyöty, Heikki; Nykter, Matti; Cinek, Ondrej

    2017-05-15

    Next generation sequencing (NGS) technology allows laboratories to investigate virome composition in clinical and environmental samples in a culture-independent way. There is a need for bioinformatic tools capable of parallel processing of virome sequencing data by exactly identical methods: this is especially important in studies of multifactorial diseases, or in parallel comparison of laboratory protocols. We have developed a web-based application allowing direct upload of sequences from multiple virome samples using custom parameters. The samples are then processed in parallel using an identical protocol, and can be easily reanalyzed. The pipeline performs de-novo assembly, taxonomic classification of viruses as well as sample analyses based on user-defined grouping categories. Tables of virus abundance are produced from cross-validation by remapping the sequencing reads to a union of all observed reference viruses. In addition, read sets and reports are created after processing unmapped reads against known human and bacterial ribosome references. Secured interactive results are dynamically plotted with population and diversity charts, clustered heatmaps and a sortable and searchable abundance table. The Vipie web application is a unique tool for multi-sample metagenomic analysis of viral data, producing searchable hits tables, interactive population maps, alpha diversity measures and clustered heatmaps that are grouped in applicable custom sample categories. Known references such as human genome and bacterial ribosomal genes are optionally removed from unmapped ('dark matter') reads. Secured results are accessible and shareable on modern browsers. Vipie is a freely available web-based tool whose code is open source.

  12. Design, Development and Testing of Web Services for Multi-Sensor Snow Cover Mapping

    NASA Astrophysics Data System (ADS)

    Kadlec, Jiri

    This dissertation presents the design, development and validation of new data integration methods for mapping the extent of snow cover based on open access ground station measurements, remote sensing images, volunteer observer snow reports, and cross country ski track recordings from location-enabled mobile devices. The first step of the data integration procedure includes data discovery, data retrieval, and data quality control of snow observations at ground stations. The WaterML R package developed in this work enables hydrologists to retrieve and analyze data from multiple organizations that are listed in the Consortium of Universities for the Advancement of Hydrologic Sciences Inc (CUAHSI) Water Data Center catalog directly within the R statistical software environment. Using the WaterML R package is demonstrated by running an energy balance snowpack model in R with data inputs from CUAHSI, and by automating uploads of real time sensor observations to CUAHSI HydroServer. The second step of the procedure requires efficient access to multi-temporal remote sensing snow images. The Snow Inspector web application developed in this research enables the users to retrieve a time series of fractional snow cover from the Moderate Resolution Imaging Spectroradiometer (MODIS) for any point on Earth. The time series retrieval method is based on automated data extraction from tile images provided by a Web Map Tile Service (WMTS). The average required time for retrieving 100 days of data using this technique is 5.4 seconds, which is significantly faster than other methods that require the download of large satellite image files. The presented data extraction technique and space-time visualization user interface can be used as a model for working with other multi-temporal hydrologic or climate data WMTS services. The third, final step of the data integration procedure is generating continuous daily snow cover maps. A custom inverse distance weighting method has been developed to combine volunteer snow reports, cross-country ski track reports and station measurements to fill cloud gaps in the MODIS snow cover product. The method is demonstrated by producing a continuous daily time step snow presence probability map dataset for the Czech Republic region. The ability of the presented methodology to reconstruct MODIS snow cover under cloud is validated by simulating cloud cover datasets and comparing estimated snow cover to actual MODIS snow cover. The percent correctly classified indicator showed accuracy between 80 and 90% using this method. Using crowdsourcing data (volunteer snow reports and ski tracks) improves the map accuracy by 0.7--1.2%. The output snow probability map data sets are published online using web applications and web services. Keywords: crowdsourcing, image analysis, interpolation, MODIS, R statistical software, snow cover, snowpack probability, Tethys platform, time series, WaterML, web services, winter sports.

  13. 76 FR 33380 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-08

    ... Two New Pricing Tiers, Investor Tier 1 and Investor Tier 2 June 3, 2011. Pursuant to Section 19(b)(1... Services (the ``Schedule'') to introduce two new pricing tiers, Investor Tier 1 and Investor Tier 2. The... proposes to introduce two new pricing tier levels, Investor Tier 1 and Investor Tier 2. Investor Tier 1...

  14. "Tactic": Traffic Aware Cloud for Tiered Infrastructure Consolidation

    ERIC Educational Resources Information Center

    Sangpetch, Akkarit

    2013-01-01

    Large-scale enterprise applications are deployed as distributed applications. These applications consist of many inter-connected components with heterogeneous roles and complex dependencies. Each component typically consumes 5-15% of the server capacity. Deploying each component as a separate virtual machine (VM) allows us to consolidate the…

  15. Atomistic modelling of scattering data in the Collaborative Computational Project for Small Angle Scattering (CCP-SAS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perkins, Stephen J.; Wright, David W.; Zhang, Hailiang

    2016-10-14

    The capabilities of current computer simulations provide a unique opportunity to model small-angle scattering (SAS) data at the atomistic level, and to include other structural constraints ranging from molecular and atomistic energetics to crystallography, electron microscopy and NMR. This extends the capabilities of solution scattering and provides deeper insights into the physics and chemistry of the systems studied. Realizing this potential, however, requires integrating the experimental data with a new generation of modelling software. To achieve this, the CCP-SAS collaboration (http://www.ccpsas.org/) is developing open-source, high-throughput and user-friendly software for the atomistic and coarse-grained molecular modelling of scattering data. Robust state-of-the-artmore » molecular simulation engines and molecular dynamics and Monte Carlo force fields provide constraints to the solution structure inferred from the small-angle scattering data, which incorporates the known physical chemistry of the system. The implementation of this software suite involves a tiered approach in whichGenAppprovides the deployment infrastructure for running applications on both standard and high-performance computing hardware, andSASSIEprovides a workflow framework into which modules can be plugged to prepare structures, carry out simulations, calculate theoretical scattering data and compare results with experimental data.GenAppproduces the accessible web-based front end termedSASSIE-web, andGenAppandSASSIEalso make community SAS codes available. Applications are illustrated by case studies: (i) inter-domain flexibility in two- to six-domain proteins as exemplified by HIV-1 Gag, MASP and ubiquitin; (ii) the hinge conformation in human IgG2 and IgA1 antibodies; (iii) the complex formed between a hexameric protein Hfq and mRNA; and (iv) synthetic `bottlebrush' polymers.« less

  16. Atomistic modelling of scattering data in the Collaborative Computational Project for Small Angle Scattering (CCP-SAS).

    PubMed

    Perkins, Stephen J; Wright, David W; Zhang, Hailiang; Brookes, Emre H; Chen, Jianhan; Irving, Thomas C; Krueger, Susan; Barlow, David J; Edler, Karen J; Scott, David J; Terrill, Nicholas J; King, Stephen M; Butler, Paul D; Curtis, Joseph E

    2016-12-01

    The capabilities of current computer simulations provide a unique opportunity to model small-angle scattering (SAS) data at the atomistic level, and to include other structural constraints ranging from molecular and atomistic energetics to crystallography, electron microscopy and NMR. This extends the capabilities of solution scattering and provides deeper insights into the physics and chemistry of the systems studied. Realizing this potential, however, requires integrating the experimental data with a new generation of modelling software. To achieve this, the CCP-SAS collaboration (http://www.ccpsas.org/) is developing open-source, high-throughput and user-friendly software for the atomistic and coarse-grained molecular modelling of scattering data. Robust state-of-the-art molecular simulation engines and molecular dynamics and Monte Carlo force fields provide constraints to the solution structure inferred from the small-angle scattering data, which incorporates the known physical chemistry of the system. The implementation of this software suite involves a tiered approach in which GenApp provides the deployment infrastructure for running applications on both standard and high-performance computing hardware, and SASSIE provides a workflow framework into which modules can be plugged to prepare structures, carry out simulations, calculate theoretical scattering data and compare results with experimental data. GenApp produces the accessible web-based front end termed SASSIE-web , and GenApp and SASSIE also make community SAS codes available. Applications are illustrated by case studies: (i) inter-domain flexibility in two- to six-domain proteins as exemplified by HIV-1 Gag, MASP and ubiquitin; (ii) the hinge conformation in human IgG2 and IgA1 antibodies; (iii) the complex formed between a hexameric protein Hfq and mRNA; and (iv) synthetic 'bottlebrush' polymers.

  17. CropEx Web-Based Agricultural Monitoring and Decision Support

    NASA Technical Reports Server (NTRS)

    Harvey. Craig; Lawhead, Joel

    2011-01-01

    CropEx is a Web-based agricultural Decision Support System (DSS) that monitors changes in crop health over time. It is designed to be used by a wide range of both public and private organizations, including individual producers and regional government offices with a vested interest in tracking vegetation health. The database and data management system automatically retrieve and ingest data for the area of interest. Another stores results of the processing and supports the DSS. The processing engine will allow server-side analysis of imagery with support for image sub-setting and a set of core raster operations for image classification, creation of vegetation indices, and change detection. The system includes the Web-based (CropEx) interface, data ingestion system, server-side processing engine, and a database processing engine. It contains a Web-based interface that has multi-tiered security profiles for multiple users. The interface provides the ability to identify areas of interest to specific users, user profiles, and methods of processing and data types for selected or created areas of interest. A compilation of programs is used to ingest available data into the system, classify that data, profile that data for quality, and make data available for the processing engine immediately upon the data s availability to the system (near real time). The processing engine consists of methods and algorithms used to process the data in a real-time fashion without copying, storing, or moving the raw data. The engine makes results available to the database processing engine for storage and further manipulation. The database processing engine ingests data from the image processing engine, distills those results into numerical indices, and stores each index for an area of interest. This process happens each time new data is ingested and processed for the area of interest, and upon subsequent database entries, the database processing engine qualifies each value for each area of interest and conducts a logical processing of results indicating when and where thresholds are exceeded. Reports are provided at regular, operator-determined intervals that include variances from thresholds and links to view raw data for verification, if necessary. The technology and method of development allow the code base to easily be modified for varied use in the real-time and near-real-time processing environments. In addition, the final product will be demonstrated as a means for rapid draft assessment of imagery.

  18. A win-win solution?: A critical analysis of tiered pricing to improve access to medicines in developing countries.

    PubMed

    Moon, Suerie; Jambert, Elodie; Childs, Michelle; von Schoen-Angerer, Tido

    2011-10-12

    Tiered pricing - the concept of selling drugs and vaccines in developing countries at prices systematically lower than in industrialized countries - has received widespread support from industry, policymakers, civil society, and academics as a way to improve access to medicines for the poor. We carried out case studies based on a review of international drug price developments for antiretrovirals, artemisinin combination therapies, drug-resistant tuberculosis medicines, liposomal amphotericin B (for visceral leishmaniasis), and pneumococcal vaccines. We found several critical shortcomings to tiered pricing: it is inferior to competition for achieving the lowest sustainable prices; it often involves arbitrary divisions between markets and/or countries, which can lead to very high prices for middle-income markets; and it leaves a disproportionate amount of decision-making power in the hands of sellers vis-à-vis consumers. In many developing countries, resources are often stretched so tight that affordability can only be approached by selling medicines at or near the cost of production. Policies that "de-link" the financing of R&D from the price of medicines merit further attention, since they can reward innovation while exploiting robust competition in production to generate the lowest sustainable prices. However, in special cases - such as when market volumes are very small or multi-source production capacity is lacking - tiered pricing may offer the only practical option to meet short-term needs for access to a product. In such cases, steps should be taken to ensure affordability and availability in the longer-term. To ensure access to medicines for populations in need, alternate strategies should be explored that harness the power of competition, avoid arbitrary market segmentation, and/or recognize government responsibilities. Competition should generally be the default option for achieving affordability, as it has proven superior to tiered pricing for reliably achieving the lowest sustainable prices.

  19. A win-win solution?: A critical analysis of tiered pricing to improve access to medicines in developing countries

    PubMed Central

    2011-01-01

    Background Tiered pricing - the concept of selling drugs and vaccines in developing countries at prices systematically lower than in industrialized countries - has received widespread support from industry, policymakers, civil society, and academics as a way to improve access to medicines for the poor. We carried out case studies based on a review of international drug price developments for antiretrovirals, artemisinin combination therapies, drug-resistant tuberculosis medicines, liposomal amphotericin B (for visceral leishmaniasis), and pneumococcal vaccines. Discussion We found several critical shortcomings to tiered pricing: it is inferior to competition for achieving the lowest sustainable prices; it often involves arbitrary divisions between markets and/or countries, which can lead to very high prices for middle-income markets; and it leaves a disproportionate amount of decision-making power in the hands of sellers vis-à-vis consumers. In many developing countries, resources are often stretched so tight that affordability can only be approached by selling medicines at or near the cost of production. Policies that "de-link" the financing of R&D from the price of medicines merit further attention, since they can reward innovation while exploiting robust competition in production to generate the lowest sustainable prices. However, in special cases - such as when market volumes are very small or multi-source production capacity is lacking - tiered pricing may offer the only practical option to meet short-term needs for access to a product. In such cases, steps should be taken to ensure affordability and availability in the longer-term. Summary To ensure access to medicines for populations in need, alternate strategies should be explored that harness the power of competition, avoid arbitrary market segmentation, and/or recognize government responsibilities. Competition should generally be the default option for achieving affordability, as it has proven superior to tiered pricing for reliably achieving the lowest sustainable prices. PMID:21992405

  20. Modelling ecological and human exposure to POPs in Venice lagoon - Part II: Quantitative uncertainty and sensitivity analysis in coupled exposure models.

    PubMed

    Radomyski, Artur; Giubilato, Elisa; Ciffroy, Philippe; Critto, Andrea; Brochot, Céline; Marcomini, Antonio

    2016-11-01

    The study is focused on applying uncertainty and sensitivity analysis to support the application and evaluation of large exposure models where a significant number of parameters and complex exposure scenarios might be involved. The recently developed MERLIN-Expo exposure modelling tool was applied to probabilistically assess the ecological and human exposure to PCB 126 and 2,3,7,8-TCDD in the Venice lagoon (Italy). The 'Phytoplankton', 'Aquatic Invertebrate', 'Fish', 'Human intake' and PBPK models available in MERLIN-Expo library were integrated to create a specific food web to dynamically simulate bioaccumulation in various aquatic species and in the human body over individual lifetimes from 1932 until 1998. MERLIN-Expo is a high tier exposure modelling tool allowing propagation of uncertainty on the model predictions through Monte Carlo simulation. Uncertainty in model output can be further apportioned between parameters by applying built-in sensitivity analysis tools. In this study, uncertainty has been extensively addressed in the distribution functions to describe the data input and the effect on model results by applying sensitivity analysis techniques (screening Morris method, regression analysis, and variance-based method EFAST). In the exposure scenario developed for the Lagoon of Venice, the concentrations of 2,3,7,8-TCDD and PCB 126 in human blood turned out to be mainly influenced by a combination of parameters (half-lives of the chemicals, body weight variability, lipid fraction, food assimilation efficiency), physiological processes (uptake/elimination rates), environmental exposure concentrations (sediment, water, food) and eating behaviours (amount of food eaten). In conclusion, this case study demonstrated feasibility of MERLIN-Expo to be successfully employed in integrated, high tier exposure assessment. Copyright © 2016 Elsevier B.V. All rights reserved.

Top