Development of an IHE MRRT-compliant open-source web-based reporting platform.
Pinto Dos Santos, Daniel; Klos, G; Kloeckner, R; Oberle, R; Dueber, C; Mildenberger, P
2017-01-01
To develop a platform that uses structured reporting templates according to the IHE Management of Radiology Report Templates (MRRT) profile, and to implement this platform into clinical routine. The reporting platform uses standard web technologies (HTML / JavaScript and PHP / MySQL) only. Several freely available external libraries were used to simplify the programming. The platform runs on a standard web server, connects with the radiology information system (RIS) and PACS, and is easily accessible via a standard web browser. A prototype platform that allows structured reporting to be easily incorporated into the clinical routine was developed and successfully tested. To date, 797 reports were generated using IHE MRRT-compliant templates (many of them downloaded from the RSNA's radreport.org website). Reports are stored in a MySQL database and are easily accessible for further analyses. Development of an IHE MRRT-compliant platform for structured reporting is feasible using only standard web technologies. All source code will be made available upon request under a free license, and the participation of other institutions in further development is welcome. • A platform for structured reporting using IHE MRRT-compliant templates is presented. • Incorporating structured reporting into clinical routine is feasible. • Full source code will be provided upon request under a free license.
Eccher, Claudio; Eccher, Lorenzo; Izzo, Umberto
2005-01-01
In this poster we describe the security solutions implemented in a web-based cooperative work frame-work for managing heart failure patients among different health care professionals involved in the care process. The solution, developed in close collaboration with the Law Department of the University of Trento, is compliant with the new Italian Personal Data Protection Code, issued in 2003, that regulates also the storing and processing of health data.
40 CFR 63.4292 - What operating limits must I meet?
Code of Federal Regulations, 2010 CFR
2010-07-01
... systems on the web coating/printing operation(s) and dyeing/finishing operations for which you use this... Limitations § 63.4292 What operating limits must I meet? (a) For any web coating/printing operation, slashing operation, or dyeing/finishing operation on which you use the compliant material option; web coating...
Lowering the Barrier for Standards-Compliant and Discoverable Hydrological Data Publication
NASA Astrophysics Data System (ADS)
Kadlec, J.
2013-12-01
The growing need for sharing and integration of hydrological and climate data across multiple organizations has resulted in the development of distributed, services-based, standards-compliant hydrological data management and data hosting systems. The problem with these systems is complicated set-up and deployment. Many existing systems assume that the data publisher has remote-desktop access to a locally managed server and experience with computer network setup. For corporate websites, shared web hosting services with limited root access provide an inexpensive, dynamic web presence solution using the Linux, Apache, MySQL and PHP (LAMP) software stack. In this paper, we hypothesize that a webhosting service provides an optimal, low-cost solution for hydrological data hosting. We propose a software architecture of a standards-compliant, lightweight and easy-to-deploy hydrological data management system that can be deployed on the majority of existing shared internet webhosting services. The architecture and design is validated by developing Hydroserver Lite: a PHP and MySQL-based hydrological data hosting package that is fully standards-compliant and compatible with the Consortium of Universities for Advancement of Hydrologic Sciences (CUAHSI) hydrologic information system. It is already being used for management of field data collection by students of the McCall Outdoor Science School in Idaho. For testing, the Hydroserver Lite software has been installed on multiple different free and low-cost webhosting sites including Godaddy, Bluehost and 000webhost. The number of steps required to set-up the server is compared with the number of steps required to set-up other standards-compliant hydrologic data hosting systems including THREDDS, IstSOS and MapServer SOS.
Section 508 Standards Resources
Learn which software applications, operating systems, web-based applications, and other electronic and information technology (EIT) products are covered by Section 508 of the Rehabilitation Act; and resources for making sure your EIT products are compliant
Compliant threads maximize spider silk connection strength and toughness
Meyer, Avery; Pugno, Nicola M.; Cranford, Steven W.
2014-01-01
Millions of years of evolution have adapted spider webs to achieve a range of functionalities, including the well-known capture of prey, with efficient use of material. One feature that has escaped extensive investigation is the silk-on-silk connection joints within spider webs, particularly from a structural mechanics perspective. We report a joint theoretical and computational analysis of an idealized silk-on-silk fibre junction. By modifying the theory of multiple peeling, we quantitatively compare the performance of the system while systematically increasing the rigidity of the anchor thread, by both scaling the stress–strain response and the introduction of an applied pre-strain. The results of our study indicate that compliance is a virtue—the more extensible the anchorage, the tougher and stronger the connection becomes. In consideration of the theoretical model, in comparison with rigid substrates, a compliant anchorage enormously increases the effective adhesion strength (work required to detach), independent of the adhered thread itself, attributed to a nonlinear alignment between thread and anchor (contact peeling angle). The results can direct novel engineering design principles to achieve possible load transfer from compliant fibre-to-fibre anchorages, be they silk-on-silk or another, as-yet undeveloped, system. PMID:25008083
2011-01-01
Background The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. Description SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. Conclusions SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services can be explored in a manner very similar to data housed in static triple-stores, thus facilitating the intersection of Web services and Semantic Web technologies. PMID:22024447
Wilkinson, Mark D; Vandervalk, Benjamin; McCarthy, Luke
2011-10-24
The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services can be explored in a manner very similar to data housed in static triple-stores, thus facilitating the intersection of Web services and Semantic Web technologies.
Conwell, J L; Creek, K L; Pozzi, A R; Whyte, H M
2001-02-01
The Industrial Hygiene and Safety Group at Los Alamos National Laboratory (LANL) developed a database application known as IH DataView, which manages industrial hygiene monitoring data. IH DataView replaces a LANL legacy system, IHSD, that restricted user access to a single point of data entry needed enhancements that support new operational requirements, and was not Year 2000 (Y2K) compliant. IH DataView features a comprehensive suite of data collection and tracking capabilities. Through the use of Oracle database management and application development tools, the system is Y2K compliant and Web enabled for easy deployment and user access via the Internet. System accessibility is particularly important because LANL operations are spread over 43 square miles, and industrial hygienists (IHs) located across the laboratory will use the system. IH DataView shows promise of being useful in the future because it eliminates these problems. It has a flexible architecture and sophisticated capability to collect, track, and analyze data in easy-to-use form.
Scientific & Intelligence Exascale Visualization Analysis System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Money, James H.
SIEVAS provides an immersive visualization framework for connecting multiple systems in real time for data science. SIEVAS provides the ability to connect multiple COTS and GOTS products in a seamless fashion for data fusion, data analysis, and viewing. It provides this capability by using a combination of micro services, real time messaging, and web service compliant back-end system.
dos-Santos, M; Fujino, A
2012-01-01
Radiology teaching usually employs a systematic and comprehensive set of medical images and related information. Databases with representative radiological images and documents are highly desirable and widely used in Radiology teaching programs. Currently, computer-based teaching file systems are widely used in Medicine and Radiology teaching as an educational resource. This work addresses a user-centered radiology electronic teaching file system as an instance of MIRC compliant medical image database. Such as a digital library, the clinical cases are available to access by using a web browser. The system has offered great opportunities to some Radiology residents interact with experts. This has been done by applying user-centered techniques and creating usage context-based tools in order to make available an interactive system.
On Campus Web-Monitoring Rules, Colleges and the FCC Have a Bad Connection
ERIC Educational Resources Information Center
Hartle, Terry W.
2006-01-01
A regulation issued by the US Federal Communications Commission (FCC) requires facilities-based Internet services providers who operate their own equipment, including colleges, to make their Internet systems compliant with a statute known as the Communications Assistance for Law Enforcement Act (Calea) by April 2007. However, the FCC does not…
A SCORM Compliant Courseware Authoring Tool for Supporting Pervasive Learning
ERIC Educational Resources Information Center
Wang, Te-Hua; Chang, Flora Chia-I
2007-01-01
The sharable content object reference model (SCORM) includes a representation of distance learning contents and a behavior definition of how users should interact with the contents. Generally, SCORMcompliant systems were based on multimedia and Web technologies on PCs. We further build a pervasive learning environment, which allows users to read…
Information System through ANIS at CeSAM
NASA Astrophysics Data System (ADS)
Moreau, C.; Agneray, F.; Gimenez, S.
2015-09-01
ANIS (AstroNomical Information System) is a web generic tool developed at CeSAM to facilitate and standardize the implementation of astronomical data of various kinds through private and/or public dedicated Information Systems. The architecture of ANIS is composed of a database server which contains the project data, a web user interface template which provides high level services (search, extract and display imaging and spectroscopic data using a combination of criteria, an object list, a sql query module or a cone search interfaces), a framework composed of several packages, and a metadata database managed by a web administration entity. The process to implement a new ANIS instance at CeSAM is easy and fast : the scientific project has to submit data or a data secure access, the CeSAM team installs the new instance (web interface template and the metadata database), and the project administrator can configure the instance with the web ANIS-administration entity. Currently, the CeSAM offers through ANIS a web access to VO compliant Information Systems for different projects (HeDaM, HST-COSMOS, CFHTLS-ZPhots, ExoDAT,...).
40 CFR 63.4321 - How do I demonstrate initial compliance with the emission limitations?
Code of Federal Regulations, 2011 CFR
2011-07-01
... compliant material option for any individual web coating/printing operation, for any group of web coating/printing operations in the affected source, or for all the web coating/printing operations in the affected... HAP concentration option for any web coating/printing operation(s) in the affected source for which...
40 CFR 63.4321 - How do I demonstrate initial compliance with the emission limitations?
Code of Federal Regulations, 2010 CFR
2010-07-01
... compliant material option for any individual web coating/printing operation, for any group of web coating/printing operations in the affected source, or for all the web coating/printing operations in the affected... HAP concentration option for any web coating/printing operation(s) in the affected source for which...
Koçkaya, Güvenç; Wertheimer, Albert
2011-06-01
The current study was designed to calculate the direct cost of noncompliance of hypertensive patients to the US health system. Understanding these expenses can inform screening and education budget policy regarding expenditure levels that can be calculated to be cost-beneficial. The study was conducted in 3 parts. First, a computer search of National Institutes of Health Web sites and professional society Web sites for organizations with members that treat hypertension, and a PubMed search were performed to obtain the numbers required for calculations. Second, formulas were developed to estimate the risk of noncompliance and undiagnosed hypertension. Third, risk calculations were performed using the information obtained in part 1 and the formulas developed in part 2. Direct risk reduction for stroke caused by hypertension, heart attack, kidney disease, and heart disease was calculated for a 100% compliant strategy. Risk, case, and cost reduction for a 100% compliant strategy for hypertension were 32%, 8.5 million and US$ 72 billion, respectively. Our analysis means that the society can spend up to the cost of noncompliance in screening, education, and prevention efforts in an attempt to reduce these costly and traumatic sequelae of poorly controlled hypertension in the light of published analysis.
Burgarella, Sarah; Cattaneo, Dario; Pinciroli, Francesco; Masseroli, Marco
2005-12-01
Improvements of bio-nano-technologies and biomolecular techniques have led to increasing production of high-throughput experimental data. Spotted cDNA microarray is one of the most diffuse technologies, used in single research laboratories and in biotechnology service facilities. Although they are routinely performed, spotted microarray experiments are complex procedures entailing several experimental steps and actors with different technical skills and roles. During an experiment, involved actors, who can also be located in a distance, need to access and share specific experiment information according to their roles. Furthermore, complete information describing all experimental steps must be orderly collected to allow subsequent correct interpretation of experimental results. We developed MicroGen, a web system for managing information and workflow in the production pipeline of spotted microarray experiments. It is constituted of a core multi-database system able to store all data completely characterizing different spotted microarray experiments according to the Minimum Information About Microarray Experiments (MIAME) standard, and of an intuitive and user-friendly web interface able to support the collaborative work required among multidisciplinary actors and roles involved in spotted microarray experiment production. MicroGen supports six types of user roles: the researcher who designs and requests the experiment, the spotting operator, the hybridisation operator, the image processing operator, the system administrator, and the generic public user who can access the unrestricted part of the system to get information about MicroGen services. MicroGen represents a MIAME compliant information system that enables managing workflow and supporting collaborative work in spotted microarray experiment production.
DADOS-Survey: an open-source application for CHERRIES-compliant Web surveys
Shah, Anand; Jacobs, Danny O; Martins, Henrique; Harker, Matthew; Menezes, Andreia; McCready, Mariana; Pietrobon, Ricardo
2006-01-01
Background The Internet has been increasingly utilized in biomedical research. From online searching for literature to data sharing, the Internet has emerged as a primary means of research for many physicians and scientists. As a result, Web-based surveys have been employed as an alternative to traditional, paper-based surveys. We describe DADOS-Survey, an open-source Web-survey application developed at our institution that, to the best of our knowledge, is the first to be compliant with the Checklist for Reporting Results of Internet E-Surveys (CHERRIES). DADOS-Survey was designed with usability as a priority, allowing investigators to design and execute their own studies with minimal technical difficulties in doing so. Results To date, DADOS-Survey has been successfully implemented in five Institutional Review Board-approved studies conducted by various departments within our academic center. Each of these studies employed a Web-survey design as their primary methodology. Our initial experience indicates that DADOS-Survey has been used with relative ease by each of the investigators and survey recipients. This has been further demonstrated through formal and field usability testing, during which time suggestions for improvement were incorporated into the software design. Conclusion DADOS-Survey has the potential to have an important role in the future direction of Web-survey administration in biomedical research. This CHERRIES-compliant application is tailored to the emerging requirements of quality data collection in medicine. PMID:16978409
40 CFR 63.4292 - What operating limits must I meet?
Code of Federal Regulations, 2011 CFR
2011-07-01
... Limitations § 63.4292 What operating limits must I meet? (a) For any web coating/printing operation, slashing operation, or dyeing/finishing operation on which you use the compliant material option; web coating..., you are not required to meet any operating limits. (b) For any controlled web coating/printing...
OOSTethys - Open Source Software for the Global Earth Observing Systems of Systems
NASA Astrophysics Data System (ADS)
Bridger, E.; Bermudez, L. E.; Maskey, M.; Rueda, C.; Babin, B. L.; Blair, R.
2009-12-01
An open source software project is much more than just picking the right license, hosting modular code and providing effective documentation. Success in advancing in an open collaborative way requires that the process match the expected code functionality to the developer's personal expertise and organizational needs as well as having an enthusiastic and responsive core lead group. We will present the lessons learned fromOOSTethys , which is a community of software developers and marine scientists who develop open source tools, in multiple languages, to integrate ocean observing systems into an Integrated Ocean Observing System (IOOS). OOSTethys' goal is to dramatically reduce the time it takes to install, adopt and update standards-compliant web services. OOSTethys has developed servers, clients and a registry. Open source PERL, PYTHON, JAVA and ASP tool kits and reference implementations are helping the marine community publish near real-time observation data in interoperable standard formats. In some cases publishing an OpenGeospatial Consortium (OGC), Sensor Observation Service (SOS) from NetCDF files or a database or even CSV text files could take only minutes depending on the skills of the developer. OOSTethys is also developing an OGC standard registry, Catalog Service for Web (CSW). This open source CSW registry was implemented to easily register and discover SOSs using ISO 19139 service metadata. A web interface layer over the CSW registry simplifies the registration process by harvesting metadata describing the observations and sensors from the “GetCapabilities” response of SOS. OPENIOOS is the web client, developed in PERL to visualize the sensors in the SOS services. While the number of OOSTethys software developers is small, currently about 10 around the world, the number of OOSTethys toolkit implementers is larger and growing and the ease of use has played a large role in spreading the use of interoperable standards compliant web services widely in the marine community.
Knowledge Base for Automatic Generation of Online IMS LD Compliant Course Structures
ERIC Educational Resources Information Center
Pacurar, Ecaterina Giacomini; Trigano, Philippe; Alupoaie, Sorin
2006-01-01
Our article presents a pedagogical scenarios-based web application that allows the automatic generation and development of pedagogical websites. These pedagogical scenarios are represented in the IMS Learning Design standard. Our application is a web portal helping teachers to dynamically generate web course structures, to edit pedagogical content…
Methodology for Localized and Accessible Image Formation and Elucidation
ERIC Educational Resources Information Center
Patil, Sandeep R.; Katiyar, Manish
2009-01-01
Accessibility is one of the key checkpoints in all software products, applications, and Web sites. Accessibility with digital images has always been a major challenge for the industry. Images form an integral part of certain type of documents and most Web 2.0-compliant Web sites. Individuals challenged with blindness and many dyslexics only make…
NASA Astrophysics Data System (ADS)
Plessel, T.; Szykman, J.; Freeman, M.
2012-12-01
EPA's Remote Sensing Information Gateway (RSIG) is a widely used free applet and web service for quickly and easily retrieving, visualizing and saving user-specified subsets of atmospheric data - by variable, geographic domain and time range. Petabytes of available data include thousands of variables from a set of NASA and NOAA satellites, aircraft, ground stations and EPA air-quality models. The RSIG applet is used by atmospheric researchers and uses the rsigserver web service to obtain data and images. The rsigserver web service is compliant with the Open Geospatial Consortium Web Coverage Service (OGC-WCS) standard to facilitate data discovery and interoperability. Since rsigserver is publicly accessible, it can be (and is) used by other applications. This presentation describes the architecture and technical implementation details of this successful system with an emphasis on achieving convenience, high-performance, data integrity and security.
Biographer: web-based editing and rendering of SBGN compliant biochemical networks.
Krause, Falko; Schulz, Marvin; Ripkens, Ben; Flöttmann, Max; Krantz, Marcus; Klipp, Edda; Handorf, Thomas
2013-06-01
The rapid accumulation of knowledge in the field of Systems Biology during the past years requires advanced, but simple-to-use, methods for the visualization of information in a structured and easily comprehensible manner. We have developed biographer, a web-based renderer and editor for reaction networks, which can be integrated as a library into tools dealing with network-related information. Our software enables visualizations based on the emerging standard Systems Biology Graphical Notation. It is able to import networks encoded in various formats such as SBML, SBGN-ML and jSBGN, a custom lightweight exchange format. The core package is implemented in HTML5, CSS and JavaScript and can be used within any kind of web-based project. It features interactive graph-editing tools and automatic graph layout algorithms. In addition, we provide a standalone graph editor and a web server, which contains enhanced features like web services for the import and export of models and visualizations in different formats. The biographer tool can be used at and downloaded from the web page http://biographer.biologie.hu-berlin.de/. The different software packages, including a server-independent version as well as a web server for Windows and Linux based systems, are available at http://code.google.com/p/biographer/ under the open-source license LGPL
VO-compliant libraries of high resolution spectra of cool stars
NASA Astrophysics Data System (ADS)
Montes, D.
2008-10-01
In this contribution we describe a Virtual Observatory (VO) compliant version of the libraries of high resolution spectra of cool stars described by Montes et al. (1997; 1998; and 1999). Since their publication the fully reduced spectra in FITS format have been available via ftp and in the World Wide Web. However, in the VO all the spectra will be accessible using a common web interface following the standards of the International Virtual Observatory Alliance (IVOA). These libraries include F, G, K and M field stars, from dwarfs to giants. The spectral coverage is from 3800 to 10000 Å, with spectral resolution ranging from 0.09 to 3.0 Å.
2008-06-01
Management and Budget, Paperwork Reduction Project (0704-0188) Washington DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE June 2008...We introduce World Wide Web Consortium (W3C) compliant services into the planning and battle management processes where a computer can be more...which the software services comprising the command, control, and battle management (C2BM) element of the BMD system need to operate within hard real
First Prototype of a Web Map Interface for ESA's Planetary Science Archive (PSA)
NASA Astrophysics Data System (ADS)
Manaud, N.; Gonzalez, J.
2014-04-01
We present a first prototype of a Web Map Interface that will serve as a proof of concept and design for ESA's future fully web-based Planetary Science Archive (PSA) User Interface. The PSA is ESA's planetary science archiving authority and central repository for all scientific and engineering data returned by ESA's Solar System missions [1]. All data are compliant with NASA's Planetary Data System (PDS) Standards and are accessible through several interfaces [2]: in addition to serving all public data via FTP and the Planetary Data Access Protocol (PDAP), a Java-based User Interface provides advanced search, preview, download, notification and delivery-basket functionality. It allows the user to query and visualise instrument observations footprints using a map-based interface (currently only available for Mars Express HRSC and OMEGA instruments). During the last decade, the planetary mapping science community has increasingly been adopting Geographic Information System (GIS) tools and standards, originally developed for and used in Earth science. There is an ongoing effort to produce and share cartographic products through Open Geospatial Consortium (OGC) Web Services, or as standalone data sets, so that they can be readily used in existing GIS applications [3,4,5]. Previous studies conducted at ESAC [6,7] have helped identify the needs of Planetary GIS users, and define key areas of improvement for the future Web PSA User Interface. Its web map interface shall will provide access to the full geospatial content of the PSA, including (1) observation geometry footprints of all remote sensing instruments, and (2) all georeferenced cartographic products, such as HRSC map-projected data or OMEGA global maps from Mars Express. It shall aim to provide a rich user experience for search and visualisation of this content using modern and interactive web mapping technology. A comprehensive set of built-in context maps from external sources, such as MOLA topography, TES infrared maps or planetary surface nomenclature, provided in both simple cylindrical and polar stereographic projections, shall enhance this user experience. In addition, users should be able to import and export data in commonly used open- GIS formats. It is also intended to serve all PSA geospatial data through OGC-compliant Web Services so that they can be captured, visualised and analysed directly from GIS software, along with data from other sources. The following figure illustrates how the PSA web map interface and services shall fit in a typical Planetary GIS user working environment.
Biographer: web-based editing and rendering of SBGN compliant biochemical networks
Krause, Falko; Schulz, Marvin; Ripkens, Ben; Flöttmann, Max; Krantz, Marcus; Klipp, Edda; Handorf, Thomas
2013-01-01
Motivation: The rapid accumulation of knowledge in the field of Systems Biology during the past years requires advanced, but simple-to-use, methods for the visualization of information in a structured and easily comprehensible manner. Results: We have developed biographer, a web-based renderer and editor for reaction networks, which can be integrated as a library into tools dealing with network-related information. Our software enables visualizations based on the emerging standard Systems Biology Graphical Notation. It is able to import networks encoded in various formats such as SBML, SBGN-ML and jSBGN, a custom lightweight exchange format. The core package is implemented in HTML5, CSS and JavaScript and can be used within any kind of web-based project. It features interactive graph-editing tools and automatic graph layout algorithms. In addition, we provide a standalone graph editor and a web server, which contains enhanced features like web services for the import and export of models and visualizations in different formats. Availability: The biographer tool can be used at and downloaded from the web page http://biographer.biologie.hu-berlin.de/. The different software packages, including a server-indepenent version as well as a web server for Windows and Linux based systems, are available at http://code.google.com/p/biographer/ under the open-source license LGPL. Contact: edda.klipp@biologie.hu-berlin.de or handorf@physik.hu-berlin.de PMID:23574737
Formula Manufacturers' Web Sites: Are They Really Non-Compliant Advertisements?
ERIC Educational Resources Information Center
Gunter, Barrie; Dickinson, Roger; Matthews, Julian; Cole, Jennifer
2013-01-01
Purpose: In the UK, advertising of infant formula products direct to consumers is not permitted. These products must be used on the recommendation of suitably qualified health or medical professionals. The aim of this study is to examine formula manufacturers' web sites to ascertain whether these are used as alternative forms of advertising that…
NASA Astrophysics Data System (ADS)
Morton, J. J.; Ferrini, V. L.
2015-12-01
The Marine Geoscience Data System (MGDS, www.marine-geo.org) operates an interactive digital data repository and metadata catalog that provides access to a variety of marine geology and geophysical data from throughout the global oceans. Its Marine-Geo Digital Library includes common marine geophysical data types and supporting data and metadata, as well as complementary long-tail data. The Digital Library also includes community data collections and custom data portals for the GeoPRISMS, MARGINS and Ridge2000 programs, for active source reflection data (Academic Seismic Portal), and for marine data acquired by the US Antarctic Program (Antarctic and Southern Ocean Data Portal). Ensuring that these data are discoverable not only through our own interfaces but also through standards-compliant web services is critical for enabling investigators to find data of interest.Over the past two years, MGDS has developed several new RESTful web services that enable programmatic access to metadata and data holdings. These web services are compliant with the EarthCube GeoWS Building Blocks specifications and are currently used to drive our own user interfaces. New web applications have also been deployed to provide a more intuitive user experience for searching, accessing and browsing metadata and data. Our new map-based search interface combines components of the Google Maps API with our web services for dynamic searching and exploration of geospatially constrained data sets. Direct introspection of nearly all data formats for hundreds of thousands of data files curated in the Marine-Geo Digital Library has allowed for precise geographic bounds, which allow geographic searches to an extent not previously possible. All MGDS map interfaces utilize the web services of the Global Multi-Resolution Topography (GMRT) synthesis for displaying global basemap imagery and for dynamically provide depth values at the cursor location.
Knowledge Repository for Fmea Related Knowledge
NASA Astrophysics Data System (ADS)
Cândea, Gabriela Simona; Kifor, Claudiu Vasile; Cândea, Ciprian
2014-11-01
This paper presents innovative usage of knowledge system into Failure Mode and Effects Analysis (FMEA) process using the ontology to represent the knowledge. Knowledge system is built to serve multi-projects work that nowadays are in place in any manufacturing or services provider, and knowledge must be retained and reused at the company level and not only at project level. The system is following the FMEA methodology and the validation of the concept is compliant with the automotive industry standards published by Automotive Industry Action Group, and not only. Collaboration is assured trough web-based GUI that supports multiple users access at any time
Job submission and management through web services: the experience with the CREAM service
NASA Astrophysics Data System (ADS)
Aiftimiei, C.; Andreetto, P.; Bertocco, S.; Fina, S. D.; Ronco, S. D.; Dorigo, A.; Gianelle, A.; Marzolla, M.; Mazzucato, M.; Sgaravatto, M.; Verlato, M.; Zangrando, L.; Corvo, M.; Miccio, V.; Sciaba, A.; Cesini, D.; Dongiovanni, D.; Grandi, C.
2008-07-01
Modern Grid middleware is built around components providing basic functionality, such as data storage, authentication, security, job management, resource monitoring and reservation. In this paper we describe the Computing Resource Execution and Management (CREAM) service. CREAM provides a Web service-based job execution and management capability for Grid systems; in particular, it is being used within the gLite middleware. CREAM exposes a Web service interface allowing conforming clients to submit and manage computational jobs to a Local Resource Management System. We developed a special component, called ICE (Interface to CREAM Environment) to integrate CREAM in gLite. ICE transfers job submissions and cancellations from the Workload Management System, allowing users to manage CREAM jobs from the gLite User Interface. This paper describes some recent studies aimed at assessing the performance and reliability of CREAM and ICE; those tests have been performed as part of the acceptance tests for integration of CREAM and ICE in gLite. We also discuss recent work towards enhancing CREAM with a BES and JSDL compliant interface.
HIPAA-compliant automatic monitoring system for RIS-integrated PACS operation
NASA Astrophysics Data System (ADS)
Jin, Jin; Zhang, Jianguo; Chen, Xiaomeng; Sun, Jianyong; Yang, Yuanyuan; Liang, Chenwen; Feng, Jie; Sheng, Liwei; Huang, H. K.
2006-03-01
As a governmental regulation, Health Insurance Portability and Accountability Act (HIPAA) was issued to protect the privacy of health information that identifies individuals who are living or deceased. HIPAA requires security services supporting implementation features: Access control; Audit controls; Authorization control; Data authentication; and Entity authentication. These controls, which proposed in HIPAA Security Standards, are Audit trails here. Audit trails can be used for surveillance purposes, to detect when interesting events might be happening that warrant further investigation. Or they can be used forensically, after the detection of a security breach, to determine what went wrong and who or what was at fault. In order to provide security control services and to achieve the high and continuous availability, we design the HIPAA-Compliant Automatic Monitoring System for RIS-Integrated PACS operation. The system consists of two parts: monitoring agents running in each PACS component computer and a Monitor Server running in a remote computer. Monitoring agents are deployed on all computer nodes in RIS-Integrated PACS system to collect the Audit trail messages defined by the Supplement 95 of the DICOM standard: Audit Trail Messages. Then the Monitor Server gathers all audit messages and processes them to provide security information in three levels: system resources, PACS/RIS applications, and users/patients data accessing. Now the RIS-Integrated PACS managers can monitor and control the entire RIS-Integrated PACS operation through web service provided by the Monitor Server. This paper presents the design of a HIPAA-compliant automatic monitoring system for RIS-Integrated PACS Operation, and gives the preliminary results performed by this monitoring system on a clinical RIS-integrated PACS.
Globus Identity, Access, and Data Management: Platform Services for Collaborative Science
NASA Astrophysics Data System (ADS)
Ananthakrishnan, R.; Foster, I.; Wagner, R.
2016-12-01
Globus is software-as-a-service for research data management, developed at, and operated by, the University of Chicago. Globus, accessible at www.globus.org, provides high speed, secure file transfer; file sharing directly from existing storage systems; and data publication to institutional repositories. 40,000 registered users have used Globus to transfer tens of billions of files totaling hundreds of petabytes between more than 10,000 storage systems within campuses and national laboratories in the US and internationally. Web, command line, and REST interfaces support both interactive use and integration into applications and infrastructures. An important component of the Globus system is its foundational identity and access management (IAM) platform service, Globus Auth. Both Globus research data management and other applications use Globus Auth for brokering authentication and authorization interactions between end-users, identity providers, resource servers (services), and a range of clients, including web, mobile, and desktop applications, and other services. Compliant with important standards such as OAuth, OpenID, and SAML, Globus Auth provides mechanisms required for an extensible, integrated ecosystem of services and clients for the research and education community. It underpins projects such as the US National Science Foundation's XSEDE system, NCAR's Research Data Archive, and the DOE Systems Biology Knowledge Base. Current work is extending Globus services to be compliant with FEDRAMP standards for security assessment, authorization, and monitoring for cloud services. We will present Globus IAM solutions and give examples of Globus use in various projects for federated access to resources. We will also describe how Globus Auth and Globus research data management capabilities enable rapid development and low-cost operations of secure data sharing platforms that leverage Globus services and integrate them with local policy and security.
ERIC Educational Resources Information Center
Barnes, David G.; Fluke, Christopher J.; Jones, Nicholas T.; Maddison, Sarah T.; Kilborn, Virginia A.; Bailes, Matthew
2008-01-01
We adopt the Web 2.0 paradigm as a mechanism for preparing, editing, delivering and maintaining educational content, and for fostering ongoing innovation in the online education field. We report here on the migration of legacy course materials from "PowerPoint" slides on CD to a fully online delivery mode for use in the "Swinburne Astronomy…
SWS: accessing SRS sites contents through Web Services.
Romano, Paolo; Marra, Domenico
2008-03-26
Web Services and Workflow Management Systems can support creation and deployment of network systems, able to automate data analysis and retrieval processes in biomedical research. Web Services have been implemented at bioinformatics centres and workflow systems have been proposed for biological data analysis. New databanks are often developed by taking into account these technologies, but many existing databases do not allow a programmatic access. Only a fraction of available databanks can thus be queried through programmatic interfaces. SRS is a well know indexing and search engine for biomedical databanks offering public access to many databanks and analysis tools. Unfortunately, these data are not easily and efficiently accessible through Web Services. We have developed 'SRS by WS' (SWS), a tool that makes information available in SRS sites accessible through Web Services. Information on known sites is maintained in a database, srsdb. SWS consists in a suite of WS that can query both srsdb, for information on sites and databases, and SRS sites. SWS returns results in a text-only format and can be accessed through a WSDL compliant client. SWS enables interoperability between workflow systems and SRS implementations, by also managing access to alternative sites, in order to cope with network and maintenance problems, and selecting the most up-to-date among available systems. Development and implementation of Web Services, allowing to make a programmatic access to an exhaustive set of biomedical databases can significantly improve automation of in-silico analysis. SWS supports this activity by making biological databanks that are managed in public SRS sites available through a programmatic interface.
Compliant flooring to prevent fall-related injuries: a scoping review protocol.
Lachance, Chantelle C; Jurkowski, Michal P; Dymarz, Ania C; Mackey, Dawn C
2016-08-16
Fall-related injuries can have serious consequences for older adults, including increased risk of dependence in daily activities and mortality. Compliant flooring is a passive intervention that may reduce the incidence and severity of fall-related injuries in healthcare settings, including acute and long-term care, but few sites have implemented compliant flooring, in part because synthesised evidence about key performance aspects has not been available. We will conduct a scoping review to address the question: what is presented about the biomechanical efficacy, clinical effectiveness, cost-effectiveness, and workplace safety associated with compliant flooring systems that aim to prevent fall-related injuries? We will conduct a comprehensive and systematic literature search of academic databases (AgeLine, CINAHL, EBM Reviews, MEDLINE (Ovid), SportDiscus and Web of Science) and grey literature (clinical trial registries, theses/dissertations, abstracts/conference proceedings and relevant websites). 2 team members will independently screen records (first titles and abstracts, then full text) and extract data from included records. Numerical and narrative analyses will be presented by theme (biomechanical efficacy, clinical effectiveness, cost-effectiveness, workplace safety). This scoping review responds to the information needs of healthcare decision-makers tasked with preventing fall-related injuries. This review will summarise evidence about compliant flooring as a potential intervention for preventing fall-related injuries in older adults and identify gaps in evidence and new avenues for research. Results will be especially useful in long-term care, but also applicable in acute care, assisted living and home care. We will disseminate the review's findings via open-access publications, conference presentations, a webinar, a Stakeholder Symposium and a Knowledge-to-Action Report. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Compliant flooring to prevent fall-related injuries: a scoping review protocol
Lachance, Chantelle C; Jurkowski, Michal P; Dymarz, Ania C; Mackey, Dawn C
2016-01-01
Introduction Fall-related injuries can have serious consequences for older adults, including increased risk of dependence in daily activities and mortality. Compliant flooring is a passive intervention that may reduce the incidence and severity of fall-related injuries in healthcare settings, including acute and long-term care, but few sites have implemented compliant flooring, in part because synthesised evidence about key performance aspects has not been available. Methods and analysis We will conduct a scoping review to address the question: what is presented about the biomechanical efficacy, clinical effectiveness, cost-effectiveness, and workplace safety associated with compliant flooring systems that aim to prevent fall-related injuries? We will conduct a comprehensive and systematic literature search of academic databases (AgeLine, CINAHL, EBM Reviews, MEDLINE (Ovid), SportDiscus and Web of Science) and grey literature (clinical trial registries, theses/dissertations, abstracts/conference proceedings and relevant websites). 2 team members will independently screen records (first titles and abstracts, then full text) and extract data from included records. Numerical and narrative analyses will be presented by theme (biomechanical efficacy, clinical effectiveness, cost-effectiveness, workplace safety). Ethics and dissemination This scoping review responds to the information needs of healthcare decision-makers tasked with preventing fall-related injuries. This review will summarise evidence about compliant flooring as a potential intervention for preventing fall-related injuries in older adults and identify gaps in evidence and new avenues for research. Results will be especially useful in long-term care, but also applicable in acute care, assisted living and home care. We will disseminate the review's findings via open-access publications, conference presentations, a webinar, a Stakeholder Symposium and a Knowledge-to-Action Report. PMID:27531731
The Information System at CeSAM
NASA Astrophysics Data System (ADS)
Agneray, F.; Gimenez, S.; Moreau, C.; Roehlly, Y.
2012-09-01
Modern large observational programmes produce important amounts of data from various origins, and need high level quality control, fast data access via easy-to-use graphic interfaces, as well as possibility to cross-correlate informations coming from different observations. The Centre de donnéeS Astrophysique de Marseille (CeSAM) offer web access to VO compliant Information Systems to access data of different projects (VVDS, HeDAM, EXODAT, HST-COSMOS,…), including ancillary data obtained outside Laboratoire d'Astrophysique de Marseille (LAM) control. The CeSAM Information Systems provides download of catalogues and some additional services like: search, extract and display imaging and spectroscopic data by multi-criteria and Cone Search interfaces.
Executing SADI services in Galaxy.
Aranguren, Mikel Egaña; González, Alejandro Rodríguez; Wilkinson, Mark D
2014-01-01
In recent years Galaxy has become a popular workflow management system in bioinformatics, due to its ease of installation, use and extension. The availability of Semantic Web-oriented tools in Galaxy, however, is limited. This is also the case for Semantic Web Services such as those provided by the SADI project, i.e. services that consume and produce RDF. Here we present SADI-Galaxy, a tool generator that deploys selected SADI Services as typical Galaxy tools. SADI-Galaxy is a Galaxy tool generator: through SADI-Galaxy, any SADI-compliant service becomes a Galaxy tool that can participate in other out-standing features of Galaxy such as data storage, history, workflow creation, and publication. Galaxy can also be used to execute and combine SADI services as it does with other Galaxy tools. Finally, we have semi-automated the packing and unpacking of data into RDF such that other Galaxy tools can easily be combined with SADI services, plugging the rich SADI Semantic Web Service environment into the popular Galaxy ecosystem. SADI-Galaxy bridges the gap between Galaxy, an easy to use but "static" workflow system with a wide user-base, and SADI, a sophisticated, semantic, discovery-based framework for Web Services, thus benefiting both user communities.
Kamauu, Aaron W C; DuVall, Scott L; Wiggins, Richard H; Avrin, David E
2008-09-01
In the creation of interesting radiological cases in a digital teaching file, it is necessary to adjust the window and level settings of an image to effectively display the educational focus. The web-based applet described in this paper presents an effective solution for real-time window and level adjustments without leaving the picture archiving and communications system workstation. Optimized images are created, as user-defined parameters are passed between the applet and a servlet on the Health Insurance Portability and Accountability Act-compliant teaching file server.
GODIVA2: interactive visualization of environmental data on the Web.
Blower, J D; Haines, K; Santokhee, A; Liu, C L
2009-03-13
GODIVA2 is a dynamic website that provides visual access to several terabytes of physically distributed, four-dimensional environmental data. It allows users to explore large datasets interactively without the need to install new software or download and understand complex data. Through the use of open international standards, GODIVA2 maintains a high level of interoperability with third-party systems, allowing diverse datasets to be mutually compared. Scientists can use the system to search for features in large datasets and to diagnose the output from numerical simulations and data processing algorithms. Data providers around Europe have adopted GODIVA2 as an INSPIRE-compliant dynamic quick-view system for providing visual access to their data.
Sirepo for Synchrotron Radiation Workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nagler, Robert; Moeller, Paul; Rakitin, Maksim
Sirepo is an open source framework for cloud computing. The graphical user interface (GUI) for Sirepo, also known as the client, executes in any HTML5 compliant web browser on any computing platform, including tablets. The client is built in JavaScript, making use of the following open source libraries: Bootstrap, which is fundamental for cross-platform web applications; AngularJS, which provides a model–view–controller (MVC) architecture and GUI components; and D3.js, which provides interactive plots and data-driven transformations. The Sirepo server is built on the following Python technologies: Flask, which is a lightweight framework for web development; Jinja, which is a secure andmore » widely used templating language; and Werkzeug, a utility library that is compliant with the WSGI standard. We use Nginx as the HTTP server and proxy, which provides a scalable event-driven architecture. The physics codes supported by Sirepo execute inside a Docker container. One of the codes supported by Sirepo is the Synchrotron Radiation Workshop (SRW). SRW computes synchrotron radiation from relativistic electrons in arbitrary magnetic fields and propagates the radiation wavefronts through optical beamlines. SRW is open source and is primarily supported by Dr. Oleg Chubar of NSLS-II at Brookhaven National Laboratory.« less
NASA Astrophysics Data System (ADS)
Teng, W.; Chiu, L.; Kempler, S.; Liu, Z.; Nadeau, D.; Rui, H.
2006-12-01
Using NASA satellite remote sensing data from multiple sources for hydrologic applications can be a daunting task and requires a detailed understanding of the data's internal structure and physical implementation. Gaining this understanding and applying it to data reduction is a time-consuming task that must be undertaken before the core investigation can begin. In order to facilitate such investigations, the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) has developed the GES-DISC Interactive Online Visualization ANd aNalysis Infrastructure or "Giovanni," which supports a family of Web interfaces (instances) that allow users to perform interactive visualization and analysis online without downloading any data. Two such Giovanni instances are particularly relevant to hydrologic applications: the Tropical Rainfall Measuring Mission (TRMM) Online Visualization and Analysis System (TOVAS) and the Agricultural Online Visualization and Analysis System (AOVAS), both highly popular and widely used for a variety of applications, including those related to several NASA Applications of National Priority, such as Agricultural Efficiency, Disaster Management, Ecological Forecasting, Homeland Security, and Public Health. Dynamic, context- sensitive Web services provided by TOVAS and AOVAS enable users to seamlessly access NASA data from within, and deeply integrate the data into, their local client environments. One example is between TOVAS and Florida International University's TerraFly, a Web-enabled system that serves a broad segment of the research and applications community, by facilitating access to various textual, remotely sensed, and vector data. Another example is between AOVAS and the U.S. Department of Agriculture Foreign Agricultural Service (USDA FAS)'s Crop Explorer, the primary decision support tool used by FAS to monitor the production, supply, and demand of agricultural commodities worldwide. AOVAS is also part of GES DISC's Agricultural Information System (AIS), which can operationally provide satellite remote sensing data products (e.g., near- real-time rainfall) and analysis services to agricultural users. AIS enables the remote, interoperable access to distributed data, by using the GrADS-Data Server (GDS) and the Open Geospatial Consortium (OGC)- compliant MapServer. The latter allows the access of AIS data from any OGC-compliant client, such as the Earth-Sun System Gateway (ESG) or Google Earth. The Giovanni system is evolving towards a Service- Oriented Architecture and is highly customizable (e.g., adding new products or services), thus availing the hydrologic applications user community of Giovanni's simple-to-use and powerful capabilities to improve decision-making.
Reproductive and Hormonal Risk Factors for Breast Cancer in Blind Women
2005-06-01
evidence for possible effects of exposure to light at night (LAN) on cancer risk due to the increased use of modern electric lighting (2-8...being collected via; e) We have developed and finalized informed consent procedures for the range of media being used which have been formally...have developed the questionnaire for web use <www.bvihealthsurvey.bwh.harvard.edu>. The web-site is Section 508- and W3C-compliant as required for
NASA Astrophysics Data System (ADS)
Baldwin, R.; Ansari, S.; Reid, G.; Lott, N.; Del Greco, S.
2007-12-01
The main goal in developing and deploying Geographic Information System (GIS) services at NOAA's National Climatic Data Center (NCDC) is to provide users with simple access to data archives while integrating new and informative climate products. Several systems at NCDC provide a variety of climatic data in GIS formats and/or map viewers. The Online GIS Map Services provide users with data discovery options which flow into detailed product selection maps, which may be queried using standard "region finder" tools or gazetteer (geographical dictionary search) functions. Each tabbed selection offers steps to help users progress through the systems. A series of additional base map layers or data types have been added to provide companion information. New map services include: Severe Weather Data Inventory, Local Climatological Data, Divisional Data, Global Summary of the Day, and Normals/Extremes products. THREDDS Data Server technology is utilized to provide access to gridded multidimensional datasets such as Model, Satellite and Radar. This access allows users to download data as a gridded NetCDF file, which is readable by ArcGIS. In addition, users may subset the data for a specific geographic region, time period, height range or variable prior to download. The NCDC Weather Radar Toolkit (WRT) is a client tool which accesses Weather Surveillance Radar 1988 Doppler (WSR-88D) data locally or remotely from the NCDC archive, NOAA FTP server or any URL or THREDDS Data Server. The WRT Viewer provides tools for custom data overlays, Web Map Service backgrounds, animations and basic filtering. The export of images and movies is provided in multiple formats. The WRT Data Exporter allows for data export in both vector polygon (Shapefile, Well-Known Text) and raster (GeoTIFF, ESRI Grid, VTK, NetCDF, GrADS) formats. As more users become accustom to GIS, questions of better, cheaper, faster access soon follow. Expanding use and availability can best be accomplished through standards which promote interoperability. Our GIS related products provide Open Geospatial Consortium (OGC) compliant Web Map Services (WMS), Web Feature Services (WFS), Web Coverage Services (WCS) and Federal Geographic Data Committee (FGDC) metadata as a complement to the map viewers. KML/KMZ data files (soon to be compliant OGC specifications) also provide access.
NASA Astrophysics Data System (ADS)
Moreau, N.; Dubernet, M. L.
2006-07-01
Basecol is a combination of a website (using PHP and HTML) and a MySQL database concerning molecular ro-vibrational transitions induced by collisions with atoms or molecules. This database has been created in view of the scientific preparation of the Heterodyne Instrument for the Far-Infrared on board the Herschel Space Observatory (HSO). Basecol offers an access to numerical and bibliographic data through various output methods such as ASCII, HTML or VOTable (which is a first step towards a VO compliant system). A web service using Apache Axis has been developed in order to provide a direct access to data for external applications.
Kouchri, Farrokh Mohammadzadeh
2012-11-06
A Voice over Internet Protocol (VoIP) communications system, a method of managing a communications network in such a system and a program product therefore. The system/network includes an ENERGY STAR (E-star) aware softswitch and E-star compliant communications devices at system endpoints. The E-star aware softswitch allows E-star compliant communications devices to enter and remain in power saving mode. The E-star aware softswitch spools messages and forwards only selected messages (e.g., calls) to the devices in power saving mode. When the E-star compliant communications devices exit power saving mode, the E-star aware softswitch forwards spooled messages.
E-DECIDER Decision Support Gateway For Earthquake Disaster Response
NASA Astrophysics Data System (ADS)
Glasscoe, M. T.; Stough, T. M.; Parker, J. W.; Burl, M. C.; Donnellan, A.; Blom, R. G.; Pierce, M. E.; Wang, J.; Ma, Y.; Rundle, J. B.; Yoder, M. R.
2013-12-01
Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing capabilities for decision-making utilizing remote sensing data and modeling software in order to provide decision support for earthquake disaster management and response. E-DECIDER incorporates earthquake forecasting methodology and geophysical modeling tools developed through NASA's QuakeSim project in order to produce standards-compliant map data products to aid in decision-making following an earthquake. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools, help provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). E-DECIDER utilizes a service-based GIS model for its cyber-infrastructure in order to produce standards-compliant products for different user types with multiple service protocols (such as KML, WMS, WFS, and WCS). The goal is to make complex GIS processing and domain-specific analysis tools more accessible to general users through software services as well as provide system sustainability through infrastructure services. The system comprises several components, which include: a GeoServer for thematic mapping and data distribution, a geospatial database for storage and spatial analysis, web service APIs, including simple-to-use REST APIs for complex GIS functionalities, and geoprocessing tools including python scripts to produce standards-compliant data products. These are then served to the E-DECIDER decision support gateway (http://e-decider.org), the E-DECIDER mobile interface, and to the Department of Homeland Security decision support middleware UICDS (Unified Incident Command and Decision Support). The E-DECIDER decision support gateway features a web interface that delivers map data products including deformation modeling results (slope change and strain magnitude) and aftershock forecasts, with remote sensing change detection results under development. These products are event triggered (from the USGS earthquake feed) and will be posted to event feeds on the E-DECIDER webpage and accessible via the mobile interface and UICDS. E-DECIDER also features a KML service that provides infrastructure information from the FEMA HAZUS database through UICDS and the mobile interface. The back-end GIS service architecture and front-end gateway components form a decision support system that is designed for ease-of-use and extensibility for end-users.
Nachshon, Liat; Goldberg, Michael R; Elizur, Arnon; Levy, Michael B; Schwartz, Naama; Katz, Yitzhak
2015-06-01
Reactions during the home treatment phase of oral immunotherapy (OIT) are not uncommon. An ongoing accurate reporting of home treatment outcomes is crucial for the safety and success of OIT. Previous reports have shown that as few as 20% of patients are truly compliant with paper-based diaries. To develop a Web site-based electronic reporting system (web-RS) for monitoring home treatment during OIT for food allergy. A web-RS was developed and incorporated a thorough questionnaire querying for pertinent data including the dose(s) consumed, occurrence and details of adverse reactions, treatment(s), and relevant potential exacerbating factors. All patients enrolled in milk, peanut, or egg OIT programs for at least 4 weeks from November 2012 through January 2014 were introduced to web-RS (n = 157). Successful reporting through web-RS was defined by consecutive reporting during the first home treatment phase (24 days) after its introduction. Comparisons were made with a previous group of OIT-treated patients (n = 100) who reported by E-mail. Successful reporting was achieved by 142 of 157 patients (90.44%) in contrast to a 75% success rate with E-mail (P = .0009). The odds for successful reporting using web-RS were 3.1 (95% confidence interval 1.6-6.3) times higher compared with using E-mail. Mild reactions were reported more frequently with web-RS (P = .0032). Patient reports were constantly available in real time for medical staff review. No complaints regarding web-RS feasibility were reported. One risk factor for failure to use web-RS was a patient's prior successful OIT experience without using web-RS (P = .012). A web-RS can be a powerful tool for improving OIT safety by achieving a high level of patient cooperation in reporting home treatment results. Copyright © 2015 American College of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.
77 FR 427 - EPAAR Clause for Compliance With EPA Policies for Information Resources Management
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-05
... the IRM policies, standards, and procedures set forth on the Office of Environmental Information..., standards, and procedures. (c) Section 508 requirements. Contract deliverables are required to be compliant... contracts. This revision incorporates to the EPAAR, administrative changes to update terminology and Web...
van Rooij, S B T; Peluso, J P; Sluzewski, M; Kortman, H G; van Rooij, W J
2018-05-01
The Woven EndoBridge (WEB) is an intrasaccular flow diverter intended to treat wide-neck aneurysms. The latest generation WEBs needed a 0.021-inch microcatheter in the small sizes. Recently, a lower profile range of WEBs compliant with a 0.017-inch microcatheter (WEB 17) has been introduced. We present the first clinical results of treatment of both ruptured and unruptured aneurysms with the WEB 17. Between December 2016 and September 2017, forty-six aneurysms in 40 patients were treated with the WEB 17. No supporting stents or balloons were used. Twenty-five aneurysms were ruptured (54%). There were 6 men and 34 women (mean age, 62 years; median, 63 years; range, 46-87 years). The mean aneurysm size was 4.9 mm (median, 5 mm; range, 2-7 mm). There were 2 thromboembolic procedural complications without clinical sequelae and no ruptures. The overall permanent procedural complication rate was 0% (0 of 40; 97.5% CI, 0%-10.4%). Imaging follow-up at 3 months was available in 33 patients with 39 aneurysms (97.5% of 40 eligible aneurysms). In 1 aneurysm, the detached WEB was undersized and the remnant was additionally treated with coils after 1 week. This same aneurysm reopened at 3 months and was again treated with a second WEB. One other aneurysm showed persistent WEB filling at 3 months. Complete occlusion was achieved in 28 of 39 aneurysms (72%), and 9 aneurysms (23%) showed a neck remnant. The WEB 17 is safe and effective for both ruptured and unruptured aneurysms. The WEB 17 is a valuable addition to the existing WEB size range, especially for very small aneurysms. © 2018 by American Journal of Neuroradiology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nagler, Robert; Moeller, Paul
Sirepo is an open source framework for cloud computing. The graphical user interface (GUI) for Sirepo, also known as the client, executes in any HTML5 compliant web browser on any computing platform, including tablets. The client is built in JavaScript, making use of the following open source libraries: Bootstrap, which is fundamental for cross-platform web applications; AngularJS, which provides a model–view–controller (MVC) architecture and GUI components; and D3.js, which provides interactive plots and data-driven transformations. The Sirepo server is built on the following Python technologies: Flask, which is a lightweight framework for web development; Jin-ja, which is a secure andmore » widely used templating language; and Werkzeug, a utility library that is compliant with the WSGI standard. We use Nginx as the HTTP server and proxy, which provides a scalable event-driven architecture. The physics codes supported by Sirepo execute inside a Docker container. One of the codes supported by Sirepo is Warp. Warp is a particle-in-cell (PIC) code de-signed to simulate high-intensity charged particle beams and plasmas in both the electrostatic and electromagnetic regimes, with a wide variety of integrated physics models and diagnostics. At pre-sent, Sirepo supports a small subset of Warp’s capabilities. Warp is open source and is part of the Berkeley Lab Accelerator Simulation Toolkit.« less
A National Crop Progress Monitoring System Based on NASA Earth Science Results
NASA Astrophysics Data System (ADS)
Di, L.; Yu, G.; Zhang, B.; Deng, M.; Yang, Z.
2011-12-01
Crop progress is an important piece of information for food security and agricultural commodities. Timely monitoring and reporting are mandated for the operation of agricultural statistical agencies. Traditionally, the weekly reporting issued by the National Agricultural Statistics Service (NASS) of the United States Department of Agriculture (USDA) is based on reports from the knowledgeable state and county agricultural officials and farmers. The results are spatially coarse and subjective. In this project, a remote-sensing-supported crop progress monitoring system is being developed intensively using the data and derived products from NASA Earth Observing satellites. Moderate Resolution Imaging Spectroradiometer (MODIS) Level 3 product - MOD09 (Surface Reflectance) is used for deriving daily normalized vegetation index (NDVI), vegetation condition index (VCI), and mean vegetation condition index (MVCI). Ratio change to previous year and multiple year mean can be also produced on demand. The time-series vegetation condition indices are further combined with the NASS' remote-sensing-derived Cropland Data Layer (CDL) to estimate crop condition and progress crop by crop. To facilitate the operational requirement and increase the accessibility of data and products by different users, each component of the system has being developed and implemented following open specifications under the Web Service reference model of Open Geospatial Consortium Inc. Sensor observations and data are accessed through Web Coverage Service (WCS), Web Feature Service (WFS), or Sensor Observation Service (SOS) if available. Products are also served through such open-specification-compliant services. For rendering and presentation, Web Map Service (WMS) is used. A Web-service based system is set up and deployed at dss.csiss.gmu.edu/NDVIDownload. Further development will adopt crop growth models, feed the models with remotely sensed precipitation and soil moisture information, and incorporate the model results with vegetation-index time series for crop progress stage estimation.
An efficient architecture to support digital pathology in standard medical imaging repositories.
Marques Godinho, Tiago; Lebre, Rui; Silva, Luís Bastião; Costa, Carlos
2017-07-01
In the past decade, digital pathology and whole-slide imaging (WSI) have been gaining momentum with the proliferation of digital scanners from different manufacturers. The literature reports significant advantages associated with the adoption of digital images in pathology, namely, improvements in diagnostic accuracy and better support for telepathology. Moreover, it also offers new clinical and research applications. However, numerous barriers have been slowing the adoption of WSI, among which the most important are performance issues associated with storage and distribution of huge volumes of data, and lack of interoperability with other hospital information systems, most notably Picture Archive and Communications Systems (PACS) based on the DICOM standard. This article proposes an architecture of a Web Pathology PACS fully compliant with DICOM standard communications and data formats. The solution includes a PACS Archive responsible for storing whole-slide imaging data in DICOM WSI format and offers a communication interface based on the most recent DICOM Web services. The second component is a zero-footprint viewer that runs in any web-browser. It consumes data using the PACS archive standard web services. Moreover, it features a tiling engine especially suited to deal with the WSI image pyramids. These components were designed with special focus on efficiency and usability. The performance of our system was assessed through a comparative analysis of the state-of-the-art solutions. The results demonstrate that it is possible to have a very competitive solution based on standard workflows. Copyright © 2017 Elsevier Inc. All rights reserved.
VisIVO: A Tool for the Virtual Observatory and Grid Environment
NASA Astrophysics Data System (ADS)
Becciani, U.; Comparato, M.; Costa, A.; Larsson, B.; Gheller, C.; Pasian, F.; Smareglia, R.
2007-10-01
We present the new features of VisIVO, software for the visualization and analysis of astrophysical data which can be retrieved from the Virtual Observatory framework and used for cosmological simulations running both on Windows and GNU/Linux platforms. VisIVO is VO standards compliant and supports the most important astronomical data formats such as FITS, HDF5 and VOTables. It is free software and can be downloaded from the web site http://visivo.cineca.it. VisIVO can interoperate with other astronomical VO compliant tools through PLASTIC (PLatform for AStronomical Tool InterConnection). This feature allows VisIVO to share data with many other astronomical packages to further analyze the loaded data.
MendeLIMS: a web-based laboratory information management system for clinical genome sequencing.
Grimes, Susan M; Ji, Hanlee P
2014-08-27
Large clinical genomics studies using next generation DNA sequencing require the ability to select and track samples from a large population of patients through many experimental steps. With the number of clinical genome sequencing studies increasing, it is critical to maintain adequate laboratory information management systems to manage the thousands of patient samples that are subject to this type of genetic analysis. To meet the needs of clinical population studies using genome sequencing, we developed a web-based laboratory information management system (LIMS) with a flexible configuration that is adaptable to continuously evolving experimental protocols of next generation DNA sequencing technologies. Our system is referred to as MendeLIMS, is easily implemented with open source tools and is also highly configurable and extensible. MendeLIMS has been invaluable in the management of our clinical genome sequencing studies. We maintain a publicly available demonstration version of the application for evaluation purposes at http://mendelims.stanford.edu. MendeLIMS is programmed in Ruby on Rails (RoR) and accesses data stored in SQL-compliant relational databases. Software is freely available for non-commercial use at http://dna-discovery.stanford.edu/software/mendelims/.
NASA Astrophysics Data System (ADS)
Hammitzsch, Martin; Lendholt, Matthias; Reißland, Sven; Schulz, Jana
2013-04-01
On November 27-28, 2012, the Kandilli Observatory and Earthquake Research Institute (KOERI) and the Portuguese Institute for the Sea and Atmosphere (IPMA) joined other countries in the North-eastern Atlantic, the Mediterranean and Connected Seas (NEAM) region as participants in an international tsunami response exercise. The exercise, titled NEAMWave12, simulated widespread Tsunami Watch situations throughout the NEAM region. It is the first international exercise as such, in this region, where the UNESCO-IOC ICG/NEAMTWS tsunami warning chain has been tested to a full scale for the first time with different systems. One of the systems is developed in the project Collaborative, Complex, and Critical Decision-Support in Evolving Crises (TRIDEC) and has been validated in this exercise among others by KOERI and IPMA. In TRIDEC new developments in Information and Communication Technology (ICT) are used to extend the existing platform realising a component-based technology framework for building distributed tsunami warning systems for deployment, e.g. in the North-eastern Atlantic, the Mediterranean and Connected Seas (NEAM) region. The TRIDEC system will be implemented in three phases, each with a demonstrator. Successively, the demonstrators are addressing related challenges. The first and second phase system demonstrator, deployed at KOERI's crisis management room and deployed at IPMA has been designed and implemented, firstly, to support plausible scenarios for the Turkish NTWC and for the Portuguese NTWC to demonstrate the treatment of simulated tsunami threats with an essential subset of a NTWC. Secondly, the feasibility and the potentials of the implemented approach are demonstrated covering ICG/NEAMTWS standard operations as well as tsunami detection and alerting functions beyond ICG/NEAMTWS requirements. The demonstrator presented addresses information management and decision-support processes for hypothetical tsunami-related crisis situations in the context of the ICG/NEAMTWS NEAMWave12 exercise for the Turkish and Portuguese tsunami exercise scenarios. Impressions gained with the standards compliant TRIDEC system during the exercise will be reported. The system version presented is based on event-driven architecture (EDA) and service-oriented architecture (SOA) concepts and is making use of relevant standards of the Open Geospatial Consortium (OGC), the World Wide Web Consortium (W3C) and the Organization for the Advancement of Structured Information Standards (OASIS). In this way the system continuously gathers, processes and displays events and data coming from open sensor platforms to enable operators to quickly decide whether an early warning is necessary and to send personalized warning messages to the authorities and the population at large through a wide range of communication channels. The system integrates OGC Sensor Web Enablement (SWE) compliant sensor systems for the rapid detection of hazardous events, like earthquakes, sea level anomalies, ocean floor occurrences, and ground displacements. Using OGC Web Map Service (WMS) and Web Feature Service (WFS) spatial data are utilized to depict the situation picture. The integration of a simulation system to identify affected areas is considered using the OGC Web Processing Service (WPS). Warning messages are compiled and transmitted in the OASIS Common Alerting Protocol (CAP) together with addressing information defined via the OASIS Emergency Data Exchange Language - Distribution Element (EDXL-DE). This demonstration is linked with the talk 'Experiences with TRIDEC's Crisis Management Demonstrator in the Turkish NEAMWave12 exercise tsunami scenario' (EGU2013-2833) given in the session "Architecture of Future Tsunami Warning Systems" (NH5.6).
NASA Astrophysics Data System (ADS)
Ma, Kevin; Wang, Ximing; Lerner, Alex; Shiroishi, Mark; Amezcua, Lilyana; Liu, Brent
2015-03-01
In the past, we have developed and displayed a multiple sclerosis eFolder system for patient data storage, image viewing, and automatic lesion quantification results stored in DICOM-SR format. The web-based system aims to be integrated in DICOM-compliant clinical and research environments to aid clinicians in patient treatments and disease tracking. This year, we have further developed the eFolder system to handle big data analysis and data mining in today's medical imaging field. The database has been updated to allow data mining and data look-up from DICOM-SR lesion analysis contents. Longitudinal studies are tracked, and any changes in lesion volumes and brain parenchyma volumes are calculated and shown on the webbased user interface as graphical representations. Longitudinal lesion characteristic changes are compared with patients' disease history, including treatments, symptom progressions, and any other changes in the disease profile. The image viewer is updated such that imaging studies can be viewed side-by-side to allow visual comparisons. We aim to use the web-based medical imaging informatics eFolder system to demonstrate big data analysis in medical imaging, and use the analysis results to predict MS disease trends and patterns in Hispanic and Caucasian populations in our pilot study. The discovery of disease patterns among the two ethnicities is a big data analysis result that will help lead to personalized patient care and treatment planning.
Building Geospatial Web Services for Ecological Monitoring and Forecasting
NASA Astrophysics Data System (ADS)
Hiatt, S. H.; Hashimoto, H.; Melton, F. S.; Michaelis, A. R.; Milesi, C.; Nemani, R. R.; Wang, W.
2008-12-01
The Terrestrial Observation and Prediction System (TOPS) at NASA Ames Research Center is a modeling system that generates a suite of gridded data products in near real-time that are designed to enhance management decisions related to floods, droughts, forest fires, human health, as well as crop, range, and forest production. While these data products introduce great possibilities for assisting management decisions and informing further research, realization of their full potential is complicated by their shear volume and by the need for a necessary infrastructure for remotely browsing, visualizing, and analyzing the data. In order to address these difficulties we have built an OGC-compliant WMS and WCS server based on an open source software stack that provides standardized access to our archive of data. This server is built using the open source Java library GeoTools which achieves efficient I/O and image rendering through Java Advanced Imaging. We developed spatio-temporal raster management capabilities using the PostGrid raster indexation engine. We provide visualization and browsing capabilities through a customized Ajax web interface derived from the kaMap project. This interface allows resource managers to quickly assess ecosystem conditions and identify significant trends and anomalies from within their web browser without the need to download source data or install special software. Our standardized web services also expose TOPS data to a range of potential clients, from web mapping applications to virtual globes and desktop GIS packages. However, support for managing the temporal dimension of our data is currently limited in existing software systems. Future work will attempt to overcome this shortcoming by building time-series visualization and analysis tools that can be integrated with existing geospatial software.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-25
... at http://www.sec.gov and also on the Exchange's Internet Web site at http://nasdaqomxbx... Options Order Protection and Locked/Crossed Market Plan (``Decentralized Plan'').\\6\\ \\5\\ See Securities... mechanisms to remain fully compliant with the Decentralized Plan and BOX Rules and to no longer rely upon a...
40 CFR 63.4322 - How do I demonstrate continuous compliance with the emission limitations?
Code of Federal Regulations, 2010 CFR
2010-07-01
... which the mass fraction of organic HAP, determined according to the requirements of § 63.4321(e)(1)(iv... report required by § 63.4311, you must identify any web coating/printing operation, slashing operation, or dyeing/finishing operation for which you used the compliant material option. If there were no...
A flexible geospatial sensor observation service for diverse sensor data based on Web service
NASA Astrophysics Data System (ADS)
Chen, Nengcheng; Di, Liping; Yu, Genong; Min, Min
Achieving a flexible and efficient geospatial Sensor Observation Service (SOS) is difficult, given the diversity of sensor networks, the heterogeneity of sensor data storage, and the differing requirements of users. This paper describes development of a service-oriented multi-purpose SOS framework. The goal is to create a single method of access to the data by integrating the sensor observation service with other Open Geospatial Consortium (OGC) services — Catalogue Service for the Web (CSW), Transactional Web Feature Service (WFS-T) and Transactional Web Coverage Service (WCS-T). The framework includes an extensible sensor data adapter, an OGC-compliant geospatial SOS, a geospatial catalogue service, a WFS-T, and a WCS-T for the SOS, and a geospatial sensor client. The extensible sensor data adapter finds, stores, and manages sensor data from live sensors, sensor models, and simulation systems. Abstract factory design patterns are used during design and implementation. A sensor observation service compatible with the SWE is designed, following the OGC "core" and "transaction" specifications. It is implemented using Java servlet technology. It can be easily deployed in any Java servlet container and automatically exposed for discovery using Web Service Description Language (WSDL). Interaction sequences between a Sensor Web data consumer and an SOS, between a producer and an SOS, and between an SOS and a CSW are described in detail. The framework has been successfully demonstrated in application scenarios for EO-1 observations, weather observations, and water height gauge observations.
Extending Climate Analytics-As to the Earth System Grid Federation
NASA Astrophysics Data System (ADS)
Tamkin, G.; Schnase, J. L.; Duffy, D.; McInerney, M.; Nadeau, D.; Li, J.; Strong, S.; Thompson, J. H.
2015-12-01
We are building three extensions to prior-funded work on climate analytics-as-a-service that will benefit the Earth System Grid Federation (ESGF) as it addresses the Big Data challenges of future climate research: (1) We are creating a cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables from six major reanalysis data sets. This near real-time capability will enable advanced technologies like the Cloudera Impala-based Structured Query Language (SQL) query capabilities and Hadoop-based MapReduce analytics over native NetCDF files while providing a platform for community experimentation with emerging analytic technologies. (2) We are building a full-featured Reanalysis Ensemble Service comprising monthly means data from six reanalysis data sets. The service will provide a basic set of commonly used operations over the reanalysis collections. The operations will be made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services (CDS) API. (3) We are establishing an Open Geospatial Consortium (OGC) WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation ESGF capabilities. The CDS API will be extended to accommodate the new WPS Web service endpoints as well as ESGF's Web service endpoints. These activities address some of the most important technical challenges for server-side analytics and support the research community's requirements for improved interoperability and improved access to reanalysis data.
Grist : grid-based data mining for astronomy
NASA Technical Reports Server (NTRS)
Jacob, Joseph C.; Katz, Daniel S.; Miller, Craig D.; Walia, Harshpreet; Williams, Roy; Djorgovski, S. George; Graham, Matthew J.; Mahabal, Ashish; Babu, Jogesh; Berk, Daniel E. Vanden;
2004-01-01
The Grist project is developing a grid-technology based system as a research environment for astronomy with massive and complex datasets. This knowledge extraction system will consist of a library of distributed grid services controlled by a workflow system, compliant with standards emerging from the grid computing, web services, and virtual observatory communities. This new technology is being used to find high redshift quasars, study peculiar variable objects, search for transients in real time, and fit SDSS QSO spectra to measure black hole masses. Grist services are also a component of the 'hyperatlas' project to serve high-resolution multi-wavelength imagery over the Internet. In support of these science and outreach objectives, the Grist framework will provide the enabling fabric to tie together distributed grid services in the areas of data access, federation, mining, subsetting, source extraction, image mosaicking, statistics, and visualization.
Grist: Grid-based Data Mining for Astronomy
NASA Astrophysics Data System (ADS)
Jacob, J. C.; Katz, D. S.; Miller, C. D.; Walia, H.; Williams, R. D.; Djorgovski, S. G.; Graham, M. J.; Mahabal, A. A.; Babu, G. J.; vanden Berk, D. E.; Nichol, R.
2005-12-01
The Grist project is developing a grid-technology based system as a research environment for astronomy with massive and complex datasets. This knowledge extraction system will consist of a library of distributed grid services controlled by a workflow system, compliant with standards emerging from the grid computing, web services, and virtual observatory communities. This new technology is being used to find high redshift quasars, study peculiar variable objects, search for transients in real time, and fit SDSS QSO spectra to measure black hole masses. Grist services are also a component of the ``hyperatlas'' project to serve high-resolution multi-wavelength imagery over the Internet. In support of these science and outreach objectives, the Grist framework will provide the enabling fabric to tie together distributed grid services in the areas of data access, federation, mining, subsetting, source extraction, image mosaicking, statistics, and visualization.
Web client and ODBC access to legacy database information: a low cost approach.
Sanders, N. W.; Mann, N. H.; Spengler, D. M.
1997-01-01
A new method has been developed for the Department of Orthopaedics of Vanderbilt University Medical Center to access departmental clinical data. Previously this data was stored only in the medical center's mainframe DB2 database, it is now additionally stored in a departmental SQL database. Access to this data is available via any ODBC compliant front-end or a web client. With a small budget and no full time staff, we were able to give our department on-line access to many years worth of patient data that was previously inaccessible. PMID:9357735
A Tale of Two Observing Systems: Interoperability in the World of Microsoft Windows
NASA Astrophysics Data System (ADS)
Babin, B. L.; Hu, L.
2008-12-01
Louisiana Universities Marine Consortium's (LUMCON) and Dauphin Island Sea Lab's (DISL) Environmental Monitoring System provide a unified coastal ocean observing system. These two systems are mirrored to maintain autonomy while offering an integrated data sharing environment. Both systems collect data via Campbell Scientific Data loggers, store the data in Microsoft SQL servers, and disseminate the data in real- time on the World Wide Web via Microsoft Internet Information Servers and Active Server Pages (ASP). The utilization of Microsoft Windows technologies presented many challenges to these observing systems as open source tools for interoperability grow. The current open source tools often require the installation of additional software. In order to make data available through common standards formats, "home grown" software has been developed. One example of this is the development of software to generate xml files for transmission to the National Data Buoy Center (NDBC). OOSTethys partners develop, test and implement easy-to-use, open-source, OGC-compliant software., and have created a working prototype of networked, semantically interoperable, real-time data systems. Partnering with OOSTethys, we are developing a cookbook to implement OGC web services. The implementation will be written in ASP, will run in a Microsoft operating system environment, and will serve data via Sensor Observation Services (SOS). This cookbook will give observing systems running Microsoft Windows the tools to easily participate in the Open Geospatial Consortium (OGC) Oceans Interoperability Experiment (OCEANS IE).
Lubowitz, James H; Smith, Patrick A
2012-03-01
In 2011, postsurgical patient outcome data may be compiled in a research registry, allowing comparative-effectiveness research and cost-effectiveness analysis by use of Health Insurance Portability and Accountability Act-compliant, institutional review board-approved, Food and Drug Administration-approved, remote, Web-based data collection systems. Computerized automation minimizes cost and minimizes surgeon time demand. A research registry can be a powerful tool to observe and understand variations in treatment and outcomes, to examine factors that influence prognosis and quality of life, to describe care patterns, to assess effectiveness, to monitor safety, and to change provider practice through feedback of data. Registry of validated, prospective outcome data is required for arthroscopic and related researchers and the public to advocate with governments and health payers. The goal is to develop evidence-based data to determine the best methods for treating patients. Copyright © 2012 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.
Systems and Methods for Implementing Bulk Metallic Glass-Based Macroscale Compliant Mechanisms
NASA Technical Reports Server (NTRS)
Hofmann, Douglas C. (Inventor); Agnes, Gregory (Inventor)
2017-01-01
Systems and methods in accordance with embodiments of the invention implement bulk metallic glass-based macroscale compliant mechanisms. In one embodiment, a bulk metallic glass-based macroscale compliant mechanism includes: a flexible member that is strained during the normal operation of the compliant mechanism; where the flexible member has a thickness of 0.5 mm; where the flexible member comprises a bulk metallic glass-based material; and where the bulk metallic glass-based material can survive a fatigue test that includes 1000 cycles under a bending loading mode at an applied stress to ultimate strength ratio of 0.25.
The evolution of the CUAHSI Water Markup Language (WaterML)
NASA Astrophysics Data System (ADS)
Zaslavsky, I.; Valentine, D.; Maidment, D.; Tarboton, D. G.; Whiteaker, T.; Hooper, R.; Kirschtel, D.; Rodriguez, M.
2009-04-01
The CUAHSI Hydrologic Information System (HIS, his.cuahsi.org) uses web services as the core data exchange mechanism which provides programmatic connection between many heterogeneous sources of hydrologic data and a variety of online and desktop client applications. The service message schema follows the CUAHSI Water Markup Language (WaterML) 1.x specification (see OGC Discussion Paper 07-041r1). Data sources that can be queried via WaterML-compliant water data services include national and international repositories such as USGS NWIS (National Water Information System), USEPA STORET (Storage & Retrieval), USDA SNOTEL (Snowpack Telemetry), NCDC ISH and ISD(Integrated Surface Hourly and Daily Data), MODIS (Moderate Resolution Imaging Spectroradiometer), and DAYMET (Daily Surface Weather Data and Climatological Summaries). Besides government data sources, CUAHSI HIS provides access to a growing number of academic hydrologic observation networks. These networks are registered by researchers associated with 11 hydrologic observatory testbeds around the US, and other research, government and commercial groups wishing to join the emerging CUAHSI Water Data Federation. The Hydrologic Information Server (HIS Server) software stack deployed at NSF-supported hydrologic observatory sites and other universities around the country, supports a hydrologic data publication workflow which includes the following steps: (1) observational data are loaded from static files or streamed from sensors into a local instance of an Observations Data Model (ODM) database; (2) a generic web service template is configured for the new ODM instance to expose the data as a WaterML-compliant water data service, and (3) the new water data service is registered at the HISCentral registry (hiscentral.cuahsi.org), its metadata are harvested and semantically tagged using concepts from a hydrologic ontology. As a result, the new service is indexed in the CUAHSI central metadata catalog, and becomes available for spatial and semantics-based queries. The main component of interoperability across hydrologic data repositories in CUAHSI HIS is mapping different repository schemas and semantics to a shared community information model for observations made at stationary points. This information model has been implemented as both a relational schema (ODM) and an XML schema (WaterML). Its main design drivers have been data storage and data interchange needs of hydrology researchers, a series of community reviews of the ODM, and the practices of hydrologic data modeling and presentation adopted by federal agencies as observed in agency online data access applications, such as NWISWeb and USEPA STORET. The goal of the first version of WaterML was to encode the semantics of hydrologic observations discovery and retrieval and implement water data services in a way that is generic across different data providers. In particular, this implied maintaining a single common representation for the key constructs returned to web service calls, related to observations, features of interest, observation procedures, observation series, etc. Another WaterML design consideration was to create (in version 1 of CUAHSI HIS in particular) a fairly rigid, compact, and simple XML schema which was easy to generate and parse, thus creating the least barrier for adoption by hydrologists. Each of the three main request methods in the water data web services - GetSiteInfo, GetVariableInfo, and GetValues - has a corresponding response element in WaterML: SiteResponse, VariableResponse, and TimeSeriesResponse. The strictness and compactness of the first version of WaterML supported its community adoption. Over the last two years, several ODM and WaterML implementations for various platforms have emerged, and several Water Data Services client applications have been created by outside groups in both industry and academia. In a significant development, the WaterML specification has been adopted by federal agencies. The experimental USGS NWIS Daily Values web service returns WaterML-compliant TimeSeriesResponse. NCDC is also prototyping WaterML for data delivery, and has developed a REST-based service that generates WaterML- compliant output for its integrated station network. These agency-supported web services provide a much more efficient way to deliver agency data compared to the web site scraper services that the CUAHSI HIS project developed initially. Adoption of WaterML by the US Geological Survey is particularly significant because the USGS maintains by far the largest water data repository in the United States. For version 1.1, WaterML has evolved to reflect the deployment experience at hydrologic observatory testbeds, as well as feedback from hydrologic data repository managers at federal and state agencies. Further development of WaterML and enhancement of the underlying information model is the focus of the recently established OGC Hydrology Domain Working Group, whose mission is to profile OGC standards (GML, O&M, SOS, WCS, WFS) for the water resources domain and thus ensure WaterML's wider applicability and easier implementation. WaterML 2.0 is envisioned as an OGC-compliant application schema that supports OGC features, can express different types of observations and various groupings of observations, and allows researchers to define custom metadata elements. This presentation will discuss the information model underlying WaterML and describe the rationale, design drivers and evolution of WaterML and the water data services, illustrating their recent application in the context of CUAHSI HIS and the hydrologic observatory testbeds.
NASA Technical Reports Server (NTRS)
Chimiak, Reine; Harris, Bernard; Williams, Phillip
2013-01-01
Basic Common Data Format (CDF) tools (e.g., cdfedit) provide no specific support for creating International Solar-Terrestrial Physics/Space Physics Data Facility (ISTP/SPDF) standard files. While it is possible for someone who is familiar with the ISTP/SPDF metadata guidelines to create compliant files using just the basic tools, the process is error-prone and unreasonable for someone without ISTP/SPDF expertise. The key problem is the lack of a tool with specific support for creating files that comply with the ISTP/SPDF guidelines. There are basic CDF tools such as cdfedit and skeletoncdf for creating CDF files, but these have no specific support for creating ISTP/ SPDF compliant files. The SPDF ISTP CDF skeleton editor is a cross-platform, Java-based GUI editor program that allows someone with only a basic understanding of the ISTP/SPDF guidelines to easily create compliant files. The editor is a simple graphical user interface (GUI) application for creating and editing ISTP/SPDF guideline-compliant skeleton CDF files. The SPDF ISTP CDF skeleton editor consists of the following components: A swing-based Java GUI program, JavaHelp-based manual/ tutorial, Image/Icon files, and HTML Web page for distribution. The editor is available as a traditional Java desktop application as well as a Java Network Launching Protocol (JNLP) application. Once started, it functions like a typical Java GUI file editor application for creating/editing application-unique files.
Design Drivers of Water Data Services
NASA Astrophysics Data System (ADS)
Valentine, D.; Zaslavsky, I.
2008-12-01
The CUAHSI Hydrologic Information System (HIS) is being developed as a geographically distributed network of hydrologic data sources and functions that are integrated using web services so that they function as a connected whole. The core of the HIS service-oriented architecture is a collection of water web services, which provide uniform access to multiple repositories of observation data. These services use SOAP protocols communicating WaterML (Water Markup Language). When a client makes a data or metadata request using a CUAHSI HIS web service, these requests are made in standard manner, following the CUAHSI HIS web service signatures - regardless of how the underlying data source may be organized. Also, regardless of the format in which the data are returned by the source, the web services respond to requests by returning the data in a standard format of WaterML. The goal of WaterML design has been to capture semantics of hydrologic observations discovery and retrieval and express the point observations information model as an XML schema. To a large extent, it follows the representation of the information model as adopted by the CUASHI Observations Data Model (ODM) relational design. Another driver of WaterML design is specifications and metadata adopted by USGS NWIS, EPA STORET, and other federal agencies, as it seeks to provide a common foundation for exchanging both agency data and data collected in multiple academic projects. Another WaterML design principle was to create, in version 1 of HIS in particular, a fairly rigid and simple XML schema which is easy to generate and parse, thus creating the least barrier for adoption by hydrologists. WaterML includes a series of elements that reflect common notions used in describing hydrologic observations, such as site, variable, source, observation series, seriesCatalog, and data values. Each of the three main request methods in the water web services - GetSiteInfo, GetVariableInfo, and GetValues - has a corresponding response element in WaterML: SitesResponse, VariableResponse, and TimeSeriesResponse. The WaterML specification is being adopted by federal agencies. The experimental USGS NWIS Daily Values web service returns WaterML-compliant TImeSeriesResponse. The National Climatic Data Center is also prototyping WaterML for data delivery, and has developed a REST-based service that generates WaterML- compliant output for the NCDC ASOS network. Such agency-supported web services coming online provide a much more efficient way to deliver agency data compared to the web site scraper services that the CUAHSI HIS project has developed initially. The CUAHSI water data web services will continue to serve as the main communication mechanism within CUAHSI HIS, connecting a variety of data sources with a growing set of web service clients being developed in both academia and the commercial sector. The driving forces for the development of web services continue to be: - Application experience and needs of the growing number of CUAHSI HIS users, who experiment with additional data types, analysis modes, data browsing and searching strategies, and provide feedback to WaterML developers; - Data description requirements posed by various federal and state agencies; - Harmonization with standards being adopted or developed in neighboring communities, in particular the relevant standards being explored within the Open Geospatial Consortium. CUAHSI WaterML is a standard output schema for CUAHSI HIS water web services. Its formal specification is available as OGC discussion paper at www.opengeospatial.org/standards/dp/ class="ab'>
A Generalized-Compliant-Motion Primitive
NASA Technical Reports Server (NTRS)
Backes, Paul G.
1993-01-01
Computer program bridges gap between planning and execution of compliant robotic motions developed and installed in control system of telerobot. Called "generalized-compliant-motion primitive," one of several task-execution-primitive computer programs, which receives commands from higher-level task-planning programs and executes commands by generating required trajectories and applying appropriate control laws. Program comprises four parts corresponding to nominal motion, compliant motion, ending motion, and monitoring. Written in C language.
NASA Astrophysics Data System (ADS)
Danobeitia, J.; Oscar, G.; Bartolomé, R.; Sorribas, J.; Del Rio, J.; Cadena, J.; Toma, D. M.; Bghiel, I.; Martinez, E.; Bardaji, R.; Piera, J.; Favali, P.; Beranzoli, L.; Rolin, J. F.; Moreau, B.; Andriani, P.; Lykousis, V.; Hernandez Brito, J.; Ruhl, H.; Gillooly, M.; Terrinha, P.; Radulescu, V.; O'Neill, N.; Best, M.; Marinaro, G.
2016-12-01
European Multidisciplinary seafloor and the Observatory of the water column for Development (EMSODEV) is a Horizon-2020 UE project whose overall objective is the operationalization of eleven marine observatories and four test sites distributed throughout Europe, from the Arctic to the Atlantic, from the Mediterranean to the Black Sea. The whole infrastructure is managed by the European consortium EMSO-ERIC (European Research Infrastructure Consortium) with the participation of 8 European countries and other partner countries. Now, we are implementing a Generic Sensor Module (EGIM) within the EMSO ERIC distributed marine research infrastructure. Our involvement is mainly on developing standard-compliant generic software for Sensor Web Enablement (SWE) on EGIM device. The main goal of this development is to support the sensors data acquisition on a new interoperable EGIM system. The EGIM software structure is made up of one acquisition layer located between the recorded data at EGIM module and the data management services. Therefore, two main interfaces are implemented: first, assuring the EGIM hardware acquisition and second allowing push and pull data from data management layer (Sensor Web Enable standard compliant). All software components used are Open source licensed and has been configured to manage different roles on the whole system (52º North SOS Server, Zabbix Monitoring System). The acquisition data module has been implemented with the aim to join all components for EGIM data acquisition and server fulfilling SOS standards interface. The system is already achieved awaiting for the first laboratory bench test and shallow water test connection to the OBSEA node, offshore Vilanova I la Geltrú (Barcelona, Spain). The EGIM module will record a wide range of ocean parameters in a long-term consistent, accurate and comparable manner from disciplines such as biology, geology, chemistry, physics, engineering, and computer science, from polar to subtropical environments, through the water column down to the deep sea. The measurements recorded along EMSO NODES are critical to respond accurately to the social and scientific challenges such as climate change, changes in marine ecosystems, and marine hazards.
Lachance, Chantelle C; Jurkowski, Michal P; Dymarz, Ania C; Robinovitch, Stephen N; Feldman, Fabio; Laing, Andrew C; Mackey, Dawn C
2017-01-01
Compliant flooring, broadly defined as flooring systems or floor coverings with some level of shock absorbency, may reduce the incidence and severity of fall-related injuries in older adults; however, a lack of synthesized evidence may be limiting widespread uptake. Informed by the Arksey and O'Malley framework and guided by a Research Advisory Panel of knowledge users, we conducted a scoping review to answer: what is presented about the biomechanical efficacy, clinical effectiveness, cost-effectiveness, and workplace safety associated with compliant flooring systems that aim to prevent fall-related injuries in healthcare settings? We searched academic and grey literature databases. Any record that discussed a compliant flooring system and at least one of biomechanical efficacy, clinical effectiveness, cost-effectiveness, or workplace safety was eligible for inclusion. Two independent reviewers screened and abstracted records, charted data, and summarized results. After screening 3611 titles and abstracts and 166 full-text articles, we included 84 records plus 56 companion (supplementary) reports. Biomechanical efficacy records (n = 50) demonstrate compliant flooring can reduce fall-related impact forces with minimal effects on standing and walking balance. Clinical effectiveness records (n = 20) suggest that compliant flooring may reduce injuries, but may increase risk for falls. Preliminary evidence suggests that compliant flooring may be a cost-effective strategy (n = 12), but may also result in increased physical demands for healthcare workers (n = 17). In summary, compliant flooring is a promising strategy for preventing fall-related injuries from a biomechanical perspective. Additional research is warranted to confirm whether compliant flooring (i) prevents fall-related injuries in real-world settings, (ii) is a cost-effective intervention strategy, and (iii) can be installed without negatively impacting workplace safety. Avenues for future research are provided, which will help to determine whether compliant flooring is recommended in healthcare environments.
Jurkowski, Michal P.; Dymarz, Ania C.; Robinovitch, Stephen N.; Feldman, Fabio; Laing, Andrew C.; Mackey, Dawn C.
2017-01-01
Background Compliant flooring, broadly defined as flooring systems or floor coverings with some level of shock absorbency, may reduce the incidence and severity of fall-related injuries in older adults; however, a lack of synthesized evidence may be limiting widespread uptake. Methods Informed by the Arksey and O’Malley framework and guided by a Research Advisory Panel of knowledge users, we conducted a scoping review to answer: what is presented about the biomechanical efficacy, clinical effectiveness, cost-effectiveness, and workplace safety associated with compliant flooring systems that aim to prevent fall-related injuries in healthcare settings? We searched academic and grey literature databases. Any record that discussed a compliant flooring system and at least one of biomechanical efficacy, clinical effectiveness, cost-effectiveness, or workplace safety was eligible for inclusion. Two independent reviewers screened and abstracted records, charted data, and summarized results. Results After screening 3611 titles and abstracts and 166 full-text articles, we included 84 records plus 56 companion (supplementary) reports. Biomechanical efficacy records (n = 50) demonstrate compliant flooring can reduce fall-related impact forces with minimal effects on standing and walking balance. Clinical effectiveness records (n = 20) suggest that compliant flooring may reduce injuries, but may increase risk for falls. Preliminary evidence suggests that compliant flooring may be a cost-effective strategy (n = 12), but may also result in increased physical demands for healthcare workers (n = 17). Conclusions In summary, compliant flooring is a promising strategy for preventing fall-related injuries from a biomechanical perspective. Additional research is warranted to confirm whether compliant flooring (i) prevents fall-related injuries in real-world settings, (ii) is a cost-effective intervention strategy, and (iii) can be installed without negatively impacting workplace safety. Avenues for future research are provided, which will help to determine whether compliant flooring is recommended in healthcare environments. PMID:28166265
Distributed geospatial model sharing based on open interoperability standards
Feng, Min; Liu, Shuguang; Euliss, Ned H.; Fang, Yin
2009-01-01
Numerous geospatial computational models have been developed based on sound principles and published in journals or presented in conferences. However modelers have made few advances in the development of computable modules that facilitate sharing during model development or utilization. Constraints hampering development of model sharing technology includes limitations on computing, storage, and connectivity; traditional stand-alone and closed network systems cannot fully support sharing and integrating geospatial models. To address this need, we have identified methods for sharing geospatial computational models using Service Oriented Architecture (SOA) techniques and open geospatial standards. The service-oriented model sharing service is accessible using any tools or systems compliant with open geospatial standards, making it possible to utilize vast scientific resources available from around the world to solve highly sophisticated application problems. The methods also allow model services to be empowered by diverse computational devices and technologies, such as portable devices and GRID computing infrastructures. Based on the generic and abstract operations and data structures required for Web Processing Service (WPS) standards, we developed an interactive interface for model sharing to help reduce interoperability problems for model use. Geospatial computational models are shared on model services, where the computational processes provided by models can be accessed through tools and systems compliant with WPS. We developed a platform to help modelers publish individual models in a simplified and efficient way. Finally, we illustrate our technique using wetland hydrological models we developed for the prairie pothole region of North America.
Parametric studies to determine the effect of compliant layers on metal matrix composite systems
NASA Technical Reports Server (NTRS)
Caruso, J. J.; Chamis, C. C.; Brown, H. C.
1990-01-01
Computational simulation studies are conducted to identify compliant layers to reduce matrix stresses which result from the coefficient of thermal expansion mismatch and the large temperature range over which the current metal matrix composites will be used. The present study includes variations of compliant layers and their properties to determine their influence on unidirectional composite and constituent response. Two simulation methods are used for these studies. The first approach is based on a three-dimensional linear finite element analysis of a 9 fiber unidirectional composite system. The second approach is a micromechanics based nonlinear computer code developed to determine the behavior of metal matrix composite system for thermal and mechanical loads. The results show that an effective compliant layer for the SCS 6 (SiC)/Ti-24Al-11Nb (Ti3Al + Nb) and SCS 6 (SiC)/Ti-15V-3Cr-3Sn-3Al (Ti-15-3) composite systems should have modulus 15 percent that of the matrix and a coefficient of thermal expansion of the compliant layer roughly equal to that of the composite system without the CL. The matrix stress in the longitudinal and the transverse tangent (loop) direction are tensile for the Ti3Al + Nb and Ti-15-3 composite systems upon cool down from fabrication. The fiber longitudinal stress is compressive from fabrication cool down. Addition of a recommended compliant layer will result in a reduction in the composite modulus.
A New Data Management System for Biological and Chemical Oceanography
NASA Astrophysics Data System (ADS)
Groman, R. C.; Chandler, C.; Allison, D.; Glover, D. M.; Wiebe, P. H.
2007-12-01
The Biological and Chemical Oceanography Data Management Office (BCO-DMO) was created to serve PIs principally funded by NSF to conduct marine chemical and ecological research. The new office is dedicated to providing open access to data and information developed in the course of scientific research on short and intermediate time-frames. The data management system developed in support of U.S. JGOFS and U.S. GLOBEC programs is being modified to support the larger scope of the BCO-DMO effort, which includes ultimately providing a way to exchange data with other data systems. The open access system is based on a philosophy of data stewardship, support for existing and evolving data standards, and use of public domain software. The DMO staff work closely with originating PIs to manage data gathered as part of their individual programs. In the new BCO-DMO data system, project and data set metadata records designed to support re-use of the data are stored in a relational database (MySQL) and the data are stored in or made accessible by the JGOFS/GLOBEC object- oriented, relational, data management system. Data access will be provided via any standard Web browser client user interface through a GIS application (Open Source, OGC-compliant MapServer), a directory listing from the data holdings catalog, or a custom search engine that facilitates data discovery. In an effort to maximize data system interoperability, data will also be available via Web Services; and data set descriptions will be generated to comply with a variety of metadata content standards. The office is located at the Woods Hole Oceanographic Institution and web access is via http://www.bco-dmo.org.
BPELPower—A BPEL execution engine for geospatial web services
NASA Astrophysics Data System (ADS)
Yu, Genong (Eugene); Zhao, Peisheng; Di, Liping; Chen, Aijun; Deng, Meixia; Bai, Yuqi
2012-10-01
The Business Process Execution Language (BPEL) has become a popular choice for orchestrating and executing workflows in the Web environment. As one special kind of scientific workflow, geospatial Web processing workflows are data-intensive, deal with complex structures in data and geographic features, and execute automatically with limited human intervention. To enable the proper execution and coordination of geospatial workflows, a specially enhanced BPEL execution engine is required. BPELPower was designed, developed, and implemented as a generic BPEL execution engine with enhancements for executing geospatial workflows. The enhancements are especially in its capabilities in handling Geography Markup Language (GML) and standard geospatial Web services, such as the Web Processing Service (WPS) and the Web Feature Service (WFS). BPELPower has been used in several demonstrations over the decade. Two scenarios were discussed in detail to demonstrate the capabilities of BPELPower. That study showed a standard-compliant, Web-based approach for properly supporting geospatial processing, with the only enhancement at the implementation level. Pattern-based evaluation and performance improvement of the engine are discussed: BPELPower directly supports 22 workflow control patterns and 17 workflow data patterns. In the future, the engine will be enhanced with high performance parallel processing and broad Web paradigms.
Hancock, David; Wilson, Michael; Velarde, Giles; Morrison, Norman; Hayes, Andrew; Hulme, Helen; Wood, A Joseph; Nashar, Karim; Kell, Douglas B; Brass, Andy
2005-11-03
maxdLoad2 is a relational database schema and Java application for microarray experimental annotation and storage. It is compliant with all standards for microarray meta-data capture; including the specification of what data should be recorded, extensive use of standard ontologies and support for data exchange formats. The output from maxdLoad2 is of a form acceptable for submission to the ArrayExpress microarray repository at the European Bioinformatics Institute. maxdBrowse is a PHP web-application that makes contents of maxdLoad2 databases accessible via web-browser, the command-line and web-service environments. It thus acts as both a dissemination and data-mining tool. maxdLoad2 presents an easy-to-use interface to an underlying relational database and provides a full complement of facilities for browsing, searching and editing. There is a tree-based visualization of data connectivity and the ability to explore the links between any pair of data elements, irrespective of how many intermediate links lie between them. Its principle novel features are: the flexibility of the meta-data that can be captured, the tools provided for importing data from spreadsheets and other tabular representations, the tools provided for the automatic creation of structured documents, the ability to browse and access the data via web and web-services interfaces. Within maxdLoad2 it is very straightforward to customise the meta-data that is being captured or change the definitions of the meta-data. These meta-data definitions are stored within the database itself allowing client software to connect properly to a modified database without having to be specially configured. The meta-data definitions (configuration file) can also be centralized allowing changes made in response to revisions of standards or terminologies to be propagated to clients without user intervention.maxdBrowse is hosted on a web-server and presents multiple interfaces to the contents of maxd databases. maxdBrowse emulates many of the browse and search features available in the maxdLoad2 application via a web-browser. This allows users who are not familiar with maxdLoad2 to browse and export microarray data from the database for their own analysis. The same browse and search features are also available via command-line and SOAP server interfaces. This both enables scripting of data export for use embedded in data repositories and analysis environments, and allows access to the maxd databases via web-service architectures. maxdLoad2 http://www.bioinf.man.ac.uk/microarray/maxd/ and maxdBrowse http://dbk.ch.umist.ac.uk/maxdBrowse are portable and compatible with all common operating systems and major database servers. They provide a powerful, flexible package for annotation of microarray experiments and a convenient dissemination environment. They are available for download and open sourced under the Artistic License.
ExoDat Information System at CeSAM
NASA Astrophysics Data System (ADS)
Agneray, F.; Moreau, C.; Chabaud, P.; Damiani, C.; Deleuil, M.
2014-05-01
CoRoT (Convection Rotation and planetary transits) is a space based mission led by French space agency (CNES) in association with French and international laboratories. One of CoRoT's goal is to detect exoplanets by the transit method. The Exoplanet Database (Exodat) is a VO compliant information system for the CoRoT exoplanet program. The main functions of ExoDat are to provide a source catalog for the observation fields and targets selection; to characterize the CoRoT targets (spectral type, variability , contamination...);and to support follow up programs. ExoDat is built using the AstroNomical Information System (ANIS) developed by the CeSAM (Centre de donneeS Astrophysique de Marseille). It offers download of observation catalogs and additional services like: search, extract and display data by using a combination of criteria, object list, and cone-search interfaces. Web services have been developed to provide easy access for user's softwares and pipelines.
NASA Astrophysics Data System (ADS)
Raup, B. H.; Khalsa, S. S.; Armstrong, R.
2007-12-01
The Global Land Ice Measurements from Space (GLIMS) project has built a geospatial and temporal database of glacier data, composed of glacier outlines and various scalar attributes. These data are being derived primarily from satellite imagery, such as from ASTER and Landsat. Each "snapshot" of a glacier is from a specific time, and the database is designed to store multiple snapshots representative of different times. We have implemented two web-based interfaces to the database; one enables exploration of the data via interactive maps (web map server), while the other allows searches based on text-field constraints. The web map server is an Open Geospatial Consortium (OGC) compliant Web Map Server (WMS) and Web Feature Server (WFS). This means that other web sites can display glacier layers from our site over the Internet, or retrieve glacier features in vector format. All components of the system are implemented using Open Source software: Linux, PostgreSQL, PostGIS (geospatial extensions to the database), MapServer (WMS and WFS), and several supporting components such as Proj.4 (a geographic projection library) and PHP. These tools are robust and provide a flexible and powerful framework for web mapping applications. As a service to the GLIMS community, the database contains metadata on all ASTER imagery acquired over glacierized terrain. Reduced-resolution of the images (browse imagery) can be viewed either as a layer in the MapServer application, or overlaid on the virtual globe within Google Earth. The interactive map application allows the user to constrain by time what data appear on the map. For example, ASTER or glacier outlines from 2002 only, or from Autumn in any year, can be displayed. The system allows users to download their selected glacier data in a choice of formats. The results of a query based on spatial selection (using a mouse) or text-field constraints can be downloaded in any of these formats: ESRI shapefiles, KML (Google Earth), MapInfo, GML (Geography Markup Language) and GMT (Generic Mapping Tools). This "clip-and-ship" function allows users to download only the data they are interested in. Our flexible web interfaces to the database, which includes various support layers (e.g. a layer to help collaborators identify satellite imagery over their region of expertise) will facilitate enhanced analysis to be undertaken on glacier systems, their distribution, and their impacts on other Earth systems.
ERIC Educational Resources Information Center
Isotani, Seiji; Mizoguchi, Riichiro; Isotani, Sadao; Capeli, Olimpio M.; Isotani, Naoko; de Albuquerque, Antonio R. P. L.; Bittencourt, Ig. I.; Jaques, Patricia
2013-01-01
When the goal of group activities is to support long-term learning, the task of designing well-thought-out collaborative learning (CL) scenarios is an important key to success. To help students adequately acquire and develop their knowledge and skills, a teacher can plan a scenario that increases the probability for learning to occur. Such a…
Marco-Ruiz, Luis; Pedrinaci, Carlos; Maldonado, J A; Panziera, Luca; Chen, Rong; Bellika, J Gustav
2016-08-01
The high costs involved in the development of Clinical Decision Support Systems (CDSS) make it necessary to share their functionality across different systems and organizations. Service Oriented Architectures (SOA) have been proposed to allow reusing CDSS by encapsulating them in a Web service. However, strong barriers in sharing CDS functionality are still present as a consequence of lack of expressiveness of services' interfaces. Linked Services are the evolution of the Semantic Web Services paradigm to process Linked Data. They aim to provide semantic descriptions over SOA implementations to overcome the limitations derived from the syntactic nature of Web services technologies. To facilitate the publication, discovery and interoperability of CDS services by evolving them into Linked Services that expose their interfaces as Linked Data. We developed methods and models to enhance CDS SOA as Linked Services that define a rich semantic layer based on machine interpretable ontologies that powers their interoperability and reuse. These ontologies provided unambiguous descriptions of CDS services properties to expose them to the Web of Data. We developed models compliant with Linked Data principles to create a semantic representation of the components that compose CDS services. To evaluate our approach we implemented a set of CDS Linked Services using a Web service definition ontology. The definitions of Web services were linked to the models developed in order to attach unambiguous semantics to the service components. All models were bound to SNOMED-CT and public ontologies (e.g. Dublin Core) in order to count on a lingua franca to explore them. Discovery and analysis of CDS services based on machine interpretable models was performed reasoning over the ontologies built. Linked Services can be used effectively to expose CDS services to the Web of Data by building on current CDS standards. This allows building shared Linked Knowledge Bases to provide machine interpretable semantics to the CDS service description alleviating the challenges on interoperability and reuse. Linked Services allow for building 'digital libraries' of distributed CDS services that can be hosted and maintained in different organizations. Copyright © 2016 Elsevier Inc. All rights reserved.
TerraLook: GIS-Ready Time-Series of Satellite Imagery for Monitoring Change
,
2008-01-01
TerraLook is a joint project of the U.S. Geological Survey (USGS) and the National Aeronautics and Space Administration (NASA) Jet Propulsion Laboratory (JPL) with a goal of providing satellite images that anyone can use to see changes in the Earth's surface over time. Each TerraLook product is a user-specified collection of satellite images selected from imagery archived at the USGS Earth Resources Observation and Science (EROS) Center. Images are bundled with standards-compliant metadata, a world file, and an outline of each image's ground footprint, enabling their use in geographic information systems (GIS), image processing software, and Web mapping applications. TerraLook images are available through the USGS Global Visualization Viewer (http://glovis.usgs.gov).
Using Taxonomic Indexing Trees to Efficiently Retrieve SCORM-Compliant Documents in e-Learning Grids
ERIC Educational Resources Information Center
Shih, Wen-Chung; Tseng, Shian-Shyong; Yang, Chao-Tung
2008-01-01
With the flourishing development of e-Learning, more and more SCORM-compliant teaching materials are developed by institutes and individuals in different sites. In addition, the e-Learning grid is emerging as an infrastructure to enhance traditional e-Learning systems. Therefore, information retrieval schemes supporting SCORM-compliant documents…
Web-Based Real-Time Emergency Monitoring
NASA Technical Reports Server (NTRS)
Harvey, Craig A.; Lawhead, Joel
2007-01-01
The Web-based Real-Time Asset Monitoring (RAM) module for emergency operations and facility management enables emergency personnel in federal agencies and local and state governments to monitor and analyze data in the event of a natural disaster or other crisis that threatens a large number of people and property. The software can manage many disparate sources of data within a facility, city, or county. It was developed on industry-standard Geo- Spatial software and is compliant with open GIS standards. RAM View can function as a standalone system, or as an integrated plugin module to Emergency Operations Center (EOC) software suites such as REACT (Real-time Emergency Action Coordination Tool), thus ensuring the widest possible distribution among potential users. RAM has the ability to monitor various data sources, including streaming data. Many disparate systems are included in the initial suite of supported hardware systems, such as mobile GPS units, ambient measurements of temperature, moisture and chemical agents, flow meters, air quality, asset location, and meteorological conditions. RAM View displays real-time data streams such as gauge heights from the U.S. Geological Survey gauging stations, flood crests from the National Weather Service, and meteorological data from numerous sources. Data points are clearly visible on the map interface, and attributes as specified in the user requirements can be viewed and queried.
TAPAS, a VO archive at the IRAM 30-m telescope
NASA Astrophysics Data System (ADS)
Leon, Stephane; Espigares, Victor; Ruíz, José Enrique; Verdes-Montenegro, Lourdes; Mauersberger, Rainer; Brunswig, Walter; Kramer, Carsten; Santander-Vela, Juan de Dios; Wiesemeyer, Helmut
2012-07-01
Astronomical observatories are today generating increasingly large volumes of data. For an efficient use of them, databases have been built following the standards proposed by the International Virtual Observatory Alliance (IVOA), providing a common protocol to query them and make them interoperable. The IRAM 30-m radio telescope, located in Sierra Nevada (Granada, Spain) is a millimeter wavelength telescope with a constantly renewed, extensive choice of instruments, and capable of covering the frequency range between 80 and 370 GHz. It is continuously producing a large amount of data thanks to the more than 200 scientific projects observed each year. The TAPAS archive at the IRAM 30-m telescope is aimed to provide public access to the headers describing the observations performed with the telescope, according to a defined data policy, making as well the technical data available to the IRAM staff members. A special emphasis has been made to make it Virtual Observatory (VO) compliant, and to offer a VO compliant web interface allowing to make the information available to the scientific community. TAPAS is built using the Django Python framework on top of a relational MySQL database, and is fully integrated with the telescope control system. The TAPAS data model (DM) is based on the Radio Astronomical DAta Model for Single dish radio telescopes (RADAMS), to allow for easy integration into the VO infrastructure. A metadata modeling layer is used by the data-filler to allow an implementation free from assumptions about the control system and the underlying database. TAPAS and its public web interface (
Schofield, E C; Carver, T; Achuthan, P; Freire-Pritchett, P; Spivakov, M; Todd, J A; Burren, O S
2016-08-15
Promoter capture Hi-C (PCHi-C) allows the genome-wide interrogation of physical interactions between distal DNA regulatory elements and gene promoters in multiple tissue contexts. Visual integration of the resultant chromosome interaction maps with other sources of genomic annotations can provide insight into underlying regulatory mechanisms. We have developed Capture HiC Plotter (CHiCP), a web-based tool that allows interactive exploration of PCHi-C interaction maps and integration with both public and user-defined genomic datasets. CHiCP is freely accessible from www.chicp.org and supports most major HTML5 compliant web browsers. Full source code and installation instructions are available from http://github.com/D-I-L/django-chicp ob219@cam.ac.uk. © The Author 2016. Published by Oxford University Press. All rights reserved.
Secure password-based authenticated key exchange for web services
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liang, Fang; Meder, Samuel; Chevassut, Olivier
This paper discusses an implementation of an authenticated key-exchange method rendered on message primitives defined in the WS-Trust and WS-SecureConversation specifications. This IEEE-specified cryptographic method (AuthA) is proven-secure for password-based authentication and key exchange, while the WS-Trust and WS-Secure Conversation are emerging Web Services Security specifications that extend the WS-Security specification. A prototype of the presented protocol is integrated in the WSRF-compliant Globus Toolkit V4. Further hardening of the implementation is expected to result in a version that will be shipped with future Globus Toolkit releases. This could help to address the current unavailability of decent shared-secret-based authentication options inmore » the Web Services and Grid world. Future work will be to integrate One-Time-Password (OTP) features in the authentication protocol.« less
Web Services and Data Enhancements at the Northern California Earthquake Data Center
NASA Astrophysics Data System (ADS)
Neuhauser, D. S.; Zuzlewski, S.; Lombard, P. N.; Allen, R. M.
2013-12-01
The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, simple text, or MiniSEED depending on the service and selected output format. The NCEDC offers the following web services that are compliant with the International Federation of Digital Seismograph Networks (FDSN) web services specifications: (1) fdsn-dataselect: time series data delivered in MiniSEED format, (2) fdsn-station: station and channel metadata and time series availability delivered in StationXML format, (3) fdsn-event: earthquake event information delivered in QuakeML format. In addition, the NCEDC offers the the following IRIS-compatible web services: (1) sacpz: provide channel gains, poles, and zeros in SAC format, (2) resp: provide channel response information in RESP format, (3) dataless: provide station and channel metadata in Dataless SEED format. The NCEDC is also developing a web service to deliver timeseries from pre-assembled event waveform gathers. The NCEDC has waveform gathers for ~750,000 northern and central California events from 1984 to the present, many of which were created by the USGS NCSN prior to the establishment of the joint NCSS (Northern California Seismic System). We are currently adding waveforms to these older event gathers with time series from the UCB networks and other networks with waveforms archived at the NCEDC, and ensuring that the waveform for each channel in the event gathers have the highest quality waveform from the archive.
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Arya, Vinod K.; Melis, Matthew E.
1990-01-01
High residual stresses within intermetallic and metal matrix composite systems can develop upon cooling from the processing temperature to room temperature due to the coefficient of thermal expansion (CTE) mismatch between the fiber and matrix. As a result, within certain composite systems, radial, circumferential, and/or longitudinal cracks have been observed to form at the fiber-matrix interface. The compliant layer concept (insertion of a compensating interface material between the fiber and matrix) was proposed to reduce or eliminate the residual stress buildup during cooling and thus minimize cracking. The viability of the proposed compliant layer concept is investigated both elastically and elastoplastically. A detailed parametric study was conducted using a unit cell model consisting of three concentric cylinders to determine the required character (i.e., thickness and material properties) of the compliant layer as well as its applicability. The unknown compliant layer mechanical properties were expressed as ratios of the corresponding temperature dependent Ti-24Al-11Nb (a/o) matrix properties. The fiber properties taken were those corresponding to SCS-6 (SiC). Results indicate that the compliant layer can be used to reduce, if not eliminate, radial and circumferential residual stresses within the fiber and matrix and therefore also reduce or eliminate the radial cracking. However, with this decrease in in-plane stresses, one obtains an increase in longitudinal stress, thus potentially initiating longitudinal cracking. Guidelines are given for the selection of a specific compliant material, given a perfectly bonded system.
Applications of Dynamic Deployment of Services in Industrial Automation
NASA Astrophysics Data System (ADS)
Candido, Gonçalo; Barata, José; Jammes, François; Colombo, Armando W.
Service-oriented Architecture (SOA) is becoming a de facto paradigm for business and enterprise integration. SOA is expanding into several domains of application envisioning a unified solution suitable across all different layers of an enterprise infrastructure. The application of SOA based on open web standards can significantly enhance the interoperability and openness of those devices. By embedding a dynamical deployment service even into small field de- vices, it would be either possible to allow machine builders to place built- in services and still allow the integrator to deploy on-the-run the services that best fit his current application. This approach allows the developer to keep his own preferred development language, but still deliver a SOA- compliant application. A dynamic deployment service is envisaged as a fundamental framework to support more complex applications, reducing deployment delays, while increasing overall system agility. As use-case scenario, a dynamic deployment service was implemented over DPWS and WS-Management specifications allowing designing and programming an automation application using IEC61131 languages, and deploying these components as web services into devices.
A BPMN solution for chaining OGC services to quality assure location-based crowdsourced data
NASA Astrophysics Data System (ADS)
Meek, Sam; Jackson, Mike; Leibovici, Didier G.
2016-02-01
The Open Geospatial Consortium (OGC) Web Processing Service (WPS) standard enables access to a centralized repository of processes and services from compliant clients. A crucial part of the standard includes the provision to chain disparate processes and services to form a reusable workflow. To date this has been realized by methods such as embedding XML requests, using Business Process Execution Language (BPEL) engines and other external orchestration engines. Although these allow the user to define tasks and data artifacts as web services, they are often considered inflexible and complicated, often due to vendor specific solutions and inaccessible documentation. This paper introduces a new method of flexible service chaining using the standard Business Process Markup Notation (BPMN). A prototype system has been developed upon an existing open source BPMN suite to illustrate the advantages of the approach. The motivation for the software design is qualification of crowdsourced data for use in policy-making. The software is tested as part of a project that seeks to qualify, assure, and add value to crowdsourced data in a biological monitoring use case.
NASA Astrophysics Data System (ADS)
Ma, Kevin; Liu, Joseph; Zhang, Xuejun; Lerner, Alex; Shiroishi, Mark; Amezcua, Lilyana; Liu, Brent
2016-03-01
We have designed and developed a multiple sclerosis eFolder system for patient data storage, image viewing, and automatic lesion quantification results stored in DICOM-SR format. The web-based system aims to be integrated in DICOM-compliant clinical and research environments to aid clinicians in patient treatments and data analysis. The system needs to quantify lesion volumes, identify and register lesion locations to track shifts in volume and quantity of lesions in a longitudinal study. In order to perform lesion registration, we have developed a brain warping and normalizing methodology using Statistical Parametric Mapping (SPM) MATLAB toolkit for brain MRI. Patients' brain MR images are processed via SPM's normalization processes, and the brain images are analyzed and warped according to the tissue probability map. Lesion identification and contouring are completed by neuroradiologists, and lesion volume quantification is completed by the eFolder's CAD program. Lesion comparison results in longitudinal studies show key growth and active regions. The results display successful lesion registration and tracking over a longitudinal study. Lesion change results are graphically represented in the web-based user interface, and users are able to correlate patient progress and changes in the MRI images. The completed lesion and disease tracking tool would enable the eFolder to provide complete patient profiles, improve the efficiency of patient care, and perform comprehensive data analysis through an integrated imaging informatics system.
An Improved Publication Process for the UMVF.
Renard, Jean-Marie; Brunetaud, Jean-Marc; Cuggia, Marc; Darmoni, Stephan; Lebeux, Pierre; Beuscart, Régis
2005-01-01
The "Université Médicale Virtuelle Francophone" (UMVF) is a federation of French medical schools. Its main goal is to share the production and use of pedagogic medical resources generated by academic medical teachers. We developed an Open-Source application based upon a workflow system which provides an improved publication process for the UMVF. For teachers, the tool permits easy and efficient upload of new educational resources. For web masters it provides a mechanism to easily locate and validate the resources. For both the teachers and the web masters, the utility provides the control and communication functions that define a workflow system.For all users, students in particular, the application improves the value of the UMVF repository by providing an easy way to find a detailed description of a resource and to check any resource from the UMVF to ascertain its quality and integrity, even if the resource is an old deprecated version. The server tier of the application is used to implement the main workflow functionalities and is deployed on certified UMVF servers using the PHP language, an LDAP directory and an SQL database. The client tier of the application provides both the workflow and the search and check functionalities and is implemented using a Java applet through a W3C compliant web browser. A unique signature for each resource, was needed to provide security functionality and is implemented using the MD5 Digest algorithm. The testing performed by Rennes and Lille verified the functionality and conformity with our specifications.
Draghici, Sorin; Tarca, Adi L; Yu, Longfei; Ethier, Stephen; Romero, Roberto
2008-03-01
The BioArray Software Environment (BASE) is a very popular MIAME-compliant, web-based microarray data repository. However in BASE, like in most other microarray data repositories, the experiment annotation and raw data uploading can be very timeconsuming, especially for large microarray experiments. We developed KUTE (Karmanos Universal daTabase for microarray Experiments), as a plug-in for BASE 2.0 that addresses these issues. KUTE provides an automatic experiment annotation feature and a completely redesigned data work-flow that dramatically reduce the human-computer interaction time. For instance, in BASE 2.0 a typical Affymetrix experiment involving 100 arrays required 4 h 30 min of user interaction time forexperiment annotation, and 45 min for data upload/download. In contrast, for the same experiment, KUTE required only 28 min of user interaction time for experiment annotation, and 3.3 min for data upload/download. http://vortex.cs.wayne.edu/kute/index.html.
Kottmann, Renzo; Gray, Tanya; Murphy, Sean; Kagan, Leonid; Kravitz, Saul; Lombardot, Thierry; Field, Dawn; Glöckner, Frank Oliver
2008-06-01
The Genomic Contextual Data Markup Language (GCDML) is a core project of the Genomic Standards Consortium (GSC) that implements the "Minimum Information about a Genome Sequence" (MIGS) specification and its extension, the "Minimum Information about a Metagenome Sequence" (MIMS). GCDML is an XML Schema for generating MIGS/MIMS compliant reports for data entry, exchange, and storage. When mature, this sample-centric, strongly-typed schema will provide a diverse set of descriptors for describing the exact origin and processing of a biological sample, from sampling to sequencing, and subsequent analysis. Here we describe the need for such a project, outline design principles required to support the project, and make an open call for participation in defining the future content of GCDML. GCDML is freely available, and can be downloaded, along with documentation, from the GSC Web site (http://gensc.org).
JPL Space Telecommunications Radio System Operating Environment
NASA Technical Reports Server (NTRS)
Lux, James P.; Lang, Minh; Peters, Kenneth J.; Taylor, Gregory H.; Duncan, Courtney B.; Orozco, David S.; Stern, Ryan A.; Ahten, Earl R.; Girard, Mike
2013-01-01
A flight-qualified implementation of a Software Defined Radio (SDR) Operating Environment for the JPL-SDR built for the CoNNeCT Project has been developed. It is compliant with the NASA Space Telecommunications Radio System (STRS) Architecture Standard, and provides the software infrastructure for STRS compliant waveform applications. This software provides a standards-compliant abstracted view of the JPL-SDR hardware platform. It uses industry standard POSIX interfaces for most functions, as well as exposing the STRS API (Application Programming In terface) required by the standard. This software includes a standardized interface for IP components instantiated within a Xilinx FPGA (Field Programmable Gate Array). The software provides a standardized abstracted interface to platform resources such as data converters, file system, etc., which can be used by STRS standards conformant waveform applications. It provides a generic SDR operating environment with a much smaller resource footprint than similar products such as SCA (Software Communications Architecture) compliant implementations, or the DoD Joint Tactical Radio Systems (JTRS).
Dugas, Martin
2016-11-29
Clinical trials use many case report forms (CRFs) per patient. Because of the astronomical number of potential CRFs, data element re-use at the design stage is attractive to foster compatibility of data from different trials. The objective of this work is to assess the technical feasibility of a CRF editor with connection to a public metadata registry (MDR) to support data element re-use. Based on the Medical Data Models portal, an ISO/IEC 11179-compliant MDR was implemented and connected to a web-based CRF editor. Three use cases were implemented: re-use at the form, item group and data element levels. CRF design with data element re-use from a public MDR is feasible. A prototypic system is available. The main limitation of the system is the amount of available MDR content.
2014-01-01
Background There is controversy as to whether conservative management that includes wearing a brace and exercises is effective in stabilising idiopathic scoliosis curves. A brace only prevents progression of the curve and has been shown to have favourable outcomes when patients are compliant. So the aim of this study was to: determine the effect of compliance to the Rigo System Cheneau (RSC) brace and a specific exercise programme on Idiopathic Scoliosis curvature; and to compare the Quality of Life (QoL) and psychological traits of compliant and non compliant subjects. Methods A pre/post test study design was used with a post study comparison between subjects who complied with the management and those who did not. Fifty one subjects, girls aged 12-16 years, Cobb angles 20-50 degrees participated in the study. Subjects were divided into two groups, according to their compliance, at the end of the study. The compliant group wore the brace 20 or more hours a day and exercised three or more times per week. The non-compliant group wore the brace less than 20 hours a day and exercised less than three times per week. Cobb angles, vertebral rotation, scoliometer readings, peak flow, quality of life and personality traits were compared between groups, using the student’s two sample t-test and an analysis of covariance. Results The compliant group, wore the brace 21.5 hours per day and exercised four times a week, and significantly improved in all measures compared to non compliant subjects, who wore the brace 12 hours per day, exercised 1.7 times a week and significantly deteriorated (p < 0.0001). The major Cobb angles in the compliant group improved 10.19°(±5.5) and deteriorated 5.52°(±4.3) in the non compliant group (p < 0.0001). Compliant subjects had a significantly better QoL than the non compliant subjects (p = 0.001). The compliant group were significantly more emotionally mature, stable and realistic than the non compliant group (p = 0.03). Conclusions Good compliance of the RSC brace and a specific exercise regime resulted in a significant improvement in curvatures, poor compliance resulted in progression/deterioration. A poorer QoL in the non compliant group possibly was caused by personality traits of the group, being more emotionally immature and unstable. PMID:24926318
Geo-hazard harmonised data a driven process to environmental analysis system
NASA Astrophysics Data System (ADS)
Cipolloni, Carlo; Iadanza, Carla; Pantaloni, Marco; Trigila, Alessandro
2015-04-01
In the last decade an increase of damage caused by natural disasters has been recorded in Italy. To support environmental safety and human protection, by reducing vulnerability of exposed elements as well as improving the resilience of the involved communities, it need to give access to harmonized and customized data that is one of several steps towards delivering adequate support to risk assessment, reduction and management. In this contest has been developed SEIS and Copernicus-GEMES as infrastructure based on web services for environmental analysis, to integrates in its own system specifications and results from INSPIRE. The two landslide risk scenarios developed in different European projects driven the harmonization process of data that represents the basic element to have interoperable web services in environmental analysis system. From two different perspective we have built a common methodology to analyse dataset and transform them into INSPIRE compliant format following the Data Specification on Geology and on Natural Risk Zone given by INSPIRE. To ensure the maximum results and re-usability of data we have also applied to the landslide and geological datasets a wider Data model standard like GeoSciML, that represents the natural extension of INSPIRE data model to provide more information. The aim of this work is to present the first results of two projects concerning the data harmonisation process, where an important role is played by the semantic harmonisation using the ontology service and/or the hierarchy vocabularies available as Link Data or Link Open Data by means of URI directly in the data spatial services. It will be presented how the harmonised web services can provide an add value in a risk scenario analysis system, showing the first results of the landslide environmental analysis developed by the eENVplus and LIFE+IMAGINE projects.
Turbine airfoil with dual wall formed from inner and outer layers separated by a compliant structure
Campbell,; Christian X. , Morrison; Jay, A [Oviedo, FL
2011-12-20
A turbine airfoil usable in a turbine engine with a cooling system and a compliant dual wall configuration configured to enable thermal expansion between inner and outer layers while eliminating stress formation is disclosed. The compliant dual wall configuration may be formed a dual wall formed from inner and outer layers separated by a compliant structure. The compliant structure may be configured such that the outer layer may thermally expand without limitation by the inner layer. The compliant structure may be formed from a plurality of pedestals positioned generally parallel with each other. The pedestals may include a first foot attached to a first end of the pedestal and extending in a first direction aligned with the outer layer, and may include a second foot attached to a second end of the pedestal and extending in a second direction aligned with the inner layer.
2000-10-01
control systems and prototyped the approach by porting the ILU ORB from Xerox to the Lynx real - time operating system . They then provided a distributed...compliant real - time operating system , a real-time ORB, and an ODMG-compliant real-time ODBMS [12]. The MITRE system is an infrastructure for...the server’s local operating system can handle. For instance, on a node controlled by the VXWorks real - time operating system with 256 local
Wiegers, Thomas C; Davis, Allan Peter; Mattingly, Carolyn J
2014-01-01
The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and disease NER were 61, 74 and 51%, respectively. Response times ranged from fractions-of-a-second to over a minute per article. We present a description of the challenge and summary of results, demonstrating how curation groups can effectively use interoperable NER technologies to simplify text-mining pipeline implementation. Database URL: http://ctdbase.org/ © The Author(s) 2014. Published by Oxford University Press.
Wiegers, Thomas C.; Davis, Allan Peter; Mattingly, Carolyn J.
2014-01-01
The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and disease NER were 61, 74 and 51%, respectively. Response times ranged from fractions-of-a-second to over a minute per article. We present a description of the challenge and summary of results, demonstrating how curation groups can effectively use interoperable NER technologies to simplify text-mining pipeline implementation. Database URL: http://ctdbase.org/ PMID:24919658
Improving Data Catalogs with Free and Open Source Software
NASA Astrophysics Data System (ADS)
Schweitzer, R.; Hankin, S.; O'Brien, K.
2013-12-01
The Global Earth Observation Integrated Data Environment (GEO-IDE) is NOAA's effort to successfully integrate data and information with partners in the national US-Global Earth Observation System (US-GEO) and the international Global Earth Observation System of Systems (GEOSS). As part of the GEO-IDE, the Unified Access Framework (UAF) is working to build momentum towards the goal of increased data integration and interoperability. The UAF project is moving towards this goal with an approach that includes leveraging well known and widely used standards, as well as free and open source software. The UAF project shares the widely held conviction that the use of data standards is a key ingredient necessary to achieve interoperability. Many community-based consensus standards fail, though, due to poor compliance. Compliance problems emerge for many reasons: because the standards evolve through versions, because documentation is ambiguous or because individual data providers find the standard inadequate as-is to meet their special needs. In addition, minimalist use of standards will lead to a compliant service, but one which is of low quality. In this presentation, we will be discussing the UAF effort to build a catalog cleaning tool which is designed to crawl THREDDS catalogs, analyze the data available, and then build a 'clean' catalog of data which is standards compliant and has a uniform set of data access services available. These data services include, among others, OPeNDAP, Web Coverage Service (WCS) and Web Mapping Service (WMS). We will also discuss how we are utilizing free and open source software and services to both crawl, analyze and build the clean data catalog, as well as our efforts to help data providers improve their data catalogs. We'll discuss the use of open source software such as DataNucleus, Thematic Realtime Environmental Distributed Data Services (THREDDS), ncISO and the netCDF Java Common Data Model (CDM). We'll also demonstrate how we are using free services such as Google Charts to create an easily identifiable visual metaphor which describes the quality of data catalogs. Using this rubric, in conjunction with the ncISO metadata quality rubric, will allow data providers to identify non-compliance issues in their data catalogs, thereby improving data availability to their users and to data discovery systems
NASA Astrophysics Data System (ADS)
Fortin, V.; Durnford, D.; Gaborit, E.; Davison, B.; Dimitrijevic, M.; Matte, P.
2016-12-01
Environment and Climate Change Canada has recently deployed a water cycle prediction system for the Great Lakes and St. Lawrence River. The model domain includes both the Canadian and US portions of the watershed. It provides 84-h forecasts of weather elements, lake level, lake ice cover and surface currents based on two-way coupling of the GEM numerical weather prediction (NWP) model with the NEMO ocean model. Streamflow of all the major tributaries of the Great Lakes and St. Lawrence River are estimated by the WATROUTE routing model, which routes the surface runoff forecasted by GEM's land-surface scheme and assimilates streamflow observations where available. Streamflow forecasts are updated twice daily and are disseminated through an OGC compliant web map service (WMS) and a web feature service (WFS). In this presentation, in addition to describing the system and documenting its forecast skill, we show how it is being used by clients for various environmental prediction applications. We then discuss the importance of two-way coupling, land-surface and hillslope modelling and the impact of horizontal resolution on hydrological prediction skill. In the second portion of the talk, we discuss plans for implementing a similar system at the national scale, using what we have learned in the Great Lakes and St. Lawrence watershed. Early results obtained for the headwaters of the Saskatchewan River as well as for the whole Nelson-Churchill watershed are presented.
NASA Astrophysics Data System (ADS)
Ma, Kevin C.; Forsyth, Sydney; Amezcua, Lilyana; Liu, Brent J.
2017-03-01
We have designed and developed a multiple sclerosis eFolder system for patient data storage, image viewing, and automatic lesion quantification results to allow patient tracking. The web-based system aims to be integrated in DICOM-compliant clinical and research environments to aid clinicians in patient treatments and data analysis. The system quantifies lesion volumes, identify and register lesion locations to track shifts in volume and quantity of lesions in a longitudinal study. We aim to evaluate the two most important features of the system, data mining and longitudinal lesion tracking, to demonstrate the MS eFolder's capability in improving clinical workflow efficiency and outcome analysis for research. In order to evaluate data mining capabilities, we have collected radiological and neurological data from 72 patients, 36 Caucasian and 36 Hispanic matched by gender, disease duration, and age. Data analysis on those patients based on ethnicity is performed, and analysis results are displayed by the system's web-based user interface. The data mining module is able to successfully separate Hispanic and Caucasian patients and compare their disease profiles. For longitudinal lesion tracking, we have collected 4 longitudinal cases and simulated different lesion growths over the next year. As a result, the eFolder is able to detect changes in lesion volume and identifying lesions with the most changes. Data mining and lesion tracking evaluation results show high potential of eFolder's usefulness in patientcare and informatics research for multiple sclerosis.
Lachance, Chantelle C; Korall, Alexandra M B; Russell, Colin M; Feldman, Fabio; Robinovitch, Stephen N; Mackey, Dawn C
2016-09-01
The aim of this study was to investigate the effects of flooring type and resident weight on external hand forces required to push floor-based lifts in long-term care (LTC). Novel compliant flooring is designed to reduce fall-related injuries among LTC residents but may increase forces required for staff to perform pushing tasks. A motorized lift may offset the effect of flooring on push forces. Fourteen female LTC staff performed straight-line pushes with two floor-based lifts (conventional, motor driven) loaded with passengers of average and 90th-percentile resident weights over four flooring systems (concrete+vinyl, compliant+vinyl, concrete+carpet, compliant+carpet). Initial and sustained push forces were measured by a handlebar-mounted triaxial load cell and compared to participant-specific tolerance limits. Participants rated pushing difficulty. Novel compliant flooring increased initial and sustained push forces and subjective ratings compared to concrete flooring. Compared to the conventional lift, the motor-driven lift substantially reduced initial and sustained push forces and perceived difficulty of pushing for all four floors and both resident weights. Participants exerted forces above published tolerance limits only when using the conventional lift on the carpet conditions (concrete+carpet, compliant+carpet). With the motor-driven lift only, resident weight did not affect push forces. Novel compliant flooring increased linear push forces generated by LTC staff using floor-based lifts, but forces did not exceed tolerance limits when pushing over compliant+vinyl. The motor-driven lift substantially reduced push forces compared to the conventional lift. Results may help to address risk of work-related musculoskeletal injury, especially in locations with novel compliant flooring. © 2016, Human Factors and Ergonomics Society.
Ontology-based, Tissue MicroArray oriented, image centered tissue bank
Viti, Federica; Merelli, Ivan; Caprera, Andrea; Lazzari, Barbara; Stella, Alessandra; Milanesi, Luciano
2008-01-01
Background Tissue MicroArray technique is becoming increasingly important in pathology for the validation of experimental data from transcriptomic analysis. This approach produces many images which need to be properly managed, if possible with an infrastructure able to support tissue sharing between institutes. Moreover, the available frameworks oriented to Tissue MicroArray provide good storage for clinical patient, sample treatment and block construction information, but their utility is limited by the lack of data integration with biomolecular information. Results In this work we propose a Tissue MicroArray web oriented system to support researchers in managing bio-samples and, through the use of ontologies, enables tissue sharing aimed at the design of Tissue MicroArray experiments and results evaluation. Indeed, our system provides ontological description both for pre-analysis tissue images and for post-process analysis image results, which is crucial for information exchange. Moreover, working on well-defined terms it is then possible to query web resources for literature articles to integrate both pathology and bioinformatics data. Conclusions Using this system, users associate an ontology-based description to each image uploaded into the database and also integrate results with the ontological description of biosequences identified in every tissue. Moreover, it is possible to integrate the ontological description provided by the user with a full compliant gene ontology definition, enabling statistical studies about correlation between the analyzed pathology and the most commonly related biological processes. PMID:18460177
NASA Astrophysics Data System (ADS)
Pascoe, Charlotte; Lawrence, Bryan; Moine, Marie-Pierre; Ford, Rupert; Devine, Gerry
2010-05-01
The EU METAFOR Project (http://metaforclimate.eu) has created a web-based model documentation questionnaire to collect metadata from the modelling groups that are running simulations in support of the Coupled Model Intercomparison Project - 5 (CMIP5). The CMIP5 model documentation questionnaire will retrieve information about the details of the models used, how the simulations were carried out, how the simulations conformed to the CMIP5 experiment requirements and details of the hardware used to perform the simulations. The metadata collected by the CMIP5 questionnaire will allow CMIP5 data to be compared in a scientifically meaningful way. This paper describes the life-cycle of the CMIP5 questionnaire development which starts with relatively unstructured input from domain specialists and ends with formal XML documents that comply with the METAFOR Common Information Model (CIM). Each development step is associated with a specific tool. (1) Mind maps are used to capture information requirements from domain experts and build a controlled vocabulary, (2) a python parser processes the XML files generated by the mind maps, (3) Django (python) is used to generate the dynamic structure and content of the web based questionnaire from processed xml and the METAFOR CIM, (4) Python parsers ensure that information entered into the CMIP5 questionnaire is output as CIM compliant xml, (5) CIM compliant output allows automatic information capture tools to harvest questionnaire content into databases such as the Earth System Grid (ESG) metadata catalogue. This paper will focus on how Django (python) and XML input files are used to generate the structure and content of the CMIP5 questionnaire. It will also address how the choice of development tools listed above provided a framework that enabled working scientists (who we would never ordinarily get to interact with UML and XML) to be part the iterative development process and ensure that the CMIP5 model documentation questionnaire reflects what scientists want to know about the models. Keywords: metadata, CMIP5, automatic information capture, tool development
Gregory, Shaun D; Stevens, Michael C; Pauls, Jo P; Schummy, Emma; Diab, Sara; Thomson, Bruce; Anderson, Ben; Tansley, Geoff; Salamonsen, Robert; Fraser, John F; Timms, Daniel
2016-09-01
Preventing ventricular suction and venous congestion through balancing flow rates and circulatory volumes with dual rotary ventricular assist devices (VADs) configured for biventricular support is clinically challenging due to their low preload and high afterload sensitivities relative to the natural heart. This study presents the in vivo evaluation of several physiological control systems, which aim to prevent ventricular suction and venous congestion. The control systems included a sensor-based, master/slave (MS) controller that altered left and right VAD speed based on pressure and flow; a sensor-less compliant inflow cannula (IC), which altered inlet resistance and, therefore, pump flow based on preload; a sensor-less compliant outflow cannula (OC) on the right VAD, which altered outlet resistance and thus pump flow based on afterload; and a combined controller, which incorporated the MS controller, compliant IC, and compliant OC. Each control system was evaluated in vivo under step increases in systemic (SVR ∼1400-2400 dyne/s/cm(5) ) and pulmonary (PVR ∼200-1000 dyne/s/cm(5) ) vascular resistances in four sheep supported by dual rotary VADs in a biventricular assist configuration. Constant speed support was also evaluated for comparison and resulted in suction events during all resistance increases and pulmonary congestion during SVR increases. The MS controller reduced suction events and prevented congestion through an initial sharp reduction in pump flow followed by a gradual return to baseline (5.0 L/min). The compliant IC prevented suction events; however, reduced pump flows and pulmonary congestion were noted during the SVR increase. The compliant OC maintained pump flow close to baseline (5.0 L/min) and prevented suction and congestion during PVR increases. The combined controller responded similarly to the MS controller to prevent suction and congestion events in all cases while providing a backup system in the event of single controller failure. © 2016 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.
Autovibration and chaotic motion of an unbalanced rotor in massive non-linear compliant supports
NASA Astrophysics Data System (ADS)
Pasynkova, I. A.; Stepanova, P. P.
2018-05-01
Stability loss scenarios of an unbalanced rotor with a flexible massless shaft mounted in massive non-linear compliant supports are studied on the example of cylindrical precession. Dyffing type of non-linearity in compliant supports is considered. The system "rotor - supports" has eight degrees of freedom. Internal and external friction are taken into account. Autovibrations and chaotic vibrations are obtained. The results are confirmed by numerical check.
NASA Technical Reports Server (NTRS)
Patterson, Maria T.; Anderson, Nicholas; Bennett, Collin; Bruggemann, Jacob; Grossman, Robert L.; Handy, Matthew; Ly, Vuong; Mandl, Daniel J.; Pederson, Shane; Pivarski, James;
2016-01-01
Project Matsu is a collaboration between the Open Commons Consortium and NASA focused on developing open source technology for cloud-based processing of Earth satellite imagery with practical applications to aid in natural disaster detection and relief. Project Matsu has developed an open source cloud-based infrastructure to process, analyze, and reanalyze large collections of hyperspectral satellite image data using OpenStack, Hadoop, MapReduce and related technologies. We describe a framework for efficient analysis of large amounts of data called the Matsu "Wheel." The Matsu Wheel is currently used to process incoming hyperspectral satellite data produced daily by NASA's Earth Observing-1 (EO-1) satellite. The framework allows batches of analytics, scanning for new data, to be applied to data as it flows in. In the Matsu Wheel, the data only need to be accessed and preprocessed once, regardless of the number or types of analytics, which can easily be slotted into the existing framework. The Matsu Wheel system provides a significantly more efficient use of computational resources over alternative methods when the data are large, have high-volume throughput, may require heavy preprocessing, and are typically used for many types of analysis. We also describe our preliminary Wheel analytics, including an anomaly detector for rare spectral signatures or thermal anomalies in hyperspectral data and a land cover classifier that can be used for water and flood detection. Each of these analytics can generate visual reports accessible via the web for the public and interested decision makers. The result products of the analytics are also made accessible through an Open Geospatial Compliant (OGC)-compliant Web Map Service (WMS) for further distribution. The Matsu Wheel allows many shared data services to be performed together to efficiently use resources for processing hyperspectral satellite image data and other, e.g., large environmental datasets that may be analyzed for many purposes.
Advances in a distributed approach for ocean model data interoperability
Signell, Richard P.; Snowden, Derrick P.
2014-01-01
An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF) metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF) output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), a metadata standard for unstructured grid model output (UGRID), and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS®) Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.
Integration of Grid and Sensor Web for Flood Monitoring and Risk Assessment from Heterogeneous Data
NASA Astrophysics Data System (ADS)
Kussul, Nataliia; Skakun, Sergii; Shelestov, Andrii
2013-04-01
Over last decades we have witnessed the upward global trend in natural disaster occurrence. Hydrological and meteorological disasters such as floods are the main contributors to this pattern. In recent years flood management has shifted from protection against floods to managing the risks of floods (the European Flood risk directive). In order to enable operational flood monitoring and assessment of flood risk, it is required to provide an infrastructure with standardized interfaces and services. Grid and Sensor Web can meet these requirements. In this paper we present a general approach to flood monitoring and risk assessment based on heterogeneous geospatial data acquired from multiple sources. To enable operational flood risk assessment integration of Grid and Sensor Web approaches is proposed [1]. Grid represents a distributed environment that integrates heterogeneous computing and storage resources administrated by multiple organizations. SensorWeb is an emerging paradigm for integrating heterogeneous satellite and in situ sensors and data systems into a common informational infrastructure that produces products on demand. The basic Sensor Web functionality includes sensor discovery, triggering events by observed or predicted conditions, remote data access and processing capabilities to generate and deliver data products. Sensor Web is governed by the set of standards, called Sensor Web Enablement (SWE), developed by the Open Geospatial Consortium (OGC). Different practical issues regarding integration of Sensor Web with Grids are discussed in the study. We show how the Sensor Web can benefit from using Grids and vice versa. For example, Sensor Web services such as SOS, SPS and SAS can benefit from the integration with the Grid platform like Globus Toolkit. The proposed approach is implemented within the Sensor Web framework for flood monitoring and risk assessment, and a case-study of exploiting this framework, namely the Namibia SensorWeb Pilot Project, is described. The project was created as a testbed for evaluating and prototyping key technologies for rapid acquisition and distribution of data products for decision support systems to monitor floods and enable flood risk assessment. The system provides access to real-time products on rainfall estimates and flood potential forecast derived from the Tropical Rainfall Measuring Mission (TRMM) mission with lag time of 6 h, alerts from the Global Disaster Alert and Coordination System (GDACS) with lag time of 4 h, and the Coupled Routing and Excess STorage (CREST) model to generate alerts. These are alerts are used to trigger satellite observations. With deployed SPS service for NASA's EO-1 satellite it is possible to automatically task sensor with re-image capability of less 8 h. Therefore, with enabled computational and storage services provided by Grid and cloud infrastructure it was possible to generate flood maps within 24-48 h after trigger was alerted. To enable interoperability between system components and services OGC-compliant standards are utilized. [1] Hluchy L., Kussul N., Shelestov A., Skakun S., Kravchenko O., Gripich Y., Kopp P., Lupian E., "The Data Fusion Grid Infrastructure: Project Objectives and Achievements," Computing and Informatics, 2010, vol. 29, no. 2, pp. 319-334.
Sharma, Deepak K; Solbrig, Harold R; Tao, Cui; Weng, Chunhua; Chute, Christopher G; Jiang, Guoqian
2017-06-05
Detailed Clinical Models (DCMs) have been regarded as the basis for retaining computable meaning when data are exchanged between heterogeneous computer systems. To better support clinical cancer data capturing and reporting, there is an emerging need to develop informatics solutions for standards-based clinical models in cancer study domains. The objective of the study is to develop and evaluate a cancer genome study metadata management system that serves as a key infrastructure in supporting clinical information modeling in cancer genome study domains. We leveraged a Semantic Web-based metadata repository enhanced with both ISO11179 metadata standard and Clinical Information Modeling Initiative (CIMI) Reference Model. We used the common data elements (CDEs) defined in The Cancer Genome Atlas (TCGA) data dictionary, and extracted the metadata of the CDEs using the NCI Cancer Data Standards Repository (caDSR) CDE dataset rendered in the Resource Description Framework (RDF). The ITEM/ITEM_GROUP pattern defined in the latest CIMI Reference Model is used to represent reusable model elements (mini-Archetypes). We produced a metadata repository with 38 clinical cancer genome study domains, comprising a rich collection of mini-Archetype pattern instances. We performed a case study of the domain "clinical pharmaceutical" in the TCGA data dictionary and demonstrated enriched data elements in the metadata repository are very useful in support of building detailed clinical models. Our informatics approach leveraging Semantic Web technologies provides an effective way to build a CIMI-compliant metadata repository that would facilitate the detailed clinical modeling to support use cases beyond TCGA in clinical cancer study domains.
The impact of a telehealth web-based solution on neurosurgery triage and consultation.
Moya, Monica; Valdez, Jessica; Yonas, Howard; Alverson, Dale C
2010-11-01
To enhance the quality of neurosurgery consultations, triage, and transport decisions between a Level I trauma service neurosurgery program at the University of New Mexico Hospital and referring hospitals, a secure Health Insurance Portability and Accountability Act (HIPAA)-compliant Web-based system was developed, to which digital neurological images could be sent for review by a neurosurgeon for consultation or patient transfer. Based upon prior experience of neurosurgery, it was predicted that 25% of transfer requests would be avoided if the neurosurgeons reviewed the computerized tomography scans at the time of a transfer request. In addition, it was predicted in 25% of the case that changes in management recommendations would take place independent of the transfer decision. The program was designed to allow referring hospitals to transmit digital images to the Web site, providing consulting doctors with additional patient information. This project analyzed the neurosurgeons' responses to questions designed to determine if transport or management decisions were altered when using this telehealth program in response to a request for consultation or transfer from a rural facility. Analysis of the responses of the consulting neurosurgeons revealed that, after viewing the images, 44% of the potential transfers were avoided and 44% of consulted cases resulted in management recommendation changes independent of the transfer decision. Use of the system resulted in improved triage and changes in transfer or management recommendations. A significant number of potential transfers were avoided, resulting in transport cost avoidance, more effective use of resources, and more appropriate use of the neurosurgery service as well as improved patient preparation.
Framework for Managing Metadata Security Tags as the Basis for Making Security Decisions.
2002-12-01
and Performance,” D.H. Associates, Inc., Sep 2001. [3] Deitel , H. M., and Deitel , P. J., Java How to Program , 3rd Edition, Prentice Hall Inc...1999. [4] Deitel , H. M., Deitel , P. J., and Nieto, T. R., Internet and The World Wide Web: How to Program , 2nd Edition, 2002. [5] Grohn, M. J., A...words) This thesis presents an analysis of a capability to employ CAPCO (Controlled Access Program Coordination Office) compliant Metadata security
Indexing method of digital audiovisual medical resources with semantic Web integration.
Cuggia, Marc; Mougin, Fleur; Le Beux, Pierre
2005-03-01
Digitalization of audiovisual resources and network capability offer many possibilities which are the subject of intensive work in scientific and industrial sectors. Indexing such resources is a major challenge. Recently, the Motion Pictures Expert Group (MPEG) has developed MPEG-7, a standard for describing multimedia content. The goal of this standard is to develop a rich set of standardized tools to enable efficient retrieval from digital archives or the filtering of audiovisual broadcasts on the Internet. How could this kind of technology be used in the medical context? In this paper, we propose a simpler indexing system, based on the Dublin Core standard and compliant to MPEG-7. We use MeSH and the UMLS to introduce conceptual navigation. We also present a video-platform which enables encoding and gives access to audiovisual resources in streaming mode.
Constraint-Based Routing Models for the Transport of Radioactive Materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peterson, Steven K
2015-01-01
The Department of Energy (DOE) has a historic programmatic interest in the safe and secure routing, tracking, and transportation risk analysis of radiological materials in the United States. In order to address these program goals, DOE has funded the development of several tools and related systems designed to provide insight to planners and other professionals handling radioactive materials shipments. These systems include the WebTRAGIS (Transportation Routing Analysis Geographic Information System) platform. WebTRAGIS is a browser-based routing application developed at Oak Ridge National Laboratory (ORNL) focused primarily on the safe transport of spent nuclear fuel from US nuclear reactors via railway,more » highway, or waterway. It is also used for the transport planning of low-level radiological waste to depositories such as the Waste Isolation Pilot Plant (WIPP) facility. One particular feature of WebTRAGIS is its coupling with high-resolution population data from ORNL s LandScan project. This allows users to obtain highly accurate population count and density information for use in route planning and risk analysis. To perform the routing and risk analysis WebTRAGIS incorporates a basic routing model methodology, with the additional application of various constraints designed to mimic US Department of Transportation (DOT), DOE, and Nuclear Regulatory Commission (NRC) regulations. Aside from the routing models available in WebTRAGIS, the system relies on detailed or specialized modal networks for the route solutions. These include a highly detailed network model of the US railroad system, the inland and coastal waterways, and a specialized highway network that focuses on the US interstate system and the designated hazardous materials and Highway Route Controlled Quantity (HRCQ) -designated roadways. The route constraints in WebTRAGIS rely upon a series of attributes assigned to the various components of the different modal networks. Routes are determined via a constrained shortest-path Dijkstra algorithm that has an assigned impedance factor. The route constraints modify the various impedance weights to bias or prefer particular network characteristics as desired by the user. Both the basic route model and the constrained impedance function calculations are determined by a series of network characteristics and shipment types. The study examines solutions under various constraints modeled by WebTRAGIS including possible routes from select shut-down reactor sites in the US to specific locations in the US. For purposes of illustration, the designated destinations are Oak Ridge National Laboratory in Tennessee and the Savannah River Site in South Carolina. To the degree that routes express sameness or variety under constraints serves to illustrate either a) the determinism of particular transport modes by either configuration or regulatory compliance, and/or b) the variety of constrained routes that are regulation compliant but may not be operationally feasible.« less
ISA-97 Compliant Architecture Testbed (ICAT) Projectry Organizations
1992-03-30
by the System Integracion Directorate of the USAISEC, August 29, 1992. The report discusses the refinement of the ISA-97 Compliant Architecture Model...browser and iconic representations of system objects and resources. When the user is interacting with an application which has multiple compo- nents, it is...computer communications, it is not uncommon for large information systems to be shared by users on multiple machines. The trend towards the desktop
A document centric metadata registration tool constructing earth environmental data infrastructure
NASA Astrophysics Data System (ADS)
Ichino, M.; Kinutani, H.; Ono, M.; Shimizu, T.; Yoshikawa, M.; Masuda, K.; Fukuda, K.; Kawamoto, H.
2009-12-01
DIAS (Data Integration and Analysis System) is one of GEOSS activities in Japan. It is also a leading part of the GEOSS task with the same name defined in GEOSS Ten Year Implementation Plan. The main mission of DIAS is to construct data infrastructure that can effectively integrate earth environmental data such as observation data, numerical model outputs, and socio-economic data provided from the fields of climate, water cycle, ecosystem, ocean, biodiversity and agriculture. Some of DIAS's data products are available at the following web site of http://www.jamstec.go.jp/e/medid/dias. Most of earth environmental data commonly have spatial and temporal attributes such as the covering geographic scope or the created date. The metadata standards including these common attributes are published by the geographic information technical committee (TC211) in ISO (the International Organization for Standardization) as specifications of ISO 19115:2003 and 19139:2007. Accordingly, DIAS metadata is developed with basing on ISO/TC211 metadata standards. From the viewpoint of data users, metadata is useful not only for data retrieval and analysis but also for interoperability and information sharing among experts, beginners and nonprofessionals. On the other hand, from the viewpoint of data providers, two problems were pointed out after discussions. One is that data providers prefer to minimize another tasks and spending time for creating metadata. Another is that data providers want to manage and publish documents to explain their data sets more comprehensively. Because of solving these problems, we have been developing a document centric metadata registration tool. The features of our tool are that the generated documents are available instantly and there is no extra cost for data providers to generate metadata. Also, this tool is developed as a Web application. So, this tool does not demand any software for data providers if they have a web-browser. The interface of the tool provides the section titles of the documents and by filling out the content of each section, the documents for the data sets are automatically published in PDF and HTML format. Furthermore, the metadata XML file which is compliant with ISO19115 and ISO19139 is created at the same moment. The generated metadata are managed in the metadata database of the DIAS project, and will be used in various ISO19139 compliant metadata management tools, such as GeoNetwork.
NASA Astrophysics Data System (ADS)
Kirsch, Peter; Breen, Paul
2013-04-01
We wish to highlight outputs of a project conceived from a science requirement to improve discovery and access to Antarctic meteorological data in near real-time. Given that the data was distributed in both spatial and temporal domains and is to be accessed across several science disciplines, the creation of an interoperable, OGC compliant web service was deemed the most appropriate approach. We will demonstrate an implementation of the OGC SOS Interface Standard to discover, browse, and access Antarctic meteorological data-sets. A selection of programmatic (R, Perl) and web client interfaces utilizing open technologies ( e.g. jQuery, Flot, openLayers ) will be demonstrated. In addition we will show how high level abstractions can be constructed to allow the users flexible and straightforward access to SOS retrieved data.
NASA Astrophysics Data System (ADS)
Kerschke, Dorit; Schilling, Maik; Simon, Andreas; Wächter, Joachim
2014-05-01
The Energiewende and the increasing scarcity of raw materials will lead to an intensified utilization of the subsurface in Germany. Within this context, geological 3D modeling is a fundamental approach for integrated decision and planning processes. Initiated by the development of the European Geospatial Infrastructure INSPIRE, the German State Geological Offices started digitizing their predominantly analog archive inventory. Until now, a comprehensive 3D subsurface model of Brandenburg did not exist. Therefore the project B3D strived to develop a new 3D model as well as a subsequent infrastructure node to integrate all geological and spatial data within the Geodaten-Infrastruktur Brandenburg (Geospatial Infrastructure, GDI-BB) and provide it to the public through an interactive 2D/3D web application. The functionality of the web application is based on a client-server architecture. Server-sided, all available spatial data is published through GeoServer. GeoServer is designed for interoperability and acts as the reference implementation of the Open Geospatial Consortium (OGC) Web Feature Service (WFS) standard that provides the interface that allows requests for geographical features. In addition, GeoServer implements, among others, the high performance certified compliant Web Map Service (WMS) that serves geo-referenced map images. For publishing 3D data, the OGC Web 3D Service (W3DS), a portrayal service for three-dimensional geo-data, is used. The W3DS displays elements representing the geometry, appearance, and behavior of geographic objects. On the client side, the web application is solely based on Free and Open Source Software and leans on the JavaScript API WebGL that allows the interactive rendering of 2D and 3D graphics by means of GPU accelerated usage of physics and image processing as part of the web page canvas without the use of plug-ins. WebGL is supported by most web browsers (e.g., Google Chrome, Mozilla Firefox, Safari, and Opera). The web application enables an intuitive navigation through all available information and allows the visualization of geological maps (2D), seismic transects (2D/3D), wells (2D/3D), and the 3D-model. These achievements will alleviate spatial and geological data management within the German State Geological Offices and foster the interoperability of heterogeneous systems. It will provide guidance to a systematic subsurface management across system, domain and administrative boundaries on the basis of a federated spatial data infrastructure, and include the public in the decision processes (e-Governance). Yet, the interoperability of the systems has to be strongly propelled forward through agreements on standards that need to be decided upon in responsible committees. The project B3D is funded with resources from the European Fund for Regional Development (EFRE).
Compliant walking appears metabolically advantageous at extreme step lengths.
Kim, Jaehoon; Bertram, John E A
2018-05-19
Humans alter gait in response to unusual gait circumstances to accomplish the task of walking. For instance, subjects spontaneously increase leg compliance at a step length threshold as step length increases. Here we test the hypothesis that this transition occurs based on the level of energy expenditure, where compliant walking becomes less energetically demanding at long step lengths. To map and compare the metabolic cost of normal and compliant walking as step length increases. 10 healthy individuals walked on a treadmill using progressively increasing step lengths (100%, 120%, 140% and 160% of preferred step length), in both normal and compliant leg walking as energy expenditure was recorded via indirect calorimetry. Leg compliance was controlled by lowering the center-of-mass trajectory during stance, forcing the leg to flex and extend as the body moved over the foot contact. For normal step lengths, compliant leg walking was more costly than normal walking gait, but compliant leg walking energetic cost did not increase as rapidly for longer step lengths. This led to an intersection between normal and compliant walking cost curves at 114% relative step length (regression analysis; r 2 = 0.92 for normal walking; r 2 = 0.65 for compliant walking). Compliant leg walking is less energetically demanding at longer step lengths where a spontaneous shift to compliant walking has been observed, suggesting the human motor control system is sensitive to energetic requirements and will employ alternate movement patterns if advantageous strategies are available. The transition could be attributed to the interplay between (i) leg work controlling body travel during single stance and (ii) leg work to control energy loss in the step-to-step transition. Compliant leg walking requires more stance leg work at normal step lengths, but involves less energy loss at the step-to-step transition for very long steps. Copyright © 2018 Elsevier B.V. All rights reserved.
Supporting NEESPI with Data Services - The SIB-ESS-C e-Infrastructure
NASA Astrophysics Data System (ADS)
Gerlach, R.; Schmullius, C.; Frotscher, K.
2009-04-01
Data discovery and retrieval is commonly among the first steps performed for any Earth science study. The way scientific data is searched and accessed has changed significantly over the past two decades. Especially the development of the World Wide Web and the technologies that evolved along shortened the data discovery and data exchange process. On the other hand the amount of data collected and distributed by earth scientists has increased exponentially requiring new concepts for data management and sharing. One such concept to meet the demand is to build up Spatial Data Infrastructures (SDI) or e-Infrastructures. These infrastructures usually contain components for data discovery allowing users (or other systems) to query a catalogue or registry and retrieve metadata information on available data holdings and services. Data access is typically granted using FTP/HTTP protocols or, more advanced, through Web Services. A Service Oriented Architecture (SOA) approach based on standardized services enables users to benefit from interoperability among different systems and to integrate distributed services into their application. The Siberian Earth System Science Cluster (SIB-ESS-C) being established at the University of Jena (Germany) is such a spatial data infrastructure following these principles and implementing standards published by the Open Geospatial Consortium (OGC) and the International Organization for Standardization (ISO). The prime objective is to provide researchers with focus on Siberia with the technical means for data discovery, data access, data publication and data analysis. The region of interest covers the entire Asian part of the Russian Federation from the Ural to the Pacific Ocean including the Ob-, Lena- and Yenissey river catchments. The aim of SIB-ESS-C is to provide a comprehensive set of data products for Earth system science in this region. Although SIB-ESS-C will be equipped with processing capabilities for in-house data generation (mainly from Earth Observation), current data holdings of SIB-ESS-C have been created in collaboration with a number of partners in previous and ongoing research projects (e.g. SIBERIA-II, SibFORD, IRIS). At the current development stage the SIB-ESS-C system comprises a federated metadata catalogue accessible through the SIB-ESS-C Web Portal or from any OGC-CSW compliant client. Due to full interoperability with other metadata catalogues users of the SIB-ESS-C Web Portal are able to search external metadata repositories. The Web Portal contains also a simple visualization component which will be extended to a comprehensive visualization and analysis tool in the near future. All data products are already accessible as a Web Mapping Service and will be made available as Web Feature and Web Coverage Services soon allowing users to directly incorporate the data into their application. The SIB-ESS-C infrastructure will be further developed as one node in a network of similar systems (e.g. NASA GIOVANNI) in the NEESPI region.
Scherer, N M; Basso, D M
2008-09-16
DNATagger is a web-based tool for coloring and editing DNA, RNA and protein sequences and alignments. It is dedicated to the visualization of protein coding sequences and also protein sequence alignments to facilitate the comprehension of evolutionary processes in sequence analysis. The distinctive feature of DNATagger is the use of codons as informative units for coloring DNA and RNA sequences. The codons are colored according to their corresponding amino acids. It is the first program that colors codons in DNA sequences without being affected by "out-of-frame" gaps of alignments. It can handle single gaps and gaps inside the triplets. The program also provides the possibility to edit the alignments and change color patterns and translation tables. DNATagger is a JavaScript application, following the W3C guidelines, designed to work on standards-compliant web browsers. It therefore requires no installation and is platform independent. The web-based DNATagger is available as free and open source software at http://www.inf.ufrgs.br/~dmbasso/dnatagger/.
Pafilis, Evangelos; Buttigieg, Pier Luigi; Ferrell, Barbra; Pereira, Emiliano; Schnetzer, Julia; Arvanitidis, Christos; Jensen, Lars Juhl
2016-01-01
The microbial and molecular ecology research communities have made substantial progress on developing standards for annotating samples with environment metadata. However, sample manual annotation is a highly labor intensive process and requires familiarity with the terminologies used. We have therefore developed an interactive annotation tool, EXTRACT, which helps curators identify and extract standard-compliant terms for annotation of metagenomic records and other samples. Behind its web-based user interface, the system combines published methods for named entity recognition of environment, organism, tissue and disease terms. The evaluators in the BioCreative V Interactive Annotation Task found the system to be intuitive, useful, well documented and sufficiently accurate to be helpful in spotting relevant text passages and extracting organism and environment terms. Comparison of fully manual and text-mining-assisted curation revealed that EXTRACT speeds up annotation by 15-25% and helps curators to detect terms that would otherwise have been missed. Database URL: https://extract.hcmr.gr/. © The Author(s) 2016. Published by Oxford University Press.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pafilis, Evangelos; Buttigieg, Pier Luigi; Ferrell, Barbra
The microbial and molecular ecology research communities have made substantial progress on developing standards for annotating samples with environment metadata. However, sample manual annotation is a highly labor intensive process and requires familiarity with the terminologies used. We have therefore developed an interactive annotation tool, EXTRACT, which helps curators identify and extract standard-compliant terms for annotation of metagenomic records and other samples. Behind its web-based user interface, the system combines published methods for named entity recognition of environment, organism, tissue and disease terms. The evaluators in the BioCreative V Interactive Annotation Task found the system to be intuitive, useful, wellmore » documented and sufficiently accurate to be helpful in spotting relevant text passages and extracting organism and environment terms. Here the comparison of fully manual and text-mining-assisted curation revealed that EXTRACT speeds up annotation by 15–25% and helps curators to detect terms that would otherwise have been missed.« less
Pafilis, Evangelos; Buttigieg, Pier Luigi; Ferrell, Barbra; ...
2016-01-01
The microbial and molecular ecology research communities have made substantial progress on developing standards for annotating samples with environment metadata. However, sample manual annotation is a highly labor intensive process and requires familiarity with the terminologies used. We have therefore developed an interactive annotation tool, EXTRACT, which helps curators identify and extract standard-compliant terms for annotation of metagenomic records and other samples. Behind its web-based user interface, the system combines published methods for named entity recognition of environment, organism, tissue and disease terms. The evaluators in the BioCreative V Interactive Annotation Task found the system to be intuitive, useful, wellmore » documented and sufficiently accurate to be helpful in spotting relevant text passages and extracting organism and environment terms. Here the comparison of fully manual and text-mining-assisted curation revealed that EXTRACT speeds up annotation by 15–25% and helps curators to detect terms that would otherwise have been missed.« less
Turbine airfoil with a compliant outer wall
Campbell, Christian X [Oviedo, FL; Morrison, Jay A [Oviedo, FL
2012-04-03
A turbine airfoil usable in a turbine engine with a cooling system and a compliant dual wall configuration configured to enable thermal expansion between inner and outer layers while eliminating stress formation in the outer layer is disclosed. The compliant dual wall configuration may be formed a dual wall formed from inner and outer layers separated by a support structure. The outer layer may be a compliant layer configured such that the outer layer may thermally expand and thereby reduce the stress within the outer layer. The outer layer may be formed from a nonplanar surface configured to thermally expand. In another embodiment, the outer layer may be planar and include a plurality of slots enabling unrestricted thermal expansion in a direction aligned with the outer layer.
Minati, L; Ghielmetti, F; Ciobanu, V; D'Incerti, L; Maccagnano, C; Bizzi, A; Bruzzone, M G
2007-03-01
Advanced neuroimaging techniques, such as functional magnetic resonance imaging (fMRI), chemical shift spectroscopy imaging (CSI), diffusion tensor imaging (DTI), and perfusion-weighted imaging (PWI) create novel challenges in terms of data storage and management: huge amounts of raw data are generated, the results of analysis may depend on the software and settings that have been used, and most often intermediate files are inherently not compliant with the current DICOM (digital imaging and communication in medicine) standard, as they contain multidimensional complex and tensor arrays and various other types of data structures. A software architecture, referred to as Bio-Image Warehouse System (BIWS), which can be used alongside a radiology information system/picture archiving and communication system (RIS/PACS) system to store neuroimaging data for research purposes, is presented. The system architecture is conceived with the purpose of enabling to query by diagnosis according to a predefined two-layered classification taxonomy. The operational impact of the system and the time needed to get acquainted with the web-based interface and with the taxonomy are found to be limited. The development of modules enabling automated creation of statistical templates is proposed.
NASA Technical Reports Server (NTRS)
Tesar, Delbert; Tosunoglu, Sabri; Lin, Shyng-Her
1990-01-01
Research results on general serial robotic manipulators modeled with structural compliances are presented. Two compliant manipulator modeling approaches, distributed and lumped parameter models, are used in this study. System dynamic equations for both compliant models are derived by using the first and second order influence coefficients. Also, the properties of compliant manipulator system dynamics are investigated. One of the properties, which is defined as inaccessibility of vibratory modes, is shown to display a distinct character associated with compliant manipulators. This property indicates the impact of robot geometry on the control of structural oscillations. Example studies are provided to illustrate the physical interpretation of inaccessibility of vibratory modes. Two types of controllers are designed for compliant manipulators modeled by either lumped or distributed parameter techniques. In order to maintain the generality of the results, neither linearization is introduced. Example simulations are given to demonstrate the controller performance. The second type controller is also built for general serial robot arms and is adaptive in nature which can estimate uncertain payload parameters on-line and simultaneously maintain trajectory tracking properties. The relation between manipulator motion tracking capability and convergence of parameter estimation properties is discussed through example case studies. The effect of control input update delays on adaptive controller performance is also studied.
Dual-Arm Generalized Compliant Motion With Shared Control
NASA Technical Reports Server (NTRS)
Backes, Paul G.
1994-01-01
Dual-Arm Generalized Compliant Motion (DAGCM) primitive computer program implementing improved unified control scheme for two manipulator arms cooperating in task in which both grasp same object. Provides capabilities for autonomous, teleoperation, and shared control of two robot arms. Unifies cooperative dual-arm control with multi-sensor-based task control and makes complete task-control capability available to higher-level task-planning computer system via large set of input parameters used to describe desired force and position trajectories followed by manipulator arms. Some concepts discussed in "A Generalized-Compliant-Motion Primitive" (NPO-18134).
DIRAC3 - the new generation of the LHCb grid software
NASA Astrophysics Data System (ADS)
Tsaregorodtsev, A.; Brook, N.; Casajus Ramo, A.; Charpentier, Ph; Closier, J.; Cowan, G.; Graciani Diaz, R.; Lanciotti, E.; Mathe, Z.; Nandakumar, R.; Paterson, S.; Romanovsky, V.; Santinelli, R.; Sapunov, M.; Smith, A. C.; Seco Miguelez, M.; Zhelezov, A.
2010-04-01
DIRAC, the LHCb community Grid solution, was considerably reengineered in order to meet all the requirements for processing the data coming from the LHCb experiment. It is covering all the tasks starting with raw data transportation from the experiment area to the grid storage, data processing up to the final user analysis. The reengineered DIRAC3 version of the system includes a fully grid security compliant framework for building service oriented distributed systems; complete Pilot Job framework for creating efficient workload management systems; several subsystems to manage high level operations like data production and distribution management. The user interfaces of the DIRAC3 system providing rich command line and scripting tools are complemented by a full-featured Web portal providing users with a secure access to all the details of the system status and ongoing activities. We will present an overview of the DIRAC3 architecture, new innovative features and the achieved performance. Extending DIRAC3 to manage computing resources beyond the WLCG grid will be discussed. Experience with using DIRAC3 by other user communities than LHCb and in other application domains than High Energy Physics will be shown to demonstrate the general-purpose nature of the system.
A new reference implementation of the PSICQUIC web service.
del-Toro, Noemi; Dumousseau, Marine; Orchard, Sandra; Jimenez, Rafael C; Galeota, Eugenia; Launay, Guillaume; Goll, Johannes; Breuer, Karin; Ono, Keiichiro; Salwinski, Lukasz; Hermjakob, Henning
2013-07-01
The Proteomics Standard Initiative Common QUery InterfaCe (PSICQUIC) specification was created by the Human Proteome Organization Proteomics Standards Initiative (HUPO-PSI) to enable computational access to molecular-interaction data resources by means of a standard Web Service and query language. Currently providing >150 million binary interaction evidences from 28 servers globally, the PSICQUIC interface allows the concurrent search of multiple molecular-interaction information resources using a single query. Here, we present an extension of the PSICQUIC specification (version 1.3), which has been released to be compliant with the enhanced standards in molecular interactions. The new release also includes a new reference implementation of the PSICQUIC server available to the data providers. It offers augmented web service capabilities and improves the user experience. PSICQUIC has been running for almost 5 years, with a user base growing from only 4 data providers to 28 (April 2013) allowing access to 151 310 109 binary interactions. The power of this web service is shown in PSICQUIC View web application, an example of how to simultaneously query, browse and download results from the different PSICQUIC servers. This application is free and open to all users with no login requirement (http://www.ebi.ac.uk/Tools/webservices/psicquic/view/main.xhtml).
ART OF THE POSSIBLE: SECURING AIR FORCE SPACE COMMAND MISSION SYSTEMS FOR THE WARFIGHTER
2016-10-23
Initiation (Adversarial)…….…17 Table 2. Assessment Scale-Likelihood of Threat Event Occurrence ( Non -Adversarial).17 Table 3. Assessment Scale...action to thwart the attacks from adversarial nation states and non -state actors alike. While there are numerous cybersecurity concerns, or non ...compliant cybersecurity controls across all weapon systems, not all non -compliant controls contribute equally to the cyber-attack surface and overall
Compliant displacement-multiplying apparatus for microelectromechanical systems
Kota, Sridhar; Rodgers, M. Steven; Hetrick, Joel A.
2001-01-01
A pivotless compliant structure is disclosed that can be used to increase the geometric advantage or mechanical advantage of a microelectromechanical (MEM) actuator such as an electrostatic comb actuator, a capacitive-plate electrostatic actuator, or a thermal actuator. The compliant structure, based on a combination of interconnected flexible beams and cross-beams formed of one or more layers of polysilicon or silicon nitride, can provide a geometric advantage of from about 5:1 to about 60:1 to multiply a 0.25-3 .mu.m displacement provided by a short-stroke actuator so that such an actuator can be used to generate a displacement stroke of about 10-34 .mu.m to operate a ratchet-driven MEM device or a microengine. The compliant structure has less play than conventional displacement-multiplying devices based on lever arms and pivoting joints, and is expected to be more reliable than such devices. The compliant structure and an associated electrostatic or thermal actuator can be formed on a common substrate (e.g. silicon) using surface micromachining.
A metadata-aware application for remote scoring and exchange of tissue microarray images
2013-01-01
Background The use of tissue microarrays (TMA) and advances in digital scanning microscopy has enabled the collection of thousands of tissue images. There is a need for software tools to annotate, query and share this data amongst researchers in different physical locations. Results We have developed an open source web-based application for remote scoring of TMA images, which exploits the value of Microsoft Silverlight Deep Zoom to provide a intuitive interface for zooming and panning around digital images. We use and extend existing XML-based standards to ensure that the data collected can be archived and that our system is interoperable with other standards-compliant systems. Conclusion The application has been used for multi-centre scoring of TMA slides composed of tissues from several Phase III breast cancer trials and ten different studies participating in the International Breast Cancer Association Consortium (BCAC). The system has enabled researchers to simultaneously score large collections of TMA and export the standardised data to integrate with pathological and clinical outcome data, thereby facilitating biomarker discovery. PMID:23635078
Mobile Visualization and Analysis Tools for Spatial Time-Series Data
NASA Astrophysics Data System (ADS)
Eberle, J.; Hüttich, C.; Schmullius, C.
2013-12-01
The Siberian Earth System Science Cluster (SIB-ESS-C) provides access and analysis services for spatial time-series data build on products from the Moderate Resolution Imaging Spectroradiometer (MODIS) and climate data from meteorological stations. Until now a webportal for data access, visualization and analysis with standard-compliant web services was developed for SIB-ESS-C. As a further enhancement a mobile app was developed to provide an easy access to these time-series data for field campaigns. The app sends the current position from the GPS receiver and a specific dataset (like land surface temperature or vegetation indices) - selected by the user - to our SIB-ESS-C web service and gets the requested time-series data for the identified pixel back in real-time. The data is then being plotted directly in the app. Furthermore the user has possibilities to analyze the time-series data for breaking points and other phenological values. These processings are executed on demand of the user on our SIB-ESS-C web server and results are transfered to the app. Any processing can also be done at the SIB-ESS-C webportal. The aim of this work is to make spatial time-series data and analysis functions available for end users without the need of data processing. In this presentation the author gives an overview on this new mobile app, the functionalities, the technical infrastructure as well as technological issues (how the app was developed, our made experiences).
The VO-Dance web application at the IA2 data center
NASA Astrophysics Data System (ADS)
Molinaro, Marco; Knapic, Cristina; Smareglia, Riccardo
2012-09-01
Italian center for Astronomical Archives (IA2, http://ia2.oats.inaf.it) is a national infrastructure project of the Italian National Institute for Astrophysics (Istituto Nazionale di AstroFisica, INAF) that provides services for the astronomical community. Besides data hosting for the Large Binocular Telescope (LBT) Corporation, the Galileo National Telescope (Telescopio Nazionale Galileo, TNG) Consortium and other telescopes and instruments, IA2 offers proprietary and public data access through user portals (both developed and mirrored) and deploys resources complying the Virtual Observatory (VO) standards. Archiving systems and web interfaces are developed to be extremely flexible about adding new instruments from other telescopes. VO resources publishing, along with data access portals, implements the International Virtual Observatory Alliance (IVOA) protocols providing astronomers with new ways of analyzing data. Given the large variety of data flavours and IVOA standards, the need for tools to easily accomplish data ingestion and data publishing arises. This paper describes the VO-Dance tool, that IA2 started developing to address VO resources publishing in a dynamical way from already existent database tables or views. The tool consists in a Java web application, potentially DBMS and platform independent, that stores internally the services' metadata and information, exposes restful endpoints to accept VO queries for these services and dynamically translates calls to these endpoints to SQL queries coherent with the published table or view. In response to the call VO-Dance translates back the database answer in a VO compliant way.
NASA Astrophysics Data System (ADS)
Ma, Kevin; Wong, Jonathan; Zhong, Mark; Zhang, Jeff; Liu, Brent
2014-03-01
In the past, we have presented an imaging-informatics based eFolder system for managing and analyzing imaging and lesion data of multiple sclerosis (MS) patients, which allows for data storage, data analysis, and data mining in clinical and research settings. The system integrates the patient's clinical data with imaging studies and a computer-aided detection (CAD) algorithm for quantifying MS lesion volume, lesion contour, locations, and sizes in brain MRI studies. For compliance with IHE integration protocols, long-term storage in PACS, and data query and display in a DICOM compliant clinical setting, CAD results need to be converted into DICOM-Structured Report (SR) format. Open-source dcmtk and customized XML templates are used to convert quantitative MS CAD results from MATLAB to DICOM-SR format. A web-based GUI based on our existing web-accessible DICOM object (WADO) image viewer has been designed to display the CAD results from generated SR files. The GUI is able to parse DICOM-SR files and extract SR document data, then display lesion volume, location, and brain matter volume along with the referenced DICOM imaging study. In addition, the GUI supports lesion contour overlay, which matches a detected MS lesion with its corresponding DICOM-SR data when a user selects either the lesion or the data. The methodology of converting CAD data in native MATLAB format to DICOM-SR and displaying the tabulated DICOM-SR along with the patient's clinical information, and relevant study images in the GUI will be demonstrated. The developed SR conversion model and GUI support aim to further demonstrate how to incorporate CAD post-processing components in a PACS and imaging informatics-based environment.
Use of Open Standards and Technologies at the Lunar Mapping and Modeling Project
NASA Astrophysics Data System (ADS)
Law, E.; Malhotra, S.; Bui, B.; Chang, G.; Goodale, C. E.; Ramirez, P.; Kim, R. M.; Sadaqathulla, S.; Rodriguez, L.
2011-12-01
The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is tasked by NASA. The project is responsible for the development of an information system to support lunar exploration activities. It provides lunar explorers a set of tools and lunar map and model products that are predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). At Jet Propulsion Laboratory (JPL), we have built the LMMP interoperable geospatial information system's underlying infrastructure and a single point of entry - the LMMP Portal by employing a number of open standards and technologies. The Portal exposes a set of services to users to allow search, visualization, subset, and download of lunar data managed by the system. Users also have access to a set of tools that visualize, analyze and annotate the data. The infrastructure and Portal are based on web service oriented architecture. We designed the system to support solar system bodies in general including asteroids, earth and planets. We employed a combination of custom software, commercial and open-source components, off-the-shelf hardware and pay-by-use cloud computing services. The use of open standards and web service interfaces facilitate platform and application independent access to the services and data, offering for instances, iPad and Android mobile applications and large screen multi-touch with 3-D terrain viewing functions, for a rich browsing and analysis experience from a variety of platforms. The web services made use of open standards including: Representational State Transfer (REST); and Open Geospatial Consortium (OGC)'s Web Map Service (WMS), Web Coverage Service (WCS), Web Feature Service (WFS). Its data management services have been built on top of a set of open technologies including: Object Oriented Data Technology (OODT) - open source data catalog, archive, file management, data grid framework; openSSO - open source access management and federation platform; solr - open source enterprise search platform; redmine - open source project collaboration and management framework; GDAL - open source geospatial data abstraction library; and others. Its data products are compliant with Federal Geographic Data Committee (FGDC) metadata standard. This standardization allows users to access the data products via custom written applications or off-the-shelf applications such as GoogleEarth. We will demonstrate this ready-to-use system for data discovery and visualization by walking through the data services provided through the portal such as browse, search, and other tools. We will further demonstrate image viewing and layering of lunar map images from the Internet, via mobile devices such as Apple's iPad.
PPI layouts: BioJS components for the display of Protein-Protein Interactions.
Salazar, Gustavo A; Meintjes, Ayton; Mulder, Nicola
2014-01-01
We present two web-based components for the display of Protein-Protein Interaction networks using different self-organizing layout methods: force-directed and circular. These components conform to the BioJS standard and can be rendered in an HTML5-compliant browser without the need for third-party plugins. We provide examples of interaction networks and how the components can be used to visualize them, and refer to a more complex tool that uses these components. http://github.com/biojs/biojs; http://dx.doi.org/10.5281/zenodo.7753.
SU-E-T-211: Peer Review System for Ensuring Quality of Radiation Therapy Treatments.
Kapoor, R; Kapur, P; Kumar, S A; Alex, D; Ranka, S; Palta, J
2012-06-01
To demonstrate a Web-based electronic peer review system that has the potential to improve quality of care for radiation therapy patients. The system provides tools that allow radiation oncologists to seek peer review of target and critical structure delineation, treatment plans, and share clinical data with peers to optimize radiation therapy treatments. Peer review of radiation therapy treatment planning data prior to its initiation improves the quality of radiation therapy and clinical outcomes. Web-based access to radiation therapy treatment planning data and medical records mitigate existing geographical and temporal constraints. With internet access, the healthcare provider can access the data from any location and review it in an interactive and collaborative manner. Interoperability standard like DICOM-RT and IHE-RO compliant RT Systems have facilitated the design and implementation of PRS with Silverlight Web technology, .net Framework and SQL Server. Local DICOM-RT archive and cloud based services are deployed to facilitate remote peer reviews. To validate the PRS system, we tested the system for 100 patients with Philips Pinnacle v 9.0 and Varian Eclipse v 8.9 treatment planning system (TPS). We transmitted the DICOM RT data from the TPS to the cloud based services via the PRS local DICOM RT Archive. Various CT simulation based parameters such as orientation of CT, properties of RT structures etc. were compared between the TPS and PRS system. Data integrity of other parameters such as patient demographics (patient name, ID, attending physician etc.) and dose volume related parameters were also evaluated. Such rigorous testing allowed us to optimize the functionalities and clinical implementation of the PRS. We believe that the PRS will improve the quality and safety of a broad spectrum of radiation therapy patients treated in underserved areas while discouraging the overutilization of expensive radiation treatment modalities. This research and development project is supported by the James and Ester King Biomedical Research Program grant # RC1-09KW-09-26829. © 2012 American Association of Physicists in Medicine.
NASA Astrophysics Data System (ADS)
Ross, A.; Stackhouse, P. W.; Tisdale, B.; Tisdale, M.; Chandler, W.; Hoell, J. M., Jr.; Kusterer, J.
2014-12-01
The NASA Langley Research Center Science Directorate and Atmospheric Science Data Center have initiated a pilot program to utilize Geographic Information System (GIS) tools that enable, generate and store climatological averages using spatial queries and calculations in a spatial database resulting in greater accessibility of data for government agencies, industry and private sector individuals. The major objectives of this effort include the 1) Processing and reformulation of current data to be consistent with ESRI and openGIS tools, 2) Develop functions to improve capability and analysis that produce "on-the-fly" data products, extending these past the single location to regional and global scales. 3) Update the current web sites to enable both web-based and mobile application displays for optimization on mobile platforms, 4) Interact with user communities in government and industry to test formats and usage of optimization, and 5) develop a series of metrics that allow for monitoring of progressive performance. Significant project results will include the the development of Open Geospatial Consortium (OGC) compliant web services (WMS, WCS, WFS, WPS) that serve renewable energy and agricultural application products to users using GIS software and tools. Each data product and OGC service will be registered within ECHO, the Common Metadata Repository, the Geospatial Platform, and Data.gov to ensure the data are easily discoverable and provide data users with enhanced access to SSE data, parameters, services, and applications. This effort supports cross agency, cross organization, and interoperability of SSE data products and services by collaborating with DOI, NRCan, NREL, NCAR, and HOMER for requirements vetting and test bed users before making available to the wider public.
An object-oriented approach to deploying highly configurable Web interfaces for the ATLAS experiment
NASA Astrophysics Data System (ADS)
Lange, Bruno; Maidantchik, Carmen; Pommes, Kathy; Pavani, Varlen; Arosa, Breno; Abreu, Igor
2015-12-01
The ATLAS Technical Coordination disposes of 17 Web systems to support its operation. These applications, whilst ranging from managing the process of publishing scientific papers to monitoring radiation levels in the equipment in the experimental cavern, are constantly prone to changes in requirements due to the collaborative nature of the experiment and its management. In this context, a Web framework is proposed to unify the generation of the supporting interfaces. FENCE assembles classes to build applications by making extensive use of JSON configuration files. It relies heavily on Glance, a technology that was set forth in 2003 to create an abstraction layer on top of the heterogeneous sources that store the technical coordination data. Once Glance maps out the database modeling, records can be referenced in the configuration files by wrapping unique identifiers around double enclosing brackets. The deployed content can be individually secured by attaching clearance attributes to their description thus ensuring that view/edit privileges are granted to eligible users only. The framework also provides tools for securely writing into a database. Fully HTML5-compliant multi-step forms can be generated from their JSON description to assure that the submitted data comply with a series of constraints. Input validation is carried out primarily on the server- side but, following progressive enhancement guidelines, verification might also be performed on the client-side by enabling specific markup data attributes which are then handed over to the jQuery validation plug-in. User monitoring is accomplished by thoroughly logging user requests along with any POST data. Documentation is built from the source code using the phpDocumentor tool and made readily available for developers online. Fence, therefore, speeds up the implementation of Web interfaces and reduces the response time to requirement changes by minimizing maintenance overhead.
OpenFIRE - A Web GIS Service for Distributing the Finnish Reflection Experiment Datasets
NASA Astrophysics Data System (ADS)
Väkevä, Sakari; Aalto, Aleksi; Heinonen, Aku; Heikkinen, Pekka; Korja, Annakaisa
2017-04-01
The Finnish Reflection Experiment (FIRE) is a land-based deep seismic reflection survey conducted between 2001 and 2003 by a research consortium of the Universities of Helsinki and Oulu, the Geological Survey of Finland, and a Russian state-owned enterprise SpetsGeofysika. The dataset consists of 2100 kilometers of high-resolution profiles across the Archaean and Proterozoic nuclei of the Fennoscandian Shield. Although FIRE data have been available on request since 2009, the data have remained underused outside the original research consortium. The original FIRE data have been quality-controlled. The shot gathers have been cross-checked and comprehensive errata has been created. The brute stacks provided by the Russian seismic contractor have been reprocessed into seismic sections and replotted. A complete documentation of the intermediate processing steps is provided together with guidelines for setting up a computing environment and plotting the data. An open access web service "OpenFIRE" for the visualization and the downloading of FIRE data has been created. The service includes a mobile-responsive map application capable of enriching seismic sections with data from other sources such as open data from the National Land Survey and the Geological Survey of Finland. The AVAA team of the Finnish Open Science and Research Initiative has provided a tailored Liferay portal with necessary web components such as an API (Application Programming Interface) for download requests. INSPIRE (Infrastructure for Spatial Information in Europe) -compliant discovery metadata have been produced and geospatial data will be exposed as Open Geospatial Consortium standard services. The technical guidelines of the European Plate Observing System have been followed and the service could be considered as a reference application for sharing reflection seismic data. The OpenFIRE web service is available at www.seismo.helsinki.fi/openfire
Electromechanical acoustic liner
NASA Technical Reports Server (NTRS)
Sheplak, Mark (Inventor); Cattafesta, III, Louis N. (Inventor); Nishida, Toshikazu (Inventor); Horowitz, Stephen Brian (Inventor)
2007-01-01
A multi-resonator-based system responsive to acoustic waves includes at least two resonators, each including a bottom plate, side walls secured to the bottom plate, and a top plate disposed on top of the side walls. The top plate includes an orifice so that a portion of an incident acoustical wave compresses gas in the resonators. The bottom plate or the side walls include at least one compliant portion. A reciprocal electromechanical transducer coupled to the compliant portion of each of the resonators forms a first and second transducer/compliant composite. An electrical network is disposed between the reciprocal electromechanical transducer of the first and second resonator.
A Novel Model to Simulate Flexural Complements in Compliant Sensor Systems
Tang, Hongyan; Zhang, Dan; Guo, Sheng; Qu, Haibo
2018-01-01
The main challenge in analyzing compliant sensor systems is how to calculate the large deformation of flexural complements. Our study proposes a new model that is called the spline pseudo-rigid-body model (spline PRBM). It combines dynamic spline and the pseudo-rigid-body model (PRBM) to simulate the flexural complements. The axial deformations of flexural complements are modeled by using dynamic spline. This makes it possible to consider the nonlinear compliance of the system using four control points. Three rigid rods connected by two revolute (R) pins with two torsion springs replace the three lines connecting the four control points. The kinematic behavior of the system is described using Lagrange equations. Both the optimization and the numerical fitting methods are used for resolving the characteristic parameters of the new model. An example is given of a compliant mechanism to modify the accuracy of the model. The spline PRBM is important in expanding the applications of the PRBM to the design and simulation of flexural force sensors. PMID:29596377
NASA Astrophysics Data System (ADS)
Kuhlmann, Arne; Herd, Daniel; Röβler, Benjamin; Gallmann, Eva; Jungbluth, Thomas
In pig production software and electronic systems are widely used for process control and management. Unfortunately most devices on farms are proprietary solutions and autonomically working. To unify data communication of devices in agricultural husbandry, the international standard ISOagriNET (ISO 17532:2007) was developed. It defines data formats and exchange protocols, to link up devices like climate controls, feeding systems and sensors, but also management software. The aim of the research project, "Information and Data Collection in Livestock Systems" is to develop an ISOagriNET compliant IT system, a so called Farming Cell. It integrates all electronic components to acquire the available data and information for pig fattening. That way, an additional benefit to humans, animals and the environment regarding process control and documentation, can be generated. Developing the Farming Cell is very complex; in detail it is very difficult and long-winded to integrate hardware and software by various vendors into an ISOagriNET compliant IT system. This ISOagriNET prototype shows as a test environment the potential of this new standard.
Wright, Alexander D; Laing, Andrew C
2012-10-01
Novel compliant flooring systems are a promising approach for reducing fall-related injuries in seniors, as they may provide up to 50% attenuation in peak force during simulated hip impacts while eliciting only minimal influences on balance. This study aimed to determine the protective capacity of novel compliant floors during simulated 'high severity' head impacts compared to common flooring systems. A headform was impacted onto a common Commercial-Carpet at 1.5, 2.5, and 3.5 m/s in front, back, and side orientations using a mechanical drop tower. Peak impact force applied to the headform (F(max)), peak linear acceleration of the headform (g(max)) and Head Injury Criterion (HIC) were determined. For the 3.5 m/s trials, backwards-oriented impacts were associated with the highest F(max) and HIC values (p<0.001); accordingly, this head orientation was used to complete additional trials on three common floors (Resilient Rubber, Residential-Loop Carpet, Berber Carpet) and six novel compliant floors at each impact velocity. ANOVAs indicated that flooring type was associated with all parameters at each impact velocity (p<0.001). Compared to impacts on the Commercial Carpet, Dunnett's post hoc indicated all variables were smaller (25-80%) for the novel compliant floors (p<0.001), but larger for Resilient Rubber (31-159%, p<0.01). This study demonstrates that during 'high severity' simulated impacts, novel compliant floors can substantially reduce the forces and accelerations applied to a headform compared to common floors including carpet and resilient rubber. In combination with reports of minimal balance impairments, these findings support the promise of novel compliant floors as a biomechanically effective strategy for reducing fall-related injuries including traumatic brain injuries and skull fractures. Copyright © 2011 IPEM. Published by Elsevier Ltd. All rights reserved.
Visualizing NetCDF Files by Using the EverVIEW Data Viewer
Conzelmann, Craig; Romañach, Stephanie S.
2010-01-01
Over the past few years, modelers in South Florida have started using Network Common Data Form (NetCDF) as the standard data container format for storing hydrologic and ecologic modeling inputs and outputs. With its origins in the meteorological discipline, NetCDF was created by the Unidata Program Center at the University Corporation for Atmospheric Research, in conjunction with the National Aeronautics and Space Administration and other organizations. NetCDF is a portable, scalable, self-describing, binary file format optimized for storing array-based scientific data. Despite attributes which make NetCDF desirable to the modeling community, many natural resource managers have few desktop software packages which can consume NetCDF and unlock the valuable data contained within. The U.S. Geological Survey and the Joint Ecosystem Modeling group, an ecological modeling community of practice, are working to address this need with the EverVIEW Data Viewer. Available for several operating systems, this desktop software currently supports graphical displays of NetCDF data as spatial overlays on a three-dimensional globe and views of grid-cell values in tabular form. An included Open Geospatial Consortium compliant, Web-mapping service client and charting interface allows the user to view Web-available spatial data as additional map overlays and provides simple charting visualizations of NetCDF grid values.
BμG@Sbase—a microbial gene expression and comparative genomic database
Witney, Adam A.; Waldron, Denise E.; Brooks, Lucy A.; Tyler, Richard H.; Withers, Michael; Stoker, Neil G.; Wren, Brendan W.; Butcher, Philip D.; Hinds, Jason
2012-01-01
The reducing cost of high-throughput functional genomic technologies is creating a deluge of high volume, complex data, placing the burden on bioinformatics resources and tool development. The Bacterial Microarray Group at St George's (BμG@S) has been at the forefront of bacterial microarray design and analysis for over a decade and while serving as a hub of a global network of microbial research groups has developed BμG@Sbase, a microbial gene expression and comparative genomic database. BμG@Sbase (http://bugs.sgul.ac.uk/bugsbase/) is a web-browsable, expertly curated, MIAME-compliant database that stores comprehensive experimental annotation and multiple raw and analysed data formats. Consistent annotation is enabled through a structured set of web forms, which guide the user through the process following a set of best practices and controlled vocabulary. The database currently contains 86 expertly curated publicly available data sets (with a further 124 not yet published) and full annotation information for 59 bacterial microarray designs. The data can be browsed and queried using an explorer-like interface; integrating intuitive tree diagrams to present complex experimental details clearly and concisely. Furthermore the modular design of the database will provide a robust platform for integrating other data types beyond microarrays into a more Systems analysis based future. PMID:21948792
BμG@Sbase--a microbial gene expression and comparative genomic database.
Witney, Adam A; Waldron, Denise E; Brooks, Lucy A; Tyler, Richard H; Withers, Michael; Stoker, Neil G; Wren, Brendan W; Butcher, Philip D; Hinds, Jason
2012-01-01
The reducing cost of high-throughput functional genomic technologies is creating a deluge of high volume, complex data, placing the burden on bioinformatics resources and tool development. The Bacterial Microarray Group at St George's (BμG@S) has been at the forefront of bacterial microarray design and analysis for over a decade and while serving as a hub of a global network of microbial research groups has developed BμG@Sbase, a microbial gene expression and comparative genomic database. BμG@Sbase (http://bugs.sgul.ac.uk/bugsbase/) is a web-browsable, expertly curated, MIAME-compliant database that stores comprehensive experimental annotation and multiple raw and analysed data formats. Consistent annotation is enabled through a structured set of web forms, which guide the user through the process following a set of best practices and controlled vocabulary. The database currently contains 86 expertly curated publicly available data sets (with a further 124 not yet published) and full annotation information for 59 bacterial microarray designs. The data can be browsed and queried using an explorer-like interface; integrating intuitive tree diagrams to present complex experimental details clearly and concisely. Furthermore the modular design of the database will provide a robust platform for integrating other data types beyond microarrays into a more Systems analysis based future.
Integration of LDSE and LTVS logs with HIPAA compliant auditing system (HCAS)
NASA Astrophysics Data System (ADS)
Zhou, Zheng; Liu, Brent J.; Huang, H. K.; Guo, Bing; Documet, Jorge; King, Nelson
2006-03-01
The deadline of HIPAA (Health Insurance Portability and Accountability Act) Security Rules has passed on February 2005; therefore being HIPAA compliant becomes extremely critical to healthcare providers. HIPAA mandates healthcare providers to protect the privacy and integrity of the health data and have the ability to demonstrate examples of mechanisms that can be used to accomplish this task. It is also required that a healthcare institution must be able to provide audit trails on image data access on demand for a specific patient. For these reasons, we have developed a HIPAA compliant auditing system (HCAS) for image data security in a PACS by auditing every image data access. The HCAS was presented in 2005 SPIE. This year, two new components, LDSE (Lossless Digital Signature Embedding) and LTVS (Patient Location Tracking and Verification System) logs, have been added to the HCAS. The LDSE can assure medical image integrity in a PACS, while the LTVS can provide access control for a PACS by creating a security zone in the clinical environment. By integrating the LDSE and LTVS logs with the HCAS, the privacy and integrity of image data can be audited as well. Thus, a PACS with the HCAS installed can become HIPAA compliant in image data privacy and integrity, access control, and audit control.
Contreras-Manzano, Alejandra; Jáuregui, Alejandra; Velasco-Bernal, Anabel; Vargas-Meza, Jorge; Rivera, Juan A; Tolentino-Mayo, Lizbeth; Barquera, Simón
2018-06-07
Nutrient profiling systems (NPS) are used around the world. In some countries, the food industry participates in the design of these systems. We aimed to compare the ability of various NPS to identify processed and ultra-processed Mexican products containing excessive amounts of critical nutrients. A sample of 2544 foods and beverages available in the Mexican market were classified as compliant and non-compliant according to seven NPS: the Pan American Health Organization (PAHO) model, which served as our reference, the Nutrient Profiling Scoring Criterion (NPSC), the Mexican Committee of Nutrition Experts (MCNE), the Health Star Rating (HSR), the Mexican Nutritional Seal (MNS), the Chilean Warning Octagons (CWO) 2016, 2018 and 2019 criteria, and Ecuador's Multiple Traffic Light (MTL). Overall, the proportion of foods classified as compliant by the HSR, MTL and MCNE models was similar to the PAHO model. In contrast, the NPSC, the MNS and the CWO-2016 classified a higher amount of foods as compliant. Larger differences between NPS classification were observed across food categories. Results support the notion that models developed with the involvement of food manufacturers are more permissive than those based on scientific evidence. Results highlight the importance of thoroughly evaluating the underlying criteria of a model.
Kaliki, Swathi; Patel, Anamika; Iram, Sadiya; Palkonda, Vijay Anand Reddy
2017-05-01
To describe the clinical features and outcomes of patients with stage III or IV retinoblastoma. This was a retrospective study of 80 patients. Based on the International Retinoblastoma Staging System (IRSS), the tumors (n = 81) belonged to stage IIIa (n = 38, 47%), IIIb (n = 1, 1%), IVa2 (n = 10, 12%), IVb1 (n = 14, 17%), and IVb3 (n = 18, 22%). Of 80 patients, 42 (53%) were compliant to treatment and 38 (47%) were non-compliant. All 38 patients who were non-compliant to treatment died of the disease at a mean duration of 13 months from diagnosis. Of the 42 patients compliant to treatment, 22 (52%) died before completion of treatment. Twenty patients with stage III disease (25%) could complete the multimodal treatment and 17 (71%) were alive and well at a median follow-up duration of 77 months. Compliant multimodality treatment is beneficial in patients with IRSS stage III disease. IRSS stage IV retinoblastoma has poor prognosis despite treatment. [J Pediatr Ophthalmol Strabismus. 2017;54(3):177-184.]. Copyright 2017, SLACK Incorporated.
Using Standardized Lexicons for Report Template Validation with LexMap, a Web-based Application.
Hostetter, Jason; Wang, Kenneth; Siegel, Eliot; Durack, Jeremy; Morrison, James J
2015-06-01
An enormous amount of data exists in unstructured diagnostic and interventional radiology reports. Free text or non-standardized terminologies limit the ability to parse, extract, and analyze these report data elements. Medical lexicons and ontologies contain standardized terms for relevant concepts including disease entities, radiographic technique, and findings. The use of standardized terms offers the potential to improve reporting consistency and facilitate computer analysis. The purpose of this project was to implement an interface to aid in the creation of standards-compliant reporting templates for use in interventional radiology. Non-standardized procedure report text was analyzed and referenced to RadLex, SNOMED-CT, and LOINC. Using JavaScript, a web application was developed which determined whether exact terms or synonyms in reports existed within these three reference resources. The NCBO BioPortal Annotator web service was used to map terms, and output from this application was used to create an interactive annotated version of the original report. The application was successfully used to analyze and modify five distinct reports for the Society of Interventional Radiology's standardized reporting project.
NASA Astrophysics Data System (ADS)
Tudose, Alexandru; Terstyansky, Gabor; Kacsuk, Peter; Winter, Stephen
Grid Application Repositories vary greatly in terms of access interface, security system, implementation technology, communication protocols and repository model. This diversity has become a significant limitation in terms of interoperability and inter-repository access. This paper presents the Grid Application Meta-Repository System (GAMRS) as a solution that offers better options for the management of Grid applications. GAMRS proposes a generic repository architecture, which allows any Grid Application Repository (GAR) to be connected to the system independent of their underlying technology. It also presents applications in a uniform manner and makes applications from all connected repositories visible to web search engines, OGSI/WSRF Grid Services and other OAI (Open Archive Initiative)-compliant repositories. GAMRS can also function as a repository in its own right and can store applications under a new repository model. With the help of this model, applications can be presented as embedded in virtual machines (VM) and therefore they can be run in their native environments and can easily be deployed on virtualized infrastructures allowing interoperability with new generation technologies such as cloud computing, application-on-demand, automatic service/application deployments and automatic VM generation.
7 CFR 1753.6 - Standards, specifications, and general requirements.
Code of Federal Regulations, 2012 CFR
2012-01-01
... compliant, as defined in 7 CFR 1735.22(e). (d) All materials and equipment financed with loan funds are subject to the “Buy American” provision (7 U.S.C. 901 et seq. as amended in 1938). (e) All software, software systems, and firmware financed with loan funds must be year 2000 compliant, as defined in 7 CFR...
7 CFR 1753.6 - Standards, specifications, and general requirements.
Code of Federal Regulations, 2014 CFR
2014-01-01
... compliant, as defined in 7 CFR 1735.22(e). (d) All materials and equipment financed with loan funds are subject to the “Buy American” provision (7 U.S.C. 901 et seq. as amended in 1938). (e) All software, software systems, and firmware financed with loan funds must be year 2000 compliant, as defined in 7 CFR...
7 CFR 1753.6 - Standards, specifications, and general requirements.
Code of Federal Regulations, 2011 CFR
2011-01-01
... compliant, as defined in 7 CFR 1735.22(e). (d) All materials and equipment financed with loan funds are subject to the “Buy American” provision (7 U.S.C. 901 et seq. as amended in 1938). (e) All software, software systems, and firmware financed with loan funds must be year 2000 compliant, as defined in 7 CFR...
7 CFR 1753.6 - Standards, specifications, and general requirements.
Code of Federal Regulations, 2013 CFR
2013-01-01
... compliant, as defined in 7 CFR 1735.22(e). (d) All materials and equipment financed with loan funds are subject to the “Buy American” provision (7 U.S.C. 901 et seq. as amended in 1938). (e) All software, software systems, and firmware financed with loan funds must be year 2000 compliant, as defined in 7 CFR...
Enhanced DIII-D Data Management Through a Relational Database
NASA Astrophysics Data System (ADS)
Burruss, J. R.; Peng, Q.; Schachter, J.; Schissel, D. P.; Terpstra, T. B.
2000-10-01
A relational database is being used to serve data about DIII-D experiments. The database is optimized for queries across multiple shots, allowing for rapid data mining by SQL-literate researchers. The relational database relates different experiments and datasets, thus providing a big picture of DIII-D operations. Users are encouraged to add their own tables to the database. Summary physics quantities about DIII-D discharges are collected and stored in the database automatically. Meta-data about code runs, MDSplus usage, and visualization tool usage are collected, stored in the database, and later analyzed to improve computing. Documentation on the database may be accessed through programming languages such as C, Java, and IDL, or through ODBC compliant applications such as Excel and Access. A database-driven web page also provides a convenient means for viewing database quantities through the World Wide Web. Demonstrations will be given at the poster.
NASA Astrophysics Data System (ADS)
Miller, C. J.; Gasson, D.; Fuentes, E.
2007-10-01
The NOAO NVO Portal is a web application for one-stop discovery, analysis, and access to VO-compliant imaging data and services. The current release allows for GUI-based discovery of nearly a half million images from archives such as the NOAO Science Archive, the Hubble Space Telescope WFPC2 and ACS instruments, XMM-Newton, Chandra, and ESO's INT Wide-Field Survey, among others. The NOAO Portal allows users to view image metadata, footprint wire-frames, FITS image previews, and provides one-click access to science quality imaging data throughout the entire sky via the Firefox web browser (i.e., no applet or code to download). Users can stage images from multiple archives at the NOAO NVO Portal for quick and easy bulk downloads. The NOAO NVO Portal also provides simplified and direct access to VO analysis services, such as the WESIX catalog generation service. We highlight the features of the NOAO NVO Portal (http://nvo.noao.edu).
NCBI GEO: mining tens of millions of expression profiles--database and tools update.
Barrett, Tanya; Troup, Dennis B; Wilhite, Stephen E; Ledoux, Pierre; Rudnev, Dmitry; Evangelista, Carlos; Kim, Irene F; Soboleva, Alexandra; Tomashevsky, Maxim; Edgar, Ron
2007-01-01
The Gene Expression Omnibus (GEO) repository at the National Center for Biotechnology Information (NCBI) archives and freely disseminates microarray and other forms of high-throughput data generated by the scientific community. The database has a minimum information about a microarray experiment (MIAME)-compliant infrastructure that captures fully annotated raw and processed data. Several data deposit options and formats are supported, including web forms, spreadsheets, XML and Simple Omnibus Format in Text (SOFT). In addition to data storage, a collection of user-friendly web-based interfaces and applications are available to help users effectively explore, visualize and download the thousands of experiments and tens of millions of gene expression patterns stored in GEO. This paper provides a summary of the GEO database structure and user facilities, and describes recent enhancements to database design, performance, submission format options, data query and retrieval utilities. GEO is accessible at http://www.ncbi.nlm.nih.gov/geo/
XMM-Newton Remote Interface to Science Analysis Software: First Public Version
NASA Astrophysics Data System (ADS)
Ibarra, A.; Gabriel, C.
2011-07-01
We present the first public beta release of the XMM-Newton Remote Interface to Science Analysis (RISA) software, available through the official XMM-Newton web pages. In a nutshell, RISA is a web based application that encapsulates the XMM-Newton data analysis software. The client identifies observations and creates XMM-Newton workflows. The server processes the client request, creates job templates and sends the jobs to a computer. RISA has been designed to help, at the same time, non-expert and professional XMM-Newton users. Thanks to the predefined threads, non-expert users can easily produce light curves and spectra. And on the other hand, expert user can use the full parameter interface to tune their own analysis. In both cases, the VO compliant client/server design frees the users from having to install any specific software to analyze XMM-Newton data.
Gmz: a Gml Compression Model for Webgis
NASA Astrophysics Data System (ADS)
Khandelwal, A.; Rajan, K. S.
2017-09-01
Geography markup language (GML) is an XML specification for expressing geographical features. Defined by Open Geospatial Consortium (OGC), it is widely used for storage and transmission of maps over the Internet. XML schemas provide the convenience to define custom features profiles in GML for specific needs as seen in widely popular cityGML, simple features profile, coverage, etc. Simple features profile (SFP) is a simpler subset of GML profile with support for point, line and polygon geometries. SFP has been constructed to make sure it covers most commonly used GML geometries. Web Feature Service (WFS) serves query results in SFP by default. But it falls short of being an ideal choice due to its high verbosity and size-heavy nature, which provides immense scope for compression. GMZ is a lossless compression model developed to work for SFP compliant GML files. Our experiments indicate GMZ achieves reasonably good compression ratios and can be useful in WebGIS based applications.
Reconfigurable, Intelligently-Adaptive, Communication System, an SDR Platform
NASA Technical Reports Server (NTRS)
Roche, Rigoberto
2016-01-01
The Space Telecommunications Radio System (STRS) provides a common, consistent framework to abstract the application software from the radio platform hardware. STRS aims to reduce the cost and risk of using complex, configurable and reprogrammable radio systems across NASA missions. The Glenn Research Center (GRC) team made a software-defined radio (SDR) platform STRS compliant by adding an STRS operating environment and a field programmable gate array (FPGA) wrapper, capable of implementing each of the platforms interfaces, as well as a test waveform to exercise those interfaces. This effort serves to provide a framework toward waveform development on an STRS compliant platform to support future space communication systems for advanced exploration missions. Validated STRS compliant applications provided tested code with extensive documentation to potentially reduce risk, cost and efforts in development of space-deployable SDRs. This paper discusses the advantages of STRS, the integration of STRS onto a Reconfigurable, Intelligently-Adaptive, Communication System (RIACS) SDR platform, the sample waveform, and wrapper development efforts. The paper emphasizes the infusion of the STRS Architecture onto the RIACS platform for potential use in next generation SDRs for advance exploration missions.
NASA Astrophysics Data System (ADS)
Mihajlovski, Andrej; Plieger, Maarten; Som de Cerff, Wim; Page, Christian
2016-04-01
The CLIPC project is developing a portal to provide a single point of access for scientific information on climate change. This is made possible through the Copernicus Earth Observation Programme for Europe, which will deliver a new generation of environmental measurements of climate quality. The data about the physical environment which is used to inform climate change policy and adaptation measures comes from several categories: satellite measurements, terrestrial observing systems, model projections and simulations and from re-analyses (syntheses of all available observations constrained with numerical weather prediction systems). These data categories are managed by different communities: CLIPC will provide a single point of access for the whole range of data. The CLIPC portal will provide a number of indicators showing impacts on specific sectors which have been generated using a range of factors selected through structured expert consultation. It will also, as part of the transformation services, allow users to explore the consequences of using different combinations of driving factors which they consider to be of particular relevance to their work or life. The portal will provide information on the scientific quality and pitfalls of such transformations to prevent misleading usage of the results. The CLIPC project will develop an end to end processing chain (indicator tool kit), from comprehensive information on the climate state through to highly aggregated decision relevant products. Indicators of climate change and climate change impact will be provided, and a tool kit to update and post process the collection of indicators will be integrated into the portal. The CLIPC portal has a distributed architecture, making use of OGC services provided by e.g., climate4impact.eu and CEDA. CLIPC has two themes: 1. Harmonized access to climate datasets derived from models, observations and re-analyses 2. A climate impact tool kit to evaluate, rank and aggregate indicators Key is the availability of standardized metadata, describing indicator data and services. This will enable standardization and interoperability between the different distributed services of CLIPC. To disseminate CLIPC indicator data, transformed data products to enable impacts assessments and climate change impact indicators a standardized meta-data infrastructure is provided. The challenge is that compliance of existing metadata to INSPIRE ISO standards and GEMINI standards needs to be extended to further allow the web portal to be generated from the available metadata blueprint. The information provided in the headers of netCDF files available through multiple catalogues, allow us to generate ISO compliant meta data which is in turn used to generate web based interface content, as well as OGC compliant web services such as WCS and WMS for front end and WPS interactions for the scientific users to combine and generate new datasets. The goal of the metadata infrastructure is to provide a blueprint for creating a data driven science portal, generated from the underlying: GIS data, web services and processing infrastructure. In the presentation we will present the results and lessons learned.
Universally Designed Text on the Web: Towards Readability Criteria Based on Anti-Patterns.
Eika, Evelyn
2016-01-01
The readability of web texts affects accessibility. The Web Content Accessibility guidelines (WCAG) state that the recommended reading level should match that of someone who has completed basic schooling. However, WCAG does not give advice on what constitutes an appropriate reading level. Web authors need tools to help composing WCAG compliant texts, and specific criteria are needed. Classic readability metrics are generally based on lengths of words and sentences and have been criticized for being over-simplistic. Automatic measures and classifications of texts' reading levels employing more advanced constructs remain an unresolved problem. If such measures were feasible, what should these be? This work examines three language constructs not captured by current readability indices but believed to significantly affect actual readability, namely, relative clauses, garden path sentences, and left-branching structures. The goal is to see whether quantifications of these stylistic features reflect readability and how they correspond to common readability measures. Manual assessments of a set of authentic web texts for such uses were conducted. The results reveal that texts related to narratives such as children's stories, which are given the highest readability value, do not contain these constructs. The structures in question occur more frequently in expository texts that aim at educating or disseminating information such as strategy and journal articles. The results suggest that language anti-patterns hold potential for establishing a set of deeper readability criteria.
A new approach of active compliance control via fuzzy logic control for multifingered robot hand
NASA Astrophysics Data System (ADS)
Jamil, M. F. A.; Jalani, J.; Ahmad, A.
2016-07-01
Safety is a vital issue in Human-Robot Interaction (HRI). In order to guarantee safety in HRI, a model reference impedance control can be a very useful approach introducing a compliant control. In particular, this paper establishes a fuzzy logic compliance control (i.e. active compliance control) to reduce impact and forces during physical interaction between humans/objects and robots. Exploiting a virtual mass-spring-damper system allows us to determine a desired compliant level by understanding the behavior of the model reference impedance control. The performance of fuzzy logic compliant control is tested in simulation for a robotic hand known as the RED Hand. The results show that the fuzzy logic is a feasible control approach, particularly to control position and to provide compliant control. In addition, the fuzzy logic control allows us to simplify the controller design process (i.e. avoid complex computation) when dealing with nonlinearities and uncertainties.
Development of an Integrated Hydrologic Modeling System for Rainfall-Runoff Simulation
NASA Astrophysics Data System (ADS)
Lu, B.; Piasecki, M.
2008-12-01
This paper aims to present the development of an integrated hydrological model which involves functionalities of digital watershed processing, online data retrieval, hydrologic simulation and post-event analysis. The proposed system is intended to work as a back end to the CUAHSI HIS cyberinfrastructure developments. As a first step into developing this system, a physics-based distributed hydrologic model PIHM (Penn State Integrated Hydrologic Model) is wrapped into OpenMI(Open Modeling Interface and Environment ) environment so as to seamlessly interact with OpenMI compliant meteorological models. The graphical user interface is being developed from the openGIS application called MapWindows which permits functionality expansion through the addition of plug-ins. . Modules required to set up through the GUI workboard include those for retrieving meteorological data from existing database or meteorological prediction models, obtaining geospatial data from the output of digital watershed processing, and importing initial condition and boundary condition. They are connected to the OpenMI compliant PIHM to simulate rainfall-runoff processes and includes a module for automatically displaying output after the simulation. Online databases are accessed through the WaterOneFlow web services, and the retrieved data are either stored in an observation database(OD) following the schema of Observation Data Model(ODM) in case for time series support, or a grid based storage facility which may be a format like netCDF or a grid-based-data database schema . Specific development steps include the creation of a bridge to overcome interoperability issue between PIHM and the ODM, as well as the embedding of TauDEM (Terrain Analysis Using Digital Elevation Models) into the model. This module is responsible for developing watershed and stream network using digital elevation models. Visualizing and editing geospatial data is achieved by the usage of MapWinGIS, an ActiveX control developed by MapWindow team. After applying to the practical watershed, the performance of the model can be tested by the post-event analysis module.
NASA Astrophysics Data System (ADS)
Das, I.; Oberai, K.; Sarathi Roy, P.
2012-07-01
Landslides exhibit themselves in different mass movement processes and are considered among the most complex natural hazards occurring on the earth surface. Making landslide database available online via WWW (World Wide Web) promotes the spreading and reaching out of the landslide information to all the stakeholders. The aim of this research is to present a comprehensive database for generating landslide hazard scenario with the help of available historic records of landslides and geo-environmental factors and make them available over the Web using geospatial Free & Open Source Software (FOSS). FOSS reduces the cost of the project drastically as proprietary software's are very costly. Landslide data generated for the period 1982 to 2009 were compiled along the national highway road corridor in Indian Himalayas. All the geo-environmental datasets along with the landslide susceptibility map were served through WEBGIS client interface. Open source University of Minnesota (UMN) mapserver was used as GIS server software for developing web enabled landslide geospatial database. PHP/Mapscript server-side application serve as a front-end application and PostgreSQL with PostGIS extension serve as a backend application for the web enabled landslide spatio-temporal databases. This dynamic virtual visualization process through a web platform brings an insight into the understanding of the landslides and the resulting damage closer to the affected people and user community. The landslide susceptibility dataset is also made available as an Open Geospatial Consortium (OGC) Web Feature Service (WFS) which can be accessed through any OGC compliant open source or proprietary GIS Software.
A Security Architecture for Grid-enabling OGC Web Services
NASA Astrophysics Data System (ADS)
Angelini, Valerio; Petronzio, Luca
2010-05-01
In the proposed presentation we describe an architectural solution for enabling a secure access to Grids and possibly other large scale on-demand processing infrastructures through OGC (Open Geospatial Consortium) Web Services (OWS). This work has been carried out in the context of the security thread of the G-OWS Working Group. G-OWS (gLite enablement of OGC Web Services) is an international open initiative started in 2008 by the European CYCLOPS , GENESI-DR, and DORII Project Consortia in order to collect/coordinate experiences in the enablement of OWS's on top of the gLite Grid middleware. G-OWS investigates the problem of the development of Spatial Data and Information Infrastructures (SDI and SII) based on the Grid/Cloud capacity in order to enable Earth Science applications and tools. Concerning security issues, the integration of OWS compliant infrastructures and gLite Grids needs to address relevant challenges, due to their respective design principles. In fact OWS's are part of a Web based architecture that demands security aspects to other specifications, whereas the gLite middleware implements the Grid paradigm with a strong security model (the gLite Grid Security Infrastructure: GSI). In our work we propose a Security Architectural Framework allowing the seamless use of Grid-enabled OGC Web Services through the federation of existing security systems (mostly web based) with the gLite GSI. This is made possible mediating between different security realms, whose mutual trust is established in advance during the deployment of the system itself. Our architecture is composed of three different security tiers: the user's security system, a specific G-OWS security system, and the gLite Grid Security Infrastructure. Applying the separation-of-concerns principle, each of these tiers is responsible for controlling the access to a well-defined resource set, respectively: the user's organization resources, the geospatial resources and services, and the Grid resources. While the gLite middleware is tied to a consolidated security approach based on X.509 certificates, our system is able to support different kinds of user's security infrastructures. Our central component, the G-OWS Security Framework, is based on the OASIS WS-Trust specifications and on the OGC GeoRM architectural framework. This allows to satisfy advanced requirements such as the enforcement of specific geospatial policies and complex secure web service chained requests. The typical use case is represented by a scientist belonging to a given organization who issues a request to a G-OWS Grid-enabled Web Service. The system initially asks the user to authenticate to his/her organization's security system and, after verification of the user's security credentials, it translates the user's digital identity into a G-OWS identity. This identity is linked to a set of attributes describing the user's access rights to the G-OWS services and resources. Inside the G-OWS Security system, access restrictions are applied making use of the enhanced Geospatial capabilities specified by the OGC GeoXACML. If the required action needs to make use of the Grid environment the system checks if the user is entitled to access a Grid infrastructure. In that case his/her identity is translated to a temporary Grid security token using the Short Lived Credential Services (IGTF Standard). In our case, for the specific gLite Grid infrastructure, some information (VOMS Attributes) is plugged into the Grid Security Token to grant the access to the user's Virtual Organization Grid resources. The resulting token is used to submit the request to the Grid and also by the various gLite middleware elements to verify the user's grants. Basing on the presented framework, the G-OWS Security Working Group developed a prototype, enabling the execution of OGC Web Services on the EGEE Production Grid through the federation with a Shibboleth based security infrastructure. Future plans aim to integrate other Web authentication services such as OpenID, Kerberos and WS-Federation.
PPI layouts: BioJS components for the display of Protein-Protein Interactions
Salazar, Gustavo A.; Meintjes, Ayton; Mulder, Nicola
2014-01-01
Summary: We present two web-based components for the display of Protein-Protein Interaction networks using different self-organizing layout methods: force-directed and circular. These components conform to the BioJS standard and can be rendered in an HTML5-compliant browser without the need for third-party plugins. We provide examples of interaction networks and how the components can be used to visualize them, and refer to a more complex tool that uses these components. Availability: http://github.com/biojs/biojs; http://dx.doi.org/10.5281/zenodo.7753 PMID:25075288
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joseph, A.; Seuntjens, J.; Parker, W.
We describe development of automated, web-based, electronic health record (EHR) auditing software for use within our paperless radiation oncology clinic. By facilitating access to multiple databases within the clinic, each patient's EHR is audited prior to treatment, regularly during treatment, and post treatment. Anomalies such as missing documentation, non-compliant workflow and treatment parameters that differ significantly from the norm may be monitored, flagged and brought to the attention of clinicians. By determining historical trends using existing patient data and by comparing new patient data with the historical, we expect our software to provide a measurable improvement in the quality ofmore » radiotherapy at our centre.« less
Galbusera, Fabio; Brayda-Bruno, Marco; Freutel, Maren; Seitz, Andreas; Steiner, Malte; Wehrle, Esther; Wilke, Hans-Joachim
2012-01-01
Previous surveys showed a poor quality of the web sites providing health information about low back pain. However, the rapid and continuous evolution of the Internet content may question the current validity of those investigations. The present study is aimed to quantitatively assess the quality of the Internet information about low back pain retrieved with the most commonly employed search engines. An Internet search with the keywords "low back pain" has been performed with Google, Yahoo!® and Bing™ in the English language. The top 30 hits obtained with each search engine were evaluated by five independent raters and averaged following criteria derived from previous works. All search results were categorized as declaring compliant to a quality standard for health information (e.g. HONCode) or not and based on the web site type (Institutional, Free informative, Commercial, News, Social Network, Unknown). The quality of the hits retrieved by the three search engines was extremely similar. The web sites had a clear purpose, were easy to navigate, and mostly lacked in validity and quality of the provided links. The conformity to a quality standard was correlated with a marked greater quality of the web sites in all respects. Institutional web sites had the best validity and ease of use. Free informative web sites had good quality but a markedly lower validity compared to Institutional websites. Commercial web sites provided more biased information. News web sites were well designed and easy to use, but lacked in validity. The average quality of the hits retrieved by the most commonly employed search engines could be defined as satisfactory and favorably comparable with previous investigations. Awareness of the user about checking the quality of the information remains of concern.
Practical solutions to implementing "Born Semantic" data systems
NASA Astrophysics Data System (ADS)
Leadbetter, A.; Buck, J. J. H.; Stacey, P.
2015-12-01
The concept of data being "Born Semantic" has been proposed in recent years as a Semantic Web analogue to the idea of data being "born digital"[1], [2]. Within the "Born Semantic" concept, data are captured digitally and at a point close to the time of creation are annotated with markup terms from semantic web resources (controlled vocabularies, thesauri or ontologies). This allows heterogeneous data to be more easily ingested and amalgamated in near real-time due to the standards compliant annotation of the data. In taking the "Born Semantic" proposal from concept to operation, a number of difficulties have been encountered. For example, although there are recognised methods such as Header, Dictionary, Triples [3] for the compression, publication and dissemination of large volumes of triples these systems are not practical to deploy in the field on low-powered (both electrically and computationally) devices. Similarly, it is not practical for instruments to output fully formed semantically annotated data files if they are designed to be plugged into a modular system and the data to be centrally logged in the field as is the case on Argo floats and oceanographic gliders where internal bandwidth becomes an issue [2]. In light of these issues, this presentation will concentrate on pragmatic solutions being developed to the problem of generating Linked Data in near real-time systems. Specific examples from the European Commission SenseOCEAN project where Linked Data systems are being developed for autonomous underwater platforms, and from work being undertaken in the streaming of data from the Irish Galway Bay Cable Observatory initiative will be highlighted. Further, developments of a set of tools for the LogStash-ElasticSearch software ecosystem to allow the storing and retrieval of Linked Data will be introduced. References[1] A. Leadbetter & J. Fredericks, We have "born digital" - now what about "born semantic"?, European Geophysical Union General Assembly, 2014.[2] J. Buck & A. Leadbetter, Born semantic: linking data from sensors to users and balancing hardware limitations with data standards, European Geophysical Union General Assembly, 2015.[3] J. Fernandez et al., Binary RDF Representation for Publication and Exchange (HDT), Web Semantics 19:22-41, 2013.
Design and development of automatic sharia compliant wheelchair wheels cleaner
NASA Astrophysics Data System (ADS)
Shaari, Muhammad Farid; Rasli, Ibrahim Ismail Mohammad; Jamaludin, M. Z. Z. Wan; Isa, W. A. Mohamad; M., H.; Rashid, A. H. Abdul
2017-04-01
Sharia compliant wheelchair wheel cleaner was developed in order to assist the muslim Person with Disabilities (PWD) to pray in the mosque without leaving their wheelchair because of the filthy wheels. Though there are many wheelchair wheel cleaning system in the market, it is very rare to find sharia compliant cleaning system that applies sertu concept which is one of the cleaning and purification technique in Islamic practice. The sertu concept is based on 6:1 ratio that refers to the six times pipe water cleaning and one time soiled water cleaning. The development process consists of design stage, fabrication and system installation stage and followed by testing stage. During the design stage, the proposed prototype underwent design brainstorming, operation programming and structural simulation analysis. Once fabricated, the cleaner prototype underwent was tested. The results showed that the prototype can cater load up to 100kg with 1.31×10-6 mm shaft bending displacement. The water ejection timing varied approximately 3% compared to the program.
Preliminary Assessment of a Compliant Gait Exoskeleton.
Cestari, Manuel; Sanz-Merodio, Daniel; Garcia, Elena
2017-06-01
Current commercial wearable gait exoskeletons contain joints with stiff actuators that cannot adapt to unpredictable environments. These actuators consume a significant amount of energy, and their stiffness may not be appropriate for safe human-machine interactions. Adjustable compliant actuators are being designed and implemented because of their ability to minimize large forces due to shocks, to safely interact with the user, and to store and release energy in passive elastic elements. Introduction of such compliant actuation in gait exoskeletons, however, has been limited by the larger power-to-weight and volume ratio requirement. This article presents a preliminary assessment of the first compliant exoskeleton for children. Compliant actuation systems developed by our research group were integrated into the ATLAS exoskeleton prototype. The resulting device is a compliant exoskeleton, the ATLAS-C prototype. The exoskeleton is coupled with a special standing frame to provide balance while allowing a semi-natural gait. Experiments show that when comparing the behavior of the joints under different stiffness conditions, the inherent compliance of the implemented actuators showed natural adaptability during the gait cycle and in regions of shock absorption. Torque tracking of the joint is achieved, identifying the areas of loading response. The implementation of a state machine in the control of knee motion allowed reutilization of the stored energy during deflection at the end of the support phase to partially propel the leg and achieve a more natural and free swing.
AMBIT RESTful web services: an implementation of the OpenTox application programming interface.
Jeliazkova, Nina; Jeliazkov, Vedrin
2011-05-16
The AMBIT web services package is one of the several existing independent implementations of the OpenTox Application Programming Interface and is built according to the principles of the Representational State Transfer (REST) architecture. The Open Source Predictive Toxicology Framework, developed by the partners in the EC FP7 OpenTox project, aims at providing a unified access to toxicity data and predictive models, as well as validation procedures. This is achieved by i) an information model, based on a common OWL-DL ontology ii) links to related ontologies; iii) data and algorithms, available through a standardized REST web services interface, where every compound, data set or predictive method has a unique web address, used to retrieve its Resource Description Framework (RDF) representation, or initiate the associated calculations.The AMBIT web services package has been developed as an extension of AMBIT modules, adding the ability to create (Quantitative) Structure-Activity Relationship (QSAR) models and providing an OpenTox API compliant interface. The representation of data and processing resources in W3C Resource Description Framework facilitates integrating the resources as Linked Data. By uploading datasets with chemical structures and arbitrary set of properties, they become automatically available online in several formats. The services provide unified interfaces to several descriptor calculation, machine learning and similarity searching algorithms, as well as to applicability domain and toxicity prediction models. All Toxtree modules for predicting the toxicological hazard of chemical compounds are also integrated within this package. The complexity and diversity of the processing is reduced to the simple paradigm "read data from a web address, perform processing, write to a web address". The online service allows to easily run predictions, without installing any software, as well to share online datasets and models. The downloadable web application allows researchers to setup an arbitrary number of service instances for specific purposes and at suitable locations. These services could be used as a distributed framework for processing of resource-intensive tasks and data sharing or in a fully independent way, according to the specific needs. The advantage of exposing the functionality via the OpenTox API is seamless interoperability, not only within a single web application, but also in a network of distributed services. Last, but not least, the services provide a basis for building web mashups, end user applications with friendly GUIs, as well as embedding the functionalities in existing workflow systems.
AMBIT RESTful web services: an implementation of the OpenTox application programming interface
2011-01-01
The AMBIT web services package is one of the several existing independent implementations of the OpenTox Application Programming Interface and is built according to the principles of the Representational State Transfer (REST) architecture. The Open Source Predictive Toxicology Framework, developed by the partners in the EC FP7 OpenTox project, aims at providing a unified access to toxicity data and predictive models, as well as validation procedures. This is achieved by i) an information model, based on a common OWL-DL ontology ii) links to related ontologies; iii) data and algorithms, available through a standardized REST web services interface, where every compound, data set or predictive method has a unique web address, used to retrieve its Resource Description Framework (RDF) representation, or initiate the associated calculations. The AMBIT web services package has been developed as an extension of AMBIT modules, adding the ability to create (Quantitative) Structure-Activity Relationship (QSAR) models and providing an OpenTox API compliant interface. The representation of data and processing resources in W3C Resource Description Framework facilitates integrating the resources as Linked Data. By uploading datasets with chemical structures and arbitrary set of properties, they become automatically available online in several formats. The services provide unified interfaces to several descriptor calculation, machine learning and similarity searching algorithms, as well as to applicability domain and toxicity prediction models. All Toxtree modules for predicting the toxicological hazard of chemical compounds are also integrated within this package. The complexity and diversity of the processing is reduced to the simple paradigm "read data from a web address, perform processing, write to a web address". The online service allows to easily run predictions, without installing any software, as well to share online datasets and models. The downloadable web application allows researchers to setup an arbitrary number of service instances for specific purposes and at suitable locations. These services could be used as a distributed framework for processing of resource-intensive tasks and data sharing or in a fully independent way, according to the specific needs. The advantage of exposing the functionality via the OpenTox API is seamless interoperability, not only within a single web application, but also in a network of distributed services. Last, but not least, the services provide a basis for building web mashups, end user applications with friendly GUIs, as well as embedding the functionalities in existing workflow systems. PMID:21575202
Security Implications of OPC, OLE, DCOM, and RPC in Control Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2006-01-01
OPC is a collection of software programming standards and interfaces used in the process control industry. It is intended to provide open connectivity and vendor equipment interoperability. The use of OPC technology simplifies the development of control systems that integrate components from multiple vendors and support multiple control protocols. OPC-compliant products are available from most control system vendors, and are widely used in the process control industry. OPC was originally known as OLE for Process Control; the first standards for OPC were based on underlying services in the Microsoft Windows computing environment. These underlying services (OLE [Object Linking and Embedding],more » DCOM [Distributed Component Object Model], and RPC [Remote Procedure Call]) have been the source of many severe security vulnerabilities. It is not feasible to automatically apply vendor patches and service packs to mitigate these vulnerabilities in a control systems environment. Control systems using the original OPC data access technology can thus inherit the vulnerabilities associated with these services. Current OPC standardization efforts are moving away from the original focus on Microsoft protocols, with a distinct trend toward web-based protocols that are independent of any particular operating system. However, the installed base of OPC equipment consists mainly of legacy implementations of the OLE for Process Control protocols.« less
Design, fabrication and control of soft robots.
Rus, Daniela; Tolley, Michael T
2015-05-28
Conventionally, engineers have employed rigid materials to fabricate precise, predictable robotic systems, which are easily modelled as rigid members connected at discrete joints. Natural systems, however, often match or exceed the performance of robotic systems with deformable bodies. Cephalopods, for example, achieve amazing feats of manipulation and locomotion without a skeleton; even vertebrates such as humans achieve dynamic gaits by storing elastic energy in their compliant bones and soft tissues. Inspired by nature, engineers have begun to explore the design and control of soft-bodied robots composed of compliant materials. This Review discusses recent developments in the emerging field of soft robotics.
Reconfigurable, Intelligently-Adaptive, Communication System, an SDR Platform
NASA Technical Reports Server (NTRS)
Roche, Rigoberto J.; Shalkhauser, Mary Jo; Hickey, Joseph P.; Briones, Janette C.
2016-01-01
The Space Telecommunications Radio System (STRS) provides a common, consistent framework to abstract the application software from the radio platform hardware. STRS aims to reduce the cost and risk of using complex, configurable and reprogrammable radio systems across NASA missions. The NASA Glenn Research Center (GRC) team made a software defined radio (SDR) platform STRS compliant by adding an STRS operating environment and a field programmable gate array (FPGA) wrapper, capable of implementing each of the platforms interfaces, as well as a test waveform to exercise those interfaces. This effort serves to provide a framework toward waveform development onto an STRS compliant platform to support future space communication systems for advanced exploration missions. The use of validated STRS compliant applications provides tested code with extensive documentation to potentially reduce risk, cost and e ort in development of space-deployable SDRs. This paper discusses the advantages of STRS, the integration of STRS onto a Reconfigurable, Intelligently-Adaptive, Communication System (RIACS) SDR platform, and the test waveform and wrapper development e orts. The paper emphasizes the infusion of the STRS Architecture onto the RIACS platform for potential use in next generation flight system SDRs for advanced exploration missions.
NASA Astrophysics Data System (ADS)
Mattson, E.; Versteeg, R.; Ankeny, M.; Stormberg, G.
2005-12-01
Long term performance monitoring has been identified by DOE, DOD and EPA as one of the most challenging and costly elements of contaminated site remedial efforts. Such monitoring should provide timely and actionable information relevant to a multitude of stakeholder needs. This information should be obtained in a manner which is auditable, cost effective and transparent. Over the last several years INL staff has designed and implemented a web accessible scientific workflow system for environmental monitoring. This workflow environment integrates distributed, automated data acquisition from diverse sensors (geophysical, geochemical and hydrological) with server side data management and information visualization through flexible browser based data access tools. Component technologies include a rich browser-based client (using dynamic javascript and html/css) for data selection, a back-end server which uses PHP for data processing, user management, and result delivery, and third party applications which are invoked by the back-end using webservices. This system has been implemented and is operational for several sites, including the Ruby Gulch Waste Rock Repository (a capped mine waste rock dump on the Gilt Edge Mine Superfund Site), the INL Vadoze Zone Research Park and an alternative cover landfill. Implementations for other vadoze zone sites are currently in progress. These systems allow for autonomous performance monitoring through automated data analysis and report generation. This performance monitoring has allowed users to obtain insights into system dynamics, regulatory compliance and residence times of water. Our system uses modular components for data selection and graphing and WSDL compliant webservices for external functions such as statistical analyses and model invocations. Thus, implementing this system for novel sites and extending functionality (e.g. adding novel models) is relatively straightforward. As system access requires a standard webbrowser and uses intuitive functionality, stakeholders with diverse degrees of technical insight can use this system with little or no training.
Electrophysiological assessment in patients with long term hypoxia.
Ilik, Faik; Pazarli, Ahmet C; Kayhan, Fatih; Karamanli, Harun; Ozlece, Hatice K
2016-01-01
To evaluate visual evoked potentials (VEP) patterns in chronic obstructive pulmonary disease (COPD) patients who were compliant with supplemental oxygen treatment relative to non-compliant COPD patients. This prospective study protocol was reviewed and approved by the local ethical committee of Selcuk University and the research was performed in the Department of Neurology, Elbistan State Hospital, Kahramanmaras, Turkey from May to October 2014. Blood gas measurements and pulmonary function tests were carried out in patients with advanced stage COPD. The VEP was assessed in both eyes in both compliant and non-compliant patients. The study included 43 patients; 24 (55.8%) of the patients were not in compliance with their supplemental oxygen treatment, while 19 patients (44.2%) received adequate oxygen treatment. There was no statistically significant difference between patients with regards to pulmonary function test results and blood gas measurements. The VEP latency was significantly greater in both eyes of the non-compliant patients. Previous studies have reported prolonged VEP latencies in inflammatory diseases of the central nervous system. Similar electrophysiological findings were observed in our study and we propose that this may be due to oxidative stress, and inflammation that occurs secondary to chronic ischemia.
NASA Astrophysics Data System (ADS)
Canfield, Shawn; Edinger, Ben; Frecker, Mary I.; Koopmann, Gary H.
1999-06-01
Recent advances in robotics, tele-robotics, smart material actuators, and mechatronics raise new possibilities for innovative developments in millimeter-scale robotics capable of manipulating objects only fractions of a millimeter in size. These advances can have a wide range of applications in the biomedical community. A potential application of this technology is in minimally invasive surgery (MIS). The focus of this paper is the development of a single degree of freedom prototype to demonstrate the viability of smart materials, force feedback and compliant mechanisms for minimally invasive surgery. The prototype is a compliant gripper that is 7-mm by 17-mm, made from a single piece of titanium that is designed to function as a needle driver for small scale suturing. A custom designed piezoelectric `inchworm' actuator drives the gripper. The integrated system is computer controlled providing a user interface device capable of force feedback. The design methodology described draws from recent advances in three emerging fields in engineering: design of innovative tools for MIS, design of compliant mechanisms, and design of smart materials and actuators. The focus of this paper is on the design of a millimeter-scale inchworm actuator for use with a compliant end effector in MIS.
Software reuse example and challenges at NSIDC
NASA Astrophysics Data System (ADS)
Billingsley, B. W.; Brodzik, M.; Collins, J. A.
2009-12-01
NSIDC has created a new data discovery and access system, Searchlight, to provide users with the data they want in the format they want. NSIDC Searchlight supports discovery and access to disparate data types with on-the-fly reprojection, regridding and reformatting. Architected to both reuse open source systems and be reused itself, Searchlight reuses GDAL and Proj4 for manipulating data and format conversions, the netCDF Java library for creating netCDF output, MapServer and OpenLayers for defining spatial criteria and the JTS Topology Suite (JTS) in conjunction with Hibernate Spatial for database interaction and rich OGC-compliant spatial objects. The application reuses popular Java and Java Script libraries including Struts 2, Spring, JPA (Hibernate), Sitemesh, JFreeChart, JQuery, DOJO and a PostGIS PostgreSQL database. Future reuse of Searchlight components is supported at varying architecture levels, ranging from the database and model components to web services. We present the tools, libraries and programs that Searchlight has reused. We describe the architecture of Searchlight and explain the strategies deployed for reusing existing software and how Searchlight is built for reuse. We will discuss NSIDC reuse of the Searchlight components to support rapid development of new data delivery systems.
Family of Advanced Beyond Line-of-Sight Terminals (FAB-T)
2015-12-01
Architecture (DoD IEA), excepting tactical and non- operational (OP) communications 3) Compliant with GIG Technical Guidance ( GTG ) to include...Information Technology (IT) standards identified in the Standards FAB-T December 2015 SAR March 21, 2016 15:24:15 UNCLASSIFIED 24 Technical Guidance ( GTG ...Availability Anti-spoofing Module (SAASM), Spectrum and Joint Tactical Radio System (JTRS) requirements. Compliant GTG to include IT standards
Ankle rehabilitation device with two degrees of freedom and compliant joint
NASA Astrophysics Data System (ADS)
Racu (Cazacu, C.-M.; Doroftei, I.
2015-11-01
We propose a rehabilitation device that we intend to be low cost and easy to manufacture. The system will ensure functionality but also have a small dimensions and low mass, considering the physiological dimensions of the foot and lower leg. To avoid injure of the ankle joint, this device is equipped with a compliant joint between the motor and mechanical transmission. The torque of this joint is intended to be adjustable, according to the degree of ankle joint damage. To choose the material and the dimensions of this compliant joint, in this paper we perform the first stress simulation. The minimum torque is calculated, while the maximum torque is given by the preliminary chosen actuator.
Compliant cantilevered micromold
Morales, Alfredo Martin [Pleasanton, CA; Domeier, Linda A [Danville, CA; Gonzales, Marcela G [Seattle, WA; Keifer, Patrick N [Livermore, CA; Garino, Terry Joseph [Albuquerque, NM
2006-08-15
A compliant cantilevered three-dimensional micromold is provided. The compliant cantilevered micromold is suitable for use in the replication of cantilevered microparts and greatly simplifies the replication of such cantilevered parts. The compliant cantilevered micromold may be used to fabricate microparts using casting or electroforming techniques. When the compliant micromold is used to fabricate electroformed cantilevered parts, the micromold will also comprise an electrically conducting base formed by a porous metal substrate that is embedded within the compliant cantilevered micromold. Methods for fabricating the compliant cantilevered micromold as well as methods of replicating cantilevered microparts using the compliant cantilevered micromold are also provided.
Method for providing a compliant cantilevered micromold
Morales, Alfredo M.; Domeier, Linda A.; Gonzales, Marcela G.; Keifer, Patrick N.; Garino, Terry J.
2008-12-16
A compliant cantilevered three-dimensional micromold is provided. The compliant cantilevered micromold is suitable for use in the replication of cantilevered microparts and greatly simplifies the replication of such cantilevered parts. The compliant cantilevered micromold may be used to fabricate microparts using casting or electroforming techniques. When the compliant micromold is used to fabricate electroformed cantilevered parts, the micromold will also comprise an electrically conducting base formed by a porous metal substrate that is embedded within the compliant cantilevered micromold. Methods for fabricating the compliant cantilevered micromold as well as methods of replicating cantilevered microparts using the compliant cantilevered micromold are also provided.
Challenges in Visualizing Satellite Level 2 Atmospheric Data with GIS approach
NASA Astrophysics Data System (ADS)
Wei, J. C.; Yang, W.; Zhao, P.; Pham, L.; Meyer, D. J.
2017-12-01
Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted. Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. One way to help users better understand the satellite data is to provide data along with `Images', including accurate pixel coverage area delineation, and science team recommended quality screening for individual geophysical parameters. However, there are challenges of visualizing remote sensed non-gridded products: (1) different geodetics of space-borne instruments (2) data often arranged in "along-track" and "across-track" axes (3) spatially and temporally continuous data chunked into granule files: data for a portion (or all) of a satellite orbit (4) no general rule of resampling or interpolations to a grid (5) geophysical retrieval only based on pixel center location without shape information. In this presentation, we will unravel a new Goddard Earth Sciences Data and Information Services Center (GES DISC) Level 2 (L2) visualization on-demand service. The service's front end provides various visualization and data accessing capabilities, such as overlay and swipe of multiply variables and subset and download of data in different formats. The backend of the service consists of Open Geospatial Consortium (OGC) standard-compliant Web Mapping Service (WMS) and Web Coverage Service. The infrastructure allows inclusion of outside data sources served in OGC compliant protocols and allows other interoperable clients, such as ArcGIS clients, to connect to our L2 WCS/WMS.
Challenges in Obtaining and Visualizing Satellite Level 2 Data in GIS
NASA Technical Reports Server (NTRS)
Wei, Jennifer C.; Yang, Wenli; Zhao, Peisheng; Pham, Long; Meyer, David J.
2017-01-01
Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted. Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. One way to help users better understand the satellite data is to provide data along with Images, including accurate pixel coverage area delineation, and science team recommended quality screening for individual geophysical parameters. However, there are challenges of visualizing remote sensed non-gridded products: (1) different geodetics of space-borne instruments (2) data often arranged in a long-track� and a cross-track� axes (3) spatially and temporally continuous data chunked into granule files: data for a portion (or all) of a satellite orbit (4) no general rule of resampling or interpolations to a grid (5) geophysical retrieval only based on pixel center location without shape information. In this presentation, we will unravel a new Goddard Earth Sciences Data and Information Services Center (GES DISC) Level 2 (L2) visualization on-demand service. The service's front end provides various visualization and data accessing capabilities, such as overlay and swipe of multiply variables and subset and download of data in different formats. The backend of the service consists of Open Geospatial Consortium (OGC) standard-compliant Web Mapping Service (WMS) and Web Coverage Service. The infrastructure allows inclusion of outside data sources served in OGC compliant protocols and allows other interoperable clients, such as ArcGIS clients, to connect to our L2 WCS/WMS.
Tirado-Ramos, Alfredo; Hu, Jingkun; Lee, K.P.
2002-01-01
Supplement 23 to DICOM (Digital Imaging and Communications for Medicine), Structured Reporting, is a specification that supports a semantically rich representation of image and waveform content, enabling experts to share image and related patient information. DICOM SR supports the representation of textual and coded data linked to images and waveforms. Nevertheless, the medical information technology community needs models that work as bridges between the DICOM relational model and open object-oriented technologies. The authors assert that representations of the DICOM Structured Reporting standard, using object-oriented modeling languages such as the Unified Modeling Language, can provide a high-level reference view of the semantically rich framework of DICOM and its complex structures. They have produced an object-oriented model to represent the DICOM SR standard and have derived XML-exchangeable representations of this model using World Wide Web Consortium specifications. They expect the model to benefit developers and system architects who are interested in developing applications that are compliant with the DICOM SR specification. PMID:11751804
NASA Astrophysics Data System (ADS)
Whitefield, P. D.; Hagen, D. E.; Lobo, P.; Miake-Lye, R. C.
2015-12-01
The Society of Automotive Engineers (SAE) Aircraft Exhaust Emissions Measurement Committee (E-31) has published an Aerospace Information Report (AIR) 6241 detailing the sampling system for the measurement of non-volatile particulate matter (nvPM) from aircraft engines (SAE 2013). The system is designed to operate in parallel with existing International Civil Aviation Organization (ICAO) Annex 16 compliant combustion gas sampling systems used for emissions certification from aircraft engines captured by conventional (Annex 16) gas sampling rakes (ICAO, 2008). The SAE E-31 committee is also working to ballot an Aerospace Recommended Practice (ARP) that will provide the methodology and system specification to measure nvPM from aircraft engines. The ARP is currently in preparation and is expected to be ready for ballot in 2015. A prototype AIR-compliant nvPM measurement system - The North American Reference System (NARS) has been built and evaluated at the MSTCOE under the joint sponsorship of the FAA, EPA and Transport Canada. It has been used to validate the performance characteristics of OEM AIR-compliant systems and is being used in engine certification type testing at OEM facilities to obtain data from a set of representative engines in the fleet. The data collected during these tests will be used by ICAO/CAEP/WG3/PMTG to develop a metric on which on the regulation for nvPM emissions will be based. This paper will review the salient features of the NARS including: (1) emissions sample transport from probe tip to the key diagnostic tools, (2) the mass and number-based diagnostic tools for nvPM mass and number concentration measurement and (3) methods employed to assess the extent of nvPM loss throughout the sampling system. This paper will conclude with a discussion of the recent results from inter-comparison studies conducted with other US - based systems that gives credence to the ARP's readiness for ballot.
Compliant leg behaviour explains basic dynamics of walking and running
Geyer, Hartmut; Seyfarth, Andre; Blickhan, Reinhard
2006-01-01
The basic mechanics of human locomotion are associated with vaulting over stiff legs in walking and rebounding on compliant legs in running. However, while rebounding legs well explain the stance dynamics of running, stiff legs cannot reproduce that of walking. With a simple bipedal spring–mass model, we show that not stiff but compliant legs are essential to obtain the basic walking mechanics; incorporating the double support as an essential part of the walking motion, the model reproduces the characteristic stance dynamics that result in the observed small vertical oscillation of the body and the observed out-of-phase changes in forward kinetic and gravitational potential energies. Exploring the parameter space of this model, we further show that it not only combines the basic dynamics of walking and running in one mechanical system, but also reveals these gaits to be just two out of the many solutions to legged locomotion offered by compliant leg behaviour and accessed by energy or speed. PMID:17015312
Norm compliance and self-reported health among Swedish adolescents.
Nygren, Karina; Janlert, Urban; Nygren, Lennart
2011-02-01
This study examines the relationship between norm compliance and self-reported health in adolescents, and how this differs between genders. Our specific aim was to investigate if extremely high norm compliance revealed any particular health patterns. This empirical study used a web-based survey from 2005, which was distributed to all students (n = 5,066) in years 7-9 of compulsory school within six municipalities in northern Sweden. The respondents answered questions about their general health as well as specific health problems such as headaches, stomach ache, sleeping difficulties and stress. Compliance was measured according to different norm-related behaviour, such as truancy, crime and use of tobacco, alcohol and narcotics. The majority of respondents reported good health and norm-compliant behaviour. Girls reported more health problems than boys, a difference that increased with age. Those who were more norm compliant reported better health, fewer somatic complaints and less stress, which goes against our initial hypothesis that extremely high norm compliance and self-reported ill-health are related. There seemed to be a stronger relationship between self-reported health and norm compliance for girls than boys, in absolute terms. The results clearly show a relationship between norm compliance and health, and suggest inequalities between genders.
Barckhausen, Christina; Rice, Brent; Baila, Stefano; Sensebé, Luc; Schrezenmeier, Hubert; Nold, Philipp; Hackstein, Holger; Rojewski, Markus Thomas
2016-01-01
This chapter describes a method for GMP-compliant expansion of human mesenchymal stromal/stem cells (hMSC) from bone marrow aspirates, using the Quantum(®) Cell Expansion System from Terumo BCT. The Quantum system is a functionally closed, automated hollow fiber bioreactor system designed to reproducibly grow cells in either GMP or research laboratory environments. The chapter includes protocols for preparation of media, setup of the Quantum system, coating of the hollow fiber bioreactor, as well as loading, feeding, and harvesting of cells. We suggest a panel of quality controls for the starting material, the interim product, as well as the final product.
NASA Astrophysics Data System (ADS)
Piorkowski, Dakota; Blackledge, Todd A.
2017-08-01
The origin of viscid capture silk in orb webs, from cribellate silk-spinning ancestors, is a key innovation correlated with significant diversification of web-building spiders. Ancestral cribellate silk consists of dry nanofibrils surrounding a stiff, axial fiber that adheres to prey through van der Waals interactions, capillary forces, and physical entanglement. In contrast, viscid silk uses chemically adhesive aqueous glue coated onto a highly compliant and extensible flagelliform core silk. The extensibility of the flagelliform fiber accounts for half of the total work of adhesion for viscid silk and is enabled by water in the aqueous coating. Recent cDNA libraries revealed the expression of flagelliform silk proteins in cribellate orb-weaving spiders. We hypothesized that the presence of flagelliform proteins in cribellate silk could have allowed for a gradual shift in mechanical performance of cribellate axial silk, whose effect was masked by the dry nature of its adhesive. We measured supercontraction and mechanical performance of cribellate axial silk, in wet and dry states, for two species of cribellate orb web-weaving spiders to see if water enabled flagelliform silk-like performance. We found that compliance and extensibility of wet cribellate silk increased compared to dry state as expected. However, when compared to other silk types, the response to water was more similar to other web silks, like major and minor ampullate silk, than to viscid silk. These findings support the punctuated evolution of viscid silk mechanical performance.
Reward-Modulated Hebbian Plasticity as Leverage for Partially Embodied Control in Compliant Robotics
Burms, Jeroen; Caluwaerts, Ken; Dambre, Joni
2015-01-01
In embodied computation (or morphological computation), part of the complexity of motor control is offloaded to the body dynamics. We demonstrate that a simple Hebbian-like learning rule can be used to train systems with (partial) embodiment, and can be extended outside of the scope of traditional neural networks. To this end, we apply the learning rule to optimize the connection weights of recurrent neural networks with different topologies and for various tasks. We then apply this learning rule to a simulated compliant tensegrity robot by optimizing static feedback controllers that directly exploit the dynamics of the robot body. This leads to partially embodied controllers, i.e., hybrid controllers that naturally integrate the computations that are performed by the robot body into a neural network architecture. Our results demonstrate the universal applicability of reward-modulated Hebbian learning. Furthermore, they demonstrate the robustness of systems trained with the learning rule. This study strengthens our belief that compliant robots should or can be seen as computational units, instead of dumb hardware that needs a complex controller. This link between compliant robotics and neural networks is also the main reason for our search for simple universal learning rules for both neural networks and robotics. PMID:26347645
Small PACS implementation using publicly available software
NASA Astrophysics Data System (ADS)
Passadore, Diego J.; Isoardi, Roberto A.; Gonzalez Nicolini, Federico J.; Ariza, P. P.; Novas, C. V.; Omati, S. A.
1998-07-01
Building cost effective PACS solutions is a main concern in developing countries. Hardware and software components are generally much more expensive than in developed countries and also more tightened financial constraints are the main reasons contributing to a slow rate of implementation of PACS. The extensive use of Internet for sharing resources and information has brought a broad number of freely available software packages to an ever-increasing number of users. In the field of medical imaging is possible to find image format conversion packages, DICOM compliant servers for all kinds of service classes, databases, web servers, image visualization, manipulation and analysis tools, etc. This paper describes a PACS implementation for review and storage built on freely available software. It currently integrates four diagnostic modalities (PET, CT, MR and NM), a Radiotherapy Treatment Planning workstation and several computers in a local area network, for image storage, database management and image review, processing and analysis. It also includes a web-based application that allows remote users to query the archive for studies from any workstation and to view the corresponding images and reports. We conclude that the advantage of using this approach is twofold. It allows a full understanding of all the issues involved in the implementation of a PACS and also contributes to keep costs down while enabling the development of a functional system for storage, distribution and review that can prove to be helpful for radiologists and referring physicians.
Umeh, Gregory C; Nomhwange, Terna Ignatius; Shamang, Anthony F; Zakari, Furera; Musa, Audu I; Dogo, Paul M; Gugong, Victor; Iliyasu, Neyu
2018-02-08
Attitude and subjective well-being are important factors in mothers accepting or rejecting Oral Polio Vaccine (OPV) supplemental immunization. The purpose of the study was to determine the role of mothers' attitude and subjective wellbeing on non-compliance to OPV supplemental immunization in Northern Nigeria. The study utilized a cross-sectional design to assess attitude and subjective well-being of mothers using previously validated VACSATC (Vaccine Safety, Attitudes, Training and Communication-10 items) & SUBI (Subjective Well-being Inventory-40 items) measures. A total of 396 participants (equal number of non-compliant and compliant mothers) from 94 non-compliant settlements were interviewed, after informed consent. T-test was run to assess difference in mean scores between the non-compliant and compliant mothers on VACSATC and SUBI measures. The research showed a significant difference in mean scores between the non-compliant and compliant groups on VACSATC measure of mothers' attitude (M = 18.9 non-compliant, compared to 26.5 compliant; p < 0.05). On subjective well-being, the study showed there was no significant difference in the mean scores of the SUBI measure (M = 77.4 non-compliant, compared to 78.0 compliant; p > 0.05). The research has shown that negative attitude is more commonly present in non-compliant mothers and may be a factor in vaccine refusal in Northern Nigeria.
Web-Based Specialist Support for Spinal Cord Injury Person's Care: Lessons Learned.
Della Mea, Vincenzo; Marin, Dario; Rosin, Claudio; Zampa, Agostino
2012-01-01
Persons with disability from spinal cord injury (SCI) are subject to high risk of pathological events and need a regular followup even after discharge from the rehabilitation hospital. To help in followup, we developed a web portal for providing online specialist as well as GP support to SCI persons. After a feasibility study with 13 subjects, the portal has been introduced in the regional healthcare network in order to make it compliant with current legal regulations on data protection, including smartcard authentication. Although a number of training courses have been made to introduce SCI persons to portal use (up to 50 users), the number of accesses remained very low. Reasons for that have been investigated by means of a questionnaire submitted to the initial feasibility study subjects and included the still easier use of telephone versus our web-based smartcard-authenticated portal, in particular, because online communications are still perceived as an unusual way of interacting with the doctor. To summarize, the overall project has been appreciated by the users, but when it is time to ask for help to, the specialist, it is still much easier to make a phone call.
NASA Astrophysics Data System (ADS)
Lehmann, Thomas M.; Guld, Mark O.; Thies, Christian; Fischer, Benedikt; Keysers, Daniel; Kohnen, Michael; Schubert, Henning; Wein, Berthold B.
2003-05-01
Picture archiving and communication systems (PACS) aim to efficiently provide the radiologists with all images in a suitable quality for diagnosis. Modern standards for digital imaging and communication in medicine (DICOM) comprise alphanumerical descriptions of study, patient, and technical parameters. Currently, this is the only information used to select relevant images within PACS. Since textual descriptions insufficiently describe the great variety of details in medical images, content-based image retrieval (CBIR) is expected to have a strong impact when integrated into PACS. However, existing CBIR approaches usually are limited to a distinct modality, organ, or diagnostic study. In this state-of-the-art report, we present first results implementing a general approach to content-based image retrieval in medical applications (IRMA) and discuss its integration into PACS environments. Usually, a PACS consists of a DICOM image server and several DICOM-compliant workstations, which are used by radiologists for reading the images and reporting the findings. Basic IRMA components are the relational database, the scheduler, and the web server, which all may be installed on the DICOM image server, and the IRMA daemons running on distributed machines, e.g., the radiologists" workstations. These workstations can also host the web-based front-ends of IRMA applications. Integrating CBIR and PACS, a special focus is put on (a) location and access transparency for data, methods, and experiments, (b) replication transparency for methods in development, (c) concurrency transparency for job processing and feature extraction, (d) system transparency at method implementation time, and (e) job distribution transparency when issuing a query. Transparent integration will have a certain impact on diagnostic quality supporting both evidence-based medicine and case-based reasoning.
The European Drought Observatory (EDO): Current State and Future Directions
NASA Astrophysics Data System (ADS)
Vogt, Jürgen; Sepulcre, Guadalupe; Magni, Diego; Valentini, Luana; Singleton, Andrew; Micale, Fabio; Barbosa, Paulo
2013-04-01
Europe has repeatedly been affected by droughts, resulting in considerable ecological and economic damage and climate change studies indicate a trend towards increasing climate variability most likely resulting in more frequent drought occurrences also in Europe. Against this background, the European Commission's Joint Research Centre (JRC) is developing methods and tools for assessing, monitoring and forecasting droughts in Europe and develops a European Drought Observatory (EDO) to complement and integrate national activities with a European view. At the core of the European Drought Observatory (EDO) is a portal, including a map server, a metadata catalogue, a media-monitor and analysis tools. The map server presents Europe-wide up-to-date information on the occurrence and severity of droughts, which is complemented by more detailed information provided by regional, national and local observatories through OGC compliant web mapping and web coverage services. In addition, time series of historical maps as well as graphs of the temporal evolution of drought indices for individual grid cells and administrative regions in Europe can be retrieved and analysed. Current work is focusing on validating the available products, developing combined indicators, improving the functionalities, extending the linkage to additional national and regional drought information systems and testing options for medium-range probabilistic drought forecasting across Europe. Longer-term goals include the development of long-range drought forecasting products, the analysis of drought hazard and risk, the monitoring of drought impact and the integration of EDO in a global drought information system. The talk will provide an overview on the development and state of EDO, the different products, and the ways to include a wide range of stakeholders (i.e. European, national river basin, and local authorities) in the development of the system as well as an outlook on the future developments.
Handedness in shearing auxetics creates rigid and compliant structures
NASA Astrophysics Data System (ADS)
Lipton, Jeffrey Ian; MacCurdy, Robert; Manchester, Zachary; Chin, Lillian; Cellucci, Daniel; Rus, Daniela
2018-05-01
In nature, repeated base units produce handed structures that selectively bond to make rigid or compliant materials. Auxetic tilings are scale-independent frameworks made from repeated unit cells that expand under tension. We discovered how to produce handedness in auxetic unit cells that shear as they expand by changing the symmetries and alignments of auxetic tilings. Using the symmetry and alignment rules that we developed, we made handed shearing auxetics that tile planes, cylinders, and spheres. By compositing the handed shearing auxetics in a manner inspired by keratin and collagen, we produce both compliant structures that expand while twisting and deployable structures that can rigidly lock. This work opens up new possibilities in designing chemical frameworks, medical devices like stents, robotic systems, and deployable engineering structures.
Global Climate Change for Kids: Making Difficult Ideas Accessible and Exciting
NASA Astrophysics Data System (ADS)
Fisher, D. K.; Leon, N.; Greene, M. P.
2009-12-01
NASA has recently launched its Global Climate Change web site (http://climate.nasa.gov), and it has been very well received. It has now also launched in preliminary form an associated site for children and educators, with a plan for completion in the near future. The goals of the NASA Global Climate Change Education site are: To increase awareness and understanding of climate change science in upper-elementary and middle-school students, reinforcing and building upon basic concepts introduced in the formal science education curriculum for these grades; To present, insofar as possible, a holistic picture of climate change science and current evidence of climate change, describing Earth as a system of interconnected processes; To be entertaining and motivating; To be clear and easy to understand; To be easy to navigate; To address multiple learning styles; To describe and promote "green" careers; To increase awareness of NASA's contributions to climate change science; To provide valuable resources for educators; To be compliant with Section 508 of the Americans with Disabilities Act. The site incorporates research findings not only on climate change, but also on effective web design for children. It is envisioned that most of the content of the site will ultimately be presented in multimedia forms. These will include illustrated and narrated "slide shows," animated expositions, interactive concept-rich games and demonstrations, videos, animated fictionalized stories, and printable picture galleries. In recognition of the attention span of the audience, content is presented in short, modular form, with a suggested, but not mandatory order of access. Empathetic animal and human cartoon personalities are used to explain concepts and tell stories. Expository, fiction, game, video, text, and image modules are interlinked for reinforcement of similar ideas. NASA's Global Climate Change Education web site addresses the vital need to impart and emphasize Earth system science concepts at or near the beginning of the education pipeline.
Radiology on handheld devices: image display, manipulation, and PACS integration issues.
Raman, Bhargav; Raman, Raghav; Raman, Lalithakala; Beaulieu, Christopher F
2004-01-01
Handheld personal digital assistants (PDAs) have undergone continuous and substantial improvements in hardware and graphics capabilities, making them a compelling platform for novel developments in teleradiology. The latest PDAs have processor speeds of up to 400 MHz and storage capacities of up to 80 Gbytes with memory expansion methods. A Digital Imaging and Communications in Medicine (DICOM)-compliant, vendor-independent handheld image access system was developed in which a PDA server acts as the gateway between a picture archiving and communication system (PACS) and PDAs. The system is compatible with most currently available PDA models. It is capable of both wired and wireless transfer of images and includes custom PDA software and World Wide Web interfaces that implement a variety of basic image manipulation functions. Implementation of this system, which is currently undergoing debugging and beta testing, required optimization of the user interface to efficiently display images on smaller PDA screens. The PDA server manages user work lists and implements compression and security features to accelerate transfer speeds, protect patient information, and regulate access. Although some limitations remain, PDA-based teleradiology has the potential to increase the efficiency of the radiologic work flow, increasing productivity and improving communication with referring physicians and patients. Copyright RSNA, 2004
Simulation study of the ROMPS robot control system
NASA Technical Reports Server (NTRS)
Nguyen, Charles C.; Liu, HUI-I.
1994-01-01
This is a report presenting the progress of a research grant funded by NASA for work performed from June 1, 1993 to August 1, 1993. The report deals with the Robot Operated Material Processing System (ROMPS). It presents results of a computer simulation study conducted to investigate the performance of the control systems controlling the azimuth, elevation, and radial axes of the ROMPS and its gripper. Four study cases are conducted. The first case investigates the control of free motion of the three areas. In the second case, the compliant motion in the elevation axis with the wrist compliant device is studied in terms of position accuracy and impact forces. The third case focuses on the behavior of the control system in controlling the robot motion along the radial axis when pulling the pallet out of the rack. In the fourth case, the compliant motion of the gripper grasping a solid object under the effect of the gripper passive compliance is studied in terms of position accuracy and contact forces. For each of the above cases, a set of PIR gains will be selected to optimize the controller performance and computer simulation results will be presented and discussed.
The deegree framework - Spatial Data Infrastructure solution for end-users and developers
NASA Astrophysics Data System (ADS)
Kiehle, Christian; Poth, Andreas
2010-05-01
The open source software framework deegree is a comprehensive implementation of standards as defined by ISO and Open Geospatial Consortium (OGC). It has been developed with two goals in mind: provide a uniform framework for implementing Spatial Data Infrastructures (SDI) and adhering to standards as strictly as possible. Although being open source software (Lesser GNU Public License, LGPL), deegree has been developed with a business model in mind: providing the general building blocks of SDIs without license fees and offer customization, consulting and tailoring by specialized companies. The core of deegree is a comprehensive Java Application Programming Interface (API) offering access to spatial features, analysis, metadata and coordinate reference systems. As a library, deegree can and has been integrated as a core module inside spatial information systems. It is reference implementation for several OGC standards and based on an ISO 19107 geometry model. For end users, deegree is shipped as a web application providing easy-to-set-up components for web mapping and spatial analysis. Since 2000, deegree has been the backbone of many productive SDIs, first and foremost for governmental stakeholders (e.g. Federal Agency for Cartography and Geodesy in Germany, the Ministry of Housing, Spatial Planning and the Environment in the Netherlands, etc.) as well as for research and development projects as an early adoption of standards, drafts and discussion papers. Besides mature standards like Web Map Service, Web Feature Service and Catalogue Services, deegree also implements rather new standards like the Sensor Observation Service, the Web Processing Service and the Web Coordinate Transformation Service (WCTS). While a robust background in standardization (knowledge and implementation) is a must for consultancy, standard-compliant services and encodings alone do not provide solutions for customers. The added value is comprised by a sophisticated set of client software, desktop and web environments. A focus lies on different client solutions for specific standards like the Web Processing Service and the Web Coordinate Transformation Service. On the other hand, complex geoportal solutions comprised of multiple standards and enhanced by components for user management, security and map client functionality show the demanding requirements of real world solutions. The XPlan-GML-standard as defined by the German spatial planing authorities is a good example of how complex real-world requirements can get. XPlan-GML is intended to provide a framework for digital spatial planning documents and requires complex Geography Markup Language (GML) features along with Symbology Encoding (SE), Filter Encoding (FE), Web Map Services (WMS), Web Feature Services (WFS). This complex infrastructure should be used by urban and spatial planners and therefore requires a user-friendly graphical interface hiding the complexity of the underlying infrastructure. Based on challenges faced within customer projects, the importance of easy to use software components is focused. SDI solution should be build upon ISO/OGC-standards, but more important, should be user-friendly and support the users in spatial data management and analysis.
NASA Astrophysics Data System (ADS)
Berthier, J.; Carry, B.; Vachier, F.; Eggl, S.; Santerne, A.
2016-05-01
All the fields of the extended space mission Kepler/K2 are located within the ecliptic. Many Solar system objects thus cross the K2 stellar masks on a regular basis. We aim at providing to the entire community a simple tool to search and identify Solar system objects serendipitously observed by Kepler. The sky body tracker (SkyBoT) service hosted at Institut de mécanique céleste et de calcul des éphémérides provides a Virtual Observatory compliant cone search that lists all Solar system objects present within a field of view at a given epoch. To generate such a list in a timely manner, ephemerides are pre-computed, updated weekly, and stored in a relational data base to ensure a fast access. The SkyBoT web service can now be used with Kepler. Solar system objects within a small (few arcminutes) field of view are identified and listed in less than 10 s. Generating object data for the entire K2 field of view (14°) takes about a minute. This extension of the SkyBoT service opens new possibilities with respect to mining K2 data for Solar system science, as well as removing Solar system objects from stellar photometric time series.
NASA Technical Reports Server (NTRS)
Anyiwo, Joshua C.
2000-01-01
Vixen is a collection of enabling technologies for uninhibited distributed object computing. In the Spring of 1995 when Vixen was proposed, it was an innovative idea very much ahead of its time. But today the technologies proposed in Vixen have become standard technologies for Enterprise Computing. Sun Microsystems J2EE/EJB specifications, among others, are independently proposed technologies of the Vixen type. I have brought Vixen completely under the J2EE standard in order to maximize interoperability and compatibility with other computing industry efforts. Vixen and the Enterprise JavaBean (EJB) Server technologies are now practically identical; OIL, another Vixen technology, and the Java Messaging System (JMS) are practically identical; and so on. There is no longer anything novel or patentable in the Vixen work performed under this grant. The above discussion, notwithstanding, my independent development of Vixen has significantly helped me, my university, my students and the local community. The undergraduate students who worked with me in developing Vixen have enhanced their expertise in what has become the cutting edge technology of their industry and are therefore well positioned for lucrative employment opportunities in the industry. My academic department has gained a new course: "Multi-media System Development", which provides a highly desirable expertise to our students for employment in any enterprise today. The many Outreach Programs that I conducted during this grant period have exposed local Middle School students to the contributions that NASA is making in our society as well as awakened desires in many such students for careers in Science and Technology. I have applied Vixen to the development of two software packages: (a) JAS: Joshua Application Server - which allows a user to configure an EJB Server to serve a J2EE compliant application over the world wide web; (b) PCM: Professor Course Manager: a J2EE compliant application for configuring a course for distance learning. These types of applications are, however, generally available in the industry today.
NASA Astrophysics Data System (ADS)
Tamkin, G.; Schnase, J. L.; Duffy, D.; Li, J.; Strong, S.; Thompson, J. H.
2016-12-01
We are extending climate analytics-as-a-service, including: (1) A high-performance Virtual Real-Time Analytics Testbed supporting six major reanalysis data sets using advanced technologies like the Cloudera Impala-based SQL and Hadoop-based MapReduce analytics over native NetCDF files. (2) A Reanalysis Ensemble Service (RES) that offers a basic set of commonly used operations over the reanalysis collections that are accessible through NASA's climate data analytics Web services and our client-side Climate Data Services Python library, CDSlib. (3) An Open Geospatial Consortium (OGC) WPS-compliant Web service interface to CDSLib to accommodate ESGF's Web service endpoints. This presentation will report on the overall progress of this effort, with special attention to recent enhancements that have been made to the Reanalysis Ensemble Service, including the following: - An CDSlib Python library that supports full temporal, spatial, and grid-based resolution services - A new reanalysis collections reference model to enable operator design and implementation - An enhanced library of sample queries to demonstrate and develop use case scenarios - Extended operators that enable single- and multiple reanalysis area average, vertical average, re-gridding, and trend, climatology, and anomaly computations - Full support for the MERRA-2 reanalysis and the initial integration of two additional reanalyses - A prototype Jupyter notebook-based distribution mechanism that combines CDSlib documentation with interactive use case scenarios and personalized project management - Prototyped uncertainty quantification services that combine ensemble products with comparative observational products - Convenient, one-stop shopping for commonly used data products from multiple reanalyses, including basic subsetting and arithmetic operations over the data and extractions of trends, climatologies, and anomalies - The ability to compute and visualize multiple reanalysis intercomparisons
Development of a functional, internet-accessible department of surgery outcomes database.
Newcomb, William L; Lincourt, Amy E; Gersin, Keith; Kercher, Kent; Iannitti, David; Kuwada, Tim; Lyons, Cynthia; Sing, Ronald F; Hadzikadic, Mirsad; Heniford, B Todd; Rucho, Susan
2008-06-01
The need for surgical outcomes data is increasing due to pressure from insurance companies, patients, and the need for surgeons to keep their own "report card". Current data management systems are limited by inability to stratify outcomes based on patients, surgeons, and differences in surgical technique. Surgeons along with research and informatics personnel from an academic, hospital-based Department of Surgery and a state university's Department of Information Technology formed a partnership to develop a dynamic, internet-based, clinical data warehouse. A five-component model was used: data dictionary development, web application creation, participating center education and management, statistics applications, and data interpretation. A data dictionary was developed from a list of data elements to address needs of research, quality assurance, industry, and centers of excellence. A user-friendly web interface was developed with menu-driven check boxes, multiple electronic data entry points, direct downloads from hospital billing information, and web-based patient portals. Data were collected on a Health Insurance Portability and Accountability Act-compliant server with a secure firewall. Protected health information was de-identified. Data management strategies included automated auditing, on-site training, a trouble-shooting hotline, and Institutional Review Board oversight. Real-time, daily, monthly, and quarterly data reports were generated. Fifty-eight publications and 109 abstracts have been generated from the database during its development and implementation. Seven national academic departments now use the database to track patient outcomes. The development of a robust surgical outcomes database requires a combination of clinical, informatics, and research expertise. Benefits of surgeon involvement in outcomes research include: tracking individual performance, patient safety, surgical research, legal defense, and the ability to provide accurate information to patient and payers.
Compliant Task Execution and Learning for Safe Mixed-Initiative Human-Robot Operations
NASA Technical Reports Server (NTRS)
Dong, Shuonan; Conrad, Patrick R.; Shah, Julie A.; Williams, Brian C.; Mittman, David S.; Ingham, Michel D.; Verma, Vandana
2011-01-01
We introduce a novel task execution capability that enhances the ability of in-situ crew members to function independently from Earth by enabling safe and efficient interaction with automated systems. This task execution capability provides the ability to (1) map goal-directed commands from humans into safe, compliant, automated actions, (2) quickly and safely respond to human commands and actions during task execution, and (3) specify complex motions through teaching by demonstration. Our results are applicable to future surface robotic systems, and we have demonstrated these capabilities on JPL's All-Terrain Hex-Limbed Extra-Terrestrial Explorer (ATHLETE) robot.
VO-Dance an IVOA tools to easy publish data into VO and it's extension on planetology request
NASA Astrophysics Data System (ADS)
Smareglia, R.; Capria, M. T.; Molinaro, M.
2012-09-01
Data publishing through the self standing portals can be joined to VO resource publishing, i.e. astronomical resources deployed through VO compliant services. Since the IVOA (International Virtual Observatory Alliance) provides many protocols and standards for the various data flavors (images, spectra, catalogues … ), and since the data center has as a goal to grow up in number of hosted archives and services providing, the idea arose to find a way to easily deploy and maintain VO resources. VO-Dance is a java web application developed at IA2 that addresses this idea creating, in a dynamical way, VO resources out of database tables or views. It is structured to be potentially DBMS and platform independent and consists of 3 main tokens, an internal DB to store resources description and model metadata information, a restful web application to deploy the resources to the VO community. It's extension to planetology request is under study to best effort INAF software development and archive efficiency.
The Development of Ontology from Multiple Databases
NASA Astrophysics Data System (ADS)
Kasim, Shahreen; Aswa Omar, Nurul; Fudzee, Mohd Farhan Md; Azhar Ramli, Azizul; Aizi Salamat, Mohamad; Mahdin, Hairulnizam
2017-08-01
The area of halal industry is the fastest growing global business across the world. The halal food industry is thus crucial for Muslims all over the world as it serves to ensure them that the food items they consume daily are syariah compliant. Currently, ontology has been widely used in computer sciences area such as web on the heterogeneous information processing, semantic web, and information retrieval. However, ontology has still not been used widely in the halal industry. Today, Muslim community still have problem to verify halal status for products in the market especially foods consisting of E number. This research tried to solve problem in validating the halal status from various halal sources. There are various chemical ontology from multilple databases found to help this ontology development. The E numbers in this chemical ontology are codes for chemicals that can be used as food additives. With this E numbers ontology, Muslim community could identify and verify the halal status effectively for halal products in the market.
Gregory, Shaun D; Schummy, Emma; Pearcy, Mark; Pauls, Jo P; Tansley, Geoff; Fraser, John F; Timms, Daniel
2015-02-01
Biventricular support with dual rotary ventricular assist devices (VADs) has been implemented clinically with restriction of the right VAD (RVAD) outflow cannula to artificially increase afterload and, therefore, operate within recommended design speed ranges. However, the low preload and high afterload sensitivity of these devices increase the susceptibility of suction events. Active control systems are prone to sensor drift or inaccurate inferred (sensor-less) data, therefore an alternative solution may be of benefit. This study presents the in vitro evaluation of a compliant outflow cannula designed to passively decrease the afterload sensitivity of rotary RVADs and minimize left-sided suction events. A one-way fluid-structure interaction model was initially used to produce a design with suitable flow dynamics and radial deformation. The resultant geometry was cast with different initial cross-sectional restrictions and concentrations of a softening diluent before evaluation in a mock circulation loop. Pulmonary vascular resistance (PVR) was increased from 50 dyne s/cm(5) until left-sided suction events occurred with each compliant cannula and a rigid, 4.5 mm diameter outflow cannula for comparison. Early suction events (PVR ∼ 300 dyne s/cm(5) ) were observed with the rigid outflow cannula. Addition of the compliant section with an initial 3 mm diameter restriction and 10% diluent expanded the outflow restriction as PVR increased, thus increasing RVAD flow rate and preventing left-sided suction events at PVR levels beyond 1000 dyne s/cm(5) . Therefore, the compliant, restricted outflow cannula provided a passive control system to assist in the prevention of suction events with rotary biventricular support while maintaining pump speeds within normal ranges of operation. Copyright © 2014 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.
The influence of novel compliant floors on balance control in elderly women--A biomechanical study.
Wright, Alexander D; Laing, Andrew C
2011-07-01
Novel compliant floors aim to decrease the risk for fall-related injury by providing substantial force attenuation during the impact phase of falls. Certain models of compliant flooring have been shown to have limited influence on postural sway and successful completion of dynamic balance tasks. However, the effects of these products on balance recovery mechanisms following an externally induced perturbation have yet to be quantified. We used a floor translation paradigm to induce a balance perturbation to thirteen elderly community-dwelling women. Outcome measures included the displacement rates and margins of safety for both the underfoot centre-of-pressure and whole-body centre-of-mass across two novel compliant floors (SmartCell, SofTile), two basic foam surfaces (Firm-Foam, Soft-Foam) and a standard 'Rigid' floor as a control condition. The centre-of-mass and centre-of-pressure margins of safety, and all centre-of-mass displacement rates, were not significantly lower for the two novel compliant flooring systems compared to the control floor. The centre-of-pressure displacement rates were similar to the control floor for the SmartCell floor condition. The majority of the margin of safety and displacement rate variables for the foam floors were significantly lower than the control condition. This study illustrates that the SmartCell and SofTile novel compliant floors have minimal influences on balance and balance control responses following externally induced perturbations in older community-dwelling women, and supports pilot installations of these floors to inform decisions regarding the development of clinical trials. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.
The influence of novel compliant floors on balance control in elderly women—A biomechanical study
Wright, Alexander D.; Laing, Andrew C.
2012-01-01
Novel compliant floors aim to decrease the risk for fall-related injury by providing substantial force attenuation during the impact phase of falls. Certain models of compliant flooring have been shown to have limited influence on postural sway and successful completion of dynamic balance tasks. However, the effects of these products on balance recovery mechanisms following an externally induced perturbation have yet to be quantified. We used a floor translation paradigm to induce a balance perturbation to thirteen elderly community-dwelling women. Outcome measures included the displacement rates and margins of safety for both the underfoot centre-of-pressure and whole-body centre-of-mass across two novel compliant floors (Smart-Cell, SofTile), two basic foam surfaces (Firm-Foam, Soft-Foam) and a standard ‘Rigid’ floor as a control condition. The centre-of-mass and centre-of-pressure margins of safety, and all centre-of-mass displacement rates, were not significantly lower for the two novel compliant flooring systems compared to the control floor. The centre-of-pressure displacement rates were similar to the control floor for the SmartCell floor condition. The majority of the margin of safety and displacement rate variables for the foam floors were significantly lower than the control condition. This study illustrates that the SmartCell and SofTile novel compliant floors have minimal influences on balance and balance control responses following externally induced perturbations in older community-dwelling women, and supports pilot installations of these floors to inform decisions regarding the development of clinical trials. PMID:21545881
Graduated driver license compliant teens involved in fatal motor vehicle crashes.
Pressley, Joyce C; Addison, Diane; Dawson, Patrick; Nelson, Sharifa S
2015-09-01
Significant reductions in motor vehicle injury mortality have been reported for teen drivers after passage of graduated driver licensing (GDL), seat belt, and no tolerance alcohol and drug laws. Despite this, teen drivers remain a vulnerable population with elevated fatal crash involvement. This study examines driver, vehicle, and crash characteristics of GDL-compliant, belted, and unimpaired teen drivers with the goal of identifying areas where further improvements might be realized. The Fatality Analysis Reporting System (FARS) for 2007 to 2009 was used to examine and classify driver violations/errors in compliant teen drivers (n = 1,571) of passenger vehicles involved in a fatal collision. Teens driving unbelted, non-GDL compliant, or impaired by alcohol or drugs were excluded. Statistical analysis used χ, Fisher's exact and multivariable logistic regression. Odds ratios are reported with 95% confidence intervals. Significance was defined as p < 0.05. Nearly one third (n = 1,571) of teen drivers involved in a fatal motor vehicle crash were GDL compliant, unimpaired, and belted. The majority held an intermediate GDL license (90.6%). Crash-related factors were identified for 63.1% of fatal crashes. Age- and sex-adjusted odds identified overcorrecting, speeding, lane errors, school morning crashes, distractions, and driving on slippery surfaces as having increased odds of fatality for the teen driver as well as newer vehicle models and heavier vehicle weight as protective. Among compliant drivers, weekday crashes before and after school and committing a driving violation at the time of crash were associated with increased risk of driver death and higher incidence of incapacitating injury in surviving drivers. Therapeutic study, level V.
Compliance among soft contact lens wearers.
Kuzman, Tomislav; Kutija, Marija Barisić; Masnec, Sanja; Jandroković, Sonja; Mrazovac, Danijela; Jurisić, Darija; Skegro, Ivan; Kalauz, Miro; Kordić, Rajko
2014-12-01
Contact lens compliance is proven to be crucial for preventing lens wear-related complications because of the interdependence of the steps in lens care regime and their influence on lens system microbial contamination. Awareness of the patients' lens handling compliance as well as correct recognition of non-compliant behaviours is the basis for creating more targeted strategies for patient education. The aim of this study was to investigate compliance among soft contact lens (SCL) wearers in different aspects of lens care handling and wearing habits. In our research 50 asymptomatic lens wearers filled out a questionnaire containing demographic data, lens type, hygiene and wearing habits, lenses and lens care system replacement schedule and self-evaluation of contact lens handling hygiene. We established criteria of compliance according to available manufacturer's recommendations, prior literature and our clinical experience. Only 2 (4%) of patients were fully compliant SCL wearers. The most common non-compliant behaviours were insufficient lens solution soaking time (62%), followed by failure to daily exchange lens case solution and showering while wearing lenses. 44% of patients reported storing lenses in saline solution. Mean lens storage case replacement was 3.6 months, with up to 78% patients replacing lens case at least once in 3 months. Average grade in self evaluating level of compliance was very good (4 +/- 0.78) (from 1-poor level of hygiene to 5-great level of hygiene). Lens wearers who reported excessive daily lens wear and more than 10 years of lens wearing experience were also found to be less compliant with other lens system care procedures. (t = -2.99, df=47, p < 0.0045 and t = -2.33, df= 48, p < 0.024, respectively). Our study indicates that almost all patients had some degree of non-compliance in lens system maintenance steps. Most common non-compliant behaviours were the ones that are crucial for maintaining lens sterility and preventing infection. Despite the low objective compliance rate, self grading was relatively high. Therefore, these results indicate the need for patient education and encouragement of better lens wearing habits and all of the lens maintenance steps at each patient visit.
NASA Astrophysics Data System (ADS)
Lee, Jasper C.; Ma, Kevin C.; Liu, Brent J.
2008-03-01
A Data Grid for medical images has been developed at the Image Processing and Informatics Laboratory, USC to provide distribution and fault-tolerant storage of medical imaging studies across Internet2 and public domain. Although back-up policies and grid certificates guarantee privacy and authenticity of grid-access-points, there still lacks a method to guarantee the sensitive DICOM images have not been altered or corrupted during transmission across a public domain. This paper takes steps toward achieving full image transfer security within the Data Grid by utilizing DICOM image authentication and a HIPAA-compliant auditing system. The 3-D lossless digital signature embedding procedure involves a private 64 byte signature that is embedded into each original DICOM image volume, whereby on the receiving end the signature can to be extracted and verified following the DICOM transmission. This digital signature method has also been developed at the IPILab. The HIPAA-Compliant Auditing System (H-CAS) is required to monitor embedding and verification events, and allows monitoring of other grid activity as well. The H-CAS system federates the logs of transmission and authentication events at each grid-access-point and stores it into a HIPAA-compliant database. The auditing toolkit is installed at the local grid-access-point and utilizes Syslog [1], a client-server standard for log messaging over an IP network, to send messages to the H-CAS centralized database. By integrating digital image signatures and centralized logging capabilities, DICOM image integrity within the Medical Imaging and Informatics Data Grid can be monitored and guaranteed without loss to any image quality.
New integrated silicon-PDMS process for compliant micro-mechanisms
NASA Astrophysics Data System (ADS)
Haouas, Wissem; Dahmouche, Redwan; Agnus, Joël; Le Fort-Piat, Nadine; Laurent, Guillaume J.
2017-12-01
Polydimethylsiloxane (PDMS) elastomers are used for many applications, such as microfluidics and micro-engineering. This paper presents a new process of integrating soft elastomers into a silicon structure without any assembly steps. The novelty of this process is the use of only one deep reactive ion etch (DRIE) instead of two or more as developed in previous works. Thus, this fabrication process allows the use of elastomers that are usually not compatible with some fabrication processes. Compliant flexures with different interference shapes have been designed, simulated, fabricated, and characterized for generic use and notably for micro-robot joints and compliant micro-systems. The experimental results show that the 400 μm × 400 μm cross-sectional area samples can be bended more than 60\\circ without delamination.
Advanced public transportation systems benefits
DOT National Transportation Integrated Search
1996-03-01
Benefits and cost savings for various Advanced Public Transportation Systems are outlined here. Operational efficiencies are given for Transit Management Systems in different locales, as well as compliant resolution and safety. Electronic Fare Paymen...
W-026, transuranic waste restricted waste management (TRU RWM) glovebox operational test report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leist, K.J.
1998-02-18
The TRU Waste/Restricted Waste Management (LLW/PWNP) Glovebox 401 is designed to accept and process waste from the Transuranic Process Glovebox 302. Waste is transferred to the glovebox via the Drath and Schraeder Bagless Transfer Port (DO-07401) on a transfer stand. The stand is removed with a hoist and the operator inspects the waste (with the aid of the Sampling and Treatment Director) to determine a course of action for each item. The waste is separated into compliant and non compliant. One Trip Port DO-07402A is designated as ``Compliant``and One Trip Port DO-07402B is designated as ``Non Compliant``. As the processingmore » (inspection, bar coding, sampling and treatment) of the transferred items takes place, residue is placed in the appropriate One Trip port. The status of the waste items is tracked by the Data Management System (DMS) via the Plant Control System (PCS) barcode interface. As an item is moved for sampling or storage or it`s state altered by treatment, the Operator will track an items location using a portable barcode reader and entry any required data on the DMS console. The Operational Test Procedure (OTP) will perform evolutions (described here) using the Plant Operating Procedures (POP) in order to verify that they are sufficient and accurate for controlled glovebox operation.« less
NASA Astrophysics Data System (ADS)
Deng, M.; di, L.
2005-12-01
The needs for Earth science education to prepare students as globally-trained geoscience workforce increase tremendously with globalization of the economy. However, current academic programs often have difficulties in providing students world-view training or experiences with global context due to lack of resources and suitable teaching technology. This paper presents a NASA funded project with insights and solutions to this problem. The project aims to establish a geospatial data-rich learning and research environment that enable the students, faculty and researchers from institutes all over the world easily accessing, analyzing and modeling with the huge amount of NASA EOS data just like they possess those vast resources locally at their desktops. With the environment, classroom demonstration and training for students to deal with global climate and environment issues for any part of the world are possible in any classroom with Internet connection. Globalization and mobilization of Earth science education can be truly realized through the environment. This project, named as NASA EOS Higher Education Alliance: Mobilization of NASA EOS Data and Information through Web Services and Knowledge Management Technologies for Higher Education Teaching and Research, is built on profound technology and infrastructure foundations including web service technology, NASA EOS data resources, and open interoperability standards. An open, distributed, standard compliant, interoperable web-based system, called GeoBrain, is being developed by this project to provide a data-rich on-line learning and research environment. The system allows users to dynamically and collaboratively develop interoperable, web-executable geospatial process and analysis modules and models, and run them on-line against any part of the peta-byte archives for getting back the customized information products rather than raw data. The system makes a data-rich globally-capable Earth science learning and research environment, backed by NASA EOS data and computing resources that are unavailable to students and professors before, available to them at their desktops free of charge. In order to efficiently integrate this new environment into Earth science education and research, a NASA EOS Higher Education Alliance (NEHEA) is formed. The core members of NEHEA consist of the GeoBrain development team led by LAITS at George Mason University and a group of Earth science educators selected from an open RFP process. NEHEA is an open and free alliance. NEHEA welcomes Earth science educators around the world to join as associate members. NEHEA promotes international research and education collaborations in Earth science. NEHEA core members will provide technical support to NEHEA associate members for incorporating the data-rich learning environment into their teaching and research activities. The responsibilities of NEHEA education members include using the system in their research and teaching, providing feedback and requirements to the development team, exchanging information on the utilization of the system capabilities, participating in the system development, and developing new curriculums and research around the environment provided by GeoBrain.
MyOcean Information System : achievements and perspectives
NASA Astrophysics Data System (ADS)
Loubrieu, T.; Dorandeu, J.; Claverie, V.; Cordier, K.; Barzic, Y.; Lauret, O.; Jolibois, T.; Blower, J.
2012-04-01
MyOcean system (http://www.myocean.eu) objective is to provide a Core Service for the Ocean. This means MyOcean is setting up an operational service for forecasts, analysis and expertise on ocean currents, temperature, salinity, sea level, primary ecosystems and ice coverage. The production of observation and forecasting data is distributed through 12 production centres. The interface with the external users (including web portal) and the coordination of the overall service is managed by a component called service desk. Besides, a transverse component called MIS (myOcean Information System) aims at connecting the production centres and service desk together, manage the shared information for the overall system and implement a standard Inspire interface for the external world. 2012 is a key year for the system. The MyOcean, 3-year project, which has set up the first versions of the system is ending. The MyOcean II, 2-year project, which will upgrade and consolidate the system is starting. Both projects are granted by the European commission within the GMES Program (7th Framework Program). At the end of the MyOcean project, the system has been designed and the 2 first versions have been implemented. The system now offers an integrated service composed with 237 ocean products. The ocean products are homogeneously described in a catalogue. They can be visualized and downloaded by the user (identified with a unique login) through a seamless web interface. The discovery and viewing interfaces are INSPIRE compliant. The data production, subsystems availability and audience are continuously monitored. The presentation will detail the implemented information system architecture and the chosen software solutions. Regarding the information system, MyOcean II is mainly aiming at consolidating the existing functions and promoting the operations cost-effectiveness. In addition, a specific effort will be done so that the less common data features of the system (ocean in-situ observations, remote-sensing along track observations) reach the same level of interoperability for view and download function as the gridded features. The presentation will detail the envisioned plans.
A Walk through TRIDEC's intermediate Tsunami Early Warning System
NASA Astrophysics Data System (ADS)
Hammitzsch, M.; Reißland, S.; Lendholt, M.
2012-04-01
The management of natural crises is an important application field of the technology developed in the project Collaborative, Complex, and Critical Decision-Support in Evolving Crises (TRIDEC), co-funded by the European Commission in its Seventh Framework Programme. TRIDEC is based on the development of the German Indonesian Tsunami Early Warning System (GITEWS) and the Distant Early Warning System (DEWS) providing a service platform for both sensor integration and warning dissemination. In TRIDEC new developments in Information and Communication Technology (ICT) are used to extend the existing platform realising a component-based technology framework for building distributed tsunami warning systems for deployment, e.g. in the North-eastern Atlantic, the Mediterranean and Connected Seas (NEAM) region. The TRIDEC system will be implemented in three phases, each with a demonstrator. Successively, the demonstrators are addressing challenges, such as the design and implementation of a robust and scalable service infrastructure supporting the integration and utilisation of existing resources with accelerated generation of large volumes of data. These include sensor systems, geo-information repositories, simulation tools and data fusion tools. In addition to conventional sensors also unconventional sensors and sensor networks play an important role in TRIDEC. The system version presented is based on service-oriented architecture (SOA) concepts and on relevant standards of the Open Geospatial Consortium (OGC), the World Wide Web Consortium (W3C) and the Organization for the Advancement of Structured Information Standards (OASIS). In this way the system continuously gathers, processes and displays events and data coming from open sensor platforms to enable operators to quickly decide whether an early warning is necessary and to send personalized warning messages to the authorities and the population at large through a wide range of communication channels. The system integrates OGC Sensor Web Enablement (SWE) compliant sensor systems for the rapid detection of hazardous events, like earthquakes, sea level anomalies, ocean floor occurrences, and ground displacements. Using OGC Web Map Service (WMS) and Web Feature Service (WFS) spatial data are utilized to depict the situation picture. The integration of a simulation system to identify affected areas is considered using the OGC Web Processing Service (WPS). Warning messages are compiled and transmitted in the OASIS Common Alerting Protocol (CAP) together with addressing information defined via the OASIS Emergency Data Exchange Language - Distribution Element (EDXL-DE). The first system demonstrator has been designed and implemented to support plausible scenarios demonstrating the treatment of simulated tsunami threats with an essential subset of a National Tsunami Warning Centre (NTWC). The feasibility and the potentials of the implemented approach are demonstrated covering standard operations as well as tsunami detection and alerting functions. The demonstrator presented addresses information management and decision-support processes in a hypothetical natural crisis situation caused by a tsunami in the Eastern Mediterranean. Developments of the system are based to the largest extent on free and open source software (FOSS) components and industry standards. Emphasis has been and will be made on leveraging open source technologies that support mature system architecture models wherever appropriate. All open source software produced is foreseen to be published on a publicly available software repository thus allowing others to reuse results achieved and enabling further development and collaboration with a wide community including scientists, developers, users and stakeholders. This live demonstration is linked with the talk "TRIDEC Natural Crisis Management Demonstrator for Tsunamis" (EGU2012-7275) given in the session "Architecture of Future Tsunami Warning Systems" (NH5.7/ESSI1.7).
Human-like Compliance for Dexterous Robot Hands
NASA Technical Reports Server (NTRS)
Jau, Bruno M.
1995-01-01
This paper describes the Active Electromechanical Compliance (AEC) system that was developed for the Jau-JPL anthropomorphic robot. The AEC system imitates the functionality of the human muscle's secondary function, which is to control the joint's stiffness: AEC is implemented through servo controlling the joint drive train's stiffness. The control strategy, controlling compliant joints in teleoperation, is described. It enables automatic hybrid position and force control through utilizing sensory feedback from joint and compliance sensors. This compliant control strategy is adaptable for autonomous robot control as well. Active compliance enables dual arm manipulations, human-like soft grasping by the robot hand, and opens the way to many new robotics applications.
NASA Astrophysics Data System (ADS)
Knapic, C.; Zanichelli, A.; Dovgan, E.; Nanni, M.; Stagni, M.; Righini, S.; Sponza, M.; Bedosti, F.; Orlati, A.; Smareglia, R.
2016-07-01
Radio Astronomical Data models are becoming very complex since the huge possible range of instrumental configurations available with the modern Radio Telescopes. What in the past was the last frontiers of data formats in terms of efficiency and flexibility is now evolving with new strategies and methodologies enabling the persistence of a very complex, hierarchical and multi-purpose information. Such an evolution of data models and data formats require new data archiving techniques in order to guarantee data preservation following the directives of Open Archival Information System and the International Virtual Observatory Alliance for data sharing and publication. Currently, various formats (FITS, MBFITS, VLBI's XML description files and ancillary files) of data acquired with the Medicina and Noto Radio Telescopes can be stored and handled by a common Radio Archive, that is planned to be released to the (inter)national community by the end of 2016. This state-of-the-art archiving system for radio astronomical data aims at delegating as much as possible to the software setting how and where the descriptors (metadata) are saved, while the users perform user-friendly queries translated by the web interface into complex interrogations on the database to retrieve data. In such a way, the Archive is ready to be Virtual Observatory compliant and as much as possible user-friendly.
NASA Astrophysics Data System (ADS)
Slater, Gregory L.; Schiff, David; De Pontieu, Bart; Tarbell, Theodore D.; Freeland, Samuel L.
2017-08-01
We present Cruiser, a new web tool for the precision interactive blending of image series across time and wavelength domains. Scrolling in two dimensions enables discovery and investigation of similarities and differences in structure and evolution across multiple wavelengths. Cruiser works in the latest versions of standards compliant browsers on both desktop and IOS platforms. Co-aligned data cubes have been generated for AIA, IRIS, and Hinode SOT FG, and image data from additional instruments, both space-based and ground-based, can be data sources. The tool has several movie playing and image adjustment controls which will be described in the poster and demonstrated on a MacOS notebook and iPad.
Weighted Description Logics Preference Formulas for Multiattribute Negotiation
NASA Astrophysics Data System (ADS)
Ragone, Azzurra; di Noia, Tommaso; Donini, Francesco M.; di Sciascio, Eugenio; Wellman, Michael P.
We propose a framework to compute the utility of an agreement w.r.t a preference set in a negotiation process. In particular, we refer to preferences expressed as weighted formulas in a decidable fragment of First-order Logic and agreements expressed as a formula. We ground our framework in Description Logics (DL) endowed with disjunction, to be compliant with Semantic Web technologies. A logic based approach to preference representation allows, when a background knowledge base is exploited, to relax the often unrealistic assumption of additive independence among attributes. We provide suitable definitions of the problem and present algorithms to compute utility in our setting. We also validate our approach through an experimental evaluation.
NASA Astrophysics Data System (ADS)
Lykiardopoulos, A.; Iona, A.; Lakes, V.; Batis, A.; Balopoulos, E.
2009-04-01
The development of new technologies for the aim of enhancing Web Applications with Dynamically data access was the starting point for Geospatial Web Applications to developed at the same time as well. By the means of these technologies the Web Applications embed the capability of presenting Geographical representations of the Geo Information. The induction in nowadays, of the state of the art technologies known as Web Services, enforce the Web Applications to have interoperability among them i.e. to be able to process requests from each other via a network. In particular throughout the Oceanographic Community, modern Geographical Information systems based on Geospatial Web Services are now developed or will be developed shortly in the near future, with capabilities of managing the information itself fully through Web Based Geographical Interfaces. The exploitation of HNODC Data Base, through a Web Based Application enhanced with Web Services by the use of open source tolls may be consider as an ideal case of such implementation. Hellenic National Oceanographic Data Center (HNODC) as a National Public Oceanographic Data provider and at the same time a member of the International Net of Oceanographic Data Centers( IOC/IODE), owns a very big volume of Data and Relevant information about the Marine Ecosystem. For the efficient management and exploitation of these Data, a relational Data Base has been constructed with a storage of over 300.000 station data concerning, physical, chemical and biological Oceanographic information. The development of a modern Web Application for the End User worldwide to be able to explore and navigate throughout HNODC data via the use of an interface with the capability of presenting Geographical representations of the Geo Information, is today a fact. The application is constituted with State of the art software components and tools such as: • Geospatial and no Spatial Web Services mechanisms • Geospatial open source tools for the creation of Dynamic Geographical Representations. • Communication protocols (messaging mechanisms) in all Layers such as XML and GML together with SOAP protocol via Apache/Axis. At the same time the application may interact with any other SOA application either in sending or receiving Geospatial Data through Geographical Layers, since it inherits the big advantage of interoperability between Web Services systems. Roughly the Architecture can denoted as follows: • At the back End Open source PostgreSQL DBMS stands as the data storage mechanism with more than one Data Base Schemas cause of the separation of the Geospatial Data and the non Geospatial Data. • UMN Map Server and Geoserver are the mechanisms for: Represent Geospatial Data via Web Map Service (WMS) Querying and Navigating in Geospatial and Meta Data Information via Web Feature Service (WFS) oAnd in the near future Transacting and processing new or existing Geospatial Data via Web Processing Service (WPS) • Map Bender, a geospatial portal site management software for OGC and OWS architectures acts as the integration module between the Geospatial Mechanisms. Mapbender comes with an embedded data model capable to manage interfaces for displaying, navigating and querying OGC compliant web map and feature services (WMS and transactional WFS). • Apache and Tomcat stand again as the Web Service middle Layers • Apache Axis with it's embedded implementation of the SOAP protocol ("Simple Object Access Protocol") acts as the No spatial data Mechanism of Web Services. These modules of the platform are still under development but their implementation will be fulfilled in the near future. • And a new Web user Interface for the end user based on enhanced and customized version of a MapBender GUI, a powerful Web Services client. For HNODC the interoperability of Web Services is the big advantage of the developed platform since it is capable to act in the future as provider and consumer of Web Services in both ways: • Either as data products provider for external SOA platforms. • Or as consumer of data products from external SOA platforms for new applications to be developed or for existing applications to be enhanced. A great paradigm of Data Managenet integration and dissemination via the use of such technologies is the European's Union Research Project Seadatanet, with the main objective to develop a standardized distributed system for managing and disseminating the large and diverse data sets and to enhance the currently existing infrastructures with Web Services Further more and when the technology of Web Processing Service (WPS), will be mature enough and applicable for development, the derived data products will be able to have any kind of GIS functionality for consumers across the network. From this point of view HNODC, joins the global scientific community by providing and consuming application Independent data products.
NASA Astrophysics Data System (ADS)
Giacometti, Paolo; Diamond, Solomon G.
2013-02-01
A noninvasive head probe that combines near-infrared spectroscopy (NIRS) and electroencephalography (EEG) for simultaneous measurement of neural dynamics and hemodynamics in the brain is presented. It is composed of a compliant expandable mechanism that accommodates a wide range of head size variation and an elastomeric web that maintains uniform sensor contact pressure on the scalp as the mechanism expands and contracts. The design is intended to help maximize optical and electrical coupling and to maintain stability during head movement. Positioning electrodes at the inion, nasion, central, and preauricular fiducial locations mechanically shapes the probe to place 64 NIRS optodes and 65 EEG electrodes following the 10-5 scalp coordinates. The placement accuracy, precision, and scalp pressure uniformity of the sensors are evaluated. A root-mean-squared (RMS) positional precision of 0.89±0.23 mm, percent arc subdivision RMS accuracy of 0.19±0.15%, and mean normal force on the scalp of 2.28±0.88 N at 5 mm displacement were found. Geometric measurements indicate that the probe will accommodate the full range of adult head sizes. The placement accuracy, precision, and uniformity of sensor contact pressure of the proposed head probe are important determinants of data quality in noninvasive brain monitoring with simultaneous NIRS-EEG.
Towards generating ECSS-compliant fault tree analysis results via ConcertoFLA
NASA Astrophysics Data System (ADS)
Gallina, B.; Haider, Z.; Carlsson, A.
2018-05-01
Attitude Control Systems (ACSs) maintain the orientation of the satellite in three-dimensional space. ACSs need to be engineered in compliance with ECSS standards and need to ensure a certain degree of dependability. Thus, dependability analysis is conducted at various levels and by using ECSS-compliant techniques. Fault Tree Analysis (FTA) is one of these techniques. FTA is being automated within various Model Driven Engineering (MDE)-based methodologies. The tool-supported CHESS-methodology is one of them. This methodology incorporates ConcertoFLA, a dependability analysis technique enabling failure behavior analysis and thus FTA-results generation. ConcertoFLA, however, similarly to other techniques, still belongs to the academic research niche. To promote this technique within the space industry, we apply it on an ACS and discuss about its multi-faceted potentialities in the context of ECSS-compliant engineering.
Hurrell, M J; Monk, T G; Nicol, A; Norton, A N; Reich, D L; Walsh, J L
2012-08-01
With the increasing use of anaesthesia information management systems (AIMS) there is the opportunity for different institutions to aggregate and share information both nationally and internationally. Potential uses of such aggregated data include outcomes research, benchmarking and improvement in clinical practice and patient safety. However, these goals can only be achieved if data contained in records from different sources are truly comparable and there is semantic inter-operability. This paper describes the development of a standard terminology for anaesthesia and also a Domain Analysis Model and implementation guide to facilitate a standard representation of AIMS records as extensible markup language documents that are compliant with the Health Level 7 Version 3 clinical document architecture. A representation of vital signs that is compliant with the International Standards Organization 11073 standard is also discussed.
GeoCSV: tabular text formatting for geoscience data
NASA Astrophysics Data System (ADS)
Stults, M.; Arko, R. A.; Davis, E.; Ertz, D. J.; Turner, M.; Trabant, C. M.; Valentine, D. W., Jr.; Ahern, T. K.; Carbotte, S. M.; Gurnis, M.; Meertens, C.; Ramamurthy, M. K.; Zaslavsky, I.; McWhirter, J.
2015-12-01
The GeoCSV design was developed within the GeoWS project as a way to provide a baseline of compatibility between tabular text data sets from various sub-domains in geoscience. Funded through NSF's EarthCube initiative, the GeoWS project aims to develop common web service interfaces for data access across hydrology, geodesy, seismology, marine geophysics, atmospheric science and other areas. The GeoCSV format is an essential part of delivering data via simple web services for discovery and utilization by both humans and machines. As most geoscience disciplines have developed and use data formats specific for their needs, tabular text data can play a key role as a lowest common denominator useful for exchanging and integrating data across sub-domains. The design starts with a core definition compatible with best practices described by the W3C - CSV on the Web Working Group (CSVW). Compatibility with CSVW is intended to ensure the broadest usability of data expressed as GeoCSV. An optional, simple, but limited metadata description mechanism was added to allow inclusion of important metadata with comma separated data, while staying with the definition of a "dialect" by CSVW. The format is designed both for creating new datasets and to annotate data sets already in a tabular text format such that they are compliant with GeoCSV.
Exploring NASA OMI Level 2 Data With Visualization
NASA Technical Reports Server (NTRS)
Wei, Jennifer; Yang, Wenli; Johnson, James; Zhao, Peisheng; Gerasimov, Irina; Pham, Long; Vicente, Gilberto
2014-01-01
Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted, such as model inputs from satellite, or extreme events (such as volcano eruptions, dust storms,... etc.). Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. Such obstacles may be avoided by allowing users to visualize satellite data as "images", with accurate pixel-level (Level-2) information, including pixel coverage area delineation and science team recommended quality screening for individual geophysical parameters. We present a prototype service from the Goddard Earth Sciences Data and Information Services Center (GES DISC) supporting Aura OMI Level-2 Data with GIS-like capabilities. Functionality includes selecting data sources (e.g., multiple parameters under the same scene, like NO2 and SO2, or the same parameter with different aggregation methods, like NO2 in OMNO2G and OMNO2D products), user-defined area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting, reformatting, and reprojection. The system will allow any user-defined portal interface (front-end) to connect to our backend server with OGC standard-compliant Web Mapping Service (WMS) and Web Coverage Service (WCS) calls. This back-end service should greatly enhance its expandability to integrate additional outside data/map sources.
Exploring NASA OMI Level 2 Data With Visualization
NASA Technical Reports Server (NTRS)
Wei, Jennifer C.; Yang, Wenli; Johnson, James; Zhao, Peisheng; Gerasimov, Irina; Pham, Long; Vincente, Gilbert
2014-01-01
Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted, such as model inputs from satellite, or extreme events (such as volcano eruptions, dust storms, etc.).Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. Such obstacles may be avoided by allowing users to visualize satellite data as images, with accurate pixel-level (Level-2) information, including pixel coverage area delineation and science team recommended quality screening for individual geophysical parameters. We present a prototype service from the Goddard Earth Sciences Data and Information Services Center (GES DISC) supporting Aura OMI Level-2 Data with GIS-like capabilities. Functionality includes selecting data sources (e.g., multiple parameters under the same scene, like NO2 and SO2, or the same parameter with different aggregation methods, like NO2 in OMNO2G and OMNO2D products), user-defined area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting, reformatting, and reprojection. The system will allow any user-defined portal interface (front-end) to connect to our backend server with OGC standard-compliant Web Mapping Service (WMS) and Web Coverage Service (WCS) calls. This back-end service should greatly enhance its expandability to integrate additional outside data-map sources.
Facial recognition trial: biometric identification of non-compliant subjects using CCTV
NASA Astrophysics Data System (ADS)
Best, Tim
2007-10-01
LogicaCMG were provided with an opportunity to deploy a facial recognition system in a realistic scenario. 12 cameras were installed at an international airport covering all entrances to the immigration hall. The evaluation took place over several months with numerous adjustments to both the hardware (i.e. cameras, servers and capture cards) and software. The learning curve has been very steep but a stage has now been reached where both LogicaCMG and the client are confident that, subject to the right environmental conditions (lighting and camera location) an effective system can be defined with a high probability of successful detection of the target individual, with minimal false alarms. To the best of our knowledge, results with a >90% detection rate, of non-compliant subjects 'at range' has not been achieved anywhere else. This puts this location at the forefront of capability in this area. The results achieved demonstrate that, given optimised conditions, it is possible to achieve a long range biometric identification of a non compliant subject, with a high rate of success.
Tirado-Ramos, Alfredo; Hu, Jingkun; Lee, K P
2002-01-01
Supplement 23 to DICOM (Digital Imaging and Communications for Medicine), Structured Reporting, is a specification that supports a semantically rich representation of image and waveform content, enabling experts to share image and related patient information. DICOM SR supports the representation of textual and coded data linked to images and waveforms. Nevertheless, the medical information technology community needs models that work as bridges between the DICOM relational model and open object-oriented technologies. The authors assert that representations of the DICOM Structured Reporting standard, using object-oriented modeling languages such as the Unified Modeling Language, can provide a high-level reference view of the semantically rich framework of DICOM and its complex structures. They have produced an object-oriented model to represent the DICOM SR standard and have derived XML-exchangeable representations of this model using World Wide Web Consortium specifications. They expect the model to benefit developers and system architects who are interested in developing applications that are compliant with the DICOM SR specification.
Urban, Trinity; Ziegler, Erik; Lewis, Rob; Hafey, Chris; Sadow, Cheryl; Van den Abbeele, Annick D; Harris, Gordon J
2017-11-01
Oncology clinical trials have become increasingly dependent upon image-based surrogate endpoints for determining patient eligibility and treatment efficacy. As therapeutics have evolved and multiplied in number, the tumor metrics criteria used to characterize therapeutic response have become progressively more varied and complex. The growing intricacies of image-based response evaluation, together with rising expectations for rapid and consistent results reporting, make it difficult for site radiologists to adequately address local and multicenter imaging demands. These challenges demonstrate the need for advanced cancer imaging informatics tools that can help ensure protocol-compliant image evaluation while simultaneously promoting reviewer efficiency. LesionTracker is a quantitative imaging package optimized for oncology clinical trial workflows. The goal of the project is to create an open source zero-footprint viewer for image analysis that is designed to be extensible as well as capable of being integrated into third-party systems for advanced imaging tools and clinical trials informatics platforms. Cancer Res; 77(21); e119-22. ©2017 AACR . ©2017 American Association for Cancer Research.
Flight Testing of Novel Compliant Spines for Passive Wing Morphing on Ornithopters
NASA Technical Reports Server (NTRS)
Wissa, Aimy; Guerreiro, Nelson; Grauer, Jared; Altenbuchner, Cornelia; Hubbard, James E., Jr.; Tummala, Yashwanth; Frecker, Mary; Roberts, Richard
2013-01-01
Unmanned Aerial Vehicles (UAVs) are proliferating in both the civil and military markets. Flapping wing UAVs, or ornithopters, have the potential to combine the agility and maneuverability of rotary wing aircraft with excellent performance in low Reynolds number flight regimes. The purpose of this paper is to present new free flight experimental results for an ornithopter equipped with one degree of freedom (1DOF) compliant spines that were designed and optimized in terms of mass, maximum von-Mises stress, and desired wing bending deflections. The spines were inserted in an experimental ornithopter wing spar in order to achieve a set of desired kinematics during the up and down strokes of a flapping cycle. The ornithopter was flown at Wright Patterson Air Force Base in the Air Force Research Laboratory Small Unmanned Air Systems (SUAS) indoor flight facility. Vicon motion tracking cameras were used to track the motion of the vehicle for five different wing configurations. The effect of the presence of the compliant spine on wing kinematics and leading edge spar deflection during flight is presented. Results show that the ornithopter with the compliant spine inserted in its wing reduced the body acceleration during the upstroke which translates into overall lift gains.
Harmonize Pipeline and Archiving Aystem: PESSTO@IA2 Use Case
NASA Astrophysics Data System (ADS)
Smareglia, R.; Knapic, C.; Molinaro, M.; Young, D.; Valenti, S.
2013-10-01
Italian Astronomical Archives Center (IA2) is a research infrastructure project that aims at coordinating different national and international initiatives to improve the quality of astrophysical data services. IA2 is now also involved in the PESSTO (Public ESO Spectroscopic Survey of Transient Objects) collaboration, developing a complete archiving system to store calibrated post processed data (including sensitive intermediate products), a user interface to access private data and Virtual Observatory (VO) compliant web services to access public fast reduction data via VO tools. The archive system shall rely on the PESSTO Marshall to provide file data and its associated metadata output by the PESSTO data-reduction pipeline. To harmonize the object repository, data handling and archiving system, new tools are under development. These systems must have a strong cross-interaction without increasing the complexities of any single task, in order to improve the performances of the whole system and must have a sturdy logic in order to perform all operations in coordination with the other PESSTO tools. MySQL Replication technology and triggers are used for the synchronization of new data in an efficient, fault tolerant manner. A general purpose library is under development to manage data starting from raw observations to final calibrated ones, open to the overriding of different sources, formats, management fields, storage and publication policies. Configurations for all the systems are stored in a dedicated schema (no configuration files), but can be easily updated by a planned Archiving System Configuration Interface (ASCI).
National Geothermal Data System: an Exemplar of Open Access to Data
NASA Astrophysics Data System (ADS)
Allison, M. L.; Richard, S. M.; Blackman, H.; Anderson, A.
2013-12-01
The National Geothermal Data System's (NGDS - www.geothermaldata.org) formal launch in 2014 will provide open access to millions of datasets, sharing technical geothermal-relevant data across the geosciences to propel geothermal development and production. With information from all of the Department of Energy's sponsored development and research projects and geologic data from all 50 states, this free, interactive tool is opening new exploration opportunities and shortening project development by making data easily discoverable and accessible. We continue to populate our prototype functional data system with multiple data nodes and nationwide data online and available to the public. Data from state geological surveys and partners includes more than 5 million records online, including 1.48 million well headers (oil and gas, water, geothermal), 732,000 well logs, and 314,000 borehole temperatures and is growing rapidly. There are over 250 Web services and another 138 WMS (Web Map Services) registered in the system as of August, 2013. Companion projects run by Boise State University, Southern Methodist University, and USGS are adding millions of additional data records. The National Renewable Energy Laboratory is managing the Geothermal Data Repository which will serve as a system node and clearinghouse for data from hundreds of DOE-funded geothermal projects. NGDS is built on the US Geoscience Information Network data integration framework, which is a joint undertaking of the USGS and the Association of American State Geologists (AASG). NGDS is fully compliant with the White House Executive Order of May 2013, requiring all federal agencies to make their data holdings publicly accessible online in open source, interoperable formats with common core and extensible metadata. The National Geothermal Data System is being designed, built, deployed, and populated primarily with grants from the US Department of Energy, Geothermal Technologies Office. To keep this operational system sustainable after the original implementation will require four core elements: continued serving of data and applications by providers; maintenance of system operations; a governance structure; and an effective business model. Each of these presents a number of challenges currently under consideration.
A Standard-Compliant Virtual Meeting System with Active Video Object Tracking
NASA Astrophysics Data System (ADS)
Lin, Chia-Wen; Chang, Yao-Jen; Wang, Chih-Ming; Chen, Yung-Chang; Sun, Ming-Ting
2002-12-01
This paper presents an H.323 standard compliant virtual video conferencing system. The proposed system not only serves as a multipoint control unit (MCU) for multipoint connection but also provides a gateway function between the H.323 LAN (local-area network) and the H.324 WAN (wide-area network) users. The proposed virtual video conferencing system provides user-friendly object compositing and manipulation features including 2D video object scaling, repositioning, rotation, and dynamic bit-allocation in a 3D virtual environment. A reliable, and accurate scheme based on background image mosaics is proposed for real-time extracting and tracking foreground video objects from the video captured with an active camera. Chroma-key insertion is used to facilitate video objects extraction and manipulation. We have implemented a prototype of the virtual conference system with an integrated graphical user interface to demonstrate the feasibility of the proposed methods.
A review of compliant transmission mechanisms for bio-inspired flapping-wing micro air vehicles.
Zhang, C; Rossi, C
2017-02-15
Flapping-wing micro air vehicles (FWMAVs) are a class of unmanned aircraft that imitate flight characteristics of natural organisms such as birds, bats, and insects, in order to achieve maximum flight efficiency and manoeuvrability. Designing proper mechanisms for flapping transmission is an extremely important aspect for FWMAVs. Compliant transmission mechanisms have been considered as an alternative to rigid transmission systems due to their lower the number of parts, thereby reducing the total weight, lower energy loss thanks to little or practically no friction among parts, and at the same time, being able to store and release mechanical power during the flapping cycle. In this paper, the state-of-the-art research in this field is dealt upon, highlighting open challenges and research topics. An optimization method for designing compliant transmission mechanisms inspired by the thoraxes of insects is also introduced.
Buckling of a stiff thin film on an elastic graded compliant substrate.
Chen, Zhou; Chen, Weiqiu; Song, Jizhou
2017-12-01
The buckling of a stiff film on a compliant substrate has attracted much attention due to its wide applications such as thin-film metrology, surface patterning and stretchable electronics. An analytical model is established for the buckling of a stiff thin film on a semi-infinite elastic graded compliant substrate subjected to in-plane compression. The critical compressive strain and buckling wavelength for the sinusoidal mode are obtained analytically for the case with the substrate modulus decaying exponentially. The rigorous finite element analysis (FEA) is performed to validate the analytical model and investigate the postbuckling behaviour of the system. The critical buckling strain for the period-doubling mode is obtained numerically. The influences of various material parameters on the results are investigated. These results are helpful to provide physical insights on the buckling of elastic graded substrate-supported thin film.
Quantification of regenerative potential in primary human mammary epithelial cells
Linnemann, Jelena R.; Miura, Haruko; Meixner, Lisa K.; Irmler, Martin; Kloos, Uwe J.; Hirschi, Benjamin; Bartsch, Harald S.; Sass, Steffen; Beckers, Johannes; Theis, Fabian J.; Gabka, Christian; Sotlar, Karl; Scheel, Christina H.
2015-01-01
We present an organoid regeneration assay in which freshly isolated human mammary epithelial cells are cultured in adherent or floating collagen gels, corresponding to a rigid or compliant matrix environment. In both conditions, luminal progenitors form spheres, whereas basal cells generate branched ductal structures. In compliant but not rigid collagen gels, branching ducts form alveoli at their tips, express basal and luminal markers at correct positions, and display contractility, which is required for alveologenesis. Thereby, branched structures generated in compliant collagen gels resemble terminal ductal-lobular units (TDLUs), the functional units of the mammary gland. Using the membrane metallo-endopeptidase CD10 as a surface marker enriches for TDLU formation and reveals the presence of stromal cells within the CD49fhi/EpCAM− population. In summary, we describe a defined in vitro assay system to quantify cells with regenerative potential and systematically investigate their interaction with the physical environment at distinct steps of morphogenesis. PMID:26071498
Buckling of a stiff thin film on an elastic graded compliant substrate
NASA Astrophysics Data System (ADS)
Chen, Zhou; Chen, Weiqiu; Song, Jizhou
2017-12-01
The buckling of a stiff film on a compliant substrate has attracted much attention due to its wide applications such as thin-film metrology, surface patterning and stretchable electronics. An analytical model is established for the buckling of a stiff thin film on a semi-infinite elastic graded compliant substrate subjected to in-plane compression. The critical compressive strain and buckling wavelength for the sinusoidal mode are obtained analytically for the case with the substrate modulus decaying exponentially. The rigorous finite element analysis (FEA) is performed to validate the analytical model and investigate the postbuckling behaviour of the system. The critical buckling strain for the period-doubling mode is obtained numerically. The influences of various material parameters on the results are investigated. These results are helpful to provide physical insights on the buckling of elastic graded substrate-supported thin film.
National Geothermal Data System: Open Access to Geoscience Data, Maps, and Documents
NASA Astrophysics Data System (ADS)
Caudill, C. M.; Richard, S. M.; Musil, L.; Sonnenschein, A.; Good, J.
2014-12-01
The U.S. National Geothermal Data System (NGDS) provides free open access to millions of geoscience data records, publications, maps, and reports via distributed web services to propel geothermal research, development, and production. NGDS is built on the US Geoscience Information Network (USGIN) data integration framework, which is a joint undertaking of the USGS and the Association of American State Geologists (AASG), and is compliant with international standards and protocols. NGDS currently serves geoscience information from 60+ data providers in all 50 states. Free and open source software is used in this federated system where data owners maintain control of their data. This interactive online system makes geoscience data easily discoverable, accessible, and interoperable at no cost to users. The dynamic project site http://geothermaldata.org serves as the information source and gateway to the system, allowing data and applications discovery and availability of the system's data feed. It also provides access to NGDS specifications and the free and open source code base (on GitHub), a map-centric and library style search interface, other software applications utilizing NGDS services, NGDS tutorials (via YouTube and USGIN site), and user-created tools and scripts. The user-friendly map-centric web-based application has been created to support finding, visualizing, mapping, and acquisition of data based on topic, location, time, provider, or key words. Geographic datasets visualized through the map interface also allow users to inspect the details of individual GIS data points (e.g. wells, geologic units, etc.). In addition, the interface provides the information necessary for users to access the GIS data from third party software applications such as GoogleEarth, UDig, and ArcGIS. A redistributable, free and open source software package called GINstack (USGIN software stack) was also created to give data providers a simple way to release data using interoperable and shareable standards, upload data and documents, and expose those data as a node in the NGDS or any larger data system through a CSW endpoint. The easy-to-use interface is supported by back-end software including Postgres, GeoServer, and custom CKAN extensions among others.
Design of 3D-Printed Titanium Compliant Mechanisms
NASA Technical Reports Server (NTRS)
Merriam, Ezekiel G.; Jones, Jonathan E.; Howell, Larry L.
2014-01-01
This paper describes 3D-printed titanium compliant mechanisms for aerospace applications. It is meant as a primer to help engineers design compliant, multi-axis, printed parts that exhibit high performance. Topics covered include brief introductions to both compliant mechanism design and 3D printing in titanium, material and geometry considerations for 3D printing, modeling techniques, and case studies of both successful and unsuccessful part geometries. Key findings include recommended flexure geometries, minimum thicknesses, and general design guidelines for compliant printed parts that may not be obvious to the first time designer.
NASA Astrophysics Data System (ADS)
Jirka, Simon; del Rio, Joaquin; Toma, Daniel; Martinez, Enoc; Delory, Eric; Pearlman, Jay; Rieke, Matthes; Stasch, Christoph
2017-04-01
The rapidly evolving technology for building Web-based (spatial) information infrastructures and Sensor Webs, there are new opportunities to improve the process how ocean data is collected and managed. A central element in this development is the suite of Sensor Web Enablement (SWE) standards specified by the Open Geospatial Consortium (OGC). This framework of standards comprises on the one hand data models as well as formats for measurement data (ISO/OGC Observations and Measurement, O&M) and metadata describing measurement processes and sensors (OGC Sensor Model Language, SensorML). On the other hand the SWE standards comprise (Web service) interface specifications for pull-based access to observation data (OGC Sensor Observation Service, SOS) and for controlling or configuring sensors (OGC Sensor Planning Service, SPS). Also within the European INSPIRE framework the SWE standards play an important role as the SOS is the recommended download service interface for O&M-encoded observation data sets. In the context of the EU-funded Oceans of Tomorrow initiative the NeXOS (Next generation, Cost-effective, Compact, Multifunctional Web Enabled Ocean Sensor Systems Empowering Marine, Maritime and Fisheries Management) project is developing a new generation of in-situ sensors that make use of the SWE standards to facilitate the data publication process and the integration into Web based information infrastructures. This includes the development of a dedicated firmware for instruments and sensor platforms (SEISI, Smart Electronic Interface for Sensors and Instruments) maintained by the Universitat Politècnica de Catalunya (UPC). Among other features, SEISI makes use of OGC SWE standards such OGC-PUCK, to enable a plug-and-play mechanism for sensors based on SensorML encoded metadata. Thus, if a new instrument is attached to a SEISI-based platform, it automatically configures the connection to these instruments, automatically generated data files compliant with the ISO/OGC Observations and Measurements standard and initiates the data transmission into the NeXOS Sensor Web infrastructure. Besides these platform-related developments, NeXOS has realised the full path of data transmission from the sensor to the end user application. The conceptual architecture design is implemented by a series of open source SWE software packages provided by 52°North. This comprises especially different SWE server components (i.e. OGC Sensor Observation Service), tools for data visualisation (e.g. the 52°North Helgoland SOS viewer), and an editor for providing SensorML-based metadata (52°North smle). As a result, NeXOS has demonstrated how the SWE standards help to improve marine observation data collection. Within this presentation, we will present the experiences and findings of the NeXOS project and will provide recommendation for future work directions.
Rapid prototyping compliant arterial phantoms for in-vitro studies and device testing
2013-01-01
Background Compliant vascular phantoms are desirable for in-vitro patient-specific experiments and device testing. TangoPlus FullCure 930® is a commercially available rubber-like material that can be used for PolyJet rapid prototyping. This work aims to gather preliminary data on the distensibility of this material, in order to assess the feasibility of its use in the context of experimental cardiovascular modelling. Methods The descending aorta anatomy of a volunteer was modelled in 3D from cardiovascular magnetic resonance (CMR) images and rapid prototyped using TangoPlus. The model was printed with a range of increasing wall thicknesses (0.6, 0.7, 0.8, 1.0 and 1.5 mm), keeping the lumen of the vessel constant. Models were also printed in both vertical and horizontal orientations, thus resulting in a total of ten specimens. Compliance tests were performed by monitoring pressure variations while gradually increasing and decreasing internal volume. Knowledge of distensibility was thus derived and then implemented with CMR data to test two applications. Firstly, a patient-specific compliant model of hypoplastic aorta suitable for connection in a mock circulatory loop for in-vitro tests was manufactured. Secondly, the right ventricular outflow tract (RVOT) of a patient necessitating pulmonary valve replacement was printed in order to physically test device insertion and assess patient’s suitability for percutaneous pulmonary valve intervention. Results The distensibility of the material was identified in a range from 6.5 × 10-3 mmHg-1 for the 0.6 mm case, to 3.0 × 10-3 mmHg-1 for the 1.5 mm case. The models printed in the vertical orientation were always more compliant than their horizontal counterpart. Rapid prototyping of a compliant hypoplastic aorta and of a RVOT anatomical model were both feasible. Device insertion in the RVOT model was successful. Conclusion Values of distensibility, compared with literature data, show that TangoPlus is suitable for manufacturing arterial phantoms, with the added benefit of being compatible with PolyJet printing, thus guaranteeing representative anatomical finishing, and quick and inexpensive fabrication. The appealing possibility of printing models of non-uniform wall thickness, resembling more closely certain physiological scenarios, can also be explored. However, this material appears to be too stiff for modelling the more compliant systemic venous system. PMID:23324211
Rapid prototyping compliant arterial phantoms for in-vitro studies and device testing.
Biglino, Giovanni; Verschueren, Peter; Zegels, Raf; Taylor, Andrew M; Schievano, Silvia
2013-01-16
Compliant vascular phantoms are desirable for in-vitro patient-specific experiments and device testing. TangoPlus FullCure 930 is a commercially available rubber-like material that can be used for PolyJet rapid prototyping. This work aims to gather preliminary data on the distensibility of this material, in order to assess the feasibility of its use in the context of experimental cardiovascular modelling. The descending aorta anatomy of a volunteer was modelled in 3D from cardiovascular magnetic resonance (CMR) images and rapid prototyped using TangoPlus. The model was printed with a range of increasing wall thicknesses (0.6, 0.7, 0.8, 1.0 and 1.5 mm), keeping the lumen of the vessel constant. Models were also printed in both vertical and horizontal orientations, thus resulting in a total of ten specimens. Compliance tests were performed by monitoring pressure variations while gradually increasing and decreasing internal volume. Knowledge of distensibility was thus derived and then implemented with CMR data to test two applications. Firstly, a patient-specific compliant model of hypoplastic aorta suitable for connection in a mock circulatory loop for in-vitro tests was manufactured. Secondly, the right ventricular outflow tract (RVOT) of a patient necessitating pulmonary valve replacement was printed in order to physically test device insertion and assess patient's suitability for percutaneous pulmonary valve intervention. The distensibility of the material was identified in a range from 6.5 × 10(-3) mmHg(-1) for the 0.6 mm case, to 3.0 × 10(-3) mmHg(-1) for the 1.5 mm case. The models printed in the vertical orientation were always more compliant than their horizontal counterpart. Rapid prototyping of a compliant hypoplastic aorta and of a RVOT anatomical model were both feasible. Device insertion in the RVOT model was successful. Values of distensibility, compared with literature data, show that TangoPlus is suitable for manufacturing arterial phantoms, with the added benefit of being compatible with PolyJet printing, thus guaranteeing representative anatomical finishing, and quick and inexpensive fabrication. The appealing possibility of printing models of non-uniform wall thickness, resembling more closely certain physiological scenarios, can also be explored. However, this material appears to be too stiff for modelling the more compliant systemic venous system.
Using Changes in Binding Globulins to Assess Oral Contraceptive Compliance
Westhoff, Carolyn; Petrie, K.A.; Cremers, S.
2012-01-01
Background Validity of oral contraceptive pill (OCP) clinical trial results depends on participant compliance. Ethinyl estradiol (EE2) induces increases in hepatic binding globulin (BG) levels. Measuring these BG increases may provide an effective and convenient approach to distinguishing non-compliant from compliant OCP users in research settings. This analysis evaluated the usefulness of measuring increases in corticosteroid, sex hormone and thyroxine binding globulins (CBG, SHBG, TBG) as measures of OCP compliance. Methods We used frozen serum from a trial that compared ovarian suppression between normal weight and obese women randomized to one of two OCPs containing EE2 and levonorgestrel (LNG). Based on serial LNG measurements during the trial, 17% of participants were non-compliant. We matched non-compliant participants with compliant participants by age, BMI, ethnicity and OCP formulation. We measured CBG, SHBG and TBG levels, and compared change from baseline to 3-month follow-up between the non-compliant and compliant participants. Construction of receiver operator characteristic (ROC) curves allowed comparison of various BG measures. Results Changes in CBG and TBG distinguished OCP non-compliant users from compliant users (area under the ROC curve (AUROC), 0.86 and 0.89, p < 0.01). Changes in SHBG were less discriminating (AUROC 0.69) Conclusions EE2 induced increases in CBG and TBG provide a sensitive integrated marker of compliance with an LNG-containing OCP. PMID:22795088
2012-06-01
MISP) COMPLIANT ARCHITECTURE WHITE SANDS MISSILE RANGE REAGAN TEST SITE YUMA PROVING GROUND DUGWAY PROVING GROUND ABERDEEN TEST CENTER...DIGITAL MOTION IMAGERY COMPRESSION BEST PRACTICES GUIDE – A MOTION IMAGERY STANDARDS PROFILE (MISP) COMPLIANT ARCHITECTURE ...delivery, and archival purposes. These practices are based on a Motion Imagery Standards Profile (MISP) compliant architecture , which has been defined
ERIC Educational Resources Information Center
Pineda, Ernest M.
1999-01-01
Discusses ways to help resolve the Y2K problem and avoid disruptions in school security and safety. Discusses computer software testing and validation to determine its functionality after year's end, and explores system remediation of non-compliant fire and security systems. (GR)
A behavioral rehabilitation intervention for amnestic Mild Cognitive Impairment
Greenaway, Melanie C.; Hanna, Sherrie M.; Lepore, Susan W.; Smith, Glenn E.
2010-01-01
Individuals with amnestic Mild Cognitive Impairment (MCI) currently have few treatment options for combating their memory loss. The Memory Support System (MSS) is a calendar and organization system with accompanying 6-week curriculum designed for individuals with progressive memory impairment. Ability to learn the MSS and its utility were assessed in 20 participants. Participants were significantly more likely to successfully use the calendar system after training. Ninety-five percent were compliant with the MSS at training completion, and 89% continued to be compliant at follow-up. Outcome measures revealed a medium effect size for improvement in functional ability. Subjects further reported improved independence, self-confidence, and mood. This initial examination of the MSS suggests that with appropriate training, individuals with amnestic MCI can and will use a memory notebook system to help compensate for memory loss. These results are encouraging that the MSS may help with the symptoms of memory decline in MCI. PMID:18955724
Development of a low mobility IEEE 802.15.4 compliant VANET system for urban environments.
Nazabal, Juan Antonio; Falcone, Francisco; Fernández-Valdivielso, Carlos; Matías, Ignacio Raúl
2013-05-29
The use of Vehicular Ad-Hoc Networks (VANETs) is growing nowadays and it includes both roadside-to-vehicle communication (RVC) and inter-vehicle communication (IVC). The purpose of VANETs is to exchange useful information between vehicles and the roadside infrastructures for making an intelligent use of them. There are several possible applications for this technology like: emergency warning system for vehicles, cooperative adaptive cruise control or collision avoidance, among others. The objective of this work is to develop a VANET prototype system for urban environments using IEEE 802.15.4 compliant devices. Simulation-based values of the estimated signal strength and radio link quality values are obtained and compared with measurements in outdoor conditions to validate an implemented VANET system. The results confirm the possibility of implementing low cost vehicular communication networks operating at moderate vehicular speeds.
Rotman, Oren Moshe; Weiss, Dar; Zaretsky, Uri; Shitzer, Avraham; Einav, Shmuel
2015-09-18
High accuracy differential pressure measurements are required in various biomedical and medical applications, such as in fluid-dynamic test systems, or in the cath-lab. Differential pressure measurements using fluid-filled catheters are relatively inexpensive, yet may be subjected to common mode pressure errors (CMP), which can significantly reduce the measurement accuracy. Recently, a novel correction method for high accuracy differential pressure measurements was presented, and was shown to effectively remove CMP distortions from measurements acquired in rigid tubes. The purpose of the present study was to test the feasibility of this correction method inside compliant tubes, which effectively simulate arteries. Two tubes with varying compliance were tested under dynamic flow and pressure conditions to cover the physiological range of radial distensibility in coronary arteries. A third, compliant model, with a 70% stenosis severity was additionally tested. Differential pressure measurements were acquired over a 3 cm tube length using a fluid-filled double-lumen catheter, and were corrected using the proposed CMP correction method. Validation of the corrected differential pressure signals was performed by comparison to differential pressure recordings taken via a direct connection to the compliant tubes, and by comparison to predicted differential pressure readings of matching fluid-structure interaction (FSI) computational simulations. The results show excellent agreement between the experimentally acquired and computationally determined differential pressure signals. This validates the application of the CMP correction method in compliant tubes of the physiological range for up to intermediate size stenosis severity of 70%. Copyright © 2015 Elsevier Ltd. All rights reserved.
Flectofold—a biomimetic compliant shading device for complex free form facades
NASA Astrophysics Data System (ADS)
Körner, A.; Born, L.; Mader, A.; Sachse, R.; Saffarian, S.; Westermeier, A. S.; Poppinga, S.; Bischoff, M.; Gresser, G. T.; Milwich, M.; Speck, T.; Knippers, J.
2018-01-01
Smart and adaptive outer façade shading systems are of high interest in modern architecture. For long lasting and reliable systems, the abandonment of hinges which often fail due to mechanical wear during repetitive use is of particular importance. Drawing inspiration from the hinge-less motion of the underwater snap-trap of the carnivorous waterwheel plant (Aldrovanda vesiculosa), the compliant façade shading device Flectofold was developed. Based on computational simulations of the biological role-model’s elastic and reversible motion, the actuation principle of the plant can be identified. The enclosed geometric motion principle is abstracted into a simplified curved-line folding geometry with distinct flexible hinge-zones. The kinematic behaviour is translated into a quantitative kinetic model, using finite element simulation which allows the detailed analyses of the influence of geometric parameters such as curved-fold line radius and various pneumatically driven actuation principles on the motion behaviour, stress concentrations within the hinge-zones, and actuation forces. The information regarding geometric relations and material gradients gained from those computational models are then used to develop novel material combinations for glass fibre reinforced plastics which enabled the fabrication of physical prototypes of the compliant façade shading device Flectofold.
Uchida, Thomas K.; Sherman, Michael A.; Delp, Scott L.
2015-01-01
Impacts are instantaneous, computationally efficient approximations of collisions. Current impact models sacrifice important physical principles to achieve that efficiency, yielding qualitative and quantitative errors when applied to simultaneous impacts in spatial multibody systems. We present a new impact model that produces behaviour similar to that of a detailed compliant contact model, while retaining the efficiency of an instantaneous method. In our model, time and configuration are fixed, but the impact is resolved into distinct compression and expansion phases, themselves comprising sliding and rolling intervals. A constrained optimization problem is solved for each interval to compute incremental impulses while respecting physical laws and principles of contact mechanics. We present the mathematical model, algorithms for its practical implementation, and examples that demonstrate its effectiveness. In collisions involving materials of various stiffnesses, our model can be more than 20 times faster than integrating through the collision using a compliant contact model. This work extends the use of instantaneous impact models to scientific and engineering applications with strict accuracy requirements, where compliant contact models would otherwise be required. An open-source implementation is available in Simbody, a C++ multibody dynamics library widely used in biomechanical and robotic applications. PMID:27547093
NCI's Distributed Geospatial Data Server
NASA Astrophysics Data System (ADS)
Larraondo, P. R.; Evans, B. J. K.; Antony, J.
2016-12-01
Earth systems, environmental and geophysics datasets are an extremely valuable source of information about the state and evolution of the Earth. However, different disciplines and applications require this data to be post-processed in different ways before it can be used. For researchers experimenting with algorithms across large datasets or combining multiple data sets, the traditional approach to batch data processing and storing all the output for later analysis rapidly becomes unfeasible, and often requires additional work to publish for others to use. Recent developments on distributed computing using interactive access to significant cloud infrastructure opens the door for new ways of processing data on demand, hence alleviating the need for storage space for each individual copy of each product. The Australian National Computational Infrastructure (NCI) has developed a highly distributed geospatial data server which supports interactive processing of large geospatial data products, including satellite Earth Observation data and global model data, using flexible user-defined functions. This system dynamically and efficiently distributes the required computations among cloud nodes and thus provides a scalable analysis capability. In many cases this completely alleviates the need to preprocess and store the data as products. This system presents a standards-compliant interface, allowing ready accessibility for users of the data. Typical data wrangling problems such as handling different file formats and data types, or harmonising the coordinate projections or temporal and spatial resolutions, can now be handled automatically by this service. The geospatial data server exposes functionality for specifying how the data should be aggregated and transformed. The resulting products can be served using several standards such as the Open Geospatial Consortium's (OGC) Web Map Service (WMS) or Web Feature Service (WFS), Open Street Map tiles, or raw binary arrays under different conventions. We will show some cases where we have used this new capability to provide a significant improvement over previous approaches.
Report on the Global Data Assembly Center (GDAC) to the 12th GHRSST Science Team Meeting
NASA Technical Reports Server (NTRS)
Armstrong, Edward M.; Bingham, Andrew; Vazquez, Jorge; Thompson, Charles; Huang, Thomas; Finch, Chris
2011-01-01
In 2010/2011 the Global Data Assembly Center (GDAC) at NASA's Physical Oceanography Distributed Active Archive Center (PO.DAAC) continued its role as the primary clearinghouse and access node for operational Group for High Resolution Sea Surface Temperature (GHRSST) datastreams, as well as its collaborative role with the NOAA Long Term Stewardship and Reanalysis Facility (LTSRF) for archiving. Here we report on our data management activities and infrastructure improvements since the last science team meeting in June 2010.These include the implementation of all GHRSST datastreams in the new PO.DAAC Data Management and Archive System (DMAS) for more reliable and timely data access. GHRSST dataset metadata are now stored in a new database that has made the maintenance and quality improvement of metadata fields more straightforward. A content management system for a revised suite of PO.DAAC web pages allows dynamic access to a subset of these metadata fields for enhanced dataset description as well as discovery through a faceted search mechanism from the perspective of the user. From the discovery and metadata standpoint the GDAC has also implemented the NASA version of the OpenSearch protocol for searching for GHRSST granules and developed a web service to generate ISO 19115-2 compliant metadata records. Furthermore, the GDAC has continued to implement a new suite of tools and services for GHRSST datastreams including a Level 2 subsetter known as Dataminer, a revised POET Level 3/4 subsetter and visualization tool, a Google Earth interface to selected daily global Level 2 and Level 4 data, and experimented with a THREDDS catalog of GHRSST data collections. Finally we will summarize the expanding user and data statistics, and other metrics that we have collected over the last year demonstrating the broad user community and applications that the GHRSST project continues to serve via the GDAC distribution mechanisms. This report also serves by extension to summarize the activities of the GHRSST Data Assembly and Systems Technical Advisory Group (DAS-TAG).
The European Drought Observatory (EDO): Current State and Future Directions
NASA Astrophysics Data System (ADS)
Vogt, J.; Singleton, A.; Sepulcre, G.; Micale, F.; Barbosa, P.
2012-12-01
Europe has repeatedly been affected by droughts, resulting in considerable ecological and economic damage and climate change studies indicate a trend towards increasing climate variability most likely resulting in more frequent drought occurrences also in Europe. Against this background, the European Commission's Joint Research Centre (JRC) is developing methods and tools for assessing, monitoring and forecasting droughts in Europe and develops a European Drought Observatory (EDO) to complement and integrate national activities with a European view. At the core of the European Drought Observatory (EDO) is a portal, including a map server, a metadata catalogue, a media-monitor and analysis tools. The map server presents Europe-wide up-to-date information on the occurrence and severity of droughts, which is complemented by more detailed information provided by regional, national and local observatories through OGC compliant web mapping and web coverage services. In addition, time series of historical maps as well as graphs of the temporal evolution of drought indices for individual grid cells and administrative regions in Europe can be retrieved and analysed. Current work is focusing on validating the available products, improving the functionalities, extending the linkage to additional national and regional drought information systems and improving medium to long-range probabilistic drought forecasting products. Probabilistic forecasts are attractive in that they provide an estimate of the range of uncertainty in a particular forecast. Longer-term goals include the development of long-range drought forecasting products, the analysis of drought hazard and risk, the monitoring of drought impact and the integration of EDO in a global drought information system. The talk will provide an overview on the development and state of EDO, the different products, and the ways to include a wide range of stakeholders (i.e. European, national river basin, and local authorities) in the development of the system as well as an outlook on the future developments.
Generalizing the Arden Syntax to a Common Clinical Application Language.
Kraus, Stefan
2018-01-01
The Arden Syntax for Medical Logic Systems is a standard for encoding and sharing knowledge in the form of Medical Logic Modules (MLMs). Although the Arden Syntax has been designed to meet the requirements of data-driven clinical event monitoring, multiple studies suggest that its language constructs may be suitable for use outside the intended application area and even as a common clinical application language. Such a broader context, however, requires to reconsider some language features. The purpose of this paper is to outline the related modifications on the basis of a generalized Arden Syntax version. The implemented prototype provides multiple adjustments to the standard, such as an option to use programming language constructs without the frame-like MLM structure, a JSON compliant data type system, a means to use MLMs as user-defined functions, and native support of restful web services with integrated data mapping. This study does not aim to promote an actually new language, but a more generic version of the proven Arden Syntax standard. Such an easy-to-understand domain-specific language for common clinical applications might cover multiple additional medical subdomains and serve as a lingua franca for arbitrary clinical algorithms, therefore avoiding a patchwork of multiple all-purpose languages between, and even within, institutions.
Chiang, Shu-Yin; Kan, Yao-Chiang; Chen, Yun-Shan; Tu, Ying-Ching; Lin, Hsueh-Chun
2016-12-03
Ubiquitous health care (UHC) is beneficial for patients to ensure they complete therapeutic exercises by self-management at home. We designed a fuzzy computing model that enables recognizing assigned movements in UHC with privacy. The movements are measured by the self-developed body motion sensor, which combines both accelerometer and gyroscope chips to make an inertial sensing node compliant with a wireless sensor network (WSN). The fuzzy logic process was studied to calculate the sensor signals that would entail necessary features of static postures and dynamic motions. Combinations of the features were studied and the proper feature sets were chosen with compatible fuzzy rules. Then, a fuzzy inference system (FIS) can be generated to recognize the assigned movements based on the rules. We thus implemented both fuzzy and adaptive neuro-fuzzy inference systems in the model to distinguish static and dynamic movements. The proposed model can effectively reach the recognition scope of the assigned activity. Furthermore, two exercises of upper-limb flexion in physical therapy were applied for the model in which the recognition rate can stand for the passing rate of the assigned motions. Finally, a web-based interface was developed to help remotely measure movement in physical therapy for UHC.
Chiang, Shu-Yin; Kan, Yao-Chiang; Chen, Yun-Shan; Tu, Ying-Ching; Lin, Hsueh-Chun
2016-01-01
Ubiquitous health care (UHC) is beneficial for patients to ensure they complete therapeutic exercises by self-management at home. We designed a fuzzy computing model that enables recognizing assigned movements in UHC with privacy. The movements are measured by the self-developed body motion sensor, which combines both accelerometer and gyroscope chips to make an inertial sensing node compliant with a wireless sensor network (WSN). The fuzzy logic process was studied to calculate the sensor signals that would entail necessary features of static postures and dynamic motions. Combinations of the features were studied and the proper feature sets were chosen with compatible fuzzy rules. Then, a fuzzy inference system (FIS) can be generated to recognize the assigned movements based on the rules. We thus implemented both fuzzy and adaptive neuro-fuzzy inference systems in the model to distinguish static and dynamic movements. The proposed model can effectively reach the recognition scope of the assigned activity. Furthermore, two exercises of upper-limb flexion in physical therapy were applied for the model in which the recognition rate can stand for the passing rate of the assigned motions. Finally, a web-based interface was developed to help remotely measure movement in physical therapy for UHC. PMID:27918482
NASA Astrophysics Data System (ADS)
Zhu, Wu-Le; Zhu, Zhiwei; To, Suet; Liu, Qiang; Ju, Bing-Feng; Zhou, Xiaoqin
2016-12-01
This paper presents a novel redundantly piezo-actuated three-degree-of-freedom XYθ z compliant mechanism for nano-positioning, driven by four mirror-symmetrically configured piezoelectric actuators (PEAs). By means of differential motion principle, linearized kinematics and physically bi-directional motions in all the three directions are achieved. Meanwhile, the decoupled delivering of three-directional independent motions at the output end is accessible, and the essential parallel and mirror symmetric configuration guarantees large output stiffness, high natural frequencies, high accuracy as well as high structural compactness of the mechanism. Accurate kinematics analysis with consideration of input coupling indicates that the proposed redundantly actuated compliant mechanism can generate three-dimensional (3D) symmetric polyhedral workspace envelope with enlarged reachable workspace, as compared with the most common parallel XYθ z mechanism driven by three PEAs. Keeping a high consistence with both analytical and numerical models, the experimental results show the working ranges of ±6.21 μm and ±12.41 μm in X- and Y-directions, and that of ±873.2 μrad in θ z-direction with nano-positioning capability can be realized. The superior performances and easily achievable structure well facilitate practical applications of the proposed XYθ z compliant mechanism in nano-positioning systems.
Can compliant fault zones be used to measure absolute stresses in the upper crust?
NASA Astrophysics Data System (ADS)
Hearn, E. H.; Fialko, Y.
2009-04-01
Geodetic and seismic observations reveal long-lived zones with reduced elastic moduli along active crustal faults. These fault zones localize strain from nearby earthquakes, consistent with the response of a compliant, elastic layer. Fault zone trapped wave studies documented a small reduction in P and S wave velocities along the Johnson Valley Fault caused by the 1999 Hector Mine earthquake. This reduction presumably perturbed a permanent compliant structure associated with the fault. The inferred changes in the fault zone compliance may produce a measurable deformation in response to background (tectonic) stresses. This deformation should have the same sense as the background stress, rather than the coseismic stress change. Here we investigate how the observed deformation of compliant zones in the Mojave Desert can be used to constrain the fault zone structure and stresses in the upper crust. We find that gravitational contraction of the coseismically softened zones should cause centimeters of coseismic subsidence of both the compliant zones and the surrounding region, unless the compliant fault zones are shallow and narrow, or essentially incompressible. We prefer the latter interpretation because profiles of line of sight displacements across compliant zones cannot be fit by a narrow, shallow compliant zone. Strain of the Camp Rock and Pinto Mountain fault zones during the Hector Mine and Landers earthquakes suggests that background deviatoric stresses are broadly consistent with Mohr-Coulomb theory in the Mojave upper crust (with μ ≥ 0.7). Large uncertainties in Mojave compliant zone properties and geometry preclude more precise estimates of crustal stresses in this region. With improved imaging of the geometry and elastic properties of compliant zones, and with precise measurements of their strain in response to future earthquakes, the modeling approach we describe here may eventually provide robust estimates of absolute crustal stress.
Krisciunas, Gintas P; Castellano, Kerlly; McCulloch, Timothy M; Lazarus, Cathy L; Pauloski, Barbara R; Meyer, Tanya K; Graner, Darlene; Van Daele, Douglas J; Silbergleit, Alice K; Crujido, Lisa R; Rybin, Denis; Doros, Gheorghe; Kotz, Tamar; Langmore, Susan E
2017-04-01
A 5-year, 16-site, randomized controlled trial enrolled 170 HNC survivors into active (estim + swallow exercise) or control (sham estim + swallowing exercise) arms. Primary analyses showed that estim did not enhance swallowing exercises. This secondary analysis determined if/how patient compliance impacted outcomes. A home program, performed 2 times/day, 6 days/week, for 12 weeks included stretches and 60 swallows paired with real or sham estim. Regular clinic visits ensured proper exercise execution, and detailed therapy checklists tracked patient compliance which was defined by mean number of sessions performed per week (0-12 times) over the 12-week intervention period. "Compliant" was defined as performing 10-12 sessions/week. Outcomes were changes in PAS, HNCI, PSS, OPSE, and hyoid excursion. ANCOVA analyses determined if outcomes differed between real/sham and compliant/noncompliant groups after 12 weeks of therapy. Of the 170 patients enrolled, 153 patients had compliance data. The mean number of sessions performed was 8.57/week (median = 10.25). Fifty-four percent of patients (n = 83) were considered "compliant." After 12 weeks of therapy, compliant patients in the sham estim group realized significantly better PAS scores than compliant patients in the active estim group (p = 0.0074). When pooling all patients together, there were no significant differences in outcomes between compliant and non-compliant patients. The addition of estim to swallowing exercises resulted in worse swallowing outcomes than exercises alone, which was more pronounced in compliant patients. Since neither compliant nor non-compliant patients benefitted from swallowing exercises, the proper dose and/or efficacy of swallowing exercises must also be questioned in this patient population.
Code of Federal Regulations, 2012 CFR
2012-07-01
... FOR COMPLIANT CONDUCT AND RESPONSIBLE USE OF THE INTERSTATE IDENTIFICATION INDEX (III) SYSTEM FOR NONCRIMINAL JUSTICE PURPOSES § 907.2 Applicability. This rule applies to III System access for noncriminal... by means of the System for such purposes. The rule establishes procedures for ensuring that the FBI's...
Code of Federal Regulations, 2014 CFR
2014-07-01
... FOR COMPLIANT CONDUCT AND RESPONSIBLE USE OF THE INTERSTATE IDENTIFICATION INDEX (III) SYSTEM FOR NONCRIMINAL JUSTICE PURPOSES § 907.2 Applicability. This rule applies to III System access for noncriminal... by means of the System for such purposes. The rule establishes procedures for ensuring that the FBI's...
Code of Federal Regulations, 2011 CFR
2011-07-01
... FOR COMPLIANT CONDUCT AND RESPONSIBLE USE OF THE INTERSTATE IDENTIFICATION INDEX (III) SYSTEM FOR NONCRIMINAL JUSTICE PURPOSES § 907.2 Applicability. This rule applies to III System access for noncriminal... by means of the System for such purposes. The rule establishes procedures for ensuring that the FBI's...
Code of Federal Regulations, 2010 CFR
2010-07-01
... FOR COMPLIANT CONDUCT AND RESPONSIBLE USE OF THE INTERSTATE IDENTIFICATION INDEX (III) SYSTEM FOR NONCRIMINAL JUSTICE PURPOSES § 907.2 Applicability. This rule applies to III System access for noncriminal... by means of the System for such purposes. The rule establishes procedures for ensuring that the FBI's...
Code of Federal Regulations, 2013 CFR
2013-07-01
... FOR COMPLIANT CONDUCT AND RESPONSIBLE USE OF THE INTERSTATE IDENTIFICATION INDEX (III) SYSTEM FOR NONCRIMINAL JUSTICE PURPOSES § 907.2 Applicability. This rule applies to III System access for noncriminal... by means of the System for such purposes. The rule establishes procedures for ensuring that the FBI's...
Passive Thermal Management of Foil Bearings
NASA Technical Reports Server (NTRS)
Bruckner, Robert J. (Inventor)
2015-01-01
Systems and methods for passive thermal management of foil bearing systems are disclosed herein. The flow of the hydrodynamic film across the surface of bearing compliant foils may be disrupted to provide passive cooling and to improve the performance and reliability of the foil bearing system.
Year 2000 Computing Crisis: FAA Must Act Quickly to Prevent System Failures
DOT National Transportation Integrated Search
1998-02-01
Testimony before House of Representatives on FAA's reliance on information processing, where the agency stood remained at risk, and recommendations needed to increase the likelihood that FAA systems would be Year 2000 compliant by January.
Shen, Lishuang; Diroma, Maria Angela; Gonzalez, Michael; Navarro-Gomez, Daniel; Leipzig, Jeremy; Lott, Marie T; van Oven, Mannis; Wallace, Douglas C; Muraresku, Colleen Clarke; Zolkipli-Cunningham, Zarazuela; Chinnery, Patrick F; Attimonelli, Marcella; Zuchner, Stephan; Falk, Marni J; Gai, Xiaowu
2016-06-01
MSeqDR is the Mitochondrial Disease Sequence Data Resource, a centralized and comprehensive genome and phenome bioinformatics resource built by the mitochondrial disease community to facilitate clinical diagnosis and research investigations of individual patient phenotypes, genomes, genes, and variants. A central Web portal (https://mseqdr.org) integrates community knowledge from expert-curated databases with genomic and phenotype data shared by clinicians and researchers. MSeqDR also functions as a centralized application server for Web-based tools to analyze data across both mitochondrial and nuclear DNA, including investigator-driven whole exome or genome dataset analyses through MSeqDR-Genesis. MSeqDR-GBrowse genome browser supports interactive genomic data exploration and visualization with custom tracks relevant to mtDNA variation and mitochondrial disease. MSeqDR-LSDB is a locus-specific database that currently manages 178 mitochondrial diseases, 1,363 genes associated with mitochondrial biology or disease, and 3,711 pathogenic variants in those genes. MSeqDR Disease Portal allows hierarchical tree-style disease exploration to evaluate their unique descriptions, phenotypes, and causative variants. Automated genomic data submission tools are provided that capture ClinVar compliant variant annotations. PhenoTips will be used for phenotypic data submission on deidentified patients using human phenotype ontology terminology. The development of a dynamic informed patient consent process to guide data access is underway to realize the full potential of these resources. © 2016 WILEY PERIODICALS, INC.
Shen, Lishuang; Diroma, Maria Angela; Gonzalez, Michael; Navarro-Gomez, Daniel; Leipzig, Jeremy; Lott, Marie T.; van Oven, Mannis; Wallace, Douglas C.; Muraresku, Colleen Clarke; Zolkipli-Cunningham, Zarazuela; Chinnery, Patrick F.; Attimonelli, Marcella; Zuchner, Stephan
2016-01-01
MSeqDR is the Mitochondrial Disease Sequence Data Resource, a centralized and comprehensive genome and phenome bioinformatics resource built by the mitochondrial disease community to facilitate clinical diagnosis and research investigations of individual patient phenotypes, genomes, genes, and variants. A central Web portal (https://mseqdr.org) integrates community knowledge from expert-curated databases with genomic and phenotype data shared by clinicians and researchers. MSeqDR also functions as a centralized application server for Web-based tools to analyze data across both mitochondrial and nuclear DNA, including investigator-driven whole exome or genome dataset analyses through MSeqDR-Genesis. MSeqDR-GBrowse supports interactive genomic data exploration and visualization with custom tracks relevant to mtDNA variation and disease. MSeqDR-LSDB is a locus specific database that currently manages 178 mitochondrial diseases, 1,363 genes associated with mitochondrial biology or disease, and 3,711 pathogenic variants in those genes. MSeqDR Disease Portal allows hierarchical tree-style disease exploration to evaluate their unique descriptions, phenotypes, and causative variants. Automated genomic data submission tools are provided that capture ClinVar-compliant variant annotations. PhenoTips is used for phenotypic data submission on de-identified patients using human phenotype ontology terminology. Development of a dynamic informed patient consent process to guide data access is underway to realize the full potential of these resources. PMID:26919060
Drag reduction through self-texturing compliant bionic materials
Liu, Eryong; Li, Longyang; Wang, Gang; Zeng, Zhixiang; Zhao, Wenjie; Xue, Qunji
2017-01-01
Compliant fish skin is effectively in reducing drag, thus the design and application of compliant bionic materials may be a good choice for drag reduction. Here we consider the drag reduction of compliant bionic materials. First, ZnO and PDMS mesh modified with n-octadecane were prepared, the drag reduction of self-texturing compliant n-octadecane were studied. The results show that the mesh modified by ZnO and PDMS possess excellent lipophilic and hydrophobic, thus n-octadecane at solid, semisolid and liquid state all have good adhesion with modified mesh. The states of n-octadecane changed with temperature, thus, the surface contact angle and adhesive force all varies obviously at different state. The contact angle decreases with temperature, the adhesive force shows a lower value at semisolid state. Furthermore, the drag testing results show that the compliant n-octadecane film is more effectively in drag reduction than superhydrophobic ZnO/PDMS film, indicating that the drag reduction mechanism of n-octadecane is significantly different with superhydrophobic film. Further research shows that the water flow leads to self-texturing of semisolid state n-octadecane, which is similar with compliant fish skin. Therefore, the compliant bionic materials of semisolid state n-octadecane with regular bulge plays a major role in the drag reduction. PMID:28053309
Drag reduction through self-texturing compliant bionic materials.
Liu, Eryong; Li, Longyang; Wang, Gang; Zeng, Zhixiang; Zhao, Wenjie; Xue, Qunji
2017-01-05
Compliant fish skin is effectively in reducing drag, thus the design and application of compliant bionic materials may be a good choice for drag reduction. Here we consider the drag reduction of compliant bionic materials. First, ZnO and PDMS mesh modified with n-octadecane were prepared, the drag reduction of self-texturing compliant n-octadecane were studied. The results show that the mesh modified by ZnO and PDMS possess excellent lipophilic and hydrophobic, thus n-octadecane at solid, semisolid and liquid state all have good adhesion with modified mesh. The states of n-octadecane changed with temperature, thus, the surface contact angle and adhesive force all varies obviously at different state. The contact angle decreases with temperature, the adhesive force shows a lower value at semisolid state. Furthermore, the drag testing results show that the compliant n-octadecane film is more effectively in drag reduction than superhydrophobic ZnO/PDMS film, indicating that the drag reduction mechanism of n-octadecane is significantly different with superhydrophobic film. Further research shows that the water flow leads to self-texturing of semisolid state n-octadecane, which is similar with compliant fish skin. Therefore, the compliant bionic materials of semisolid state n-octadecane with regular bulge plays a major role in the drag reduction.
Drag reduction through self-texturing compliant bionic materials
NASA Astrophysics Data System (ADS)
Liu, Eryong; Li, Longyang; Wang, Gang; Zeng, Zhixiang; Zhao, Wenjie; Xue, Qunji
2017-01-01
Compliant fish skin is effectively in reducing drag, thus the design and application of compliant bionic materials may be a good choice for drag reduction. Here we consider the drag reduction of compliant bionic materials. First, ZnO and PDMS mesh modified with n-octadecane were prepared, the drag reduction of self-texturing compliant n-octadecane were studied. The results show that the mesh modified by ZnO and PDMS possess excellent lipophilic and hydrophobic, thus n-octadecane at solid, semisolid and liquid state all have good adhesion with modified mesh. The states of n-octadecane changed with temperature, thus, the surface contact angle and adhesive force all varies obviously at different state. The contact angle decreases with temperature, the adhesive force shows a lower value at semisolid state. Furthermore, the drag testing results show that the compliant n-octadecane film is more effectively in drag reduction than superhydrophobic ZnO/PDMS film, indicating that the drag reduction mechanism of n-octadecane is significantly different with superhydrophobic film. Further research shows that the water flow leads to self-texturing of semisolid state n-octadecane, which is similar with compliant fish skin. Therefore, the compliant bionic materials of semisolid state n-octadecane with regular bulge plays a major role in the drag reduction.
Wong, Kaitlyn E; Gorton, George E; Tashjian, David B; Tirabassi, Michael V; Moriarty, Kevin P
2014-06-01
The purpose of this study is to measure the effectiveness of compressive orthotic brace therapy for the treatment of pectus carinatum using an adjusted Haller Index (HI) measurement calculated from 3D body scan (BS) images. Pediatric patients with pectus carinatum were treated with either compressive orthotic bracing or observation. An adjusted BS Haller index (HI) was calculated from serial 3D BS images obtained on all patients. Medical records were evaluated to determine treatment with bracing and brace compliance more than 12hours daily. Compliant patient measurements were compared to non-compliant and non-brace groups. Forty patients underwent compressive orthotic bracing, while ten were observed. Twenty-three patients were compliant with bracing, and seventeen patients were non-compliant. Compliant patients exhibited an 8.2% increase, non-compliant patients had a 1.5% increase, and non-brace patients exhibited a 2.5% increase in BS HI. The change in BS HI of compliant patients was significantly different compared to non-brace patients (p=0.004) and non-compliant patients (p<0.001). Three dimensional BS is an effective, radiation free, and objective means to evaluate patients treated with compressive orthotic bracing. Copyright © 2014 Elsevier Inc. All rights reserved.
Ghosh, Rajesh; Lewis, David
2015-01-01
Advent of new technologies in mobile devices and software applications is leading to an evolving change in the extent, geographies and modes for use of internet. Today, it is used not only for information gathering but for sharing of experiences, opinions and suggestions. Web-Recognizing Adverse Drug Reactions (RADR) is a groundbreaking European Union (EU) Innovative Medicines Innovation funded 3-year initiative to recommend policies, frameworks, tools and methodologies by leveraging these new developments to get new insights in drug safety. Data were gathered from prior surveys, previous initiatives and a review of relevant literature was done. New technologies provide an opportunity in the way safety information is collected, helping generate new knowledge for safety profile of drugs as well as unique insights into the evolving pharmacovigilance system in general. It is critical that these capabilities are harnessed in a way that is ethical, compliant with regulations, respecting data privacy and used responsibly. At the same time, the process for managing and interpreting this new information must be efficient and effective for sustenance, thoughtful use of resources and valuable return of knowledge. These approaches should complement the ongoing progress toward personalized medicine. This Web-RADR initiative should provide some directions on 'what and how' to use social media to further proactive pharmacovigilance and protection of public health. It is expected to also show how a multipronged expert consortium group comprising regulators, industry and academia can leverage new developments in technology and society to bring innovation in process, operations, organization and scientific approaches across its boundaries and beyond the normal realms of individual research units. These new approaches should bring insights faster, earlier, specific, actionable and moving toward the target of AE prevention. The possibilities of a blended targeted pharmacovigilance (PV) approach where boundaries between stakeholders blur and cultures mix point to very different future for better, healthier and longer lives.
DISTANT EARLY WARNING SYSTEM for Tsunamis - A wide-area and multi-hazard approach
NASA Astrophysics Data System (ADS)
Hammitzsch, Martin; Lendholt, Matthias; Wächter, Joachim
2010-05-01
The DEWS (Distant Early Warning System) [1] project, funded under the 6th Framework Programme of the European Union, has the objective to create a new generation of interoperable early warning systems based on an open sensor platform. This platform integrates OGC [2] SWE [3] compliant sensor systems for the rapid detection of hazardous events, like earthquakes, sea level anomalies, ocean floor occurrences, and ground displacements in the case of tsunami early warning. Based on the upstream information flow DEWS focuses on the improvement of downstream capacities of warning centres especially by improving information logistics for effective and targeted warning message aggregation for a multilingual environment. Multiple telecommunication channels will be used for the dissemination of warning messages. Wherever possible, existing standards have been integrated. The Command and Control User Interface (CCUI), a rich client application based on Eclipse RCP (Rich Client Platform) [4] and the open source GIS uDig [5], integrates various OGC services. Using WMS (Web Map Service) [6] and WFS (Web Feature Service) [7] spatial data are utilized to depict the situation picture and to integrate a simulation system via WPS (Web Processing Service) [8] to identify affected areas. Warning messages are compiled and transmitted in the OASIS [9] CAP (Common Alerting Protocol) [10] standard together with addressing information defined via EDXL-DE (Emergency Data Exchange Language - Distribution Element) [11]. Internal interfaces are realized with SOAP [12] web services. Based on results of GITEWS [13] - in particular the GITEWS Tsunami Service Bus [14] - the DEWS approach provides an implementation for tsunami early warning systems but other geological paradigms are going to follow, e.g. volcanic eruptions or landslides. Therefore in future also multi-hazard functionality is conceivable. The specific software architecture of DEWS makes it possible to dock varying sensors to the system and to extend the CCUI with hazard specific functionality. The presentation covers the DEWS project, the system architecture and the CCUI in conjunction with details of information logistics. The DEWS Wide Area Centre connecting national centres to allow the international communication and warning exchange is presented also. REFERENCES: [1] DEWS, www.dews-online.org [2] OGC, www.opengeospatial.org [3] SWE, www.opengeospatial.org/projects/groups/sensorweb [4] Eclipse RCP, www.eclipse.org/home/categories/rcp.php [5] uDig, udig.refractions.net [6] WMS, www.opengeospatial.org/standards/wms [7] WFS, www.opengeospatial.org/standards/wfs [8] WPS, www.opengeospatial.org/standards/wps [9] OASIS, www.oasis-open.org [10] CAP, www.oasis-open.org/specs/#capv1.1 [11] EDXL-DE, www.oasis-open.org/specs/#edxlde-v1.0 [12] SOAP, www.w3.org/TR/soap [13] GITEWS (German Indonesian Tsunami Early Warning System) is a project of the German Federal Government to aid the recon¬struction of the tsunami-prone Indian Ocean region, www.gitews.org [14] The Tsunami Service Bus is the GITEWS sensor system integration platform offering standardised services for the detection and monitoring of tsunamis
Compliance control with embedded neural elements
NASA Technical Reports Server (NTRS)
Venkataraman, S. T.; Gulati, S.
1992-01-01
The authors discuss a control approach that embeds the neural elements within a model-based compliant control architecture for robotic tasks that involve contact with unstructured environments. Compliance control experiments have been performed on actual robotics hardware to demonstrate the performance of contact control schemes with neural elements. System parameters were identified under the assumption that environment dynamics have a fixed nonlinear structure. A robotics research arm, placed in contact with a single degree-of-freedom electromechanical environment dynamics emulator, was commanded to move through a desired trajectory. The command was implemented by using a compliant control strategy.
A Compliant Casing for Transonic Axial Compressors
NASA Technical Reports Server (NTRS)
Bloch, Gregory S.; Hah, Chunill
2003-01-01
A viewgraph presentation on the concept of compliant casing for transonic axial compressors is shown. The topics include: 1) Concept for compliant casing; 2) Rig and facility details; 3) Experimental results; and 4) Numerical results.
Lachance, Chantelle C; Korall, Alexandra M B; Russell, Colin M; Feldman, Fabio; Robinovitch, Stephen N; Mackey, Dawn C
2018-09-01
Purpose-designed compliant flooring and carpeting have been promoted as a means for reducing fall-related injuries in high-risk environments, such as long-term care. However, it is not known whether these surfaces influence the forces that long-term care staff exert when pushing residents in wheelchairs. We studied 14 direct-care staff who pushed a loaded wheelchair instrumented with a triaxial load cell to test the effects on hand force of flooring overlay (vinyl versus carpet) and flooring subfloor (concrete versus compliant rubber [brand: SmartCells]). During straight-line pushing, carpet overlay increased initial and sustained hand forces compared to vinyl overlay by 22-49% over a concrete subfloor and by 8-20% over a compliant subfloor. Compliant subflooring increased initial and sustained hand forces compared to concrete subflooring by 18-31% when under a vinyl overlay. In contrast, compliant flooring caused no change in initial or sustained hand forces compared to concrete subflooring when under a carpet overlay. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Moore, C.
2011-12-01
The Index to Marine and Lacustrine Geological Samples is a community designed and maintained resource enabling researchers to locate and request sea floor and lakebed geologic samples archived by partner institutions. Conceived in the dawn of the digital age by representatives from U.S. academic and government marine core repositories and the NOAA National Geophysical Data Center (NGDC) at a 1977 meeting convened by the National Science Foundation (NSF), the Index is based on core concepts of community oversight, common vocabularies, consistent metadata and a shared interface. Form and content of underlying vocabularies and metadata continue to evolve according to the needs of the community, as do supporting technologies and access methodologies. The Curators Consortium, now international in scope, meets at partner institutions biennially to share ideas and discuss best practices. NGDC serves the group by providing database access and maintenance, a list server, digitizing support and long-term archival of sample metadata, data and imagery. Over three decades, participating curators have performed the herculean task of creating and contributing metadata for over 195,000 sea floor and lakebed cores, grabs, and dredges archived in their collections. Some partners use the Index for primary web access to their collections while others use it to increase exposure of more in-depth institutional systems. The Index is currently a geospatially-enabled relational database, publicly accessible via Web Feature and Web Map Services, and text- and ArcGIS map-based web interfaces. To provide as much knowledge as possible about each sample, the Index includes curatorial contact information and links to related data, information and images; 1) at participating institutions, 2) in the NGDC archive, and 3) at sites such as the Rolling Deck to Repository (R2R) and the System for Earth Sample Registration (SESAR). Over 34,000 International GeoSample Numbers (IGSNs) linking to SESAR are included in anticipation of opportunities for interconnectivity with Integrated Earth Data Applications (IEDA) systems. To promote interoperability and broaden exposure via the semantic web, NGDC is publishing lithologic classification schemes and terminology used in the Index as Simple Knowledge Organization System (SKOS) vocabularies, coordinating with R2R and the Consortium for Ocean Leadership for consistency. Availability in SKOS form will also facilitate use of the vocabularies in International Standards Organization (ISO) 19115-2 compliant metadata records. NGDC provides stewardship for the Index on behalf of U.S. repositories as the NSF designated "appropriate National Data Center" for data and metadata pertaining to sea floor samples as specified in the 2011 Division of Ocean Sciences Sample and Data Policy, and on behalf of international partners via a collocated World Data Center. NGDC operates on the Open Archival Information System (OAIS) reference model. Active Partners: Antarctic Marine Geology Research Facility, Florida State University; British Ocean Sediment Core Research Facility; Geological Survey of Canada; Integrated Ocean Drilling Program; Lamont-Doherty Earth Observatory; National Lacustrine Core Repository, University of Minnesota; Oregon State University; Scripps Institution of Oceanography; University of Rhode Island; U.S. Geological Survey; Woods Hole Oceanographic Institution.
Lau, Annie Y S; Sintchenko, Vitali; Crimmins, Jacinta; Magrabi, Farah; Gallego, Blanca; Coiera, Enrico
2012-04-02
Online social networking and personally controlled health management systems (PCHMS) offer a new opportunity for developing innovative interventions to prevent diseases of public health concern (e.g., influenza) but there are few comparative studies about patterns of use and impact of these systems. A 2010 CONSORT-compliant randomised controlled trial with a two-group parallel design will assess the efficacy of a web-based PCHMS called Healthy.me in facilitating the uptake of influenza vaccine amongst university students and staff. Eligible participants are randomised either to obtain access to Healthy.me or a 6-month waitlist. Participants complete pre-study, post-study and monthly surveys about their health and utilisation of health services. A post-study clinical audit will be conducted to validate self-reports about influenza vaccination and visits to the university health service due to influenza-like illness (ILI) amongst a subset of participants. 600 participants older than 18 years with monthly access to the Internet and email will be recruited. Participants who (i) discontinue the online registration process; (ii) report obtaining an influenza vaccination in 2010 before the commencement of the study; or (iii) report being influenced by other participants to undertake influenza vaccination will be excluded from analysis. The primary outcome measure is the number of participants obtaining influenza vaccination during the study. Secondary outcome measures include: number of participants (i) experiencing ILI symptoms, (ii) absent from or experiencing impairment in work or study due to ILI symptoms, (iii) using health services or medications due to ILI symptoms; (iv) expressing positive or negative attitudes or experiences towards influenza vaccination, via their reasons of receiving (or not receiving) influenza vaccine; and (v) their patterns of usage of Healthy.me (e.g., frequency and timing of hits, duration of access, uptake of specific functions). This study will provide new insights about the utility of online social networking and PCHMS for public health and health promotion. It will help to assess whether a web-based PCHMS, with connectivity to a health service provider, containing information and self-management tools, can improve the uptake of preventive health services amongst university students and staff. ACTRN12610000386033 (Australian New Zealand Clinical Trials Registry).
2012-01-01
Background Online social networking and personally controlled health management systems (PCHMS) offer a new opportunity for developing innovative interventions to prevent diseases of public health concern (e.g., influenza) but there are few comparative studies about patterns of use and impact of these systems. Methods/Design A 2010 CONSORT-compliant randomised controlled trial with a two-group parallel design will assess the efficacy of a web-based PCHMS called Healthy.me in facilitating the uptake of influenza vaccine amongst university students and staff. Eligible participants are randomised either to obtain access to Healthy.me or a 6-month waitlist. Participants complete pre-study, post-study and monthly surveys about their health and utilisation of health services. A post-study clinical audit will be conducted to validate self-reports about influenza vaccination and visits to the university health service due to influenza-like illness (ILI) amongst a subset of participants. 600 participants older than 18 years with monthly access to the Internet and email will be recruited. Participants who (i) discontinue the online registration process; (ii) report obtaining an influenza vaccination in 2010 before the commencement of the study; or (iii) report being influenced by other participants to undertake influenza vaccination will be excluded from analysis. The primary outcome measure is the number of participants obtaining influenza vaccination during the study. Secondary outcome measures include: number of participants (i) experiencing ILI symptoms, (ii) absent from or experiencing impairment in work or study due to ILI symptoms, (iii) using health services or medications due to ILI symptoms; (iv) expressing positive or negative attitudes or experiences towards influenza vaccination, via their reasons of receiving (or not receiving) influenza vaccine; and (v) their patterns of usage of Healthy.me (e.g., frequency and timing of hits, duration of access, uptake of specific functions). Discussion This study will provide new insights about the utility of online social networking and PCHMS for public health and health promotion. It will help to assess whether a web-based PCHMS, with connectivity to a health service provider, containing information and self-management tools, can improve the uptake of preventive health services amongst university students and staff. Trial registration ACTRN12610000386033 (Australian New Zealand Clinical Trials Registry) PMID:22462549
802.16e System Profile for NASA Extra-Vehicular Activities
NASA Technical Reports Server (NTRS)
Foore, Lawrence R.; Chelmins, David T.; Nguyen, Hung D.; Downey, Joseph A.; Finn, Gregory G.; Cagley, Richard E.; Bakula, Casey J.
2009-01-01
This report identifies an 802.16e system profile that is applicable to a lunar surface wireless network, and specifically for meeting extra-vehicular activity (EVA) data flow requirements. EVA suit communication needs are addressed. Design-driving operational scenarios are considered. These scenarios are then used to identify a configuration of the 802.16e system (system profile) that meets EVA requirements, but also aim to make the radio realizable within EVA constraints. Limitations of this system configuration are highlighted. An overview and development status is presented by Toyon Research Corporation concerning the development of an 802.16e compatible modem under NASA s Small Business Innovative Research (SBIR) Program. This modem is based on the recommended system profile developed as part of this report. Last, a path forward is outlined that presents an evolvable solution for the EVA radio system and lunar surface radio networks. This solution is based on a custom link layer, and 802.16e compliant physical layer compliant to the identified system profile, and a later progression to a fully interoperable 802.16e system.
Masia, Lorenzo; Cappello, Leonardo; Morasso, Pietro; Lachenal, Xavier; Pirrera, Alberto; Weaver, Paul; Mattioni, Filippo
2013-06-01
A novel actuator is introduced that combines an elastically compliant composite structure with conventional electromechanical elements. The proposed design is analogous to that used in Series Elastic Actuators, its distinctive feature being that the compliant composite part offers different stable configurations. In other words, its elastic potential presents points of local minima that correspond to robust stable positions (multistability). This potential is known a priori as a function of the structural geometry, thus providing tremendous benefits in terms of control implementation. Such knowledge enables the complexities arising from the additional degrees of freedom associated with link deformations to be overcome and uncover challenges that extends beyond those posed by standard rigidlink robot dynamics. It is thought that integrating a multistable elastic element in a robotic transmission can provide new scenarios in the field of assistive robotics, as the system may help a subject to stand or carry a load without the need for an active control effort by the actuators.
Quantification of regenerative potential in primary human mammary epithelial cells.
Linnemann, Jelena R; Miura, Haruko; Meixner, Lisa K; Irmler, Martin; Kloos, Uwe J; Hirschi, Benjamin; Bartsch, Harald S; Sass, Steffen; Beckers, Johannes; Theis, Fabian J; Gabka, Christian; Sotlar, Karl; Scheel, Christina H
2015-09-15
We present an organoid regeneration assay in which freshly isolated human mammary epithelial cells are cultured in adherent or floating collagen gels, corresponding to a rigid or compliant matrix environment. In both conditions, luminal progenitors form spheres, whereas basal cells generate branched ductal structures. In compliant but not rigid collagen gels, branching ducts form alveoli at their tips, express basal and luminal markers at correct positions, and display contractility, which is required for alveologenesis. Thereby, branched structures generated in compliant collagen gels resemble terminal ductal-lobular units (TDLUs), the functional units of the mammary gland. Using the membrane metallo-endopeptidase CD10 as a surface marker enriches for TDLU formation and reveals the presence of stromal cells within the CD49f(hi)/EpCAM(-) population. In summary, we describe a defined in vitro assay system to quantify cells with regenerative potential and systematically investigate their interaction with the physical environment at distinct steps of morphogenesis. © 2015. Published by The Company of Biologists Ltd.
Koopmans-Compliant Spectral Functionals for Extended Systems
NASA Astrophysics Data System (ADS)
Nguyen, Ngoc Linh; Colonna, Nicola; Ferretti, Andrea; Marzari, Nicola
2018-04-01
Koopmans-compliant functionals have been shown to provide accurate spectral properties for molecular systems; this accuracy is driven by the generalized linearization condition imposed on each charged excitation, i.e., on changing the occupation of any orbital in the system, while accounting for screening and relaxation from all other electrons. In this work, we discuss the theoretical formulation and the practical implementation of this formalism to the case of extended systems, where a third condition, the localization of Koopmans's orbitals, proves crucial to reach seamlessly the thermodynamic limit. We illustrate the formalism by first studying one-dimensional molecular systems of increasing length. Then, we consider the band gaps of 30 paradigmatic solid-state test cases, for which accurate experimental and computational results are available. The results are found to be comparable with the state of the art in many-body perturbation theory, notably using just a functional formulation for spectral properties and the generalized-gradient approximation for the exchange and correlation functional.
EPA Administrative Order on Consent (AOC) with ERP Compliant Coke, LLC
This Administrative Order on Consent with ERP Compliant Coke was effective August 2016. The Walter Coke facility located in North Birmingham was purchased by ERP Compliant Coke, LLC in February 2016 out of bankruptcy proceedings.
Safety and fitness electronic records system (SAFER) : draft master test plan
DOT National Transportation Integrated Search
1995-12-31
The purpose of this plan is to establish a formal set of guidelines and activities to be : adhered to and performed by JHU/APL and the developer to ensure that the SAFER System has been tested successfully and is fully compliant with the SAFER System...
Surviving an Information Systems Conversion.
ERIC Educational Resources Information Center
Neel, Don
1999-01-01
Prompted by the "millennium bug," many school districts are in the process of replacing non-Y2K-compliant information systems. Planners should establish a committee to develop performance criteria and select the winning proposal, estimate time requirements, and schedule retraining during low-activity periods. (MLH)
On the theory of compliant wall drag reduction in turbulent boundary layers
NASA Technical Reports Server (NTRS)
Ash, R. L.
1974-01-01
A theoretical model has been developed which can explain how the motion of a compliant wall reduces turbulent skin friction drag. Available experimental evidence at low speeds has been used to infer that a compliant surface selectively removes energy from the upper frequency range of the energy containing eddies and through resulting surface motions can produce locally negative Reynolds stresses at the wall. The theory establishes a preliminary amplitude and frequency criterion as the basis for designing effective drag reducing compliant surfaces.
Compliant Interfacial Layers in Thermoelectric Devices
NASA Technical Reports Server (NTRS)
Firdosy, Samad A. (Inventor); Li, Billy Chun-Yip (Inventor); Ravi, Vilupanur A. (Inventor); Fleurial, Jean-Pierre (Inventor); Caillat, Thierry (Inventor); Anjunyan, Harut (Inventor)
2017-01-01
A thermoelectric power generation device is disclosed using one or more mechanically compliant and thermally and electrically conductive layers at the thermoelectric material interfaces to accommodate high temperature differentials and stresses induced thereby. The compliant material may be metal foam or metal graphite composite (e.g. using nickel) and is particularly beneficial in high temperature thermoelectric generators employing Zintl thermoelectric materials. The compliant material may be disposed between the thermoelectric segments of the device or between a thermoelectric segment and the hot or cold side interconnect of the device.
Harvey, Matthew J; Mason, Nicholas J; McLean, Andrew; Rzepa, Henry S
2015-01-01
We describe three different procedures based on metadata standards for enabling automated retrieval of scientific data from digital repositories utilising the persistent identifier of the dataset with optional specification of the attributes of the data document such as filename or media type. The procedures are demonstrated using the JSmol molecular visualizer as a component of a web page and Avogadro as a stand-alone modelling program. We compare our methods for automated retrieval of data from a standards-compliant data repository with those currently in operation for a selection of existing molecular databases and repositories. Our methods illustrate the importance of adopting a standards-based approach of using metadata declarations to increase access to and discoverability of repository-based data. Graphical abstract.
NASA Technical Reports Server (NTRS)
Shalkhauser, Mary Jo W.
2017-01-01
The Space Telecommunications Radio System (STRS) provides a common, consistent framework for software defined radios (SDRs) to abstract the application software from the radio platform hardware. The STRS standard aims to reduce the cost and risk of using complex, configurable and reprogrammable radio systems across NASA missions. To promote the use of the STRS architecture for future NASA advanced exploration missions, NASA Glenn Research Center (GRC) developed an STRS compliant SDR on a radio platform used by the Advance Exploration System program at the Johnson Space Center (JSC) in their Integrated Power, Avionics, and Software (iPAS) laboratory. At the conclusion of the development, the software and hardware description language (HDL) code was delivered to JSC for their use in their iPAS test bed to get hands-on experience with the STRS standard, and for development of their own STRS Waveforms on the now STRS compliant platform.The iPAS STRS Radio was implemented on the Reconfigurable, Intelligently-Adaptive Communication System (RIACS) platform, currently being used for radio development at JSC. The platform consists of a Xilinx ML605 Virtex-6 FPGA board, an Analog Devices FMCOMMS1-EBZ RF transceiver board, and an Embedded PC (Axiomtek eBox 620-110-FL) running the Ubuntu 12.4 operating system. Figure 1 shows the RIACS platform hardware. The result of this development is a very low cost STRS compliant platform that can be used for waveform developments for multiple applications.The purpose of this document is to describe the design of the HDL code for the FPGA portion of the iPAS STRS Radio particularly the design of the FPGA wrapper and the test waveform.
Koh, Hong; Kim, Seung; Kim, Myung-Joon; Kim, Hyun Gi; Shin, Hyun Joo; Lee, Mi-Jung
2015-09-07
To evaluate the possibility of treatment effect monitoring using hepatic fat quantification magnetic resonance (MR) in pediatric nonalcoholic steatohepatitis (NASH). We retrospectively reviewed the medical records of patients who received educational recommendations and vitamin E for NASH and underwent hepatic fat quantification MR from 2011 to 2013. Hepatic fat fraction (%) was measured using dual- and triple-echo gradient-recalled-echo sequences at 3T. The compliant and non-compliant groups were compared clinically, biochemically, and radiologically. Twenty seven patients (M:F = 24:3; mean age: 12 ± 2.3 years) were included (compliant group = 22, non-compliant = 5). None of the baseline findings differed between the 2 groups, except for triglyceride level (compliant vs non-compliant, 167.7 mg/dL vs 74.2 mg/dL, P = 0.001). In the compliant group, high-density lipoprotein increased and all other parameters decreased after 1-year follow-up. However, there were various changes in the non-compliant group. Dual-echo fat fraction (-19.2% vs 4.6, P < 0.001), triple-echo fat fraction (-13.4% vs 3.5, P < 0.001), alanine aminotransferase (-110.7 IU/L vs -10.6 IU/L, P = 0.047), total cholesterol (-18.1 mg/dL vs 3.8 mg/dL, P = 0.016), and triglyceride levels (-61.3 mg/dL vs 11.2 mg/dL, P = 0.013) were significantly decreased only in the compliant group. The change in body mass index and dual-echo fat fraction showed a positive correlation (ρ = 0.418, P = 0.030). Hepatic fat quantification MR can be a non-invasive, quantitative and useful tool for monitoring treatment effects in pediatric NASH.
Summary of compliant and multi-arm control at NASA. Langley Research Center
NASA Technical Reports Server (NTRS)
Harrison, Fenton W.
1992-01-01
The topics are presented in viewgraph form and include the: single arm system, single arm axis system, single arm control systems, single arm hand controller axis system, single arm position axis system, single arm vision axis system, single arm force axis system, multi-arm system, multi-arm axis system, and the dual arm hand control axis system with control signals.
Bistable Mechanisms for Space Applications
Zirbel, Shannon A.; Tolman, Kyler A.; Trease, Brian P.
2016-01-01
Compliant bistable mechanisms are monolithic devices with two stable equilibrium positions separated by an unstable equilibrium position. They show promise in space applications as nonexplosive release mechanisms in deployment systems, thereby eliminating friction and improving the reliability and precision of those mechanical devices. This paper presents both analytical and numerical models that are used to predict bistable behavior and can be used to create bistable mechanisms in materials not previously feasible for compliant mechanisms. Materials compatible with space applications are evaluated for use as bistable mechanisms and prototypes are fabricated in three different materials. Pin-puller and cutter release mechanisms are proposed as potential space applications. PMID:28030588
Raul, P R; Dwivedula, R V; Pagilla, P R
2016-07-01
The problem of controlling the load speed of a mechanical transmission system consisting of a belt-pulley and gear-pair is considered. The system is modeled as two inertia (motor and load) connected by a compliant transmission. If the transmission is assumed to be rigid, then using either the motor or load speed feedback provides the same result. However, with transmission compliance, due to belts or long shafts, the stability characteristics and performance of the closed-loop system are quite different when either motor or load speed feedback is employed. We investigate motor and load speed feedback schemes by utilizing the singular perturbation method. We propose and discuss a control scheme that utilizes both motor and load speed feedback, and design an adaptive feedforward action to reject load torque disturbances. The control algorithms are implemented on an experimental platform that is typically used in roll-to-roll manufacturing and results are shown and discussed. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
Workflow-Based Software Development Environment
NASA Technical Reports Server (NTRS)
Izygon, Michel E.
2013-01-01
The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment
41 CFR 102-33.195 - Do we need an automated system to account for aircraft costs?
Code of Federal Regulations, 2010 CFR
2010-07-01
... FAIRS-compliant system are described in the “Common Aviation Management Information Standard” (C-AMIS... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Do we need an automated... Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION PERSONAL...
NASA Astrophysics Data System (ADS)
Cattadori, M.
2013-12-01
It has been demonstrated that in Italy Earth and Climate System Sciences Education (ESS) is one of the scientific disciplines where science teachers show a greatest need in terms of professional support. Among the causes that have been reported we should mention: the predominance of science teachers with a degree in biological disciplines rather then geo-logical or physical topics, and the high interdisciplinarity of certain topics, in particular those related to the climate system. Furthermore, it was found that ESS topics are predominant in the science curricula of those grades in which have been reported the major students dropout rates during the whole italian school cycle . In this context, in 2010, the MUSE, the Museum of Science of Trento (Italy), created a web-based service named I-Cleen (Inquring on Climate and Energy www.icleen.muse.it). This is a tool aimed at promoting the collaboration among science teachers in order to share resources and enhance the professional collaboration by means of participatory methods and models belonging to the world of open source and open content. The main instrument of the I-CLEEN project is an online repository (with metadata compliant with the DCMI and LOM international standards) of teaching resources focused on Earth and Climate Sciences all published under the Creative Commons license Attribution 3.0 and therefore, belonging to the model of OER (Open Educational Resources). The service has been designed, developed and managed by a team consisting of very experiencing science teachers and scientists from the Museum and other partners research institutions. The editorial work is carried out online utilizing a specific platform made with LifeRay, a CMS (Content Management System) software that is open source and manageable in a single Java-frameworked environment using the dbase, the website, the editorial process and several web 2.0 services. The project has been subjected to two distinct testing activities in collaboration with the University of Trento dealing with the effectiveness of the service as well as the usability of the graphic user interface (GUI). The present work aims to illustrate the essential features of the service I-cleen and the results achieved during the last three years of operation. It will be display and interpret for the first time data with web traffic, and other data from downloading and publishing documents of the teaching resources and the main outcomes of the above mentioned tests. The purpose of this contribution is to highlight strengths and weaknesses of this experience and potentially able to provide valuable information on the role of today's web based services and online communities to help support teachers in earth and climate sciences subjects.
Passive control of a biventricular assist device with compliant inflow cannulae.
Gregory, Shaun David; Pearcy, Mark John; Timms, Daniel
2012-08-01
Rotary ventricular assist device (VAD) support of the cardiovascular system is susceptible to suction events due to the limited preload sensitivity of these devices. This may be of particular concern with rotary biventricular support (BiVAD) where the native, flow balancing Starling response is diminished in both ventricles. The reliability of sensor and sensorless-based control systems which aim to control VAD flow based on preload has limitations, and, thus, an alternative solution is desired. This study introduces a compliant inflow cannula (CIC) which could improve the preload sensitivity of a rotary VAD by passively altering VAD flow depending on preload. To evaluate the design, both the CIC and a standard rigid inflow cannula were inserted into a mock circulation loop to enable biventricular heart failure support using configurations of atrial and ventricular inflow, and arterial outflow cannulation. A range of left (LVAD) and right VAD (RVAD) rotational speeds were tested as well as step changes in systemic/pulmonary vascular resistance to alter relative preloads, with resulting flow rates recorded. Simulated suction events were observed, particularly at higher VAD speeds, during support with the rigid inflow cannula, while the CIC prevented suction events under all circumstances. The compliant section passively restricted its internal diameter as preload was reduced, which increased the VAD circuit resistance and thus reduced VAD flow. Therefore, a CIC could potentially be used as a passive control system to prevent suction events in rotary left, right, and biventricular support. © 2012, Copyright the Authors. Artificial Organs © 2012, International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.
SRG110 Stirling Generator Dynamic Simulator Vibration Test Results and Analysis Correlation
NASA Technical Reports Server (NTRS)
Lewandowski, Edward J.; Suarez, Vicente J.; Goodnight, Thomas W.; Callahan, John
2007-01-01
The U.S. Department of Energy (DOE), Lockheed Martin (LM), and NASA Glenn Research Center (GRC) have been developing the Stirling Radioisotope Generator (SRG110) for use as a power system for space science missions. The launch environment enveloping potential missions results in a random input spectrum that is significantly higher than historical radioisotope power system (RPS) launch levels and is a challenge for designers. Analysis presented in prior work predicted that tailoring the compliance at the generator-spacecraft interface reduced the dynamic response of the system thereby allowing higher launch load input levels and expanding the range of potential generator missions. To confirm analytical predictions, a dynamic simulator representing the generator structure, Stirling convertors and heat sources were designed and built for testing with and without a compliant interface. Finite element analysis was performed to guide the generator simulator and compliant interface design so that test modes and frequencies were representative of the SRG110 generator. This paper presents the dynamic simulator design, the test setup and methodology, test article modes and frequencies and dynamic responses, and post-test analysis results. With the compliant interface, component responses to an input environment exceeding the SRG110 qualification level spectrum were all within design allowables. Post-test analysis included finite element model tuning to match test frequencies and random response analysis using the test input spectrum. Analytical results were in good overall agreement with the test results and confirmed previous predictions that the SRG110 power system may be considered for a broad range of potential missions, including those with demanding launch environments.
. REAL ID LANL Impacts and Solutions The federal government has determined New Mexico is non-compliant Identification Cards whom will also become Non-Compliant. Access through LANL Vehicle Access Portals unaffected alternate ID if they are coming from "non-compliant" REAL-ID states LANS and the Field Office have
Code of Federal Regulations, 2012 CFR
2012-10-01
... Compliant Microcomputers, Including Personal Computers, Monitors and Printers. 1552.239-103 Section 1552.239... Star Compliant Microcomputers, Including Personal Computers, Monitors and Printers. As prescribed in... Personal Computers, Monitors, and Printers (APR 1996) (a) The Contractor shall provide computer products...
Code of Federal Regulations, 2011 CFR
2011-10-01
... Compliant Microcomputers, Including Personal Computers, Monitors and Printers. 1552.239-103 Section 1552.239... Star Compliant Microcomputers, Including Personal Computers, Monitors and Printers. As prescribed in... Personal Computers, Monitors, and Printers (APR 1996) (a) The Contractor shall provide computer products...
Code of Federal Regulations, 2010 CFR
2010-10-01
... Compliant Microcomputers, Including Personal Computers, Monitors and Printers. 1552.239-103 Section 1552.239... Star Compliant Microcomputers, Including Personal Computers, Monitors and Printers. As prescribed in... Personal Computers, Monitors, and Printers (APR 1996) (a) The Contractor shall provide computer products...
Performance of a non-tapered 3D morphing wing with integrated compliant ribs
NASA Astrophysics Data System (ADS)
Previtali, F.; Ermanni, P.
2012-05-01
Morphing wings have a high potential for improving the performance and reducing the fuel consumption of modern aircraft. Thanks to its simplicity, the compliant belt-rib concept is regarded by the authors as a promising solution. Using the compliant rib designed by Hasse and Campanile as a starting point, a compliant morphing wing made of composite materials is designed. Innovative methods for optimal placing of the actuation and for the quantification of the morphing are used. The performance of the compliant morphing wing in terms of three-dimensional (3D) structural behaviour and aerodynamic properties, both two- and three-dimensional, is presented and discussed. The fundamental importance of considering 3D coupling effects in the determination of the performance of morphing aerofoils is shown.
Approaches to Linked Open Data at data.oceandrilling.org
NASA Astrophysics Data System (ADS)
Fils, D.
2012-12-01
The data.oceandrilling.org web application applies Linked Open Data (LOD) patterns to expose Deep Sea Drilling Project (DSDP), Ocean Drilling Program (ODP) and Integrated Ocean Drilling Program (IODP) data. Ocean drilling data is represented in a rich range of data formats: high resolution images, file based data sets and sample based data. This richness of data types has been well met by semantic approaches and will be demonstrated. Data has been extracted from CSV, HTML and RDBMS through custom software and existing packages for loading into a SPARQL 1.1 compliant triple store. Practices have been developed to streamline the maintenance of the RDF graphs and properly expose them using LOD approaches like VoID and HTML embedded structured data. Custom and existing vocabularies are used to allow semantic relations between resources. Use of the W3c draft RDF Data Cube Vocabulary and other approaches for encoding time scales, taxonomic fossil data and other graphs will be shown. A software layer written in Google Go mediates the RDF to web pipeline. The approach used is general and can be applied to other similar environments like node.js or Python Twisted. To facilitate communication user interface software libraries such as D3 and packages such as S2S and LodLive have been used. Additionally OpenSearch API's, structured data in HTML and SPARQL endpoints provide various access methods for applications. The data.oceandrilling.org is not viewed as a web site but as an application that communicate with a range of clients. This approach helps guide the development more along software practices than along web site authoring approaches.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clements, Samuel L.; Edgar, Thomas W.; Manz, David O.
The purpose of this workshop was to identify and discuss concerns with the use and adoption of IEC 62351 security standard for IEC 61850 compliant control system products. The industry participants discussed performance, interoperability, adoption, challenges, business cases, and future issues.
EXO-DAT: AN INFORMATION SYSTEM IN SUPPORT OF THE CoRoT/EXOPLANET SCIENCE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deleuil, M.; Meunier, J. C.; Moutou, C.
2009-08-15
Exo-Dat is a database and an information system created primarily in support of the exoplanet program of the COnvection ROtation and planetary Transits (CoRoT) mission. In the directions of CoRoT pointings, it provides a united interface to several sets of data: stellar published catalogs, photometric and spectroscopic data obtained during the mission preparation, results from the mission and from follow-up observations, and several mission-specific technical parameters. The new photometric data constitute the subcatalog Exo-Cat, and give consistent 4-color photometry of 14.0 million stars with a completeness to 19th magnitude in the r-filter. It covers several zones in the galactic planemore » around CoRoT pointings, with a total area of 209 deg{sup 2}. This Exo-Dat information system provides essential technical support to the ongoing CoRoT light-curve analyses and ground-based follow-up by supplying additional complementary information such as the prior knowledge of the star's fundamental parameters or its contamination level inside the large CoRoT photometric mask. The database is fully interfaced with VO tools and thus benefits from existing visualization and analysis tools like TOPCAT or ALADIN. It is accessible to the CoRoT community through the Web, and will be gradually opened to the public. It is the ideal tool to prepare the foreseen statistical studies of the properties of the exoplanetary systems. As a VO-compliant system, such analyses could thus benefit from the most up-to-date classifier tools.« less
Determining Appropriate Coupling between User Experiences and Earth Science Data Services
NASA Astrophysics Data System (ADS)
Moghaddam-Taaheri, E.; Pilone, D.; Newman, D. J.; Mitchell, A. E.; Goff, T. D.; Baynes, K.
2012-12-01
NASA's Earth Observing System ClearingHOuse (ECHO) is a format agnostic metadata repository supporting over 3000 collections and 100M granules. ECHO exposes FTP and RESTful Data Ingest APIs in addition to both SOAP and RESTful search and order capabilities. Built on top of ECHO is a human facing search and order web application named Reverb. Reverb exposes ECHO's capabilities through an interactive, Web 2.0 application designed around searching for Earth Science data and downloading or ordering data of interest. ECHO and Reverb have supported the concept of Earth Science data services for several years but only for discovery. Invocation of these services was not a primary capability of the user experience. As more and more Earth Science data moves online and away from the concept of data ordering, progress has been made in making on demand services available for directly accessed data. These concepts have existed through access mechanisms such as OPeNDAP but are proliferating to accommodate a wider variety of services and service providers. Recently, the EOSDIS Service Interface (ESI) was defined and integrated into the ECS system. The ESI allows data providers to expose a wide variety of service capabilities including reprojection, reformatting, spatial and band subsetting, and resampling. ECHO and Reverb were tasked with making these services available to end-users in a meaningful and usable way that integrated into its existing search and ordering workflow. This presentation discusses the challenges associated with exposing disparate service capabilities while presenting a meaningful and cohesive user experience. Specifically, we'll discuss: - Benefits and challenges of tightly coupling the user interface with underlying services - Approaches to generic service descriptions - Approaches to dynamic user interfaces that better describe service capabilities while minimizing application coupling - Challenges associated with traditional WSDL / UDDI style service descriptions - Walkthrough of the solution used by ECHO and Reverb to integrate and expose ESI compliant services to our users
Training Children with Autism Spectrum Disorders to Be Compliant with an Oral Assessment
ERIC Educational Resources Information Center
Cuvo, Anthony J.; Godard, Anna; Huckfeldt, Rachel; DeMattei, Ronda
2010-01-01
Little research has been conducted on teaching children with autism spectrum disorders to be compliant with dental procedures. This study evaluated a behavioral package to train children with autism spectrum disorders to be compliant with an 8 component oral assessment. After a dental hygienist performed an assessment pretest, noncompliance on…
Power bases and attribution in three cultures.
Alanazi, Falah M; Rodrigues, Aroldo
2003-06-01
The authors used a Saudi context to verify the cross-cultural generality of findings (A. Rodrigues & K. L. Lloyd, 1998) reported for U.S. and Brazilian samples in which compliant behavior caused by reward, informational, and referent influences was perceived as more controllable and more internal than compliant behavior resulting from legitimate, expert, and coercive influences. This differential attribution led, in turn, to different affective and behavioral responses. In the present study, cognitive and affective reactions of Saudi students were measured with regard to compliant behavior (leading to a good outcome or a bad outcome) caused by each of the 6 bases of power described by B. H. Raven (1965). As expected, power bases had significant effects. However, when the outcome of the compliant behavior was bad, compliant behavior caused by a coercive influence led to the perception of more internality and controllability. Also--and not found in previous studies--the perception of less internality and controllability of compliant behavior was caused by an informational influence. Findings are discussed in the light of related research and Saudi cultural characteristics.
Herd protection effect of N95 respirators in healthcare workers.
Chen, Xin; Chughtai, Abrar Ahmad; MacIntyre, Chandini Raina
2017-12-01
Objective To determine if there was herd protection conferred to unprotected healthcare workers (HCWs) by N95 respirators worn by colleagues. Methods Data were analysed from a prospective cluster randomized clinical trial conducted in Beijing, China between 1 December 2008 and 15 January 2009. A minimum compliance level (MCL) of N95 respirators for prevention of clinical respiratory illness (CRI) was set based on various compliance cut-offs. The CRI rates were compared between compliant (≥MCL) and non-compliant (
Next Generation Waste Tracking: Linking Legacy Systems with Modern Networking Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, Randy M.; Resseguie, David R.; Shankar, Mallikarjun
2010-01-01
This report describes results from a preliminary analysis to satisfy the Department of Energy (DOE) objective to ensure the safe, secure, efficient packaging and transportation of materials both hazardous and non hazardous [1, 2]. The DOE Office of Environmental Management (OEM) through Oak Ridge National Laboratory (ORNL) has embarked on a project to further this objective. OEM and ORNL have agreed to develop, demonstrate and make available modern day cost effective technologies for characterization, identification, tracking, monitoring and disposal of radioactive waste when transported by, or between, motor, air, rail, and water modes. During the past 8 years ORNL hasmore » investigated and deployed Web 2.0 compliant sensors into the transportation segment of the supply chain. ORNL has recently demonstrated operational experience with DOE Oak Ridge Operations Office (ORO) and others in national test beds and applications within this domain of the supply chain. Furthermore, in addition to DOE, these hazardous materials supply chain partners included Federal and State enforcement agencies, international ports, and commercial sector shipping operations in a hazardous/radioactive materials tracking and monitoring program called IntelligentFreight. IntelligentFreight is an ORNL initiative encompassing 5 years of research effort associated with the supply chain. The ongoing ORNL SmartFreight programs include RadSTraM [3], GRadSTraM , Trusted Corridors, SensorPedia [4], SensorNet, Southeastern Transportation Corridor Pilot (SETCP) and Trade Data Exchange [5]. The integration of multiple technologies aimed at safer more secure conveyance has been investigated with the core research question being focused on testing distinctly different distributed supply chain information sharing systems. ORNL with support from ORO have demonstrated capabilities when transporting Environmental Management (EM) waste materials for disposal over an onsite haul road. ORNL has unified the operations of existing legacy hazardous, radioactive and related informational databases and systems using emerging Web 2.0 technologies. These capabilities were used to interoperate ORNL s waste generating, packaging, transportation and disposal with other DOE ORO waste management contractors. Importantly, the DOE EM objectives were accomplished in a cost effective manner without altering existing information systems. A path forward is to demonstrate and share these technologies with DOE EM, contractors and stakeholders. This approach will not alter existing DOE assets, i.e. Automated Traffic Management Systems (ATMS), Transportation Tracking and Communications System (TRANSCOM), the Argonne National Laboratory (ANL) demonstrated package tracking system, etc« less
Load Capacity Estimation of Foil Air Journal Bearings for Oil-Free Turbomachinery Applications
NASA Technical Reports Server (NTRS)
DellaCorte, Christopher; Valco, Mark J.
2000-01-01
This paper introduces a simple "Rule of Thumb" (ROT) method to estimate the load capacity of foil air journal bearings, which are self-acting compliant-surface hydrodynamic bearings being considered for Oil-Free turbo-machinery applications such as gas turbine engines. The ROT is based on first principles and data available in the literature and it relates bearing load capacity to the bearing size and speed through an empirically based load capacity coefficient, D. It is shown that load capacity is a linear function of bearing surface velocity and bearing projected area. Furthermore, it was found that the load capacity coefficient, D, is related to the design features of the bearing compliant members and operating conditions (speed and ambient temperature). Early bearing designs with basic or "first generation" compliant support elements have relatively low load capacity. More advanced bearings, in which the compliance of the support structure is tailored, have load capacities up to five times those of simpler designs. The ROT enables simplified load capacity estimation for foil air journal bearings and can guide development of new Oil-Free turbomachinery systems.
Adaptivity in ProPer: An Adaptive SCORM Compliant LMS
ERIC Educational Resources Information Center
Kazanidis, Ioannis; Satratzemi, Maya
2009-01-01
Adaptive Educational Hypermedia Systems provide personalized educational content to learners. However most of them do not support the functionality of Learning Management Systems (LMS) and the reusability of their courses is hard work. On the other hand some LMS support SCORM specifications but do not provide adaptive features. This article…
School Accountability Systems and the Every Student Succeeds Act. Re:VISION
ERIC Educational Resources Information Center
Martin, Mike
2016-01-01
The "Every Student Succeeds Act" (ESSA) replaced the "No Child Left Behind Act of 2001" (NCLB) in December 2015, substantially changing the federal role in education and how schools across the country will be held accountable. For state policymakers, designing new ESSA-compliant accountability systems is a significant…
7 CFR 1753.6 - Standards, specifications, and general requirements.
Code of Federal Regulations, 2010 CFR
2010-01-01
... UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE TELECOMMUNICATIONS SYSTEM CONSTRUCTION POLICIES AND PROCEDURES... subject to the “Buy American” provision (7 U.S.C. 901 et seq. as amended in 1938). (e) All software, software systems, and firmware financed with loan funds must be year 2000 compliant, as defined in 7 CFR...
40 CFR 63.343 - Compliance provisions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... operators of affected sources. (1) Composite mesh-pad systems. (i) During the initial performance test, the... with the emission limitations in § 63.342 through the use of a composite mesh-pad system shall... one performance test and accept ±2 inches of water column from this value as the compliant range. (ii...
40 CFR 63.343 - Compliance provisions.
Code of Federal Regulations, 2012 CFR
2012-07-01
... operators of affected sources. (1) Composite mesh-pad systems. (i) During the initial performance test, the... with the emission limitations in § 63.342 through the use of a composite mesh-pad system shall... one performance test and accept ±2 inches of water column from this value as the compliant range. (ii...
40 CFR 63.343 - Compliance provisions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... operators of affected sources. (1) Composite mesh-pad systems. (i) During the initial performance test, the... with the emission limitations in § 63.342 through the use of a composite mesh-pad system shall... one performance test and accept ±2 inches of water column from this value as the compliant range. (ii...
Bao, Shunxing; Damon, Stephen M; Landman, Bennett A; Gokhale, Aniruddha
2016-02-27
Adopting high performance cloud computing for medical image processing is a popular trend given the pressing needs of large studies. Amazon Web Services (AWS) provide reliable, on-demand, and inexpensive cloud computing services. Our research objective is to implement an affordable, scalable and easy-to-use AWS framework for the Java Image Science Toolkit (JIST). JIST is a plugin for Medical-Image Processing, Analysis, and Visualization (MIPAV) that provides a graphical pipeline implementation allowing users to quickly test and develop pipelines. JIST is DRMAA-compliant allowing it to run on portable batch system grids. However, as new processing methods are implemented and developed, memory may often be a bottleneck for not only lab computers, but also possibly some local grids. Integrating JIST with the AWS cloud alleviates these possible restrictions and does not require users to have deep knowledge of programming in Java. Workflow definition/management and cloud configurations are two key challenges in this research. Using a simple unified control panel, users have the ability to set the numbers of nodes and select from a variety of pre-configured AWS EC2 nodes with different numbers of processors and memory storage. Intuitively, we configured Amazon S3 storage to be mounted by pay-for-use Amazon EC2 instances. Hence, S3 storage is recognized as a shared cloud resource. The Amazon EC2 instances provide pre-installs of all necessary packages to run JIST. This work presents an implementation that facilitates the integration of JIST with AWS. We describe the theoretical cost/benefit formulae to decide between local serial execution versus cloud computing and apply this analysis to an empirical diffusion tensor imaging pipeline.
NASA Astrophysics Data System (ADS)
Bao, Shunxing; Damon, Stephen M.; Landman, Bennett A.; Gokhale, Aniruddha
2016-03-01
Adopting high performance cloud computing for medical image processing is a popular trend given the pressing needs of large studies. Amazon Web Services (AWS) provide reliable, on-demand, and inexpensive cloud computing services. Our research objective is to implement an affordable, scalable and easy-to-use AWS framework for the Java Image Science Toolkit (JIST). JIST is a plugin for Medical- Image Processing, Analysis, and Visualization (MIPAV) that provides a graphical pipeline implementation allowing users to quickly test and develop pipelines. JIST is DRMAA-compliant allowing it to run on portable batch system grids. However, as new processing methods are implemented and developed, memory may often be a bottleneck for not only lab computers, but also possibly some local grids. Integrating JIST with the AWS cloud alleviates these possible restrictions and does not require users to have deep knowledge of programming in Java. Workflow definition/management and cloud configurations are two key challenges in this research. Using a simple unified control panel, users have the ability to set the numbers of nodes and select from a variety of pre-configured AWS EC2 nodes with different numbers of processors and memory storage. Intuitively, we configured Amazon S3 storage to be mounted by pay-for- use Amazon EC2 instances. Hence, S3 storage is recognized as a shared cloud resource. The Amazon EC2 instances provide pre-installs of all necessary packages to run JIST. This work presents an implementation that facilitates the integration of JIST with AWS. We describe the theoretical cost/benefit formulae to decide between local serial execution versus cloud computing and apply this analysis to an empirical diffusion tensor imaging pipeline.
Bao, Shunxing; Damon, Stephen M.; Landman, Bennett A.; Gokhale, Aniruddha
2016-01-01
Adopting high performance cloud computing for medical image processing is a popular trend given the pressing needs of large studies. Amazon Web Services (AWS) provide reliable, on-demand, and inexpensive cloud computing services. Our research objective is to implement an affordable, scalable and easy-to-use AWS framework for the Java Image Science Toolkit (JIST). JIST is a plugin for Medical-Image Processing, Analysis, and Visualization (MIPAV) that provides a graphical pipeline implementation allowing users to quickly test and develop pipelines. JIST is DRMAA-compliant allowing it to run on portable batch system grids. However, as new processing methods are implemented and developed, memory may often be a bottleneck for not only lab computers, but also possibly some local grids. Integrating JIST with the AWS cloud alleviates these possible restrictions and does not require users to have deep knowledge of programming in Java. Workflow definition/management and cloud configurations are two key challenges in this research. Using a simple unified control panel, users have the ability to set the numbers of nodes and select from a variety of pre-configured AWS EC2 nodes with different numbers of processors and memory storage. Intuitively, we configured Amazon S3 storage to be mounted by pay-for-use Amazon EC2 instances. Hence, S3 storage is recognized as a shared cloud resource. The Amazon EC2 instances provide pre-installs of all necessary packages to run JIST. This work presents an implementation that facilitates the integration of JIST with AWS. We describe the theoretical cost/benefit formulae to decide between local serial execution versus cloud computing and apply this analysis to an empirical diffusion tensor imaging pipeline. PMID:27127335
Mathematical circulatory system model
NASA Technical Reports Server (NTRS)
Lakin, William D. (Inventor); Stevens, Scott A. (Inventor)
2010-01-01
A system and method of modeling a circulatory system including a regulatory mechanism parameter. In one embodiment, a regulatory mechanism parameter in a lumped parameter model is represented as a logistic function. In another embodiment, the circulatory system model includes a compliant vessel, the model having a parameter representing a change in pressure due to contraction of smooth muscles of a wall of the vessel.
Four-Wheel Vehicle Suspension System
NASA Technical Reports Server (NTRS)
Bickler, Donald B.
1990-01-01
Four-wheel suspension system uses simple system of levers with no compliant components to provide three-point suspension of chassis of vehicle while maintaining four-point contact with uneven terrain. Provides stability against tipping of four-point rectangular base, without rocking contact to which rigid four-wheel frame susceptible. Similar to six-wheel suspension system described in "Articulated Suspension Without Springs" (NPO-17354).
2013-09-30
fire sprinkler system during the initial construction of the RSOI facilities. The construction contract to build the RSOI...International Building Code. Compliant manual and automatic fire alarm and notification systems , portable fire extinguishers, fire sprinkler systems ...automatic fire sprinkler system that was not operational, a fire department connection that was obstructed, and a fire detection system
Dos Reis, Laura L; Tuttle, R Michael; Alon, Eran; Bergman, Donald A; Bernet, Victor; Brett, Elise M; Cobin, Rhoda; Doherty, Gerard; Harris, Jeffrey R; Klopper, Joshua; Lee, Stephanie L; Lupo, Mark; Milas, Mira; Machac, Josef; Mechanick, Jeffrey I; Orloff, Lisa; Randolph, Gregory; Ross, Douglas S; Smallridge, Robert C; Terris, David James; Tufano, Ralph P; Mehra, Saral; Scherl, Sophie; Clain, Jason B; Urken, Mark L
2014-10-01
Appropriate management of well-differentiated thyroid cancer requires treating clinicians to have access to critical elements of the patient's presentation, surgical management, postoperative course, and pathologic assessment. Electronic health records (EHRs) provide an effective method for the storage and transmission of patient information, although most commercially available EHRs are not intended to be disease-specific. In addition, there are significant challenges for the sharing of relevant clinical information when providers involved in the care of a patient with thyroid cancer are not connected by a common EHR. In 2012, the American Thyroid Association (ATA) defined the critical elements for optimal interclinician communication in a position paper entitled, "The Essential Elements of Interdisciplinary Communication of Perioperative Information for Patients Undergoing Thyroid Cancer Surgery." We present a field-by-field comparison of the ATA's essential elements as applied to three contemporary electronic reporting systems: the Thyroid Surgery e-Form from Memorial Sloan-Kettering Cancer Center (MSKCC), the Alberta WebSMR from the University of Calgary, and the Thyroid Cancer Care Collaborative (TCCC). The MSKCC e-form fulfills 21 of 32 intraoperative fields and includes an additional 14 fields not specifically mentioned in the ATA's report. The Alberta WebSMR fulfills 45 of 82 preoperative and intraoperative fields outlined by the ATA and includes 13 additional fields. The TCCC fulfills 117 of 120 fields outlined by the ATA and includes 23 additional fields. Effective management of thyroid cancer is a highly collaborative, multidisciplinary effort. The patient information that factors into clinical decisions about thyroid cancer is complex. For these reasons, EHRs are particularly favorable for the management of patients with thyroid cancer. The MSKCC Thyroid Surgery e-Form, the Alberta WebSMR, and the TCCC each meet all of the general recommendations for effective reporting of the specific domains that they cover in the management of thyroid cancer, as recommended by the ATA. However, the TCCC format is the most comprehensive. The TCCC is a new Web-based disease-specific database to enhance communication of patient information between clinicians in a Health Insurance Portability and Accountability Act (HIPAA)-compliant manner. We believe the easy-to-use TCCC format will enhance clinician communication while providing portability of thyroid cancer information for patients.
NASA Astrophysics Data System (ADS)
Anderson, D. M.; Snowden, D. P.; Bochenek, R.; Bickel, A.
2015-12-01
In the U.S. coastal waters, a network of eleven regional coastal ocean observing systems support real-time coastal and ocean observing. The platforms supported and variables acquired are diverse, ranging from current sensing high frequency (HF) radar to autonomous gliders. The system incorporates data produced by other networks and experimental systems, further increasing the breadth of the collection. Strategies promoted by the U.S. Integrated Ocean Observing System (IOOS) ensure these data are not lost at sea. Every data set deserves a description. ISO and FGDC compliant metadata enables catalog interoperability and record-sharing. Extensive use of netCDF with the Climate and Forecast convention (identifying both metadata and a structured format) is shown to be a powerful strategy to promote discovery, interoperability, and re-use of the data. To integrate specialized data which are often obscure, quality control protocols are being developed to homogenize the QC and make these data more integrate-able. Data Assembly Centers have been established to integrate some specialized streams including gliders, animal telemetry, and HF radar. Subsets of data that are ingested into the National Data Buoy Center are also routed to the Global Telecommunications System (GTS) of the World Meteorological Organization to assure wide international distribution. From the GTS, data are assimilated into now-cast and forecast models, fed to other observing systems, and used to support observation-based decision making such as forecasts, warnings, and alerts. For a few years apps were a popular way to deliver these real-time data streams to phones and tablets. Responsive and adaptive web sites are an emerging flexible strategy to provide access to the regional coastal ocean observations.
Environmentally Compliant Coating Remover Evaluation
2012-08-30
22 9 Total of 491 products evaluated 7 Background • Many DoD depainting operations currently use environmentally compliant peroxide -assisted... benzyl alcohol strippers • These strippers have acceptable coating removal rates with minimal physical damage to metallic substrates • However, several...Coatings • Environmentally compliant benzyl alcohol product • Passed corrosion testing conducted by SMI in 2011 11 Laboratory Testing Scope
NASA Astrophysics Data System (ADS)
Glasscoe, Margaret T.; Wang, Jun; Pierce, Marlon E.; Yoder, Mark R.; Parker, Jay W.; Burl, Michael C.; Stough, Timothy M.; Granat, Robert A.; Donnellan, Andrea; Rundle, John B.; Ma, Yu; Bawden, Gerald W.; Yuen, Karen
2015-08-01
Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing new capabilities for decision making utilizing remote sensing data and modeling software to provide decision support for earthquake disaster management and response. E-DECIDER incorporates the earthquake forecasting methodology and geophysical modeling tools developed through NASA's QuakeSim project. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools allows us to provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). This in turn is delivered through standards-compliant web services for desktop and hand-held devices.
NASA Astrophysics Data System (ADS)
Modolo, R.; Hess, S.; Génot, V.; Leclercq, L.; Leblanc, F.; Chaufray, J.-Y.; Weill, P.; Gangloff, M.; Fedorov, A.; Budnik, E.; Bouchemit, M.; Steckiewicz, M.; André, N.; Beigbeder, L.; Popescu, D.; Toniutti, J.-P.; Al-Ubaidi, T.; Khodachenko, M.; Brain, D.; Curry, S.; Jakosky, B.; Holmström, M.
2018-01-01
We present the Latmos Hybrid Simulation (LatHyS) database, which is dedicated to the investigations of planetary plasma environment. Simulation results of several planetary objects (Mars, Mercury, Ganymede) are available in an online catalogue. The full description of the simulations and their results is compliant with a data model developped in the framework of the FP7 IMPEx project. The catalogue is interfaced with VO-visualization tools such AMDA, 3DView, TOPCAT, CLweb or the IMPEx portal. Web services ensure the possibilities of accessing and extracting simulated quantities/data. We illustrate the interoperability between the simulation database and VO-tools using a detailed science case that focuses on a three-dimensional representation of the solar wind interaction with the Martian upper atmosphere, combining MAVEN and Mars Express observations and simulation results.
Krisciunas, Gintas P.; McCulloch, Timothy M.; Lazarus, Cathy L.; Pauloski, Barbara R.; Meyer, Tanya K.; Graner, Darlene; Van Daele, Douglas J.; Silbergleit, Alice K.; Crujido, Lisa R.; Rybin, Denis; Doros, Gheorghe; Kotz, Tamar; Langmore, Susan E.
2016-01-01
Purpose A 5yr, 16 site, randomized controlled trial enrolled 170 HNC survivors into active (estim + swallow exercise) or control (sham estim + swallowing exercise) arms. Primary analyses showed that estim did not enhance swallowing exercises. This secondary analysis determined if/how patient compliance impacted outcomes. Methods A home program, performed 2×/day, 6d/wk, for 12wks included stretches and 60 swallows paired with real or sham estim. Regular clinic visits ensured proper exercise execution and detailed therapy checklists tracked patient compliance which was defined by mean number of sessions performed per week (0-12 times) over the 12wk intervention period. “Compliant” was defined as performing 10-12 sessions/wk. Outcomes were change in PAS, HNCI, PSS, OPSE, and hyoid excursion. ANCOVA analyses determined if outcomes differed between real/sham and compliant/noncompliant groups after 12wks of therapy. Results Of the 170 patients enrolled, 153 patients had compliance data. The mean number of sessions performed was 8.57/wk (median=10.25). Fifty four percent of patients (n=83) were considered “compliant”. After 12wks of therapy, compliant patients in the sham estim group realized significantly better PAS scores than compliant patients in the active estim group (p=0.0074). When pooling all patients together, there were no significant differences in outcomes between compliant and non-compliant patients. Conclusions The addition of estim to swallowing exercises resulted in worse swallowing outcomes than exercises alone, which was more pronounced in compliant patients. Since neither compliant nor non-compliant patients benefitted from swallowing exercises, the proper dose and/or efficacy of swallowing exercises must also be questioned in this patient population. PMID:27848021
Compliance with gastric cancer guidelines is associated with improved outcomes.
Worhunsky, David J; Ma, Yifei; Zak, Yulia; Poultsides, George A; Norton, Jeffrey A; Rhoads, Kim F; Visser, Brendan C
2015-03-01
Limited data are available on the implementation and effectiveness of NCCN Clinical Practice Guidelines in Oncology (NCCN Guidelines) for Gastric Cancer. We sought to assess rates of compliance with NCCN Guidelines, specifically stage-specific therapy during the initial episode of care, and to determine its impact on outcomes. The California Cancer Registry was used to identify cases of gastric cancer from 2001 to 2006. Logistic regression and Cox proportional hazard models were used to predict guideline compliance and the adjusted hazard ratio for mortality. Patients with TNM staging or summary stage (SS) were also analyzed separately. Compliance with NCCN Guidelines occurred in just 45.5% of patients overall. Patients older than 55 years were less likely to receive guideline-compliant care, and compliance was associated with a median survival of 20 versus 7 months for noncompliant care (P<.001). Compliant care was also associated with a 55% decreased hazard of mortality (P<.001). Further analysis revealed that 50% of patients had complete TNM staging versus an SS, and TNM-staged patients were more likely to receive compliant care (odds ratio, 1.59; P<.001). TNM-staged patients receiving compliant care had a median survival of 25.3 months compared with 15.1 months for compliant SS patients. Compliance with NCCN Guidelines and stage-specific therapy at presentation for the treatment of patients with gastric cancer was poor, which was a significant finding given that compliant care was associated with a 55% reduction in the hazard of death. Additionally, patients with TNM-staged cancer were more likely to receive compliant care, perhaps a result of having received more intensive therapy. Combined with the improved survival among compliant TNM-staged patients, these differences have meaningful implications for health services research. Copyright © 2015 by the National Comprehensive Cancer Network.
Assessing the degradation of compliant electrodes for soft actuators.
Rosset, Samuel; de Saint-Aubin, Christine; Poulin, Alexandre; Shea, Herbert R
2017-10-01
We present an automated system to measure the degradation of compliant electrodes used in dielectric elastomer actuators (DEAs) over millions of cycles. Electrodes for DEAs generally experience biaxial linear strains of more than 10%. The decrease in electrode conductivity induced by this repeated fast mechanical deformation impacts the bandwidth of the actuator and its strain homogeneity. Changes in the electrode mechanical properties lead to reduced actuation strain. Rather than using an external actuator to periodically deform the electrodes, our measurement method consists of measuring the properties of an electrode in an expanding circle DEA. A programmable high voltage power supply drives the actuator with a square signal up to 1 kHz, periodically actuating the DEA, and thus stretching the electrodes. The DEA strain is monitored with a universal serial bus camera, while the resistance of the ground electrode is measured with a multimeter. The system can be used for any type of electrode. We validated the test setup by characterising a carbon black/silicone composite that we commonly use as compliant electrode. Although the composite is well-suited for tens of millions of cycles of actuation below 5%, we observe important degradation for higher deformations. When activated at a 20% radial strain, the electrodes suffer from important damage after a few thousand cycles, and an inhomogeneous actuation is observed, with the strain localised in a sub-region of the actuator only.
Assessing the degradation of compliant electrodes for soft actuators
NASA Astrophysics Data System (ADS)
Rosset, Samuel; de Saint-Aubin, Christine; Poulin, Alexandre; Shea, Herbert R.
2017-10-01
We present an automated system to measure the degradation of compliant electrodes used in dielectric elastomer actuators (DEAs) over millions of cycles. Electrodes for DEAs generally experience biaxial linear strains of more than 10%. The decrease in electrode conductivity induced by this repeated fast mechanical deformation impacts the bandwidth of the actuator and its strain homogeneity. Changes in the electrode mechanical properties lead to reduced actuation strain. Rather than using an external actuator to periodically deform the electrodes, our measurement method consists of measuring the properties of an electrode in an expanding circle DEA. A programmable high voltage power supply drives the actuator with a square signal up to 1 kHz, periodically actuating the DEA, and thus stretching the electrodes. The DEA strain is monitored with a universal serial bus camera, while the resistance of the ground electrode is measured with a multimeter. The system can be used for any type of electrode. We validated the test setup by characterising a carbon black/silicone composite that we commonly use as compliant electrode. Although the composite is well-suited for tens of millions of cycles of actuation below 5%, we observe important degradation for higher deformations. When activated at a 20% radial strain, the electrodes suffer from important damage after a few thousand cycles, and an inhomogeneous actuation is observed, with the strain localised in a sub-region of the actuator only.
Latest developments for the IAGOS database: Interoperability and metadata
NASA Astrophysics Data System (ADS)
Boulanger, Damien; Gautron, Benoit; Thouret, Valérie; Schultz, Martin; van Velthoven, Peter; Broetz, Bjoern; Rauthe-Schöch, Armin; Brissebrat, Guillaume
2014-05-01
In-service Aircraft for a Global Observing System (IAGOS, http://www.iagos.org) aims at the provision of long-term, frequent, regular, accurate, and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. Data access is handled by open access policy based on the submission of research requests which are reviewed by the PIs. Users can access the data through the following web sites: http://www.iagos.fr or http://www.pole-ether.fr as the IAGOS database is part of the French atmospheric chemistry data centre ETHER (CNES and CNRS). The database is in continuous development and improvement. In the framework of the IGAS project (IAGOS for GMES/COPERNICUS Atmospheric Service), major achievements will be reached, such as metadata and format standardisation in order to interoperate with international portals and other databases, QA/QC procedures and traceability, CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data integration within the central database, and the real-time data transmission. IGAS work package 2 aims at providing the IAGOS data to users in a standardized format including the necessary metadata and information on data processing, data quality and uncertainties. We are currently redefining and standardizing the IAGOS metadata for interoperable use within GMES/Copernicus. The metadata are compliant with the ISO 19115, INSPIRE and NetCDF-CF conventions. IAGOS data will be provided to users in NetCDF or NASA Ames format. We also are implementing interoperability between all the involved IAGOS data services, including the central IAGOS database, the former MOZAIC and CARIBIC databases, Aircraft Research DLR database and the Jülich WCS web application JOIN (Jülich OWS Interface) which combines model outputs with in situ data for intercomparison. The optimal data transfer protocol is being investigated to insure the interoperability. To facilitate satellite and model validation, tools will be made available for co-location and comparison with IAGOS. We will enhance the JOIN application in order to properly display aircraft data as vertical profiles and along individual flight tracks and to allow for graphical comparison to model results that are accessible through interoperable web services, such as the daily products from the GMES/Copernicus atmospheric service.
GSKY: A scalable distributed geospatial data server on the cloud
NASA Astrophysics Data System (ADS)
Rozas Larraondo, Pablo; Pringle, Sean; Antony, Joseph; Evans, Ben
2017-04-01
Earth systems, environmental and geophysical datasets are an extremely valuable sources of information about the state and evolution of the Earth. Being able to combine information coming from different geospatial collections is in increasing demand by the scientific community, and requires managing and manipulating data with different formats and performing operations such as map reprojections, resampling and other transformations. Due to the large data volume inherent in these collections, storing multiple copies of them is unfeasible and so such data manipulation must be performed on-the-fly using efficient, high performance techniques. Ideally this should be performed using a trusted data service and common system libraries to ensure wide use and reproducibility. Recent developments in distributed computing based on dynamic access to significant cloud infrastructure opens the door for such new ways of processing geospatial data on demand. The National Computational Infrastructure (NCI), hosted at the Australian National University (ANU), has over 10 Petabytes of nationally significant research data collections. Some of these collections, which comprise a variety of observed and modelled geospatial data, are now made available via a highly distributed geospatial data server, called GSKY (pronounced [jee-skee]). GSKY supports on demand processing of large geospatial data products such as satellite earth observation data as well as numerical weather products, allowing interactive exploration and analysis of the data. It dynamically and efficiently distributes the required computations among cloud nodes providing a scalable analysis framework that can adapt to serve large number of concurrent users. Typical geospatial workflows handling different file formats and data types, or blending data in different coordinate projections and spatio-temporal resolutions, is handled transparently by GSKY. This is achieved by decoupling the data ingestion and indexing process as an independent service. An indexing service crawls data collections either locally or remotely by extracting, storing and indexing all spatio-temporal metadata associated with each individual record. GSKY provides the user with the ability of specifying how ingested data should be aggregated, transformed and presented. It presents an OGC standards-compliant interface, allowing ready accessibility for users of the data via Web Map Services (WMS), Web Processing Services (WPS) or raw data arrays using Web Coverage Services (WCS). The presentation will show some cases where we have used this new capability to provide a significant improvement over previous approaches.
OpenSearch technology for geospatial resources discovery
NASA Astrophysics Data System (ADS)
Papeschi, Fabrizio; Enrico, Boldrini; Mazzetti, Paolo
2010-05-01
In 2005, the term Web 2.0 has been coined by Tim O'Reilly to describe a quickly growing set of Web-based applications that share a common philosophy of "mutually maximizing collective intelligence and added value for each participant by formalized and dynamic information sharing". Around this same period, OpenSearch a new Web 2.0 technology, was developed. More properly, OpenSearch is a collection of technologies that allow publishing of search results in a format suitable for syndication and aggregation. It is a way for websites and search engines to publish search results in a standard and accessible format. Due to its strong impact on the way the Web is perceived by users and also due its relevance for businesses, Web 2.0 has attracted the attention of both mass media and the scientific community. This explosive growth in popularity of Web 2.0 technologies like OpenSearch, and practical applications of Service Oriented Architecture (SOA) resulted in an increased interest in similarities, convergence, and a potential synergy of these two concepts. SOA is considered as the philosophy of encapsulating application logic in services with a uniformly defined interface and making these publicly available via discovery mechanisms. Service consumers may then retrieve these services, compose and use them according to their current needs. A great degree of similarity between SOA and Web 2.0 may be leading to a convergence between the two paradigms. They also expose divergent elements, such as the Web 2.0 support to the human interaction in opposition to the typical SOA machine-to-machine interaction. According to these considerations, the Geospatial Information (GI) domain, is also moving first steps towards a new approach of data publishing and discovering, in particular taking advantage of the OpenSearch technology. A specific GI niche is represented by the OGC Catalog Service for Web (CSW) that is part of the OGC Web Services (OWS) specifications suite, which provides a set of services for discovery, access, and processing of geospatial resources in a SOA framework. GI-cat is a distributed CSW framework implementation developed by the ESSI Lab of the Italian National Research Council (CNR-IMAA) and the University of Florence. It provides brokering and mediation functionalities towards heterogeneous resources and inventories, exposing several standard interfaces for query distribution. This work focuses on a new GI-cat interface which allows the catalog to be queried according to the OpenSearch syntax specification, thus filling the gap between the SOA architectural design of the CSW and the Web 2.0. At the moment, there is no OGC standard specification about this topic, but an official change request has been proposed in order to enable the OGC catalogues to support OpenSearch queries. In this change request, an OpenSearch extension is proposed providing a standard mechanism to query a resource based on temporal and geographic extents. Two new catalog operations are also proposed, in order to publish a suitable OpenSearch interface. This extended interface is implemented by the modular GI-cat architecture adding a new profiling module called "OpenSearch profiler". Since GI-cat also acts as a clearinghouse catalog, another component called "OpenSearch accessor" is added in order to access OpenSearch compliant services. An important role in the GI-cat extension, is played by the adopted mapping strategy. Two different kind of mappings are required: query, and response elements mapping. Query mapping is provided in order to fit the simple OpenSearch query syntax to the complex CSW query expressed by the OGC Filter syntax. GI-cat internal data model is based on the ISO-19115 profile, that is more complex than the simple XML syndication formats, such as RSS 2.0 and Atom 1.0, suggested by OpenSearch. Once response elements are available, in order to be presented, they need to be translated from the GI-cat internal data model, to the above mentioned syndication formats; the mapping processing, is bidirectional. When GI-cat is used to access OpenSearch compliant services, the CSW query must be mapped to the OpenSearch query, and the response elements, must be translated according to the GI-cat internal data model. As results of such extensions, GI-cat provides a user friendly facade to the complex CSW interface, thus enabling it to be queried, for example, using a browser toolbar.
Generalized constitutive equations for piezo-actuated compliant mechanism
NASA Astrophysics Data System (ADS)
Cao, Junyi; Ling, Mingxiang; Inman, Daniel J.; Lin, Jin
2016-09-01
This paper formulates analytical models to describe the static displacement and force interactions between generic serial-parallel compliant mechanisms and their loads by employing the matrix method. In keeping with the familiar piezoelectric constitutive equations, the generalized constitutive equations of compliant mechanism represent the input-output displacement and force relations in the form of a generalized Hooke’s law and as analytical functions of physical parameters. Also significantly, a new model of output displacement for compliant mechanism interacting with piezo-stacks and elastic loads is deduced based on the generalized constitutive equations. Some original findings differing from the well-known constitutive performance of piezo-stacks are also given. The feasibility of the proposed models is confirmed by finite element analysis and by experiments under various elastic loads. The analytical models can be an insightful tool for predicting and optimizing the performance of a wide class of compliant mechanisms that simultaneously consider the influence of loads and piezo-stacks.
Klimyuk, Victor; Pogue, Gregory; Herz, Stefan; Butler, John; Haydon, Hugh
2014-01-01
This review describes the adaptation of the plant virus-based transient expression system, magnICON(®) for the at-scale manufacturing of pharmaceutical proteins. The system utilizes so-called "deconstructed" viral vectors that rely on Agrobacterium-mediated systemic delivery into the plant cells for recombinant protein production. The system is also suitable for production of hetero-oligomeric proteins like immunoglobulins. By taking advantage of well established R&D tools for optimizing the expression of protein of interest using this system, product concepts can reach the manufacturing stage in highly competitive time periods. At the manufacturing stage, the system offers many remarkable features including rapid production cycles, high product yield, virtually unlimited scale-up potential, and flexibility for different manufacturing schemes. The magnICON system has been successfully adaptated to very different logistical manufacturing formats: (1) speedy production of multiple small batches of individualized pharmaceuticals proteins (e.g. antigens comprising individualized vaccines to treat NonHodgkin's Lymphoma patients) and (2) large-scale production of other pharmaceutical proteins such as therapeutic antibodies. General descriptions of the prototype GMP-compliant manufacturing processes and facilities for the product formats that are in preclinical and clinical testing are provided.
Flexural anchorage performance at diagonal crack locations.
DOT National Transportation Integrated Search
2010-12-01
Large numbers of reinforced concrete deck girder bridges that were constructed during the interstate system expansion of the 1950s have developed diagonal cracking in the stems. Though compliant with design codes when constructed, many of these bridg...
Performance through Deformation and Instability
NASA Astrophysics Data System (ADS)
Bertoldi, Katia
2015-03-01
Materials capable of undergoing large deformations like elastomers and gels are ubiquitous in daily life and nature. An exciting field of engineering is emerging that uses these compliant materials to design active devices, such as actuators, adaptive optical systems and self-regulating fluidics. Compliant structures may significantly change their architecture in response to diverse stimuli. When excessive deformation is applied, they may eventually become unstable. Traditionally, mechanical instabilities have been viewed as an inconvenience, with research focusing on how to avoid them. Here, I will demonstrate that these instabilities can be exploited to design materials with novel, switchable functionalities. The abrupt changes introduced into the architecture of soft materials by instabilities will be used to change their shape in a sudden, but controlled manner. Possible and exciting applications include materials with unusual properties such negative Poisson's ratio, phononic crystals with tunable low-frequency acoustic band gaps and reversible encapsulation systems.
Pulsatile flow in a compliant stenosed asymmetric model
NASA Astrophysics Data System (ADS)
Usmani, Abdullah Y.; Muralidhar, K.
2016-12-01
Time-varying velocity field in an asymmetric constricted tube is experimentally studied using a two-dimensional particle image velocimetry system. The geometry resembles a vascular disease which is characterized by arterial narrowing due to plaque deposition. The present study compares the nature of flow patterns in rigid and compliant asymmetric constricted tubes for a range of dimensionless parameters appearing in a human artery. A blood analogue fluid is employed along with a pump that mimics cardioflow conditions. The peak Reynolds number range is Re 300-800, while the Womersley number range considered in experiments is Wo 6-8. These values are based on the peak velocity in a straight rigid tube connected to the model, over a pulsation frequency range of 1.2-2.4 Hz. The medial-plane velocity distribution is used to investigate the nature of flow patterns. Temporal distribution of stream traces and hemodynamic factors including WSS, TAWSS and OSI at important phases of the pulsation cycle are discussed. The flow patterns obtained from PIV are compared to a limited extent against numerical simulation. Results show that the region downstream of the constriction is characterized by a high-velocity jet at the throat, while a recirculation zone, attached to the wall, evolves in time. Compliant models reveal large flow disturbances upstream during the retrograde flow. Wall shear stress values are lower in a compliant model as compared to the rigid. Cross-plane flow structures normal to the main flow direction are visible at select phases of the cycle. Positive values of largest Lyapunov exponent are realized for wall movement and are indicative of chaotic motion transferred from the flow to the wall. These exponents increase with Reynolds number as well as compliance. Period doubling is observed in wall displacement of highly compliant models, indicating possible triggering of hemodynamic events in a real artery that may cause fissure in the plaque deposits.
Polat, Gökhan; Karademir, Gökhan; Akalan, Ekin; Aşık, Mehmet; Erdil, Mehmet
2017-03-20
The aim of this study was to prospectively evaluate the compliance of our patients with a touchdown weight bearing (without supporting any weight on the affected side by only touching the plantar aspect of the foot to the ground to maintain balance to protect the affected side from mechanical loading) postoperative rehabilitation protocol after treatment of talar osteochondral lesion (TOL). Fourteen patients, who had been treated with arthroscopic debridement and microfracture, were followed prospectively. The patients were evaluated for weight bearing compliance with using a stationary gait analysis and feedback system at the postoperative first day, first week, third week, and sixth week. The mean visual analog scale (VAS) scores of the patients at the preoperative, postoperative first day, first week, third week, and sixth weeks were 5.5, 5.9, 3.6, 0.9, and 0.4, respectively. The decrease in VAS scores were statistically significant (p < 0.0001). First postoperative day revealed a mean value of transmitted weight of 4.08% ±0.8 (one non-compliant patient). The mean value was 4.34% ±0.8 at the first postoperative week (two non-compliant patients), 6.95% ±2.3 at the third postoperative week (eight non-compliant patients), and 10.8% ±4.8 at the sixth postoperative week (11 non-compliant patients). In the analysis of data, we found a negative correlation between VAS scores and transmitted weight (Kendall's tau b = -0.445 and p = 0.0228). Although patients were able to learn and adjust to the touchdown weight bearing gait protocol during the early postoperative period, most patients became non-compliant when their pain was relieved. To prevent this situation of non-compliance, patients should be warned to obey the weight bearing restrictions, and patients should be called for a follow-up at the third postoperative week.
Van Assche, Kerlijn; Nebot Giralt, Ariadna; Caudron, Jean Michel; Schiavetti, Benedetta; Pouget, Corinne; Tsoumanis, Achilleas; Meessen, Bruno; Ravinetto, Raffaella
2018-01-01
The rapid globalisation of the pharmaceutical production and distribution has not been supported by harmonisation of regulatory systems worldwide. Thus, the supply systems in low-income and middle-income countries (LMICs) remain exposed to the risk of poor-quality medicines. To contribute to estimating this risk in the private sector in LMICs, we assessed the quality assurance system of a convenient sample of local private pharmaceutical distributors. This descriptive study uses secondary data derived from the audits conducted by the QUAMED group at 60 local private pharmaceutical distributors in 13 LMICs. We assessed the distributors' compliance with good distribution practices (GDP), general quality requirements (GQR) and cold chain management (CCM), based on an evaluation tool inspired by the WHO guidelines 'Model Quality Assurance System (MQAS) for procurement agencies'. Descriptive statistics describe the compliance for the whole sample, for distributors in sub-Saharan Africa (SSA) versus those in non-SSA, and for those in low-income countries (LICs) versus middle-income countries (MICs). Local private pharmaceutical distributors in our sample were non-compliant, very low-compliant or low-compliant for GQR (70%), GDP (60%) and CCM (41%). Only 7/60 showed good to full compliance for at least two criteria. Observed compliance varies by geographical region and by income group: maximum values are higher in non-SSA versus SSA and in MICs versus LICs, while minimum values are the same across different groups. The poor compliance with WHO quality standards observed in our sample indicates a concrete risk that patients in LMICs are exposed to poor-quality or degraded medicines. Significant investments are needed to strengthen the regulatory supervision, including on private pharmaceutical distributors. An adapted standardised evaluation tool inspired by the WHO MQAS would be helpful for self-evaluation, audit and inspection purposes.
Connections for solid oxide fuel cells
Collie, Jeffrey C.
1999-01-01
A connection for fuel cell assemblies is disclosed. The connection includes compliant members connected to individual fuel cells and a rigid member connected to the compliant members. Adjacent bundles or modules of fuel cells are connected together by mechanically joining their rigid members. The compliant/rigid connection permits construction of generator fuel cell stacks from basic modular groups of cells of any desired size. The connections can be made prior to installation of the fuel cells in a generator, thereby eliminating the need for in-situ completion of the connections. In addition to allowing pre-fabrication, the compliant/rigid connections also simplify removal and replacement of sections of a generator fuel cell stack.
NASA Astrophysics Data System (ADS)
Boiko, Andrey V.; Kulik, Victor M.; Chun, Ho-Hwan; Lee, Inwon
2011-12-01
Skin frictional drag reduction efficiency of "stiff" compliant coating was investigated in a wind tunnel experiment. Flat plate compliant coating inserts were installed in a wind tunnel and the measurements of skin frictional drag and velocity field were carried out. The compliant coatings with varying viscoelastic properties had been prepared using different composition. In order to optimize the coating thickness, the most important design parameter, the dynamic viscoelastic properties had been determined experimentally. The aging of the materials (variation of their properties) during half a year was documented as well. A design procedure proposed by Kulik et al. (2008) was applied to get an optimal value for the coating thickness. Along with the drag measurement using the strain balance, velocity and pressure were measured for different coatings. The compliant coatings with the thickness h = 7mm achieved 4~5% drag reduction within a velocity range 30~40 m/s. The drag reduction mechanism of the attenuation of turbulence velocity fluctuations due to the compliant coating was demonstrated. It is envisioned that larger drag reduction effect is obtainable at higher flow velocities for high speed trains and subsonic aircrafts.
NASA Astrophysics Data System (ADS)
Jiang, Yao; Li, Tie-Min; Wang, Li-Ping
2015-09-01
This paper investigates the stiffness modeling of compliant parallel mechanism (CPM) based on the matrix method. First, the general compliance matrix of a serial flexure chain is derived. The stiffness modeling of CPMs is next discussed in detail, considering the relative positions of the applied load and the selected displacement output point. The derived stiffness models have simple and explicit forms, and the input, output, and coupling stiffness matrices of the CPM can easily be obtained. The proposed analytical model is applied to the stiffness modeling and performance analysis of an XY parallel compliant stage with input and output decoupling characteristics. Then, the key geometrical parameters of the stage are optimized to obtain the minimum input decoupling degree. Finally, a prototype of the compliant stage is developed and its input axial stiffness, coupling characteristics, positioning resolution, and circular contouring performance are tested. The results demonstrate the excellent performance of the compliant stage and verify the effectiveness of the proposed theoretical model. The general stiffness models provided in this paper will be helpful for performance analysis, especially in determining coupling characteristics, and the structure optimization of the CPM.
A new hybrid piezo-actuated compliant mechanism with self-tuned flexure arm
NASA Astrophysics Data System (ADS)
Ling, Mingxiang; Cao, Junyi
2017-04-01
Recent interests and demands for developing video-rate atomic force microscopes, high-throughput probe-based nanofabrication and high-frequency vibration generator for assisted-machining are increasingly posing new challenges for designing high-bandwidth and large-range piezo-actuated compliant mechanisms. The previous studies mainly focused on making the trade-off between natural frequency and motion range by designing a proper topology. Differing from the previous works, this paper attempts to break the deadlock by employing both piezo-stacks and piezoelectric patches to actuate compliant mechanisms. In this method, piezo-stacks provide an actuating force similar to the traditional way, while piezoelectric patches are bonded on the surface of the flexure arms in compliant mechanisms. These `active' laminaes are used to further actuate the hosting flexural beam by inducing strains on the interface and then give additional bending moments to the flexural arms, which enlarge the output displacement of the compliant mechanism while without the sacrifice of natural frequency. An analytical formulation is established to illustrate the new driving principle and the compound static behaviour of a specific hybrid piezo-actuated multistage compliant mechanism. Initial prototype is also manufactured and experimentally testing is conducted to verify the feasibility of the method.
A Services-Oriented Architecture for Water Observations Data
NASA Astrophysics Data System (ADS)
Maidment, D. R.; Zaslavsky, I.; Valentine, D.; Tarboton, D. G.; Whitenack, T.; Whiteaker, T.; Hooper, R.; Kirschtel, D.
2009-04-01
Water observations data are time series of measurements made at point locations of water level, flow, and quality and corresponding data for climatic observations at point locations such as gaged precipitation and weather variables. A services-oriented architecture has been built for such information for the United States that has three components: hydrologic information servers, hydrologic information clients, and a centralized metadata cataloging system. These are connected using web services for observations data and metadata defined by an XML-based language called WaterML. A Hydrologic Information Server can be built by storing observations data in a relational database schema in the CUAHSI Observations Data Model, in which case, web services access to the data and metadata is automatically provided by query functions for WaterML that are wrapped around the relational database within a web server. A Hydrologic Information Server can also be constructed by custom-programming an interface to an existing water agency web site so that responds to the same queries by producing data in WaterML as do the CUAHSI Observations Data Model based servers. A Hydrologic Information Client is one which can interpret and ingest WaterML metadata and data. We have two client applications for Excel and ArcGIS and have shown how WaterML web services can be ingested into programming environments such as Matlab and Visual Basic. HIS Central, maintained at the San Diego Supercomputer Center is a repository of observational metadata for WaterML web services which presently indexes 342 million data measured at 1.75 million locations. This is the largest catalog water observational data for the United States presently in existence. As more observation networks join what we term "CUAHSI Water Data Federation", and the system accommodates a growing number of sites, measured parameters, applications, and users, rapid and reliable access to large heterogeneous hydrologic data repositories becomes critical. The CUAHSI HIS solution to the scalability and heterogeneity challenges has several components. Structural differences across the data repositories are addressed by building a standard services foundation for the exchange of hydrologic data, as derived from a common information model for observational data measured at stationary points and its implementation as a relational schema (ODM) and an XML schema (WaterML). Semantic heterogeneity is managed by mapping water quantity, water quality, and other parameters collected by government agencies and academic projects to a common ontology. The WaterML-compliant web services are indexed in a community services registry called HIS Central (hiscentral.cuahsi.org). Once a web service is registered in HIS Central, its metadata (site and variable characteristics, period of record for each variable at each site, etc.) is harvested and appended to the central catalog. The catalog is further updated as the service publisher associates the variables in the published service with ontology concepts. After this, the newly published service becomes available for spatial and semantics-based queries from online and desktop client applications developed by the project. Hydrologic system server software is now deployed at more than a dozen locations in the United States and Australia. To provide rapid access to data summaries, in particular for several nation-wide data repositories including EPA STORET, USGS NWIS, and USDA SNOTEL, we convert the observation data catalogs and databases with harvested data values into special representations that support high-performance analysis and visualization. The construction of OLAP (Online Analytical Processing) cubes, often called data cubes, is an approach to organizing and querying large multi-dimensional data collections. We have applied the OLAP techniques, as implemented in Microsoft SQL Server 2005/2008, to the analysis of the catalogs from several agencies. OLAP analysis results reflect geography and history of observation data availability from USGS NWIS, EPA STORET, and USDA SNOTEL repositories, and spatial and temporal dynamics of the available measurements for several key nutrient-related parameters. Our experience developing the CUAHSI HIS cyberinfrastructure demonstrated that efficient integration of hydrologic observations from multiple government and academic sources requires a range of technical approaches focused on managing different components of data heterogeneity and system scalability. While this submission addresses technical aspects of developing a national-scale information system for hydrologic observations, the challenges of explicating shared semantics of hydrologic observations and building a community of HIS users and developers remain critical in constructing a nation-wide federation of water data services.
Semantic Overlays in Educational Content Networks--The hylOs Approach
ERIC Educational Resources Information Center
Engelhardt, Michael; Hildebrand, Arne; Lange, Dagmar; Schmidt, Thomas C.
2006-01-01
Purpose: The paper aims to introduce an educational content management system, Hypermedia Learning Objects System (hylOs), which is fully compliant to the IEEE LOM eLearning object metadata standard. Enabled through an advanced authoring toolset, hylOs allows the definition of instructional overlays of a given eLearning object mesh.…
Summary of International Border Crossings Roundtable Meeting Held in Buffalo, New York, June 7, 1993
DOT National Transportation Integrated Search
1996-11-30
The purpose of this plan is to establish a formal set of guidelines and activities to be adhered to and performed by JHU/APL and the developer to ensure that the SAFER System has been tested successfully and is fully compliant with the SAFER System r...
Many States unprepared for 'Y2K' computer woes.
1998-12-25
Most computer systems that States use to process health benefits are not Y2K compliant, which could jeopardize Medicaid application processing. This could cause recipients to lose benefits or experience delays in payments. Only seven states have reported that their systems are ready for the year 2000. Contact information is provided.
48 CFR 352.234-3 - Full earned value management system.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Federal Agency (CFA) as being compliant with the guidelines in ANSI/EIA Standard-748 (current version at... and accepted by the CFA at the time of award, see paragraph (b) of this clause. The Contractor shall..., the Contractor's EVM system has not been validated and accepted by the CFA as complying with EVMS...
40 CFR 63.343 - Compliance provisions.
Code of Federal Regulations, 2013 CFR
2013-07-01
.... (1) Composite mesh-pad systems. (i) During the initial performance test, the owner or operator of an... limitations in § 63.342 through the use of a composite mesh-pad system shall determine the outlet chromium... performance test and accept ±2 inches of water column from this value as the compliant range. (ii) On and...
40 CFR 63.343 - Compliance provisions.
Code of Federal Regulations, 2014 CFR
2014-07-01
.... (1) Composite mesh-pad systems. (i) During the initial performance test, the owner or operator of an... limitations in § 63.342 through the use of a composite mesh-pad system shall determine the outlet chromium... performance test and accept ±2 inches of water column from this value as the compliant range. (ii) On and...
SRG110 Stirling Generator Dynamic Simulator Vibration Test Results and Analysis Correlation
NASA Technical Reports Server (NTRS)
Suarez, Vicente J.; Lewandowski, Edward J.; Callahan, John
2006-01-01
The U.S. Department of Energy (DOE), Lockheed Martin (LM), and NASA Glenn Research Center (GRC) have been developing the Stirling Radioisotope Generator (SRG110) for use as a power system for space science missions. The launch environment enveloping potential missions results in a random input spectrum that is significantly higher than historical RPS launch levels and is a challenge for designers. Analysis presented in prior work predicted that tailoring the compliance at the generator-spacecraft interface reduced the dynamic response of the system thereby allowing higher launch load input levels and expanding the range of potential generator missions. To confirm analytical predictions, a dynamic simulator representing the generator structure, Stirling convertors and heat sources was designed and built for testing with and without a compliant interface. Finite element analysis was performed to guide the generator simulator and compliant interface design so that test modes and frequencies were representative of the SRG110 generator. This paper presents the dynamic simulator design, the test setup and methodology, test article modes and frequencies and dynamic responses, and post-test analysis results. With the compliant interface, component responses to an input environment exceeding the SRG110 qualification level spectrum were all within design allowables. Post-test analysis included finite element model tuning to match test frequencies and random response analysis using the test input spectrum. Analytical results were in good overall agreement with the test results and confirmed previous predictions that the SRG110 power system may be considered for a broad range of potential missions, including those with demanding launch environments.
Telerobotic controller development
NASA Technical Reports Server (NTRS)
Otaguro, W. S.; Kesler, L. O.; Land, Ken; Rhoades, Don
1987-01-01
To meet NASA's space station's needs and growth, a modular and generic approach to robotic control which provides near-term implementation with low development cost and capability for growth into more autonomous systems was developed. The method uses a vision based robotic controller and compliant hand integrated with the Remote Manipulator System arm on the Orbiter. A description of the hardware and its system integration is presented.
Flexural anchorage performance at diagonal crack locations : final report.
DOT National Transportation Integrated Search
2010-12-01
Large numbers of reinforced concrete deck girder bridges that were constructed during the interstate system expansion of the 1950s have developed diagonal cracking in the stems. Though compliant with design codes when constructed, many of these bridg...
Grandfathered, Grandmothered, And ACA-Compliant Health Plans Have Equivalent Premiums.
Whitmore, Heidi; Gabel, Jon R; Satorius, Jennifer L; Green, Matthew
2017-02-01
Many small employers offer employees health plans that are not fully compliant with Affordable Care Act (ACA) provisions such as covering preventive services without cost sharing. These "grandfathered" and "grandmothered" plans accounted for about 65 percent of enrollment in the small-group market in 2014. Premium costs for these and ACA-compliant plans were equivalent. Project HOPE—The People-to-People Health Foundation, Inc.
NASA Astrophysics Data System (ADS)
Lucido, J. M.
2013-12-01
Scientists in the fields of hydrology, geophysics, and climatology are increasingly using the vast quantity of publicly-available data to address broadly-scoped scientific questions. For example, researchers studying contamination of nearshore waters could use a combination of radar indicated precipitation, modeled water currents, and various sources of in-situ monitoring data to predict water quality near a beach. In discovering, gathering, visualizing and analyzing potentially useful data sets, data portals have become invaluable tools. The most effective data portals often aggregate distributed data sets seamlessly and allow multiple avenues for accessing the underlying data, facilitated by the use of open standards. Additionally, adequate metadata are necessary for attribution, documentation of provenance and relating data sets to one another. Metadata also enable thematic, geospatial and temporal indexing of data sets and entities. Furthermore, effective portals make use of common vocabularies for scientific methods, units of measure, geologic features, chemical, and biological constituents as they allow investigators to correctly interpret and utilize data from external sources. One application that employs these principles is the National Ground Water Monitoring Network (NGWMN) Data Portal (http://cida.usgs.gov/ngwmn), which makes groundwater data from distributed data providers available through a single, publicly accessible web application by mediating and aggregating native data exposed via web services on-the-fly into Open Geospatial Consortium (OGC) compliant service output. That output may be accessed either through the map-based user interface or through the aforementioned OGC web services. Furthermore, the Geo Data Portal (http://cida.usgs.gov/climate/gdp/), which is a system that provides users with data access, subsetting and geospatial processing of large and complex climate and land use data, exemplifies the application of International Standards Organization (ISO) metadata records to enhance data discovery for both human and machine interpretation. Lastly, the Water Quality Portal (http://www.waterqualitydata.us/) achieves interoperable dissemination of water quality data by referencing a vocabulary service for mapping constituents and methods between the USGS and USEPA. The NGWMN Data Portal, Geo Data Portal and Water Quality Portal are three examples of best practices when implementing data portals that provide distributed scientific data in an integrated, standards-based approach.
VO for Education: Archive Prototype
NASA Astrophysics Data System (ADS)
Ramella, M.; Iafrate, G.; De Marco, M.; Molinaro, M.; Knapic, C.; Smareglia, R.; Cepparo, F.
2014-05-01
The number of remote control telescopes dedicated to education is increasing in many countries, leading to correspondingly larger and larger amount of stored educational data that are usually available only to local observers. Here we present the project for a new infrastructure that will allow teachers using educational telescopes to archive their data and easily publish them within the Virtual Observatory (VO) avoiding the complexity of professional tools. Students and teachers anywhere will be able to access these data with obvious benefits for the realization of grander scale collaborative projects. Educational VO data will also be an important resource for teachers not having direct access to any educational telescopes. We will use the educational telescope at our observatory in Trieste as a prototype for the future VO educational data archive resource. The publishing infrastructure will include: user authentication, content and curation validation, data validation and ingestion, VO compliant resource generation. All of these parts will be performed by means of server side applications accessible through a web graphical user interface (web GUI). Apart from user registration, that will be validated by a natural person responsible for the archive (after having verified the reliability of the user and inspected one or more test files), all the subsequent steps will be automated. This means that at the very first data submission through the webGUI, a complete resource including archive and published VO service will be generated, ready to be registered to the VO. The efforts required to the registered user will consist only in describing herself/himself at registration step and submitting the data she/he selects for publishing after each observation sessions. The infrastructure will be file format independent and the underlying data model will use a minimal set of standard VO keywords, some of which will be specific for outreach and education, possibly including VO field identification (astronomy, planetary science, solar physics). The VO published resource description will be suggested such as to allow selective access to educational data by VO aware tools, differentiating them from professional data while treating them with the same procedures, protocols and tools. The whole system will be very flexible, scalable and with the objective to leave as less work as possible to humans.
NASA Astrophysics Data System (ADS)
Dumoulin, J.; Averty, R.
2012-04-01
One of the objectives of ISTIMES project is to evaluate the potentialities offered by the integration of different electromagnetic techniques able to perform non-invasive diagnostics for surveillance and monitoring of transport infrastructures. Among the EM methods investigated, uncooled infrared camera is a promising technique due to its dissemination potential according to its relative low cost on the market. Infrared thermography, when it is used in quantitative mode (not in laboratory conditions) and not in qualitative mode (vision applied to survey), requires to process in real time thermal radiative corrections on raw data acquired to take into account influences of natural environment evolution with time. But, camera sensor has to be enough smart to apply in real time calibration law and radiometric corrections in a varying atmosphere. So, a complete measurement system was studied and developed with low cost infrared cameras available on the market. In the system developed, infrared camera is coupled with other sensors to feed simplified radiative models running, in real time, on GPU available on small PC. The system studied and developed uses a fast Ethernet camera FLIR A320 [1] coupled with a VAISALA WXT520 [2] weather station and a light GPS unit [3] for positioning and dating. It can be used with other Ethernet infrared cameras (i.e. visible ones) but requires to be able to access measured data at raw level. In the present study, it has been made possible thanks to a specific agreement signed with FLIR Company. The prototype system studied and developed is implemented on low cost small computer that integrates a GPU card to allow real time parallel computing [4] of simplified radiometric [5] heat balance using information measured with the weather station. An HMI was developed under Linux using OpenSource and complementary pieces of software developed at IFSTTAR. This new HMI called "IrLaW" has various functionalities that let it compliant to be use in real site for long term monitoring. It can be remotely controlled in wire or wireless communication mode depending on what is the context of measurement and the degree of accessibility to the system when it is running on real site. To complete and conclude, thanks to the development of a high level library, but also to the deployment of a daemon, our developed measurement system was tuned to be compatible with OGC standards. Complementary functionalities were also developed to allow the system to self declare to 52North. For that, a specific plugin was developed to be inserted previously at 52North level. Finally, data are also accessible by tasking the system when required, fort instance by using the web portal developed in the ISTIMES Framework. ACKNOWLEDGEMENT - The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement n° 225663.
NASA Astrophysics Data System (ADS)
Boettcher, M. A.; Butt, B. M.; Klinkner, S.
2016-10-01
A major concern of a university satellite mission is to download the payload and the telemetry data from a satellite. While the ground station antennas are in general easy and with limited afford to procure, the receiving unit is most certainly not. The flexible and low-cost software-defined radio (SDR) transceiver "BladeRF" is used to receive the QPSK modulated and CCSDS compliant coded data of a satellite in the HAM radio S-band. The control software is based on the Open Source program GNU Radio, which also is used to perform CCSDS post processing of the binary bit stream. The test results show a good performance of the receiving system.
A strain-isolation design for stretchable electronics
NASA Astrophysics Data System (ADS)
Wu, Jian; Li, Ming; Chen, Wei-Qiu; Kim, Dae-Hyeong; Kim, Yun-Soung; Huang, Yong-Gang; Hwang, Keh-Chih; Kang, Zhan; Rogers, John A.
2010-12-01
Stretchable electronics represents a direction of recent development in next-generation semiconductor devices. Such systems have the potential to offer the performance of conventional wafer-based technologies, but they can be stretched like a rubber band, twisted like a rope, bent over a pencil, and folded like a piece of paper. Isolating the active devices from strains associated with such deformations is an important aspect of design. One strategy involves the shielding of the electronics from deformation of the substrate through insertion of a compliant adhesive layer. This paper establishes a simple, analytical model and validates the results by the finite element method. The results show that a relatively thick, compliant adhesive is effective to reduce the strain in the electronics, as is a relatively short film.
Enhanced Management of and Access to Hurricane Sandy Ocean and Coastal Mapping Data
NASA Astrophysics Data System (ADS)
Eakins, B.; Neufeld, D.; Varner, J. D.; McLean, S. J.
2014-12-01
NOAA's National Geophysical Data Center (NGDC) has significantly improved the discovery and delivery of its geophysical data holdings, initially targeting ocean and coastal mapping (OCM) data in the U.S. coastal region impacted by Hurricane Sandy in 2012. We have developed a browser-based, interactive interface that permits users to refine their initial map-driven data-type choices prior to bulk download (e.g., by selecting individual surveys), including the ability to choose ancillary files, such as reports or derived products. Initial OCM data types now available in a U.S. East Coast map viewer, as well as underlying web services, include: NOS hydrographic soundings and multibeam sonar bathymetry. Future releases will include trackline geophysics, airborne topographic and bathymetric-topographic lidar, bottom sample descriptions, and digital elevation models.This effort also includes working collaboratively with other NOAA offices and partners to develop automated methods to receive and verify data, stage data for archive, and notify data providers when ingest and archive are completed. We have also developed improved metadata tools to parse XML and auto-populate OCM data catalogs, support the web-based creation and editing of ISO-compliant metadata records, and register metadata in appropriate data portals. This effort supports a variety of NOAA mission requirements, from safe navigation to coastal flood forecasting and habitat characterization.
xiSPEC: web-based visualization, analysis and sharing of proteomics data.
Kolbowski, Lars; Combe, Colin; Rappsilber, Juri
2018-05-08
We present xiSPEC, a standard compliant, next-generation web-based spectrum viewer for visualizing, analyzing and sharing mass spectrometry data. Peptide-spectrum matches from standard proteomics and cross-linking experiments are supported. xiSPEC is to date the only browser-based tool supporting the standardized file formats mzML and mzIdentML defined by the proteomics standards initiative. Users can either upload data directly or select files from the PRIDE data repository as input. xiSPEC allows users to save and share their datasets publicly or password protected for providing access to collaborators or readers and reviewers of manuscripts. The identification table features advanced interaction controls and spectra are presented in three interconnected views: (i) annotated mass spectrum, (ii) peptide sequence fragmentation key and (iii) quality control error plots of matched fragments. Highlighting or selecting data points in any view is represented in all other views. Views are interactive scalable vector graphic elements, which can be exported, e.g. for use in publication. xiSPEC allows for re-annotation of spectra for easy hypothesis testing by modifying input data. xiSPEC is freely accessible at http://spectrumviewer.org and the source code is openly available on https://github.com/Rappsilber-Laboratory/xiSPEC.
Compliant mechanism road bicycle brake: a rigid-body replacement case study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olsen, Brian M; Howell, Larry L; Magleby, Spencer P
2011-01-19
The design of high-performance bicycle brakes is complicated by the competing design objectives of increased performance and low weight. But this challenge also provides a good case study to demonstrate the design of compliant mechanisms to replace current rigid-link mechanisms. This paper briefly reviews current road brake designs, demonstrates the use of rigid-body replacement synthesis to design a compliant mechanism, and illustrates the combination of compliant mechanism design tools. The resulting concept was generated from the modified dual-pivot brake design and is a partially compliant mechanism where one pin has the dual role of a joint and a mounting pin.more » The pseudo-rigid-body model, finite element analysis, and optimization algorithms are used to generate design dimensions, and designs are considered for both titanium and E-glass flexures. The resulting design has the potential of reducing the part count and overall weight while maintaining a performance similar to the benchmark.« less
Yang, Jiacheng; Durbin, Thomas D; Jiang, Yu; Tange, Takeshi; Karavalakis, Georgios; Cocker, David R; Johnson, Kent C
2018-05-31
The primary goal of this study was to compare emissions measurements between a 1065 compliant PEMS, and the NTK Compact Emissions Meter (NCEM) capable of measuring NOx, PM, and solid PN. Both units were equipped on a light-duty diesel truck and tested over local, highway, and downtown driving routes. The results indicate that the NOx measurements for the NCEM were within approximately ±10% of those the 1065 compliant PEMS, which suggests that the NCEM could be used as a screening tool for NOx emissions. The NCEM showed larger differences for PM emissions on an absolute level, but this was at PM levels well below the 1 mg/mi level. The NCEM differences ranged from -2% to +26% if the comparisons are based on a percentage of the 1.0 mg/mi standard. Larger differences were also seen for PN emissions, with the NCEM measuring higher PN emissions, which can primarily be attributed to a zero current offset that we observed for the NCEM, which has been subsequently improved in the latest generation of the NCEM system. The comparisons between the 1065 compliant PEMS and the NCEM suggest that there could be applications for the NCEM or other mini-PEMS for applications such as identification of potential issues by regulatory agencies, manufacturer evaluation and validation of emissions under in-use conditions, and potential use in inspection and maintenance (I/M) programs, especially for heavy-duty vehicles. Copyright © 2017. Published by Elsevier B.V.
Study of non-compliance among chronic hemodialysis patients and its impact on patients' outcomes.
Ibrahim, Salwa; Hossam, Mohammed; Belal, Dawlat
2015-03-01
Non-adherence to prescription is common among hemodialysis (HD) patients and has been associated with significant morbidity. At least 50% of HD patients are believed to be non-adherent to some part of their treatment. We aimed to assess the prevalence of non-adherence to dialysis prescription among 100 chronic HD patients. We explored the relationship between non-adherence on one hand and socioeconomic profile, depression scores and cognitive function on the other hand. The impact of patients' non-adherence on nutritional status, quality of life and dialysis adequacy was also assessed. The mean age of the study group was 50.51 ± 12.0 years. There were 62 females and 38 males in the study. Thirty-six patients (36%) were non-compliant to their dialysis prescription. No significant differences were detected between compliant and non-compliant patients in their education level and employment status. Inter-dialytic weight gain, serum phosphorus and depression scores were significantly higher in non-compliant patients compared with compliant patients, whereas body weight, serum albumin, serum calcium, quality of life scores and nutrition scores were significantly higher in compliant patients (P <0.05). In conclusion, non-adherence is highly prevalent among chronic HD patients and is associated with poor quality of life, depression and malnutrition.
Investigation of the Mechanical Performance of Compliant Thermal Barriers
NASA Technical Reports Server (NTRS)
DeMange, Jeffrey J.; Bott, Robert J.; Dunlap, Patrick H.
2011-01-01
Compliant thermal barriers play a pivotal role in the thermal protection systems of advanced aerospace vehicles. Both the thermal properties and mechanical performance of these barriers are critical in determining their successful implementation. Due to the custom nature of many thermal barriers, designers of advanced spacecraft have little guidance as to the design, selection, and implementation of these elements. As part of an effort to develop a more fundamental understanding of the interrelationship between thermal barrier design and performance, mechanical testing of thermal barriers was conducted. Two different types of thermal barriers with several core insulation density levels ranging from 62 to 141 kg/cu m were investigated. Room-temperature compression tests were conducted on samples to determine load performance and assess thermal barrier resiliency. Results showed that the loading behavior of these thermal barriers was similar to other porous, low-density, compliant materials, such as elastomeric foams. Additionally, the insulation density level had a significant non-linear impact on the stiffness and peak loads of the thermal barriers. In contrast, neither the thermal barrier type nor the level of insulation density significantly influenced the room-temperature resiliency of the samples.
NASA Astrophysics Data System (ADS)
Natali, S.; Mantovani, S.; Barboni, D.; Hogan, P.
2017-12-01
In 1999, US Vice-President Al Gore outlined the concept of `Digital Earth' as a multi-resolution, three-dimensional representation of the planet to find, visualise and make sense of vast amounts of geo- referenced information on physical and social environments, allowing to navigate through space and time, accessing historical and forecast data to support scientists, policy-makers, and any other user. The eodataservice platform (http://eodataservice.org/) implements the Digital Earth Concept: eodatasevice is a cross-domain platform that makes available a large set of multi-year global environmental collections allowing data discovery, visualization, combination, processing and download. It implements a "virtual datacube" approach where data stored on distributed data centers are made available via standardized OGC-compliant interfaces. Dedicated web-based Graphic User Interfaces (based on the ESA-NASA WebWorldWind technology) as well as web-based notebooks (e.g. Jupyter notebook), deskop GIS tools and command line interfaces can be used to access and manipulate the data. The platform can be fully customized on users' needs. So far eodataservice has been used for the following thematic applications: High resolution satellite data distribution Land surface monitoring using SAR surface deformation data Atmosphere, ocean and climate applications Climate-health applications Urban Environment monitoring Safeguard of cultural heritage sites Support to farmers and (re)-insurances in the agriculturés field In the current work, the EO Data Service concept is presented as key enabling technology; furthermore various examples are provided to demonstrate the high level of interdisciplinarity of the platform.
The catalogCleaner: Separating the Sheep from the Goats
NASA Astrophysics Data System (ADS)
O'Brien, K.; Hankin, S. C.; Schweitzer, R.; Koyuk, H.
2012-12-01
The Global Earth Observation Integrated Data Environment (GEO-IDE) is NOAA's effort to successfully integrate data and information with partners in the national US-Global Earth Observation System (US-GEO) and the international Global Earth Observation System of Systems (GEOSS). As part of the GEO-IDE, the Unified Access Framework (UAF) is working to build momentum towards the goal of increased data integration and interoperability. The UAF project is moving towards this goal with an approach that includes leveraging well known and widely used standards and focusing initially on well understood data types, such as gridded data from climate models. This phased approach serves to engage data providers and users and also has a high probability of demonstrable successes. The UAF project shares the widely held conviction that the use of data standards is a key ingredient necessary to achieve interoperability. Many community-based consensus standards fail, though, due to poor compliance. Compliance problems emerge for many reasons: because the standards evolve through versions, because documentation is ambiguous or because individual data providers find the standard inadequate as-is to meet their special needs. In addition, minimalist use of standards will lead to a compliant service, but one which is of low quality. For example, serving five hundred individual files from a single climate model might be compliant, but enhancing the service so that those files are all aggregated together into one virtual dataset and available through a single access URL provides a much more useful service. The UAF project began showcasing the advantages of providing compliant data by manually building a master catalog generated from hand-picked THREDDS servers. With an understanding that educating data managers to provide standards compliant data and metadata can take years, the UAF project wanted to continue increasing the volume of data served through the master catalog as much as possible. However, it quickly became obvious, through the sheer volume of data servers available, that the manual process of building a master catalog was not scalable. Thus, the idea for the catalogCleaner tool was born. The goal of this tool is to automatically crawl a remote OPeNDAP or THREDDS server, and from the information in the server build a "clean" catalog of data that will be: a) served through uniform access services; b) have CF compliant metadata; c) directly link the data to common visualization tools thereby allowing users to immediately begin exploring actual data. In addition, the UAF-generated clean catalog can then be used to drive data discovery tools such as Geoportal, GI-CAT, etc. This presentation will further explore the motivation of creating this tool, the implementation of this tool, as well as the myriad of challenges and difficulties there were encountered along the way.
NASA Astrophysics Data System (ADS)
Banks, David; Wiley, Anthony; Catania, Nicolas; Coles, Alastair N.; Smith, Duncan; Baynham, Simon; Deliot, Eric; Chidzey, Rod
1998-02-01
In this paper we describe the work being done at HP Labs Bristol in the area of home networks and gateways. This work is based on the idea of breaking open the set top box by physically separating the access network specific functions from the application specific functions. The access network specific functions reside in an access network gateway that can be shared by many end user devices. The first section of the paper present the philosophy behind this approach. The end user devices and the access network gateways must be interconnected by a high bandwidth network which can offer a bounded delay service for delay sensitive traffic. We are advocating the use of IEEE 1394 for this network, and the next section of the paper gives a brief introduction to this technology. We then describe a prototype digital video broadcasting satellite compliant gateway that we have built. This gateway could be used, for example, by a PC for receiving a data service or by a digital TV for receiving an MPEG-2 video service. A control architecture is the presented which uses a PC application to provide a web based user interface to the system. Finally, we provide details of our work on extending the reach of IEEE 1394 and its standardization status.
A Low-Cost and Secure Solution for e-Commerce
NASA Astrophysics Data System (ADS)
Pasquet, Marc; Vacquez, Delphine; Rosenberger, Christophe
We present in this paper a new architecture for remote banking and e-commerce applications. The proposed solution is designed to be low cost and provides some good guarantees of security for a client and his bank issuer. Indeed, the main problem for an issuer is to identify and authenticate one client (a cardholder) using his personal computer through the web when this client wants to access to remote banking services or when he wants to pay on a e-commerce site equipped with 3D-secure payment solution. The proposed solution described in this paper is MasterCard Chip Authentication Program compliant and was experimented in the project called SOPAS. The main contribution of this system consists in the use of a smartcard with a I2C bus that pilots a terminal only equipped with a screen and a keyboard. During the use of services, the user types his PIN code on the keyboard and all the security part of the transaction is performed by the chip of the smartcard. None information of security stays on the personal computer and a dynamic token created by the card is sent to the bank and verified by the front end. We present first the defined methodology and we analyze the main security aspects of the proposed solution.
MATISSE: A novel tool to access, visualize and analyse data from planetary exploration missions
NASA Astrophysics Data System (ADS)
Zinzi, A.; Capria, M. T.; Palomba, E.; Giommi, P.; Antonelli, L. A.
2016-04-01
The increasing number and complexity of planetary exploration space missions require new tools to access, visualize and analyse data to improve their scientific return. ASI Science Data Center (ASDC) addresses this request with the web-tool MATISSE (Multi-purpose Advanced Tool for the Instruments of the Solar System Exploration), allowing the visualization of single observation or real-time computed high-order products, directly projected on the three-dimensional model of the selected target body. Using MATISSE it will be no longer needed to download huge quantity of data or to write down a specific code for every instrument analysed, greatly encouraging studies based on joint analysis of different datasets. In addition the extremely high-resolution output, to be used offline with a Python-based free software, together with the files to be read with specific GIS software, makes it a valuable tool to further process the data at the best spatial accuracy available. MATISSE modular structure permits addition of new missions or tasks and, thanks to dedicated future developments, it would be possible to make it compliant to the Planetary Virtual Observatory standards currently under definition. In this context the recent development of an interface to the NASA ODE REST API by which it is possible to access to public repositories is set.
Sun, Shulei; Chen, Jing; Li, Weizhong; Altintas, Ilkay; Lin, Abel; Peltier, Steve; Stocks, Karen; Allen, Eric E.; Ellisman, Mark; Grethe, Jeffrey; Wooley, John
2011-01-01
The Community Cyberinfrastructure for Advanced Microbial Ecology Research and Analysis (CAMERA, http://camera.calit2.net/) is a database and associated computational infrastructure that provides a single system for depositing, locating, analyzing, visualizing and sharing data about microbial biology through an advanced web-based analysis portal. CAMERA collects and links metadata relevant to environmental metagenome data sets with annotation in a semantically-aware environment allowing users to write expressive semantic queries against the database. To meet the needs of the research community, users are able to query metadata categories such as habitat, sample type, time, location and other environmental physicochemical parameters. CAMERA is compliant with the standards promulgated by the Genomic Standards Consortium (GSC), and sustains a role within the GSC in extending standards for content and format of the metagenomic data and metadata and its submission to the CAMERA repository. To ensure wide, ready access to data and annotation, CAMERA also provides data submission tools to allow researchers to share and forward data to other metagenomics sites and community data archives such as GenBank. It has multiple interfaces for easy submission of large or complex data sets, and supports pre-registration of samples for sequencing. CAMERA integrates a growing list of tools and viewers for querying, analyzing, annotating and comparing metagenome and genome data. PMID:21045053
Sun, Shulei; Chen, Jing; Li, Weizhong; Altintas, Ilkay; Lin, Abel; Peltier, Steve; Stocks, Karen; Allen, Eric E; Ellisman, Mark; Grethe, Jeffrey; Wooley, John
2011-01-01
The Community Cyberinfrastructure for Advanced Microbial Ecology Research and Analysis (CAMERA, http://camera.calit2.net/) is a database and associated computational infrastructure that provides a single system for depositing, locating, analyzing, visualizing and sharing data about microbial biology through an advanced web-based analysis portal. CAMERA collects and links metadata relevant to environmental metagenome data sets with annotation in a semantically-aware environment allowing users to write expressive semantic queries against the database. To meet the needs of the research community, users are able to query metadata categories such as habitat, sample type, time, location and other environmental physicochemical parameters. CAMERA is compliant with the standards promulgated by the Genomic Standards Consortium (GSC), and sustains a role within the GSC in extending standards for content and format of the metagenomic data and metadata and its submission to the CAMERA repository. To ensure wide, ready access to data and annotation, CAMERA also provides data submission tools to allow researchers to share and forward data to other metagenomics sites and community data archives such as GenBank. It has multiple interfaces for easy submission of large or complex data sets, and supports pre-registration of samples for sequencing. CAMERA integrates a growing list of tools and viewers for querying, analyzing, annotating and comparing metagenome and genome data.
NASA Astrophysics Data System (ADS)
Dutta, Rishabh; Wang, Teng; Feng, Guangcai; Harrington, Jonathan; Vasyura-Bathke, Hannes; Jónsson, Sigurjón
2017-04-01
Strain localizations in compliant fault zones (with elastic moduli lower than the surrounding rocks) induced by nearby earthquakes have been detected using geodetic observations in a few cases in the past. Here we observe small-scale changes in interferometric Synthetic Aperture Radar (InSAR) measurements along multiple conjugate faults near the rupture of the 2013 Mw7.7 Baluchistan (Pakistan) earthquake. After removing the main coseismic deformation signal in the interferograms and correcting them for topography-related phase, we observe 2-3 cm signal along several conjugate faults that are 15-30 km from the mainshock fault rupture. These conjugate compliant faults have strikes of N30°E and N45°W. The sense of motion indicates left-lateral deformation across the N30°E faults and right-lateral deformation across the N45°W faults, which suggests the conjugate faults were subjected to extensional coseismic stresses along the WSW-ENE direction. The spacing between the different sets of faults is around 5 to 8 km. We explain the observed strain localizations as an elastic response of the compliant conjugate faults induced by the Baluchistan earthquake. Using 3D Finite Element models (FEM), we impose coseismic static displacements due to the earthquake along the boundaries of the FEM domain to reproduce the coseismic stress changes acting across the compliant faults. The InSAR measurements are used to constrain the geometry and rigidity variations of the compliant faults with respect to the surrounding rocks. The best fitting models show the compliant fault zones to have a width of 0.5 km to 2 km and a reduction of the shear modulus by a factor of 3 to 4. Our study yields similar values as were found for compliant fault zones near the 1992 Landers and the 1999 Hector Mine earthquakes in California, although here the strain localization is occurring on more complex conjugate sets of faults.
A Dynamic Approach to Make CDS/ISIS Databases Interoperable over the Internet Using the OAI Protocol
ERIC Educational Resources Information Center
Jayakanth, F.; Maly, K.; Zubair, M.; Aswath, L.
2006-01-01
Purpose: A dynamic approach to making legacy databases, like CDS/ISIS, interoperable with OAI-compliant digital libraries (DLs). Design/methodology/approach: There are many bibliographic databases that are being maintained using legacy database systems. CDS/ISIS is one such legacy database system. It was designed and developed specifically for…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-08
... near zero. This circumstance, along with the compliant status of all other fabric and label components... 1\\1/4\\ inches. The area of the Label is insignificant with respect to the over two yards of fabric... system. Moreover, all other fabric, including other warning labels for the MyRide child restraint system...
48 CFR 352.234-4 - Partial earned value management system.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Cognizant Federal Agency (CFA) as being compliant with the schedule-related guidelines in ANSI/EIA Standard... has not been validated and accepted by the CFA at the time of award, see paragraph (b) of this clause..., at the time of award, the Contractor's EVM system has not been validated and accepted by the CFA as...
NASA Astrophysics Data System (ADS)
McLaughlin, B. D.; Pawloski, A. W.
2017-12-01
NASA ESDIS has been moving a variety of data ingest, distribution, and science data processing applications into a cloud environment over the last 2 years. As expected, there have been a number of challenges in migrating primarily on-premises applications into a cloud-based environment, related to architecture and taking advantage of cloud-based services. What was not expected is a number of issues that were beyond purely technical application re-architectures. From surprising network policy limitations, billing challenges in a government-based cost model, and obtaining certificates in an NASA security-compliant manner to working with multiple applications in a shared and resource-constrained AWS account, these have been the relevant challenges in taking advantage of a cloud model. And most surprising of all… well, you'll just have to wait and see the "gotcha" that caught our entire team off guard!
Comparison of load distribution for implant overdenture attachments.
Porter, Joseph A; Petropoulos, Vicki C; Brunski, John B
2002-01-01
The aim of this study was to compare the force and moment distributions that develop on different implant overdenture attachments when vertical compressive forces are applied to an implant-retained overdenture. The following attachments were examined: Nobel Biocare bar and clip (NBC), Nobel Biocare standard ball (NSB), Nobel Biocare 2.25-mm-diameter ball (NB2), Zest Anchor Advanced Generation (ZAAG), Sterngold ERA white (SEW), Sterngold ERA orange (SEO), Compliant Keeper System with titanium shims (CK-Ti), Compliant Keeper System with black nitrile 2SR90 sleeve rings (CK-70), and Compliant Keeper System with clear silicone 2SR90 sleeve rings (CK-90). The attachments were tested using custom strain-gauged abutments and 2 Brånemark System implants placed in a test model. Each attachment type had one part embedded in a denture-like housing and the other part (the abutment) screwed into the implants. Compressive static loads of 100 N were applied (1) bilaterally, over the distal midline (DM); (2) unilaterally, over the right implant (RI); (3) unilaterally, over the left implant (LI); and (4) between implants in the mid-anterior region (MA). Both the force and bending moment on each implant were recorded for each loading location and attachment type. Results were analyzed using 2-way analysis of variance and the Duncan multiple-range test. Both loading location and attachment type were statistically significant factors (P < .05). In general, the force and moment on an implant were greater when the load was applied directly over the implant or at MA. While not significant at every loading location, the largest implant forces tended to occur with ZAAG attachments; the smallest were found with the SEW, the SEO, the NSB, the CK-70, and the CK-90. Typically, higher moments existed for NBC and ZAAG, while lower moments existed for SEW, SEO, NSB, CK-90, and CK-70. For different loading locations, significant differences were found among the different overdenture attachment systems.
ENVIRONMENTALLY COMPLIANT CORROSION-ACTIVATED INHIBITOR SYSTEM FOR ALUMINUM ALLOYS - PHASE I
The federal government is estimated to spend $1 billion on painting/repainting aircraft annually. Aircraft have surfaces composed of aluminum alloys that are highly susceptible to corrosion and must be protected with corrosion-preventative treatments that typically conta...
Assistive and Autonomous Breast Ultrasound Screening: Improving PPV and Reducing RSI
imaging with quantitative elastography. Major objectives achieved in this period included development of a research platform including a compliant...This report details our first year of research activity on technologies that support sonographer-supervised robotic systems for breast ultrasound
77 FR 8326 - Petition for Waiver of Compliance
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-14
... 2010, FRA formed an Engineering Task Force (ETF) to develop crashworthiness criteria for an... system with other compliant Tier 1 equipments. The waiver petition includes documentation on the... Integrity Colliding Equipment Override Fluid Entry Inhibition End Structure Integrity of Cab End End...
75 FR 27986 - Electronic Filing System-Web (EFS-Web) Contingency Option
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-19
...] Electronic Filing System--Web (EFS-Web) Contingency Option AGENCY: United States Patent and Trademark Office... availability of its patent electronic filing system, Electronic Filing System--Web (EFS-Web) by providing a new contingency option when the primary portal to EFS-Web has an unscheduled outage. Previously, the entire EFS...
Validation of the Spanish Acne Severity Scale (Escala de Gravedad del Acné Española--EGAE).
Puig, Lluis; Guerra-Tapia, Aurora; Conejo-Mir, Julián; Toribio, Jaime; Berasategui, Carmen; Zsolt, Ilonka
2013-04-01
Several acne grading systems have been described, but consensus is lacking on which shows superiority. A standardized system would facilitate therapeutic decisions and the analysis of clinical trial data. To assess the feasibility, reliability, validity and sensitivity to change of the Spanish Acne Severity Scale (EGAE). A Spanish, multicentre, prospective, observational study was performed in patients with facial, back or chest acne assessed using EGAE, Leeds Revised Acne Grading system (LRAG) and lesion count. Clinicians answered 4 questions regarding EGAE use and time employed. Patients were evaluated at baseline and after 5±1 weeks. Four additional blinded observers, all dermatologists, evaluated patients' pictures using EGAE and LRAG. In total, 349 acne locations were assessed in 328 patients. Of the dermatologists, 95.6% (CI: 92.9-97.5%) reported that EGAE was easy to use, and 75% used it in <3 minutes. Interobserver reliability of the EGAE scale was shown by a Kendall's W of 0.773 (p<0.001). EGAE and LRAG scales showed a high correlation (Spearman's correlation>0.85; p<0.001). EGAE mean score in treatment-compliant patients was significantly lower at follow-up than at baseline (2.14 vs. 1.57, p<0.001, Cohen's d=0.35).The pre-post-treatment difference in EGAE mean score in non-compliant patients was not significant (1.44 vs. 1.32, p<0.102) and Cohen's d was lower (0.19) than in compliant patients. The use of EGAE to evaluate acne grade in daily clinical dermatological practice in Spanish centres has shown feasibility, high interobserver reliability, concurrent validity and sensitivity to detect treatment effects.
The tsunami service bus, an integration platform for heterogeneous sensor systems
NASA Astrophysics Data System (ADS)
Haener, R.; Waechter, J.; Kriegel, U.; Fleischer, J.; Mueller, S.
2009-04-01
1. INTRODUCTION Early warning systems are long living and evolving: New sensor-systems and -types may be developed and deployed, sensors will be replaced or redeployed on other locations and the functionality of analyzing software will be improved. To ensure a continuous operability of those systems their architecture must be evolution-enabled. From a computer science point of view an evolution-enabled architecture must fulfill following criteria: • Encapsulation of and functionality on data in standardized services. Access to proprietary sensor data is only possible via these services. • Loose coupling of system constituents which easily can be achieved by implementing standardized interfaces. • Location transparency of services what means that services can be provided everywhere. • Separation of concerns that means breaking a system into distinct features which overlap in functionality as little as possible. A Service Oriented Architecture (SOA) as e. g. realized in the German Indonesian Tsunami Early Warning System (GITEWS) and the advantages of functional integration on the basis of services described below adopt these criteria best. 2. SENSOR INTEGRATION Integration of data from (distributed) data sources is just a standard task in computer science. From few well known solution patterns, taking into account performance and security requirements of early warning systems only functional integration should be considered. Precondition for this is that systems are realized compliant to SOA patterns. Functionality is realized in form of dedicated components communicating via a service infrastructure. These components provide their functionality in form of services via standardized and published interfaces which could be used to access data maintained in - and functionality provided by dedicated components. Functional integration replaces the tight coupling at data level by a dependency on loosely coupled services. If the interfaces of the service providing components remain unchanged, components can be maintained and evolved independently on each other and service functionality as a whole can be reused. In GITEWS the functional integration pattern was adopted by applying the principles of an Enterprise Service Bus (ESB) as a backbone. Four services provided by the so called Tsunami Service Bus (TSB) which are essential for early warning systems are realized compliant to services specified within the Sensor Web Enablement (SWE) initiative of the Open Geospatial Consortium (OGC). 3. ARCHITECTURE The integration platform was developed to access proprietary, heterogeneous sensor data and to provide them in a uniform manner for further use. Its core, the TSB provides both a messaging-backbone and -interfaces on the basis of a Java Messaging Service (JMS). The logical architecture of GITEWS consists of four independent layers: • A resource layer where physical or virtual sensors as well as data or model storages provide relevant measurement-, event- and analysis-data: Utilizable for the TSB are any kind of data. In addition to sensors databases, model data and processing applications are adopted. SWE specifies encoding both to access and to describe these data in a comprehensive way: 1. Sensor Model Language (SensorML): Standardized description of sensors and sensor data 2. Observations and Measurements (O&M): Model and encoding of sensor measurements • A service layer to collect and conduct data from heterogeneous and proprietary resources and provide them via standardized interfaces: The TSB enables interaction with sensors via the following services: 1. Sensor Observation Service (SOS): Standardized access to sensor data 2. Sensor Planning Service (SPS): Controlling of sensors and sensor networks 3. Sensor Alert Service (SAS): Active sending of data if defined events occur 4. Web Notification Service (WNS): Conduction of asynchronous dialogues between services • An orchestration layer where atomic services are composed and arranged to high level processes like a decision support process: One of the outstanding features of service-oriented architectures is the possibility to compose new services from existing ones, which can be done programmatically or via declaration (workflow or process design). This allows e. g. the definition of new warning processes which could be adapted easily to new requirements. • An access layer which may contain graphical user interfaces for decision support, monitoring- or visualization-systems: To for example visualize time series graphical user interfaces request sensor data simply via the SOS. 4.BENEFIT The integration platform is realized on top of well known and widely used open source software implementing industrial standards. New sensors could be added easily to the infrastructure. Client components don't need to be adjusted if new sensor-types or -individuals are added to the system, because they access the sensors via standardized services. With implementing SWE fully compatible to the OGC specification it is possible to establish the "detection" and integration of sensors via the Web. Thus realizing a system of systems that combines early warning system functionality at different levels of detail (distant early warning systems, monitoring systems and any sensor system) is feasible.
Wang, Z.; Ward, M. M.; Chan, L.
2014-01-01
Summary Previous studies have shown an association between duration of bisphosphonate use and atypical femur fractures. This cohort study showed an increasingly higher risk of subtrochanteric and femoral shaft fractures among those who were more adherent to oral bisphosphonates. Introduction Long-term use of oral bisphosphonates has been implicated in an increased risk of atypical femur fractures located in subtrochanteric and femoral shaft regions. Another measure of drug exposure, medication adherence, however, has not been investigated. Methods Among all Medicare fee-for-service female beneficiaries from 2006–2010, we followed 522,287 new bisphosphonate users from their index prescription until being censored or having a primary diagnosis of closed subtrochanteric/ femoral shaft or intertrochanteric/femoral neck fractures. Data about radiographs of fracture site and features were not available. Adherence was classified according to the medication possession ratio (MPR) as the following: MPR<1/3 as less compliant, MPR≥1/3–<2/3 as compliant, and MPR≥2/3 as highly compliant. Alternative cutoff points at 50 and 80 % were also used. Survival analysis was used to determine the cumulative incidence and hazard of subtrochanteric/femoral shaft or intertrochanteric/femoral neck fractures. Results There was a graded increase in incidence of subtrochanteric/femoral shaft fractures as the level of adherence increased (Gray’s test, P<0.001). The adjusted hazard ratio (HR) for the highly compliant vs. the less compliant was 1.23 (95 % Confidence Interval [CI] 1.06–1.43) overall, became significant after 2 years of follow-up (HR=1.51, 95 % CI 1.06–2.15) and reached the highest risk in the fifth year (HR=4.06, 95 % CI 1.47–11.19). However, age-adjusted incidence rates of intertrochanteric/femoral neck fractures were significantly lower among highly compliant beneficiaries, compared to less compliant users (HR=0.69, 95 % CI 0.66–0.73). Similar results were obtained when the cutoff points for being compliant and highly compliant were set at 50 and 80 %, respectively. Conclusions Subtrochanteric/femoral shaft fractures, unlike intertrochanteric/femoral neck fractures, are positively associated with higher adherence to long-term (≥3 years) oral bisphosphonates in the elderly female Medicare population. PMID:24846316
NASA Astrophysics Data System (ADS)
Li, Guo-Yang; Xu, Guoqiang; Zheng, Yang; Cao, Yanping
2018-03-01
Surface acoustic wave (SAW) devices have found a wide variety of technical applications, including SAW filters, SAW resonators, microfluidic actuators, biosensors, flow measurement devices, and seismic wave shields. Stretchable/flexible electronic devices, such as sensory skins for robotics, structural health monitors, and wearable communication devices, have received considerable attention across different disciplines. Flexible SAW devices are essential building blocks for these applications, wherein piezoelectric films may need to be integrated with the compliant substrates. When piezoelectric films are much stiffer than soft substrates, SAWs are usually leaky and the devices incorporating them suffer from acoustic losses. In this study, the propagation of SAWs in a wrinkled bilayer system is investigated, and our analysis shows that non-leaky modes can be achieved by engineering stress patterns through surface wrinkles in the system. Our analysis also uncovers intriguing bandgaps (BGs) related to the SAWs in a wrinkled bilayer system; these are caused by periodic deformation patterns, which indicate that diverse wrinkling patterns could be used as metasurfaces for controlling the propagation of SAWs.
Environmental Monitoring Using Sensor Networks
NASA Astrophysics Data System (ADS)
Yang, J.; Zhang, C.; Li, X.; Huang, Y.; Fu, S.; Acevedo, M. F.
2008-12-01
Environmental observatories, consisting of a variety of sensor systems, computational resources and informatics, are important for us to observe, model, predict, and ultimately help preserve the health of the nature. The commoditization and proliferation of coin-to-palm sized wireless sensors will allow environmental monitoring with unprecedented fine spatial and temporal resolution. Once scattered around, these sensors can identify themselves, locate their positions, describe their functions, and self-organize into a network. They communicate through wireless channel with nearby sensors and transmit data through multi-hop protocols to a gateway, which can forward information to a remote data server. In this project, we describe an environmental observatory called Texas Environmental Observatory (TEO) that incorporates a sensor network system with intertwined wired and wireless sensors. We are enhancing and expanding the existing wired weather stations to include wireless sensor networks (WSNs) and telemetry using solar-powered cellular modems. The new WSNs will monitor soil moisture and support long-term hydrologic modeling. Hydrologic models are helpful in predicting how changes in land cover translate into changes in the stream flow regime. These models require inputs that are difficult to measure over large areas, especially variables related to storm events, such as soil moisture antecedent conditions and rainfall amount and intensity. This will also contribute to improve rainfall estimations from meteorological radar data and enhance hydrological forecasts. Sensor data are transmitted from monitoring site to a Central Data Collection (CDC) Server. We incorporate a GPRS modem for wireless telemetry, a single-board computer (SBC) as Remote Field Gateway (RFG) Server, and a WSN for distributed soil moisture monitoring. The RFG provides effective control, management, and coordination of two independent sensor systems, i.e., a traditional datalogger-based wired sensor system and the WSN-based wireless sensor system. The RFG also supports remote manipulation of the devices in the field such as the SBC, datalogger, and WSN. Sensor data collected from the distributed monitoring stations are stored in a database (DB) Server. The CDC Server acts as an intermediate component to hide the heterogeneity of different devices and support data validation required by the DB Server. Daemon programs running on the CDC Server pre-process the data before it is inserted into the database, and periodically perform synchronization tasks. A SWE-compliant data repository is installed to enable data exchange, accepting data from both internal DB Server and external sources through the OGC web services. The web portal, i.e. TEO Online, serves as a user-friendly interface for data visualization, analysis, synthesis, modeling, and K-12 educational outreach activities. It also provides useful capabilities for system developers and operators to remotely monitor system status and remotely update software and system configuration, which greatly simplifies the system debugging and maintenance tasks. We also implement Sensor Observation Services (SOS) at this layer, conforming to the SWE standard to facilitate data exchange. The standard SensorML/O&M data representation makes it easy to integrate our sensor data into the existing Geographic Information Systems (GIS) web services and exchange the data with other organizations.
Enabling Interoperability in Heliophysical Domains
NASA Astrophysics Data System (ADS)
Bentley, Robert
2013-04-01
There are many aspects of science in the Solar System that are overlapping - phenomena observed in one domain can have effects in other domains. However, there are many problems related to exploiting the data in cross-disciplinary studies because of lack of interoperability of the data and services. The CASSIS project is a Coordination Action funded under FP7 that has the objective of improving the interoperability of data and services related Solar System science. CASSIS has been investigating how the data could be made more accessible with some relatively minor changes to the observational metadata. The project has been looking at the services that are used within the domain and determining whether they are interoperable with each other and if not what would be required make them so. It has also been examining all types of metadata that are used when identifying and using observations and trying to make them more compliant with techniques and standards developed by bodies such as the International Virtual Observatory Alliance (IVOA). Many of the lessons that are being learnt in the study are applicable to domains that go beyond those directly involved in heliophysics. Adopting some simple standards related to the design of the services interfaces and metadata that are used would make it much easier to investigate interdisciplinary science topics. We will report on our finding and describe a roadmap for the future. For more information about CASSIS, please visit the project Web site on cassis-vo.eu
NASA Astrophysics Data System (ADS)
Dyer, T.; Brodie, K. L.; Spore, N.
2016-02-01
Modern LIDAR systems, while capable of providing highly accurate and dense datasets, introduce significant challenges in data processing and end-user accessibility. At the United States Army Corps of Engineers Field Research Facility in Duck, North Carolina, we have developed a stationary LIDAR tower for the continuous monitoring of the ocean, beach, and foredune, as well as an automated workflow capable of providing scientific data products from the LIDAR scanner in near real-time through an online data portal. The LIDAR performs hourly scans, taking approximately 50 minutes to complete and producing datasets on the order of 1GB. Processing of the LIDAR data includes coordinate transformations, data rectification and coregistration, filtering to remove noise and unwanted objects, gridding, and time-series analysis to generate products for use by end-users. Examples of these products include water levels and significant wave heights, virtual wave gauge time-series and FFTs, wave runup, foreshore elevations and slopes, and bare earth DEMs. Immediately after processing, data products are combined with ISO compliant metadata and stored using the NetCDF-4 file format, making them easily discoverable through a web portal which provides an interactive map that allows users to explore datasets both spatially and temporally. End-users can download datasets in user-defined time intervals, which can be used, for example, as forcing or validation parameters in numerical models. Funded by the USACE Coastal Ocean Data Systems Program.
NASA Astrophysics Data System (ADS)
Joslin, R. D.
1991-04-01
The use of passive devices to obtain drag and noise reduction or transition delays in boundary layers is highly desirable. One such device that shows promise for hydrodynamic applications is the compliant coating. The present study extends the mechanical model to allow for three-dimensional waves. This study also looks at the effect of compliant walls on three-dimensional secondary instabilities. For the primary and secondary instability analysis, spectral and shooting approximations are used to obtain solutions of the governing equations and boundary conditions. The spectral approximation consists of local and global methods of solution while the shooting approach is local. The global method is used to determine the discrete spectrum of eigenvalue without any initial guess. The local method requires a sufficiently accurate initial guess to converge to the eigenvalue. Eigenvectors may be obtained with either local approach. For the initial stage of this analysis, two and three dimensional primary instabilities propagate over compliant coatings. Results over the compliant walls are compared with the rigid wall case. Three-dimensional instabilities are found to dominate transition over the compliant walls considered. However, transition delays are still obtained and compared with transition delay predictions for rigid walls. The angles of wave propagation are plotted with Reynolds number and frequency. Low frequency waves are found to be highly three-dimensional.
Development of a multistage compliant mechanism with new boundary constraint
NASA Astrophysics Data System (ADS)
Ling, Mingxiang; Cao, Junyi; Jiang, Zhou; Li, Qisheng
2018-01-01
This paper presents a piezo-actuated compliant mechanism with a new boundary constraint to provide concurrent large workspace and high dynamic frequency for precision positioning or other flexible manipulation applications. A two-stage rhombus-type displacement amplifier with the "sliding-sliding" boundary constraint is presented to maximize the dynamic frequency while retaining a large output displacement. The vibration mode is also improved by the designed boundary constraint. A theoretical kinematic model of the compliant mechanism is established to optimize the geometric parameters, and a prototype is fabricated with a compact dimension of 60 mm × 60 mm × 12 mm. The experimental testing shows that the maximum stroke is approximately 0.6 mm and the output stiffness is 1.1 N/μm with the fundamental frequency of larger than 2.2 kHz. Lastly, the excellent performance of the presented compliant mechanism is compared with several mechanisms in the previous literature. As a conclusion, the presented boundary constraint strategy provides a new way to balance the trade-off between the frequency response and the stroke range widely existed in compliant mechanisms.
DNA origami compliant nanostructures with tunable mechanical properties.
Zhou, Lifeng; Marras, Alexander E; Su, Hai-Jun; Castro, Carlos E
2014-01-28
DNA origami enables fabrication of precise nanostructures by programming the self-assembly of DNA. While this approach has been used to make a variety of complex 2D and 3D objects, the mechanical functionality of these structures is limited due to their rigid nature. We explore the fabrication of deformable, or compliant, objects to establish a framework for mechanically functional nanostructures. This compliant design approach is used in macroscopic engineering to make devices including sensors, actuators, and robots. We build compliant nanostructures by utilizing the entropic elasticity of single-stranded DNA (ssDNA) to locally bend bundles of double-stranded DNA into bent geometries whose curvature and mechanical properties can be tuned by controlling the length of ssDNA strands. We demonstrate an ability to achieve a wide range of geometries by adjusting a few strands in the nanostructure design. We further developed a mechanical model to predict both geometry and mechanical properties of our compliant nanostructures that agrees well with experiments. Our results provide a basis for the design of mechanically functional DNA origami devices and materials.
Common Badging and Access Control System (CBACS)
NASA Technical Reports Server (NTRS)
Dischinger, Portia
2005-01-01
This slide presentation presents NASA's Common Badging and Access Control System. NASA began a Smart Card implementation in January 2004. Following site surveys, it was determined that NASA's badging and access control systems required upgrades to common infrastructure in order to provide flexibly, usability, and return on investment prior to a smart card implantation. Common Badging and Access Control System (CBACS) provides the common infrastructure from which FIPS-201 compliant processes, systems, and credentials can be developed and used.
Adapting the CUAHSI Hydrologic Information System to OGC standards
NASA Astrophysics Data System (ADS)
Valentine, D. W.; Whitenack, T.; Zaslavsky, I.
2010-12-01
The CUAHSI Hydrologic Information System (HIS) provides web and desktop client access to hydrologic observations via water data web services using an XML schema called “WaterML”. The WaterML 1.x specification and the corresponding Water Data Services have been the backbone of the HIS service-oriented architecture (SOA) and have been adopted for serving hydrologic data by several federal agencies and many academic groups. The central discovery service, HIS Central, is based on an metadata catalog that references 4.7 billion observations, organized as 23 million data series from 1.5 million sites from 51 organizations. Observations data are published using HydroServer nodes that have been deployed at 18 organizations. Usage of HIS has increased by 8x from 2008 to 2010, and doubled in usage from 1600 data series a day in 2009 to 3600 data series a day in the first half of 2010. The HIS central metadata catalog currently harvests information from 56 Water Data Services. We collaborate on the catalog updates with two federal partners, USGS and US EPA: their data series are periodically reloaded into the HIS metadata catalog. We are pursuing two main development directions in the HIS project: Cloud-based computing, and further compliance with Open Geospatial Consortium (OGC) standards. The goal of moving to cloud-computing is to provide a scalable collaborative system with a simpler deployment and less dependence of hardware maintenance and staff. This move requires re-architecting the information models underlying the metadata catalog, and Water Data Services to be independent of the underlying relational database model, allowing for implementation on both relational databases, and cloud-based processing systems. Cloud-based HIS central resources can be managed collaboratively; partners share responsibility for their metadata by publishing data series information into the centralized catalog. Publishing data series will use REST-based service interfaces, like OData, as the basis for ingesting data series information into a cloud-hosted catalog. The future HIS services involve providing information via OGC Standards that will allow for observational data access from commercial GIS applications. Use of standards will allow for tools to access observational data from other projects using standards, such as the Ocean Observatories Initiative, and for tools from such projects to be integrated into the HIS toolset. With international collaborators, we have been developing a water information exchange language called “WaterML 2.0” which will be used to deliver observations data over OGC Sensor Observation Services (SOS). A software stack of OGC standard services will provide access to HIS information. In addition to SOS, Web Mapping and Feature Services (WMS, and WFS) will provide access to location information. Catalog Services for the Web (CSW) will provide a catalog for water information that is both centralized, and distributed. We intend the OGC standards supplement the existing HIS service interfaces, rather than replace the present service interfaces. The ultimate goal of this development is expand access to hydrologic observations data, and create an environment where these data can be seamlessly integrated with standards-compliant data resources.
DOT National Transportation Integrated Search
2015-02-01
Through a coordinated effort among the electrical engineering research team of the Florida State : University (FSU) and key Florida Department of Transportation (FDOT) personnel, an NTCIP-based : automated testing system for NTCIP-compliant ASC has b...
48 CFR 352.239-71 - Standard for encryption language.
Code of Federal Regulations, 2011 CFR
2011-10-01
... language. 352.239-71 Section 352.239-71 Federal Acquisition Regulations System HEALTH AND HUMAN SERVICES... Information Processing Standard (FIPS) 140-2-compliant encryption (Security Requirements for Cryptographic Module, as amended) to protect all instances of HHS sensitive information during storage and transmission...
Force-reflection and shared compliant control in operating telemanipulators with time delay
NASA Technical Reports Server (NTRS)
Kim, Won S.; Hannaford, Blake; Bejczy, Antal K.
1992-01-01
The performance of an advanced telemanipulation system in the presence of a wide range of time delays between a master control station and a slave robot is quantified. The contemplated applications include multiple satellite links to LEO, geosynchronous operation, spacecraft local area networks, and general-purpose computer-based short-distance designs. The results of high-precision peg-in-hole tasks performed by six test operators indicate that task performance decreased linearly with introduced time delays for both kinesthetic force feedback (KFF) and shared compliant control (SCC). The rate of this decrease was substantially improved with SCC compared to KFF. Task performance at delays above 1 s was not possible using KFF. SCC enabled task performance for such delays, which are realistic values for ground-controlled remote manipulation of telerobots in space.
Parameter Optimization of Pseudo-Rigid-Body Models of MRI-Actuated Catheters
Greigarn, Tipakorn; Liu, Taoming; Çavuşoğlu, M. Cenk
2016-01-01
Simulation and control of a system containing compliant mechanisms such as cardiac catheters often incur high computational costs. One way to reduce the costs is to approximate the mechanisms with Pseudo-Rigid-Body Models (PRBMs). A PRBM generally consists of rigid links connected by spring-loaded revolute joints. The lengths of the rigid links and the stiffnesses of the springs are usually chosen to minimize the tip deflection differences between the PRBM and the compliant mechanism. In most applications, only the relationship between end load and tip deflection is considered. This is obviously not applicable for MRI-actuated catheters which is actuated by the coils attached to the body. This paper generalizes PRBM parameter optimization to include loading and reference points along the body. PMID:28261009
NASA Astrophysics Data System (ADS)
Dabiri, Arman; Butcher, Eric A.; Nazari, Morad
2017-02-01
Compliant impacts can be modeled using linear viscoelastic constitutive models. While such impact models for realistic viscoelastic materials using integer order derivatives of force and displacement usually require a large number of parameters, compliant impact models obtained using fractional calculus, however, can be advantageous since such models use fewer parameters and successfully capture the hereditary property. In this paper, we introduce the fractional Chebyshev collocation (FCC) method as an approximation tool for numerical simulation of several linear fractional viscoelastic compliant impact models in which the overall coefficient of restitution for the impact is studied as a function of the fractional model parameters for the first time. Other relevant impact characteristics such as hysteresis curves, impact force gradient, penetration and separation depths are also studied.
Goto, Takaaki; Dobashi, Hiroki; Yoshikawa, Tsuneo; Loureiro, Rui C V; Harwin, William S; Miyamura, Yuga; Nagai, Kiyoshi
2017-07-01
This paper addresses the mechanical structure and control method of a redundant drive robot (RDR) to produce compliant motions, and show how the design parameters of the RDR can effect the produced motions and the mechanical and performance limitations of the actuators of the RDR. The structure and control method of the RDR can have been proper to produce compliant motions, but the effect of the design parameters of the RDR to the mechanical and performance limitations have not been clear. Therefore, the feasibility of producing compliant motions in the case of the prototype of the RDR is confirmed by conducting simulations and experiments, and then the design parameters of the RDR to the mechanical and performance limitations are verified by conducting simulations.
NASA Astrophysics Data System (ADS)
Gray, Bonnie L.
2012-04-01
Microfluidics is revolutionizing laboratory methods and biomedical devices, offering new capabilities and instrumentation in multiple areas such as DNA analysis, proteomics, enzymatic analysis, single cell analysis, immunology, point-of-care medicine, personalized medicine, drug delivery, and environmental toxin and pathogen detection. For many applications (e.g., wearable and implantable health monitors, drug delivery devices, and prosthetics) mechanically flexible polymer devices and systems that can conform to the body offer benefits that cannot be achieved using systems based on conventional rigid substrate materials. However, difficulties in implementing active devices and reliable packaging technologies have limited the success of flexible microfluidics. Employing highly compliant materials such as PDMS that are typically employed for prototyping, we review mechanically flexible polymer microfluidic technologies based on free-standing polymer substrates and novel electronic and microfluidic interconnection schemes. Central to these new technologies are hybrid microfabrication methods employing novel nanocomposite polymer materials and devices. We review microfabrication methods using these materials, along with demonstrations of example devices and packaging schemes that employ them. We review these recent developments and place them in the context of the fields of flexible microfluidics and conformable systems, and discuss cross-over applications to conventional rigid-substrate microfluidics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alvarez, P; Molineu, A; Lowenstein, J
Purpose: IROC-H conducts external audits for output check verification of photon and electron beams. Many of these beams can meet the geometric requirements of the TG 51 calibration protocol. For those photon beams that are non TG 51 compliant like Elekta GammaKnife, Accuray CyberKnife and TomoTherapy, IROC-H has specific audit tools to monitor the reference calibration. Methods: IROC-H used its TLD and OSLD remote monitoring systems to verify the output of machines with TG 51 non compliant beams. Acrylic OSLD miniphantoms are used for the CyberKnife. Special TLD phantoms are used for TomoTherapy and GammaKnife machines to accommodate the specificmore » geometry of each machine. These remote audit tools are sent to institutions to be irradiated and returned to IROC-H for analysis. Results: The average IROC-H/institution ratios for 480 GammaKnife, 660 CyberKnife and 907 rotational TomoTherapy beams are 1.000±0.021, 1.008±0.019, 0.974±0.023, respectively. In the particular case of TomoTherapy, the overall ratio is 0.977±0.022 for HD units. The standard deviations of all results are consistent with values determined for TG 51 compliant photon beams. These ratios have shown some changes compared to values presented in 2008. The GammaKnife results were corrected by an experimentally determined scatter factor of 1.025 in 2013. The TomoTherapy helical beam results are now from a rotational beam whereas in 2008 the results were from a static beam. The decision to change modality was based on recommendations from the users. Conclusion: External audits of beam outputs is a valuable tool to confirm the calibrations of photon beams regardless of whether the machine is TG 51 or TG 51 non compliant. The difference found for TomoTherapy units is under investigation. This investigation was supported by IROC grant CA180803 awarded by the NCI.« less
Towards the Goal of Modular Climate Data Services: An Overview of NCPP Applications and Software
NASA Astrophysics Data System (ADS)
Koziol, B. W.; Cinquini, L.; Treshansky, A.; Murphy, S.; DeLuca, C.
2013-12-01
In August 2013, the National Climate Predictions and Projections Platform (NCPP) organized a workshop focusing on the quantitative evaluation of downscaled climate data products (QED-2013). The QED-2013 workshop focused on real-world application problems drawn from several sectors (e.g. hydrology, ecology, environmental health, agriculture), and required that downscaled downscaled data products be dynamically accessed, generated, manipulated, annotated, and evaluated. The cyberinfrastructure elements that were integrated to support the workshop included (1) a wiki-based project hosting environment (Earth System CoG) with an interface to data services provided by an Earth System Grid Federation (ESGF) data node; (2) metadata tools provided by the Earth System Documentation (ES-DOC) collaboration; and (3) a Python-based library OpenClimateGIS (OCGIS) for subsetting and converting NetCDF-based climate data to GIS and tabular formats. Collectively, this toolset represents a first deployment of a 'ClimateTranslator' that enables users to access, interpret, and apply climate information at local and regional scales. This presentation will provide an overview of these components above, how they were used in the workshop, and discussion of current and potential integration. The long-term strategy for this software stack is to offer the suite of services described on a customizable, per-project basis. Additional detail on the three components is below. (1) Earth System CoG is a web-based collaboration environment that integrates data discovery and access services with tools for supporting governance and the organization of information. QED-2013 utilized these capabilities to share with workshop participants a suite of downscaled datasets, associated images derived from those datasets, and metadata files describing the downscaling techniques involved. The collaboration side of CoG was used for workshop organization, discussion, and results. (2) The ES-DOC Questionnaire, Viewer, and Comparator are web-based tools for the creation and use of model and experiment documentation. Workshop participants used the Questionnaire to generate metadata on regional downscaling models and statistical downscaling methods, and the Viewer to display the results. A prototype Comparator was available to compare properties across dynamically downscaled models. (3) OCGIS is a Python (v2.7) package designed for geospatial manipulation, subsetting, computation, and translation of Climate and Forecasting (CF)-compliant climate datasets - either stored in local NetCDF files, or files served through THREDDS data servers.
Aeroelastic Airworthiness Assesment of the Adaptive Compliant Trailing Edge Flaps
NASA Technical Reports Server (NTRS)
Herrera, Claudia Y.; Spivey, Natalie D.; Lung, Shun-fat; Ervin, Gregory; Flick, Peter
2015-01-01
The Adaptive Compliant Trailing Edge (ACTE) demonstrator is a joint task under the National Aeronautics and Space Administration Environmentally Responsible Aviation Project in partnership with the Air Force Research Laboratory and FlexSys, Inc. (Ann Arbor, Michigan). The project goal is to develop advanced technologies that enable environmentally friendly aircraft, such as adaptive compliant technologies. The ACTE demonstrator flight-test program encompassed replacing the Fowler flaps on the SubsoniC Aircraft Testbed, a modified Gulfstream III (Gulfstream Aerospace, Savannah, Georgia) aircraft, with control surfaces developed by FlexSys. The control surfaces developed by FlexSys are a pair of uniquely-designed unconventional flaps to be used as lifting surfaces during flight-testing to validate their structural effectiveness. The unconventional flaps required a multidisciplinary airworthiness assessment to prove they could withstand the prescribed flight envelope. Several challenges were posed due to the large deflections experienced by the structure, requiring non-linear analysis methods. The aeroelastic assessment necessitated both conventional and extensive testing and analysis methods. A series of ground vibration tests (GVTs) were conducted to provide modal characteristics to validate and update finite element models (FEMs) used for the flutter analyses for a subset of the various flight configurations. Numerous FEMs were developed using data from FlexSys and the ground tests. The flap FEMs were then attached to the aircraft model to generate a combined FEM that could be analyzed for aeroelastic instabilities. The aeroelastic analysis results showed the combined system of aircraft and flaps were predicted to have the required flutter margin to successfully demonstrate the adaptive compliant technology. This paper documents the details of the aeroelastic airworthiness assessment described, including the ground testing and analyses, and subsequent flight-testing performed on the unconventional ACTE flaps.
Smart structure with elastomeric contact surface for prosthetic fingertip sensitivity development
NASA Astrophysics Data System (ADS)
Gu, Chunxin; Liu, Weiting; Yu, Ping; Cheng, Xiaoying; Fu, Xin
2017-09-01
Current flexible/compliant tactile sensors suffer from low sensitivity and high hysteresis introduced by the essential viscosity characteristic of soft material, either used as compliant sensing element or as flexible coverage. To overcome these disadvantages, this paper focuses on developing a tactile sensor with a smart hybrid structure to obtain comprehensive properties in terms of size, compliance, robustness and pressure sensing ability so as to meet the requirements of limited space applications such as prosthetic fingertips. Employing micro-fabricated tiny silicon-based pressure die as the sensing element, it is easy to have both small size and good mechanical performance. To protect it from potential damage and maintain the compliant surface, a rigid base and a soft layer form a sealed chamber and encapsulate the fixed die together with fluid. The fluid serves as highly efficient pressure propagation media of mechanical stimulus from the compliant skin to the pressure die without any hazard impacting the vulnerable connecting wires. To understand the pressure transmission mechanism, a simplified and concise analytic model of a spring system is proposed. Using easy fabrication technologies, a prototype of a 3 × 3 sensor array with total dimensions of 14 mm × 14 mm × 6.5 mm was developed. Based on the quasi-linear relationship between fluid volume and pressure, finite element modeling was developed to analyze the chamber deformation and pressure output of the sensor cell. Experimental tests of the sensor prototype were implemented. The results showed that the sensor cell had good sensing performance with sensitivity of 19.9 mV N-1, linearity of 0.998, repeatability error of 3.41%, and hysteresis error of 3.34%. The force sensing range was from 5 mN to 1.6 N.
NASA Astrophysics Data System (ADS)
Mytych, Joanna; Ligarski, Mariusz J.
2018-03-01
The quality management systems compliant with the ISO 9001:2009 have been thoroughly researched and described in detail in the world literature. The accredited management systems used in the testing laboratories and compliant with the ISO/IEC 17025:2005 have been mainly described in terms of the system design and implementation. They have also been investigated from the analytical point of view. Unfortunately, a low number of studies concerned the management system functioning in the accredited testing laboratories. The aim of following study was to assess the management system functioning in the accredited testing laboratories in Poland. On 8 October 2015, 1,213 accredited testing laboratories were present in Poland. They investigated various scientific areas and substances/objects. There are more and more such laboratories that have various problems and different long-term experience when it comes to the implementation, maintenance and improvement of the management systems. The article describes the results of the conducted expert assessment (survey) carried out to examine the conditions for the functioning of a management system in an accredited laboratory. It also focuses on the characteristics of the accredited research laboratories in Poland. The authors discuss the selection of the external and internal conditions that may affect the accredited management system. They show how the experts assessing the selected conditions were chosen. The survey results are also presented.
Bringing Control System User Interfaces to the Web
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Xihui; Kasemir, Kay
With the evolution of web based technologies, especially HTML5 [1], it becomes possible to create web-based control system user interfaces (UI) that are cross-browser and cross-device compatible. This article describes two technologies that facilitate this goal. The first one is the WebOPI [2], which can seamlessly display CSS BOY [3] Operator Interfaces (OPI) in web browsers without modification to the original OPI file. The WebOPI leverages the powerful graphical editing capabilities of BOY and provides the convenience of re-using existing OPI files. On the other hand, it uses generic JavaScript and a generic communication mechanism between the web browser andmore » web server. It is not optimized for a control system, which results in unnecessary network traffic and resource usage. Our second technology is the WebSocket-based Process Data Access (WebPDA) [4]. It is a protocol that provides efficient control system data communication using WebSocket [5], so that users can create web-based control system UIs using standard web page technologies such as HTML, CSS and JavaScript. WebPDA is control system independent, potentially supporting any type of control system.« less
NASA Astrophysics Data System (ADS)
Eberle, J.; Hüttich, C.; Schmullius, C.
2014-12-01
Spatial time series data are freely available around the globe from earth observation satellites and meteorological stations for many years until now. They provide useful and important information to detect ongoing changes of the environment; but for end-users it is often too complex to extract this information out of the original time series datasets. This issue led to the development of the Earth Observation Monitor (EOM), an operational framework and research project to provide simple access, analysis and monitoring tools for global spatial time series data. A multi-source data processing middleware in the backend is linked to MODIS data from Land Processes Distributed Archive Center (LP DAAC) and Google Earth Engine as well as daily climate station data from NOAA National Climatic Data Center. OGC Web Processing Services are used to integrate datasets from linked data providers or external OGC-compliant interfaces to the EOM. Users can either use the web portal (webEOM) or the mobile application (mobileEOM) to execute these processing services and to retrieve the requested data for a given point or polygon in userfriendly file formats (CSV, GeoTiff). Beside providing just data access tools, users can also do further time series analyses like trend calculations, breakpoint detections or the derivation of phenological parameters from vegetation time series data. Furthermore data from climate stations can be aggregated over a given time interval. Calculated results can be visualized in the client and downloaded for offline usage. Automated monitoring and alerting of the time series data integrated by the user is provided by an OGC Sensor Observation Service with a coupled OGC Web Notification Service. Users can decide which datasets and parameters are monitored with a given filter expression (e.g., precipitation value higher than x millimeter per day, occurrence of a MODIS Fire point, detection of a time series anomaly). Datasets integrated in the SOS service are updated in near-realtime based on the linked data providers mentioned above. An alert is automatically pushed to the user if the new data meets the conditions of the registered filter expression. This monitoring service is available on the web portal with alerting by email and within the mobile app with alerting by email and push notification.
Applying Web Usage Mining for Personalizing Hyperlinks in Web-Based Adaptive Educational Systems
ERIC Educational Resources Information Center
Romero, Cristobal; Ventura, Sebastian; Zafra, Amelia; de Bra, Paul
2009-01-01
Nowadays, the application of Web mining techniques in e-learning and Web-based adaptive educational systems is increasing exponentially. In this paper, we propose an advanced architecture for a personalization system to facilitate Web mining. A specific Web mining tool is developed and a recommender engine is integrated into the AHA! system in…
Standard requirements for GCP-compliant data management in multinational clinical trials.
Ohmann, Christian; Kuchinke, Wolfgang; Canham, Steve; Lauritsen, Jens; Salas, Nader; Schade-Brittinger, Carmen; Wittenberg, Michael; McPherson, Gladys; McCourt, John; Gueyffier, Francois; Lorimer, Andrea; Torres, Ferràn
2011-03-22
A recent survey has shown that data management in clinical trials performed by academic trial units still faces many difficulties (e.g. heterogeneity of software products, deficits in quality management, limited human and financial resources and the complexity of running a local computer centre). Unfortunately, no specific, practical and open standard for both GCP-compliant data management and the underlying IT-infrastructure is available to improve the situation. For that reason the "Working Group on Data Centres" of the European Clinical Research Infrastructures Network (ECRIN) has developed a standard specifying the requirements for high quality GCP-compliant data management in multinational clinical trials. International, European and national regulations and guidelines relevant to GCP, data security and IT infrastructures, as well as ECRIN documents produced previously, were evaluated to provide a starting point for the development of standard requirements. The requirements were produced by expert consensus of the ECRIN Working group on Data Centres, using a structured and standardised process. The requirements were divided into two main parts: an IT part covering standards for the underlying IT infrastructure and computer systems in general, and a Data Management (DM) part covering requirements for data management applications in clinical trials. The standard developed includes 115 IT requirements, split into 15 separate sections, 107 DM requirements (in 12 sections) and 13 other requirements (2 sections). Sections IT01 to IT05 deal with the basic IT infrastructure while IT06 and IT07 cover validation and local software development. IT08 to IT015 concern the aspects of IT systems that directly support clinical trial management. Sections DM01 to DM03 cover the implementation of a specific clinical data management application, i.e. for a specific trial, whilst DM04 to DM12 address the data management of trials across the unit. Section IN01 is dedicated to international aspects and ST01 to the competence of a trials unit's staff. The standard is intended to provide an open and widely used set of requirements for GCP-compliant data management, particularly in academic trial units. It is the intention that ECRIN will use these requirements as the basis for the certification of ECRIN data centres.
Caudron, Jean Michel; Schiavetti, Benedetta; Pouget, Corinne; Tsoumanis, Achilleas; Meessen, Bruno; Ravinetto, Raffaella
2018-01-01
Introduction The rapid globalisation of the pharmaceutical production and distribution has not been supported by harmonisation of regulatory systems worldwide. Thus, the supply systems in low-income and middle-income countries (LMICs) remain exposed to the risk of poor-quality medicines. To contribute to estimating this risk in the private sector in LMICs, we assessed the quality assurance system of a convenient sample of local private pharmaceutical distributors. Methods This descriptive study uses secondary data derived from the audits conducted by the QUAMED group at 60 local private pharmaceutical distributors in 13 LMICs. We assessed the distributors’ compliance with good distribution practices (GDP), general quality requirements (GQR) and cold chain management (CCM), based on an evaluation tool inspired by the WHO guidelines ’Model Quality Assurance System (MQAS) for procurement agencies'. Descriptive statistics describe the compliance for the whole sample, for distributors in sub-Saharan Africa (SSA) versus those in non-SSA, and for those in low-income countries (LICs) versus middle-income countries (MICs). Results Local private pharmaceutical distributors in our sample were non-compliant, very low-compliant or low-compliant for GQR (70%), GDP (60%) and CCM (41%). Only 7/60 showed good to full compliance for at least two criteria. Observed compliance varies by geographical region and by income group: maximum values are higher in non-SSA versus SSA and in MICs versus LICs, while minimum values are the same across different groups. Conclusion The poor compliance with WHO quality standards observed in our sample indicates a concrete risk that patients in LMICs are exposed to poor-quality or degraded medicines. Significant investments are needed to strengthen the regulatory supervision, including on private pharmaceutical distributors. An adapted standardised evaluation tool inspired by the WHO MQAS would be helpful for self-evaluation, audit and inspection purposes. PMID:29915671
Direct design of an energy landscape with bistable DNA origami mechanisms.
Zhou, Lifeng; Marras, Alexander E; Su, Hai-Jun; Castro, Carlos E
2015-03-11
Structural DNA nanotechnology provides a feasible technique for the design and fabrication of complex geometries even exhibiting controllable dynamic behavior. Recently we have demonstrated the possibility of implementing macroscopic engineering design approaches to construct DNA origami mechanisms (DOM) with programmable motion and tunable flexibility. Here, we implement the design of compliant DNA origami mechanisms to extend from prescribing motion to prescribing an energy landscape. Compliant mechanisms facilitate motion via deformation of components with tunable stiffness resulting in well-defined mechanical energy stored in the structure. We design, fabricate, and characterize a DNA origami nanostructure with an energy landscape defined by two stable states (local energy minima) separated by a designed energy barrier. This nanostructure is a four-bar bistable mechanism with two undeformed states. Traversing between those states requires deformation, and hence mechanical energy storage, in a compliant arm of the linkage. The energy barrier for switching between two states was obtained from the conformational distribution based on a Boltzmann probability function and closely follows a predictive mechanical model. Furthermore, we demonstrated the ability to actuate the mechanism into one stable state via additional DNA inputs and then release the actuation via DNA strand displacement. This controllable multistate system establishes a foundation for direct design of energy landscapes that regulate conformational dynamics similar to biomolecular complexes.
Mechanically adaptive intracortical implants improve the proximity of neuronal cell bodies
Harris, J P; Capadona, J R; Miller, R H; Healy, B C; Shanmuganathan, K; Rowan, S J; Weder, C; Tyler, D J
2012-01-01
The hypothesis is that mechanical mismatch between brain tissue and microelectrodes influences the inflammatory response. Our unique, mechanically-adaptive polymer nanocomposite enabled this study within the cerebral cortex of rats. The initial tensile storage modulus of 5 GPa decreases to 12 MPa within 15 minutes under physiological conditions. The response to the nanocomposite was compared to surface-matched, stiffer implants of traditional wires (411 GPa) coated with the identical polymer substrate and implanted on the contralateral side. Both implants were tethered. Fluorescent immunohistochemistry labeling examined neurons, intermediate filaments, macrophages, microglia, and proteoglycans. We demonstrate, for the first time, a system that decouples the mechanical and surface chemistry components of the neural response. The neuronal nuclei density within 100 μm of the device at four weeks post implantation was greater for the compliant nanocomposite compared to the stiff wire. At eight weeks post implantation, the neuronal nuclei density around the nanocomposite was maintained, but the density around the wire recovered to match the nanocomposite. The glial scar response to the compliant nanocomposite was less vigorous than to the stiffer wire. The results suggest that mechanically associated factors such as proteoglycans and intermediate filaments are important modulators of the response of the compliant nanocomposite. PMID:22049097
A novel compact compliant actuator design for rehabilitation robots.
Yu, Haoyong; Huang, Sunan; Thakor, Nitish V; Chen, Gong; Toh, Siew-Lok; Sta Cruz, Manolo; Ghorbel, Yassine; Zhu, Chi
2013-06-01
Rehabilitation robots have direct physical interaction with human body. Ideally, actuators for rehabilitation robots should be compliant, force controllable, and back drivable due to safety and control considerations. Various designs of Series Elastic Actuators (SEA) have been developed for these applications. However, current SEA designs face a common performance limitation due to the compromise on the spring stiffness selection. This paper presents a novel compact compliant force control actuator design for portable rehabilitation robots to overcome the performance limitations in current SEAs. Our design consists of a servomotor, a ball screw, a torsional spring between the motor and the ball screw, and a set of translational springs between the ball screw nut and the external load. The soft translational springs are used to handle the low force operation and reduce output impedance, stiction, and external shock load. The torsional spring, being in the high speed range, has high effective stiffness and improves the system bandwidth in large force operation when the translational springs are fully compressed. This design is also more compact due to the smaller size of the springs. We explain the construction and the working principle of our new design, followed by the dynamic modeling and analysis of the actuator. We also show the preliminary testing results of a prototype actuator designed for a lower limb exoskeleton for gait rehabilitation.
Tyl, Rochelle W.
2009-01-01
Background Myers et al. [Environ Health Perspect 117:309–315 (2009)] argued that Good Laboratory Practices (GLPs) cannot be used as a criterion for selecting data for risk assessment, using bisphenol A (BPA) as a case study. They did not discuss the role(s) of guideline-compliant studies versus basic/exploratory research studies, and they criticized both GLPs and guideline-compliant studies and their roles in formal hazard evaluation and risk assessment. They also specifically criticized our published guideline-compliant dietary studies on BPA in rats and mice and 17β-estradiol (E2) in mice. Objectives As the study director/first author of the criticized E2 and BPA studies, I discuss the uses of basic research versus guideline-compliant studies, how testing guidelines are developed and revised, how new end points are validated, and the role of GLPs. I also provide an overview of the BPA guideline-compliant and exploratory research animal studies and describe BPA pharmacokinetics in rats and humans. I present responses to specific criticisms by Myers et al. Discussion and conclusions Weight-of-evidence evaluations have consistently concluded that low-level BPA oral exposures do not adversely affect human developmental or reproductive health, and I encourage increased validation efforts for “new” end points for inclusion in guideline studies, as well as performance of robust long-term studies to follow early effects (observed in small exploratory studies) to any adverse consequences. PMID:20049112
2014-04-01
important data structures of RTEMS are introduced. Section 3.2.2 discusses the problems we found in RTEMS that may cause security vulnerabilities...the important data structures in RTEMS: Object, which is a critical data structure in the SCORE, tasks threads. Approved for Public Release...these important system codes. The example code shows a possibility that a user can delete a system thread. Therefore, in order to protect system
Przybylo, Jennifer A; Wang, Ange; Loftus, Pooja; Evans, Kambria H; Chu, Isabella; Shieh, Lisa
2014-09-01
Though current hospital paging systems are neither efficient (callbacks disrupt workflow), nor secure (pagers are not Health Insurance Portability and Accountability Act [HIPAA]-compliant), they are routinely used to communicate patient information. Smartphone-based text messaging is a potentially more convenient and efficient mobile alternative; however, commercial cellular networks are also not secure. To determine if augmenting one-way pagers with Medigram, a secure, HIPAA-compliant group messaging (HCGM) application for smartphones, could improve hospital team communication. Eight-week prospective, cluster-randomized, controlled trial Stanford Hospital Three inpatient medicine teams used the HCGM application in addition to paging, while two inpatient medicine teams used paging only for intra-team communication. Baseline and post-study surveys were collected from 22 control and 41 HCGM team members. When compared with paging, HCGM was rated significantly (P < 0.05) more effective in: (1) allowing users to communicate thoughts clearly (P = 0.010) and efficiently (P = 0.009) and (2) integrating into workflow during rounds (P = 0.018) and patient discharge (P = 0.012). Overall satisfaction with HCGM was significantly higher (P = 0.003). 85% of HCGM team respondents said they would recommend using an HCGM system on the wards. Smartphone-based, HIPAA-compliant group messaging applications improve provider perception of in-hospital communication, while providing the information security that paging and commercial cellular networks do not. © 2014 The Authors Journal of Hospital Medicine published by Wiley Periodicals, Inc. on behalf of Society of Hospital Medicine.
Instrumented Compliant Wrist with Proximity and Contact Sensing for Close Robot Interaction Control.
Laferrière, Pascal; Payeur, Pierre
2017-06-14
Compliance has been exploited in various forms in robotic systems to allow rigid mechanisms to come into contact with fragile objects, or with complex shapes that cannot be accurately modeled. Force feedback control has been the classical approach for providing compliance in robotic systems. However, by integrating other forms of instrumentation with compliance into a single device, it is possible to extend close monitoring of nearby objects before and after contact occurs. As a result, safer and smoother robot control can be achieved both while approaching and while touching surfaces. This paper presents the design and extensive experimental evaluation of a versatile, lightweight, and low-cost instrumented compliant wrist mechanism which can be mounted on any rigid robotic manipulator in order to introduce a layer of compliance while providing the controller with extra sensing signals during close interaction with an object's surface. Arrays of embedded range sensors provide real-time measurements on the position and orientation of surfaces, either located in proximity or in contact with the robot's end-effector, which permits close guidance of its operation. Calibration procedures are formulated to overcome inter-sensor variability and achieve the highest available resolution. A versatile solution is created by embedding all signal processing, while wireless transmission connects the device to any industrial robot's controller to support path control. Experimental work demonstrates the device's physical compliance as well as the stability and accuracy of the device outputs. Primary applications of the proposed instrumented compliant wrist include smooth surface following in manufacturing, inspection, and safe human-robot interaction.
Information Technology and Community Restoration Studies/Task 1: Information Technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Upton, Jaki F.; Lesperance, Ann M.; Stein, Steven L.
2009-11-19
Executive Summary The Interagency Biological Restoration Demonstration—a program jointly funded by the Department of Defense's Defense Threat Reduction Agency and the Department of Homeland Security's (DHS's) Science and Technology Directorate—is developing policies, methods, plans, and applied technologies to restore large urban areas, critical infrastructures, and Department of Defense installations following the intentional release of a biological agent (anthrax) by terrorists. There is a perception that there should be a common system that can share information both vertically and horizontally amongst participating organizations as well as support analyses. A key question is: "How far away from this are we?" As partmore » of this program, Pacific Northwest National Laboratory conducted research to identify the current information technology tools that would be used by organizations in the greater Seattle urban area in such a scenario, to define criteria for use in evaluating information technology tools, and to identify current gaps. Researchers interviewed 28 individuals representing 25 agencies in civilian and military organizations to identify the tools they currently use to capture data needed to support operations and decision making. The organizations can be grouped into five broad categories: defense (Department of Defense), environmental/ecological (Environmental Protection Agency/Ecology), public health and medical services, emergency management, and critical infrastructure. The types of information that would be communicated in a biological terrorism incident include critical infrastructure and resource status, safety and protection information, laboratory test results, and general emergency information. The most commonly used tools are WebEOC (web-enabled crisis information management systems with real-time information sharing), mass notification software, resource tracking software, and NW WARN (web-based information to protect critical infrastructure systems). It appears that the current information management tools are used primarily for information gathering and sharing—not decision making. Respondents identified the following criteria for a future software system. It is easy to learn, updates information in real time, works with all agencies, is secure, uses a visualization or geographic information system feature, enables varying permission levels, flows information from one stage to another, works with other databases, feeds decision support tools, is compliant with appropriate standards, and is reasonably priced. Current tools have security issues, lack visual/mapping functions and critical infrastructure status, and do not integrate with other tools. It is clear that there is a need for an integrated, common operating system. The system would need to be accessible by all the organizations that would have a role in managing an anthrax incident to enable regional decision making. The most useful tool would feature a GIS visualization that would allow for a common operating picture that is updated in real time. To capitalize on information gained from the interviews, the following activities are recommended: • Rate emergency management decision tools against the criteria specified by the interviewees. • Identify and analyze other current activities focused on information sharing in the greater Seattle urban area. • Identify and analyze information sharing systems/tools used in other regions.« less
Effect of Compliant Walls on Secondary Instabilities in Boundary-Layer Transition
NASA Technical Reports Server (NTRS)
Joslin, Ronald D.; Morris, Philip J.
1991-01-01
For aerodynamic and hydrodynamic vehicles, it is highly desirable to reduce drag and noise levels. A reduction in drag leads to fuel savings. In particular for submersible vehicles, a decrease in noise levels inhibits detection. A suggested means to obtain these reduction goals is by delaying the transition from laminar to turbulent flow in external boundary layers. For hydrodynamic applications, a passive device which shows promise for transition delays is the compliant coating. In previous studies with a simple mechanical model representing the compliant wall, coatings were found that provided transition delays as predicted from the semi-empirical e(sup n) method. Those studies were concerned with the linear stage of transition where the instability of concern is referred to as the primary instability. For the flat-plate boundary layer, the Tollmien-Schlichting (TS) wave is the primary instability. In one of those studies, it was shown that three-dimensional (3-D) primary instabilities, or oblique waves, could dominate transition over the coatings considered. From the primary instability, the stretching and tilting of vorticity in the shear flow leads to a secondary instability mechanism. This has been theoretical described by Herbert based on Floquet theory. In the present study, Herbert's theory is used to predict the development of secondary instabilities over isotropic and non-isotropic compliant walls. Since oblique waves may be dominant over compliant walls, a secondary theory extention is made to allow for these 3-D primary instabilities. The effect of variations in primary amplitude, spanwise wavenumber, and Reynolds number on the secondary instabilities are examined. As in the rigid wall case, over compliant walls the subharmonic mode of secondary instability dominates for low-amplitude primary disturbances. Both isotropic and non-isotropic compliant walls lead to reduced secondary growth rates compared to the rigid wall results. For high frequencies, the non-isotropic wall suppresses the amplification of the secondary instabilities, while instabilities over the isotropic wall may grow with an explosive rate similar to the rigid wall results. For the more important lower frequencies, both isotropic and non-isotropic compliant walls suppress the amplification of secondary instabilities compared to the rigid wall results. The twofold major discovery and demonstration of the present investigation are: (1) the use of passive devices, such as compliant walls, can lead to significant reductions in the secondary instability growth rates and amplification; (2) suppressing the primary growth rates and subsequent amplification enable delays in the growth of the explosive secondary instability mechanism.
Embedded Web Technology: Applying World Wide Web Standards to Embedded Systems
NASA Technical Reports Server (NTRS)
Ponyik, Joseph G.; York, David W.
2002-01-01
Embedded Systems have traditionally been developed in a highly customized manner. The user interface hardware and software along with the interface to the embedded system are typically unique to the system for which they are built, resulting in extra cost to the system in terms of development time and maintenance effort. World Wide Web standards have been developed in the passed ten years with the goal of allowing servers and clients to intemperate seamlessly. The client and server systems can consist of differing hardware and software platforms but the World Wide Web standards allow them to interface without knowing about the details of system at the other end of the interface. Embedded Web Technology is the merging of Embedded Systems with the World Wide Web. Embedded Web Technology decreases the cost of developing and maintaining the user interface by allowing the user to interface to the embedded system through a web browser running on a standard personal computer. Embedded Web Technology can also be used to simplify an Embedded System's internal network.
eXframe: reusable framework for storage, analysis and visualization of genomics experiments
2011-01-01
Background Genome-wide experiments are routinely conducted to measure gene expression, DNA-protein interactions and epigenetic status. Structured metadata for these experiments is imperative for a complete understanding of experimental conditions, to enable consistent data processing and to allow retrieval, comparison, and integration of experimental results. Even though several repositories have been developed for genomics data, only a few provide annotation of samples and assays using controlled vocabularies. Moreover, many of them are tailored for a single type of technology or measurement and do not support the integration of multiple data types. Results We have developed eXframe - a reusable web-based framework for genomics experiments that provides 1) the ability to publish structured data compliant with accepted standards 2) support for multiple data types including microarrays and next generation sequencing 3) query, analysis and visualization integration tools (enabled by consistent processing of the raw data and annotation of samples) and is available as open-source software. We present two case studies where this software is currently being used to build repositories of genomics experiments - one contains data from hematopoietic stem cells and another from Parkinson's disease patients. Conclusion The web-based framework eXframe offers structured annotation of experiments as well as uniform processing and storage of molecular data from microarray and next generation sequencing platforms. The framework allows users to query and integrate information across species, technologies, measurement types and experimental conditions. Our framework is reusable and freely modifiable - other groups or institutions can deploy their own custom web-based repositories based on this software. It is interoperable with the most important data formats in this domain. We hope that other groups will not only use eXframe, but also contribute their own useful modifications. PMID:22103807
Publication of sensor data in the long-term environmental monitoring infrastructure TERENO
NASA Astrophysics Data System (ADS)
Stender, V.; Schroeder, M.; Klump, J. F.
2014-12-01
Terrestrial Environmental Observatories (TERENO) is an interdisciplinary and long-term research project spanning an Earth observation network across Germany. It includes four test sites within Germany from the North German lowlands to the Bavarian Alps and is operated by six research centers of the Helmholtz Association. TERENO Northeast is one of the sub-observatories of TERENO and is operated by the German Research Centre for Geosciences GFZ in Potsdam. This observatory investigates geoecological processes in the northeastern lowland of Germany by collecting large amounts of environmentally relevant data. The success of long-term projects like TERENO depends on well-organized data management, data exchange between the partners involved and on the availability of the captured data. Data discovery and dissemination are facilitated not only through data portals of the regional TERENO observatories but also through a common spatial data infrastructure TEODOOR (TEreno Online Data repOsitORry). TEODOOR bundles the data, provided by the different web services of the single observatories, and provides tools for data discovery, visualization and data access. The TERENO Northeast data infrastructure integrates data from more than 200 instruments and makes data available through standard web services. TEODOOR accesses the OGC Sensor Web Enablement (SWE) interfaces offered by the regional observatories. In addition to the SWE interface, TERENO Northeast also publishes time series of environmental sensor data through the online research data publication platform DataCite. The metadata required by DataCite are created in an automated process by extracting information from the SWE SensorML to create ISO 19115 compliant metadata. The GFZ data management tool kit panMetaDocs is used to register Digital Object Identifiers (DOI) and preserve file based datasets. In addition to DOI, the International Geo Sample Numbers (IGSN) is used to uniquely identify research specimens.
Dry compliant seal for phosphoric acid fuel cell
Granata, Jr., Samuel J.; Woodle, Boyd M.
1990-01-01
A dry compliant overlapping seal for a phosphoric acid fuel cell preformed f non-compliant Teflon to make an anode seal frame that encircles an anode assembly, a cathode seal frame that encircles a cathode assembly and a compliant seal frame made of expanded Teflon, generally encircling a matrix assembly. Each frame has a thickness selected to accommodate various tolerances of the fuel cell elements and are either bonded to one of the other frames or to a bipolar or end plate. One of the non-compliant frames is wider than the other frames forming an overlap of the matrix over the wider seal frame, which cooperates with electrolyte permeating the matrix to form a wet seal within the fuel cell that prevents process gases from intermixing at the periphery of the fuel cell and a dry seal surrounding the cell to keep electrolyte from the periphery thereof. The frames may be made in one piece, in L-shaped portions or in strips and have an outer perimeter which registers with the outer perimeter of bipolar or end plates to form surfaces upon which flanges of pan shaped, gas manifolds can be sealed.
Klebanoff (K-) modes in boundary layers (BLs) over compliant surfaces
NASA Astrophysics Data System (ADS)
Ali, Reza; Carpenter, Peter
2002-11-01
We investigate the effect of wall compliance on K-modes. These are associated with streaks observed in the transitional BL, generated by spanwise modulation of the streamwise velocity, and are thought to be the mechanism for bypass transition. They have been widely studied over flat-plate, rigid surfaces but not compliant surfaces. A novel velocity-vorticity formulation is adopted for the numerical simulations, and a freestream spanwise body force is used to generate the streaks. We find compliant walls are less receptive than rigid walls, i.e. freestream turbulence generates weaker disturbances over compliant walls. This effect intensifies with increasing compliance. Where a compliant panel is embedded into a rigid surface, the leading and trailing edges of the panel can introduce a stabilising or destabilising disturbance on the streaks depending on the Reynolds number. It is therefore possible to optimise the wall to suppress streaks and hence bypass. K-modes can also act as a theoretical model for the near-wall structures that generate the high skin-friction drag in turbulent BLs. In this scenario, increasing compliance increases the spanwise spacing and weakens the streak. This explains experimental observations that wall compliance reduces skin-friction drag and turbulence levels in turbulent BLs.
Parrish, Clyde F
2003-12-01
A series of workshops were sponsored by the Physical Science Division of NASA's Office of Biological and Physical Research to address operational gravity-compliant in-situ resource utilization and life support techologies. Workshop participants explored a Mars simulation study on Devon Island, Canada; the processing of carbon dioxide in regenerative life support systems; space tourism; rocket technology; plant growth research for closed ecological systems; and propellant extraction of planetary regoliths.
Environmentally Compliant Thermoplastic Powder Coating, Phase 1
1992-10-07
TPC flame sprayed application equipment and ethylene acrylic acid (EAA) and ethylene methacrylic acid (EMAA) copolymers thermoplastic powder...have worked closely with Dow Chemical to develop and optimize their systems using Dow "Envelon" ethylene acrylic acid (EAA) thermoplastic copolymers...provide on/off control. CFS recommends the use of Dow "Envelon" ethylene acrylic acid (EAA) copolymer thermoplastic powder with this unit. The CFS system
10 CFR 2.1011 - Management of electronic information.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 1 2010-01-01 2010-01-01 false Management of electronic information. 2.1011 Section 2... High-Level Radioactive Waste at a Geologic Repository § 2.1011 Management of electronic information. (a... Language)-compliant (ANSI IX3.135-1992/ISO 9075-1992) database management system (DBMS). Alternatively, the...
The tool extracts deep phenotypic information from the clinical narrative at the document-, episode-, and patient-level. The final output is FHIR compliant patient-level phenotypic summary which can be consumed by research warehouses or the DeepPhe native visualization tool.
Learning Portfolio Analysis and Mining for SCORM Compliant Environment
ERIC Educational Resources Information Center
Su, Jun-Ming; Tseng, Shian-Shyong; Wang, Wei; Weng, Jui-Feng; Yang, Jin Tan David; Tsai, Wen-Nung
2006-01-01
With vigorous development of the Internet, e-learning system has become more and more popular. Sharable Content Object Reference Model (SCORM) 2004 provides the Sequencing and Navigation (SN) Specification to define the course sequencing behavior, control the sequencing, selecting and delivering of course, and organize the content into a…
Impact of EMP on Air Operations
pulse while in flight. With a requirement to be compliant with this standard for all critical weapon systems, the C -17 remains untested. The significant...EMP event that would threaten an inflight C -17, there was not enough evidence to force a change to the current EMP testing and hardening schedule.
78 FR 40478 - Privacy Act of 1974; Notice of an Updated System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-05
... of MyUSA's programmatic interfaces, such as notifications, tasks, or events; (3) a history of third... Technology standards and information in the database is encrypted. Records are safeguarded in accordance with... algorithms and firewalls are compliant with National Institute of Standards and Technology standards...
Resource Management Scheme Based on Ubiquitous Data Analysis
Lee, Heung Ki; Jung, Jaehee
2014-01-01
Resource management of the main memory and process handler is critical to enhancing the system performance of a web server. Owing to the transaction delay time that affects incoming requests from web clients, web server systems utilize several web processes to anticipate future requests. This procedure is able to decrease the web generation time because there are enough processes to handle the incoming requests from web browsers. However, inefficient process management results in low service quality for the web server system. Proper pregenerated process mechanisms are required for dealing with the clients' requests. Unfortunately, it is difficult to predict how many requests a web server system is going to receive. If a web server system builds too many web processes, it wastes a considerable amount of memory space, and thus performance is reduced. We propose an adaptive web process manager scheme based on the analysis of web log mining. In the proposed scheme, the number of web processes is controlled through prediction of incoming requests, and accordingly, the web process management scheme consumes the least possible web transaction resources. In experiments, real web trace data were used to prove the improved performance of the proposed scheme. PMID:25197692
Context indexing of digital cardiac ultrasound records in PACS
NASA Astrophysics Data System (ADS)
Lobodzinski, S. Suave; Meszaros, Georg N.
1998-07-01
Recent wide adoption of the DICOM 3.0 standard by ultrasound equipment vendors created a need for practical clinical implementations of cardiac imaging study visualization, management and archiving, DICOM 3.0 defines only a logical and physical format for exchanging image data (still images, video, patient and study demographics). All DICOM compliant imaging studies must presently be archived on a 650 Mb recordable compact disk. This is a severe limitation for ultrasound applications where studies of 3 to 10 minutes long are a common practice. In addition, DICOM digital echocardiography objects require physiological signal indexing, content segmentation and characterization. Since DICOM 3.0 is an interchange standard only, it does not define how to database composite video objects. The goal of this research was therefore to address the issues of efficient storage, retrieval and management of DICOM compliant cardiac video studies in a distributed PACS environment. Our Web based implementation has the advantage of accommodating both DICOM defined entity-relation modules (equipment data, patient data, video format, etc.) in standard relational database tables and digital indexed video with its attributes in an object relational database. Object relational data model facilitates content indexing of full motion cardiac imaging studies through bi-directional hyperlink generation that tie searchable video attributes and related objects to individual video frames in the temporal domain. Benefits realized from use of bi-directionally hyperlinked data models in an object relational database include: (1) real time video indexing during image acquisition, (2) random access and frame accurate instant playback of previously recorded full motion imaging data, and (3) time savings from faster and more accurate access to data through multiple navigation mechanisms such as multidimensional queries on an index, queries on a hyperlink attribute, free search and browsing.
Compliant tactile sensor that delivers a force vector
NASA Technical Reports Server (NTRS)
Torres-Jara, Eduardo (Inventor)
2010-01-01
Tactile Sensor. The sensor includes a compliant convex surface disposed above a sensor array, the sensor array adapted to respond to deformation of the convex surface to generate a signal related to an applied force vector. The applied force vector has three components to establish the direction and magnitude of an applied force. The compliant convex surface defines a dome with a hollow interior and has a linear relation between displacement and load including a magnet disposed substantially at the center of the dome above a sensor array that responds to magnetic field intensity.
Compliant Electrode and Composite Material for Piezoelectric Wind and Mechanical Energy Conversions
NASA Technical Reports Server (NTRS)
Chen, Bin (Inventor)
2015-01-01
A thin film device for harvesting energy from wind. The thin film device includes one or more layers of a compliant piezoelectric material formed from a composite of a polymer and an inorganic material, such as a ceramic. Electrodes are disposed on a first side and a second side of the piezoelectric material. The electrodes are formed from a compliant material, such as carbon nanotubes or graphene. The thin film device exhibits improved resistance to structural fatigue upon application of large strains and repeated cyclic loadings.
Approach for Structurally Clearing an Adaptive Compliant Trailing Edge Flap for Flight
NASA Technical Reports Server (NTRS)
Miller, Eric J.; Lokos, William A.; Cruz, Josue; Crampton, Glen; Stephens, Craig A.; Kota, Sridhar; Ervin, Gregory; Flick, Pete
2015-01-01
The Adaptive Compliant Trailing Edge (ACTE) flap was flown on the NASA Gulfstream GIII test bed at the NASA Armstrong Flight Research Center. This smoothly curving flap replaced the existing Fowler flaps creating a seamless control surface. This compliant structure, developed by FlexSys Inc. in partnership with Air Force Research Laboratory, supported NASA objectives for airframe structural noise reduction, aerodynamic efficiency, and wing weight reduction through gust load alleviation. A thorough structures airworthiness approach was developed to move this project safely to flight.
Spacecraft Onboard Interface Services: Current Status and Roadmap
NASA Astrophysics Data System (ADS)
Prochazka, Marek; Lopez Trescastro, Jorge; Krueger, Sabine
2016-08-01
Spacecraft Onboard Interface Services (SOIS) is a set of CCSDS standards defining communication stack services to interact with hardware equipment onboard spacecraft. In 2014 ESA kicked off three parallel activities to critically review the SOIS standards, use legacy spacecraft flight software (FSW), make it compliant to a preselected subset of SOIS standards and make performance and architecture assessment. As a part of the three parallel activities, led by Airbus DS Toulouse, OHB Bremen and Thales Alenia Space Cannes respectively, it was to provide feedback back to ESA and CCSDS and also to propose a roadmap of transition towards an operational FSW system fully compliant to applicable SOIS standards. The objective of the paper is twofold: Firstly it is to summarise main results of the three parallel activities and secondly, based on the results, to propose a roadmap for the future.
NASA Astrophysics Data System (ADS)
Li, Chuanwei; Kong, Yingxiao; Jiang, Wenchong; Wang, Zhiyong; Li, Linan; Wang, Shibin
2017-06-01
The wrinkling of a silicon monoxide thin film on a compliant poly(dimethylsiloxane) (PDMS) substrate structure was experimentally investigated in this study. The self-expansion effect of PDMS during film deposition was utilized to impose a pretensile strain on the structure through a specially made fixture. A laser scanning confocal microscope (LSCM) system with an in situ heating stage was employed for the real-time measurement. The Young’s modulus of the silicon monoxide thin film as well as the PDMS substrate was measured on the basis of the elasticity theory. Moreover, the effects of temperature variations on geometric parameters in the postbuckling state, such as wavelength and amplitude, were analyzed. It was proved that wavelength is relatively immune to thermal loads, while amplitude is much more sensitive.
Aerodynamic Flight-Test Results for the Adaptive Compliant Trailing Edge
NASA Technical Reports Server (NTRS)
Cumming, Stephen B.; Smith, Mark S.; Ali, Aliyah N.; Bui, Trong T.; Ellsworth, Joel C.; Garcia, Christian A.
2016-01-01
The aerodynamic effects of compliant flaps installed onto a modified Gulfstream III airplane were investigated. Analyses were performed prior to flight to predict the aerodynamic effects of the flap installation. Flight tests were conducted to gather both structural and aerodynamic data. The airplane was instrumented to collect vehicle aerodynamic data and wing pressure data. A leading-edge stagnation detection system was also installed. The data from these flights were analyzed and compared with predictions. The predictive tools compared well with flight data for small flap deflections, but differences between predictions and flight estimates were greater at larger deflections. This paper describes the methods used to examine the aerodynamics data from the flight tests and provides a discussion of the flight-test results in the areas of vehicle aerodynamics, wing sectional pressure coefficient profiles, and air data.
New NED XML/VOtable Services and Client Interface Applications
NASA Astrophysics Data System (ADS)
Pevunova, O.; Good, J.; Mazzarella, J.; Berriman, G. B.; Madore, B.
2005-12-01
The NASA/IPAC Extragalactic Database (NED) provides data and cross-identifications for over 7 million extragalactic objects fused from thousands of survey catalogs and journal articles. The data cover all frequencies from radio through gamma rays and include positions, redshifts, photometry and spectral energy distributions (SEDs), sizes, and images. NED services have traditionally supplied data in HTML format for connections from Web browsers, and a custom ASCII data structure for connections by remote computer programs written in the C programming language. We describe new services that provide responses from NED queries in XML documents compliant with the international virtual observatory VOtable protocol. The XML/VOtable services support cone searches, all-sky searches based on object attributes (survey names, cross-IDs, redshifts, flux densities), and requests for detailed object data. Initial services have been inserted into the NVO registry, and others will follow soon. The first client application is a Style Sheet specification for rendering NED VOtable query results in Web browsers that support XML. The second prototype application is a Java applet that allows users to compare multiple SEDs. The new XML/VOtable output mode will also simplify the integration of data from NED into visualization and analysis packages, software agents, and other virtual observatory applications. We show an example SED from NED plotted using VOPlot. The NED website is: http://nedwww.ipac.caltech.edu.
MR efficiency using automated MRI-desktop eProtocol
NASA Astrophysics Data System (ADS)
Gao, Fei; Xu, Yanzhe; Panda, Anshuman; Zhang, Min; Hanson, James; Su, Congzhe; Wu, Teresa; Pavlicek, William; James, Judy R.
2017-03-01
MRI protocols are instruction sheets that radiology technologists use in routine clinical practice for guidance (e.g., slice position, acquisition parameters etc.). In Mayo Clinic Arizona (MCA), there are over 900 MR protocols (ranging across neuro, body, cardiac, breast etc.) which makes maintaining and updating the protocol instructions a labor intensive effort. The task is even more challenging given different vendors (Siemens, GE etc.). This is a universal problem faced by all the hospitals and/or medical research institutions. To increase the efficiency of the MR practice, we designed and implemented a web-based platform (eProtocol) to automate the management of MRI protocols. It is built upon a database that automatically extracts protocol information from DICOM compliant images and provides a user-friendly interface to the technologists to create, edit and update the protocols. Advanced operations such as protocol migrations from scanner to scanner and capability to upload Multimedia content were also implemented. To the best of our knowledge, eProtocol is the first MR protocol automated management tool used clinically. It is expected that this platform will significantly improve the radiology operations efficiency including better image quality and exam consistency, fewer repeat examinations and less acquisition errors. These protocols instructions will be readily available to the technologists during scans. In addition, this web-based platform can be extended to other imaging modalities such as CT, Mammography, and Interventional Radiology and different vendors for imaging protocol management.
Approach for Structurally Clearing an Adaptive Compliant Trailing Edge Flap for Flight
NASA Technical Reports Server (NTRS)
Miller, Eric J.; Lokos, William A.; Cruz, Josue; Crampton, Glen; Stephens, Craig A.; Kota, Sridhar; Ervin, Gregory; Flick, Pete
2015-01-01
The Adaptive Compliant Trailing Edge (ACTE) flap was flown on the National Aeronautics and Space Administration (NASA) Gulfstream GIII testbed at the NASA Armstrong Flight Research Center. This smoothly curving flap replaced the existing Fowler flaps creating a seamless control surface. This compliant structure, developed by FlexSys Inc. in partnership with the Air Force Research Laboratory, supported NASA objectives for airframe structural noise reduction, aerodynamic efficiency, and wing weight reduction through gust load alleviation. A thorough structures airworthiness approach was developed to move this project safely to flight. A combination of industry and NASA standard practice require various structural analyses, ground testing, and health monitoring techniques for showing an airworthy structure. This paper provides an overview of compliant structures design, the structural ground testing leading up to flight, and the flight envelope expansion and monitoring strategy. Flight data will be presented, and lessons learned along the way will be highlighted.
Yao, Cheng-Hsiang; Lin, Kun-Ju; Weng, Chi-Chang; Hsiao, Ing-Tsung; Ting, Yi-Shu; Yen, Tzu-Chen; Jan, Tong-Rong; Skovronsky, Daniel; Kung, Mei-Ping; Wey, Shiaw-Pyng
2010-12-01
We report herein the Good Manufacturing Practice (GMP)-compliant automated synthesis of (18)F-labeled styrylpyridine, AV-45 (Florbetapir), a novel tracer for positron emission tomography (PET) imaging of beta-amyloid (Abeta) plaques in the brain of Alzheimer's disease patients. [(18)F]AV-45 was prepared in 105 min using a tosylate precursor with Sumitomo modules for radiosynthesis under GMP-compliant conditions. The overall yield was 25.4+/-7.7% with a final radiochemical purity of 95.3+/-2.2% (n=19). The specific activity of [(18)F]AV-45 reached as high as 470+/-135 TBq/mmol (n=19). The present studies show that [(18)F]AV-45 can be manufactured under GMP-compliant conditions and could be widely available for routine clinical use. Copyright 2010 Elsevier Ltd. All rights reserved.
Design and control of compliant tensegrity robots through simulation and hardware validation
Caluwaerts, Ken; Despraz, Jérémie; Işçen, Atıl; Sabelhaus, Andrew P.; Bruce, Jonathan; Schrauwen, Benjamin; SunSpiral, Vytas
2014-01-01
To better understand the role of tensegrity structures in biological systems and their application to robotics, the Dynamic Tensegrity Robotics Lab at NASA Ames Research Center, Moffett Field, CA, USA, has developed and validated two software environments for the analysis, simulation and design of tensegrity robots. These tools, along with new control methodologies and the modular hardware components developed to validate them, are presented as a system for the design of actuated tensegrity structures. As evidenced from their appearance in many biological systems, tensegrity (‘tensile–integrity’) structures have unique physical properties that make them ideal for interaction with uncertain environments. Yet, these characteristics make design and control of bioinspired tensegrity robots extremely challenging. This work presents the progress our tools have made in tackling the design and control challenges of spherical tensegrity structures. We focus on this shape since it lends itself to rolling locomotion. The results of our analyses include multiple novel control approaches for mobility and terrain interaction of spherical tensegrity structures that have been tested in simulation. A hardware prototype of a spherical six-bar tensegrity, the Reservoir Compliant Tensegrity Robot, is used to empirically validate the accuracy of simulation. PMID:24990292
NASA Astrophysics Data System (ADS)
Tamkin, G.; Schnase, J. L.; Duffy, D.; Li, J.; Strong, S.; Thompson, J. H.
2017-12-01
NASA's efforts to advance climate analytics-as-a-service are making new capabilities available to the research community: (1) A full-featured Reanalysis Ensemble Service (RES) comprising monthly means data from multiple reanalysis data sets, accessible through an enhanced set of extraction, analytic, arithmetic, and intercomparison operations. The operations are made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services Python library, CDSlib; (2) A cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables. This near real-time capability enables advanced technologies like Spark and Hadoop-based MapReduce analytics over native NetCDF files; and (3) A WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation systems such as ESGF. The Reanalysis Ensemble Service includes the following: - New API that supports full temporal, spatial, and grid-based resolution services with sample queries - A Docker-ready RES application to deploy across platforms - Extended capabilities that enable single- and multiple reanalysis area average, vertical average, re-gridding, standard deviation, and ensemble averages - Convenient, one-stop shopping for commonly used data products from multiple reanalyses including basic sub-setting and arithmetic operations (e.g., avg, sum, max, min, var, count, anomaly) - Full support for the MERRA-2 reanalysis dataset in addition to, ECMWF ERA-Interim, NCEP CFSR, JMA JRA-55 and NOAA/ESRL 20CR… - A Jupyter notebook-based distribution mechanism designed for client use cases that combines CDSlib documentation with interactive scenarios and personalized project management - Supporting analytic services for NASA GMAO Forward Processing datasets - Basic uncertainty quantification services that combine heterogeneous ensemble products with comparative observational products (e.g., reanalysis, observational, visualization) - The ability to compute and visualize multiple reanalysis for ease of inter-comparisons - Automated tools to retrieve and prepare data collections for analytic processing
Cable applications in robot compliant devices
NASA Technical Reports Server (NTRS)
Kerley, James J.
1987-01-01
Robotic systems need compliance to connect the robot to the work object. The cable system illustrated offers compliance for mating but can be changed in space to become quite stiff. Thus the same system can do both tasks, even in environments where the work object or robot are moving at different frequencies and different amplitudes. The adjustment can be made in all six degrees of freedom, translated in or rotated in any plane and still make a good contact and control.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2017-03-28
GridAPPS-D is an open-source, open architecture, standards based platform for development of advanced electric power system planning and operations applications. GridAPPS-D provides a documented data abstraction for the application developer enabling creation of applications that can be run in any compliant system or platform. This enables development of applications that are platform vendor independent applications and applications that take advantage of the possibility of data rich and data driven applications based on deployment of smart grid devices and systems.
NASA Technical Reports Server (NTRS)
Shalkhauser, Mary Jo W.; Roche, Rigoberto
2017-01-01
The Space Telecommunications Radio System (STRS) provides a common, consistent framework for software defined radios (SDRs) to abstract the application software from the radio platform hardware. The STRS standard aims to reduce the cost and risk of using complex, configurable and reprogrammable radio systems across NASA missions. To promote the use of the STRS architecture for future NASA advanced exploration missions, NASA Glenn Research Center (GRC) developed an STRS-compliant SDR on a radio platform used by the Advance Exploration System program at the Johnson Space Center (JSC) in their Integrated Power, Avionics, and Software (iPAS) laboratory. The iPAS STRS Radio was implemented on the Reconfigurable, Intelligently-Adaptive Communication System (RIACS) platform, currently being used for radio development at JSC. The platform consists of a Xilinx(Trademark) ML605 Virtex(Trademark)-6 FPGA board, an Analog Devices FMCOMMS1-EBZ RF transceiver board, and an Embedded PC (Axiomtek(Trademark) eBox 620-110-FL) running the Ubuntu 12.4 operating system. The result of this development is a very low cost STRS compliant platform that can be used for waveform developments for multiple applications. The purpose of this document is to describe how to develop a new waveform using the RIACS platform and the Very High Speed Integrated Circuits (VHSIC) Hardware Description Language (VHDL) FPGA wrapper code and the STRS implementation on the Axiomtek processor.
Dynamic Server-Based KML Code Generator Method for Level-of-Detail Traversal of Geospatial Data
NASA Technical Reports Server (NTRS)
Baxes, Gregory; Mixon, Brian; Linger, TIm
2013-01-01
Web-based geospatial client applications such as Google Earth and NASA World Wind must listen to data requests, access appropriate stored data, and compile a data response to the requesting client application. This process occurs repeatedly to support multiple client requests and application instances. Newer Web-based geospatial clients also provide user-interactive functionality that is dependent on fast and efficient server responses. With massively large datasets, server-client interaction can become severely impeded because the server must determine the best way to assemble data to meet the client applications request. In client applications such as Google Earth, the user interactively wanders through the data using visually guided panning and zooming actions. With these actions, the client application is continually issuing data requests to the server without knowledge of the server s data structure or extraction/assembly paradigm. A method for efficiently controlling the networked access of a Web-based geospatial browser to server-based datasets in particular, massively sized datasets has been developed. The method specifically uses the Keyhole Markup Language (KML), an Open Geospatial Consortium (OGS) standard used by Google Earth and other KML-compliant geospatial client applications. The innovation is based on establishing a dynamic cascading KML strategy that is initiated by a KML launch file provided by a data server host to a Google Earth or similar KMLcompliant geospatial client application user. Upon execution, the launch KML code issues a request for image data covering an initial geographic region. The server responds with the requested data along with subsequent dynamically generated KML code that directs the client application to make follow-on requests for higher level of detail (LOD) imagery to replace the initial imagery as the user navigates into the dataset. The approach provides an efficient data traversal path and mechanism that can be flexibly established for any dataset regardless of size or other characteristics. The method yields significant improvements in userinteractive geospatial client and data server interaction and associated network bandwidth requirements. The innovation uses a C- or PHP-code-like grammar that provides a high degree of processing flexibility. A set of language lexer and parser elements is provided that offers a complete language grammar for writing and executing language directives. A script is wrapped and passed to the geospatial data server by a client application as a component of a standard KML-compliant statement. The approach provides an efficient means for a geospatial client application to request server preprocessing of data prior to client delivery. Data is structured in a quadtree format. As the user zooms into the dataset, geographic regions are subdivided into four child regions. Conversely, as the user zooms out, four child regions collapse into a single, lower-LOD region. The approach provides an efficient data traversal path and mechanism that can be flexibly established for any dataset regardless of size or other characteristics.
Development and Evaluation of an Interactive WebQuest Environment: "Web Macerasi"
ERIC Educational Resources Information Center
Gulbahar, Yasemin; Madran, R. Orcun; Kalelioglu, Filiz
2010-01-01
This study was conducted to develop a web-based interactive system, Web Macerasi, for teaching-learning and evaluation purposes, and to find out the possible effects of this system. The study has two stages. In the first stage, a WebQuest site was designed as an interactive system in which various Internet and web technologies were used for…
NASA Astrophysics Data System (ADS)
McDonald, Michael C.; Kim, H. K.; Henry, J. R.; Cunningham, I. A.
2012-03-01
The detective quantum efficiency (DQE) is widely accepted as a primary measure of x-ray detector performance in the scientific community. A standard method for measuring the DQE, based on IEC 62220-1, requires the system to have a linear response meaning that the detector output signals are proportional to the incident x-ray exposure. However, many systems have a non-linear response due to characteristics of the detector, or post processing of the detector signals, that cannot be disabled and may involve unknown algorithms considered proprietary by the manufacturer. For these reasons, the DQE has not been considered as a practical candidate for routine quality assurance testing in a clinical setting. In this article we described a method that can be used to measure the DQE of both linear and non-linear systems that employ only linear image processing algorithms. The method was validated on a Cesium Iodide based flat panel system that simultaneously stores a raw (linear) and processed (non-linear) image for each exposure. It was found that the resulting DQE was equivalent to a conventional standards-compliant DQE with measurement precision, and the gray-scale inversion and linear edge enhancement did not affect the DQE result. While not IEC 62220-1 compliant, it may be adequate for QA programs.
Validation of an automated system for aliquoting of HIV-1 Env-pseudotyped virus stocks.
Schultz, Anke; Germann, Anja; Fuss, Martina; Sarzotti-Kelsoe, Marcella; Ozaki, Daniel A; Montefiori, David C; Zimmermann, Heiko; von Briesen, Hagen
2018-01-01
The standardized assessments of HIV-specific immune responses are of main interest in the preclinical and clinical stage of HIV-1 vaccine development. In this regard, HIV-1 Env-pseudotyped viruses play a central role for the evaluation of neutralizing antibody profiles and are produced according to Good Clinical Laboratory Practice- (GCLP-) compliant manual and automated procedures. To further improve and complete the automated production cycle an automated system for aliquoting HIV-1 pseudovirus stocks has been implemented. The automation platform consists of a modified Tecan-based system including a robot platform for handling racks containing 48 cryovials, a Decapper, a tubing pump and a safety device consisting of ultrasound sensors for online liquid level detection of each individual cryovial. With the aim to aliquot the HIV-1 pseudoviruses in an automated manner under GCLP-compliant conditions a validation plan was developed where the acceptance criteria-accuracy, precision as well as the specificity and robustness-were defined and summarized. By passing the validation experiments described in this article the automated system for aliquoting has been successfully validated. This allows the standardized and operator independent distribution of small-scale and bulk amounts of HIV-1 pseudovirus stocks with a precise and reproducible outcome to support upcoming clinical vaccine trials.
Validation of an automated system for aliquoting of HIV-1 Env-pseudotyped virus stocks
Schultz, Anke; Germann, Anja; Fuss, Martina; Sarzotti-Kelsoe, Marcella; Ozaki, Daniel A.; Montefiori, David C.; Zimmermann, Heiko
2018-01-01
The standardized assessments of HIV-specific immune responses are of main interest in the preclinical and clinical stage of HIV-1 vaccine development. In this regard, HIV-1 Env-pseudotyped viruses play a central role for the evaluation of neutralizing antibody profiles and are produced according to Good Clinical Laboratory Practice- (GCLP-) compliant manual and automated procedures. To further improve and complete the automated production cycle an automated system for aliquoting HIV-1 pseudovirus stocks has been implemented. The automation platform consists of a modified Tecan-based system including a robot platform for handling racks containing 48 cryovials, a Decapper, a tubing pump and a safety device consisting of ultrasound sensors for online liquid level detection of each individual cryovial. With the aim to aliquot the HIV-1 pseudoviruses in an automated manner under GCLP-compliant conditions a validation plan was developed where the acceptance criteria—accuracy, precision as well as the specificity and robustness—were defined and summarized. By passing the validation experiments described in this article the automated system for aliquoting has been successfully validated. This allows the standardized and operator independent distribution of small-scale and bulk amounts of HIV-1 pseudovirus stocks with a precise and reproducible outcome to support upcoming clinical vaccine trials. PMID:29300769
Tellurium notebooks-An environment for reproducible dynamical modeling in systems biology.
Medley, J Kyle; Choi, Kiri; König, Matthias; Smith, Lucian; Gu, Stanley; Hellerstein, Joseph; Sealfon, Stuart C; Sauro, Herbert M
2018-06-01
The considerable difficulty encountered in reproducing the results of published dynamical models limits validation, exploration and reuse of this increasingly large biomedical research resource. To address this problem, we have developed Tellurium Notebook, a software system for model authoring, simulation, and teaching that facilitates building reproducible dynamical models and reusing models by 1) providing a notebook environment which allows models, Python code, and narrative to be intermixed, 2) supporting the COMBINE archive format during model development for capturing model information in an exchangeable format and 3) enabling users to easily simulate and edit public COMBINE-compliant models from public repositories to facilitate studying model dynamics, variants and test cases. Tellurium Notebook, a Python-based Jupyter-like environment, is designed to seamlessly inter-operate with these community standards by automating conversion between COMBINE standards formulations and corresponding in-line, human-readable representations. Thus, Tellurium brings to systems biology the strategy used by other literate notebook systems such as Mathematica. These capabilities allow users to edit every aspect of the standards-compliant models and simulations, run the simulations in-line, and re-export to standard formats. We provide several use cases illustrating the advantages of our approach and how it allows development and reuse of models without requiring technical knowledge of standards. Adoption of Tellurium should accelerate model development, reproducibility and reuse.
NASA's Global Imagery Browse Services - Technologies for Visualizing Earth Science Data
NASA Astrophysics Data System (ADS)
Cechini, M. F.; Boller, R. A.; Baynes, K.; Schmaltz, J. E.; Thompson, C. K.; Roberts, J. T.; Rodriguez, J.; Wong, M. M.; King, B. A.; King, J.; De Luca, A. P.; Pressley, N. N.
2017-12-01
For more than 20 years, the NASA Earth Observing System (EOS) has collected earth science data for thousands of scientific parameters now totaling nearly 15 Petabytes of data. In 2013, NASA's Global Imagery Browse Services (GIBS) formed its vision to "transform how end users interact and discover [EOS] data through visualizations." This vision included leveraging scientific and community best practices and standards to provide a scalable, compliant, and authoritative source for EOS earth science data visualizations. Since that time, GIBS has grown quickly and now services millions of daily requests for over 500 imagery layers representing hundreds of earth science parameters to a broad community of users. For many of these parameters, visualizations are available within hours of acquisition from the satellite. For others, visualizations are available for the entire mission of the satellite. The GIBS system is built upon the OnEarth and MRF open source software projects, which are provided by the GIBS team. This software facilitates standards-based access for compliance with existing GIS tools. The GIBS imagery layers are predominantly rasterized images represented in two-dimensional coordinate systems, though multiple projections are supported. The OnEarth software also supports the GIBS ingest pipeline to facilitate low latency updates to new or updated visualizations. This presentation will focus on the following topics: Overview of GIBS visualizations and user community Current benefits and limitations of the OnEarth and MRF software projects and related standards GIBS access methods and their in/compatibilities with existing GIS libraries and applications Considerations for visualization accuracy and understandability Future plans for more advanced visualization concepts including Vertical Profiles and Vector-Based Representations Future plans for Amazon Web Service support and deployments
A Functional Approach to Hyperspectral Image Analysis in the Cloud
NASA Astrophysics Data System (ADS)
Wilson, A.; Lindholm, D. M.; Coddington, O.; Pilewskie, P.
2017-12-01
Hyperspectral image volumes are very large. A hyperspectral image analysis (HIA) may use 100TB of data, a huge barrier to their use. Hylatis is a new NASA project to create a toolset for HIA. Through web notebook and cloud technology, Hylatis will provide a more interactive experience for HIA by defining and implementing concepts and operations for HIA, identified and vetted by subject matter experts, and callable within a general purpose language, particularly Python. Hylatis leverages LaTiS, a data access framework developed at LASP. With an OPeNDAP compliant interface plus additional server side capabilities, the LaTiS API provides a uniform interface to virtually any data source, and has been applied to various storage systems, including: file systems, databases, remote servers, and in various domains including: space science, systems administration and stock quotes. In the LaTiS architecture, data `adapters' read data into a data model, where server-side computations occur. Data `writers' write data from the data model into the desired format. The Hylatis difference is the data model. In LaTiS, data are represented as mathematical functions of independent and dependent variables. Domain semantics are not present at this level, but are instead present in higher software layers. The benefit of a domain agnostic, mathematical representation is having the power of math, particularly functional algebra, unconstrained by domain semantics. This agnosticism supports reusable server side functionality applicable in any domain, such as statistical, filtering, or projection operations. Algorithms to aggregate or fuse data can be simpler because domain semantics are separated from the math. Hylatis will map the functional model onto the Spark relational interface, thereby adding a functional interface to that big data engine.This presentation will discuss Hylatis goals, strategies, and current state.
Foster, Wendy; Gilder, Jason; Love, Thomas E; Jain, Anil K
2012-01-01
Objective To demonstrate the potential of de-identified clinical data from multiple healthcare systems using different electronic health records (EHR) to be efficiently used for very large retrospective cohort studies. Materials and methods Data of 959 030 patients, pooled from multiple different healthcare systems with distinct EHR, were obtained. Data were standardized and normalized using common ontologies, searchable through a HIPAA-compliant, patient de-identified web application (Explore; Explorys Inc). Patients were 26 years or older seen in multiple healthcare systems from 1999 to 2011 with data from EHR. Results Comparing obese, tall subjects with normal body mass index, short subjects, the venous thromboembolic events (VTE) OR was 1.83 (95% CI 1.76 to 1.91) for women and 1.21 (1.10 to 1.32) for men. Weight had more effect then height on VTE. Compared with Caucasian, Hispanic/Latino subjects had a much lower risk of VTE (female OR 0.47, 0.41 to 0.55; male OR 0.24, 0.20 to 0.28) and African-Americans a substantially higher risk (female OR 1.83, 1.76 to 1.91; male OR 1.58, 1.50 to 1.66). This 13-year retrospective study of almost one million patients was performed over approximately 125 h in 11 weeks, part time by the five authors. Discussion As research informatics tools develop and more clinical data become available in EHR, it is important to study and understand unique opportunities for clinical research informatics to transform the scale and resources needed to perform certain types of clinical research. Conclusions With the right clinical research informatics tools and EHR data, some types of very large cohort studies can be completed with minimal resources. PMID:22759621
Kotb, Magd A; Elmahdy, Hesham Nabeh; Khalifa, Nour El Deen Mahmoud; El-Deen, Mohamed Hamed Nasr; Lotfi, Mohamed Amr N
2015-07-01
Evidence-based medicine (EBM) is delivered through a didactic, blended learning, and mixed models. Students are supposed to construct an answerable question in PICO (patient, intervention, comparison, and outcome) framework, acquire evidence through search of literature, appraise evidence, apply it to the clinical case scenario, and assess the evidence in relation to clinical context. Yet these teaching models have limitations especially those related to group work, for example, handling uncooperative students, students who fail to contribute, students who domineer, students who have personal conflict, their impact upon progress of their groups, and inconsistent individual acquisition of required skills. At Pediatrics Department, Faculty of Medicine, Cairo University, we designed a novel undergraduate pediatric EBM assignment online system to overcome shortcomings of previous didactic method and aimed to assess its effectiveness by prospective follow-up during academic years 2012 to 2013 and 2013 to 2014. The novel web-based online interactive system was tailored to provide sequential single and group assignments for each student. Single assignment addressed a specific case scenario question, while group assignment was teamwork that addressed different questions of same case scenario. Assignment comprised scholar content and skills. We objectively analyzed students' performance by criterion-based assessment and subjectively by anonymous student questionnaire. A total of 2879 were enrolled in 5th year Pediatrics Course consecutively, of them 2779 (96.5%) logged in and 2554 (88.7%) submitted their work. They were randomly assigned to 292 groups. A total of 2277 (89.15%) achieved ≥ 80% of total mark (4/5), of them 717 (28.1%) achieved a full mark. A total of 2178 (85.27%) and 2359 (92.36%) made evidence-based conclusions and recommendations in single and group assignment, respectively (P < 0.001). A total of 1102 (43.1%) answered student questionnaire, of them 898 (81.48%) found e-educational experience satisfactory, 175 (15.88%) disagreed, and 29 (2.6%) could not decide. A total of 964 (87.47%) found single assignment educational, 913 (82.84%) found group assignment educational, and 794 (72.3%) enjoyed it. Web-based online interactive undergraduate EBM assignment was found effective in teaching medical students and assured individual student acquisition of concepts and skills of pediatric EMB. It was effective in mass education, data collection, and storage essential for system and student assessment.
Compliant tactile sensor for generating a signal related to an applied force
NASA Technical Reports Server (NTRS)
Torres-Jara, Eduardo (Inventor)
2012-01-01
Tactile sensor. The sensor includes a compliant convex surface disposed above a sensor array, the sensor array adapted to respond to deformation of the convex surface to generate a signal related to an applied force vector.
76 FR 17429 - Buy American Exceptions Under the American Recovery and Reinvestment Act of 2009
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-29
... closets that comply with the Americans with Disabilities Act (ADA-compliant water closets) at the Orness... goods (ADA-compliant water closets) are not produced in the U.S. in sufficient and reasonably available...
The design and implementation of web mining in web sites security
NASA Astrophysics Data System (ADS)
Li, Jian; Zhang, Guo-Yin; Gu, Guo-Chang; Li, Jian-Li
2003-06-01
The backdoor or information leak of Web servers can be detected by using Web Mining techniques on some abnormal Web log and Web application log data. The security of Web servers can be enhanced and the damage of illegal access can be avoided. Firstly, the system for discovering the patterns of information leakages in CGI scripts from Web log data was proposed. Secondly, those patterns for system administrators to modify their codes and enhance their Web site security were provided. The following aspects were described: one is to combine web application log with web log to extract more information, so web data mining could be used to mine web log for discovering the information that firewall and Information Detection System cannot find. Another approach is to propose an operation module of web site to enhance Web site security. In cluster server session, Density-Based Clustering technique is used to reduce resource cost and obtain better efficiency.
Pirro, Valentina; Girolami, Flavia; Spalenza, Veronica; Gardini, Giulia; Badino, Paola; Nebbia, Carlo
2015-01-01
A chemometric class modelling strategy (unequal dispersed classes - UNEQ) was applied for the first time as a possible screening method to monitor the abuse of growth promoters in veal calves. Five serum biomarkers, known to reflect the exposure to classes of compounds illegally used as growth promoters, were determined from 50 untreated animals in order to design a model of controls, representing veal calves reared under good, safe and highly standardised breeding conditions. The class modelling was applied to 421 commercially bred veal calves to separate them into 'compliant' and 'non-compliant' with respect to the modelled controls. Part of the non-compliant animals underwent further histological and chemical examinations to confirm the presence of either alterations in target tissues or traces of illegal substances commonly administered for growth-promoting purposes. Overall, the congruence between the histological or chemical methods and the UNEQ non-compliant outcomes was approximately 58%, likely underestimated due to the blindness nature of this examination. Further research is needed to confirm the validity of the UNEQ model in terms of sensitivity in recognising untreated animals as compliant to the controls, and specificity in revealing deviations from ideal breeding conditions, for example due to the abuse of growth promoters.
Dependence of residual displacements on the width and depth of compliant fault zones: a 3D study
NASA Astrophysics Data System (ADS)
Kang, J.; Duan, B.
2011-12-01
Compliant fault zones have been detected along active faults by seismic investigations (trapped waves and travel time analysis) and InSAR observations. However, the width and depth extent of compliant fault zones are still under debate in the community. Numerical models of dynamic rupture build a bridge between theories and the geological and geophysical observations. Theoretical 2D plane-strain studies of elastic and inelastic response of compliant fault zones to nearby earthquake have been conducted by Duan [2010] and Duan et al [2010]. In this study, we further extend the experiments to 3D with a focus on elastic response. We are specifically interested in how residual displacements depend on the structure and properties of complaint fault zones, in particular on the width and depth extent. We conduct numerical experiments on various types of fault-zone models, including fault zones with a constant width along depth, with decreasing widths along depth, and with Hanning taper profiles of velocity reduction. . Our preliminary results suggest 1) the width of anomalous horizontal residual displacement is only indicative of the width of a fault zone near the surface, and 2) the vertical residual displacement contains information of the depth extent of compliant fault zones.
Sripathi, Vangipuram Canchi; Kumar, Ramarathnam Krishna; Balakrishnan, Komarakshi R
2004-03-01
This study aims to find the fundamental differences in the mechanism of opening and closing of a normal aortic valve and a valve with a stiff root, using a dynamic finite element model. A dynamic, finite element model with time varying pressure was used in this study. Shell elements with linear elastic properties for the leaflet and root were used. Two different cases were analyzed: (1) normal leaflets inside a compliant root, and (2) normal leaflets inside a stiff root. A compliant aortic root contributes substantially to the smooth and symmetrical leaflet opening with minimal gradients. In contrast, the leaflet opening inside a stiff root is delayed, asymmetric, and wrinkled. However, this wrinkling is not associated with increased leaflet stresses. In compliant roots, the effective valve orifice area can substantially increase because of increased root pressure and transvalvular gradients. In stiff roots this effect is strikingly absent. A compliant aortic root contributes substantially to smooth and symmetrical leaflet opening with minimal gradients. The compliance also contributes much to the ability of the normal aortic valve to increase its effective valve orifice in response to physiologic demands of exercise. This effect is strikingly absent in stiff roots.
Fabrication Process of Silicone-based Dielectric Elastomer Actuators
Rosset, Samuel; Araromi, Oluwaseun A.; Schlatter, Samuel; Shea, Herbert R.
2016-01-01
This contribution demonstrates the fabrication process of dielectric elastomer transducers (DETs). DETs are stretchable capacitors consisting of an elastomeric dielectric membrane sandwiched between two compliant electrodes. The large actuation strains of these transducers when used as actuators (over 300% area strain) and their soft and compliant nature has been exploited for a wide range of applications, including electrically tunable optics, haptic feedback devices, wave-energy harvesting, deformable cell-culture devices, compliant grippers, and propulsion of a bio-inspired fish-like airship. In most cases, DETs are made with a commercial proprietary acrylic elastomer and with hand-applied electrodes of carbon powder or carbon grease. This combination leads to non-reproducible and slow actuators exhibiting viscoelastic creep and a short lifetime. We present here a complete process flow for the reproducible fabrication of DETs based on thin elastomeric silicone films, including casting of thin silicone membranes, membrane release and prestretching, patterning of robust compliant electrodes, assembly and testing. The membranes are cast on flexible polyethylene terephthalate (PET) substrates coated with a water-soluble sacrificial layer for ease of release. The electrodes consist of carbon black particles dispersed into a silicone matrix and patterned using a stamping technique, which leads to precisely-defined compliant electrodes that present a high adhesion to the dielectric membrane on which they are applied. PMID:26863283
The Climate-G Portal: a Grid Enabled Scientifc Gateway for Climate Change
NASA Astrophysics Data System (ADS)
Fiore, Sandro; Negro, Alessandro; Aloisio, Giovanni
2010-05-01
Grid portals are web gateways aiming at concealing the underlying infrastructure through a pervasive, transparent, user-friendly, ubiquitous and seamless access to heterogeneous and geographical spread resources (i.e. storage, computational facilities, services, sensors, network, databases). Definitively they provide an enhanced problem-solving environment able to deal with modern, large scale scientific and engineering problems. Scientific gateways are able to introduce a revolution in the way scientists and researchers organize and carry out their activities. Access to distributed resources, complex workflow capabilities, and community-oriented functionalities are just some of the features that can be provided by such a web-based environment. In the context of the EGEE NA4 Earth Science Cluster, Climate-G is a distributed testbed focusing on climate change research topics. The Euro-Mediterranean Center for Climate Change (CMCC) is actively participating in the testbed providing the scientific gateway (Climate-G Portal) to access to the entire infrastructure. The Climate-G Portal has to face important and critical challenges as well as has to satisfy and address key requirements. In the following, the most relevant ones are presented and discussed. Transparency: the portal has to provide a transparent access to the underlying infrastructure preventing users from dealing with low level details and the complexity of a distributed grid environment. Security: users must be authenticated and authorized on the portal to access and exploit portal functionalities. A wide set of roles is needed to clearly assign the proper one to each user. The access to the computational grid must be completely secured, since the target infrastructure to run jobs is a production grid environment. A security infrastructure (based on X509v3 digital certificates) is strongly needed. Pervasivity and ubiquity: the access to the system must be pervasive and ubiquitous. This is easily true due to the nature of the needed web approach. Usability and simplicity: the portal has to provide simple, high level and user friendly interfaces to ease the access and exploitation of the entire system. Coexistence of general purpose and domain oriented services: along with general purpose services (file transfer, job submission, etc.), the portal has to provide domain based services and functionalities. Subsetting of data, visualization of 2D maps around a virtual globe, delivery of maps through OGC compliant interfaces (i.e. Web Map Service - WMS) are just some examples. Since april 2009, about 70 users (85% coming from the climate change community) got access to the portal. A key challenge of this work is the idea to provide users with an integrated working environment, that is a place where scientists can find huge amount of data, complete metadata support, a wide set of data access services, data visualization and analysis tools, easy access to the underlying grid infrastructure and advanced monitoring interfaces.
2012-10-01
REPORT 3. DATES COVERED (From - To) MAR 2010 – APR 2012 4 . TITLE AND SUBTITLE IMPLICATIONS OF MULT-CORE ARCHITECTURES ON THE DEVELOPMENT OF...Framework for Multicore Information Flow Analysis ...................................... 23 4 4.1 A Hypothetical Reference Architecture... 4 Figure 2: Pentium II Block Diagram
This paper makes suggestions on how to comply with ANSI/ASQC E4-1994 while avoiding some of the frustration. Some options for writing a quality system document compliant with E4 are given, along with a model outline, to provide assistance in interpreting requirements.
Creating FGDC and NBII metadata with Metavist 2005.
David J. Rugg
2004-01-01
This report documents a computer program for creating metadata compliant with the Federal Geographic Data Committee (FGDC) 1998 metadata standard or the National Biological Information Infrastructure (NBII) 1999 Biological Data Profile for the FGDC standard. The software runs under the Microsoft Windows 2000 and XP operating systems, and requires the presence of...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-29
...-compliant communications systems. The P25 CAP Program Manager will perform a simple administrative review to.... ADDRESSES: Interested persons are invited to submit comments, identified by docket number DHS-2011-0118, by... improving its information collection and urges all interested parties to suggest how these materials can...