Sample records for high-level application framework

  1. Framework for Development of Object-Oriented Software

    NASA Technical Reports Server (NTRS)

    Perez-Poveda, Gus; Ciavarella, Tony; Nieten, Dan

    2004-01-01

    The Real-Time Control (RTC) Application Framework is a high-level software framework written in C++ that supports the rapid design and implementation of object-oriented application programs. This framework provides built-in functionality that solves common software development problems within distributed client-server, multi-threaded, and embedded programming environments. When using the RTC Framework to develop software for a specific domain, designers and implementers can focus entirely on the details of the domain-specific software rather than on creating custom solutions, utilities, and frameworks for the complexities of the programming environment. The RTC Framework was originally developed as part of a Space Shuttle Launch Processing System (LPS) replacement project called Checkout and Launch Control System (CLCS). As a result of the framework s development, CLCS software development time was reduced by 66 percent. The framework is generic enough for developing applications outside of the launch-processing system domain. Other applicable high-level domains include command and control systems and simulation/ training systems.

  2. A Framework for Translating a High Level Security Policy into Low Level Security Mechanisms

    NASA Astrophysics Data System (ADS)

    Hassan, Ahmed A.; Bahgat, Waleed M.

    2010-01-01

    Security policies have different components; firewall, active directory, and IDS are some examples of these components. Enforcement of network security policies to low level security mechanisms faces some essential difficulties. Consistency, verification, and maintenance are the major ones of these difficulties. One approach to overcome these difficulties is to automate the process of translation of high level security policy into low level security mechanisms. This paper introduces a framework of an automation process that translates a high level security policy into low level security mechanisms. The framework is described in terms of three phases; in the first phase all network assets are categorized according to their roles in the network security and relations between them are identified to constitute the network security model. This proposed model is based on organization based access control (OrBAC). However, the proposed model extend the OrBAC model to include not only access control policy but also some other administrative security policies like auditing policy. Besides, the proposed model enables matching of each rule of the high level security policy with the corresponding ones of the low level security policy. Through the second phase of the proposed framework, the high level security policy is mapped into the network security model. The second phase could be considered as a translation of the high level security policy into an intermediate model level. Finally, the intermediate model level is translated automatically into low level security mechanism. The paper illustrates the applicability of proposed approach through an application example.

  3. An integrated framework for high level design of high performance signal processing circuits on FPGAs

    NASA Astrophysics Data System (ADS)

    Benkrid, K.; Belkacemi, S.; Sukhsawas, S.

    2005-06-01

    This paper proposes an integrated framework for the high level design of high performance signal processing algorithms' implementations on FPGAs. The framework emerged from a constant need to rapidly implement increasingly complicated algorithms on FPGAs while maintaining the high performance needed in many real time digital signal processing applications. This is particularly important for application developers who often rely on iterative and interactive development methodologies. The central idea behind the proposed framework is to dynamically integrate high performance structural hardware description languages with higher level hardware languages in other to help satisfy the dual requirement of high level design and high performance implementation. The paper illustrates this by integrating two environments: Celoxica's Handel-C language, and HIDE, a structural hardware environment developed at the Queen's University of Belfast. On the one hand, Handel-C has been proven to be very useful in the rapid design and prototyping of FPGA circuits, especially control intensive ones. On the other hand, HIDE, has been used extensively, and successfully, in the generation of highly optimised parameterisable FPGA cores. In this paper, this is illustrated in the construction of a scalable and fully parameterisable core for image algebra's five core neighbourhood operations, where fully floorplanned efficient FPGA configurations, in the form of EDIF netlists, are generated automatically for instances of the core. In the proposed combined framework, highly optimised data paths are invoked dynamically from within Handel-C, and are synthesized using HIDE. Although the idea might seem simple prima facie, it could have serious implications on the design of future generations of hardware description languages.

  4. Framework for architecture-independent run-time reconfigurable applications

    NASA Astrophysics Data System (ADS)

    Lehn, David I.; Hudson, Rhett D.; Athanas, Peter M.

    2000-10-01

    Configurable Computing Machines (CCMs) have emerged as a technology with the computational benefits of custom ASICs as well as the flexibility and reconfigurability of general-purpose microprocessors. Significant effort from the research community has focused on techniques to move this reconfigurability from a rapid application development tool to a run-time tool. This requires the ability to change the hardware design while the application is executing and is known as Run-Time Reconfiguration (RTR). Widespread acceptance of run-time reconfigurable custom computing depends upon the existence of high-level automated design tools. Such tools must reduce the designers effort to port applications between different platforms as the architecture, hardware, and software evolves. A Java implementation of a high-level application framework, called Janus, is presented here. In this environment, developers create Java classes that describe the structural behavior of an application. The framework allows hardware and software modules to be freely mixed and interchanged. A compilation phase of the development process analyzes the structure of the application and adapts it to the target platform. Janus is capable of structuring the run-time behavior of an application to take advantage of the memory and computational resources available.

  5. Designing a robust activity recognition framework for health and exergaming using wearable sensors.

    PubMed

    Alshurafa, Nabil; Xu, Wenyao; Liu, Jason J; Huang, Ming-Chun; Mortazavi, Bobak; Roberts, Christian K; Sarrafzadeh, Majid

    2014-09-01

    Detecting human activity independent of intensity is essential in many applications, primarily in calculating metabolic equivalent rates and extracting human context awareness. Many classifiers that train on an activity at a subset of intensity levels fail to recognize the same activity at other intensity levels. This demonstrates weakness in the underlying classification method. Training a classifier for an activity at every intensity level is also not practical. In this paper, we tackle a novel intensity-independent activity recognition problem where the class labels exhibit large variability, the data are of high dimensionality, and clustering algorithms are necessary. We propose a new robust stochastic approximation framework for enhanced classification of such data. Experiments are reported using two clustering techniques, K-Means and Gaussian Mixture Models. The stochastic approximation algorithm consistently outperforms other well-known classification schemes which validate the use of our proposed clustered data representation. We verify the motivation of our framework in two applications that benefit from intensity-independent activity recognition. The first application shows how our framework can be used to enhance energy expenditure calculations. The second application is a novel exergaming environment aimed at using games to reward physical activity performed throughout the day, to encourage a healthy lifestyle.

  6. User-level framework for performance monitoring of HPC applications

    NASA Astrophysics Data System (ADS)

    Hristova, R.; Goranov, G.

    2013-10-01

    HP-SEE is an infrastructure that links the existing HPC facilities in South East Europe in a common infrastructure. The analysis of the performance monitoring of the High-Performance Computing (HPC) applications in the infrastructure can be useful for the end user as diagnostic for the overall performance of his applications. The existing monitoring tools for HP-SEE provide to the end user only aggregated information for all applications. Usually, the user does not have permissions to select only the relevant information for him and for his applications. In this article we present a framework for performance monitoring of the HPC applications in the HP-SEE infrastructure. The framework provides standardized performance metrics, which every user can use in order to monitor his applications. Furthermore as a part of the framework a program interface is developed. The interface allows the user to publish metrics data from his application and to read and analyze gathered information. Publishing and reading through the framework is possible only with grid certificate valid for the infrastructure. Therefore the user is authorized to access only the data for his applications.

  7. Leverage hadoop framework for large scale clinical informatics applications.

    PubMed

    Dong, Xiao; Bahroos, Neil; Sadhu, Eugene; Jackson, Tommie; Chukhman, Morris; Johnson, Robert; Boyd, Andrew; Hynes, Denise

    2013-01-01

    In this manuscript, we present our experiences using the Apache Hadoop framework for high data volume and computationally intensive applications, and discuss some best practice guidelines in a clinical informatics setting. There are three main aspects in our approach: (a) process and integrate diverse, heterogeneous data sources using standard Hadoop programming tools and customized MapReduce programs; (b) after fine-grained aggregate results are obtained, perform data analysis using the Mahout data mining library; (c) leverage the column oriented features in HBase for patient centric modeling and complex temporal reasoning. This framework provides a scalable solution to meet the rapidly increasing, imperative "Big Data" needs of clinical and translational research. The intrinsic advantage of fault tolerance, high availability and scalability of Hadoop platform makes these applications readily deployable at the enterprise level cluster environment.

  8. A Model-Based Approach for Bridging Virtual and Physical Sensor Nodes in a Hybrid Simulation Framework

    PubMed Central

    Mozumdar, Mohammad; Song, Zhen Yu; Lavagno, Luciano; Sangiovanni-Vincentelli, Alberto L.

    2014-01-01

    The Model Based Design (MBD) approach is a popular trend to speed up application development of embedded systems, which uses high-level abstractions to capture functional requirements in an executable manner, and which automates implementation code generation. Wireless Sensor Networks (WSNs) are an emerging very promising application area for embedded systems. However, there is a lack of tools in this area, which would allow an application developer to model a WSN application by using high level abstractions, simulate it mapped to a multi-node scenario for functional analysis, and finally use the refined model to automatically generate code for different WSN platforms. Motivated by this idea, in this paper we present a hybrid simulation framework that not only follows the MBD approach for WSN application development, but also interconnects a simulated sub-network with a physical sub-network and then allows one to co-simulate them, which is also known as Hardware-In-the-Loop (HIL) simulation. PMID:24960083

  9. A reusable rocket engine intelligen control

    NASA Technical Reports Server (NTRS)

    Merrill, Walter C.; Lorenzo, Carl F.

    1988-01-01

    An intelligent control system for reusable space propulsion systems for future launch vehicles is described. The system description includes a framework for the design. The framework consists of an execution level with high-speed control and diagnostics, and a coordination level which marries expert system concepts with traditional control. A comparison is made between air breathing and rocket engine control concepts to assess the relative levels of development and to determine the applicability of air breathing control concepts to future reusable rocket engine systems.

  10. A reusable rocket engine intelligent control

    NASA Technical Reports Server (NTRS)

    Merrill, Walter C.; Lorenzo, Carl F.

    1988-01-01

    An intelligent control system for reusable space propulsion systems for future launch vehicles is described. The system description includes a framework for the design. The framework consists of an execution level with high-speed control and diagnostics, and a coordination level which marries expert system concepts with traditional control. A comparison is made between air breathing and rocket engine control concepts to assess the relative levels of development and to determine the applicability of air breathing control concepts ot future reusable rocket engine systems.

  11. Generic strategies for chemical space exploration.

    PubMed

    Andersen, Jakob L; Flamm, Christoph; Merkle, Daniel; Stadler, Peter F

    2014-01-01

    The chemical universe of molecules reachable from a set of start compounds by iterative application of a finite number of reactions is usually so vast, that sophisticated and efficient exploration strategies are required to cope with the combinatorial complexity. A stringent analysis of (bio)chemical reaction networks, as approximations of these complex chemical spaces, forms the foundation for the understanding of functional relations in Chemistry and Biology. Graphs and graph rewriting are natural models for molecules and reactions. Borrowing the idea of partial evaluation from functional programming, we introduce partial applications of rewrite rules. A framework for the specification of exploration strategies in graph-rewriting systems is presented. Using key examples of complex reaction networks from carbohydrate chemistry we demonstrate the feasibility of this high-level strategy framework. While being designed for chemical applications, the framework can also be used to emulate higher-level transformation models such as illustrated in a small puzzle game.

  12. From Physics Model to Results: An Optimizing Framework for Cross-Architecture Code Generation

    DOE PAGES

    Blazewicz, Marek; Hinder, Ian; Koppelman, David M.; ...

    2013-01-01

    Starting from a high-level problem description in terms of partial differential equations using abstract tensor notation, the Chemora framework discretizes, optimizes, and generates complete high performance codes for a wide range of compute architectures. Chemora extends the capabilities of Cactus, facilitating the usage of large-scale CPU/GPU systems in an efficient manner for complex applications, without low-level code tuning. Chemora achieves parallelism through MPI and multi-threading, combining OpenMP and CUDA. Optimizations include high-level code transformations, efficient loop traversal strategies, dynamically selected data and instruction cache usage strategies, and JIT compilation of GPU code tailored to the problem characteristics. The discretization ismore » based on higher-order finite differences on multi-block domains. Chemora's capabilities are demonstrated by simulations of black hole collisions. This problem provides an acid test of the framework, as the Einstein equations contain hundreds of variables and thousands of terms.« less

  13. Reusable and Extensible High Level Data Distributions

    NASA Technical Reports Server (NTRS)

    Diaconescu, Roxana E.; Chamberlain, Bradford; James, Mark L.; Zima, Hans P.

    2005-01-01

    This paper presents a reusable design of a data distribution framework for data parallel high performance applications. We are implementing the design in the context of the Chapel high productivity programming language. Distributions in Chapel are a means to express locality in systems composed of large numbers of processor and memory components connected by a network. Since distributions have a great effect on,the performance of applications, it is important that the distribution strategy can be chosen by a user. At the same time, high productivity concerns require that the user is shielded from error-prone, tedious details such as communication and synchronization. We propose an approach to distributions that enables the user to refine a language-provided distribution type and adjust it to optimize the performance of the application. Additionally, we conceal from the user low-level communication and synchronization details to increase productivity. To emphasize the generality of our distribution machinery, we present its abstract design in the form of a design pattern, which is independent of a concrete implementation. To illustrate the applicability of our distribution framework design, we outline the implementation of data distributions in terms of the Chapel language.

  14. Argobots: A Lightweight Low-Level Threading and Tasking Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seo, Sangmin; Amer, Abdelhalim; Balaji, Pavan

    In the past few decades, a number of user-level threading and tasking models have been proposed in the literature to address the shortcomings of OS-level threads, primarily with respect to cost and flexibility. Current state-of-the-art user-level threading and tasking models, however, are either too specific to applications or architectures or are not as powerful or flexible. In this paper, we present Argobots, a lightweight, low-level threading and tasking framework that is designed as a portable and performant substrate for high-level programming models or runtime systems. Argobots offers a carefully designed execution model that balances generality of functionality with providing amore » rich set of controls to allow specialization by the user or high-level programming model. We describe the design, implementation, and optimization of Argobots and present integrations with three example high-level models: OpenMP, MPI, and co-located I/O service. Evaluations show that (1) Argobots outperforms existing generic threading runtimes; (2) our OpenMP runtime offers more efficient interoperability capabilities than production OpenMP runtimes do; (3) when MPI interoperates with Argobots instead of Pthreads, it enjoys reduced synchronization costs and better latency hiding capabilities; and (4) I/O service with Argobots reduces interference with co-located applications, achieving performance competitive with that of the Pthreads version.« less

  15. Surgical model-view-controller simulation software framework for local and collaborative applications

    PubMed Central

    Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2010-01-01

    Purpose Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. Methods A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. Results The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. Conclusion A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users. PMID:20714933

  16. Surgical model-view-controller simulation software framework for local and collaborative applications.

    PubMed

    Maciel, Anderson; Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2011-07-01

    Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users.

  17. A Framework for Lab Work Management in Mass Courses. Application to Low Level Input/Output without Hardware

    ERIC Educational Resources Information Center

    Rodriguez, Santiago; Zamorano, Juan; Rosales, Francisco; Dopico, Antonio Garcia; Pedraza, Jose Luis

    2007-01-01

    This paper describes a complete lab work management framework designed and developed in the authors' department to help teachers to manage the small projects that students are expected to complete as lab assignments during their graduate-level computer engineering studies. The paper focuses on an application example of the framework to a specific…

  18. Argobots: A Lightweight Low-Level Threading and Tasking Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seo, Sangmin; Amer, Abdelhalim; Balaji, Pavan

    In the past few decades, a number of user-level threading and tasking models have been proposed in the literature to address the shortcomings of OS-level threads, primarily with respect to cost and flexibility. Current state-of-the-art user-level threading and tasking models, however, either are too specific to applications or architectures or are not as powerful or flexible. In this paper, we present Argobots, a lightweight, low-level threading and tasking framework that is designed as a portable and performant substrate for high-level programming models or runtime systems. Argobots offers a carefully designed execution model that balances generality of functionality with providing amore » rich set of controls to allow specialization by end users or high-level programming models. We describe the design, implementation, and performance characterization of Argobots and present integrations with three high-level models: OpenMP, MPI, and colocated I/O services. Evaluations show that (1) Argobots, while providing richer capabilities, is competitive with existing simpler generic threading runtimes; (2) our OpenMP runtime offers more efficient interoperability capabilities than production OpenMP runtimes do; (3) when MPI interoperates with Argobots instead of Pthreads, it enjoys reduced synchronization costs and better latency-hiding capabilities; and (4) I/O services with Argobots reduce interference with colocated applications while achieving performance competitive with that of a Pthreads approach.« less

  19. Argobots: A Lightweight Low-Level Threading and Tasking Framework

    DOE PAGES

    Seo, Sangmin; Amer, Abdelhalim; Balaji, Pavan; ...

    2017-10-24

    In the past few decades, a number of user-level threading and tasking models have been proposed in the literature to address the shortcomings of OS-level threads, primarily with respect to cost and flexibility. Current state-of-the-art user-level threading and tasking models, however, are either too specific to applications or architectures or are not as powerful or flexible. In this article, we present Argobots, a lightweight, low-level threading and tasking framework that is designed as a portable and performant substrate for high-level programming models or runtime systems. Argobots offers a carefully designed execution model that balances generality of functionality with providing amore » rich set of controls to allow specialization by the user or high-level programming model. Here, we describe the design, implementation, and optimization of Argobots and present integrations with three example high-level models: OpenMP, MPI, and co-located I/O service. Evaluations show that (1) Argobots outperforms existing generic threading runtimes; (2) our OpenMP runtime offers more efficient interoperability capabilities than production OpenMP runtimes do; (3) when MPI interoperates with Argobots instead of Pthreads, it enjoys reduced synchronization costs and better latency hiding capabilities; and (4) I/O service with Argobots reduces interference with co-located applications, achieving performance competitive with that of the Pthreads version.« less

  20. Argobots: A Lightweight Low-Level Threading and Tasking Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seo, Sangmin; Amer, Abdelhalim; Balaji, Pavan

    In the past few decades, a number of user-level threading and tasking models have been proposed in the literature to address the shortcomings of OS-level threads, primarily with respect to cost and flexibility. Current state-of-the-art user-level threading and tasking models, however, are either too specific to applications or architectures or are not as powerful or flexible. In this article, we present Argobots, a lightweight, low-level threading and tasking framework that is designed as a portable and performant substrate for high-level programming models or runtime systems. Argobots offers a carefully designed execution model that balances generality of functionality with providing amore » rich set of controls to allow specialization by the user or high-level programming model. Here, we describe the design, implementation, and optimization of Argobots and present integrations with three example high-level models: OpenMP, MPI, and co-located I/O service. Evaluations show that (1) Argobots outperforms existing generic threading runtimes; (2) our OpenMP runtime offers more efficient interoperability capabilities than production OpenMP runtimes do; (3) when MPI interoperates with Argobots instead of Pthreads, it enjoys reduced synchronization costs and better latency hiding capabilities; and (4) I/O service with Argobots reduces interference with co-located applications, achieving performance competitive with that of the Pthreads version.« less

  1. Nektar++: An open-source spectral/ hp element framework

    NASA Astrophysics Data System (ADS)

    Cantwell, C. D.; Moxey, D.; Comerford, A.; Bolis, A.; Rocco, G.; Mengaldo, G.; De Grazia, D.; Yakovlev, S.; Lombard, J.-E.; Ekelschot, D.; Jordi, B.; Xu, H.; Mohamied, Y.; Eskilsson, C.; Nelson, B.; Vos, P.; Biotto, C.; Kirby, R. M.; Sherwin, S. J.

    2015-07-01

    Nektar++ is an open-source software framework designed to support the development of high-performance scalable solvers for partial differential equations using the spectral/ hp element method. High-order methods are gaining prominence in several engineering and biomedical applications due to their improved accuracy over low-order techniques at reduced computational cost for a given number of degrees of freedom. However, their proliferation is often limited by their complexity, which makes these methods challenging to implement and use. Nektar++ is an initiative to overcome this limitation by encapsulating the mathematical complexities of the underlying method within an efficient C++ framework, making the techniques more accessible to the broader scientific and industrial communities. The software supports a variety of discretisation techniques and implementation strategies, supporting methods research as well as application-focused computation, and the multi-layered structure of the framework allows the user to embrace as much or as little of the complexity as they need. The libraries capture the mathematical constructs of spectral/ hp element methods, while the associated collection of pre-written PDE solvers provides out-of-the-box application-level functionality and a template for users who wish to develop solutions for addressing questions in their own scientific domains.

  2. Development of a conceptual framework toward an integrated transportation system : final report, April 10, 2009.

    DOT National Transportation Integrated Search

    2009-04-10

    This report documents research on the conceptual framework of an integrated transportation system with a prototype application under the framework. Three levels of control are involved in this framework: at the global level (an entire transportation ...

  3. Flight Software Development for the CHEOPS Instrument with the CORDET Framework

    NASA Astrophysics Data System (ADS)

    Cechticky, V.; Ottensamer, R.; Pasetti, A.

    2015-09-01

    CHEOPS is an ESA S-class mission dedicated to the precise measurement of radii of already known exoplanets using ultra-high precision photometry. The instrument flight software controlling the instrument and handling the science data is developed by the University of Vienna using the CORDET Framework offered by P&P Software GmbH. The CORDET Framework provides a generic software infrastructure for PUS-based applications. This paper describes how the framework is used for the CHEOPS application software to provide a consistent solution for to the communication and control services, event handling and FDIR procedures. This approach is innovative in four respects: (a) it is a true third-party reuse; (b) re-use is done at specification, validation and code level; (c) the re-usable assets and their qualification data package are entirely open-source; (d) re-use is based on call-back with the application developer providing functions which are called by the reusable architecture. File names missing from here on out (I tried to mimic the files names from before.)

  4. Audio-based queries for video retrieval over Java enabled mobile devices

    NASA Astrophysics Data System (ADS)

    Ahmad, Iftikhar; Cheikh, Faouzi Alaya; Kiranyaz, Serkan; Gabbouj, Moncef

    2006-02-01

    In this paper we propose a generic framework for efficient retrieval of audiovisual media based on its audio content. This framework is implemented in a client-server architecture where the client application is developed in Java to be platform independent whereas the server application is implemented for the PC platform. The client application adapts to the characteristics of the mobile device where it runs such as screen size and commands. The entire framework is designed to take advantage of the high-level segmentation and classification of audio content to improve speed and accuracy of audio-based media retrieval. Therefore, the primary objective of this framework is to provide an adaptive basis for performing efficient video retrieval operations based on the audio content and types (i.e. speech, music, fuzzy and silence). Experimental results approve that such an audio based video retrieval scheme can be used from mobile devices to search and retrieve video clips efficiently over wireless networks.

  5. A deep learning framework for financial time series using stacked autoencoders and long-short term memory.

    PubMed

    Bao, Wei; Yue, Jun; Rao, Yulei

    2017-01-01

    The application of deep learning approaches to finance has received a great deal of attention from both investors and researchers. This study presents a novel deep learning framework where wavelet transforms (WT), stacked autoencoders (SAEs) and long-short term memory (LSTM) are combined for stock price forecasting. The SAEs for hierarchically extracted deep features is introduced into stock price forecasting for the first time. The deep learning framework comprises three stages. First, the stock price time series is decomposed by WT to eliminate noise. Second, SAEs is applied to generate deep high-level features for predicting the stock price. Third, high-level denoising features are fed into LSTM to forecast the next day's closing price. Six market indices and their corresponding index futures are chosen to examine the performance of the proposed model. Results show that the proposed model outperforms other similar models in both predictive accuracy and profitability performance.

  6. MODELS-3/CMAQ APPLICATIONS WHICH ILLUSTRATE CAPABILITY AND FUNCTIONALITY

    EPA Science Inventory

    The Models-3/CMAQ developed by the U.S. Environmental Protections Agency (USEPA) is a third generation multiscale, multi-pollutant air quality modeling system within a high-level, object-oriented computer framework (Models-3). It has been available to the scientific community ...

  7. Constraint-Based Local Search for Constrained Optimum Paths Problems

    NASA Astrophysics Data System (ADS)

    Pham, Quang Dung; Deville, Yves; van Hentenryck, Pascal

    Constrained Optimum Path (COP) problems arise in many real-life applications and are ubiquitous in communication networks. They have been traditionally approached by dedicated algorithms, which are often hard to extend with side constraints and to apply widely. This paper proposes a constraint-based local search (CBLS) framework for COP applications, bringing the compositionality, reuse, and extensibility at the core of CBLS and CP systems. The modeling contribution is the ability to express compositional models for various COP applications at a high level of abstraction, while cleanly separating the model and the search procedure. The main technical contribution is a connected neighborhood based on rooted spanning trees to find high-quality solutions to COP problems. The framework, implemented in COMET, is applied to Resource Constrained Shortest Path (RCSP) problems (with and without side constraints) and to the edge-disjoint paths problem (EDP). Computational results show the potential significance of the approach.

  8. Maintenance = reuse-oriented software development

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.

    1989-01-01

    Maintenance is viewed as a reuse process. In this context, a set of models that can be used to support the maintenance process is discussed. A high level reuse framework is presented that characterizes the object of reuse, the process for adapting that object for its target application, and the reused object within its target application. Based upon this framework, a qualitative comparison is offered of the three maintenance process models with regard to their strengths and weaknesses and the circumstances in which they are appropriate. To provide a more systematic, quantitative approach for evaluating the appropriateness of the particular maintenance model, a measurement scheme is provided, based upon the reuse framework, in the form of an organized set of questions that need to be answered. To support the reuse perspective, a set of reuse enablers are discussed.

  9. A Conceptual and Measurement Framework to Guide Policy Development and Systems Change

    ERIC Educational Resources Information Center

    Schalock, Robert L.; Verdugo, Miguel Angel

    2012-01-01

    The authors describe a conceptual and measurement framework that provides a template for guiding policy development and systems change. The framework is built on the concepts of vertical and horizontal alignment, system-level processes, and organization-level practices. Application of the framework can structure the thinking and analytic…

  10. Implicit kernel sparse shape representation: a sparse-neighbors-based objection segmentation framework.

    PubMed

    Yao, Jincao; Yu, Huimin; Hu, Roland

    2017-01-01

    This paper introduces a new implicit-kernel-sparse-shape-representation-based object segmentation framework. Given an input object whose shape is similar to some of the elements in the training set, the proposed model can automatically find a cluster of implicit kernel sparse neighbors to approximately represent the input shape and guide the segmentation. A distance-constrained probabilistic definition together with a dualization energy term is developed to connect high-level shape representation and low-level image information. We theoretically prove that our model not only derives from two projected convex sets but is also equivalent to a sparse-reconstruction-error-based representation in the Hilbert space. Finally, a "wake-sleep"-based segmentation framework is applied to drive the evolutionary curve to recover the original shape of the object. We test our model on two public datasets. Numerical experiments on both synthetic images and real applications show the superior capabilities of the proposed framework.

  11. GELATIO: a general framework for modular digital analysis of high-purity Ge detector signals

    NASA Astrophysics Data System (ADS)

    Agostini, M.; Pandola, L.; Zavarise, P.; Volynets, O.

    2011-08-01

    GELATIO is a new software framework for advanced data analysis and digital signal processing developed for the GERDA neutrinoless double beta decay experiment. The framework is tailored to handle the full analysis flow of signals recorded by high purity Ge detectors and photo-multipliers from the veto counters. It is designed to support a multi-channel modular and flexible analysis, widely customizable by the user either via human-readable initialization files or via a graphical interface. The framework organizes the data into a multi-level structure, from the raw data up to the condensed analysis parameters, and includes tools and utilities to handle the data stream between the different levels. GELATIO is implemented in C++. It relies upon ROOT and its extension TAM, which provides compatibility with PROOF, enabling the software to run in parallel on clusters of computers or many-core machines. It was tested on different platforms and benchmarked in several GERDA-related applications. A stable version is presently available for the GERDA Collaboration and it is used to provide the reference analysis of the experiment data.

  12. Automated Processing of Imaging Data through Multi-tiered Classification of Biological Structures Illustrated Using Caenorhabditis elegans.

    PubMed

    Zhan, Mei; Crane, Matthew M; Entchev, Eugeni V; Caballero, Antonio; Fernandes de Abreu, Diana Andrea; Ch'ng, QueeLim; Lu, Hang

    2015-04-01

    Quantitative imaging has become a vital technique in biological discovery and clinical diagnostics; a plethora of tools have recently been developed to enable new and accelerated forms of biological investigation. Increasingly, the capacity for high-throughput experimentation provided by new imaging modalities, contrast techniques, microscopy tools, microfluidics and computer controlled systems shifts the experimental bottleneck from the level of physical manipulation and raw data collection to automated recognition and data processing. Yet, despite their broad importance, image analysis solutions to address these needs have been narrowly tailored. Here, we present a generalizable formulation for autonomous identification of specific biological structures that is applicable for many problems. The process flow architecture we present here utilizes standard image processing techniques and the multi-tiered application of classification models such as support vector machines (SVM). These low-level functions are readily available in a large array of image processing software packages and programming languages. Our framework is thus both easy to implement at the modular level and provides specific high-level architecture to guide the solution of more complicated image-processing problems. We demonstrate the utility of the classification routine by developing two specific classifiers as a toolset for automation and cell identification in the model organism Caenorhabditis elegans. To serve a common need for automated high-resolution imaging and behavior applications in the C. elegans research community, we contribute a ready-to-use classifier for the identification of the head of the animal under bright field imaging. Furthermore, we extend our framework to address the pervasive problem of cell-specific identification under fluorescent imaging, which is critical for biological investigation in multicellular organisms or tissues. Using these examples as a guide, we envision the broad utility of the framework for diverse problems across different length scales and imaging methods.

  13. On the Design of Smart Homes: A Framework for Activity Recognition in Home Environment.

    PubMed

    Cicirelli, Franco; Fortino, Giancarlo; Giordano, Andrea; Guerrieri, Antonio; Spezzano, Giandomenico; Vinci, Andrea

    2016-09-01

    A smart home is a home environment enriched with sensing, actuation, communication and computation capabilities which permits to adapt it to inhabitants preferences and requirements. Establishing a proper strategy of actuation on the home environment can require complex computational tasks on the sensed data. This is the case of activity recognition, which consists in retrieving high-level knowledge about what occurs in the home environment and about the behaviour of the inhabitants. The inherent complexity of this application domain asks for tools able to properly support the design and implementation phases. This paper proposes a framework for the design and implementation of smart home applications focused on activity recognition in home environments. The framework mainly relies on the Cloud-assisted Agent-based Smart home Environment (CASE) architecture offering basic abstraction entities which easily allow to design and implement Smart Home applications. CASE is a three layered architecture which exploits the distributed multi-agent paradigm and the cloud technology for offering analytics services. Details about how to implement activity recognition onto the CASE architecture are supplied focusing on the low-level technological issues as well as the algorithms and the methodologies useful for the activity recognition. The effectiveness of the framework is shown through a case study consisting of a daily activity recognition of a person in a home environment.

  14. Towards cooperative guidance and control of highly automated vehicles: H-Mode and Conduct-by-Wire.

    PubMed

    Flemisch, Frank Ole; Bengler, Klaus; Bubb, Heiner; Winner, Hermann; Bruder, Ralph

    2014-01-01

    This article provides a general ergonomic framework of cooperative guidance and control for vehicles with an emphasis on the cooperation between a human and a highly automated vehicle. In the twenty-first century, mobility and automation technologies are increasingly fused. In the sky, highly automated aircraft are flying with a high safety record. On the ground, a variety of driver assistance systems are being developed, and highly automated vehicles with increasingly autonomous capabilities are becoming possible. Human-centred automation has paved the way for a better cooperation between automation and humans. How can these highly automated systems be structured so that they can be easily understood, how will they cooperate with the human? The presented research was conducted using the methods of iterative build-up and refinement of framework by triangulation, i.e. by instantiating and testing the framework with at least two derived concepts and prototypes. This article sketches a general, conceptual ergonomic framework of cooperative guidance and control of highly automated vehicles, two concepts derived from the framework, prototypes and pilot data. Cooperation is exemplified in a list of aspects and related to levels of the driving task. With the concept 'Conduct-by-Wire', cooperation happens mainly on the guidance level, where the driver can delegate manoeuvres to the automation with a specialised manoeuvre interface. With H-Mode, a haptic-multimodal interaction with highly automated vehicles based on the H(orse)-Metaphor, cooperation is mainly done on guidance and control with a haptically active interface. Cooperativeness should be a key aspect for future human-automation systems. Especially for highly automated vehicles, cooperative guidance and control is a research direction with already promising concepts and prototypes that should be further explored. The application of the presented approach is every human-machine system that moves and includes high levels of assistance/automation.

  15. An Automated DAKOTA and VULCAN-CFD Framework with Application to Supersonic Facility Nozzle Flowpath Optimization

    NASA Technical Reports Server (NTRS)

    Axdahl, Erik L.

    2015-01-01

    Removing human interaction from design processes by using automation may lead to gains in both productivity and design precision. This memorandum describes efforts to incorporate high fidelity numerical analysis tools into an automated framework and applying that framework to applications of practical interest. The purpose of this effort was to integrate VULCAN-CFD into an automated, DAKOTA-enabled framework with a proof-of-concept application being the optimization of supersonic test facility nozzles. It was shown that the optimization framework could be deployed on a high performance computing cluster with the flow of information handled effectively to guide the optimization process. Furthermore, the application of the framework to supersonic test facility nozzle flowpath design and optimization was demonstrated using multiple optimization algorithms.

  16. Real-time tracking of visually attended objects in virtual environments and its application to LOD.

    PubMed

    Lee, Sungkil; Kim, Gerard Jounghyun; Choi, Seungmoon

    2009-01-01

    This paper presents a real-time framework for computationally tracking objects visually attended by the user while navigating in interactive virtual environments. In addition to the conventional bottom-up (stimulus-driven) saliency map, the proposed framework uses top-down (goal-directed) contexts inferred from the user's spatial and temporal behaviors, and identifies the most plausibly attended objects among candidates in the object saliency map. The computational framework was implemented using GPU, exhibiting high computational performance adequate for interactive virtual environments. A user experiment was also conducted to evaluate the prediction accuracy of the tracking framework by comparing objects regarded as visually attended by the framework to actual human gaze collected with an eye tracker. The results indicated that the accuracy was in the level well supported by the theory of human cognition for visually identifying single and multiple attentive targets, especially owing to the addition of top-down contextual information. Finally, we demonstrate how the visual attention tracking framework can be applied to managing the level of details in virtual environments, without any hardware for head or eye tracking.

  17. A generic framework for individual-based modelling and physical-biological interaction

    PubMed Central

    2018-01-01

    The increased availability of high-resolution ocean data globally has enabled more detailed analyses of physical-biological interactions and their consequences to the ecosystem. We present IBMlib, which is a versatile, portable and computationally effective framework for conducting Lagrangian simulations in the marine environment. The purpose of the framework is to handle complex individual-level biological models of organisms, combined with realistic 3D oceanographic model of physics and biogeochemistry describing the environment of the organisms without assumptions about spatial or temporal scales. The open-source framework features a minimal robust interface to facilitate the coupling between individual-level biological models and oceanographic models, and we provide application examples including forward/backward simulations, habitat connectivity calculations, assessing ocean conditions, comparison of physical circulation models, model ensemble runs and recently posterior Eulerian simulations using the IBMlib framework. We present the code design ideas behind the longevity of the code, our implementation experiences, as well as code performance benchmarking. The framework may contribute substantially to progresses in representing, understanding, predicting and eventually managing marine ecosystems. PMID:29351280

  18. The combination of an Environmental Management System and Life Cycle Assessment at the territorial level

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mazzi, Anna; Toniolo, Sara; Catto, Stella

    A framework to include a Life Cycle Assessment in the significance evaluation of the environmental aspects of an Environmental Management System has been studied for some industrial sectors, but there is a literature gap at the territorial level, where the indirect impact assessment is crucial. To overcome this criticality, our research proposes the Life Cycle Assessment as a framework to assess environmental aspects of public administration within an Environmental Management System applied at the territorial level. This research is structured in two parts: the design of a new methodological framework and the pilot application for an Italian municipality. The methodologicalmore » framework designed supports Initial Environmental Analysis at the territorial level thanks to the results derived from the impact assessment phase. The pilot application in an Italian municipality EMAS registered demonstrates the applicability of the framework and its effectiveness in evaluating the environmental impact assessment for direct and indirect aspects. Through the discussion of the results, we underline the growing knowledge derived by this research in terms of the reproducibility and consistency of the criteria to define the significance of the direct and indirect environmental aspects for a local public administration. - Highlights: • The combination between Environmental Management System and LCA is studied. • A methodological framework is elaborated and tested at the territorial level. • Life Cycle Impact Assessment supports the evaluation of aspects significance. • The framework assures consistency of evaluation criteria on the studied territory.« less

  19. Bilingual parallel programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foster, I.; Overbeek, R.

    1990-01-01

    Numerous experiments have demonstrated that computationally intensive algorithms support adequate parallelism to exploit the potential of large parallel machines. Yet successful parallel implementations of serious applications are rare. The limiting factor is clearly programming technology. None of the approaches to parallel programming that have been proposed to date -- whether parallelizing compilers, language extensions, or new concurrent languages -- seem to adequately address the central problems of portability, expressiveness, efficiency, and compatibility with existing software. In this paper, we advocate an alternative approach to parallel programming based on what we call bilingual programming. We present evidence that this approach providesmore » and effective solution to parallel programming problems. The key idea in bilingual programming is to construct the upper levels of applications in a high-level language while coding selected low-level components in low-level languages. This approach permits the advantages of a high-level notation (expressiveness, elegance, conciseness) to be obtained without the cost in performance normally associated with high-level approaches. In addition, it provides a natural framework for reusing existing code.« less

  20. A deep learning framework for financial time series using stacked autoencoders and long-short term memory

    PubMed Central

    Bao, Wei; Rao, Yulei

    2017-01-01

    The application of deep learning approaches to finance has received a great deal of attention from both investors and researchers. This study presents a novel deep learning framework where wavelet transforms (WT), stacked autoencoders (SAEs) and long-short term memory (LSTM) are combined for stock price forecasting. The SAEs for hierarchically extracted deep features is introduced into stock price forecasting for the first time. The deep learning framework comprises three stages. First, the stock price time series is decomposed by WT to eliminate noise. Second, SAEs is applied to generate deep high-level features for predicting the stock price. Third, high-level denoising features are fed into LSTM to forecast the next day’s closing price. Six market indices and their corresponding index futures are chosen to examine the performance of the proposed model. Results show that the proposed model outperforms other similar models in both predictive accuracy and profitability performance. PMID:28708865

  1. Software design and implementation concepts for an interoperable medical communication framework.

    PubMed

    Besting, Andreas; Bürger, Sebastian; Kasparick, Martin; Strathen, Benjamin; Portheine, Frank

    2018-02-23

    The new IEEE 11073 service-oriented device connectivity (SDC) standard proposals for networked point-of-care and surgical devices constitutes the basis for improved interoperability due to its independence of vendors. To accelerate the distribution of the standard a reference implementation is indispensable. However, the implementation of such a framework has to overcome several non-trivial challenges. First, the high level of complexity of the underlying standard must be reflected in the software design. An efficient implementation has to consider the limited resources of the underlying hardware. Moreover, the frameworks purpose of realizing a distributed system demands a high degree of reliability of the framework itself and its internal mechanisms. Additionally, a framework must provide an easy-to-use and fail-safe application programming interface (API). In this work, we address these challenges by discussing suitable software engineering principles and practical coding guidelines. A descriptive model is developed that identifies key strategies. General feasibility is shown by outlining environments in which our implementation has been utilized.

  2. Models and Frameworks: A Synergistic Association for Developing Component-Based Applications

    PubMed Central

    Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A.; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development. PMID:25147858

  3. Models and frameworks: a synergistic association for developing component-based applications.

    PubMed

    Alonso, Diego; Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development.

  4. StreamExplorer: A Multi-Stage System for Visually Exploring Events in Social Streams.

    PubMed

    Wu, Yingcai; Chen, Zhutian; Sun, Guodao; Xie, Xiao; Cao, Nan; Liu, Shixia; Cui, Weiwei

    2017-10-18

    Analyzing social streams is important for many applications, such as crisis management. However, the considerable diversity, increasing volume, and high dynamics of social streams of large events continue to be significant challenges that must be overcome to ensure effective exploration. We propose a novel framework by which to handle complex social streams on a budget PC. This framework features two components: 1) an online method to detect important time periods (i.e., subevents), and 2) a tailored GPU-assisted Self-Organizing Map (SOM) method, which clusters the tweets of subevents stably and efficiently. Based on the framework, we present StreamExplorer to facilitate the visual analysis, tracking, and comparison of a social stream at three levels. At a macroscopic level, StreamExplorer uses a new glyph-based timeline visualization, which presents a quick multi-faceted overview of the ebb and flow of a social stream. At a mesoscopic level, a map visualization is employed to visually summarize the social stream from either a topical or geographical aspect. At a microscopic level, users can employ interactive lenses to visually examine and explore the social stream from different perspectives. Two case studies and a task-based evaluation are used to demonstrate the effectiveness and usefulness of StreamExplorer.Analyzing social streams is important for many applications, such as crisis management. However, the considerable diversity, increasing volume, and high dynamics of social streams of large events continue to be significant challenges that must be overcome to ensure effective exploration. We propose a novel framework by which to handle complex social streams on a budget PC. This framework features two components: 1) an online method to detect important time periods (i.e., subevents), and 2) a tailored GPU-assisted Self-Organizing Map (SOM) method, which clusters the tweets of subevents stably and efficiently. Based on the framework, we present StreamExplorer to facilitate the visual analysis, tracking, and comparison of a social stream at three levels. At a macroscopic level, StreamExplorer uses a new glyph-based timeline visualization, which presents a quick multi-faceted overview of the ebb and flow of a social stream. At a mesoscopic level, a map visualization is employed to visually summarize the social stream from either a topical or geographical aspect. At a microscopic level, users can employ interactive lenses to visually examine and explore the social stream from different perspectives. Two case studies and a task-based evaluation are used to demonstrate the effectiveness and usefulness of StreamExplorer.

  5. Application of Frameworks in the Analysis and (Re)design of Interactive Visual Learning Tools

    ERIC Educational Resources Information Center

    Liang, Hai-Ning; Sedig, Kamran

    2009-01-01

    Interactive visual learning tools (IVLTs) are software environments that encode and display information visually and allow learners to interact with the visual information. This article examines the application and utility of frameworks in the analysis and design of IVLTs at the micro level. Frameworks play an important role in any design. They…

  6. Evaluation of Conceptual Frameworks Applicable to the Study of Isolation Precautions Effectiveness

    PubMed Central

    Crawford, Catherine; Shang, Jingjing

    2015-01-01

    Aims A discussion of conceptual frameworks applicable to the study of isolation precautions effectiveness according to Fawcett and DeSanto-Madeya’s (2013) evaluation technique and their relative merits and drawbacks for this purpose Background Isolation precautions are recommended to control infectious diseases with high morbidity and mortality, but effectiveness is not established due to numerous methodological challenges. These challenges, such as identifying empirical indicators and refining operational definitions, could be alleviated though use of an appropriate conceptual framework. Design Discussion paper Data Sources In mid-April 2014, the primary author searched five electronic, scientific literature databases for conceptual frameworks applicable to study isolation precautions, without limiting searches by publication date. Implications for Nursing By reviewing promising conceptual frameworks to support isolation precautions effectiveness research, this paper exemplifies the process to choose an appropriate conceptual framework for empirical research. Hence, researchers may build on these analyses to improve study design of empirical research in multiple disciplines, which may lead to improved research and practice. Conclusion Three frameworks were reviewed: the epidemiologic triad of disease, Donabedian’s healthcare quality framework and the Quality Health Outcomes model. Each has been used in nursing research to evaluate health outcomes and contains concepts relevant to nursing domains. Which framework can be most useful likely depends on whether the study question necessitates testing multiple interventions, concerns pathogen-specific characteristics and yields cross-sectional or longitudinal data. The Quality Health Outcomes model may be slightly preferred as it assumes reciprocal relationships, multi-level analysis and is sensitive to cultural inputs. PMID:26179813

  7. The Role of Omics in the Application of Adverse Outcome Pathways for Chemical Risk Assessment.

    PubMed

    Brockmeier, Erica K; Hodges, Geoff; Hutchinson, Thomas H; Butler, Emma; Hecker, Markus; Tollefsen, Knut Erik; Garcia-Reyero, Natalia; Kille, Peter; Becker, Dörthe; Chipman, Kevin; Colbourne, John; Collette, Timothy W; Cossins, Andrew; Cronin, Mark; Graystock, Peter; Gutsell, Steve; Knapen, Dries; Katsiadaki, Ioanna; Lange, Anke; Marshall, Stuart; Owen, Stewart F; Perkins, Edward J; Plaistow, Stewart; Schroeder, Anthony; Taylor, Daisy; Viant, Mark; Ankley, Gerald; Falciani, Francesco

    2017-08-01

    In conjunction with the second International Environmental Omics Symposium (iEOS) conference, held at the University of Liverpool (United Kingdom) in September 2014, a workshop was held to bring together experts in toxicology and regulatory science from academia, government and industry. The purpose of the workshop was to review the specific roles that high-content omics datasets (eg, transcriptomics, metabolomics, lipidomics, and proteomics) can hold within the adverse outcome pathway (AOP) framework for supporting ecological and human health risk assessments. In light of the growing number of examples of the application of omics data in the context of ecological risk assessment, we considered how omics datasets might continue to support the AOP framework. In particular, the role of omics in identifying potential AOP molecular initiating events and providing supportive evidence of key events at different levels of biological organization and across taxonomic groups was discussed. Areas with potential for short and medium-term breakthroughs were also discussed, such as providing mechanistic evidence to support chemical read-across, providing weight of evidence information for mode of action assignment, understanding biological networks, and developing robust extrapolations of species-sensitivity. Key challenges that need to be addressed were considered, including the need for a cohesive approach towards experimental design, the lack of a mutually agreed framework to quantitatively link genes and pathways to key events, and the need for better interpretation of chemically induced changes at the molecular level. This article was developed to provide an overview of ecological risk assessment process and a perspective on how high content molecular-level datasets can support the future of assessment procedures through the AOP framework. © The Author 2017. Published by Oxford University Press on behalf of the Society of Toxicology.

  8. The Role of Omics in the Application of Adverse Outcome Pathways for Chemical Risk Assessment

    PubMed Central

    Brockmeier, Erica K.; Hodges, Geoff; Hutchinson, Thomas H.; Butler, Emma; Hecker, Markus; Tollefsen, Knut Erik; Garcia-Reyero, Natalia; Kille, Peter; Becker, Dörthe; Chipman, Kevin; Colbourne, John; Collette, Timothy W.; Cossins, Andrew; Cronin, Mark; Graystock, Peter; Gutsell, Steve; Knapen, Dries; Katsiadaki, Ioanna; Lange, Anke; Marshall, Stuart; Owen, Stewart F.; Perkins, Edward J.; Plaistow, Stewart; Schroeder, Anthony; Taylor, Daisy; Viant, Mark; Ankley, Gerald; Falciani, Francesco

    2017-01-01

    Abstract In conjunction with the second International Environmental Omics Symposium (iEOS) conference, held at the University of Liverpool (United Kingdom) in September 2014, a workshop was held to bring together experts in toxicology and regulatory science from academia, government and industry. The purpose of the workshop was to review the specific roles that high-content omics datasets (eg, transcriptomics, metabolomics, lipidomics, and proteomics) can hold within the adverse outcome pathway (AOP) framework for supporting ecological and human health risk assessments. In light of the growing number of examples of the application of omics data in the context of ecological risk assessment, we considered how omics datasets might continue to support the AOP framework. In particular, the role of omics in identifying potential AOP molecular initiating events and providing supportive evidence of key events at different levels of biological organization and across taxonomic groups was discussed. Areas with potential for short and medium-term breakthroughs were also discussed, such as providing mechanistic evidence to support chemical read-across, providing weight of evidence information for mode of action assignment, understanding biological networks, and developing robust extrapolations of species-sensitivity. Key challenges that need to be addressed were considered, including the need for a cohesive approach towards experimental design, the lack of a mutually agreed framework to quantitatively link genes and pathways to key events, and the need for better interpretation of chemically induced changes at the molecular level. This article was developed to provide an overview of ecological risk assessment process and a perspective on how high content molecular-level datasets can support the future of assessment procedures through the AOP framework. PMID:28525648

  9. A complete categorization of multiscale models of infectious disease systems.

    PubMed

    Garira, Winston

    2017-12-01

    Modelling of infectious disease systems has entered a new era in which disease modellers are increasingly turning to multiscale modelling to extend traditional modelling frameworks into new application areas and to achieve higher levels of detail and accuracy in characterizing infectious disease systems. In this paper we present a categorization framework for categorizing multiscale models of infectious disease systems. The categorization framework consists of five integration frameworks and five criteria. We use the categorization framework to give a complete categorization of host-level immuno-epidemiological models (HL-IEMs). This categorization framework is also shown to be applicable in categorizing other types of multiscale models of infectious diseases beyond HL-IEMs through modifying the initial categorization framework presented in this study. Categorization of multiscale models of infectious disease systems in this way is useful in bringing some order to the discussion on the structure of these multiscale models.

  10. Open Technology Approaches to Geospatial Interface Design

    NASA Astrophysics Data System (ADS)

    Crevensten, B.; Simmons, D.; Alaska Satellite Facility

    2011-12-01

    What problems do you not want your software developers to be solving? Choosing open technologies across the entire stack of software development-from low-level shared libraries to high-level user interaction implementations-is a way to help ensure that customized software yields innovative and valuable tools for Earth Scientists. This demonstration will review developments in web application technologies and the recurring patterns of interaction design regarding exploration and discovery of geospatial data through the Vertex: ASF's Dataportal interface, a project utilizing current open web application standards and technologies including HTML5, jQueryUI, Backbone.js and the Jasmine unit testing framework.

  11. A Framework for Identifying Selective Chemical Applications for IPM in Dryland Agriculture

    PubMed Central

    Umina, Paul A.; Jenkins, Sommer; McColl, Stuart; Arthur, Aston; Hoffmann, Ary A.

    2015-01-01

    Shifts to Integrated Pest Management (IPM) in agriculture are assisted by the identification of chemical applications that provide effective control of pests relative to broad-spectrum pesticides but have fewer negative effects on natural enemy (beneficial) groups that assist in pest control. Here, we outline a framework for identifying such applications and apply this framework to field trials involving the crop establishment phase of Australian dryland cropping systems. Several chemicals, which are not presently available to farmers in Australia, were identified as providing moderate levels of pest control and seedling protection, with the potential to be less harmful to beneficial groups including predatory mites, predatory beetles and ants. This framework highlights the challenges involved in chemically controlling pests while maintaining non-target populations when pest species are present at damaging levels. PMID:26694469

  12. REEF: Retainable Evaluator Execution Framework

    PubMed Central

    Weimer, Markus; Chen, Yingda; Chun, Byung-Gon; Condie, Tyson; Curino, Carlo; Douglas, Chris; Lee, Yunseong; Majestro, Tony; Malkhi, Dahlia; Matusevych, Sergiy; Myers, Brandon; Narayanamurthy, Shravan; Ramakrishnan, Raghu; Rao, Sriram; Sears, Russell; Sezgin, Beysim; Wang, Julia

    2015-01-01

    Resource Managers like Apache YARN have emerged as a critical layer in the cloud computing system stack, but the developer abstractions for leasing cluster resources and instantiating application logic are very low-level. This flexibility comes at a high cost in terms of developer effort, as each application must repeatedly tackle the same challenges (e.g., fault-tolerance, task scheduling and coordination) and re-implement common mechanisms (e.g., caching, bulk-data transfers). This paper presents REEF, a development framework that provides a control-plane for scheduling and coordinating task-level (data-plane) work on cluster resources obtained from a Resource Manager. REEF provides mechanisms that facilitate resource re-use for data caching, and state management abstractions that greatly ease the development of elastic data processing work-flows on cloud platforms that support a Resource Manager service. REEF is being used to develop several commercial offerings such as the Azure Stream Analytics service. Furthermore, we demonstrate REEF development of a distributed shell application, a machine learning algorithm, and a port of the CORFU [4] system. REEF is also currently an Apache Incubator project that has attracted contributors from several instititutions.1 PMID:26819493

  13. Creating an outcomes framework.

    PubMed

    Doerge, J B

    2000-01-01

    Four constructs used to build a framework for outcomes management for a large midwestern tertiary hospital are described in this article. A system framework outlining a model of clinical integration and population management based in Steven Shortell's work is discussed. This framework includes key definitions of high-risk patients, target groups, populations and community. Roles for each level of population management and how they were implemented in the health care system are described. A point of service framework centered on seven dimensions of care is the next construct applied on each nursing unit. The third construct outlines the framework for role development. Three roles for nursing were created to implement strategies for target groups that are strategic disease categories; two of those roles are described in depth. The philosophy of nursing practice is centered on caring and existential advocacy. The final construct is the modification of the Dartmouth model as a common framework for outcomes. System applications of the scorecard and lessons learned in the 2-year process of implementation are shared

  14. The Qubit as Key to Quantum Physics Part II: Physical Realizations and Applications

    ERIC Educational Resources Information Center

    Dür, Wolfgang; Heusler, Stefan

    2016-01-01

    Using the simplest possible quantum system--the qubit--the fundamental concepts of quantum physics can be introduced. This highlights the common features of many different physical systems, and provides a unifying framework when teaching quantum physics at the high school or introductory level. In a previous "TPT" article and in a…

  15. An Evaluation of the High Level Architecture (HLA) as a Framework for NASA Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reid, Michael R.; Powers, Edward I. (Technical Monitor)

    2000-01-01

    The High Level Architecture (HLA) is a current US Department of Defense and an industry (IEEE-1516) standard architecture for modeling and simulations. It provides a framework and set of functional rules and common interfaces for integrating separate and disparate simulators into a larger simulation. The goal of the HLA is to reduce software costs by facilitating the reuse of simulation components and by providing a runtime infrastructure to manage the simulations. In order to evaluate the applicability of the HLA as a technology for NASA space mission simulations, a Simulations Group at Goddard Space Flight Center (GSFC) conducted a study of the HLA and developed a simple prototype HLA-compliant space mission simulator. This paper summarizes the prototyping effort and discusses the potential usefulness of the HLA in the design and planning of future NASA space missions with a focus on risk mitigation and cost reduction.

  16. NSLS-II HIGH LEVEL APPLICATION INFRASTRUCTURE AND CLIENT API DESIGN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, G.; Yang; L.

    2011-03-28

    The beam commissioning software framework of NSLS-II project adopts a client/server based architecture to replace the more traditional monolithic high level application approach. It is an open structure platform, and we try to provide a narrow API set for client application. With this narrow API, existing applications developed in different language under different architecture could be ported to our platform with small modification. This paper describes system infrastructure design, client API and system integration, and latest progress. As a new 3rd generation synchrotron light source with ultra low emittance, there are new requirements and challenges to control and manipulate themore » beam. A use case study and a theoretical analysis have been performed to clarify requirements and challenges to the high level applications (HLA) software environment. To satisfy those requirements and challenges, adequate system architecture of the software framework is critical for beam commissioning, study and operation. The existing traditional approaches are self-consistent, and monolithic. Some of them have adopted a concept of middle layer to separate low level hardware processing from numerical algorithm computing, physics modelling, data manipulating, plotting, and error handling. However, none of the existing approaches can satisfy the requirement. A new design has been proposed by introducing service oriented architecture technology. The HLA is combination of tools for accelerator physicists and operators, which is same as traditional approach. In NSLS-II, they include monitoring applications and control routines. Scripting environment is very important for the later part of HLA and both parts are designed based on a common set of APIs. Physicists and operators are users of these APIs, while control system engineers and a few accelerator physicists are the developers of these APIs. With our Client/Server mode based approach, we leave how to retrieve information to the developers of APIs and how to use them to form a physics application to the users. For example, how the channels are related to magnet and what the current real-time setting of a magnet is in physics unit are the internals of APIs. Measuring chromaticities are the users of APIs. All the users of APIs are working with magnet and instrument names in a physics unit. The low level communications in current or voltage unit are minimized. In this paper, we discussed our recent progress of our infrastructure development, and client API.« less

  17. [Informatics data quality and management].

    PubMed

    Feng, Rung-Chuang

    2009-06-01

    While the quality of data affects every aspect of business, it is frequently overlooked in terms of customer data integration, data warehousing, business intelligence and enterprise applications. Regardless of which data terms are used, a high level of data quality is a critical base condition essential to satisfy user needs and facilitate the development of effective applications. In this paper, the author introduces methods, a management framework and the major factors involved in data quality assessment. Author also integrates expert opinions to develop data quality assessment tools.

  18. An Overview of R in Health Decision Sciences.

    PubMed

    Jalal, Hawre; Pechlivanoglou, Petros; Krijkamp, Eline; Alarid-Escudero, Fernando; Enns, Eva; Hunink, M G Myriam

    2017-10-01

    As the complexity of health decision science applications increases, high-level programming languages are increasingly adopted for statistical analyses and numerical computations. These programming languages facilitate sophisticated modeling, model documentation, and analysis reproducibility. Among the high-level programming languages, the statistical programming framework R is gaining increased recognition. R is freely available, cross-platform compatible, and open source. A large community of users who have generated an extensive collection of well-documented packages and functions supports it. These functions facilitate applications of health decision science methodology as well as the visualization and communication of results. Although R's popularity is increasing among health decision scientists, methodological extensions of R in the field of decision analysis remain isolated. The purpose of this article is to provide an overview of existing R functionality that is applicable to the various stages of decision analysis, including model design, input parameter estimation, and analysis of model outputs.

  19. The Medical Imaging Interaction Toolkit: challenges and advances : 10 years of open-source development.

    PubMed

    Nolden, Marco; Zelzer, Sascha; Seitel, Alexander; Wald, Diana; Müller, Michael; Franz, Alfred M; Maleike, Daniel; Fangerau, Markus; Baumhauer, Matthias; Maier-Hein, Lena; Maier-Hein, Klaus H; Meinzer, Hans-Peter; Wolf, Ivo

    2013-07-01

    The Medical Imaging Interaction Toolkit (MITK) has been available as open-source software for almost 10 years now. In this period the requirements of software systems in the medical image processing domain have become increasingly complex. The aim of this paper is to show how MITK evolved into a software system that is able to cover all steps of a clinical workflow including data retrieval, image analysis, diagnosis, treatment planning, intervention support, and treatment control. MITK provides modularization and extensibility on different levels. In addition to the original toolkit, a module system, micro services for small, system-wide features, a service-oriented architecture based on the Open Services Gateway initiative (OSGi) standard, and an extensible and configurable application framework allow MITK to be used, extended and deployed as needed. A refined software process was implemented to deliver high-quality software, ease the fulfillment of regulatory requirements, and enable teamwork in mixed-competence teams. MITK has been applied by a worldwide community and integrated into a variety of solutions, either at the toolkit level or as an application framework with custom extensions. The MITK Workbench has been released as a highly extensible and customizable end-user application. Optional support for tool tracking, image-guided therapy, diffusion imaging as well as various external packages (e.g. CTK, DCMTK, OpenCV, SOFA, Python) is available. MITK has also been used in several FDA/CE-certified applications, which demonstrates the high-quality software and rigorous development process. MITK provides a versatile platform with a high degree of modularization and interoperability and is well suited to meet the challenging tasks of today's and tomorrow's clinically motivated research.

  20. Study of Thread Level Parallelism in a Video Encoding Application for Chip Multiprocessor Design

    NASA Astrophysics Data System (ADS)

    Debes, Eric; Kaine, Greg

    2002-11-01

    In media applications there is a high level of available thread level parallelism (TLP). In this paper we study the intra TLP in a video encoder. We show that a well-distributed highly optimized encoder running on a symmetric multiprocessor (SMP) system can run 3.2 faster on a 4-way SMP machine than on a single processor. The multithreaded encoder running on an SMP system is then used to understand the requirements of a chip multiprocessor (CMP) architecture, which is one possible architectural direction to better exploit TLP. In the framework of this study, we use a software approach to evaluate the dataflow between processors for the video encoder running on an SMP system. An estimation of the dataflow is done with L2 cache miss event counters using Intel® VTuneTM performance analyzer. The experimental measurements are compared to theoretical results.

  1. MARTe: A Multiplatform Real-Time Framework

    NASA Astrophysics Data System (ADS)

    Neto, André C.; Sartori, Filippo; Piccolo, Fabio; Vitelli, Riccardo; De Tommasi, Gianmaria; Zabeo, Luca; Barbalace, Antonio; Fernandes, Horacio; Valcarcel, Daniel F.; Batista, Antonio J. N.

    2010-04-01

    Development of real-time applications is usually associated with nonportable code targeted at specific real-time operating systems. The boundary between hardware drivers, system services, and user code is commonly not well defined, making the development in the target host significantly difficult. The Multithreaded Application Real-Time executor (MARTe) is a framework built over a multiplatform library that allows the execution of the same code in different operating systems. The framework provides the high-level interfaces with hardware, external configuration programs, and user interfaces, assuring at the same time hard real-time performances. End-users of the framework are required to define and implement algorithms inside a well-defined block of software, named Generic Application Module (GAM), that is executed by the real-time scheduler. Each GAM is reconfigurable with a set of predefined configuration meta-parameters and interchanges information using a set of data pipes that are provided as inputs and required as output. Using these connections, different GAMs can be chained either in series or parallel. GAMs can be developed and debugged in a non-real-time system and, only once the robustness of the code and correctness of the algorithm are verified, deployed to the real-time system. The software also supplies a large set of utilities that greatly ease the interaction and debugging of a running system. Among the most useful are a highly efficient real-time logger, HTTP introspection of real-time objects, and HTTP remote configuration. MARTe is currently being used to successfully drive the plasma vertical stabilization controller on the largest magnetic confinement fusion device in the world, with a control loop cycle of 50 ?s and a jitter under 1 ?s. In this particular project, MARTe is used with the Real-Time Application Interface (RTAI)/Linux operating system exploiting the new ?86 multicore processors technology.

  2. Leadership qualities framework provides a useful tool for nurses.

    PubMed

    Guelbert, Catherine

    2003-11-01

    Good leadership can be difficult to define, but it is vital to inspiring staff to improve services. A framework has been developed to enable NHS leaders at all levels to assess their strengths and identify their development needs. It is applicable to leadership roles at any level, including nurses.

  3. Adjusted Framework of M-Learning in Blended Learning System for Mathematics Study Field of Junior High School Level VII

    NASA Astrophysics Data System (ADS)

    Sugiyanta, Lipur; Sukardjo, Moch.

    2018-04-01

    The 2013 curriculum requires teachers to be more productive, creative, and innovative in encouraging students to be more independent by strengthening attitudes, skills and knowledge. Teachers are given the options to create lesson plan according to the environment and conditions of their students. At the junior level, Core Competence (KI) and Basic Competence (KD) have been completely designed. In addition, there had already guidebooks, both for teacher manuals (Master’s Books) and for learners (Student Books). The lesson plan and guidebooks which already exist are intended only for learning in the classroom/in-school. Many alternative classrooms and alternatives learning models opened up using educational technology. The advance of educational technology opened opportunity for combination of class interaction using mobile learning applications. Mobile learning has rapidly evolved in education for the last ten years and many initiatives have been conducted worldwide. However, few of these efforts have produced any lasting outcomes. It is evident that mobile education applications are complex and hence, will not become sustainable. Long-term sustainability remains a risk. Long-term sustainability usually was resulted from continuous adaptation to changing conditions [4]. Frameworks are therefore required to avoid sustainability pitfalls. The implementation should start from simple environment then gradually become complex through adaptation steps. Therefore, our paper developed the framework of mobile learning (m-learning) adaptation for grade 7th (junior high school). The environment setup was blended mobile learning (not full mobile learning) and emphasize on Algebra. The research is done by R&D method (research and development). Results of the framework includes requirements and adaptation steps. The adjusted m-learning framework is designed to be a guidance for teachers to adopt m-learning to support blended learning environments. During mock-up prototype, the adjusted framework demonstrates how to make successful implementation of early blended mobile learning through framework. The Social area is in focus of adaptation because participation is important to improve the sustainability. From the short practice of mock-up prototype, blended mobile learning can be an effective pedagogical model in supporting students in inquiry-based learning.

  4. YAPPA: a Compiler-Based Parallelization Framework for Irregular Applications on MPSoCs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lovergine, Silvia; Tumeo, Antonino; Villa, Oreste

    Modern embedded systems include hundreds of cores. Because of the difficulty in providing a fast, coherent memory architecture, these systems usually rely on non-coherent, non-uniform memory architectures with private memories for each core. However, programming these systems poses significant challenges. The developer must extract large amounts of parallelism, while orchestrating communication among cores to optimize application performance. These issues become even more significant with irregular applications, which present data sets difficult to partition, unpredictable memory accesses, unbalanced control flow and fine grained communication. Hand-optimizing every single aspect is hard and time-consuming, and it often does not lead to the expectedmore » performance. There is a growing gap between such complex and highly-parallel architectures and the high level languages used to describe the specification, which were designed for simpler systems and do not consider these new issues. In this paper we introduce YAPPA (Yet Another Parallel Programming Approach), a compilation framework for the automatic parallelization of irregular applications on modern MPSoCs based on LLVM. We start by considering an efficient parallel programming approach for irregular applications on distributed memory systems. We then propose a set of transformations that can reduce the development and optimization effort. The results of our initial prototype confirm the correctness of the proposed approach.« less

  5. The Cladistic Basis for the Phylogenetic Diversity (PD) Measure Links Evolutionary Features to Environmental Gradients and Supports Broad Applications of Microbial Ecology’s “Phylogenetic Beta Diversity” Framework

    PubMed Central

    Faith, Daniel P.; Lozupone, Catherine A.; Nipperess, David; Knight, Rob

    2009-01-01

    The PD measure of phylogenetic diversity interprets branch lengths cladistically to make inferences about feature diversity. PD calculations extend conventional species-level ecological indices to the features level. The “phylogenetic beta diversity” framework developed by microbial ecologists calculates PD-dissimilarities between community localities. Interpretation of these PD-dissimilarities at the feature level explains the framework’s success in producing ordinations revealing environmental gradients. An example gradients space using PD-dissimilarities illustrates how evolutionary features form unimodal response patterns to gradients. This features model supports new application of existing species-level methods that are robust to unimodal responses, plus novel applications relating to climate change, commercial products discovery, and community assembly. PMID:20087461

  6. Examining statewide capacity for school health and mental health promotion: a post hoc application of a district capacity-building framework.

    PubMed

    Maras, Melissa A; Weston, Karen J; Blacksmith, Jennifer; Brophy, Chelsey

    2015-03-01

    Schools must possess a variety of capacities to effectively support comprehensive and coordinated school health promotion activities, and researchers have developed a district-level capacity-building framework specific to school health promotion. State-level school health coalitions often support such capacity-building efforts and should embed this work within a data-based, decision-making model. However, there is a lack of guidance for state school health coalitions on how they should collect and use data. This article uses a district-level capacity-building framework to interpret findings from a statewide coordinated school health needs/resource assessment in order to examine statewide capacity for school health promotion. Participants included school personnel (N = 643) from one state. Descriptive statistics were calculated for survey items, with further examination of subgroup differences among school administrators and nurses. Results were then interpreted via a post hoc application of a district-level capacity-building framework. Findings across districts revealed statewide strengths and gaps with regard to leadership and management capacities, internal and external supports, and an indicator of global capacity. Findings support the utility of using a common framework across local and state levels to align efforts and embed capacity-building activities within a data-driven, continuous improvement model. © 2014 Society for Public Health Education.

  7. Individual responsibility for healthcare financing: application of an analytical framework exploring the suitability of private financing of assistive devices.

    PubMed

    Tinghög, Gustav; Carlsson, Per

    2012-12-01

    To operationalise and apply a conceptual framework for exploring when health services contain characteristics that facilitate individuals' ability to take individual responsibility for health care through out-of-pocket payment. In addition, we investigate if the levels of out-of-pocket payment for assistive devices (ADs) in Sweden are in line with the proposed framework. Focus groups were used to operationalise the core concepts of sufficient knowledge, individual autonomy, positive externalities, sufficient demand, affordability, and lifestyle enhancement into a measurable and replicable rationing tool. A selection of 28 ADs were graded separately as having high, medium, or low suitability for private financing according to the measurement scale provided through the operationalised framework. To investigate the actual level of private financing, a questionnaire about the level of out-of-pocket payment for the specific ADs was administered to county councils in Sweden. Concepts were operationalised into three levels indicating possible suitability for private financing. Responses to the questionnaire indicate that financing of ADs in Sweden varies across county councils as regards co-payment, full payment, discretionary payment for certain healthcare consumer groups, and full reimbursement. According to the framework, ADs commonly funded privately were generally considered to be more suitable for private financing. Sufficient knowledge, individual autonomy, and sufficient demand did not appear to influence why certain ADs were financed out-of-pocket. The level of positive externalities, affordability, and lifestyle enhancement appeared to be somewhat higher for ADs that were financed out-of-pocket, but the differences were small. Affordability seemed to be the most influential concept.

  8. Examination of the regulatory frameworks applicable to biologic drugs (including stem cells and their progeny) in Europe, the U.S., and Australia: part II--a method of software documentary analysis.

    PubMed

    Ilic, Nina; Savic, Snezana; Siegel, Evan; Atkinson, Kerry; Tasic, Ljiljana

    2012-12-01

    A wide range of regulatory standards applicable to production and use of tissues, cells, and other biologics (or biologicals), as advanced therapies, indicates considerable interest in the regulation of these products. The objective of this study was to analyze and compare high-tier documents within the Australian, European, and U.S. biologic drug regulatory environments using qualitative methodology. Eighteen high-tier documents from the European Medicines Agency (EMA), U.S. Food and Drug Administration (FDA), and Therapeutic Goods Administration (TGA) regulatory frameworks were subject to automated text analysis. Selected documents were consistent with the legal requirements for manufacturing and use of biologic drugs in humans and fall into six different categories. Concepts, themes, and their co-occurrence were identified and compared. The most frequent concepts in TGA, FDA, and EMA frameworks were "biological," "product," and "medicinal," respectively. This was consistent with the previous manual terminology search. Good Manufacturing Practice documents, across frameworks, identified "quality" and "appropriate" as main concepts, whereas in Good Clinical Practice (GCP) documents it was "clinical," followed by "trial," "subjects," "sponsor," and "data." GCP documents displayed considerably higher concordance between different regulatory frameworks, as demonstrated by a smaller number of concepts, similar size, and similar distance between them. Although high-tier documents often use different terminology, they share concepts and themes. This paper may be a modest contribution to the recognition of similarities and differences between analyzed regulatory documents. It may also fill the literature gap and provide some foundation for future comparative research of biologic drug regulations on a global level.

  9. Translation from UML to Markov Model: A Performance Modeling Framework

    NASA Astrophysics Data System (ADS)

    Khan, Razib Hayat; Heegaard, Poul E.

    Performance engineering focuses on the quantitative investigation of the behavior of a system during the early phase of the system development life cycle. Bearing this on mind, we delineate a performance modeling framework of the application for communication system that proposes a translation process from high level UML notation to Continuous Time Markov Chain model (CTMC) and solves the model for relevant performance metrics. The framework utilizes UML collaborations, activity diagrams and deployment diagrams to be used for generating performance model for a communication system. The system dynamics will be captured by UML collaboration and activity diagram as reusable specification building blocks, while deployment diagram highlights the components of the system. The collaboration and activity show how reusable building blocks in the form of collaboration can compose together the service components through input and output pin by highlighting the behavior of the components and later a mapping between collaboration and system component identified by deployment diagram will be delineated. Moreover the UML models are annotated to associate performance related quality of service (QoS) information which is necessary for solving the performance model for relevant performance metrics through our proposed framework. The applicability of our proposed performance modeling framework in performance evaluation is delineated in the context of modeling a communication system.

  10. StakeMeter: value-based stakeholder identification and quantification framework for value-based software systems.

    PubMed

    Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N A; Bin Zaheer, Kashif

    2015-01-01

    Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called 'StakeMeter'. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error.

  11. A Framework for Performing Verification and Validation in Reuse Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1997-01-01

    Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.

  12. Instructional Uses of the Lexile Framework.

    ERIC Educational Resources Information Center

    Stenner, A. Jackson

    The Lexile Framework provides teachers with tools to help them link the results of reading assessment with subsequent instruction, focuses on appropriate-level curriculum for readers at all educational levels, and is designed to be flexible enough to use alongside any type of reading program. Suggested areas for application of this system include:…

  13. Verification of Security Policy Enforcement in Enterprise Systems

    NASA Astrophysics Data System (ADS)

    Gupta, Puneet; Stoller, Scott D.

    Many security requirements for enterprise systems can be expressed in a natural way as high-level access control policies. A high-level policy may refer to abstract information resources, independent of where the information is stored; it controls both direct and indirect accesses to the information; it may refer to the context of a request, i.e., the request’s path through the system; and its enforcement point and enforcement mechanism may be unspecified. Enforcement of a high-level policy may depend on the system architecture and the configurations of a variety of security mechanisms, such as firewalls, host login permissions, file permissions, DBMS access control, and application-specific security mechanisms. This paper presents a framework in which all of these can be conveniently and formally expressed, a method to verify that a high-level policy is enforced, and an algorithm to determine a trusted computing base for each resource.

  14. GiPSi:a framework for open source/open architecture software development for organ-level surgical simulation.

    PubMed

    Cavuşoğlu, M Cenk; Göktekin, Tolga G; Tendick, Frank

    2006-04-01

    This paper presents the architectural details of an evolving open source/open architecture software framework for developing organ-level surgical simulations. Our goal is to facilitate shared development of reusable models, to accommodate heterogeneous models of computation, and to provide a framework for interfacing multiple heterogeneous models. The framework provides an application programming interface for interfacing dynamic models defined over spatial domains. It is specifically designed to be independent of the specifics of the modeling methods used, and therefore facilitates seamless integration of heterogeneous models and processes. Furthermore, each model has separate geometries for visualization, simulation, and interfacing, allowing the model developer to choose the most natural geometric representation for each case. Input/output interfaces for visualization and haptics for real-time interactive applications have also been provided.

  15. A high performance data parallel tensor contraction framework: Application to coupled electro-mechanics

    NASA Astrophysics Data System (ADS)

    Poya, Roman; Gil, Antonio J.; Ortigosa, Rogelio

    2017-07-01

    The paper presents aspects of implementation of a new high performance tensor contraction framework for the numerical analysis of coupled and multi-physics problems on streaming architectures. In addition to explicit SIMD instructions and smart expression templates, the framework introduces domain specific constructs for the tensor cross product and its associated algebra recently rediscovered by Bonet et al. (2015, 2016) in the context of solid mechanics. The two key ingredients of the presented expression template engine are as follows. First, the capability to mathematically transform complex chains of operations to simpler equivalent expressions, while potentially avoiding routes with higher levels of computational complexity and, second, to perform a compile time depth-first or breadth-first search to find the optimal contraction indices of a large tensor network in order to minimise the number of floating point operations. For optimisations of tensor contraction such as loop transformation, loop fusion and data locality optimisations, the framework relies heavily on compile time technologies rather than source-to-source translation or JIT techniques. Every aspect of the framework is examined through relevant performance benchmarks, including the impact of data parallelism on the performance of isomorphic and nonisomorphic tensor products, the FLOP and memory I/O optimality in the evaluation of tensor networks, the compilation cost and memory footprint of the framework and the performance of tensor cross product kernels. The framework is then applied to finite element analysis of coupled electro-mechanical problems to assess the speed-ups achieved in kernel-based numerical integration of complex electroelastic energy functionals. In this context, domain-aware expression templates combined with SIMD instructions are shown to provide a significant speed-up over the classical low-level style programming techniques.

  16. ASIC/FPGA Trust Assessment Framework

    NASA Technical Reports Server (NTRS)

    Berg, Melanie

    2018-01-01

    NASA Electronic Parts and Packaging (NEPP) is developing a process to be employed in critical applications. The framework assesses levels of Trust and assurance in microelectronic systems. The process is being created with participation from a variety of organizations. We present a synopsis of the framework that includes contributions from The Aerospace Corporation.

  17. Autonomic Management of Application Workflows on Hybrid Computing Infrastructure

    DOE PAGES

    Kim, Hyunjoo; el-Khamra, Yaakoub; Rodero, Ivan; ...

    2011-01-01

    In this paper, we present a programming and runtime framework that enables the autonomic management of complex application workflows on hybrid computing infrastructures. The framework is designed to address system and application heterogeneity and dynamics to ensure that application objectives and constraints are satisfied. The need for such autonomic system and application management is becoming critical as computing infrastructures become increasingly heterogeneous, integrating different classes of resources from high-end HPC systems to commodity clusters and clouds. For example, the framework presented in this paper can be used to provision the appropriate mix of resources based on application requirements and constraints.more » The framework also monitors the system/application state and adapts the application and/or resources to respond to changing requirements or environment. To demonstrate the operation of the framework and to evaluate its ability, we employ a workflow used to characterize an oil reservoir executing on a hybrid infrastructure composed of TeraGrid nodes and Amazon EC2 instances of various types. Specifically, we show how different applications objectives such as acceleration, conservation and resilience can be effectively achieved while satisfying deadline and budget constraints, using an appropriate mix of dynamically provisioned resources. Our evaluations also demonstrate that public clouds can be used to complement and reinforce the scheduling and usage of traditional high performance computing infrastructure.« less

  18. A lightweight distributed framework for computational offloading in mobile cloud computing.

    PubMed

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.

  19. A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing

    PubMed Central

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245

  20. Integrated health monitoring and controls for rocket engines

    NASA Technical Reports Server (NTRS)

    Merrill, W. C.; Musgrave, J. L.; Guo, T. H.

    1992-01-01

    Current research in intelligent control systems at the Lewis Research Center is described in the context of a functional framework. The framework is applicable to a variety of reusable space propulsion systems for existing and future launch vehicles. It provides a 'road map' technology development to enable enhanced engine performance with increased reliability, durability, and maintainability. The framework hierarchy consists of a mission coordination level, a propulsion system coordination level, and an engine control level. Each level is described in the context of the Space Shuttle Main Engine. The concept of integrating diagnostics with control is discussed within the context of the functional framework. A distributed real time simulation testbed is used to realize and evaluate the functionalities in closed loop.

  1. Multidimensional Recurrence Quantification Analysis (MdRQA) for the Analysis of Multidimensional Time-Series: A Software Implementation in MATLAB and Its Application to Group-Level Data in Joint Action.

    PubMed

    Wallot, Sebastian; Roepstorff, Andreas; Mønster, Dan

    2016-01-01

    We introduce Multidimensional Recurrence Quantification Analysis (MdRQA) as a tool to analyze multidimensional time-series data. We show how MdRQA can be used to capture the dynamics of high-dimensional signals, and how MdRQA can be used to assess coupling between two or more variables. In particular, we describe applications of the method in research on joint and collective action, as it provides a coherent analysis framework to systematically investigate dynamics at different group levels-from individual dynamics, to dyadic dynamics, up to global group-level of arbitrary size. The Appendix in Supplementary Material contains a software implementation in MATLAB to calculate MdRQA measures.

  2. Dynamic Resource Allocation for IEEE802.16e

    NASA Astrophysics Data System (ADS)

    Nascimento, Alberto; Rodriguez, Jonathan

    Mobile communications has witnessed an exponential increase in the amount of users, services and applications. New high bandwidth consuming applications are targeted for B3G networks raising more stringent requirements for Dynamic Resource Allocation (DRA) architectures and packet schedulers that must be spectrum efficient and deliver QoS for heterogeneous applications and services. In this paper we propose a new cross layer-based architecture framework embedded in a newly designed DRA architecture for the Mobile WiMAX standard. System level simulation results show that the proposed architecture can be considered a viable candidate solution for supporting mixed services in a cost-effective manner in contrast to existing approaches.

  3. Social disorganization and the profile of child welfare: Explaining child welfare activity by the community-level factors.

    PubMed

    Harrikari, Timo

    2014-10-01

    This article addresses the question of the structure of local child welfare activities in light of community-level factors. It poses the following research questions: how are different community-level factors related to child welfare client structures in communities and what is the extent to which these factors explain structural differences? The applied theoretical framework is based on social disorganization and strain theories as well as human developmental approach. The data has been collected from two Finnish national databases and it consists of variables containing 257 Finnish municipalities. The method of analysis is multinomial logistic regression. The results suggest that the local child welfare structures are tied to social disorganization, policing and culture as well as to the intensity of control in the communities. In general, the more fragile the communal structures, the more last-resort child welfare there is in the community. Combining fragile communal structures with weak dependency ratio and high proportion of social workers, the more intense the level of child welfare statistics indicated. The results indicate that the theoretical framework for the application of child welfare activity analysis is justified, but they also suggest that it requires further development through both context-bound reflection and application. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. An Application of the Impact Evaluation Process for Designing a Performance Measurement and Evaluation Framework in K-12 Environments

    ERIC Educational Resources Information Center

    Guerra-Lopez, Ingrid; Toker, Sacip

    2012-01-01

    This article illustrates the application of the Impact Evaluation Process for the design of a performance measurement and evaluation framework for an urban high school. One of the key aims of this framework is to enhance decision-making by providing timely feedback about the effectiveness of various performance improvement interventions. The…

  5. Complex basis functions for molecular resonances: Methodology and applications

    NASA Astrophysics Data System (ADS)

    White, Alec; McCurdy, C. William; Head-Gordon, Martin

    The computation of positions and widths of metastable electronic states is a challenge for molecular electronic structure theory because, in addition to the difficulty of the many-body problem, such states obey scattering boundary conditions. These resonances cannot be addressed with naïve application of traditional bound state electronic structure theory. Non-Hermitian electronic structure methods employing complex basis functions is one way that we may rigorously treat resonances within the framework of traditional electronic structure theory. In this talk, I will discuss our recent work in this area including the methodological extension from single determinant SCF-based approaches to highly correlated levels of wavefunction-based theory such as equation of motion coupled cluster and many-body perturbation theory. These approaches provide a hierarchy of theoretical methods for the computation of positions and widths of molecular resonances. Within this framework, we may also examine properties of resonances including the dependence of these parameters on molecular geometry. Some applications of these methods to temporary anions and dianions will also be discussed.

  6. Enabling parallel simulation of large-scale HPC network systems

    DOE PAGES

    Mubarak, Misbah; Carothers, Christopher D.; Ross, Robert B.; ...

    2016-04-07

    Here, with the increasing complexity of today’s high-performance computing (HPC) architectures, simulation has become an indispensable tool for exploring the design space of HPC systems—in particular, networks. In order to make effective design decisions, simulations of these systems must possess the following properties: (1) have high accuracy and fidelity, (2) produce results in a timely manner, and (3) be able to analyze a broad range of network workloads. Most state-of-the-art HPC network simulation frameworks, however, are constrained in one or more of these areas. In this work, we present a simulation framework for modeling two important classes of networks usedmore » in today’s IBM and Cray supercomputers: torus and dragonfly networks. We use the Co-Design of Multi-layer Exascale Storage Architecture (CODES) simulation framework to simulate these network topologies at a flit-level detail using the Rensselaer Optimistic Simulation System (ROSS) for parallel discrete-event simulation. Our simulation framework meets all the requirements of a practical network simulation and can assist network designers in design space exploration. First, it uses validated and detailed flit-level network models to provide an accurate and high-fidelity network simulation. Second, instead of relying on serial time-stepped or traditional conservative discrete-event simulations that limit simulation scalability and efficiency, we use the optimistic event-scheduling capability of ROSS to achieve efficient and scalable HPC network simulations on today’s high-performance cluster systems. Third, our models give network designers a choice in simulating a broad range of network workloads, including HPC application workloads using detailed network traces, an ability that is rarely offered in parallel with high-fidelity network simulations« less

  7. Enabling parallel simulation of large-scale HPC network systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mubarak, Misbah; Carothers, Christopher D.; Ross, Robert B.

    Here, with the increasing complexity of today’s high-performance computing (HPC) architectures, simulation has become an indispensable tool for exploring the design space of HPC systems—in particular, networks. In order to make effective design decisions, simulations of these systems must possess the following properties: (1) have high accuracy and fidelity, (2) produce results in a timely manner, and (3) be able to analyze a broad range of network workloads. Most state-of-the-art HPC network simulation frameworks, however, are constrained in one or more of these areas. In this work, we present a simulation framework for modeling two important classes of networks usedmore » in today’s IBM and Cray supercomputers: torus and dragonfly networks. We use the Co-Design of Multi-layer Exascale Storage Architecture (CODES) simulation framework to simulate these network topologies at a flit-level detail using the Rensselaer Optimistic Simulation System (ROSS) for parallel discrete-event simulation. Our simulation framework meets all the requirements of a practical network simulation and can assist network designers in design space exploration. First, it uses validated and detailed flit-level network models to provide an accurate and high-fidelity network simulation. Second, instead of relying on serial time-stepped or traditional conservative discrete-event simulations that limit simulation scalability and efficiency, we use the optimistic event-scheduling capability of ROSS to achieve efficient and scalable HPC network simulations on today’s high-performance cluster systems. Third, our models give network designers a choice in simulating a broad range of network workloads, including HPC application workloads using detailed network traces, an ability that is rarely offered in parallel with high-fidelity network simulations« less

  8. [Assessing program sustainability in public health organizations: a tool-kit application in Haiti].

    PubMed

    Ridde, V; Pluye, P; Queuille, L

    2006-10-01

    Public health stakeholders are concerned about program sustainability. However, they usually conceive sustainability in accordance with financial criteria for at least one reason. No simple frameworks are operationally and theoretically sound enough to globally evaluate program sustainability. The present paper aims to describe an application of one framework assessment tool used to evaluate the sustainability level and process of a Nutritional Care Unit managed by a Swiss humanitarian agency to fight against severe child malnutrition in a Haitian area. The managing agency is committed to put this Unit back into the structure of a local public hospital. The evaluation was performed within the sustainability framework proposed in a former article. Data were collected with a combination of tools, semi-structured interviews (n=33, medical and support staff from the agency and the hospital), participatory observation and document review. Data concerned the four characteristics of organizational routines (memory, adaptation, values and rules) enabling assess to the level of sustainability. In addition, data were related to three types of events distinguishing routinization processes from implementation processes: specific events of routinization, routinization-implementation joint events, and specific events of implementation. Data analysis was thematic and results were validated by actors through a feed-back session and written comments. The current level of sustainability of the Nutritional Care Unit within the Hospital is weak: weak memory, high adaptation, weak sharing of values and rules. This may be explained by the sustainability process, and the absence of specific routinization events. The relevance of such processes is reasonable, while it has been strongly challenged in the troublesome Haitian context. Riots have been widespread over the last years, creating difficulties for the Hospital. This experience suggests the proposed framework and sustainability assessment tools are useful when the context permits scrutinization of program sustainability.

  9. Examination of Frameworks for Safe Integration of Intelligent Small UAS into the NAS

    NASA Technical Reports Server (NTRS)

    Logan, Michael J.

    2012-01-01

    This paper discusses a proposed framework for the safe integration of small unmanned aerial systems (sUAS) into the National Airspace System (NAS). The paper briefly examines the potential uses of sUAS to build an understanding of the location and frequency of potential future flight operations based on the future applications of the sUAS systems. The paper then examines the types of systems that would be required to meet the application-level demand to determine "classes" of platforms and operations. A framework for categorization of the "intelligence" level of the UAS is postulated for purposes of NAS integration. Finally, constraints on the intelligent systems are postulated to ensure their ease of integration into the NAS.

  10. A software framework for developing measurement applications under variable requirements.

    PubMed

    Arpaia, Pasquale; Buzio, Marco; Fiscarelli, Lucio; Inglese, Vitaliano

    2012-11-01

    A framework for easily developing software for measurement and test applications under highly and fast-varying requirements is proposed. The framework allows the software quality, in terms of flexibility, usability, and maintainability, to be maximized. Furthermore, the development effort is reduced and finalized, by relieving the test engineer of development details. The framework can be configured for satisfying a large set of measurement applications in a generic field for an industrial test division, a test laboratory, or a research center. As an experimental case study, the design, the implementation, and the assessment inside the application to a measurement scenario of magnet testing at the European Organization for Nuclear Research is reported.

  11. Modeling and Reduction With Applications to Semiconductor Processing

    DTIC Science & Technology

    1999-01-01

    smoothies ,” as they kept my energy level high without resorting to coffee (the beverage of choice, it seems, for graduate students). My advisor gave me all...with POC data, and balancing approach. . . . . . . . . . . . . . . . 312 xii LIST OF FIGURES 1.1 General state-space model reduction methodology ...reduction problem, then, is one of finding a systematic methodology within a given mathematical framework to produce an efficient or optimal trade-off of

  12. Ultrabroadband photonic internet: safety aspects

    NASA Astrophysics Data System (ADS)

    Kalicki, Arkadiusz; Romaniuk, Ryszard

    2008-11-01

    Web applications became most popular medium in the Internet. Popularity, easiness of web application frameworks together with careless development results in high number of vulnerabilities and attacks. There are several types of attacks possible because of improper input validation. SQL injection is ability to execute arbitrary SQL queries in a database through an existing application. Cross-site scripting is the vulnerability which allows malicious web users to inject code into the web pages viewed by other users. Cross-Site Request Forgery (CSRF) is an attack that tricks the victim into loading a page that contains malicious request. Web spam in blogs. There are several techniques to mitigate attacks. Most important are web application strong design, correct input validation, defined data types for each field and parameterized statements in SQL queries. Server hardening with firewall, modern security policies systems and safe web framework interpreter configuration are essential. It is advised to keep proper security level on client side, keep updated software and install personal web firewalls or IDS/IPS systems. Good habits are logging out from services just after finishing work and using even separate web browser for most important sites, like e-banking.

  13. Development and application of a framework to assess community nutritionists' use of environmental strategies to prevent obesity.

    PubMed

    Lu, Angela H; Dickin, Katherine; Dollahite, Jamie

    2014-01-01

    To develop and apply a framework exploring the extent of involvement in promoting environmental changes to prevent obesity by a group of nutrition educators (NE). Cross-sectional, mixed methods: qualitative interviews informed framework development; survey applied framework to describe NE's involvement in environmental changes. Cooperative Extension in New York State. Interviewees (n = 7) selected to vary in environmental change activities and rural/urban location. Survey response rate was 100% (n = 58). Dimensions and degree of NE's involvement in promoting environmental change. Thematic analysis of qualitative data, triangulated with descriptive analyses of NE's performance of tasks in various settings. NE's promotion of environmental changes was characterized using framework based on settings and tasks, dimensions that emerged from qualitative analysis. NE's actions varied across these dimensions and ranged from low to high intensity of collaboration and leadership for environmental change. Most NE surveyed reported actions limited to providing information and recommendations on healthy eating and physical activity. Few reported intensive engagement in developing, implementing, and evaluating plans to change environments for obesity prevention. Framework identifies the levels of engagement in promoting environmental changes and supports future research and practice of community nutrition professionals by providing a roadmap for assessing their involvement on multiple levels to prevent obesity. Copyright © 2014 Society for Nutrition Education and Behavior. All rights reserved.

  14. A Public Health Grid (PHGrid): Architecture and value proposition for 21st century public health.

    PubMed

    Savel, T; Hall, K; Lee, B; McMullin, V; Miles, M; Stinn, J; White, P; Washington, D; Boyd, T; Lenert, L

    2010-07-01

    This manuscript describes the value of and proposal for a high-level architectural framework for a Public Health Grid (PHGrid), which the authors feel has the capability to afford the public health community a robust technology infrastructure for secure and timely data, information, and knowledge exchange, not only within the public health domain, but between public health and the overall health care system. The CDC facilitated multiple Proof-of-Concept (PoC) projects, leveraging an open-source-based software development methodology, to test four hypotheses with regard to this high-level framework. The outcomes of the four PoCs in combination with the use of the Federal Enterprise Architecture Framework (FEAF) and the newly emerging Federal Segment Architecture Methodology (FSAM) was used to develop and refine a high-level architectural framework for a Public Health Grid infrastructure. The authors were successful in documenting a robust high-level architectural framework for a PHGrid. The documentation generated provided a level of granularity needed to validate the proposal, and included examples of both information standards and services to be implemented. Both the results of the PoCs as well as feedback from selected public health partners were used to develop the granular documentation. A robust high-level cohesive architectural framework for a Public Health Grid (PHGrid) has been successfully articulated, with its feasibility demonstrated via multiple PoCs. In order to successfully implement this framework for a Public Health Grid, the authors recommend moving forward with a three-pronged approach focusing on interoperability and standards, streamlining the PHGrid infrastructure, and developing robust and high-impact public health services. Published by Elsevier Ireland Ltd.

  15. A COMPUTATIONAL AND EXPERIMENTAL STUDY OF METAL AND COVALENT ORGANIC FRAMEWORKS USED IN ADSORPTION COOLING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jenks, Jeromy WJ; TeGrotenhuis, Ward E.; Motkuri, Radha K.

    2015-07-09

    Metal-organic frameworks (MOFs) have recently attracted enormous interest over the past few years due to their potential applications in energy storage and gas separation. However, there have been few reports on MOFs for adsorption cooling applications. Adsorption cooling technology is an established alternative to mechanical vapor compression refrigeration systems. Adsorption cooling is an excellent alternative in industrial environments where waste heat is available. Applications also include hybrid systems, refrigeration, power-plant dry cooling, cryogenics, vehicular systems and building HVAC. Adsorption based cooling and refrigeration systems have several advantages including few moving parts and negligible power consumption. Key disadvantages include large thermalmore » mass, bulkiness, complex controls, and low COP (0.2-0.5). We explored the use of metal organic frameworks that have very high mass loading and relatively low heats of adsorption, with certain combinations of refrigerants to demonstrate a new type of highly efficient adsorption chiller. An adsorption chiller based on MOFs suggests that a thermally-driven COP>1 may be possible with these materials, which would represent a fundamental breakthrough in performance of adsorption chiller technology. Computational fluid dynamics combined with a system level lumped-parameter model have been used to project size and performance for chillers with a cooling capacity ranging from a few kW to several thousand kW. In addition, a cost model has been developed to project manufactured cost of entire systems. These systems rely on stacked micro/mini-scale architectures to enhance heat and mass transfer. Presented herein are computational and experimental results for hydrophyilic MOFs, fluorophilic MOFs and also flourophilic Covalent-organic frameworks (COFs).« less

  16. GNSS-ISR data fusion: General framework with application to the high-latitude ionosphere

    NASA Astrophysics Data System (ADS)

    Semeter, Joshua; Hirsch, Michael; Lind, Frank; Coster, Anthea; Erickson, Philip; Pankratius, Victor

    2016-03-01

    A mathematical framework is presented for the fusion of electron density measured by incoherent scatter radar (ISR) and total electron content (TEC) measured using global navigation satellite systems (GNSS). Both measurements are treated as projections of an unknown density field (for GNSS-TEC the projection is tomographic; for ISR the projection is a weighted average over a local spatial region) and discrete inverse theory is applied to obtain a higher fidelity representation of the field than could be obtained from either modality individually. The specific implementation explored herein uses the interpolated ISR density field as initial guess to the combined inverse problem, which is subsequently solved using maximum entropy regularization. Simulations involving a dense meridional network of GNSS receivers near the Poker Flat ISR demonstrate the potential of this approach to resolve sub-beam structure in ISR measurements. Several future directions are outlined, including (1) data fusion using lower level (lag product) ISR data, (2) consideration of the different temporal sampling rates, (3) application of physics-based regularization, (4) consideration of nonoptimal observing geometries, and (5) use of an ISR simulation framework for optimal experiment design.

  17. State Estimation Using Dependent Evidence Fusion: Application to Acoustic Resonance-Based Liquid Level Measurement.

    PubMed

    Xu, Xiaobin; Li, Zhenghui; Li, Guo; Zhou, Zhe

    2017-04-21

    Estimating the state of a dynamic system via noisy sensor measurement is a common problem in sensor methods and applications. Most state estimation methods assume that measurement noise and state perturbations can be modeled as random variables with known statistical properties. However in some practical applications, engineers can only get the range of noises, instead of the precise statistical distributions. Hence, in the framework of Dempster-Shafer (DS) evidence theory, a novel state estimatation method by fusing dependent evidence generated from state equation, observation equation and the actual observations of the system states considering bounded noises is presented. It can be iteratively implemented to provide state estimation values calculated from fusion results at every time step. Finally, the proposed method is applied to a low-frequency acoustic resonance level gauge to obtain high-accuracy measurement results.

  18. Virtual Levels and Role Models: N-Level Structural Equations Model of Reciprocal Ratings Data.

    PubMed

    Mehta, Paras D

    2018-01-01

    A general latent variable modeling framework called n-Level Structural Equations Modeling (NL-SEM) for dependent data-structures is introduced. NL-SEM is applicable to a wide range of complex multilevel data-structures (e.g., cross-classified, switching membership, etc.). Reciprocal dyadic ratings obtained in round-robin design involve complex set of dependencies that cannot be modeled within Multilevel Modeling (MLM) or Structural Equations Modeling (SEM) frameworks. The Social Relations Model (SRM) for round robin data is used as an example to illustrate key aspects of the NL-SEM framework. NL-SEM introduces novel constructs such as 'virtual levels' that allows a natural specification of latent variable SRMs. An empirical application of an explanatory SRM for personality using xxM, a software package implementing NL-SEM is presented. Results show that person perceptions are an integral aspect of personality. Methodological implications of NL-SEM for the analyses of an emerging class of contextual- and relational-SEMs are discussed.

  19. StakeMeter: Value-Based Stakeholder Identification and Quantification Framework for Value-Based Software Systems

    PubMed Central

    Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N. A.; Zaheer, Kashif Bin

    2015-01-01

    Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called ‘StakeMeter’. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error. PMID:25799490

  20. Virtual machine-based simulation platform for mobile ad-hoc network-based cyber infrastructure

    DOE PAGES

    Yoginath, Srikanth B.; Perumalla, Kayla S.; Henz, Brian J.

    2015-09-29

    In modeling and simulating complex systems such as mobile ad-hoc networks (MANETs) in de-fense communications, it is a major challenge to reconcile multiple important considerations: the rapidity of unavoidable changes to the software (network layers and applications), the difficulty of modeling the critical, implementation-dependent behavioral effects, the need to sustain larger scale scenarios, and the desire for faster simulations. Here we present our approach in success-fully reconciling them using a virtual time-synchronized virtual machine(VM)-based parallel ex-ecution framework that accurately lifts both the devices as well as the network communications to a virtual time plane while retaining full fidelity. At themore » core of our framework is a scheduling engine that operates at the level of a hypervisor scheduler, offering a unique ability to execute multi-core guest nodes over multi-core host nodes in an accurate, virtual time-synchronized manner. In contrast to other related approaches that suffer from either speed or accuracy issues, our framework provides MANET node-wise scalability, high fidelity of software behaviors, and time-ordering accuracy. The design and development of this framework is presented, and an ac-tual implementation based on the widely used Xen hypervisor system is described. Benchmarks with synthetic and actual applications are used to identify the benefits of our approach. The time inaccuracy of traditional emulation methods is demonstrated, in comparison with the accurate execution of our framework verified by theoretically correct results expected from analytical models of the same scenarios. In the largest high fidelity tests, we are able to perform virtual time-synchronized simulation of 64-node VM-based full-stack, actual software behaviors of MANETs containing a mix of static and mobile (unmanned airborne vehicle) nodes, hosted on a 32-core host, with full fidelity of unmodified ad-hoc routing protocols, unmodified application executables, and user-controllable physical layer effects including inter-device wireless signal strength, reachability, and connectivity.« less

  1. Virtual machine-based simulation platform for mobile ad-hoc network-based cyber infrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoginath, Srikanth B.; Perumalla, Kayla S.; Henz, Brian J.

    In modeling and simulating complex systems such as mobile ad-hoc networks (MANETs) in de-fense communications, it is a major challenge to reconcile multiple important considerations: the rapidity of unavoidable changes to the software (network layers and applications), the difficulty of modeling the critical, implementation-dependent behavioral effects, the need to sustain larger scale scenarios, and the desire for faster simulations. Here we present our approach in success-fully reconciling them using a virtual time-synchronized virtual machine(VM)-based parallel ex-ecution framework that accurately lifts both the devices as well as the network communications to a virtual time plane while retaining full fidelity. At themore » core of our framework is a scheduling engine that operates at the level of a hypervisor scheduler, offering a unique ability to execute multi-core guest nodes over multi-core host nodes in an accurate, virtual time-synchronized manner. In contrast to other related approaches that suffer from either speed or accuracy issues, our framework provides MANET node-wise scalability, high fidelity of software behaviors, and time-ordering accuracy. The design and development of this framework is presented, and an ac-tual implementation based on the widely used Xen hypervisor system is described. Benchmarks with synthetic and actual applications are used to identify the benefits of our approach. The time inaccuracy of traditional emulation methods is demonstrated, in comparison with the accurate execution of our framework verified by theoretically correct results expected from analytical models of the same scenarios. In the largest high fidelity tests, we are able to perform virtual time-synchronized simulation of 64-node VM-based full-stack, actual software behaviors of MANETs containing a mix of static and mobile (unmanned airborne vehicle) nodes, hosted on a 32-core host, with full fidelity of unmodified ad-hoc routing protocols, unmodified application executables, and user-controllable physical layer effects including inter-device wireless signal strength, reachability, and connectivity.« less

  2. Word-level language modeling for P300 spellers based on discriminative graphical models

    NASA Astrophysics Data System (ADS)

    Delgado Saa, Jaime F.; de Pesters, Adriana; McFarland, Dennis; Çetin, Müjdat

    2015-04-01

    Objective. In this work we propose a probabilistic graphical model framework that uses language priors at the level of words as a mechanism to increase the performance of P300-based spellers. Approach. This paper is concerned with brain-computer interfaces based on P300 spellers. Motivated by P300 spelling scenarios involving communication based on a limited vocabulary, we propose a probabilistic graphical model framework and an associated classification algorithm that uses learned statistical models of language at the level of words. Exploiting such high-level contextual information helps reduce the error rate of the speller. Main results. Our experimental results demonstrate that the proposed approach offers several advantages over existing methods. Most importantly, it increases the classification accuracy while reducing the number of times the letters need to be flashed, increasing the communication rate of the system. Significance. The proposed approach models all the variables in the P300 speller in a unified framework and has the capability to correct errors in previous letters in a word, given the data for the current one. The structure of the model we propose allows the use of efficient inference algorithms, which in turn makes it possible to use this approach in real-time applications.

  3. Advanced Information Technology in Simulation Based Life Cycle Design

    NASA Technical Reports Server (NTRS)

    Renaud, John E.

    2003-01-01

    In this research a Collaborative Optimization (CO) approach for multidisciplinary systems design is used to develop a decision based design framework for non-deterministic optimization. To date CO strategies have been developed for use in application to deterministic systems design problems. In this research the decision based design (DBD) framework proposed by Hazelrigg is modified for use in a collaborative optimization framework. The Hazelrigg framework as originally proposed provides a single level optimization strategy that combines engineering decisions with business decisions in a single level optimization. By transforming this framework for use in collaborative optimization one can decompose the business and engineering decision making processes. In the new multilevel framework of Decision Based Collaborative Optimization (DBCO) the business decisions are made at the system level. These business decisions result in a set of engineering performance targets that disciplinary engineering design teams seek to satisfy as part of subspace optimizations. The Decision Based Collaborative Optimization framework more accurately models the existing relationship between business and engineering in multidisciplinary systems design.

  4. The Need for V&V in Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1997-01-01

    V&V is currently performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors, especially errors related to entire' domain or product line rather than a critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. engineering. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for activities.

  5. Airplane detection based on fusion framework by combining saliency model with Deep Convolutional Neural Networks

    NASA Astrophysics Data System (ADS)

    Dou, Hao; Sun, Xiao; Li, Bin; Deng, Qianqian; Yang, Xubo; Liu, Di; Tian, Jinwen

    2018-03-01

    Aircraft detection from very high resolution remote sensing images, has gained more increasing interest in recent years due to the successful civil and military applications. However, several problems still exist: 1) how to extract the high-level features of aircraft; 2) locating objects within such a large image is difficult and time consuming; 3) A common problem of multiple resolutions of satellite images still exists. In this paper, inspirited by biological visual mechanism, the fusion detection framework is proposed, which fusing the top-down visual mechanism (deep CNN model) and bottom-up visual mechanism (GBVS) to detect aircraft. Besides, we use multi-scale training method for deep CNN model to solve the problem of multiple resolutions. Experimental results demonstrate that our method can achieve a better detection result than the other methods.

  6. Java Application Shell: A Framework for Piecing Together Java Applications

    NASA Technical Reports Server (NTRS)

    Miller, Philip; Powers, Edward I. (Technical Monitor)

    2001-01-01

    This session describes the architecture of Java Application Shell (JAS), a Swing-based framework for developing interactive Java applications. Java Application Shell is being developed by Commerce One, Inc. for NASA Goddard Space Flight Center Code 588. The purpose of JAS is to provide a framework for the development of Java applications, providing features that enable the development process to be more efficient, consistent and flexible. Fundamentally, JAS is based upon an architecture where an application is considered a collection of 'plugins'. In turn, a plug-in is a collection of Swing actions defined using XML and packaged in a jar file. Plug-ins may be local to the host platform or remotely-accessible through HTTP. Local and remote plugins are automatically discovered by JAS upon application startup; plugins may also be loaded dynamically without having to re-start the application. Using Extensible Markup Language (XML) to define actions, as opposed to hardcoding them in application logic, allows easier customization of application-specific operations by separating application logic from presentation. Through XML, a developer defines an action that may appear on any number of menus, toolbars, and buttons. Actions maintain and propagate enable/disable states and specify icons, tool-tips, titles, etc. Furthermore, JAS allows actions to be implemented using various scripting languages through the use of IBM's Bean Scripting Framework. Scripted action implementation is seamless to the end-user. In addition to action implementation, scripts may be used for application and unit-level testing. In the case of application-level testing, JAS has hooks to assist a script in simulating end-user input. JAS also provides property and user preference management, JavaHelp, Undo/Redo, Multi-Document Interface, Single-Document Interface, printing, and logging. Finally, Jini technology has also been included into the framework by means of a Jini services browser and the ability to associate services with actions. Several Java technologies have been incorporated into JAS, including Swing, Internal Frames, Java Beans, XML, JavaScript, JavaHelp, and Jini. Additional information is contained in the original extended abstract.

  7. The development of a competency framework for pharmacists providing cancer services.

    PubMed

    Carrington, Christine; Weir, Janet; Smith, Peter

    2011-09-01

    Health practitioners should possess relevant, up to date skills and be able to perform within their required scope of practice to ensure that they are competent. Maintaining the competency of health care professionals is a key principle of clinical governance and risk management. The aim of this project was to develop a competency framework for pharmacists providing pharmaceutical care to cancer patients. An initial draft framework was developed based on existing documentation and adapted to the needs of Queensland Health (QH) facilities. Pharmacists in QH and interstate were asked to review the framework for content and applicability. Cancer care pharmacists in QH were invited to evaluate and score the usefulness and relevance of the final framework. The framework consists of competency clusters, which describe core activities within three areas: patient care competencies, knowledge competencies, and advanced level competencies. The characteristics of the levels of practice at foundation, advanced, and consultant are defined. Twelve pharmacists evaluated the framework by self-assessing their own practice. Respondents reported that the framework was very to somewhat reflective of what they usually do and gave overall support for the content and applicability to practice. The framework has been developed using national and international documents and the input of experienced practitioners across Australia. It represents a set of key competencies for the pharmaceutical delivery of cancer care. The next essential step of the competency framework is to implement and integrate the framework into practice and to develop accompanying training tools.

  8. BRICK v0.2, a simple, accessible, and transparent model framework for climate and regional sea-level projections

    NASA Astrophysics Data System (ADS)

    Wong, Tony E.; Bakker, Alexander M. R.; Ruckert, Kelsey; Applegate, Patrick; Slangen, Aimée B. A.; Keller, Klaus

    2017-07-01

    Simple models can play pivotal roles in the quantification and framing of uncertainties surrounding climate change and sea-level rise. They are computationally efficient, transparent, and easy to reproduce. These qualities also make simple models useful for the characterization of risk. Simple model codes are increasingly distributed as open source, as well as actively shared and guided. Alas, computer codes used in the geosciences can often be hard to access, run, modify (e.g., with regards to assumptions and model components), and review. Here, we describe the simple model framework BRICK (Building blocks for Relevant Ice and Climate Knowledge) v0.2 and its underlying design principles. The paper adds detail to an earlier published model setup and discusses the inclusion of a land water storage component. The framework largely builds on existing models and allows for projections of global mean temperature as well as regional sea levels and coastal flood risk. BRICK is written in R and Fortran. BRICK gives special attention to the model values of transparency, accessibility, and flexibility in order to mitigate the above-mentioned issues while maintaining a high degree of computational efficiency. We demonstrate the flexibility of this framework through simple model intercomparison experiments. Furthermore, we demonstrate that BRICK is suitable for risk assessment applications by using a didactic example in local flood risk management.

  9. Task-specific image partitioning.

    PubMed

    Kim, Sungwoong; Nowozin, Sebastian; Kohli, Pushmeet; Yoo, Chang D

    2013-02-01

    Image partitioning is an important preprocessing step for many of the state-of-the-art algorithms used for performing high-level computer vision tasks. Typically, partitioning is conducted without regard to the task in hand. We propose a task-specific image partitioning framework to produce a region-based image representation that will lead to a higher task performance than that reached using any task-oblivious partitioning framework and existing supervised partitioning framework, albeit few in number. The proposed method partitions the image by means of correlation clustering, maximizing a linear discriminant function defined over a superpixel graph. The parameters of the discriminant function that define task-specific similarity/dissimilarity among superpixels are estimated based on structured support vector machine (S-SVM) using task-specific training data. The S-SVM learning leads to a better generalization ability while the construction of the superpixel graph used to define the discriminant function allows a rich set of features to be incorporated to improve discriminability and robustness. We evaluate the learned task-aware partitioning algorithms on three benchmark datasets. Results show that task-aware partitioning leads to better labeling performance than the partitioning computed by the state-of-the-art general-purpose and supervised partitioning algorithms. We believe that the task-specific image partitioning paradigm is widely applicable to improving performance in high-level image understanding tasks.

  10. 3-dimensional interconnected framework of N-doped porous carbon based on sugarcane bagasse for application in supercapacitors and lithium ion batteries

    NASA Astrophysics Data System (ADS)

    Wang, Bin; Wang, Yunhui; Peng, Yueying; Wang, Xin; Wang, Jing; Zhao, Jinbao

    2018-06-01

    In this work, N-doped biomass derived porous carbon (NSBDC) has been prepared utilizing low-cost agricultural waste-sugarcane bagasse as the prototype, and needle-like PANI as the dopant. NSBDC possesses a special 3D interconnected framework structure, superior hierarchical pores and suitable heteroatom doping level, which benefits a large number of applications on ion storage and high-rate ion transfer. Typically, the NSBDC exhibits the high specific capacitance (298 F g-1 at 1 A g-1) and rate capability (58.7% capacitance retention at 20 A g-1), as well as the high cycle stability (5.5% loss over 5000 cycles) in three-electrode systems. A two-electrode asymmetric system has been fabricated employing NSBDC and the precursor of NSBDC (sugarcane bagasse derived carbon/PANI composite) as the negative and positive electrodes, respectively, and an energy density as high as 49.4 Wh kg-1 is verified in this asymmetric system. A NSBDC-based whole symmetric supercapacitors has also been assembled, and it can easily light a 1.5 V bulb due to its high energy density (27.7 Wh kg-1). In addition, for expanding the application areas of NSBDC, it is also applied to lithium ion battery, and a high reversible capacity of 1148 mAh g-1 at 0.1 A g-1 is confirmed. Even at 5 A g-1, NSBDC can still deliver a high reversible capacity of 357 mAh g-1 after 200 cycles, indicating its superior lithium storage capability.

  11. The SMAT fiber laser for industrial applications

    NASA Astrophysics Data System (ADS)

    Ding, Jianwu; Liu, Jinghui; Wei, Xi; Xu, Jun

    2017-02-01

    With the increased adoption of high power fiber laser for various industrial applications, the downtime and the reliability of fiber lasers become more and more important. Here we present our approach toward a more reliable and more intelligent laser source for industrial applications: the SMAT fiber laser with the extensive sensor network and multi-level protection mechanism, the mobile connection and the mobile App, and the Smart Cloud. The proposed framework is the first IoT (Internet of Things) approach integrated in an industrial laser not only prolongs the reliability of an industrial laser but open up enormous potential for value-adding services by gathering and analyzing the Big data from the connected SMAT lasers.

  12. Component-Level Tuning of Kinematic Features from Composite Therapist Impressions of Movement Quality

    PubMed Central

    Venkataraman, Vinay; Turaga, Pavan; Baran, Michael; Lehrer, Nicole; Du, Tingfang; Cheng, Long; Rikakis, Thanassis; Wolf, Steven L.

    2016-01-01

    In this paper, we propose a general framework for tuning component-level kinematic features using therapists’ overall impressions of movement quality, in the context of a Home-based Adaptive Mixed Reality Rehabilitation (HAMRR) system. We propose a linear combination of non-linear kinematic features to model wrist movement, and propose an approach to learn feature thresholds and weights using high-level labels of overall movement quality provided by a therapist. The kinematic features are chosen such that they correlate with the quality of wrist movements to clinical assessment scores. Further, the proposed features are designed to be reliably extracted from an inexpensive and portable motion capture system using a single reflective marker on the wrist. Using a dataset collected from ten stroke survivors, we demonstrate that the framework can be reliably used for movement quality assessment in HAMRR systems. The system is currently being deployed for large-scale evaluations, and will represent an increasingly important application area of motion capture and activity analysis. PMID:25438331

  13. A Framework to Implement IoT Network Performance Modelling Techniques for Network Solution Selection †

    PubMed Central

    Delaney, Declan T.; O’Hare, Gregory M. P.

    2016-01-01

    No single network solution for Internet of Things (IoT) networks can provide the required level of Quality of Service (QoS) for all applications in all environments. This leads to an increasing number of solutions created to fit particular scenarios. Given the increasing number and complexity of solutions available, it becomes difficult for an application developer to choose the solution which is best suited for an application. This article introduces a framework which autonomously chooses the best solution for the application given the current deployed environment. The framework utilises a performance model to predict the expected performance of a particular solution in a given environment. The framework can then choose an apt solution for the application from a set of available solutions. This article presents the framework with a set of models built using data collected from simulation. The modelling technique can determine with up to 85% accuracy the solution which performs the best for a particular performance metric given a set of solutions. The article highlights the fractured and disjointed practice currently in place for examining and comparing communication solutions and aims to open a discussion on harmonising testing procedures so that different solutions can be directly compared and offers a framework to achieve this within IoT networks. PMID:27916929

  14. A Framework to Implement IoT Network Performance Modelling Techniques for Network Solution Selection.

    PubMed

    Delaney, Declan T; O'Hare, Gregory M P

    2016-12-01

    No single network solution for Internet of Things (IoT) networks can provide the required level of Quality of Service (QoS) for all applications in all environments. This leads to an increasing number of solutions created to fit particular scenarios. Given the increasing number and complexity of solutions available, it becomes difficult for an application developer to choose the solution which is best suited for an application. This article introduces a framework which autonomously chooses the best solution for the application given the current deployed environment. The framework utilises a performance model to predict the expected performance of a particular solution in a given environment. The framework can then choose an apt solution for the application from a set of available solutions. This article presents the framework with a set of models built using data collected from simulation. The modelling technique can determine with up to 85% accuracy the solution which performs the best for a particular performance metric given a set of solutions. The article highlights the fractured and disjointed practice currently in place for examining and comparing communication solutions and aims to open a discussion on harmonising testing procedures so that different solutions can be directly compared and offers a framework to achieve this within IoT networks.

  15. A Software Rejuvenation Framework for Distributed Computing

    NASA Technical Reports Server (NTRS)

    Chau, Savio

    2009-01-01

    A performability-oriented conceptual framework for software rejuvenation has been constructed as a means of increasing levels of reliability and performance in distributed stateful computing. As used here, performability-oriented signifies that the construction of the framework is guided by the concept of analyzing the ability of a given computing system to deliver services with gracefully degradable performance. The framework is especially intended to support applications that involve stateful replicas of server computers.

  16. Implementing the United Kingdom Government's 10-Year Teenage Pregnancy Strategy for England (1999-2010): Applicable Lessons for Other Countries.

    PubMed

    Hadley, Alison; Chandra-Mouli, Venkatraman; Ingham, Roger

    2016-07-01

    Teenage pregnancy is an issue of inequality affecting the health, well-being, and life chances of young women, young men, and their children. Consequently, high levels of teenage pregnancy are of concern to an increasing number of developing and developed countries. The UK Labour Government's Teenage Pregnancy Strategy for England was one of the very few examples of a nationally led, locally implemented evidence-based strategy, resourced over a long duration, with an associated reduction of 51% in the under-18 conception rate. This article seeks to identify the lessons applicable to other countries. The article focuses on the prevention program. Drawing on the detailed documentation of the 10-year strategy, it analyzes the factors that helped and hindered implementation against the World Health Organization (WHO) ExpandNet Framework. The Framework strives to improve the planning and management of the process of scaling-up of successful pilot programs with a focus on sexual and reproductive health, making it particularly suited for an analysis of England's teenage pregnancy strategy. The development and implementation of the strategy matches the Framework's key attributes for successful planning and scaling up of sexual and reproductive health programs. It also matched the attributes identified by the Centre for Global Development for scaled up approaches to complex public health issues. Although the strategy was implemented in a high-income country, analysis against the WHO-ExpandNet Framework identifies many lessons which are transferable to low- and medium-income countries seeking to address high teenage pregnancy rates. Copyright © 2016 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  17. Self organising hypothesis networks: a new approach for representing and structuring SAR knowledge

    PubMed Central

    2014-01-01

    Background Combining different sources of knowledge to build improved structure activity relationship models is not easy owing to the variety of knowledge formats and the absence of a common framework to interoperate between learning techniques. Most of the current approaches address this problem by using consensus models that operate at the prediction level. We explore the possibility to directly combine these sources at the knowledge level, with the aim to harvest potentially increased synergy at an earlier stage. Our goal is to design a general methodology to facilitate knowledge discovery and produce accurate and interpretable models. Results To combine models at the knowledge level, we propose to decouple the learning phase from the knowledge application phase using a pivot representation (lingua franca) based on the concept of hypothesis. A hypothesis is a simple and interpretable knowledge unit. Regardless of its origin, knowledge is broken down into a collection of hypotheses. These hypotheses are subsequently organised into hierarchical network. This unification permits to combine different sources of knowledge into a common formalised framework. The approach allows us to create a synergistic system between different forms of knowledge and new algorithms can be applied to leverage this unified model. This first article focuses on the general principle of the Self Organising Hypothesis Network (SOHN) approach in the context of binary classification problems along with an illustrative application to the prediction of mutagenicity. Conclusion It is possible to represent knowledge in the unified form of a hypothesis network allowing interpretable predictions with performances comparable to mainstream machine learning techniques. This new approach offers the potential to combine knowledge from different sources into a common framework in which high level reasoning and meta-learning can be applied; these latter perspectives will be explored in future work. PMID:24959206

  18. Implicit mesh discontinuous Galerkin methods and interfacial gauge methods for high-order accurate interface dynamics, with applications to surface tension dynamics, rigid body fluid-structure interaction, and free surface flow: Part I

    NASA Astrophysics Data System (ADS)

    Saye, Robert

    2017-09-01

    In this two-part paper, a high-order accurate implicit mesh discontinuous Galerkin (dG) framework is developed for fluid interface dynamics, facilitating precise computation of interfacial fluid flow in evolving geometries. The framework uses implicitly defined meshes-wherein a reference quadtree or octree grid is combined with an implicit representation of evolving interfaces and moving domain boundaries-and allows physically prescribed interfacial jump conditions to be imposed or captured with high-order accuracy. Part one discusses the design of the framework, including: (i) high-order quadrature for implicitly defined elements and faces; (ii) high-order accurate discretisation of scalar and vector-valued elliptic partial differential equations with interfacial jumps in ellipticity coefficient, leading to optimal-order accuracy in the maximum norm and discrete linear systems that are symmetric positive (semi)definite; (iii) the design of incompressible fluid flow projection operators, which except for the influence of small penalty parameters, are discretely idempotent; and (iv) the design of geometric multigrid methods for elliptic interface problems on implicitly defined meshes and their use as preconditioners for the conjugate gradient method. Also discussed is a variety of aspects relating to moving interfaces, including: (v) dG discretisations of the level set method on implicitly defined meshes; (vi) transferring state between evolving implicit meshes; (vii) preserving mesh topology to accurately compute temporal derivatives; (viii) high-order accurate reinitialisation of level set functions; and (ix) the integration of adaptive mesh refinement. In part two, several applications of the implicit mesh dG framework in two and three dimensions are presented, including examples of single phase flow in nontrivial geometry, surface tension-driven two phase flow with phase-dependent fluid density and viscosity, rigid body fluid-structure interaction, and free surface flow. A class of techniques known as interfacial gauge methods is adopted to solve the corresponding incompressible Navier-Stokes equations, which, compared to archetypical projection methods, have a weaker coupling between fluid velocity, pressure, and interface position, and allow high-order accurate numerical methods to be developed more easily. Convergence analyses conducted throughout the work demonstrate high-order accuracy in the maximum norm for all of the applications considered; for example, fourth-order spatial accuracy in fluid velocity, pressure, and interface location is demonstrated for surface tension-driven two phase flow in 2D and 3D. Specific application examples include: vortex shedding in nontrivial geometry, capillary wave dynamics revealing fine-scale flow features, falling rigid bodies tumbling in unsteady flow, and free surface flow over a submersed obstacle, as well as high Reynolds number soap bubble oscillation dynamics and vortex shedding induced by a type of Plateau-Rayleigh instability in water ripple free surface flow. These last two examples compare numerical results with experimental data and serve as an additional means of validation; they also reveal physical phenomena not visible in the experiments, highlight how small-scale interfacial features develop and affect macroscopic dynamics, and demonstrate the wide range of spatial scales often at play in interfacial fluid flow.

  19. Implicit mesh discontinuous Galerkin methods and interfacial gauge methods for high-order accurate interface dynamics, with applications to surface tension dynamics, rigid body fluid-structure interaction, and free surface flow: Part II

    NASA Astrophysics Data System (ADS)

    Saye, Robert

    2017-09-01

    In this two-part paper, a high-order accurate implicit mesh discontinuous Galerkin (dG) framework is developed for fluid interface dynamics, facilitating precise computation of interfacial fluid flow in evolving geometries. The framework uses implicitly defined meshes-wherein a reference quadtree or octree grid is combined with an implicit representation of evolving interfaces and moving domain boundaries-and allows physically prescribed interfacial jump conditions to be imposed or captured with high-order accuracy. Part one discusses the design of the framework, including: (i) high-order quadrature for implicitly defined elements and faces; (ii) high-order accurate discretisation of scalar and vector-valued elliptic partial differential equations with interfacial jumps in ellipticity coefficient, leading to optimal-order accuracy in the maximum norm and discrete linear systems that are symmetric positive (semi)definite; (iii) the design of incompressible fluid flow projection operators, which except for the influence of small penalty parameters, are discretely idempotent; and (iv) the design of geometric multigrid methods for elliptic interface problems on implicitly defined meshes and their use as preconditioners for the conjugate gradient method. Also discussed is a variety of aspects relating to moving interfaces, including: (v) dG discretisations of the level set method on implicitly defined meshes; (vi) transferring state between evolving implicit meshes; (vii) preserving mesh topology to accurately compute temporal derivatives; (viii) high-order accurate reinitialisation of level set functions; and (ix) the integration of adaptive mesh refinement. In part two, several applications of the implicit mesh dG framework in two and three dimensions are presented, including examples of single phase flow in nontrivial geometry, surface tension-driven two phase flow with phase-dependent fluid density and viscosity, rigid body fluid-structure interaction, and free surface flow. A class of techniques known as interfacial gauge methods is adopted to solve the corresponding incompressible Navier-Stokes equations, which, compared to archetypical projection methods, have a weaker coupling between fluid velocity, pressure, and interface position, and allow high-order accurate numerical methods to be developed more easily. Convergence analyses conducted throughout the work demonstrate high-order accuracy in the maximum norm for all of the applications considered; for example, fourth-order spatial accuracy in fluid velocity, pressure, and interface location is demonstrated for surface tension-driven two phase flow in 2D and 3D. Specific application examples include: vortex shedding in nontrivial geometry, capillary wave dynamics revealing fine-scale flow features, falling rigid bodies tumbling in unsteady flow, and free surface flow over a submersed obstacle, as well as high Reynolds number soap bubble oscillation dynamics and vortex shedding induced by a type of Plateau-Rayleigh instability in water ripple free surface flow. These last two examples compare numerical results with experimental data and serve as an additional means of validation; they also reveal physical phenomena not visible in the experiments, highlight how small-scale interfacial features develop and affect macroscopic dynamics, and demonstrate the wide range of spatial scales often at play in interfacial fluid flow.

  20. Oasis: A high-level/high-performance open source Navier-Stokes solver

    NASA Astrophysics Data System (ADS)

    Mortensen, Mikael; Valen-Sendstad, Kristian

    2015-03-01

    Oasis is a high-level/high-performance finite element Navier-Stokes solver written from scratch in Python using building blocks from the FEniCS project (fenicsproject.org). The solver is unstructured and targets large-scale applications in complex geometries on massively parallel clusters. Oasis utilizes MPI and interfaces, through FEniCS, to the linear algebra backend PETSc. Oasis advocates a high-level, programmable user interface through the creation of highly flexible Python modules for new problems. Through the high-level Python interface the user is placed in complete control of every aspect of the solver. A version of the solver, that is using piecewise linear elements for both velocity and pressure, is shown to reproduce very well the classical, spectral, turbulent channel simulations of Moser et al. (1999). The computational speed is strongly dominated by the iterative solvers provided by the linear algebra backend, which is arguably the best performance any similar implicit solver using PETSc may hope for. Higher order accuracy is also demonstrated and new solvers may be easily added within the same framework.

  1. Value Frameworks in Oncology: Comparative Analysis and Implications to the Pharmaceutical Industry.

    PubMed

    Slomiany, Mark; Madhavan, Priya; Kuehn, Michael; Richardson, Sasha

    2017-07-01

    As the cost of oncology care continues to rise, composite value models that variably capture the diverse concerns of patients, physicians, payers, policymakers, and the pharmaceutical industry have begun to take shape. To review the capabilities and limitations of 5 of the most notable value frameworks in oncology that have emerged in recent years and to compare their relative value and application among the intended stakeholders. We compared the methodology of the American Society of Clinical Oncology (ASCO) Value Framework (version 2.0), the National Comprehensive Cancer Network Evidence Blocks, Memorial Sloan Kettering Cancer Center DrugAbacus, the Institute for Clinical and Economic Review Value Assessment Framework, and the European Society for Medical Oncology Magnitude of Clinical Benefit Scale, using a side-by-side comparative approach in terms of the input, scoring methodology, and output of each framework. In addition, we gleaned stakeholder insights about these frameworks and their potential real-world applications through dialogues with physicians and payers, as well as through secondary research and an aggregate analysis of previously published survey results. The analysis identified several framework-specific themes in their respective focus on clinical trial elements, breadth of evidence, evidence weighting, scoring methodology, and value to stakeholders. Our dialogues with physicians and our aggregate analysis of previous surveys revealed a varying level of awareness of, and use of, each of the value frameworks in clinical practice. For example, although the ASCO Value Framework appears nascent in clinical practice, physicians believe that the frameworks will be more useful in practice in the future as they become more established and as their outputs are more widely accepted. Along with patients and payers, who bear the burden of treatment costs, physicians and policymakers have waded into the discussion of defining value in oncology care, as well as pharmaceutical companies that seek to understand the impact of these value frameworks on each stakeholder, as they model the value and financial threshold of innovative, high-cost drugs.

  2. Dissecting obesogenic environments: the development and application of a framework for identifying and prioritizing environmental interventions for obesity.

    PubMed

    Swinburn, B; Egger, G; Raza, F

    1999-12-01

    The "obesogenicity" of modern environments is fueling the obesity pandemic. We describe a framework, known as ANGELO (analysis grid for environments linked to obesity), which is a conceptual model for understanding the obesogenicity of environments and a practical tool for prioritizing environmental elements for research and intervention. Development of the ANGELO framework. The basic framework is a 2 x 4 grid which dissects the environment into environmental size (micro and macro) by type: physical (what is available), economic (what are the costs), political (what are the "rules"), and sociocultural (what are the attitudes and beliefs). Within this grid, the elements which influence food intake and physical activity are characterized as obesogenic or "leptogenic" (promoting leanness). Application of the ANGELO framework. The ANGELO framework has been piloted at the population level (island communities) to prioritize the settings/sectors for intervention and at the setting level (fast food outlets) to prioritize research needs and interventions. Environmental elements were prioritized by rating their validity (evidence of impact), relevance (to the local context), and potential changeability. The ANGELO framework appears to be a flexible and robust instrument for the needs analysis and problem identification stages of reducing the obesogenicity of modern environments. Copyright 1999 American Health Foundation and Academic Press.

  3. Decision-making in irrigation networks: Selecting appropriate canal structures using multi-attribute decision analysis.

    PubMed

    Hosseinzade, Zeinab; Pagsuyoin, Sheree A; Ponnambalam, Kumaraswamy; Monem, Mohammad J

    2017-12-01

    The stiff competition for water between agriculture and non-agricultural production sectors makes it necessary to have effective management of irrigation networks in farms. However, the process of selecting flow control structures in irrigation networks is highly complex and involves different levels of decision makers. In this paper, we apply multi-attribute decision making (MADM) methodology to develop a decision analysis (DA) framework for evaluating, ranking and selecting check and intake structures for irrigation canals. The DA framework consists of identifying relevant attributes for canal structures, developing a robust scoring system for alternatives, identifying a procedure for data quality control, and identifying a MADM model for the decision analysis. An application is illustrated through an analysis for automation purposes of the Qazvin irrigation network, one of the oldest and most complex irrigation networks in Iran. A survey questionnaire designed based on the decision framework was distributed to experts, managers, and operators of the Qazvin network and to experts from the Ministry of Power in Iran. Five check structures and four intake structures were evaluated. A decision matrix was generated from the average scores collected from the survey, and was subsequently solved using TOPSIS (Technique for Order of Preference by Similarity to Ideal Solution) method. To identify the most critical structure attributes for the selection process, optimal attribute weights were calculated using Entropy method. For check structures, results show that the duckbill weir is the preferred structure while the pivot weir is the least preferred. Use of the duckbill weir can potentially address the problem with existing Amil gates where manual intervention is required to regulate water levels during periods of flow extremes. For intake structures, the Neyrpic® gate and constant head orifice are the most and least preferred alternatives, respectively. Some advantages of the Neyrpic® gate are ease of operation and capacity to measure discharge flows. Overall, the application to the Qazvin irrigation network demonstrates the utility of the proposed DA framework in selecting appropriate structures for regulating water flows in irrigation canals. This framework systematically aids the decision process by capturing decisions made at various levels (individual farmers to high-level management). It can be applied to other cases where a new irrigation network is being designed, or where changes in irrigation structures need to be identified to improve flow control in existing networks. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Multi-sources data fusion framework for remote triage prioritization in telehealth.

    PubMed

    Salman, O H; Rasid, M F A; Saripan, M I; Subramaniam, S K

    2014-09-01

    The healthcare industry is streamlining processes to offer more timely and effective services to all patients. Computerized software algorithm and smart devices can streamline the relation between users and doctors by providing more services inside the healthcare telemonitoring systems. This paper proposes a multi-sources framework to support advanced healthcare applications. The proposed framework named Multi Sources Healthcare Architecture (MSHA) considers multi-sources: sensors (ECG, SpO2 and Blood Pressure) and text-based inputs from wireless and pervasive devices of Wireless Body Area Network. The proposed framework is used to improve the healthcare scalability efficiency by enhancing the remote triaging and remote prioritization processes for the patients. The proposed framework is also used to provide intelligent services over telemonitoring healthcare services systems by using data fusion method and prioritization technique. As telemonitoring system consists of three tiers (Sensors/ sources, Base station and Server), the simulation of the MSHA algorithm in the base station is demonstrated in this paper. The achievement of a high level of accuracy in the prioritization and triaging patients remotely, is set to be our main goal. Meanwhile, the role of multi sources data fusion in the telemonitoring healthcare services systems has been demonstrated. In addition to that, we discuss how the proposed framework can be applied in a healthcare telemonitoring scenario. Simulation results, for different symptoms relate to different emergency levels of heart chronic diseases, demonstrate the superiority of our algorithm compared with conventional algorithms in terms of classify and prioritize the patients remotely.

  5. Oil Spills and Marine Mammals in British Columbia, Canada: Development and Application of a Risk-Based Conceptual Framework.

    PubMed

    Jarvela Rosenberger, Adrianne L; MacDuffee, Misty; Rosenberger, Andrew G J; Ross, Peter S

    2017-07-01

    Marine mammals are inherently vulnerable to oil spills. We developed a conceptual framework to evaluate the impacts of potential oil exposure on marine mammals and applied it to 21 species inhabiting coastal British Columbia (BC), Canada. Oil spill vulnerability was determined by examining both the likelihood of species-specific (individual) oil exposure and the consequent likelihood of population-level effects. Oil exposure pathways, ecology, and physiological characteristics were first used to assign species-specific vulnerability rankings. Baleen whales were found to be highly vulnerable due to blowhole breathing, surface filter feeding, and invertebrate prey. Sea otters (Enhydra lutris) were ranked as highly vulnerable due to their time spent at the ocean surface, dense pelage, and benthic feeding techniques. Species-specific vulnerabilities were considered to estimate the likelihood of population-level effects occurring after oil exposure. Killer whale (Orcinus orca) populations were deemed at highest risk due to small population sizes, complex social structure, long lives, slow reproductive turnover, and dietary specialization. Finally, we related the species-specific and population-level vulnerabilities. In BC, vulnerability was deemed highest for Northern and Southern Resident killer whales and sea otters, followed by Bigg's killer whales and Steller sea lions (Eumetopias jubatus). Our findings challenge the typical "indicator species" approach routinely used and underscore the need to examine marine mammals at a species and population level for risk-based oil spill predictions. This conceptual framework can be combined with spill probabilities and volumes to develop more robust risk assessments and may be applied elsewhere to identify vulnerability themes for marine mammals.

  6. Multilevel microvibration test for performance predictions of a space optical load platform

    NASA Astrophysics Data System (ADS)

    Li, Shiqi; Zhang, Heng; Liu, Shiping; Wang, Yue

    2018-05-01

    This paper presents a framework for the multilevel microvibration analysis and test of a space optical load platform. The test framework is conducted on three levels, including instrument, subsystem, and system level. Disturbance source experimental investigations are performed to evaluate the vibration amplitude and study vibration mechanism. Transfer characteristics of space camera are validated by a subsystem test, which allows the calculation of transfer functions from various disturbance sources to optical performance outputs. In order to identify the influence of the source on the spacecraft performance, a system level microvibration measurement test has been performed on the ground. From the time domain analysis and spectrum analysis of multilevel microvibration tests, we concluded that the disturbance source has a significant effect on its installation position. After transmitted through mechanical links, the residual vibration reduces to a background noise level. In addition, the angular microvibration of the platform jitter is mainly concentrated in the rotation of y-axes. This work is applied to a real practical application involving the high resolution satellite camera system.

  7. Resource Optimization Techniques and Security Levels for Wireless Sensor Networks Based on the ARSy Framework.

    PubMed

    Parenreng, Jumadi Mabe; Kitagawa, Akio

    2018-05-17

    Wireless Sensor Networks (WSNs) with limited battery, central processing units (CPUs), and memory resources are a widely implemented technology for early warning detection systems. The main advantage of WSNs is their ability to be deployed in areas that are difficult to access by humans. In such areas, regular maintenance may be impossible; therefore, WSN devices must utilize their limited resources to operate for as long as possible, but longer operations require maintenance. One method of maintenance is to apply a resource adaptation policy when a system reaches a critical threshold. This study discusses the application of a security level adaptation model, such as an ARSy Framework, for using resources more efficiently. A single node comprising a Raspberry Pi 3 Model B and a DS18B20 temperature sensor were tested in a laboratory under normal and stressful conditions. The result shows that under normal conditions, the system operates approximately three times longer than under stressful conditions. Maintaining the stability of the resources also enables the security level of a network's data output to stay at a high or medium level.

  8. Resource Optimization Techniques and Security Levels for Wireless Sensor Networks Based on the ARSy Framework

    PubMed Central

    Kitagawa, Akio

    2018-01-01

    Wireless Sensor Networks (WSNs) with limited battery, central processing units (CPUs), and memory resources are a widely implemented technology for early warning detection systems. The main advantage of WSNs is their ability to be deployed in areas that are difficult to access by humans. In such areas, regular maintenance may be impossible; therefore, WSN devices must utilize their limited resources to operate for as long as possible, but longer operations require maintenance. One method of maintenance is to apply a resource adaptation policy when a system reaches a critical threshold. This study discusses the application of a security level adaptation model, such as an ARSy Framework, for using resources more efficiently. A single node comprising a Raspberry Pi 3 Model B and a DS18B20 temperature sensor were tested in a laboratory under normal and stressful conditions. The result shows that under normal conditions, the system operates approximately three times longer than under stressful conditions. Maintaining the stability of the resources also enables the security level of a network’s data output to stay at a high or medium level. PMID:29772773

  9. Multidimensional Recurrence Quantification Analysis (MdRQA) for the Analysis of Multidimensional Time-Series: A Software Implementation in MATLAB and Its Application to Group-Level Data in Joint Action

    PubMed Central

    Wallot, Sebastian; Roepstorff, Andreas; Mønster, Dan

    2016-01-01

    We introduce Multidimensional Recurrence Quantification Analysis (MdRQA) as a tool to analyze multidimensional time-series data. We show how MdRQA can be used to capture the dynamics of high-dimensional signals, and how MdRQA can be used to assess coupling between two or more variables. In particular, we describe applications of the method in research on joint and collective action, as it provides a coherent analysis framework to systematically investigate dynamics at different group levels—from individual dynamics, to dyadic dynamics, up to global group-level of arbitrary size. The Appendix in Supplementary Material contains a software implementation in MATLAB to calculate MdRQA measures. PMID:27920748

  10. The NeuARt II system: a viewing tool for neuroanatomical data based on published neuroanatomical atlases

    PubMed Central

    Burns, Gully APC; Cheng, Wei-Cheng; Thompson, Richard H; Swanson, Larry W

    2006-01-01

    Background Anatomical studies of neural circuitry describing the basic wiring diagram of the brain produce intrinsically spatial, highly complex data of great value to the neuroscience community. Published neuroanatomical atlases provide a spatial framework for these studies. We have built an informatics framework based on these atlases for the representation of neuroanatomical knowledge. This framework not only captures current methods of anatomical data acquisition and analysis, it allows these studies to be collated, compared and synthesized within a single system. Results We have developed an atlas-viewing application ('NeuARt II') in the Java language with unique functional properties. These include the ability to use copyrighted atlases as templates within which users may view, save and retrieve data-maps and annotate them with volumetric delineations. NeuARt II also permits users to view multiple levels on multiple atlases at once. Each data-map in this system is simply a stack of vector images with one image per atlas level, so any set of accurate drawings made onto a supported atlas (in vector graphics format) could be uploaded into NeuARt II. Presently the database is populated with a corpus of high-quality neuroanatomical data from the laboratory of Dr Larry Swanson (consisting 64 highly-detailed maps of PHAL tract-tracing experiments, made up of 1039 separate drawings that were published in 27 primary research publications over 17 years). Herein we take selective examples from these data to demonstrate the features of NeuArt II. Our informatics tool permits users to browse, query and compare these maps. The NeuARt II tool operates within a bioinformatics knowledge management platform (called 'NeuroScholar') either as a standalone or a plug-in application. Conclusion Anatomical localization is fundamental to neuroscientific work and atlases provide an easily-understood framework that is widely used by neuroanatomists and non-neuroanatomists alike. NeuARt II, the neuroinformatics tool presented here, provides an accurate and powerful way of representing neuroanatomical data in the context of commonly-used brain atlases for visualization, comparison and analysis. Furthermore, it provides a framework that supports the delivery and manipulation of mapped data either as a standalone system or as a component in a larger knowledge management system. PMID:17166289

  11. A General Sparse Tensor Framework for Electronic Structure Theory

    DOE PAGES

    Manzer, Samuel; Epifanovsky, Evgeny; Krylov, Anna I.; ...

    2017-01-24

    Linear-scaling algorithms must be developed in order to extend the domain of applicability of electronic structure theory to molecules of any desired size. But, the increasing complexity of modern linear-scaling methods makes code development and maintenance a significant challenge. A major contributor to this difficulty is the lack of robust software abstractions for handling block-sparse tensor operations. We therefore report the development of a highly efficient symbolic block-sparse tensor library in order to provide access to high-level software constructs to treat such problems. Our implementation supports arbitrary multi-dimensional sparsity in all input and output tensors. We then avoid cumbersome machine-generatedmore » code by implementing all functionality as a high-level symbolic C++ language library and demonstrate that our implementation attains very high performance for linear-scaling sparse tensor contractions.« less

  12. RT-18: Value of Flexibility. Phase 1

    DTIC Science & Technology

    2010-09-25

    an analytical framework based on sound mathematical constructs. A review of the current state-of-the-art showed that there is little unifying theory...framework that is mathematically consistent, domain independent and applicable under varying information levels. This report presents our advances in...During this period, we also explored the development of an analytical framework based on sound mathematical constructs. A review of the current state

  13. Towards end to end technology modeling: Carbon nanotube and thermoelectric devices

    NASA Astrophysics Data System (ADS)

    Salamat, Shuaib

    The goal of this work is to demonstrate the feasibility of end-to-end ("atoms to applications") technology modeling. Two different technologies were selected to drive this work. The first technology is carbon nanotube field-effect transistors (CNTFETs), and the goal is to model device level variability and identify the origin of variations in these devices. Recently, there has been significant progress in understanding the physics of carbon nanotube electronic devices and in identifying their potential applications. For nanotubes, the carrier mobility is high, so low bias transport across several hundred nanometers is nearly ballistic, and the deposition of high-k gate dielectrics does not degrade the carrier mobility. The conduction and valence bands are symmetric (useful for complimentary application) and the bandstructure is direct (enables optical emission). Because of these striking features, carbon nanotubes (CNTs) have received much attention. Carbon nanotubes field-effect transistors (CNTFETs) are one of the main potential candidates for large-area electronics. In this research model, systematic simulation approaches are applied to understand the intrinsic performance variability in CNTFETs. It is shown that control over diameter distribution is critically important process parameter for attaining high performance transistors and circuits with characteristics rivaling those of state-of-the-art Si technology. The second technology driver concerns the development of a multi-scale framework for thermoelectric device design. An essential step in the development of new materials and devices for thermoelectrics is to develop accurate, efficient, and realistic models. The ready availability of user friendly ab-initio codes and the ever-increasing computing power have made the band structure calculations routine. Thermoelectric device design, however, is still largely done at the effective mass level. Tools that allow device designers to make use of sophisticated electronic structure and phonon dispersion calculations are needed. We have developed a proof-of-concept, integrated, multi-scale design framework for TE technology. Beginning from full electronic and phonon dispersions, Landauer approach is used to evaluate the temperature-dependent thermoelectric transport parameters needed for device simulation. A comprehensive SPICE-based model for electro-thermal transport has also been developed to serve as a bridge between the materials and device level descriptions and the system level simulations. This prototype framework has been used to design a thermoelectric cooler for managing hot spots in the integrated circuit chips. What's more, as a byproduct of this research a suite of educational and simulation resources have been developed and deployed, on the nanoHUB.org science gateway to serve as a resource for the TE community.

  14. Electronic and Ionic Conductors from Ordered Microporous Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dincă, Mircea

    The proposed work aimed to establish metal-organic frameworks (MOFs) as new classes of high-surface area microporous electronic and ionic conductors. MOFs are crystalline materials with pore sizes ranging from 0.2 to ~ 2 nm (or larger for the latter) defined by inorganic or organic building blocks connected by rigid organic linkers. Myriad applications have been found or proposed for these materials, yet those that require electron transport or conductivity in combination with permanent porosity still lag behind because the vast majority of known frameworks are electrical insulators. Prior to our proposal and subsequent work, there were virtually no studies exploringmore » the possibility of electronic delocalization in these materials. Therefore, our primary goal was to understand and control, at a fundamental level, the electron and ion transport properties of this class of materials, with no specific application proposed, although myriad applications could be envisioned for high surface area conductors. Our goals directly addressed one of the DOE-identified Grand Challenges for Basic Energy Sciences: designing perfect atom- and energy-efficient syntheses of revolutionary new forms of matter with tailored properties. Indeed, the proposed work is entirely synthetic in nature; owing to the molecular nature of the building blocks in MOFs, there is the possibility of unprecedented control over the structure and properties of solid crystalline matter. The goals also tangentially addressed the Grand Challenge of controlling materials processes at the level of electrons: the scope of our program is to create new materials where charges (electrons and/or ions) move according to predefined pathways.« less

  15. A dynamic multiarmed bandit-gene expression programming hyper-heuristic for combinatorial optimization problems.

    PubMed

    Sabar, Nasser R; Ayob, Masri; Kendall, Graham; Qu, Rong

    2015-02-01

    Hyper-heuristics are search methodologies that aim to provide high-quality solutions across a wide variety of problem domains, rather than developing tailor-made methodologies for each problem instance/domain. A traditional hyper-heuristic framework has two levels, namely, the high level strategy (heuristic selection mechanism and the acceptance criterion) and low level heuristics (a set of problem specific heuristics). Due to the different landscape structures of different problem instances, the high level strategy plays an important role in the design of a hyper-heuristic framework. In this paper, we propose a new high level strategy for a hyper-heuristic framework. The proposed high-level strategy utilizes a dynamic multiarmed bandit-extreme value-based reward as an online heuristic selection mechanism to select the appropriate heuristic to be applied at each iteration. In addition, we propose a gene expression programming framework to automatically generate the acceptance criterion for each problem instance, instead of using human-designed criteria. Two well-known, and very different, combinatorial optimization problems, one static (exam timetabling) and one dynamic (dynamic vehicle routing) are used to demonstrate the generality of the proposed framework. Compared with state-of-the-art hyper-heuristics and other bespoke methods, empirical results demonstrate that the proposed framework is able to generalize well across both domains. We obtain competitive, if not better results, when compared to the best known results obtained from other methods that have been presented in the scientific literature. We also compare our approach against the recently released hyper-heuristic competition test suite. We again demonstrate the generality of our approach when we compare against other methods that have utilized the same six benchmark datasets from this test suite.

  16. Policymaking to preserve privacy in disclosure of public health data: a suggested framework.

    PubMed

    Mizani, Mehrdad A; Baykal, Nazife

    2015-03-01

    Health organisations in Turkey gather a vast amount of valuable individual data that can be used for public health purposes. The organisations use rigid methods to remove some useful details from the data while publishing the rest of the data in a highly aggregated form, mostly because of privacy concerns and lack of standardised policies. This action leads to information loss and bias affecting public health research. Hence, organisations need dynamic policies and well-defined procedures rather than a specific algorithm to protect the privacy of individual data. To address this need, we developed a framework for the systematic application of anonymity methods while reducing and objectively reporting the information loss without leaking confidentiality. This framework acts as a roadmap for policymaking by providing high-level pseudo-policies with semitechnical guidelines in addition to some sample scenarios suitable for policymakers, public health programme managers and legislators. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  17. Developing tools and resources for the biomedical domain of the Greek language.

    PubMed

    Vagelatos, Aristides; Mantzari, Elena; Pantazara, Mavina; Tsalidis, Christos; Kalamara, Chryssoula

    2011-06-01

    This paper presents the design and implementation of terminological and specialized textual resources that were produced in the framework of the Greek research project "IATROLEXI". The aim of the project was to create the critical infrastructure for the Greek language, i.e. linguistic resources and tools for use in high level Natural Language Processing (NLP) applications in the domain of biomedicine. The project was built upon existing resources developed by the project partners and further enhanced within its framework, i.e. a Greek morphological lexicon of about 100,000 words, and language processing tools such as a lemmatiser and a morphosyntactic tagger. Christos Tsalidis, Additionally, it developed new assets, such as a specialized corpus of biomedical texts and an ontology of medical terminology.

  18. A methodology to model causal relationships on offshore safety assessment focusing on human and organizational factors.

    PubMed

    Ren, J; Jenkinson, I; Wang, J; Xu, D L; Yang, J B

    2008-01-01

    Focusing on people and organizations, this paper aims to contribute to offshore safety assessment by proposing a methodology to model causal relationships. The methodology is proposed in a general sense that it will be capable of accommodating modeling of multiple risk factors considered in offshore operations and will have the ability to deal with different types of data that may come from different resources. Reason's "Swiss cheese" model is used to form a generic offshore safety assessment framework, and Bayesian Network (BN) is tailored to fit into the framework to construct a causal relationship model. The proposed framework uses a five-level-structure model to address latent failures within the causal sequence of events. The five levels include Root causes level, Trigger events level, Incidents level, Accidents level, and Consequences level. To analyze and model a specified offshore installation safety, a BN model was established following the guideline of the proposed five-level framework. A range of events was specified, and the related prior and conditional probabilities regarding the BN model were assigned based on the inherent characteristics of each event. This paper shows that Reason's "Swiss cheese" model and BN can be jointly used in offshore safety assessment. On the one hand, the five-level conceptual model is enhanced by BNs that are capable of providing graphical demonstration of inter-relationships as well as calculating numerical values of occurrence likelihood for each failure event. Bayesian inference mechanism also makes it possible to monitor how a safety situation changes when information flow travel forwards and backwards within the networks. On the other hand, BN modeling relies heavily on experts' personal experiences and is therefore highly domain specific. "Swiss cheese" model is such a theoretic framework that it is based on solid behavioral theory and therefore can be used to provide industry with a roadmap for BN modeling and implications. A case study of the collision risk between a Floating Production, Storage and Offloading (FPSO) unit and authorized vessels caused by human and organizational factors (HOFs) during operations is used to illustrate an industrial application of the proposed methodology.

  19. Dynamically Reconfigurable Systolic Array Accelorators

    NASA Technical Reports Server (NTRS)

    Dasu, Aravind (Inventor); Barnes, Robert C. (Inventor)

    2014-01-01

    A polymorphic systolic array framework that works in conjunction with an embedded microprocessor on an FPGA, that allows for dynamic and complimentary scaling of acceleration levels of two algorithms active concurrently on the FPGA. Use is made of systolic arrays and hardware-software co-design to obtain an efficient multi-application acceleration system. The flexible and simple framework allows hosting of a broader range of algorithms and extendable to more complex applications in the area of aerospace embedded systems.

  20. Global Framework for Climate Services (GFCS)

    NASA Astrophysics Data System (ADS)

    Lúcio, F.

    2012-04-01

    Climate information at global, regional and national levels and in timeframes ranging from the past, present and future climate is fundamental for planning, sustainable development and to help organizations, countries and individuals adopt appropriate strategies to adapt to climate variability and change. Based on this recognition, in 2009, the Heads of States and Governments, Ministers and Heads of Delegation representing more than 150 countries, 34 United Nations Organizations and 36 Governmental and non-Governmental international organizations, and more than 2500 experts present at the Third World Climate Conference (WCC - 3) unanimously agreed to develop the Global Framework for Climate Services (GFCS) to strengthen the production, availability, delivery and application of science-based climate prediction and services. They requested that a taskforce of high-level independent advisors be appointed to prepare a report, including recommendations on the proposed elements of the Framework and the next steps for its implementation. The high-level taskforce produced a report which was endorsed by the Sixteeth World Meteorological Congress XVI in May 2011. A process for the development of the implementation plan and the governance structure of the Global Framework for Climate Services (GFCS) is well under way being led by the World Meteorological Organization within the UN system. This process involves consultations that engage a broad range of stakeholders including governments, UN and international agencies, regional organizations and specific communities of practitioners. These consultations are being conducted to facilitate discussions of key issues related to the production, availability, delivery and application of climate services in the four priority sectors of the framework (agriculture, water, health and disaster risk reduction) so that the implementation plan of the Framework is a true reflection of the aspirations of stakeholders. The GFCS is envisaged as a set of international arrangements that will coordinate the activities and build on existing efforts to provide climate services that are truly focused on meeting user needs. It will be implemented through the development of five main components: 1) User Interface Platform — to provide ways for climate service users and providers to interact and improve the effectiveness of the Framework and its climate services 2) Climate Services Information System — to produce and distribute climate data and information according to the needs of users and to agreed standards 3) Observations and Monitoring - to develop agreements and standards for collecting and generating necessary climate data 4) Research, Modeling and Prediction section — to harness science capabilities and results to meet the needs of climate services 5) Capacity Building — to support the systematic development of the institutions, infrastructure and human resources needed for effective production of climate services and their application. Putting the GFCS in place will require unprecedented collaboration among agencies and across political, functional and disciplinary boundaries, and a global mobilization of effort. This communication will provide information on benefits and the process for the development of the GFCS as well as potential entry points for stakeholders to participate. In addition, it will highlight some of the research, modelling and prediction opportunities that will require intra-disciplinary science action.

  1. Integrating Formal Methods and Testing 2002

    NASA Technical Reports Server (NTRS)

    Cukic, Bojan

    2002-01-01

    Traditionally, qualitative program verification methodologies and program testing are studied in separate research communities. None of them alone is powerful and practical enough to provide sufficient confidence in ultra-high reliability assessment when used exclusively. Significant advances can be made by accounting not only tho formal verification and program testing. but also the impact of many other standard V&V techniques, in a unified software reliability assessment framework. The first year of this research resulted in the statistical framework that, given the assumptions on the success of the qualitative V&V and QA procedures, significantly reduces the amount of testing needed to confidently assess reliability at so-called high and ultra-high levels (10-4 or higher). The coming years shall address the methodologies to realistically estimate the impacts of various V&V techniques to system reliability and include the impact of operational risk to reliability assessment. Combine formal correctness verification, process and product metrics, and other standard qualitative software assurance methods with statistical testing with the aim of gaining higher confidence in software reliability assessment for high-assurance applications. B) Quantify the impact of these methods on software reliability. C) Demonstrate that accounting for the effectiveness of these methods reduces the number of tests needed to attain certain confidence level. D) Quantify and justify the reliability estimate for systems developed using various methods.

  2. Government Microelectronics Assessment for Trust (GOMAT)

    NASA Technical Reports Server (NTRS)

    Berg, Melanie D.; LaBel, Kenneth A.

    2018-01-01

    NASA Electronic Parts and Packaging (NEPP) is developing a process to be employed in critical applications. The framework assesses levels of trust and assurance in microelectronic systems. The process is being created with participation from a variety of organizations. We present a synopsis of the framework that includes contributions from The Aerospace Corporation.

  3. Time, Space, and Mass at the Operational Level of War: The Dynamics of the Culminating Point,

    DTIC Science & Technology

    1988-04-28

    theoretical framework for operational culmination and then examining the theory as reflected in recent history. This paper focuses on the concept of...the paper first examines key definitions and provides a theoretical framework for understanding culmination. Next, it considers the application of the

  4. Biocomplexity in coupled natural–human systems: a multidimensional framework

    Treesearch

    S.T.A. Pickett; M.L. Cadenasso; J.M. Grove

    2005-01-01

    As defined by Ascher, biocomplexity results from a "multiplicity of interconnected relationships and levels. "However, no integrative framework yet exists to facilitate the application of this concept to coupled human-natural systems. Indeed, the term "biocomplexity" is still used primarily as a creative and provocative metaphor. To help advance its...

  5. Reconciling nature conservation and traditional farming practices: a spatially explicit framework to assess the extent of High Nature Value farmlands in the European countryside

    PubMed Central

    Lomba, Angela; Alves, Paulo; Jongman, Rob H G; McCracken, David I

    2015-01-01

    Agriculture constitutes a dominant land cover worldwide, and rural landscapes under extensive farming practices acknowledged due to high biodiversity levels. The High Nature Value farmland (HNVf) concept has been highlighted in the EU environmental and rural policies due to their inherent potential to help characterize and direct financial support to European landscapes where high nature and/or conservation value is dependent on the continuation of specific low-intensity farming systems. Assessing the extent of HNV farmland by necessity relies on the availability of both ecological and farming systems' data, and difficulties associated with making such assessments have been widely described across Europe. A spatially explicit framework of data collection, building out from local administrative units, has recently been suggested as a means of addressing such difficulties. This manuscript tests the relevance of the proposed approach, describes the spatially explicit framework in a case study area in northern Portugal, and discusses the potential of the approach to help better inform the implementation of conservation and rural development policies. Synthesis and applications: The potential of a novel approach (combining land use/cover, farming and environmental data) to provide more accurate and efficient mapping and monitoring of HNV farmlands is tested at the local level in northern Portugal. The approach is considered to constitute a step forward toward a more precise targeting of landscapes for agri-environment schemes, as it allowed a more accurate discrimination of areas within the case study landscape that have a higher value for nature conservation. PMID:25798221

  6. A Framework for Safe Integration of Small UAS Into the NAS

    NASA Technical Reports Server (NTRS)

    Logan, Michael J.; Bland, Geoffrey; Murray, Jennifer

    2011-01-01

    This paper discusses a proposed framework for the safe integration of small unmanned aerial systems (sUAS) into the National Airspace System (NAS). The paper examines the potential uses of sUAS to build an understanding of the location and frequency of potential future flight operations based on the future applications of the sUAS systems. The paper then examines the types of systems that would be required to meet the application-level demand to determine classes of platforms and operations. Finally, a framework is proposed for both airworthiness and operations that attempts to balance safety with utility for these important systems.

  7. Molecule database framework: a framework for creating database applications with chemical structure search capability

    PubMed Central

    2013-01-01

    Background Research in organic chemistry generates samples of novel chemicals together with their properties and other related data. The involved scientists must be able to store this data and search it by chemical structure. There are commercial solutions for common needs like chemical registration systems or electronic lab notebooks. However for specific requirements of in-house databases and processes no such solutions exist. Another issue is that commercial solutions have the risk of vendor lock-in and may require an expensive license of a proprietary relational database management system. To speed up and simplify the development for applications that require chemical structure search capabilities, I have developed Molecule Database Framework. The framework abstracts the storing and searching of chemical structures into method calls. Therefore software developers do not require extensive knowledge about chemistry and the underlying database cartridge. This decreases application development time. Results Molecule Database Framework is written in Java and I created it by integrating existing free and open-source tools and frameworks. The core functionality includes: • Support for multi-component compounds (mixtures) • Import and export of SD-files • Optional security (authorization) For chemical structure searching Molecule Database Framework leverages the capabilities of the Bingo Cartridge for PostgreSQL and provides type-safe searching, caching, transactions and optional method level security. Molecule Database Framework supports multi-component chemical compounds (mixtures). Furthermore the design of entity classes and the reasoning behind it are explained. By means of a simple web application I describe how the framework could be used. I then benchmarked this example application to create some basic performance expectations for chemical structure searches and import and export of SD-files. Conclusions By using a simple web application it was shown that Molecule Database Framework successfully abstracts chemical structure searches and SD-File import and export to simple method calls. The framework offers good search performance on a standard laptop without any database tuning. This is also due to the fact that chemical structure searches are paged and cached. Molecule Database Framework is available for download on the projects web page on bitbucket: https://bitbucket.org/kienerj/moleculedatabaseframework. PMID:24325762

  8. Molecule database framework: a framework for creating database applications with chemical structure search capability.

    PubMed

    Kiener, Joos

    2013-12-11

    Research in organic chemistry generates samples of novel chemicals together with their properties and other related data. The involved scientists must be able to store this data and search it by chemical structure. There are commercial solutions for common needs like chemical registration systems or electronic lab notebooks. However for specific requirements of in-house databases and processes no such solutions exist. Another issue is that commercial solutions have the risk of vendor lock-in and may require an expensive license of a proprietary relational database management system. To speed up and simplify the development for applications that require chemical structure search capabilities, I have developed Molecule Database Framework. The framework abstracts the storing and searching of chemical structures into method calls. Therefore software developers do not require extensive knowledge about chemistry and the underlying database cartridge. This decreases application development time. Molecule Database Framework is written in Java and I created it by integrating existing free and open-source tools and frameworks. The core functionality includes:•Support for multi-component compounds (mixtures)•Import and export of SD-files•Optional security (authorization)For chemical structure searching Molecule Database Framework leverages the capabilities of the Bingo Cartridge for PostgreSQL and provides type-safe searching, caching, transactions and optional method level security. Molecule Database Framework supports multi-component chemical compounds (mixtures).Furthermore the design of entity classes and the reasoning behind it are explained. By means of a simple web application I describe how the framework could be used. I then benchmarked this example application to create some basic performance expectations for chemical structure searches and import and export of SD-files. By using a simple web application it was shown that Molecule Database Framework successfully abstracts chemical structure searches and SD-File import and export to simple method calls. The framework offers good search performance on a standard laptop without any database tuning. This is also due to the fact that chemical structure searches are paged and cached. Molecule Database Framework is available for download on the projects web page on bitbucket: https://bitbucket.org/kienerj/moleculedatabaseframework.

  9. An Efficient and Adaptive Mutual Authentication Framework for Heterogeneous Wireless Sensor Network-Based Applications

    PubMed Central

    Kumar, Pardeep; Ylianttila, Mika; Gurtov, Andrei; Lee, Sang-Gon; Lee, Hoon-Jae

    2014-01-01

    Robust security is highly coveted in real wireless sensor network (WSN) applications since wireless sensors' sense critical data from the application environment. This article presents an efficient and adaptive mutual authentication framework that suits real heterogeneous WSN-based applications (such as smart homes, industrial environments, smart grids, and healthcare monitoring). The proposed framework offers: (i) key initialization; (ii) secure network (cluster) formation (i.e., mutual authentication and dynamic key establishment); (iii) key revocation; and (iv) new node addition into the network. The correctness of the proposed scheme is formally verified. An extensive analysis shows the proposed scheme coupled with message confidentiality, mutual authentication and dynamic session key establishment, node privacy, and message freshness. Moreover, the preliminary study also reveals the proposed framework is secure against popular types of attacks, such as impersonation attacks, man-in-the-middle attacks, replay attacks, and information-leakage attacks. As a result, we believe the proposed framework achieves efficiency at reasonable computation and communication costs and it can be a safeguard to real heterogeneous WSN applications. PMID:24521942

  10. An efficient and adaptive mutual authentication framework for heterogeneous wireless sensor network-based applications.

    PubMed

    Kumar, Pardeep; Ylianttila, Mika; Gurtov, Andrei; Lee, Sang-Gon; Lee, Hoon-Jae

    2014-02-11

    Robust security is highly coveted in real wireless sensor network (WSN) applications since wireless sensors' sense critical data from the application environment. This article presents an efficient and adaptive mutual authentication framework that suits real heterogeneous WSN-based applications (such as smart homes, industrial environments, smart grids, and healthcare monitoring). The proposed framework offers: (i) key initialization; (ii) secure network (cluster) formation (i.e., mutual authentication and dynamic key establishment); (iii) key revocation; and (iv) new node addition into the network. The correctness of the proposed scheme is formally verified. An extensive analysis shows the proposed scheme coupled with message confidentiality, mutual authentication and dynamic session key establishment, node privacy, and message freshness. Moreover, the preliminary study also reveals the proposed framework is secure against popular types of attacks, such as impersonation attacks, man-in-the-middle attacks, replay attacks, and information-leakage attacks. As a result, we believe the proposed framework achieves efficiency at reasonable computation and communication costs and it can be a safeguard to real heterogeneous WSN applications.

  11. Semantics of data and service registration to advance interdisciplinary information and data access.

    NASA Astrophysics Data System (ADS)

    Fox, P. P.; McGuinness, D. L.; Raskin, R.; Sinha, A. K.

    2008-12-01

    In developing an application of semantic web methods and technologies to address the integration of heterogeneous and interdisciplinary earth-science datasets, we have developed methodologies for creating rich semantic descriptions (ontologies) of the application domains. We have leveraged and extended where possible existing ontology frameworks such as SWEET. As a result of this semantic approach, we have also utilized ontologic descriptions of key enabling elements of the application, such as the registration of datasets with ontologies at several levels of granularity. This has enabled the location and usage of the data across disciplines. We are also realizing the need to develop similar semantic registration of web service data holdings as well as those provided with community and/or standard markup languages (e.g. GeoSciML). This level of semantic enablement extending beyond domain terms and relations significantly enhances our ability to provide a coherent semantic data framework for data and information systems. Much of this work is on the frontier of technology development and we will present the current and near-future capabilities we are developing. This work arises from the Semantically-Enabled Science Data Integration (SESDI) project, which is an NASA/ESTO/ACCESS-funded project involving the High Altitude Observatory at the National Center for Atmospheric Research (NCAR), McGuinness Associates Consulting, NASA/JPL and Virginia Polytechnic University.

  12. Magnetic hybrid magnetite/metal organic framework nanoparticles: facile preparation, post-synthetic biofunctionalization and tracking in vivo with magnetic methods

    NASA Astrophysics Data System (ADS)

    Tregubov, A. A.; Sokolov, I. L.; Babenyshev, A. V.; Nikitin, P. I.; Cherkasov, V. R.; Nikitin, M. P.

    2018-03-01

    Multifunctional hybrid nanocomposites remain to be of great interest in biomedicine as a universal tool in a number of applications. As a promising example, the nanoparticles with magnetic core and porous shell have a potential as theranostic agents combining both the diagnostics probe and drug delivery vehicle properties. However, reported methods of the nanostructure preparation are complex and include tedious time-consuming growth of porous shell by means of layer by layer assembly technique. In this study, we develop new way of fabrication of the superparamagnetic magnetite core @ porous metal organic framework shell nanoparticles and demonstrate their application both as a multimodal (MRI contrasting, magnetometric and optical labeling) and multifunctional (in vivo bioimaging, biotargeting by coupled receptors, lateral flow assay) agents. The easiness of fabrication, controllable bioconjugation properties and low level of non-specific binding indicate high potential of the nanoparticles to be employed as multifunctional agents in theranostics, advanced biosensing and bioimaging.

  13. Integration of chemical-specific exposure and pharmacokinetic information with the chemical-agnostic AOP framework to support high throughput risk assessment

    EPA Science Inventory

    Application of the Adverse Outcome Pathway (AOP) framework and high throughput toxicity testing in chemical-specific risk assessment requires reconciliation of chemical concentrations sufficient to trigger a molecular initiating event measured in vitro and at the relevant target ...

  14. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: Earth System Modeling Software Framework Survey

    NASA Technical Reports Server (NTRS)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn; Zukor, Dorothy (Technical Monitor)

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task, both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation. while maintaining high performance across numerous supercomputer and workstation architectures. This document surveys numerous software frameworks for potential use in Earth science modeling. Several frameworks are evaluated in depth, including Parallel Object-Oriented Methods and Applications (POOMA), Cactus (from (he relativistic physics community), Overture, Goddard Earth Modeling System (GEMS), the National Center for Atmospheric Research Flux Coupler, and UCLA/UCB Distributed Data Broker (DDB). Frameworks evaluated in less detail include ROOT, Parallel Application Workspace (PAWS), and Advanced Large-Scale Integrated Computational Environment (ALICE). A host of other frameworks and related tools are referenced in this context. The frameworks are evaluated individually and also compared with each other.

  15. Dynamically Reconfigurable Systolic Array Accelerator

    NASA Technical Reports Server (NTRS)

    Dasu, Aravind; Barnes, Robert

    2012-01-01

    A polymorphic systolic array framework has been developed that works in conjunction with an embedded microprocessor on a field-programmable gate array (FPGA), which allows for dynamic and complimentary scaling of acceleration levels of two algorithms active concurrently on the FPGA. Use is made of systolic arrays and a hardware-software co-design to obtain an efficient multi-application acceleration system. The flexible and simple framework allows hosting of a broader range of algorithms, and is extendable to more complex applications in the area of aerospace embedded systems. FPGA chips can be responsive to realtime demands for changing applications needs, but only if the electronic fabric can respond fast enough. This systolic array framework allows for rapid partial and dynamic reconfiguration of the chip in response to the real-time needs of scalability, and adaptability of executables.

  16. Model and algorithmic framework for detection and correction of cognitive errors.

    PubMed

    Feki, Mohamed Ali; Biswas, Jit; Tolstikov, Andrei

    2009-01-01

    This paper outlines an approach that we are taking for elder-care applications in the smart home, involving cognitive errors and their compensation. Our approach involves high level modeling of daily activities of the elderly by breaking down these activities into smaller units, which can then be automatically recognized at a low level by collections of sensors placed in the homes of the elderly. This separation allows us to employ plan recognition algorithms and systems at a high level, while developing stand-alone activity recognition algorithms and systems at a low level. It also allows the mixing and matching of multi-modality sensors of various kinds that go to support the same high level requirement. Currently our plan recognition algorithms are still at a conceptual stage, whereas a number of low level activity recognition algorithms and systems have been developed. Herein we present our model for plan recognition, providing a brief survey of the background literature. We also present some concrete results that we have achieved for activity recognition, emphasizing how these results are incorporated into the overall plan recognition system.

  17. High-level expression of Camelid nanobodies in Nicotiana benthamiana.

    PubMed

    Teh, Yi-Hui Audrey; Kavanagh, Tony A

    2010-08-01

    Nanobodies (or VHHs) are single-domain antigen-binding fragments derived from Camelid heavy chain-only antibodies. Their small size, monomeric behaviour, high stability and solubility, and ability to bind epitopes not accessible to conventional antibodies make them especially suitable for many therapeutic and biotechnological applications. Here we describe high-level expression, in Nicotiana benthamiana, of three versions of an anti-hen egg white lysozyme (HEWL) nanobody which include the original VHH from an immunized library (cAbLys3), a codon-optimized derivative, and a codon-optimized hybrid nanobody comprising the CDRs of cAbLys3 grafted onto an alternative 'universal' nanobody framework. His6- and StrepII-tagged derivatives of each nanobody were targeted for accumulation in the cytoplasm, chloroplast and apoplast using different pre-sequences. When targeted to the apoplast, intact functional nanobodies accumulated at an exceptionally high level (up to 30% total leaf protein), demonstrating the great potential of plants as a nanobody production system.

  18. EPICS-based control and data acquisition for the APS slope profiler (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Sullivan, Joseph; Assoufid, Lahsen; Qian, Jun; Jemian, Peter R.; Mooney, Tim; Rivers, Mark L.; Goetze, Kurt; Sluiter, Ronald L.; Lang, Keenan

    2016-09-01

    The motion control, data acquisition and analysis system for APS Slope Measuring Profiler was implemented using the Experimental Physics and Industrial Control System (EPICS). EPICS was designed as a framework with software tools and applications that provide a software infrastructure used in building distributed control systems to operate devices such as particle accelerators, large experiments and major telescopes. EPICS was chosen to implement the APS Slope Measuring Profiler because it is also applicable to single purpose systems. The control and data handling capability available in the EPICS framework provides the basic functionality needed for high precision X-ray mirror measurement. Those built in capabilities include hardware integration of high-performance motion control systems (3-axis gantry and tip-tilt stages), mirror measurement devices (autocollimator, laser spot camera) and temperature sensors. Scanning the mirror and taking measurements was accomplished with an EPICS feature (the sscan record) which synchronizes motor positioning with measurement triggers and data storage. Various mirror scanning modes were automatically configured using EPICS built-in scripting. EPICS tools also provide low-level image processing (areaDetector). Operation screens were created using EPICS-aware GUI screen development tools.

  19. An application framework of three-dimensional reconstruction and measurement for endodontic research.

    PubMed

    Gao, Yuan; Peters, Ove A; Wu, Hongkun; Zhou, Xuedong

    2009-02-01

    The purpose of this study was to customize an application framework by using the MeVisLab image processing and visualization platform for three-dimensional reconstruction and assessment of tooth and root canal morphology. One maxillary first molar was scanned before and after preparation with ProTaper by using micro-computed tomography. With a customized application framework based on MeVisLab, internal and external anatomy was reconstructed. Furthermore, the dimensions of root canal and radicular dentin were quantified, and effects of canal preparation were assessed. Finally, a virtual preparation with risk analysis was performed to simulate the removal of a broken instrument. This application framework provided an economical platform and met current requirements of endodontic research. The broad-based use of high-quality free software and the resulting exchange of experience might help to improve the quality of endodontic research with micro-computed tomography.

  20. Mobile Applications and 4G Wireless Networks: A Framework for Analysis

    ERIC Educational Resources Information Center

    Yang, Samuel C.

    2012-01-01

    Purpose: The use of mobile wireless data services continues to increase worldwide. New fourth-generation (4G) wireless networks can deliver data rates exceeding 2 Mbps. The purpose of this paper is to develop a framework of 4G mobile applications that utilize such high data rates and run on small form-factor devices. Design/methodology/approach:…

  1. Earth Science Computational Architecture for Multi-disciplinary Investigations

    NASA Astrophysics Data System (ADS)

    Parker, J. W.; Blom, R.; Gurrola, E.; Katz, D.; Lyzenga, G.; Norton, C.

    2005-12-01

    Understanding the processes underlying Earth's deformation and mass transport requires a non-traditional, integrated, interdisciplinary, approach dependent on multiple space and ground based data sets, modeling, and computational tools. Currently, details of geophysical data acquisition, analysis, and modeling largely limit research to discipline domain experts. Interdisciplinary research requires a new computational architecture that is optimized to perform complex data processing of multiple solid Earth science data types in a user-friendly environment. A web-based computational framework is being developed and integrated with applications for automatic interferometric radar processing, and models for high-resolution deformation & gravity, forward models of viscoelastic mass loading over short wavelengths & complex time histories, forward-inverse codes for characterizing surface loading-response over time scales of days to tens of thousands of years, and inversion of combined space magnetic & gravity fields to constrain deep crustal and mantle properties. This framework combines an adaptation of the QuakeSim distributed services methodology with the Pyre framework for multiphysics development. The system uses a three-tier architecture, with a middle tier server that manages user projects, available resources, and security. This ensures scalability to very large networks of collaborators. Users log into a web page and have a personal project area, persistently maintained between connections, for each application. Upon selection of an application and host from a list of available entities, inputs may be uploaded or constructed from web forms and available data archives, including gravity, GPS and imaging radar data. The user is notified of job completion and directed to results posted via URLs. Interdisciplinary work is supported through easy availability of all applications via common browsers, application tutorials and reference guides, and worked examples with visual response. At the platform level, multi-physics application development and workflow are available in the enriched environment of the Pyre framework. Advantages for combining separate expert domains include: multiple application components efficiently interact through Python shared libraries, investigators may nimbly swap models and try new parameter values, and a rich array of common tools are inherent in the Pyre system. The first four specific investigations to use this framework are: Gulf Coast subsidence: understanding of partitioning between compaction, subsidence and growth faulting; Gravity & deformation of a layered spherical earth model due to large earthquakes; Rift setting of Lake Vostok, Antarctica; and global ice mass changes.

  2. Linking English-Language Test Scores onto the Common European Framework of Reference: An Application of Standard-Setting Methodology. TOEFL iBT Research Report TOEFL iBt-06. ETS RR-08-34

    ERIC Educational Resources Information Center

    Tannenbaum, Richard J.; Wylie, E. Caroline

    2008-01-01

    The Common European Framework of Reference (CEFR) describes language proficiency in reading, writing, speaking, and listening on a 6-level scale. In this study, English-language experts from across Europe linked CEFR levels to scores on three tests: the TOEFL® iBT test, the TOEIC® assessment, and the TOEIC "Bridge"™ test.…

  3. eClims: An Extensible and Dynamic Integration Framework for Biomedical Information Systems.

    PubMed

    Savonnet, Marinette; Leclercq, Eric; Naubourg, Pierre

    2016-11-01

    Biomedical information systems (BIS) require consideration of three types of variability: data variability induced by new high throughput technologies, schema or model variability induced by large scale studies or new fields of research, and knowledge variability resulting from new discoveries. Beyond data heterogeneity, managing variabilities in the context of BIS requires extensible and dynamic integration process. In this paper, we focus on data and schema variabilities and we propose an integration framework based on ontologies, master data, and semantic annotations. The framework addresses issues related to: 1) collaborative work through a dynamic integration process; 2) variability among studies using an annotation mechanism; and 3) quality control over data and semantic annotations. Our approach relies on two levels of knowledge: BIS-related knowledge is modeled using an application ontology coupled with UML models that allow controlling data completeness and consistency, and domain knowledge is described by a domain ontology, which ensures data coherence. A system build with the eClims framework has been implemented and evaluated in the context of a proteomic platform.

  4. Cloud Computing Services for Seismic Networks

    NASA Astrophysics Data System (ADS)

    Olson, Michael

    This thesis describes a compositional framework for developing situation awareness applications: applications that provide ongoing information about a user's changing environment. The thesis describes how the framework is used to develop a situation awareness application for earthquakes. The applications are implemented as Cloud computing services connected to sensors and actuators. The architecture and design of the Cloud services are described and measurements of performance metrics are provided. The thesis includes results of experiments on earthquake monitoring conducted over a year. The applications developed by the framework are (1) the CSN---the Community Seismic Network---which uses relatively low-cost sensors deployed by members of the community, and (2) SAF---the Situation Awareness Framework---which integrates data from multiple sources, including the CSN, CISN---the California Integrated Seismic Network, a network consisting of high-quality seismometers deployed carefully by professionals in the CISN organization and spread across Southern California---and prototypes of multi-sensor platforms that include carbon monoxide, methane, dust and radiation sensors.

  5. Framework for near-field-communication-based geo-localization and personalization for Android-based smartphones--application in hospital environments.

    PubMed

    Meng, Philipp; Fehre, Karsten; Rappelsberger, Andrea; Adlassnig, Klaus-Peter

    2014-01-01

    Various applications using near field communication (NFC) have been developed for the medical sector. As a method of short-range wireless contact-driven data transfer, NFC is a useful tool in medicine. It can be used to transfer data such as blood pressure, control adherence to medication, or transmit in vivo data. The first proposed general framework uses NFC as a mechanism for indoor geo-localization in hospitals. NFC geo-localization is economical compared to classical concepts using indoor GPS or WLAN triangulation, and the granularity of location retrieval can be defined at a tag level. Using this framework, we facilitate the development of medical applications that require exact indoor geo-localization. Multi-user Android systems are addressed in the second framework. Using private NFC tags, users are able to carry on their personal settings for enabled applications. This eliminates the need for multiple user accounts on common Android devices, improves usability, and eases technical administration. Based on the prototypes presented here, we show a novel concept of using NFC-enabled Android devices in hospital environments.

  6. A Smartphone-Based Driver Safety Monitoring System Using Data Fusion

    PubMed Central

    Lee, Boon-Giin; Chung, Wan-Young

    2012-01-01

    This paper proposes a method for monitoring driver safety levels using a data fusion approach based on several discrete data types: eye features, bio-signal variation, in-vehicle temperature, and vehicle speed. The driver safety monitoring system was developed in practice in the form of an application for an Android-based smartphone device, where measuring safety-related data requires no extra monetary expenditure or equipment. Moreover, the system provides high resolution and flexibility. The safety monitoring process involves the fusion of attributes gathered from different sensors, including video, electrocardiography, photoplethysmography, temperature, and a three-axis accelerometer, that are assigned as input variables to an inference analysis framework. A Fuzzy Bayesian framework is designed to indicate the driver’s capability level and is updated continuously in real-time. The sensory data are transmitted via Bluetooth communication to the smartphone device. A fake incoming call warning service alerts the driver if his or her safety level is suspiciously compromised. Realistic testing of the system demonstrates the practical benefits of multiple features and their fusion in providing a more authentic and effective driver safety monitoring. PMID:23247416

  7. A capacity-building conceptual framework for public health nutrition practice.

    PubMed

    Baillie, Elizabeth; Bjarnholt, Christel; Gruber, Marlies; Hughes, Roger

    2009-08-01

    To describe a conceptual framework to assist in the application of capacity-building principles to public health nutrition practice. A review of the literature and consideration of the determinants of effective public health nutrition practice has been used to inform the development of a conceptual framework for capacity building in the context of public health nutrition practice. The limited literature supports a greater integration and application of capacity-building strategies and principles in public health nutrition practice, and that this application should be overt and strategic. A framework is proposed that identifies a number of determinants of capacity for effective public health nutrition action. The framework represents the key foundations for building capacity including leadership, resourcing and intelligence. Five key strategic domains supported by these foundation elements, including partnerships, organisational development, project management quality, workforce development and community development, are proposed. This framework can be used to assist the systematic assessment, development and evaluation of capacity-building activity within public health nutrition practice. Capacity building is a strategy within public health nutrition practice that needs to be central to public health nutrition intervention management. The present paper defines, contextualises and outlines a framework for integrating and making explicit the importance of capacity building within public health nutrition practice at many levels.

  8. RN jurisdiction over nursing care systems in nursing homes: application of latent class analysis

    PubMed Central

    Corazzini, Kirsten N.; Anderson, Ruth A.; Mueller, Christine; Thorpe, Joshua M.; McConnell, Eleanor S.

    2015-01-01

    Background In the context of declining registered nurse (RN) staffing levels in nursing homes, professional nursing jurisdiction over nursing care systems may erode. Objectives The purpose of this study is to develop a typology of professional nursing jurisdiction in nursing homes in relation to characteristics of RN staffing, drawing upon Abbott's (1988) tasks and jurisdictions framework. Method The study was a cross-sectional, observational study using the 2004 National Nursing Home Survey (N=1,120 nursing homes). Latent class analysis tested whether RN staffing indicators differentiated facilities in a typology of RN jurisdiction, and compared classes on key organizational environment characteristics. Multiple logistic regression analysis related the emergent classes to presence or absence of specialty care programs in 8 clinical areas. Results Three classes of capacity for jurisdiction were identified, including ‘low capacity’ (41% of homes) with low probabilities of having any indicators of RN jurisdiction, ‘mixed capacity’ (26% of homes) with moderate to high probabilities of having higher RN education and staffing levels, and ‘high capacity’ (32% of homes) with moderate to high probabilities of having almost all indicators of RN jurisdiction. ‘High capacity’ homes were more likely to have specialty care programs relative to ‘low capacity’ homes; such homes were less likely to be chain-owned, and more likely to be larger, provide higher technical levels of patient care, have unionized nursing assistants, have a lower ratio of LPNs to RNs, and a higher education level of the administrator. Discussion Findings provide preliminary support for the theoretical framework as a starting point to move beyond extensive reliance on staffing levels and mix as indicators of quality. Further, findings indicate the importance of RN specialty certification. PMID:22166907

  9. Finger vein recognition based on the hyperinformation feature

    NASA Astrophysics Data System (ADS)

    Xi, Xiaoming; Yang, Gongping; Yin, Yilong; Yang, Lu

    2014-01-01

    The finger vein is a promising biometric pattern for personal identification due to its advantages over other existing biometrics. In finger vein recognition, feature extraction is a critical step, and many feature extraction methods have been proposed to extract the gray, texture, or shape of the finger vein. We treat them as low-level features and present a high-level feature extraction framework. Under this framework, base attribute is first defined to represent the characteristics of a certain subcategory of a subject. Then, for an image, the correlation coefficient is used for constructing the high-level feature, which reflects the correlation between this image and all base attributes. Since the high-level feature can reveal characteristics of more subcategories and contain more discriminative information, we call it hyperinformation feature (HIF). Compared with low-level features, which only represent the characteristics of one subcategory, HIF is more powerful and robust. In order to demonstrate the potential of the proposed framework, we provide a case study to extract HIF. We conduct comprehensive experiments to show the generality of the proposed framework and the efficiency of HIF on our databases, respectively. Experimental results show that HIF significantly outperforms the low-level features.

  10. Effect of Wind Farm Noise on Local Residents' Decision to Adopt Mitigation Measures.

    PubMed

    Botelho, Anabela; Arezes, Pedro; Bernardo, Carlos; Dias, Hernâni; Pinto, Lígia M Costa

    2017-07-11

    Wind turbines' noise is frequently pointed out as the reason for local communities' objection to the installation of wind farms. The literature suggests that local residents feel annoyed by such noise and that, in many instances, this is significant enough to make them adopt noise-abatement interventions on their homes. Aiming at characterizing the relationship between wind turbine noise, annoyance, and mitigating actions, we propose a novel conceptual framework. The proposed framework posits that actual sound pressure levels of wind turbines determine individual homes' noise-abatement decisions; in addition, the framework analyzes the role that self-reported annoyance, and perception of noise levels, plays on the relationship between actual noise pressure levels and those decisions. The application of this framework to a particular case study shows that noise perception and annoyance constitutes a link between the two. Importantly, however, noise also directly affects people's decision to adopt mitigating measures, independently of the reported annoyance.

  11. Technical Guidance from the International Safety Framework for Nuclear Power Source Applications in Outer Space for Design and Development Phases

    NASA Astrophysics Data System (ADS)

    Summerer, Leopold

    2014-08-01

    In 2009, the International Safety Framework for Nuclear Power Source Applications in Outer Space [1] has been adopted, following a multi-year process that involved all major space faring nations in the frame of the International Atomic Energy Agency and the UN Committee on the Peaceful Uses of Outer Space. The safety framework reflects an international consensus on best practices. After the older 1992 Principles Relevant to the Use of Nuclear Power Sources in Outer Space, it is the second document at UN level dedicated entirely to space nuclear power sources.This paper analyses aspects of the safety framework relevant for the design and development phases of space nuclear power sources. While early publications have started analysing the legal aspects of the safety framework, its technical guidance has not yet been subject to scholarly articles. The present paper therefore focuses on the technical guidance provided in the safety framework, in an attempt to assist engineers and practitioners to benefit from these.

  12. Highly selective luminescent sensing of picric acid based on a water-stable europium metal-organic framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xia, Tifeng; Zhu, Fengliang; Cui, Yuanjing, E-mail: cuiyj@zju.edu.cn

    A water-stable metal-organic framework (MOF) EuNDC has been synthesized for selective detection of the well-known contaminant and toxicant picric acid (PA) in aqueous solution. Due to the photo-induced electron transfer and self-absorption mechanism, EuNDC displayed rapid, selective and sensitive detection of PA with a detection limit of 37.6 ppb. Recyclability experiments revealed that EuNDC retains its initial luminescent intensity and same quenching efficiency in each cycle, suggesting high photostability and reusability for long-term sensing applications. The excellent detection performance of EuNDC makes it a promising PA sensing material for practical applications. - Graphical abstract: A water-stable europium-based metal-organic framework hasmore » been reported for highly selective sensing of picric acid (PA) with a detection limit of 37.6 ppb in aqueous solution. - Highlights: • A water-stable metal-organic framework (MOF) EuNDC was synthesized. • The highly selective detection of picric acid with a detection limit of 37.6 ppb was realized. • The detection mechanism were also presented and discussed.« less

  13. Effective Recovery of Vanadium from Oil Refinery Waste into Vanadium-Based Metal-Organic Frameworks.

    PubMed

    Zhan, Guowu; Ng, Wei Cheng; Lin, Wenlin Yvonne; Koh, Shin Nuo; Wang, Chi-Hwa

    2018-03-06

    Carbon black waste, an oil refinery waste, contains a high concentration of vanadium(V) leftover from the processing of crude oil. For the sake of environmental sustainability, it is therefore of interest to recover the vanadium as useful products instead of disposing of it. In this work, V was recovered in the form of vanadium-based metal-organic frameworks (V-MOFs) via a novel pathway by using the leaching solution of carbon black waste instead of commercially available vanadium chemicals. Two different types of V-MOFs with high levels of crystallinity and phase purity were fabricated in very high yields (>98%) based on a coordination modulation method. The V-MOFs exhibited well-defined and controlled shapes such as nanofibers (length: > 10 μm) and nanorods (length: ∼270 nm). Furthermore, the V-MOFs showed high catalytic activities for the oxidation of benzyl alcohol to benzaldehyde, indicating the strong potential of the waste-derived V-MOFs in catalysis applications. Overall, our work offers a green synthesis pathway for the preparation of V-MOFs by using heavy metals of industrial waste as the metal source.

  14. Ontological Problem-Solving Framework for Assigning Sensor Systems and Algorithms to High-Level Missions

    PubMed Central

    Qualls, Joseph; Russomanno, David J.

    2011-01-01

    The lack of knowledge models to represent sensor systems, algorithms, and missions makes opportunistically discovering a synthesis of systems and algorithms that can satisfy high-level mission specifications impractical. A novel ontological problem-solving framework has been designed that leverages knowledge models describing sensors, algorithms, and high-level missions to facilitate automated inference of assigning systems to subtasks that may satisfy a given mission specification. To demonstrate the efficacy of the ontological problem-solving architecture, a family of persistence surveillance sensor systems and algorithms has been instantiated in a prototype environment to demonstrate the assignment of systems to subtasks of high-level missions. PMID:22164081

  15. Ecoregions of the conterminous United States: evolution of a hierarchical spatial framework

    USGS Publications Warehouse

    Omernik, James M.; Griffith, Glenn E.

    2014-01-01

    A map of ecological regions of the conterminous United States, first published in 1987, has been greatly refined and expanded into a hierarchical spatial framework in response to user needs, particularly by state resource management agencies. In collaboration with scientists and resource managers from numerous agencies and institutions in the United States, Mexico, and Canada, the framework has been expanded to cover North America, and the original ecoregions (now termed Level III) have been refined, subdivided, and aggregated to identify coarser as well as more detailed spatial units. The most generalized units (Level I) define 10 ecoregions in the conterminous U.S., while the finest-scale units (Level IV) identify 967 ecoregions. In this paper, we explain the logic underpinning the approach, discuss the evolution of the regional mapping process, and provide examples of how the ecoregions were distinguished at each hierarchical level. The variety of applications of the ecoregion framework illustrates its utility in resource assessment and management.

  16. Ecoregions of the Conterminous United States: Evolution of a Hierarchical Spatial Framework

    NASA Astrophysics Data System (ADS)

    Omernik, James M.; Griffith, Glenn E.

    2014-12-01

    A map of ecological regions of the conterminous United States, first published in 1987, has been greatly refined and expanded into a hierarchical spatial framework in response to user needs, particularly by state resource management agencies. In collaboration with scientists and resource managers from numerous agencies and institutions in the United States, Mexico, and Canada, the framework has been expanded to cover North America, and the original ecoregions (now termed Level III) have been refined, subdivided, and aggregated to identify coarser as well as more detailed spatial units. The most generalized units (Level I) define 10 ecoregions in the conterminous U.S., while the finest-scale units (Level IV) identify 967 ecoregions. In this paper, we explain the logic underpinning the approach, discuss the evolution of the regional mapping process, and provide examples of how the ecoregions were distinguished at each hierarchical level. The variety of applications of the ecoregion framework illustrates its utility in resource assessment and management.

  17. Integrating Visualization Applications, such as ParaView, into HEP Software Frameworks for In-situ Event Displays

    NASA Astrophysics Data System (ADS)

    Lyon, A. L.; Kowalkowski, J. B.; Jones, C. D.

    2017-10-01

    ParaView is a high performance visualization application not widely used in High Energy Physics (HEP). It is a long standing open source project led by Kitware and involves several Department of Energy (DOE) and Department of Defense (DOD) laboratories. Futhermore, it has been adopted by many DOE supercomputing centers and other sites. ParaView is unique in speed and efficiency by using state-of-the-art techniques developed by the academic visualization community that are often not found in applications written by the HEP community. In-situ visualization of events, where event details are visualized during processing/analysis, is a common task for experiment software frameworks. Kitware supplies Catalyst, a library that enables scientific software to serve visualization objects to client ParaView viewers yielding a real-time event display. Connecting ParaView to the Fermilab art framework will be described and the capabilities it brings discussed.

  18. Just-in-Time Training of the Evidence-Based Public Health Framework, Oklahoma, 2016-2017.

    PubMed

    Douglas, Malinda R; Lowry, Jon P; Morgan, Latricia A

    2018-03-07

    Training of practitioners on evidence-based public health has shown to be beneficial, yet overwhelming. Chunking information and proximate practical application are effective techniques to increase retention in adult learning. Evidence-based public health training for practitioners from African American and Hispanic/Latino community agencies and tribes/tribal nations incorporated these 2 techniques. The community-level practitioners alternated attending training and implementing the steps of the evidence-based public health framework as they planned state-funded programs. One year later, survey results showed that participants reported increased confidence in skills that were reinforced by practical and practiced application as compared with posttraining survey results. In addition, at 1 year, reported confidence in skills that were not fortified by proximate application decreased when compared with posttraining confidence levels. All 7 community programs successfully created individualized evidence-based action plans that included evidence-based practices and policies across socioecological levels that fit with the unique culture and climate of their own community.

  19. Study of Electrocatalytic Properties of Metal–Organic Framework PCN-223 for the Oxygen Reduction Reaction

    DOE PAGES

    Usov, Pavel M.; Huffman, Brittany; Epley, Charity C.; ...

    2017-03-27

    Here, a highly robust metal–organic framework (MOF) constructed from Zr 6 oxo clusters and Fe(III) porphyrin linkers, PCN-223-Fe was investigated as a heterogeneous catalyst for oxygen reduction reaction (ORR). Films of the framework were grown on a conductive FTO substrate and showed a high catalytic current upon application of cathodic potentials and achieved high H 2O/H 2O 2 selectivity. In addition, the effect of the proton source on the catalytic performance was also investigated.

  20. An application of extreme value theory to the management of a hydroelectric dam.

    PubMed

    Minkah, Richard

    2016-01-01

    Assessing the probability of very low or high water levels is an important issue in the management of hydroelectric dams. In the case of the Akosombo dam, very low and high water levels result in load shedding of electrical power and flooding in communities downstream respectively. In this paper, we use extreme value theory to estimate the probability and return period of very low water levels that can result in load shedding or a complete shutdown of the dam's operations. In addition, we assess the probability and return period of high water levels near the height of the dam and beyond. This provides a framework for a possible extension of the dam to sustain the generation of electrical power and reduce the frequency of spillage that causes flooding in communities downstream. The results show that an extension of the dam can reduce the probability and prolong the return period of a flood. In addition, we found a negligible probability of a complete shutdown of the dam due to inadequate water level.

  1. Programming Hierarchical Self-Assembly of Patchy Particles into Colloidal Crystals via Colloidal Molecules.

    PubMed

    Morphew, Daniel; Shaw, James; Avins, Christopher; Chakrabarti, Dwaipayan

    2018-03-27

    Colloidal self-assembly is a promising bottom-up route to a wide variety of three-dimensional structures, from clusters to crystals. Programming hierarchical self-assembly of colloidal building blocks, which can give rise to structures ordered at multiple levels to rival biological complexity, poses a multiscale design problem. Here we explore a generic design principle that exploits a hierarchy of interaction strengths and employ this design principle in computer simulations to demonstrate the hierarchical self-assembly of triblock patchy colloidal particles into two distinct colloidal crystals. We obtain cubic diamond and body-centered cubic crystals via distinct clusters of uniform size and shape, namely, tetrahedra and octahedra, respectively. Such a conceptual design framework has the potential to reliably encode hierarchical self-assembly of colloidal particles into a high level of sophistication. Moreover, the design framework underpins a bottom-up route to cubic diamond colloidal crystals, which have remained elusive despite being much sought after for their attractive photonic applications.

  2. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    NASA Astrophysics Data System (ADS)

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  3. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data.

    PubMed

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-21

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  4. A critical review of recent US market level health care strategy literature.

    PubMed

    Wells, R; Banaszak-Holl, J

    2000-09-01

    In this review, we argue that it would be profitable if the neoclassical economic theories that have dominated recent US market level health care strategy research could be complemented by greater use of sociological frameworks. Sociological theory can address three central questions that neoclassical economic theories have tended to slight: (1) how decision-makers' preferences are determined; (2) who the decision-makers are; and (3) how decision-makers' plans are translated into organizational action. We suggest five sociological frameworks that would enable researchers to address these issues better relative to market level strategy in health care. The frameworks are (1) institutional theory, (2) organizational ecology, (3) social movements, (4) social networks, and (5) internal organizational change. A recent global trend toward privatization of health care provision makes US market level strategy research increasingly applicable to non-US readers.

  5. Content-level deduplication on mobile internet datasets

    NASA Astrophysics Data System (ADS)

    Hou, Ziyu; Chen, Xunxun; Wang, Yang

    2017-06-01

    Various systems and applications involve a large volume of duplicate items. Based on high data redundancy in real world datasets, data deduplication can reduce storage capacity and improve the utilization of network bandwidth. However, chunks of existing deduplications range in size from 4KB to over 16KB, existing systems are not applicable to the datasets consisting of short records. In this paper, we propose a new framework called SF-Dedup which is able to implement the deduplication process on a large set of Mobile Internet records, the size of records can be smaller than 100B, or even smaller than 10B. SF-Dedup is a short fingerprint, in-line, hash-collisions-resolved deduplication. Results of experimental applications illustrate that SH-Dedup is able to reduce storage capacity and shorten query time on relational database.

  6. Evidence-Based Evaluation of Practice and Innovation in Physical Therapy Using the IDEAL-Physio Framework.

    PubMed

    Beard, David; Hamilton, David; Davies, Loretta; Cook, Jonathan; Hirst, Allison; McCulloch, Peter; Paez, Arsenio

    2018-02-01

    The IDEAL framework is an established method for initial and ongoing evaluations of innovation and practice for complex health care interventions. First derived for surgical sciences and embedded at a global level for evaluating surgery/surgical devices, the IDEAL framework is based on the principle that innovation and evaluation in clinical practice can, and should, evolve together in an ordered manner: from conception to development and then to validation by appropriate clinical studies and, finally, longer-term follow-up. This framework is highly suited to other complex, nonpharmacological interventions, such as physical therapist interventions. This perspective outlines the application of IDEAL to physical therapy in the new IDEAL-Physio framework. The IDEAL-Physio framework comprises 5 stages. In stage 1, the idea phase, formal data collection should begin. Stage 2a is the phase for iterative improvement and adjustment with thorough data recording. Stage 2b involves the onset of formal evaluation using systematically collected group or cohort data. Stage 3 is the phase for formal comparative assessment of treatment, usually involving randomized studies. Stage 4 involves long-term follow-up. The IDEAL-Physio framework is recommended as a method for guiding and evaluating both innovation and practice in physical therapy, with the overall goal of providing better evidence-based care. © 2017 American Physical Therapy Association.

  7. Trainable Nonlinear Reaction Diffusion: A Flexible Framework for Fast and Effective Image Restoration.

    PubMed

    Chen, Yunjin; Pock, Thomas

    2017-06-01

    Image restoration is a long-standing problem in low-level computer vision with many interesting applications. We describe a flexible learning framework based on the concept of nonlinear reaction diffusion models for various image restoration problems. By embodying recent improvements in nonlinear diffusion models, we propose a dynamic nonlinear reaction diffusion model with time-dependent parameters (i.e., linear filters and influence functions). In contrast to previous nonlinear diffusion models, all the parameters, including the filters and the influence functions, are simultaneously learned from training data through a loss based approach. We call this approach TNRD-Trainable Nonlinear Reaction Diffusion. The TNRD approach is applicable for a variety of image restoration tasks by incorporating appropriate reaction force. We demonstrate its capabilities with three representative applications, Gaussian image denoising, single image super resolution and JPEG deblocking. Experiments show that our trained nonlinear diffusion models largely benefit from the training of the parameters and finally lead to the best reported performance on common test datasets for the tested applications. Our trained models preserve the structural simplicity of diffusion models and take only a small number of diffusion steps, thus are highly efficient. Moreover, they are also well-suited for parallel computation on GPUs, which makes the inference procedure extremely fast.

  8. Protein loop modeling using a new hybrid energy function and its application to modeling in inaccurate structural environments.

    PubMed

    Park, Hahnbeom; Lee, Gyu Rie; Heo, Lim; Seok, Chaok

    2014-01-01

    Protein loop modeling is a tool for predicting protein local structures of particular interest, providing opportunities for applications involving protein structure prediction and de novo protein design. Until recently, the majority of loop modeling methods have been developed and tested by reconstructing loops in frameworks of experimentally resolved structures. In many practical applications, however, the protein loops to be modeled are located in inaccurate structural environments. These include loops in model structures, low-resolution experimental structures, or experimental structures of different functional forms. Accordingly, discrepancies in the accuracy of the structural environment assumed in development of the method and that in practical applications present additional challenges to modern loop modeling methods. This study demonstrates a new strategy for employing a hybrid energy function combining physics-based and knowledge-based components to help tackle this challenge. The hybrid energy function is designed to combine the strengths of each energy component, simultaneously maintaining accurate loop structure prediction in a high-resolution framework structure and tolerating minor environmental errors in low-resolution structures. A loop modeling method based on global optimization of this new energy function is tested on loop targets situated in different levels of environmental errors, ranging from experimental structures to structures perturbed in backbone as well as side chains and template-based model structures. The new method performs comparably to force field-based approaches in loop reconstruction in crystal structures and better in loop prediction in inaccurate framework structures. This result suggests that higher-accuracy predictions would be possible for a broader range of applications. The web server for this method is available at http://galaxy.seoklab.org/loop with the PS2 option for the scoring function.

  9. Rotation covariant image processing for biomedical applications.

    PubMed

    Skibbe, Henrik; Reisert, Marco

    2013-01-01

    With the advent of novel biomedical 3D image acquisition techniques, the efficient and reliable analysis of volumetric images has become more and more important. The amount of data is enormous and demands an automated processing. The applications are manifold, ranging from image enhancement, image reconstruction, and image description to object/feature detection and high-level contextual feature extraction. In most scenarios, it is expected that geometric transformations alter the output in a mathematically well-defined manner. In this paper we emphasis on 3D translations and rotations. Many algorithms rely on intensity or low-order tensorial-like descriptions to fulfill this demand. This paper proposes a general mathematical framework based on mathematical concepts and theories transferred from mathematical physics and harmonic analysis into the domain of image analysis and pattern recognition. Based on two basic operations, spherical tensor differentiation and spherical tensor multiplication, we show how to design a variety of 3D image processing methods in an efficient way. The framework has already been applied to several biomedical applications ranging from feature and object detection tasks to image enhancement and image restoration techniques. In this paper, the proposed methods are applied on a variety of different 3D data modalities stemming from medical and biological sciences.

  10. On extracting design principles from biology: I. Method-General answers to high-level design questions for bioinspired robots.

    PubMed

    Haberland, M; Kim, S

    2015-02-02

    When millions of years of evolution suggest a particular design solution, we may be tempted to abandon traditional design methods and copy the biological example. However, biological solutions do not often translate directly into the engineering domain, and even when they do, copying eliminates the opportunity to improve. A better approach is to extract design principles relevant to the task of interest, incorporate them in engineering designs, and vet these candidates against others. This paper presents the first general framework for determining whether biologically inspired relationships between design input variables and output objectives and constraints are applicable to a variety of engineering systems. Using optimization and statistics to generalize the results beyond a particular system, the framework overcomes shortcomings observed of ad hoc methods, particularly those used in the challenging study of legged locomotion. The utility of the framework is demonstrated in a case study of the relative running efficiency of rotary-kneed and telescoping-legged robots.

  11. Towards a conceptual framework of OSH risk management in smart working environments based on smart PPE, ambient intelligence and the Internet of Things technologies.

    PubMed

    Podgórski, Daniel; Majchrzycka, Katarzyna; Dąbrowska, Anna; Gralewicz, Grzegorz; Okrasa, Małgorzata

    2017-03-01

    Recent developments in domains of ambient intelligence (AmI), Internet of Things, cyber-physical systems (CPS), ubiquitous/pervasive computing, etc., have led to numerous attempts to apply ICT solutions in the occupational safety and health (OSH) area. A literature review reveals a wide range of examples of smart materials, smart personal protective equipment and other AmI applications that have been developed to improve workers' safety and health. Because the use of these solutions modifies work methods, increases complexity of production processes and introduces high dynamism into thus created smart working environments (SWE), a new conceptual framework for dynamic OSH management in SWE is called for. A proposed framework is based on a new paradigm of OSH risk management consisting of real-time risk assessment and the capacity to monitor the risk level of each worker individually. A rationale for context-based reasoning in SWE and a respective model of the SWE-dedicated CPS are also proposed.

  12. High School Improvement: Indicators of Effectiveness and School-Level Benchmarks

    ERIC Educational Resources Information Center

    National High School Center, 2012

    2012-01-01

    The National High School Center's "Eight Elements of High School Improvement: A Mapping Framework" provides a cohesive high school improvement framework comprised of eight elements and related indicators of effectiveness. These indicators of effectiveness allow states, districts, and schools to identify strengths and weaknesses of their current…

  13. Screening Systems and Decision Making at the Preschool Level: Application of a Comprehensive Validity Framework

    ERIC Educational Resources Information Center

    Kettler, Ryan J.; Feeney-Kettler, Kelly A.

    2011-01-01

    Universal screening is designed to be an efficient method for identifying preschool students with mental health problems, but prior to use, screening systems must be evaluated to determine their appropriateness within a specific setting. In this article, an evidence-based validity framework is applied to four screening systems for identifying…

  14. Evidence-Based Practices: Applications of Concrete Representational Abstract Framework across Math Concepts for Students with Mathematics Disabilities

    ERIC Educational Resources Information Center

    Agrawal, Jugnu; Morin, Lisa L.

    2016-01-01

    Students with mathematics disabilities (MD) experience difficulties with both conceptual and procedural knowledge of different math concepts across grade levels. Research shows that concrete representational abstract framework of instruction helps to bridge this gap for students with MD. In this article, we provide an overview of this strategy…

  15. Resource Management for Real-Time Adaptive Agents

    NASA Technical Reports Server (NTRS)

    Welch, Lonnie; Chelberg, David; Pfarr, Barbara; Fleeman, David; Parrott, David; Tan, Zhen-Yu; Jain, Shikha; Drews, Frank; Bruggeman, Carl; Shuler, Chris

    2003-01-01

    Increased autonomy and automation in onboard flight systems offer numerous potential benefits, including cost reduction and greater flexibility. The existence of generic mechanisms for automation is critical for handling unanticipated science events and anomalies where limitations in traditional control software with fixed, predetermined algorithms can mean loss of science data and missed opportunities for observing important terrestrial events. We have developed such a mechanism by adding a Hierarchical Agent-based ReaLTime technology (HART) extension to our Dynamic Resource Management (DRM) middleware. Traditional DRM provides mechanisms to monitor the realtime performance of distributed applications and to move applications among processors to improve real-time performance. In the HART project we have designed and implemented a performance adaptation mechanism to improve reaktime performance. To use this mechanism, applications are developed that can run at various levels of quality. The DRM can choose a setting for the quality level of an application dynamically at run-time in order to manage satellite resource usage more effectively. A groundbased prototype of a satellite system that captures and processes images has also been developed as part of this project to be used as a benchmark for evaluating the resource management framework A significant enhancement of this generic mission-independent framework allows scientists to specify the utility, or "scientific benefit," of science observations under various conditions like cloud cover and compression method. The resource manager then uses these benefit tables to determine in redtime how to set the quality levels for applications to maximize overall system utility as defined by the scientists running the mission. We also show how maintenance functions llke health and safety data can be integrated into the utility framework. Once thls framework has been certified for missions and successfully flight tested it can be reused with little development overhead for other missions. In contrast, current space missions llke Swift manage similar types of resource trade -off completely with the scientific application code itself, and such code must be re-certified and tested for each mission even if a large portion of the code base is shared. This final report discusses some of the major issues motivating this research effort, provides a literature review of the related work, discusses the resource management framework and ground-based satellite system prototype that has been developed, indicates what work is yet to be performed, and provides a list of publications resulting from this work.

  16. First-Order Frameworks for Managing Models in Engineering Optimization

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natlia M.; Lewis, Robert Michael

    2000-01-01

    Approximation/model management optimization (AMMO) is a rigorous methodology for attaining solutions of high-fidelity optimization problems with minimal expense in high- fidelity function and derivative evaluation. First-order AMMO frameworks allow for a wide variety of models and underlying optimization algorithms. Recent demonstrations with aerodynamic optimization achieved three-fold savings in terms of high- fidelity function and derivative evaluation in the case of variable-resolution models and five-fold savings in the case of variable-fidelity physics models. The savings are problem dependent but certain trends are beginning to emerge. We give an overview of the first-order frameworks, current computational results, and an idea of the scope of the first-order framework applicability.

  17. OpenNFT: An open-source Python/Matlab framework for real-time fMRI neurofeedback training based on activity, connectivity and multivariate pattern analysis.

    PubMed

    Koush, Yury; Ashburner, John; Prilepin, Evgeny; Sladky, Ronald; Zeidman, Peter; Bibikov, Sergei; Scharnowski, Frank; Nikonorov, Artem; De Ville, Dimitri Van

    2017-08-01

    Neurofeedback based on real-time functional magnetic resonance imaging (rt-fMRI) is a novel and rapidly developing research field. It allows for training of voluntary control over localized brain activity and connectivity and has demonstrated promising clinical applications. Because of the rapid technical developments of MRI techniques and the availability of high-performance computing, new methodological advances in rt-fMRI neurofeedback become possible. Here we outline the core components of a novel open-source neurofeedback framework, termed Open NeuroFeedback Training (OpenNFT), which efficiently integrates these new developments. This framework is implemented using Python and Matlab source code to allow for diverse functionality, high modularity, and rapid extendibility of the software depending on the user's needs. In addition, it provides an easy interface to the functionality of Statistical Parametric Mapping (SPM) that is also open-source and one of the most widely used fMRI data analysis software. We demonstrate the functionality of our new framework by describing case studies that include neurofeedback protocols based on brain activity levels, effective connectivity models, and pattern classification approaches. This open-source initiative provides a suitable framework to actively engage in the development of novel neurofeedback approaches, so that local methodological developments can be easily made accessible to a wider range of users. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  18. A hierarchical-multiobjective framework for risk management

    NASA Technical Reports Server (NTRS)

    Haimes, Yacov Y.; Li, Duan

    1991-01-01

    A broad hierarchical-multiobjective framework is established and utilized to methodologically address the management of risk. United into the framework are the hierarchical character of decision-making, the multiple decision-makers at separate levels within the hierarchy, the multiobjective character of large-scale systems, the quantitative/empirical aspects, and the qualitative/normative/judgmental aspects. The methodological components essentially consist of hierarchical-multiobjective coordination, risk of extreme events, and impact analysis. Examples of applications of the framework are presented. It is concluded that complex and interrelated forces require an analysis of trade-offs between engineering analysis and societal preferences, as in the hierarchical-multiobjective framework, to successfully address inherent risk.

  19. Child Safety Reference Frameworks: a Policy Tool for Child Injury Prevention at the Sub-national Level.

    PubMed

    Scholtes, Beatrice; Schröder-Bäck, Peter; Mackay, Morag; Vincenten, Joanne; Brand, Helmut

    2017-06-01

    The aim of this paper is to present the Child Safety Reference Frameworks (CSRF), a policy advice tool that places evidence-based child safety interventions, applicable at the sub-national level, into a framework resembling the Haddon Matrix. The CSRF is based on work done in previous EU funded projects, which we have adapted to the field of child safety. The CSRF were populated following a literature review. Four CSRF were developed for four domains of child safety: road, water and home safety, and intentional injury prevention. The CSRF can be used as a reference, assessment and comparative tool by child safety practitioners and policy makers working at the sub-national level. Copyright© by the National Institute of Public Health, Prague 2017

  20. Analytical framework and tool kit for SEA follow-up

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran

    2009-04-15

    Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at themore » regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate.« less

  1. Framing ethical acceptability: a problem with nuclear waste in Canada.

    PubMed

    Wilding, Ethan T

    2012-06-01

    Ethical frameworks are often used in professional fields as a means of providing explicit ethical guidance for individuals and institutions when confronted with ethically important decisions. The notion of an ethical framework has received little critical attention, however, and the concept subsequently lends itself easily to misuse and ambiguous application. This is the case with the 'ethical framework' offered by Canada's Nuclear Waste Management Organization (NWMO), the crown-corporation which owns and is responsible for the long-term management of Canada's high-level nuclear fuel waste. It makes a very specific claim, namely that it is managing Canada's long-lived radioactive nuclear fuel waste in an ethically responsible manner. According to this organization, what it means to behave in an ethically responsible manner is to act and develop policy in accordance with its ethical framework. What, then, is its ethical framework, and can it be satisfied? In this paper I will show that the NWMO's ethical and social framework is deeply flawed in two respects: (a) it fails to meet the minimum requirements of a code of ethic or ethical framework by offering only questions, and no principles or rules of conduct; and (b) if posed as principles or rules of conduct, some of its questions are unsatisfiable. In particular, I will show that one of its claims, namely that it seek informed consent from individuals exposed to risk of harm from nuclear waste, cannot be satisfied as formulated. The result is that the NWMO's ethical framework is not, at present, ethically acceptable.

  2. Application of the BRAFO-tiered approach for benefit-risk assessment to case studies on natural foods.

    PubMed

    Watzl, Bernhard; Gelencsér, Eva; Hoekstra, Jeljer; Kulling, Sabine; Lydeking-Olsen, Eva; Rowland, Ian; Schilter, Benoît; van Klaveren, Jakob; Chiodini, Alessandro

    2012-11-01

    There is evidence that consumption of fish, especially oily fish, has substantial beneficial effects on health. In particular an inverse relationship of oily fish intake to coronary heart disease incidence has been established. These beneficial effects are ascribed to fish oil components including long chain ω-3 polyunsaturated fatty acids. On the other hand it should be noted that oily fish also contains hazardous substances such as dioxins, PCBs and methylmercury. Soy consumption has been associated with potential beneficial and adverse effects. The claimed benefits include reduced risk of cardiovascular disease; osteoporosis, breast and prostate cancer whereas potential adverse effects include impaired thyroid function, disruption of sex hormone levels, changes in reproductive function and increased breast cancer risk The two cases of natural foods highlight the need to consider both risks and benefits in order to establish the net health impact associated to the consumption of specific food products. Within the Sixth Framework programme of the European Commission, the BRAFO project was funded to develop a framework that allows for the quantitative comparison of human health risks and benefits in relation to foods and food compounds. This paper describes the application of the developed framework to two natural foods, farmed salmon and soy protein. We conclude that the BRAFO methodology is highly applicable to natural foods. It will help the benefit-risk managers in selecting the appropriate dietary recommendations for the population. Copyright © 2011 ILSI Europe. Published by Elsevier Ltd.. All rights reserved.

  3. Sea Level Station Metadata for Tsunami Detection, Warning and Research

    NASA Astrophysics Data System (ADS)

    Stroker, K. J.; Marra, J.; Kari, U. S.; Weinstein, S. A.; Kong, L.

    2007-12-01

    The devastating earthquake and tsunami of December 26, 2004 has greatly increased recognition of the need for water level data both from the coasts and the deep-ocean. In 2006, the National Oceanic and Atmospheric Administration (NOAA) completed a Tsunami Data Management Report describing the management of data required to minimize the impact of tsunamis in the United States. One of the major gaps defined in this report is the access to global coastal water level data. NOAA's National Geophysical Data Center (NGDC) and National Climatic Data Center (NCDC) are working cooperatively to bridge this gap. NOAA relies on a network of global data, acquired and processed in real-time to support tsunami detection and warning, as well as high-quality global databases of archived data to support research and advanced scientific modeling. In 2005, parties interested in enhancing the access and use of sea level station data united under the NOAA NCDC's Integrated Data and Environmental Applications (IDEA) Center's Pacific Region Integrated Data Enterprise (PRIDE) program to develop a distributed metadata system describing sea level stations (Kari et. al., 2006; Marra et.al., in press). This effort started with pilot activities in a regional framework and is targeted at tsunami detection and warning systems being developed by various agencies. It includes development of the components of a prototype sea level station metadata web service and accompanying Google Earth-based client application, which use an XML-based schema to expose, at a minimum, information in the NOAA National Weather Service (NWS) Pacific Tsunami Warning Center (PTWC) station database needed to use the PTWC's Tide Tool application. As identified in the Tsunami Data Management Report, the need also exists for long-term retention of the sea level station data. NOAA envisions that the retrospective water level data and metadata will also be available through web services, using an XML-based schema. Five high-priority metadata requirements identified at a water level workshop held at the XXIV IUGG Meeting in Perugia will be addressed: consistent, validated, and well defined numbers (e.g. amplitude); exact location of sea level stations; a complete record of sea level data stored in the archive; identifying high-priority sea level stations; and consistent definitions. NOAA's National Geophysical Data Center (NGDC) and co-located World Data Center for Solid Earth Geophysics (including tsunamis) would hold the archive of the sea level station data and distribute the standard metadata. Currently, NGDC is also archiving and distributing the DART buoy deep-ocean water level data and metadata in standards based formats. Kari, Uday S., John J. Marra, Stuart A. Weinstein, 2006 A Tsunami Focused Data Sharing Framework For Integration of Databases that Describe Water Level Station Specifications. AGU Fall Meeting, 2006. San Francisco, California. Marra, John, J., Uday S. Kari, and Stuart A. Weinstein (in press). A Tsunami Detection and Warning-focused Sea Level Station Metadata Web Service. IUGG XXIV, July 2-13, 2007. Perugia, Italy.

  4. Vulnerability assessment of water resources - Translating a theoretical concept to an operational framework using systems thinking approach in a changing climate: Case study in Ogallala Aquifer

    NASA Astrophysics Data System (ADS)

    Anandhi, Aavudai; Kannan, Narayanan

    2018-02-01

    Water is an essential natural resource. Among many stressors, altered climate is exerting pressure on water resource systems, increasing its demand and creating a need for vulnerability assessments. The overall objective of this study was to develop a novel tool that can translate a theoretical concept (vulnerability of water resources (VWR)) to an operational framework mainly under altered temperature and precipitation, as well as for population change (smaller extent). The developed tool had three stages and utilized a novel systems thinking approach. Stage-1: Translating theoretical concept to characteristics identified from studies; Stage-2: Operationalizing characteristics to methodology in VWR; Stage-3: Utilizing the methodology for development of a conceptual modeling tool for VWR: WR-VISTA (Water Resource Vulnerability assessment conceptual model using Indicators selected by System's Thinking Approach). The specific novelties were: 1) The important characteristics in VWR were identified in Stage-1 (target system, system components, scale, level of detail, data source, frameworks, and indicator); 2) WR-VISTA combined two vulnerability assessments frameworks: the European's Driver-Pressure-State-Impact-Response framework (DPSIR) and the Intergovernmental Panel on Climate Change's framework (IPCC's); and 3) used systems thinking approaches in VWR for indicator selection. The developed application was demonstrated in Kansas (overlying the High Plains region/Ogallala Aquifer, considered the "breadbasket of the world"), using 26 indicators with intermediate level of detail. Our results indicate that the western part of the state is vulnerable from agricultural water use and the eastern part from urban water use. The developed tool can be easily replicated to other regions within and outside the US.

  5. Public Higher Education Performance Accountability Framework Report: Goal--College Readiness Measure: Levels in English and Mathematics. Commission Report 07-24

    ERIC Educational Resources Information Center

    California Postsecondary Education Commission, 2007

    2007-01-01

    As part of its work in developing a performance accountability framework for higher education, the Commission conducted an analysis of student performance on standardized tests at the high school and middle school levels. National test results show that California is behind most other states in giving its students a high school education of the…

  6. Titanium-based Organic Frameworks for Chemical Transformations

    EPA Science Inventory

    Metal–organic frameworks (MOFs) based on organic bridging ligands are a promising class of highly ordered porous materials1 with potential applications in catalysis, gas storage and photoelectric devices. The availability of external surface of the solid-state catalysts plays an ...

  7. Methodological framework for heart rate variability analysis during exercise: application to running and cycling stress testing.

    PubMed

    Hernando, David; Hernando, Alberto; Casajús, Jose A; Laguna, Pablo; Garatachea, Nuria; Bailón, Raquel

    2018-05-01

    Standard methodologies of heart rate variability analysis and physiological interpretation as a marker of autonomic nervous system condition have been largely published at rest, but not so much during exercise. A methodological framework for heart rate variability (HRV) analysis during exercise is proposed, which deals with the non-stationary nature of HRV during exercise, includes respiratory information, and identifies and corrects spectral components related to cardiolocomotor coupling (CC). This is applied to 23 male subjects who underwent different tests: maximal and submaximal, running and cycling; where the ECG, respiratory frequency and oxygen consumption were simultaneously recorded. High-frequency (HF) power results largely modified from estimations with the standard fixed band to those obtained with the proposed methodology. For medium and high levels of exercise and recovery, HF power results in a 20 to 40% increase. When cycling, HF power increases around 40% with respect to running, while CC power is around 20% stronger in running.

  8. 2D-pattern matching image and video compression: theory, algorithms, and experiments.

    PubMed

    Alzina, Marc; Szpankowski, Wojciech; Grama, Ananth

    2002-01-01

    In this paper, we propose a lossy data compression framework based on an approximate two-dimensional (2D) pattern matching (2D-PMC) extension of the Lempel-Ziv (1977, 1978) lossless scheme. This framework forms the basis upon which higher level schemes relying on differential coding, frequency domain techniques, prediction, and other methods can be built. We apply our pattern matching framework to image and video compression and report on theoretical and experimental results. Theoretically, we show that the fixed database model used for video compression leads to suboptimal but computationally efficient performance. The compression ratio of this model is shown to tend to the generalized entropy. For image compression, we use a growing database model for which we provide an approximate analysis. The implementation of 2D-PMC is a challenging problem from the algorithmic point of view. We use a range of techniques and data structures such as k-d trees, generalized run length coding, adaptive arithmetic coding, and variable and adaptive maximum distortion level to achieve good compression ratios at high compression speeds. We demonstrate bit rates in the range of 0.25-0.5 bpp for high-quality images and data rates in the range of 0.15-0.5 Mbps for a baseline video compression scheme that does not use any prediction or interpolation. We also demonstrate that this asymmetric compression scheme is capable of extremely fast decompression making it particularly suitable for networked multimedia applications.

  9. Standardized mappings--a framework to combine different semantic mappers into a standardized web-API.

    PubMed

    Neuhaus, Philipp; Doods, Justin; Dugas, Martin

    2015-01-01

    Automatic coding of medical terms is an important, but highly complicated and laborious task. To compare and evaluate different strategies a framework with a standardized web-interface was created. Two UMLS mapping strategies are compared to demonstrate the interface. The framework is a Java Spring application running on a Tomcat application server. It accepts different parameters and returns results in JSON format. To demonstrate the framework, a list of medical data items was mapped by two different methods: similarity search in a large table of terminology codes versus search in a manually curated repository. These mappings were reviewed by a specialist. The evaluation shows that the framework is flexible (due to standardized interfaces like HTTP and JSON), performant and reliable. Accuracy of automatically assigned codes is limited (up to 40%). Combining different semantic mappers into a standardized Web-API is feasible. This framework can be easily enhanced due to its modular design.

  10. a Novel Framework for Remote Sensing Image Scene Classification

    NASA Astrophysics Data System (ADS)

    Jiang, S.; Zhao, H.; Wu, W.; Tan, Q.

    2018-04-01

    High resolution remote sensing (HRRS) images scene classification aims to label an image with a specific semantic category. HRRS images contain more details of the ground objects and their spatial distribution patterns than low spatial resolution images. Scene classification can bridge the gap between low-level features and high-level semantics. It can be applied in urban planning, target detection and other fields. This paper proposes a novel framework for HRRS images scene classification. This framework combines the convolutional neural network (CNN) and XGBoost, which utilizes CNN as feature extractor and XGBoost as a classifier. Then, this framework is evaluated on two different HRRS images datasets: UC-Merced dataset and NWPU-RESISC45 dataset. Our framework achieved satisfying accuracies on two datasets, which is 95.57 % and 83.35 % respectively. From the experiments result, our framework has been proven to be effective for remote sensing images classification. Furthermore, we believe this framework will be more practical for further HRRS scene classification, since it costs less time on training stage.

  11. A framework for activity detection in wide-area motion imagery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Porter, Reid B; Ruggiero, Christy E; Morrison, Jack D

    2009-01-01

    Wide-area persistent imaging systems are becoming increasingly cost effective and now large areas of the earth can be imaged at relatively high frame rates (1-2 fps). The efficient exploitation of the large geo-spatial-temporal datasets produced by these systems poses significant technical challenges for image and video analysis and data mining. In recent years there has been significant progress made on stabilization, moving object detection and tracking and automated systems now generate hundreds to thousands of vehicle tracks from raw data, with little human intervention. However, the tracking performance at this scale, is unreliable and average track length is much smallermore » than the average vehicle route. This is a limiting factor for applications which depend heavily on track identity, i.e. tracking vehicles from their points of origin to their final destination. In this paper we propose and investigate a framework for wide-area motion imagery (W AMI) exploitation that minimizes the dependence on track identity. In its current form this framework takes noisy, incomplete moving object detection tracks as input, and produces a small set of activities (e.g. multi-vehicle meetings) as output. The framework can be used to focus and direct human users and additional computation, and suggests a path towards high-level content extraction by learning from the human-in-the-loop.« less

  12. Adapting existing natural language processing resources for cardiovascular risk factors identification in clinical notes.

    PubMed

    Khalifa, Abdulrahman; Meystre, Stéphane

    2015-12-01

    The 2014 i2b2 natural language processing shared task focused on identifying cardiovascular risk factors such as high blood pressure, high cholesterol levels, obesity and smoking status among other factors found in health records of diabetic patients. In addition, the task involved detecting medications, and time information associated with the extracted data. This paper presents the development and evaluation of a natural language processing (NLP) application conceived for this i2b2 shared task. For increased efficiency, the application main components were adapted from two existing NLP tools implemented in the Apache UIMA framework: Textractor (for dictionary-based lookup) and cTAKES (for preprocessing and smoking status detection). The application achieved a final (micro-averaged) F1-measure of 87.5% on the final evaluation test set. Our attempt was mostly based on existing tools adapted with minimal changes and allowed for satisfying performance with limited development efforts. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. HIV/AIDS Interventions in Bangladesh: What Can Application of a Social Exclusion Framework Tell Us?

    PubMed Central

    2009-01-01

    Bangladesh has maintained a low HIV prevalence (of less than 1%) despite multiple risk factors. However, recent serological surveillance data have reported very high levels of HIV infection among a subgroup of male injecting drug-users (IDUs). This suggests that an HIV/AIDS epidemic could be imminent in Bangladesh. Although biomedical and behavioural change projects are important, they do not address the root causes of observed risky behaviours among ‘high-risk’ groups. In Bangladesh, these groups include sex workers, IDUs, males who have sex with males, and the transgender population—hijra—who are all excluded groups. Using a social exclusion framework, this paper analyzed existing literature on HIV in Bangladesh to identify social, economic and legal forces that heighten the vulnerability of such excluded groups to HIV/AIDS. It found that poverty and bias against women are major exclusionary factors. The paper presents areas for research and for policy action so that the social exclusion of high-risk groups can be reduced, their rights protected, and an HIV epidemic averted. PMID:19761091

  14. HIV/AIDS interventions in Bangladesh: what can application of a social exclusion framework tell us?

    PubMed

    Khosla, Nidhi

    2009-08-01

    Bangladesh has maintained a low HIV prevalence (of less than 1%) despite multiple risk factors. However, recent serological surveillance data have reported very high levels of HIV infection among a subgroup of male injecting drug-users (IDUs). This suggests that an HIV/AIDS epidemic could be imminent in Bangladesh. Although biomedical and behavioural change projects are important, they do not address the root causes of observed risky behaviours among 'high-risk' groups. In Bangladesh, these groups include sex workers, IDUs, males who have sex with males, and the transgender population-hijra-who are all excluded groups. Using a social exclusion framework, this paper analyzed existing literature on HIV in Bangladesh to identify social, economic and legal forces that heighten the vulnerability of such excluded groups to HIV/AIDS. It found that poverty and bias against women are major exclusionary factors. The paper presents areas for research and for policy action so that the social exclusion of high-risk groups can be reduced, their rights protected, and an HIV epidemic averted.

  15. PROGame: A process framework for serious game development for motor rehabilitation therapy

    PubMed Central

    Jaume-i-Capó, Antoni; Moyà-Alcover, Biel

    2018-01-01

    Serious game development for rehabilitation therapy is becoming increasingly popular because of the motivational advantages that these types of applications provide. Consequently, the need for a common process framework for this category of software development has become increasingly evident. The goal is to guarantee that products are developed and validated by following a coherent and systematic method that leads to high-quality serious games. This paper introduces a new process framework for the development of serious games for motor rehabilitation therapy. We introduce the new model and demonstrate its application for the development of a serious game for the improvement of the balance and postural control of adults with cerebral palsy. The development of this application has been facilitated by two technological transfer contracts and is being exploited by two different organizations. According to clinical measurements, patients using the application improved from high fall risk to moderate fall risk. We believe that our development strategy can be useful not only for motor rehabilitation therapy, but also for the development of serious games in many other rehabilitation areas. PMID:29768472

  16. PROGame: A process framework for serious game development for motor rehabilitation therapy.

    PubMed

    Amengual Alcover, Esperança; Jaume-I-Capó, Antoni; Moyà-Alcover, Biel

    2018-01-01

    Serious game development for rehabilitation therapy is becoming increasingly popular because of the motivational advantages that these types of applications provide. Consequently, the need for a common process framework for this category of software development has become increasingly evident. The goal is to guarantee that products are developed and validated by following a coherent and systematic method that leads to high-quality serious games. This paper introduces a new process framework for the development of serious games for motor rehabilitation therapy. We introduce the new model and demonstrate its application for the development of a serious game for the improvement of the balance and postural control of adults with cerebral palsy. The development of this application has been facilitated by two technological transfer contracts and is being exploited by two different organizations. According to clinical measurements, patients using the application improved from high fall risk to moderate fall risk. We believe that our development strategy can be useful not only for motor rehabilitation therapy, but also for the development of serious games in many other rehabilitation areas.

  17. Solving Geometric Problems by Using Algebraic Representation for Junior High School Level 3 in Van Hiele at Geometric Thinking Level

    ERIC Educational Resources Information Center

    Suwito, Abi; Yuwono, Ipung; Parta, I. Nengah; Irawati, Santi; Oktavianingtyas, Ervin

    2016-01-01

    This study aims to determine the ability of algebra students who have 3 levels van Hiele levels. Follow its framework Dindyal framework (2007). Students are required to do 10 algebra shaped multiple choice, then students work 15 about the geometry of the van Hiele level in the form of multiple choice questions. The question has been tested levels…

  18. Nanoarchitectures for Metal-Organic Framework-Derived Nanoporous Carbons toward Supercapacitor Applications.

    PubMed

    Salunkhe, Rahul R; Kaneti, Yusuf Valentino; Kim, Jeonghun; Kim, Jung Ho; Yamauchi, Yusuke

    2016-12-20

    The future advances of supercapacitors depend on the development of novel carbon materials with optimized porous structures, high surface area, high conductivity, and high electrochemical stability. Traditionally, nanoporous carbons (NPCs) have been prepared by a variety of methods, such as templated synthesis, carbonization of polymer precursors, physical and chemical activation, etc. Inorganic solid materials such as mesoporous silica and zeolites have been successfully utilized as templates to prepare NPCs. However, the hard-templating methods typically involve several synthetic steps, such as preparation of the original templates, formation of carbon frameworks, and removal of the original templates. Therefore, these methods are not favorable for large-scale production. Metal-organic frameworks (MOFs) with high surface areas and large pore volumes have been studied over the years, and recently, enormous efforts have been made to utilize MOFs for electrochemical applications. However, their low conductivity and poor stability still present major challenges toward their practical applications in supercapacitors. MOFs can be used as precursors for the preparation of NPCs with high porosity. Their parent MOFs can be prepared with endless combinations of organic and inorganic constituents by simple coordination chemistry, and it is possible to control their porous architectures, pore volumes, surface areas, etc. These unique properties of MOF-derived NPCs make them highly attractive for many technological applications. Compared with carbonaceous materials prepared using conventional precursors, MOF-derived carbons have significant advantages in terms of a simple synthesis with inherent diversity affording precise control over porous architectures, pore volumes, and surface areas. In this Account, we will summarize our recent research developments on the preparation of three-dimensional (3-D) MOF-derived carbons for supercapacitor applications. This Account will be divided into three main sections: (1) useful background on carbon materials for supercapacitor applications, (2) the importance of MOF-derived carbons, and (3) potential future developments of MOF-derived carbons for supercapacitors. This Account focuses mostly on carbons derived from two types of MOFs, namely, zeolite imidazolate framework-8 (ZIF-8) and ZIF-67. By using examples from our previous works, we will show the uniqueness of these carbons for achieving high performance by control of the chemical reactions/conditions as well proper utilization in asymmetric/symmetric supercapacitor configurations. This Account will promote further developments of MOF-derived multifunctional carbon materials with controlled porous architectures for optimization of their electrochemical performance toward supercapacitor applications.

  19. Security and Cloud Outsourcing Framework for Economic Dispatch

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarker, Mushfiqur R.; Wang, Jianhui; Li, Zuyi

    The computational complexity and problem sizes of power grid applications have increased significantly with the advent of renewable resources and smart grid technologies. The current paradigm of solving these issues consist of inhouse high performance computing infrastructures, which have drawbacks of high capital expenditures, maintenance, and limited scalability. Cloud computing is an ideal alternative due to its powerful computational capacity, rapid scalability, and high cost-effectiveness. A major challenge, however, remains in that the highly confidential grid data is susceptible for potential cyberattacks when outsourced to the cloud. In this work, a security and cloud outsourcing framework is developed for themore » Economic Dispatch (ED) linear programming application. As a result, the security framework transforms the ED linear program into a confidentiality-preserving linear program, that masks both the data and problem structure, thus enabling secure outsourcing to the cloud. Results show that for large grid test cases the performance gain and costs outperforms the in-house infrastructure.« less

  20. Security and Cloud Outsourcing Framework for Economic Dispatch

    DOE PAGES

    Sarker, Mushfiqur R.; Wang, Jianhui; Li, Zuyi; ...

    2017-04-24

    The computational complexity and problem sizes of power grid applications have increased significantly with the advent of renewable resources and smart grid technologies. The current paradigm of solving these issues consist of inhouse high performance computing infrastructures, which have drawbacks of high capital expenditures, maintenance, and limited scalability. Cloud computing is an ideal alternative due to its powerful computational capacity, rapid scalability, and high cost-effectiveness. A major challenge, however, remains in that the highly confidential grid data is susceptible for potential cyberattacks when outsourced to the cloud. In this work, a security and cloud outsourcing framework is developed for themore » Economic Dispatch (ED) linear programming application. As a result, the security framework transforms the ED linear program into a confidentiality-preserving linear program, that masks both the data and problem structure, thus enabling secure outsourcing to the cloud. Results show that for large grid test cases the performance gain and costs outperforms the in-house infrastructure.« less

  1. A Framework for Assessing High School Students' Statistical Reasoning.

    PubMed

    Chan, Shiau Wei; Ismail, Zaleha; Sumintono, Bambang

    2016-01-01

    Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students' statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students' statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework's cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments.

  2. A Methodology and a Web Platform for the Collaborative Development of Context-Aware Systems

    PubMed Central

    Martín, David; López-de-Ipiña, Diego; Alzua-Sorzabal, Aurkene; Lamsfus, Carlos; Torres-Manzanera, Emilio

    2013-01-01

    Information and services personalization is essential for an optimal user experience. Systems have to be able to acquire data about the user's context, process them in order to identify the user's situation and finally, adapt the functionality of the system to that situation, but the development of context-aware systems is complex. Data coming from distributed and heterogeneous sources have to be acquired, processed and managed. Several programming frameworks have been proposed in order to simplify the development of context-aware systems. These frameworks offer high-level application programming interfaces for programmers that complicate the involvement of domain experts in the development life-cycle. The participation of users that do not have programming skills but are experts in the application domain can speed up and improve the development process of these kinds of systems. Apart from that, there is a lack of methodologies to guide the development process. This article presents as main contributions, the implementation and evaluation of a web platform and a methodology to collaboratively develop context-aware systems by programmers and domain experts. PMID:23666131

  3. An analysis of the concept of competence in individuals and social systems.

    PubMed

    Adler, P T

    1982-01-01

    This paper has attempted to present a unified conceptual model of positive mental health or competence from the perspective of individuals and from the perspective of social systems of varying degrees of complexity, such as families, organizations, and entire communities. It has provided a taxonomy of the elements of competence which allows the application of a common framework to the analysis of competence and to the planning and evaluation of competence building interventions at any level of social organization. Community Mental Health Centers can apply the model which has been presented in a number of different ways. At whatever level(s) the CMHCs' efforts are directed, the competence model presents a framework for analysis, intervention, and evaluation which enriches and expands upon more typical disorder-based formulations. By providing a framework which encompasses all levels of social organization, the model provides the conceptual tools for going beyond the individual and microsystem levels which have often constituted the boundaries of CMHC concern, and allows the CMHC to approach the organizational and community levels which must be encompassed by a competently comprehensive center. Application of the concept of competence to social organizations and to communities allows the CMHC to analyze and intervene at these levels. Finally, the concept of organizational competence separated into its various elements provides the CMHC with a tool for analyzing and evaluating its own environment and the competence of various aspects of its own functioning within that environment.

  4. Effect of Wind Farm Noise on Local Residents’ Decision to Adopt Mitigation Measures

    PubMed Central

    Botelho, Anabela; Bernardo, Carlos; Dias, Hernâni; Pinto, Lígia M. Costa

    2017-01-01

    Wind turbines’ noise is frequently pointed out as the reason for local communities’ objection to the installation of wind farms. The literature suggests that local residents feel annoyed by such noise and that, in many instances, this is significant enough to make them adopt noise-abatement interventions on their homes. Aiming at characterizing the relationship between wind turbine noise, annoyance, and mitigating actions, we propose a novel conceptual framework. The proposed framework posits that actual sound pressure levels of wind turbines determine individual homes’ noise-abatement decisions; in addition, the framework analyzes the role that self-reported annoyance, and perception of noise levels, plays on the relationship between actual noise pressure levels and those decisions. The application of this framework to a particular case study shows that noise perception and annoyance constitutes a link between the two. Importantly, however, noise also directly affects people’s decision to adopt mitigating measures, independently of the reported annoyance. PMID:28696404

  5. Manycast routing, modulation level and spectrum assignment over elastic optical networks

    NASA Astrophysics Data System (ADS)

    Luo, Xiao; Zhao, Yang; Chen, Xue; Wang, Lei; Zhang, Min; Zhang, Jie; Ji, Yuefeng; Wang, Huitao; Wang, Taili

    2017-07-01

    Manycast is a point to multi-point transmission framework that requires a subset of destination nodes successfully reached. It is particularly applicable for dealing with large amounts of data simultaneously in bandwidth-hungry, dynamic and cloud-based applications. As rapid increasing of traffics in these applications, the elastic optical networks (EONs) may be relied on to achieve high throughput manycast. In terms of finer spectrum granularity, the EONs could reach flexible accessing to network spectrum and efficient providing exact spectrum resource to demands. In this paper, we focus on the manycast routing, modulation level and spectrum assignment (MA-RMLSA) problem in EONs. Both EONs planning with static manycast traffic and EONs provisioning with dynamic manycast traffic are investigated. An integer linear programming (ILP) model is formulated to derive MA-RMLSA problem in static manycast scenario. Then corresponding heuristic algorithm called manycast routing, modulation level and spectrum assignment genetic algorithm (MA-RMLSA-GA) is proposed to adapt for both static and dynamic manycast scenarios. The MA-RMLSA-GA optimizes MA-RMLSA problem in destination nodes selection, routing light-tree constitution, modulation level allocation and spectrum resource assignment jointly, to achieve an effective improvement in network performance. Simulation results reveal that MA-RMLSA strategies offered by MA-RMLSA-GA have slightly disparity from the optimal solutions provided by ILP model in static scenario. Moreover, the results demonstrate that MA-RMLSA-GA realizes a highly efficient MA-RMLSA strategy with the lowest blocking probability in dynamic scenario compared with benchmark algorithms.

  6. Heat-Passing Framework for Robust Interpretation of Data in Networks

    PubMed Central

    Fang, Yi; Sun, Mengtian; Ramani, Karthik

    2015-01-01

    Researchers are regularly interested in interpreting the multipartite structure of data entities according to their functional relationships. Data is often heterogeneous with intricately hidden inner structure. With limited prior knowledge, researchers are likely to confront the problem of transforming this data into knowledge. We develop a new framework, called heat-passing, which exploits intrinsic similarity relationships within noisy and incomplete raw data, and constructs a meaningful map of the data. The proposed framework is able to rank, cluster, and visualize the data all at once. The novelty of this framework is derived from an analogy between the process of data interpretation and that of heat transfer, in which all data points contribute simultaneously and globally to reveal intrinsic similarities between regions of data, meaningful coordinates for embedding the data, and exemplar data points that lie at optimal positions for heat transfer. We demonstrate the effectiveness of the heat-passing framework for robustly partitioning the complex networks, analyzing the globin family of proteins and determining conformational states of macromolecules in the presence of high levels of noise. The results indicate that the methodology is able to reveal functionally consistent relationships in a robust fashion with no reference to prior knowledge. The heat-passing framework is very general and has the potential for applications to a broad range of research fields, for example, biological networks, social networks and semantic analysis of documents. PMID:25668316

  7. Uniformity on the grid via a configuration framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Igor V Terekhov et al.

    2003-03-11

    As Grid permeates modern computing, Grid solutions continue to emerge and take shape. The actual Grid development projects continue to provide higher-level services that evolve in functionality and operate with application-level concepts which are often specific to the virtual organizations that use them. Physically, however, grids are comprised of sites whose resources are diverse and seldom project readily onto a grid's set of concepts. In practice, this also creates problems for site administrators who actually instantiate grid services. In this paper, we present a flexible, uniform framework to configure a grid site and its facilities, and otherwise describe the resourcesmore » and services it offers. We start from a site configuration and instantiate services for resource advertisement, monitoring and data handling; we also apply our framework to hosting environment creation. We use our ideas in the Information Management part of the SAM-Grid project, a grid system which will deliver petabyte-scale data to the hundreds of users. Our users are High Energy Physics experimenters who are scattered worldwide across dozens of institutions and always use facilities that are shared with other experiments as well as other grids. Our implementation represents information in the XML format and includes tools written in XQuery and XSLT.« less

  8. A Research Study Using the Delphi Method to Define Essential Competencies for a High School Game Art and Design Course Framework at the National Level

    ERIC Educational Resources Information Center

    Mack, Nayo Corenus-Geneva

    2011-01-01

    This research study reports the findings of a Delphi study conducted to determine the essential competencies and objectives for a high school Game Art and Design course framework at the national level. The Delphi panel consisted of gaming, industry and educational experts from all over the world who were members of the International Game…

  9. The application of language-game theory to the analysis of science learning: Developing an interpretive classroom-level learning framework

    NASA Astrophysics Data System (ADS)

    Ahmadibasir, Mohammad

    In this study an interpretive learning framework that aims to measure learning on the classroom level is introduced. In order to develop and evaluate the value of the framework, a theoretical/empirical study is designed. The researcher attempted to illustrate how the proposed framework provides insights on the problem of classroom-level learning. The framework is developed by construction of connections between the current literature on science learning and Wittgenstein's language-game theory. In this framework learning is defined as change of classroom language-game or discourse. In the proposed framework, learning is measured by analysis of classroom discourse. The empirical explanation power of the framework is evaluated by applying the framework in the analysis of learning in a fifth-grade science classroom. The researcher attempted to analyze how students' colloquial discourse changed to a discourse that bears more resemblance to science discourse. The results of the empirical part of the investigation are presented in three parts: first, the gap between what students did and what they were supposed to do was reported. The gap showed that students during the classroom inquiry wanted to do simple comparisons by direct observation, while they were supposed to do tool-assisted observation and procedural manipulation for a complete comparison. Second, it was illustrated that the first attempt to connect the colloquial to science discourse was done by what was immediately intelligible for students and then the teacher negotiated with students in order to help them to connect the old to the new language-game more purposefully. The researcher suggested that these two events in the science classroom are critical in discourse change. Third, it was illustrated that through the academic year, the way that students did the act of comparison was improved and by the end of the year more accurate causal inferences were observable in classroom communication. At the end of the study, the researcher illustrates that the application of the proposed framework resulted in an improved version of the framework. The improved version of the proposed framework is more connected to the topic of science learning, and is able to measure the change of discourse in higher resolution.

  10. Integrating Visualization Applications, such as ParaView, into HEP Software Frameworks for In-situ Event Displays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyon, A. L.; Kowalkowski, J. B.; Jones, C. D.

    ParaView is a high performance visualization application not widely used in High Energy Physics (HEP). It is a long standing open source project led by Kitware and involves several Department of Energy (DOE) and Department of Defense (DOD) laboratories. Futhermore, it has been adopted by many DOE supercomputing centers and other sites. ParaView is unique in speed and efficiency by using state-of-the-art techniques developed by the academic visualization community that are often not found in applications written by the HEP community. In-situ visualization of events, where event details are visualized during processing/analysis, is a common task for experiment software frameworks.more » Kitware supplies Catalyst, a library that enables scientific software to serve visualization objects to client ParaView viewers yielding a real-time event display. Connecting ParaView to the Fermilab art framework will be described and the capabilities it brings discussed.« less

  11. Flexible real-time magnetic resonance imaging framework.

    PubMed

    Santos, Juan M; Wright, Graham A; Pauly, John M

    2004-01-01

    The extension of MR imaging to new applications has demonstrated the limitations of the architecture of current real-time systems. Traditional real-time implementations provide continuous acquisition of data and modification of basic sequence parameters on the fly. We have extended the concept of real-time MRI by designing a system that drives the examinations from a real-time localizer and then gets reconfigured for different imaging modes. Upon operator request or automatic feedback the system can immediately generate a new pulse sequence or change fundamental aspects of the acquisition such as gradient waveforms excitation pulses and scan planes. This framework has been implemented by connecting a data processing and control workstation to a conventional clinical scanner. Key components on the design of this framework are the data communication and control mechanisms, reconstruction algorithms optimized for real-time and adaptability, flexible user interface and extensible user interaction. In this paper we describe the various components that comprise this system. Some of the applications implemented in this framework include real-time catheter tracking embedded in high frame rate real-time imaging and immediate switching between real-time localizer and high-resolution volume imaging for coronary angiography applications.

  12. Effect of framework material and vertical misfit on stress distribution in implant-supported partial prosthesis under load application: 3-D finite element analysis.

    PubMed

    Bacchi, Ataís; Consani, Rafael Leonardo Xediek; Mesquita, Marcelo Ferraz; Dos Santos, Mateus Bertolini Fernandes

    2013-09-01

    This study evaluated the influence of framework material and vertical misfit on stress created in an implant-supported partial prosthesis under load application. The posterior part of a severely reabsorbed jaw with a fixed partial prosthesis above two osseointegrated titanium implants at the place of the second premolar and second molar was modeled using SolidWorks 2010 software. Finite element models were obtained by importing the solid model into an ANSYS Workbench 11 simulation. The models were divided into 15 groups according to their prosthetic framework material (type IV gold alloy, silver-palladium alloy, commercially pure titanium, cobalt-chromium alloy or zirconia) and vertical misfit level (10 µm, 50 µm and 100 µm). After settlement of the prosthesis with the closure of the misfit, simultaneous loads of 110 N vertical and 15 N horizontal were applied on the occlusal and lingual faces of each tooth, respectively. The data was evaluated using Maximum Principal Stress (framework, porcelain veneer and bone tissue) and a von Mises Stress (retention screw) provided by the software. As a result, stiffer frameworks presented higher stress concentrations; however, these frameworks led to lower stresses in the porcelain veneer, the retention screw (faced to 10 µm and 50 µm of the misfit) and the peri-implant bone tissues. The increase in the vertical misfit resulted in stress values increasing in all of the prosthetic structures and peri-implant bone tissues. The framework material and vertical misfit level presented a relevant influence on the stresses for all of the structures evaluated.

  13. Refining the aggregate exposure pathway.

    PubMed

    Tan, Yu-Mei; Leonard, Jeremy A; Edwards, Stephen; Teeguarden, Justin; Egeghy, Peter

    2018-03-01

    Advancements in measurement technologies and modeling capabilities continue to result in an abundance of exposure information, adding to that currently in existence. However, fragmentation within the exposure science community acts as an obstacle for realizing the vision set forth in the National Research Council's report on Exposure Science in the 21 st century to consider exposures from source to dose, on multiple levels of integration, and to multiple stressors. The concept of an Aggregate Exposure Pathway (AEP) was proposed as a framework for organizing and integrating diverse exposure information that exists across numerous repositories and among multiple scientific fields. A workshop held in May 2016 followed introduction of the AEP concept, allowing members of the exposure science community to provide extensive evaluation and feedback regarding the framework's structure, key components, and applications. The current work briefly introduces topics discussed at the workshop and attempts to address key challenges involved in refining this framework. The resulting evolution in the AEP framework's features allows for facilitating acquisition, integration, organization, and transparent application and communication of exposure knowledge in a manner that is independent of its ultimate use, thereby enabling reuse of such information in many applications.

  14. InDEx: Open Source iOS and Android Software for Self-Reporting and Monitoring of Alcohol Consumption.

    PubMed

    Leightley, Daniel; Puddephatt, Jo-Anne; Goodwin, Laura; Rona, Roberto; Fear, Nicola T

    2018-03-23

    InDEx is a software package for reporting and monitoring alcohol consumption via a smartphone application. Consumption of alcohol is self-reported by the user, and the app provides a visual representation of drinking behaviour and offers feedback on consumption levels compared to the general population. InDEx is intended as an exemplar app, operating as a standalone smartphone application and is highly customisable for a variety of research domains. InDEx is written in JavaScript, using IONIC framework which is cross-platform and is available under the liberal GNU General Public License (v3). The software is available from GitHub (https://github.com/DrDanL/index-app-public).

  15. InDEx: Open Source iOS and Android Software for Self-Reporting and Monitoring of Alcohol Consumption

    PubMed Central

    Leightley, Daniel; Puddephatt, Jo-Anne; Goodwin, Laura; Rona, Roberto; Fear, Nicola T.

    2018-01-01

    InDEx is a software package for reporting and monitoring alcohol consumption via a smartphone application. Consumption of alcohol is self-reported by the user, and the app provides a visual representation of drinking behaviour and offers feedback on consumption levels compared to the general population. InDEx is intended as an exemplar app, operating as a standalone smartphone application and is highly customisable for a variety of research domains. InDEx is written in JavaScript, using IONIC framework which is cross-platform and is available under the liberal GNU General Public License (v3). The software is available from GitHub (https://github.com/DrDanL/index-app-public). PMID:29795769

  16. 3D geospatial visualizations: Animation and motion effects on spatial objects

    NASA Astrophysics Data System (ADS)

    Evangelidis, Konstantinos; Papadopoulos, Theofilos; Papatheodorou, Konstantinos; Mastorokostas, Paris; Hilas, Constantinos

    2018-02-01

    Digital Elevation Models (DEMs), in combination with high quality raster graphics provide realistic three-dimensional (3D) representations of the globe (virtual globe) and amazing navigation experience over the terrain through earth browsers. In addition, the adoption of interoperable geospatial mark-up languages (e.g. KML) and open programming libraries (Javascript) makes it also possible to create 3D spatial objects and convey on them the sensation of any type of texture by utilizing open 3D representation models (e.g. Collada). One step beyond, by employing WebGL frameworks (e.g. Cesium.js, three.js) animation and motion effects are attributed on 3D models. However, major GIS-based functionalities in combination with all the above mentioned visualization capabilities such as for example animation effects on selected areas of the terrain texture (e.g. sea waves) as well as motion effects on 3D objects moving in dynamically defined georeferenced terrain paths (e.g. the motion of an animal over a hill, or of a big fish in an ocean etc.) are not widely supported at least by open geospatial applications or development frameworks. Towards this we developed and made available to the research community, an open geospatial software application prototype that provides high level capabilities for dynamically creating user defined virtual geospatial worlds populated by selected animated and moving 3D models on user specified locations, paths and areas. At the same time, the generated code may enhance existing open visualization frameworks and programming libraries dealing with 3D simulations, with the geospatial aspect of a virtual world.

  17. Methane storage in metal-organic frameworks.

    PubMed

    He, Yabing; Zhou, Wei; Qian, Guodong; Chen, Banglin

    2014-08-21

    Natural gas (NG), whose main component is methane, is an attractive fuel for vehicular applications. Realization of safe, cheap and convenient means and materials for high-capacity methane storage can significantly facilitate the implementation of natural gas fuelled vehicles. The physisorption based process involving porous materials offers an efficient storage methodology and the emerging porous metal-organic frameworks have been explored as potential candidates because of their extraordinarily high porosities, tunable pore/cage sizes and easily immobilized functional sites. In this view, we provide an overview of the current status of metal-organic frameworks for methane storage.

  18. High-fidelity simulations of unsteady civil aircraft aerodynamics: stakes and perspectives. Application of zonal detached eddy simulation

    PubMed Central

    Deck, Sébastien; Gand, Fabien; Brunet, Vincent; Ben Khelil, Saloua

    2014-01-01

    This paper provides an up-to-date survey of the use of zonal detached eddy simulations (ZDES) for unsteady civil aircraft applications as a reflection on the stakes and perspectives of the use of hybrid methods in the framework of industrial aerodynamics. The issue of zonal or non-zonal treatment of turbulent flows for engineering applications is discussed. The ZDES method used in this article and based on a fluid problem-dependent zonalization is briefly presented. Some recent landmark achievements for conditions all over the flight envelope are presented, including low-speed (aeroacoustics of high-lift devices and landing gear), cruising (engine–airframe interactions), propulsive jets and off-design (transonic buffet and dive manoeuvres) applications. The implications of such results and remaining challenges in a more global framework are further discussed. PMID:25024411

  19. Cyberinfrastructure and Scientific Collaboration: Application of a Virtual Team Performance Framework with Potential Relevance to Education. WCER Working Paper No. 2010-12

    ERIC Educational Resources Information Center

    Kraemer, Sara; Thorn, Christopher A.

    2010-01-01

    The purpose of this exploratory study was to identify and describe some of the dimensions of scientific collaborations using high throughput computing (HTC) through the lens of a virtual team performance framework. A secondary purpose was to assess the viability of using a virtual team performance framework to study scientific collaborations using…

  20. Multimodal characterization of the semantic N400 response within a rapid evaluation brain vital sign framework.

    PubMed

    Ghosh Hajra, Sujoy; Liu, Careesa C; Song, Xiaowei; Fickling, Shaun D; Cheung, Teresa P L; D'Arcy, Ryan C N

    2018-06-04

    For nearly four decades, the N400 has been an important brainwave marker of semantic processing. It can be recorded non-invasively from the scalp using electrical and/or magnetic sensors, but largely within the restricted domain of research laboratories specialized to run specific N400 experiments. However, there is increasing evidence of significant clinical utility for the N400 in neurological evaluation, particularly at the individual level. To enable clinical applications, we recently reported a rapid evaluation framework known as "brain vital signs" that successfully incorporated the N400 response as one of the core components for cognitive function evaluation. The current study characterized the rapidly evoked N400 response to demonstrate that it shares consistent features with traditional N400 responses acquired in research laboratory settings-thereby enabling its translation into brain vital signs applications. Data were collected from 17 healthy individuals using magnetoencephalography (MEG) and electroencephalography (EEG), with analysis of sensor-level effects as well as evaluation of brain sources. Individual-level N400 responses were classified using machine learning to determine the percentage of participants in whom the response was successfully detected. The N400 response was observed in both M/EEG modalities showing significant differences to incongruent versus congruent condition in the expected time range (p < 0.05). Also as expected, N400-related brain activity was observed in the temporal and inferior frontal cortical regions, with typical left-hemispheric asymmetry. Classification robustly confirmed the N400 effect at the individual level with high accuracy (89%), sensitivity (0.88) and specificity (0.90). The brain vital sign N400 characteristics were highly consistent with features of the previously reported N400 responses acquired using traditional laboratory-based experiments. These results provide important evidence supporting clinical translation of the rapidly acquired N400 response as a potential tool for assessments of higher cognitive functions.

  1. Ultrahigh Surface Area Three-Dimensional Porous Graphitic Carbon from Conjugated Polymeric Molecular Framework

    PubMed Central

    2015-01-01

    Porous graphitic carbon is essential for many applications such as energy storage devices, catalysts, and sorbents. However, current graphitic carbons are limited by low conductivity, low surface area, and ineffective pore structure. Here we report a scalable synthesis of porous graphitic carbons using a conjugated polymeric molecular framework as precursor. The multivalent cross-linker and rigid conjugated framework help to maintain micro- and mesoporous structures, while promoting graphitization during carbonization and chemical activation. The above unique design results in a class of highly graphitic carbons at temperature as low as 800 °C with record-high surface area (4073 m2 g–1), large pore volume (2.26 cm–3), and hierarchical pore architecture. Such carbons simultaneously exhibit electrical conductivity >3 times more than activated carbons, very high electrochemical activity at high mass loading, and high stability, as demonstrated by supercapacitors and lithium–sulfur batteries with excellent performance. Moreover, the synthesis can be readily tuned to make a broad range of graphitic carbons with desired structures and compositions for many applications. PMID:27162953

  2. Ultrahigh Surface Area Three-Dimensional Porous Graphitic Carbon from Conjugated Polymeric Molecular Framework

    DOE PAGES

    To, John W. F.; Chen, Zheng; Yao, Hongbin; ...

    2015-05-18

    Porous graphitic carbon is essential for many applications such as energy storage devices, catalysts, and sorbents. However, current graphitic carbons are limited by low conductivity, low surface area, and ineffective pore structure. Here we report a scalable synthesis of porous graphitic carbons using a conjugated polymeric molecular framework as precursor. The multivalent cross-linker and rigid conjugated framework help to maintain micro- and mesoporous structures, while promoting graphitization during carbonization and chemical activation. The above unique design results in a class of highly graphitic carbons at temperature as low as 800 °C with record-high surface area (4073 m 2 g –1),more » large pore volume (2.26 cm –3), and hierarchical pore architecture. Such carbons simultaneously exhibit electrical conductivity >3 times more than activated carbons, very high electrochemical activity at high mass loading, and high stability, as demonstrated by supercapacitors and lithium–sulfur batteries with excellent performance. Moreover, the synthesis can be readily tuned to make a broad range of graphitic carbons with desired structures and compositions for many applications.« less

  3. A Framework for Assessing High School Students' Statistical Reasoning

    PubMed Central

    2016-01-01

    Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students’ statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students’ statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework’s cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments. PMID:27812091

  4. Regional Management Units for Marine Turtles: A Novel Framework for Prioritizing Conservation and Research across Multiple Scales

    PubMed Central

    Wallace, Bryan P.; DiMatteo, Andrew D.; Hurley, Brendan J.; Finkbeiner, Elena M.; Bolten, Alan B.; Chaloupka, Milani Y.; Hutchinson, Brian J.; Abreu-Grobois, F. Alberto; Amorocho, Diego; Bjorndal, Karen A.; Bourjea, Jerome; Bowen, Brian W.; Dueñas, Raquel Briseño; Casale, Paolo; Choudhury, B. C.; Costa, Alice; Dutton, Peter H.; Fallabrino, Alejandro; Girard, Alexandre; Girondot, Marc; Godfrey, Matthew H.; Hamann, Mark; López-Mendilaharsu, Milagros; Marcovaldi, Maria Angela; Mortimer, Jeanne A.; Musick, John A.; Nel, Ronel; Pilcher, Nicolas J.; Seminoff, Jeffrey A.; Troëng, Sebastian; Witherington, Blair; Mast, Roderic B.

    2010-01-01

    Background Resolving threats to widely distributed marine megafauna requires definition of the geographic distributions of both the threats as well as the population unit(s) of interest. In turn, because individual threats can operate on varying spatial scales, their impacts can affect different segments of a population of the same species. Therefore, integration of multiple tools and techniques — including site-based monitoring, genetic analyses, mark-recapture studies and telemetry — can facilitate robust definitions of population segments at multiple biological and spatial scales to address different management and research challenges. Methodology/Principal Findings To address these issues for marine turtles, we collated all available studies on marine turtle biogeography, including nesting sites, population abundances and trends, population genetics, and satellite telemetry. We georeferenced this information to generate separate layers for nesting sites, genetic stocks, and core distributions of population segments of all marine turtle species. We then spatially integrated this information from fine- to coarse-spatial scales to develop nested envelope models, or Regional Management Units (RMUs), for marine turtles globally. Conclusions/Significance The RMU framework is a solution to the challenge of how to organize marine turtles into units of protection above the level of nesting populations, but below the level of species, within regional entities that might be on independent evolutionary trajectories. Among many potential applications, RMUs provide a framework for identifying data gaps, assessing high diversity areas for multiple species and genetic stocks, and evaluating conservation status of marine turtles. Furthermore, RMUs allow for identification of geographic barriers to gene flow, and can provide valuable guidance to marine spatial planning initiatives that integrate spatial distributions of protected species and human activities. In addition, the RMU framework — including maps and supporting metadata — will be an iterative, user-driven tool made publicly available in an online application for comments, improvements, download and analysis. PMID:21253007

  5. FERN - a Java framework for stochastic simulation and evaluation of reaction networks.

    PubMed

    Erhard, Florian; Friedel, Caroline C; Zimmer, Ralf

    2008-08-29

    Stochastic simulation can be used to illustrate the development of biological systems over time and the stochastic nature of these processes. Currently available programs for stochastic simulation, however, are limited in that they either a) do not provide the most efficient simulation algorithms and are difficult to extend, b) cannot be easily integrated into other applications or c) do not allow to monitor and intervene during the simulation process in an easy and intuitive way. Thus, in order to use stochastic simulation in innovative high-level modeling and analysis approaches more flexible tools are necessary. In this article, we present FERN (Framework for Evaluation of Reaction Networks), a Java framework for the efficient simulation of chemical reaction networks. FERN is subdivided into three layers for network representation, simulation and visualization of the simulation results each of which can be easily extended. It provides efficient and accurate state-of-the-art stochastic simulation algorithms for well-mixed chemical systems and a powerful observer system, which makes it possible to track and control the simulation progress on every level. To illustrate how FERN can be easily integrated into other systems biology applications, plugins to Cytoscape and CellDesigner are included. These plugins make it possible to run simulations and to observe the simulation progress in a reaction network in real-time from within the Cytoscape or CellDesigner environment. FERN addresses shortcomings of currently available stochastic simulation programs in several ways. First, it provides a broad range of efficient and accurate algorithms both for exact and approximate stochastic simulation and a simple interface for extending to new algorithms. FERN's implementations are considerably faster than the C implementations of gillespie2 or the Java implementations of ISBJava. Second, it can be used in a straightforward way both as a stand-alone program and within new systems biology applications. Finally, complex scenarios requiring intervention during the simulation progress can be modelled easily with FERN.

  6. Grassland biodiversity can pay.

    PubMed

    Binder, Seth; Isbell, Forest; Polasky, Stephen; Catford, Jane A; Tilman, David

    2018-04-10

    The biodiversity-ecosystem functioning (BEF) literature provides strong evidence of the biophysical basis for the potential profitability of greater diversity but does not address questions of optimal management. BEF studies typically focus on the ecosystem outputs produced by randomly assembled communities that only differ in their biodiversity levels, measured by indices such as species richness. Landholders, however, do not randomly select species to plant; they choose particular species that collectively maximize profits. As such, their interest is not in comparing the average performance of randomly assembled communities at each level of biodiversity but rather comparing the best-performing communities at each diversity level. Assessing the best-performing mixture requires detailed accounting of species' identities and relative abundances. It also requires accounting for the financial cost of individual species' seeds, and the economic value of changes in the quality, quantity, and variability of the species' collective output-something that existing multifunctionality indices fail to do. This study presents an assessment approach that integrates the relevant factors into a single, coherent framework. It uses ecological production functions to inform an economic model consistent with the utility-maximizing decisions of a potentially risk-averse private landowner. We demonstrate the salience and applicability of the framework using data from an experimental grassland to estimate production relationships for hay and carbon storage. For that case, our results suggest that even a risk-neutral, profit-maximizing landowner would favor a highly diverse mix of species, with optimal species richness falling between the low levels currently found in commercial grasslands and the high levels found in natural grasslands.

  7. Metal-organic frameworks in chromatography.

    PubMed

    Yusuf, Kareem; Aqel, Ahmad; ALOthman, Zeid

    2014-06-27

    Metal-organic frameworks (MOFs) emerged approximately two decades ago and are the youngest class of porous materials. Despite their short existence, MOFs are finding applications in a variety of fields because of their outstanding chemical and physical properties. This review article focuses on the applications of MOFs in chromatography, including high-performance liquid chromatography (HPLC), gas chromatography (GC), and other chromatographic techniques. The use of MOFs in chromatography has already had a significant impact; however, the utilisation of MOFs in chromatography is still less common than other applications, and the number of MOF materials explored in chromatography applications is limited. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Quantum Accelerators for High-performance Computing Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humble, Travis S.; Britt, Keith A.; Mohiyaddin, Fahd A.

    We define some of the programming and system-level challenges facing the application of quantum processing to high-performance computing. Alongside barriers to physical integration, prominent differences in the execution of quantum and conventional programs challenges the intersection of these computational models. Following a brief overview of the state of the art, we discuss recent advances in programming and execution models for hybrid quantum-classical computing. We discuss a novel quantum-accelerator framework that uses specialized kernels to offload select workloads while integrating with existing computing infrastructure. We elaborate on the role of the host operating system to manage these unique accelerator resources, themore » prospects for deploying quantum modules, and the requirements placed on the language hierarchy connecting these different system components. We draw on recent advances in the modeling and simulation of quantum computing systems with the development of architectures for hybrid high-performance computing systems and the realization of software stacks for controlling quantum devices. Finally, we present simulation results that describe the expected system-level behavior of high-performance computing systems composed from compute nodes with quantum processing units. We describe performance for these hybrid systems in terms of time-to-solution, accuracy, and energy consumption, and we use simple application examples to estimate the performance advantage of quantum acceleration.« less

  9. A framework for the selection and ensemble development of flood vulnerability models

    NASA Astrophysics Data System (ADS)

    Figueiredo, Rui; Schröter, Kai; Kreibich, Heidi; Martina, Mario

    2017-04-01

    Effective understanding and management of flood risk requires comprehensive risk assessment studies that consider not only the hazard component, but also the impacts that the phenomena may have on the built environment, economy and society. This integrated approach has gained importance over recent decades, and with it so has the scientific attention given to flood vulnerability models describing the relationships between flood intensity metrics and damage to physical assets, also known as flood loss models. Despite considerable progress in this field, many challenges persist. Flood damage mechanisms are complex and depend on multiple variables, which can have different degrees of importance depending on the application setting. In addition, data required for the development and validation of such models tend to be scarce, particularly in data poor regions. These issues are reflected in the large amount of flood vulnerability models that are available in the literature today, as well as in their high heterogeneity: they are built with different modelling approaches, in different geographic contexts, utilizing different explanatory variables, and with varying levels of complexity. Notwithstanding recent developments in this area, uncertainty remains high, and large disparities exist among models. For these reasons, identifying which model or models, given their properties, are appropriate for a given context is not straightforward. In the present study, we propose a framework that guides the structured selection of flood vulnerability models and enables ranking them according to their suitability for a certain application, based on expert judgement. The approach takes advantage of current state of the art and most up-to-date knowledge on flood vulnerability processes. Given the heterogeneity and uncertainty currently present in flood vulnerability models, we propose the use of a model ensemble. With this in mind, the proposed approach is based on a weighting scheme within a logic-tree framework that enables the generation of such ensembles in a logically consistent manner. We test and discuss the results by applying the framework to the case study of the 2002 floods along the Mulde River in Germany. Applications of individual models and model ensembles are compared and discussed.

  10. A methodological framework to support the initiation, design and institutionalization of participatory modeling processes in water resources management

    NASA Astrophysics Data System (ADS)

    Halbe, Johannes; Pahl-Wostl, Claudia; Adamowski, Jan

    2018-01-01

    Multiple barriers constrain the widespread application of participatory methods in water management, including the more technical focus of most water agencies, additional cost and time requirements for stakeholder involvement, as well as institutional structures that impede collaborative management. This paper presents a stepwise methodological framework that addresses the challenges of context-sensitive initiation, design and institutionalization of participatory modeling processes. The methodological framework consists of five successive stages: (1) problem framing and stakeholder analysis, (2) process design, (3) individual modeling, (4) group model building, and (5) institutionalized participatory modeling. The Management and Transition Framework is used for problem diagnosis (Stage One), context-sensitive process design (Stage Two) and analysis of requirements for the institutionalization of participatory water management (Stage Five). Conceptual modeling is used to initiate participatory modeling processes (Stage Three) and ensure a high compatibility with quantitative modeling approaches (Stage Four). This paper describes the proposed participatory model building (PMB) framework and provides a case study of its application in Québec, Canada. The results of the Québec study demonstrate the applicability of the PMB framework for initiating and designing participatory model building processes and analyzing barriers towards institutionalization.

  11. Modeling the Impact of Alternative Immunization Strategies: Using Matrices as Memory Lanes

    PubMed Central

    Alonso, Wladimir J.; Rabaa, Maia A.; Giglio, Ricardo; Miller, Mark A.; Schuck-Paim, Cynthia

    2015-01-01

    Existing modeling approaches are divided between a focus on the constitutive (micro) elements of systems or on higher (macro) organization levels. Micro-level models enable consideration of individual histories and interactions, but can be unstable and subject to cumulative errors. Macro-level models focus on average population properties, but may hide relevant heterogeneity at the micro-scale. We present a framework that integrates both approaches through the use of temporally structured matrices that can take large numbers of variables into account. Matrices are composed of several bidimensional (time×age) grids, each representing a state (e.g. physiological, immunological, socio-demographic). Time and age are primary indices linking grids. These matrices preserve the entire history of all population strata and enable the use of historical events, parameters and states dynamically in the modeling process. This framework is applicable across fields, but particularly suitable to simulate the impact of alternative immunization policies. We demonstrate the framework by examining alternative strategies to accelerate measles elimination in 15 developing countries. The model recaptured long-endorsed policies in measles control, showing that where a single routine measles-containing vaccine is employed with low coverage, any improvement in coverage is more effective than a second dose. It also identified an opportunity to save thousands of lives in India at attractively low costs through the implementation of supplementary immunization campaigns. The flexibility of the approach presented enables estimating the effectiveness of different immunization policies in highly complex contexts involving multiple and historical influences from different hierarchical levels. PMID:26509976

  12. Microchip-Based Single-Cell Functional Proteomics for Biomedical Applications

    PubMed Central

    Lu, Yao; Yang, Liu; Wei, Wei; Shi, Qihui

    2017-01-01

    Cellular heterogeneity has been widely recognized but only recently have single cell tools become available that allow characterizing heterogeneity at the genomic and proteomic levels. We review the technological advances in microchip-based toolkits for single-cell functional proteomics. Each of these tools has distinct advantages and limitations, and a few have advanced toward being applied to address biological or clinical problems that fail to be addressed by traditional population-based methods. High-throughput single-cell proteomic assays generate high-dimensional data sets that contain new information and thus require developing new analytical framework to extract new biology. In this review article, we highlight a few biological and clinical applications in which the microchip-based single-cell proteomic tools provide unique advantages. The examples include resolving functional heterogeneity and dynamics of immune cells, dissecting cell-cell interaction by creating well-contolled on-chip microenvironment, capturing high-resolution snapshots of immune system functions in patients for better immunotherapy and elucidating phosphoprotein signaling networks in cancer cells for guiding effective molecularly targeted therapies. PMID:28280819

  13. Experimental demonstration of OpenFlow-enabled media ecosystem architecture for high-end applications over metro and core networks.

    PubMed

    Ntofon, Okung-Dike; Channegowda, Mayur P; Efstathiou, Nikolaos; Rashidi Fard, Mehdi; Nejabati, Reza; Hunter, David K; Simeonidou, Dimitra

    2013-02-25

    In this paper, a novel Software-Defined Networking (SDN) architecture is proposed for high-end Ultra High Definition (UHD) media applications. UHD media applications require huge amounts of bandwidth that can only be met with high-capacity optical networks. In addition, there are requirements for control frameworks capable of delivering effective application performance with efficient network utilization. A novel SDN-based Controller that tightly integrates application-awareness with network control and management is proposed for such applications. An OpenFlow-enabled test-bed demonstrator is reported with performance evaluations of advanced online and offline media- and network-aware schedulers.

  14. A Multifactorial Approach to Sport-Related Concussion Prevention and Education: Application of the Socioecological Framework.

    PubMed

    Register-Mihalik, Johna; Baugh, Christine; Kroshus, Emily; Y Kerr, Zachary; Valovich McLeod, Tamara C

    2017-03-01

    To offer an overview of sport-related concussion (SRC) prevention and education strategies in the context of the socioecological framework (SEF). Athletic trainers (ATs) will understand the many factors that interact to influence SRC prevention and the implications of these interactions for effective SRC education. Concussion is a complex injury that is challenging to identify and manage, particularly when athletes fail to disclose symptoms to their health care providers. Education is 1 strategy for increasing disclosure. However, limited information addresses how ATs can integrate the many factors that may influence the effectiveness of SRC education into their specific settings. Public health models provide an example through the SEF, which highlights the interplay among various levels of society and sport that can facilitate SRC prevention strategies, including education. For ATs to develop appropriate SRC prevention strategies, a framework for application is needed. A growing body of information concerning SRC prevention indicates that knowledge alone is insufficient to change concussion-related behaviors. The SEF allows this information to be considered at levels such as policy and societal, community, interpersonal (relationships), and intrapersonal (athlete). The use of such a framework will facilitate more comprehensive SRC prevention efforts that can be applied in all athletic training practice settings. Clinical Applications: Athletic trainers can use this information as they plan SRC prevention strategies in their specific settings. This approach will aid in addressing the layers of complexity that exist when developing a concussion-management policy and plan.

  15. Three-dimensional polypyrrole-derived carbon nanotube framework for dye adsorption and electrochemical supercapacitor

    NASA Astrophysics Data System (ADS)

    Xin, Shengchang; Yang, Na; Gao, Fei; Zhao, Jing; Li, Liang; Teng, Chao

    2017-08-01

    Three-dimensional carbon nanotube frameworks have been prepared via pyrolysis of polypyrrole nanotube aerogels that are synthesized by the simultaneous self-degraded template synthesis and hydrogel assembly followed by freeze-drying. The microstructure and composition of the materials are investigated by thermal gravimetric analysis, Raman spectrum, X-ray photoelectron spectroscopy, transmission electron microscopy, and specific surface analyzer. The results confirm the formation of three-dimensional carbon nanotube frameworks with low density, high mechanical properties, and high specific surface area. Compared with PPy aerogel precursor, the as-prepared three-dimensional carbon nanotube frameworks exhibit outstanding adsorption capacity towards organic dyes. Moreover, electrochemical tests show that the products possess high specific capacitance, good rate capability and excellent cycling performance with no capacitance loss over 1000 cycles. These characteristics collectively indicate the potential of three-dimensional polypyrrole-derived carbon nanotube framework as a promising macroscopic device for the applications in environmental and energy storages.

  16. Smart City Energy Interconnection Technology Framework Preliminary Research

    NASA Astrophysics Data System (ADS)

    Zheng, Guotai; Zhao, Baoguo; Zhao, Xin; Li, Hao; Huo, Xianxu; Li, Wen; Xia, Yu

    2018-01-01

    to improve urban energy efficiency, improve the absorptive ratio of new energy resources and renewable energy sources, and reduce environmental pollution and other energy supply and consumption technology framework matched with future energy restriction conditions and applied technology level are required to be studied. Relative to traditional energy supply system, advanced information technology-based “Energy Internet” technical framework may give play to energy integrated application and load side interactive technology advantages, as a whole optimize energy supply and consumption and improve the overall utilization efficiency of energy.

  17. ProjectQ Software Framework

    NASA Astrophysics Data System (ADS)

    Steiger, Damian S.; Haener, Thomas; Troyer, Matthias

    Quantum computers promise to transform our notions of computation by offering a completely new paradigm. A high level quantum programming language and optimizing compilers are essential components to achieve scalable quantum computation. In order to address this, we introduce the ProjectQ software framework - an open source effort to support both theorists and experimentalists by providing intuitive tools to implement and run quantum algorithms. Here, we present our ProjectQ quantum compiler, which compiles a quantum algorithm from our high-level Python-embedded language down to low-level quantum gates available on the target system. We demonstrate how this compiler can be used to control actual hardware and to run high-performance simulations.

  18. Rotation Covariant Image Processing for Biomedical Applications

    PubMed Central

    Reisert, Marco

    2013-01-01

    With the advent of novel biomedical 3D image acquisition techniques, the efficient and reliable analysis of volumetric images has become more and more important. The amount of data is enormous and demands an automated processing. The applications are manifold, ranging from image enhancement, image reconstruction, and image description to object/feature detection and high-level contextual feature extraction. In most scenarios, it is expected that geometric transformations alter the output in a mathematically well-defined manner. In this paper we emphasis on 3D translations and rotations. Many algorithms rely on intensity or low-order tensorial-like descriptions to fulfill this demand. This paper proposes a general mathematical framework based on mathematical concepts and theories transferred from mathematical physics and harmonic analysis into the domain of image analysis and pattern recognition. Based on two basic operations, spherical tensor differentiation and spherical tensor multiplication, we show how to design a variety of 3D image processing methods in an efficient way. The framework has already been applied to several biomedical applications ranging from feature and object detection tasks to image enhancement and image restoration techniques. In this paper, the proposed methods are applied on a variety of different 3D data modalities stemming from medical and biological sciences. PMID:23710255

  19. Concurrent enterprise: a conceptual framework for enterprise supply-chain network activities

    NASA Astrophysics Data System (ADS)

    Addo-Tenkorang, Richard; Helo, Petri T.; Kantola, Jussi

    2017-04-01

    Supply-chain management (SCM) in manufacturing industries has evolved significantly over the years. Recently, a lot more relevant research has picked up on the development of integrated solutions. Thus, seeking a collaborative optimisation of geographical, just-in-time (JIT), quality (customer demand/satisfaction) and return-on-investment (profits), aspects of organisational management and planning through 'best practice' business-process management - concepts and application; employing system tools such as certain applications/aspects of enterprise resource planning (ERP) - SCM systems information technology (IT) enablers to enhance enterprise integrated product development/concurrent engineering principles. This article assumed three main organisation theory applications in positioning its assumptions. Thus, proposing a feasible industry-specific framework not currently included within the SCOR model's level four (4) implementation level, as well as other existing SCM integration reference models such as in the MIT process handbook's - Process Interchange Format (PIF), the TOVE project, etc. which could also be replicated in other SCs. However, the wider focus of this paper's contribution will be concentrated on a complimentary proposed framework to the SCC's SCOR reference model. Quantitative empirical closed-ended questionnaires in addition to the main data collected from a qualitative empirical real-life industrial-based pilot case study were used: To propose a conceptual concurrent enterprise framework for SCM network activities. This research adopts a design structure matrix simulation approach analysis to propose an optimal enterprise SCM-networked value-adding, customised master data-management platform/portal for efficient SCM network information exchange and an effective supply-chain (SC) network systems-design teams' structure. Furthermore, social network theory analysis will be employed in a triangulation approach with statistical correlation analysis to assess the scale/level of frequency, importance, level of collaborative-ness, mutual trust as well as roles and responsibility among the enterprise SCM network for systems product development (PD) design teams' technical communication network as well as extensive literature reviews.

  20. Spatial rule-based assessment of habitat potential to predict impact of land use changes on biodiversity at municipal scale.

    PubMed

    Scolozzi, Rocco; Geneletti, Davide

    2011-03-01

    In human dominated landscapes, ecosystems are under increasing pressures caused by urbanization and infrastructure development. In Alpine valleys remnant natural areas are increasingly affected by habitat fragmentation and loss. In these contexts, there is a growing risk of local extinction for wildlife populations; hence assessing the consequences on biodiversity of proposed land use changes is extremely important. The article presents a methodology to assess the impacts of land use changes on target species at a local scale. The approach relies on the application of ecological profiles of target species for habitat potential (HP) assessment, using high resolution GIS-data within a multiple level framework. The HP, in this framework, is based on a species-specific assessment of the suitability of a site, as well of surrounding areas. This assessment is performed through spatial rules, structured as sets of queries on landscape objects. We show that by considering spatial dependencies in habitat assessment it is possible to perform better quantification of impacts of local-level land use changes on habitats.

  1. Chemical principles underpinning the performance of the metal-organic framework HKUST-1.

    PubMed

    Hendon, Christopher H; Walsh, Aron

    2015-07-15

    A common feature of multi-functional metal-organic frameworks is a metal dimer in the form of a paddlewheel, as found in the structure of Cu 3 ( btc ) 2 (HKUST-1). The HKUST-1 framework demonstrates exceptional gas storage, sensing and separation, catalytic activity and, in recent studies, unprecedented ionic and electrical conductivity. These results are a promising step towards the real-world application of metal-organic materials. In this perspective, we discuss progress in the understanding of the electronic, magnetic and physical properties of HKUST-1, representative of the larger family of Cu···Cu containing metal-organic frameworks. We highlight the chemical interactions that give rise to its favourable properties, and which make this material well suited to a range of technological applications. From this analysis, we postulate key design principles for tailoring novel high-performance hybrid frameworks.

  2. Chemical principles underpinning the performance of the metal–organic framework HKUST-1

    PubMed Central

    Hendon, Christopher H.

    2015-01-01

    A common feature of multi-functional metal–organic frameworks is a metal dimer in the form of a paddlewheel, as found in the structure of Cu3(btc)2 (HKUST-1). The HKUST-1 framework demonstrates exceptional gas storage, sensing and separation, catalytic activity and, in recent studies, unprecedented ionic and electrical conductivity. These results are a promising step towards the real-world application of metal–organic materials. In this perspective, we discuss progress in the understanding of the electronic, magnetic and physical properties of HKUST-1, representative of the larger family of Cu···Cu containing metal–organic frameworks. We highlight the chemical interactions that give rise to its favourable properties, and which make this material well suited to a range of technological applications. From this analysis, we postulate key design principles for tailoring novel high-performance hybrid frameworks. PMID:28706713

  3. 2007 Mississippi Curriculum Framework: Secondary Technology Applications. (Program CIP: 21.0101 - Technology Applications)

    ERIC Educational Resources Information Center

    Fava, David; Gunkel, Andy; Hood, Jennifer; Mason, Debra; Walker, Jim

    2007-01-01

    Secondary vocational-technical education programs in Mississippi are faced with many challenges resulting from sweeping educational reforms at the national and state levels. Schools and teachers are increasingly being held accountable for providing true learning activities to every student in the classroom. This accountability is measured through…

  4. Integrating Real-World Numeracy Applications and Modelling into Vocational Courses

    ERIC Educational Resources Information Center

    Hall, Graham

    2014-01-01

    Practitioner research is in progress at a Further Education college to improve the motivation of vocational students for numeracy and problem solving. A framework proposed by Tang, Sui, & Wang (2003) has been adapted for use in courses. Five levels are identified for embedding numeracy applications and modelling into vocational studies:…

  5. A critical analysis of hazard resilience measures within sustainability assessment frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthews, Elizabeth C., E-mail: echiso1@lsu.edu; Sattler, Meredith, E-mail: msattler@lsu.edu; Friedland, Carol J., E-mail: friedland@lsu.edu

    Today, numerous sustainability assessment frameworks (SAFs) exist to guide designers in achieving sustainable performance in the design of structures and communities. SAFs are beneficial in educating users and are useful tools for incorporating sustainability strategies into planning, design, and construction; however, there is currently a substantial gap in the ability of existing SAFs to incorporate hazard resistance and hazard mitigation in the broader context of sustainable design. This paper analyzes the incorporation of hazard resistant design and hazard mitigation strategies within SAFs via a multi-level analysis of eleven SAFs. The SAFs analyzed range in scale of application (i.e. building, site,more » community). Three levels of analysis are presented: (1) macro-level analysis comparing the number of measures strictly addressing resilience versus sustainability, (2) meso-level analysis of the coverage of types of hazards within SAFs (e.g. flood, fire), and (3) micro-level analysis of SAF measures connected to flood-related hazard resilience. The results demonstrate that hazard resistance and hazard mitigation do not figure prominently in the intent of SAFs and that weaknesses in resilience coverage exist that have the potential to lead to the design of structures and communities that are still highly vulnerable to the impacts of extreme events. - Highlights: • Sustainability assessment frameworks (SAFs) were analyzed for resilience coverage • Hazard resistance and mitigation do not figure prominently in the intent of SAFs • Approximately 75% of SAFs analyzed address three or fewer hazards • Lack of economic measures within SAFs could impact resilience and sustainability • Resilience measures for flood hazards are not consistently included in SAFs.« less

  6. Semiconductive 3-D haloplumbate framework hybrids with high color rendering index white-light emission.

    PubMed

    Wang, Guan-E; Xu, Gang; Wang, Ming-Sheng; Cai, Li-Zhen; Li, Wen-Hua; Guo, Guo-Cong

    2015-12-01

    Single-component white light materials may create great opportunities for novel conventional lighting applications and display systems; however, their reported color rendering index (CRI) values, one of the key parameters for lighting, are less than 90, which does not satisfy the demand of color-critical upmarket applications, such as photography, cinematography, and art galleries. In this work, two semiconductive chloroplumbate (chloride anion of lead(ii)) hybrids, obtained using a new inorganic-organic hybrid strategy, show unprecedented 3-D inorganic framework structures and white-light-emitting properties with high CRI values around 90, one of which shows the highest value to date.

  7. Highly selective luminescent sensing of picric acid based on a water-stable europium metal-organic framework

    NASA Astrophysics Data System (ADS)

    Xia, Tifeng; Zhu, Fengliang; Cui, Yuanjing; Yang, Yu; Wang, Zhiyu; Qian, Guodong

    2017-01-01

    A water-stable metal-organic framework (MOF) EuNDC has been synthesized for selective detection of the well-known contaminant and toxicant picric acid (PA) in aqueous solution. Due to the photo-induced electron transfer and self-absorption mechanism, EuNDC displayed rapid, selective and sensitive detection of PA with a detection limit of 37.6 ppb. Recyclability experiments revealed that EuNDC retains its initial luminescent intensity and same quenching efficiency in each cycle, suggesting high photostability and reusability for long-term sensing applications. The excellent detection performance of EuNDC makes it a promising PA sensing material for practical applications.

  8. Quantifying the role of climate variability on extreme total water level impacts: An application of a full simulation model to Ocean Beach, California

    NASA Astrophysics Data System (ADS)

    Serafin, K.; Ruggiero, P.; Stockdon, H. F.; Barnard, P.; Long, J.

    2014-12-01

    Many coastal communities worldwide are vulnerable to flooding and erosion driven by extreme total water levels (TWL), potentially dangerous events produced by the combination of large waves, high tides, and high non-tidal residuals. The West coast of the United States provides an especially challenging environment to model these processes due to its complex geological setting combined with uncertain forecasts for sea level rise (SLR), changes in storminess, and possible changes in the frequency of major El Niños. Our research therefore aims to develop an appropriate methodology to assess present-day and future storm-induced coastal hazards along the entire U.S. West coast, filling this information gap. We present the application of this framework in a pilot study at Ocean Beach, California, a National Park site within the Golden Gate National Recreation Area where existing event-scale coastal change data can be used for model calibration and verification. We use a probabilistic, full simulation TWL model (TWL-FSM; Serafin and Ruggiero, in press) that captures the seasonal and interannual climatic variability in extremes using functions of regional climate indices, such as the Multivariate ENSO index (MEI), to represent atmospheric patterns related to the El Niño-Southern Oscillation (ENSO). In order to characterize the effect of climate variability on TWL components, we refine the TWL-FSM by splitting non-tidal residuals into low (monthly mean sea level anomalies) and high frequency (storm surge) components. We also develop synthetic climate indices using Markov sequences to reproduce the autocorrelated nature of ENSO behavior. With the refined TWL-FSM, we simulate each TWL component, resulting in synthetic TWL records providing robust estimates of extreme return level events (e.g., the 100-yr event) and the ability to examine the relative contribution of each TWL component to these extreme events. Extreme return levels are then used to drive storm impact models to examine the probability of coastal change (Stockdon et al., 2013) and thus, the vulnerability to storm-induced coastal hazards that Ocean Beach faces. Future climate variability is easily incorporated into this framework, allowing us to quantify how an evolving climate will alter future extreme TWLs and their related coastal impacts.

  9. Individual heterogeneity in life histories and eco-evolutionary dynamics

    PubMed Central

    Vindenes, Yngvild; Langangen, Øystein

    2015-01-01

    Individual heterogeneity in life history shapes eco-evolutionary processes, and unobserved heterogeneity can affect demographic outputs characterising life history and population dynamical properties. Demographic frameworks like matrix models or integral projection models represent powerful approaches to disentangle mechanisms linking individual life histories and population-level processes. Recent developments have provided important steps towards their application to study eco-evolutionary dynamics, but so far individual heterogeneity has largely been ignored. Here, we present a general demographic framework that incorporates individual heterogeneity in a flexible way, by separating static and dynamic traits (discrete or continuous). First, we apply the framework to derive the consequences of ignoring heterogeneity for a range of widely used demographic outputs. A general conclusion is that besides the long-term growth rate lambda, all parameters can be affected. Second, we discuss how the framework can help advance current demographic models of eco-evolutionary dynamics, by incorporating individual heterogeneity. For both applications numerical examples are provided, including an empirical example for pike. For instance, we demonstrate that predicted demographic responses to climate warming can be reversed by increased heritability. We discuss how applications of this demographic framework incorporating individual heterogeneity can help answer key biological questions that require a detailed understanding of eco-evolutionary dynamics. PMID:25807980

  10. IMHOTEP: virtual reality framework for surgical applications.

    PubMed

    Pfeiffer, Micha; Kenngott, Hannes; Preukschas, Anas; Huber, Matthias; Bettscheider, Lisa; Müller-Stich, Beat; Speidel, Stefanie

    2018-05-01

    The data which is available to surgeons before, during and after surgery is steadily increasing in quantity as well as diversity. When planning a patient's treatment, this large amount of information can be difficult to interpret. To aid in processing the information, new methods need to be found to present multimodal patient data, ideally combining textual, imagery, temporal and 3D data in a holistic and context-aware system. We present an open-source framework which allows handling of patient data in a virtual reality (VR) environment. By using VR technology, the workspace available to the surgeon is maximized and 3D patient data is rendered in stereo, which increases depth perception. The framework organizes the data into workspaces and contains tools which allow users to control, manipulate and enhance the data. Due to the framework's modular design, it can easily be adapted and extended for various clinical applications. The framework was evaluated by clinical personnel (77 participants). The majority of the group stated that a complex surgical situation is easier to comprehend by using the framework, and that it is very well suited for education. Furthermore, the application to various clinical scenarios-including the simulation of excitation propagation in the human atrium-demonstrated the framework's adaptability. As a feasibility study, the framework was used during the planning phase of the surgical removal of a large central carcinoma from a patient's liver. The clinical evaluation showed a large potential and high acceptance for the VR environment in a medical context. The various applications confirmed that the framework is easily extended and can be used in real-time simulation as well as for the manipulation of complex anatomical structures.

  11. A study of IEEE 802.15.4 security framework for wireless body area networks.

    PubMed

    Saleem, Shahnaz; Ullah, Sana; Kwak, Kyung Sup

    2011-01-01

    A Wireless Body Area Network (WBAN) is a collection of low-power and lightweight wireless sensor nodes that are used to monitor the human body functions and the surrounding environment. It supports a number of innovative and interesting applications, including ubiquitous healthcare and Consumer Electronics (CE) applications. Since WBAN nodes are used to collect sensitive (life-critical) information and may operate in hostile environments, they require strict security mechanisms to prevent malicious interaction with the system. In this paper, we first highlight major security requirements and Denial of Service (DoS) attacks in WBAN at Physical, Medium Access Control (MAC), Network, and Transport layers. Then we discuss the IEEE 802.15.4 security framework and identify the security vulnerabilities and major attacks in the context of WBAN. Different types of attacks on the Contention Access Period (CAP) and Contention Free Period (CFP) parts of the superframe are analyzed and discussed. It is observed that a smart attacker can successfully corrupt an increasing number of GTS slots in the CFP period and can considerably affect the Quality of Service (QoS) in WBAN (since most of the data is carried in CFP period). As we increase the number of smart attackers the corrupted GTS slots are eventually increased, which prevents the legitimate nodes to utilize the bandwidth efficiently. This means that the direct adaptation of IEEE 802.15.4 security framework for WBAN is not totally secure for certain WBAN applications. New solutions are required to integrate high level security in WBAN.

  12. A Study of IEEE 802.15.4 Security Framework for Wireless Body Area Networks

    PubMed Central

    Saleem, Shahnaz; Ullah, Sana; Kwak, Kyung Sup

    2011-01-01

    A Wireless Body Area Network (WBAN) is a collection of low-power and lightweight wireless sensor nodes that are used to monitor the human body functions and the surrounding environment. It supports a number of innovative and interesting applications, including ubiquitous healthcare and Consumer Electronics (CE) applications. Since WBAN nodes are used to collect sensitive (life-critical) information and may operate in hostile environments, they require strict security mechanisms to prevent malicious interaction with the system. In this paper, we first highlight major security requirements and Denial of Service (DoS) attacks in WBAN at Physical, Medium Access Control (MAC), Network, and Transport layers. Then we discuss the IEEE 802.15.4 security framework and identify the security vulnerabilities and major attacks in the context of WBAN. Different types of attacks on the Contention Access Period (CAP) and Contention Free Period (CFP) parts of the superframe are analyzed and discussed. It is observed that a smart attacker can successfully corrupt an increasing number of GTS slots in the CFP period and can considerably affect the Quality of Service (QoS) in WBAN (since most of the data is carried in CFP period). As we increase the number of smart attackers the corrupted GTS slots are eventually increased, which prevents the legitimate nodes to utilize the bandwidth efficiently. This means that the direct adaptation of IEEE 802.15.4 security framework for WBAN is not totally secure for certain WBAN applications. New solutions are required to integrate high level security in WBAN. PMID:22319358

  13. Framework to model neutral particle flux in convex high aspect ratio structures using one-dimensional radiosity

    NASA Astrophysics Data System (ADS)

    Manstetten, Paul; Filipovic, Lado; Hössinger, Andreas; Weinbub, Josef; Selberherr, Siegfried

    2017-02-01

    We present a computationally efficient framework to compute the neutral flux in high aspect ratio structures during three-dimensional plasma etching simulations. The framework is based on a one-dimensional radiosity approach and is applicable to simulations of convex rotationally symmetric holes and convex symmetric trenches with a constant cross-section. The framework is intended to replace the full three-dimensional simulation step required to calculate the neutral flux during plasma etching simulations. Especially for high aspect ratio structures, the computational effort, required to perform the full three-dimensional simulation of the neutral flux at the desired spatial resolution, conflicts with practical simulation time constraints. Our results are in agreement with those obtained by three-dimensional Monte Carlo based ray tracing simulations for various aspect ratios and convex geometries. With this framework we present a comprehensive analysis of the influence of the geometrical properties of high aspect ratio structures as well as of the particle sticking probability on the neutral particle flux.

  14. Making sense of complexity in context and implementation: the Context and Implementation of Complex Interventions (CICI) framework.

    PubMed

    Pfadenhauer, Lisa M; Gerhardus, Ansgar; Mozygemba, Kati; Lysdahl, Kristin Bakke; Booth, Andrew; Hofmann, Bjørn; Wahlster, Philip; Polus, Stephanie; Burns, Jacob; Brereton, Louise; Rehfuess, Eva

    2017-02-15

    The effectiveness of complex interventions, as well as their success in reaching relevant populations, is critically influenced by their implementation in a given context. Current conceptual frameworks often fail to address context and implementation in an integrated way and, where addressed, they tend to focus on organisational context and are mostly concerned with specific health fields. Our objective was to develop a framework to facilitate the structured and comprehensive conceptualisation and assessment of context and implementation of complex interventions. The Context and Implementation of Complex Interventions (CICI) framework was developed in an iterative manner and underwent extensive application. An initial framework based on a scoping review was tested in rapid assessments, revealing inconsistencies with respect to the underlying concepts. Thus, pragmatic utility concept analysis was undertaken to advance the concepts of context and implementation. Based on these findings, the framework was revised and applied in several systematic reviews, one health technology assessment (HTA) and one applicability assessment of very different complex interventions. Lessons learnt from these applications and from peer review were incorporated, resulting in the CICI framework. The CICI framework comprises three dimensions-context, implementation and setting-which interact with one another and with the intervention dimension. Context comprises seven domains (i.e., geographical, epidemiological, socio-cultural, socio-economic, ethical, legal, political); implementation consists of five domains (i.e., implementation theory, process, strategies, agents and outcomes); setting refers to the specific physical location, in which the intervention is put into practise. The intervention and the way it is implemented in a given setting and context can occur on a micro, meso and macro level. Tools to operationalise the framework comprise a checklist, data extraction tools for qualitative and quantitative reviews and a consultation guide for applicability assessments. The CICI framework addresses and graphically presents context, implementation and setting in an integrated way. It aims at simplifying and structuring complexity in order to advance our understanding of whether and how interventions work. The framework can be applied in systematic reviews and HTA as well as primary research and facilitate communication among teams of researchers and with various stakeholders.

  15. XAL Application Framework and Bricks GUI Builder

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pelaia II, Tom

    2007-01-01

    The XAL [1] Application Framework is a framework for rapidly developing document based Java applications with a common look and feel along with many built-in user interface behaviors. The Bricks GUI builder consists of a modern application and framework for rapidly building user interfaces in support of true Model-View-Controller (MVC) compliant Java applications. Bricks and the XAL Application Framework allow developers to rapidly create quality applications.

  16. Coping with domestic violence: control attributions, dysphoria, and hopelessness.

    PubMed

    Clements, C M; Sawhney, D K

    2000-04-01

    We investigated the influence of control judgments and coping style on emotional reactions to domestic violence utilizing the framework of hopelessness theory. We assessed abuse severity, control attributions, coping, dysphoric symptoms, and hopelessness in 70 battered women recruited from 12 domestic violence agencies. Respondents reported dysphoria but not hopelessness. Increased reports of dysphoria were associated with higher levels of self-blame and avoidance coping and lower levels of problem-focused coping. Increased problem-focused coping was associated with decreased hopelessness. Perceived control over current abuse was not related to dysphoria. High expectations for control over future events were associated with decreased dysphoria. We discuss our results in terms of their application to attributional accounts of emotional reactions to battering.

  17. Individual-based approach to epidemic processes on arbitrary dynamic contact networks

    NASA Astrophysics Data System (ADS)

    Rocha, Luis E. C.; Masuda, Naoki

    2016-08-01

    The dynamics of contact networks and epidemics of infectious diseases often occur on comparable time scales. Ignoring one of these time scales may provide an incomplete understanding of the population dynamics of the infection process. We develop an individual-based approximation for the susceptible-infected-recovered epidemic model applicable to arbitrary dynamic networks. Our framework provides, at the individual-level, the probability flow over time associated with the infection dynamics. This computationally efficient framework discards the correlation between the states of different nodes, yet provides accurate results in approximating direct numerical simulations. It naturally captures the temporal heterogeneities and correlations of contact sequences, fundamental ingredients regulating the timing and size of an epidemic outbreak, and the number of secondary infections. The high accuracy of our approximation further allows us to detect the index individual of an epidemic outbreak in real-life network data.

  18. The Bayesian boom: good thing or bad?

    PubMed Central

    Hahn, Ulrike

    2014-01-01

    A series of high-profile critiques of Bayesian models of cognition have recently sparked controversy. These critiques question the contribution of rational, normative considerations in the study of cognition. The present article takes central claims from these critiques and evaluates them in light of specific models. Closer consideration of actual examples of Bayesian treatments of different cognitive phenomena allows one to defuse these critiques showing that they cannot be sustained across the diversity of applications of the Bayesian framework for cognitive modeling. More generally, there is nothing in the Bayesian framework that would inherently give rise to the deficits that these critiques perceive, suggesting they have been framed at the wrong level of generality. At the same time, the examples are used to demonstrate the different ways in which consideration of rationality uniquely benefits both theory and practice in the study of cognition. PMID:25152738

  19. Extending enterprise architecture modelling with business goals and requirements

    NASA Astrophysics Data System (ADS)

    Engelsman, Wilco; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten

    2011-02-01

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling techniques for EA focus on the products, services, processes and applications of an enterprise. In addition, techniques may be provided to describe structured requirements lists and use cases. Little support is available however for modelling the underlying motivation of EAs in terms of stakeholder concerns and the high-level goals that address these concerns. This article describes a language that supports the modelling of this motivation. The definition of the language is based on existing work on high-level goal and requirements modelling and is aligned with an existing standard for enterprise modelling: the ArchiMate language. Furthermore, the article illustrates how EA can benefit from analysis techniques from the requirements engineering domain.

  20. Recognition of complex human behaviours using 3D imaging for intelligent surveillance applications

    NASA Astrophysics Data System (ADS)

    Yao, Bo; Lepley, Jason J.; Peall, Robert; Butler, Michael; Hagras, Hani

    2016-10-01

    We introduce a system that exploits 3-D imaging technology as an enabler for the robust recognition of the human form. We combine this with pose and feature recognition capabilities from which we can recognise high-level human behaviours. We propose a hierarchical methodology for the recognition of complex human behaviours, based on the identification of a set of atomic behaviours, individual and sequential poses (e.g. standing, sitting, walking, drinking and eating) that provides a framework from which we adopt time-based machine learning techniques to recognise complex behaviour patterns.

  1. Geological repository for nuclear high level waste in France from feasibility to design within a legal framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Voizard, Patrice; Mayer, Stefan; Ouzounian, Gerald

    Over the past 15 years, the French program on deep geologic disposal of high level and long-lived radioactive waste has benefited from a clear legal framework as the result of the December 30, 1991 French Waste Act. To fulfil its obligations stipulated in this law, ANDRA has submitted the 'Dossier 2005 Argile' (clay) and 'Dossier 2005 Granite' to the French Government. The first of those reports presents a concept for the underground disposal of nuclear waste at a specific clay site and focuses on a feasibility study. Knowledge of the host rock characteristics is based on the investigations carried outmore » at the Meuse/Haute Marne Underground Research Laboratory. The repository concept addresses various issues, the most important of which relates to the large amount of waste, the clay host rock and the reversibility requirement. This phase has ended upon review and evaluation of the 'Dossier 2005' made by different organisations including the National Review Board, the National Safety Authority and the NEA International Review Team. By passing the 'new', June 28, 2006 Planning Act on the sustainable management of radioactive materials and waste, the French parliament has further defined a clear legal framework for future work. This June 28 Planning Act thus sets a schedule and defines the objectives for the next phase of repository design in requesting the submission of a construction authorization application by 2015. The law calls for the repository program to be in a position to commission disposal installations by 2025. (authors)« less

  2. High-fidelity simulations of unsteady civil aircraft aerodynamics: stakes and perspectives. Application of zonal detached eddy simulation.

    PubMed

    Deck, Sébastien; Gand, Fabien; Brunet, Vincent; Ben Khelil, Saloua

    2014-08-13

    This paper provides an up-to-date survey of the use of zonal detached eddy simulations (ZDES) for unsteady civil aircraft applications as a reflection on the stakes and perspectives of the use of hybrid methods in the framework of industrial aerodynamics. The issue of zonal or non-zonal treatment of turbulent flows for engineering applications is discussed. The ZDES method used in this article and based on a fluid problem-dependent zonalization is briefly presented. Some recent landmark achievements for conditions all over the flight envelope are presented, including low-speed (aeroacoustics of high-lift devices and landing gear), cruising (engine-airframe interactions), propulsive jets and off-design (transonic buffet and dive manoeuvres) applications. The implications of such results and remaining challenges in a more global framework are further discussed. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  3. Horizon: The Portable, Scalable, and Reusable Framework for Developing Automated Data Management and Product Generation Systems

    NASA Astrophysics Data System (ADS)

    Huang, T.; Alarcon, C.; Quach, N. T.

    2014-12-01

    Capture, curate, and analysis are the typical activities performed at any given Earth Science data center. Modern data management systems must be adaptable to heterogeneous science data formats, scalable to meet the mission's quality of service requirements, and able to manage the life-cycle of any given science data product. Designing a scalable data management doesn't happen overnight. It takes countless hours of refining, refactoring, retesting, and re-architecting. The Horizon data management and workflow framework, developed at the Jet Propulsion Laboratory, is a portable, scalable, and reusable framework for developing high-performance data management and product generation workflow systems to automate data capturing, data curation, and data analysis activities. The NASA's Physical Oceanography Distributed Active Archive Center (PO.DAAC)'s Data Management and Archive System (DMAS) is its core data infrastructure that handles capturing and distribution of hundreds of thousands of satellite observations each day around the clock. DMAS is an application of the Horizon framework. The NASA Global Imagery Browse Services (GIBS) is NASA's Earth Observing System Data and Information System (EOSDIS)'s solution for making high-resolution global imageries available to the science communities. The Imagery Exchange (TIE), an application of the Horizon framework, is a core subsystem for GIBS responsible for data capturing and imagery generation automation to support the EOSDIS' 12 distributed active archive centers and 17 Science Investigator-led Processing Systems (SIPS). This presentation discusses our ongoing effort in refining, refactoring, retesting, and re-architecting the Horizon framework to enable data-intensive science and its applications.

  4. A Computational Framework for Efficient Low Temperature Plasma Simulations

    NASA Astrophysics Data System (ADS)

    Verma, Abhishek Kumar; Venkattraman, Ayyaswamy

    2016-10-01

    Over the past years, scientific computing has emerged as an essential tool for the investigation and prediction of low temperature plasmas (LTP) applications which includes electronics, nanomaterial synthesis, metamaterials etc. To further explore the LTP behavior with greater fidelity, we present a computational toolbox developed to perform LTP simulations. This framework will allow us to enhance our understanding of multiscale plasma phenomenon using high performance computing tools mainly based on OpenFOAM FVM distribution. Although aimed at microplasma simulations, the modular framework is able to perform multiscale, multiphysics simulations of physical systems comprises of LTP. Some salient introductory features are capability to perform parallel, 3D simulations of LTP applications on unstructured meshes. Performance of the solver is tested based on numerical results assessing accuracy and efficiency of benchmarks for problems in microdischarge devices. Numerical simulation of microplasma reactor at atmospheric pressure with hemispherical dielectric coated electrodes will be discussed and hence, provide an overview of applicability and future scope of this framework.

  5. Preparation Methods of Metal Organic Frameworks and Their Capture of CO2

    NASA Astrophysics Data System (ADS)

    Zhang, Linjian; Liand, Fangqin; Luo, Liangfei

    2018-01-01

    The increasingly serious greenhouse effect makes people pay more attention to the capture and storage technology of CO2. Metal organic frameworks (MOFs) have the advantages of high specific surface area, porous structure and controllable structure, and become the research focus of CO2 emission reduction technology in recent years. In this paper, the characteristics, preparation methods and application of MOFs in the field of CO2 adsorption and separation are discussed, especially the application of flue gas environment in power plants.

  6. Session on High Speed Civil Transport Design Capability Using MDO and High Performance Computing

    NASA Technical Reports Server (NTRS)

    Rehder, Joe

    2000-01-01

    Since the inception of CAS in 1992, NASA Langley has been conducting research into applying multidisciplinary optimization (MDO) and high performance computing toward reducing aircraft design cycle time. The focus of this research has been the development of a series of computational frameworks and associated applications that increased in capability, complexity, and performance over time. The culmination of this effort is an automated high-fidelity analysis capability for a high speed civil transport (HSCT) vehicle installed on a network of heterogeneous computers with a computational framework built using Common Object Request Broker Architecture (CORBA) and Java. The main focus of the research in the early years was the development of the Framework for Interdisciplinary Design Optimization (FIDO) and associated HSCT applications. While the FIDO effort was eventually halted, work continued on HSCT applications of ever increasing complexity. The current application, HSCT4.0, employs high fidelity CFD and FEM analysis codes. For each analysis cycle, the vehicle geometry and computational grids are updated using new values for design variables. Processes for aeroelastic trim, loads convergence, displacement transfer, stress and buckling, and performance have been developed. In all, a total of 70 processes are integrated in the analysis framework. Many of the key processes include automatic differentiation capabilities to provide sensitivity information that can be used in optimization. A software engineering process was developed to manage this large project. Defining the interactions among 70 processes turned out to be an enormous, but essential, task. A formal requirements document was prepared that defined data flow among processes and subprocesses. A design document was then developed that translated the requirements into actual software design. A validation program was defined and implemented to ensure that codes integrated into the framework produced the same results as their standalone counterparts. Finally, a Commercial Off the Shelf (COTS) configuration management system was used to organize the software development. A computational environment, CJOPT, based on the Common Object Request Broker Architecture, CORBA, and the Java programming language has been developed as a framework for multidisciplinary analysis and Optimization. The environment exploits the parallelisms inherent in the application and distributes the constituent disciplines on machines best suited to their needs. In CJOpt, a discipline code is "wrapped" as an object. An interface to the object identifies the functionality (services) provided by the discipline, defined in Interface Definition Language (IDL) and implemented using Java. The results of using the HSCT4.0 capability are described. A summary of lessons learned is also presented. The use of some of the processes, codes, and techniques by industry are highlighted. The application of the methodology developed in this research to other aircraft are described. Finally, we show how the experience gained is being applied to entirely new vehicles, such as the Reusable Space Transportation System. Additional information is contained in the original.

  7. Abstracting application deployment on Cloud infrastructures

    NASA Astrophysics Data System (ADS)

    Aiftimiei, D. C.; Fattibene, E.; Gargana, R.; Panella, M.; Salomoni, D.

    2017-10-01

    Deploying a complex application on a Cloud-based infrastructure can be a challenging task. In this contribution we present an approach for Cloud-based deployment of applications and its present or future implementation in the framework of several projects, such as “!CHAOS: a cloud of controls” [1], a project funded by MIUR (Italian Ministry of Research and Education) to create a Cloud-based deployment of a control system and data acquisition framework, “INDIGO-DataCloud” [2], an EC H2020 project targeting among other things high-level deployment of applications on hybrid Clouds, and “Open City Platform”[3], an Italian project aiming to provide open Cloud solutions for Italian Public Administrations. We considered to use an orchestration service to hide the complex deployment of the application components, and to build an abstraction layer on top of the orchestration one. Through Heat [4] orchestration service, we prototyped a dynamic, on-demand, scalable platform of software components, based on OpenStack infrastructures. On top of the orchestration service we developed a prototype of a web interface exploiting the Heat APIs. The user can start an instance of the application without having knowledge about the underlying Cloud infrastructure and services. Moreover, the platform instance can be customized by choosing parameters related to the application such as the size of a File System or the number of instances of a NoSQL DB cluster. As soon as the desired platform is running, the web interface offers the possibility to scale some infrastructure components. In this contribution we describe the solution design and implementation, based on the application requirements, the details of the development of both the Heat templates and of the web interface, together with possible exploitation strategies of this work in Cloud data centers.

  8. Foundational Tools for Petascale Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Barton

    2014-05-19

    The Paradyn project has a history of developing algorithms, techniques, and software that push the cutting edge of tool technology for high-end computing systems. Under this funding, we are working on a three-year agenda to make substantial new advances in support of new and emerging Petascale systems. The overall goal for this work is to address the steady increase in complexity of these petascale systems. Our work covers two key areas: (1) The analysis, instrumentation and control of binary programs. Work in this area falls under the general framework of the Dyninst API tool kits. (2) Infrastructure for building toolsmore » and applications at extreme scale. Work in this area falls under the general framework of the MRNet scalability framework. Note that work done under this funding is closely related to work done under a contemporaneous grant, “High-Performance Energy Applications and Systems”, SC0004061/FG02-10ER25972, UW PRJ36WV.« less

  9. An Ontology-Based Reasoning Framework for Querying Satellite Images for Disaster Monitoring.

    PubMed

    Alirezaie, Marjan; Kiselev, Andrey; Längkvist, Martin; Klügl, Franziska; Loutfi, Amy

    2017-11-05

    This paper presents a framework in which satellite images are classified and augmented with additional semantic information to enable queries about what can be found on the map at a particular location, but also about paths that can be taken. This is achieved by a reasoning framework based on qualitative spatial reasoning that is able to find answers to high level queries that may vary on the current situation. This framework called SemCityMap, provides the full pipeline from enriching the raw image data with rudimentary labels to the integration of a knowledge representation and reasoning methods to user interfaces for high level querying. To illustrate the utility of SemCityMap in a disaster scenario, we use an urban environment-central Stockholm-in combination with a flood simulation. We show that the system provides useful answers to high-level queries also with respect to the current flood status. Examples of such queries concern path planning for vehicles or retrieval of safe regions such as "find all regions close to schools and far from the flooded area". The particular advantage of our approach lies in the fact that ontological information and reasoning is explicitly integrated so that queries can be formulated in a natural way using concepts on appropriate level of abstraction, including additional constraints.

  10. An Ontology-Based Reasoning Framework for Querying Satellite Images for Disaster Monitoring

    PubMed Central

    Alirezaie, Marjan; Klügl, Franziska; Loutfi, Amy

    2017-01-01

    This paper presents a framework in which satellite images are classified and augmented with additional semantic information to enable queries about what can be found on the map at a particular location, but also about paths that can be taken. This is achieved by a reasoning framework based on qualitative spatial reasoning that is able to find answers to high level queries that may vary on the current situation. This framework called SemCityMap, provides the full pipeline from enriching the raw image data with rudimentary labels to the integration of a knowledge representation and reasoning methods to user interfaces for high level querying. To illustrate the utility of SemCityMap in a disaster scenario, we use an urban environment—central Stockholm—in combination with a flood simulation. We show that the system provides useful answers to high-level queries also with respect to the current flood status. Examples of such queries concern path planning for vehicles or retrieval of safe regions such as “find all regions close to schools and far from the flooded area”. The particular advantage of our approach lies in the fact that ontological information and reasoning is explicitly integrated so that queries can be formulated in a natural way using concepts on appropriate level of abstraction, including additional constraints. PMID:29113073

  11. Field-widened Michelson interferometer for spectral discrimination in high-spectral-resolution lidar: theoretical framework.

    PubMed

    Cheng, Zhongtao; Liu, Dong; Luo, Jing; Yang, Yongying; Zhou, Yudi; Zhang, Yupeng; Duan, Lulin; Su, Lin; Yang, Liming; Shen, Yibing; Wang, Kaiwei; Bai, Jian

    2015-05-04

    A field-widened Michelson interferometer (FWMI) is developed to act as the spectral discriminator in high-spectral-resolution lidar (HSRL). This realization is motivated by the wide-angle Michelson interferometer (WAMI) which has been used broadly in the atmospheric wind and temperature detection. This paper describes an independent theoretical framework about the application of the FWMI in HSRL for the first time. In the framework, the operation principles and application requirements of the FWMI are discussed in comparison with that of the WAMI. Theoretical foundations for designing this type of interferometer are introduced based on these comparisons. Moreover, a general performance estimation model for the FWMI is established, which can provide common guidelines for the performance budget and evaluation of the FWMI in the both design and operation stages. Examples incorporating many practical imperfections or conditions that may degrade the performance of the FWMI are given to illustrate the implementation of the modeling. This theoretical framework presents a complete and powerful tool for solving most of theoretical or engineering problems encountered in the FWMI application, including the designing, parameter calibration, prior performance budget, posterior performance estimation, and so on. It will be a valuable contribution to the lidar community to develop a new generation of HSRLs based on the FWMI spectroscopic filter.

  12. A framework for operationalization of strategic plans and metrics for corporate performance measurement in transportation asset management

    NASA Astrophysics Data System (ADS)

    Mteri, Hassan H.

    This thesis investigated the business processes required to translate corporate-level strategic plans into tactical and operational plans in the context of transportation asset management. The study also developed a framework for effective performance measure for departments of transportation. The thesis was based on a case study of transportation agencies in the U.S.A. and Canada. The scope is therefore limited or more directly applicable to transportation assets such as pavement, bridges and culverts. The goal was to address the problem of translating or managing strategic plans, especially in the context of the public sector responsible for operating transportation infrastructure. It was observed that many agencies have been successful in formulating good strategic plans but they have performed relatively poorly in translating such corporate-level strategic plans into operational activities. A questionnaire survey was designed and targeted about 30 state agencies that are currently active in transportation asset management. Twenty one (21) transportation agencies in the USA and Canada responded to the questionnaire. The analysis of the questionnaire data showed that there is a lack of a standard approach to managing corporate strategic plans in transportation agencies. The results also indicated that most transportation agencies operate in three organizational levels but there was no systematic approach of translating goal and objectives from high level to lower levels. Approaches in performance measurement were found to vary from agency to agency. A number of limitations were identified in the existing practice on performance measurements. Key weaknesses include the large number of measures in use (as many as 25 or more), and the disconnection between the measures used and the corporate goals and objectives. Lessons from the private sector were thoroughly reviewed in order to build the groundwork for adapting existing tools to the public sector. The existing literature, assumptions and characteristics that make the Balanced Scorecards and strategy maps work effectively in the private sector were identified. Gaps in implementation of strategic plans and the use of Balanced Scorecard in the public sector were derived. Although Balanced Scorecards have previously been used to a limited extent in transportation agencies, the use of combined Balanced Scorecards and strategy maps with a much broader utility of translating strategic plans into tactical and operational activities for Transportation Asset Management is yet to be established. The thesis presents a framework to operationalize strategic plans through the combined application of Balanced Scorecards and strategy maps. The proposed framework aligns overarching objectives in all organizational levels: corporate, tactical, and operation, in which detail information is delegated from top level to lower levels. Furthermore, the thesis presents a proposed framework for developing and using effective corporate performance measures. The framework for performance measures provides a key tool for tracking progress and ensuring overall operationalization of strategic plans in transportation agencies. The thesis presents a methodology to assess existing performance measures so that agencies can reduce the number of measures, to be more effective and manageable. It was found that among other good characteristics, corporate performance measures must be tied to agency's goals and objectives and must be sensitive or responsive to program delivery activities and to the impacts of decisions about resource allocation.

  13. A simulation framework for the CMS Track Trigger electronics

    NASA Astrophysics Data System (ADS)

    Amstutz, C.; Magazzù, G.; Weber, M.; Palla, F.

    2015-03-01

    A simulation framework has been developed to test and characterize algorithms, architectures and hardware implementations of the vastly complex CMS Track Trigger for the high luminosity upgrade of the CMS experiment at the Large Hadron Collider in Geneva. High-level SystemC models of all system components have been developed to simulate a portion of the track trigger. The simulation of the system components together with input data from physics simulations allows evaluating figures of merit, like delays or bandwidths, under realistic conditions. The use of SystemC for high-level modelling allows co-simulation with models developed in Hardware Description Languages, e.g. VHDL or Verilog. Therefore, the simulation framework can also be used as a test bench for digital modules developed for the final system.

  14. The applicability of the UK Public Health Skills and Knowledge Framework to the practitioner workforce: lessons for competency framework development.

    PubMed

    Shickle, Darren; Stroud, Laura; Day, Matthew; Smith, Kevin

    2018-06-05

    Many countries have developed competency frameworks for public health practice. While the number of competencies vary, frameworks cover similar knowledge and skills although they are not explicitly based on competency theory. A total of 15 qualitative group interviews (of up to six people), were conducted with 51 public health practitioners in 8 local authorities to assess the extent to which practitioners utilize competencies defined within the UK Public Health Skills and Knowledge Framework (PHSKF). Framework analysis was applied to the transcribed interviews. The overall framework was seen positively although no participants had previously read or utilized the PHSKF. Most could provide evidence, although some PHSKF competencies required creative thinking to fit expectations of practitioners and to reflect variation across the domains of practice which are impacted by job role and level of seniority. Evidence from previous NHS jobs or education may be needed as some competencies were not regularly utilized within their current local authority role. Further development of the PHSKF is required to provide guidance on how it should be used for practitioners and other members of the public health workforce. Empirical research can help benchmark knowledge/skills for workforce levels so improving the utility of competency frameworks.

  15. Evaluating the Impact of Educational Interventions on Patients and Communities: A Conceptual Framework.

    PubMed

    Bzowyckyj, Andrew S; Dow, Alan; Knab, Mary S

    2017-11-01

    Health professions education programs can have direct effects on patients and communities as well as on learners. However, few studies have examined the patient and community outcomes of educational interventions. To better integrate education and health care delivery, educators and researchers would benefit from a unifying framework to guide the planning of educational interventions and evaluation of their impact on patients.The authors of this Perspective mirrored approaches from Miller's pyramid of educational assessment and Moore and colleagues' framework for evaluating continuing professional development to propose a conceptual framework for evaluating the impact of educational interventions on patients and communities. This proposed framework, which complements these existing frameworks for evaluating the impact of educational interventions on learners, includes four levels: (1) interaction; (2) acceptability; (3) individual outcomes (i.e., knowledge, skills, activation, behaviors, and individual health indicators); and (4) population outcomes (i.e., community health indicators, capacity, and disparities). The authors describe measures and outcomes at each level and provide an example of the application of their new conceptual framework.The authors encourage educators and researchers to use this conceptual framework to evaluate the impact of educational interventions on patients and to more clearly identify and define which educational interventions strengthen communities and enhance overall health outcomes.

  16. Jagged Tiling for Intra-tile Parallelism and Fine-Grain Multithreading

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shrestha, Sunil; Manzano Franco, Joseph B.; Marquez, Andres

    In this paper, we have developed a novel methodology that takes into consideration multithreaded many-core designs to better utilize memory/processing resources and improve memory residence on tileable applications. It takes advantage of polyhedral analysis and transformation in the form of PLUTO, combined with a highly optimized finegrain tile runtime to exploit parallelism at all levels. The main contributions of this paper include the introduction of multi-hierarchical tiling techniques that increases intra tile parallelism; and a data-flow inspired runtime library that allows the expression of parallel tiles with an efficient synchronization registry. Our current implementation shows performance improvements on an Intelmore » Xeon Phi board up to 32.25% against instances produced by state-of-the-art compiler frameworks for selected stencil applications.« less

  17. A Subdivision-Based Representation for Vector Image Editing.

    PubMed

    Liao, Zicheng; Hoppe, Hugues; Forsyth, David; Yu, Yizhou

    2012-11-01

    Vector graphics has been employed in a wide variety of applications due to its scalability and editability. Editability is a high priority for artists and designers who wish to produce vector-based graphical content with user interaction. In this paper, we introduce a new vector image representation based on piecewise smooth subdivision surfaces, which is a simple, unified and flexible framework that supports a variety of operations, including shape editing, color editing, image stylization, and vector image processing. These operations effectively create novel vector graphics by reusing and altering existing image vectorization results. Because image vectorization yields an abstraction of the original raster image, controlling the level of detail of this abstraction is highly desirable. To this end, we design a feature-oriented vector image pyramid that offers multiple levels of abstraction simultaneously. Our new vector image representation can be rasterized efficiently using GPU-accelerated subdivision. Experiments indicate that our vector image representation achieves high visual quality and better supports editing operations than existing representations.

  18. Conceptual framework of public health surveillance and action and its application in health sector reform.

    PubMed

    McNabb, Scott J N; Chungong, Stella; Ryan, Mike; Wuhib, Tadesse; Nsubuga, Peter; Alemu, Wondi; Carande-Kulis, Vilma; Rodier, Guenael

    2002-01-01

    Because both public health surveillance and action are crucial, the authors initiated meetings at regional and national levels to assess and reform surveillance and action systems. These meetings emphasized improved epidemic preparedness, epidemic response, and highlighted standardized assessment and reform. To standardize assessments, the authors designed a conceptual framework for surveillance and action that categorized the framework into eight core and four support activities, measured with indicators. In application, country-level reformers measure both the presence and performance of the six core activities comprising public health surveillance (detection, registration, reporting, confirmation, analyses, and feedback) and acute (epidemic-type) and planned (management-type) responses composing the two core activities of public health action. Four support activities - communications, supervision, training, and resource provision - enable these eight core processes. National, multiple systems can then be concurrently assessed at each level for effectiveness, technical efficiency, and cost. This approach permits a cost analysis, highlights areas amenable to integration, and provides focused intervention. The final public health model becomes a district-focused, action-oriented integration of core and support activities with enhanced effectiveness, technical efficiency, and cost savings. This reform approach leads to sustained capacity development by an empowerment strategy defined as facilitated, process-oriented action steps transforming staff and the system.

  19. A Framework for Performing V&V within Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1996-01-01

    Verification and validation (V&V) is performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. Early discovery is important in order to minimize the cost and other impacts of correcting these errors. In order to provide early detection of errors, V&V is conducted in parallel with system development, often beginning with the concept phase. In reuse-based software engineering, however, decisions on the requirements, design and even implementation of domain assets can be made prior to beginning development of a specific system. In this case, V&V must be performed during domain engineering in order to have an impact on system development. This paper describes a framework for performing V&V within architecture-centric, reuse-based software engineering. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.

  20. A Framework for Assessing High School Students' Intercultural Communicative Competence in a Computer-Mediated Language Learning Project

    ERIC Educational Resources Information Center

    Peng, Hsinyi; Lu, Wei-Hsin; Wang, Chao-I

    2009-01-01

    The purposes of this study were to identify the essential dimensions of intercultural communicative competence (ICC) and to establish a framework for assessing the ICC level of high school students that included a self-report inventory and scoring rubrics for online interaction in intercultural contexts. A total of 472 high school students from…

  1. INL Generic Robot Architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2005-03-30

    The INL Generic Robot Architecture is a generic, extensible software framework that can be applied across a variety of different robot geometries, sensor suites and low-level proprietary control application programming interfaces (e.g. mobility, aria, aware, player, etc.).

  2. Tensions of Teaching Media Literacy in Teacher Education

    ERIC Educational Resources Information Center

    Ngomba-Westbrook, Nalova Elaine

    2013-01-01

    This study investigates the tensions a teacher educator faces in facilitating a media literacy teacher education course at the university level. Teaching tensions are conceptualized as a three-tier framework. At the first level, tensions may arise in the selection and application of pedagogies associated with critical and new/21st century…

  3. Thread mapping using system-level model for shared memory multicores

    NASA Astrophysics Data System (ADS)

    Mitra, Reshmi

    Exploring thread-to-core mapping options for a parallel application on a multicore architecture is computationally very expensive. For the same algorithm, the mapping strategy (MS) with the best response time may change with data size and thread counts. The primary challenge is to design a fast, accurate and automatic framework for exploring these MSs for large data-intensive applications. This is to ensure that the users can explore the design space within reasonable machine hours, without thorough understanding on how the code interacts with the platform. Response time is related to the cycles per instructions retired (CPI), taking into account both active and sleep states of the pipeline. This work establishes a hybrid approach, based on Markov Chain Model (MCM) and Model Tree (MT) for system-level steady state CPI prediction. It is designed for shared memory multicore processors with coarse-grained multithreading. The thread status is represented by the MCM states. The program characteristics are modeled as the transition probabilities, representing the system moving between active and suspended thread states. The MT model extrapolates these probabilities for the actual application size (AS) from the smaller AS performance. This aspect of the framework, along with, the use of mathematical expressions for the actual AS performance information, results in a tremendous reduction in the CPI prediction time. The framework is validated using an electromagnetics application. The average performance prediction error for steady state CPI results with 12 different MSs is less than 1%. The total run time of model is of the order of minutes, whereas the actual application execution time is in terms of days.

  4. Trends and Lessons Learned in Interdisciplinary and Non-Business Case Method Application.

    ERIC Educational Resources Information Center

    Anyansi-Archibong, Chi; Czuchry, Andrew J.; House, Claudia S.; Cicirello, Tony

    2000-01-01

    Presents results of a survey designed to test the level of development and application of cases in non-business courses such as sciences, mathematics, engineering, health, and technology. Findings support the growing popularity of the case method of teaching and learning outside the business domain. Suggests a framework for establishing win-win…

  5. Integrated Spreadsheets as a Paradigm of Type II Technology Applications in Mathematics Teacher Education

    ERIC Educational Resources Information Center

    Abramovich, Sergei

    2016-01-01

    The paper presents the use of spreadsheets integrated with digital tools capable of symbolic computations and graphic constructions in a master's level capstone course for secondary mathematics teachers. Such use of spreadsheets is congruent with the Type II technology applications framework aimed at the development of conceptual knowledge in the…

  6. Variational optical flow estimation for images with spectral and photometric sensor diversity

    NASA Astrophysics Data System (ADS)

    Bengtsson, Tomas; McKelvey, Tomas; Lindström, Konstantin

    2015-03-01

    Motion estimation of objects in image sequences is an essential computer vision task. To this end, optical flow methods compute pixel-level motion, with the purpose of providing low-level input to higher-level algorithms and applications. Robust flow estimation is crucial for the success of applications, which in turn depends on the quality of the captured image data. This work explores the use of sensor diversity in the image data within a framework for variational optical flow. In particular, a custom image sensor setup intended for vehicle applications is tested. Experimental results demonstrate the improved flow estimation performance when IR sensitivity or flash illumination is added to the system.

  7. A Component-based Programming Model for Composite, Distributed Applications

    NASA Technical Reports Server (NTRS)

    Eidson, Thomas M.; Bushnell, Dennis M. (Technical Monitor)

    2001-01-01

    The nature of scientific programming is evolving to larger, composite applications that are composed of smaller element applications. These composite applications are more frequently being targeted for distributed, heterogeneous networks of computers. They are most likely programmed by a group of developers. Software component technology and computational frameworks are being proposed and developed to meet the programming requirements of these new applications. Historically, programming systems have had a hard time being accepted by the scientific programming community. In this paper, a programming model is outlined that attempts to organize the software component concepts and fundamental programming entities into programming abstractions that will be better understood by the application developers. The programming model is designed to support computational frameworks that manage many of the tedious programming details, but also that allow sufficient programmer control to design an accurate, high-performance application.

  8. Examining household asthma management behavior through a microeconomic framework.

    PubMed

    Magzamen, Sheryl; Brandt, Sylvia J; Tager, Ira B

    2014-12-01

    National guidelines on the effective management of pediatric asthma have been promoted for over 20 years, yet asthma-related morbidity among low-income children remains disproportionately high. To date, household and clinical interventions designed to remediate these differences have been informed largely by a health behavior framework. However, these programs have not resulted in consistent sustained improvements in targeted populations. The continued funding and implementation of programs based on the health behavior framework leads us to question if traditional behavioral models are sufficient to understand and promote adaptation of positive health management behaviors. We introduce the application of the microeconomic framework to investigate potential mechanisms that can lead to positive management behaviors to improve asthma-related morbidity. We provide examples from the literature on health production, preferences, trade-offs and time horizons to illustrate how economic constructs can potentially add to understanding of disease management. The economic framework, which can be empirically observed, tested, and quantified, can explicate the engagement in household-level activities that would affect health and well-being. The inclusion of a microeconomic perspective in intervention research may lead to identification of mechanisms that lead to household decisions with regard to asthma management strategies and behavior. The inclusion of the microeconomic framework to understand the production of health may provide a novel theoretical framework to investigate the underlying causal behavioral mechanisms related to asthma management and control. Adaptation of an economic perspective may provide new insight into the design and implementation of interventions to improve asthma-related morbidity in susceptible populations. © 2014 Society for Public Health Education.

  9. Software engineering processes for Class D missions

    NASA Astrophysics Data System (ADS)

    Killough, Ronnie; Rose, Debi

    2013-09-01

    Software engineering processes are often seen as anathemas; thoughts of CMMI key process areas and NPR 7150.2A compliance matrices can motivate a software developer to consider other career fields. However, with adequate definition, common-sense application, and an appropriate level of built-in flexibility, software engineering processes provide a critical framework in which to conduct a successful software development project. One problem is that current models seem to be built around an underlying assumption of "bigness," and assume that all elements of the process are applicable to all software projects regardless of size and tolerance for risk. This is best illustrated in NASA's NPR 7150.2A in which, aside from some special provisions for manned missions, the software processes are to be applied based solely on the criticality of the software to the mission, completely agnostic of the mission class itself. That is, the processes applicable to a Class A mission (high priority, very low risk tolerance, very high national significance) are precisely the same as those applicable to a Class D mission (low priority, high risk tolerance, low national significance). This paper will propose changes to NPR 7150.2A, taking mission class into consideration, and discuss how some of these changes are being piloted for a current Class D mission—the Cyclone Global Navigation Satellite System (CYGNSS).

  10. The application of 3D Zernike moments for the description of "model-free" molecular structure, functional motion, and structural reliability.

    PubMed

    Grandison, Scott; Roberts, Carl; Morris, Richard J

    2009-03-01

    Protein structures are not static entities consisting of equally well-determined atomic coordinates. Proteins undergo continuous motion, and as catalytic machines, these movements can be of high relevance for understanding function. In addition to this strong biological motivation for considering shape changes is the necessity to correctly capture different levels of detail and error in protein structures. Some parts of a structural model are often poorly defined, and the atomic displacement parameters provide an excellent means to characterize the confidence in an atom's spatial coordinates. A mathematical framework for studying these shape changes, and handling positional variance is therefore of high importance. We present an approach for capturing various protein structure properties in a concise mathematical framework that allows us to compare features in a highly efficient manner. We demonstrate how three-dimensional Zernike moments can be employed to describe functions, not only on the surface of a protein but throughout the entire molecule. A number of proof-of-principle examples are given which demonstrate how this approach may be used in practice for the representation of movement and uncertainty.

  11. A domain-specific compiler for a parallel multiresolution adaptive numerical simulation environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rajbhandari, Samyam; Kim, Jinsung; Krishnamoorthy, Sriram

    This paper describes the design and implementation of a layered domain-specific compiler to support MADNESS---Multiresolution ADaptive Numerical Environment for Scientific Simulation. MADNESS is a high-level software environment for the solution of integral and differential equations in many dimensions, using adaptive and fast harmonic analysis methods with guaranteed precision. MADNESS uses k-d trees to represent spatial functions and implements operators like addition, multiplication, differentiation, and integration on the numerical representation of functions. The MADNESS runtime system provides global namespace support and a task-based execution model including futures. MADNESS is currently deployed on massively parallel supercomputers and has enabled many science advances.more » Due to the highly irregular and statically unpredictable structure of the k-d trees representing the spatial functions encountered in MADNESS applications, only purely runtime approaches to optimization have previously been implemented in the MADNESS framework. This paper describes a layered domain-specific compiler developed to address some performance bottlenecks in MADNESS. The newly developed static compile-time optimizations, in conjunction with the MADNESS runtime support, enable significant performance improvement for the MADNESS framework.« less

  12. Approaches in highly parameterized inversion: bgaPEST, a Bayesian geostatistical approach implementation with PEST: documentation and instructions

    USGS Publications Warehouse

    Fienen, Michael N.; D'Oria, Marco; Doherty, John E.; Hunt, Randall J.

    2013-01-01

    The application bgaPEST is a highly parameterized inversion software package implementing the Bayesian Geostatistical Approach in a framework compatible with the parameter estimation suite PEST. Highly parameterized inversion refers to cases in which parameters are distributed in space or time and are correlated with one another. The Bayesian aspect of bgaPEST is related to Bayesian probability theory in which prior information about parameters is formally revised on the basis of the calibration dataset used for the inversion. Conceptually, this approach formalizes the conditionality of estimated parameters on the specific data and model available. The geostatistical component of the method refers to the way in which prior information about the parameters is used. A geostatistical autocorrelation function is used to enforce structure on the parameters to avoid overfitting and unrealistic results. Bayesian Geostatistical Approach is designed to provide the smoothest solution that is consistent with the data. Optionally, users can specify a level of fit or estimate a balance between fit and model complexity informed by the data. Groundwater and surface-water applications are used as examples in this text, but the possible uses of bgaPEST extend to any distributed parameter applications.

  13. Hierarchical pictorial structures for simultaneously localizing multiple organs in volumetric pre-scan CT

    NASA Astrophysics Data System (ADS)

    Montillo, Albert; Song, Qi; Das, Bipul; Yin, Zhye

    2015-03-01

    Parsing volumetric computed tomography (CT) into 10 or more salient organs simultaneously is a challenging task with many applications such as personalized scan planning and dose reporting. In the clinic, pre-scan data can come in the form of very low dose volumes acquired just prior to the primary scan or from an existing primary scan. To localize organs in such diverse data, we propose a new learning based framework that we call hierarchical pictorial structures (HPS) which builds multiple levels of models in a tree-like hierarchy that mirrors the natural decomposition of human anatomy from gross structures to finer structures. Each node of our hierarchical model learns (1) the local appearance and shape of structures, and (2) a generative global model that learns probabilistic, structural arrangement. Our main contribution is twofold. First we embed the pictorial structures approach in a hierarchical framework which reduces test time image interpretation and allows for the incorporation of additional geometric constraints that robustly guide model fitting in the presence of noise. Second we guide our HPS framework with the probabilistic cost maps extracted using random decision forests using volumetric 3D HOG features which makes our model fast to train and fast to apply to novel test data and posses a high degree of invariance to shape distortion and imaging artifacts. All steps require approximate 3 mins to compute and all organs are located with suitably high accuracy for our clinical applications such as personalized scan planning for radiation dose reduction. We assess our method using a database of volumetric CT scans from 81 subjects with widely varying age and pathology and with simulated ultra-low dose cadaver pre-scan data.

  14. Framework for the design and delivery of organized physical activity sessions for children and adolescents: rationale and description of the 'SAAFE' teaching principles.

    PubMed

    Lubans, David R; Lonsdale, Chris; Cohen, Kristen; Eather, Narelle; Beauchamp, Mark R; Morgan, Philip J; Sylvester, Benjamin D; Smith, Jordan J

    2017-02-23

    The economic burden of inactivity is substantial, with conservative estimates suggesting the global cost to health care systems is more than US$50 billion. School-based programs, including physical education and school sport, have been recommended as important components of a multi-sector, multi-system approach to address physical inactivity. Additionally, community sporting clubs and after-school programs (ASPs) offer further opportunities for young people to be physically active outside of school. Despite demonstrating promise, current evidence suggests school-based physical activity programs, community sporting clubs and ASPs are not achieving their full potential. For example, physical activity levels in physical education (PE) and ASP sessions are typically much lower than recommended. For these sessions to have the strongest effects on young people's physical activity levels and their on-going physical literacy, they need to improve in quality and should be highly active and engaging. This paper presents the Supportive, Active, Autonomous, Fair, Enjoyable (SAAFE) principles, which represent an evidence-based framework designed to guide the planning, delivery and evaluation of organized physical activity sessions in school, community sport and ASPs. In this paper we provide a narrative and integrative review of the conceptual and empirical bases that underpin this framework and highlight implications for knowledge translation and application.

  15. Analytical Overview of the European and Russian Qualifications Frameworks with a Focus on Doctoral Degree Level

    ERIC Educational Resources Information Center

    Chigisheva, Oksana; Bondarenko, Anna; Soltovets, Elena

    2017-01-01

    The paper provides analytical insights into highly acute issues concerning preparation and adoption of Qualifications Frameworks being an adequate response to the growing interactions at the global labor market and flourishing of knowledge economy. Special attention is paid to the analyses of transnational Meta Qualifications Frameworks (A…

  16. Argonne Simulation Framework for Intelligent Transportation Systems

    DOT National Transportation Integrated Search

    1996-01-01

    A simulation framework has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS). The simulator is designed to run on parallel computers and distribu...

  17. An inverse modeling approach for semilunar heart valve leaflet mechanics: exploitation of tissue structure.

    PubMed

    Aggarwal, Ankush; Sacks, Michael S

    2016-08-01

    Determining the biomechanical behavior of heart valve leaflet tissues in a noninvasive manner remains an important clinical goal. While advances in 3D imaging modalities have made in vivo valve geometric data available, optimal methods to exploit such information in order to obtain functional information remain to be established. Herein we present and evaluate a novel leaflet shape-based framework to estimate the biomechanical behavior of heart valves from surface deformations by exploiting tissue structure. We determined accuracy levels using an "ideal" in vitro dataset, in which the leaflet geometry, strains, mechanical behavior, and fibrous structure were known to a high level of precision. By utilizing a simplified structural model for the leaflet mechanical behavior, we were able to limit the number of parameters to be determined per leaflet to only two. This approach allowed us to dramatically reduce the computational time and easily visualize the cost function to guide the minimization process. We determined that the image resolution and the number of available imaging frames were important components in the accuracy of our framework. Furthermore, our results suggest that it is possible to detect differences in fiber structure using our framework, thus allowing an opportunity to diagnose asymptomatic valve diseases and begin treatment at their early stages. Lastly, we observed good agreement of the final resulting stress-strain response when an averaged fiber architecture was used. This suggests that population-averaged fiber structural data may be sufficient for the application of the present framework to in vivo studies, although clearly much work remains to extend the present approach to in vivo problems.

  18. A generic framework for internet-based interactive applications of high-resolution 3-D medical image data.

    PubMed

    Liu, Danzhou; Hua, Kien A; Sugaya, Kiminobu

    2008-09-01

    With the advances in medical imaging devices, large volumes of high-resolution 3-D medical image data have been produced. These high-resolution 3-D data are very large in size, and severely stress storage systems and networks. Most existing Internet-based 3-D medical image interactive applications therefore deal with only low- or medium-resolution image data. While it is possible to download the whole 3-D high-resolution image data from the server and perform the image visualization and analysis at the client site, such an alternative is infeasible when the high-resolution data are very large, and many users concurrently access the server. In this paper, we propose a novel framework for Internet-based interactive applications of high-resolution 3-D medical image data. Specifically, we first partition the whole 3-D data into buckets, remove the duplicate buckets, and then, compress each bucket separately. We also propose an index structure for these buckets to efficiently support typical queries such as 3-D slicer and region of interest, and only the relevant buckets are transmitted instead of the whole high-resolution 3-D medical image data. Furthermore, in order to better support concurrent accesses and to improve the average response time, we also propose techniques for efficient query processing, incremental transmission, and client sharing. Our experimental study in simulated and realistic environments indicates that the proposed framework can significantly reduce storage and communication requirements, and can enable real-time interaction with remote high-resolution 3-D medical image data for many concurrent users.

  19. Bridging Human Reliability Analysis and Psychology, Part 2: A Cognitive Framework to Support HRA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    April M. Whaley; Stacey M. L. Hendrickson; Ronald L. Boring

    This is the second of two papers that discuss the literature review conducted as part of the U.S. Nuclear Regulatory Commission (NRC) effort to develop a hybrid human reliability analysis (HRA) method in response to Staff Requirements Memorandum (SRM) SRM-M061020. This review was conducted with the goal of strengthening the technical basis within psychology, cognitive science and human factors for the hybrid HRA method being proposed. An overview of the literature review approach and high-level structure is provided in the first paper, whereas this paper presents the results of the review. The psychological literature review encompassed research spanning the entiretymore » of human cognition and performance, and consequently produced an extensive list of psychological processes, mechanisms, and factors that contribute to human performance. To make sense of this large amount of information, the results of the literature review were organized into a cognitive framework that identifies causes of failure of macrocognition in humans, and connects those proximate causes to psychological mechanisms and performance influencing factors (PIFs) that can lead to the failure. This cognitive framework can serve as a tool to inform HRA. Beyond this, however, the cognitive framework has the potential to also support addressing human performance issues identified in Human Factors applications.« less

  20. An integrative and applicable phylogenetic footprinting framework for cis-regulatory motifs identification in prokaryotic genomes.

    PubMed

    Liu, Bingqiang; Zhang, Hanyuan; Zhou, Chuan; Li, Guojun; Fennell, Anne; Wang, Guanghui; Kang, Yu; Liu, Qi; Ma, Qin

    2016-08-09

    Phylogenetic footprinting is an important computational technique for identifying cis-regulatory motifs in orthologous regulatory regions from multiple genomes, as motifs tend to evolve slower than their surrounding non-functional sequences. Its application, however, has several difficulties for optimizing the selection of orthologous data and reducing the false positives in motif prediction. Here we present an integrative phylogenetic footprinting framework for accurate motif predictions in prokaryotic genomes (MP(3)). The framework includes a new orthologous data preparation procedure, an additional promoter scoring and pruning method and an integration of six existing motif finding algorithms as basic motif search engines. Specifically, we collected orthologous genes from available prokaryotic genomes and built the orthologous regulatory regions based on sequence similarity of promoter regions. This procedure made full use of the large-scale genomic data and taxonomy information and filtered out the promoters with limited contribution to produce a high quality orthologous promoter set. The promoter scoring and pruning is implemented through motif voting by a set of complementary predicting tools that mine as many motif candidates as possible and simultaneously eliminate the effect of random noise. We have applied the framework to Escherichia coli k12 genome and evaluated the prediction performance through comparison with seven existing programs. This evaluation was systematically carried out at the nucleotide and binding site level, and the results showed that MP(3) consistently outperformed other popular motif finding tools. We have integrated MP(3) into our motif identification and analysis server DMINDA, allowing users to efficiently identify and analyze motifs in 2,072 completely sequenced prokaryotic genomes. The performance evaluation indicated that MP(3) is effective for predicting regulatory motifs in prokaryotic genomes. Its application may enhance progress in elucidating transcription regulation mechanism, thus provide benefit to the genomic research community and prokaryotic genome researchers in particular.

  1. New developments in the evolution and application of the WHO/IPCS framework on mode of action/species concordance analysis.

    PubMed

    Meek, M E; Boobis, A; Cote, I; Dellarco, V; Fotakis, G; Munn, S; Seed, J; Vickers, C

    2014-01-01

    The World Health Organization/International Programme on Chemical Safety mode of action/human relevance framework has been updated to reflect the experience acquired in its application and extend its utility to emerging areas in toxicity testing and non-testing methods. The underlying principles have not changed, but the framework's scope has been extended to enable integration of information at different levels of biological organization and reflect evolving experience in a much broader range of potential applications. Mode of action/species concordance analysis can also inform hypothesis-based data generation and research priorities in support of risk assessment. The modified framework is incorporated within a roadmap, with feedback loops encouraging continuous refinement of fit-for-purpose testing strategies and risk assessment. Important in this construct is consideration of dose-response relationships and species concordance analysis in weight of evidence. The modified Bradford Hill considerations have been updated and additionally articulated to reflect increasing experience in application for cases where the toxicological outcome of chemical exposure is known. The modified framework can be used as originally intended, where the toxicological effects of chemical exposure are known, or in hypothesizing effects resulting from chemical exposure, using information on putative key events in established modes of action from appropriate in vitro or in silico systems and other lines of evidence. This modified mode of action framework and accompanying roadmap and case examples are expected to contribute to improving transparency in explicitly addressing weight of evidence considerations in mode of action/species concordance analysis based on both conventional data sources and evolving methods. Copyright © 2013 John Wiley & Sons, Ltd. The World Health Organization retains copyright and all other rights in the manuscript of this article as submitted for publication.

  2. Comprehensive mitigation framework for concurrent application of multiple clinical practice guidelines.

    PubMed

    Wilk, Szymon; Michalowski, Martin; Michalowski, Wojtek; Rosu, Daniela; Carrier, Marc; Kezadri-Hamiaz, Mounira

    2017-02-01

    In this work we propose a comprehensive framework based on first-order logic (FOL) for mitigating (identifying and addressing) interactions between multiple clinical practice guidelines (CPGs) applied to a multi-morbid patient while also considering patient preferences related to the prescribed treatment. With this framework we respond to two fundamental challenges associated with clinical decision support: (1) concurrent application of multiple CPGs and (2) incorporation of patient preferences into the decision making process. We significantly expand our earlier research by (1) proposing a revised and improved mitigation-oriented representation of CPGs and secondary medical knowledge for addressing adverse interactions and incorporating patient preferences and (2) introducing a new mitigation algorithm. Specifically, actionable graphs representing CPGs allow for parallel and temporal activities (decisions and actions). Revision operators representing secondary medical knowledge support temporal interactions and complex revisions across multiple actionable graphs. The mitigation algorithm uses the actionable graphs, revision operators and available (and possibly incomplete) patient information represented in FOL. It relies on a depth-first search strategy to find a valid sequence of revisions and uses theorem proving and model finding techniques to identify applicable revision operators and to establish a management scenario for a given patient if one exists. The management scenario defines a safe (interaction-free) and preferred set of activities together with possible patient states. We illustrate the use of our framework with a clinical case study describing two patients who suffer from chronic kidney disease, hypertension, and atrial fibrillation, and who are managed according to CPGs for these diseases. While in this paper we are primarily concerned with the methodological aspects of mitigation, we also briefly discuss a high-level proof of concept implementation of the proposed framework in the form of a clinical decision support system (CDSS). The proposed mitigation CDSS "insulates" clinicians from the complexities of the FOL representations and provides semantically meaningful summaries of mitigation results. Ultimately we plan to implement the mitigation CDSS within our MET (Mobile Emergency Triage) decision support environment. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. TECHNOLOGY ASSESSMENT IN HOSPITALS: LESSONS LEARNED FROM AN EMPIRICAL EXPERIMENT.

    PubMed

    Foglia, Emanuela; Lettieri, Emanuele; Ferrario, Lucrezia; Porazzi, Emanuele; Garagiola, Elisabetta; Pagani, Roberta; Bonfanti, Marzia; Lazzarotti, Valentina; Manzini, Raffaella; Masella, Cristina; Croce, Davide

    2017-01-01

    Hospital Based Health Technology Assessment (HBHTA) practices, to inform decision making at the hospital level, emerged as urgent priority for policy makers, hospital managers, and professionals. The present study crystallized the results achieved by the testing of an original framework for HBHTA, developed within Lombardy Region: the IMPlementation of A Quick hospital-based HTA (IMPAQHTA). The study tested: (i) the HBHTA framework efficiency, (ii) feasibility, (iii) the tool utility and completeness, considering dimensions and sub-dimensions. The IMPAQHTA framework deployed the Regional HTA program, activated in 2008 in Lombardy, at the hospital level. The relevance and feasibility of the framework were tested over a 3-year period through a large-scale empirical experiment, involving seventy-four healthcare professionals organized in different HBHTA teams for assessing thirty-two different technologies within twenty-two different hospitals. Semi-structured interviews and self-reported questionnaires were used to collect data regarding the relevance and feasibility of the IMPAQHTA framework. The proposed HBHTA framework proved to be suitable for application at the hospital level, in the Italian context, permitting a quick assessment (11 working days) and providing hospital decision makers with relevant and quantitative information. Performances in terms of feasibility, utility, completeness, and easiness proved to be satisfactory. The IMPAQHTA was considered to be a complete and feasible HBHTA framework, as well as being replicable to different technologies within any hospital settings, thus demonstrating the capability of a hospital to develop a complete HTA, if supported by adequate and well defined tools and quantitative metrics.

  4. PKK-man: A System to Manage PKK Activities in Indonesia

    NASA Astrophysics Data System (ADS)

    Anggraini, R. N. E.; Rochimah, S.; Soedjono, A. R.

    2016-01-01

    PKK community is a well-known women community in Indonesia. PKK was formed from national level to neighbourhood association level and usually has a regular event and several activities. This is the reason why PKK is expected to play an active role in national development, started from PKK member family. This research intends to develop a system to manage PKK activities and prepares information sharing in PKK community. The application was built using client server architecture. The web version of PKK-man was developed using PHP framework and was prepared to be accessed by PKK board member. Since they will deal with more data so a wide interface in web version will be more suitable. While the ordinary PKK member can access PKK-man through mobile application. The mobile version was built using PhoneGap framework that supports different mobile operating systems.

  5. Validation of High-Fidelity CFD/CAA Framework for Launch Vehicle Acoustic Environment Simulation against Scale Model Test Data

    NASA Technical Reports Server (NTRS)

    Liever, Peter A.; West, Jeffrey S.

    2016-01-01

    A hybrid Computational Fluid Dynamics and Computational Aero-Acoustics (CFD/CAA) modeling framework has been developed for launch vehicle liftoff acoustic environment predictions. The framework couples the existing highly-scalable NASA production CFD code, Loci/CHEM, with a high-order accurate discontinuous Galerkin solver developed in the same production framework, Loci/THRUST, to accurately resolve and propagate acoustic physics across the entire launch environment. Time-accurate, Hybrid RANS/LES CFD modeling is applied for predicting the acoustic generation physics at the plume source, and a high-order accurate unstructured discontinuous Galerkin (DG) method is employed to propagate acoustic waves away from the source across large distances using high-order accurate schemes. The DG solver is capable of solving 2nd, 3rd, and 4th order Euler solutions for non-linear, conservative acoustic field propagation. Initial application testing and validation has been carried out against high resolution acoustic data from the Ares Scale Model Acoustic Test (ASMAT) series to evaluate the capabilities and production readiness of the CFD/CAA system to resolve the observed spectrum of acoustic frequency content. This paper presents results from this validation and outlines efforts to mature and improve the computational simulation framework.

  6. Ensemble framework based real-time respiratory motion prediction for adaptive radiotherapy applications.

    PubMed

    Tatinati, Sivanagaraja; Nazarpour, Kianoush; Tech Ang, Wei; Veluvolu, Kalyana C

    2016-08-01

    Successful treatment of tumors with motion-adaptive radiotherapy requires accurate prediction of respiratory motion, ideally with a prediction horizon larger than the latency in radiotherapy system. Accurate prediction of respiratory motion is however a non-trivial task due to the presence of irregularities and intra-trace variabilities, such as baseline drift and temporal changes in fundamental frequency pattern. In this paper, to enhance the accuracy of the respiratory motion prediction, we propose a stacked regression ensemble framework that integrates heterogeneous respiratory motion prediction algorithms. We further address two crucial issues for developing a successful ensemble framework: (1) selection of appropriate prediction methods to ensemble (level-0 methods) among the best existing prediction methods; and (2) finding a suitable generalization approach that can successfully exploit the relative advantages of the chosen level-0 methods. The efficacy of the developed ensemble framework is assessed with real respiratory motion traces acquired from 31 patients undergoing treatment. Results show that the developed ensemble framework improves the prediction performance significantly compared to the best existing methods. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.

  7. Development of an integrated economic and ecological framework for ecosystem-based fisheries management in New England

    NASA Astrophysics Data System (ADS)

    Jin, D.; Hoagland, P.; Dalton, T. M.; Thunberg, E. M.

    2012-09-01

    We present an integrated economic-ecological framework designed to help assess the implementation of ecosystem-based fisheries management (EBFM) in New England. We develop the framework by linking a computable general equilibrium (CGE) model of a coastal economy to an end-to-end (E2E) model of a marine food web for Georges Bank. We focus on the New England region using coastal county economic data for a restricted set of industry sectors and marine ecological data for three top level trophic feeding guilds: planktivores, benthivores, and piscivores. We undertake numerical simulations to model the welfare effects of changes in alternative combinations of yields from feeding guilds and alternative manifestations of biological productivity. We estimate the economic and distributional effects of these alternative simulations across a range of consumer income levels. This framework could be used to extend existing methodologies for assessing the impacts on human communities of groundfish stock rebuilding strategies, such as those expected through the implementation of the sector management program in the US northeast fishery. We discuss other possible applications of and modifications and limitations to the framework.

  8. A Study on the Security Levels of Spread-Spectrum Embedding Schemes in the WOA Framework.

    PubMed

    Wang, Yuan-Gen; Zhu, Guopu; Kwong, Sam; Shi, Yun-Qing

    2017-08-23

    Security analysis is a very important issue for digital watermarking. Several years ago, according to Kerckhoffs' principle, the famous four security levels, namely insecurity, key security, subspace security, and stego-security, were defined for spread-spectrum (SS) embedding schemes in the framework of watermarked-only attack. However, up to now there has been little application of the definition of these security levels to the theoretical analysis of the security of SS embedding schemes, due to the difficulty of the theoretical analysis. In this paper, based on the security definition, we present a theoretical analysis to evaluate the security levels of five typical SS embedding schemes, which are the classical SS, the improved SS (ISS), the circular extension of ISS, the nonrobust and robust natural watermarking, respectively. The theoretical analysis of these typical SS schemes are successfully performed by taking advantage of the convolution of probability distributions to derive the probabilistic models of watermarked signals. Moreover, simulations are conducted to illustrate and validate our theoretical analysis. We believe that the theoretical and practical analysis presented in this paper can bridge the gap between the definition of the four security levels and its application to the theoretical analysis of SS embedding schemes.

  9. Sustainability in Health care by Allocating Resources Effectively (SHARE) 10: operationalising disinvestment in a conceptual framework for resource allocation.

    PubMed

    Harris, Claire; Green, Sally; Elshaug, Adam G

    2017-09-08

    This is the tenth in a series of papers reporting a program of Sustainability in Health care by Allocating Resources Effectively (SHARE) in a local healthcare setting. After more than a decade of research, there is little published evidence of active and successful disinvestment. The paucity of frameworks, methods and tools is reported to be a factor in the lack of success. However there are clear and consistent messages in the literature that can be used to inform development of a framework for operationalising disinvestment. This paper, along with the conceptual review of disinvestment in Paper 9 of this series, aims to integrate the findings of the SHARE Program with the existing disinvestment literature to address the lack of information regarding systematic organisation-wide approaches to disinvestment at the local health service level. A framework for disinvestment in a local healthcare setting is proposed. Definitions for essential terms and key concepts underpinning the framework have been made explicit to address the lack of consistent terminology. Given the negative connotations of the word 'disinvestment' and the problems inherent in considering disinvestment in isolation, the basis for the proposed framework is 'resource allocation' to address the spectrum of decision-making from investment to disinvestment. The focus is positive: optimising healthcare, improving health outcomes, using resources effectively. The framework is based on three components: a program for decision-making, projects to implement decisions and evaluate outcomes, and research to understand and improve the program and project activities. The program consists of principles for decision-making and settings that provide opportunities to introduce systematic prompts and triggers to initiate disinvestment. The projects follow the steps in the disinvestment process. Potential methods and tools are presented, however the framework does not stipulate project design or conduct; allowing application of any theories, methods or tools at each step. Barriers are discussed and examples illustrating constituent elements are provided. The framework can be employed at network, institutional, departmental, ward or committee level. It is proposed as an organisation-wide application, embedded within existing systems and processes, which can be responsive to needs and priorities at the level of implementation. It can be used in policy, management or clinical contexts.

  10. Self-perceived competency of infection control nurses based on Benner's framework: a nationwide survey in Korea.

    PubMed

    Kim, Kyung Mi; Choi, Jeong Sil

    2015-05-01

    The aim of this study was to evaluate the competency level of Korean infection control nurses (ICNs) by comparing the self-perceived competency level based on Benner's framework and the core competency proposed by the Certification Board of Infection Control. Study subjects included 90 ICNs working in Korean hospitals with more than 300 beds. A questionnaire was used to measure self-perceived competency level and core competency level. Using descriptive analysis, the core competency level of ICNs was found to differ significantly according to self-perceived competency level, and core competency level showed a significant increase with the increase of self-perceived competency level. Self-perceived competency level could be useful in classifying the competency level of nursing specialties. These results illustrate the competency levels of Korean ICNs and could serve as a reference to evaluate and expand the application of competency measurement not only for ICNs but also other groups of nurse specialists. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Smoothed Particle Hydrodynamic Simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-10-05

    This code is a highly modular framework for developing smoothed particle hydrodynamic (SPH) simulations running on parallel platforms. The compartmentalization of the code allows for rapid development of new SPH applications and modifications of existing algorithms. The compartmentalization also allows changes in one part of the code used by many applications to instantly be made available to all applications.

  12. MARVIN: a medical research application framework based on open source software.

    PubMed

    Rudolph, Tobias; Puls, Marc; Anderegg, Christoph; Ebert, Lars; Broehan, Martina; Rudin, Adrian; Kowal, Jens

    2008-08-01

    This paper describes the open source framework MARVIN for rapid application development in the field of biomedical and clinical research. MARVIN applications consist of modules that can be plugged together in order to provide the functionality required for a specific experimental scenario. Application modules work on a common patient database that is used to store and organize medical data as well as derived data. MARVIN provides a flexible input/output system with support for many file formats including DICOM, various 2D image formats and surface mesh data. Furthermore, it implements an advanced visualization system and interfaces to a wide range of 3D tracking hardware. Since it uses only highly portable libraries, MARVIN applications run on Unix/Linux, Mac OS X and Microsoft Windows.

  13. A Concise and Practical Framework for the Development and Usability Evaluation of Patient Information Websites.

    PubMed

    Peute, L W; Knijnenburg, S L; Kremer, L C; Jaspers, M W M

    2015-01-01

    The Website Developmental Model for the Healthcare Consumer (WDMHC) is an extensive and successfully evaluated framework that incorporates user-centered design principles. However, due to its extensiveness its application is limited. In the current study we apply a subset of the WDMHC framework in a case study concerning the development and evaluation of a website aimed at childhood cancer survivors (CCS). To assess whether the implementation of a limited subset of the WDMHC-framework is sufficient to deliver a high-quality website with few usability problems, aimed at a specific patient population. The website was developed using a six-step approach divided into three phases derived from the WDMHC: 1) information needs analysis, mock-up creation and focus group discussion; 2) website prototype development; and 3) heuristic evaluation (HE) and think aloud analysis (TA). The HE was performed by three double experts (knowledgeable both in usability engineering and childhood cancer survivorship), who assessed the site using the Nielsen heuristics. Eight end-users were invited to complete three scenarios covering all functionality of the website by TA. The HE and TA were performed concurrently on the website prototype. The HE resulted in 29 unique usability issues; the end-users performing the TA encountered eleven unique problems. Four issues specifically revealed by HE concerned cosmetic design flaws, whereas two problems revealed by TA were related to website content. Based on the subset of the WDMHC framework we were able to deliver a website that closely matched the expectancy of the end-users and resulted in relatively few usability problems during end-user testing. With the successful application of this subset of the WDMHC, we provide developers with a clear and easily applicable framework for the development of healthcare websites with high usability aimed at specific medical populations.

  14. Self-Compassion: A Mentorship Framework for Counselor Educator Mothers

    ERIC Educational Resources Information Center

    Solomon, Coralis; Barden, Sejal Mehta

    2016-01-01

    Counselor educators experience high levels of stress. Mothers in academia face an additional set of emotional stressors. The authors offer a self-compassion framework for mentors to increase emotional resilience of mothers in counselor education.

  15. Fluorescence enhancement through the formation of a single-layer two-dimensional supramolecular organic framework and its application in highly selective recognition of picric acid.

    PubMed

    Zhang, Ying; Zhan, Tian-Guang; Zhou, Tian-You; Qi, Qiao-Yan; Xu, Xiao-Na; Zhao, Xin

    2016-06-18

    A two-dimensional (2D) supramolecular organic framework (SOF) has been constructed through the co-assembly of a triphenylamine-based building block and cucurbit[8]uril (CB[8]). Fluorescence turn-on of the non-emissive building block was observed upon the formation of the 2D SOF, which displayed highly selective and sensitive recognition of picric acid over a variety of nitroaromatics.

  16. Molecular simulations for energy, environmental and pharmaceutical applications of nanoporous materials: from zeolites, metal-organic frameworks to protein crystals.

    PubMed

    Jiang, Jianwen; Babarao, Ravichandar; Hu, Zhongqiao

    2011-07-01

    Nanoporous materials have widespread applications in chemical industry, but the pathway from laboratory synthesis and testing to practical utilization of nanoporous materials is substantially challenging and requires fundamental understanding from the bottom up. With ever-growing computational resources, molecular simulations have become an indispensable tool for material characterization, screening and design. This tutorial review summarizes the recent simulation studies in zeolites, metal-organic frameworks and protein crystals, and provides a molecular overview for energy, environmental and pharmaceutical applications of nanoporous materials with increasing degree of complexity in building blocks. It is demonstrated that molecular-level studies can bridge the gap between physical and engineering sciences, unravel microscopic insights that are otherwise experimentally inaccessible, and assist in the rational design of new materials. The review is concluded with major challenges in future simulation exploration of novel nanoporous materials for emerging applications.

  17. Application of STORMTOOLS's simplified flood inundation model with sea level rise to assess impacts to RI coastal areas

    NASA Astrophysics Data System (ADS)

    Spaulding, M. L.

    2015-12-01

    The vision for STORMTOOLS is to provide access to a suite of coastal planning tools (numerical models et al), available as a web service, that allows wide spread accessibly and applicability at high resolution for user selected coastal areas of interest. The first product developed under this framework were flood inundation maps, with and without sea level rise, for varying return periods for RI coastal waters. The flood mapping methodology is based on using the water level vs return periods at a primary NOAA water level gauging station and then spatially scaling the values, based on the predictions of high resolution, storm and wave simulations performed by Army Corp of Engineers, North Atlantic Comprehensive Coastal Study (NACCS) for tropical and extratropical storms on an unstructured grid, to estimate inundation levels for varying return periods. The scaling for the RI application used Newport, RI water levels as the reference point. Predictions are provided for once in 25, 50, and 100 yr return periods (at the upper 95% confidence level), with sea level rises of 1, 2, 3, and 5 ft. Simulations have also been performed for historical hurricane events including 1938, Carol (1954), Bob (1991), and Sandy (2012) and nuisance flooding events with return periods of 1, 3, 5, and 10 yr. Access to the flooding maps is via a web based, map viewer that seamlessly covers all coastal waters of the state at one meter resolution. The GIS structure of the map viewer allows overlays of additional relevant data sets (roads and highways, wastewater treatment facilities, schools, hospitals, emergency evacuation routes, etc.) as desired by the user. The simplified flooding maps are publically available and are now being implemented for state and community resilience planning and vulnerability assessment activities in response to climate change impacts.

  18. Π4U: A high performance computing framework for Bayesian uncertainty quantification of complex models

    NASA Astrophysics Data System (ADS)

    Hadjidoukas, P. E.; Angelikopoulos, P.; Papadimitriou, C.; Koumoutsakos, P.

    2015-03-01

    We present Π4U, an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow.

  19. Towards a cyber-physical era: soft computing framework based multi-sensor array for water quality monitoring

    NASA Astrophysics Data System (ADS)

    Bhardwaj, Jyotirmoy; Gupta, Karunesh K.; Gupta, Rajiv

    2018-02-01

    New concepts and techniques are replacing traditional methods of water quality parameter measurement systems. This paper introduces a cyber-physical system (CPS) approach for water quality assessment in a distribution network. Cyber-physical systems with embedded sensors, processors and actuators can be designed to sense and interact with the water environment. The proposed CPS is comprised of sensing framework integrated with five different water quality parameter sensor nodes and soft computing framework for computational modelling. Soft computing framework utilizes the applications of Python for user interface and fuzzy sciences for decision making. Introduction of multiple sensors in a water distribution network generates a huge number of data matrices, which are sometimes highly complex, difficult to understand and convoluted for effective decision making. Therefore, the proposed system framework also intends to simplify the complexity of obtained sensor data matrices and to support decision making for water engineers through a soft computing framework. The target of this proposed research is to provide a simple and efficient method to identify and detect presence of contamination in a water distribution network using applications of CPS.

  20. Application Level Protocol Development for Library and Information Science Applications. Volume 1: Service Definition. Volume 2: Protocol Specification. Report No. TG.1.5; TG.50.

    ERIC Educational Resources Information Center

    Aagaard, James S.; And Others

    This two-volume document specifies a protocol that was developed using the Reference Model for Open Systems Interconnection (OSI), which provides a framework for communications within a heterogeneous network environment. The protocol implements the features necessary for bibliographic searching, record maintenance, and mail transfer between…

  1. Contributions of low- and high-level properties to neural processing of visual scenes in the human brain.

    PubMed

    Groen, Iris I A; Silson, Edward H; Baker, Chris I

    2017-02-19

    Visual scene analysis in humans has been characterized by the presence of regions in extrastriate cortex that are selectively responsive to scenes compared with objects or faces. While these regions have often been interpreted as representing high-level properties of scenes (e.g. category), they also exhibit substantial sensitivity to low-level (e.g. spatial frequency) and mid-level (e.g. spatial layout) properties, and it is unclear how these disparate findings can be united in a single framework. In this opinion piece, we suggest that this problem can be resolved by questioning the utility of the classical low- to high-level framework of visual perception for scene processing, and discuss why low- and mid-level properties may be particularly diagnostic for the behavioural goals specific to scene perception as compared to object recognition. In particular, we highlight the contributions of low-level vision to scene representation by reviewing (i) retinotopic biases and receptive field properties of scene-selective regions and (ii) the temporal dynamics of scene perception that demonstrate overlap of low- and mid-level feature representations with those of scene category. We discuss the relevance of these findings for scene perception and suggest a more expansive framework for visual scene analysis.This article is part of the themed issue 'Auditory and visual scene analysis'. © 2017 The Author(s).

  2. Contributions of low- and high-level properties to neural processing of visual scenes in the human brain

    PubMed Central

    2017-01-01

    Visual scene analysis in humans has been characterized by the presence of regions in extrastriate cortex that are selectively responsive to scenes compared with objects or faces. While these regions have often been interpreted as representing high-level properties of scenes (e.g. category), they also exhibit substantial sensitivity to low-level (e.g. spatial frequency) and mid-level (e.g. spatial layout) properties, and it is unclear how these disparate findings can be united in a single framework. In this opinion piece, we suggest that this problem can be resolved by questioning the utility of the classical low- to high-level framework of visual perception for scene processing, and discuss why low- and mid-level properties may be particularly diagnostic for the behavioural goals specific to scene perception as compared to object recognition. In particular, we highlight the contributions of low-level vision to scene representation by reviewing (i) retinotopic biases and receptive field properties of scene-selective regions and (ii) the temporal dynamics of scene perception that demonstrate overlap of low- and mid-level feature representations with those of scene category. We discuss the relevance of these findings for scene perception and suggest a more expansive framework for visual scene analysis. This article is part of the themed issue ‘Auditory and visual scene analysis’. PMID:28044013

  3. Enterprise and system of systems capability development life-cycle processes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, David Franklin

    2014-08-01

    This report and set of appendices are a collection of memoranda originally drafted circa 2007-2009 for the purpose of describing and detailing a models-based systems engineering approach for satisfying enterprise and system-of-systems life cycle process requirements. At the time there was interest and support to move from Capability Maturity Model Integration (CMMI) Level One (ad hoc processes) to Level Three. The main thrust of the material presents a rational exposâe of a structured enterprise development life cycle that uses the scientific method as a framework, with further rigor added from adapting relevant portions of standard systems engineering processes. While themore » approach described invokes application of the Department of Defense Architectural Framework (DoDAF), it is suitable for use with other architectural description frameworks.« less

  4. Optimal bioprocess design through a gene regulatory network - growth kinetic hybrid model: Towards Replacing Monod kinetics.

    PubMed

    Tsipa, Argyro; Koutinas, Michalis; Usaku, Chonlatep; Mantalaris, Athanasios

    2018-05-02

    Currently, design and optimisation of biotechnological bioprocesses is performed either through exhaustive experimentation and/or with the use of empirical, unstructured growth kinetics models. Whereas, elaborate systems biology approaches have been recently explored, mixed-substrate utilisation is predominantly ignored despite its significance in enhancing bioprocess performance. Herein, bioprocess optimisation for an industrially-relevant bioremediation process involving a mixture of highly toxic substrates, m-xylene and toluene, was achieved through application of a novel experimental-modelling gene regulatory network - growth kinetic (GRN-GK) hybrid framework. The GRN model described the TOL and ortho-cleavage pathways in Pseudomonas putida mt-2 and captured the transcriptional kinetics expression patterns of the promoters. The GRN model informed the formulation of the growth kinetics model replacing the empirical and unstructured Monod kinetics. The GRN-GK framework's predictive capability and potential as a systematic optimal bioprocess design tool, was demonstrated by effectively predicting bioprocess performance, which was in agreement with experimental values, when compared to four commonly used models that deviated significantly from the experimental values. Significantly, a fed-batch biodegradation process was designed and optimised through the model-based control of TOL Pr promoter expression resulting in 61% and 60% enhanced pollutant removal and biomass formation, respectively, compared to the batch process. This provides strong evidence of model-based bioprocess optimisation at the gene level, rendering the GRN-GK framework as a novel and applicable approach to optimal bioprocess design. Finally, model analysis using global sensitivity analysis (GSA) suggests an alternative, systematic approach for model-driven strain modification for synthetic biology and metabolic engineering applications. Copyright © 2018. Published by Elsevier Inc.

  5. Modular System for Shelves and Coasts (MOSSCO v1.0) - a flexible and multi-component framework for coupled coastal ocean ecosystem modelling

    NASA Astrophysics Data System (ADS)

    Lemmen, Carsten; Hofmeister, Richard; Klingbeil, Knut; Hassan Nasermoaddeli, M.; Kerimoglu, Onur; Burchard, Hans; Kösters, Frank; Wirtz, Kai W.

    2018-03-01

    Shelf and coastal sea processes extend from the atmosphere through the water column and into the seabed. These processes reflect intimate interactions between physical, chemical, and biological states on multiple scales. As a consequence, coastal system modelling requires a high and flexible degree of process and domain integration; this has so far hardly been achieved by current model systems. The lack of modularity and flexibility in integrated models hinders the exchange of data and model components and has historically imposed the supremacy of specific physical driver models. We present the Modular System for Shelves and Coasts (MOSSCO; http://www.mossco.de), a novel domain and process coupling system tailored but not limited to the coupling challenges of and applications in the coastal ocean. MOSSCO builds on the Earth System Modeling Framework (ESMF) and on the Framework for Aquatic Biogeochemical Models (FABM). It goes beyond existing technologies by creating a unique level of modularity in both domain and process coupling, including a clear separation of component and basic model interfaces, flexible scheduling of several tens of models, and facilitation of iterative development at the lab and the station and on the coastal ocean scale. MOSSCO is rich in metadata and its concepts are also applicable outside the coastal domain. For coastal modelling, it contains dozens of example coupling configurations and tested set-ups for coupled applications. Thus, MOSSCO addresses the technology needs of a growing marine coastal Earth system community that encompasses very different disciplines, numerical tools, and research questions.

  6. Local linear discriminant analysis framework using sample neighbors.

    PubMed

    Fan, Zizhu; Xu, Yong; Zhang, David

    2011-07-01

    The linear discriminant analysis (LDA) is a very popular linear feature extraction approach. The algorithms of LDA usually perform well under the following two assumptions. The first assumption is that the global data structure is consistent with the local data structure. The second assumption is that the input data classes are Gaussian distributions. However, in real-world applications, these assumptions are not always satisfied. In this paper, we propose an improved LDA framework, the local LDA (LLDA), which can perform well without needing to satisfy the above two assumptions. Our LLDA framework can effectively capture the local structure of samples. According to different types of local data structure, our LLDA framework incorporates several different forms of linear feature extraction approaches, such as the classical LDA and principal component analysis. The proposed framework includes two LLDA algorithms: a vector-based LLDA algorithm and a matrix-based LLDA (MLLDA) algorithm. MLLDA is directly applicable to image recognition, such as face recognition. Our algorithms need to train only a small portion of the whole training set before testing a sample. They are suitable for learning large-scale databases especially when the input data dimensions are very high and can achieve high classification accuracy. Extensive experiments show that the proposed algorithms can obtain good classification results.

  7. A Computational Framework for High-Throughput Isotopic Natural Abundance Correction of Omics-Level Ultra-High Resolution FT-MS Datasets

    PubMed Central

    Carreer, William J.; Flight, Robert M.; Moseley, Hunter N. B.

    2013-01-01

    New metabolomics applications of ultra-high resolution and accuracy mass spectrometry can provide thousands of detectable isotopologues, with the number of potentially detectable isotopologues increasing exponentially with the number of stable isotopes used in newer isotope tracing methods like stable isotope-resolved metabolomics (SIRM) experiments. This huge increase in usable data requires software capable of correcting the large number of isotopologue peaks resulting from SIRM experiments in a timely manner. We describe the design of a new algorithm and software system capable of handling these high volumes of data, while including quality control methods for maintaining data quality. We validate this new algorithm against a previous single isotope correction algorithm in a two-step cross-validation. Next, we demonstrate the algorithm and correct for the effects of natural abundance for both 13C and 15N isotopes on a set of raw isotopologue intensities of UDP-N-acetyl-D-glucosamine derived from a 13C/15N-tracing experiment. Finally, we demonstrate the algorithm on a full omics-level dataset. PMID:24404440

  8. The coupling of the neutron transport application RATTLESNAKE to the nuclear fuels performance application BISON under the MOOSE framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gleicher, Frederick N.; Williamson, Richard L.; Ortensi, Javier

    The MOOSE neutron transport application RATTLESNAKE was coupled to the fuels performance application BISON to provide a higher fidelity tool for fuel performance simulation. This project is motivated by the desire to couple a high fidelity core analysis program (based on the self-adjoint angular flux equations) to a high fidelity fuel performance program, both of which can simulate on unstructured meshes. RATTLESNAKE solves self-adjoint angular flux transport equation and provides a sub-pin level resolution of the multigroup neutron flux with resonance treatment during burnup or a fast transient. BISON solves the coupled thermomechanical equations for the fuel on a sub-millimetermore » scale. Both applications are able to solve their respective systems on aligned and unaligned unstructured finite element meshes. The power density and local burnup was transferred from RATTLESNAKE to BISON with the MOOSE Multiapp transfer system. Multiple depletion cases were run with one-way data transfer from RATTLESNAKE to BISON. The eigenvalues are shown to agree well with values obtained from the lattice physics code DRAGON. The one-way data transfer of power density is shown to agree with the power density obtained from an internal Lassman-style model in BISON.« less

  9. The Diamond Beamline Controls and Data Acquisition Software Architecture

    NASA Astrophysics Data System (ADS)

    Rees, N.

    2010-06-01

    The software for the Diamond Light Source beamlines[1] is based on two complementary software frameworks: low level control is provided by the Experimental Physics and Industrial Control System (EPICS) framework[2][3] and the high level user interface is provided by the Java based Generic Data Acquisition or GDA[4][5]. EPICS provides a widely used, robust, generic interface across a wide range of hardware where the user interfaces are focused on serving the needs of engineers and beamline scientists to obtain detailed low level views of all aspects of the beamline control systems. The GDA system provides a high-level system that combines an understanding of scientific concepts, such as reciprocal lattice coordinates, a flexible python syntax scripting interface for the scientific user to control their data acquisition, and graphical user interfaces where necessary. This paper describes the beamline software architecture in more detail, highlighting how these complementary frameworks provide a flexible system that can accommodate a wide range of requirements.

  10. Deep Packet/Flow Analysis using GPUs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gong, Qian; Wu, Wenji; DeMar, Phil

    Deep packet inspection (DPI) faces severe performance challenges in high-speed networks (40/100 GE) as it requires a large amount of raw computing power and high I/O throughputs. Recently, researchers have tentatively used GPUs to address the above issues and boost the performance of DPI. Typically, DPI applications involve highly complex operations in both per-packet and per-flow data level, often in real-time. The parallel architecture of GPUs fits exceptionally well for per-packet network traffic processing. However, for stateful network protocols such as TCP, their data stream need to be reconstructed in a per-flow level to deliver a consistent content analysis. Sincemore » the flow-centric operations are naturally antiparallel and often require large memory space for buffering out-of-sequence packets, they can be problematic for GPUs, whose memory is normally limited to several gigabytes. In this work, we present a highly efficient GPU-based deep packet/flow analysis framework. The proposed design includes a purely GPU-implemented flow tracking and TCP stream reassembly. Instead of buffering and waiting for TCP packets to become in sequence, our framework process the packets in batch and uses a deterministic finite automaton (DFA) with prefix-/suffix- tree method to detect patterns across out-of-sequence packets that happen to be located in different batches. In conclusion, evaluation shows that our code can reassemble and forward tens of millions of packets per second and conduct a stateful signature-based deep packet inspection at 55 Gbit/s using an NVIDIA K40 GPU.« less

  11. A Review on Breathing Behaviors of Metal-Organic-Frameworks (MOFs) for Gas Adsorption

    PubMed Central

    Alhamami, Mays; Doan, Huu; Cheng, Chil-Hung

    2014-01-01

    Metal-organic frameworks (MOFs) are a new class of microporous materials that possess framework flexibility, large surface areas, “tailor-made” framework functionalities, and tunable pore sizes. These features empower MOFs superior performances and broader application spectra than those of zeolites and phosphine-based molecular sieves. In parallel with designing new structures and new chemistry of MOFs, the observation of unique breathing behaviors upon adsorption of gases or solvents stimulates their potential applications as host materials in gas storage for renewable energy. This has attracted intense research energy to understand the causes at the atomic level, using in situ X-ray diffraction, calorimetry, Fourier transform infrared spectroscopy, and molecular dynamics simulations. This article is developed in the following order: first to introduce the definition of MOFs and the observation of their framework flexibility. Second, synthesis routes of MOFs are summarized with the emphasis on the hydrothermal synthesis, owing to the environmental-benign and economically availability of water. Third, MOFs exhibiting breathing behaviors are summarized, followed by rationales from thermodynamic viewpoint. Subsequently, effects of various functionalities on breathing behaviors are appraised, including using post-synthetic modification routes. Finally, possible framework spatial requirements of MOFs for yielding breathing behaviors are highlighted as the design strategies for new syntheses. PMID:28788614

  12. Parallelization and checkpointing of GPU applications through program transformation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solano-Quinde, Lizandro Damian

    2012-01-01

    GPUs have emerged as a powerful tool for accelerating general-purpose applications. The availability of programming languages that makes writing general-purpose applications for running on GPUs tractable have consolidated GPUs as an alternative for accelerating general purpose applications. Among the areas that have benefited from GPU acceleration are: signal and image processing, computational fluid dynamics, quantum chemistry, and, in general, the High Performance Computing (HPC) Industry. In order to continue to exploit higher levels of parallelism with GPUs, multi-GPU systems are gaining popularity. In this context, single-GPU applications are parallelized for running in multi-GPU systems. Furthermore, multi-GPU systems help to solvemore » the GPU memory limitation for applications with large application memory footprint. Parallelizing single-GPU applications has been approached by libraries that distribute the workload at runtime, however, they impose execution overhead and are not portable. On the other hand, on traditional CPU systems, parallelization has been approached through application transformation at pre-compile time, which enhances the application to distribute the workload at application level and does not have the issues of library-based approaches. Hence, a parallelization scheme for GPU systems based on application transformation is needed. Like any computing engine of today, reliability is also a concern in GPUs. GPUs are vulnerable to transient and permanent failures. Current checkpoint/restart techniques are not suitable for systems with GPUs. Checkpointing for GPU systems present new and interesting challenges, primarily due to the natural differences imposed by the hardware design, the memory subsystem architecture, the massive number of threads, and the limited amount of synchronization among threads. Therefore, a checkpoint/restart technique suitable for GPU systems is needed. The goal of this work is to exploit higher levels of parallelism and to develop support for application-level fault tolerance in applications using multiple GPUs. Our techniques reduce the burden of enhancing single-GPU applications to support these features. To achieve our goal, this work designs and implements a framework for enhancing a single-GPU OpenCL application through application transformation.« less

  13. Compensation for Lithography Induced Process Variations during Physical Design

    NASA Astrophysics Data System (ADS)

    Chin, Eric Yiow-Bing

    This dissertation addresses the challenge of designing robust integrated circuits in the deep sub micron regime in the presence of lithography process variability. By extending and combining existing process and circuit analysis techniques, flexible software frameworks are developed to provide detailed studies of circuit performance in the presence of lithography variations such as focus and exposure. Applications of these software frameworks to select circuits demonstrate the electrical impact of these variations and provide insight into variability aware compact models that capture the process dependent circuit behavior. These variability aware timing models abstract lithography variability from the process level to the circuit level and are used to estimate path level circuit performance with high accuracy with very little overhead in runtime. The Interconnect Variability Characterization (IVC) framework maps lithography induced geometrical variations at the interconnect level to electrical delay variations. This framework is applied to one dimensional repeater circuits patterned with both 90nm single patterning and 32nm double patterning technologies, under the presence of focus, exposure, and overlay variability. Studies indicate that single and double patterning layouts generally exhibit small variations in delay (between 1--3%) due to self compensating RC effects associated with dense layouts and overlay errors for layouts without self-compensating RC effects. The delay response of each double patterned interconnect structure is fit with a second order polynomial model with focus, exposure, and misalignment parameters with 12 coefficients and residuals of less than 0.1ps. The IVC framework is also applied to a repeater circuit with cascaded interconnect structures to emulate more complex layout scenarios, and it is observed that the variations on each segment average out to reduce the overall delay variation. The Standard Cell Variability Characterization (SCVC) framework advances existing layout-level lithography aware circuit analysis by extending it to cell-level applications utilizing a physically accurate approach that integrates process simulation, compact transistor models, and circuit simulation to characterize electrical cell behavior. This framework is applied to combinational and sequential cells in the Nangate 45nm Open Cell Library, and the timing response of these cells to lithography focus and exposure variations demonstrate Bossung like behavior. This behavior permits the process parameter dependent response to be captured in a nine term variability aware compact model based on Bossung fitting equations. For a two input NAND gate, the variability aware compact model captures the simulated response to an accuracy of 0.3%. The SCVC framework is also applied to investigate advanced process effects including misalignment and layout proximity. The abstraction of process variability from the layout level to the cell level opens up an entire new realm of circuit analysis and optimization and provides a foundation for path level variability analysis without the computationally expensive costs associated with joint process and circuit simulation. The SCVC framework is used with slight modification to illustrate the speedup and accuracy tradeoffs of using compact models. With variability aware compact models, the process dependent performance of a three stage logic circuit can be estimated to an accuracy of 0.7% with a speedup of over 50,000. Path level variability analysis also provides an accurate estimate (within 1%) of ring oscillator period in well under a second. Another significant advantage of variability aware compact models is that they can be easily incorporated into existing design methodologies for design optimization. This is demonstrated by applying cell swapping on a logic circuit to reduce the overall delay variability along a circuit path. By including these variability aware compact models in cell characterization libraries, design metrics such as circuit timing, power, area, and delay variability can be quickly assessed to optimize for the correct balance of all design metrics, including delay variability. Deterministic lithography variations can be easily captured using the variability aware compact models described in this dissertation. However, another prominent source of variability is random dopant fluctuations, which affect transistor threshold voltage and in turn circuit performance. The SCVC framework is utilized to investigate the interactions between deterministic lithography variations and random dopant fluctuations. Monte Carlo studies show that the output delay distribution in the presence of random dopant fluctuations is dependent on lithography focus and exposure conditions, with a 3.6 ps change in standard deviation across the focus exposure process window. This indicates that the electrical impact of random variations is dependent on systematic lithography variations, and this dependency should be included for precise analysis.

  14. Ultrathin Two-Dimensional Covalent Organic Framework Nanosheets: Preparation and Application in Highly Sensitive and Selective DNA Detection.

    PubMed

    Peng, Yongwu; Huang, Ying; Zhu, Yihan; Chen, Bo; Wang, Liying; Lai, Zhuangchai; Zhang, Zhicheng; Zhao, Meiting; Tan, Chaoliang; Yang, Nailiang; Shao, Fangwei; Han, Yu; Zhang, Hua

    2017-06-28

    The ability to prepare ultrathin two-dimensional (2D) covalent organic framework (COF) nanosheets (NSs) in high yield is of great importance for the further exploration of their unique properties and potential applications. Herein, by elaborately designing and choosing two flexible molecules with C 3v molecular symmetry as building units, a novel imine-linked COF, namely, TPA-COF, with a hexagonal layered structure and sheet-like morphology, is synthesized. Since the flexible building units are integrated into the COF skeletons, the interlayer stacking becomes weak, resulting in the easy exfoliation of TPA-COF into ultrathin 2D NSs. Impressively, for the first time, the detailed structural information, i.e., the pore channels and individual building units in the NSs, is clearly visualized by using the recently developed low-dose imaging technique of transmission electron microscopy (TEM). As a proof-of-concept application, the obtained ultrathin COF NSs are used as a novel fluorescence sensing platform for the highly sensitive and selective detection of DNA.

  15. Hazard Analysis and Safety Requirements for Small Drone Operations: To What Extent Do Popular Drones Embed Safety?

    PubMed

    Plioutsias, Anastasios; Karanikas, Nektarios; Chatzimihailidou, Maria Mikela

    2018-03-01

    Currently, published risk analyses for drones refer mainly to commercial systems, use data from civil aviation, and are based on probabilistic approaches without suggesting an inclusive list of hazards and respective requirements. Within this context, this article presents: (1) a set of safety requirements generated from the application of the systems theoretic process analysis (STPA) technique on a generic small drone system; (2) a gap analysis between the set of safety requirements and the ones met by 19 popular drone models; (3) the extent of the differences between those models, their manufacturers, and the countries of origin; and (4) the association of drone prices with the extent they meet the requirements derived by STPA. The application of STPA resulted in 70 safety requirements distributed across the authority, manufacturer, end user, or drone automation levels. A gap analysis showed high dissimilarities regarding the extent to which the 19 drones meet the same safety requirements. Statistical results suggested a positive correlation between drone prices and the extent that the 19 drones studied herein met the safety requirements generated by STPA, and significant differences were identified among the manufacturers. This work complements the existing risk assessment frameworks for small drones, and contributes to the establishment of a commonly endorsed international risk analysis framework. Such a framework will support the development of a holistic and methodologically justified standardization scheme for small drone flights. © 2017 Society for Risk Analysis.

  16. Transforming guided waves with metamaterial waveguide cores

    NASA Astrophysics Data System (ADS)

    Viaene, S.; Ginis, V.; Danckaert, J.; Tassin, P.

    2016-04-01

    Metamaterials make use of subwavelength building blocks to enhance our control on the propagation of light. To determine the required material properties for a given functionality, i.e., a set of desired light flows inside a metamaterial device, metamaterial designs often rely on a geometrical design tool known as transformation optics. In recent years, applications in integrated photonics motivated several research groups to develop two-dimensional versions of transformation optics capable of routing surface waves along graphene-dielectric and metal-dielectric interfaces. Although guided electromagnetic waves are highly relevant to applications in integrated optics, no consistent transformation-optical framework has so far been developed for slab waveguides. Indeed, the conventional application of transformation optics to dielectric slab waveguides leads to bulky three-dimensional devices with metamaterial implementations both inside and outside of the waveguide's core. In this contribution, we develop a transformationoptical framework that still results in thin metamaterial waveguide devices consisting of a nonmagnetic metamaterial core of varying thickness [Phys. Rev. B 93.8, 085429 (2016)]. We numerically demonstrate the effectiveness and versatility of our equivalence relations with three crucial functionalities: a beam bender, a beam splitter and a conformal lens. Our devices perform well on a qualitative (comparison of fields) and quantitative (comparison of transmitted power) level compared to their bulky counterparts. As a result, the geometrical toolbox of transformation optics may lead to a plethora of integrated metamaterial devices to route guided waves along optical chips.

  17. Deep Neural Networks: A New Framework for Modeling Biological Vision and Brain Information Processing.

    PubMed

    Kriegeskorte, Nikolaus

    2015-11-24

    Recent advances in neural network modeling have enabled major strides in computer vision and other artificial intelligence applications. Human-level visual recognition abilities are coming within reach of artificial systems. Artificial neural networks are inspired by the brain, and their computations could be implemented in biological neurons. Convolutional feedforward networks, which now dominate computer vision, take further inspiration from the architecture of the primate visual hierarchy. However, the current models are designed with engineering goals, not to model brain computations. Nevertheless, initial studies comparing internal representations between these models and primate brains find surprisingly similar representational spaces. With human-level performance no longer out of reach, we are entering an exciting new era, in which we will be able to build biologically faithful feedforward and recurrent computational models of how biological brains perform high-level feats of intelligence, including vision.

  18. Beamline Insertions Manager at Jefferson Lab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Michael C.

    2015-09-01

    The beam viewer system at Jefferson Lab provides operators and beam physicists with qualitative and quantitative information on the transverse electron beam properties. There are over 140 beam viewers installed on the 12 GeV CEBAF accelerator. This paper describes an upgrade consisting of replacing the EPICS-based system tasked with managing all viewers with a mixed system utilizing EPICS and high-level software. Most devices, particularly the beam viewers, cannot be safely inserted into the beam line during high-current beam operations. Software is partly responsible for protecting the machine from untimely insertions. The multiplicity of beam-blocking and beam-vulnerable devices motivates us tomore » try a data-driven approach. The beamline insertions application components are centrally managed and configured through an object-oriented software framework created for this purpose. A rules-based engine tracks the configuration and status of every device, along with the beam status of the machine segment containing the device. The application uses this information to decide on which device actions are allowed at any given time.« less

  19. Bed II Sequence Stratigraphic context of EF-HR and HWK EE archaeological sites, and the Oldowan/Acheulean succession at Olduvai Gorge, Tanzania.

    PubMed

    Stanistreet, Ian G; McHenry, Lindsay J; Stollhofen, Harald; de la Torre, Ignacio

    2018-04-20

    Archaeological excavations at EF-HR and HWK EE allow reassessment of Bed II stratigraphy within the Junction Area and eastern Olduvai Gorge. Application of Sequence Stratigraphic methods provides a time-stratigraphic framework enabling correlation of sedimentary units across facies boundaries, applicable even in those areas where conventional timelines, such as tephrostratigraphic markers, are absent, eroded, or reworked. Sequence Stratigraphically, Bed II subdivides into five major Sequences 1 to 5, all floored by major disconformities that incise deeply into the underlying succession, proving that simple "layer cake" stratigraphy is inappropriate. Previous establishment of the Lemuta Member has invalidated the use of Tuff IIA as the boundary between Lower and Middle Bed II, now redefined at the disconformity between Sequences 2 and 3, a lithostratigraphic contact underlying the succession containing the Lower, Middle, and Upper Augitic Sandstones. HWK EE site records Oldowan technology in the Lower Augitic Sandstone at the base of Sequence 3, within Middle Bed II. We suggest placement of recently reported Acheulean levels at FLK W within the Middle Augitic Sandstone, thus emphasizing that handaxes are yet to be found in earlier stratigraphic units of the Olduvai sequence. This would place a boundary between the Oldowan and Acheulean technologies at Olduvai in the Tuff IIB zone or earliest Middle Augitic Sandstone. A major disconformity between Sequences 3 and 4 at and near EF-HR cuts through the level of Tuff IIC, placing the main Acheulean EF-HR assemblage at the base of Sequence 4, within Upper rather than Middle Bed II. Sequence stratigraphic methods also yield a more highly resolved Bed II stratigraphic framework. Backwall and sidewall surveying of archaeological trenches at EF-HR and HWK EE permits definition of "Lake-parasequences" nested within the major Sequences that record downcutting of disconformities associated with lake regression, then sedimentation associated with lake transgression, capped finally by another erosional disconformity or hiatal paraconformity caused by the next lake withdrawal. On a relative time-scale rather than a vertical metre scale, the resulting Wheeler diagram framework provides a basis for recognizing time-equivalent depositional episodes and the position of time gaps at various scales. Relative timing of archaeological assemblage levels can then be differentiated at a millennial scale within this framework. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Earth Science Mobile App Development for Non-Programmers

    NASA Astrophysics Data System (ADS)

    Oostra, D.; Crecelius, S.; Lewis, P.; Chambers, L. H.

    2012-08-01

    A number of cloud based visual development tools have emerged that provide methods for developing mobile applications quickly and without previous programming experience. The MY NASA DATA (MND) team would like to begin a discussion on how we can best leverage current mobile app technologies and available Earth science datasets. The MY NASA DATA team is developing an approach based on two main ideas. The first is to teach our constituents how to create mobile applications that interact with NASA datasets; the second is to provide web services or Application Programming Interfaces (APIs) that create sources of data that educators, students and scientists can use in their own mobile app development. This framework allows data providers to foster mobile application development and interaction while not becoming a software clearing house. MY NASA DATA's research has included meetings with local data providers, educators, libraries and individuals. A high level of interest has been identified from initial discussions and interviews. This overt interest combined with the marked popularity of mobile applications in our societies has created a new channel for outreach and communications with and between the science and educational communities.

  1. Achieving High Performance With TCP Over 40 GbE on NUMA Architectures for CMS Data Acquisition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bawej, Tomasz; et al.

    2014-01-01

    TCP and the socket abstraction have barely changed over the last two decades, but at the network layer there has been a giant leap from a few megabits to 100 gigabits in bandwidth. At the same time, CPU architectures have evolved into the multicore era and applications are expected to make full use of all available resources. Applications in the data acquisition domain based on the standard socket library running in a Non-Uniform Memory Access (NUMA) architecture are unable to reach full efficiency and scalability without the software being adequately aware about the IRQ (Interrupt Request), CPU and memory affinities.more » During the first long shutdown of LHC, the CMS DAQ system is going to be upgraded for operation from 2015 onwards and a new software component has been designed and developed in the CMS online framework for transferring data with sockets. This software attempts to wrap the low-level socket library to ease higher-level programming with an API based on an asynchronous event driven model similar to the DAT uDAPL API. It is an event-based application with NUMA optimizations, that allows for a high throughput of data across a large distributed system. This paper describes the architecture, the technologies involved and the performance measurements of the software in the context of the CMS distributed event building.« less

  2. A top-level ontology of functions and its application in the Open Biomedical Ontologies.

    PubMed

    Burek, Patryk; Hoehndorf, Robert; Loebe, Frank; Visagie, Johann; Herre, Heinrich; Kelso, Janet

    2006-07-15

    A clear understanding of functions in biology is a key component in accurate modelling of molecular, cellular and organismal biology. Using the existing biomedical ontologies it has been impossible to capture the complexity of the community's knowledge about biological functions. We present here a top-level ontological framework for representing knowledge about biological functions. This framework lends greater accuracy, power and expressiveness to biomedical ontologies by providing a means to capture existing functional knowledge in a more formal manner. An initial major application of the ontology of functions is the provision of a principled way in which to curate functional knowledge and annotations in biomedical ontologies. Further potential applications include the facilitation of ontology interoperability and automated reasoning. A major advantage of the proposed implementation is that it is an extension to existing biomedical ontologies, and can be applied without substantial changes to these domain ontologies. The Ontology of Functions (OF) can be downloaded in OWL format from http://onto.eva.mpg.de/. Additionally, a UML profile and supplementary information and guides for using the OF can be accessed from the same website.

  3. Evaluating Mobile Survey Tools (MSTs) for Field-Level Monitoring and Data Collection: Development of a Novel Evaluation Framework, and Application to MSTs for Rural Water and Sanitation Monitoring

    PubMed Central

    Fisher, Michael B.; Mann, Benjamin H.; Cronk, Ryan D.; Shields, Katherine F.; Klug, Tori L.; Ramaswamy, Rohit

    2016-01-01

    Information and communications technologies (ICTs) such as mobile survey tools (MSTs) can facilitate field-level data collection to drive improvements in national and international development programs. MSTs allow users to gather and transmit field data in real time, standardize data storage and management, automate routine analyses, and visualize data. Dozens of diverse MST options are available, and users may struggle to select suitable options. We developed a systematic MST Evaluation Framework (EF), based on International Organization for Standardization/International Electrotechnical Commission (ISO/IEC) software quality modeling standards, to objectively assess MSTs and assist program implementers in identifying suitable MST options. The EF is applicable to MSTs for a broad variety of applications. We also conducted an MST user survey to elucidate needs and priorities of current MST users. Finally, the EF was used to assess seven MSTs currently used for water and sanitation monitoring, as a validation exercise. The results suggest that the EF is a promising method for evaluating MSTs. PMID:27563916

  4. Unmasking feigned sanity: a neurobiological model of emotion processing in primary psychopathy.

    PubMed

    van Honk, Jack; Schutter, Dennis J L G

    2006-05-01

    The neurobiological basis of primary psychopathy, an emotional disorder characterised by a lack of fear and empathy, on the one hand, and extremely violent, antisocial tendencies, on the other, is relatively unknown. Nevertheless, theoretical models that emphasise the role of fearlessness, imbalanced motivation, defective somatic markers, and dysfunctional violence inhibition mechanisms have complementary proposals regarding motivations and brain mechanisms involved. Presently, incorporating the heuristic value of these models and further theorising on the basis of recent data from neuropsychology, neuroendocrinology, neuroimaging, and repetitive transcranial magnetic stimulation (rTMS), an attempt is made to construct a neurobiological framework of emotion processing in primary psychopathy with clinical applicability. According to this framework, defective emotional processing in primary psychopathy results from bottom-up hormone-mediated imbalances at: (1) the subcortical level; (2) in subcortico-cortical "cross-talk"; that end up in an instrumental stance at the cortical level (3). An endocrine dual-system approach for the fine-tuned restoration of these hormone-mediated imbalances is proposed as a possible clinical application. This application may be capable of laying a neurobiological foundation for more successful sociotherapeutic interventions in primary psychopathy.

  5. Evaluating Mobile Survey Tools (MSTs) for Field-Level Monitoring and Data Collection: Development of a Novel Evaluation Framework, and Application to MSTs for Rural Water and Sanitation Monitoring.

    PubMed

    Fisher, Michael B; Mann, Benjamin H; Cronk, Ryan D; Shields, Katherine F; Klug, Tori L; Ramaswamy, Rohit

    2016-08-23

    Information and communications technologies (ICTs) such as mobile survey tools (MSTs) can facilitate field-level data collection to drive improvements in national and international development programs. MSTs allow users to gather and transmit field data in real time, standardize data storage and management, automate routine analyses, and visualize data. Dozens of diverse MST options are available, and users may struggle to select suitable options. We developed a systematic MST Evaluation Framework (EF), based on International Organization for Standardization/International Electrotechnical Commission (ISO/IEC) software quality modeling standards, to objectively assess MSTs and assist program implementers in identifying suitable MST options. The EF is applicable to MSTs for a broad variety of applications. We also conducted an MST user survey to elucidate needs and priorities of current MST users. Finally, the EF was used to assess seven MSTs currently used for water and sanitation monitoring, as a validation exercise. The results suggest that the EF is a promising method for evaluating MSTs.

  6. Fluorene-Based Two-Dimensional Covalent Organic Framework with Thermoelectric Properties through Doping.

    PubMed

    Wang, Liangying; Dong, Bin; Ge, Rile; Jiang, Fengxing; Xu, Jingkun

    2017-03-01

    Organic semiconductors have great potential as flexible thermoelectric materials. A fluorene-based covalent organic framework (FL-COF-1) was designed with the aim of creating an enhanced π-π interaction among the crystalline backbones. By the introduction of fluorene units into the frameworks, the FL-COF-1 had high thermal stability with a BET surface area over 1300 m 2 g -1 . The open frameworks were favorable for doping with iodine and followed with the improved charge carrier mobility. The compressed pellet of I 2 @FL-COF-1 exhibited a high Seebeck coefficient of 2450 μV K -1 and power factor of 0.063 μW m -1 K -2 at room temperature, giving the first example of COFs' potential application as thermoelectric materials.

  7. A Framework for Integrating Environmental and Occupational Health and Primary Care in a Postdisaster Context.

    PubMed

    Kirkland, Katherine; Sherman, Mya; Covert, Hannah; Barlet, Grace; Lichtveld, Maureen

    Integration of environmental and occupational health (EOH) into primary care settings is a critical step to addressing the EOH concerns of a community, particularly in a postdisaster context. Several barriers to EOH integration exist at the physician, patient, and health care system levels. This article presents a framework for improving the health system's capacity to address EOH after the Deepwater Horizon oil spill and illustrates its application in the Environmental and Occupational Health Education and Referral (EOHER) program. This program worked with 11 Federally Qualified Health Center systems in the Gulf Coast region to try to address the EOH concerns of community members and to assist primary care providers to better understand the impact of EOH factors on their patients' health. The framework uses a 3-pronged approach to (1) foster coordination between primary care and EOH facilities through a referral network and peer consultations, (2) increase physician capacity in EOH issues through continuing education and training, and (3) conduct outreach to community members about EOH issues. The EOHER program highlighted the importance of building strong partnerships with community members and other relevant organizations, as well as high organizational capacity and effective leadership to enable EOH integration into primary care settings. Physicians in the EOHER program were constrained in their ability to engage with EOH issues due to competing patient needs and time constraints, indicating the need to improve physicians' ability to assess which patients are at high risk for EOH exposures and to efficiently take environmental and occupational histories. This article highlights the importance of addressing EOH barriers at multiple levels and provides a model that can be applied to promote community health, particularly in the context of future natural or technological disasters.

  8. An efficient, scalable, and adaptable framework for solving generic systems of level-set PDEs

    PubMed Central

    Mosaliganti, Kishore R.; Gelas, Arnaud; Megason, Sean G.

    2013-01-01

    In the last decade, level-set methods have been actively developed for applications in image registration, segmentation, tracking, and reconstruction. However, the development of a wide variety of level-set PDEs and their numerical discretization schemes, coupled with hybrid combinations of PDE terms, stopping criteria, and reinitialization strategies, has created a software logistics problem. In the absence of an integrative design, current toolkits support only specific types of level-set implementations which restrict future algorithm development since extensions require significant code duplication and effort. In the new NIH/NLM Insight Toolkit (ITK) v4 architecture, we implemented a level-set software design that is flexible to different numerical (continuous, discrete, and sparse) and grid representations (point, mesh, and image-based). Given that a generic PDE is a summation of different terms, we used a set of linked containers to which level-set terms can be added or deleted at any point in the evolution process. This container-based approach allows the user to explore and customize terms in the level-set equation at compile-time in a flexible manner. The framework is optimized so that repeated computations of common intensity functions (e.g., gradient and Hessians) across multiple terms is eliminated. The framework further enables the evolution of multiple level-sets for multi-object segmentation and processing of large datasets. For doing so, we restrict level-set domains to subsets of the image domain and use multithreading strategies to process groups of subdomains or level-set functions. Users can also select from a variety of reinitialization policies and stopping criteria. Finally, we developed a visualization framework that shows the evolution of a level-set in real-time to help guide algorithm development and parameter optimization. We demonstrate the power of our new framework using confocal microscopy images of cells in a developing zebrafish embryo. PMID:24501592

  9. An efficient, scalable, and adaptable framework for solving generic systems of level-set PDEs.

    PubMed

    Mosaliganti, Kishore R; Gelas, Arnaud; Megason, Sean G

    2013-01-01

    In the last decade, level-set methods have been actively developed for applications in image registration, segmentation, tracking, and reconstruction. However, the development of a wide variety of level-set PDEs and their numerical discretization schemes, coupled with hybrid combinations of PDE terms, stopping criteria, and reinitialization strategies, has created a software logistics problem. In the absence of an integrative design, current toolkits support only specific types of level-set implementations which restrict future algorithm development since extensions require significant code duplication and effort. In the new NIH/NLM Insight Toolkit (ITK) v4 architecture, we implemented a level-set software design that is flexible to different numerical (continuous, discrete, and sparse) and grid representations (point, mesh, and image-based). Given that a generic PDE is a summation of different terms, we used a set of linked containers to which level-set terms can be added or deleted at any point in the evolution process. This container-based approach allows the user to explore and customize terms in the level-set equation at compile-time in a flexible manner. The framework is optimized so that repeated computations of common intensity functions (e.g., gradient and Hessians) across multiple terms is eliminated. The framework further enables the evolution of multiple level-sets for multi-object segmentation and processing of large datasets. For doing so, we restrict level-set domains to subsets of the image domain and use multithreading strategies to process groups of subdomains or level-set functions. Users can also select from a variety of reinitialization policies and stopping criteria. Finally, we developed a visualization framework that shows the evolution of a level-set in real-time to help guide algorithm development and parameter optimization. We demonstrate the power of our new framework using confocal microscopy images of cells in a developing zebrafish embryo.

  10. Research and Design of the Three-tier Distributed Network Management System Based on COM / COM + and DNA

    NASA Astrophysics Data System (ADS)

    Liang, Likai; Bi, Yushen

    Considered on the distributed network management system's demand of high distributives, extensibility and reusability, a framework model of Three-tier distributed network management system based on COM/COM+ and DNA is proposed, which adopts software component technology and N-tier application software framework design idea. We also give the concrete design plan of each layer of this model. Finally, we discuss the internal running process of each layer in the distributed network management system's framework model.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryu, Jun Hyung; Lee, Soo bin; Hodge, Bri-Mathias

    The energy system of process industry are faced with a new unprecedented challenge. Renewable energies should be incorporated but single of them cannot meet its energy demand of high degree and a large quantity. This paper investigates a simulation framework to compute the capacity of multiple energy sources including solar, wind power, diesel and batteries. The framework involves actual renewable energy supply and demand profile generation and supply demand matching. Eight configurations of different supply options are evaluated to illustrate the applicability of the proposed framework with some remarks.

  12. NIRP Core Software Suite v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitener, Dustin Heath; Folz, Wesley; Vo, Duong

    The NIRP Core Software Suite is a core set of code that supports multiple applications. It includes miscellaneous base code for data objects, mathematic equations, and user interface components; and the framework includes several fully-developed software applications that exist as stand-alone tools to compliment other applications. The stand-alone tools are described below. Analyst Manager: An application to manage contact information for people (analysts) that use the software products. This information is often included in generated reports and may be used to identify the owners of calculations. Radionuclide Viewer: An application for viewing the DCFPAK radiological data. Compliments the Mixture Managermore » tool. Mixture Manager: An application to create and manage radionuclides mixtures that are commonly used in other applications. High Explosive Manager: An application to manage explosives and their properties. Chart Viewer: An application to view charts of data (e.g. meteorology charts). Other applications may use this framework to create charts specific to their data needs.« less

  13. The South Dakota cooperative land use effort: A state level remote sensing demonstration project

    NASA Technical Reports Server (NTRS)

    Tessar, P. A.; Hood, D. R.; Todd, W. J.

    1975-01-01

    Remote sensing technology can satisfy or make significant contributions toward satisfying many of the information needs of governmental natural resource planners and policy makers. Recognizing this potential, the South Dakota State Planning Bureau and the EROS Data Center together formulated the framework for an ongoing Land Use and Natural Resource Inventory and Information System Program. Statewide land use/land cover information is generated from LANDSAT digital data and high altitude photography. Many applications of the system are anticipated as it evolves and data are added from more conventional sources. The conceptualization, design, and implementation of the program are discussed.

  14. Integration of High-Performance Computing into Cloud Computing Services

    NASA Astrophysics Data System (ADS)

    Vouk, Mladen A.; Sills, Eric; Dreher, Patrick

    High-Performance Computing (HPC) projects span a spectrum of computer hardware implementations ranging from peta-flop supercomputers, high-end tera-flop facilities running a variety of operating systems and applications, to mid-range and smaller computational clusters used for HPC application development, pilot runs and prototype staging clusters. What they all have in common is that they operate as a stand-alone system rather than a scalable and shared user re-configurable resource. The advent of cloud computing has changed the traditional HPC implementation. In this article, we will discuss a very successful production-level architecture and policy framework for supporting HPC services within a more general cloud computing infrastructure. This integrated environment, called Virtual Computing Lab (VCL), has been operating at NC State since fall 2004. Nearly 8,500,000 HPC CPU-Hrs were delivered by this environment to NC State faculty and students during 2009. In addition, we present and discuss operational data that show that integration of HPC and non-HPC (or general VCL) services in a cloud can substantially reduce the cost of delivering cloud services (down to cents per CPU hour).

  15. From Coordination Cages to a Stable Crystalline Porous Hydrogen-Bonded Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ju, Zhanfeng; Liu, Guoliang; Chen, Yu-Sheng

    2017-03-20

    A stable framework has been constructed through multiple charge-assisted H-bonds between cationic coordination cages and chloride ions. The framework maintained its original structure upon desolvation, which has been established by single-crystal structure analysis. This is the first fully characterized stable porous framework based on coordination cages after desolvation, with a moderately high Brunauer–Emmett–Teller (BET) surface area of 1201 m2 g-1. This work will not only give a light to construct stable porous frameworks based on coordination cages and thus broaden their applications, but will also provide a new avenue to the assembly of other porous materials such as porous organicmore » cages and hydrogen-bonded organic frameworks (HOFs) through non covalent bonds.« less

  16. Moose: An Open-Source Framework to Enable Rapid Development of Collaborative, Multi-Scale, Multi-Physics Simulation Tools

    NASA Astrophysics Data System (ADS)

    Slaughter, A. E.; Permann, C.; Peterson, J. W.; Gaston, D.; Andrs, D.; Miller, J.

    2014-12-01

    The Idaho National Laboratory (INL)-developed Multiphysics Object Oriented Simulation Environment (MOOSE; www.mooseframework.org), is an open-source, parallel computational framework for enabling the solution of complex, fully implicit multiphysics systems. MOOSE provides a set of computational tools that scientists and engineers can use to create sophisticated multiphysics simulations. Applications built using MOOSE have computed solutions for chemical reaction and transport equations, computational fluid dynamics, solid mechanics, heat conduction, mesoscale materials modeling, geomechanics, and others. To facilitate the coupling of diverse and highly-coupled physical systems, MOOSE employs the Jacobian-free Newton-Krylov (JFNK) method when solving the coupled nonlinear systems of equations arising in multiphysics applications. The MOOSE framework is written in C++, and leverages other high-quality, open-source scientific software packages such as LibMesh, Hypre, and PETSc. MOOSE uses a "hybrid parallel" model which combines both shared memory (thread-based) and distributed memory (MPI-based) parallelism to ensure efficient resource utilization on a wide range of computational hardware. MOOSE-based applications are inherently modular, which allows for simulation expansion (via coupling of additional physics modules) and the creation of multi-scale simulations. Any application developed with MOOSE supports running (in parallel) any other MOOSE-based application. Each application can be developed independently, yet easily communicate with other applications (e.g., conductivity in a slope-scale model could be a constant input, or a complete phase-field micro-structure simulation) without additional code being written. This method of development has proven effective at INL and expedites the development of sophisticated, sustainable, and collaborative simulation tools.

  17. Integrated Multidisciplinary Optimization Objects

    NASA Technical Reports Server (NTRS)

    Alston, Katherine

    2014-01-01

    OpenMDAO is an open-source MDAO framework. It is used to develop an integrated analysis and design environment for engineering challenges. This Phase II project integrated additional modules and design tools into OpenMDAO to perform discipline-specific analysis across multiple flight regimes at varying levels of fidelity. It also showcased a refined system architecture that allows the system to be less customized to a specific configuration (i.e., system and configuration separation). By delivering a capable and validated MDAO system along with a set of example applications to be used as a template for future users, this work greatly expands NASA's high-fidelity, physics-based MDAO capabilities and enables the design of revolutionary vehicles in a cost-effective manner. This proposed work complements M4 Engineering's expertise in developing modeling and simulation toolsets that solve relevant subsonic, supersonic, and hypersonic demonstration applications.

  18. Development and Application of the Collaborative Optimization Architecture in a Multidisciplinary Design Environment

    NASA Technical Reports Server (NTRS)

    Braun, R. D.; Kroo, I. M.

    1995-01-01

    Collaborative optimization is a design architecture applicable in any multidisciplinary analysis environment but specifically intended for large-scale distributed analysis applications. In this approach, a complex problem is hierarchically de- composed along disciplinary boundaries into a number of subproblems which are brought into multidisciplinary agreement by a system-level coordination process. When applied to problems in a multidisciplinary design environment, this scheme has several advantages over traditional solution strategies. These advantageous features include reducing the amount of information transferred between disciplines, the removal of large iteration-loops, allowing the use of different subspace optimizers among the various analysis groups, an analysis framework which is easily parallelized and can operate on heterogenous equipment, and a structural framework that is well-suited for conventional disciplinary organizations. In this article, the collaborative architecture is developed and its mathematical foundation is presented. An example application is also presented which highlights the potential of this method for use in large-scale design applications.

  19. A human-oriented framework for developing assistive service robots.

    PubMed

    McGinn, Conor; Cullinan, Michael F; Culleton, Mark; Kelly, Kevin

    2018-04-01

    Multipurpose robots that can perform a range of useful tasks have the potential to increase the quality of life for many people living with disabilities. Owing to factors such as high system complexity, as-yet unresolved research questions and current technology limitations, there is a need for effective strategies to coordinate the development process. Integrating established methodologies based on human-centred design and universal design, a framework was formulated to coordinate the robot design process over successive iterations of prototype development. An account is given of how the framework was practically applied to the problem of developing a personal service robot. Application of the framework led to the formation of several design goals which addressed a wide range of identified user needs. The resultant prototype solution, which consisted of several component elements, succeeded in demonstrating the performance stipulated by all of the proposed metrics. Application of the framework resulted in the development of a complex prototype that addressed many aspects of the functional and usability requirements of a personal service robot. Following the process led to several important insights which directly benefit the development of subsequent prototypes. Implications for Rehabilitation This research shows how universal design might be used to formulate usability requirements for assistive service robots. A framework is presented that guides the process of designing service robots in a human-centred way. Through practical application of the framework, a prototype robot system that addressed a range of identified user needs was developed.

  20. Driving CO2 to a Quasi-Condensed Phase at the Interface between a Nanoparticle Surface and a Metal-Organic Framework at 1 bar and 298 K.

    PubMed

    Lee, Hiang Kwee; Lee, Yih Hong; Morabito, Joseph V; Liu, Yejing; Koh, Charlynn Sher Lin; Phang, In Yee; Pedireddy, Srikanth; Han, Xuemei; Chou, Lien-Yang; Tsung, Chia-Kuang; Ling, Xing Yi

    2017-08-23

    We demonstrate a molecular-level observation of driving CO 2 molecules into a quasi-condensed phase on the solid surface of metal nanoparticles (NP) under ambient conditions of 1 bar and 298 K. This is achieved via a CO 2 accumulation in the interface between a metal-organic framework (MOF) and a metal NP surface formed by coating NPs with a MOF. Using real-time surface-enhanced Raman scattering spectroscopy, a >18-fold enhancement of surface coverage of CO 2 is observed at the interface. The high surface concentration leads CO 2 molecules to be in close proximity with the probe molecules on the metal surface (4-methylbenzenethiol), and transforms CO 2 molecules into a bent conformation without the formation of chemical bonds. Such linear-to-bent transition of CO 2 is unprecedented at ambient conditions in the absence of chemical bond formation, and is commonly observed only in pressurized systems (>10 5 bar). The molecular-level observation of a quasi-condensed phase induced by MOF coating could impact the future design of hybrid materials in diverse applications, including catalytic CO 2 conversion and ambient solid-gas operation.

  1. A Hybrid Optimization Framework with POD-based Order Reduction and Design-Space Evolution Scheme

    NASA Astrophysics Data System (ADS)

    Ghoman, Satyajit S.

    The main objective of this research is to develop an innovative multi-fidelity multi-disciplinary design, analysis and optimization suite that integrates certain solution generation codes and newly developed innovative tools to improve the overall optimization process. The research performed herein is divided into two parts: (1) the development of an MDAO framework by integration of variable fidelity physics-based computational codes, and (2) enhancements to such a framework by incorporating innovative features extending its robustness. The first part of this dissertation describes the development of a conceptual Multi-Fidelity Multi-Strategy and Multi-Disciplinary Design Optimization Environment (M3 DOE), in context of aircraft wing optimization. M 3 DOE provides the user a capability to optimize configurations with a choice of (i) the level of fidelity desired, (ii) the use of a single-step or multi-step optimization strategy, and (iii) combination of a series of structural and aerodynamic analyses. The modularity of M3 DOE allows it to be a part of other inclusive optimization frameworks. The M 3 DOE is demonstrated within the context of shape and sizing optimization of the wing of a Generic Business Jet aircraft. Two different optimization objectives, viz. dry weight minimization, and cruise range maximization are studied by conducting one low-fidelity and two high-fidelity optimization runs to demonstrate the application scope of M3 DOE. The second part of this dissertation describes the development of an innovative hybrid optimization framework that extends the robustness of M 3 DOE by employing a proper orthogonal decomposition-based design-space order reduction scheme combined with the evolutionary algorithm technique. The POD method of extracting dominant modes from an ensemble of candidate configurations is used for the design-space order reduction. The snapshot of candidate population is updated iteratively using evolutionary algorithm technique of fitness-driven retention. This strategy capitalizes on the advantages of evolutionary algorithm as well as POD-based reduced order modeling, while overcoming the shortcomings inherent with these techniques. When linked with M3 DOE, this strategy offers a computationally efficient methodology for problems with high level of complexity and a challenging design-space. This newly developed framework is demonstrated for its robustness on a nonconventional supersonic tailless air vehicle wing shape optimization problem.

  2. An efficient synthesis strategy for metal-organic frameworks: Dry-gel synthesis of MOF-74 framework with high yield and improved performance

    DOE PAGES

    Das, Atanu Kumar; Vemuri, Rama Sesha; Kutnyakov, Igor; ...

    2016-06-16

    Here, vapor-assisted dry-gel synthesis of MOF-74 structure, specifically NiMOF-74 from its synthetic precursors, was conducted with high yield and improved performance showing promise for gas (CO 2) and water adsorption applications. Unlike conventional synthesis, which takes 72 h, this kinetic study showed that NiMOF-74 forms within 12 h under dry-gel conditions with similar performance characteristics and exhibits the best performance characteristics after 48 h of heating.

  3. Genoviz Software Development Kit: Java tool kit for building genomics visualization applications.

    PubMed

    Helt, Gregg A; Nicol, John W; Erwin, Ed; Blossom, Eric; Blanchard, Steven G; Chervitz, Stephen A; Harmon, Cyrus; Loraine, Ann E

    2009-08-25

    Visualization software can expose previously undiscovered patterns in genomic data and advance biological science. The Genoviz Software Development Kit (SDK) is an open source, Java-based framework designed for rapid assembly of visualization software applications for genomics. The Genoviz SDK framework provides a mechanism for incorporating adaptive, dynamic zooming into applications, a desirable feature of genome viewers. Visualization capabilities of the Genoviz SDK include automated layout of features along genetic or genomic axes; support for user interactions with graphical elements (Glyphs) in a map; a variety of Glyph sub-classes that promote experimentation with new ways of representing data in graphical formats; and support for adaptive, semantic zooming, whereby objects change their appearance depending on zoom level and zooming rate adapts to the current scale. Freely available demonstration and production quality applications, including the Integrated Genome Browser, illustrate Genoviz SDK capabilities. Separation between graphics components and genomic data models makes it easy for developers to add visualization capability to pre-existing applications or build new applications using third-party data models. Source code, documentation, sample applications, and tutorials are available at http://genoviz.sourceforge.net/.

  4. Conformal Prediction Based on K-Nearest Neighbors for Discrimination of Ginsengs by a Home-Made Electronic Nose

    PubMed Central

    Sun, Xiyang; Miao, Jiacheng; Wang, You; Luo, Zhiyuan; Li, Guang

    2017-01-01

    An estimate on the reliability of prediction in the applications of electronic nose is essential, which has not been paid enough attention. An algorithm framework called conformal prediction is introduced in this work for discriminating different kinds of ginsengs with a home-made electronic nose instrument. Nonconformity measure based on k-nearest neighbors (KNN) is implemented separately as underlying algorithm of conformal prediction. In offline mode, the conformal predictor achieves a classification rate of 84.44% based on 1NN and 80.63% based on 3NN, which is better than that of simple KNN. In addition, it provides an estimate of reliability for each prediction. In online mode, the validity of predictions is guaranteed, which means that the error rate of region predictions never exceeds the significance level set by a user. The potential of this framework for detecting borderline examples and outliers in the application of E-nose is also investigated. The result shows that conformal prediction is a promising framework for the application of electronic nose to make predictions with reliability and validity. PMID:28805721

  5. Application of the Carolina Framework for Cervical Cancer Prevention.

    PubMed

    Moss, Jennifer L; McCarthy, Schatzi H; Gilkey, Melissa B; Brewer, Noel T

    2014-03-01

    The Carolina Framework for Cervical Cancer Prevention describes 4 main causes of cervical cancer incidence: human papillomavirus (HPV) infection, lack of screening, screening errors, and not receiving follow-up care. We present 2 applications of the Carolina Framework in which we identify high-need counties in North Carolina and generate recommendations for improving prevention efforts. We created a cervical cancer prevention need index (CCPNI) that ranked counties on cervical cancer mortality, HPV vaccine initiation and completion, Pap smear screening, and provision of Pap tests to rarely- or never-screened women. In addition, we conducted in-depth interviews with 19 key informants from programs and agencies involved in cervical cancer prevention in North Carolina. North Carolina's 100 counties varied widely on individual CCPNI components, including annual cervical cancer mortality (median 2.7/100,000 women; range 0.0-8.0), adolescent girls' HPV vaccine initiation (median 42%; range 15%-62%), and Pap testing in the previous 3 years among Medicaid-insured adult women (median 59%; range 40%-83%). Counties with the greatest prevention needs formed 2 distinct clusters in the northeast and south-central regions of the state. Interviews generated 9 recommendations to improve cervical cancer prevention in North Carolina, identifying applications to specific programs and policies in the state. This study found striking geographic disparities in cervical cancer prevention need in North Carolina. Future prevention efforts in the state should prioritize high-need regions as well as recommended strategies and applications in existing programs. Other states can use the Carolina Framework to increase the impact of their cervical cancer prevention efforts. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. An Approach for Autonomy: A Collaborative Communication Framework for Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Dufrene, Warren Russell, Jr.

    2005-01-01

    Research done during the last three years has studied the emersion properties of Complex Adaptive Systems (CAS). The deployment of Artificial Intelligence (AI) techniques applied to remote Unmanned Aerial Vehicles has led the author to investigate applications of CAS within the field of Autonomous Multi-Agent Systems. The core objective of current research efforts is focused on the simplicity of Intelligent Agents (IA) and the modeling of these agents within complex systems. This research effort looks at the communication, interaction, and adaptability of multi-agents as applied to complex systems control. The embodiment concept applied to robotics has application possibilities within multi-agent frameworks. A new framework for agent awareness within a virtual 3D world concept is possible where the vehicle is composed of collaborative agents. This approach has many possibilities for applications to complex systems. This paper describes the development of an approach to apply this virtual framework to the NASA Goddard Space Flight Center (GSFC) tetrahedron structure developed under the Autonomous Nano Technology Swarm (ANTS) program and the Super Miniaturized Addressable Reconfigurable Technology (SMART) architecture program. These projects represent an innovative set of novel concepts deploying adaptable, self-organizing structures composed of many tetrahedrons. This technology is pushing current applied Agents Concepts to new levels of requirements and adaptability.

  7. Multiple scales of patchiness and patch structure: a hierarchical framework for the study of heterogeneity

    USGS Publications Warehouse

    Kotliar, Natasha B.; Wiens, John A.

    1990-01-01

    We develop a hierarchical model of heterogeneity that provides a framework for classifying patch structure across a range of scales. Patches at lower levels in the hierarchy are more simplistic and correspond to the traditional view of patches. At levels approaching the upper bounds of the hierarchy the internal structure becomes more heterogeneous and boundaries more ambiguous. At each level in the hierarchy, patch structure will be influenced by both contrast among patches as well as the degree of aggregation of patches at lower levels in the hierarchy. We apply this model to foraging theory, but it has wider applications as in the study of habitat selection, population dynamics, and habitat fragmentation. It may also be useful in expanding the realm of landscape ecology beyond the current focus on anthropocentric scales.

  8. An Efficient Bifunctional Electrocatalyst for a Zinc-Air Battery Derived from Fe/N/C and Bimetallic Metal-Organic Framework Composites.

    PubMed

    Wang, Mengfan; Qian, Tao; Zhou, Jinqiu; Yan, Chenglin

    2017-02-15

    Efficient bifunctional electrocatalysts with desirable oxygen activities are closely related to practical applications of renewable energy systems including metal-air batteries, fuel cells, and water splitting. Here a composite material derived from a combination of bimetallic zeolitic imidazolate frameworks (denoted as BMZIFs) and Fe/N/C framework was reported as an efficient bifunctional catalyst. Although BMZIF or Fe/N/C alone exhibits undesirable oxygen reaction activity, a combination of these materials shows unprecedented ORR (half-wave potential of 0.85 V as well as comparatively superior OER activities (potential@10 mA cm -2 of 1.64 V), outperforming not only a commercial Pt/C electrocatalyst but also most reported bifunctional electrocatalysts. We then tested its practical application in Zn-air batteries. The primary batteries exhibit a high peak power density of 235 mW cm -2 , and the batteries are able to be operated smoothly for 100 cycles at a curent density of 10 mA cm -2 . The unprecedented catalytic activity can be attritued to chemical coupling effects between Fe/N/C and BMZIF and will aid the development of highly active electrocatalysts and applications for electrochemical energy devices.

  9. A Framework for Validating Traffic Simulation Models at the Vehicle Trajectory Level

    DOT National Transportation Integrated Search

    2017-03-01

    Based on current practices, traffic simulation models are calibrated and validated using macroscopic measures such as 15-minute averages of traffic counts or average point-to-point travel times. For an emerging number of applications, including conne...

  10. The IHMC CmapTools software in research and education: a multi-level use case in Space Meteorology

    NASA Astrophysics Data System (ADS)

    Messerotti, Mauro

    2010-05-01

    The IHMC (Institute for Human and Machine Cognition, Florida University System, USA) CmapTools software is a powerful multi-platform tool for knowledge modelling in graphical form based on concept maps. In this work we present its application for the high-level development of a set of multi-level concept maps in the framework of Space Meteorology to act as the kernel of a space meteorology domain ontology. This is an example of a research use case, as a domain ontology coded in machine-readable form via e.g. OWL (Web Ontology Language) is suitable to be an active layer of any knowledge management system embedded in a Virtual Observatory (VO). Apart from being manageable at machine level, concept maps developed via CmapTools are intrinsically human-readable and can embed hyperlinks and objects of many kinds. Therefore they are suitable to be published on the web: the coded knowledge can be exploited for educational purposes by the students and the public, as the level of information can be naturally organized among linked concept maps in progressively increasing complexity levels. Hence CmapTools and its advanced version COE (Concept-map Ontology Editor) represent effective and user-friendly software tools for high-level knowledge represention in research and education.

  11. An Engineering Technology Skills Framework that Reflects Workforce Needs on Maui and the Big Island of Hawai'i

    NASA Astrophysics Data System (ADS)

    Seagroves, S.; Hunter, L.

    2010-12-01

    The Akamai Workforce Initiative (AWI) is an interdisciplinary effort to improve science/engineering education in the state of Hawai'i, and to train a diverse population of local students in the skills needed for a high-tech economy. In 2009, the AWI undertook a survey of industry partners on Maui and the Big Island of Hawai'i to develop an engineering technology skills framework that will guide curriculum development at the U. of Hawai'i - Maui (formerly Maui Community College). This engineering skills framework builds directly on past engineering-education developments within the Center for Adaptive Optics Professional Development Program, and draws on curriculum development frameworks and engineering skills standards from the literature. Coupling that previous work with reviews of past Akamai Internship projects and information from previous conversations with the local high-tech community led to a structured-interview format where engineers and managers could contribute meaningful commentary to this framework. By incorporating these local high-tech companies' needs for entry-level engineers and technicians, a skills framework emerges that is unique and illuminating. Two surprising features arise in this framework: (1) "technician-like" skills of making existing technology work are on similar footing with "engineer-like" skills of creating new technology; in fact, both engineers and technicians at these workplaces use both sets of skills; and (2) project management skills are emphasized by employers even for entry-level positions.

  12. Framework for network modularization and Bayesian network analysis to investigate the perturbed metabolic network

    PubMed Central

    2011-01-01

    Background Genome-scale metabolic network models have contributed to elucidating biological phenomena, and predicting gene targets to engineer for biotechnological applications. With their increasing importance, their precise network characterization has also been crucial for better understanding of the cellular physiology. Results We herein introduce a framework for network modularization and Bayesian network analysis (FMB) to investigate organism’s metabolism under perturbation. FMB reveals direction of influences among metabolic modules, in which reactions with similar or positively correlated flux variation patterns are clustered, in response to specific perturbation using metabolic flux data. With metabolic flux data calculated by constraints-based flux analysis under both control and perturbation conditions, FMB, in essence, reveals the effects of specific perturbations on the biological system through network modularization and Bayesian network analysis at metabolic modular level. As a demonstration, this framework was applied to the genetically perturbed Escherichia coli metabolism, which is a lpdA gene knockout mutant, using its genome-scale metabolic network model. Conclusions After all, it provides alternative scenarios of metabolic flux distributions in response to the perturbation, which are complementary to the data obtained from conventionally available genome-wide high-throughput techniques or metabolic flux analysis. PMID:22784571

  13. Framework for network modularization and Bayesian network analysis to investigate the perturbed metabolic network.

    PubMed

    Kim, Hyun Uk; Kim, Tae Yong; Lee, Sang Yup

    2011-01-01

    Genome-scale metabolic network models have contributed to elucidating biological phenomena, and predicting gene targets to engineer for biotechnological applications. With their increasing importance, their precise network characterization has also been crucial for better understanding of the cellular physiology. We herein introduce a framework for network modularization and Bayesian network analysis (FMB) to investigate organism's metabolism under perturbation. FMB reveals direction of influences among metabolic modules, in which reactions with similar or positively correlated flux variation patterns are clustered, in response to specific perturbation using metabolic flux data. With metabolic flux data calculated by constraints-based flux analysis under both control and perturbation conditions, FMB, in essence, reveals the effects of specific perturbations on the biological system through network modularization and Bayesian network analysis at metabolic modular level. As a demonstration, this framework was applied to the genetically perturbed Escherichia coli metabolism, which is a lpdA gene knockout mutant, using its genome-scale metabolic network model. After all, it provides alternative scenarios of metabolic flux distributions in response to the perturbation, which are complementary to the data obtained from conventionally available genome-wide high-throughput techniques or metabolic flux analysis.

  14. Predictive assimilation framework to support contaminated site understanding and remediation

    NASA Astrophysics Data System (ADS)

    Versteeg, R. J.; Bianchi, M.; Hubbard, S. S.

    2014-12-01

    Subsurface system behavior at contaminated sites is driven and controlled by the interplay of physical, chemical, and biological processes occurring at multiple temporal and spatial scales. Effective remediation and monitoring planning requires an understanding of this complexity that is current, predictive (with some level of confidence) and actionable. We present and demonstrate a predictive assimilation framework (PAF). This framework automatically ingests, quality controls and stores near real-time environmental data and processes these data using different inversion and modeling codes to provide information on the current state and evolution of the subsurface system. PAF is implemented as a cloud based software application which has five components: (1) data acquisition, (2) data management, (3) data assimilation and processing, (4) visualization and result deliver and (5) orchestration. Access to and interaction with PAF is done through a standard browser. PAF is designed to be modular so that it can ingest and process different data streams dependent on the site. We will present an implementation of PAF which uses data from a highly instrumented site (the DOE Rifle Subsurface Biogeochemistry Field Observatory in Rifle, Colorado) for which PAF automatically ingests hydrological data and forward models groundwater flow in the saturated zone.

  15. Healthcare4VideoStorm: Making Smart Decisions Based on Storm Metrics.

    PubMed

    Zhang, Weishan; Duan, Pengcheng; Chen, Xiufeng; Lu, Qinghua

    2016-04-23

    Storm-based stream processing is widely used for real-time large-scale distributed processing. Knowing the run-time status and ensuring performance is critical to providing expected dependability for some applications, e.g., continuous video processing for security surveillance. The existing scheduling strategies' granularity is too coarse to have good performance, and mainly considers network resources without computing resources while scheduling. In this paper, we propose Healthcare4Storm, a framework that finds Storm insights based on Storm metrics to gain knowledge from the health status of an application, finally ending up with smart scheduling decisions. It takes into account both network and computing resources and conducts scheduling at a fine-grained level using tuples instead of topologies. The comprehensive evaluation shows that the proposed framework has good performance and can improve the dependability of the Storm-based applications.

  16. pyomo.dae: a modeling and automatic discretization framework for optimization with differential and algebraic equations

    DOE PAGES

    Nicholson, Bethany; Siirola, John D.; Watson, Jean-Paul; ...

    2017-12-20

    We describe pyomo.dae, an open source Python-based modeling framework that enables high-level abstract specification of optimization problems with differential and algebraic equations. The pyomo.dae framework is integrated with the Pyomo open source algebraic modeling language, and is available at http://www.pyomo.org. One key feature of pyomo.dae is that it does not restrict users to standard, predefined forms of differential equations, providing a high degree of modeling flexibility and the ability to express constraints that cannot be easily specified in other modeling frameworks. Other key features of pyomo.dae are the ability to specify optimization problems with high-order differential equations and partial differentialmore » equations, defined on restricted domain types, and the ability to automatically transform high-level abstract models into finite-dimensional algebraic problems that can be solved with off-the-shelf solvers. Moreover, pyomo.dae users can leverage existing capabilities of Pyomo to embed differential equation models within stochastic and integer programming models and mathematical programs with equilibrium constraint formulations. Collectively, these features enable the exploration of new modeling concepts, discretization schemes, and the benchmarking of state-of-the-art optimization solvers.« less

  17. pyomo.dae: a modeling and automatic discretization framework for optimization with differential and algebraic equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nicholson, Bethany; Siirola, John D.; Watson, Jean-Paul

    We describe pyomo.dae, an open source Python-based modeling framework that enables high-level abstract specification of optimization problems with differential and algebraic equations. The pyomo.dae framework is integrated with the Pyomo open source algebraic modeling language, and is available at http://www.pyomo.org. One key feature of pyomo.dae is that it does not restrict users to standard, predefined forms of differential equations, providing a high degree of modeling flexibility and the ability to express constraints that cannot be easily specified in other modeling frameworks. Other key features of pyomo.dae are the ability to specify optimization problems with high-order differential equations and partial differentialmore » equations, defined on restricted domain types, and the ability to automatically transform high-level abstract models into finite-dimensional algebraic problems that can be solved with off-the-shelf solvers. Moreover, pyomo.dae users can leverage existing capabilities of Pyomo to embed differential equation models within stochastic and integer programming models and mathematical programs with equilibrium constraint formulations. Collectively, these features enable the exploration of new modeling concepts, discretization schemes, and the benchmarking of state-of-the-art optimization solvers.« less

  18. Informing the NCA: EPA's Climate Change Impact and Risk Analysis Framework

    NASA Astrophysics Data System (ADS)

    Sarofim, M. C.; Martinich, J.; Kolian, M.; Crimmins, A. R.

    2017-12-01

    The Climate Change Impact and Risk Analysis (CIRA) framework is designed to quantify the physical impacts and economic damages in the United States under future climate change scenarios. To date, the framework has been applied to 25 sectors, using scenarios and projections developed for the Fourth National Climate Assessment. The strength of this framework has been in the use of consistent climatic, socioeconomic, and technological assumptions and inputs across the impact sectors to maximize the ease of cross-sector comparison. The results of the underlying CIRA sectoral analyses are informing the sustained assessment process by helping to address key gaps related to economic valuation and risk. Advancing capacity and scientific literature in this area has created opportunity to consider future applications and strengthening of the framework. This presentation will describe the CIRA framework, present results for various sectors such as heat mortality, air & water quality, winter recreation, and sea level rise, and introduce potential enhancements that can improve the utility of the framework for decision analysis.

  19. An open-source framework for large-scale, flexible evaluation of biomedical text mining systems.

    PubMed

    Baumgartner, William A; Cohen, K Bretonnel; Hunter, Lawrence

    2008-01-29

    Improved evaluation methodologies have been identified as a necessary prerequisite to the improvement of text mining theory and practice. This paper presents a publicly available framework that facilitates thorough, structured, and large-scale evaluations of text mining technologies. The extensibility of this framework and its ability to uncover system-wide characteristics by analyzing component parts as well as its usefulness for facilitating third-party application integration are demonstrated through examples in the biomedical domain. Our evaluation framework was assembled using the Unstructured Information Management Architecture. It was used to analyze a set of gene mention identification systems involving 225 combinations of system, evaluation corpus, and correctness measure. Interactions between all three were found to affect the relative rankings of the systems. A second experiment evaluated gene normalization system performance using as input 4,097 combinations of gene mention systems and gene mention system-combining strategies. Gene mention system recall is shown to affect gene normalization system performance much more than does gene mention system precision, and high gene normalization performance is shown to be achievable with remarkably low levels of gene mention system precision. The software presented in this paper demonstrates the potential for novel discovery resulting from the structured evaluation of biomedical language processing systems, as well as the usefulness of such an evaluation framework for promoting collaboration between developers of biomedical language processing technologies. The code base is available as part of the BioNLP UIMA Component Repository on SourceForge.net.

  20. An open-source framework for large-scale, flexible evaluation of biomedical text mining systems

    PubMed Central

    Baumgartner, William A; Cohen, K Bretonnel; Hunter, Lawrence

    2008-01-01

    Background Improved evaluation methodologies have been identified as a necessary prerequisite to the improvement of text mining theory and practice. This paper presents a publicly available framework that facilitates thorough, structured, and large-scale evaluations of text mining technologies. The extensibility of this framework and its ability to uncover system-wide characteristics by analyzing component parts as well as its usefulness for facilitating third-party application integration are demonstrated through examples in the biomedical domain. Results Our evaluation framework was assembled using the Unstructured Information Management Architecture. It was used to analyze a set of gene mention identification systems involving 225 combinations of system, evaluation corpus, and correctness measure. Interactions between all three were found to affect the relative rankings of the systems. A second experiment evaluated gene normalization system performance using as input 4,097 combinations of gene mention systems and gene mention system-combining strategies. Gene mention system recall is shown to affect gene normalization system performance much more than does gene mention system precision, and high gene normalization performance is shown to be achievable with remarkably low levels of gene mention system precision. Conclusion The software presented in this paper demonstrates the potential for novel discovery resulting from the structured evaluation of biomedical language processing systems, as well as the usefulness of such an evaluation framework for promoting collaboration between developers of biomedical language processing technologies. The code base is available as part of the BioNLP UIMA Component Repository on SourceForge.net. PMID:18230184

  1. Flourishing across Europe: Application of a New Conceptual Framework for Defining Well-Being

    ERIC Educational Resources Information Center

    Huppert, Felicia A.; So, Timothy T. C.

    2013-01-01

    Governments around the world are recognising the importance of measuring subjective well-being as an indicator of progress. But how should well-being be measured? A conceptual framework is offered which equates high well-being with positive mental health. Well-being is seen as lying at the opposite end of a spectrum to the common mental disorders…

  2. Preparation and Analysis of Cyclodextrin-Based Metal-Organic Frameworks: Laboratory Experiments Adaptable for High School through Advanced Undergraduate Students

    ERIC Educational Resources Information Center

    Smith, Merry K.; Angle, Samantha R.; Northrop, Brian H.

    2015-01-01

    ?-Cyclodextrin can assemble in the presence of KOH or RbOH into metal-organic frameworks (CD-MOFs) with applications in gas adsorption and environmental remediation. Crystalline CD-MOFs are grown by vapor diffusion and their reversible adsorption of CO[subscript 2](g) is analyzed both qualitatively and quantitatively. The experiment can be…

  3. Topology Optimization using the Level Set and eXtended Finite Element Methods: Theory and Applications

    NASA Astrophysics Data System (ADS)

    Villanueva Perez, Carlos Hernan

    Computational design optimization provides designers with automated techniques to develop novel and non-intuitive optimal designs. Topology optimization is a design optimization technique that allows for the evolution of a broad variety of geometries in the optimization process. Traditional density-based topology optimization methods often lack a sufficient resolution of the geometry and physical response, which prevents direct use of the optimized design in manufacturing and the accurate modeling of the physical response of boundary conditions. The goal of this thesis is to introduce a unified topology optimization framework that uses the Level Set Method (LSM) to describe the design geometry and the eXtended Finite Element Method (XFEM) to solve the governing equations and measure the performance of the design. The methodology is presented as an alternative to density-based optimization approaches, and is able to accommodate a broad range of engineering design problems. The framework presents state-of-the-art methods for immersed boundary techniques to stabilize the systems of equations and enforce the boundary conditions, and is studied with applications in 2D and 3D linear elastic structures, incompressible flow, and energy and species transport problems to test the robustness and the characteristics of the method. A comparison of the framework against density-based topology optimization approaches is studied with regards to convergence, performance, and the capability to manufacture the designs. Furthermore, the ability to control the shape of the design to operate within manufacturing constraints is developed and studied. The analysis capability of the framework is validated quantitatively through comparison against previous benchmark studies, and qualitatively through its application to topology optimization problems. The design optimization problems converge to intuitive designs and resembled well the results from previous 2D or density-based studies.

  4. Vulnerable Populations in Hospital and Health Care Emergency Preparedness Planning: A Comprehensive Framework for Inclusion.

    PubMed

    Kreisberg, Debra; Thomas, Deborah S K; Valley, Morgan; Newell, Shannon; Janes, Enessa; Little, Charles

    2016-04-01

    As attention to emergency preparedness becomes a critical element of health care facility operations planning, efforts to recognize and integrate the needs of vulnerable populations in a comprehensive manner have lagged. This not only results in decreased levels of equitable service, but also affects the functioning of the health care system in disasters. While this report emphasizes the United States context, the concepts and approaches apply beyond this setting. This report: (1) describes a conceptual framework that provides a model for the inclusion of vulnerable populations into integrated health care and public health preparedness; and (2) applies this model to a pilot study. The framework is derived from literature, hospital regulatory policy, and health care standards, laying out the communication and relational interfaces that must occur at the systems, organizational, and community levels for a successful multi-level health care systems response that is inclusive of diverse populations explicitly. The pilot study illustrates the application of key elements of the framework, using a four-pronged approach that incorporates both quantitative and qualitative methods for deriving information that can inform hospital and health facility preparedness planning. The conceptual framework and model, applied to a pilot project, guide expanded work that ultimately can result in methodologically robust approaches to comprehensively incorporating vulnerable populations into the fabric of hospital disaster preparedness at levels from local to national, thus supporting best practices for a community resilience approach to disaster preparedness.

  5. CICS Region Virtualization for Cost Effective Application Development

    ERIC Educational Resources Information Center

    Khan, Kamal Waris

    2012-01-01

    Mainframe is used for hosting large commercial databases, transaction servers and applications that require a greater degree of reliability, scalability and security. Customer Information Control System (CICS) is a mainframe software framework for implementing transaction services. It is designed for rapid, high-volume online processing. In order…

  6. The Influence of the Pedagogical Content Knowledge Framework on Research in Mathematics Education: A Review across Grade Bands

    ERIC Educational Resources Information Center

    Matthews, Mary Elizabeth

    2013-01-01

    This literature review examines the models, theories, and research in mathematics education that are informed by Lee S. Shulman's construct, Pedagogical Content Knowledge. The application of the concept differs in nature and volume across levels of schooling. The research includes substantial work at the elementary level, fewer studies at the…

  7. Features of anti-inflammatory effects of modulated extremely high-frequency electromagnetic radiation.

    PubMed

    Gapeyev, Andrew B; Mikhailik, Elena N; Chemeris, Nikolay K

    2009-09-01

    Using a model of acute zymosan-induced paw edema in NMRI mice, we test the hypothesis that anti-inflammatory effects of extremely high-frequency electromagnetic radiation (EHF EMR) can be essentially modified by application of pulse modulation with certain frequencies. It has been revealed that a single exposure of animals to continuous EHF EMR for 20 min reduced the exudative edema of inflamed paw on average by 19% at intensities of 0.1-0.7 mW/cm(2) and frequencies from the range of 42.2-42.6 GHz. At fixed effective carrier frequency of 42.2 GHz, the anti-inflammatory effect of EHF EMR did not depend on modulation frequencies, that is, application of different modulation frequencies from the range of 0.03-100 Hz did not lead to considerable changes in the effect level. On the contrary, at "ineffective" carrier frequencies of 43.0 and 61.22 GHz, the use of modulation frequencies of 0.07-0.1 and 20-30 Hz has allowed us to restore the effect up to a maximal level. The results obtained show the critical dependence of anti-inflammatory action of low-intensity EHF EMR on carrier and modulation frequencies. Within the framework of this study, the possibility of changing the level of expected biological effect of modulated EMR by a special selection of combination of carrier and modulation frequencies is confirmed.

  8. The Integrated Plasma Simulator: A Flexible Python Framework for Coupled Multiphysics Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foley, Samantha S; Elwasif, Wael R; Bernholdt, David E

    2011-11-01

    High-fidelity coupled multiphysics simulations are an increasingly important aspect of computational science. In many domains, however, there has been very limited experience with simulations of this sort, therefore research in coupled multiphysics often requires computational frameworks with significant flexibility to respond to the changing directions of the physics and mathematics. This paper presents the Integrated Plasma Simulator (IPS), a framework designed for loosely coupled simulations of fusion plasmas. The IPS provides users with a simple component architecture into which a wide range of existing plasma physics codes can be inserted as components. Simulations can take advantage of multiple levels ofmore » parallelism supported in the IPS, and can be controlled by a high-level ``driver'' component, or by other coordination mechanisms, such as an asynchronous event service. We describe the requirements and design of the framework, and how they were implemented in the Python language. We also illustrate the flexibility of the framework by providing examples of different types of simulations that utilize various features of the IPS.« less

  9. Tuning the Adsorption-Induced Phase Change in the Flexible Metal–Organic Framework Co(bdp)

    DOE PAGES

    Taylor, Mercedes K.; Runčevski, Tomče; Oktawiec, Julia; ...

    2016-11-02

    Metal–organic frameworks that flex to undergo structural phase changes upon gas adsorption are promising materials for gas storage and separations, and achieving synthetic control over the pressure at which these changes occur is crucial to the design of such materials for specific applications. To this end, a new family of materials based on the flexible metal–organic framework Co(bdp) (bdp 2– = 1,4-benzenedipyrazolate) has been prepared via the introduction of fluorine, deuterium, and methyl functional groups on the bdp 2– ligand, namely, Co(F-bdp), Co(p-F 2-bdp), Co(o-F 2-bdp), Co(D 4-bdp), and Co(p-Me 2-bdp). These frameworks are isoreticular to the parent framework andmore » exhibit similar structural flexibility, transitioning from a low-porosity, collapsed phase to high-porosity, expanded phases with increasing gas pressure. Powder X-ray diffraction studies reveal that fluorination of the aryl ring disrupts edge-to-face π–π interactions, which work to stabilize the collapsed phase at low gas pressures, while deuteration preserves these interactions and methylation strengthens them. In agreement with these observations, high-pressure CH 4 adsorption isotherms show that the pressure of the CH 4-induced framework expansion can be systematically controlled by ligand functionalization, as materials without edge-to-face interactions in the collapsed phase expand at lower CH 4 pressures, while frameworks with strengthened edge-to-face interactions expand at higher pressures. This work puts forth a general design strategy relevant to many other families of flexible metal–organic frameworks, which will be a powerful tool in optimizing these phase-change materials for industrial applications.« less

  10. Application of the risk assessment paradigm to the induction of allergic contact dermatitis.

    PubMed

    Felter, Susan P; Ryan, Cindy A; Basketter, David A; Gilmour, Nicola J; Gerberick, G Frank

    2003-02-01

    The National Academy of Science (NAS) risk assessment paradigm has been widely accepted as a framework for estimating risk from exposure to environmental chemicals (NAS, 1983). Within this framework, quantitative risk assessments (QRAs) serve as the cornerstone of health-based exposure limits, and have been used routinely for both cancer and noncancer endpoints. These methods have focused primarily on the extrapolation of data from laboratory animals to establish acceptable levels of exposure for humans. For health effects associated with a threshold, uncertainty and variability inherent in the extrapolation process is generally dealt with by the application of "uncertainty factors (UFs)." The adaptation of QRA methods to address skin sensitization is a natural and desirable extension of current practices. Based on our chemical, cellular and molecular understanding of the induction of allergic contact dermatitis, one can conduct a QRA using established methods of identifying a NOAEL (No Observed Adverse Effect Level) or other point of departure, and applying appropriate UFs. This paper describes the application of the NAS paradigm to characterize risks from human exposure to skin sensitizers; consequently, this method can also be used to establish an exposure level for skin allergens that does not present an appreciable risk of sensitization. Copyright 2003 Elsevier Science (USA)

  11. Validation of High-Fidelity CFD/CAA Framework for Launch Vehicle Acoustic Environment Simulation against Scale Model Test Data

    NASA Technical Reports Server (NTRS)

    Liever, Peter A.; West, Jeffrey S.; Harris, Robert E.

    2016-01-01

    A hybrid Computational Fluid Dynamics and Computational Aero-Acoustics (CFD/CAA) modeling framework has been developed for launch vehicle liftoff acoustic environment predictions. The framework couples the existing highly-scalable NASA production CFD code, Loci/CHEM, with a high-order accurate Discontinuous Galerkin solver developed in the same production framework, Loci/THRUST, to accurately resolve and propagate acoustic physics across the entire launch environment. Time-accurate, Hybrid RANS/LES CFD modeling is applied for predicting the acoustic generation physics at the plume source, and a high-order accurate unstructured mesh Discontinuous Galerkin (DG) method is employed to propagate acoustic waves away from the source across large distances using high-order accurate schemes. The DG solver is capable of solving 2nd, 3rd, and 4th order Euler solutions for non-linear, conservative acoustic field propagation. Initial application testing and validation has been carried out against high resolution acoustic data from the Ares Scale Model Acoustic Test (ASMAT) series to evaluate the capabilities and production readiness of the CFD/CAA system to resolve the observed spectrum of acoustic frequency content. This paper presents results from this validation and outlines efforts to mature and improve the computational simulation framework.

  12. Development of International Terminology and Definitions for Texture-Modified Foods and Thickened Fluids Used in Dysphagia Management: The IDDSI Framework.

    PubMed

    Cichero, Julie A Y; Lam, Peter; Steele, Catriona M; Hanson, Ben; Chen, Jianshe; Dantas, Roberto O; Duivestein, Janice; Kayashita, Jun; Lecko, Caroline; Murray, Joseph; Pillay, Mershen; Riquelme, Luis; Stanschus, Soenke

    2017-04-01

    Dysphagia is estimated to affect ~8% of the world's population (~590 million people). Texture-modified foods and thickened drinks are commonly used to reduce the risks of choking and aspiration. The International Dysphagia Diet Standardisation Initiative (IDDSI) was founded with the goal of developing globally standardized terminology and definitions for texture-modified foods and liquids applicable to individuals with dysphagia of all ages, in all care settings, and all cultures. A multi-professional volunteer committee developed a dysphagia diet framework through systematic review and stakeholder consultation. First, a survey of existing national terminologies and current practice was conducted, receiving 2050 responses from 33 countries. Respondents included individuals with dysphagia; their caregivers; organizations supporting individuals with dysphagia; healthcare professionals; food service providers; researchers; and industry. The results revealed common use of 3-4 levels of food texture (54 different names) and ≥3 levels of liquid thickness (27 different names). Substantial support was expressed for international standardization. Next, a systematic review regarding the impact of food texture and liquid consistency on swallowing was completed. A meeting was then convened to review data from previous phases, and develop a draft framework. A further international stakeholder survey sought feedback to guide framework refinement; 3190 responses were received from 57 countries. The IDDSI Framework (released in November, 2015) involves a continuum of 8 levels (0-7) identified by numbers, text labels, color codes, definitions, and measurement methods. The IDDSI Framework is recommended for implementation throughout the world.

  13. Real Time Global Tests of the ALICE High Level Trigger Data Transport Framework

    NASA Astrophysics Data System (ADS)

    Becker, B.; Chattopadhyay, S.; Cicalo, C.; Cleymans, J.; de Vaux, G.; Fearick, R. W.; Lindenstruth, V.; Richter, M.; Rohrich, D.; Staley, F.; Steinbeck, T. M.; Szostak, A.; Tilsner, H.; Weis, R.; Vilakazi, Z. Z.

    2008-04-01

    The High Level Trigger (HLT) system of the ALICE experiment is an online event filter and trigger system designed for input bandwidths of up to 25 GB/s at event rates of up to 1 kHz. The system is designed as a scalable PC cluster, implementing several hundred nodes. The transport of data in the system is handled by an object-oriented data flow framework operating on the basis of the publisher-subscriber principle, being designed fully pipelined with lowest processing overhead and communication latency in the cluster. In this paper, we report the latest measurements where this framework has been operated on five different sites over a global north-south link extending more than 10,000 km, processing a ldquoreal-timerdquo data flow.

  14. Measurement and calculation of levitation forces between magnets and granular superconductors

    NASA Technical Reports Server (NTRS)

    Johansen, T. H.; Bratsberg, H.; Baziljevich, M.; Hetland, P. O.; Riise, A. B.

    1995-01-01

    Recent developments indicate that exploitation of the phenomenon of magnetic levitation may become one of the most important near-term applications of high-T(sub c) superconductivity. Because of this, the interaction between a strong permanent magnet(PM) and bulk high-T(sub c) superconductor (HTSC) is currently a subject of much interest. We have studied central features of the mechanics of PM-HTSC systems of simple geometries. Here we report experimental results for the components of the levitation force, their associated stiffness and mechanical ac-loss. To analyze the observed behavior a theoretical framework based on critical-state considerations is developed. It will be shown that all the mechanical properties can be explained consistently at a quantitative level wing a minimum of model parameters.

  15. Machine learning bandgaps of double perovskites

    PubMed Central

    Pilania, G.; Mannodi-Kanakkithodi, A.; Uberuaga, B. P.; Ramprasad, R.; Gubernatis, J. E.; Lookman, T.

    2016-01-01

    The ability to make rapid and accurate predictions on bandgaps of double perovskites is of much practical interest for a range of applications. While quantum mechanical computations for high-fidelity bandgaps are enormously computation-time intensive and thus impractical in high throughput studies, informatics-based statistical learning approaches can be a promising alternative. Here we demonstrate a systematic feature-engineering approach and a robust learning framework for efficient and accurate predictions of electronic bandgaps of double perovskites. After evaluating a set of more than 1.2 million features, we identify lowest occupied Kohn-Sham levels and elemental electronegativities of the constituent atomic species as the most crucial and relevant predictors. The developed models are validated and tested using the best practices of data science and further analyzed to rationalize their prediction performance. PMID:26783247

  16. Integration of Multifidelity Multidisciplinary Computer Codes for Design and Analysis of Supersonic Aircraft

    NASA Technical Reports Server (NTRS)

    Geiselhart, Karl A.; Ozoroski, Lori P.; Fenbert, James W.; Shields, Elwood W.; Li, Wu

    2011-01-01

    This paper documents the development of a conceptual level integrated process for design and analysis of efficient and environmentally acceptable supersonic aircraft. To overcome the technical challenges to achieve this goal, a conceptual design capability which provides users with the ability to examine the integrated solution between all disciplines and facilitates the application of multidiscipline design, analysis, and optimization on a scale greater than previously achieved, is needed. The described capability is both an interactive design environment as well as a high powered optimization system with a unique blend of low, mixed and high-fidelity engineering tools combined together in the software integration framework, ModelCenter. The various modules are described and capabilities of the system are demonstrated. The current limitations and proposed future enhancements are also discussed.

  17. A framework for evaluating national space activity

    NASA Astrophysics Data System (ADS)

    Wood, Danielle; Weigel, Annalisa

    2012-04-01

    Space technology and resources are used around the world to address societal challenges. Space provides valuable satellite services, unique scientific discoveries, surprising technology applications and new economic opportunities. Many developing countries formally recognize the advantages of space resources and pursue national level activity to harness them. There is limited data or documentation on the space activities of developing countries. Meanwhile, traditional approaches to summarize national space activity do not necessarily capture the types of activity that developing countries pursue in space. This is especially true if they do not have a formal national space program or office. Developing countries pursue national space activity through activities of many types—from national satellite programs to commercial use of satellite services to involvement with international space institutions. This research aims to understand and analyze these trends. This paper introduces two analytical frameworks for evaluating space activity at the national level. The frameworks are specifically designed to capture the activity of countries that have traditionally been less involved in space. They take a broad view of space related activity across multiple societal sectors and disciplines. The discussion explains the approach for using the frameworks as well as illustrative examples of how they can be applied as part of a research process. The first framework is called the Mission and Management Ladders. This framework considers specific space projects within countries and ranks them on "Ladders" that measure technical challenge and managerial autonomy. This first method is at a micro level of analysis. The second framework is called the Space Participation Metric (SPM). The SPM can be used to assign a Space Participation score to countries based on their involvement in various space related activities. This second method uses a macro level of analysis. The authors developed both frameworks as part of a long term research program about the space activities of developing countries. This aspect of the research focuses on harnessing multiple techniques to summarize complex, multi-disciplinary information about global space activity.

  18. Benchmarking high performance computing architectures with CMS’ skeleton framework

    NASA Astrophysics Data System (ADS)

    Sexton-Kennedy, E.; Gartung, P.; Jones, C. D.

    2017-10-01

    In 2012 CMS evaluated which underlying concurrency technology would be the best to use for its multi-threaded framework. The available technologies were evaluated on the high throughput computing systems dominating the resources in use at that time. A skeleton framework benchmarking suite that emulates the tasks performed within a CMSSW application was used to select Intel’s Thread Building Block library, based on the measured overheads in both memory and CPU on the different technologies benchmarked. In 2016 CMS will get access to high performance computing resources that use new many core architectures; machines such as Cori Phase 1&2, Theta, Mira. Because of this we have revived the 2012 benchmark to test it’s performance and conclusions on these new architectures. This talk will discuss the results of this exercise.

  19. MIA - A free and open source software for gray scale medical image analysis

    PubMed Central

    2013-01-01

    Background Gray scale images make the bulk of data in bio-medical image analysis, and hence, the main focus of many image processing tasks lies in the processing of these monochrome images. With ever improving acquisition devices, spatial and temporal image resolution increases, and data sets become very large. Various image processing frameworks exists that make the development of new algorithms easy by using high level programming languages or visual programming. These frameworks are also accessable to researchers that have no background or little in software development because they take care of otherwise complex tasks. Specifically, the management of working memory is taken care of automatically, usually at the price of requiring more it. As a result, processing large data sets with these tools becomes increasingly difficult on work station class computers. One alternative to using these high level processing tools is the development of new algorithms in a languages like C++, that gives the developer full control over how memory is handled, but the resulting workflow for the prototyping of new algorithms is rather time intensive, and also not appropriate for a researcher with little or no knowledge in software development. Another alternative is in using command line tools that run image processing tasks, use the hard disk to store intermediate results, and provide automation by using shell scripts. Although not as convenient as, e.g. visual programming, this approach is still accessable to researchers without a background in computer science. However, only few tools exist that provide this kind of processing interface, they are usually quite task specific, and don’t provide an clear approach when one wants to shape a new command line tool from a prototype shell script. Results The proposed framework, MIA, provides a combination of command line tools, plug-ins, and libraries that make it possible to run image processing tasks interactively in a command shell and to prototype by using the according shell scripting language. Since the hard disk becomes the temporal storage memory management is usually a non-issue in the prototyping phase. By using string-based descriptions for filters, optimizers, and the likes, the transition from shell scripts to full fledged programs implemented in C++ is also made easy. In addition, its design based on atomic plug-ins and single tasks command line tools makes it easy to extend MIA, usually without the requirement to touch or recompile existing code. Conclusion In this article, we describe the general design of MIA, a general purpouse framework for gray scale image processing. We demonstrated the applicability of the software with example applications from three different research scenarios, namely motion compensation in myocardial perfusion imaging, the processing of high resolution image data that arises in virtual anthropology, and retrospective analysis of treatment outcome in orthognathic surgery. With MIA prototyping algorithms by using shell scripts that combine small, single-task command line tools is a viable alternative to the use of high level languages, an approach that is especially useful when large data sets need to be processed. PMID:24119305

  20. MIA - A free and open source software for gray scale medical image analysis.

    PubMed

    Wollny, Gert; Kellman, Peter; Ledesma-Carbayo, María-Jesus; Skinner, Matthew M; Hublin, Jean-Jaques; Hierl, Thomas

    2013-10-11

    Gray scale images make the bulk of data in bio-medical image analysis, and hence, the main focus of many image processing tasks lies in the processing of these monochrome images. With ever improving acquisition devices, spatial and temporal image resolution increases, and data sets become very large.Various image processing frameworks exists that make the development of new algorithms easy by using high level programming languages or visual programming. These frameworks are also accessable to researchers that have no background or little in software development because they take care of otherwise complex tasks. Specifically, the management of working memory is taken care of automatically, usually at the price of requiring more it. As a result, processing large data sets with these tools becomes increasingly difficult on work station class computers.One alternative to using these high level processing tools is the development of new algorithms in a languages like C++, that gives the developer full control over how memory is handled, but the resulting workflow for the prototyping of new algorithms is rather time intensive, and also not appropriate for a researcher with little or no knowledge in software development.Another alternative is in using command line tools that run image processing tasks, use the hard disk to store intermediate results, and provide automation by using shell scripts. Although not as convenient as, e.g. visual programming, this approach is still accessable to researchers without a background in computer science. However, only few tools exist that provide this kind of processing interface, they are usually quite task specific, and don't provide an clear approach when one wants to shape a new command line tool from a prototype shell script. The proposed framework, MIA, provides a combination of command line tools, plug-ins, and libraries that make it possible to run image processing tasks interactively in a command shell and to prototype by using the according shell scripting language. Since the hard disk becomes the temporal storage memory management is usually a non-issue in the prototyping phase. By using string-based descriptions for filters, optimizers, and the likes, the transition from shell scripts to full fledged programs implemented in C++ is also made easy. In addition, its design based on atomic plug-ins and single tasks command line tools makes it easy to extend MIA, usually without the requirement to touch or recompile existing code. In this article, we describe the general design of MIA, a general purpouse framework for gray scale image processing. We demonstrated the applicability of the software with example applications from three different research scenarios, namely motion compensation in myocardial perfusion imaging, the processing of high resolution image data that arises in virtual anthropology, and retrospective analysis of treatment outcome in orthognathic surgery. With MIA prototyping algorithms by using shell scripts that combine small, single-task command line tools is a viable alternative to the use of high level languages, an approach that is especially useful when large data sets need to be processed.

  1. The ontology model of FrontCRM framework

    NASA Astrophysics Data System (ADS)

    Budiardjo, Eko K.; Perdana, Wira; Franshisca, Felicia

    2013-03-01

    Adoption and implementation of Customer Relationship Management (CRM) is not merely a technological installation, but the emphasis is more on the application of customer-centric philosophy and culture as a whole. CRM must begin at the level of business strategy, the only level that thorough organizational changes are possible to be done. Changes agenda can be directed to each departmental plans, and supported by information technology. Work processes related to CRM concept include marketing, sales, and services. FrontCRM is developed as framework to guide in identifying business processes related to CRM in which based on the concept of strategic planning approach. This leads to processes and practices identification in every process area related to marketing, sales, and services. The Ontology model presented on this paper by means serves as tools to avoid framework misunderstanding, to define practices systematically within process area and to find CRM software features related to those practices.

  2. Prima Platform: A Scheme for Managing Equipment-Dependent Onboard Functions and Impacts on the Avionics Software Production Process

    NASA Astrophysics Data System (ADS)

    Candia, Sante; Lisio, Giovanni; Campolo, Giovanni; Pascucci, Dario

    2010-08-01

    The Avionics Software (ASW), in charge of controlling the Low Earth Orbit (LEO) Spacecraft PRIMA Platform (Piattaforma Ri-configurabile Italiana Multi-Applicativa), is evolving towards a highly modular and re-usable architecture based on an architectural framework allowing the effective integration of the software building blocks (SWBBs) providing the on-board control functions. During the recent years, the PRIMA ASW design and production processes have been improved to reach the following objectives: (a) at PUS Services level, separation of the mission-independent software mechanisms from the mission-dependent configuration information; (b) at Application level, identification of mission-independent recurrent functions for promoting abstraction and obtaining a more efficient and safe ASW production, with positive implications also on the software validation activities. This paper is dedicated to the characterisation activity which has been performed at Application level for a software component abstracting a set of functions for the generic On-Board Assembly (OBA), a set of hardware units used to deliver an on-board service. Moreover, the ASW production process is specified to show how it results after the introduction of the new design features.

  3. Sensor fusion display evaluation using information integration models in enhanced/synthetic vision applications

    NASA Technical Reports Server (NTRS)

    Foyle, David C.

    1993-01-01

    Based on existing integration models in the psychological literature, an evaluation framework is developed to assess sensor fusion displays as might be implemented in an enhanced/synthetic vision system. The proposed evaluation framework for evaluating the operator's ability to use such systems is a normative approach: The pilot's performance with the sensor fusion image is compared to models' predictions based on the pilot's performance when viewing the original component sensor images prior to fusion. This allows for the determination as to when a sensor fusion system leads to: poorer performance than one of the original sensor displays, clearly an undesirable system in which the fused sensor system causes some distortion or interference; better performance than with either single sensor system alone, but at a sub-optimal level compared to model predictions; optimal performance compared to model predictions; or, super-optimal performance, which may occur if the operator were able to use some highly diagnostic 'emergent features' in the sensor fusion display, which were unavailable in the original sensor displays.

  4. Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.

    PubMed

    Barre, Arnaud; Armand, Stéphane

    2014-04-01

    C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X). Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  5. An Experimental Framework for Executing Applications in Dynamic Grid Environments

    NASA Technical Reports Server (NTRS)

    Huedo, Eduardo; Montero, Ruben S.; Llorente, Ignacio M.; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    The Grid opens up opportunities for resource-starved scientists and engineers to harness highly distributed computing resources. A number of Grid middleware projects are currently available to support the simultaneous exploitation of heterogeneous resources distributed in different administrative domains. However, efficient job submission and management continue being far from accessible to ordinary scientists and engineers due to the dynamic and complex nature of the Grid. This report describes a new Globus framework that allows an easier and more efficient execution of jobs in a 'submit and forget' fashion. Adaptation to dynamic Grid conditions is achieved by supporting automatic application migration following performance degradation, 'better' resource discovery, requirement change, owner decision or remote resource failure. The report also includes experimental results of the behavior of our framework on the TRGP testbed.

  6. Enriched Video Semantic Metadata: Authorization, Integration, and Presentation.

    ERIC Educational Resources Information Center

    Mu, Xiangming; Marchionini, Gary

    2003-01-01

    Presents an enriched video metadata framework including video authorization using the Video Annotation and Summarization Tool (VAST)-a video metadata authorization system that integrates both semantic and visual metadata-- metadata integration, and user level applications. Results demonstrated that the enriched metadata were seamlessly…

  7. Proof of Concept for the Trajectory-Level Validation Framework for Traffic Simulation Models

    DOT National Transportation Integrated Search

    2017-10-30

    Based on current practices, traffic simulation models are calibrated and validated using macroscopic measures such as 15-minute averages of traffic counts or average point-to-point travel times. For an emerging number of applications, including conne...

  8. A post-Bertalanffy Systemics Healthcare Competitive Framework Proposal.

    PubMed

    Fiorini, Rodolfo A; Santacroce, Giulia F

    2014-01-01

    Health Information community can take advantage of a new evolutive categorization cybernetic framework. A systemic concept of principles organizing nature is proposed. It can be used as a multiscaling reference framework to develop successful and competitive antifragile system and new HRO information management strategies in advanced healthcare organization (HO) and high reliability organization (HRO) conveniently. Expected impacts are multifarious and quite articulated at different system scale level: major one is that, for the first time, Biomedical Engineering ideal system categorization levels can be matched exactly to practical system modeling interaction styles, with no paradigmatic operational ambiguity and information loss.

  9. Middleware enabling computational self-reflection: exploring the need for and some costs of selfreflecting networks with application to homeland defense

    NASA Astrophysics Data System (ADS)

    Kramer, Michael J.; Bellman, Kirstie L.; Landauer, Christopher

    2002-07-01

    This paper will review and examine the definitions of Self-Reflection and Active Middleware. Then it will illustrate a conceptual framework for understanding and enumerating the costs of Self-Reflection and Active Middleware at increasing levels of Application. Then it will review some application of Self-Reflection and Active Middleware to simulations. Finally it will consider the application and additional kinds of costs applying Self-Reflection and Active Middleware to sharing information among the organizations expected to participate in Homeland Defense.

  10. Detecting and characterizing high-frequency oscillations in epilepsy: a case study of big data analysis

    NASA Astrophysics Data System (ADS)

    Huang, Liang; Ni, Xuan; Ditto, William L.; Spano, Mark; Carney, Paul R.; Lai, Ying-Cheng

    2017-01-01

    We develop a framework to uncover and analyse dynamical anomalies from massive, nonlinear and non-stationary time series data. The framework consists of three steps: preprocessing of massive datasets to eliminate erroneous data segments, application of the empirical mode decomposition and Hilbert transform paradigm to obtain the fundamental components embedded in the time series at distinct time scales, and statistical/scaling analysis of the components. As a case study, we apply our framework to detecting and characterizing high-frequency oscillations (HFOs) from a big database of rat electroencephalogram recordings. We find a striking phenomenon: HFOs exhibit on-off intermittency that can be quantified by algebraic scaling laws. Our framework can be generalized to big data-related problems in other fields such as large-scale sensor data and seismic data analysis.

  11. Scalable large format 3D displays

    NASA Astrophysics Data System (ADS)

    Chang, Nelson L.; Damera-Venkata, Niranjan

    2010-02-01

    We present a general framework for the modeling and optimization of scalable large format 3-D displays using multiple projectors. Based on this framework, we derive algorithms that can robustly optimize the visual quality of an arbitrary combination of projectors (e.g. tiled, superimposed, combinations of the two) without manual adjustment. The framework creates for the first time a new unified paradigm that is agnostic to a particular configuration of projectors yet robustly optimizes for the brightness, contrast, and resolution of that configuration. In addition, we demonstrate that our algorithms support high resolution stereoscopic video at real-time interactive frame rates achieved on commodity graphics hardware. Through complementary polarization, the framework creates high quality multi-projector 3-D displays at low hardware and operational cost for a variety of applications including digital cinema, visualization, and command-and-control walls.

  12. Linking Satellite Remote Sensing Based Environmental Predictors to Disease: AN Application to the Spatiotemporal Modelling of Schistosomiasis in Ghana

    NASA Astrophysics Data System (ADS)

    Wrable, M.; Liss, A.; Kulinkina, A.; Koch, M.; Biritwum, N. K.; Ofosu, A.; Kosinski, K. C.; Gute, D. M.; Naumova, E. N.

    2016-06-01

    90% of the worldwide schistosomiasis burden falls on sub-Saharan Africa. Control efforts are often based on infrequent, small-scale health surveys, which are expensive and logistically difficult to conduct. Use of satellite imagery to predictively model infectious disease transmission has great potential for public health applications. Transmission of schistosomiasis requires specific environmental conditions to sustain freshwater snails, however has unknown seasonality, and is difficult to study due to a long lag between infection and clinical symptoms. To overcome this, we employed a comprehensive 8-year time-series built from remote sensing feeds. The purely environmental predictor variables: accumulated precipitation, land surface temperature, vegetative growth indices, and climate zones created from a novel climate regionalization technique, were regressed against 8 years of national surveillance data in Ghana. All data were aggregated temporally into monthly observations, and spatially at the level of administrative districts. The result of an initial mixed effects model had 41% explained variance overall. Stratification by climate zone brought the R2 as high as 50% for major zones and as high as 59% for minor zones. This can lead to a predictive risk model used to develop a decision support framework to design treatment schemes and direct scarce resources to areas with the highest risk of infection. This framework can be applied to diseases sensitive to climate or to locations where remote sensing would be better suited than health surveys.

  13. EmbryoMiner: A new framework for interactive knowledge discovery in large-scale cell tracking data of developing embryos.

    PubMed

    Schott, Benjamin; Traub, Manuel; Schlagenhauf, Cornelia; Takamiya, Masanari; Antritter, Thomas; Bartschat, Andreas; Löffler, Katharina; Blessing, Denis; Otte, Jens C; Kobitski, Andrei Y; Nienhaus, G Ulrich; Strähle, Uwe; Mikut, Ralf; Stegmaier, Johannes

    2018-04-01

    State-of-the-art light-sheet and confocal microscopes allow recording of entire embryos in 3D and over time (3D+t) for many hours. Fluorescently labeled structures can be segmented and tracked automatically in these terabyte-scale 3D+t images, resulting in thousands of cell migration trajectories that provide detailed insights to large-scale tissue reorganization at the cellular level. Here we present EmbryoMiner, a new interactive open-source framework suitable for in-depth analyses and comparisons of entire embryos, including an extensive set of trajectory features. Starting at the whole-embryo level, the framework can be used to iteratively focus on a region of interest within the embryo, to investigate and test specific trajectory-based hypotheses and to extract quantitative features from the isolated trajectories. Thus, the new framework provides a valuable new way to quantitatively compare corresponding anatomical regions in different embryos that were manually selected based on biological prior knowledge. As a proof of concept, we analyzed 3D+t light-sheet microscopy images of zebrafish embryos, showcasing potential user applications that can be performed using the new framework.

  14. A Framework for Multi-Stakeholder Decision-Making and ...

    EPA Pesticide Factsheets

    This contribution describes the implementation of the conditional-value-at-risk (CVaR) metric to create a general multi-stakeholder decision-making framework. It is observed that stakeholder dissatisfactions (distance to their individual ideal solutions) can be interpreted as random variables. We thus shape the dissatisfaction distribution and find an optimal compromise solution by solving a CVaR minimization problem parameterized in the probability level. This enables us to generalize multi-stakeholder settings previously proposed in the literature that minimizes average and worst-case dissatisfactions. We use the concept of the CVaR norm to give a geometric interpretation to this problem and use the properties of this norm to prove that the CVaR minimization problem yields Pareto optimal solutions for any choice of the probability level. We discuss a broad range of potential applications of the framework. We demonstrate the framework in a bio-waste processing facility location case study, where we seek compromise solutions (facility locations) that balance stakeholder priorities on transportation, safety, water quality, and capital costs. This conference presentation abstract explains a new decision-making framework that computes compromise solution alternatives (reach consensus) by mitigating dissatisfactions among stakeholders as needed for SHC Decision Science and Support Tools project.

  15. Persuasive Technology in Mobile Applications Promoting Physical Activity: a Systematic Review.

    PubMed

    Matthews, John; Win, Khin Than; Oinas-Kukkonen, Harri; Freeman, Mark

    2016-03-01

    Persuasive technology in mobile applications can be used to influence the behaviour of users. A framework known as the Persuasive Systems Design model has been developed for designing and evaluating systems that influence the attitudes or behaviours of users. This paper reviews the current state of mobile applications for health behavioural change with an emphasis on applications that promote physical activity. The inbuilt persuasive features of mobile applications were evaluated using the Persuasive Systems Design model. A database search was conducted to identify relevant articles. Articles were then reviewed using the Persuasive Systems Design model as a framework for analysis. Primary task support, dialogue support, and social support were found to be moderately represented in the selected articles. However, system credibility support was found to have only low levels of representation as a persuasive systems design feature in mobile applications for supporting physical activity. To ensure that available mobile technology resources are best used to improve the wellbeing of people, it is important that the design principles that influence the effectiveness of persuasive technology be understood.

  16. Resource management and scheduling policy based on grid for AIoT

    NASA Astrophysics Data System (ADS)

    Zou, Yiqin; Quan, Li

    2017-07-01

    This paper has a research on resource management and scheduling policy based on grid technology for Agricultural Internet of Things (AIoT). Facing the situation of a variety of complex and heterogeneous agricultural resources in AIoT, it is difficult to represent them in a unified way. But from an abstract perspective, there are some common models which can express their characteristics and features. Based on this, we proposed a high-level model called Agricultural Resource Hierarchy Model (ARHM), which can be used for modeling various resources. It introduces the agricultural resource modeling method based on this model. Compared with traditional application-oriented three-layer model, ARHM can hide the differences of different applications and make all applications have a unified interface layer and be implemented without distinction. Furthermore, it proposes a Web Service Resource Framework (WSRF)-based resource management method and the encapsulation structure for it. Finally, it focuses on the discussion of multi-agent-based AG resource scheduler, which is a collaborative service provider pattern in multiple agricultural production domains.

  17. A deep semantic mobile application for thyroid cytopathology

    NASA Astrophysics Data System (ADS)

    Kim, Edward; Corte-Real, Miguel; Baloch, Zubair

    2016-03-01

    Cytopathology is the study of disease at the cellular level and often used as a screening tool for cancer. Thyroid cytopathology is a branch of pathology that studies the diagnosis of thyroid lesions and diseases. A pathologist views cell images that may have high visual variance due to different anatomical structures and pathological characteristics. To assist the physician with identifying and searching through images, we propose a deep semantic mobile application. Our work augments recent advances in the digitization of pathology and machine learning techniques, where there are transformative opportunities for computers to assist pathologists. Our system uses a custom thyroid ontology that can be augmented with multimedia metadata extracted from images using deep machine learning techniques. We describe the utilization of a particular methodology, deep convolutional neural networks, to the application of cytopathology classification. Our method is able to leverage networks that have been trained on millions of generic images, to medical scenarios where only hundreds or thousands of images exist. We demonstrate the benefits of our framework through both quantitative and qualitative results.

  18. Monte Carlo Simulation of Massive Absorbers for Cryogenic Calorimeters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brandt, D.; Asai, M.; Brink, P.L.

    There is a growing interest in cryogenic calorimeters with macroscopic absorbers for applications such as dark matter direct detection and rare event search experiments. The physics of energy transport in calorimeters with absorber masses exceeding several grams is made complex by the anisotropic nature of the absorber crystals as well as the changing mean free paths as phonons decay to progressively lower energies. We present a Monte Carlo model capable of simulating anisotropic phonon transport in cryogenic crystals. We have initiated the validation process and discuss the level of agreement between our simulation and experimental results reported in the literature,more » focusing on heat pulse propagation in germanium. The simulation framework is implemented using Geant4, a toolkit originally developed for high-energy physics Monte Carlo simulations. Geant4 has also been used for nuclear and accelerator physics, and applications in medical and space sciences. We believe that our current work may open up new avenues for applications in material science and condensed matter physics.« less

  19. Payments for Environmental Services in a Policymix: Spatial and Temporal Articulation in Mexico.

    PubMed

    Ezzine-de-Blas, Driss; Dutilly, Céline; Lara-Pulido, José-Alberto; Le Velly, Gwenolé; Guevara-Sanginés, Alejando

    2016-01-01

    Government based Payments for Ecosystem Services (PES) have been criticized for not maximizing environmental effectiveness through appropriate targeting, while instead prioritizing social side-objectives. In Mexico, existing literature on how the Payments for Ecosystem Services-Hydrological program (PSA-H) has targeted deforestation and forest degradation shows that both the process of identifying the eligible areas and the choice of the selection criteria for enrolling forest parcels have been under the influence of competing agendas. In the present paper we study the influence of the PSA-H multi-level governance on the environmental effectiveness of the program-the degree to which forest at high risk of deforestation is enrolled- building from a "policyscape" framework. In particular, we combine governance analysis with two distinct applications of the policyscape framework: First, at national level we assess the functional overlap between the PSA-H and other environmental and rural programs with regard to the risk of deforestation. Second, at regional level in the states of Chiapas and Yucatan, we describe the changing policy agenda and the role of technical intermediaries in defining the temporal spatialization of the PSA-H eligible and enrolled areas with regard to key socio-economic criteria. We find that, although at national level the PSA-H program has been described as coping with both social and environmental indicators thanks to successful adaptive management, our analysis show that PSA-H is mainly found in communities where deforestation risk is low and in combination with other environmental programs (protected areas and forest management programs). Such inertia is reinforced at regional level as a result of the eligible areas' characteristics and the behaviour of technical intermediaries, which seek to minimise transaction costs and sources of uncertainty. Our project-specific analysis shows the importance of integrating the governance of a program in the policyscape framework as a way to better systematize complex interactions at different spatial and institutional scales between policies and landscape characteristics.

  20. Marginal Shape Deep Learning: Applications to Pediatric Lung Field Segmentation.

    PubMed

    Mansoor, Awais; Cerrolaza, Juan J; Perez, Geovanny; Biggs, Elijah; Nino, Gustavo; Linguraru, Marius George

    2017-02-11

    Representation learning through deep learning (DL) architecture has shown tremendous potential for identification, localization, and texture classification in various medical imaging modalities. However, DL applications to segmentation of objects especially to deformable objects are rather limited and mostly restricted to pixel classification. In this work, we propose marginal shape deep learning (MaShDL), a framework that extends the application of DL to deformable shape segmentation by using deep classifiers to estimate the shape parameters. MaShDL combines the strength of statistical shape models with the automated feature learning architecture of DL. Unlike the iterative shape parameters estimation approach of classical shape models that often leads to a local minima, the proposed framework is robust to local minima optimization and illumination changes. Furthermore, since the direct application of DL framework to a multi-parameter estimation problem results in a very high complexity, our framework provides an excellent run-time performance solution by independently learning shape parameter classifiers in marginal eigenspaces in the decreasing order of variation. We evaluated MaShDL for segmenting the lung field from 314 normal and abnormal pediatric chest radiographs and obtained a mean Dice similarity coefficient of 0.927 using only the four highest modes of variation (compared to 0.888 with classical ASM 1 (p-value=0.01) using same configuration). To the best of our knowledge this is the first demonstration of using DL framework for parametrized shape learning for the delineation of deformable objects.

  1. Marginal shape deep learning: applications to pediatric lung field segmentation

    NASA Astrophysics Data System (ADS)

    Mansoor, Awais; Cerrolaza, Juan J.; Perez, Geovany; Biggs, Elijah; Nino, Gustavo; Linguraru, Marius George

    2017-02-01

    Representation learning through deep learning (DL) architecture has shown tremendous potential for identification, local- ization, and texture classification in various medical imaging modalities. However, DL applications to segmentation of objects especially to deformable objects are rather limited and mostly restricted to pixel classification. In this work, we propose marginal shape deep learning (MaShDL), a framework that extends the application of DL to deformable shape segmentation by using deep classifiers to estimate the shape parameters. MaShDL combines the strength of statistical shape models with the automated feature learning architecture of DL. Unlike the iterative shape parameters estimation approach of classical shape models that often leads to a local minima, the proposed framework is robust to local minima optimization and illumination changes. Furthermore, since the direct application of DL framework to a multi-parameter estimation problem results in a very high complexity, our framework provides an excellent run-time performance solution by independently learning shape parameter classifiers in marginal eigenspaces in the decreasing order of variation. We evaluated MaShDL for segmenting the lung field from 314 normal and abnormal pediatric chest radiographs and obtained a mean Dice similarity coefficient of 0:927 using only the four highest modes of variation (compared to 0:888 with classical ASM1 (p-value=0:01) using same configuration). To the best of our knowledge this is the first demonstration of using DL framework for parametrized shape learning for the delineation of deformable objects.

  2. Marginal Shape Deep Learning: Applications to Pediatric Lung Field Segmentation

    PubMed Central

    Mansoor, Awais; Cerrolaza, Juan J.; Perez, Geovanny; Biggs, Elijah; Nino, Gustavo; Linguraru, Marius George

    2017-01-01

    Representation learning through deep learning (DL) architecture has shown tremendous potential for identification, localization, and texture classification in various medical imaging modalities. However, DL applications to segmentation of objects especially to deformable objects are rather limited and mostly restricted to pixel classification. In this work, we propose marginal shape deep learning (MaShDL), a framework that extends the application of DL to deformable shape segmentation by using deep classifiers to estimate the shape parameters. MaShDL combines the strength of statistical shape models with the automated feature learning architecture of DL. Unlike the iterative shape parameters estimation approach of classical shape models that often leads to a local minima, the proposed framework is robust to local minima optimization and illumination changes. Furthermore, since the direct application of DL framework to a multi-parameter estimation problem results in a very high complexity, our framework provides an excellent run-time performance solution by independently learning shape parameter classifiers in marginal eigenspaces in the decreasing order of variation. We evaluated MaShDL for segmenting the lung field from 314 normal and abnormal pediatric chest radiographs and obtained a mean Dice similarity coefficient of 0.927 using only the four highest modes of variation (compared to 0.888 with classical ASM1 (p-value=0.01) using same configuration). To the best of our knowledge this is the first demonstration of using DL framework for parametrized shape learning for the delineation of deformable objects. PMID:28592911

  3. A mechanistic spatio-temporal framework for modelling individual-to-individual transmission—With an application to the 2014-2015 West Africa Ebola outbreak

    PubMed Central

    McClelland, Amanda; Zelner, Jon; Streftaris, George; Funk, Sebastian; Metcalf, Jessica; Dalziel, Benjamin D.; Grenfell, Bryan T.

    2017-01-01

    In recent years there has been growing availability of individual-level spatio-temporal disease data, particularly due to the use of modern communicating devices with GPS tracking functionality. These detailed data have been proven useful for inferring disease transmission to a more refined level than previously. However, there remains a lack of statistically sound frameworks to model the underlying transmission dynamic in a mechanistic manner. Such a development is particularly crucial for enabling a general epidemic predictive framework at the individual level. In this paper we propose a new statistical framework for mechanistically modelling individual-to-individual disease transmission in a landscape with heterogeneous population density. Our methodology is first tested using simulated datasets, validating our inferential machinery. The methodology is subsequently applied to data that describes a regional Ebola outbreak in Western Africa (2014-2015). Our results show that the methods are able to obtain estimates of key epidemiological parameters that are broadly consistent with the literature, while revealing a significantly shorter distance of transmission. More importantly, in contrast to existing approaches, we are able to perform a more general model prediction that takes into account the susceptible population. Finally, our results show that, given reasonable scenarios, the framework can be an effective surrogate for susceptible-explicit individual models which are often computationally challenging. PMID:29084216

  4. A mechanistic spatio-temporal framework for modelling individual-to-individual transmission-With an application to the 2014-2015 West Africa Ebola outbreak.

    PubMed

    Lau, Max S Y; Gibson, Gavin J; Adrakey, Hola; McClelland, Amanda; Riley, Steven; Zelner, Jon; Streftaris, George; Funk, Sebastian; Metcalf, Jessica; Dalziel, Benjamin D; Grenfell, Bryan T

    2017-10-01

    In recent years there has been growing availability of individual-level spatio-temporal disease data, particularly due to the use of modern communicating devices with GPS tracking functionality. These detailed data have been proven useful for inferring disease transmission to a more refined level than previously. However, there remains a lack of statistically sound frameworks to model the underlying transmission dynamic in a mechanistic manner. Such a development is particularly crucial for enabling a general epidemic predictive framework at the individual level. In this paper we propose a new statistical framework for mechanistically modelling individual-to-individual disease transmission in a landscape with heterogeneous population density. Our methodology is first tested using simulated datasets, validating our inferential machinery. The methodology is subsequently applied to data that describes a regional Ebola outbreak in Western Africa (2014-2015). Our results show that the methods are able to obtain estimates of key epidemiological parameters that are broadly consistent with the literature, while revealing a significantly shorter distance of transmission. More importantly, in contrast to existing approaches, we are able to perform a more general model prediction that takes into account the susceptible population. Finally, our results show that, given reasonable scenarios, the framework can be an effective surrogate for susceptible-explicit individual models which are often computationally challenging.

  5. Method for the capture and storage of waste

    DOEpatents

    None

    2017-01-24

    Systems and methods for capturing waste are disclosed. The systems and methods provide for a high level of confinement and long term stability. The systems and methods include adsorbing waste into a metal-organic framework (MOF), and applying pressure to the MOF material's framework to crystallize or make amorphous the MOF material thereby changing the MOF's pore structure and sorption characteristics without collapsing the MOF framework.

  6. Graph-based Data Modeling and Analysis for Data Fusion in Remote Sensing

    NASA Astrophysics Data System (ADS)

    Fan, Lei

    Hyperspectral imaging provides the capability of increased sensitivity and discrimination over traditional imaging methods by combining standard digital imaging with spectroscopic methods. For each individual pixel in a hyperspectral image (HSI), a continuous spectrum is sampled as the spectral reflectance/radiance signature to facilitate identification of ground cover and surface material. The abundant spectrum knowledge allows all available information from the data to be mined. The superior qualities within hyperspectral imaging allow wide applications such as mineral exploration, agriculture monitoring, and ecological surveillance, etc. The processing of massive high-dimensional HSI datasets is a challenge since many data processing techniques have a computational complexity that grows exponentially with the dimension. Besides, a HSI dataset may contain a limited number of degrees of freedom due to the high correlations between data points and among the spectra. On the other hand, merely taking advantage of the sampled spectrum of individual HSI data point may produce inaccurate results due to the mixed nature of raw HSI data, such as mixed pixels, optical interferences and etc. Fusion strategies are widely adopted in data processing to achieve better performance, especially in the field of classification and clustering. There are mainly three types of fusion strategies, namely low-level data fusion, intermediate-level feature fusion, and high-level decision fusion. Low-level data fusion combines multi-source data that is expected to be complementary or cooperative. Intermediate-level feature fusion aims at selection and combination of features to remove redundant information. Decision level fusion exploits a set of classifiers to provide more accurate results. The fusion strategies have wide applications including HSI data processing. With the fast development of multiple remote sensing modalities, e.g. Very High Resolution (VHR) optical sensors, LiDAR, etc., fusion of multi-source data can in principal produce more detailed information than each single source. On the other hand, besides the abundant spectral information contained in HSI data, features such as texture and shape may be employed to represent data points from a spatial perspective. Furthermore, feature fusion also includes the strategy of removing redundant and noisy features in the dataset. One of the major problems in machine learning and pattern recognition is to develop appropriate representations for complex nonlinear data. In HSI processing, a particular data point is usually described as a vector with coordinates corresponding to the intensities measured in the spectral bands. This vector representation permits the application of linear and nonlinear transformations with linear algebra to find an alternative representation of the data. More generally, HSI is multi-dimensional in nature and the vector representation may lose the contextual correlations. Tensor representation provides a more sophisticated modeling technique and a higher-order generalization to linear subspace analysis. In graph theory, data points can be generalized as nodes with connectivities measured from the proximity of a local neighborhood. The graph-based framework efficiently characterizes the relationships among the data and allows for convenient mathematical manipulation in many applications, such as data clustering, feature extraction, feature selection and data alignment. In this thesis, graph-based approaches applied in the field of multi-source feature and data fusion in remote sensing area are explored. We will mainly investigate the fusion of spatial, spectral and LiDAR information with linear and multilinear algebra under graph-based framework for data clustering and classification problems.

  7. Toward compositional design of reticular type porous films by mixing and coating titania-based frameworks with silica

    NASA Astrophysics Data System (ADS)

    Kimura, T.

    2015-12-01

    A recently developed reticular type porous structure, which can be fabricated as the film through the soft colloidal block copolymer (e.g., PS-b-PEO) templating, is very promising as the porous platform showing high-performance based on its high surface area as well as high diffusivity of targeted organic molecules and effective accommodation of bulky molecules, but the compositional design of oxide frameworks has not been developed so enough to date. Here, I report reliable synthetic methods of the reticular type porous structure toward simple compositional variations. Due to the reproducibility of reticular type porous titania films from titanium alkoxide (e.g., TTIP; titanium tetraisopropoxide), a titania-silica film having similar porous structure was obtained by mixing silicon alkoxide (e.g., tetraethoxysilane) and TTIP followed by their pre-hydrolysis, and the mixing ratio of Ti to Si composition was easily reached to 1.0. For further compositional design, a concept of surface coating was widely applicable; the reticular type porous titania surfaces can be coated with other oxides such as silica. Here, a silica coating was successfully achieved by the simple chemical vapor deposition of silicon alkoxide (e.g., tetramethoxysilane) without water (with water at the humidity level), which was also utilized for pore filling with silica by the similar process with water.

  8. Mapping morphological shape as a high-dimensional functional curve

    PubMed Central

    Fu, Guifang; Huang, Mian; Bo, Wenhao; Hao, Han; Wu, Rongling

    2018-01-01

    Abstract Detecting how genes regulate biological shape has become a multidisciplinary research interest because of its wide application in many disciplines. Despite its fundamental importance, the challenges of accurately extracting information from an image, statistically modeling the high-dimensional shape and meticulously locating shape quantitative trait loci (QTL) affect the progress of this research. In this article, we propose a novel integrated framework that incorporates shape analysis, statistical curve modeling and genetic mapping to detect significant QTLs regulating variation of biological shape traits. After quantifying morphological shape via a radius centroid contour approach, each shape, as a phenotype, was characterized as a high-dimensional curve, varying as angle θ runs clockwise with the first point starting from angle zero. We then modeled the dynamic trajectories of three mean curves and variation patterns as functions of θ. Our framework led to the detection of a few significant QTLs regulating the variation of leaf shape collected from a natural population of poplar, Populus szechuanica var tibetica. This population, distributed at altitudes 2000–4500 m above sea level, is an evolutionarily important plant species. This is the first work in the quantitative genetic shape mapping area that emphasizes a sense of ‘function’ instead of decomposing the shape into a few discrete principal components, as the majority of shape studies do. PMID:28062411

  9. Graduate Attributes for Master's Programs in Health Services and Policy Research: Results of a National Consultation

    PubMed Central

    Morgan, Steve; Orr, Karen; Mah, Catherine

    2010-01-01

    Objective: Our objective was to identify desirable attributes to be developed through graduate training in health services and policy research (HSPR) by identifying the knowledge, skills and abilities thought to be keys to success in HSPR-related careers. We aimed for a framework clear enough to serve as a touchstone for HSPR training programs across Canada yet flexible enough to permit diversity of specialization across and within those programs. Methods: Our approach involved several stages of data collection and analysis: a review of literature; telephone interviews with opinion leaders; online surveys of HSPR students, recent graduates and employers; an invitational workshop; and an interactive panel at a national conference. Our final framework was arrived at through an iterative process of thematic analysis, reflection on invited feedback from consultation participants and triangulation with existing competency frameworks. Results: Our final result was a framework that identifies traits, knowledge and abilities of master's-level graduates who are capable of fostering health system improvement through planning, management, analysis or monitoring that is informed by credible evidence and relevant theory. These attributes are organized into three levels: generic graduate attributes, knowledge related to health and health systems and, finally, attributes related to the application of knowledge for health system improvement. The HSPR-specific attributes include not only an understanding of HSPR theories and methods but also the skills related to the practical application of knowledge in the complex environments of health system decision-making and healthcare policy. Conclusion: Master's-level HSPR training programs should prepare students to pose and seek answers to important questions and provide them with the skills necessary to apply their knowledge within complex decision-making environments. PMID:21804839

  10. Approach for describing spatial dynamics of quantum light-matter interaction in dispersive dissipative media

    NASA Astrophysics Data System (ADS)

    Zyablovsky, A. A.; Andrianov, E. S.; Nechepurenko, I. A.; Dorofeenko, A. V.; Pukhov, A. A.; Vinogradov, A. P.

    2017-05-01

    Solving the challenging problem of the amplification and generation of an electromagnetic field in nanostructures enables us to implement many properties of the electromagnetic field at the nanoscale in practical applications. A first-principles quantum-mechanical consideration of such a problem is sufficiently restricted by the exponentially large number of degrees of freedom and does not allow the electromagnetic-field dynamics to be described if it involves a high number of interacting atoms and modes of the electromagnetic field. Conversely, the classical description of electromagnetic fields is incorrect at the nanoscale due to the high level of quantum fluctuations connected to high dissipation and noise levels. In this paper, we develop a framework with a significantly reduced number of degrees of freedom, which describes the quantum spatial dynamics of electromagnetic fields interacting with atoms. As an example, we consider the interaction between atoms placed in a metallic subwavelength groove and demonstrate that a spontaneously excited electromagnetic pulse propagates with the group velocity. The developed approach may be exploited to describe nonuniform amplification and propagation of electromagnetic fields in arbitrary dispersive dissipative systems.

  11. Ultrathin Hierarchical Porous Carbon Nanosheets for High-Performance Supercapacitors and Redox Electrolyte Energy Storage.

    PubMed

    Jayaramulu, Kolleboyina; Dubal, Deepak P; Nagar, Bhawna; Ranc, Vaclav; Tomanec, Ondrej; Petr, Martin; Datta, Kasibhatta Kumara Ramanatha; Zboril, Radek; Gómez-Romero, Pedro; Fischer, Roland A

    2018-04-01

    The design of advanced high-energy-density supercapacitors requires the design of unique materials that combine hierarchical nanoporous structures with high surface area to facilitate ion transport and excellent electrolyte permeability. Here, shape-controlled 2D nanoporous carbon sheets (NPSs) with graphitic wall structure through the pyrolysis of metal-organic frameworks (MOFs) are developed. As a proof-of-concept application, the obtained NPSs are used as the electrode material for a supercapacitor. The carbon-sheet-based symmetric cell shows an ultrahigh Brunauer-Emmett-Teller (BET)-area-normalized capacitance of 21.4 µF cm -2 (233 F g -1 ), exceeding other carbon-based supercapacitors. The addition of potassium iodide as redox-active species in a sulfuric acid (supporting electrolyte) leads to the ground-breaking enhancement in the energy density up to 90 Wh kg -1 , which is higher than commercial aqueous rechargeable batteries, maintaining its superior power density. Thus, the new material provides a double profits strategy such as battery-level energy and capacitor-level power density. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Colaborated Architechture Framework for Composition UML 2.0 in Zachman Framework

    NASA Astrophysics Data System (ADS)

    Hermawan; Hastarista, Fika

    2016-01-01

    Zachman Framework (ZF) is the framework of enterprise architechture that most widely adopted in the Enterprise Information System (EIS) development. In this study, has been developed Colaborated Architechture Framework (CAF) to collaborate ZF with Unified Modeling Language (UML) 2.0 modeling. The CAF provides the composition of ZF matrix that each cell is consist of the Model Driven architechture (MDA) from the various UML models and many Software Requirement Specification (SRS) documents. Implementation of this modeling is used to develops Enterprise Resource Planning (ERP). Because ERP have a coverage of applications in large numbers and complexly relations, it is necessary to use Agile Model Driven Design (AMDD) approach as an advanced method to transforms MDA into components of application modules with efficiently and accurately. Finally, through the using of the CAF, give good achievement in fullfilment the needs from all stakeholders that are involved in the overall process stage of Rational Unified Process (RUP), and also obtaining a high satisfaction to fullfiled the functionality features of the ERP software in PT. Iglas (Persero) Gresik.

  13. Management of gastric cancer in Asia: resource-stratified guidelines.

    PubMed

    Shen, Lin; Shan, Yan-Shen; Hu, Huang-Ming; Price, Timothy J; Sirohi, Bhawna; Yeh, Kun-Huei; Yang, Yi-Hsin; Sano, Takeshi; Yang, Han-Kwang; Zhang, Xiaotian; Park, Sook Ryun; Fujii, Masashi; Kang, Yoon-Koo; Chen, Li-Tzong

    2013-11-01

    Gastric cancer is the fourth most common cancer globally, and is the second most common cause of death from cancer worldwide. About three-quarters of newly diagnosed cases in 2008 were from Asian countries. With a high mortality-to-incidence ratio, management of gastric cancer is challenging. We discuss evidence for optimum management of gastric cancer in aspects of screening and early detection, diagnosis, and staging; endoscopic and surgical intervention; and the concepts of perioperative, postoperative, and palliative chemotherapy and use of molecularly targeted therapy. Recommendations are formulated on the basis of the framework provided by the Breast Health Global Initiative, using the categories of basic, limited, enhanced, and maximum level. We aim to provide a stepwise strategy for management of gastric cancer applicable to different levels of health-care resources in Asian countries. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. An evolving systems-based methodology for healthcare planning.

    PubMed

    Warwick, Jon; Bell, Gary

    2007-01-01

    Healthcare planning seems beset with problems at all hierarchical levels. These are caused by the 'soft' nature of many of the issues present in healthcare planning and the high levels of complexity inherent in healthcare services. There has, in recent years, been a move to utilize systems thinking ideas in an effort to gain a better understanding of the forces at work within the healthcare environment and these have had some success. This paper argues that systems-based methodologies can be further enhanced by metrication and modeling which assist in exploring the changed emergent behavior of a system resulting from management intervention. The paper describes the Holon Framework as an evolving systems-based approach that has been used to help clients understand complex systems (in the education domain) that would have application in the analysis of healthcare problems.

  15. Mathematical Fundamentals of Probabilistic Semantics for High-Level Fusion

    DTIC Science & Technology

    2013-12-02

    understanding of the fundamental aspects of uncertainty representation and reasoning that a theory of hard and soft high-level fusion must encompass...representation and reasoning that a theory of hard and soft high-level fusion must encompass. Successful completion requires an unbiased, in-depth...and soft information is the lack of a fundamental HLIF theory , backed by a consistent mathematical framework and supporting algorithms. Although there

  16. A tunable azine covalent organic framework platform for visible light-induced hydrogen generation

    PubMed Central

    Vyas, Vijay S.; Haase, Frederik; Stegbauer, Linus; Savasci, Gökcen; Podjaski, Filip; Ochsenfeld, Christian; Lotsch, Bettina V.

    2015-01-01

    Hydrogen evolution from photocatalytic reduction of water holds promise as a sustainable source of carbon-free energy. Covalent organic frameworks (COFs) present an interesting new class of photoactive materials, which combine three key features relevant to the photocatalytic process, namely crystallinity, porosity and tunability. Here we synthesize a series of water- and photostable 2D azine-linked COFs from hydrazine and triphenylarene aldehydes with varying number of nitrogen atoms. The electronic and steric variations in the precursors are transferred to the resulting frameworks, thus leading to a progressively enhanced light-induced hydrogen evolution with increasing nitrogen content in the frameworks. Our results demonstrate that by the rational design of COFs on a molecular level, it is possible to precisely adjust their structural and optoelectronic properties, thus resulting in enhanced photocatalytic activities. This is expected to spur further interest in these photofunctional frameworks where rational supramolecular engineering may lead to new material applications. PMID:26419805

  17. Knowledge mapping as a technique to support knowledge translation.

    PubMed Central

    Ebener, S.; Khan, A.; Shademani, R.; Compernolle, L.; Beltran, M.; Lansang, Ma; Lippman, M.

    2006-01-01

    This paper explores the possibility of integrating knowledge mapping into a conceptual framework that could serve as a tool for understanding the many complex processes, resources and people involved in a health system, and for identifying potential gaps within knowledge translation processes in order to address them. After defining knowledge mapping, this paper presents various examples of the application of this process in health, before looking at the steps that need to be taken to identify potential gaps, to determine to what extent these gaps affect the knowledge translation process and to establish their cause. This is followed by proposals for interventions aimed at strengthening the overall process. Finally, potential limitations on the application of this framework at the country level are addressed. PMID:16917651

  18. Spatio-structural granularity of biological material entities

    PubMed Central

    2010-01-01

    Background With the continuously increasing demands on knowledge- and data-management that databases have to meet, ontologies and the theories of granularity they use become more and more important. Unfortunately, currently used theories and schemes of granularity unnecessarily limit the performance of ontologies due to two shortcomings: (i) they do not allow the integration of multiple granularity perspectives into one granularity framework; (ii) they are not applicable to cumulative-constitutively organized material entities, which cover most of the biomedical material entities. Results The above mentioned shortcomings are responsible for the major inconsistencies in currently used spatio-structural granularity schemes. By using the Basic Formal Ontology (BFO) as a top-level ontology and Keet's general theory of granularity, a granularity framework is presented that is applicable to cumulative-constitutively organized material entities. It provides a scheme for granulating complex material entities into their constitutive and regional parts by integrating various compositional and spatial granularity perspectives. Within a scale dependent resolution perspective, it even allows distinguishing different types of representations of the same material entity. Within other scale dependent perspectives, which are based on specific types of measurements (e.g. weight, volume, etc.), the possibility of organizing instances of material entities independent of their parthood relations and only according to increasing measures is provided as well. All granularity perspectives are connected to one another through overcrossing granularity levels, together forming an integrated whole that uses the compositional object perspective as an integrating backbone. This granularity framework allows to consistently assign structural granularity values to all different types of material entities. Conclusions The here presented framework provides a spatio-structural granularity framework for all domain reference ontologies that model cumulative-constitutively organized material entities. With its multi-perspectives approach it allows querying an ontology stored in a database at one's own desired different levels of detail: The contents of a database can be organized according to diverse granularity perspectives, which in their turn provide different views on its content (i.e. data, knowledge), each organized into different levels of detail. PMID:20509878

  19. Convolutional Neural Network-Based Robot Navigation Using Uncalibrated Spherical Images †

    PubMed Central

    Ran, Lingyan; Zhang, Yanning; Zhang, Qilin; Yang, Tao

    2017-01-01

    Vision-based mobile robot navigation is a vibrant area of research with numerous algorithms having been developed, the vast majority of which either belong to the scene-oriented simultaneous localization and mapping (SLAM) or fall into the category of robot-oriented lane-detection/trajectory tracking. These methods suffer from high computational cost and require stringent labelling and calibration efforts. To address these challenges, this paper proposes a lightweight robot navigation framework based purely on uncalibrated spherical images. To simplify the orientation estimation, path prediction and improve computational efficiency, the navigation problem is decomposed into a series of classification tasks. To mitigate the adverse effects of insufficient negative samples in the “navigation via classification” task, we introduce the spherical camera for scene capturing, which enables 360° fisheye panorama as training samples and generation of sufficient positive and negative heading directions. The classification is implemented as an end-to-end Convolutional Neural Network (CNN), trained on our proposed Spherical-Navi image dataset, whose category labels can be efficiently collected. This CNN is capable of predicting potential path directions with high confidence levels based on a single, uncalibrated spherical image. Experimental results demonstrate that the proposed framework outperforms competing ones in realistic applications. PMID:28604624

  20. DIRAC3 - the new generation of the LHCb grid software

    NASA Astrophysics Data System (ADS)

    Tsaregorodtsev, A.; Brook, N.; Casajus Ramo, A.; Charpentier, Ph; Closier, J.; Cowan, G.; Graciani Diaz, R.; Lanciotti, E.; Mathe, Z.; Nandakumar, R.; Paterson, S.; Romanovsky, V.; Santinelli, R.; Sapunov, M.; Smith, A. C.; Seco Miguelez, M.; Zhelezov, A.

    2010-04-01

    DIRAC, the LHCb community Grid solution, was considerably reengineered in order to meet all the requirements for processing the data coming from the LHCb experiment. It is covering all the tasks starting with raw data transportation from the experiment area to the grid storage, data processing up to the final user analysis. The reengineered DIRAC3 version of the system includes a fully grid security compliant framework for building service oriented distributed systems; complete Pilot Job framework for creating efficient workload management systems; several subsystems to manage high level operations like data production and distribution management. The user interfaces of the DIRAC3 system providing rich command line and scripting tools are complemented by a full-featured Web portal providing users with a secure access to all the details of the system status and ongoing activities. We will present an overview of the DIRAC3 architecture, new innovative features and the achieved performance. Extending DIRAC3 to manage computing resources beyond the WLCG grid will be discussed. Experience with using DIRAC3 by other user communities than LHCb and in other application domains than High Energy Physics will be shown to demonstrate the general-purpose nature of the system.

  1. Convolutional Neural Network-Based Robot Navigation Using Uncalibrated Spherical Images.

    PubMed

    Ran, Lingyan; Zhang, Yanning; Zhang, Qilin; Yang, Tao

    2017-06-12

    Vision-based mobile robot navigation is a vibrant area of research with numerous algorithms having been developed, the vast majority of which either belong to the scene-oriented simultaneous localization and mapping (SLAM) or fall into the category of robot-oriented lane-detection/trajectory tracking. These methods suffer from high computational cost and require stringent labelling and calibration efforts. To address these challenges, this paper proposes a lightweight robot navigation framework based purely on uncalibrated spherical images. To simplify the orientation estimation, path prediction and improve computational efficiency, the navigation problem is decomposed into a series of classification tasks. To mitigate the adverse effects of insufficient negative samples in the "navigation via classification" task, we introduce the spherical camera for scene capturing, which enables 360° fisheye panorama as training samples and generation of sufficient positive and negative heading directions. The classification is implemented as an end-to-end Convolutional Neural Network (CNN), trained on our proposed Spherical-Navi image dataset, whose category labels can be efficiently collected. This CNN is capable of predicting potential path directions with high confidence levels based on a single, uncalibrated spherical image. Experimental results demonstrate that the proposed framework outperforms competing ones in realistic applications.

  2. Simulation and Experimental Study of Metal Organic Frameworks Used in Adsorption Cooling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jenks, Jeromy J.; Motkuri, Radha K.; TeGrotenhuis, Ward

    2016-10-11

    Metal-organic frameworks (MOFs) have recently attracted enormous interest over the past few years in energy storage and gas separation, yet there have been few reports for adsorption cooling applications. Adsorption cooling technology is an established alternative to mechanical vapor compression refrigeration systems and is an excellent alternative in industrial environments where waste heat is available. We explored the use of MOFs that have very high mass loading and relatively low heats of adsorption, with certain combinations of refrigerants to demonstrate a new type of highly efficient adsorption chiller. Computational fluid dynamics combined with a system level lumped-parameter model have beenmore » used to project size and performance for chillers with a cooling capacity ranging from a few kW to several thousand kW. These systems rely on stacked micro/mini-scale architectures to enhance heat and mass transfer. Recent computational studies of an adsorption chiller based on MOFs suggests that a thermally-driven coefficient of performance greater than one may be possible, which would represent a fundamental breakthrough in performance of adsorption chiller technology. Presented herein are computational and experimental results for hydrophyilic and fluorophilic MOFs.« less

  3. Breaking Down Chemical Weapons by Metal-Organic Frameworks.

    PubMed

    Mondal, Suvendu Sekhar; Holdt, Hans-Jürgen

    2016-01-04

    Seek and destroy: Filtration schemes and self-detoxifying protective fabrics based on the Zr(IV)-containing metal-organic frameworks (MOFs) MOF-808 and UiO-66 doped with LiOtBu have been developed that capture and hydrolytically detoxify simulants of nerve agents and mustard gas. Both MOFs function as highly catalytic elements in these applications. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. A framework for analysis of sentinel events in medical student education.

    PubMed

    Cohen, Daniel M; Clinchot, Daniel M; Werman, Howard A

    2013-11-01

    Although previous studies have addressed student factors contributing to dismissal or withdrawal from medical school for academic reasons, little information is available regarding institutional factors that may hinder student progress. The authors describe the development and application of a framework for sentinel event (SE) root cause analysis to evaluate cases in which students are dismissed or withdraw because of failure to progress in the medical school curriculum. The SE in medical student education (MSE) framework was piloted at the Ohio State University College of Medicine (OSUCOM) during 2010-2012. Faculty presented cases using the framework during academic oversight committee discussions. Nine SEs in MSE were presented using the framework. Major institution-level findings included the need for improved communication, documentation of cognitive and noncognitive (e.g., mental health) issues, clarification of requirements for remediation and fitness for duty, and additional psychological services. Challenges related to alternative and combined programs were identified as well. The OSUCOM undertook system changes based on the action plans developed through the discussions of these SEs. An SE analysis process appears to be a useful method for making system changes in response to institutional issues identified in evaluation of cases in which students fail to progress in the medical school curriculum. The authors plan to continue to refine the SE in MSE framework and analysis process. Next steps include assessing whether analysis using this framework yields improved student outcomes with universal applications for other institutions.

  5. A Function-Behavior-State Approach to Designing Human Machine Interface for Nuclear Power Plant Operators

    NASA Astrophysics Data System (ADS)

    Lin, Y.; Zhang, W. J.

    2005-02-01

    This paper presents an approach to human-machine interface design for control room operators of nuclear power plants. The first step in designing an interface for a particular application is to determine information content that needs to be displayed. The design methodology for this step is called the interface design framework (called framework ). Several frameworks have been proposed for applications at varying levels, including process plants. However, none is based on the design and manufacture of a plant system for which the interface is designed. This paper presents an interface design framework which originates from design theory and methodology for general technical systems. Specifically, the framework is based on a set of core concepts of a function-behavior-state model originally proposed by the artificial intelligence research community and widely applied in the design research community. Benefits of this new framework include the provision of a model-based fault diagnosis facility, and the seamless integration of the design (manufacture, maintenance) of plants and the design of human-machine interfaces. The missing linkage between design and operation of a plant was one of the causes of the Three Mile Island nuclear reactor incident. A simulated plant system is presented to explain how to apply this framework in designing an interface. The resulting human-machine interface is discussed; specifically, several fault diagnosis examples are elaborated to demonstrate how this interface could support operators' fault diagnosis in an unanticipated situation.

  6. Detecting Adaptation in Protein-Coding Genes Using a Bayesian Site-Heterogeneous Mutation-Selection Codon Substitution Model.

    PubMed

    Rodrigue, Nicolas; Lartillot, Nicolas

    2017-01-01

    Codon substitution models have traditionally attempted to uncover signatures of adaptation within protein-coding genes by contrasting the rates of synonymous and non-synonymous substitutions. Another modeling approach, known as the mutation-selection framework, attempts to explicitly account for selective patterns at the amino acid level, with some approaches allowing for heterogeneity in these patterns across codon sites. Under such a model, substitutions at a given position occur at the neutral or nearly neutral rate when they are synonymous, or when they correspond to replacements between amino acids of similar fitness; substitutions from high to low (low to high) fitness amino acids have comparatively low (high) rates. Here, we study the use of such a mutation-selection framework as a null model for the detection of adaptation. Following previous works in this direction, we include a deviation parameter that has the effect of capturing the surplus, or deficit, in non-synonymous rates, relative to what would be expected under a mutation-selection modeling framework that includes a Dirichlet process approach to account for across-codon-site variation in amino acid fitness profiles. We use simulations, along with a few real data sets, to study the behavior of the approach, and find it to have good power with a low false-positive rate. Altogether, we emphasize the potential of recent mutation-selection models in the detection of adaptation, calling for further model refinements as well as large-scale applications. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  7. The Integrated Behavioural Model for Water, Sanitation, and Hygiene: a systematic review of behavioural models and a framework for designing and evaluating behaviour change interventions in infrastructure-restricted settings.

    PubMed

    Dreibelbis, Robert; Winch, Peter J; Leontsini, Elli; Hulland, Kristyna R S; Ram, Pavani K; Unicomb, Leanne; Luby, Stephen P

    2013-10-26

    Promotion and provision of low-cost technologies that enable improved water, sanitation, and hygiene (WASH) practices are seen as viable solutions for reducing high rates of morbidity and mortality due to enteric illnesses in low-income countries. A number of theoretical models, explanatory frameworks, and decision-making models have emerged which attempt to guide behaviour change interventions related to WASH. The design and evaluation of such interventions would benefit from a synthesis of this body of theory informing WASH behaviour change and maintenance. We completed a systematic review of existing models and frameworks through a search of related articles available in PubMed and in the grey literature. Information on the organization of behavioural determinants was extracted from the references that fulfilled the selection criteria and synthesized. Results from this synthesis were combined with other relevant literature, and from feedback through concurrent formative and pilot research conducted in the context of two cluster-randomized trials on the efficacy of WASH behaviour change interventions to inform the development of a framework to guide the development and evaluation of WASH interventions: the Integrated Behavioural Model for Water, Sanitation, and Hygiene (IBM-WASH). We identified 15 WASH-specific theoretical models, behaviour change frameworks, or programmatic models, of which 9 addressed our review questions. Existing models under-represented the potential role of technology in influencing behavioural outcomes, focused on individual-level behavioural determinants, and had largely ignored the role of the physical and natural environment. IBM-WASH attempts to correct this by acknowledging three dimensions (Contextual Factors, Psychosocial Factors, and Technology Factors) that operate on five-levels (structural, community, household, individual, and habitual). A number of WASH-specific models and frameworks exist, yet with some limitations. The IBM-WASH model aims to provide both a conceptual and practical tool for improving our understanding and evaluation of the multi-level multi-dimensional factors that influence water, sanitation, and hygiene practices in infrastructure-constrained settings. We outline future applications of our proposed model as well as future research priorities needed to advance our understanding of the sustained adoption of water, sanitation, and hygiene technologies and practices.

  8. The Integrated Behavioural Model for Water, Sanitation, and Hygiene: a systematic review of behavioural models and a framework for designing and evaluating behaviour change interventions in infrastructure-restricted settings

    PubMed Central

    2013-01-01

    Background Promotion and provision of low-cost technologies that enable improved water, sanitation, and hygiene (WASH) practices are seen as viable solutions for reducing high rates of morbidity and mortality due to enteric illnesses in low-income countries. A number of theoretical models, explanatory frameworks, and decision-making models have emerged which attempt to guide behaviour change interventions related to WASH. The design and evaluation of such interventions would benefit from a synthesis of this body of theory informing WASH behaviour change and maintenance. Methods We completed a systematic review of existing models and frameworks through a search of related articles available in PubMed and in the grey literature. Information on the organization of behavioural determinants was extracted from the references that fulfilled the selection criteria and synthesized. Results from this synthesis were combined with other relevant literature, and from feedback through concurrent formative and pilot research conducted in the context of two cluster-randomized trials on the efficacy of WASH behaviour change interventions to inform the development of a framework to guide the development and evaluation of WASH interventions: the Integrated Behavioural Model for Water, Sanitation, and Hygiene (IBM-WASH). Results We identified 15 WASH-specific theoretical models, behaviour change frameworks, or programmatic models, of which 9 addressed our review questions. Existing models under-represented the potential role of technology in influencing behavioural outcomes, focused on individual-level behavioural determinants, and had largely ignored the role of the physical and natural environment. IBM-WASH attempts to correct this by acknowledging three dimensions (Contextual Factors, Psychosocial Factors, and Technology Factors) that operate on five-levels (structural, community, household, individual, and habitual). Conclusions A number of WASH-specific models and frameworks exist, yet with some limitations. The IBM-WASH model aims to provide both a conceptual and practical tool for improving our understanding and evaluation of the multi-level multi-dimensional factors that influence water, sanitation, and hygiene practices in infrastructure-constrained settings. We outline future applications of our proposed model as well as future research priorities needed to advance our understanding of the sustained adoption of water, sanitation, and hygiene technologies and practices. PMID:24160869

  9. Film grain synthesis and its application to re-graining

    NASA Astrophysics Data System (ADS)

    Schallauer, Peter; Mörzinger, Roland

    2006-01-01

    Digital film restoration and special effects compositing require more and more automatic procedures for movie regraining. Missing or inhomogeneous grain decreases perceived quality. For the purpose of grain synthesis an existing texture synthesis algorithm has been evaluated and optimized. We show that this algorithm can produce synthetic grain which is perceptually similar to a given grain template, which has high spatial and temporal variation and which can be applied to multi-spectral images. Furthermore a re-grain application framework is proposed, which synthesises based on an input grain template artificial grain and composites this together with the original image content. Due to its modular approach this framework supports manual as well as automatic re-graining applications. Two example applications are presented, one for re-graining an entire movie and one for fully automatic re-graining of image regions produced by restoration algorithms. Low computational cost of the proposed algorithms allows application in industrial grade software.

  10. A high-level 3D visualization API for Java and ImageJ.

    PubMed

    Schmid, Benjamin; Schindelin, Johannes; Cardona, Albert; Longair, Mark; Heisenberg, Martin

    2010-05-21

    Current imaging methods such as Magnetic Resonance Imaging (MRI), Confocal microscopy, Electron Microscopy (EM) or Selective Plane Illumination Microscopy (SPIM) yield three-dimensional (3D) data sets in need of appropriate computational methods for their analysis. The reconstruction, segmentation and registration are best approached from the 3D representation of the data set. Here we present a platform-independent framework based on Java and Java 3D for accelerated rendering of biological images. Our framework is seamlessly integrated into ImageJ, a free image processing package with a vast collection of community-developed biological image analysis tools. Our framework enriches the ImageJ software libraries with methods that greatly reduce the complexity of developing image analysis tools in an interactive 3D visualization environment. In particular, we provide high-level access to volume rendering, volume editing, surface extraction, and image annotation. The ability to rely on a library that removes the low-level details enables concentrating software development efforts on the algorithm implementation parts. Our framework enables biomedical image software development to be built with 3D visualization capabilities with very little effort. We offer the source code and convenient binary packages along with extensive documentation at http://3dviewer.neurofly.de.

  11. High nitrogen-containing cotton derived 3D porous carbon frameworks for high-performance supercapacitors

    NASA Astrophysics Data System (ADS)

    Fan, Li-Zhen; Chen, Tian-Tian; Song, Wei-Li; Li, Xiaogang; Zhang, Shichao

    2015-10-01

    Supercapacitors fabricated by 3D porous carbon frameworks, such as graphene- and carbon nanotube (CNT)-based aerogels, have been highly attractive due to their various advantages. However, their high cost along with insufficient yield has inhibited their large-scale applications. Here we have demonstrated a facile and easily scalable approach for large-scale preparing novel 3D nitrogen-containing porous carbon frameworks using ultralow-cost commercial cotton. Electrochemical performance suggests that the optimal nitrogen-containing cotton-derived carbon frameworks with a high nitrogen content (12.1 mol%) along with low surface area 285 m2 g-1 present high specific capacities of the 308 and 200 F g-1 in KOH electrolyte at current densities of 0.1 and 10 A g-1, respectively, with very limited capacitance loss upon 10,000 cycles in both aqueous and gel electrolytes. Moreover, the electrode exhibits the highest capacitance up to 220 F g-1 at 0.1 A g-1 and excellent flexibility (with negligible capacitance loss under different bending angles) in the polyvinyl alcohol/KOH gel electrolyte. The observed excellent performance competes well with that found in the electrodes of similar 3D frameworks formed by graphene or CNTs. Therefore, the ultralow-cost and simply strategy here demonstrates great potential for scalable producing high-performance carbon-based supercapacitors in the industry.

  12. High nitrogen-containing cotton derived 3D porous carbon frameworks for high-performance supercapacitors.

    PubMed

    Fan, Li-Zhen; Chen, Tian-Tian; Song, Wei-Li; Li, Xiaogang; Zhang, Shichao

    2015-10-16

    Supercapacitors fabricated by 3D porous carbon frameworks, such as graphene- and carbon nanotube (CNT)-based aerogels, have been highly attractive due to their various advantages. However, their high cost along with insufficient yield has inhibited their large-scale applications. Here we have demonstrated a facile and easily scalable approach for large-scale preparing novel 3D nitrogen-containing porous carbon frameworks using ultralow-cost commercial cotton. Electrochemical performance suggests that the optimal nitrogen-containing cotton-derived carbon frameworks with a high nitrogen content (12.1 mol%) along with low surface area 285 m(2) g(-1) present high specific capacities of the 308 and 200 F g(-1) in KOH electrolyte at current densities of 0.1 and 10 A g(-1), respectively, with very limited capacitance loss upon 10,000 cycles in both aqueous and gel electrolytes. Moreover, the electrode exhibits the highest capacitance up to 220 F g(-1) at 0.1 A g(-1) and excellent flexibility (with negligible capacitance loss under different bending angles) in the polyvinyl alcohol/KOH gel electrolyte. The observed excellent performance competes well with that found in the electrodes of similar 3D frameworks formed by graphene or CNTs. Therefore, the ultralow-cost and simply strategy here demonstrates great potential for scalable producing high-performance carbon-based supercapacitors in the industry.

  13. High nitrogen-containing cotton derived 3D porous carbon frameworks for high-performance supercapacitors

    PubMed Central

    Fan, Li-Zhen; Chen, Tian-Tian; Song, Wei-Li; Li, Xiaogang; Zhang, Shichao

    2015-01-01

    Supercapacitors fabricated by 3D porous carbon frameworks, such as graphene- and carbon nanotube (CNT)-based aerogels, have been highly attractive due to their various advantages. However, their high cost along with insufficient yield has inhibited their large-scale applications. Here we have demonstrated a facile and easily scalable approach for large-scale preparing novel 3D nitrogen-containing porous carbon frameworks using ultralow-cost commercial cotton. Electrochemical performance suggests that the optimal nitrogen-containing cotton-derived carbon frameworks with a high nitrogen content (12.1 mol%) along with low surface area 285 m2 g−1 present high specific capacities of the 308 and 200 F g−1 in KOH electrolyte at current densities of 0.1 and 10 A g−1, respectively, with very limited capacitance loss upon 10,000 cycles in both aqueous and gel electrolytes. Moreover, the electrode exhibits the highest capacitance up to 220 F g−1 at 0.1 A g−1 and excellent flexibility (with negligible capacitance loss under different bending angles) in the polyvinyl alcohol/KOH gel electrolyte. The observed excellent performance competes well with that found in the electrodes of similar 3D frameworks formed by graphene or CNTs. Therefore, the ultralow-cost and simply strategy here demonstrates great potential for scalable producing high-performance carbon-based supercapacitors in the industry. PMID:26472144

  14. Pharmaceuticals in tap water: human health risk assessment and proposed monitoring framework in China.

    PubMed

    Leung, Ho Wing; Jin, Ling; Wei, Si; Tsui, Mirabelle Mei Po; Zhou, Bingsheng; Jiao, Liping; Cheung, Pak Chuen; Chun, Yiu Kan; Murphy, Margaret Burkhardt; Lam, Paul Kwan Sing

    2013-07-01

    Pharmaceuticals are known to contaminate tap water worldwide, but the relevant human health risks have not been assessed in China. We monitored 32 pharmaceuticals in Chinese tap water and evaluated the life-long human health risks of exposure in order to provide information for future prioritization and risk management. We analyzed samples (n = 113) from 13 cities and compared detected concentrations with existing or newly-derived safety levels for assessing risk quotients (RQs) at different life stages, excluding the prenatal stage. We detected 17 pharmaceuticals in 89% of samples, with most detectable concentrations (92%) at < 50 ng/L. Caffeine (median-maximum, nanograms per liter: 24.4-564), metronidazole (1.8-19.3), salicylic acid (16.6-41.2), clofibric acid (1.2-3.3), carbamazepine (1.3-6.7), and dimetridazole (6.9-14.7) were found in ≥ 20% of samples. Cities within the Yangtze River region and Guangzhou were regarded as contamination hot spots because of elevated levels and frequent positive detections. Of the 17 pharmaceuticals detected, 13 showed very low risk levels, but 4 (i.e., dimetridazole, thiamphenicol, sulfamethazine, and clarithromycin) were found to have at least one life-stage RQ ≥ 0.01, especially for the infant and child life stages, and should be considered of high priority for management. We propose an indicator-based monitoring framework for providing information for source identification, water treatment effectiveness, and water safety management in China. Chinese tap water is an additional route of human exposure to pharmaceuticals, particularly for dimetridazole, although the risk to human health is low based on current toxicity data. Pharmaceutical detection and application of the proposed monitoring framework can be used for water source protection and risk management in China and elsewhere.

  15. Impact of metal and anion substitutions on the hydrogen storage properties of M-BTT metal-organic frameworks.

    PubMed

    Sumida, Kenji; Stück, David; Mino, Lorenzo; Chai, Jeng-Da; Bloch, Eric D; Zavorotynska, Olena; Murray, Leslie J; Dincă, Mircea; Chavan, Sachin; Bordiga, Silvia; Head-Gordon, Martin; Long, Jeffrey R

    2013-01-23

    Microporous metal-organic frameworks are a class of materials being vigorously investigated for mobile hydrogen storage applications. For high-pressure storage at ambient temperatures, the M(3)[(M(4)Cl)(3)(BTT)(8)](2) (M-BTT; BTT(3-) = 1,3,5-benzenetristetrazolate) series of frameworks are of particular interest due to the high density of exposed metal cation sites on the pore surface. These sites give enhanced zero-coverage isosteric heats of adsorption (Q(st)) approaching the optimal value for ambient storage applications. However, the Q(st) parameter provides only a limited insight into the thermodynamics of the individual adsorption sites, the tuning of which is paramount for optimizing the storage performance. Here, we begin by performing variable-temperature infrared spectroscopy studies of Mn-, Fe-, and Cu-BTT, allowing the thermodynamics of H(2) adsorption to be probed experimentally. This is complemented by a detailed DFT study, in which molecular fragments representing the metal clusters within the extended solid are simulated to obtain a more thorough description of the structural and thermodynamic aspects of H(2) adsorption at the strongest binding sites. Then, the effect of substitutions at the metal cluster (metal ion and anion within the tetranuclear cluster) is discussed, showing that the configuration of this unit indeed plays an important role in determining the affinity of the framework toward H(2). Interestingly, the theoretical study has identified that the Zn-based analogs would be expected to facilitate enhanced adsorption profiles over the compounds synthesized experimentally, highlighting the importance of a combined experimental and theoretical approach to the design and synthesis of new frameworks for H(2) storage applications.

  16. Optimized effective potential method and application to static RPA correlation

    NASA Astrophysics Data System (ADS)

    Fukazawa, Taro; Akai, Hisazumi

    2015-03-01

    The optimized effective potential (OEP) method is a promising technique for calculating the ground state properties of a system within the density functional theory. However, it is not widely used as its computational cost is rather high and, also, some ambiguity remains in the theoretical framework. In order to overcome these problems, we first introduced a method that accelerates the OEP scheme in a static RPA-level correlation functional. Second, the Krieger-Li-Iafrate (KLI) approximation is exploited to solve the OEP equation. Although seemingly too crude, this approximation did not reduce the accuracy of the description of the magnetic transition metals (Fe, Co, and Ni) examined here, the magnetic properties of which are rather sensitive to correlation effects. Finally, we reformulated the OEP method to render it applicable to the direct RPA correlation functional and other, more precise, functionals. Emphasis is placed on the following three points of the discussion: (i) level-crossing at the Fermi surface is taken into account; (ii) eigenvalue variations in a Kohn-Sham functional are correctly treated; and (iii) the resultant OEP equation is different from those reported to date.

  17. The Aggregate Exposure Pathway (AEP): A conceptual framework for advancing exposure science research and applications

    EPA Science Inventory

    Historically, risk assessment has relied upon toxicological data to obtain hazard-based reference levels, which are subsequently compared to exposure estimates to determine whether an unacceptable risk to public health may exist. Recent advances in analytical methods, biomarker ...

  18. PLOCAN glider portal: a gateway for useful data management and visualization system

    NASA Astrophysics Data System (ADS)

    Morales, Tania; Lorenzo, Alvaro; Viera, Josue; Barrera, Carlos; José Rueda, María

    2014-05-01

    Nowadays monitoring ocean behavior and its characteristics involves a wide range of sources able to gather and provide a vast amount of data in spatio-temporal scales. Multiplatform infrastructures, like PLOCAN, hold a variety of autonomous Lagrangian and Eulerian devices addressed to collect information then transferred to land in near-real time. Managing all this data collection in an efficient way is a major issue. Advances in ocean observation technologies, where underwater autonomous gliders play a key role, has brought as a consequence an improvement of spatio-temporal resolution which offers a deeper understanding of the ocean but requires a bigger effort in the data management process. There are general requirements in terms of data management in that kind of environments, such as processing raw data at different levels to obtain valuable information, storing data coherently and providing accurate products to final users according to their specific needs. Managing large amount of data can be certainly tedious and complex without having right tools and operational procedures; hence automating these tasks through software applications saves time and reduces errors. Moreover, data distribution is highly relevant since scientist tent to assimilate different sources for comparison and validation. The use of web applications has boosted the necessary scientific dissemination. Within this argument, PLOCAN has implemented a set of independent but compatible applications to process, store and disseminate information gathered through different oceanographic platforms. These applications have been implemented using open standards, such as HTML and CSS, and open source software, like python as programming language and Django as framework web. More specifically, a glider application has been developed within the framework of FP7-GROOM project. Regarding data management, this project focuses on collecting and making available consistent and quality controlled datasets as well as fostering open access to glider data.

  19. StreamQRE: Modular Specification and Efficient Evaluation of Quantitative Queries over Streaming Data.

    PubMed

    Mamouras, Konstantinos; Raghothaman, Mukund; Alur, Rajeev; Ives, Zachary G; Khanna, Sanjeev

    2017-06-01

    Real-time decision making in emerging IoT applications typically relies on computing quantitative summaries of large data streams in an efficient and incremental manner. To simplify the task of programming the desired logic, we propose StreamQRE, which provides natural and high-level constructs for processing streaming data. Our language has a novel integration of linguistic constructs from two distinct programming paradigms: streaming extensions of relational query languages and quantitative extensions of regular expressions. The former allows the programmer to employ relational constructs to partition the input data by keys and to integrate data streams from different sources, while the latter can be used to exploit the logical hierarchy in the input stream for modular specifications. We first present the core language with a small set of combinators, formal semantics, and a decidable type system. We then show how to express a number of common patterns with illustrative examples. Our compilation algorithm translates the high-level query into a streaming algorithm with precise complexity bounds on per-item processing time and total memory footprint. We also show how to integrate approximation algorithms into our framework. We report on an implementation in Java, and evaluate it with respect to existing high-performance engines for processing streaming data. Our experimental evaluation shows that (1) StreamQRE allows more natural and succinct specification of queries compared to existing frameworks, (2) the throughput of our implementation is higher than comparable systems (for example, two-to-four times greater than RxJava), and (3) the approximation algorithms supported by our implementation can lead to substantial memory savings.

  20. StreamQRE: Modular Specification and Efficient Evaluation of Quantitative Queries over Streaming Data*

    PubMed Central

    Mamouras, Konstantinos; Raghothaman, Mukund; Alur, Rajeev; Ives, Zachary G.; Khanna, Sanjeev

    2017-01-01

    Real-time decision making in emerging IoT applications typically relies on computing quantitative summaries of large data streams in an efficient and incremental manner. To simplify the task of programming the desired logic, we propose StreamQRE, which provides natural and high-level constructs for processing streaming data. Our language has a novel integration of linguistic constructs from two distinct programming paradigms: streaming extensions of relational query languages and quantitative extensions of regular expressions. The former allows the programmer to employ relational constructs to partition the input data by keys and to integrate data streams from different sources, while the latter can be used to exploit the logical hierarchy in the input stream for modular specifications. We first present the core language with a small set of combinators, formal semantics, and a decidable type system. We then show how to express a number of common patterns with illustrative examples. Our compilation algorithm translates the high-level query into a streaming algorithm with precise complexity bounds on per-item processing time and total memory footprint. We also show how to integrate approximation algorithms into our framework. We report on an implementation in Java, and evaluate it with respect to existing high-performance engines for processing streaming data. Our experimental evaluation shows that (1) StreamQRE allows more natural and succinct specification of queries compared to existing frameworks, (2) the throughput of our implementation is higher than comparable systems (for example, two-to-four times greater than RxJava), and (3) the approximation algorithms supported by our implementation can lead to substantial memory savings. PMID:29151821

  1. Application of Construal Level and Value-Belief Norm Theories to Undergraduate Decision-Making on a Wildlife Socio-Scientific Issue

    ERIC Educational Resources Information Center

    Sutter, A. McKinzie; Dauer, Jenny M.; Forbes, Cory T.

    2018-01-01

    One aim of science education is to develop scientific literacy for decision-making in daily life. Socio-scientific issues (SSI) and structured decision-making frameworks can help students reach these objectives. This research uses value belief norm (VBN) theory and construal level theory (CLT) to explore students' use of personal values in their…

  2. The Contribution of Local Experiments and Negotiation Processes to Field-Level Learning in Emerging (Niche) Technologies: Meta-Analysis of 27 New Energy Projects in Europe

    ERIC Educational Resources Information Center

    Raven, Rob P. J. M.; Heiskanen, Eva; Lovio, Raimo; Hodson, Mike; Brohmann, Bettina

    2008-01-01

    This article examines how local experiments and negotiation processes contribute to social and field-level learning. The analysis is framed within the niche development literature, which offers a framework for analyzing the relation between projects in local contexts and the transfer of local experiences into generally applicable rules. The…

  3. Polycrystalline CVD diamond device level modeling for particle detection applications

    NASA Astrophysics Data System (ADS)

    Morozzi, A.; Passeri, D.; Kanxheri, K.; Servoli, L.; Lagomarsino, S.; Sciortino, S.

    2016-12-01

    Diamond is a promising material whose excellent physical properties foster its use for radiation detection applications, in particular in those hostile operating environments where the silicon-based detectors behavior is limited due to the high radiation fluence. Within this framework, the application of Technology Computer Aided Design (TCAD) simulation tools is highly envisaged for the study, the optimization and the predictive analysis of sensing devices. Since the novelty of using diamond in electronics, this material is not included in the library of commercial, state-of-the-art TCAD software tools. In this work, we propose the development, the application and the validation of numerical models to simulate the electrical behavior of polycrystalline (pc)CVD diamond conceived for diamond sensors for particle detection. The model focuses on the characterization of a physically-based pcCVD diamond bandgap taking into account deep-level defects acting as recombination centers and/or trap states. While a definite picture of the polycrystalline diamond band-gap is still debated, the effect of the main parameters (e.g. trap densities, capture cross-sections, etc.) can be deeply investigated thanks to the simulated approach. The charge collection efficiency due to β -particle irradiation of diamond materials provided by different vendors and with different electrode configurations has been selected as figure of merit for the model validation. The good agreement between measurements and simulation findings, keeping the traps density as the only one fitting parameter, assesses the suitability of the TCAD modeling approach as a predictive tool for the design and the optimization of diamond-based radiation detectors.

  4. Creating a FIESTA (Framework for Integrated Earth Science and Technology Applications) with MagIC

    NASA Astrophysics Data System (ADS)

    Minnett, R.; Koppers, A. A. P.; Jarboe, N.; Tauxe, L.; Constable, C.

    2017-12-01

    The Magnetics Information Consortium (https://earthref.org/MagIC) has recently developed a containerized web application to considerably reduce the friction in contributing, exploring and combining valuable and complex datasets for the paleo-, geo- and rock magnetic scientific community. The data produced in this scientific domain are inherently hierarchical and the communities evolving approaches to this scientific workflow, from sampling to taking measurements to multiple levels of interpretations, require a large and flexible data model to adequately annotate the results and ensure reproducibility. Historically, contributing such detail in a consistent format has been prohibitively time consuming and often resulted in only publishing the highly derived interpretations. The new open-source (https://github.com/earthref/MagIC) application provides a flexible upload tool integrated with the data model to easily create a validated contribution and a powerful search interface for discovering datasets and combining them to enable transformative science. MagIC is hosted at EarthRef.org along with several interdisciplinary geoscience databases. A FIESTA (Framework for Integrated Earth Science and Technology Applications) is being created by generalizing MagIC's web application for reuse in other domains. The application relies on a single configuration document that describes the routing, data model, component settings and external services integrations. The container hosts an isomorphic Meteor JavaScript application, MongoDB database and ElasticSearch search engine. Multiple containers can be configured as microservices to serve portions of the application or rely on externally hosted MongoDB, ElasticSearch, or third-party services to efficiently scale computational demands. FIESTA is particularly well suited for many Earth Science disciplines with its flexible data model, mapping, account management, upload tool to private workspaces, reference metadata, image galleries, full text searches and detailed filters. EarthRef's Seamount Catalog of bathymetry and morphology data, EarthRef's Geochemical Earth Reference Model (GERM) databases, and Oregon State University's Marine and Geology Repository (http://osu-mgr.org) will benefit from custom adaptations of FIESTA.

  5. Evaluation of CHO Benchmarks on the Arria 10 FPGA using Intel FPGA SDK for OpenCL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Zheming; Yoshii, Kazutomo; Finkel, Hal

    The OpenCL standard is an open programming model for accelerating algorithms on heterogeneous computing system. OpenCL extends the C-based programming language for developing portable codes on different platforms such as CPU, Graphics processing units (GPUs), Digital Signal Processors (DSPs) and Field Programmable Gate Arrays (FPGAs). The Intel FPGA SDK for OpenCL is a suite of tools that allows developers to abstract away the complex FPGA-based development flow for a high-level software development flow. Users can focus on the design of hardware-accelerated kernel functions in OpenCL and then direct the tools to generate the low-level FPGA implementations. The approach makes themore » FPGA-based development more accessible to software users as the needs for hybrid computing using CPUs and FPGAs are increasing. It can also significantly reduce the hardware development time as users can evaluate different ideas with high-level language without deep FPGA domain knowledge. Benchmarking of OpenCL-based framework is an effective way for analyzing the performance of system by studying the execution of the benchmark applications. CHO is a suite of benchmark applications that provides support for OpenCL [1]. The authors presented CHO as an OpenCL port of the CHStone benchmark. Using Altera OpenCL (AOCL) compiler to synthesize the benchmark applications, they listed the resource usage and performance of each kernel that can be successfully synthesized by the compiler. In this report, we evaluate the resource usage and performance of the CHO benchmark applications using the Intel FPGA SDK for OpenCL and Nallatech 385A FPGA board that features an Arria 10 FPGA device. The focus of the report is to have a better understanding of the resource usage and performance of the kernel implementations using Arria-10 FPGA devices compared to Stratix-5 FPGA devices. In addition, we also gain knowledge about the limitations of the current compiler when it fails to synthesize a benchmark application.« less

  6. Development and application of a real-time testbed for multiagent system interoperability: A case study on hierarchical microgrid control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cintuglu, Mehmet Hazar; Youssef, Tarek; Mohammed, Osama A.

    This article presents the development and application of a real-time testbed for multiagent system interoperability. As utility independent private microgrids are installed constantly, standardized interoperability frameworks are required to define behavioral models of the individual agents for expandability and plug-and-play operation. In this paper, we propose a comprehensive hybrid agent framework combining the foundation for intelligent physical agents (FIPA), IEC 61850, and data distribution service (DDS) standards. The IEC 61850 logical node concept is extended using FIPA based agent communication language (ACL) with application specific attributes and deliberative behavior modeling capability. The DDS middleware is adopted to enable a real-timemore » publisher-subscriber interoperability mechanism between platforms. The proposed multi-agent framework was validated in a laboratory based testbed involving developed intelligent electronic device (IED) prototypes and actual microgrid setups. Experimental results were demonstrated for both decentralized and distributed control approaches. Secondary and tertiary control levels of a microgrid were demonstrated for decentralized hierarchical control case study. A consensus-based economic dispatch case study was demonstrated as a distributed control example. Finally, it was shown that the developed agent platform is industrially applicable for actual smart grid field deployment.« less

  7. Development and application of a real-time testbed for multiagent system interoperability: A case study on hierarchical microgrid control

    DOE PAGES

    Cintuglu, Mehmet Hazar; Youssef, Tarek; Mohammed, Osama A.

    2016-08-10

    This article presents the development and application of a real-time testbed for multiagent system interoperability. As utility independent private microgrids are installed constantly, standardized interoperability frameworks are required to define behavioral models of the individual agents for expandability and plug-and-play operation. In this paper, we propose a comprehensive hybrid agent framework combining the foundation for intelligent physical agents (FIPA), IEC 61850, and data distribution service (DDS) standards. The IEC 61850 logical node concept is extended using FIPA based agent communication language (ACL) with application specific attributes and deliberative behavior modeling capability. The DDS middleware is adopted to enable a real-timemore » publisher-subscriber interoperability mechanism between platforms. The proposed multi-agent framework was validated in a laboratory based testbed involving developed intelligent electronic device (IED) prototypes and actual microgrid setups. Experimental results were demonstrated for both decentralized and distributed control approaches. Secondary and tertiary control levels of a microgrid were demonstrated for decentralized hierarchical control case study. A consensus-based economic dispatch case study was demonstrated as a distributed control example. Finally, it was shown that the developed agent platform is industrially applicable for actual smart grid field deployment.« less

  8. Kinetic Stability of MOF-5 in Humid Environments: Impact of Powder Densification, Humidity Level, and Exposure Time.

    PubMed

    Ming, Yang; Purewal, Justin; Yang, Jun; Xu, Chunchuan; Soltis, Rick; Warner, James; Veenstra, Mike; Gaab, Manuela; Müller, Ulrich; Siegel, Donald J

    2015-05-05

    Metal-organic frameworks (MOFs) are an emerging class of microporous, crystalline materials with potential applications in the capture, storage, and separation of gases. Of the many known MOFs, MOF-5 has attracted considerable attention because of its ability to store gaseous fuels at low pressure with high densities. Nevertheless, MOF-5 and several other MOFs exhibit limited stability upon exposure to reactive species such as water. The present study quantifies the impact of humid air exposure on the properties of MOF-5 as a function of exposure time, humidity level, and morphology (i.e., powders vs pellets). Properties examined include hydrogen storage capacity, surface area, and crystallinity. Water adsorption/desorption isotherms are measured using a gravimetric technique; the first uptake exhibits a type V isotherm with a sudden increase in uptake at ∼50% relative humidity. For humidity levels below this threshold only minor degradation is observed for exposure times up to several hours, suggesting that MOF-5 is more stable than generally assumed under moderately humid conditions. In contrast, irreversible degradation occurs in a matter of minutes for exposures above the 50% threshold. Fourier transform infrared spectroscopy indicates that molecular and/or dissociated water is inserted into the skeletal framework after long exposure times. Densification into pellets can slow the degradation of MOF-5 significantly, and may present a pathway to enhance the stability of some MOFs.

  9. An application framework for computer-aided patient positioning in radiation therapy.

    PubMed

    Liebler, T; Hub, M; Sanner, C; Schlegel, W

    2003-09-01

    The importance of exact patient positioning in radiation therapy increases with the ongoing improvements in irradiation planning and treatment. Therefore, new ways to overcome precision limitations of current positioning methods in fractionated treatment have to be found. The Department of Medical Physics at the German Cancer Research Centre (DKFZ) follows different video-based approaches to increase repositioning precision. In this context, the modular software framework FIVE (Fast Integrated Video-based Environment) has been designed and implemented. It is both hardware- and platform-independent and supports merging position data by integrating various computer-aided patient positioning methods. A highly precise optical tracking system and several subtraction imaging techniques have been realized as modules to supply basic video-based repositioning techniques. This paper describes the common framework architecture, the main software modules and their interfaces. An object-oriented software engineering process has been applied using the UML, C + + and the Qt library. The significance of the current framework prototype for the application in patient positioning as well as the extension to further application areas will be discussed. Particularly in experimental research, where special system adjustments are often necessary, the open design of the software allows problem-oriented extensions and adaptations.

  10. [A framework to support action in population mental health].

    PubMed

    Mantoura, Pascale; Roberge, Marie-Claude; Fournier, Louise

    In Quebec, like elsewhere in the world, we are witnessing a growing concern for the population's mental health and for the importance of concentrating efforts on prevention and promotion. In this context, public health actors are invited to adopt a leadership role in advancing mental health promotion and mental disorder prevention goals, and establish the required partnerships with actors from the health and social services and from other sectors who are indispensable to the population mental health agenda. In Canada, public heath actors are not yet sufficiently supported in this role. They express the need to access structuring frameworks which can clarify their action in mental health. This article first presents the momentum for change at the policy level within the field of mental health. A framework to support population mental health action is then presented. The framework identifies the various dimensions underlying the promotion of population mental health as well as the reduction of mental health inequalities. The article finally illustrates how the application of a populational (the application of a populational responsibility perspective) responsibility perspective, as it is defined in the context of Quebec, facilitates the implementation of the various elements of this framework. In the end, public health actors are better equipped to situate their practice in favour of the population's mental health.

  11. Summary of a Modeling and Simulation Framework for High-Fidelity Weapon Models in Joint Semi-Automated Forces (JSAF) and Other Mission-Simulation Software

    DTIC Science & Technology

    2008-05-01

    communicate with other weapon models In a mission-level simulation; (3) introduces the four configuration levels of the M&S framework; and (4) presents a cost ...and Disadvantages ....................................................................... 26 6 COST -EFFECTIVE M&S LABORATORY PLAN...25 23 Weapon Model Sample Time and Average TET Displayed on the Target PC ..... 26 24 Design and Cost of an

  12. TerraFERMA: Harnessing Advanced Computational Libraries in Earth Science

    NASA Astrophysics Data System (ADS)

    Wilson, C. R.; Spiegelman, M.; van Keken, P.

    2012-12-01

    Many important problems in Earth sciences can be described by non-linear coupled systems of partial differential equations. These "multi-physics" problems include thermo-chemical convection in Earth and planetary interiors, interactions of fluids and magmas with the Earth's mantle and crust and coupled flow of water and ice. These problems are of interest to a large community of researchers but are complicated to model and understand. Much of this complexity stems from the nature of multi-physics where small changes in the coupling between variables or constitutive relations can lead to radical changes in behavior, which in turn affect critical computational choices such as discretizations, solvers and preconditioners. To make progress in understanding such coupled systems requires a computational framework where multi-physics problems can be described at a high-level while maintaining the flexibility to easily modify the solution algorithm. Fortunately, recent advances in computational science provide a basis for implementing such a framework. Here we present the Transparent Finite Element Rapid Model Assembler (TerraFERMA), which leverages several advanced open-source libraries for core functionality. FEniCS (fenicsproject.org) provides a high level language for describing the weak forms of coupled systems of equations, and an automatic code generator that produces finite element assembly code. PETSc (www.mcs.anl.gov/petsc) provides a wide range of scalable linear and non-linear solvers that can be composed into effective multi-physics preconditioners. SPuD (amcg.ese.ic.ac.uk/Spud) is an application neutral options system that provides both human and machine-readable interfaces based on a single xml schema. Our software integrates these libraries and provides the user with a framework for exploring multi-physics problems. A single options file fully describes the problem, including all equations, coefficients and solver options. Custom compiled applications are generated from this file but share an infrastructure for services common to all models, e.g. diagnostics, checkpointing and global non-linear convergence monitoring. This maximizes code reusability, reliability and longevity ensuring that scientific results and the methods used to acquire them are transparent and reproducible. TerraFERMA has been tested against many published geodynamic benchmarks including 2D/3D thermal convection problems, the subduction zone benchmarks and benchmarks for magmatic solitary waves. It is currently being used in the investigation of reactive cracking phenomena with applications to carbon sequestration, but we will principally discuss its use in modeling the migration of fluids in subduction zones. Subduction zones require an understanding of the highly nonlinear interactions of fluids with solids and thus provide an excellent scientific driver for the development of multi-physics software.

  13. Application of the Carolina Framework for Cervical Cancer Prevention

    PubMed Central

    Moss, Jennifer L.; McCarthy, Schatzi H.; Gilkey, Melissa B.; Brewer, Noel T.

    2014-01-01

    Objective The Carolina Framework for Cervical Cancer Prevention describes 4 main causes of cervical cancer incidence: human papillomavirus (HPV) infection, lack of screening, screening errors, and not receiving follow-up care. We present 2 applications of the Carolina Framework in which we identify high-need counties in North Carolina and generate recommendations for improving prevention efforts. Methods We created a cervical cancer prevention need index (CCPNI) that ranked counties on cervical cancer mortality, HPV vaccine initiation and completion, Pap smear screening, and provision of Pap tests to rarely- or never-screened women. In addition, we conducted in-depth interviews with 19 key informants from programs and agencies involved in cervical cancer prevention in North Carolina. Results North Carolina’s 100 counties varied widely on individual CCPNI components, including annual cervical cancer mortality (median 2.7/100,000 women; range 0.0–8.0), adolescent girls’ HPV vaccine initiation (median 42%; range 15%–62%), and Pap testing in the previous 3 years among Medicaid-insured adult women (median 59%; range 40%–83%). Counties with the greatest prevention needs formed 2 distinct clusters in the northeast and south-central regions of the state. Interviews generated 9 recommendations to improve cervical cancer prevention in North Carolina, identifying applications to specific programs and policies in the state. Conclusions This study found striking geographic disparities in cervical cancer prevention need in North Carolina. Future prevention efforts in the state should prioritize high-need regions as well as recommended strategies and applications in existing programs. Other states can use the Carolina Framework to increase the impact of their cervical cancer prevention efforts. PMID:24333357

  14. Enhancing GIS Capabilities for High Resolution Earth Science Grids

    NASA Astrophysics Data System (ADS)

    Koziol, B. W.; Oehmke, R.; Li, P.; O'Kuinghttons, R.; Theurich, G.; DeLuca, C.

    2017-12-01

    Applications for high performance GIS will continue to increase as Earth system models pursue more realistic representations of Earth system processes. Finer spatial resolution model input and output, unstructured or irregular modeling grids, data assimilation, and regional coordinate systems present novel challenges for GIS frameworks operating in the Earth system modeling domain. This presentation provides an overview of two GIS-driven applications that combine high performance software with big geospatial datasets to produce value-added tools for the modeling and geoscientific community. First, a large-scale interpolation experiment using National Hydrography Dataset (NHD) catchments, a high resolution rectilinear CONUS grid, and the Earth System Modeling Framework's (ESMF) conservative interpolation capability will be described. ESMF is a parallel, high-performance software toolkit that provides capabilities (e.g. interpolation) for building and coupling Earth science applications. ESMF is developed primarily by the NOAA Environmental Software Infrastructure and Interoperability (NESII) group. The purpose of this experiment was to test and demonstrate the utility of high performance scientific software in traditional GIS domains. Special attention will be paid to the nuanced requirements for dealing with high resolution, unstructured grids in scientific data formats. Second, a chunked interpolation application using ESMF and OpenClimateGIS (OCGIS) will demonstrate how spatial subsetting can virtually remove computing resource ceilings for very high spatial resolution interpolation operations. OCGIS is a NESII-developed Python software package designed for the geospatial manipulation of high-dimensional scientific datasets. An overview of the data processing workflow, why a chunked approach is required, and how the application could be adapted to meet operational requirements will be discussed here. In addition, we'll provide a general overview of OCGIS's parallel subsetting capabilities including challenges in the design and implementation of a scientific data subsetter.

  15. Data Service: Distributed Data Capture and Replication

    NASA Astrophysics Data System (ADS)

    Warner, P. B.; Pietrowicz, S. R.

    2007-10-01

    Data Service is a critical component of the NOAO Data Management and Science Support (DMaSS) Solutions Platform, which is based on a service-oriented architecture, and is to replace the current NOAO Data Transport System. Its responsibilities include capturing data from NOAO and partner telescopes and instruments and replicating the data across multiple (currently six) storage sites. Java 5 was chosen as the implementation language, and Java EE as the underlying enterprise framework. Application metadata persistence is performed using EJB and Hibernate on the JBoss Application Server, with PostgreSQL as the persistence back-end. Although potentially any underlying mass storage system may be used as the Data Service file persistence technology, DTS deployments and Data Service test deployments currently use the Storage Resource Broker from SDSC. This paper presents an overview and high-level design of the Data Service, including aspects of deployment, e.g., for the LSST Data Challenge at the NCSA computing facilities.

  16. Optimizing Interactive Development of Data-Intensive Applications

    PubMed Central

    Interlandi, Matteo; Tetali, Sai Deep; Gulzar, Muhammad Ali; Noor, Joseph; Condie, Tyson; Kim, Miryung; Millstein, Todd

    2017-01-01

    Modern Data-Intensive Scalable Computing (DISC) systems are designed to process data through batch jobs that execute programs (e.g., queries) compiled from a high-level language. These programs are often developed interactively by posing ad-hoc queries over the base data until a desired result is generated. We observe that there can be significant overlap in the structure of these queries used to derive the final program. Yet, each successive execution of a slightly modified query is performed anew, which can significantly increase the development cycle. Vega is an Apache Spark framework that we have implemented for optimizing a series of similar Spark programs, likely originating from a development or exploratory data analysis session. Spark developers (e.g., data scientists) can leverage Vega to significantly reduce the amount of time it takes to re-execute a modified Spark program, reducing the overall time to market for their Big Data applications. PMID:28405637

  17. Atomic sites and stability of Cs+ captured within zeolitic nanocavities

    PubMed Central

    Yoshida, Kaname; Toyoura, Kazuaki; Matsunaga, Katsuyuki; Nakahira, Atsushi; Kurata, Hiroki; Ikuhara, Yumi H.; Sasaki, Yukichi

    2013-01-01

    Zeolites have potential application as ion-exchangers, catalysts and molecular sieves. Zeolites are once again drawing attention in Japan as stable adsorbents and solidification materials of fission products, such as 137Cs+ from damaged nuclear-power plants. Although there is a long history of scientific studies on the crystal structures and ion-exchange properties of zeolites for practical application, there are still open questions, at the atomic-level, on the physical and chemical origins of selective ion-exchange abilities of different cations and detailed atomic structures of exchanged cations inside the nanoscale cavities of zeolites. Here, the precise locations of Cs+ ions captured within A-type zeolite were analyzed using high-resolution electron microscopy. Together with theoretical calculations, the stable positions of absorbed Cs+ ions in the nanocavities are identified, and the bonding environment within the zeolitic framework is revealed to be a key factor that influences the locations of absorbed cations. PMID:23949184

  18. Spreading DIRT with Web Services

    NASA Astrophysics Data System (ADS)

    Pound, M. W.; Wolfire, M. G.; Amarnath, N. S.; Plante, R. L.

    2005-12-01

    Most of the systems currently used to analyze astronomical data were designed and implemented more than a decade ago. Although they still are very useful for analysis, one often would like a better interface to newer concepts like archives, Virtual Observatories and GRID. Further, incompatibilities between most of the current systems with respect to control language and semantics make it cumbersome to mix applications from different origins. An OPTICON Network, funded by the Sixth Framework Programme of the European Commission, started this year to discuss high-level needs for an astronomical data analysis environment which could provide a flexible access to both legacy applications and new astronomical resources. The main objective of the Network is to establish widely accepted requirements and basic design recommendations for such an environment. The hope is that this effort will help other projects, which consider to implement such systems, in collaborating and achieving a common environment.

  19. A development framework for artificial intelligence based distributed operations support systems

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.; Cottman, Bruce H.

    1990-01-01

    Advanced automation is required to reduce costly human operations support requirements for complex space-based and ground control systems. Existing knowledge based technologies have been used successfully to automate individual operations tasks. Considerably less progress has been made in integrating and coordinating multiple operations applications for unified intelligent support systems. To fill this gap, SOCIAL, a tool set for developing Distributed Artificial Intelligence (DAI) systems is being constructed. SOCIAL consists of three primary language based components defining: models of interprocess communication across heterogeneous platforms; models for interprocess coordination, concurrency control, and fault management; and for accessing heterogeneous information resources. DAI applications subsystems, either new or existing, will access these distributed services non-intrusively, via high-level message-based protocols. SOCIAL will reduce the complexity of distributed communications, control, and integration, enabling developers to concentrate on the design and functionality of the target DAI system itself.

  20. Applying network theory to animal movements to identify properties of landscape space use.

    PubMed

    Bastille-Rousseau, Guillaume; Douglas-Hamilton, Iain; Blake, Stephen; Northrup, Joseph M; Wittemyer, George

    2018-04-01

    Network (graph) theory is a popular analytical framework to characterize the structure and dynamics among discrete objects and is particularly effective at identifying critical hubs and patterns of connectivity. The identification of such attributes is a fundamental objective of animal movement research, yet network theory has rarely been applied directly to animal relocation data. We develop an approach that allows the analysis of movement data using network theory by defining occupied pixels as nodes and connection among these pixels as edges. We first quantify node-level (local) metrics and graph-level (system) metrics on simulated movement trajectories to assess the ability of these metrics to pull out known properties in movement paths. We then apply our framework to empirical data from African elephants (Loxodonta africana), giant Galapagos tortoises (Chelonoidis spp.), and mule deer (Odocoileous hemionus). Our results indicate that certain node-level metrics, namely degree, weight, and betweenness, perform well in capturing local patterns of space use, such as the definition of core areas and paths used for inter-patch movement. These metrics were generally applicable across data sets, indicating their robustness to assumptions structuring analysis or strategies of movement. Other metrics capture local patterns effectively, but were sensitive to specified graph properties, indicating case specific applications. Our analysis indicates that graph-level metrics are unlikely to outperform other approaches for the categorization of general movement strategies (central place foraging, migration, nomadism). By identifying critical nodes, our approach provides a robust quantitative framework to identify local properties of space use that can be used to evaluate the effect of the loss of specific nodes on range wide connectivity. Our network approach is intuitive, and can be implemented across imperfectly sampled or large-scale data sets efficiently, providing a framework for conservationists to analyze movement data. Functions created for the analyses are available within the R package moveNT. © 2018 by the Ecological Society of America.

Top