Sample records for proposed framework consists

  1. A framework for evaluating proposals for scientific activities in wilderness

    Treesearch

    Peter Landres

    2000-01-01

    This paper presents a structured framework for evaluating proposals for scientific activities in wilderness. Wilderness managers receive proposals for scientific activities ranging from unobtrusive inventorying of plants and animals to the use of chainsaws and helicopters for collecting information. Currently, there is no consistent process for evaluating proposals,...

  2. A proposed framework for consistent regulation of public exposures to radionuclides and other carcinogens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kocher, D.C.; Hoffman, F.O.

    1991-12-31

    This paper discusses a proposed framework for consistent regulation of carcinogenic risks to the public based on establishing de manifestis (i.e., unacceptable) and de minimis (i.e., trivial) lifetime risks from exposure to any carcinogens at levels of about 10{sup {minus}1}--10{sup {minus}3} and 10{sup {minus}4}--10{sup {minus}6}, respectively, and reduction of risks above de minimis levels as low as reasonably achievable (ALARA). We then discuss certain differences in the way risks from exposure to radionuclides and other carcinogens currently are regulated or assessed which would need to be considered in implementing the proposed regulatory framework for all carcinogens.

  3. A proposed framework for consistent regulation of public exposures to radionuclides and other carcinogens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kocher, D.C.; Hoffman, F.O.

    1991-01-01

    This paper discusses a proposed framework for consistent regulation of carcinogenic risks to the public based on establishing de manifestis (i.e., unacceptable) and de minimis (i.e., trivial) lifetime risks from exposure to any carcinogens at levels of about 10{sup {minus}1}--10{sup {minus}3} and 10{sup {minus}4}--10{sup {minus}6}, respectively, and reduction of risks above de minimis levels as low as reasonably achievable (ALARA). We then discuss certain differences in the way risks from exposure to radionuclides and other carcinogens currently are regulated or assessed which would need to be considered in implementing the proposed regulatory framework for all carcinogens.

  4. A framework to evaluate proposals for scientific activities in wilderness

    Treesearch

    Peter Landres

    2010-01-01

    Every year, the four Federal wilderness management agencies - U.S. DOI Bureau of Land Management, Fish and Wildlife Service, National Park Service, and the USDA Forest Service - receive hundreds of proposals to conduct scientific studies within wilderness. There is no consistent and comprehensive framework for evaluating such proposals that accounts for the unique...

  5. A Proposed Framework for Collaborative Design in a Virtual Environment

    NASA Astrophysics Data System (ADS)

    Breland, Jason S.; Shiratuddin, Mohd Fairuz

    This paper describes a proposed framework for a collaborative design in a virtual environment. The framework consists of components that support a true collaborative design in a real-time 3D virtual environment. In support of the proposed framework, a prototype application is being developed. The authors envision the framework will have, but not limited to the following features: (1) real-time manipulation of 3D objects across the network, (2) support for multi-designer activities and information access, (3) co-existence within same virtual space, etc. This paper also discusses a proposed testing to determine the possible benefits of a collaborative design in a virtual environment over other forms of collaboration, and results from a pilot test.

  6. Promoting Teachers' Learning and Knowledge Building in a Socio-Technical System

    ERIC Educational Resources Information Center

    Tammets, Kairit; Pata, Kai; Laanpere, Mart

    2013-01-01

    The study proposes a way in which the learning and knowledge building (LKB) framework, which is consistent with the knowledge conversion phases proposed by Nonaka and Takeuchi, supports teachers' informal and self-directed workplace learning. An LKB framework in a socio-technical system was developed to support professional development in an…

  7. A methodological survey identified eight proposed frameworks for the adaptation of health related guidelines.

    PubMed

    Darzi, Andrea; Abou-Jaoude, Elias A; Agarwal, Arnav; Lakis, Chantal; Wiercioch, Wojtek; Santesso, Nancy; Brax, Hneine; El-Jardali, Fadi; Schünemann, Holger J; Akl, Elie A

    2017-06-01

    Our objective was to identify and describe published frameworks for adaptation of clinical, public health, and health services guidelines. We included reports describing methods of adaptation of guidelines in sufficient detail to allow its reproducibility. We searched Medline and EMBASE databases. We also searched personal files, as well manuals and handbooks of organizations and professional societies that proposed methods of adaptation and adoption of guidelines. We followed standard systematic review methodology. Our search captured 12,021 citations, out of which we identified eight proposed methods of guidelines adaptation: ADAPTE, Adapted ADAPTE, Alberta Ambassador Program adaptation phase, GRADE-ADOLOPMENT, MAGIC, RAPADAPTE, Royal College of Nursing (RCN), and Systematic Guideline Review (SGR). The ADAPTE framework consists of a 24-step process to adapt guidelines to a local context taking into consideration the needs, priorities, legislation, policies, and resources. The Alexandria Center for Evidence-Based Clinical Practice Guidelines updated one of ADAPTE's tools, modified three tools, and added three new ones. In addition, they proposed optionally using three other tools. The Alberta Ambassador Program adaptation phase consists of 11 steps and focused on adapting good-quality guidelines for nonspecific low back pain into local context. GRADE-ADOLOPMENT is an eight-step process based on the GRADE Working Group's Evidence to Decision frameworks and applied in 22 guidelines in the context of national guideline development program. The MAGIC research program developed a five-step adaptation process, informed by ADAPTE and the GRADE approach in the context of adapting thrombosis guidelines. The RAPADAPTE framework consists of 12 steps based on ADAPTE and using synthesized evidence databases, retrospectively derived from the experience of producing a high-quality guideline for the treatment of breast cancer with limited resources in Costa Rica. The RCN outlines five key steps strategy for adaptation of guidelines to the local context. The SGR method consists of nine steps and takes into consideration both methodological gaps and context-specific normative issues in source guidelines. We identified through searching personal files two abandoned methods. We identified and described eight proposed frameworks for the adaptation of health-related guidelines. There is a need to evaluate these different frameworks to assess rigor, efficiency, and transparency of their proposed processes. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Academic Libraries and Quality: An Analysis and Evaluation Framework

    ERIC Educational Resources Information Center

    Atkinson, Jeremy

    2017-01-01

    The paper proposes and describes a framework for academic library quality to be used by new and more experienced library practitioners and by others involved in considering the quality of academic libraries' services and provision. The framework consists of eight themes and a number of questions to examine within each theme. The framework was…

  9. An Empirical Investigation of Entrepreneurship Intensity in Iranian State Universities

    ERIC Educational Resources Information Center

    Mazdeh, Mohammad Mahdavi; Razavi, Seyed-Mostafa; Hesamamiri, Roozbeh; Zahedi, Mohammad-Reza; Elahi, Behin

    2013-01-01

    The purpose of this study is to propose a framework to evaluate the entrepreneurship intensity (EI) of Iranian state universities. In order to determine EI, a hybrid multi-method framework consisting of Delphi, Analytic Network Process (ANP), and VIKOR is proposed. The Delphi method is used to localize and reduce the number of criteria extracted…

  10. Development of agent-based on-line adaptive signal control (ASC) framework using connected vehicle (CV) technology.

    DOT National Transportation Integrated Search

    2016-04-01

    In this study, we developed an adaptive signal control (ASC) framework for connected vehicles (CVs) using agent-based modeling technique. : The proposed framework consists of two types of agents: 1) vehicle agents (VAs); and 2) signal controller agen...

  11. A Theoretically Consistent Framework for Modelling Lagrangian Particle Deposition in Plant Canopies

    NASA Astrophysics Data System (ADS)

    Bailey, Brian N.; Stoll, Rob; Pardyjak, Eric R.

    2018-06-01

    We present a theoretically consistent framework for modelling Lagrangian particle deposition in plant canopies. The primary focus is on describing the probability of particles encountering canopy elements (i.e., potential deposition), and provides a consistent means for including the effects of imperfect deposition through any appropriate sub-model for deposition efficiency. Some aspects of the framework draw upon an analogy to radiation propagation through a turbid medium with which to develop model theory. The present method is compared against one of the most commonly used heuristic Lagrangian frameworks, namely that originally developed by Legg and Powell (Agricultural Meteorology, 1979, Vol. 20, 47-67), which is shown to be theoretically inconsistent. A recommendation is made to discontinue the use of this heuristic approach in favour of the theoretically consistent framework developed herein, which is no more difficult to apply under equivalent assumptions. The proposed framework has the additional advantage that it can be applied to arbitrary canopy geometries given readily measurable parameters describing vegetation structure.

  12. A Framework For Analysis Of Coastal Infrastructure Vunerabilty To Global Sea Level Rise

    NASA Astrophysics Data System (ADS)

    Obrien, P. S.; White, K. D.; Veatch, W.; Marzion, R.; Moritz, H.; Moritz, H. R.

    2017-12-01

    Recorded impacts of global sea rise on coastal water levels have been documented over the past 100 to 150 years. In the recent 40 years the assumption of hydrologic stationarity has been recognized as invalid. New coastal infrastructure designs must recognize the paradigm shift from hydrologic stationarity to non-stationarity in coastal hydrology. A framework for the evaluation of existing coastal infrastructure is proposed to effectively assess design vulnerability. Two data sets developed from existing structures are chosen to test a proposed framework for vunerabilty to global sea level rise, with the proposed name Climate Preparedness and Resilience Register (CPRR). The CPRR framework consists of four major elements; Datum Adjustment, Coastal Water Levels, Scenario Projections and Performance Thresholds.

  13. A framework for automatic information quality ranking of diabetes websites.

    PubMed

    Belen Sağlam, Rahime; Taskaya Temizel, Tugba

    2015-01-01

    Objective: When searching for particular medical information on the internet the challenge lies in distinguishing the websites that are relevant to the topic, and contain accurate information. In this article, we propose a framework that automatically identifies and ranks diabetes websites according to their relevance and information quality based on the website content. Design: The proposed framework ranks diabetes websites according to their content quality, relevance and evidence based medicine. The framework combines information retrieval techniques with a lexical resource based on Sentiwordnet making it possible to work with biased and untrusted websites while, at the same time, ensuring the content relevance. Measurement: The evaluation measurements used were Pearson-correlation, true positives, false positives and accuracy. We tested the framework with a benchmark data set consisting of 55 websites with varying degrees of information quality problems. Results: The proposed framework gives good results that are comparable with the non-automated information quality measuring approaches in the literature. The correlation between the results of the proposed automated framework and ground-truth is 0.68 on an average with p < 0.001 which is greater than the other proposed automated methods in the literature (r score in average is 0.33).

  14. Developing and Implementing a Framework of Participatory Simulation for Mobile Learning Using Scaffolding

    ERIC Educational Resources Information Center

    Yin, Chengjiu; Song, Yanjie; Tabata, Yoshiyuki; Ogata, Hiroaki; Hwang, Gwo-Jen

    2013-01-01

    This paper proposes a conceptual framework, scaffolding participatory simulation for mobile learning (SPSML), used on mobile devices for helping students learn conceptual knowledge in the classroom. As the pedagogical design, the framework adopts an experiential learning model, which consists of five sequential but cyclic steps: the initial stage,…

  15. A modified belief entropy in Dempster-Shafer framework.

    PubMed

    Zhou, Deyun; Tang, Yongchuan; Jiang, Wen

    2017-01-01

    How to quantify the uncertain information in the framework of Dempster-Shafer evidence theory is still an open issue. Quite a few uncertainty measures have been proposed in Dempster-Shafer framework, however, the existing studies mainly focus on the mass function itself, the available information represented by the scale of the frame of discernment (FOD) in the body of evidence is ignored. Without taking full advantage of the information in the body of evidence, the existing methods are somehow not that efficient. In this paper, a modified belief entropy is proposed by considering the scale of FOD and the relative scale of a focal element with respect to FOD. Inspired by Deng entropy, the new belief entropy is consistent with Shannon entropy in the sense of probability consistency. What's more, with less information loss, the new measure can overcome the shortage of some other uncertainty measures. A few numerical examples and a case study are presented to show the efficiency and superiority of the proposed method.

  16. A modified belief entropy in Dempster-Shafer framework

    PubMed Central

    Zhou, Deyun; Jiang, Wen

    2017-01-01

    How to quantify the uncertain information in the framework of Dempster-Shafer evidence theory is still an open issue. Quite a few uncertainty measures have been proposed in Dempster-Shafer framework, however, the existing studies mainly focus on the mass function itself, the available information represented by the scale of the frame of discernment (FOD) in the body of evidence is ignored. Without taking full advantage of the information in the body of evidence, the existing methods are somehow not that efficient. In this paper, a modified belief entropy is proposed by considering the scale of FOD and the relative scale of a focal element with respect to FOD. Inspired by Deng entropy, the new belief entropy is consistent with Shannon entropy in the sense of probability consistency. What’s more, with less information loss, the new measure can overcome the shortage of some other uncertainty measures. A few numerical examples and a case study are presented to show the efficiency and superiority of the proposed method. PMID:28481914

  17. Brain tumor classification and segmentation using sparse coding and dictionary learning.

    PubMed

    Salman Al-Shaikhli, Saif Dawood; Yang, Michael Ying; Rosenhahn, Bodo

    2016-08-01

    This paper presents a novel fully automatic framework for multi-class brain tumor classification and segmentation using a sparse coding and dictionary learning method. The proposed framework consists of two steps: classification and segmentation. The classification of the brain tumors is based on brain topology and texture. The segmentation is based on voxel values of the image data. Using K-SVD, two types of dictionaries are learned from the training data and their associated ground truth segmentation: feature dictionary and voxel-wise coupled dictionaries. The feature dictionary consists of global image features (topological and texture features). The coupled dictionaries consist of coupled information: gray scale voxel values of the training image data and their associated label voxel values of the ground truth segmentation of the training data. For quantitative evaluation, the proposed framework is evaluated using different metrics. The segmentation results of the brain tumor segmentation (MICCAI-BraTS-2013) database are evaluated using five different metric scores, which are computed using the online evaluation tool provided by the BraTS-2013 challenge organizers. Experimental results demonstrate that the proposed approach achieves an accurate brain tumor classification and segmentation and outperforms the state-of-the-art methods.

  18. SoMIR framework for designing high-NDBP photonic crystal waveguides.

    PubMed

    Mirjalili, Seyed Mohammad

    2014-06-20

    This work proposes a modularized framework for designing the structure of photonic crystal waveguides (PCWs) and reducing human involvement during the design process. The proposed framework consists of three main modules: parameters module, constraints module, and optimizer module. The first module is responsible for defining the structural parameters of a given PCW. The second module defines various limitations in order to achieve desirable optimum designs. The third module is the optimizer, in which a numerical optimization method is employed to perform optimization. As case studies, two new structures called Ellipse PCW (EPCW) and Hypoellipse PCW (HPCW) with different shape of holes in each row are proposed and optimized by the framework. The calculation results show that the proposed framework is able to successfully optimize the structures of the new EPCW and HPCW. In addition, the results demonstrate the applicability of the proposed framework for optimizing different PCWs. The results of the comparative study show that the optimized EPCW and HPCW provide 18% and 9% significant improvements in normalized delay-bandwidth product (NDBP), respectively, compared to the ring-shape-hole PCW, which has the highest NDBP in the literature. Finally, the simulations of pulse propagation confirm the manufacturing feasibility of both optimized structures.

  19. A framework using cluster-based hybrid network architecture for collaborative virtual surgery.

    PubMed

    Qin, Jing; Choi, Kup-Sze; Poon, Wai-Sang; Heng, Pheng-Ann

    2009-12-01

    Research on collaborative virtual environments (CVEs) opens the opportunity for simulating the cooperative work in surgical operations. It is however a challenging task to implement a high performance collaborative surgical simulation system because of the difficulty in maintaining state consistency with minimum network latencies, especially when sophisticated deformable models and haptics are involved. In this paper, an integrated framework using cluster-based hybrid network architecture is proposed to support collaborative virtual surgery. Multicast transmission is employed to transmit updated information among participants in order to reduce network latencies, while system consistency is maintained by an administrative server. Reliable multicast is implemented using distributed message acknowledgment based on cluster cooperation and sliding window technique. The robustness of the framework is guaranteed by the failure detection chain which enables smooth transition when participants join and leave the collaboration, including normal and involuntary leaving. Communication overhead is further reduced by implementing a number of management approaches such as computational policies and collaborative mechanisms. The feasibility of the proposed framework is demonstrated by successfully extending an existing standalone orthopedic surgery trainer into a collaborative simulation system. A series of experiments have been conducted to evaluate the system performance. The results demonstrate that the proposed framework is capable of supporting collaborative surgical simulation.

  20. A framework for small infrared target real-time visual enhancement

    NASA Astrophysics Data System (ADS)

    Sun, Xiaoliang; Long, Gucan; Shang, Yang; Liu, Xiaolin

    2015-03-01

    This paper proposes a framework for small infrared target real-time visual enhancement. The framework is consisted of three parts: energy accumulation for small infrared target enhancement, noise suppression and weighted fusion. Dynamic programming based track-before-detection algorithm is adopted in the energy accumulation to detect the target accurately and enhance the target's intensity notably. In the noise suppression, the target region is weighted by a Gaussian mask according to the target's Gaussian shape. In order to fuse the processed target region and unprocessed background smoothly, the intensity in the target region is treated as weight in the fusion. Experiments on real small infrared target images indicate that the framework proposed in this paper can enhances the small infrared target markedly and improves the image's visual quality notably. The proposed framework outperforms tradition algorithms in enhancing the small infrared target, especially for image in which the target is hardly visible.

  1. Effects of Cognitive Load on Driving Performance: The Cognitive Control Hypothesis.

    PubMed

    Engström, Johan; Markkula, Gustav; Victor, Trent; Merat, Natasha

    2017-08-01

    The objective of this paper was to outline an explanatory framework for understanding effects of cognitive load on driving performance and to review the existing experimental literature in the light of this framework. Although there is general consensus that taking the eyes off the forward roadway significantly impairs most aspects of driving, the effects of primarily cognitively loading tasks on driving performance are not well understood. Based on existing models of driver attention, an explanatory framework was outlined. This framework can be summarized in terms of the cognitive control hypothesis: Cognitive load selectively impairs driving subtasks that rely on cognitive control but leaves automatic performance unaffected. An extensive literature review was conducted wherein existing results were reinterpreted based on the proposed framework. It was demonstrated that the general pattern of experimental results reported in the literature aligns well with the cognitive control hypothesis and that several apparent discrepancies between studies can be reconciled based on the proposed framework. More specifically, performance on nonpracticed or inherently variable tasks, relying on cognitive control, is consistently impaired by cognitive load, whereas the performance on automatized (well-practiced and consistently mapped) tasks is unaffected and sometimes even improved. Effects of cognitive load on driving are strongly selective and task dependent. The present results have important implications for the generalization of results obtained from experimental studies to real-world driving. The proposed framework can also serve to guide future research on the potential causal role of cognitive load in real-world crashes.

  2. A Kernel-Based Low-Rank (KLR) Model for Low-Dimensional Manifold Recovery in Highly Accelerated Dynamic MRI.

    PubMed

    Nakarmi, Ukash; Wang, Yanhua; Lyu, Jingyuan; Liang, Dong; Ying, Leslie

    2017-11-01

    While many low rank and sparsity-based approaches have been developed for accelerated dynamic magnetic resonance imaging (dMRI), they all use low rankness or sparsity in input space, overlooking the intrinsic nonlinear correlation in most dMRI data. In this paper, we propose a kernel-based framework to allow nonlinear manifold models in reconstruction from sub-Nyquist data. Within this framework, many existing algorithms can be extended to kernel framework with nonlinear models. In particular, we have developed a novel algorithm with a kernel-based low-rank model generalizing the conventional low rank formulation. The algorithm consists of manifold learning using kernel, low rank enforcement in feature space, and preimaging with data consistency. Extensive simulation and experiment results show that the proposed method surpasses the conventional low-rank-modeled approaches for dMRI.

  3. An Information Technology Framework for Strengthening Telehealthcare Service Delivery

    PubMed Central

    Chen, Chi-Wen; Weng, Yung-Ching; Shang, Rung-Ji; Yu, Hui-Chu; Chung, Yufang; Lai, Feipei

    2012-01-01

    Abstract Objective: Telehealthcare has been used to provide healthcare service, and information technology infrastructure appears to be essential while providing telehealthcare service. Insufficiencies have been identified, such as lack of integration, need of accommodation of diverse biometric sensors, and accessing diverse networks as different houses have varying facilities, which challenge the promotion of telehealthcare. This study designs an information technology framework to strengthen telehealthcare delivery. Materials and Methods: The proposed framework consists of a system architecture design and a network transmission design. The aim of the framework is to integrate data from existing information systems, to adopt medical informatics standards, to integrate diverse biometric sensors, and to provide different data transmission networks to support a patient's house network despite the facilities. The proposed framework has been evaluated with a case study of two telehealthcare programs, with and without the adoption of the framework. Results: The proposed framework facilitates the functionality of the program and enables steady patient enrollments. The overall patient participations are increased, and the patient outcomes appear positive. The attitudes toward the service and self-improvement also are positive. Conclusions: The findings of this study add up to the construction of a telehealthcare system. Implementing the proposed framework further assists the functionality of the service and enhances the availability of the service and patient acceptances. PMID:23061641

  4. An information technology framework for strengthening telehealthcare service delivery.

    PubMed

    Chen, Li-Chin; Chen, Chi-Wen; Weng, Yung-Ching; Shang, Rung-Ji; Yu, Hui-Chu; Chung, Yufang; Lai, Feipei

    2012-10-01

    Telehealthcare has been used to provide healthcare service, and information technology infrastructure appears to be essential while providing telehealthcare service. Insufficiencies have been identified, such as lack of integration, need of accommodation of diverse biometric sensors, and accessing diverse networks as different houses have varying facilities, which challenge the promotion of telehealthcare. This study designs an information technology framework to strengthen telehealthcare delivery. The proposed framework consists of a system architecture design and a network transmission design. The aim of the framework is to integrate data from existing information systems, to adopt medical informatics standards, to integrate diverse biometric sensors, and to provide different data transmission networks to support a patient's house network despite the facilities. The proposed framework has been evaluated with a case study of two telehealthcare programs, with and without the adoption of the framework. The proposed framework facilitates the functionality of the program and enables steady patient enrollments. The overall patient participations are increased, and the patient outcomes appear positive. The attitudes toward the service and self-improvement also are positive. The findings of this study add up to the construction of a telehealthcare system. Implementing the proposed framework further assists the functionality of the service and enhances the availability of the service and patient acceptances.

  5. A framework for medical image retrieval using machine learning and statistical similarity matching techniques with relevance feedback.

    PubMed

    Rahman, Md Mahmudur; Bhattacharya, Prabir; Desai, Bipin C

    2007-01-01

    A content-based image retrieval (CBIR) framework for diverse collection of medical images of different imaging modalities, anatomic regions with different orientations and biological systems is proposed. Organization of images in such a database (DB) is well defined with predefined semantic categories; hence, it can be useful for category-specific searching. The proposed framework consists of machine learning methods for image prefiltering, similarity matching using statistical distance measures, and a relevance feedback (RF) scheme. To narrow down the semantic gap and increase the retrieval efficiency, we investigate both supervised and unsupervised learning techniques to associate low-level global image features (e.g., color, texture, and edge) in the projected PCA-based eigenspace with their high-level semantic and visual categories. Specially, we explore the use of a probabilistic multiclass support vector machine (SVM) and fuzzy c-mean (FCM) clustering for categorization and prefiltering of images to reduce the search space. A category-specific statistical similarity matching is proposed in a finer level on the prefiltered images. To incorporate a better perception subjectivity, an RF mechanism is also added to update the query parameters dynamically and adjust the proposed matching functions. Experiments are based on a ground-truth DB consisting of 5000 diverse medical images of 20 predefined categories. Analysis of results based on cross-validation (CV) accuracy and precision-recall for image categorization and retrieval is reported. It demonstrates the improvement, effectiveness, and efficiency achieved by the proposed framework.

  6. Robust Dehaze Algorithm for Degraded Image of CMOS Image Sensors.

    PubMed

    Qu, Chen; Bi, Du-Yan; Sui, Ping; Chao, Ai-Nong; Wang, Yun-Fei

    2017-09-22

    The CMOS (Complementary Metal-Oxide-Semiconductor) is a new type of solid image sensor device widely used in object tracking, object recognition, intelligent navigation fields, and so on. However, images captured by outdoor CMOS sensor devices are usually affected by suspended atmospheric particles (such as haze), causing a reduction in image contrast, color distortion problems, and so on. In view of this, we propose a novel dehazing approach based on a local consistent Markov random field (MRF) framework. The neighboring clique in traditional MRF is extended to the non-neighboring clique, which is defined on local consistent blocks based on two clues, where both the atmospheric light and transmission map satisfy the character of local consistency. In this framework, our model can strengthen the restriction of the whole image while incorporating more sophisticated statistical priors, resulting in more expressive power of modeling, thus, solving inadequate detail recovery effectively and alleviating color distortion. Moreover, the local consistent MRF framework can obtain details while maintaining better results for dehazing, which effectively improves the image quality captured by the CMOS image sensor. Experimental results verified that the method proposed has the combined advantages of detail recovery and color preservation.

  7. A conceptual framework for evaluation of public health and primary care system performance in iran.

    PubMed

    Jahanmehr, Nader; Rashidian, Arash; Khosravi, Ardeshir; Farzadfar, Farshad; Shariati, Mohammad; Majdzadeh, Reza; Akbari Sari, Ali; Mesdaghinia, Alireza

    2015-01-26

    The main objective of this study was to design a conceptual framework, according to the policies and priorities of the ministry of health to evaluate provincial public health and primary care performance and to assess their share in the overall health impacts of the community. We used several tools and techniques, including system thinking, literature review to identify relevant attributes of health system performance framework and interview with the key stakeholders. The PubMed, Scopus, web of science, Google Scholar and two specialized databases of Persian language literature (IranMedex and SID) were searched using main terms and keywords. Following decision-making and collective agreement among the different stakeholders, 51 core indicators were chosen from among 602 obtained indicators in a four stage process, for monitoring and evaluation of Health Deputies. We proposed a conceptual framework by identifying the performance area for Health Deputies between other determinants of health, as well as introducing a chain of results, for performance, consisting of Input, Process, Output and Outcome indicators. We also proposed 5 dimensions for measuring the performance of Health Deputies, consisting of efficiency, effectiveness, equity, access and improvement of health status. The proposed Conceptual Framework illustrates clearly the Health Deputies success in achieving best results and consequences of health in the country. Having the relative commitment of the ministry of health and Health Deputies at the University of Medical Sciences is essential for full implementation of this framework and providing the annual performance report.

  8. ICADx: interpretable computer aided diagnosis of breast masses

    NASA Astrophysics Data System (ADS)

    Kim, Seong Tae; Lee, Hakmin; Kim, Hak Gu; Ro, Yong Man

    2018-02-01

    In this study, a novel computer aided diagnosis (CADx) framework is devised to investigate interpretability for classifying breast masses. Recently, a deep learning technology has been successfully applied to medical image analysis including CADx. Existing deep learning based CADx approaches, however, have a limitation in explaining the diagnostic decision. In real clinical practice, clinical decisions could be made with reasonable explanation. So current deep learning approaches in CADx are limited in real world deployment. In this paper, we investigate interpretability in CADx with the proposed interpretable CADx (ICADx) framework. The proposed framework is devised with a generative adversarial network, which consists of interpretable diagnosis network and synthetic lesion generative network to learn the relationship between malignancy and a standardized description (BI-RADS). The lesion generative network and the interpretable diagnosis network compete in an adversarial learning so that the two networks are improved. The effectiveness of the proposed method was validated on public mammogram database. Experimental results showed that the proposed ICADx framework could provide the interpretability of mass as well as mass classification. It was mainly attributed to the fact that the proposed method was effectively trained to find the relationship between malignancy and interpretations via the adversarial learning. These results imply that the proposed ICADx framework could be a promising approach to develop the CADx system.

  9. Hierarchical control framework for integrated coordination between distributed energy resources and demand response

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Di; Lian, Jianming; Sun, Yannan

    Demand response is representing a significant but largely untapped resource that can greatly enhance the flexibility and reliability of power systems. In this paper, a hierarchical control framework is proposed to facilitate the integrated coordination between distributed energy resources and demand response. The proposed framework consists of coordination and device layers. In the coordination layer, various resource aggregations are optimally coordinated in a distributed manner to achieve the system-level objectives. In the device layer, individual resources are controlled in real time to follow the optimal power generation or consumption dispatched from the coordination layer. For the purpose of practical applications,more » a method is presented to determine the utility functions of controllable loads by taking into account the real-time load dynamics and the preferences of individual customers. The effectiveness of the proposed framework is validated by detailed simulation studies.« less

  10. A framework for supervising lifestyle diseases using long-term activity monitoring.

    PubMed

    Han, Yongkoo; Han, Manhyung; Lee, Sungyoung; Sarkar, A M Jehad; Lee, Young-Koo

    2012-01-01

    Activity monitoring of a person for a long-term would be helpful for controlling lifestyle associated diseases. Such diseases are often linked with the way a person lives. An unhealthy and irregular standard of living influences the risk of such diseases in the later part of one's life. The symptoms and the initial signs of these diseases are common to the people with irregular lifestyle. In this paper, we propose a novel healthcare framework to manage lifestyle diseases using long-term activity monitoring. The framework recognizes the user's activities with the help of the sensed data in runtime and reports the irregular and unhealthy activity patterns to a doctor and a caregiver. The proposed framework is a hierarchical structure that consists of three modules: activity recognition, activity pattern generation and lifestyle disease prediction. We show that it is possible to assess the possibility of lifestyle diseases from the sensor data. We also show the viability of the proposed framework.

  11. Digital Rights Management Implemented by RDF Graph Approach

    ERIC Educational Resources Information Center

    Yang, Jin Tan; Horng, Huai-Chien

    2006-01-01

    This paper proposes a design framework for constructing Digital Rights Management (DRM) that enables learning objects in legal usage. The central theme of this framework is that any design of a DRM must have theories as foundations to make the maintenance, extension or interoperability easy. While a learning objective consists of learning…

  12. A KPI-based process monitoring and fault detection framework for large-scale processes.

    PubMed

    Zhang, Kai; Shardt, Yuri A W; Chen, Zhiwen; Yang, Xu; Ding, Steven X; Peng, Kaixiang

    2017-05-01

    Large-scale processes, consisting of multiple interconnected subprocesses, are commonly encountered in industrial systems, whose performance needs to be determined. A common approach to this problem is to use a key performance indicator (KPI)-based approach. However, the different KPI-based approaches are not developed with a coherent and consistent framework. Thus, this paper proposes a framework for KPI-based process monitoring and fault detection (PM-FD) for large-scale industrial processes, which considers the static and dynamic relationships between process and KPI variables. For the static case, a least squares-based approach is developed that provides an explicit link with least-squares regression, which gives better performance than partial least squares. For the dynamic case, using the kernel representation of each subprocess, an instrument variable is used to reduce the dynamic case to the static case. This framework is applied to the TE benchmark process and the hot strip mill rolling process. The results show that the proposed method can detect faults better than previous methods. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  13. A Group Decision Framework with Intuitionistic Preference Relations and Its Application to Low Carbon Supplier Selection.

    PubMed

    Tong, Xiayu; Wang, Zhou-Jing

    2016-09-19

    This article develops a group decision framework with intuitionistic preference relations. An approach is first devised to rectify an inconsistent intuitionistic preference relation to derive an additive consistent one. A new aggregation operator, the so-called induced intuitionistic ordered weighted averaging (IIOWA) operator, is proposed to aggregate individual intuitionistic fuzzy judgments. By using the mean absolute deviation between the original and rectified intuitionistic preference relations as an order inducing variable, the rectified consistent intuitionistic preference relations are aggregated into a collective preference relation. This treatment is presumably able to assign different weights to different decision-makers' judgments based on the quality of their inputs (in terms of consistency of their original judgments). A solution procedure is then developed for tackling group decision problems with intuitionistic preference relations. A low carbon supplier selection case study is developed to illustrate how to apply the proposed decision model in practice.

  14. A Group Decision Framework with Intuitionistic Preference Relations and Its Application to Low Carbon Supplier Selection

    PubMed Central

    Tong, Xiayu; Wang, Zhou-Jing

    2016-01-01

    This article develops a group decision framework with intuitionistic preference relations. An approach is first devised to rectify an inconsistent intuitionistic preference relation to derive an additive consistent one. A new aggregation operator, the so-called induced intuitionistic ordered weighted averaging (IIOWA) operator, is proposed to aggregate individual intuitionistic fuzzy judgments. By using the mean absolute deviation between the original and rectified intuitionistic preference relations as an order inducing variable, the rectified consistent intuitionistic preference relations are aggregated into a collective preference relation. This treatment is presumably able to assign different weights to different decision-makers’ judgments based on the quality of their inputs (in terms of consistency of their original judgments). A solution procedure is then developed for tackling group decision problems with intuitionistic preference relations. A low carbon supplier selection case study is developed to illustrate how to apply the proposed decision model in practice. PMID:27657097

  15. Understanding the relations between different forms of racial prejudice: a cognitive consistency perspective.

    PubMed

    Gawronski, Bertram; Peters, Kurt R; Brochu, Paula M; Strack, Fritz

    2008-05-01

    Research on racial prejudice is currently characterized by the existence of diverse concepts (e.g., implicit prejudice, old-fashioned racism, modern racism, aversive racism) that are not well integrated from a general perspective. The present article proposes an integrative framework for these concepts employing a cognitive consistency perspective. Specifically, it is argued that the reliance on immediate affective reactions toward racial minority groups in evaluative judgments about these groups depends on the consistency of this evaluation with other relevant beliefs pertaining to central components of old-fashioned, modern, and aversive forms of prejudice. A central prediction of the proposed framework is that the relation between "implicit" and "explicit" prejudice should be moderated by the interaction of egalitarianism-related, nonprejudicial goals and perceptions of discrimination. This prediction was confirmed in a series of three studies. Implications for research on prejudice are discussed.

  16. A Conceptual Framework for Evaluation of Public Health and Primary Care System Performance in Iran

    PubMed Central

    Jahanmehr, Nader; Rashidian, Arash; Khosravi, Ardeshir; Farzadfar, Farshad; Shariati, Mohammad; Majdzadeh, Reza; Sari, Ali Akbari; Mesdaghinia, Alireza

    2015-01-01

    Introduction: The main objective of this study was to design a conceptual framework, according to the policies and priorities of the ministry of health to evaluate provincial public health and primary care performance and to assess their share in the overall health impacts of the community. Methods: We used several tools and techniques, including system thinking, literature review to identify relevant attributes of health system performance framework and interview with the key stakeholders. The PubMed, Scopus, web of science, Google Scholar and two specialized databases of Persian language literature (IranMedex and SID) were searched using main terms and keywords. Following decision-making and collective agreement among the different stakeholders, 51 core indicators were chosen from among 602 obtained indicators in a four stage process, for monitoring and evaluation of Health Deputies. Results: We proposed a conceptual framework by identifying the performance area for Health Deputies between other determinants of health, as well as introducing a chain of results, for performance, consisting of Input, Process, Output and Outcome indicators. We also proposed 5 dimensions for measuring the performance of Health Deputies, consisting of efficiency, effectiveness, equity, access and improvement of health status. Conclusion: The proposed Conceptual Framework illustrates clearly the Health Deputies success in achieving best results and consequences of health in the country. Having the relative commitment of the ministry of health and Health Deputies at the University of Medical Sciences is essential for full implementation of this framework and providing the annual performance report. PMID:25946937

  17. Localizing text in scene images by boundary clustering, stroke segmentation, and string fragment classification.

    PubMed

    Yi, Chucai; Tian, Yingli

    2012-09-01

    In this paper, we propose a novel framework to extract text regions from scene images with complex backgrounds and multiple text appearances. This framework consists of three main steps: boundary clustering (BC), stroke segmentation, and string fragment classification. In BC, we propose a new bigram-color-uniformity-based method to model both text and attachment surface, and cluster edge pixels based on color pairs and spatial positions into boundary layers. Then, stroke segmentation is performed at each boundary layer by color assignment to extract character candidates. We propose two algorithms to combine the structural analysis of text stroke with color assignment and filter out background interferences. Further, we design a robust string fragment classification based on Gabor-based text features. The features are obtained from feature maps of gradient, stroke distribution, and stroke width. The proposed framework of text localization is evaluated on scene images, born-digital images, broadcast video images, and images of handheld objects captured by blind persons. Experimental results on respective datasets demonstrate that the framework outperforms state-of-the-art localization algorithms.

  18. Sustainability in Health care by Allocating Resources Effectively (SHARE) 10: operationalising disinvestment in a conceptual framework for resource allocation.

    PubMed

    Harris, Claire; Green, Sally; Elshaug, Adam G

    2017-09-08

    This is the tenth in a series of papers reporting a program of Sustainability in Health care by Allocating Resources Effectively (SHARE) in a local healthcare setting. After more than a decade of research, there is little published evidence of active and successful disinvestment. The paucity of frameworks, methods and tools is reported to be a factor in the lack of success. However there are clear and consistent messages in the literature that can be used to inform development of a framework for operationalising disinvestment. This paper, along with the conceptual review of disinvestment in Paper 9 of this series, aims to integrate the findings of the SHARE Program with the existing disinvestment literature to address the lack of information regarding systematic organisation-wide approaches to disinvestment at the local health service level. A framework for disinvestment in a local healthcare setting is proposed. Definitions for essential terms and key concepts underpinning the framework have been made explicit to address the lack of consistent terminology. Given the negative connotations of the word 'disinvestment' and the problems inherent in considering disinvestment in isolation, the basis for the proposed framework is 'resource allocation' to address the spectrum of decision-making from investment to disinvestment. The focus is positive: optimising healthcare, improving health outcomes, using resources effectively. The framework is based on three components: a program for decision-making, projects to implement decisions and evaluate outcomes, and research to understand and improve the program and project activities. The program consists of principles for decision-making and settings that provide opportunities to introduce systematic prompts and triggers to initiate disinvestment. The projects follow the steps in the disinvestment process. Potential methods and tools are presented, however the framework does not stipulate project design or conduct; allowing application of any theories, methods or tools at each step. Barriers are discussed and examples illustrating constituent elements are provided. The framework can be employed at network, institutional, departmental, ward or committee level. It is proposed as an organisation-wide application, embedded within existing systems and processes, which can be responsive to needs and priorities at the level of implementation. It can be used in policy, management or clinical contexts.

  19. Iterative deep convolutional encoder-decoder network for medical image segmentation.

    PubMed

    Jung Uk Kim; Hak Gu Kim; Yong Man Ro

    2017-07-01

    In this paper, we propose a novel medical image segmentation using iterative deep learning framework. We have combined an iterative learning approach and an encoder-decoder network to improve segmentation results, which enables to precisely localize the regions of interest (ROIs) including complex shapes or detailed textures of medical images in an iterative manner. The proposed iterative deep convolutional encoder-decoder network consists of two main paths: convolutional encoder path and convolutional decoder path with iterative learning. Experimental results show that the proposed iterative deep learning framework is able to yield excellent medical image segmentation performances for various medical images. The effectiveness of the proposed method has been proved by comparing with other state-of-the-art medical image segmentation methods.

  20. Fast image interpolation via random forests.

    PubMed

    Huang, Jun-Jie; Siu, Wan-Chi; Liu, Tian-Rui

    2015-10-01

    This paper proposes a two-stage framework for fast image interpolation via random forests (FIRF). The proposed FIRF method gives high accuracy, as well as requires low computation. The underlying idea of this proposed work is to apply random forests to classify the natural image patch space into numerous subspaces and learn a linear regression model for each subspace to map the low-resolution image patch to high-resolution image patch. The FIRF framework consists of two stages. Stage 1 of the framework removes most of the ringing and aliasing artifacts in the initial bicubic interpolated image, while Stage 2 further refines the Stage 1 interpolated image. By varying the number of decision trees in the random forests and the number of stages applied, the proposed FIRF method can realize computationally scalable image interpolation. Extensive experimental results show that the proposed FIRF(3, 2) method achieves more than 0.3 dB improvement in peak signal-to-noise ratio over the state-of-the-art nonlocal autoregressive modeling (NARM) method. Moreover, the proposed FIRF(1, 1) obtains similar or better results as NARM while only takes its 0.3% computational time.

  1. Spatio-temporal Granger causality: a new framework

    PubMed Central

    Luo, Qiang; Lu, Wenlian; Cheng, Wei; Valdes-Sosa, Pedro A.; Wen, Xiaotong; Ding, Mingzhou; Feng, Jianfeng

    2015-01-01

    That physiological oscillations of various frequencies are present in fMRI signals is the rule, not the exception. Herein, we propose a novel theoretical framework, spatio-temporal Granger causality, which allows us to more reliably and precisely estimate the Granger causality from experimental datasets possessing time-varying properties caused by physiological oscillations. Within this framework, Granger causality is redefined as a global index measuring the directed information flow between two time series with time-varying properties. Both theoretical analyses and numerical examples demonstrate that Granger causality is a monotonically increasing function of the temporal resolution used in the estimation. This is consistent with the general principle of coarse graining, which causes information loss by smoothing out very fine-scale details in time and space. Our results confirm that the Granger causality at the finer spatio-temporal scales considerably outperforms the traditional approach in terms of an improved consistency between two resting-state scans of the same subject. To optimally estimate the Granger causality, the proposed theoretical framework is implemented through a combination of several approaches, such as dividing the optimal time window and estimating the parameters at the fine temporal and spatial scales. Taken together, our approach provides a novel and robust framework for estimating the Granger causality from fMRI, EEG, and other related data. PMID:23643924

  2. Toward a consistent modeling framework to assess multi-sectoral climate impacts.

    PubMed

    Monier, Erwan; Paltsev, Sergey; Sokolov, Andrei; Chen, Y-H Henry; Gao, Xiang; Ejaz, Qudsia; Couzo, Evan; Schlosser, C Adam; Dutkiewicz, Stephanie; Fant, Charles; Scott, Jeffery; Kicklighter, David; Morris, Jennifer; Jacoby, Henry; Prinn, Ronald; Haigh, Martin

    2018-02-13

    Efforts to estimate the physical and economic impacts of future climate change face substantial challenges. To enrich the currently popular approaches to impact analysis-which involve evaluation of a damage function or multi-model comparisons based on a limited number of standardized scenarios-we propose integrating a geospatially resolved physical representation of impacts into a coupled human-Earth system modeling framework. Large internationally coordinated exercises cannot easily respond to new policy targets and the implementation of standard scenarios across models, institutions and research communities can yield inconsistent estimates. Here, we argue for a shift toward the use of a self-consistent integrated modeling framework to assess climate impacts, and discuss ways the integrated assessment modeling community can move in this direction. We then demonstrate the capabilities of such a modeling framework by conducting a multi-sectoral assessment of climate impacts under a range of consistent and integrated economic and climate scenarios that are responsive to new policies and business expectations.

  3. A Component-Based Vocabulary-Extensible Sign Language Gesture Recognition Framework.

    PubMed

    Wei, Shengjing; Chen, Xiang; Yang, Xidong; Cao, Shuai; Zhang, Xu

    2016-04-19

    Sign language recognition (SLR) can provide a helpful tool for the communication between the deaf and the external world. This paper proposed a component-based vocabulary extensible SLR framework using data from surface electromyographic (sEMG) sensors, accelerometers (ACC), and gyroscopes (GYRO). In this framework, a sign word was considered to be a combination of five common sign components, including hand shape, axis, orientation, rotation, and trajectory, and sign classification was implemented based on the recognition of five components. Especially, the proposed SLR framework consisted of two major parts. The first part was to obtain the component-based form of sign gestures and establish the code table of target sign gesture set using data from a reference subject. In the second part, which was designed for new users, component classifiers were trained using a training set suggested by the reference subject and the classification of unknown gestures was performed with a code matching method. Five subjects participated in this study and recognition experiments under different size of training sets were implemented on a target gesture set consisting of 110 frequently-used Chinese Sign Language (CSL) sign words. The experimental results demonstrated that the proposed framework can realize large-scale gesture set recognition with a small-scale training set. With the smallest training sets (containing about one-third gestures of the target gesture set) suggested by two reference subjects, (82.6 ± 13.2)% and (79.7 ± 13.4)% average recognition accuracy were obtained for 110 words respectively, and the average recognition accuracy climbed up to (88 ± 13.7)% and (86.3 ± 13.7)% when the training set included 50~60 gestures (about half of the target gesture set). The proposed framework can significantly reduce the user's training burden in large-scale gesture recognition, which will facilitate the implementation of a practical SLR system.

  4. Multi-sources data fusion framework for remote triage prioritization in telehealth.

    PubMed

    Salman, O H; Rasid, M F A; Saripan, M I; Subramaniam, S K

    2014-09-01

    The healthcare industry is streamlining processes to offer more timely and effective services to all patients. Computerized software algorithm and smart devices can streamline the relation between users and doctors by providing more services inside the healthcare telemonitoring systems. This paper proposes a multi-sources framework to support advanced healthcare applications. The proposed framework named Multi Sources Healthcare Architecture (MSHA) considers multi-sources: sensors (ECG, SpO2 and Blood Pressure) and text-based inputs from wireless and pervasive devices of Wireless Body Area Network. The proposed framework is used to improve the healthcare scalability efficiency by enhancing the remote triaging and remote prioritization processes for the patients. The proposed framework is also used to provide intelligent services over telemonitoring healthcare services systems by using data fusion method and prioritization technique. As telemonitoring system consists of three tiers (Sensors/ sources, Base station and Server), the simulation of the MSHA algorithm in the base station is demonstrated in this paper. The achievement of a high level of accuracy in the prioritization and triaging patients remotely, is set to be our main goal. Meanwhile, the role of multi sources data fusion in the telemonitoring healthcare services systems has been demonstrated. In addition to that, we discuss how the proposed framework can be applied in a healthcare telemonitoring scenario. Simulation results, for different symptoms relate to different emergency levels of heart chronic diseases, demonstrate the superiority of our algorithm compared with conventional algorithms in terms of classify and prioritize the patients remotely.

  5. Conceptual measurement framework for help-seeking for mental health problems

    PubMed Central

    Rickwood, Debra; Thomas, Kerry

    2012-01-01

    Background Despite a high level of research, policy, and practice interest in help-seeking for mental health problems and mental disorders, there is currently no agreed and commonly used definition or conceptual measurement framework for help-seeking. Methods A systematic review of research activity in the field was undertaken to investigate how help-seeking has been conceptualized and measured. Common elements were used to develop a proposed conceptual measurement framework. Results The database search revealed a very high level of research activity and confirmed that there is no commonly applied definition of help-seeking and no psychometrically sound measures that are routinely used. The most common element in the help-seeking research was a focus on formal help-seeking sources, rather than informal sources, although studies did not assess a consistent set of professional sources; rather, each study addressed an idiosyncratic range of sources of professional health and community care. Similarly, the studies considered help-seeking for a range of mental health problems and no consistent terminology was applied. The most common mental health problem investigated was depression, followed by use of generic terms, such as mental health problem, psychological distress, or emotional problem. Major gaps in the consistent measurement of help-seeking were identified. Conclusion It is evident that an agreed definition that supports the comparable measurement of help-seeking is lacking. Therefore, a conceptual measurement framework is proposed to fill this gap. The framework maintains that the essential elements for measurement are: the part of the help-seeking process to be investigated and respective time frame, the source and type of assistance, and the type of mental health concern. It is argued that adopting this framework will facilitate progress in the field by providing much needed conceptual consistency. Results will then be able to be compared across studies and population groups, and this will significantly benefit understanding of policy and practice initiatives aimed at improving access to and engagement with services for people with mental health concerns. PMID:23248576

  6. Decoding the "CoDe": A Framework for Conceptualizing and Designing Help Options in Computer-Based Second Language Listening

    ERIC Educational Resources Information Center

    Cardenas-Claros, Monica Stella; Gruba, Paul A.

    2013-01-01

    This paper proposes a theoretical framework for the conceptualization and design of help options in computer-based second language (L2) listening. Based on four empirical studies, it aims at clarifying both conceptualization and design (CoDe) components. The elements of conceptualization consist of a novel four-part classification of help options:…

  7. Periodic Pulay method for robust and efficient convergence acceleration of self-consistent field iterations

    DOE PAGES

    Banerjee, Amartya S.; Suryanarayana, Phanish; Pask, John E.

    2016-01-21

    Pulay's Direct Inversion in the Iterative Subspace (DIIS) method is one of the most widely used mixing schemes for accelerating the self-consistent solution of electronic structure problems. In this work, we propose a simple generalization of DIIS in which Pulay extrapolation is performed at periodic intervals rather than on every self-consistent field iteration, and linear mixing is performed on all other iterations. Lastly, we demonstrate through numerical tests on a wide variety of materials systems in the framework of density functional theory that the proposed generalization of Pulay's method significantly improves its robustness and efficiency.

  8. Binding the Electronic Book: Design Features for Bibliophiles

    ERIC Educational Resources Information Center

    Ruecker, Stan; Uszkalo, Kirsten C.

    2007-01-01

    This paper proposes a design for the electronic book based on discussions with frequent book readers. We adopted a conceptual framework for this project consisting of a spectrum of possible designs, with the conventional bound book at one difference pole, and the laptop computer at the other; the design activity then consisted of appropriately…

  9. An IoT-Based Computational Framework for Healthcare Monitoring in Mobile Environments.

    PubMed

    Mora, Higinio; Gil, David; Terol, Rafael Muñoz; Azorín, Jorge; Szymanski, Julian

    2017-10-10

    The new Internet of Things paradigm allows for small devices with sensing, processing and communication capabilities to be designed, which enable the development of sensors, embedded devices and other 'things' ready to understand the environment. In this paper, a distributed framework based on the internet of things paradigm is proposed for monitoring human biomedical signals in activities involving physical exertion. The main advantages and novelties of the proposed system is the flexibility in computing the health application by using resources from available devices inside the body area network of the user. This proposed framework can be applied to other mobile environments, especially those where intensive data acquisition and high processing needs take place. Finally, we present a case study in order to validate our proposal that consists in monitoring footballers' heart rates during a football match. The real-time data acquired by these devices presents a clear social objective of being able to predict not only situations of sudden death but also possible injuries.

  10. An IoT-Based Computational Framework for Healthcare Monitoring in Mobile Environments

    PubMed Central

    Szymanski, Julian

    2017-01-01

    The new Internet of Things paradigm allows for small devices with sensing, processing and communication capabilities to be designed, which enable the development of sensors, embedded devices and other ‘things’ ready to understand the environment. In this paper, a distributed framework based on the internet of things paradigm is proposed for monitoring human biomedical signals in activities involving physical exertion. The main advantages and novelties of the proposed system is the flexibility in computing the health application by using resources from available devices inside the body area network of the user. This proposed framework can be applied to other mobile environments, especially those where intensive data acquisition and high processing needs take place. Finally, we present a case study in order to validate our proposal that consists in monitoring footballers’ heart rates during a football match. The real-time data acquired by these devices presents a clear social objective of being able to predict not only situations of sudden death but also possible injuries. PMID:28994743

  11. Stepwise and stagewise approaches for spatial cluster detection

    PubMed Central

    Xu, Jiale

    2016-01-01

    Spatial cluster detection is an important tool in many areas such as sociology, botany and public health. Previous work has mostly taken either hypothesis testing framework or Bayesian framework. In this paper, we propose a few approaches under a frequentist variable selection framework for spatial cluster detection. The forward stepwise methods search for multiple clusters by iteratively adding currently most likely cluster while adjusting for the effects of previously identified clusters. The stagewise methods also consist of a series of steps, but with tiny step size in each iteration. We study the features and performances of our proposed methods using simulations on idealized grids or real geographic area. From the simulations, we compare the performance of the proposed methods in terms of estimation accuracy and power of detections. These methods are applied to the the well-known New York leukemia data as well as Indiana poverty data. PMID:27246273

  12. Stepwise and stagewise approaches for spatial cluster detection.

    PubMed

    Xu, Jiale; Gangnon, Ronald E

    2016-05-01

    Spatial cluster detection is an important tool in many areas such as sociology, botany and public health. Previous work has mostly taken either a hypothesis testing framework or a Bayesian framework. In this paper, we propose a few approaches under a frequentist variable selection framework for spatial cluster detection. The forward stepwise methods search for multiple clusters by iteratively adding currently most likely cluster while adjusting for the effects of previously identified clusters. The stagewise methods also consist of a series of steps, but with a tiny step size in each iteration. We study the features and performances of our proposed methods using simulations on idealized grids or real geographic areas. From the simulations, we compare the performance of the proposed methods in terms of estimation accuracy and power. These methods are applied to the the well-known New York leukemia data as well as Indiana poverty data. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. A Canadian framework for applying the precautionary principle to public health issues.

    PubMed

    Weir, Erica; Schabas, Richard; Wilson, Kumanan; Mackie, Chris

    2010-01-01

    The precautionary principle has influenced environmental and public health policy. It essentially states that complete evidence of a potential risk is not required before action is taken to mitigate the effects of the potential risk. The application of precaution to public health issues is not straightforward and could paradoxically cause harm to the public's health when applied inappropriately. To avoid this, we propose a framework for applying the precautionary principle to potential public health risks. The framework consists of ten guiding questions to help establish whether a proposed application of the precautionary principle on a public health matter is based on adequacy of the evidence of causation, severity of harm and acceptability of the precautionary measures.

  14. A video coding scheme based on joint spatiotemporal and adaptive prediction.

    PubMed

    Jiang, Wenfei; Latecki, Longin Jan; Liu, Wenyu; Liang, Hui; Gorman, Ken

    2009-05-01

    We propose a video coding scheme that departs from traditional Motion Estimation/DCT frameworks and instead uses Karhunen-Loeve Transform (KLT)/Joint Spatiotemporal Prediction framework. In particular, a novel approach that performs joint spatial and temporal prediction simultaneously is introduced. It bypasses the complex H.26x interframe techniques and it is less computationally intensive. Because of the advantage of the effective joint prediction and the image-dependent color space transformation (KLT), the proposed approach is demonstrated experimentally to consistently lead to improved video quality, and in many cases to better compression rates and improved computational speed.

  15. An Approach to a Comprehensive Test Framework for Analysis and Evaluation of Text Line Segmentation Algorithms

    PubMed Central

    Brodic, Darko; Milivojevic, Dragan R.; Milivojevic, Zoran N.

    2011-01-01

    The paper introduces a testing framework for the evaluation and validation of text line segmentation algorithms. Text line segmentation represents the key action for correct optical character recognition. Many of the tests for the evaluation of text line segmentation algorithms deal with text databases as reference templates. Because of the mismatch, the reliable testing framework is required. Hence, a new approach to a comprehensive experimental framework for the evaluation of text line segmentation algorithms is proposed. It consists of synthetic multi-like text samples and real handwritten text as well. Although the tests are mutually independent, the results are cross-linked. The proposed method can be used for different types of scripts and languages. Furthermore, two different procedures for the evaluation of algorithm efficiency based on the obtained error type classification are proposed. The first is based on the segmentation line error description, while the second one incorporates well-known signal detection theory. Each of them has different capabilities and convenience, but they can be used as supplements to make the evaluation process efficient. Overall the proposed procedure based on the segmentation line error description has some advantages, characterized by five measures that describe measurement procedures. PMID:22164106

  16. An approach to a comprehensive test framework for analysis and evaluation of text line segmentation algorithms.

    PubMed

    Brodic, Darko; Milivojevic, Dragan R; Milivojevic, Zoran N

    2011-01-01

    The paper introduces a testing framework for the evaluation and validation of text line segmentation algorithms. Text line segmentation represents the key action for correct optical character recognition. Many of the tests for the evaluation of text line segmentation algorithms deal with text databases as reference templates. Because of the mismatch, the reliable testing framework is required. Hence, a new approach to a comprehensive experimental framework for the evaluation of text line segmentation algorithms is proposed. It consists of synthetic multi-like text samples and real handwritten text as well. Although the tests are mutually independent, the results are cross-linked. The proposed method can be used for different types of scripts and languages. Furthermore, two different procedures for the evaluation of algorithm efficiency based on the obtained error type classification are proposed. The first is based on the segmentation line error description, while the second one incorporates well-known signal detection theory. Each of them has different capabilities and convenience, but they can be used as supplements to make the evaluation process efficient. Overall the proposed procedure based on the segmentation line error description has some advantages, characterized by five measures that describe measurement procedures.

  17. A Spatiotemporal Prediction Framework for Air Pollution Based on Deep RNN

    NASA Astrophysics Data System (ADS)

    Fan, J.; Li, Q.; Hou, J.; Feng, X.; Karimian, H.; Lin, S.

    2017-10-01

    Time series data in practical applications always contain missing values due to sensor malfunction, network failure, outliers etc. In order to handle missing values in time series, as well as the lack of considering temporal properties in machine learning models, we propose a spatiotemporal prediction framework based on missing value processing algorithms and deep recurrent neural network (DRNN). By using missing tag and missing interval to represent time series patterns, we implement three different missing value fixing algorithms, which are further incorporated into deep neural network that consists of LSTM (Long Short-term Memory) layers and fully connected layers. Real-world air quality and meteorological datasets (Jingjinji area, China) are used for model training and testing. Deep feed forward neural networks (DFNN) and gradient boosting decision trees (GBDT) are trained as baseline models against the proposed DRNN. Performances of three missing value fixing algorithms, as well as different machine learning models are evaluated and analysed. Experiments show that the proposed DRNN framework outperforms both DFNN and GBDT, therefore validating the capacity of the proposed framework. Our results also provides useful insights for better understanding of different strategies that handle missing values.

  18. Digital contract approach for consistent and predictable multimedia information delivery in electronic commerce

    NASA Astrophysics Data System (ADS)

    Konana, Prabhudev; Gupta, Alok; Whinston, Andrew B.

    1997-01-01

    A pure 'technological' solution to network quality problems is incomplete since any benefits from new technologies are offset by the demand from exponentially growing electronic commerce ad data-intensive applications. SInce an economic paradigm is implicit in electronic commerce, we propose a 'market-system' approach to improve quality of service. Quality of service for digital products takes on a different meaning since users view quality of service differently and value information differently. We propose a framework for electronic commerce that is based on an economic paradigm and mass-customization, and works as a wide-area distributed management system. In our framework, surrogate-servers act as intermediaries between information provides and end- users, and arrange for consistent and predictable information delivery through 'digital contracts.' These contracts are negotiated and priced based on economic principles. Surrogate servers pre-fetched, through replication, information from many different servers and consolidate based on demand expectations. In order to recognize users' requirements and process requests accordingly, real-time databases are central to our framework. We also propose that multimedia information be separated into slowly changing and rapidly changing data streams to improve response time requirements. Surrogate- servers perform the tasks of integration of these data streams that is transparent to end-users.

  19. A superpixel-based framework for automatic tumor segmentation on breast DCE-MRI

    NASA Astrophysics Data System (ADS)

    Yu, Ning; Wu, Jia; Weinstein, Susan P.; Gaonkar, Bilwaj; Keller, Brad M.; Ashraf, Ahmed B.; Jiang, YunQing; Davatzikos, Christos; Conant, Emily F.; Kontos, Despina

    2015-03-01

    Accurate and efficient automated tumor segmentation in breast dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is highly desirable for computer-aided tumor diagnosis. We propose a novel automatic segmentation framework which incorporates mean-shift smoothing, superpixel-wise classification, pixel-wise graph-cuts partitioning, and morphological refinement. A set of 15 breast DCE-MR images, obtained from the American College of Radiology Imaging Network (ACRIN) 6657 I-SPY trial, were manually segmented to generate tumor masks (as ground truth) and breast masks (as regions of interest). Four state-of-the-art segmentation approaches based on diverse models were also utilized for comparison. Based on five standard evaluation metrics for segmentation, the proposed framework consistently outperformed all other approaches. The performance of the proposed framework was: 1) 0.83 for Dice similarity coefficient, 2) 0.96 for pixel-wise accuracy, 3) 0.72 for VOC score, 4) 0.79 mm for mean absolute difference, and 5) 11.71 mm for maximum Hausdorff distance, which surpassed the second best method (i.e., adaptive geodesic transformation), a semi-automatic algorithm depending on precise initialization. Our results suggest promising potential applications of our segmentation framework in assisting analysis of breast carcinomas.

  20. Computer-aided pulmonary image analysis in small animal models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Ziyue; Mansoor, Awais; Mollura, Daniel J.

    Purpose: To develop an automated pulmonary image analysis framework for infectious lung diseases in small animal models. Methods: The authors describe a novel pathological lung and airway segmentation method for small animals. The proposed framework includes identification of abnormal imaging patterns pertaining to infectious lung diseases. First, the authors’ system estimates an expected lung volume by utilizing a regression function between total lung capacity and approximated rib cage volume. A significant difference between the expected lung volume and the initial lung segmentation indicates the presence of severe pathology, and invokes a machine learning based abnormal imaging pattern detection system next.more » The final stage of the proposed framework is the automatic extraction of airway tree for which new affinity relationships within the fuzzy connectedness image segmentation framework are proposed by combining Hessian and gray-scale morphological reconstruction filters. Results: 133 CT scans were collected from four different studies encompassing a wide spectrum of pulmonary abnormalities pertaining to two commonly used small animal models (ferret and rabbit). Sensitivity and specificity were greater than 90% for pathological lung segmentation (average dice similarity coefficient > 0.9). While qualitative visual assessments of airway tree extraction were performed by the participating expert radiologists, for quantitative evaluation the authors validated the proposed airway extraction method by using publicly available EXACT’09 data set. Conclusions: The authors developed a comprehensive computer-aided pulmonary image analysis framework for preclinical research applications. The proposed framework consists of automatic pathological lung segmentation and accurate airway tree extraction. The framework has high sensitivity and specificity; therefore, it can contribute advances in preclinical research in pulmonary diseases.« less

  1. An active monitoring method for flood events

    NASA Astrophysics Data System (ADS)

    Chen, Zeqiang; Chen, Nengcheng; Du, Wenying; Gong, Jianya

    2018-07-01

    Timely and active detecting and monitoring of a flood event are critical for a quick response, effective decision-making and disaster reduction. To achieve the purpose, this paper proposes an active service framework for flood monitoring based on Sensor Web services and an active model for the concrete implementation of the active service framework. The framework consists of two core components-active warning and active planning. The active warning component is based on a publish-subscribe mechanism implemented by the Sensor Event Service. The active planning component employs the Sensor Planning Service to control the execution of the schemes and models and plans the model input data. The active model, called SMDSA, defines the quantitative calculation method for five elements, scheme, model, data, sensor, and auxiliary information, as well as their associations. Experimental monitoring of the Liangzi Lake flood in the summer of 2010 is conducted to test the proposed framework and model. The results show that 1) the proposed active service framework is efficient for timely and automated flood monitoring. 2) The active model, SMDSA, is a quantitative calculation method used to monitor floods from manual intervention to automatic computation. 3) As much preliminary work as possible should be done to take full advantage of the active service framework and the active model.

  2. Finger Vein Recognition Based on a Personalized Best Bit Map

    PubMed Central

    Yang, Gongping; Xi, Xiaoming; Yin, Yilong

    2012-01-01

    Finger vein patterns have recently been recognized as an effective biometric identifier. In this paper, we propose a finger vein recognition method based on a personalized best bit map (PBBM). Our method is rooted in a local binary pattern based method and then inclined to use the best bits only for matching. We first present the concept of PBBM and the generating algorithm. Then we propose the finger vein recognition framework, which consists of preprocessing, feature extraction, and matching. Finally, we design extensive experiments to evaluate the effectiveness of our proposal. Experimental results show that PBBM achieves not only better performance, but also high robustness and reliability. In addition, PBBM can be used as a general framework for binary pattern based recognition. PMID:22438735

  3. Finger vein recognition based on a personalized best bit map.

    PubMed

    Yang, Gongping; Xi, Xiaoming; Yin, Yilong

    2012-01-01

    Finger vein patterns have recently been recognized as an effective biometric identifier. In this paper, we propose a finger vein recognition method based on a personalized best bit map (PBBM). Our method is rooted in a local binary pattern based method and then inclined to use the best bits only for matching. We first present the concept of PBBM and the generating algorithm. Then we propose the finger vein recognition framework, which consists of preprocessing, feature extraction, and matching. Finally, we design extensive experiments to evaluate the effectiveness of our proposal. Experimental results show that PBBM achieves not only better performance, but also high robustness and reliability. In addition, PBBM can be used as a general framework for binary pattern based recognition.

  4. Her2Net: A Deep Framework for Semantic Segmentation and Classification of Cell Membranes and Nuclei in Breast Cancer Evaluation.

    PubMed

    Saha, Monjoy; Chakraborty, Chandan

    2018-05-01

    We present an efficient deep learning framework for identifying, segmenting, and classifying cell membranes and nuclei from human epidermal growth factor receptor-2 (HER2)-stained breast cancer images with minimal user intervention. This is a long-standing issue for pathologists because the manual quantification of HER2 is error-prone, costly, and time-consuming. Hence, we propose a deep learning-based HER2 deep neural network (Her2Net) to solve this issue. The convolutional and deconvolutional parts of the proposed Her2Net framework consisted mainly of multiple convolution layers, max-pooling layers, spatial pyramid pooling layers, deconvolution layers, up-sampling layers, and trapezoidal long short-term memory (TLSTM). A fully connected layer and a softmax layer were also used for classification and error estimation. Finally, HER2 scores were calculated based on the classification results. The main contribution of our proposed Her2Net framework includes the implementation of TLSTM and a deep learning framework for cell membrane and nucleus detection, segmentation, and classification and HER2 scoring. Our proposed Her2Net achieved 96.64% precision, 96.79% recall, 96.71% F-score, 93.08% negative predictive value, 98.33% accuracy, and a 6.84% false-positive rate. Our results demonstrate the high accuracy and wide applicability of the proposed Her2Net in the context of HER2 scoring for breast cancer evaluation.

  5. Characterizing behavioural ‘characters’: an evolutionary framework

    PubMed Central

    Araya-Ajoy, Yimen G.; Dingemanse, Niels J.

    2014-01-01

    Biologists often study phenotypic evolution assuming that phenotypes consist of a set of quasi-independent units that have been shaped by selection to accomplish a particular function. In the evolutionary literature, such quasi-independent functional units are called ‘evolutionary characters’, and a framework based on evolutionary principles has been developed to characterize them. This framework mainly focuses on ‘fixed’ characters, i.e. those that vary exclusively between individuals. In this paper, we introduce multi-level variation and thereby expand the framework to labile characters, focusing on behaviour as a worked example. We first propose a concept of ‘behavioural characters’ based on the original evolutionary character concept. We then detail how integration of variation between individuals (cf. ‘personality’) and within individuals (cf. ‘individual plasticity’) into the framework gives rise to a whole suite of novel testable predictions about the evolutionary character concept. We further propose a corresponding statistical methodology to test whether observed behaviours should be considered expressions of a hypothesized evolutionary character. We illustrate the application of our framework by characterizing the behavioural character ‘aggressiveness’ in wild great tits, Parus major. PMID:24335984

  6. Network Community Detection based on the Physarum-inspired Computational Framework.

    PubMed

    Gao, Chao; Liang, Mingxin; Li, Xianghua; Zhang, Zili; Wang, Zhen; Zhou, Zhili

    2016-12-13

    Community detection is a crucial and essential problem in the structure analytics of complex networks, which can help us understand and predict the characteristics and functions of complex networks. Many methods, ranging from the optimization-based algorithms to the heuristic-based algorithms, have been proposed for solving such a problem. Due to the inherent complexity of identifying network structure, how to design an effective algorithm with a higher accuracy and a lower computational cost still remains an open problem. Inspired by the computational capability and positive feedback mechanism in the wake of foraging process of Physarum, which is a large amoeba-like cell consisting of a dendritic network of tube-like pseudopodia, a general Physarum-based computational framework for community detection is proposed in this paper. Based on the proposed framework, the inter-community edges can be identified from the intra-community edges in a network and the positive feedback of solving process in an algorithm can be further enhanced, which are used to improve the efficiency of original optimization-based and heuristic-based community detection algorithms, respectively. Some typical algorithms (e.g., genetic algorithm, ant colony optimization algorithm, and Markov clustering algorithm) and real-world datasets have been used to estimate the efficiency of our proposed computational framework. Experiments show that the algorithms optimized by Physarum-inspired computational framework perform better than the original ones, in terms of accuracy and computational cost. Moreover, a computational complexity analysis verifies the scalability of our framework.

  7. Derivation of Multiple Covarying Material and Process Parameters Using Physics-Based Modeling of X-ray Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khaira, Gurdaman; Doxastakis, Manolis; Bowen, Alec

    There is considerable interest in developing multimodal characterization frameworks capable of probing critical properties of complex materials by relying on distinct, complementary methods or tools. Any such framework should maximize the amount of information that is extracted from any given experiment and should be sufficiently powerful and efficient to enable on-the-fly analysis of multiple measurements in a self-consistent manner. Such a framework is demonstrated in this work in the context of self-assembling polymeric materials, where theory and simulations provide the language to seamlessly mesh experimental data from two different scattering measurements. Specifically, the samples considered here consist of diblock copolymersmore » (BCP) that are self-assembled on chemically nanopatterned surfaces. The copolymers microphase separate into ordered lamellae with characteristic dimensions on the scale of tens of nanometers that are perfectly aligned by the substrate over macroscopic areas. These aligned lamellar samples provide ideal standards with which to develop the formalism introduced in this work and, more generally, the concept of high-information-content, multimodal experimentation. The outcomes of the proposed analysis are then compared to images generated by 3D scanning electron microscopy tomography, serving to validate the merit of the framework and ideas proposed here.« less

  8. General flat four-dimensional world pictures and clock systems

    NASA Technical Reports Server (NTRS)

    Hsu, J. P.; Underwood, J. A.

    1978-01-01

    We explore the mathematical structure and the physical implications of a general four-dimensional symmetry framework which is consistent with the Poincare-Einstein principle of relativity for physical laws and with experiments. In particular, we discuss a four-dimensional framework in which all observers in different frames use one and the same grid of clocks. The general framework includes special relativity and a recently proposed new four-dimensional symmetry with a nonuniversal light speed as two special simple cases. The connection between the properties of light propagation and the convention concerning clock systems is also discussed, and is seen to be nonunique within the four-dimensional framework.

  9. The Architecture of Personality

    ERIC Educational Resources Information Center

    Cervone, Daniel

    2004-01-01

    This article presents a theoretical framework for analyzing psychological systems that contribute to the variability, consistency, and cross-situational coherence of personality functioning. In the proposed knowledge-and-appraisal personality architecture (KAPA), personality structures and processes are delineated by combining 2 principles:…

  10. Modeling and Detecting Feature Interactions among Integrated Services of Home Network Systems

    NASA Astrophysics Data System (ADS)

    Igaki, Hiroshi; Nakamura, Masahide

    This paper presents a framework for formalizing and detecting feature interactions (FIs) in the emerging smart home domain. We first establish a model of home network system (HNS), where every networked appliance (or the HNS environment) is characterized as an object consisting of properties and methods. Then, every HNS service is defined as a sequence of method invocations of the appliances. Within the model, we next formalize two kinds of FIs: (a) appliance interactions and (b) environment interactions. An appliance interaction occurs when two method invocations conflict on the same appliance, whereas an environment interaction arises when two method invocations conflict indirectly via the environment. Finally, we propose offline and online methods that detect FIs before service deployment and during execution, respectively. Through a case study with seven practical services, it is shown that the proposed framework is generic enough to capture feature interactions in HNS integrated services. We also discuss several FI resolution schemes within the proposed framework.

  11. New theoretical framework for designing nonionic surfactant mixtures that exhibit a desired adsorption kinetics behavior.

    PubMed

    Moorkanikkara, Srinivas Nageswaran; Blankschtein, Daniel

    2010-12-21

    How does one design a surfactant mixture using a set of available surfactants such that it exhibits a desired adsorption kinetics behavior? The traditional approach used to address this design problem involves conducting trial-and-error experiments with specific surfactant mixtures. This approach is typically time-consuming and resource-intensive and becomes increasingly challenging when the number of surfactants that can be mixed increases. In this article, we propose a new theoretical framework to identify a surfactant mixture that most closely meets a desired adsorption kinetics behavior. Specifically, the new theoretical framework involves (a) formulating the surfactant mixture design problem as an optimization problem using an adsorption kinetics model and (b) solving the optimization problem using a commercial optimization package. The proposed framework aims to identify the surfactant mixture that most closely satisfies the desired adsorption kinetics behavior subject to the predictive capabilities of the chosen adsorption kinetics model. Experiments can then be conducted at the identified surfactant mixture condition to validate the predictions. We demonstrate the reliability and effectiveness of the proposed theoretical framework through a realistic case study by identifying a nonionic surfactant mixture consisting of up to four alkyl poly(ethylene oxide) surfactants (C(10)E(4), C(12)E(5), C(12)E(6), and C(10)E(8)) such that it most closely exhibits a desired dynamic surface tension (DST) profile. Specifically, we use the Mulqueen-Stebe-Blankschtein (MSB) adsorption kinetics model (Mulqueen, M.; Stebe, K. J.; Blankschtein, D. Langmuir 2001, 17, 5196-5207) to formulate the optimization problem as well as the SNOPT commercial optimization solver to identify a surfactant mixture consisting of these four surfactants that most closely exhibits the desired DST profile. Finally, we compare the experimental DST profile measured at the surfactant mixture condition identified by the new theoretical framework with the desired DST profile and find good agreement between the two profiles.

  12. Noise-aware dictionary-learning-based sparse representation framework for detection and removal of single and combined noises from ECG signal

    PubMed Central

    Ramkumar, Barathram; Sabarimalai Manikandan, M.

    2017-01-01

    Automatic electrocardiogram (ECG) signal enhancement has become a crucial pre-processing step in most ECG signal analysis applications. In this Letter, the authors propose an automated noise-aware dictionary learning-based generalised ECG signal enhancement framework which can automatically learn the dictionaries based on the ECG noise type for effective representation of ECG signal and noises, and can reduce the computational load of sparse representation-based ECG enhancement system. The proposed framework consists of noise detection and identification, noise-aware dictionary learning, sparse signal decomposition and reconstruction. The noise detection and identification is performed based on the moving average filter, first-order difference, and temporal features such as number of turning points, maximum absolute amplitude, zerocrossings, and autocorrelation features. The representation dictionary is learned based on the type of noise identified in the previous stage. The proposed framework is evaluated using noise-free and noisy ECG signals. Results demonstrate that the proposed method can significantly reduce computational load as compared with conventional dictionary learning-based ECG denoising approaches. Further, comparative results show that the method outperforms existing methods in automatically removing noises such as baseline wanders, power-line interference, muscle artefacts and their combinations without distorting the morphological content of local waves of ECG signal. PMID:28529758

  13. Noise-aware dictionary-learning-based sparse representation framework for detection and removal of single and combined noises from ECG signal.

    PubMed

    Satija, Udit; Ramkumar, Barathram; Sabarimalai Manikandan, M

    2017-02-01

    Automatic electrocardiogram (ECG) signal enhancement has become a crucial pre-processing step in most ECG signal analysis applications. In this Letter, the authors propose an automated noise-aware dictionary learning-based generalised ECG signal enhancement framework which can automatically learn the dictionaries based on the ECG noise type for effective representation of ECG signal and noises, and can reduce the computational load of sparse representation-based ECG enhancement system. The proposed framework consists of noise detection and identification, noise-aware dictionary learning, sparse signal decomposition and reconstruction. The noise detection and identification is performed based on the moving average filter, first-order difference, and temporal features such as number of turning points, maximum absolute amplitude, zerocrossings, and autocorrelation features. The representation dictionary is learned based on the type of noise identified in the previous stage. The proposed framework is evaluated using noise-free and noisy ECG signals. Results demonstrate that the proposed method can significantly reduce computational load as compared with conventional dictionary learning-based ECG denoising approaches. Further, comparative results show that the method outperforms existing methods in automatically removing noises such as baseline wanders, power-line interference, muscle artefacts and their combinations without distorting the morphological content of local waves of ECG signal.

  14. Framework for the mapping of the monthly average daily solar radiation using an advanced case-based reasoning and a geostatistical technique.

    PubMed

    Lee, Minhyun; Koo, Choongwan; Hong, Taehoon; Park, Hyo Seon

    2014-04-15

    For the effective photovoltaic (PV) system, it is necessary to accurately determine the monthly average daily solar radiation (MADSR) and to develop an accurate MADSR map, which can simplify the decision-making process for selecting the suitable location of the PV system installation. Therefore, this study aimed to develop a framework for the mapping of the MADSR using an advanced case-based reasoning (CBR) and a geostatistical technique. The proposed framework consists of the following procedures: (i) the geographic scope for the mapping of the MADSR is set, and the measured MADSR and meteorological data in the geographic scope are collected; (ii) using the collected data, the advanced CBR model is developed; (iii) using the advanced CBR model, the MADSR at unmeasured locations is estimated; and (iv) by applying the measured and estimated MADSR data to the geographic information system, the MADSR map is developed. A practical validation was conducted by applying the proposed framework to South Korea. It was determined that the MADSR map developed through the proposed framework has been improved in terms of accuracy. The developed MADSR map can be used for estimating the MADSR at unmeasured locations and for determining the optimal location for the PV system installation.

  15. Local linear discriminant analysis framework using sample neighbors.

    PubMed

    Fan, Zizhu; Xu, Yong; Zhang, David

    2011-07-01

    The linear discriminant analysis (LDA) is a very popular linear feature extraction approach. The algorithms of LDA usually perform well under the following two assumptions. The first assumption is that the global data structure is consistent with the local data structure. The second assumption is that the input data classes are Gaussian distributions. However, in real-world applications, these assumptions are not always satisfied. In this paper, we propose an improved LDA framework, the local LDA (LLDA), which can perform well without needing to satisfy the above two assumptions. Our LLDA framework can effectively capture the local structure of samples. According to different types of local data structure, our LLDA framework incorporates several different forms of linear feature extraction approaches, such as the classical LDA and principal component analysis. The proposed framework includes two LLDA algorithms: a vector-based LLDA algorithm and a matrix-based LLDA (MLLDA) algorithm. MLLDA is directly applicable to image recognition, such as face recognition. Our algorithms need to train only a small portion of the whole training set before testing a sample. They are suitable for learning large-scale databases especially when the input data dimensions are very high and can achieve high classification accuracy. Extensive experiments show that the proposed algorithms can obtain good classification results.

  16. Practical no-gold-standard evaluation framework for quantitative imaging methods: application to lesion segmentation in positron emission tomography

    PubMed Central

    Jha, Abhinav K.; Mena, Esther; Caffo, Brian; Ashrafinia, Saeed; Rahmim, Arman; Frey, Eric; Subramaniam, Rathan M.

    2017-01-01

    Abstract. Recently, a class of no-gold-standard (NGS) techniques have been proposed to evaluate quantitative imaging methods using patient data. These techniques provide figures of merit (FoMs) quantifying the precision of the estimated quantitative value without requiring repeated measurements and without requiring a gold standard. However, applying these techniques to patient data presents several practical difficulties including assessing the underlying assumptions, accounting for patient-sampling-related uncertainty, and assessing the reliability of the estimated FoMs. To address these issues, we propose statistical tests that provide confidence in the underlying assumptions and in the reliability of the estimated FoMs. Furthermore, the NGS technique is integrated within a bootstrap-based methodology to account for patient-sampling-related uncertainty. The developed NGS framework was applied to evaluate four methods for segmenting lesions from F-Fluoro-2-deoxyglucose positron emission tomography images of patients with head-and-neck cancer on the task of precisely measuring the metabolic tumor volume. The NGS technique consistently predicted the same segmentation method as the most precise method. The proposed framework provided confidence in these results, even when gold-standard data were not available. The bootstrap-based methodology indicated improved performance of the NGS technique with larger numbers of patient studies, as was expected, and yielded consistent results as long as data from more than 80 lesions were available for the analysis. PMID:28331883

  17. A Mixed-Methods Research Framework for Healthcare Process Improvement.

    PubMed

    Bastian, Nathaniel D; Munoz, David; Ventura, Marta

    2016-01-01

    The healthcare system in the United States is spiraling out of control due to ever-increasing costs without significant improvements in quality, access to care, satisfaction, and efficiency. Efficient workflow is paramount to improving healthcare value while maintaining the utmost standards of patient care and provider satisfaction in high stress environments. This article provides healthcare managers and quality engineers with a practical healthcare process improvement framework to assess, measure and improve clinical workflow processes. The proposed mixed-methods research framework integrates qualitative and quantitative tools to foster the improvement of processes and workflow in a systematic way. The framework consists of three distinct phases: 1) stakeholder analysis, 2a) survey design, 2b) time-motion study, and 3) process improvement. The proposed framework is applied to the pediatric intensive care unit of the Penn State Hershey Children's Hospital. The implementation of this methodology led to identification and categorization of different workflow tasks and activities into both value-added and non-value added in an effort to provide more valuable and higher quality patient care. Based upon the lessons learned from the case study, the three-phase methodology provides a better, broader, leaner, and holistic assessment of clinical workflow. The proposed framework can be implemented in various healthcare settings to support continuous improvement efforts in which complexity is a daily element that impacts workflow. We proffer a general methodology for process improvement in a healthcare setting, providing decision makers and stakeholders with a useful framework to help their organizations improve efficiency. Published by Elsevier Inc.

  18. Built environment change: a framework to support health-enhancing behaviour through environmental policy and health research.

    PubMed

    Berke, Ethan M; Vernez-Moudon, Anne

    2014-06-01

    As research examining the effect of the built environment on health accelerates, it is critical for health and planning researchers to conduct studies and make recommendations in the context of a robust theoretical framework. We propose a framework for built environment change (BEC) related to improving health. BEC consists of elements of the built environment, how people are exposed to and interact with them perceptually and functionally, and how this exposure may affect health-related behaviours. Integrated into this framework are the legal and regulatory mechanisms and instruments that are commonly used to effect change in the built environment. This framework would be applicable to medical research as well as to issues of policy and community planning.

  19. Fully Decentralized Semi-supervised Learning via Privacy-preserving Matrix Completion.

    PubMed

    Fierimonte, Roberto; Scardapane, Simone; Uncini, Aurelio; Panella, Massimo

    2016-08-26

    Distributed learning refers to the problem of inferring a function when the training data are distributed among different nodes. While significant work has been done in the contexts of supervised and unsupervised learning, the intermediate case of Semi-supervised learning in the distributed setting has received less attention. In this paper, we propose an algorithm for this class of problems, by extending the framework of manifold regularization. The main component of the proposed algorithm consists of a fully distributed computation of the adjacency matrix of the training patterns. To this end, we propose a novel algorithm for low-rank distributed matrix completion, based on the framework of diffusion adaptation. Overall, the distributed Semi-supervised algorithm is efficient and scalable, and it can preserve privacy by the inclusion of flexible privacy-preserving mechanisms for similarity computation. The experimental results and comparison on a wide range of standard Semi-supervised benchmarks validate our proposal.

  20. 78 FR 48920 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-12

    ... amendment is consistent with Section 6(b)(1) of the Act in that it simply clarifies the framework under... the purpose of, and basis for, the proposed rule change and discussed any comments it received on the... consisting of a contract to purchase equity and/or debt securities at a specified time. It is the Exchange's...

  1. Large scale air pollution estimation method combining land use regression and chemical transport modeling in a geostatistical framework.

    PubMed

    Akita, Yasuyuki; Baldasano, Jose M; Beelen, Rob; Cirach, Marta; de Hoogh, Kees; Hoek, Gerard; Nieuwenhuijsen, Mark; Serre, Marc L; de Nazelle, Audrey

    2014-04-15

    In recognition that intraurban exposure gradients may be as large as between-city variations, recent air pollution epidemiologic studies have become increasingly interested in capturing within-city exposure gradients. In addition, because of the rapidly accumulating health data, recent studies also need to handle large study populations distributed over large geographic domains. Even though several modeling approaches have been introduced, a consistent modeling framework capturing within-city exposure variability and applicable to large geographic domains is still missing. To address these needs, we proposed a modeling framework based on the Bayesian Maximum Entropy method that integrates monitoring data and outputs from existing air quality models based on Land Use Regression (LUR) and Chemical Transport Models (CTM). The framework was applied to estimate the yearly average NO2 concentrations over the region of Catalunya in Spain. By jointly accounting for the global scale variability in the concentration from the output of CTM and the intraurban scale variability through LUR model output, the proposed framework outperformed more conventional approaches.

  2. Designing effective human-automation-plant interfaces: a control-theoretic perspective.

    PubMed

    Jamieson, Greg A; Vicente, Kim J

    2005-01-01

    In this article, we propose the application of a control-theoretic framework to human-automation interaction. The framework consists of a set of conceptual distinctions that should be respected in automation research and design. We demonstrate how existing automation interface designs in some nuclear plants fail to recognize these distinctions. We further show the value of the approach by applying it to modes of automation. The design guidelines that have been proposed in the automation literature are evaluated from the perspective of the framework. This comparison shows that the framework reveals insights that are frequently overlooked in this literature. A new set of design guidelines is introduced that builds upon the contributions of previous research and draws complementary insights from the control-theoretic framework. The result is a coherent and systematic approach to the design of human-automation-plant interfaces that will yield more concrete design criteria and a broader set of design tools. Applications of this research include improving the effectiveness of human-automation interaction design and the relevance of human-automation interaction research.

  3. A validation framework for brain tumor segmentation.

    PubMed

    Archip, Neculai; Jolesz, Ferenc A; Warfield, Simon K

    2007-10-01

    We introduce a validation framework for the segmentation of brain tumors from magnetic resonance (MR) images. A novel unsupervised semiautomatic brain tumor segmentation algorithm is also presented. The proposed framework consists of 1) T1-weighted MR images of patients with brain tumors, 2) segmentation of brain tumors performed by four independent experts, 3) segmentation of brain tumors generated by a semiautomatic algorithm, and 4) a software tool that estimates the performance of segmentation algorithms. We demonstrate the validation of the novel segmentation algorithm within the proposed framework. We show its performance and compare it with existent segmentation. The image datasets and software are available at http://www.brain-tumor-repository.org/. We present an Internet resource that provides access to MR brain tumor image data and segmentation that can be openly used by the research community. Its purpose is to encourage the development and evaluation of segmentation methods by providing raw test and image data, human expert segmentation results, and methods for comparing segmentation results.

  4. Steps toward improving ethical evaluation in health technology assessment: a proposed framework.

    PubMed

    Assasi, Nazila; Tarride, Jean-Eric; O'Reilly, Daria; Schwartz, Lisa

    2016-06-06

    While evaluation of ethical aspects in health technology assessment (HTA) has gained much attention during the past years, the integration of ethics in HTA practice still presents many challenges. In response to the increasing demand for expansion of health technology assessment (HTA) methodology to include ethical issues more systematically, this article reports on a multi-stage study that aimed at construction of a framework for improving the integration of ethics in HTA. The framework was developed through the following phases: 1) a systematic review and content analysis of guidance documents for ethics in HTA; 2) identification of factors influencing the integration of ethical considerations in HTA; 3) preparation of an action-oriented framework based on the key elements of the existing guidance documents and identified barriers to and facilitators of their implementation; and 4) expert consultation and revision of the framework. The proposed framework consists of three main components: an algorithmic flowchart, which exhibits the different steps of an ethical inquiry throughout the HTA process, including: defining the objectives and scope of the evaluation, stakeholder analysis, assessing organizational capacity, framing ethical evaluation questions, ethical analysis, deliberation, and knowledge translation; a stepwise guide, which focuses on the task objectives and potential questions that are required to be addressed at each step; and a list of some commonly recommended or used tools to help facilitate the evaluation process. The proposed framework can be used to support and promote good practice in integration of ethics into HTA. However, further validation of the framework through case studies and expert consultation is required to establish its utility for HTA practice.

  5. A Hierarchical Framework for Demand-Side Frequency Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moya, Christian; Zhang, Wei; Lian, Jianming

    2014-06-02

    With large-scale plans to integrate renewable generation, more resources will be needed to compensate for the uncertainty associated with intermittent generation resources. Under such conditions, performing frequency control using only supply-side resources become not only prohibitively expensive but also technically difficult. It is therefore important to explore how a sufficient proportion of the loads could assume a routine role in frequency control to maintain the stability of the system at an acceptable cost. In this paper, a novel hierarchical decentralized framework for frequency based load control is proposed. The framework involves two decision layers. The top decision layer determines themore » optimal droop gain required from the aggregated load response on each bus using a robust decentralized control approach. The second layer consists of a large number of devices, which switch probabilistically during contingencies so that the aggregated power change matches the desired droop amount according to the updated gains. The proposed framework is based on the classical nonlinear multi-machine power system model, and can deal with timevarying system operating conditions while respecting the physical constraints of individual devices. Realistic simulation results based on a 68-bus system are provided to demonstrate the effectiveness of the proposed strategy.« less

  6. Definition and constituents of maltreatment in sport: establishing a conceptual framework for research practitioners.

    PubMed

    Stirling, A E

    2009-12-01

    There has recently been an increased emergence of research on the maltreatment of athletes in sport. It is suggested that research may play a particularly salient role with respect to athlete protection initiatives. However, as it stands, current research in this area is limited by a lack of consistency in definitions. The purpose of the paper, therefore, is to propose a conceptual framework of maltreatment in sport to be used among research practitioners. More specifically, a conceptual model of the different categories, constructs and constituents of maltreatment in sport is proposed. Sport-specific examples of the various maltreatments are outlined. Current literature is reviewed, and recommendations are made for future research.

  7. International validation of quality indicators for evaluating priority setting in low income countries: process and key lessons.

    PubMed

    Kapiriri, Lydia

    2017-06-19

    While there have been efforts to develop frameworks to guide healthcare priority setting; there has been limited focus on evaluation frameworks. Moreover, while the few frameworks identify quality indicators for successful priority setting, they do not provide the users with strategies to verify these indicators. Kapiriri and Martin (Health Care Anal 18:129-147, 2010) developed a framework for evaluating priority setting in low and middle income countries. This framework provides BOTH parameters for successful priority setting and proposes means of their verification. Before its use in real life contexts, this paper presents results from a validation process of the framework. The framework validation involved 53 policy makers and priority setting researchers at the global, national and sub-national levels (in Uganda). They were requested to indicate the relative importance of the proposed parameters as well as the feasibility of obtaining the related information. We also pilot tested the proposed means of verification. Almost all the respondents evaluated all the parameters, including the contextual factors, as 'very important'. However, some respondents at the global level thought 'presence of incentives to comply', 'reduced disagreements', 'increased public understanding,' 'improved institutional accountability' and 'meeting the ministry of health objectives', which could be a reflection of their levels of decision making. All the proposed means of verification were assessed as feasible with the exception of meeting observations which would require an insider. These findings results were consistent with those obtained from the pilot testing. These findings are relevant to policy makers and researchers involved in priority setting in low and middle income countries. To the best of our knowledge, this is one of the few initiatives that has involved potential users of a framework (at the global and in a Low Income Country) in its validation. The favorable validation of all the parameters at the national and sub-national levels implies that the framework has potential usefulness at those levels, as is. The parameters that were disputed at the global level necessitate further discussion when using the framework at that level. The next step is to use the validated framework in evaluating actual priority setting at the different levels.

  8. An efficient depth map preprocessing method based on structure-aided domain transform smoothing for 3D view generation

    PubMed Central

    Ma, Liyan; Qiu, Bo; Cui, Mingyue; Ding, Jianwei

    2017-01-01

    Depth image-based rendering (DIBR), which is used to render virtual views with a color image and the corresponding depth map, is one of the key techniques in the 2D to 3D conversion process. Due to the absence of knowledge about the 3D structure of a scene and its corresponding texture, DIBR in the 2D to 3D conversion process, inevitably leads to holes in the resulting 3D image as a result of newly-exposed areas. In this paper, we proposed a structure-aided depth map preprocessing framework in the transformed domain, which is inspired by recently proposed domain transform for its low complexity and high efficiency. Firstly, our framework integrates hybrid constraints including scene structure, edge consistency and visual saliency information in the transformed domain to improve the performance of depth map preprocess in an implicit way. Then, adaptive smooth localization is cooperated and realized in the proposed framework to further reduce over-smoothness and enhance optimization in the non-hole regions. Different from the other similar methods, the proposed method can simultaneously achieve the effects of hole filling, edge correction and local smoothing for typical depth maps in a united framework. Thanks to these advantages, it can yield visually satisfactory results with less computational complexity for high quality 2D to 3D conversion. Numerical experimental results demonstrate the excellent performances of the proposed method. PMID:28407027

  9. An Advanced Framework for Improving Situational Awareness in Electric Power Grid Operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Huang, Zhenyu; Zhou, Ning

    With the deployment of new smart grid technologies and the penetration of renewable energy in power systems, significant uncertainty and variability is being introduced into power grid operation. Traditionally, the Energy Management System (EMS) operates the power grid in a deterministic mode, and thus will not be sufficient for the future control center in a stochastic environment with faster dynamics. One of the main challenges is to improve situational awareness. This paper reviews the current status of power grid operation and presents a vision of improving wide-area situational awareness for a future control center. An advanced framework, consisting of parallelmore » state estimation, state prediction, parallel contingency selection, parallel contingency analysis, and advanced visual analytics, is proposed to provide capabilities needed for better decision support by utilizing high performance computing (HPC) techniques and advanced visual analytic techniques. Research results are presented to support the proposed vision and framework.« less

  10. Skin Lesion Analysis towards Melanoma Detection Using Deep Learning Network.

    PubMed

    Li, Yuexiang; Shen, Linlin

    2018-02-11

    Skin lesions are a severe disease globally. Early detection of melanoma in dermoscopy images significantly increases the survival rate. However, the accurate recognition of melanoma is extremely challenging due to the following reasons: low contrast between lesions and skin, visual similarity between melanoma and non-melanoma lesions, etc. Hence, reliable automatic detection of skin tumors is very useful to increase the accuracy and efficiency of pathologists. In this paper, we proposed two deep learning methods to address three main tasks emerging in the area of skin lesion image processing, i.e., lesion segmentation (task 1), lesion dermoscopic feature extraction (task 2) and lesion classification (task 3). A deep learning framework consisting of two fully convolutional residual networks (FCRN) is proposed to simultaneously produce the segmentation result and the coarse classification result. A lesion index calculation unit (LICU) is developed to refine the coarse classification results by calculating the distance heat-map. A straight-forward CNN is proposed for the dermoscopic feature extraction task. The proposed deep learning frameworks were evaluated on the ISIC 2017 dataset. Experimental results show the promising accuracies of our frameworks, i.e., 0.753 for task 1, 0.848 for task 2 and 0.912 for task 3 were achieved.

  11. A Framework for Monitoring Progress Using Summary Measures of Health.

    PubMed

    Madans, Jennifer H; Weeks, Julie D

    2016-10-01

    Initiatives designed to monitor health typically incorporate numerous specific measures of health and the health system to assess improvements, or lack thereof, for policy and program purposes. The addition of summary measures provides overarching information which is essential for determining whether the goals of such initiatives are met. Summary measures are identified that relate to the individual indicators but that also reflect movement in the various parts of the system. A hierarchical framework that is conceptually consistent and which utilizes a succinct number of summary measures incorporating indicators of functioning and participation is proposed. While a large set of individual indicators can be useful for monitoring progress, these individual indicators do not provide an overall evaluation of health, defined broadly, at the population level. A hierarchical framework consisting of summary measures is important for monitoring the success of health improvement initiatives. © The Author(s) 2016.

  12. A Framework for Integrating Environmental Justice in Regulatory Analysis

    PubMed Central

    Nweke, Onyemaechi C.

    2011-01-01

    With increased interest in integrating environmental justice into the process for developing environmental regulations in the United States, analysts and decision makers are confronted with the question of what methods and data can be used to assess disproportionate environmental health impacts. However, as a first step to identifying data and methods, it is important that analysts understand what information on equity impacts is needed for decision making. Such knowledge originates from clearly stated equity objectives and the reflection of those objectives throughout the analytical activities that characterize Regulatory Impact Analysis (RIA), a process that is traditionally used to inform decision making. The framework proposed in this paper advocates structuring analyses to explicitly provide pre-defined output on equity impacts. Specifically, the proposed framework emphasizes: (a) defining equity objectives for the proposed regulatory action at the onset of the regulatory process, (b) identifying specific and related sub-objectives for key analytical steps in the RIA process, and (c) developing explicit analytical/research questions to assure that stated sub-objectives and objectives are met. In proposing this framework, it is envisioned that information on equity impacts informs decision-making in regulatory development, and that this is achieved through a systematic and consistent approach that assures linkages between stated equity objectives, regulatory analyses, selection of policy options, and the design of compliance and enforcement activities. PMID:21776235

  13. Teaching Gil to Lead

    ERIC Educational Resources Information Center

    Davis, Stephen H.; Leon, Ronald J.

    2009-01-01

    The complexities of public education today require new, distributed models of school leadership in which teachers play a central role. The most effective teachers assume leadership roles as instructors and professional colleagues. In this article, we propose a framework for developing teacher leadership that consists of four intersecting domains:…

  14. An evaluation of a new instrument to measure organisational safety culture values and practices.

    PubMed

    Díaz-Cabrera, D; Hernández-Fernaud, E; Isla-Díaz, R

    2007-11-01

    The main aim of this research is to evaluate a safety culture measuring instrument centred upon relevant organisational values and practices related to the safety management system. Seven dimensions that reflect underlying safety meanings are proposed. A second objective is to explore the four cultural orientations in the field of safety arising from the competing values framework. The study sample consisted of 299 participants from five companies in different sectors. The results show six dimensions of organisational values and practices and different company profiles in the organisations studied. The four cultural orientations proposed by the competing values framework are not confirmed. Nevertheless, a coexistence of diverse cultural orientations or paradoxes in the companies is observed.

  15. A framework provided an outline toward the proper evaluation of potential screening strategies.

    PubMed

    Adriaensen, Wim J; Matheï, Cathy; Buntinx, Frank J; Arbyn, Marc

    2013-06-01

    Screening tests are often introduced into clinical practice without proper evaluation, despite the increasing awareness that screening is a double-edged sword that can lead to either net benefits or harms. Our objective was to develop a comprehensive framework for the evaluation of new screening strategies. Elaborating on the existing concepts proposed by experts, a stepwise framework is proposed to evaluate whether a potential screening test can be introduced as a screening strategy into clinical practice. The principle of screening strategy evaluation is illustrated for cervical cancer, which is a template for screening because of the existence of an easily detectable and treatable precursor lesion. The evaluation procedure consists of six consecutive steps. In steps 1-4, the technical accuracy, place of the test in the screening pathway, diagnostic accuracy, and longitudinal sensitivity and specificity of the screening test are assessed. In steps 5 and 6, the impact of the screening strategy on the patient and population levels, respectively, is evaluated. The framework incorporates a harm and benefit trade-off and cost-effectiveness analysis. Our framework provides an outline toward the proper evaluation of potential screening strategies before considering implementation. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. Automatic Earth observation data service based on reusable geo-processing workflow

    NASA Astrophysics Data System (ADS)

    Chen, Nengcheng; Di, Liping; Gong, Jianya; Yu, Genong; Min, Min

    2008-12-01

    A common Sensor Web data service framework for Geo-Processing Workflow (GPW) is presented as part of the NASA Sensor Web project. This framework consists of a data service node, a data processing node, a data presentation node, a Catalogue Service node and BPEL engine. An abstract model designer is used to design the top level GPW model, model instantiation service is used to generate the concrete BPEL, and the BPEL execution engine is adopted. The framework is used to generate several kinds of data: raw data from live sensors, coverage or feature data, geospatial products, or sensor maps. A scenario for an EO-1 Sensor Web data service for fire classification is used to test the feasibility of the proposed framework. The execution time and influences of the service framework are evaluated. The experiments show that this framework can improve the quality of services for sensor data retrieval and processing.

  17. A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine.

    PubMed

    Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun

    2017-02-06

    In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human-machine-environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines.

  18. An Integrated Framework for Human-Robot Collaborative Manipulation.

    PubMed

    Sheng, Weihua; Thobbi, Anand; Gu, Ye

    2015-10-01

    This paper presents an integrated learning framework that enables humanoid robots to perform human-robot collaborative manipulation tasks. Specifically, a table-lifting task performed jointly by a human and a humanoid robot is chosen for validation purpose. The proposed framework is split into two phases: 1) phase I-learning to grasp the table and 2) phase II-learning to perform the manipulation task. An imitation learning approach is proposed for phase I. In phase II, the behavior of the robot is controlled by a combination of two types of controllers: 1) reactive and 2) proactive. The reactive controller lets the robot take a reactive control action to make the table horizontal. The proactive controller lets the robot take proactive actions based on human motion prediction. A measure of confidence of the prediction is also generated by the motion predictor. This confidence measure determines the leader/follower behavior of the robot. Hence, the robot can autonomously switch between the behaviors during the task. Finally, the performance of the human-robot team carrying out the collaborative manipulation task is experimentally evaluated on a platform consisting of a Nao humanoid robot and a Vicon motion capture system. Results show that the proposed framework can enable the robot to carry out the collaborative manipulation task successfully.

  19. A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine

    PubMed Central

    Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun

    2017-01-01

    In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human–machine–environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines. PMID:28178184

  20. A novel water quality data analysis framework based on time-series data mining.

    PubMed

    Deng, Weihui; Wang, Guoyin

    2017-07-01

    The rapid development of time-series data mining provides an emerging method for water resource management research. In this paper, based on the time-series data mining methodology, we propose a novel and general analysis framework for water quality time-series data. It consists of two parts: implementation components and common tasks of time-series data mining in water quality data. In the first part, we propose to granulate the time series into several two-dimensional normal clouds and calculate the similarities in the granulated level. On the basis of the similarity matrix, the similarity search, anomaly detection, and pattern discovery tasks in the water quality time-series instance dataset can be easily implemented in the second part. We present a case study of this analysis framework on weekly Dissolve Oxygen time-series data collected from five monitoring stations on the upper reaches of Yangtze River, China. It discovered the relationship of water quality in the mainstream and tributary as well as the main changing patterns of DO. The experimental results show that the proposed analysis framework is a feasible and efficient method to mine the hidden and valuable knowledge from water quality historical time-series data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. A Psychometric Framework for the Evaluation of Instructional Sensitivity

    ERIC Educational Resources Information Center

    Naumann, Alexander; Hochweber, Jan; Klieme, Eckhard

    2016-01-01

    Although there is a common understanding of instructional sensitivity, it lacks a common operationalization. Various approaches have been proposed, some focusing on item responses, others on test scores. As approaches often do not produce consistent results, previous research has created the impression that approaches to instructional sensitivity…

  2. Dynamical dark matter: A new framework for dark-matter physics

    NASA Astrophysics Data System (ADS)

    Dienes, Keith R.; Thomas, Brooks

    2013-05-01

    Although much remains unknown about the dark matter of the universe, one property is normally considered sacrosanct: dark matter must be stable well beyond cosmological time scales. However, a new framework for dark-matter physics has recently been proposed which challenges this assumption. In the "dynamical dark matter" (DDM) framework, the dark sector consists of a vast ensemble of individual dark-matter components with differing masses, lifetimes, and cosmological abundances. Moreover, the usual requirement of stability is replaced by a delicate balancing between lifetimes and cosmological abundances across the ensemble as a whole. As a result, it is possible for the DDM ensemble to remain consistent with all experimental and observational bounds on dark matter while nevertheless giving rise to collective behaviors which transcend those normally associated with traditional dark-matter candidates. These include a new, non-trivial darkmatter equation of state as well as potentially distinctive signatures in collider and direct-detection experiments. In this review article, we provide a self-contained introduction to the DDM framework and summarize some of the work which has recently been done in this area. We also present an explicit model within the DDM framework, and outline a number of ideas for future investigation.

  3. A framework for assessing outcomes from newborn screening: on the road to measuring its promise

    PubMed Central

    Hinton, Cynthia F.; Homer, Charles J.; Thompson, Alexis A.; Williams, Andrea; Hassell, Kathryn L.; Feuchtbaum, Lisa; Berry, Susan A.; Comeau, Anne Marie; Therrell, Bradford L.; Brower, Amy; Harris, Katharine B.; Brown, Christine; Monaco, Jana; Ostrander, Robert J.; Zuckerman, Alan E.; Kaye, Celia; Dougherty, Denise; Greene, Carol; Green, Nancy S.

    2016-01-01

    Newborn screening (NBS) is intended to identify congenital conditions prior to the onset of symptoms in order to provide early intervention that leads to improved outcomes. NBS is a public health success, providing reduction in mortality and improved developmental outcomes for screened conditions.. However, it is less clear to what extent newborn screening achieves the long-term goals relating to improved health, growth, development and function. We propose a framework for assessing outcomes for the health and well-being of children identified through NBS programs. The framework proposed here, and this manuscript, were approved for publication by the Secretary of Health and Human Services’ Advisory Committee on Heritable Disorders in Newborns and Children (ACHDNC). This framework can be applied to each screened condition within the Recommended Uniform Screening Panel (RUSP), recognizing that the data elements and measures will vary by condition. As an example, we applied the framework to sickle cell disease and phenylketonuria (PKU), two diverse conditions with different outcome measures and potential sources of data. Widespread and consistent application of this framework across state NBS and child health systems is envisioned as useful to standardize approaches to assessment of outcomes and for continuous improvement of the NBS and child health systems. PMID:27268406

  4. An operational structured decision making framework for ...

    EPA Pesticide Factsheets

    Pressure to develop an operational framework for decision makers to employ the concepts of ecosystem goods and services for assessing changes to human well-being has been increasing since these concepts gained widespread notoriety after the Millennium Ecosystem Assessment Report. Many conceptual frameworks have been proposed, but most do not propose methodologies and tools to make this approach to decision making implementable. Building on common components of existing conceptual frameworks for ecosystem services and human well-being assessment we apply a structured decision making approach to develop a standardized operational framework and suggest tools and methods for completing each step. The structured decision making approach consists of six steps: 1) Clarify the Decision Context 2) Define Objectives and Evaluation Criteria 3) Develop Alternatives 4) Estimate Consequences 5) Evaluate Trade-Offs and Select and 6) Implement and Monitor. These six steps include the following activities, and suggested tools, when applied to ecosystem goods and services and human well-being conceptual frameworks: 1) Characterization of decision specific human beneficiaries using the Final Ecosystem Goods and Services (FEGS) approach and Classification System (FEGS-CS) 2) Determine beneficiaries’ relative priorities for human well-being domains in the Human Well-Being Index (HWBI) through stakeholder engagement and identify beneficiary-relevant metrics of FEGS using the Nat

  5. Analysing arrangements for cross-border mobility of patients in the European Union: a proposal for a framework.

    PubMed

    Legido-Quigley, Helena; Glinos, Irene A; Baeten, Rita; McKee, Martin; Busse, Reinhard

    2012-11-01

    This paper proposes a framework for analyzing arrangements set up to facilitate cross-border mobility of patients in the European Union. Exploiting both conceptual analysis and data from a range of case studies carried out in a number of European projects, and building on Walt and Gilson's model of policy analysis, the framework consists of five major components, each with a subset of categories or issues: (1) The actors directly and indirectly involved in setting up and promoting arrangements, (2) the content of the arrangements, classified into four categories (e.g. purchaser-provider and provider-provider or joint cross-border providers), (3) the institutional framework of the arrangements (including the underlying European and national legal frameworks, health systems' characteristics and payment mechanisms), (4) the processes that have led to the initiation and continuation, or cessation, of arrangements, (5) contextual factors (e.g. political or cultural) that impact on cross-border patient mobility and thus arrangements to facilitate them. The framework responds to what is a clearly identifiable demand for a means to analyse these interrelated concepts and dimensions. We believe that it will be useful to researchers studying cross-border collaborations and policy makers engaging in them. Crown Copyright © 2012. Published by Elsevier Ireland Ltd. All rights reserved.

  6. Devising a consensus definition and framework for non-technical skills in healthcare to support educational design: A modified Delphi study.

    PubMed

    Gordon, Morris; Baker, Paul; Catchpole, Ken; Darbyshire, Daniel; Schocken, Dawn

    2015-01-01

    Non-technical skills are a subset of human factors that focus on the individual and promote safety through teamwork and awareness. There is no widely adopted competency- or outcome-based framework for non-technical skills training in healthcare. The authors set out to devise such a framework using a modified Delphi approach. An exhaustive list of published and team suggested items was presented to the expert panel for ranking and to propose a definition. In the second round, a focused list was presented, as well as the proposed definition elements. The finalised framework was sent to the panel for review. Sixteen experts participated. The final framework consists of 16 competencies for all and eight specific competencies for team leaders. The consensus definition describes non-technical skills as "a set of social (communication and team work) and cognitive (analytical and personal behaviour) skills that support high quality, safe, effective and efficient inter-professional care within the complex healthcare system". The authors have produced a new competency framework, through the works of an International expert panel, which is not discipline specific that can be used by curriculum developers, educational innovators and clinical teachers to support developments in the field.

  7. Reliability-based design optimization of reinforced concrete structures including soil-structure interaction using a discrete gravitational search algorithm and a proposed metamodel

    NASA Astrophysics Data System (ADS)

    Khatibinia, M.; Salajegheh, E.; Salajegheh, J.; Fadaee, M. J.

    2013-10-01

    A new discrete gravitational search algorithm (DGSA) and a metamodelling framework are introduced for reliability-based design optimization (RBDO) of reinforced concrete structures. The RBDO of structures with soil-structure interaction (SSI) effects is investigated in accordance with performance-based design. The proposed DGSA is based on the standard gravitational search algorithm (GSA) to optimize the structural cost under deterministic and probabilistic constraints. The Monte-Carlo simulation (MCS) method is considered as the most reliable method for estimating the probabilities of reliability. In order to reduce the computational time of MCS, the proposed metamodelling framework is employed to predict the responses of the SSI system in the RBDO procedure. The metamodel consists of a weighted least squares support vector machine (WLS-SVM) and a wavelet kernel function, which is called WWLS-SVM. Numerical results demonstrate the efficiency and computational advantages of DGSA and the proposed metamodel for RBDO of reinforced concrete structures.

  8. Accumulating pyramid spatial-spectral collaborative coding divergence for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Sun, Hao; Zou, Huanxin; Zhou, Shilin

    2016-03-01

    Detection of anomalous targets of various sizes in hyperspectral data has received a lot of attention in reconnaissance and surveillance applications. Many anomaly detectors have been proposed in literature. However, current methods are susceptible to anomalies in the processing window range and often make critical assumptions about the distribution of the background data. Motivated by the fact that anomaly pixels are often distinctive from their local background, in this letter, we proposed a novel hyperspectral anomaly detection framework for real-time remote sensing applications. The proposed framework consists of four major components, sparse feature learning, pyramid grid window selection, joint spatial-spectral collaborative coding and multi-level divergence fusion. It exploits the collaborative representation difference in the feature space to locate potential anomalies and is totally unsupervised without any prior assumptions. Experimental results on airborne recorded hyperspectral data demonstrate that the proposed methods adaptive to anomalies in a large range of sizes and is well suited for parallel processing.

  9. Multi-sparse dictionary colorization algorithm based on the feature classification and detail enhancement

    NASA Astrophysics Data System (ADS)

    Yan, Dan; Bai, Lianfa; Zhang, Yi; Han, Jing

    2018-02-01

    For the problems of missing details and performance of the colorization based on sparse representation, we propose a conceptual model framework for colorizing gray-scale images, and then a multi-sparse dictionary colorization algorithm based on the feature classification and detail enhancement (CEMDC) is proposed based on this framework. The algorithm can achieve a natural colorized effect for a gray-scale image, and it is consistent with the human vision. First, the algorithm establishes a multi-sparse dictionary classification colorization model. Then, to improve the accuracy rate of the classification, the corresponding local constraint algorithm is proposed. Finally, we propose a detail enhancement based on Laplacian Pyramid, which is effective in solving the problem of missing details and improving the speed of image colorization. In addition, the algorithm not only realizes the colorization of the visual gray-scale image, but also can be applied to the other areas, such as color transfer between color images, colorizing gray fusion images, and infrared images.

  10. A national framework for disaster health education in Australia.

    PubMed

    FitzGerald, Gerard J; Aitken, Peter; Arbon, Paul; Archer, Frank; Cooper, David; Leggat, Peter; Myers, Colin; Robertson, Andrew; Tarrant, Michael; Davis, Elinor R

    2010-01-01

    Recent events have heightened awareness of disaster health issues and the need to prepare the health workforce to plan for and respond to major incidents. This has been reinforced at an international level by the World Association for Disaster and Emergency Medicine, which has proposed an international educational framework. The aim of this paper is to outline the development of a national educational framework for disaster health in Australia. The framework was developed on the basis of the literature and the previous experience of members of a National Collaborative for Disaster Health Education and Research. The Collaborative was brought together in a series of workshops and teleconferences, utilizing a modified Delphi technique to finalize the content at each level of the framework and to assign a value to the inclusion of that content at the various levels. The framework identifies seven educational levels along with educational outcomes for each level. The framework also identifies the recommended contents at each level and assigns a rating of depth for each component. The framework is not intended as a detailed curriculum, but rather as a guide for educationalists to develop specific programs at each level. This educational framework will provide an infrastructure around which future educational programs in Disaster Health in Australia may be designed and delivered. It will permit improved articulation for students between the various levels and greater consistency between programs so that operational responders may have a consistent language and operational approach to the management of major events.

  11. Embedded sparse representation of fMRI data via group-wise dictionary optimization

    NASA Astrophysics Data System (ADS)

    Zhu, Dajiang; Lin, Binbin; Faskowitz, Joshua; Ye, Jieping; Thompson, Paul M.

    2016-03-01

    Sparse learning enables dimension reduction and efficient modeling of high dimensional signals and images, but it may need to be tailored to best suit specific applications and datasets. Here we used sparse learning to efficiently represent functional magnetic resonance imaging (fMRI) data from the human brain. We propose a novel embedded sparse representation (ESR), to identify the most consistent dictionary atoms across different brain datasets via an iterative group-wise dictionary optimization procedure. In this framework, we introduced additional criteria to make the learned dictionary atoms more consistent across different subjects. We successfully identified four common dictionary atoms that follow the external task stimuli with very high accuracy. After projecting the corresponding coefficient vectors back into the 3-D brain volume space, the spatial patterns are also consistent with traditional fMRI analysis results. Our framework reveals common features of brain activation in a population, as a new, efficient fMRI analysis method.

  12. A Framework for Translating a High Level Security Policy into Low Level Security Mechanisms

    NASA Astrophysics Data System (ADS)

    Hassan, Ahmed A.; Bahgat, Waleed M.

    2010-01-01

    Security policies have different components; firewall, active directory, and IDS are some examples of these components. Enforcement of network security policies to low level security mechanisms faces some essential difficulties. Consistency, verification, and maintenance are the major ones of these difficulties. One approach to overcome these difficulties is to automate the process of translation of high level security policy into low level security mechanisms. This paper introduces a framework of an automation process that translates a high level security policy into low level security mechanisms. The framework is described in terms of three phases; in the first phase all network assets are categorized according to their roles in the network security and relations between them are identified to constitute the network security model. This proposed model is based on organization based access control (OrBAC). However, the proposed model extend the OrBAC model to include not only access control policy but also some other administrative security policies like auditing policy. Besides, the proposed model enables matching of each rule of the high level security policy with the corresponding ones of the low level security policy. Through the second phase of the proposed framework, the high level security policy is mapped into the network security model. The second phase could be considered as a translation of the high level security policy into an intermediate model level. Finally, the intermediate model level is translated automatically into low level security mechanism. The paper illustrates the applicability of proposed approach through an application example.

  13. Disaster Metrics: A Comprehensive Framework for Disaster Evaluation Typologies.

    PubMed

    Wong, Diana F; Spencer, Caroline; Boyd, Lee; Burkle, Frederick M; Archer, Frank

    2017-10-01

    Introduction The frequency of disasters is increasing around the world with more people being at risk. There is a moral imperative to improve the way in which disaster evaluations are undertaken and reported with the aim of reducing preventable mortality and morbidity in future events. Disasters are complex events and undertaking disaster evaluations is a specialized area of study at an international level. Hypothesis/Problem While some frameworks have been developed to support consistent disaster research and evaluation, they lack validation, consistent terminology, and standards for reporting across the different phases of a disaster. There is yet to be an agreed, comprehensive framework to structure disaster evaluation typologies. The aim of this paper is to outline an evolving comprehensive framework for disaster evaluation typologies. It is anticipated that this new framework will facilitate an agreement on identifying, structuring, and relating the various evaluations found in the disaster setting with a view to better understand the process, outcomes, and impacts of the effectiveness and efficiency of interventions. Research was undertaken in two phases: (1) a scoping literature review (peer-reviewed and "grey literature") was undertaken to identify current evaluation frameworks and typologies used in the disaster setting; and (2) a structure was developed that included the range of typologies identified in Phase One and suggests possible relationships in the disaster setting. No core, unifying framework to structure disaster evaluation and research was identified in the literature. The authors propose a "Comprehensive Framework for Disaster Evaluation Typologies" that identifies, structures, and suggests relationships for the various typologies detected. The proposed Comprehensive Framework for Disaster Evaluation Typologies outlines the different typologies of disaster evaluations that were identified in this study and brings them together into a single framework. This unique, unifying framework has relevance at an international level and is expected to benefit the disaster, humanitarian, and development sectors. The next step is to undertake a validation process that will include international leaders with experience in evaluation, in general, and disasters specifically. This work promotes an environment for constructive dialogue on evaluations in the disaster setting to strengthen the evidence base for interventions across the disaster spectrum. It remains a work in progress. Wong DF , Spencer C , Boyd L , Burkle FM Jr. , Archer F . Disaster metrics: a comprehensive framework for disaster evaluation typologies. Prehosp Disaster Med. 2017;32(5):501-514.

  14. Evaluation of the causal framework used for setting national ambient air quality standards.

    PubMed

    Goodman, Julie E; Prueitt, Robyn L; Sax, Sonja N; Bailey, Lisa A; Rhomberg, Lorenz R

    2013-11-01

    Abstract A scientifically sound assessment of the potential hazards associated with a substance requires a systematic, objective and transparent evaluation of the weight of evidence (WoE) for causality of health effects. We critically evaluated the current WoE framework for causal determination used in the United States Environmental Protection Agency's (EPA's) assessments of the scientific data on air pollutants for the National Ambient Air Quality Standards (NAAQS) review process, including its methods for literature searches; study selection, evaluation and integration; and causal judgments. The causal framework used in recent NAAQS evaluations has many valuable features, but it could be more explicit in some cases, and some features are missing that should be included in every WoE evaluation. Because of this, it has not always been applied consistently in evaluations of causality, leading to conclusions that are not always supported by the overall WoE, as we demonstrate using EPA's ozone Integrated Science Assessment as a case study. We propose additions to the NAAQS causal framework based on best practices gleaned from a previously conducted survey of available WoE frameworks. A revision of the NAAQS causal framework so that it more closely aligns with these best practices and the full and consistent application of the framework will improve future assessments of the potential health effects of criteria air pollutants by making the assessments more thorough, transparent, and scientifically sound.

  15. A Decentralized Compositional Framework for Dependable Decision Process in Self-Managed Cyber Physical Systems

    PubMed Central

    Hou, Kun-Mean; Zhang, Zhan

    2017-01-01

    Cyber Physical Systems (CPSs) need to interact with the changeable environment under various interferences. To provide continuous and high quality services, a self-managed CPS should automatically reconstruct itself to adapt to these changes and recover from failures. Such dynamic adaptation behavior introduces systemic challenges for CPS design, advice evaluation and decision process arrangement. In this paper, a formal compositional framework is proposed to systematically improve the dependability of the decision process. To guarantee the consistent observation of event orders for causal reasoning, this work first proposes a relative time-based method to improve the composability and compositionality of the timing property of events. Based on the relative time solution, a formal reference framework is introduced for self-managed CPSs, which includes a compositional FSM-based actor model (subsystems of CPS), actor-based advice and runtime decomposable decisions. To simplify self-management, a self-similar recursive actor interface is proposed for decision (actor) composition. We provide constraints and seven patterns for the composition of reliability and process time requirements. Further, two decentralized decision process strategies are proposed based on our framework, and we compare the reliability with the static strategy and the centralized processing strategy. The simulation results show that the one-order feedback strategy has high reliability, scalability and stability against the complexity of decision and random failure. This paper also shows a way to simplify the evaluation for dynamic system by improving the composability and compositionality of the subsystem. PMID:29120357

  16. A Decentralized Compositional Framework for Dependable Decision Process in Self-Managed Cyber Physical Systems.

    PubMed

    Zhou, Peng; Zuo, Decheng; Hou, Kun-Mean; Zhang, Zhan

    2017-11-09

    Cyber Physical Systems (CPSs) need to interact with the changeable environment under various interferences. To provide continuous and high quality services, a self-managed CPS should automatically reconstruct itself to adapt to these changes and recover from failures. Such dynamic adaptation behavior introduces systemic challenges for CPS design, advice evaluation and decision process arrangement. In this paper, a formal compositional framework is proposed to systematically improve the dependability of the decision process. To guarantee the consistent observation of event orders for causal reasoning, this work first proposes a relative time-based method to improve the composability and compositionality of the timing property of events. Based on the relative time solution, a formal reference framework is introduced for self-managed CPSs, which includes a compositional FSM-based actor model (subsystems of CPS), actor-based advice and runtime decomposable decisions. To simplify self-management, a self-similar recursive actor interface is proposed for decision (actor) composition. We provide constraints and seven patterns for the composition of reliability and process time requirements. Further, two decentralized decision process strategies are proposed based on our framework, and we compare the reliability with the static strategy and the centralized processing strategy. The simulation results show that the one-order feedback strategy has high reliability, scalability and stability against the complexity of decision and random failure. This paper also shows a way to simplify the evaluation for dynamic system by improving the composability and compositionality of the subsystem.

  17. Skin Lesion Analysis towards Melanoma Detection Using Deep Learning Network

    PubMed Central

    2018-01-01

    Skin lesions are a severe disease globally. Early detection of melanoma in dermoscopy images significantly increases the survival rate. However, the accurate recognition of melanoma is extremely challenging due to the following reasons: low contrast between lesions and skin, visual similarity between melanoma and non-melanoma lesions, etc. Hence, reliable automatic detection of skin tumors is very useful to increase the accuracy and efficiency of pathologists. In this paper, we proposed two deep learning methods to address three main tasks emerging in the area of skin lesion image processing, i.e., lesion segmentation (task 1), lesion dermoscopic feature extraction (task 2) and lesion classification (task 3). A deep learning framework consisting of two fully convolutional residual networks (FCRN) is proposed to simultaneously produce the segmentation result and the coarse classification result. A lesion index calculation unit (LICU) is developed to refine the coarse classification results by calculating the distance heat-map. A straight-forward CNN is proposed for the dermoscopic feature extraction task. The proposed deep learning frameworks were evaluated on the ISIC 2017 dataset. Experimental results show the promising accuracies of our frameworks, i.e., 0.753 for task 1, 0.848 for task 2 and 0.912 for task 3 were achieved. PMID:29439500

  18. Vehicle logo recognition using multi-level fusion model

    NASA Astrophysics Data System (ADS)

    Ming, Wei; Xiao, Jianli

    2018-04-01

    Vehicle logo recognition plays an important role in manufacturer identification and vehicle recognition. This paper proposes a new vehicle logo recognition algorithm. It has a hierarchical framework, which consists of two fusion levels. At the first level, a feature fusion model is employed to map the original features to a higher dimension feature space. In this space, the vehicle logos become more recognizable. At the second level, a weighted voting strategy is proposed to promote the accuracy and the robustness of the recognition results. To evaluate the performance of the proposed algorithm, extensive experiments are performed, which demonstrate that the proposed algorithm can achieve high recognition accuracy and work robustly.

  19. Learning, Behaviour and Reaction Framework: A Model for Training Raters to Improve Assessment Quality

    ERIC Educational Resources Information Center

    Chen, Chung-Yang; Chang, Huiju; Hsu, Wen-Chin; Sheen, Gwo-Ji

    2017-01-01

    This paper proposes a training model for raters, with the goal to improve the intra- and inter-consistency of evaluation quality for higher education curricula. The model, termed the learning, behaviour and reaction (LBR) circular training model, is an interdisciplinary application from the business and organisational training domain. The…

  20. Toward the Development of Socio-Metacognitive Expertise: An Approach to Developing Collaborative Competence

    ERIC Educational Resources Information Center

    Borge, Marcela; White, Barbara

    2016-01-01

    We proposed and evaluated an instructional framework for increasing students' ability to understand and regulate collaborative interactions called Co-Regulated Collaborative Learning (CRCL). In this instantiation of CRCL, models of collaborative competence were articulated through a set of socio-metacognitive roles. Our population consisted of 28…

  1. Legacies in material flux: Structural changes before long-term studies

    Treesearch

    D.J. Bain; M.B. Green; J. Campbell; J. Chamblee; S. Chaoka; J. Fraterrigo; S. Kaushal; S. Martin; T. Jordan; T. Parolari; B. Sobczak; D. Weller; W. M. Wollheim; E. Boose; J. Duncan; G. Gettel; B. Hall; P. Kumar; J. Thompson; J. Vose; E. Elliott; D. Leigh

    2012-01-01

    Legacy effects of past land use and disturbance are increasingly recognized, yet consistent definitions of and criteria for defining them do not exist. To address this gap in biological- and ecosystem-assessment frameworks, we propose a general metric for evaluating potential legacy effects, which are computed by normalizing altered system function persistence with...

  2. A thermo-chemo-mechanically coupled constitutive model for curing of glassy polymers

    NASA Astrophysics Data System (ADS)

    Sain, Trisha; Loeffel, Kaspar; Chester, Shawn

    2018-07-01

    Curing of a polymer is the process through which a polymer liquid transitions into a solid polymer, capable of bearing mechanical loads. The curing process is a coupled thermo-chemo-mechanical conversion process which requires a thorough understanding of the system behavior to predict the cure dependent mechanical behavior of the solid polymer. In this paper, a thermodynamically consistent, frame indifferent, thermo-chemo-mechanically coupled continuum level constitutive framework is proposed for thermally cured glassy polymers. The constitutive framework considers the thermodynamics of chemical reactions, as well as the material behavior for a glassy polymer. A stress-free intermediate configuration is introduced within a finite deformation setting to capture the formation of the network in a stress-free configuration. This work considers a definition for the degree of cure based on the chemistry of the curing reactions. A simplified version of the proposed model has been numerically implemented, and simulations are used to understand the capabilities of the model and framework.

  3. Towards a conceptual framework of OSH risk management in smart working environments based on smart PPE, ambient intelligence and the Internet of Things technologies.

    PubMed

    Podgórski, Daniel; Majchrzycka, Katarzyna; Dąbrowska, Anna; Gralewicz, Grzegorz; Okrasa, Małgorzata

    2017-03-01

    Recent developments in domains of ambient intelligence (AmI), Internet of Things, cyber-physical systems (CPS), ubiquitous/pervasive computing, etc., have led to numerous attempts to apply ICT solutions in the occupational safety and health (OSH) area. A literature review reveals a wide range of examples of smart materials, smart personal protective equipment and other AmI applications that have been developed to improve workers' safety and health. Because the use of these solutions modifies work methods, increases complexity of production processes and introduces high dynamism into thus created smart working environments (SWE), a new conceptual framework for dynamic OSH management in SWE is called for. A proposed framework is based on a new paradigm of OSH risk management consisting of real-time risk assessment and the capacity to monitor the risk level of each worker individually. A rationale for context-based reasoning in SWE and a respective model of the SWE-dedicated CPS are also proposed.

  4. A World Health Organization field trial assessing a proposed ICD-11 framework for classifying patient safety events.

    PubMed

    Forster, Alan J; Bernard, Burnand; Drösler, Saskia E; Gurevich, Yana; Harrison, James; Januel, Jean-Marie; Romano, Patrick S; Southern, Danielle A; Sundararajan, Vijaya; Quan, Hude; Vanderloo, Saskia E; Pincus, Harold A; Ghali, William A

    2017-08-01

    To assess the utility of the proposed World Health Organization (WHO)'s International Classification of Disease (ICD) framework for classifying patient safety events. Independent classification of 45 clinical vignettes using a web-based platform. The WHO's multi-disciplinary Quality and Safety Topic Advisory Group. The framework consists of three concepts: harm, cause and mode. We defined a concept as 'classifiable' if more than half of the raters could assign an ICD-11 code for the case. We evaluated reasons why cases were nonclassifiable using a qualitative approach. Harm was classifiable in 31 of 45 cases (69%). Of these, only 20 could be classified according to cause and mode. Classifiable cases were those in which a clear cause and effect relationship existed (e.g. medication administration error). Nonclassifiable cases were those without clear causal attribution (e.g. pressure ulcer). Of the 14 cases in which harm was not evident (31%), only 5 could be classified according to cause and mode and represented potential adverse events. Overall, nine cases (20%) were nonclassifiable using the three-part patient safety framework and contained significant ambiguity in the relationship between healthcare outcome and putative cause. The proposed framework enabled classification of the majority of patient safety events. Cases in which potentially harmful events did not cause harm were not classifiable; additional code categories within the ICD-11 are one proposal to address this concern. Cases with ambiguity in cause and effect relationship between healthcare processes and outcomes remain difficult to classify. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  5. Surrogate assisted multidisciplinary design optimization for an all-electric GEO satellite

    NASA Astrophysics Data System (ADS)

    Shi, Renhe; Liu, Li; Long, Teng; Liu, Jian; Yuan, Bin

    2017-09-01

    State-of-the-art all-electric geostationary earth orbit (GEO) satellites use electric thrusters to execute all propulsive duties, which significantly differ from the traditional all-chemical ones in orbit-raising, station-keeping, radiation damage protection, and power budget, etc. Design optimization task of an all-electric GEO satellite is therefore a complex multidisciplinary design optimization (MDO) problem involving unique design considerations. However, solving the all-electric GEO satellite MDO problem faces big challenges in disciplinary modeling techniques and efficient optimization strategy. To address these challenges, we presents a surrogate assisted MDO framework consisting of several modules, i.e., MDO problem definition, multidisciplinary modeling, multidisciplinary analysis (MDA), and surrogate assisted optimizer. Based on the proposed framework, the all-electric GEO satellite MDO problem is formulated to minimize the total mass of the satellite system under a number of practical constraints. Then considerable efforts are spent on multidisciplinary modeling involving geosynchronous transfer, GEO station-keeping, power, thermal control, attitude control, and structure disciplines. Since orbit dynamics models and finite element structural model are computationally expensive, an adaptive response surface surrogate based optimizer is incorporated in the proposed framework to solve the satellite MDO problem with moderate computational cost, where a response surface surrogate is gradually refined to represent the computationally expensive MDA process. After optimization, the total mass of the studied GEO satellite is decreased by 185.3 kg (i.e., 7.3% of the total mass). Finally, the optimal design is further discussed to demonstrate the effectiveness of our proposed framework to cope with the all-electric GEO satellite system design optimization problems. This proposed surrogate assisted MDO framework can also provide valuable references for other all-electric spacecraft system design.

  6. Policy Framework for Covering Preventive Services Without Cost Sharing: Saving Lives and Saving Money?

    PubMed

    Chen, Stephanie C; Pearson, Steven D

    2016-08-01

    The US Affordable Care Act mandates that private insurers cover a list of preventive services without cost sharing. The list is determined by 4 expert committees that evaluate the overall health effect of preventive services. We analyzed the process by which the expert committees develop their recommendations. Each committee uses different criteria to evaluate preventive services and none of the committees consider cost systematically. We propose that the existing committees adopt consistent evidence review methodologies and expand the scope of preventive services reviewed and that a separate advisory committee be established to integrate economic considerations into the final selection of free preventive services. The comprehensive framework and associated criteria are intended to help policy makers in the future develop a more evidence-based, consistent, and ethically sound approach.

  7. Rating counselor-client behavior in online counseling: development and preliminary psychometric properties of the Counseling Progress and Depth Rating Instrument.

    PubMed

    Bagraith, Karl; Chardon, Lydia; King, Robert John

    2010-11-01

    Although there are widely accepted and utilized models and frameworks for nondirective counseling (NDC), there is little in the way of tools or instruments designed to assist in determining whether or not a specific episode of counseling is consistent with the stated model or framework. The Counseling Progress and Depth Rating Instrument (CPDRI) was developed to evaluate counselor integrity in the use of Egan's skilled helper model in online counseling. The instrument was found to have sound internal consistency, good interrater reliability, and good face and convergent validity. The CPDRI is, therefore, proposed as a useful tool to facilitate investigation of the degree to which counselors adhere to and apply a widely used approach to NDC.

  8. Probability-Based Recognition Framework for Underwater Landmarks Using Sonar Images †.

    PubMed

    Lee, Yeongjun; Choi, Jinwoo; Ko, Nak Yong; Choi, Hyun-Taek

    2017-08-24

    This paper proposes a probability-based framework for recognizing underwater landmarks using sonar images. Current recognition methods use a single image, which does not provide reliable results because of weaknesses of the sonar image such as unstable acoustic source, many speckle noises, low resolution images, single channel image, and so on. However, using consecutive sonar images, if the status-i.e., the existence and identity (or name)-of an object is continuously evaluated by a stochastic method, the result of the recognition method is available for calculating the uncertainty, and it is more suitable for various applications. Our proposed framework consists of three steps: (1) candidate selection, (2) continuity evaluation, and (3) Bayesian feature estimation. Two probability methods-particle filtering and Bayesian feature estimation-are used to repeatedly estimate the continuity and feature of objects in consecutive images. Thus, the status of the object is repeatedly predicted and updated by a stochastic method. Furthermore, we develop an artificial landmark to increase detectability by an imaging sonar, which we apply to the characteristics of acoustic waves, such as instability and reflection depending on the roughness of the reflector surface. The proposed method is verified by conducting basin experiments, and the results are presented.

  9. Tiered Approach to Resilience Assessment.

    PubMed

    Linkov, Igor; Fox-Lent, Cate; Read, Laura; Allen, Craig R; Arnott, James C; Bellini, Emanuele; Coaffee, Jon; Florin, Marie-Valentine; Hatfield, Kirk; Hyde, Iain; Hynes, William; Jovanovic, Aleksandar; Kasperson, Roger; Katzenberger, John; Keys, Patrick W; Lambert, James H; Moss, Richard; Murdoch, Peter S; Palma-Oliveira, Jose; Pulwarty, Roger S; Sands, Dale; Thomas, Edward A; Tye, Mari R; Woods, David

    2018-04-25

    Regulatory agencies have long adopted a three-tier framework for risk assessment. We build on this structure to propose a tiered approach for resilience assessment that can be integrated into the existing regulatory processes. Comprehensive approaches to assessing resilience at appropriate and operational scales, reconciling analytical complexity as needed with stakeholder needs and resources available, and ultimately creating actionable recommendations to enhance resilience are still lacking. Our proposed framework consists of tiers by which analysts can select resilience assessment and decision support tools to inform associated management actions relative to the scope and urgency of the risk and the capacity of resource managers to improve system resilience. The resilience management framework proposed is not intended to supplant either risk management or the many existing efforts of resilience quantification method development, but instead provide a guide to selecting tools that are appropriate for the given analytic need. The goal of this tiered approach is to intentionally parallel the tiered approach used in regulatory contexts so that resilience assessment might be more easily and quickly integrated into existing structures and with existing policies. Published 2018. This article is a U.S. government work and is in the public domain in the USA.

  10. An Exemplar-Based Multi-View Domain Generalization Framework for Visual Recognition.

    PubMed

    Niu, Li; Li, Wen; Xu, Dong; Cai, Jianfei

    2018-02-01

    In this paper, we propose a new exemplar-based multi-view domain generalization (EMVDG) framework for visual recognition by learning robust classifier that are able to generalize well to arbitrary target domain based on the training samples with multiple types of features (i.e., multi-view features). In this framework, we aim to address two issues simultaneously. First, the distribution of training samples (i.e., the source domain) is often considerably different from that of testing samples (i.e., the target domain), so the performance of the classifiers learnt on the source domain may drop significantly on the target domain. Moreover, the testing data are often unseen during the training procedure. Second, when the training data are associated with multi-view features, the recognition performance can be further improved by exploiting the relation among multiple types of features. To address the first issue, considering that it has been shown that fusing multiple SVM classifiers can enhance the domain generalization ability, we build our EMVDG framework upon exemplar SVMs (ESVMs), in which a set of ESVM classifiers are learnt with each one trained based on one positive training sample and all the negative training samples. When the source domain contains multiple latent domains, the learnt ESVM classifiers are expected to be grouped into multiple clusters. To address the second issue, we propose two approaches under the EMVDG framework based on the consensus principle and the complementary principle, respectively. Specifically, we propose an EMVDG_CO method by adding a co-regularizer to enforce the cluster structures of ESVM classifiers on different views to be consistent based on the consensus principle. Inspired by multiple kernel learning, we also propose another EMVDG_MK method by fusing the ESVM classifiers from different views based on the complementary principle. In addition, we further extend our EMVDG framework to exemplar-based multi-view domain adaptation (EMVDA) framework when the unlabeled target domain data are available during the training procedure. The effectiveness of our EMVDG and EMVDA frameworks for visual recognition is clearly demonstrated by comprehensive experiments on three benchmark data sets.

  11. From Cues to Nudge: A Knowledge-Based Framework for Surveillance of Healthcare-Associated Infections.

    PubMed

    Shaban-Nejad, Arash; Mamiya, Hiroshi; Riazanov, Alexandre; Forster, Alan J; Baker, Christopher J O; Tamblyn, Robyn; Buckeridge, David L

    2016-01-01

    We propose an integrated semantic web framework consisting of formal ontologies, web services, a reasoner and a rule engine that together recommend appropriate level of patient-care based on the defined semantic rules and guidelines. The classification of healthcare-associated infections within the HAIKU (Hospital Acquired Infections - Knowledge in Use) framework enables hospitals to consistently follow the standards along with their routine clinical practice and diagnosis coding to improve quality of care and patient safety. The HAI ontology (HAIO) groups over thousands of codes into a consistent hierarchy of concepts, along with relationships and axioms to capture knowledge on hospital-associated infections and complications with focus on the big four types, surgical site infections (SSIs), catheter-associated urinary tract infection (CAUTI); hospital-acquired pneumonia, and blood stream infection. By employing statistical inferencing in our study we use a set of heuristics to define the rule axioms to improve the SSI case detection. We also demonstrate how the occurrence of an SSI is identified using semantic e-triggers. The e-triggers will be used to improve our risk assessment of post-operative surgical site infections (SSIs) for patients undergoing certain type of surgeries (e.g., coronary artery bypass graft surgery (CABG)).

  12. A segmentation editing framework based on shape change statistics

    NASA Astrophysics Data System (ADS)

    Mostapha, Mahmoud; Vicory, Jared; Styner, Martin; Pizer, Stephen

    2017-02-01

    Segmentation is a key task in medical image analysis because its accuracy significantly affects successive steps. Automatic segmentation methods often produce inadequate segmentations, which require the user to manually edit the produced segmentation slice by slice. Because editing is time-consuming, an editing tool that enables the user to produce accurate segmentations by only drawing a sparse set of contours would be needed. This paper describes such a framework as applied to a single object. Constrained by the additional information enabled by the manually segmented contours, the proposed framework utilizes object shape statistics to transform the failed automatic segmentation to a more accurate version. Instead of modeling the object shape, the proposed framework utilizes shape change statistics that were generated to capture the object deformation from the failed automatic segmentation to its corresponding correct segmentation. An optimization procedure was used to minimize an energy function that consists of two terms, an external contour match term and an internal shape change regularity term. The high accuracy of the proposed segmentation editing approach was confirmed by testing it on a simulated data set based on 10 in-vivo infant magnetic resonance brain data sets using four similarity metrics. Segmentation results indicated that our method can provide efficient and adequately accurate segmentations (Dice segmentation accuracy increase of 10%), with very sparse contours (only 10%), which is promising in greatly decreasing the work expected from the user.

  13. Extendable supervised dictionary learning for exploring diverse and concurrent brain activities in task-based fMRI.

    PubMed

    Zhao, Shijie; Han, Junwei; Hu, Xintao; Jiang, Xi; Lv, Jinglei; Zhang, Tuo; Zhang, Shu; Guo, Lei; Liu, Tianming

    2018-06-01

    Recently, a growing body of studies have demonstrated the simultaneous existence of diverse brain activities, e.g., task-evoked dominant response activities, delayed response activities and intrinsic brain activities, under specific task conditions. However, current dominant task-based functional magnetic resonance imaging (tfMRI) analysis approach, i.e., the general linear model (GLM), might have difficulty in discovering those diverse and concurrent brain responses sufficiently. This subtraction-based model-driven approach focuses on the brain activities evoked directly from the task paradigm, thus likely overlooks other possible concurrent brain activities evoked during the information processing. To deal with this problem, in this paper, we propose a novel hybrid framework, called extendable supervised dictionary learning (E-SDL), to explore diverse and concurrent brain activities under task conditions. A critical difference between E-SDL framework and previous methods is that we systematically extend the basic task paradigm regressor into meaningful regressor groups to account for possible regressor variation during the information processing procedure in the brain. Applications of the proposed framework on five independent and publicly available tfMRI datasets from human connectome project (HCP) simultaneously revealed more meaningful group-wise consistent task-evoked networks and common intrinsic connectivity networks (ICNs). These results demonstrate the advantage of the proposed framework in identifying the diversity of concurrent brain activities in tfMRI datasets.

  14. Statistical label fusion with hierarchical performance models

    PubMed Central

    Asman, Andrew J.; Dagley, Alexander S.; Landman, Bennett A.

    2014-01-01

    Label fusion is a critical step in many image segmentation frameworks (e.g., multi-atlas segmentation) as it provides a mechanism for generalizing a collection of labeled examples into a single estimate of the underlying segmentation. In the multi-label case, typical label fusion algorithms treat all labels equally – fully neglecting the known, yet complex, anatomical relationships exhibited in the data. To address this problem, we propose a generalized statistical fusion framework using hierarchical models of rater performance. Building on the seminal work in statistical fusion, we reformulate the traditional rater performance model from a multi-tiered hierarchical perspective. This new approach provides a natural framework for leveraging known anatomical relationships and accurately modeling the types of errors that raters (or atlases) make within a hierarchically consistent formulation. Herein, we describe several contributions. First, we derive a theoretical advancement to the statistical fusion framework that enables the simultaneous estimation of multiple (hierarchical) performance models within the statistical fusion context. Second, we demonstrate that the proposed hierarchical formulation is highly amenable to the state-of-the-art advancements that have been made to the statistical fusion framework. Lastly, in an empirical whole-brain segmentation task we demonstrate substantial qualitative and significant quantitative improvement in overall segmentation accuracy. PMID:24817809

  15. Floodplain Mapping for the Continental United States Using Machine Learning Techniques and Watershed Characteristics

    NASA Astrophysics Data System (ADS)

    Jafarzadegan, K.; Merwade, V.; Saksena, S.

    2017-12-01

    Using conventional hydrodynamic methods for floodplain mapping in large-scale and data-scarce regions is problematic due to the high cost of these methods, lack of reliable data and uncertainty propagation. In this study a new framework is proposed to generate 100-year floodplains for any gauged or ungauged watershed across the United States (U.S.). This framework uses Flood Insurance Rate Maps (FIRMs), topographic, climatic and land use data which are freely available for entire U.S. for floodplain mapping. The framework consists of three components, including a Random Forest classifier for watershed classification, a Probabilistic Threshold Binary Classifier (PTBC) for generating the floodplains, and a lookup table for linking the Random Forest classifier to the PTBC. The effectiveness and reliability of the proposed framework is tested on 145 watersheds from various geographical locations in the U.S. The validation results show that around 80 percent of total watersheds are predicted well, 14 percent have acceptable fit and less than five percent are predicted poorly compared to FIRMs. Another advantage of this framework is its ability in generating floodplains for all small rivers and tributaries. Due to the high accuracy and efficiency of this framework, it can be used as a preliminary decision making tool to generate 100-year floodplain maps for data-scarce regions and all tributaries where hydrodynamic methods are difficult to use.

  16. A framework for automatic feature extraction from airborne light detection and ranging data

    NASA Astrophysics Data System (ADS)

    Yan, Jianhua

    Recent advances in airborne Light Detection and Ranging (LIDAR) technology allow rapid and inexpensive measurements of topography over large areas. Airborne LIDAR systems usually return a 3-dimensional cloud of point measurements from reflective objects scanned by the laser beneath the flight path. This technology is becoming a primary method for extracting information of different kinds of geometrical objects, such as high-resolution digital terrain models (DTMs), buildings and trees, etc. In the past decade, LIDAR gets more and more interest from researchers in the field of remote sensing and GIS. Compared to the traditional data sources, such as aerial photography and satellite images, LIDAR measurements are not influenced by sun shadow and relief displacement. However, voluminous data pose a new challenge for automated extraction the geometrical information from LIDAR measurements because many raster image processing techniques cannot be directly applied to irregularly spaced LIDAR points. In this dissertation, a framework is proposed to filter out information about different kinds of geometrical objects, such as terrain and buildings from LIDAR automatically. They are essential to numerous applications such as flood modeling, landslide prediction and hurricane animation. The framework consists of several intuitive algorithms. Firstly, a progressive morphological filter was developed to detect non-ground LIDAR measurements. By gradually increasing the window size and elevation difference threshold of the filter, the measurements of vehicles, vegetation, and buildings are removed, while ground data are preserved. Then, building measurements are identified from no-ground measurements using a region growing algorithm based on the plane-fitting technique. Raw footprints for segmented building measurements are derived by connecting boundary points and are further simplified and adjusted by several proposed operations to remove noise, which is caused by irregularly spaced LIDAR measurements. To reconstruct 3D building models, the raw 2D topology of each building is first extracted and then further adjusted. Since the adjusting operations for simple building models do not work well on 2D topology, 2D snake algorithm is proposed to adjust 2D topology. The 2D snake algorithm consists of newly defined energy functions for topology adjusting and a linear algorithm to find the minimal energy value of 2D snake problems. Data sets from urbanized areas including large institutional, commercial, and small residential buildings were employed to test the proposed framework. The results demonstrated that the proposed framework achieves a very good performance.

  17. The combination of an Environmental Management System and Life Cycle Assessment at the territorial level

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mazzi, Anna; Toniolo, Sara; Catto, Stella

    A framework to include a Life Cycle Assessment in the significance evaluation of the environmental aspects of an Environmental Management System has been studied for some industrial sectors, but there is a literature gap at the territorial level, where the indirect impact assessment is crucial. To overcome this criticality, our research proposes the Life Cycle Assessment as a framework to assess environmental aspects of public administration within an Environmental Management System applied at the territorial level. This research is structured in two parts: the design of a new methodological framework and the pilot application for an Italian municipality. The methodologicalmore » framework designed supports Initial Environmental Analysis at the territorial level thanks to the results derived from the impact assessment phase. The pilot application in an Italian municipality EMAS registered demonstrates the applicability of the framework and its effectiveness in evaluating the environmental impact assessment for direct and indirect aspects. Through the discussion of the results, we underline the growing knowledge derived by this research in terms of the reproducibility and consistency of the criteria to define the significance of the direct and indirect environmental aspects for a local public administration. - Highlights: • The combination between Environmental Management System and LCA is studied. • A methodological framework is elaborated and tested at the territorial level. • Life Cycle Impact Assessment supports the evaluation of aspects significance. • The framework assures consistency of evaluation criteria on the studied territory.« less

  18. An Uncertainty Quantification Framework for Prognostics and Condition-Based Monitoring

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar; Goebel, Kai

    2014-01-01

    This paper presents a computational framework for uncertainty quantification in prognostics in the context of condition-based monitoring of aerospace systems. The different sources of uncertainty and the various uncertainty quantification activities in condition-based prognostics are outlined in detail, and it is demonstrated that the Bayesian subjective approach is suitable for interpreting uncertainty in online monitoring. A state-space model-based framework for prognostics, that can rigorously account for the various sources of uncertainty, is presented. Prognostics consists of two important steps. First, the state of the system is estimated using Bayesian tracking, and then, the future states of the system are predicted until failure, thereby computing the remaining useful life of the system. The proposed framework is illustrated using the power system of a planetary rover test-bed, which is being developed and studied at NASA Ames Research Center.

  19. A proposal for a computer-based framework of support for public health in the management of biological incidents: the Czech Republic experience.

    PubMed

    Bures, Vladimír; Otcenásková, Tereza; Cech, Pavel; Antos, Karel

    2012-11-01

    Biological incidents jeopardising public health require decision-making that consists of one dominant feature: complexity. Therefore, public health decision-makers necessitate appropriate support. Based on the analogy with business intelligence (BI) principles, the contextual analysis of the environment and available data resources, and conceptual modelling within systems and knowledge engineering, this paper proposes a general framework for computer-based decision support in the case of a biological incident. At the outset, the analysis of potential inputs to the framework is conducted and several resources such as demographic information, strategic documents, environmental characteristics, agent descriptors and surveillance systems are considered. Consequently, three prototypes were developed, tested and evaluated by a group of experts. Their selection was based on the overall framework scheme. Subsequently, an ontology prototype linked with an inference engine, multi-agent-based model focusing on the simulation of an environment, and expert-system prototypes were created. All prototypes proved to be utilisable support tools for decision-making in the field of public health. Nevertheless, the research revealed further issues and challenges that might be investigated by both public health focused researchers and practitioners.

  20. Designing a robust activity recognition framework for health and exergaming using wearable sensors.

    PubMed

    Alshurafa, Nabil; Xu, Wenyao; Liu, Jason J; Huang, Ming-Chun; Mortazavi, Bobak; Roberts, Christian K; Sarrafzadeh, Majid

    2014-09-01

    Detecting human activity independent of intensity is essential in many applications, primarily in calculating metabolic equivalent rates and extracting human context awareness. Many classifiers that train on an activity at a subset of intensity levels fail to recognize the same activity at other intensity levels. This demonstrates weakness in the underlying classification method. Training a classifier for an activity at every intensity level is also not practical. In this paper, we tackle a novel intensity-independent activity recognition problem where the class labels exhibit large variability, the data are of high dimensionality, and clustering algorithms are necessary. We propose a new robust stochastic approximation framework for enhanced classification of such data. Experiments are reported using two clustering techniques, K-Means and Gaussian Mixture Models. The stochastic approximation algorithm consistently outperforms other well-known classification schemes which validate the use of our proposed clustered data representation. We verify the motivation of our framework in two applications that benefit from intensity-independent activity recognition. The first application shows how our framework can be used to enhance energy expenditure calculations. The second application is a novel exergaming environment aimed at using games to reward physical activity performed throughout the day, to encourage a healthy lifestyle.

  1. Users' Interaction with World Wide Web Resources: An Exploratory Study Using a Holistic Approach.

    ERIC Educational Resources Information Center

    Wang, Peiling; Hawk, William B.; Tenopir, Carol

    2000-01-01

    Presents results of a study that explores factors of user-Web interaction in finding factual information, develops a conceptual framework for studying user-Web interaction, and applies a process-tracing method for conducting holistic user-Web studies. Describes measurement techniques and proposes a model consisting of the user, interface, and the…

  2. Legacy effects in material flux: structural catchment changes predate long-term studies

    Treesearch

    Daniel Bain; Mark B. Green; John L. Campbell; John F. Chamblee; Sayo Chaoka; Jennifer M. Fraterrigo; Sujay S. Kaushal; Sujay S. Kaushal; Sherry L. Martin; Thomas E. Jordan; Anthony J. Parolari; William V. Sobczak; Donald E. Weller; Wilfred M. Wolheim; Emery R. Boose; Jonathan M. Duncan; Gretchen M. Gettel; Brian R. Hall; Praveen Kumar; Jonathan R. Thompson; James M. Vose; Emily M. Elliott; David S. Leigh

    2012-01-01

    Legacy effects of past land use and disturbance are increasingly recognized, yet consistent definitions of and criteria for defining them do not exist. To address this gap in biological- and ecosystem-assessment frameworks, we propose a general metric for evaluating potential legacy effects, which are computed by normalizing altered system function persistence with...

  3. Construing Mathematics-Containing Activities in Adults' Workplace Competences: Analysis of Institutional and Multimodal Aspects

    ERIC Educational Resources Information Center

    Björklund Boistrup, Lisa; Gustafsson, Lars

    2014-01-01

    In this paper we propose and discuss a framework for analysing adults' work competences while construing mathematics-containing "themes" in two workplace settings: road haulage and nursing. The data consist of videos and transcribed interviews from the work of two lorryloaders, and a nurses' aide at an orthopaedic department. In the…

  4. New agreement measures based on survival processes

    PubMed Central

    Guo, Ying; Li, Ruosha; Peng, Limin; Manatunga, Amita K.

    2013-01-01

    Summary The need to assess agreement arises in many scenarios in biomedical sciences when measurements were taken by different methods on the same subjects. When the endpoints are survival outcomes, the study of agreement becomes more challenging given the special characteristics of time-to-event data. In this paper, we propose a new framework for assessing agreement based on survival processes that can be viewed as a natural representation of time-to-event outcomes. Our new agreement measure is formulated as the chance-corrected concordance between survival processes. It provides a new perspective for studying the relationship between correlated survival outcomes and offers an appealing interpretation as the agreement between survival times on the absolute distance scale. We provide a multivariate extension of the proposed agreement measure for multiple methods. Furthermore, the new framework enables a natural extension to evaluate time-dependent agreement structure. We develop nonparametric estimation of the proposed new agreement measures. Our estimators are shown to be strongly consistent and asymptotically normal. We evaluate the performance of the proposed estimators through simulation studies and then illustrate the methods using a prostate cancer data example. PMID:23844617

  5. Scalable High Performance Image Registration Framework by Unsupervised Deep Feature Representations Learning

    PubMed Central

    Wu, Guorong; Kim, Minjeong; Wang, Qian; Munsell, Brent C.

    2015-01-01

    Feature selection is a critical step in deformable image registration. In particular, selecting the most discriminative features that accurately and concisely describe complex morphological patterns in image patches improves correspondence detection, which in turn improves image registration accuracy. Furthermore, since more and more imaging modalities are being invented to better identify morphological changes in medical imaging data,, the development of deformable image registration method that scales well to new image modalities or new image applications with little to no human intervention would have a significant impact on the medical image analysis community. To address these concerns, a learning-based image registration framework is proposed that uses deep learning to discover compact and highly discriminative features upon observed imaging data. Specifically, the proposed feature selection method uses a convolutional stacked auto-encoder to identify intrinsic deep feature representations in image patches. Since deep learning is an unsupervised learning method, no ground truth label knowledge is required. This makes the proposed feature selection method more flexible to new imaging modalities since feature representations can be directly learned from the observed imaging data in a very short amount of time. Using the LONI and ADNI imaging datasets, image registration performance was compared to two existing state-of-the-art deformable image registration methods that use handcrafted features. To demonstrate the scalability of the proposed image registration framework image registration experiments were conducted on 7.0-tesla brain MR images. In all experiments, the results showed the new image registration framework consistently demonstrated more accurate registration results when compared to state-of-the-art. PMID:26552069

  6. Scalable High-Performance Image Registration Framework by Unsupervised Deep Feature Representations Learning.

    PubMed

    Wu, Guorong; Kim, Minjeong; Wang, Qian; Munsell, Brent C; Shen, Dinggang

    2016-07-01

    Feature selection is a critical step in deformable image registration. In particular, selecting the most discriminative features that accurately and concisely describe complex morphological patterns in image patches improves correspondence detection, which in turn improves image registration accuracy. Furthermore, since more and more imaging modalities are being invented to better identify morphological changes in medical imaging data, the development of deformable image registration method that scales well to new image modalities or new image applications with little to no human intervention would have a significant impact on the medical image analysis community. To address these concerns, a learning-based image registration framework is proposed that uses deep learning to discover compact and highly discriminative features upon observed imaging data. Specifically, the proposed feature selection method uses a convolutional stacked autoencoder to identify intrinsic deep feature representations in image patches. Since deep learning is an unsupervised learning method, no ground truth label knowledge is required. This makes the proposed feature selection method more flexible to new imaging modalities since feature representations can be directly learned from the observed imaging data in a very short amount of time. Using the LONI and ADNI imaging datasets, image registration performance was compared to two existing state-of-the-art deformable image registration methods that use handcrafted features. To demonstrate the scalability of the proposed image registration framework, image registration experiments were conducted on 7.0-T brain MR images. In all experiments, the results showed that the new image registration framework consistently demonstrated more accurate registration results when compared to state of the art.

  7. Standardizing terminology and definitions of medication adherence and persistence in research employing electronic databases.

    PubMed

    Raebel, Marsha A; Schmittdiel, Julie; Karter, Andrew J; Konieczny, Jennifer L; Steiner, John F

    2013-08-01

    To propose a unifying set of definitions for prescription adherence research utilizing electronic health record prescribing databases, prescription dispensing databases, and pharmacy claims databases and to provide a conceptual framework to operationalize these definitions consistently across studies. We reviewed recent literature to identify definitions in electronic database studies of prescription-filling patterns for chronic oral medications. We then develop a conceptual model and propose standardized terminology and definitions to describe prescription-filling behavior from electronic databases. The conceptual model we propose defines 2 separate constructs: medication adherence and persistence. We define primary and secondary adherence as distinct subtypes of adherence. Metrics for estimating secondary adherence are discussed and critiqued, including a newer metric (New Prescription Medication Gap measure) that enables estimation of both primary and secondary adherence. Terminology currently used in prescription adherence research employing electronic databases lacks consistency. We propose a clear, consistent, broadly applicable conceptual model and terminology for such studies. The model and definitions facilitate research utilizing electronic medication prescribing, dispensing, and/or claims databases and encompasses the entire continuum of prescription-filling behavior. Employing conceptually clear and consistent terminology to define medication adherence and persistence will facilitate future comparative effectiveness research and meta-analytic studies that utilize electronic prescription and dispensing records.

  8. Theory and Practice of Pediatric Bioethics.

    PubMed

    Ross, Lainie Friedman

    2016-01-01

    This article examines two typical bioethics frameworks: the "Four Principles" by Beauchamp and Childress, and the "Four Boxes" by Jonsen, Siegler, and Winslade. I show how they are inadequate to address the ethical issues raised by pediatrics, in part because they do not pay adequate attention to families. I then consider an alternate framework proposed by Buchanan and Brock that focuses on four questions that must be addressed for the patient who lacks decisional capacity. This model also does not give adequate respect for the family, particularly the intimate family. I then describe my own framework, which provides answers to Buchanan and Brock's four questions in a way that is consistent with the intimate family and its need for protection from state intervention.

  9. Atlas-based liver segmentation and hepatic fat-fraction assessment for clinical trials.

    PubMed

    Yan, Zhennan; Zhang, Shaoting; Tan, Chaowei; Qin, Hongxing; Belaroussi, Boubakeur; Yu, Hui Jing; Miller, Colin; Metaxas, Dimitris N

    2015-04-01

    Automated assessment of hepatic fat-fraction is clinically important. A robust and precise segmentation would enable accurate, objective and consistent measurement of hepatic fat-fraction for disease quantification, therapy monitoring and drug development. However, segmenting the liver in clinical trials is a challenging task due to the variability of liver anatomy as well as the diverse sources the images were acquired from. In this paper, we propose an automated and robust framework for liver segmentation and assessment. It uses single statistical atlas registration to initialize a robust deformable model to obtain fine segmentation. Fat-fraction map is computed by using chemical shift based method in the delineated region of liver. This proposed method is validated on 14 abdominal magnetic resonance (MR) volumetric scans. The qualitative and quantitative comparisons show that our proposed method can achieve better segmentation accuracy with less variance comparing with two other atlas-based methods. Experimental results demonstrate the promises of our assessment framework. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Security Event Recognition for Visual Surveillance

    NASA Astrophysics Data System (ADS)

    Liao, W.; Yang, C.; Yang, M. Ying; Rosenhahn, B.

    2017-05-01

    With rapidly increasing deployment of surveillance cameras, the reliable methods for automatically analyzing the surveillance video and recognizing special events are demanded by different practical applications. This paper proposes a novel effective framework for security event analysis in surveillance videos. First, convolutional neural network (CNN) framework is used to detect objects of interest in the given videos. Second, the owners of the objects are recognized and monitored in real-time as well. If anyone moves any object, this person will be verified whether he/she is its owner. If not, this event will be further analyzed and distinguished between two different scenes: moving the object away or stealing it. To validate the proposed approach, a new video dataset consisting of various scenarios is constructed for more complex tasks. For comparison purpose, the experiments are also carried out on the benchmark databases related to the task on abandoned luggage detection. The experimental results show that the proposed approach outperforms the state-of-the-art methods and effective in recognizing complex security events.

  11. Modelling electro-active polymers with a dispersion-type anisotropy

    NASA Astrophysics Data System (ADS)

    Hossain, Mokarram; Steinmann, Paul

    2018-02-01

    We propose a novel constitutive framework for electro-active polymers (EAPs) that can take into account anisotropy with a chain dispersion. To enhance actuation behaviour, particle-filled EAPs become promising candidates nowadays. Recent studies suggest that particle-filled EAPs, which can be cured under an electric field during the manufacturing time, do not necessarily form perfect anisotropic composites, rather they create composites with dispersed chains. Hence in this contribution, an electro-mechanically coupled constitutive model is devised that considers the chain dispersion with a probability distribution function in an integral form. To obtain relevant quantities in discrete form, numerical integration over the unit sphere is utilized. Necessary constitutive equations are derived exploiting the basic laws of thermodynamics that result in a thermodynamically consistent formulation. To demonstrate the performance of the proposed electro-mechanically coupled framework, we analytically solve a non-homogeneous boundary value problem, the extension and inflation of an axisymmetric cylindrical tube under electro-mechanically coupled load. The results capture various electro-mechanical couplings with the formulation proposed for EAP composites.

  12. Game theoretic approach for cooperative feature extraction in camera networks

    NASA Astrophysics Data System (ADS)

    Redondi, Alessandro E. C.; Baroffio, Luca; Cesana, Matteo; Tagliasacchi, Marco

    2016-07-01

    Visual sensor networks (VSNs) consist of several camera nodes with wireless communication capabilities that can perform visual analysis tasks such as object identification, recognition, and tracking. Often, VSN deployments result in many camera nodes with overlapping fields of view. In the past, such redundancy has been exploited in two different ways: (1) to improve the accuracy/quality of the visual analysis task by exploiting multiview information or (2) to reduce the energy consumed for performing the visual task, by applying temporal scheduling techniques among the cameras. We propose a game theoretic framework based on the Nash bargaining solution to bridge the gap between the two aforementioned approaches. The key tenet of the proposed framework is for cameras to reduce the consumed energy in the analysis process by exploiting the redundancy in the reciprocal fields of view. Experimental results in both simulated and real-life scenarios confirm that the proposed scheme is able to increase the network lifetime, with a negligible loss in terms of visual analysis accuracy.

  13. Retinal artery-vein classification via topology estimation

    PubMed Central

    Estrada, Rolando; Allingham, Michael J.; Mettu, Priyatham S.; Cousins, Scott W.; Tomasi, Carlo; Farsiu, Sina

    2015-01-01

    We propose a novel, graph-theoretic framework for distinguishing arteries from veins in a fundus image. We make use of the underlying vessel topology to better classify small and midsized vessels. We extend our previously proposed tree topology estimation framework by incorporating expert, domain-specific features to construct a simple, yet powerful global likelihood model. We efficiently maximize this model by iteratively exploring the space of possible solutions consistent with the projected vessels. We tested our method on four retinal datasets and achieved classification accuracies of 91.0%, 93.5%, 91.7%, and 90.9%, outperforming existing methods. Our results show the effectiveness of our approach, which is capable of analyzing the entire vasculature, including peripheral vessels, in wide field-of-view fundus photographs. This topology-based method is a potentially important tool for diagnosing diseases with retinal vascular manifestation. PMID:26068204

  14. Communication between Brain Areas Based on Nested Oscillations

    PubMed Central

    Kastner, Sabine

    2017-01-01

    Abstract Unraveling how brain regions communicate is crucial for understanding how the brain processes external and internal information. Neuronal oscillations within and across brain regions have been proposed to play a crucial role in this process. Two main hypotheses have been suggested for routing of information based on oscillations, namely communication through coherence and gating by inhibition. Here, we propose a framework unifying these two hypotheses that is based on recent empirical findings. We discuss a theory in which communication between two regions is established by phase synchronization of oscillations at lower frequencies (<25 Hz), which serve as temporal reference frame for information carried by high-frequency activity (>40 Hz). Our framework, consistent with numerous recent empirical findings, posits that cross-frequency interactions are essential for understanding how large-scale cognitive and perceptual networks operate. PMID:28374013

  15. A preliminary examination of patient loyalty: an application of the customer loyalty classification framework in the health care industry.

    PubMed

    Heiens, R A; Pleshko, L P

    1997-01-01

    The present article applies the customer loyalty classification framework developed by Dick and Basu (1994) to the health care industry. Based on a two factor classification, consisting of repeat patronage and relative attitude, four categories of patient loyalty are proposed and examined, including true loyalty, latent loyalty, spurious loyalty, and no loyalty. Data is collected and the four patient loyalty categories are profiled and compared on the basis of perceived risk, product class importance, provider decision importance, provider awareness, provider consideration, number of providers visited, and self-reported loyalty.

  16. A Prototype for the Support of Integrated Software Process Development and Improvement

    NASA Astrophysics Data System (ADS)

    Porrawatpreyakorn, Nalinpat; Quirchmayr, Gerald; Chutimaskul, Wichian

    An efficient software development process is one of key success factors for quality software. Not only can the appropriate establishment but also the continuous improvement of integrated project management and of the software development process result in efficiency. This paper hence proposes a software process maintenance framework which consists of two core components: an integrated PMBOK-Scrum model describing how to establish a comprehensive set of project management and software engineering processes and a software development maturity model advocating software process improvement. Besides, a prototype tool to support the framework is introduced.

  17. Structural Equation Models in a Redundancy Analysis Framework With Covariates.

    PubMed

    Lovaglio, Pietro Giorgio; Vittadini, Giorgio

    2014-01-01

    A recent method to specify and fit structural equation modeling in the Redundancy Analysis framework based on so-called Extended Redundancy Analysis (ERA) has been proposed in the literature. In this approach, the relationships between the observed exogenous variables and the observed endogenous variables are moderated by the presence of unobservable composites, estimated as linear combinations of exogenous variables. However, in the presence of direct effects linking exogenous and endogenous variables, or concomitant indicators, the composite scores are estimated by ignoring the presence of the specified direct effects. To fit structural equation models, we propose a new specification and estimation method, called Generalized Redundancy Analysis (GRA), allowing us to specify and fit a variety of relationships among composites, endogenous variables, and external covariates. The proposed methodology extends the ERA method, using a more suitable specification and estimation algorithm, by allowing for covariates that affect endogenous indicators indirectly through the composites and/or directly. To illustrate the advantages of GRA over ERA we propose a simulation study of small samples. Moreover, we propose an application aimed at estimating the impact of formal human capital on the initial earnings of graduates of an Italian university, utilizing a structural model consistent with well-established economic theory.

  18. Female employment and fertility in Peninsular Malaysia: the maternal role incompatibility hypothesis reconsidered.

    PubMed

    Mason, K O; Palan, V T

    1981-11-01

    Multivariate analysis of the 1974 Malaysian Fertility and Family Survey tests the hypothesis that an inverse relationship between women's work and fertility occurs only when there are serious conflicts between working and caring for children. The results are only partly consistent with the hypothesis and suggest that normative conflicts between working and mothering affect the employment-fertility relationship in Malaysia more than spacio-temporal conflicts do. The lack of consistent evidence for the hypothesis, as well as some conceptual problems, lead us to propose an alternative framework for understanding variation in the employment-fertility relationship, both in Malaysia and elsewhere. This framework incorporates ideas from the role incompatibility hypothesis but views the employment-fertility relationship as dependent not just on role conflicts but more generally on the structure of the household's socioeconomic opportunities.

  19. Self-consistent description of a system of interacting phonons

    NASA Astrophysics Data System (ADS)

    Poluektov, Yu. M.

    2015-11-01

    A proposal for a method of self-consistent description of phonon systems. This method generalizes the Debye model to account for phonon-phonon interaction. The idea of "self-consistent" phonons is introduced; their speed depends on the temperature and is determined by solving a non-linear equation. The Debye energy is also a function of the temperature within the framework of the proposed approach. The thermodynamics of "self-consistent" phonon gas are built. It is shown that at low temperatures the cubic law temperature dependence of specific heat acquires an additional term that is proportional to the seventh power of the temperature. This seems to explain the reason why the cubic law for specific heat is observed only at relatively low temperatures. At high temperatures, the theory predicts a linear deviation with respect to temperature from the Dulong-Petit law, which is observed experimentally. A modification to the melting criteria is considered, to account for the phonon-phonon interaction.

  20. A framework for the evaluation of new interventional procedures.

    PubMed

    Lourenco, Tania; Grant, Adrian M; Burr, Jennifer M; Vale, Luke

    2012-03-01

    The introduction of new interventional procedures is less regulated than for other health technologies such as pharmaceuticals. Decisions are often taken on evidence of efficacy and short-term safety from small-scale usually observational studies. This reflects the particular challenges of evaluating interventional procedures - the extra facets of skill and training and the difficulty defining a 'new' technology. Currently, there is no framework to evaluate new interventional procedures before they become available in clinical practice as opposed to new pharmaceuticals. This paper proposes a framework to guide the evaluation of a new interventional procedure. A framework was developed consisting of a four-stage progressive evaluation for a new interventional procedure: Stage 1: Development; Stage 2: Efficacy and short-term safety; Stage 3: Effectiveness and cost-effectiveness; and Stage 4: Implementation. The framework also suggests the types of studies or data collection methods that can be used to satisfy each stage. This paper makes a first step on a framework for generating evidence on new interventional procedures. The difficulties and limitations of applying such a framework are discussed. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  1. Dictionary-based fiber orientation estimation with improved spatial consistency.

    PubMed

    Ye, Chuyang; Prince, Jerry L

    2018-02-01

    Diffusion magnetic resonance imaging (dMRI) has enabled in vivo investigation of white matter tracts. Fiber orientation (FO) estimation is a key step in tract reconstruction and has been a popular research topic in dMRI analysis. In particular, the sparsity assumption has been used in conjunction with a dictionary-based framework to achieve reliable FO estimation with a reduced number of gradient directions. Because image noise can have a deleterious effect on the accuracy of FO estimation, previous works have incorporated spatial consistency of FOs in the dictionary-based framework to improve the estimation. However, because FOs are only indirectly determined from the mixture fractions of dictionary atoms and not modeled as variables in the objective function, these methods do not incorporate FO smoothness directly, and their ability to produce smooth FOs could be limited. In this work, we propose an improvement to Fiber Orientation Reconstruction using Neighborhood Information (FORNI), which we call FORNI+; this method estimates FOs in a dictionary-based framework where FO smoothness is better enforced than in FORNI alone. We describe an objective function that explicitly models the actual FOs and the mixture fractions of dictionary atoms. Specifically, it consists of data fidelity between the observed signals and the signals represented by the dictionary, pairwise FO dissimilarity that encourages FO smoothness, and weighted ℓ 1 -norm terms that ensure the consistency between the actual FOs and the FO configuration suggested by the dictionary representation. The FOs and mixture fractions are then jointly estimated by minimizing the objective function using an iterative alternating optimization strategy. FORNI+ was evaluated on a simulation phantom, a physical phantom, and real brain dMRI data. In particular, in the real brain dMRI experiment, we have qualitatively and quantitatively evaluated the reproducibility of the proposed method. Results demonstrate that FORNI+ produces FOs with better quality compared with competing methods. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Co-adapting societal and ecological interactions following large disturbances in urban park woodlands

    Treesearch

    Margaret Carreiro; Wayne Zipperer

    2011-01-01

    The responses of urban park woodlands to large disturbances provide the opportunity to identify and examine linkages in social-ecological systems in urban landscapes.We propose that the Panarchy model consisting of hierarchically nested adaptive cycles provides a useful framework to evaluate those linkages.We use two case studies as examples – Cherokee Park in...

  3. Proposing a New Pedagogy-Based Website Design: A Usability Test with Lifelong Learners

    ERIC Educational Resources Information Center

    Khlaisang, Jintavee

    2017-01-01

    This study aimed to create a new pedagogy-based website based on the analysis of the needs of 7147 website users who visited the Thailand Cyber University (TCU) project website during 2011-2013. The study consisted of 4 stages: (1) examining learners' needs and literature related to developing a lifelong learning framework, (2) designing a site…

  4. Possibility of designing catalysts beyond the traditional volcano curve: a theoretical framework for multi-phase surfaces.

    PubMed

    Wang, Ziyun; Wang, Hai-Feng; Hu, P

    2015-10-01

    The current theory of catalyst activity in heterogeneous catalysis is mainly obtained from the study of catalysts with mono-phases, while most catalysts in real systems consist of multi-phases, the understanding of which is far short of chemists' expectation. Density functional theory (DFT) and micro-kinetics simulations are used to investigate the activities of six mono-phase and nine bi-phase catalysts, using CO hydrogenation that is arguably the most typical reaction in heterogeneous catalysis. Excellent activities that are beyond the activity peak of traditional mono-phase volcano curves are found on some bi-phase surfaces. By analyzing these results, a new framework to understand the unexpected activities of bi-phase surfaces is proposed. Based on the framework, several principles for the design of multi-phase catalysts are suggested. The theoretical framework extends the traditional catalysis theory to understand more complex systems.

  5. Identifying seasonal mobility profiles from anonymized and aggregated mobile phone data. Application in food security.

    PubMed

    Zufiria, Pedro J; Pastor-Escuredo, David; Úbeda-Medina, Luis; Hernandez-Medina, Miguel A; Barriales-Valbuena, Iker; Morales, Alfredo J; Jacques, Damien C; Nkwambi, Wilfred; Diop, M Bamba; Quinn, John; Hidalgo-Sanchís, Paula; Luengo-Oroz, Miguel

    2018-01-01

    We propose a framework for the systematic analysis of mobile phone data to identify relevant mobility profiles in a population. The proposed framework allows finding distinct human mobility profiles based on the digital trace of mobile phone users characterized by a Matrix of Individual Trajectories (IT-Matrix). This matrix gathers a consistent and regularized description of individual trajectories that enables multi-scale representations along time and space, which can be used to extract aggregated indicators such as a dynamic multi-scale population count. Unsupervised clustering of individual trajectories generates mobility profiles (clusters of similar individual trajectories) which characterize relevant group behaviors preserving optimal aggregation levels for detailed and privacy-secured mobility characterization. The application of the proposed framework is illustrated by analyzing fully anonymized data on human mobility from mobile phones in Senegal at the arrondissement level over a calendar year. The analysis of monthly mobility patterns at the livelihood zone resolution resulted in the discovery and characterization of seasonal mobility profiles related with economic activities, agricultural calendars and rainfalls. The use of these mobility profiles could support the timely identification of mobility changes in vulnerable populations in response to external shocks (such as natural disasters, civil conflicts or sudden increases of food prices) to monitor food security.

  6. Probability-Based Recognition Framework for Underwater Landmarks Using Sonar Images †

    PubMed Central

    Choi, Jinwoo; Choi, Hyun-Taek

    2017-01-01

    This paper proposes a probability-based framework for recognizing underwater landmarks using sonar images. Current recognition methods use a single image, which does not provide reliable results because of weaknesses of the sonar image such as unstable acoustic source, many speckle noises, low resolution images, single channel image, and so on. However, using consecutive sonar images, if the status—i.e., the existence and identity (or name)—of an object is continuously evaluated by a stochastic method, the result of the recognition method is available for calculating the uncertainty, and it is more suitable for various applications. Our proposed framework consists of three steps: (1) candidate selection, (2) continuity evaluation, and (3) Bayesian feature estimation. Two probability methods—particle filtering and Bayesian feature estimation—are used to repeatedly estimate the continuity and feature of objects in consecutive images. Thus, the status of the object is repeatedly predicted and updated by a stochastic method. Furthermore, we develop an artificial landmark to increase detectability by an imaging sonar, which we apply to the characteristics of acoustic waves, such as instability and reflection depending on the roughness of the reflector surface. The proposed method is verified by conducting basin experiments, and the results are presented. PMID:28837068

  7. An epidemiological modeling and data integration framework.

    PubMed

    Pfeifer, B; Wurz, M; Hanser, F; Seger, M; Netzer, M; Osl, M; Modre-Osprian, R; Schreier, G; Baumgartner, C

    2010-01-01

    In this work, a cellular automaton software package for simulating different infectious diseases, storing the simulation results in a data warehouse system and analyzing the obtained results to generate prediction models as well as contingency plans, is proposed. The Brisbane H3N2 flu virus, which has been spreading during the winter season 2009, was used for simulation in the federal state of Tyrol, Austria. The simulation-modeling framework consists of an underlying cellular automaton. The cellular automaton model is parameterized by known disease parameters and geographical as well as demographical conditions are included for simulating the spreading. The data generated by simulation are stored in the back room of the data warehouse using the Talend Open Studio software package, and subsequent statistical and data mining tasks are performed using the tool, termed Knowledge Discovery in Database Designer (KD3). The obtained simulation results were used for generating prediction models for all nine federal states of Austria. The proposed framework provides a powerful and easy to handle interface for parameterizing and simulating different infectious diseases in order to generate prediction models and improve contingency plans for future events.

  8. A natural language processing and geospatial clustering framework for harvesting local place names from geotagged housing advertisements

    DOE PAGES

    Hu, Yingjie; Mao, Huina; Mckenzie, Grant

    2018-04-13

    We report that local place names are frequently used by residents living in a geographic region. Such place names may not be recorded in existing gazetteers, due to their vernacular nature, relative insignificance to a gazetteer covering a large area (e.g. the entire world), recent establishment (e.g. the name of a newly-opened shopping center) or other reasons. While not always recorded, local place names play important roles in many applications, from supporting public participation in urban planning to locating victims in disaster response. In this paper, we propose a computational framework for harvesting local place names from geotagged housing advertisements.more » We make use of those advertisements posted on local-oriented websites, such as Craigslist, where local place names are often mentioned. The proposed framework consists of two stages: natural language processing (NLP) and geospatial clustering. The NLP stage examines the textual content of housing advertisements and extracts place name candidates. The geospatial stage focuses on the coordinates associated with the extracted place name candidates and performs multiscale geospatial clustering to filter out the non-place names. We evaluate our framework by comparing its performance with those of six baselines. Finally, we also compare our result with four existing gazetteers to demonstrate the not-yet-recorded local place names discovered by our framework.« less

  9. A natural language processing and geospatial clustering framework for harvesting local place names from geotagged housing advertisements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Yingjie; Mao, Huina; Mckenzie, Grant

    We report that local place names are frequently used by residents living in a geographic region. Such place names may not be recorded in existing gazetteers, due to their vernacular nature, relative insignificance to a gazetteer covering a large area (e.g. the entire world), recent establishment (e.g. the name of a newly-opened shopping center) or other reasons. While not always recorded, local place names play important roles in many applications, from supporting public participation in urban planning to locating victims in disaster response. In this paper, we propose a computational framework for harvesting local place names from geotagged housing advertisements.more » We make use of those advertisements posted on local-oriented websites, such as Craigslist, where local place names are often mentioned. The proposed framework consists of two stages: natural language processing (NLP) and geospatial clustering. The NLP stage examines the textual content of housing advertisements and extracts place name candidates. The geospatial stage focuses on the coordinates associated with the extracted place name candidates and performs multiscale geospatial clustering to filter out the non-place names. We evaluate our framework by comparing its performance with those of six baselines. Finally, we also compare our result with four existing gazetteers to demonstrate the not-yet-recorded local place names discovered by our framework.« less

  10. Deep Spatial-Temporal Joint Feature Representation for Video Object Detection.

    PubMed

    Zhao, Baojun; Zhao, Boya; Tang, Linbo; Han, Yuqi; Wang, Wenzheng

    2018-03-04

    With the development of deep neural networks, many object detection frameworks have shown great success in the fields of smart surveillance, self-driving cars, and facial recognition. However, the data sources are usually videos, and the object detection frameworks are mostly established on still images and only use the spatial information, which means that the feature consistency cannot be ensured because the training procedure loses temporal information. To address these problems, we propose a single, fully-convolutional neural network-based object detection framework that involves temporal information by using Siamese networks. In the training procedure, first, the prediction network combines the multiscale feature map to handle objects of various sizes. Second, we introduce a correlation loss by using the Siamese network, which provides neighboring frame features. This correlation loss represents object co-occurrences across time to aid the consistent feature generation. Since the correlation loss should use the information of the track ID and detection label, our video object detection network has been evaluated on the large-scale ImageNet VID dataset where it achieves a 69.5% mean average precision (mAP).

  11. Sparsity-aware tight frame learning with adaptive subspace recognition for multiple fault diagnosis

    NASA Astrophysics Data System (ADS)

    Zhang, Han; Chen, Xuefeng; Du, Zhaohui; Yang, Boyuan

    2017-09-01

    It is a challenging problem to design excellent dictionaries to sparsely represent diverse fault information and simultaneously discriminate different fault sources. Therefore, this paper describes and analyzes a novel multiple feature recognition framework which incorporates the tight frame learning technique with an adaptive subspace recognition strategy. The proposed framework consists of four stages. Firstly, by introducing the tight frame constraint into the popular dictionary learning model, the proposed tight frame learning model could be formulated as a nonconvex optimization problem which can be solved by alternatively implementing hard thresholding operation and singular value decomposition. Secondly, the noises are effectively eliminated through transform sparse coding techniques. Thirdly, the denoised signal is decoupled into discriminative feature subspaces by each tight frame filter. Finally, in guidance of elaborately designed fault related sensitive indexes, latent fault feature subspaces can be adaptively recognized and multiple faults are diagnosed simultaneously. Extensive numerical experiments are sequently implemented to investigate the sparsifying capability of the learned tight frame as well as its comprehensive denoising performance. Most importantly, the feasibility and superiority of the proposed framework is verified through performing multiple fault diagnosis of motor bearings. Compared with the state-of-the-art fault detection techniques, some important advantages have been observed: firstly, the proposed framework incorporates the physical prior with the data-driven strategy and naturally multiple fault feature with similar oscillation morphology can be adaptively decoupled. Secondly, the tight frame dictionary directly learned from the noisy observation can significantly promote the sparsity of fault features compared to analytical tight frames. Thirdly, a satisfactory complete signal space description property is guaranteed and thus weak feature leakage problem is avoided compared to typical learning methods.

  12. Validating a new methodology for strain estimation from cardiac cine MRI

    NASA Astrophysics Data System (ADS)

    Elnakib, Ahmed; Beache, Garth M.; Gimel'farb, Georgy; Inanc, Tamer; El-Baz, Ayman

    2013-10-01

    This paper focuses on validating a novel framework for estimating the functional strain from cine cardiac magnetic resonance imaging (CMRI). The framework consists of three processing steps. First, the left ventricle (LV) wall borders are segmented using a level-set based deformable model. Second, the points on the wall borders are tracked during the cardiac cycle based on solving the Laplace equation between the LV edges. Finally, the circumferential and radial strains are estimated at the inner, mid-wall, and outer borders of the LV wall. The proposed framework is validated using synthetic phantoms of the material strains that account for the physiological features and the LV response during the cardiac cycle. Experimental results on simulated phantom images confirm the accuracy and robustness of our method.

  13. Enhancing clinical decision making: development of a contiguous definition and conceptual framework.

    PubMed

    Tiffen, Jennifer; Corbridge, Susan J; Slimmer, Lynda

    2014-01-01

    Clinical decision making is a term frequently used to describe the fundamental role of the nurse practitioner; however, other terms have been used interchangeably. The purpose of this article is to begin the process of developing a definition and framework of clinical decision making. The developed definition was "Clinical decision making is a contextual, continuous, and evolving process, where data are gathered, interpreted, and evaluated in order to select an evidence-based choice of action." A contiguous framework for clinical decision making specific for nurse practitioners is also proposed. Having a clear and unique understanding of clinical decision making will allow for consistent use of the term, which is relevant given the changing educational requirements for nurse practitioners and broadening scope of practice. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. On nonlinear thermo-electro-elasticity.

    PubMed

    Mehnert, Markus; Hossain, Mokarram; Steinmann, Paul

    2016-06-01

    Electro-active polymers (EAPs) for large actuations are nowadays well-known and promising candidates for producing sensors, actuators and generators. In general, polymeric materials are sensitive to differential temperature histories. During experimental characterizations of EAPs under electro-mechanically coupled loads, it is difficult to maintain constant temperature not only because of an external differential temperature history but also because of the changes in internal temperature caused by the application of high electric loads. In this contribution, a thermo-electro-mechanically coupled constitutive framework is proposed based on the total energy approach. Departing from relevant laws of thermodynamics, thermodynamically consistent constitutive equations are formulated. To demonstrate the performance of the proposed thermo-electro-mechanically coupled framework, a frequently used non-homogeneous boundary-value problem, i.e. the extension and inflation of a cylindrical tube, is solved analytically. The results illustrate the influence of various thermo-electro-mechanical couplings.

  15. On nonlinear thermo-electro-elasticity

    PubMed Central

    Mehnert, Markus; Hossain, Mokarram

    2016-01-01

    Electro-active polymers (EAPs) for large actuations are nowadays well-known and promising candidates for producing sensors, actuators and generators. In general, polymeric materials are sensitive to differential temperature histories. During experimental characterizations of EAPs under electro-mechanically coupled loads, it is difficult to maintain constant temperature not only because of an external differential temperature history but also because of the changes in internal temperature caused by the application of high electric loads. In this contribution, a thermo-electro-mechanically coupled constitutive framework is proposed based on the total energy approach. Departing from relevant laws of thermodynamics, thermodynamically consistent constitutive equations are formulated. To demonstrate the performance of the proposed thermo-electro-mechanically coupled framework, a frequently used non-homogeneous boundary-value problem, i.e. the extension and inflation of a cylindrical tube, is solved analytically. The results illustrate the influence of various thermo-electro-mechanical couplings. PMID:27436985

  16. Antiviral Information Management System (AIMS): a prototype for operational innovation in drug development.

    PubMed

    Jadhav, Pravin R; Neal, Lauren; Florian, Jeff; Chen, Ying; Naeger, Lisa; Robertson, Sarah; Soon, Guoxing; Birnkrant, Debra

    2010-09-01

    This article presents a prototype for an operational innovation in knowledge management (KM). These operational innovations are geared toward managing knowledge efficiently and accessing all available information by embracing advances in bioinformatics and allied fields. The specific components of the proposed KM system are (1) a database to archive hepatitis C virus (HCV) treatment data in a structured format and retrieve information in a query-capable manner and (2) an automated analysis tool to inform trial design elements for HCV drug development. The proposed framework is intended to benefit drug development by increasing efficiency of dose selection and improving the consistency of advice from US Food and Drug Administration (FDA). It is also hoped that the framework will encourage collaboration among FDA, industry, and academic scientists to guide the HCV drug development process using model-based quantitative analysis techniques.

  17. Automatic Microaneurysms Detection Based on Multifeature Fusion Dictionary Learning

    PubMed Central

    Wang, Zhenzhu; Du, Wenyou

    2017-01-01

    Recently, microaneurysm (MA) detection has attracted a lot of attention in the medical image processing community. Since MAs can be seen as the earliest lesions in diabetic retinopathy, their detection plays a critical role in diabetic retinopathy diagnosis. In this paper, we propose a novel MA detection approach named multifeature fusion dictionary learning (MFFDL). The proposed method consists of four steps: preprocessing, candidate extraction, multifeature dictionary learning, and classification. The novelty of our proposed approach lies in incorporating the semantic relationships among multifeatures and dictionary learning into a unified framework for automatic detection of MAs. We evaluate the proposed algorithm by comparing it with the state-of-the-art approaches and the experimental results validate the effectiveness of our algorithm. PMID:28421125

  18. Automatic Microaneurysms Detection Based on Multifeature Fusion Dictionary Learning.

    PubMed

    Zhou, Wei; Wu, Chengdong; Chen, Dali; Wang, Zhenzhu; Yi, Yugen; Du, Wenyou

    2017-01-01

    Recently, microaneurysm (MA) detection has attracted a lot of attention in the medical image processing community. Since MAs can be seen as the earliest lesions in diabetic retinopathy, their detection plays a critical role in diabetic retinopathy diagnosis. In this paper, we propose a novel MA detection approach named multifeature fusion dictionary learning (MFFDL). The proposed method consists of four steps: preprocessing, candidate extraction, multifeature dictionary learning, and classification. The novelty of our proposed approach lies in incorporating the semantic relationships among multifeatures and dictionary learning into a unified framework for automatic detection of MAs. We evaluate the proposed algorithm by comparing it with the state-of-the-art approaches and the experimental results validate the effectiveness of our algorithm.

  19. CUFID-query: accurate network querying through random walk based network flow estimation.

    PubMed

    Jeong, Hyundoo; Qian, Xiaoning; Yoon, Byung-Jun

    2017-12-28

    Functional modules in biological networks consist of numerous biomolecules and their complicated interactions. Recent studies have shown that biomolecules in a functional module tend to have similar interaction patterns and that such modules are often conserved across biological networks of different species. As a result, such conserved functional modules can be identified through comparative analysis of biological networks. In this work, we propose a novel network querying algorithm based on the CUFID (Comparative network analysis Using the steady-state network Flow to IDentify orthologous proteins) framework combined with an efficient seed-and-extension approach. The proposed algorithm, CUFID-query, can accurately detect conserved functional modules as small subnetworks in the target network that are expected to perform similar functions to the given query functional module. The CUFID framework was recently developed for probabilistic pairwise global comparison of biological networks, and it has been applied to pairwise global network alignment, where the framework was shown to yield accurate network alignment results. In the proposed CUFID-query algorithm, we adopt the CUFID framework and extend it for local network alignment, specifically to solve network querying problems. First, in the seed selection phase, the proposed method utilizes the CUFID framework to compare the query and the target networks and to predict the probabilistic node-to-node correspondence between the networks. Next, the algorithm selects and greedily extends the seed in the target network by iteratively adding nodes that have frequent interactions with other nodes in the seed network, in a way that the conductance of the extended network is maximally reduced. Finally, CUFID-query removes irrelevant nodes from the querying results based on the personalized PageRank vector for the induced network that includes the fully extended network and its neighboring nodes. Through extensive performance evaluation based on biological networks with known functional modules, we show that CUFID-query outperforms the existing state-of-the-art algorithms in terms of prediction accuracy and biological significance of the predictions.

  20. A Theoretical Framework for Calibration in Computer Models: Parametrization, Estimation and Convergence Properties

    DOE PAGES

    Tuo, Rui; Jeff Wu, C. F.

    2016-07-19

    Calibration parameters in deterministic computer experiments are those attributes that cannot be measured or available in physical experiments. Here, an approach to estimate them by using data from physical experiments and computer simulations. A theoretical framework is given which allows us to study the issues of parameter identifiability and estimation. We define the L 2-consistency for calibration as a justification for calibration methods. It is shown that a simplified version of the original KO method leads to asymptotically L 2-inconsistent calibration. This L 2-inconsistency can be remedied by modifying the original estimation procedure. A novel calibration method, called the Lmore » 2 calibration, is proposed and proven to be L 2-consistent and enjoys optimal convergence rate. Furthermore a numerical example and some mathematical analysis are used to illustrate the source of the L 2-inconsistency problem.« less

  1. Framework for Structural Online Health Monitoring of Aging and Degradation of Secondary Systems due to some Aspects of Erosion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gribok, Andrei; Patnaik, Sobhan; Williams, Christian

    This report describes the current state of research related to critical aspects of erosion and selected aspects of degradation of secondary components in nuclear power plants. The report also proposes a framework for online health monitoring of aging and degradation of secondary components. The framework consists of an integrated multi-sensor modality system which can be used to monitor different piping configurations under different degradation conditions. The report analyses the currently known degradation mechanisms and available predictive models. Based on this analysis, the structural health monitoring framework is proposed. The Light Water Reactor Sustainability Program began to evaluate technologies that couldmore » be used to perform online monitoring of piping and other secondary system structural components in commercial NPPs. These online monitoring systems have the potential to identify when a more detailed inspection is needed using real-time measurements, rather than at a pre-determined inspection interval. This transition to condition-based, risk informed automated maintenance will contribute to a significant reduction of operations and maintenance costs that account for the majority of nuclear power generation costs. There is unanimous agreement between industry experts and academic researchers that identifying and prioritizing inspection locations in secondary piping systems (for example, in raw water piping or diesel piping) would eliminate many excessive in-service inspections. The proposed structural health monitoring framework takes aim at answering this challenge by combining long-range guided wave technologies with other monitoring techniques, which can significantly increase the inspection length and pinpoint the locations that degraded the most. More widely, the report suggests research efforts aimed at developing, validating, and deploying online corrosion monitoring techniques for complex geometries, which are pervasive in NPPs.« less

  2. A research framework for pharmacovigilance in health social media: Identification and evaluation of patient adverse drug event reports.

    PubMed

    Liu, Xiao; Chen, Hsinchun

    2015-12-01

    Social media offer insights of patients' medical problems such as drug side effects and treatment failures. Patient reports of adverse drug events from social media have great potential to improve current practice of pharmacovigilance. However, extracting patient adverse drug event reports from social media continues to be an important challenge for health informatics research. In this study, we develop a research framework with advanced natural language processing techniques for integrated and high-performance patient reported adverse drug event extraction. The framework consists of medical entity extraction for recognizing patient discussions of drug and events, adverse drug event extraction with shortest dependency path kernel based statistical learning method and semantic filtering with information from medical knowledge bases, and report source classification to tease out noise. To evaluate the proposed framework, a series of experiments were conducted on a test bed encompassing about postings from major diabetes and heart disease forums in the United States. The results reveal that each component of the framework significantly contributes to its overall effectiveness. Our framework significantly outperforms prior work. Published by Elsevier Inc.

  3. Joint Multi-Fiber NODDI Parameter Estimation and Tractography Using the Unscented Information Filter

    PubMed Central

    Reddy, Chinthala P.; Rathi, Yogesh

    2016-01-01

    Tracing white matter fiber bundles is an integral part of analyzing brain connectivity. An accurate estimate of the underlying tissue parameters is also paramount in several neuroscience applications. In this work, we propose to use a joint fiber model estimation and tractography algorithm that uses the NODDI (neurite orientation dispersion diffusion imaging) model to estimate fiber orientation dispersion consistently and smoothly along the fiber tracts along with estimating the intracellular and extracellular volume fractions from the diffusion signal. While the NODDI model has been used in earlier works to estimate the microstructural parameters at each voxel independently, for the first time, we propose to integrate it into a tractography framework. We extend this framework to estimate the NODDI parameters for two crossing fibers, which is imperative to trace fiber bundles through crossings as well as to estimate the microstructural parameters for each fiber bundle separately. We propose to use the unscented information filter (UIF) to accurately estimate the model parameters and perform tractography. The proposed approach has significant computational performance improvements as well as numerical robustness over the unscented Kalman filter (UKF). Our method not only estimates the confidence in the estimated parameters via the covariance matrix, but also provides the Fisher-information matrix of the state variables (model parameters), which can be quite useful to measure model complexity. Results from in-vivo human brain data sets demonstrate the ability of our algorithm to trace through crossing fiber regions, while estimating orientation dispersion and other biophysical model parameters in a consistent manner along the tracts. PMID:27147956

  4. Joint Multi-Fiber NODDI Parameter Estimation and Tractography Using the Unscented Information Filter.

    PubMed

    Reddy, Chinthala P; Rathi, Yogesh

    2016-01-01

    Tracing white matter fiber bundles is an integral part of analyzing brain connectivity. An accurate estimate of the underlying tissue parameters is also paramount in several neuroscience applications. In this work, we propose to use a joint fiber model estimation and tractography algorithm that uses the NODDI (neurite orientation dispersion diffusion imaging) model to estimate fiber orientation dispersion consistently and smoothly along the fiber tracts along with estimating the intracellular and extracellular volume fractions from the diffusion signal. While the NODDI model has been used in earlier works to estimate the microstructural parameters at each voxel independently, for the first time, we propose to integrate it into a tractography framework. We extend this framework to estimate the NODDI parameters for two crossing fibers, which is imperative to trace fiber bundles through crossings as well as to estimate the microstructural parameters for each fiber bundle separately. We propose to use the unscented information filter (UIF) to accurately estimate the model parameters and perform tractography. The proposed approach has significant computational performance improvements as well as numerical robustness over the unscented Kalman filter (UKF). Our method not only estimates the confidence in the estimated parameters via the covariance matrix, but also provides the Fisher-information matrix of the state variables (model parameters), which can be quite useful to measure model complexity. Results from in-vivo human brain data sets demonstrate the ability of our algorithm to trace through crossing fiber regions, while estimating orientation dispersion and other biophysical model parameters in a consistent manner along the tracts.

  5. Proposal of a New Adverse Event Classification by the Society of Interventional Radiology Standards of Practice Committee.

    PubMed

    Khalilzadeh, Omid; Baerlocher, Mark O; Shyn, Paul B; Connolly, Bairbre L; Devane, A Michael; Morris, Christopher S; Cohen, Alan M; Midia, Mehran; Thornton, Raymond H; Gross, Kathleen; Caplin, Drew M; Aeron, Gunjan; Misra, Sanjay; Patel, Nilesh H; Walker, T Gregory; Martinez-Salazar, Gloria; Silberzweig, James E; Nikolic, Boris

    2017-10-01

    To develop a new adverse event (AE) classification for the interventional radiology (IR) procedures and evaluate its clinical, research, and educational value compared with the existing Society of Interventional Radiology (SIR) classification via an SIR member survey. A new AE classification was developed by members of the Standards of Practice Committee of the SIR. Subsequently, a survey was created by a group of 18 members from the SIR Standards of Practice Committee and Service Lines. Twelve clinical AE case scenarios were generated that encompassed a broad spectrum of IR procedures and potential AEs. Survey questions were designed to evaluate the following domains: educational and research values, accountability for intraprocedural challenges, consistency of AE reporting, unambiguity, and potential for incorporation into existing quality-assurance framework. For each AE scenario, the survey participants were instructed to answer questions about the proposed and existing SIR classifications. SIR members were invited via online survey links, and 68 members participated among 140 surveyed. Answers on new and existing classifications were evaluated and compared statistically. Overall comparison between the two surveys was performed by generalized linear modeling. The proposed AE classification received superior evaluations in terms of consistency of reporting (P < .05) and potential for incorporation into existing quality-assurance framework (P < .05). Respondents gave a higher overall rating to the educational and research value of the new compared with the existing classification (P < .05). This study proposed an AE classification system that outperformed the existing SIR classification in the studied domains. Copyright © 2017 SIR. Published by Elsevier Inc. All rights reserved.

  6. Generating action descriptions from statistically integrated representations of human motions and sentences.

    PubMed

    Takano, Wataru; Kusajima, Ikuo; Nakamura, Yoshihiko

    2016-08-01

    It is desirable for robots to be able to linguistically understand human actions during human-robot interactions. Previous research has developed frameworks for encoding human full body motion into model parameters and for classifying motion into specific categories. For full understanding, the motion categories need to be connected to the natural language such that the robots can interpret human motions as linguistic expressions. This paper proposes a novel framework for integrating observation of human motion with that of natural language. This framework consists of two models; the first model statistically learns the relations between motions and their relevant words, and the second statistically learns sentence structures as word n-grams. Integration of these two models allows robots to generate sentences from human motions by searching for words relevant to the motion using the first model and then arranging these words in appropriate order using the second model. This allows making sentences that are the most likely to be generated from the motion. The proposed framework was tested on human full body motion measured by an optical motion capture system. In this, descriptive sentences were manually attached to the motions, and the validity of the system was demonstrated. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Phase-amplitude reduction of transient dynamics far from attractors for limit-cycling systems

    NASA Astrophysics Data System (ADS)

    Shirasaka, Sho; Kurebayashi, Wataru; Nakao, Hiroya

    2017-02-01

    Phase reduction framework for limit-cycling systems based on isochrons has been used as a powerful tool for analyzing the rhythmic phenomena. Recently, the notion of isostables, which complements the isochrons by characterizing amplitudes of the system state, i.e., deviations from the limit-cycle attractor, has been introduced to describe the transient dynamics around the limit cycle [Wilson and Moehlis, Phys. Rev. E 94, 052213 (2016)]. In this study, we introduce a framework for a reduced phase-amplitude description of transient dynamics of stable limit-cycling systems. In contrast to the preceding study, the isostables are treated in a fully consistent way with the Koopman operator analysis, which enables us to avoid discontinuities of the isostables and to apply the framework to system states far from the limit cycle. We also propose a new, convenient bi-orthogonalization method to obtain the response functions of the amplitudes, which can be interpreted as an extension of the adjoint covariant Lyapunov vector to transient dynamics in limit-cycling systems. We illustrate the utility of the proposed reduction framework by estimating the optimal injection timing of external input that efficiently suppresses deviations of the system state from the limit cycle in a model of a biochemical oscillator.

  8. Design and develop a video conferencing framework for real-time telemedicine applications using secure group-based communication architecture.

    PubMed

    Mat Kiah, M L; Al-Bakri, S H; Zaidan, A A; Zaidan, B B; Hussain, Muzammil

    2014-10-01

    One of the applications of modern technology in telemedicine is video conferencing. An alternative to traveling to attend a conference or meeting, video conferencing is becoming increasingly popular among hospitals. By using this technology, doctors can help patients who are unable to physically visit hospitals. Video conferencing particularly benefits patients from rural areas, where good doctors are not always available. Telemedicine has proven to be a blessing to patients who have no access to the best treatment. A telemedicine system consists of customized hardware and software at two locations, namely, at the patient's and the doctor's end. In such cases, the video streams of the conferencing parties may contain highly sensitive information. Thus, real-time data security is one of the most important requirements when designing video conferencing systems. This study proposes a secure framework for video conferencing systems and a complete management solution for secure video conferencing groups. Java Media Framework Application Programming Interface classes are used to design and test the proposed secure framework. Real-time Transport Protocol over User Datagram Protocol is used to transmit the encrypted audio and video streams, and RSA and AES algorithms are used to provide the required security services. Results show that the encryption algorithm insignificantly increases the video conferencing computation time.

  9. Comprehensive and Practical Vision System for Self-Driving Vehicle Lane-Level Localization.

    PubMed

    Du, Xinxin; Tan, Kok Kiong

    2016-05-01

    Vehicle lane-level localization is a fundamental technology in autonomous driving. To achieve accurate and consistent performance, a common approach is to use the LIDAR technology. However, it is expensive and computational demanding, and thus not a practical solution in many situations. This paper proposes a stereovision system, which is of low cost, yet also able to achieve high accuracy and consistency. It integrates a new lane line detection algorithm with other lane marking detectors to effectively identify the correct lane line markings. It also fits multiple road models to improve accuracy. An effective stereo 3D reconstruction method is proposed to estimate vehicle localization. The estimation consistency is further guaranteed by a new particle filter framework, which takes vehicle dynamics into account. Experiment results based on image sequences taken under different visual conditions showed that the proposed system can identify the lane line markings with 98.6% accuracy. The maximum estimation error of the vehicle distance to lane lines is 16 cm in daytime and 26 cm at night, and the maximum estimation error of its moving direction with respect to the road tangent is 0.06 rad in daytime and 0.12 rad at night. Due to its high accuracy and consistency, the proposed system can be implemented in autonomous driving vehicles as a practical solution to vehicle lane-level localization.

  10. Telescopic multi-resolution augmented reality

    NASA Astrophysics Data System (ADS)

    Jenkins, Jeffrey; Frenchi, Christopher; Szu, Harold

    2014-05-01

    To ensure a self-consistent scaling approximation, the underlying microscopic fluctuation components can naturally influence macroscopic means, which may give rise to emergent observable phenomena. In this paper, we describe a consistent macroscopic (cm-scale), mesoscopic (micron-scale), and microscopic (nano-scale) approach to introduce Telescopic Multi-Resolution (TMR) into current Augmented Reality (AR) visualization technology. We propose to couple TMR-AR by introducing an energy-matter interaction engine framework that is based on known Physics, Biology, Chemistry principles. An immediate payoff of TMR-AR is a self-consistent approximation of the interaction between microscopic observables and their direct effect on the macroscopic system that is driven by real-world measurements. Such an interdisciplinary approach enables us to achieve more than multiple scale, telescopic visualization of real and virtual information but also conducting thought experiments through AR. As a result of the consistency, this framework allows us to explore a large dimensionality parameter space of measured and unmeasured regions. Towards this direction, we explore how to build learnable libraries of biological, physical, and chemical mechanisms. Fusing analytical sensors with TMR-AR libraries provides a robust framework to optimize testing and evaluation through data-driven or virtual synthetic simulations. Visualizing mechanisms of interactions requires identification of observable image features that can indicate the presence of information in multiple spatial and temporal scales of analog data. The AR methodology was originally developed to enhance pilot-training as well as `make believe' entertainment industries in a user-friendly digital environment We believe TMR-AR can someday help us conduct thought experiments scientifically, to be pedagogically visualized in a zoom-in-and-out, consistent, multi-scale approximations.

  11. A multimedia retrieval framework based on semi-supervised ranking and relevance feedback.

    PubMed

    Yang, Yi; Nie, Feiping; Xu, Dong; Luo, Jiebo; Zhuang, Yueting; Pan, Yunhe

    2012-04-01

    We present a new framework for multimedia content analysis and retrieval which consists of two independent algorithms. First, we propose a new semi-supervised algorithm called ranking with Local Regression and Global Alignment (LRGA) to learn a robust Laplacian matrix for data ranking. In LRGA, for each data point, a local linear regression model is used to predict the ranking scores of its neighboring points. A unified objective function is then proposed to globally align the local models from all the data points so that an optimal ranking score can be assigned to each data point. Second, we propose a semi-supervised long-term Relevance Feedback (RF) algorithm to refine the multimedia data representation. The proposed long-term RF algorithm utilizes both the multimedia data distribution in multimedia feature space and the history RF information provided by users. A trace ratio optimization problem is then formulated and solved by an efficient algorithm. The algorithms have been applied to several content-based multimedia retrieval applications, including cross-media retrieval, image retrieval, and 3D motion/pose data retrieval. Comprehensive experiments on four data sets have demonstrated its advantages in precision, robustness, scalability, and computational efficiency.

  12. Approximate kernel competitive learning.

    PubMed

    Wu, Jian-Sheng; Zheng, Wei-Shi; Lai, Jian-Huang

    2015-03-01

    Kernel competitive learning has been successfully used to achieve robust clustering. However, kernel competitive learning (KCL) is not scalable for large scale data processing, because (1) it has to calculate and store the full kernel matrix that is too large to be calculated and kept in the memory and (2) it cannot be computed in parallel. In this paper we develop a framework of approximate kernel competitive learning for processing large scale dataset. The proposed framework consists of two parts. First, it derives an approximate kernel competitive learning (AKCL), which learns kernel competitive learning in a subspace via sampling. We provide solid theoretical analysis on why the proposed approximation modelling would work for kernel competitive learning, and furthermore, we show that the computational complexity of AKCL is largely reduced. Second, we propose a pseudo-parallelled approximate kernel competitive learning (PAKCL) based on a set-based kernel competitive learning strategy, which overcomes the obstacle of using parallel programming in kernel competitive learning and significantly accelerates the approximate kernel competitive learning for large scale clustering. The empirical evaluation on publicly available datasets shows that the proposed AKCL and PAKCL can perform comparably as KCL, with a large reduction on computational cost. Also, the proposed methods achieve more effective clustering performance in terms of clustering precision against related approximate clustering approaches. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Multi-Domain Transfer Learning for Early Diagnosis of Alzheimer's Disease.

    PubMed

    Cheng, Bo; Liu, Mingxia; Shen, Dinggang; Li, Zuoyong; Zhang, Daoqiang

    2017-04-01

    Recently, transfer learning has been successfully applied in early diagnosis of Alzheimer's Disease (AD) based on multi-domain data. However, most of existing methods only use data from a single auxiliary domain, and thus cannot utilize the intrinsic useful correlation information from multiple domains. Accordingly, in this paper, we consider the joint learning of tasks in multi-auxiliary domains and the target domain, and propose a novel Multi-Domain Transfer Learning (MDTL) framework for early diagnosis of AD. Specifically, the proposed MDTL framework consists of two key components: 1) a multi-domain transfer feature selection (MDTFS) model that selects the most informative feature subset from multi-domain data, and 2) a multi-domain transfer classification (MDTC) model that can identify disease status for early AD detection. We evaluate our method on 807 subjects from the Alzheimer's Disease Neuroimaging Initiative (ADNI) database using baseline magnetic resonance imaging (MRI) data. The experimental results show that the proposed MDTL method can effectively utilize multi-auxiliary domain data for improving the learning performance in the target domain, compared with several state-of-the-art methods.

  14. Reservoir Modeling by Data Integration via Intermediate Spaces and Artificial Intelligence Tools in MPS Simulation Frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahmadi, Rouhollah, E-mail: rouhollahahmadi@yahoo.com; Khamehchi, Ehsan

    Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks andmore » fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data.« less

  15. Vision Sensor-Based Road Detection for Field Robot Navigation

    PubMed Central

    Lu, Keyu; Li, Jian; An, Xiangjing; He, Hangen

    2015-01-01

    Road detection is an essential component of field robot navigation systems. Vision sensors play an important role in road detection for their great potential in environmental perception. In this paper, we propose a hierarchical vision sensor-based method for robust road detection in challenging road scenes. More specifically, for a given road image captured by an on-board vision sensor, we introduce a multiple population genetic algorithm (MPGA)-based approach for efficient road vanishing point detection. Superpixel-level seeds are then selected in an unsupervised way using a clustering strategy. Then, according to the GrowCut framework, the seeds proliferate and iteratively try to occupy their neighbors. After convergence, the initial road segment is obtained. Finally, in order to achieve a globally-consistent road segment, the initial road segment is refined using the conditional random field (CRF) framework, which integrates high-level information into road detection. We perform several experiments to evaluate the common performance, scale sensitivity and noise sensitivity of the proposed method. The experimental results demonstrate that the proposed method exhibits high robustness compared to the state of the art. PMID:26610514

  16. Real-time solution of linear computational problems using databases of parametric reduced-order models with arbitrary underlying meshes

    NASA Astrophysics Data System (ADS)

    Amsallem, David; Tezaur, Radek; Farhat, Charbel

    2016-12-01

    A comprehensive approach for real-time computations using a database of parametric, linear, projection-based reduced-order models (ROMs) based on arbitrary underlying meshes is proposed. In the offline phase of this approach, the parameter space is sampled and linear ROMs defined by linear reduced operators are pre-computed at the sampled parameter points and stored. Then, these operators and associated ROMs are transformed into counterparts that satisfy a certain notion of consistency. In the online phase of this approach, a linear ROM is constructed in real-time at a queried but unsampled parameter point by interpolating the pre-computed linear reduced operators on matrix manifolds and therefore computing an interpolated linear ROM. The proposed overall model reduction framework is illustrated with two applications: a parametric inverse acoustic scattering problem associated with a mockup submarine, and a parametric flutter prediction problem associated with a wing-tank system. The second application is implemented on a mobile device, illustrating the capability of the proposed computational framework to operate in real-time.

  17. Multi-Domain Transfer Learning for Early Diagnosis of Alzheimer’s Disease

    PubMed Central

    Cheng, Bo; Liu, Mingxia; Li, Zuoyong

    2017-01-01

    Recently, transfer learning has been successfully applied in early diagnosis of Alzheimer’s Disease (AD) based on multi-domain data. However, most of existing methods only use data from a single auxiliary domain, and thus cannot utilize the intrinsic useful correlation information from multiple domains. Accordingly, in this paper, we consider the joint learning of tasks in multi-auxiliary domains and the target domain, and propose a novel Multi-Domain Transfer Learning (MDTL) framework for early diagnosis of AD. Specifically, the proposed MDTL framework consists of two key components: 1) a multi-domain transfer feature selection (MDTFS) model that selects the most informative feature subset from multi-domain data, and 2) a multidomain transfer classification (MDTC) model that can identify disease status for early AD detection. We evaluate our method on 807 subjects from the Alzheimer’s Disease Neuroimaging Initiative (ADNI) database using baseline magnetic resonance imaging (MRI) data. The experimental results show that the proposed MDTL method can effectively utilize multi-auxiliary domain data for improving the learning performance in the target domain, compared with several state-of-the-art methods. PMID:27928657

  18. Finger vein recognition based on personalized weight maps.

    PubMed

    Yang, Gongping; Xiao, Rongyang; Yin, Yilong; Yang, Lu

    2013-09-10

    Finger vein recognition is a promising biometric recognition technology, which verifies identities via the vein patterns in the fingers. Binary pattern based methods were thoroughly studied in order to cope with the difficulties of extracting the blood vessel network. However, current binary pattern based finger vein matching methods treat every bit of feature codes derived from different image of various individuals as equally important and assign the same weight value to them. In this paper, we propose a finger vein recognition method based on personalized weight maps (PWMs). The different bits have different weight values according to their stabilities in a certain number of training samples from an individual. Firstly we present the concept of PWM, and then propose the finger vein recognition framework, which mainly consists of preprocessing, feature extraction, and matching. Finally, we design extensive experiments to evaluate the effectiveness of our proposal. Experimental results show that PWM achieves not only better performance, but also high robustness and reliability. In addition, PWM can be used as a general framework for binary pattern based recognition.

  19. Finger Vein Recognition Based on Personalized Weight Maps

    PubMed Central

    Yang, Gongping; Xiao, Rongyang; Yin, Yilong; Yang, Lu

    2013-01-01

    Finger vein recognition is a promising biometric recognition technology, which verifies identities via the vein patterns in the fingers. Binary pattern based methods were thoroughly studied in order to cope with the difficulties of extracting the blood vessel network. However, current binary pattern based finger vein matching methods treat every bit of feature codes derived from different image of various individuals as equally important and assign the same weight value to them. In this paper, we propose a finger vein recognition method based on personalized weight maps (PWMs). The different bits have different weight values according to their stabilities in a certain number of training samples from an individual. Firstly we present the concept of PWM, and then propose the finger vein recognition framework, which mainly consists of preprocessing, feature extraction, and matching. Finally, we design extensive experiments to evaluate the effectiveness of our proposal. Experimental results show that PWM achieves not only better performance, but also high robustness and reliability. In addition, PWM can be used as a general framework for binary pattern based recognition. PMID:24025556

  20. Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems

    NASA Technical Reports Server (NTRS)

    Bujorianu, Marius C.; Bujorianu, Manuela L.

    2009-01-01

    In this paper, we sketch a framework for interdisciplinary modeling of space systems, by proposing a holistic view. We consider different system dimensions and their interaction. Specifically, we study the interactions between computation, physics, communication, uncertainty and autonomy. The most comprehensive computational paradigm that supports a holistic perspective on autonomous space systems is given by cyber-physical systems. For these, the state of art consists of collaborating multi-engineering efforts that prompt for an adequate formal foundation. To achieve this, we propose a leveraging of the traditional content of formal modeling by a co-engineering process.

  1. Rationally Designed 2D Covalent Organic Framework with a Brick-Wall Topology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai, Song-Liang; Zhang, Kai; Tan, Jing-Bo

    In this paper, we report the design and synthesis of an imine-based two-dimensional covalent organic framework (2D COF) with a novel brick-wall topology by judiciously choosing a tritopic T-shaped building block and a ditopic linear linker. Unlike the main body of COF frameworks reported to-date, which consists of higher-symmetry 2D topologies, the unconventional layered brick-wall topology have only been proposed but never been realized experimentally. The brick-wall structure was characterized by powder X-ray diffraction analysis, FT-IR, solid state 13C NMR spectroscopy, nitrogen, and carbon oxide adsorption-desorption measurements as well as theoretical simulations. Lastly, our present work opens the door tomore » the design of novel 2D COFs and will broaden the scope of emerging COF materials.« less

  2. Modelling Framework and Assistive Device for Peripheral Intravenous Injections

    NASA Astrophysics Data System (ADS)

    Kam, Kin F.; Robinson, Martin P.; Gilbert, Mathew A.; Pelah, Adar

    2016-02-01

    Intravenous access for blood sampling or drug administration that requires peripheral venepuncture is perhaps the most common invasive procedure practiced in hospitals, clinics and general practice surgeries.We describe an idealised mathematical framework for modelling the dynamics of the peripheral venepuncture process. Basic assumptions of the model are confirmed through motion analysis of needle trajectories during venepuncture, taken from video recordings of a skilled practitioner injecting into a practice kit. The framework is also applied to the design and construction of a proposed device for accurate needle guidance during venepuncture administration, assessed as consistent and repeatable in application and does not lead to over puncture. The study provides insights into the ubiquitous peripheral venepuncture process and may contribute to applications in training and in the design of new devices, including for use in robotic automation.

  3. A physics-based crystallographic modeling framework for describing the thermal creep behavior of Fe-Cr alloys

    DOE PAGES

    Wen, Wei; Capolungo, Laurent; Patra, Anirban; ...

    2017-02-23

    In this work, a physics-based thermal creep model is developed based on the understanding of the microstructure in Fe-Cr alloys. This model is associated with a transition state theory based framework that considers the distribution of internal stresses at sub-material point level. The thermally activated dislocation glide and climb mechanisms are coupled in the obstacle-bypass processes for both dislocation and precipitate-type barriers. A kinetic law is proposed to track the dislocation densities evolution in the subgrain interior and in the cell wall. The predicted results show that this model, embedded in the visco-plastic self-consistent (VPSC) framework, captures well the creepmore » behaviors for primary and steady-state stages under various loading conditions. We also discuss the roles of the mechanisms involved.« less

  4. Rationally Designed 2D Covalent Organic Framework with a Brick-Wall Topology

    DOE PAGES

    Cai, Song-Liang; Zhang, Kai; Tan, Jing-Bo; ...

    2016-11-23

    In this paper, we report the design and synthesis of an imine-based two-dimensional covalent organic framework (2D COF) with a novel brick-wall topology by judiciously choosing a tritopic T-shaped building block and a ditopic linear linker. Unlike the main body of COF frameworks reported to-date, which consists of higher-symmetry 2D topologies, the unconventional layered brick-wall topology have only been proposed but never been realized experimentally. The brick-wall structure was characterized by powder X-ray diffraction analysis, FT-IR, solid state 13C NMR spectroscopy, nitrogen, and carbon oxide adsorption-desorption measurements as well as theoretical simulations. Lastly, our present work opens the door tomore » the design of novel 2D COFs and will broaden the scope of emerging COF materials.« less

  5. State-Based Implicit Coordination and Applications

    NASA Technical Reports Server (NTRS)

    Narkawicz, Anthony J.; Munoz, Cesar A.

    2011-01-01

    In air traffic management, pairwise coordination is the ability to achieve separation requirements when conflicting aircraft simultaneously maneuver to solve a conflict. Resolution algorithms are implicitly coordinated if they provide coordinated resolution maneuvers to conflicting aircraft when only surveillance data, e.g., position and velocity vectors, is periodically broadcast by the aircraft. This paper proposes an abstract framework for reasoning about state-based implicit coordination. The framework consists of a formalized mathematical development that enables and simplifies the design and verification of implicitly coordinated state-based resolution algorithms. The use of the framework is illustrated with several examples of algorithms and formal proofs of their coordination properties. The work presented here supports the safety case for a distributed self-separation air traffic management concept where different aircraft may use different conflict resolution algorithms and be assured that separation will be maintained.

  6. A paradigm shift toward a consistent modeling framework to assess climate impacts

    NASA Astrophysics Data System (ADS)

    Monier, E.; Paltsev, S.; Sokolov, A. P.; Fant, C.; Chen, H.; Gao, X.; Schlosser, C. A.; Scott, J. R.; Dutkiewicz, S.; Ejaz, Q.; Couzo, E. A.; Prinn, R. G.; Haigh, M.

    2017-12-01

    Estimates of physical and economic impacts of future climate change are subject to substantial challenges. To enrich the currently popular approaches of assessing climate impacts by evaluating a damage function or by multi-model comparisons based on the Representative Concentration Pathways (RCPs), we focus here on integrating impacts into a self-consistent coupled human and Earth system modeling framework that includes modules that represent multiple physical impacts. In a sample application we show that this framework is capable of investigating the physical impacts of climate change and socio-economic stressors. The projected climate impacts vary dramatically across the globe in a set of scenarios with global mean warming ranging between 2.4°C and 3.6°C above pre-industrial by 2100. Unabated emissions lead to substantial sea level rise, acidification that impacts the base of the oceanic food chain, air pollution that exceeds health standards by tenfold, water stress that impacts an additional 1 to 2 billion people globally and agricultural productivity that decreases substantially in many parts of the world. We compare the outcomes from these forward-looking scenarios against the common goal described by the target-driven scenario of 2°C, which results in much smaller impacts. It is challenging for large internationally coordinated exercises to respond quickly to new policy targets. We propose that a paradigm shift toward a self-consistent modeling framework to assess climate impacts is needed to produce information relevant to evolving global climate policy and mitigation strategies in a timely way.

  7. Mitigation for one & all: An integrated framework for mitigation of development impacts on biodiversity and ecosystem services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tallis, Heather, E-mail: htallis@tnc.org; Kennedy, Christina M., E-mail: ckennedy@tnc.org; Ruckelshaus, Mary

    Emerging development policies and lending standards call for consideration of ecosystem services when mitigating impacts from development, yet little guidance exists to inform this process. Here we propose a comprehensive framework for advancing both biodiversity and ecosystem service mitigation. We have clarified a means for choosing representative ecosystem service targets alongside biodiversity targets, identified servicesheds as a useful spatial unit for assessing ecosystem service avoidance, impact, and offset options, and discuss methods for consistent calculation of biodiversity and ecosystem service mitigation ratios. We emphasize the need to move away from area- and habitat-based assessment methods for both biodiversity and ecosystemmore » services towards functional assessments at landscape or seascape scales. Such comprehensive assessments more accurately reflect cumulative impacts and variation in environmental quality, social needs and value preferences. The integrated framework builds on the experience of biodiversity mitigation while addressing the unique opportunities and challenges presented by ecosystem service mitigation. These advances contribute to growing potential for economic development planning and execution that will minimize impacts on nature and maximize human wellbeing. - Highlights: • This is the first framework for biodiversity and ecosystem service mitigation. • Functional, landscape scale assessments are ideal for avoidance and offsets. • Servicesheds define the appropriate spatial extent for ecosystem service mitigation. • Mitigation ratios should be calculated consistently and based on standard factors. • Our framework meets the needs of integrated mitigation assessment requirements.« less

  8. Conceptualizing and assessing improvement capability: a review

    PubMed Central

    Boaden, Ruth; Walshe, Kieran

    2017-01-01

    Abstract Purpose The literature is reviewed to examine how ‘improvement capability’ is conceptualized and assessed and to identify future areas for research. Data sources An iterative and systematic search of the literature was carried out across all sectors including healthcare. The search was limited to literature written in English. Data extraction The study identifies and analyses 70 instruments and frameworks for assessing or measuring improvement capability. Information about the source of the instruments, the sectors in which they were developed or used, the measurement constructs or domains they employ, and how they were tested was extracted. Results of data synthesis The instruments and framework constructs are very heterogeneous, demonstrating the ambiguity of improvement capability as a concept, and the difficulties involved in its operationalisation. Two-thirds of the instruments and frameworks have been subject to tests of reliability and half to tests of validity. Many instruments have little apparent theoretical basis and do not seem to have been used widely. Conclusion The assessment and development of improvement capability needs clearer and more consistent conceptual and terminological definition, used consistently across disciplines and sectors. There is scope to learn from existing instruments and frameworks, and this study proposes a synthetic framework of eight dimensions of improvement capability. Future instruments need robust testing for reliability and validity. This study contributes to practice and research by presenting the first review of the literature on improvement capability across all sectors including healthcare. PMID:28992146

  9. A Social-Attributional Analysis of Alcohol Response

    PubMed Central

    Fairbairn, Catharine E.; Sayette, Michael A.

    2014-01-01

    Conventional wisdom and survey data indicate that alcohol is a social lubricant and is consumed for its social effects. In contrast, the experimental literature examining alcohol’s effects within a social context reveals that alcohol does not consistently enhance social-emotional experience. We identify a methodological factor that might explain inconsistent alcohol-administration findings, distinguishing between studies featuring unscripted interactions among naïve participants (k = 18) and those featuring scripted social interactions with individuals identified as study confederates (k = 18). While 89% of naïve-participant studies find positive effects of alcohol on mood (d = 0.5), only 11% of confederate studies find evidence of significant alcohol-related mood enhancement (d = −0.01). The naïve-participant versus confederate distinction remains robust after controlling for various moderators including stress manipulations, gender, group size, anxiety outcome measure, and within-group consistency of beverage assignment. Based on the findings of our review, we propose a multidimensional, social-attributional framework for understanding alcohol-related reward. Borrowing organizing principles from attribution theory, the social-attributional approach predicts that alcohol will enhance mood when negative outcomes are perceived to be unstable and/or self-relevant. Our framework proposes that alcohol’s effects within a social context are largely explained by its tendency to free individuals from preoccupation with social rejection, allowing them to access social rewards. The social-attributional approach represents a novel framework for integrating distinct, well-validated concepts derived from several theories of alcohol’s effects. It further presents promising lines of inquiry for future research examining the role of social factors in alcohol reward and addiction susceptibility. PMID:25180806

  10. Higher-order finite-difference formulation of periodic Orbital-free Density Functional Theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghosh, Swarnava; Suryanarayana, Phanish, E-mail: phanish.suryanarayana@ce.gatech.edu

    2016-02-15

    We present a real-space formulation and higher-order finite-difference implementation of periodic Orbital-free Density Functional Theory (OF-DFT). Specifically, utilizing a local reformulation of the electrostatic and kernel terms, we develop a generalized framework for performing OF-DFT simulations with different variants of the electronic kinetic energy. In particular, we propose a self-consistent field (SCF) type fixed-point method for calculations involving linear-response kinetic energy functionals. In this framework, evaluation of both the electronic ground-state and forces on the nuclei are amenable to computations that scale linearly with the number of atoms. We develop a parallel implementation of this formulation using the finite-difference discretization.more » We demonstrate that higher-order finite-differences can achieve relatively large convergence rates with respect to mesh-size in both the energies and forces. Additionally, we establish that the fixed-point iteration converges rapidly, and that it can be further accelerated using extrapolation techniques like Anderson's mixing. We validate the accuracy of the results by comparing the energies and forces with plane-wave methods for selected examples, including the vacancy formation energy in Aluminum. Overall, the suitability of the proposed formulation for scalable high performance computing makes it an attractive choice for large-scale OF-DFT calculations consisting of thousands of atoms.« less

  11. Dynamic experiment design regularization approach to adaptive imaging with array radar/SAR sensor systems.

    PubMed

    Shkvarko, Yuriy; Tuxpan, José; Santos, Stewart

    2011-01-01

    We consider a problem of high-resolution array radar/SAR imaging formalized in terms of a nonlinear ill-posed inverse problem of nonparametric estimation of the power spatial spectrum pattern (SSP) of the random wavefield scattered from a remotely sensed scene observed through a kernel signal formation operator and contaminated with random Gaussian noise. First, the Sobolev-type solution space is constructed to specify the class of consistent kernel SSP estimators with the reproducing kernel structures adapted to the metrics in such the solution space. Next, the "model-free" variational analysis (VA)-based image enhancement approach and the "model-based" descriptive experiment design (DEED) regularization paradigm are unified into a new dynamic experiment design (DYED) regularization framework. Application of the proposed DYED framework to the adaptive array radar/SAR imaging problem leads to a class of two-level (DEED-VA) regularized SSP reconstruction techniques that aggregate the kernel adaptive anisotropic windowing with the projections onto convex sets to enforce the consistency and robustness of the overall iterative SSP estimators. We also show how the proposed DYED regularization method may be considered as a generalization of the MVDR, APES and other high-resolution nonparametric adaptive radar sensing techniques. A family of the DYED-related algorithms is constructed and their effectiveness is finally illustrated via numerical simulations.

  12. Smoothed Particle Hydrodynamics: A consistent model for interfacial multiphase fluid flow simulations

    NASA Astrophysics Data System (ADS)

    Krimi, Abdelkader; Rezoug, Mehdi; Khelladi, Sofiane; Nogueira, Xesús; Deligant, Michael; Ramírez, Luis

    2018-04-01

    In this work, a consistent Smoothed Particle Hydrodynamics (SPH) model to deal with interfacial multiphase fluid flows simulation is proposed. A modification to the Continuum Stress Surface formulation (CSS) [1] to enhance the stability near the fluid interface is developed in the framework of the SPH method. A non-conservative first-order consistency operator is used to compute the divergence of stress surface tensor. This formulation benefits of all the advantages of the one proposed by Adami et al. [2] and, in addition, it can be applied to more than two phases fluid flow simulations. Moreover, the generalized wall boundary conditions [3] are modified in order to be well adapted to multiphase fluid flows with different density and viscosity. In order to allow the application of this technique to wall-bounded multiphase flows, a modification of generalized wall boundary conditions is presented here for using the SPH method. In this work we also present a particle redistribution strategy as an extension of the damping technique presented in [3] to smooth the initial transient phase of gravitational multiphase fluid flow simulations. Several computational tests are investigated to show the accuracy, convergence and applicability of the proposed SPH interfacial multiphase model.

  13. A conceptual framework for clutch-size evolution in songbirds.

    PubMed

    Martin, Thomas E

    2014-03-01

    Causes of evolved differences in clutch size among songbird species remain debated. I propose a new conceptual framework that integrates aspects of traditional life-history theory while including novel elements to explain evolution of clutch size among songbirds. I review evidence that selection by nest predation on length of time that offspring develop in the nest creates a gradient in offspring characteristics at nest leaving (fledging), including flight mobility, spatial dispersion, and self-feeding rate. I postulate that this gradient has consequences for offspring mortality rates and parental energy expenditure per offspring. These consequences then determine how reproductive effort is partitioned among offspring, while reproductive effort evolves from age-specific mortality effects. Using data from a long-term site in Arizona, as well as from the literature, I provide support for hypothesized relationships. Nestling development period consistently explains fledgling mortality, energy expenditure per offspring, and clutch size while accounting for reproductive effort (i.e., total energy expenditure) to thereby support the framework. Tests in this article are not definitive, but they document previously unrecognized relationships and address diverse traits (developmental strategies, parental care strategies, energy requirements per offspring, evolution of reproductive effort, clutch size) that justify further investigations of hypotheses proposed here.

  14. Patch forest: a hybrid framework of random forest and patch-based segmentation

    NASA Astrophysics Data System (ADS)

    Xie, Zhongliu; Gillies, Duncan

    2016-03-01

    The development of an accurate, robust and fast segmentation algorithm has long been a research focus in medical computer vision. State-of-the-art practices often involve non-rigidly registering a target image with a set of training atlases for label propagation over the target space to perform segmentation, a.k.a. multi-atlas label propagation (MALP). In recent years, the patch-based segmentation (PBS) framework has gained wide attention due to its advantage of relaxing the strict voxel-to-voxel correspondence to a series of pair-wise patch comparisons for contextual pattern matching. Despite a high accuracy reported in many scenarios, computational efficiency has consistently been a major obstacle for both approaches. Inspired by recent work on random forest, in this paper we propose a patch forest approach, which by equipping the conventional PBS with a fast patch search engine, is able to boost segmentation speed significantly while retaining an equal level of accuracy. In addition, a fast forest training mechanism is also proposed, with the use of a dynamic grid framework to efficiently approximate data compactness computation and a 3D integral image technique for fast box feature retrieval.

  15. Development of software for computing forming information using a component based approach

    NASA Astrophysics Data System (ADS)

    Ko, Kwang Hee; Park, Jiing Seo; Kim, Jung; Kim, Young Bum; Shin, Jong Gye

    2009-12-01

    In shipbuilding industry, the manufacturing technology> has advanced at an unprecedented pace for the last decade. As a result, many automatic systems for cutting, welding, etc. have been developed and employed in the manufacturing process and accordingly the productivity has been increased drastically. Despite such improvement in the manufacturing technology', however, development of an automatic system for fabricating a curved hull plate remains at the beginning stage since hardware and software for the automation of the curved hull fabrication process should be developed differently depending on the dimensions of plates, forming methods and manufacturing processes of each shipyard. To deal with this problem, it is necessary> to create a "plug-in ''framework, which can adopt various kinds of hardware and software to construct a full automatic fabrication system. In this paper, a frame-work for automatic fabrication of curved hull plates is proposed, which consists of four components and related software. In particular the software module for computing fabrication information is developed by using the ooCBD development methodology; which can interface with other hardware and software with minimum effort. Examples of the proposed framework applied to medium and large shipyards are presented.

  16. A conceptual framework for clutch size evolution in songbirds

    USGS Publications Warehouse

    Martin, Thomas E.

    2014-01-01

    Causes of evolved differences in clutch size among songbird species remain debated. I propose a new conceptual framework that integrates aspects of traditional life history theory, while including novel elements, to explain evolution of clutch size among songbirds. I review evidence that selection by nest predation on length of time that offspring develop in the nest creates a gradient in offspring characteristics at nest-leaving (fledging), including flight mobility, spatial dispersion, and self-feeding rate. I postulate that this gradient has consequences for offspring mortality rates and parental energy expenditure per offspring. These consequences then determine how reproductive effort is partitioned among offspring, while reproductive effort evolves from age-specific mortality effects. Using data from a long-term site in Arizona, as well as from the literature, I provide support for hypothesized relationships. Nestling development period consistently explains fledgling mortality, energy expenditure per offspring, and clutch size while accounting for reproductive effort (i.e., total energy expenditure) to thereby support the framework. Tests in this paper are not definitive, but they document previously unrecognized relationships and address diverse traits (developmental strategies, parental care strategies, energy requirements per offspring, evolution of reproductive effort, clutch size) that justify further investigations of hypotheses proposed here.

  17. Multi Sensor Fusion Framework for Indoor-Outdoor Localization of Limited Resource Mobile Robots

    PubMed Central

    Marín, Leonardo; Vallés, Marina; Soriano, Ángel; Valera, Ángel; Albertos, Pedro

    2013-01-01

    This paper presents a sensor fusion framework that improves the localization of mobile robots with limited computational resources. It employs an event based Kalman Filter to combine the measurements of a global sensor and an inertial measurement unit (IMU) on an event based schedule, using fewer resources (execution time and bandwidth) but with similar performance when compared to the traditional methods. The event is defined to reflect the necessity of the global information, when the estimation error covariance exceeds a predefined limit. The proposed experimental platforms are based on the LEGO Mindstorm NXT, and consist of a differential wheel mobile robot navigating indoors with a zenithal camera as global sensor, and an Ackermann steering mobile robot navigating outdoors with a SBG Systems GPS accessed through an IGEP board that also serves as datalogger. The IMU in both robots is built using the NXT motor encoders along with one gyroscope, one compass and two accelerometers from Hitecnic, placed according to a particle based dynamic model of the robots. The tests performed reflect the correct performance and low execution time of the proposed framework. The robustness and stability is observed during a long walk test in both indoors and outdoors environments. PMID:24152933

  18. Multi sensor fusion framework for indoor-outdoor localization of limited resource mobile robots.

    PubMed

    Marín, Leonardo; Vallés, Marina; Soriano, Ángel; Valera, Ángel; Albertos, Pedro

    2013-10-21

    This paper presents a sensor fusion framework that improves the localization of mobile robots with limited computational resources. It employs an event based Kalman Filter to combine the measurements of a global sensor and an inertial measurement unit (IMU) on an event based schedule, using fewer resources (execution time and bandwidth) but with similar performance when compared to the traditional methods. The event is defined to reflect the necessity of the global information, when the estimation error covariance exceeds a predefined limit. The proposed experimental platforms are based on the LEGO Mindstorm NXT, and consist of a differential wheel mobile robot navigating indoors with a zenithal camera as global sensor, and an Ackermann steering mobile robot navigating outdoors with a SBG Systems GPS accessed through an IGEP board that also serves as datalogger. The IMU in both robots is built using the NXT motor encoders along with one gyroscope, one compass and two accelerometers from Hitecnic, placed according to a particle based dynamic model of the robots. The tests performed reflect the correct performance and low execution time of the proposed framework. The robustness and stability is observed during a long walk test in both indoors and outdoors environments.

  19. A competency framework for colonoscopy training derived from cognitive task analysis techniques and expert review.

    PubMed

    Zupanc, Christine M; Burgess-Limerick, Robin; Hill, Andrew; Riek, Stephan; Wallis, Guy M; Plooy, Annaliese M; Horswill, Mark S; Watson, Marcus O; Hewett, David G

    2015-12-01

    Colonoscopy is a difficult cognitive-perceptual-motor task. Designing an appropriate instructional program for such a task requires an understanding of the knowledge, skills and attitudes underpinning the competency required to perform the task. Cognitive task analysis techniques provide an empirical means of deriving this information. Video recording and a think-aloud protocol were conducted while 20 experienced endoscopists performed colonoscopy procedures. "Cued-recall" interviews were also carried out post-procedure with nine of the endoscopists. Analysis of the resulting transcripts employed the constant comparative coding method within a grounded theory framework. The resulting draft competency framework was modified after review during semi-structured interviews conducted with six expert endoscopists. The proposed colonoscopy competency framework consists of twenty-seven skill, knowledge and attitude components, grouped into six categories (clinical knowledge; colonoscope handling; situation awareness; heuristics and strategies; clinical reasoning; and intra- and inter-personal). The colonoscopy competency framework provides a principled basis for the design of a training program, and for the design of formative assessment to gauge progress towards attaining the knowledge, skills and attitudes underpinning the achievement of colonoscopy competence.

  20. A technology selection framework for supporting delivery of patient-oriented health interventions in developing countries

    PubMed Central

    Chan, Connie V.; Kaufman, David R.

    2009-01-01

    Health information technologies (HIT) have great potential to advance health care globally. In particular, HIT can provide innovative approaches and methodologies to overcome the range of access and resource barriers specific to developing countries. However, there is a paucity of models and empirical evidence informing the technology selection process in these settings. We propose a framework for selecting patient-oriented technologies in developing countries. The selection guidance process is structured by a set of filters that impose particular constraints and serve to narrow the space of possible decisions. The framework consists of three levels of factors: 1) situational factors, 2) the technology and its relationship with health interventions and with target patients, and 3) empirical evidence. We demonstrate the utility of the framework in the context of mobile phones for behavioral health interventions to reduce risk factors for cardiovascular disease. This framework can be applied to health interventions across health domains to explore how and whether available technologies can support delivery of the associated types of interventions and with the target populations. PMID:19796709

  1. TROIKA: a general framework for heart rate monitoring using wrist-type photoplethysmographic signals during intensive physical exercise.

    PubMed

    Zhang, Zhilin; Pi, Zhouyue; Liu, Benyuan

    2015-02-01

    Heart rate monitoring using wrist-type photoplethysmographic signals during subjects' intensive exercise is a difficult problem, since the signals are contaminated by extremely strong motion artifacts caused by subjects' hand movements. So far few works have studied this problem. In this study, a general framework, termed TROIKA, is proposed, which consists of signal decomposiTion for denoising, sparse signal RecOnstructIon for high-resolution spectrum estimation, and spectral peaK trAcking with verification. The TROIKA framework has high estimation accuracy and is robust to strong motion artifacts. Many variants can be straightforwardly derived from this framework. Experimental results on datasets recorded from 12 subjects during fast running at the peak speed of 15 km/h showed that the average absolute error of heart rate estimation was 2.34 beat per minute, and the Pearson correlation between the estimates and the ground truth of heart rate was 0.992. This framework is of great values to wearable devices such as smartwatches which use PPG signals to monitor heart rate for fitness.

  2. The Genome-based Knowledge Management in Cycles model: a complex adaptive systems framework for implementation of genomic applications.

    PubMed

    Arar, Nedal; Knight, Sara J; Modell, Stephen M; Issa, Amalia M

    2011-03-01

    The main mission of the Genomic Applications in Practice and Prevention Network™ is to advance collaborative efforts involving partners from across the public health sector to realize the promise of genomics in healthcare and disease prevention. We introduce a new framework that supports the Genomic Applications in Practice and Prevention Network mission and leverages the characteristics of the complex adaptive systems approach. We call this framework the Genome-based Knowledge Management in Cycles model (G-KNOMIC). G-KNOMIC proposes that the collaborative work of multidisciplinary teams utilizing genome-based applications will enhance translating evidence-based genomic findings by creating ongoing knowledge management cycles. Each cycle consists of knowledge synthesis, knowledge evaluation, knowledge implementation and knowledge utilization. Our framework acknowledges that all the elements in the knowledge translation process are interconnected and continuously changing. It also recognizes the importance of feedback loops, and the ability of teams to self-organize within a dynamic system. We demonstrate how this framework can be used to improve the adoption of genomic technologies into practice using two case studies of genomic uptake.

  3. A hybrid framework of first principles molecular orbital calculations and a three-dimensional integral equation theory for molecular liquids: Multi-center molecular Ornstein-Zernike self-consistent field approach

    NASA Astrophysics Data System (ADS)

    Kido, Kentaro; Kasahara, Kento; Yokogawa, Daisuke; Sato, Hirofumi

    2015-07-01

    In this study, we reported the development of a new quantum mechanics/molecular mechanics (QM/MM)-type framework to describe chemical processes in solution by combining standard molecular-orbital calculations with a three-dimensional formalism of integral equation theory for molecular liquids (multi-center molecular Ornstein-Zernike (MC-MOZ) method). The theoretical procedure is very similar to the 3D-reference interaction site model self-consistent field (RISM-SCF) approach. Since the MC-MOZ method is highly parallelized for computation, the present approach has the potential to be one of the most efficient procedures to treat chemical processes in solution. Benchmark tests to check the validity of this approach were performed for two solute (solute water and formaldehyde) systems and a simple SN2 reaction (Cl- + CH3Cl → ClCH3 + Cl-) in aqueous solution. The results for solute molecular properties and solvation structures obtained by the present approach were in reasonable agreement with those obtained by other hybrid frameworks and experiments. In particular, the results of the proposed approach are in excellent agreements with those of 3D-RISM-SCF.

  4. A hybrid framework of first principles molecular orbital calculations and a three-dimensional integral equation theory for molecular liquids: multi-center molecular Ornstein-Zernike self-consistent field approach.

    PubMed

    Kido, Kentaro; Kasahara, Kento; Yokogawa, Daisuke; Sato, Hirofumi

    2015-07-07

    In this study, we reported the development of a new quantum mechanics/molecular mechanics (QM/MM)-type framework to describe chemical processes in solution by combining standard molecular-orbital calculations with a three-dimensional formalism of integral equation theory for molecular liquids (multi-center molecular Ornstein-Zernike (MC-MOZ) method). The theoretical procedure is very similar to the 3D-reference interaction site model self-consistent field (RISM-SCF) approach. Since the MC-MOZ method is highly parallelized for computation, the present approach has the potential to be one of the most efficient procedures to treat chemical processes in solution. Benchmark tests to check the validity of this approach were performed for two solute (solute water and formaldehyde) systems and a simple SN2 reaction (Cl(-) + CH3Cl → ClCH3 + Cl(-)) in aqueous solution. The results for solute molecular properties and solvation structures obtained by the present approach were in reasonable agreement with those obtained by other hybrid frameworks and experiments. In particular, the results of the proposed approach are in excellent agreements with those of 3D-RISM-SCF.

  5. Framework Support For Knowledge-Based Software Development

    NASA Astrophysics Data System (ADS)

    Huseth, Steve

    1988-03-01

    The advent of personal engineering workstations has brought substantial information processing power to the individual programmer. Advanced tools and environment capabilities supporting the software lifecycle are just beginning to become generally available. However, many of these tools are addressing only part of the software development problem by focusing on rapid construction of self-contained programs by a small group of talented engineers. Additional capabilities are required to support the development of large programming systems where a high degree of coordination and communication is required among large numbers of software engineers, hardware engineers, and managers. A major player in realizing these capabilities is the framework supporting the software development environment. In this paper we discuss our research toward a Knowledge-Based Software Assistant (KBSA) framework. We propose the development of an advanced framework containing a distributed knowledge base that can support the data representation needs of tools, provide environmental support for the formalization and control of the software development process, and offer a highly interactive and consistent user interface.

  6. Accountable care around the world: a framework to guide reform strategies.

    PubMed

    McClellan, Mark; Kent, James; Beales, Stephen J; Cohen, Samuel I A; Macdonnell, Michael; Thoumi, Andrea; Abdulmalik, Mariam; Darzi, Ara

    2014-09-01

    Accountable care--a way to align health care payments with patient-focused reform goals--is currently being pursued in the United States, but its principles are also being applied in many other countries. In this article we review experiences with such reforms to offer a globally applicable definition of an accountable care system and propose a conceptual framework for characterizing and assessing accountable care reforms. The framework consists of five components: population, outcomes, metrics and learning, payments and incentives, and coordinated delivery. We describe how the framework applies to accountable care reforms that are already being implemented in Spain and Singapore. We also describe how it can be used to map progress through increasingly sophisticated levels of reforms. We recommend that policy makers pursuing accountable care reforms emphasize the following steps: highlight population health and wellness instead of just treating illness; pay for outcomes instead of activities; create a more favorable environment for collaboration and coordinated care; and promote interoperable data systems. Project HOPE—The People-to-People Health Foundation, Inc.

  7. Equivalent formulations of “the equation of life”

    NASA Astrophysics Data System (ADS)

    Ao, Ping

    2014-07-01

    Motivated by progress in theoretical biology a recent proposal on a general and quantitative dynamical framework for nonequilibrium processes and dynamics of complex systems is briefly reviewed. It is nothing but the evolutionary process discovered by Charles Darwin and Alfred Wallace. Such general and structured dynamics may be tentatively named “the equation of life”. Three equivalent formulations are discussed, and it is also pointed out that such a quantitative dynamical framework leads naturally to the powerful Boltzmann-Gibbs distribution and the second law in physics. In this way, the equation of life provides a logically consistent foundation for thermodynamics. This view clarifies a particular outstanding problem and further suggests a unifying principle for physics and biology.

  8. A unified framework for gesture recognition and spatiotemporal gesture segmentation.

    PubMed

    Alon, Jonathan; Athitsos, Vassilis; Yuan, Quan; Sclaroff, Stan

    2009-09-01

    Within the context of hand gesture recognition, spatiotemporal gesture segmentation is the task of determining, in a video sequence, where the gesturing hand is located and when the gesture starts and ends. Existing gesture recognition methods typically assume either known spatial segmentation or known temporal segmentation, or both. This paper introduces a unified framework for simultaneously performing spatial segmentation, temporal segmentation, and recognition. In the proposed framework, information flows both bottom-up and top-down. A gesture can be recognized even when the hand location is highly ambiguous and when information about when the gesture begins and ends is unavailable. Thus, the method can be applied to continuous image streams where gestures are performed in front of moving, cluttered backgrounds. The proposed method consists of three novel contributions: a spatiotemporal matching algorithm that can accommodate multiple candidate hand detections in every frame, a classifier-based pruning framework that enables accurate and early rejection of poor matches to gesture models, and a subgesture reasoning algorithm that learns which gesture models can falsely match parts of other longer gestures. The performance of the approach is evaluated on two challenging applications: recognition of hand-signed digits gestured by users wearing short-sleeved shirts, in front of a cluttered background, and retrieval of occurrences of signs of interest in a video database containing continuous, unsegmented signing in American Sign Language (ASL).

  9. A complete-pelvis segmentation framework for image-free total hip arthroplasty (THA): methodology and clinical study.

    PubMed

    Xie, Weiguo; Franke, Jochen; Chen, Cheng; Grützner, Paul A; Schumann, Steffen; Nolte, Lutz-P; Zheng, Guoyan

    2015-06-01

    Complete-pelvis segmentation in antero-posterior pelvic radiographs is required to create a patient-specific three-dimensional pelvis model for surgical planning and postoperative assessment in image-free navigation of total hip arthroplasty. A fast and robust framework for accurately segmenting the complete pelvis is presented, consisting of two consecutive modules. In the first module, a three-stage method was developed to delineate the left hemi-pelvis based on statistical appearance and shape models. To handle complex pelvic structures, anatomy-specific information processing techniques were employed. As the input to the second module, the delineated left hemi-pelvis was then reflected about an estimated symmetry line of the radiograph to initialize the right hemi-pelvis segmentation. The right hemi-pelvis was segmented by the same three-stage method, Two experiments conducted on respectively 143 and 40 AP radiographs demonstrated a mean segmentation accuracy of 1.61±0.68 mm. A clinical study to investigate the postoperative assessment of acetabular cup orientations based on the proposed framework revealed an average accuracy of 1.2°±0.9° and 1.6°±1.4° for anteversion and inclination, respectively. Delineation of each radiograph costs less than one minute. Despite further validation needed, the preliminary results implied the underlying clinical applicability of the proposed framework for image-free THA. Copyright © 2014 John Wiley & Sons, Ltd.

  10. A methodological framework to support the initiation, design and institutionalization of participatory modeling processes in water resources management

    NASA Astrophysics Data System (ADS)

    Halbe, Johannes; Pahl-Wostl, Claudia; Adamowski, Jan

    2018-01-01

    Multiple barriers constrain the widespread application of participatory methods in water management, including the more technical focus of most water agencies, additional cost and time requirements for stakeholder involvement, as well as institutional structures that impede collaborative management. This paper presents a stepwise methodological framework that addresses the challenges of context-sensitive initiation, design and institutionalization of participatory modeling processes. The methodological framework consists of five successive stages: (1) problem framing and stakeholder analysis, (2) process design, (3) individual modeling, (4) group model building, and (5) institutionalized participatory modeling. The Management and Transition Framework is used for problem diagnosis (Stage One), context-sensitive process design (Stage Two) and analysis of requirements for the institutionalization of participatory water management (Stage Five). Conceptual modeling is used to initiate participatory modeling processes (Stage Three) and ensure a high compatibility with quantitative modeling approaches (Stage Four). This paper describes the proposed participatory model building (PMB) framework and provides a case study of its application in Québec, Canada. The results of the Québec study demonstrate the applicability of the PMB framework for initiating and designing participatory model building processes and analyzing barriers towards institutionalization.

  11. Activity recognition using dynamic multiple sensor fusion in body sensor networks.

    PubMed

    Gao, Lei; Bourke, Alan K; Nelson, John

    2012-01-01

    Multiple sensor fusion is a main research direction for activity recognition. However, there are two challenges in those systems: the energy consumption due to the wireless transmission and the classifier design because of the dynamic feature vector. This paper proposes a multi-sensor fusion framework, which consists of the sensor selection module and the hierarchical classifier. The sensor selection module adopts the convex optimization to select the sensor subset in real time. The hierarchical classifier combines the Decision Tree classifier with the Naïve Bayes classifier. The dataset collected from 8 subjects, who performed 8 scenario activities, was used to evaluate the proposed system. The results show that the proposed system can obviously reduce the energy consumption while guaranteeing the recognition accuracy.

  12. On the Design of Smart Homes: A Framework for Activity Recognition in Home Environment.

    PubMed

    Cicirelli, Franco; Fortino, Giancarlo; Giordano, Andrea; Guerrieri, Antonio; Spezzano, Giandomenico; Vinci, Andrea

    2016-09-01

    A smart home is a home environment enriched with sensing, actuation, communication and computation capabilities which permits to adapt it to inhabitants preferences and requirements. Establishing a proper strategy of actuation on the home environment can require complex computational tasks on the sensed data. This is the case of activity recognition, which consists in retrieving high-level knowledge about what occurs in the home environment and about the behaviour of the inhabitants. The inherent complexity of this application domain asks for tools able to properly support the design and implementation phases. This paper proposes a framework for the design and implementation of smart home applications focused on activity recognition in home environments. The framework mainly relies on the Cloud-assisted Agent-based Smart home Environment (CASE) architecture offering basic abstraction entities which easily allow to design and implement Smart Home applications. CASE is a three layered architecture which exploits the distributed multi-agent paradigm and the cloud technology for offering analytics services. Details about how to implement activity recognition onto the CASE architecture are supplied focusing on the low-level technological issues as well as the algorithms and the methodologies useful for the activity recognition. The effectiveness of the framework is shown through a case study consisting of a daily activity recognition of a person in a home environment.

  13. On the thermomechanical coupling in dissipative materials: A variational approach for generalized standard materials

    NASA Astrophysics Data System (ADS)

    Bartels, A.; Bartel, T.; Canadija, M.; Mosler, J.

    2015-09-01

    This paper deals with the thermomechanical coupling in dissipative materials. The focus lies on finite strain plasticity theory and the temperature increase resulting from plastic deformation. For this type of problem, two fundamentally different modeling approaches can be found in the literature: (a) models based on thermodynamical considerations and (b) models based on the so-called Taylor-Quinney factor. While a naive straightforward implementation of thermodynamically consistent approaches usually leads to an over-prediction of the temperature increase due to plastic deformation, models relying on the Taylor-Quinney factor often violate fundamental physical principles such as the first and the second law of thermodynamics. In this paper, a thermodynamically consistent framework is elaborated which indeed allows the realistic prediction of the temperature evolution. In contrast to previously proposed frameworks, it is based on a fully three-dimensional, finite strain setting and it naturally covers coupled isotropic and kinematic hardening - also based on non-associative evolution equations. Considering a variationally consistent description based on incremental energy minimization, it is shown that the aforementioned problem (thermodynamical consistency and a realistic temperature prediction) is essentially equivalent to correctly defining the decomposition of the total energy into stored and dissipative parts. Interestingly, this decomposition shows strong analogies to the Taylor-Quinney factor. In this respect, the Taylor-Quinney factor can be well motivated from a physical point of view. Furthermore, certain intervals for this factor can be derived in order to guarantee that fundamental physically principles are fulfilled a priori. Representative examples demonstrate the predictive capabilities of the final constitutive modeling framework.

  14. Developing Urban Environment Indicators for Neighborhood Sustainability Assessment in Tripoli-Libya

    NASA Astrophysics Data System (ADS)

    Elgadi, Ahmed. A.; Hakim Ismail, Lokman; Abass, Fatma; Ali, Abdelmuniem

    2016-11-01

    Sustainability assessment frameworks are becoming increasingly important to assist in the transition towards a sustainable urban environment. The urban environment is an effective system and requires regular monitoring and evaluation through a set of relevant indicators. The indicator provides information about the state of the environment through the production value of quantity. The indicator creates sustainability assessment requests to be considered on all spatial scales to specify efficient information of urban environment sustainability in Tripoli-Libya. Detailed data is necessary to assess environmental modification in the urban environment on a local scale and ease the transfer of this information to national and global stages. This paper proposes a set of key indicators to monitor urban environmental sustainability developments of Libyan residential neighborhoods. The proposed environmental indicator framework measures the sustainability performance of an urban environment through 13 sub-categories consisting of 21 indicators. This paper also explains the theoretical foundations for the selection of all indicators with reference to previous studies.

  15. What is adaptive about adaptive decision making? A parallel constraint satisfaction account.

    PubMed

    Glöckner, Andreas; Hilbig, Benjamin E; Jekel, Marc

    2014-12-01

    There is broad consensus that human cognition is adaptive. However, the vital question of how exactly this adaptivity is achieved has remained largely open. Herein, we contrast two frameworks which account for adaptive decision making, namely broad and general single-mechanism accounts vs. multi-strategy accounts. We propose and fully specify a single-mechanism model for decision making based on parallel constraint satisfaction processes (PCS-DM) and contrast it theoretically and empirically against a multi-strategy account. To achieve sufficiently sensitive tests, we rely on a multiple-measure methodology including choice, reaction time, and confidence data as well as eye-tracking. Results show that manipulating the environmental structure produces clear adaptive shifts in choice patterns - as both frameworks would predict. However, results on the process level (reaction time, confidence), in information acquisition (eye-tracking), and from cross-predicting choice consistently corroborate single-mechanisms accounts in general, and the proposed parallel constraint satisfaction model for decision making in particular. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Thinking as the control of imagination: a conceptual framework for goal-directed systems.

    PubMed

    Pezzulo, Giovanni; Castelfranchi, Cristiano

    2009-07-01

    This paper offers a conceptual framework which (re)integrates goal-directed control, motivational processes, and executive functions, and suggests a developmental pathway from situated action to higher level cognition. We first illustrate a basic computational (control-theoretic) model of goal-directed action that makes use of internal modeling. We then show that by adding the problem of selection among multiple action alternatives motivation enters the scene, and that the basic mechanisms of executive functions such as inhibition, the monitoring of progresses, and working memory, are required for this system to work. Further, we elaborate on the idea that the off-line re-enactment of anticipatory mechanisms used for action control gives rise to (embodied) mental simulations, and propose that thinking consists essentially in controlling mental simulations rather than directly controlling behavior and perceptions. We conclude by sketching an evolutionary perspective of this process, proposing that anticipation leveraged cognition, and by highlighting specific predictions of our model.

  17. Determination of water environment standards based on water quality criteria in China: Limitations and feasibilities.

    PubMed

    Wang, Tieyu; Zhou, Yunqiao; Bi, Cencen; Lu, Yonglong; He, Guizhen; Giesy, John P

    2017-07-01

    There is a need to formulate water environment standards (WESs) from the current water quality criteria (WQC) in China. To this end, we briefly summarize typical mechanisms applied in several countries with longer histories of developing WESs, and three limitations to formulating WESs in China were identified. After analyzing the feasibility factors including economic development, scientific support capability and environmental policies, we realized that China is still not ready for a complete change from its current nation-wide unified WES system to a local-standard-based system. Thus, we proposed a framework for transformation from WQC to WESs in China. The framework consists of three parts, including responsibilities, processes and policies. The responsibilities include research authorization, development of guidelines, and collection of information, at both national and local levels; the processes include four steps and an impact factor system to establish water quality standards; and the policies include seven specific proposals. Copyright © 2016. Published by Elsevier B.V.

  18. An Interoperability Framework and Capability Profiling for Manufacturing Software

    NASA Astrophysics Data System (ADS)

    Matsuda, M.; Arai, E.; Nakano, N.; Wakai, H.; Takeda, H.; Takata, M.; Sasaki, H.

    ISO/TC184/SC5/WG4 is working on ISO16100: Manufacturing software capability profiling for interoperability. This paper reports on a manufacturing software interoperability framework and a capability profiling methodology which were proposed and developed through this international standardization activity. Within the context of manufacturing application, a manufacturing software unit is considered to be capable of performing a specific set of function defined by a manufacturing software system architecture. A manufacturing software interoperability framework consists of a set of elements and rules for describing the capability of software units to support the requirements of a manufacturing application. The capability profiling methodology makes use of the domain-specific attributes and methods associated with each specific software unit to describe capability profiles in terms of unit name, manufacturing functions, and other needed class properties. In this methodology, manufacturing software requirements are expressed in terns of software unit capability profiles.

  19. Framework for cognitive analysis of dynamic perfusion computed tomography with visualization of large volumetric data

    NASA Astrophysics Data System (ADS)

    Hachaj, Tomasz; Ogiela, Marek R.

    2012-10-01

    The proposed framework for cognitive analysis of perfusion computed tomography images is a fusion of image processing, pattern recognition, and image analysis procedures. The output data of the algorithm consists of: regions of perfusion abnormalities, anatomy atlas description of brain tissues, measures of perfusion parameters, and prognosis for infracted tissues. That information is superimposed onto volumetric computed tomography data and displayed to radiologists. Our rendering algorithm enables rendering large volumes on off-the-shelf hardware. This portability of rendering solution is very important because our framework can be run without using expensive dedicated hardware. The other important factors are theoretically unlimited size of rendered volume and possibility of trading of image quality for rendering speed. Such rendered, high quality visualizations may be further used for intelligent brain perfusion abnormality identification, and computer aided-diagnosis of selected types of pathologies.

  20. Improved biliary detection and diagnosis through intelligent machine analysis.

    PubMed

    Logeswaran, Rajasvaran

    2012-09-01

    This paper reports on work undertaken to improve automated detection of bile ducts in magnetic resonance cholangiopancreatography (MRCP) images, with the objective of conducting preliminary classification of the images for diagnosis. The proposed I-BDeDIMA (Improved Biliary Detection and Diagnosis through Intelligent Machine Analysis) scheme is a multi-stage framework consisting of successive phases of image normalization, denoising, structure identification, object labeling, feature selection and disease classification. A combination of multiresolution wavelet, dynamic intensity thresholding, segment-based region growing, region elimination, statistical analysis and neural networks, is used in this framework to achieve good structure detection and preliminary diagnosis. Tests conducted on over 200 clinical images with known diagnosis have shown promising results of over 90% accuracy. The scheme outperforms related work in the literature, making it a viable framework for computer-aided diagnosis of biliary diseases. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  1. A viable logarithmic f(R) model for inflation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amin, M.; Khalil, S.; Salah, M.

    2016-08-18

    Inflation in the framework of f(R) modified gravity is revisited. We study the conditions that f(R) should satisfy in order to lead to a viable inflationary model in the original form and in the Einstein frame. Based on these criteria we propose a new logarithmic model as a potential candidate for f(R) theories aiming to describe inflation consistent with observations from Planck satellite (2015). The model predicts scalar spectral index 0.9615

  2. Computer-assisted framework for machine-learning-based delineation of GTV regions on datasets of planning CT and PET/CT images.

    PubMed

    Ikushima, Koujiro; Arimura, Hidetaka; Jin, Ze; Yabu-Uchi, Hidetake; Kuwazuru, Jumpei; Shioyama, Yoshiyuki; Sasaki, Tomonari; Honda, Hiroshi; Sasaki, Masayuki

    2017-01-01

    We have proposed a computer-assisted framework for machine-learning-based delineation of gross tumor volumes (GTVs) following an optimum contour selection (OCS) method. The key idea of the proposed framework was to feed image features around GTV contours (determined based on the knowledge of radiation oncologists) into a machine-learning classifier during the training step, after which the classifier produces the 'degree of GTV' for each voxel in the testing step. Initial GTV regions were extracted using a support vector machine (SVM) that learned the image features inside and outside each tumor region (determined by radiation oncologists). The leave-one-out-by-patient test was employed for training and testing the steps of the proposed framework. The final GTV regions were determined using the OCS method that can be used to select a global optimum object contour based on multiple active delineations with a LSM around the GTV. The efficacy of the proposed framework was evaluated in 14 lung cancer cases [solid: 6, ground-glass opacity (GGO): 4, mixed GGO: 4] using the 3D Dice similarity coefficient (DSC), which denotes the degree of region similarity between the GTVs contoured by radiation oncologists and those determined using the proposed framework. The proposed framework achieved an average DSC of 0.777 for 14 cases, whereas the OCS-based framework produced an average DSC of 0.507. The average DSCs for GGO and mixed GGO were 0.763 and 0.701, respectively, obtained by the proposed framework. The proposed framework can be employed as a tool to assist radiation oncologists in delineating various GTV regions. © The Author 2016. Published by Oxford University Press on behalf of The Japan Radiation Research Society and Japanese Society for Radiation Oncology.

  3. Data-Driven Information Extraction from Chinese Electronic Medical Records

    PubMed Central

    Zhao, Tianwan; Ge, Chen; Gao, Weiguo; Wei, Jia; Zhu, Kenny Q.

    2015-01-01

    Objective This study aims to propose a data-driven framework that takes unstructured free text narratives in Chinese Electronic Medical Records (EMRs) as input and converts them into structured time-event-description triples, where the description is either an elaboration or an outcome of the medical event. Materials and Methods Our framework uses a hybrid approach. It consists of constructing cross-domain core medical lexica, an unsupervised, iterative algorithm to accrue more accurate terms into the lexica, rules to address Chinese writing conventions and temporal descriptors, and a Support Vector Machine (SVM) algorithm that innovatively utilizes Normalized Google Distance (NGD) to estimate the correlation between medical events and their descriptions. Results The effectiveness of the framework was demonstrated with a dataset of 24,817 de-identified Chinese EMRs. The cross-domain medical lexica were capable of recognizing terms with an F1-score of 0.896. 98.5% of recorded medical events were linked to temporal descriptors. The NGD SVM description-event matching achieved an F1-score of 0.874. The end-to-end time-event-description extraction of our framework achieved an F1-score of 0.846. Discussion In terms of named entity recognition, the proposed framework outperforms state-of-the-art supervised learning algorithms (F1-score: 0.896 vs. 0.886). In event-description association, the NGD SVM is superior to SVM using only local context and semantic features (F1-score: 0.874 vs. 0.838). Conclusions The framework is data-driven, weakly supervised, and robust against the variations and noises that tend to occur in a large corpus. It addresses Chinese medical writing conventions and variations in writing styles through patterns used for discovering new terms and rules for updating the lexica. PMID:26295801

  4. Data-Driven Information Extraction from Chinese Electronic Medical Records.

    PubMed

    Xu, Dong; Zhang, Meizhuo; Zhao, Tianwan; Ge, Chen; Gao, Weiguo; Wei, Jia; Zhu, Kenny Q

    2015-01-01

    This study aims to propose a data-driven framework that takes unstructured free text narratives in Chinese Electronic Medical Records (EMRs) as input and converts them into structured time-event-description triples, where the description is either an elaboration or an outcome of the medical event. Our framework uses a hybrid approach. It consists of constructing cross-domain core medical lexica, an unsupervised, iterative algorithm to accrue more accurate terms into the lexica, rules to address Chinese writing conventions and temporal descriptors, and a Support Vector Machine (SVM) algorithm that innovatively utilizes Normalized Google Distance (NGD) to estimate the correlation between medical events and their descriptions. The effectiveness of the framework was demonstrated with a dataset of 24,817 de-identified Chinese EMRs. The cross-domain medical lexica were capable of recognizing terms with an F1-score of 0.896. 98.5% of recorded medical events were linked to temporal descriptors. The NGD SVM description-event matching achieved an F1-score of 0.874. The end-to-end time-event-description extraction of our framework achieved an F1-score of 0.846. In terms of named entity recognition, the proposed framework outperforms state-of-the-art supervised learning algorithms (F1-score: 0.896 vs. 0.886). In event-description association, the NGD SVM is superior to SVM using only local context and semantic features (F1-score: 0.874 vs. 0.838). The framework is data-driven, weakly supervised, and robust against the variations and noises that tend to occur in a large corpus. It addresses Chinese medical writing conventions and variations in writing styles through patterns used for discovering new terms and rules for updating the lexica.

  5. Topology optimization for nonlinear dynamic problems: Considerations for automotive crashworthiness

    NASA Astrophysics Data System (ADS)

    Kaushik, Anshul; Ramani, Anand

    2014-04-01

    Crashworthiness of automotive structures is most often engineered after an optimal topology has been arrived at using other design considerations. This study is an attempt to incorporate crashworthiness requirements upfront in the topology synthesis process using a mathematically consistent framework. It proposes the use of equivalent linear systems from the nonlinear dynamic simulation in conjunction with a discrete-material topology optimizer. Velocity and acceleration constraints are consistently incorporated in the optimization set-up. Issues specific to crash problems due to the explicit solution methodology employed, nature of the boundary conditions imposed on the structure, etc. are discussed and possible resolutions are proposed. A demonstration of the methodology on two-dimensional problems that address some of the structural requirements and the types of loading typical of frontal and side impact is provided in order to show that this methodology has the potential for topology synthesis incorporating crashworthiness requirements.

  6. An adaptive simplex cut-cell method for high-order discontinuous Galerkin discretizations of elliptic interface problems and conjugate heat transfer problems

    NASA Astrophysics Data System (ADS)

    Sun, Huafei; Darmofal, David L.

    2014-12-01

    In this paper we propose a new high-order solution framework for interface problems on non-interface-conforming meshes. The framework consists of a discontinuous Galerkin (DG) discretization, a simplex cut-cell technique, and an output-based adaptive scheme. We first present a DG discretization with a dual-consistent output evaluation for elliptic interface problems on interface-conforming meshes, and then extend the method to handle multi-physics interface problems, in particular conjugate heat transfer (CHT) problems. The method is then applied to non-interface-conforming meshes using a cut-cell technique, where the interface definition is completely separate from the mesh generation process. No assumption is made on the interface shape (other than Lipschitz continuity). We then equip our strategy with an output-based adaptive scheme for an accurate output prediction. Through numerical examples, we demonstrate high-order convergence for elliptic interface problems and CHT problems with both smooth and non-smooth interface shapes.

  7. A generic biokinetic model for noble gases with application to radon.

    PubMed

    Leggett, Rich; Marsh, James; Gregoratto, Demetrio; Blanchardon, Eric

    2013-06-01

    To facilitate the estimation of radiation doses from intake of radionuclides, the International Commission on Radiological Protection (ICRP) publishes dose coefficients (dose per unit intake) based on reference biokinetic and dosimetric models. The ICRP generally has not provided biokinetic models or dose coefficients for intake of noble gases, but plans to provide such information for (222)Rn and other important radioisotopes of noble gases in a forthcoming series of reports on occupational intake of radionuclides (OIR). This paper proposes a generic biokinetic model framework for noble gases and develops parameter values for radon. The framework is tailored to applications in radiation protection and is consistent with a physiologically based biokinetic modelling scheme adopted for the OIR series. Parameter values for a noble gas are based largely on a blood flow model and physical laws governing transfer of a non-reactive and soluble gas between materials. Model predictions for radon are shown to be consistent with results of controlled studies of its biokinetics in human subjects.

  8. Biodiversity impact assessment (BIA+) - methodological framework for screening biodiversity.

    PubMed

    Winter, Lisa; Pflugmacher, Stephan; Berger, Markus; Finkbeiner, Matthias

    2018-03-01

    For the past 20 years, the life cycle assessment (LCA) community has sought to integrate impacts on biodiversity into the LCA framework. However, existing impact assessment methods still fail to do so comprehensively because they quantify only a few impacts related to specific species and regions. This paper proposes a methodological framework that will allow LCA practitioners to assess currently missing impacts on biodiversity on a global scale. Building on existing models that seek to quantify the impacts of human activities on biodiversity, the herein proposed methodological framework consists of 2 components: a habitat factor for 14 major habitat types and the impact on the biodiversity status in those major habitat types. The habitat factor is calculated by means of indicators that characterize each habitat. The biodiversity status depends on parameters from impact categories. The impact functions, relating these different parameters to a given response in the biodiversity status, rely on expert judgments. To ensure the applicability for LCA practitioners, the components of the framework can be regionalized on a country scale for which LCA inventory data is more readily available. The weighting factors for the 14 major habitat types range from 0.63 to 1.82. By means of area weighting of the major habitat types in a country, country-specific weighting factors are calculated. In order to demonstrate the main part of the framework, examples of impact functions are given for the categories "freshwater eutrophication" and "freshwater ecotoxicity" in 1 major habitat type. The results confirm suitability of the methodological framework. The major advantages are the framework's user-friendliness, given that data can be used from LCA databases directly, and the complete inclusion of all levels of biodiversity (genetic, species, and ecosystem). It is applicable for the whole world and a wide range of impact categories. Integr Environ Assess Manag 2018;14:282-297. © 2017 SETAC. © 2017 SETAC.

  9. Framework for non-coherent interface models at finite displacement jumps and finite strains

    NASA Astrophysics Data System (ADS)

    Ottosen, Niels Saabye; Ristinmaa, Matti; Mosler, Jörn

    2016-05-01

    This paper deals with a novel constitutive framework suitable for non-coherent interfaces, such as cracks, undergoing large deformations in a geometrically exact setting. For this type of interface, the displacement field shows a jump across the interface. Within the engineering community, so-called cohesive zone models are frequently applied in order to describe non-coherent interfaces. However, for existing models to comply with the restrictions imposed by (a) thermodynamical consistency (e.g., the second law of thermodynamics), (b) balance equations (in particular, balance of angular momentum) and (c) material frame indifference, these models are essentially fiber models, i.e. models where the traction vector is collinear with the displacement jump. This constraints the ability to model shear and, in addition, anisotropic effects are excluded. A novel, extended constitutive framework which is consistent with the above mentioned fundamental physical principles is elaborated in this paper. In addition to the classical tractions associated with a cohesive zone model, the main idea is to consider additional tractions related to membrane-like forces and out-of-plane shear forces acting within the interface. For zero displacement jump, i.e. coherent interfaces, this framework degenerates to existing formulations presented in the literature. For hyperelasticity, the Helmholtz energy of the proposed novel framework depends on the displacement jump as well as on the tangent vectors of the interface with respect to the current configuration - or equivalently - the Helmholtz energy depends on the displacement jump and the surface deformation gradient. It turns out that by defining the Helmholtz energy in terms of the invariants of these variables, all above-mentioned fundamental physical principles are automatically fulfilled. Extensions of the novel framework necessary for material degradation (damage) and plasticity are also covered.

  10. Automated framework for estimation of lung tumor locations in kV-CBCT images for tumor-based patient positioning in stereotactic lung body radiotherapy

    NASA Astrophysics Data System (ADS)

    Yoshidome, Satoshi; Arimura, Hidetaka; Terashima, Koutarou; Hirakawa, Masakazu; Hirose, Taka-aki; Fukunaga, Junichi; Nakamura, Yasuhiko

    2017-03-01

    Recently, image-guided radiotherapy (IGRT) systems using kilovolt cone-beam computed tomography (kV-CBCT) images have become more common for highly accurate patient positioning in stereotactic lung body radiotherapy (SLBRT). However, current IGRT procedures are based on bone structures and subjective correction. Therefore, the aim of this study was to evaluate the proposed framework for automated estimation of lung tumor locations in kV-CBCT images for tumor-based patient positioning in SLBRT. Twenty clinical cases are considered, involving solid, pure ground-glass opacity (GGO), mixed GGO, solitary, and non-solitary tumor types. The proposed framework consists of four steps: (1) determination of a search region for tumor location detection in a kV-CBCT image; (2) extraction of a tumor template from a planning CT image; (3) preprocessing for tumor region enhancement (edge and tumor enhancement using a Sobel filter and a blob structure enhancement (BSE) filter, respectively); and (4) tumor location estimation based on a template-matching technique. The location errors in the original, edge-, and tumor-enhanced images were found to be 1.2 ± 0.7 mm, 4.2 ± 8.0 mm, and 2.7 ± 4.6 mm, respectively. The location errors in the original images of solid, pure GGO, mixed GGO, solitary, and non-solitary types of tumors were 1.2 ± 0.7 mm, 1.3 ± 0.9 mm, 0.4 ± 0.6 mm, 1.1 ± 0.8 mm and 1.0 ± 0.7 mm, respectively. These results suggest that the proposed framework is robust as regards automatic estimation of several types of tumor locations in kV-CBCT images for tumor-based patient positioning in SLBRT.

  11. Developing a water market readiness assessment framework

    NASA Astrophysics Data System (ADS)

    Wheeler, Sarah Ann; Loch, Adam; Crase, Lin; Young, Mike; Grafton, R. Quentin

    2017-09-01

    Water markets are increasingly proposed as a demand-management strategy to deal with water scarcity. Water trading arrangements, on their own, are not about setting bio-physical limits to water-use. Nevertheless, water trading that mitigates scarcity constraints can assist regulators of water resources to keep water-use within limits at the lowest possible cost, and may reduce the cost of restoring water system health. While theoretically attractive, many practitioners have, at best, only a limited understanding of the practical usefulness of markets and how they might be most appropriately deployed. Using lessons learned from jurisdictions around the world where water markets have been implemented, this study attempts to fill the existing water market development gap and provide an initial framework (the water market readiness assessment (WMRA)) to describe the policy and administrative conditions/reforms necessary to enable governments/jurisdictions to develop water trading arrangements that are efficient, equitable and within sustainable limits. Our proposed framework consists of three key steps: 1) an assessment of hydrological and institutional needs; 2) a market evaluation, including assessment of development and implementation issues; and 3) the monitoring, continuous/review and assessment of future needs; with a variety of questions needing assessment at each stage. We apply the framework to three examples: regions in Australia, the United States and Spain. These applications indicate that WMRA can provide key information for water planners to consider on the usefulness of water trading processes to better manage water scarcity; but further practical applications and tests of the framework are required to fully evaluate its effectiveness.

  12. A framework and a measurement instrument for sustainability of work practices in long-term care

    PubMed Central

    2011-01-01

    Background In health care, many organizations are working on quality improvement and/or innovation of their care practices. Although the effectiveness of improvement processes has been studied extensively, little attention has been given to sustainability of the changed work practices after implementation. The objective of this study is to develop a theoretical framework and measurement instrument for sustainability. To this end sustainability is conceptualized with two dimensions: routinization and institutionalization. Methods The exploratory methodological design consisted of three phases: a) framework development; b) instrument development; and c) field testing in former improvement teams in a quality improvement program for health care (N teams = 63, N individual = 112). Data were collected not until at least one year had passed after implementation. Underlying constructs and their interrelations were explored using Structural Equation Modeling and Principal Component Analyses. Internal consistency was computed with Cronbach's alpha coefficient. A long and a short version of the instrument are proposed. Results The χ2- difference test of the -2 Log Likelihood estimates demonstrated that the hierarchical two factor model with routinization and institutionalization as separate constructs showed a better fit than the one factor model (p < .01). Secondly, construct validity of the instrument was strong as indicated by the high factor loadings of the items. Finally, the internal consistency of the subscales was good. Conclusions The theoretical framework offers a valuable starting point for the analysis of sustainability on the level of actual changed work practices. Even though the two dimensions routinization and institutionalization are related, they are clearly distinguishable and each has distinct value in the discussion of sustainability. Finally, the subscales conformed to psychometric properties defined in literature. The instrument can be used in the evaluation of improvement projects. PMID:22087884

  13. Periodic benefit-risk assessment using Bayesian stochastic multi-criteria acceptability analysis

    PubMed Central

    Li, Kan; Yuan, Shuai Sammy; Wang, William; Wan, Shuyan Sabrina; Ceesay, Paulette; Heyse, Joseph F.; Mt-Isa, Shahrul; Luo, Sheng

    2018-01-01

    Benefit-risk (BR) assessment is essential to ensure the best decisions are made for a medical product in the clinical development process, regulatory marketing authorization, post-market surveillance, and coverage and reimbursement decisions. One challenge of BR assessment in practice is that the benefit and risk profile may keep evolving while new evidence is accumulating. Regulators and the International Conference on Harmonization (ICH) recommend performing periodic benefit-risk evaluation report (PBRER) through the product's lifecycle. In this paper, we propose a general statistical framework for periodic benefit-risk assessment, in which Bayesian meta-analysis and stochastic multi-criteria acceptability analysis (SMAA) will be combined to synthesize the accumulating evidence. The proposed approach allows us to compare the acceptability of different drugs dynamically and effectively and accounts for the uncertainty of clinical measurements and imprecise or incomplete preference information of decision makers. We apply our approaches to two real examples in a post-hoc way for illustration purpose. The proposed method may easily be modified for other pre and post market settings, and thus be an important complement to the current structured benefit-risk assessment (sBRA) framework to improve the transparent and consistency of the decision-making process. PMID:29505866

  14. Random Forest-Based Recognition of Isolated Sign Language Subwords Using Data from Accelerometers and Surface Electromyographic Sensors.

    PubMed

    Su, Ruiliang; Chen, Xiang; Cao, Shuai; Zhang, Xu

    2016-01-14

    Sign language recognition (SLR) has been widely used for communication amongst the hearing-impaired and non-verbal community. This paper proposes an accurate and robust SLR framework using an improved decision tree as the base classifier of random forests. This framework was used to recognize Chinese sign language subwords using recordings from a pair of portable devices worn on both arms consisting of accelerometers (ACC) and surface electromyography (sEMG) sensors. The experimental results demonstrated the validity of the proposed random forest-based method for recognition of Chinese sign language (CSL) subwords. With the proposed method, 98.25% average accuracy was obtained for the classification of a list of 121 frequently used CSL subwords. Moreover, the random forests method demonstrated a superior performance in resisting the impact of bad training samples. When the proportion of bad samples in the training set reached 50%, the recognition error rate of the random forest-based method was only 10.67%, while that of a single decision tree adopted in our previous work was almost 27.5%. Our study offers a practical way of realizing a robust and wearable EMG-ACC-based SLR systems.

  15. Moving vehicles segmentation based on Gaussian motion model

    NASA Astrophysics Data System (ADS)

    Zhang, Wei; Fang, Xiang Z.; Lin, Wei Y.

    2005-07-01

    Moving objects segmentation is a challenge in computer vision. This paper focuses on the segmentation of moving vehicles in dynamic scene. We analyses the psychology of human vision and present a framework for segmenting moving vehicles in the highway. The proposed framework consists of two parts. Firstly, we propose an adaptive background update method in which the background is updated according to the change of illumination conditions and thus can adapt to the change of illumination sensitively. Secondly, we construct a Gaussian motion model to segment moving vehicles, in which the motion vectors of the moving pixels are modeled as a Gaussian model and an on-line EM algorithm is used to update the model. The Gaussian distribution of the adaptive model is elevated to determine which moving vectors result from moving vehicles and which from other moving objects such as waving trees. Finally, the pixels with motion vector result from the moving vehicles are segmented. Experimental results of several typical scenes show that the proposed model can detect the moving vehicles correctly and is immune from influence of the moving objects caused by the waving trees and the vibration of camera.

  16. Securing While Sampling in Wireless Body Area Networks With Application to Electrocardiography.

    PubMed

    Dautov, Ruslan; Tsouri, Gill R

    2016-01-01

    Stringent resource constraints and broadcast transmission in wireless body area network raise serious security concerns when employed in biomedical applications. Protecting data transmission where any minor alteration is potentially harmful is of significant importance in healthcare. Traditional security methods based on public or private key infrastructure require considerable memory and computational resources, and present an implementation obstacle in compact sensor nodes. This paper proposes a lightweight encryption framework augmenting compressed sensing with wireless physical layer security. Augmenting compressed sensing to secure information is based on the use of the measurement matrix as an encryption key, and allows for incorporating security in addition to compression at the time of sampling an analog signal. The proposed approach eliminates the need for a separate encryption algorithm, as well as the predeployment of a key thereby conserving sensor node's limited resources. The proposed framework is evaluated using analysis, simulation, and experimentation applied to a wireless electrocardiogram setup consisting of a sensor node, an access point, and an eavesdropper performing a proximity attack. Results show that legitimate communication is reliable and secure given that the eavesdropper is located at a reasonable distance from the sensor node and the access point.

  17. Numerical Coupling and Simulation of Point-Mass System with the Turbulent Fluid Flow

    NASA Astrophysics Data System (ADS)

    Gao, Zheng

    A computational framework that combines the Eulerian description of the turbulence field with a Lagrangian point-mass ensemble is proposed in this dissertation. Depending on the Reynolds number, the turbulence field is simulated using Direct Numerical Simulation (DNS) or eddy viscosity model. In the meanwhile, the particle system, such as spring-mass system and cloud droplets, are modeled using the ordinary differential system, which is stiff and hence poses a challenge to the stability of the entire system. This computational framework is applied to the numerical study of parachute deceleration and cloud microphysics. These two distinct problems can be uniformly modeled with Partial Differential Equations (PDEs) and Ordinary Differential Equations (ODEs), and numerically solved in the same framework. For the parachute simulation, a novel porosity model is proposed to simulate the porous effects of the parachute canopy. This model is easy to implement with the projection method and is able to reproduce Darcy's law observed in the experiment. Moreover, the impacts of using different versions of k-epsilon turbulence model in the parachute simulation have been investigated and conclude that the standard and Re-Normalisation Group (RNG) model may overestimate the turbulence effects when Reynolds number is small while the Realizable model has a consistent performance with both large and small Reynolds number. For another application, cloud microphysics, the cloud entrainment-mixing problem is studied in the same numerical framework. Three sets of DNS are carried out with both decaying and forced turbulence. The numerical result suggests a new way parameterize the cloud mixing degree using the dynamical measures. The numerical experiments also verify the negative relationship between the droplets number concentration and the vorticity field. The results imply that the gravity has fewer impacts on the forced turbulence than the decaying turbulence. In summary, the proposed framework can be used to solve a physics problem that involves turbulence field and point-mass system, and therefore has a broad application.

  18. Assessing secondary soil salinization risk based on the PSR sustainability framework.

    PubMed

    Zhou, De; Lin, Zhulu; Liu, Liming; Zimmermann, David

    2013-10-15

    Risk assessment of secondary soil salinization, which is caused in part by the way people manage the land, is an essential challenge to agricultural sustainability. The objective of our study was to develop a soil salinity risk assessment methodology by selecting a consistent set of risk factors based on the conceptual Pressure-State-Response (PSR) sustainability framework and incorporating the grey relational analysis and the Analytic Hierarchy Process methods. The proposed salinity risk assessment methodology was demonstrated through a case study of developing composite risk index maps for the Yinchuan Plain, a major irrigation agriculture district in northwest China. Fourteen risk factors were selected in terms of the three PSR criteria: pressure, state, and response. The results showed that the salinity risk in the Yinchuan Plain was strongly influenced by the subsoil and groundwater salinity, land use, distance to irrigation canals, and depth to groundwater. To maintain agricultural sustainability in the Yinchuan Plain, a suite of remedial and preventative actions were proposed to manage soil salinity risk in the regions that are affected by salinity at different levels and by different salinization processes. The weight sensitivity analysis results also showed that the overall salinity risk of the Yinchuan Plain would increase or decrease as the weights for pressure or response risk factors increased, signifying the importance of human activities on secondary soil salinization. Ideally, the proposed methodology will help us develop more consistent management tools for risk assessment and management and for control of secondary soil salinization. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Improving resolution of MR images with an adversarial network incorporating images with different contrast.

    PubMed

    Kim, Ki Hwan; Do, Won-Joon; Park, Sung-Hong

    2018-05-04

    The routine MRI scan protocol consists of multiple pulse sequences that acquire images of varying contrast. Since high frequency contents such as edges are not significantly affected by image contrast, down-sampled images in one contrast may be improved by high resolution (HR) images acquired in another contrast, reducing the total scan time. In this study, we propose a new deep learning framework that uses HR MR images in one contrast to generate HR MR images from highly down-sampled MR images in another contrast. The proposed convolutional neural network (CNN) framework consists of two CNNs: (a) a reconstruction CNN for generating HR images from the down-sampled images using HR images acquired with a different MRI sequence and (b) a discriminator CNN for improving the perceptual quality of the generated HR images. The proposed method was evaluated using a public brain tumor database and in vivo datasets. The performance of the proposed method was assessed in tumor and no-tumor cases separately, with perceptual image quality being judged by a radiologist. To overcome the challenge of training the network with a small number of available in vivo datasets, the network was pretrained using the public database and then fine-tuned using the small number of in vivo datasets. The performance of the proposed method was also compared to that of several compressed sensing (CS) algorithms. Incorporating HR images of another contrast improved the quantitative assessments of the generated HR image in reference to ground truth. Also, incorporating a discriminator CNN yielded perceptually higher image quality. These results were verified in regions of normal tissue as well as tumors for various MRI sequences from pseudo k-space data generated from the public database. The combination of pretraining with the public database and fine-tuning with the small number of real k-space datasets enhanced the performance of CNNs in in vivo application compared to training CNNs from scratch. The proposed method outperformed the compressed sensing methods. The proposed method can be a good strategy for accelerating routine MRI scanning. © 2018 American Association of Physicists in Medicine.

  20. A numerical framework for bubble transport in a subcooled fluid flow

    NASA Astrophysics Data System (ADS)

    Jareteg, Klas; Sasic, Srdjan; Vinai, Paolo; Demazière, Christophe

    2017-09-01

    In this paper we present a framework for the simulation of dispersed bubbly two-phase flows, with the specific aim of describing vapor-liquid systems with condensation. We formulate and implement a framework that consists of a population balance equation (PBE) for the bubble size distribution and an Eulerian-Eulerian two-fluid solver. The PBE is discretized using the Direct Quadrature Method of Moments (DQMOM) in which we include the condensation of the bubbles as an internal phase space convection. We investigate the robustness of the DQMOM formulation and the numerical issues arising from the rapid shrinkage of the vapor bubbles. In contrast to a PBE method based on the multiple-size-group (MUSIG) method, the DQMOM formulation allows us to compute a distribution with dynamic bubble sizes. Such a property is advantageous to capture the wide range of bubble sizes associated with the condensation process. Furthermore, we compare the computational performance of the DQMOM-based framework with the MUSIG method. The results demonstrate that DQMOM is able to retrieve the bubble size distribution with a good numerical precision in only a small fraction of the computational time required by MUSIG. For the two-fluid solver, we examine the implementation of the mass, momentum and enthalpy conservation equations in relation to the coupling to the PBE. In particular, we propose a formulation of the pressure and liquid continuity equations, that was shown to correctly preserve mass when computing the vapor fraction with DQMOM. In addition, the conservation of enthalpy was also proven. Therefore a consistent overall framework that couples the PBE and two-fluid solvers is achieved.

  1. Scaling phenomena in the Internet: Critically examining criticality

    PubMed Central

    Willinger, Walter; Govindan, Ramesh; Jamin, Sugih; Paxson, Vern; Shenker, Scott

    2002-01-01

    Recent Internet measurements have found pervasive evidence of some surprising scaling properties. The two we focus on in this paper are self-similar scaling in the burst patterns of Internet traffic and, in some contexts, scale-free structure in the network's interconnection topology. These findings have led to a number of proposed models or “explanations” of such “emergent” phenomena. Many of these explanations invoke concepts such as fractals, chaos, or self-organized criticality, mainly because these concepts are closely associated with scale invariance and power laws. We examine these criticality-based explanations of self-similar scaling behavior—of both traffic flows through the Internet and the Internet's topology—to see whether they indeed explain the observed phenomena. To do so, we bring to bear a simple validation framework that aims at testing whether a proposed model is merely evocative, in that it can reproduce the phenomenon of interest but does not necessarily capture and incorporate the true underlying cause, or indeed explanatory, in that it also captures the causal mechanisms (why and how, in addition to what). We argue that the framework can provide a basis for developing a useful, consistent, and verifiable theory of large networks such as the Internet. Applying the framework, we find that, whereas the proposed criticality-based models are able to produce the observed “emergent” phenomena, they unfortunately fail as sound explanations of why such scaling behavior arises in the Internet. PMID:11875212

  2. An Ethical (Descriptive) Framework for Judgment of Actions and Decisions in the Construction Industry and Engineering-Part I.

    PubMed

    Alkhatib, Omar J; Abdou, Alaa

    2018-04-01

    The construction industry is usually characterized as a fragmented system of multiple-organizational entities in which members from different technical backgrounds and moral values join together to develop a particular business or project. The greatest challenge in the construction process for the achievement of a successful practice is the development of an outstanding reputation, which is built on identifying and applying an ethical framework. This framework should reflect a common ethical ground for myriad people involved in this process to survive and compete ethically in today's turbulent construction market. This study establishes a framework for ethical judgment of behavior and actions conducted in the construction process. The framework was primarily developed based on the essential attributes of business management identified in the literature review and subsequently incorporates additional attributes identified to prevent breaches in the construction industry and common ethical values related to professional engineering. The proposed judgment framework is based primarily on the ethical dimension of professional responsibility. The Ethical Judgment Framework consists of descriptive approaches involving technical, professional, administrative, and miscellaneous terms. The framework provides the basis for judging actions as either ethical or unethical. Furthermore, the framework can be implemented as a form of preventive ethics, which would help avoid ethical dilemmas and moral allegations. The framework can be considered a decision-making model to guide actions and improve the ethical reasoning process that would help individuals think through possible implications and consequences of ethical dilemmas in the construction industry.

  3. Endurance with partnership: a preliminary conceptual framework for couples undergoing in vitro fertilisation treatment.

    PubMed

    Ying, Liying; Wu, Lai Har; Wu, Xiangli; Shu, Jing; Loke, Alice Yuen

    2018-04-01

    Infertility affects both women and men in the physical, emotional, existential, and interpersonal realms. When couples seek in vitro fertilisation (IVF) treatment, they further suffer from the difficulties of the treatment and the uncertainty of its outcome. The aim of this study was to develop a preliminary conceptual framework for couples undergoing IVF treatment to give health professionals a better understanding of the experiences of such couples, and to guide the development of an intervention. The process of identifying frameworks adopted in intervention studies confirmed that there is no established framework for infertile couples undergoing IVF treatment. A skeletal framework identified from previous studies provides an internal structure for the proposed framework for couples undergoing IVF treatment, filled out with concepts drawn from a concept analysis and a qualitative study, knitting the structure together. This preliminary framework is the Endurance with Partnership Conceptual Framework (P-EPCF). It consists of four domains: the impacts of infertility and stressors, dyadic mediators, dyadic moderators and dyadic outcomes. According to the P-EPCF, the impacts of infertility and IVF treatment can be mediated by the couples' partnership and dyadic coping. Improvements in the psychological well-being and marital functioning of IVF couples can then be expected. The P-EPCF would be potentially valuable in guiding the development of a complex, couple-based intervention, which could focus on enhancing the partnership of couples and their coping strategies.

  4. 75 FR 46916 - Proposal for Minor Adjustments to Optional Alternative Site Framework

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-04

    ... DEPARTMENT OF COMMERCE Foreign-Trade Zones Board Proposal for Minor Adjustments to Optional Alternative Site Framework The Foreign-Trade Zones (FTZ) Board is inviting public comment on a staff proposal to make minor adjustments to the Board's practice regarding the alternative site framework (ASF...

  5. Novel application of red-light runner proneness theory within traffic microsimulation to an actual signal junction.

    PubMed

    Bell, Margaret Carol; Galatioto, Fabio; Giuffrè, Tullio; Tesoriere, Giovanni

    2012-05-01

    Building on previous research a conceptual framework, based on potential conflicts analysis, has provided a quantitative evaluation of 'proneness' to red-light running behaviour at urban signalised intersections of different geometric, flow and driver characteristics. The results provided evidence that commonly used violation rates could cause inappropriate evaluation of the extent of the red-light running phenomenon. Initially, an in-depth investigation of the functional form of the mathematical relationship between the potential and actual red-light runners was carried out. The application of the conceptual framework was tested on a signalised intersection in order to quantify the proneness to red-light running. For the particular junction studied proneness for daytime was found to be 0.17 north and 0.16 south for opposing main road approaches and 0.42 east and 0.59 west for the secondary approaches. Further investigations were carried out using a traffic microsimulation model, to explore those geometric features and traffic volumes (arrival patterns at the stop-line) that significantly affect red-light running. In this way the prediction capability of the proposed potential conflict model was improved. A degree of consistency in the measured and simulated red-light running was observed and the conceptual framework was tested through a sensitivity analysis applied to different stop-line positions and traffic volume variations. The microsimulation, although at its early stages of development, has shown promise in its ability to model unintentional red light running behaviour and following further work through application to other junctions, potentially provides a tool for evaluation of alternative junction designs on proneness. In brief, this paper proposes and applies a novel approach to model red-light running using a microsimulation and demonstrates consistency with the observed and theoretical results. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Repulsive particles under a general external potential: Thermodynamics by neglecting thermal noise.

    PubMed

    Ribeiro, Mauricio S; Nobre, Fernando D

    2016-08-01

    A recent proposal of an effective temperature θ, conjugated to a generalized entropy s_{q}, typical of nonextensive statistical mechanics, has led to a consistent thermodynamic framework in the case q=2. The proposal was explored for repulsively interacting vortices, currently used for modeling type-II superconductors. In these systems, the variable θ presents values much higher than those of typical room temperatures T, so that the thermal noise can be neglected (T/θ≃0). The whole procedure was developed for an equilibrium state obtained after a sufficiently long-time evolution, associated with a nonlinear Fokker-Planck equation and approached due to a confining external harmonic potential, ϕ(x)=αx^{2}/2 (α>0). Herein, the thermodynamic framework is extended to a quite general confining potential, namely ϕ(x)=α|x|^{z}/z (z>1). It is shown that the main results of the previous analyses hold for any z>1: (i) The definition of the effective temperature θ conjugated to the entropy s_{2}. (ii) The construction of a Carnot cycle, whose efficiency is shown to be η=1-(θ_{2}/θ_{1}), where θ_{1} and θ_{2} are the effective temperatures associated with two isothermal transformations, with θ_{1}>θ_{2}. The special character of the Carnot cycle is indicated by analyzing another cycle that presents an efficiency depending on z. (iii) Applying Legendre transformations for a distinct pair of variables, different thermodynamic potentials are obtained, and furthermore, Maxwell relations and response functions are derived. The present approach shows a consistent thermodynamic framework, suggesting that these results should hold for a general confining potential ϕ(x), increasing the possibility of experimental verifications.

  7. Neural systems language: a formal modeling language for the systematic description, unambiguous communication, and automated digital curation of neural connectivity.

    PubMed

    Brown, Ramsay A; Swanson, Larry W

    2013-09-01

    Systematic description and the unambiguous communication of findings and models remain among the unresolved fundamental challenges in systems neuroscience. No common descriptive frameworks exist to describe systematically the connective architecture of the nervous system, even at the grossest level of observation. Furthermore, the accelerating volume of novel data generated on neural connectivity outpaces the rate at which this data is curated into neuroinformatics databases to synthesize digitally systems-level insights from disjointed reports and observations. To help address these challenges, we propose the Neural Systems Language (NSyL). NSyL is a modeling language to be used by investigators to encode and communicate systematically reports of neural connectivity from neuroanatomy and brain imaging. NSyL engenders systematic description and communication of connectivity irrespective of the animal taxon described, experimental or observational technique implemented, or nomenclature referenced. As a language, NSyL is internally consistent, concise, and comprehensible to both humans and computers. NSyL is a promising development for systematizing the representation of neural architecture, effectively managing the increasing volume of data on neural connectivity and streamlining systems neuroscience research. Here we present similar precedent systems, how NSyL extends existing frameworks, and the reasoning behind NSyL's development. We explore NSyL's potential for balancing robustness and consistency in representation by encoding previously reported assertions of connectivity from the literature as examples. Finally, we propose and discuss the implications of a framework for how NSyL will be digitally implemented in the future to streamline curation of experimental results and bridge the gaps among anatomists, imagers, and neuroinformatics databases. Copyright © 2013 Wiley Periodicals, Inc.

  8. Effective Vehicle-Based Kangaroo Detection for Collision Warning Systems Using Region-Based Convolutional Networks.

    PubMed

    Saleh, Khaled; Hossny, Mohammed; Nahavandi, Saeid

    2018-06-12

    Traffic collisions between kangaroos and motorists are on the rise on Australian roads. According to a recent report, it was estimated that there were more than 20,000 kangaroo vehicle collisions that occurred only during the year 2015 in Australia. In this work, we are proposing a vehicle-based framework for kangaroo detection in urban and highway traffic environment that could be used for collision warning systems. Our proposed framework is based on region-based convolutional neural networks (RCNN). Given the scarcity of labeled data of kangaroos in traffic environments, we utilized our state-of-the-art data generation pipeline to generate 17,000 synthetic depth images of traffic scenes with kangaroo instances annotated in them. We trained our proposed RCNN-based framework on a subset of the generated synthetic depth images dataset. The proposed framework achieved a higher average precision (AP) score of 92% over all the testing synthetic depth image datasets. We compared our proposed framework against other baseline approaches and we outperformed it with more than 37% in AP score over all the testing datasets. Additionally, we evaluated the generalization performance of the proposed framework on real live data and we achieved a resilient detection accuracy without any further fine-tuning of our proposed RCNN-based framework.

  9. Crack layer theory

    NASA Technical Reports Server (NTRS)

    Chudnovsky, A.

    1984-01-01

    A damage parameter is introduced in addition to conventional parameters of continuum mechanics and consider a crack surrounded by an array of microdefects within the continuum mechanics framework. A system consisting of the main crack and surrounding damage is called crack layer (CL). Crack layer propagation is an irreversible process. The general framework of the thermodynamics of irreversible processes are employed to identify the driving forces (causes) and to derive the constitutive equation of CL propagation, that is, the relationship between the rates of the crack growth and damage dissemination from one side and the conjugated thermodynamic forces from another. The proposed law of CL propagation is in good agreement with the experimental data on fatigue CL propagation in various materials. The theory also elaborates material toughness characterization.

  10. Crack layer theory

    NASA Technical Reports Server (NTRS)

    Chudnovsky, A.

    1987-01-01

    A damage parameter is introduced in addition to conventional parameters of continuum mechanics and consider a crack surrounded by an array of microdefects within the continuum mechanics framework. A system consisting of the main crack and surrounding damage is called crack layer (CL). Crack layer propagation is an irreversible process. The general framework of the thermodynamics of irreversible processes are employed to identify the driving forces (causes) and to derive the constitutive equation of CL propagation, that is, the relationship between the rates of the crack growth and damage dissemination from one side and the conjugated thermodynamic forces from another. The proposed law of CL propagation is in good agreement with the experimental data on fatigue CL propagation in various materials. The theory also elaborates material toughness characterization.

  11. A new neural framework for visuospatial processing.

    PubMed

    Kravitz, Dwight J; Saleem, Kadharbatcha S; Baker, Chris I; Mishkin, Mortimer

    2011-04-01

    The division of cortical visual processing into distinct dorsal and ventral streams is a key framework that has guided visual neuroscience. The characterization of the ventral stream as a 'What' pathway is relatively uncontroversial, but the nature of dorsal stream processing is less clear. Originally proposed as mediating spatial perception ('Where'), more recent accounts suggest it primarily serves non-conscious visually guided action ('How'). Here, we identify three pathways emerging from the dorsal stream that consist of projections to the prefrontal and premotor cortices, and a major projection to the medial temporal lobe that courses both directly and indirectly through the posterior cingulate and retrosplenial cortices. These three pathways support both conscious and non-conscious visuospatial processing, including spatial working memory, visually guided action and navigation, respectively.

  12. Threat driven modeling framework using petri nets for e-learning system.

    PubMed

    Khamparia, Aditya; Pandey, Babita

    2016-01-01

    Vulnerabilities at various levels are main cause of security risks in e-learning system. This paper presents a modified threat driven modeling framework, to identify the threats after risk assessment which requires mitigation and how to mitigate those threats. To model those threat mitigations aspects oriented stochastic petri nets are used. This paper included security metrics based on vulnerabilities present in e-learning system. The Common Vulnerability Scoring System designed to provide a normalized method for rating vulnerabilities which will be used as basis in metric definitions and calculations. A case study has been also proposed which shows the need and feasibility of using aspect oriented stochastic petri net models for threat modeling which improves reliability, consistency and robustness of the e-learning system.

  13. SHINE: Strategic Health Informatics Networks for Europe.

    PubMed

    Kruit, D; Cooper, P A

    1994-10-01

    The mission of SHINE is to construct an open systems framework for the development of regional community healthcare telematic services that support and add to the strategic business objectives of European healthcare providers and purchasers. This framework will contain a Methodology, that identifies healthcare business processes and develops a supporting IT strategy, and the Open Health Environment. This consists of an architecture and information standards that are 'open' and will be available to any organisation wishing to construct SHINE conform regional healthcare telematic services. Results are: generic models, e.g., regional healthcare business networks, IT strategies; demonstrable, e.g., pilot demonstrators, application and service prototypes; reports, e.g., SHINE Methodology, pilot specifications & evaluations; proposals, e.g., service/interface specifications, standards conformance.

  14. Crystal plasticity modeling of irradiation growth in Zircaloy-2

    DOE PAGES

    Patra, Anirban; Tome, Carlos; Golubov, Stanislav I.

    2017-05-10

    A reaction-diffusion based mean field rate theory model is implemented in the viscoplastic self-consistent (VPSC) crystal plasticity framework to simulate irradiation growth in hcp Zr and its alloys. A novel scheme is proposed to model the evolution (both number density and radius) of irradiation-induced dislocation loops that can be informed directly from experimental data of dislocation density evolution during irradiation. This framework is used to predict the irradiation growth behavior of cold-worked Zircaloy-2 and trends compared to available experimental data. The role of internal stresses in inducing irradiation creep is discussed. Effects of grain size, texture, and external stress onmore » the coupled irradiation growth and creep behavior are also studied.« less

  15. Crystal plasticity modeling of irradiation growth in Zircaloy-2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patra, Anirban; Tome, Carlos; Golubov, Stanislav I.

    A reaction-diffusion based mean field rate theory model is implemented in the viscoplastic self-consistent (VPSC) crystal plasticity framework to simulate irradiation growth in hcp Zr and its alloys. A novel scheme is proposed to model the evolution (both number density and radius) of irradiation-induced dislocation loops that can be informed directly from experimental data of dislocation density evolution during irradiation. This framework is used to predict the irradiation growth behavior of cold-worked Zircaloy-2 and trends compared to available experimental data. The role of internal stresses in inducing irradiation creep is discussed. Effects of grain size, texture, and external stress onmore » the coupled irradiation growth and creep behavior are also studied.« less

  16. Guided color consistency optimization for image mosaicking

    NASA Astrophysics Data System (ADS)

    Xie, Renping; Xia, Menghan; Yao, Jian; Li, Li

    2018-01-01

    This paper studies the problem of color consistency correction for sequential images with diverse color characteristics. Existing algorithms try to adjust all images to minimize color differences among images under a unified energy framework, however, the results are prone to presenting a consistent but unnatural appearance when the color difference between images is large and diverse. In our approach, this problem is addressed effectively by providing a guided initial solution for the global consistency optimization, which avoids converging to a meaningless integrated solution. First of all, to obtain the reliable intensity correspondences in overlapping regions between image pairs, we creatively propose the histogram extreme point matching algorithm which is robust to image geometrical misalignment to some extents. In the absence of the extra reference information, the guided initial solution is learned from the major tone of the original images by searching some image subset as the reference, whose color characteristics will be transferred to the others via the paths of graph analysis. Thus, the final results via global adjustment will take on a consistent color similar to the appearance of the reference image subset. Several groups of convincing experiments on both the synthetic dataset and the challenging real ones sufficiently demonstrate that the proposed approach can achieve as good or even better results compared with the state-of-the-art approaches.

  17. Consistent lattice Boltzmann methods for incompressible axisymmetric flows

    NASA Astrophysics Data System (ADS)

    Zhang, Liangqi; Yang, Shiliang; Zeng, Zhong; Yin, Linmao; Zhao, Ya; Chew, Jia Wei

    2016-08-01

    In this work, consistent lattice Boltzmann (LB) methods for incompressible axisymmetric flows are developed based on two efficient axisymmetric LB models available in the literature. In accord with their respective original models, the proposed axisymmetric models evolve within the framework of the standard LB method and the source terms contain no gradient calculations. Moreover, the incompressibility conditions are realized with the Hermite expansion, thus the compressibility errors arising in the existing models are expected to be reduced by the proposed incompressible models. In addition, an extra relaxation parameter is added to the Bhatnagar-Gross-Krook collision operator to suppress the effect of the ghost variable and thus the numerical stability of the present models is significantly improved. Theoretical analyses, based on the Chapman-Enskog expansion and the equivalent moment system, are performed to derive the macroscopic equations from the LB models and the resulting truncation terms (i.e., the compressibility errors) are investigated. In addition, numerical validations are carried out based on four well-acknowledged benchmark tests and the accuracy and applicability of the proposed incompressible axisymmetric LB models are verified.

  18. The role of evidence, context, and facilitation in an implementation trial: implications for the development of the PARIHS framework.

    PubMed

    Rycroft-Malone, Jo; Seers, Kate; Chandler, Jackie; Hawkes, Claire A; Crichton, Nicola; Allen, Claire; Bullock, Ian; Strunin, Leo

    2013-03-09

    The case has been made for more and better theory-informed process evaluations within trials in an effort to facilitate insightful understandings of how interventions work. In this paper, we provide an explanation of implementation processes from one of the first national implementation research randomized controlled trials with embedded process evaluation conducted within acute care, and a proposed extension to the Promoting Action on Research Implementation in Health Services (PARIHS) framework. The PARIHS framework was prospectively applied to guide decisions about intervention design, data collection, and analysis processes in a trial focussed on reducing peri-operative fasting times. In order to capture a holistic picture of implementation processes, the same data were collected across 19 participating hospitals irrespective of allocation to intervention. This paper reports on findings from data collected from a purposive sample of 151 staff and patients pre- and post-intervention. Data were analysed using content analysis within, and then across data sets. A robust and uncontested evidence base was a necessary, but not sufficient condition for practice change, in that individual staff and patient responses such as caution influenced decision making. The implementation context was challenging, in which individuals and teams were bounded by professional issues, communication challenges, power and a lack of clarity for the authority and responsibility for practice change. Progress was made in sites where processes were aligned with existing initiatives. Additionally, facilitators reported engaging in many intervention implementation activities, some of which result in practice changes, but not significant improvements to outcomes. This study provided an opportunity for reflection on the comprehensiveness of the PARIHS framework. Consistent with the underlying tenant of PARIHS, a multi-faceted and dynamic story of implementation was evident. However, the prominent role that individuals played as part of the interaction between evidence and context is not currently explicit within the framework. We propose that successful implementation of evidence into practice is a planned facilitated process involving an interplay between individuals, evidence, and context to promote evidence-informed practice. This proposal will enhance the potential of the PARIHS framework for explanation, and ensure theoretical development both informs and responds to the evidence base for implementation.

  19. Dynamic Experiment Design Regularization Approach to Adaptive Imaging with Array Radar/SAR Sensor Systems

    PubMed Central

    Shkvarko, Yuriy; Tuxpan, José; Santos, Stewart

    2011-01-01

    We consider a problem of high-resolution array radar/SAR imaging formalized in terms of a nonlinear ill-posed inverse problem of nonparametric estimation of the power spatial spectrum pattern (SSP) of the random wavefield scattered from a remotely sensed scene observed through a kernel signal formation operator and contaminated with random Gaussian noise. First, the Sobolev-type solution space is constructed to specify the class of consistent kernel SSP estimators with the reproducing kernel structures adapted to the metrics in such the solution space. Next, the “model-free” variational analysis (VA)-based image enhancement approach and the “model-based” descriptive experiment design (DEED) regularization paradigm are unified into a new dynamic experiment design (DYED) regularization framework. Application of the proposed DYED framework to the adaptive array radar/SAR imaging problem leads to a class of two-level (DEED-VA) regularized SSP reconstruction techniques that aggregate the kernel adaptive anisotropic windowing with the projections onto convex sets to enforce the consistency and robustness of the overall iterative SSP estimators. We also show how the proposed DYED regularization method may be considered as a generalization of the MVDR, APES and other high-resolution nonparametric adaptive radar sensing techniques. A family of the DYED-related algorithms is constructed and their effectiveness is finally illustrated via numerical simulations. PMID:22163859

  20. YoTube: Searching Action Proposal Via Recurrent and Static Regression Networks

    NASA Astrophysics Data System (ADS)

    Zhu, Hongyuan; Vial, Romain; Lu, Shijian; Peng, Xi; Fu, Huazhu; Tian, Yonghong; Cao, Xianbin

    2018-06-01

    In this paper, we present YoTube-a novel network fusion framework for searching action proposals in untrimmed videos, where each action proposal corresponds to a spatialtemporal video tube that potentially locates one human action. Our method consists of a recurrent YoTube detector and a static YoTube detector, where the recurrent YoTube explores the regression capability of RNN for candidate bounding boxes predictions using learnt temporal dynamics and the static YoTube produces the bounding boxes using rich appearance cues in a single frame. Both networks are trained using rgb and optical flow in order to fully exploit the rich appearance, motion and temporal context, and their outputs are fused to produce accurate and robust proposal boxes. Action proposals are finally constructed by linking these boxes using dynamic programming with a novel trimming method to handle the untrimmed video effectively and efficiently. Extensive experiments on the challenging UCF-101 and UCF-Sports datasets show that our proposed technique obtains superior performance compared with the state-of-the-art.

  1. Hierarchical Representation Learning for Kinship Verification.

    PubMed

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  2. Acquisition and production of skilled behavior in dynamic decision-making tasks. Semiannual Status Report M.S. Thesis - Georgia Inst. of Tech., Nov. 1992

    NASA Technical Reports Server (NTRS)

    Kirlik, Alex; Kossack, Merrick Frank

    1993-01-01

    This status report consists of a thesis entitled 'Ecological Task Analysis: A Method for Display Enhancements.' Previous use of various analysis processes for the purpose of display interface design or enhancement has run the risk of failing to improve user performance due to the analysis resulting in only a sequencial listing of user tasks. Adopting an ecological approach to performing the task analysis, however, may result in the necessary modeling of an unpredictable and variable task domain required to improve user performance. Kirlik has proposed an Ecological Task Analysis framework which is designed for this purpose. It is the purpose of this research to measure this framework's effectiveness at enhancing display interfaces in order to improve user performance. Following the proposed framework, an ecological task analysis of experienced users of a complex and dynamic laboratory task, Star Cruiser, was performed. Based on this analysis, display enhancements were proposed and implemented. An experiment was then conducted to compare this new version of Star Cruiser to the original. By measuring user performance at different tasks, it was determined that during early sessions, use of the enhanced display contributed to better user performance compared to that achieved using the original display. Furthermore, the results indicate that the enhancements proposed as a result of the ecological task analysis affected user performance differently depending on whether they are enhancements which aid in the selection of a possible action or in the performance of an action. Generalizations of these findings to larger, more complex systems were avoided since the analysis was only performed on this one particular system.

  3. Simultaneous-Fault Diagnosis of Gearboxes Using Probabilistic Committee Machine

    PubMed Central

    Zhong, Jian-Hua; Wong, Pak Kin; Yang, Zhi-Xin

    2016-01-01

    This study combines signal de-noising, feature extraction, two pairwise-coupled relevance vector machines (PCRVMs) and particle swarm optimization (PSO) for parameter optimization to form an intelligent diagnostic framework for gearbox fault detection. Firstly, the noises of sensor signals are de-noised by using the wavelet threshold method to lower the noise level. Then, the Hilbert-Huang transform (HHT) and energy pattern calculation are applied to extract the fault features from de-noised signals. After that, an eleven-dimension vector, which consists of the energies of nine intrinsic mode functions (IMFs), maximum value of HHT marginal spectrum and its corresponding frequency component, is obtained to represent the features of each gearbox fault. The two PCRVMs serve as two different fault detection committee members, and they are trained by using vibration and sound signals, respectively. The individual diagnostic result from each committee member is then combined by applying a new probabilistic ensemble method, which can improve the overall diagnostic accuracy and increase the number of detectable faults as compared to individual classifiers acting alone. The effectiveness of the proposed framework is experimentally verified by using test cases. The experimental results show the proposed framework is superior to existing single classifiers in terms of diagnostic accuracies for both single- and simultaneous-faults in the gearbox. PMID:26848665

  4. Reconsideration of Plant Morphological Traits: From a Structure-Based Perspective to a Function-Based Evolutionary Perspective

    PubMed Central

    Bai, Shu-Nong

    2017-01-01

    This opinion article proposes a novel alignment of traits in plant morphogenesis from a function-based evolutionary perspective. As a member species of the ecosystem on Earth, we human beings view our neighbor organisms from our own sensing system. We tend to distinguish forms and structures (i.e., “morphological traits”) mainly through vision. Traditionally, a plant was considered to be consisted of three parts, i.e., the shoot, the leaves, and the root. Based on such a “structure-based perspective,” evolutionary analyses or comparisons across species were made on particular parts or their derived structures. So far no conceptual framework has been established to incorporate the morphological traits of all three land plant phyta, i.e., bryophyta, pteridophyta and spermatophyta, for evolutionary developmental analysis. Using the tenets of the recently proposed concept of sexual reproduction cycle, the major morphological traits of land plants can be aligned into five categories from a function-based evolutionary perspective. From this perspective, and the resulting alignment, a new conceptual framework emerges, called “Plant Morphogenesis 123.” This framework views a plant as a colony of integrated plant developmental units that are each produced via one life cycle. This view provided an alternative perspective for evolutionary developmental investigation in plants. PMID:28360919

  5. Simultaneous-Fault Diagnosis of Gearboxes Using Probabilistic Committee Machine.

    PubMed

    Zhong, Jian-Hua; Wong, Pak Kin; Yang, Zhi-Xin

    2016-02-02

    This study combines signal de-noising, feature extraction, two pairwise-coupled relevance vector machines (PCRVMs) and particle swarm optimization (PSO) for parameter optimization to form an intelligent diagnostic framework for gearbox fault detection. Firstly, the noises of sensor signals are de-noised by using the wavelet threshold method to lower the noise level. Then, the Hilbert-Huang transform (HHT) and energy pattern calculation are applied to extract the fault features from de-noised signals. After that, an eleven-dimension vector, which consists of the energies of nine intrinsic mode functions (IMFs), maximum value of HHT marginal spectrum and its corresponding frequency component, is obtained to represent the features of each gearbox fault. The two PCRVMs serve as two different fault detection committee members, and they are trained by using vibration and sound signals, respectively. The individual diagnostic result from each committee member is then combined by applying a new probabilistic ensemble method, which can improve the overall diagnostic accuracy and increase the number of detectable faults as compared to individual classifiers acting alone. The effectiveness of the proposed framework is experimentally verified by using test cases. The experimental results show the proposed framework is superior to existing single classifiers in terms of diagnostic accuracies for both single- and simultaneous-faults in the gearbox.

  6. Infinite slope stability under steady unsaturated seepage conditions

    USGS Publications Warehouse

    Lu, Ning; Godt, Jonathan W.

    2008-01-01

    We present a generalized framework for the stability of infinite slopes under steady unsaturated seepage conditions. The analytical framework allows the water table to be located at any depth below the ground surface and variation of soil suction and moisture content above the water table under steady infiltration conditions. The framework also explicitly considers the effect of weathering and porosity increase near the ground surface on changes in the friction angle of the soil. The factor of safety is conceptualized as a function of the depth within the vadose zone and can be reduced to the classical analytical solution for subaerial infinite slopes in the saturated zone. Slope stability analyses with hypothetical sandy and silty soils are conducted to illustrate the effectiveness of the framework. These analyses indicate that for hillslopes of both sandy and silty soils, failure can occur above the water table under steady infiltration conditions, which is consistent with some field observations that cannot be predicted by the classical infinite slope theory. A case study of shallow slope failures of sandy colluvium on steep coastal hillslopes near Seattle, Washington, is presented to examine the predictive utility of the proposed framework.

  7. Expanding the landscape of $$ \\mathcal{N} $$ = 2 rank 1 SCFTs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Argyres, Philip C.; Lotito, Matteo; Lu, Yongchao

    Here, we refine our previous proposal [1-3] for systematically classifying 4d rank-1 N = 2 SCFTs by constructing their possible Coulomb branch geometries. Four new recently discussed rank-1 theories [4, 5], including novel N = 3 SCFTs, sit beautifully in our refined classification framework. By arguing for the consistency of their RG flows we can make a strong case for the existence of at least four additional rank-1 SCFTs, nearly doubling the number of known rank-1 SCFTs. The refinement consists of relaxing the assumption that the flavor symmetries of the SCFTs have no discrete factors. This results in an enlargedmore » (but finite) set of possible rank-1 SCFTs. Their existence can be further constrained using consistency of their central charges and RG flows.« less

  8. Expanding the landscape of $$ \\mathcal{N} $$ = 2 rank 1 SCFTs

    DOE PAGES

    Argyres, Philip C.; Lotito, Matteo; Lu, Yongchao; ...

    2016-05-16

    Here, we refine our previous proposal [1-3] for systematically classifying 4d rank-1 N = 2 SCFTs by constructing their possible Coulomb branch geometries. Four new recently discussed rank-1 theories [4, 5], including novel N = 3 SCFTs, sit beautifully in our refined classification framework. By arguing for the consistency of their RG flows we can make a strong case for the existence of at least four additional rank-1 SCFTs, nearly doubling the number of known rank-1 SCFTs. The refinement consists of relaxing the assumption that the flavor symmetries of the SCFTs have no discrete factors. This results in an enlargedmore » (but finite) set of possible rank-1 SCFTs. Their existence can be further constrained using consistency of their central charges and RG flows.« less

  9. A framework for different levels of integration of computational models into web-based virtual patients.

    PubMed

    Kononowicz, Andrzej A; Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-23

    Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients' interactivity by enriching them with computational models of physiological and pathological processes. The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the model. The third element is the description of four integration strategies, and the last element consisted of evaluation profiles specifying the relevant feasibility features and acceptance thresholds for specific purposes. The group of experts who evaluated the virtual patient exemplar found higher integration more interesting, but at the same time they were more concerned with the validity of the result. The observed differences were not statistically significant. This paper outlines a framework for the integration of computational models into virtual patients. The opportunities and challenges of model exploitation are discussed from a number of user perspectives, considering different levels of model integration. The long-term aim for future research is to isolate the most crucial factors in the framework and to determine their influence on the integration outcome.

  10. A Framework for Different Levels of Integration of Computational Models Into Web-Based Virtual Patients

    PubMed Central

    Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-01

    Background Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients’ interactivity by enriching them with computational models of physiological and pathological processes. Objective The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. Methods The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. Results The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the model. The third element is the description of four integration strategies, and the last element consisted of evaluation profiles specifying the relevant feasibility features and acceptance thresholds for specific purposes. The group of experts who evaluated the virtual patient exemplar found higher integration more interesting, but at the same time they were more concerned with the validity of the result. The observed differences were not statistically significant. Conclusions This paper outlines a framework for the integration of computational models into virtual patients. The opportunities and challenges of model exploitation are discussed from a number of user perspectives, considering different levels of model integration. The long-term aim for future research is to isolate the most crucial factors in the framework and to determine their influence on the integration outcome. PMID:24463466

  11. A framework for longitudinal data analysis via shape regression

    NASA Astrophysics Data System (ADS)

    Fishbaugh, James; Durrleman, Stanley; Piven, Joseph; Gerig, Guido

    2012-02-01

    Traditional longitudinal analysis begins by extracting desired clinical measurements, such as volume or head circumference, from discrete imaging data. Typically, the continuous evolution of a scalar measurement is estimated by choosing a 1D regression model, such as kernel regression or fitting a polynomial of fixed degree. This type of analysis not only leads to separate models for each measurement, but there is no clear anatomical or biological interpretation to aid in the selection of the appropriate paradigm. In this paper, we propose a consistent framework for the analysis of longitudinal data by estimating the continuous evolution of shape over time as twice differentiable flows of deformations. In contrast to 1D regression models, one model is chosen to realistically capture the growth of anatomical structures. From the continuous evolution of shape, we can simply extract any clinical measurements of interest. We demonstrate on real anatomical surfaces that volume extracted from a continuous shape evolution is consistent with a 1D regression performed on the discrete measurements. We further show how the visualization of shape progression can aid in the search for significant measurements. Finally, we present an example on a shape complex of the brain (left hemisphere, right hemisphere, cerebellum) that demonstrates a potential clinical application for our framework.

  12. A consistent conceptual framework for applying climate metrics in technology life cycle assessment

    NASA Astrophysics Data System (ADS)

    Mallapragada, Dharik; Mignone, Bryan K.

    2017-07-01

    Comparing the potential climate impacts of different technologies is challenging for several reasons, including the fact that any given technology may be associated with emissions of multiple greenhouse gases when evaluated on a life cycle basis. In general, analysts must decide how to aggregate the climatic effects of different technologies, taking into account differences in the properties of the gases (differences in atmospheric lifetimes and instantaneous radiative efficiencies) as well as different technology characteristics (differences in emission factors and technology lifetimes). Available metrics proposed in the literature have incorporated these features in different ways and have arrived at different conclusions. In this paper, we develop a general framework for classifying metrics based on whether they measure: (a) cumulative or end point impacts, (b) impacts over a fixed time horizon or up to a fixed end year, and (c) impacts from a single emissions pulse or from a stream of pulses over multiple years. We then use the comparison between compressed natural gas and gasoline-fueled vehicles to illustrate how the choice of metric can affect conclusions about technologies. Finally, we consider tradeoffs involved in selecting a metric, show how the choice of metric depends on the framework that is assumed for climate change mitigation, and suggest which subset of metrics are likely to be most analytically self-consistent.

  13. Ergonomics action research II: a framework for integrating HF into work system design.

    PubMed

    Neumann, W P; Village, J

    2012-01-01

    This paper presents a conceptual framework that can support efforts to integrate human factors (HF) into the work system design process, where improved and cost-effective application of HF is possible. The framework advocates strategies of broad stakeholder participation, linking of performance and health goals, and process focussed change tools that can help practitioners engage in improvements to embed HF into a firm's work system design process. Recommended tools include business process mapping of the design process, implementing design criteria, using cognitive mapping to connect to managers' strategic goals, tactical use of training and adopting virtual HF (VHF) tools to support the integration effort. Consistent with organisational change research, the framework provides guidance but does not suggest a strict set of steps. This allows more adaptability for the practitioner who must navigate within a particular organisational context to secure support for embedding HF into the design process for improved operator wellbeing and system performance. There has been little scientific literature about how a practitioner might integrate HF into a company's work system design process. This paper proposes a framework for this effort by presenting a coherent conceptual framework, process tools, design tools and procedural advice that can be adapted for a target organisation.

  14. Total Variation with Overlapping Group Sparsity for Image Deblurring under Impulse Noise

    PubMed Central

    Liu, Gang; Huang, Ting-Zhu; Liu, Jun; Lv, Xiao-Guang

    2015-01-01

    The total variation (TV) regularization method is an effective method for image deblurring in preserving edges. However, the TV based solutions usually have some staircase effects. In order to alleviate the staircase effects, we propose a new model for restoring blurred images under impulse noise. The model consists of an ℓ1-fidelity term and a TV with overlapping group sparsity (OGS) regularization term. Moreover, we impose a box constraint to the proposed model for getting more accurate solutions. The solving algorithm for our model is under the framework of the alternating direction method of multipliers (ADMM). We use an inner loop which is nested inside the majorization minimization (MM) iteration for the subproblem of the proposed method. Compared with other TV-based methods, numerical results illustrate that the proposed method can significantly improve the restoration quality, both in terms of peak signal-to-noise ratio (PSNR) and relative error (ReE). PMID:25874860

  15. Exploiting salient semantic analysis for information retrieval

    NASA Astrophysics Data System (ADS)

    Luo, Jing; Meng, Bo; Quan, Changqin; Tu, Xinhui

    2016-11-01

    Recently, many Wikipedia-based methods have been proposed to improve the performance of different natural language processing (NLP) tasks, such as semantic relatedness computation, text classification and information retrieval. Among these methods, salient semantic analysis (SSA) has been proven to be an effective way to generate conceptual representation for words or documents. However, its feasibility and effectiveness in information retrieval is mostly unknown. In this paper, we study how to efficiently use SSA to improve the information retrieval performance, and propose a SSA-based retrieval method under the language model framework. First, SSA model is adopted to build conceptual representations for documents and queries. Then, these conceptual representations and the bag-of-words (BOW) representations can be used in combination to estimate the language models of queries and documents. The proposed method is evaluated on several standard text retrieval conference (TREC) collections. Experiment results on standard TREC collections show the proposed models consistently outperform the existing Wikipedia-based retrieval methods.

  16. eClims: An Extensible and Dynamic Integration Framework for Biomedical Information Systems.

    PubMed

    Savonnet, Marinette; Leclercq, Eric; Naubourg, Pierre

    2016-11-01

    Biomedical information systems (BIS) require consideration of three types of variability: data variability induced by new high throughput technologies, schema or model variability induced by large scale studies or new fields of research, and knowledge variability resulting from new discoveries. Beyond data heterogeneity, managing variabilities in the context of BIS requires extensible and dynamic integration process. In this paper, we focus on data and schema variabilities and we propose an integration framework based on ontologies, master data, and semantic annotations. The framework addresses issues related to: 1) collaborative work through a dynamic integration process; 2) variability among studies using an annotation mechanism; and 3) quality control over data and semantic annotations. Our approach relies on two levels of knowledge: BIS-related knowledge is modeled using an application ontology coupled with UML models that allow controlling data completeness and consistency, and domain knowledge is described by a domain ontology, which ensures data coherence. A system build with the eClims framework has been implemented and evaluated in the context of a proteomic platform.

  17. A continuum mechanics constitutive framework for transverse isotropic soft tissues

    NASA Astrophysics Data System (ADS)

    Garcia-Gonzalez, D.; Jérusalem, A.; Garzon-Hernandez, S.; Zaera, R.; Arias, A.

    2018-03-01

    In this work, a continuum constitutive framework for the mechanical modelling of soft tissues that incorporates strain rate and temperature dependencies as well as the transverse isotropy arising from fibres embedded into a soft matrix is developed. The constitutive formulation is based on a Helmholtz free energy function decoupled into the contribution of a viscous-hyperelastic matrix and the contribution of fibres introducing dispersion dependent transverse isotropy. The proposed framework considers finite deformation kinematics, is thermodynamically consistent and allows for the particularisation of the energy potentials and flow equations of each constitutive branch. In this regard, the approach developed herein provides the basis on which specific constitutive models can be potentially formulated for a wide variety of soft tissues. To illustrate this versatility, the constitutive framework is particularised here for animal and human white matter and skin, for which constitutive models are provided. In both cases, different energy functions are considered: Neo-Hookean, Gent and Ogden. Finally, the ability of the approach at capturing the experimental behaviour of the two soft tissues is confirmed.

  18. A Strategic Framework for Responding to Coral Bleaching Events in a Changing Climate

    NASA Astrophysics Data System (ADS)

    Maynard, J. A.; Johnson, J. E.; Marshall, P. A.; Eakin, C. M.; Goby, G.; Schuttenberg, H.; Spillman, C. M.

    2009-07-01

    The frequency and severity of mass coral bleaching events are predicted to increase as sea temperatures continue to warm under a global regime of rising ocean temperatures. Bleaching events can be disastrous for coral reef ecosystems and, given the number of other stressors to reefs that result from human activities, there is widespread concern about their future. This article provides a strategic framework from the Great Barrier Reef to prepare for and respond to mass bleaching events. The framework presented has two main inter-related components: an early warning system and assessment and monitoring. Both include the need to proactively and consistently communicate information on environmental conditions and the level of bleaching severity to senior decision-makers, stakeholders, and the public. Managers, being the most timely and credible source of information on bleaching events, can facilitate the implementation of strategies that can give reefs the best chance to recover from bleaching and to withstand future disturbances. The proposed framework is readily transferable to other coral reef regions, and can easily be adapted by managers to local financial, technical, and human resources.

  19. Development of intuitive rules: evaluating the application of the dual-system framework to understanding children's intuitive reasoning.

    PubMed

    Osman, Magda; Stavy, Ruth

    2006-12-01

    Theories of adult reasoning propose that reasoning consists of two functionally distinct systems that operate under entirely different mechanisms. This theoretical framework has been used to account for a wide range of phenomena, which now encompasses developmental research on reasoning and problem solving. We begin this review by contrasting three main dual-system theories of adult reasoning (Evans & Over, 1996; Sloman, 1996; Stanovich & West, 2000) with a well-established developmental account that also incorporates a dual-system framework (Brainerd & Reyna, 2001). We use developmental studies of the formation and application of intuitive rules in science and mathematics to evaluate the claims that these theories make. Overall, the evidence reviewed suggests that what is crucial to understanding how children reason is the saliency of the features that are presented within a task. By highlighting the importance of saliency as a way of understanding reasoning, we aim to provide clarity concerning the benefits and limitations of adopting a dual-system framework to account for evidence from developmental studies of intuitive reasoning.

  20. A lightweight distributed framework for computational offloading in mobile cloud computing.

    PubMed

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.

  1. A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing

    PubMed Central

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245

  2. Knowledge Management in Role Based Agents

    NASA Astrophysics Data System (ADS)

    Kır, Hüseyin; Ekinci, Erdem Eser; Dikenelli, Oguz

    In multi-agent system literature, the role concept is getting increasingly researched to provide an abstraction to scope beliefs, norms, goals of agents and to shape relationships of the agents in the organization. In this research, we propose a knowledgebase architecture to increase applicability of roles in MAS domain by drawing inspiration from the self concept in the role theory of sociology. The proposed knowledgebase architecture has granulated structure that is dynamically organized according to the agent's identification in a social environment. Thanks to this dynamic structure, agents are enabled to work on consistent knowledge in spite of inevitable conflicts between roles and the agent. The knowledgebase architecture is also implemented and incorporated into the SEAGENT multi-agent system development framework.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malikopoulos, Andreas; Djouadi, Seddik M; Kuruganti, Teja

    We consider the optimal stochastic control problem for home energy systems with solar and energy storage devices when the demand is realized from the grid. The demand is subject to Brownian motions with both drift and variance parameters modulated by a continuous-time Markov chain that represents the regime of electricity price. We model the systems as pure stochastic differential equation models, and then we follow the completing square technique to solve the stochastic home energy management problem. The effectiveness of the efficiency of the proposed approach is validated through a simulation example. For practical situations with constraints consistent to thosemore » studied here, our results imply the proposed framework could reduce the electricity cost from short-term purchase in peak hour market.« less

  4. Robust High-Resolution Cloth Using Parallelism, History-Based Collisions and Accurate Friction

    PubMed Central

    Selle, Andrew; Su, Jonathan; Irving, Geoffrey; Fedkiw, Ronald

    2015-01-01

    In this paper we simulate high resolution cloth consisting of up to 2 million triangles which allows us to achieve highly detailed folds and wrinkles. Since the level of detail is also influenced by object collision and self collision, we propose a more accurate model for cloth-object friction. We also propose a robust history-based repulsion/collision framework where repulsions are treated accurately and efficiently on a per time step basis. Distributed memory parallelism is used for both time evolution and collisions and we specifically address Gauss-Seidel ordering of repulsion/collision response. This algorithm is demonstrated by several high-resolution and high-fidelity simulations. PMID:19147895

  5. Visually defining and querying consistent multi-granular clinical temporal abstractions.

    PubMed

    Combi, Carlo; Oliboni, Barbara

    2012-02-01

    The main goal of this work is to propose a framework for the visual specification and query of consistent multi-granular clinical temporal abstractions. We focus on the issue of querying patient clinical information by visually defining and composing temporal abstractions, i.e., high level patterns derived from several time-stamped raw data. In particular, we focus on the visual specification of consistent temporal abstractions with different granularities and on the visual composition of different temporal abstractions for querying clinical databases. Temporal abstractions on clinical data provide a concise and high-level description of temporal raw data, and a suitable way to support decision making. Granularities define partitions on the time line and allow one to represent time and, thus, temporal clinical information at different levels of detail, according to the requirements coming from the represented clinical domain. The visual representation of temporal information has been considered since several years in clinical domains. Proposed visualization techniques must be easy and quick to understand, and could benefit from visual metaphors that do not lead to ambiguous interpretations. Recently, physical metaphors such as strips, springs, weights, and wires have been proposed and evaluated on clinical users for the specification of temporal clinical abstractions. Visual approaches to boolean queries have been considered in the last years and confirmed that the visual support to the specification of complex boolean queries is both an important and difficult research topic. We propose and describe a visual language for the definition of temporal abstractions based on a set of intuitive metaphors (striped wall, plastered wall, brick wall), allowing the clinician to use different granularities. A new algorithm, underlying the visual language, allows the physician to specify only consistent abstractions, i.e., abstractions not containing contradictory conditions on the component abstractions. Moreover, we propose a visual query language where different temporal abstractions can be composed to build complex queries: temporal abstractions are visually connected through the usual logical connectives AND, OR, and NOT. The proposed visual language allows one to simply define temporal abstractions by using intuitive metaphors, and to specify temporal intervals related to abstractions by using different temporal granularities. The physician can interact with the designed and implemented tool by point-and-click selections, and can visually compose queries involving several temporal abstractions. The evaluation of the proposed granularity-related metaphors consisted in two parts: (i) solving 30 interpretation exercises by choosing the correct interpretation of a given screenshot representing a possible scenario, and (ii) solving a complex exercise, by visually specifying through the interface a scenario described only in natural language. The exercises were done by 13 subjects. The percentage of correct answers to the interpretation exercises were slightly different with respect to the considered metaphors (54.4--striped wall, 73.3--plastered wall, 61--brick wall, and 61--no wall), but post hoc statistical analysis on means confirmed that differences were not statistically significant. The result of the user's satisfaction questionnaire related to the evaluation of the proposed granularity-related metaphors ratified that there are no preferences for one of them. The evaluation of the proposed logical notation consisted in two parts: (i) solving five interpretation exercises provided by a screenshot representing a possible scenario and by three different possible interpretations, of which only one was correct, and (ii) solving five exercises, by visually defining through the interface a scenario described only in natural language. Exercises had an increasing difficulty. The evaluation involved a total of 31 subjects. Results related to this evaluation phase confirmed us about the soundness of the proposed solution even in comparison with a well known proposal based on a tabular query form (the only significant difference is that our proposal requires more time for the training phase: 21 min versus 14 min). In this work we have considered the issue of visually composing and querying temporal clinical patient data. In this context we have proposed a visual framework for the specification of consistent temporal abstractions with different granularities and for the visual composition of different temporal abstractions to build (possibly) complex queries on clinical databases. A new algorithm has been proposed to check the consistency of the specified granular abstraction. From the evaluation of the proposed metaphors and interfaces and from the comparison of the visual query language with a well known visual method for boolean queries, the soundness of the overall system has been confirmed; moreover, pros and cons and possible improvements emerged from the comparison of different visual metaphors and solutions. Copyright © 2011 Elsevier B.V. All rights reserved.

  6. Self-organizing network services with evolutionary adaptation.

    PubMed

    Nakano, Tadashi; Suda, Tatsuya

    2005-09-01

    This paper proposes a novel framework for developing adaptive and scalable network services. In the proposed framework, a network service is implemented as a group of autonomous agents that interact in the network environment. Agents in the proposed framework are autonomous and capable of simple behaviors (e.g., replication, migration, and death). In this paper, an evolutionary adaptation mechanism is designed using genetic algorithms (GAs) for agents to evolve their behaviors and improve their fitness values (e.g., response time to a service request) to the environment. The proposed framework is evaluated through simulations, and the simulation results demonstrate the ability of autonomous agents to adapt to the network environment. The proposed framework may be suitable for disseminating network services in dynamic and large-scale networks where a large number of data and services need to be replicated, moved, and deleted in a decentralized manner.

  7. 77 FR 52791 - Regulatory Capital Rules: Regulatory Capital, Implementation of Basel III, Minimum Regulatory...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-30

    ...The Office of the Comptroller of the Currency (OCC), Board of Governors of the Federal Reserve System (Board), and the Federal Deposit Insurance Corporation (FDIC) (collectively, the agencies) are seeking comment on three Notices of Proposed Rulemaking (NPR) that would revise and replace the agencies' current capital rules. In this NPR, the agencies are proposing to revise their risk-based and leverage capital requirements consistent with agreements reached by the Basel Committee on Banking Supervision (BCBS) in ``Basel III: A Global Regulatory Framework for More Resilient Banks and Banking Systems'' (Basel III). The proposed revisions would include implementation of a new common equity tier 1 minimum capital requirement, a higher minimum tier 1 capital requirement, and, for banking organizations subject to the advanced approaches capital rules, a supplementary leverage ratio that incorporates a broader set of exposures in the denominator measure. Additionally, consistent with Basel III, the agencies are proposing to apply limits on a banking organization's capital distributions and certain discretionary bonus payments if the banking organization does not hold a specified amount of common equity tier 1 capital in addition to the amount necessary to meet its minimum risk- based capital requirements. This NPR also would establish more conservative standards for including an instrument in regulatory capital. As discussed in the proposal, the revisions set forth in this NPR are consistent with section 171 of the Dodd-Frank Wall Street Reform and Consumer Protection Act (Dodd-Frank Act), which requires the agencies to establish minimum risk-based and leverage capital requirements. In connection with the proposed changes to the agencies' capital rules in this NPR, the agencies are also seeking comment on the two related NPRs published elsewhere in today's Federal Register. The two related NPRs are discussed further in the SUPPLEMENTARY INFORMATION.

  8. The CRISP theory of hippocampal function in episodic memory

    PubMed Central

    Cheng, Sen

    2013-01-01

    Over the past four decades, a “standard framework” has emerged to explain the neural mechanisms of episodic memory storage. This framework has been instrumental in driving hippocampal research forward and now dominates the design and interpretation of experimental and theoretical studies. It postulates that cortical inputs drive plasticity in the recurrent cornu ammonis 3 (CA3) synapses to rapidly imprint memories as attractor states in CA3. Here we review a range of experimental studies and argue that the evidence against the standard framework is mounting, notwithstanding the considerable evidence in its support. We propose CRISP as an alternative theory to the standard framework. CRISP is based on Context Reset by dentate gyrus (DG), Intrinsic Sequences in CA3, and Pattern completion in cornu ammonis 1 (CA1). Compared to previous models, CRISP uses a radically different mechanism for storing episodic memories in the hippocampus. Neural sequences are intrinsic to CA3, and inputs are mapped onto these intrinsic sequences through synaptic plasticity in the feedforward projections of the hippocampus. Hence, CRISP does not require plasticity in the recurrent CA3 synapses during the storage process. Like in other theories DG and CA1 play supporting roles, however, their function in CRISP have distinct implications. For instance, CA1 performs pattern completion in the absence of CA3 and DG contributes to episodic memory retrieval, increasing the speed, precision, and robustness of retrieval. We propose the conceptual theory, discuss its implications for experimental results and suggest testable predictions. It appears that CRISP not only accounts for those experimental results that are consistent with the standard framework, but also for results that are at odds with the standard framework. We therefore suggest that CRISP is a viable, and perhaps superior, theory for the hippocampal function in episodic memory. PMID:23653597

  9. A framework for delineating the regional boundaries of PM2.5 pollution: A case study of China.

    PubMed

    Liu, Jianzheng; Li, Weifeng; Wu, Jiansheng

    2018-04-01

    Fine particulate matter (PM 2.5 ) pollution has been a major issue in many countries. Considerable studies have demonstrated that PM 2.5 pollution is a regional issue, but little research has been done to investigate the regional extent of PM 2.5 pollution or to define areas in which PM 2.5 pollutants interact. To allow for a better understanding of the regional nature and spatial patterns of PM 2.5 pollution, This study proposes a novel framework for delineating regional boundaries of PM 2.5 pollution. The framework consists of four steps, including cross-correlation analysis, time-series clustering, generation of Voronoi polygons, and polygon smoothing using polynomial approximation with exponential kernel method. Using the framework, the regional PM 2.5 boundaries for China are produced and the boundaries define areas where the monthly PM 2.5 time series of any two cities show, on average, more than 50% similarity with each other. These areas demonstrate straightforwardly that PM 2.5 pollution is not limited to a single city or a single province. We also found that the PM 2.5 areas in China tend to be larger in cold months, but more fragmented in warm months, suggesting that, in cold months, the interactions between PM 2.5 concentrations in adjacent cities are stronger than in warmer months. The proposed framework provides a tool to delineate PM 2.5 boundaries and identify areas where PM 2.5 pollutants interact. It can help define air pollution management zones and assess impacts related to PM 2.5 pollution. It can also be used in analyses of other air pollutants. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Trade Services System Adaptation for Sustainable Development

    NASA Astrophysics Data System (ADS)

    Khrichenkov, A.; Shaufler, V.; Bannikova, L.

    2017-11-01

    Under market conditions, the trade services system in post-Soviet Russia, being one of the most important city infrastructures, loses its systematic and hierarchic consistency hence provoking the degradation of communicating transport systems and urban planning framework. This article describes the results of the research carried out to identify objects and object parameters that influence functioning of a locally significant trade services system. Based on the revealed consumer behaviour patterns, we propose methods to determine the optimal parameters of objects inside a locally significant trade services system.

  11. Assessing the formability of metallic sheets by means of localized and diffuse necking models

    NASA Astrophysics Data System (ADS)

    Comşa, Dan-Sorin; Lǎzǎrescu, Lucian; Banabic, Dorel

    2016-10-01

    The main objective of the paper consists in elaborating a unified framework that allows the theoretical assessment of sheet metal formability. Hill's localized necking model and the Extended Maximum Force Criterion proposed by Mattiasson, Sigvant, and Larsson have been selected for this purpose. Both models are thoroughly described together with their solution procedures. A comparison of the theoretical predictions with experimental data referring to the formability of a DP600 steel sheet is also presented by the authors.

  12. From virtual clustering analysis to self-consistent clustering analysis: a mathematical study

    NASA Astrophysics Data System (ADS)

    Tang, Shaoqiang; Zhang, Lei; Liu, Wing Kam

    2018-03-01

    In this paper, we propose a new homogenization algorithm, virtual clustering analysis (VCA), as well as provide a mathematical framework for the recently proposed self-consistent clustering analysis (SCA) (Liu et al. in Comput Methods Appl Mech Eng 306:319-341, 2016). In the mathematical theory, we clarify the key assumptions and ideas of VCA and SCA, and derive the continuous and discrete Lippmann-Schwinger equations. Based on a key postulation of "once response similarly, always response similarly", clustering is performed in an offline stage by machine learning techniques (k-means and SOM), and facilitates substantial reduction of computational complexity in an online predictive stage. The clear mathematical setup allows for the first time a convergence study of clustering refinement in one space dimension. Convergence is proved rigorously, and found to be of second order from numerical investigations. Furthermore, we propose to suitably enlarge the domain in VCA, such that the boundary terms may be neglected in the Lippmann-Schwinger equation, by virtue of the Saint-Venant's principle. In contrast, they were not obtained in the original SCA paper, and we discover these terms may well be responsible for the numerical dependency on the choice of reference material property. Since VCA enhances the accuracy by overcoming the modeling error, and reduce the numerical cost by avoiding an outer loop iteration for attaining the material property consistency in SCA, its efficiency is expected even higher than the recently proposed SCA algorithm.

  13. Multi-Sensor Fusion with Interaction Multiple Model and Chi-Square Test Tolerant Filter.

    PubMed

    Yang, Chun; Mohammadi, Arash; Chen, Qing-Wei

    2016-11-02

    Motivated by the key importance of multi-sensor information fusion algorithms in the state-of-the-art integrated navigation systems due to recent advancements in sensor technologies, telecommunication, and navigation systems, the paper proposes an improved and innovative fault-tolerant fusion framework. An integrated navigation system is considered consisting of four sensory sub-systems, i.e., Strap-down Inertial Navigation System (SINS), Global Navigation System (GPS), the Bei-Dou2 (BD2) and Celestial Navigation System (CNS) navigation sensors. In such multi-sensor applications, on the one hand, the design of an efficient fusion methodology is extremely constrained specially when no information regarding the system's error characteristics is available. On the other hand, the development of an accurate fault detection and integrity monitoring solution is both challenging and critical. The paper addresses the sensitivity issues of conventional fault detection solutions and the unavailability of a precisely known system model by jointly designing fault detection and information fusion algorithms. In particular, by using ideas from Interacting Multiple Model (IMM) filters, the uncertainty of the system will be adjusted adaptively by model probabilities and using the proposed fuzzy-based fusion framework. The paper also addresses the problem of using corrupted measurements for fault detection purposes by designing a two state propagator chi-square test jointly with the fusion algorithm. Two IMM predictors, running in parallel, are used and alternatively reactivated based on the received information form the fusion filter to increase the reliability and accuracy of the proposed detection solution. With the combination of the IMM and the proposed fusion method, we increase the failure sensitivity of the detection system and, thereby, significantly increase the overall reliability and accuracy of the integrated navigation system. Simulation results indicate that the proposed fault tolerant fusion framework provides superior performance over its traditional counterparts.

  14. Multi-Sensor Fusion with Interaction Multiple Model and Chi-Square Test Tolerant Filter

    PubMed Central

    Yang, Chun; Mohammadi, Arash; Chen, Qing-Wei

    2016-01-01

    Motivated by the key importance of multi-sensor information fusion algorithms in the state-of-the-art integrated navigation systems due to recent advancements in sensor technologies, telecommunication, and navigation systems, the paper proposes an improved and innovative fault-tolerant fusion framework. An integrated navigation system is considered consisting of four sensory sub-systems, i.e., Strap-down Inertial Navigation System (SINS), Global Navigation System (GPS), the Bei-Dou2 (BD2) and Celestial Navigation System (CNS) navigation sensors. In such multi-sensor applications, on the one hand, the design of an efficient fusion methodology is extremely constrained specially when no information regarding the system’s error characteristics is available. On the other hand, the development of an accurate fault detection and integrity monitoring solution is both challenging and critical. The paper addresses the sensitivity issues of conventional fault detection solutions and the unavailability of a precisely known system model by jointly designing fault detection and information fusion algorithms. In particular, by using ideas from Interacting Multiple Model (IMM) filters, the uncertainty of the system will be adjusted adaptively by model probabilities and using the proposed fuzzy-based fusion framework. The paper also addresses the problem of using corrupted measurements for fault detection purposes by designing a two state propagator chi-square test jointly with the fusion algorithm. Two IMM predictors, running in parallel, are used and alternatively reactivated based on the received information form the fusion filter to increase the reliability and accuracy of the proposed detection solution. With the combination of the IMM and the proposed fusion method, we increase the failure sensitivity of the detection system and, thereby, significantly increase the overall reliability and accuracy of the integrated navigation system. Simulation results indicate that the proposed fault tolerant fusion framework provides superior performance over its traditional counterparts. PMID:27827832

  15. An overview of ethical frameworks in public health: can they be supportive in the evaluation of programs to prevent overweight?

    PubMed Central

    2010-01-01

    Background The prevention of overweight sometimes raises complex ethical questions. Ethical public health frameworks may be helpful in evaluating programs or policy for overweight prevention. We give an overview of the purpose, form and contents of such public health frameworks and investigate to which extent they are useful for evaluating programs to prevent overweight and/or obesity. Methods Our search for frameworks consisted of three steps. Firstly, we asked experts in the field of ethics and public health for the frameworks they were aware of. Secondly, we performed a search in Pubmed. Thirdly, we checked literature references in the articles on frameworks we found. In total, we thus found six ethical frameworks. We assessed the area on which the available ethical frameworks focus, the users they target at, the type of policy or intervention they propose to address, and their aim. Further, we looked at their structure and content, that is, tools for guiding the analytic process, the main ethical principles or values, possible criteria for dealing with ethical conflicts, and the concrete policy issues they are applied to. Results All frameworks aim to support public health professionals or policymakers. Most of them provide a set of values or principles that serve as a standard for evaluating policy. Most frameworks articulate both the positive ethical foundations for public health and ethical constraints or concerns. Some frameworks offer analytic tools for guiding the evaluative process. Procedural guidelines and concrete criteria for solving important ethical conflicts in the particular area of the prevention of overweight or obesity are mostly lacking. Conclusions Public health ethical frameworks may be supportive in the evaluation of overweight prevention programs or policy, but seem to lack practical guidance to address ethical conflicts in this particular area. PMID:20969761

  16. A tuned mesh-generation strategy for image representation based on data-dependent triangulation.

    PubMed

    Li, Ping; Adams, Michael D

    2013-05-01

    A mesh-generation framework for image representation based on data-dependent triangulation is proposed. The proposed framework is a modified version of the frameworks of Rippa and Garland and Heckbert that facilitates the development of more effective mesh-generation methods. As the proposed framework has several free parameters, the effects of different choices of these parameters on mesh quality are studied, leading to the recommendation of a particular set of choices for these parameters. A mesh-generation method is then introduced that employs the proposed framework with these best parameter choices. This method is demonstrated to produce meshes of higher quality (both in terms of squared error and subjectively) than those generated by several competing approaches, at a relatively modest computational and memory cost.

  17. A hybrid gene selection approach for microarray data classification using cellular learning automata and ant colony optimization.

    PubMed

    Vafaee Sharbaf, Fatemeh; Mosafer, Sara; Moattar, Mohammad Hossein

    2016-06-01

    This paper proposes an approach for gene selection in microarray data. The proposed approach consists of a primary filter approach using Fisher criterion which reduces the initial genes and hence the search space and time complexity. Then, a wrapper approach which is based on cellular learning automata (CLA) optimized with ant colony method (ACO) is used to find the set of features which improve the classification accuracy. CLA is applied due to its capability to learn and model complicated relationships. The selected features from the last phase are evaluated using ROC curve and the most effective while smallest feature subset is determined. The classifiers which are evaluated in the proposed framework are K-nearest neighbor; support vector machine and naïve Bayes. The proposed approach is evaluated on 4 microarray datasets. The evaluations confirm that the proposed approach can find the smallest subset of genes while approaching the maximum accuracy. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Integrated Nationwide Electronic Health Records system: Semi-distributed architecture approach.

    PubMed

    Fragidis, Leonidas L; Chatzoglou, Prodromos D; Aggelidis, Vassilios P

    2016-11-14

    The integration of heterogeneous electronic health records systems by building an interoperable nationwide electronic health record system provides undisputable benefits in health care, like superior health information quality, medical errors prevention and cost saving. This paper proposes a semi-distributed system architecture approach for an integrated national electronic health record system incorporating the advantages of the two dominant approaches, the centralized architecture and the distributed architecture. The high level design of the main elements for the proposed architecture is provided along with diagrams of execution and operation and data synchronization architecture for the proposed solution. The proposed approach effectively handles issues related to redundancy, consistency, security, privacy, availability, load balancing, maintainability, complexity and interoperability of citizen's health data. The proposed semi-distributed architecture offers a robust interoperability framework without healthcare providers to change their local EHR systems. It is a pragmatic approach taking into account the characteristics of the Greek national healthcare system along with the national public administration data communication network infrastructure, for achieving EHR integration with acceptable implementation cost.

  19. A Framework for Enhancing the Value of Research for Dissemination and Implementation

    PubMed Central

    Glasgow, Russell E.; Carpenter, Christopher R.; Grimshaw, Jeremy M.; Rabin, Borsika A.; Fernandez, Maria E.; Brownson, Ross C.

    2015-01-01

    A comprehensive guide that identifies critical evaluation and reporting elements necessary to move research into practice is needed. We propose a framework that highlights the domains required to enhance the value of dissemination and implementation research for end users. We emphasize the importance of transparent reporting on the planning phase of research in addition to delivery, evaluation, and long-term outcomes. We highlight key topics for which well-established reporting and assessment tools are underused (e.g., cost of intervention, implementation strategy, adoption) and where such tools are inadequate or lacking (e.g., context, sustainability, evolution) within the context of existing reporting guidelines. Consistent evaluation of and reporting on these issues with standardized approaches would enhance the value of research for practitioners and decision-makers. PMID:25393182

  20. A new neural framework for visuospatial processing

    PubMed Central

    Kravitz, Dwight J.; Saleem, Kadharbatcha S.; Baker, Chris I.; Mishkin, Mortimer

    2012-01-01

    The division of cortical visual processing into distinct dorsal and ventral streams is a key framework that has guided visual neuroscience. The characterization of the ventral stream as a ‘What’ pathway is relatively uncontroversial, but the nature of dorsal stream processing is less clear. Originally proposed as mediating spatial perception (‘Where’), more recent accounts suggest it primarily serves non-conscious visually guided action (‘How’). Here, we identify three pathways emerging from the dorsal stream that consist of projections to the prefrontal and premotor cortices, and a major projection to the medial temporal lobe that courses both directly and indirectly through the posterior cingulate and retrosplenial cortices. These three pathways support both conscious and non-conscious visuospatial processing, including spatial working memory, visually guided action and navigation, respectively. PMID:21415848

  1. The domain interface method: a general-purpose non-intrusive technique for non-conforming domain decomposition problems

    NASA Astrophysics Data System (ADS)

    Cafiero, M.; Lloberas-Valls, O.; Cante, J.; Oliver, J.

    2016-04-01

    A domain decomposition technique is proposed which is capable of properly connecting arbitrary non-conforming interfaces. The strategy essentially consists in considering a fictitious zero-width interface between the non-matching meshes which is discretized using a Delaunay triangulation. Continuity is satisfied across domains through normal and tangential stresses provided by the discretized interface and inserted in the formulation in the form of Lagrange multipliers. The final structure of the global system of equations resembles the dual assembly of substructures where the Lagrange multipliers are employed to nullify the gap between domains. A new approach to handle floating subdomains is outlined which can be implemented without significantly altering the structure of standard industrial finite element codes. The effectiveness of the developed algorithm is demonstrated through a patch test example and a number of tests that highlight the accuracy of the methodology and independence of the results with respect to the framework parameters. Considering its high degree of flexibility and non-intrusive character, the proposed domain decomposition framework is regarded as an attractive alternative to other established techniques such as the mortar approach.

  2. Validation of a finite element method framework for cardiac mechanics applications

    NASA Astrophysics Data System (ADS)

    Danan, David; Le Rolle, Virginie; Hubert, Arnaud; Galli, Elena; Bernard, Anne; Donal, Erwan; Hernández, Alfredo I.

    2017-11-01

    Modeling cardiac mechanics is a particularly challenging task, mainly because of the poor understanding of the underlying physiology, the lack of observability and the complexity of the mechanical properties of myocardial tissues. The choice of cardiac mechanic solvers, especially, implies several difficulties, notably due to the potential instability arising from the nonlinearities inherent to the large deformation framework. Furthermore, the verification of the obtained simulations is a difficult task because there is no analytic solutions for these kinds of problems. Hence, the objective of this work is to provide a quantitative verification of a cardiac mechanics implementation based on two published benchmark problems. The first problem consists in deforming a bar whereas the second problem concerns the inflation of a truncated ellipsoid-shaped ventricle, both in the steady state case. Simulations were obtained by using the finite element software GETFEM++. Results were compared to the consensus solution published by 11 groups and the proposed solutions were indistinguishable. The validation of the proposed mechanical model implementation is an important step toward the proposition of a global model of cardiac electro-mechanical activity.

  3. Visual Control for Multirobot Organized Rendezvous.

    PubMed

    Lopez-Nicolas, G; Aranda, M; Mezouar, Y; Sagues, C

    2012-08-01

    This paper addresses the problem of visual control of a set of mobile robots. In our framework, the perception system consists of an uncalibrated flying camera performing an unknown general motion. The robots are assumed to undergo planar motion considering nonholonomic constraints. The goal of the control task is to drive the multirobot system to a desired rendezvous configuration relying solely on visual information given by the flying camera. The desired multirobot configuration is defined with an image of the set of robots in that configuration without any additional information. We propose a homography-based framework relying on the homography induced by the multirobot system that gives a desired homography to be used to define the reference target, and a new image-based control law that drives the robots to the desired configuration by imposing a rigidity constraint. This paper extends our previous work, and the main contributions are that the motion constraints on the flying camera are removed, the control law is improved by reducing the number of required steps, the stability of the new control law is proved, and real experiments are provided to validate the proposal.

  4. Data-driven simultaneous fault diagnosis for solid oxide fuel cell system using multi-label pattern identification

    NASA Astrophysics Data System (ADS)

    Li, Shuanghong; Cao, Hongliang; Yang, Yupu

    2018-02-01

    Fault diagnosis is a key process for the reliability and safety of solid oxide fuel cell (SOFC) systems. However, it is difficult to rapidly and accurately identify faults for complicated SOFC systems, especially when simultaneous faults appear. In this research, a data-driven Multi-Label (ML) pattern identification approach is proposed to address the simultaneous fault diagnosis of SOFC systems. The framework of the simultaneous-fault diagnosis primarily includes two components: feature extraction and ML-SVM classifier. The simultaneous-fault diagnosis approach can be trained to diagnose simultaneous SOFC faults, such as fuel leakage, air leakage in different positions in the SOFC system, by just using simple training data sets consisting only single fault and not demanding simultaneous faults data. The experimental result shows the proposed framework can diagnose the simultaneous SOFC system faults with high accuracy requiring small number training data and low computational burden. In addition, Fault Inference Tree Analysis (FITA) is employed to identify the correlations among possible faults and their corresponding symptoms at the system component level.

  5. Sign language indexation within the MPEG-7 framework

    NASA Astrophysics Data System (ADS)

    Zaharia, Titus; Preda, Marius; Preteux, Francoise J.

    1999-06-01

    In this paper, we address the issue of sign language indexation/recognition. The existing tools, like on-like Web dictionaries or other educational-oriented applications, are making exclusive use of textural annotations. However, keyword indexing schemes have strong limitations due to the ambiguity of the natural language and to the huge effort needed to manually annotate a large amount of data. In order to overcome these drawbacks, we tackle sign language indexation issue within the MPEG-7 framework and propose an approach based on linguistic properties and characteristics of sing language. The method developed introduces the concept of over time stable hand configuration instanciated on natural or synthetic prototypes. The prototypes are indexed by means of a shape descriptor which is defined as a translation, rotation and scale invariant Hough transform. A very compact representation is available by considering the Fourier transform of the Hough coefficients. Such an approach has been applied to two data sets consisting of 'Letters' and 'Words' respectively. The accuracy and robustness of the result are discussed and a compete sign language description schema is proposed.

  6. A postprocessing method in the HMC framework for predicting gene function based on biological instrumental data

    NASA Astrophysics Data System (ADS)

    Feng, Shou; Fu, Ping; Zheng, Wenbin

    2018-03-01

    Predicting gene function based on biological instrumental data is a complicated and challenging hierarchical multi-label classification (HMC) problem. When using local approach methods to solve this problem, a preliminary results processing method is usually needed. This paper proposed a novel preliminary results processing method called the nodes interaction method. The nodes interaction method revises the preliminary results and guarantees that the predictions are consistent with the hierarchy constraint. This method exploits the label dependency and considers the hierarchical interaction between nodes when making decisions based on the Bayesian network in its first phase. In the second phase, this method further adjusts the results according to the hierarchy constraint. Implementing the nodes interaction method in the HMC framework also enhances the HMC performance for solving the gene function prediction problem based on the Gene Ontology (GO), the hierarchy of which is a directed acyclic graph that is more difficult to tackle. The experimental results validate the promising performance of the proposed method compared to state-of-the-art methods on eight benchmark yeast data sets annotated by the GO.

  7. Efficient Mining and Detection of Sequential Intrusion Patterns for Network Intrusion Detection Systems

    NASA Astrophysics Data System (ADS)

    Shyu, Mei-Ling; Huang, Zifang; Luo, Hongli

    In recent years, pervasive computing infrastructures have greatly improved the interaction between human and system. As we put more reliance on these computing infrastructures, we also face threats of network intrusion and/or any new forms of undesirable IT-based activities. Hence, network security has become an extremely important issue, which is closely connected with homeland security, business transactions, and people's daily life. Accurate and efficient intrusion detection technologies are required to safeguard the network systems and the critical information transmitted in the network systems. In this chapter, a novel network intrusion detection framework for mining and detecting sequential intrusion patterns is proposed. The proposed framework consists of a Collateral Representative Subspace Projection Modeling (C-RSPM) component for supervised classification, and an inter-transactional association rule mining method based on Layer Divided Modeling (LDM) for temporal pattern analysis. Experiments on the KDD99 data set and the traffic data set generated by a private LAN testbed show promising results with high detection rates, low processing time, and low false alarm rates in mining and detecting sequential intrusion detections.

  8. Resistance vs resilience to Alzheimer disease: Clarifying terminology for preclinical studies.

    PubMed

    Arenaza-Urquijo, Eider M; Vemuri, Prashanthi

    2018-04-10

    Preventing or delaying Alzheimer disease (AD) through lifestyle interventions will come from a better understanding of the mechanistic underpinnings of (1) why a significant proportion of elderly remain cognitively normal with AD pathologies (ADP), i.e., amyloid or tau; and (2) why some elderly individuals do not have significant ADP. In the last decades, concepts such as brain reserve, cognitive reserve, and more recently brain maintenance have been proposed along with more general notions such as (neuro)protection and compensation. It is currently unclear how to effectively apply these concepts in the new field of preclinical AD specifically separating the 2 distinct mechanisms of coping with pathology vs avoiding pathology. We propose a simplistic conceptual framework that builds on existing concepts using the nomenclature of resistance in the context of avoiding pathology, i.e., remaining cognitively normal without significant ADP, and resilience in the context of coping with pathology, i.e., remaining cognitively normal despite significant ADP. In the context of preclinical AD studies, we (1) define these concepts and provide recommendations (and common scenarios) for their use; (2) discuss how to employ this terminology in the context of investigating mechanisms and factors; (3) highlight the complementarity and clarity they provide to existing concepts; and (4) discuss different study designs and methodologies. The application of the proposed framework for framing hypotheses, study design, and interpretation of results and mechanisms can provide a consistent framework and nomenclature for researchers to reach consensus on identifying factors that may prevent ADP or delay the onset of cognitive impairment. © 2018 American Academy of Neurology.

  9. Predictive brain networks for major depression in a semi-multimodal fusion hierarchical feature reduction framework.

    PubMed

    Yang, Jie; Yin, Yingying; Zhang, Zuping; Long, Jun; Dong, Jian; Zhang, Yuqun; Xu, Zhi; Li, Lei; Liu, Jie; Yuan, Yonggui

    2018-02-05

    Major depressive disorder (MDD) is characterized by dysregulation of distributed structural and functional networks. It is now recognized that structural and functional networks are related at multiple temporal scales. The recent emergence of multimodal fusion methods has made it possible to comprehensively and systematically investigate brain networks and thereby provide essential information for influencing disease diagnosis and prognosis. However, such investigations are hampered by the inconsistent dimensionality features between structural and functional networks. Thus, a semi-multimodal fusion hierarchical feature reduction framework is proposed. Feature reduction is a vital procedure in classification that can be used to eliminate irrelevant and redundant information and thereby improve the accuracy of disease diagnosis. Our proposed framework primarily consists of two steps. The first step considers the connection distances in both structural and functional networks between MDD and healthy control (HC) groups. By adding a constraint based on sparsity regularization, the second step fully utilizes the inter-relationship between the two modalities. However, in contrast to conventional multi-modality multi-task methods, the structural networks were considered to play only a subsidiary role in feature reduction and were not included in the following classification. The proposed method achieved a classification accuracy, specificity, sensitivity, and area under the curve of 84.91%, 88.6%, 81.29%, and 0.91, respectively. Moreover, the frontal-limbic system contributed the most to disease diagnosis. Importantly, by taking full advantage of the complementary information from multimodal neuroimaging data, the selected consensus connections may be highly reliable biomarkers of MDD. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. [The debate concerning performance-based financing in Africa South of the Sahara: analysis of the nature].

    PubMed

    Manitu, Serge Mayaka; Meessen, Bruno; Lushimba, Michel Muvudi; Macq, Jean

    2015-01-01

    Performance-based financing (PBF) is a strategy designed to link thefunding of health services to predetermined results. Payment by an independent strategic purchaser is subject to verification of effective achievement of health outcomes in terms ofquantity and quality. This article investigates the complex tensions observed in relation to performance based financing (PBF) and identifies some reasons for disagreement on this approach. This study was essentially qualitative. Interviews were conducted with a panel of experts on PBF mobilizing their ability to reflect on the various arguments and positions concerning this financing mechanism. To enhance our analyses, we proposed a framework based on the main reasonsfor scientific or political controversies and factors involved in their emergence. Analysis of the information collected therefore consisted of combining experts verbatim reports with corresponding factors of controversies of our framework. Graphic representations of the differences were also established. Tensions concerning PBF are based on facts (experts' interpretation ofPBF), principles and values (around each expert's conceptual framework), balances of power between experts but also inappropriate behavior in the discussion process. Viewpoints remain isolated, each individual experience and an overview are lacking, which can interfere with decision-making and maintain the Health system reform crisis. Potential solutions to reduce these tensions are proposed. Our study shows that experts have difficulties agreeing on a theoretical priority approach to PBE. A good understanding of the nature of the tensions and an improvement in the quality of dialogue will promote a real dynamic of change and the proposal of an agenda of PBF actions.

  11. Teaching and Learning Numerical Analysis and Optimization: A Didactic Framework and Applications of Inquiry-Based Learning

    ERIC Educational Resources Information Center

    Lappas, Pantelis Z.; Kritikos, Manolis N.

    2018-01-01

    The main objective of this paper is to propose a didactic framework for teaching Applied Mathematics in higher education. After describing the structure of the framework, several applications of inquiry-based learning in teaching numerical analysis and optimization are provided to illustrate the potential of the proposed framework. The framework…

  12. Dual tree fractional quaternion wavelet transform for disparity estimation.

    PubMed

    Kumar, Sanoj; Kumar, Sanjeev; Sukavanam, Nagarajan; Raman, Balasubramanian

    2014-03-01

    This paper proposes a novel phase based approach for computing disparity as the optical flow from the given pair of consecutive images. A new dual tree fractional quaternion wavelet transform (FrQWT) is proposed by defining the 2D Fourier spectrum upto a single quadrant. In the proposed FrQWT, each quaternion wavelet consists of a real part (a real DWT wavelet) and three imaginary parts that are organized according to the quaternion algebra. First two FrQWT phases encode the shifts of image features in the absolute horizontal and vertical coordinate system, while the third phase has the texture information. The FrQWT allowed a multi-scale framework for calculating and adjusting local disparities and executing phase unwrapping from coarse to fine scales with linear computational efficiency. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Methodology of ecooriented assessment of constructive schemes of cast in-situ RC framework in civil engineering

    NASA Astrophysics Data System (ADS)

    Avilova, I. P.; Krutilova, M. O.

    2018-01-01

    Economic growth is the main determinant of the trend to increased greenhouse gas (GHG) emission. Therefore, the reduction of emission and stabilization of GHG levels in the atmosphere become an urgent task to avoid the worst predicted consequences of climate change. GHG emissions in construction industry take a significant part of industrial GHG emission and are expected to consistently increase. The problem could be successfully solved with a help of both economical and organizational restrictions, based on enhanced algorithms of calculation and amercement of environmental harm in building industry. This study aims to quantify of GHG emission caused by different constructive schemes of RC framework in concrete casting. The result shows that proposed methodology allows to make a comparative analysis of alternative projects in residential housing, taking into account an environmental damage, caused by construction process. The study was carried out in the framework of the Program of flagship university development on the base of Belgorod State Technological University named after V.G. Shoukhov

  14. Pareto frontier analyses based decision making tool for transportation of hazardous waste.

    PubMed

    Das, Arup; Mazumder, T N; Gupta, A K

    2012-08-15

    Transportation of hazardous wastes through a region poses immense threat on the development along its road network. The risk to the population, exposed to such activities, has been documented in the past. However, a comprehensive framework for routing hazardous wastes has often been overlooked. A regional Hazardous Waste Management scheme should incorporate a comprehensive framework for hazardous waste transportation. This framework would incorporate the various stakeholders involved in decision making. Hence, a multi-objective approach is required to safeguard the interest of all the concerned stakeholders. The objective of this study is to design a methodology for routing of hazardous wastes between the generating units and the disposal facilities through a capacity constrained network. The proposed methodology uses posteriori method with multi-objective approach to find non-dominated solutions for the system consisting of multiple origins and destinations. A case study of transportation of hazardous wastes in Kolkata Metropolitan Area has also been provided to elucidate the methodology. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Crystal plasticity modeling of irradiation growth in Zircaloy-2

    NASA Astrophysics Data System (ADS)

    Patra, Anirban; Tomé, Carlos N.; Golubov, Stanislav I.

    2017-08-01

    A physically based reaction-diffusion model is implemented in the visco-plastic self-consistent (VPSC) crystal plasticity framework to simulate irradiation growth in hcp Zr and its alloys. The reaction-diffusion model accounts for the defects produced by the cascade of displaced atoms, their diffusion to lattice sinks and the contribution to crystallographic strain at the level of single crystals. The VPSC framework accounts for intergranular interactions and irradiation creep, and calculates the strain in the polycrystalline ensemble. A novel scheme is proposed to model the simultaneous evolution of both, number density and radius, of irradiation-induced dislocation loops directly from experimental data of dislocation density evolution during irradiation. This framework is used to predict the irradiation growth behaviour of cold-worked Zircaloy-2 and trends compared to available experimental data. The role of internal stresses in inducing irradiation creep is discussed. Effects of grain size, texture and external stress on the coupled irradiation growth and creep behaviour are also studied and compared with available experimental data.

  16. Automatic liver tumor segmentation on computed tomography for patient treatment planning and monitoring

    PubMed Central

    Moghbel, Mehrdad; Mashohor, Syamsiah; Mahmud, Rozi; Saripan, M. Iqbal Bin

    2016-01-01

    Segmentation of liver tumors from Computed Tomography (CT) and tumor burden analysis play an important role in the choice of therapeutic strategies for liver diseases and treatment monitoring. In this paper, a new segmentation method for liver tumors from contrast-enhanced CT imaging is proposed. As manual segmentation of tumors for liver treatment planning is both labor intensive and time-consuming, a highly accurate automatic tumor segmentation is desired. The proposed framework is fully automatic requiring no user interaction. The proposed segmentation evaluated on real-world clinical data from patients is based on a hybrid method integrating cuckoo optimization and fuzzy c-means algorithm with random walkers algorithm. The accuracy of the proposed method was validated using a clinical liver dataset containing one of the highest numbers of tumors utilized for liver tumor segmentation containing 127 tumors in total with further validation of the results by a consultant radiologist. The proposed method was able to achieve one of the highest accuracies reported in the literature for liver tumor segmentation compared to other segmentation methods with a mean overlap error of 22.78 % and dice similarity coefficient of 0.75 in 3Dircadb dataset and a mean overlap error of 15.61 % and dice similarity coefficient of 0.81 in MIDAS dataset. The proposed method was able to outperform most other tumor segmentation methods reported in the literature while representing an overlap error improvement of 6 % compared to one of the best performing automatic methods in the literature. The proposed framework was able to provide consistently accurate results considering the number of tumors and the variations in tumor contrast enhancements and tumor appearances while the tumor burden was estimated with a mean error of 0.84 % in 3Dircadb dataset. PMID:27540353

  17. A Novel Phonology- and Radical-Coded Chinese Sign Language Recognition Framework Using Accelerometer and Surface Electromyography Sensors

    PubMed Central

    Cheng, Juan; Chen, Xun; Liu, Aiping; Peng, Hu

    2015-01-01

    Sign language recognition (SLR) is an important communication tool between the deaf and the external world. It is highly necessary to develop a worldwide continuous and large-vocabulary-scale SLR system for practical usage. In this paper, we propose a novel phonology- and radical-coded Chinese SLR framework to demonstrate the feasibility of continuous SLR using accelerometer (ACC) and surface electromyography (sEMG) sensors. The continuous Chinese characters, consisting of coded sign gestures, are first segmented into active segments using EMG signals by means of moving average algorithm. Then, features of each component are extracted from both ACC and sEMG signals of active segments (i.e., palm orientation represented by the mean and variance of ACC signals, hand movement represented by the fixed-point ACC sequence, and hand shape represented by both the mean absolute value (MAV) and autoregressive model coefficients (ARs)). Afterwards, palm orientation is first classified, distinguishing “Palm Downward” sign gestures from “Palm Inward” ones. Only the “Palm Inward” gestures are sent for further hand movement and hand shape recognition by dynamic time warping (DTW) algorithm and hidden Markov models (HMM) respectively. Finally, component recognition results are integrated to identify one certain coded gesture. Experimental results demonstrate that the proposed SLR framework with a vocabulary scale of 223 characters can achieve an averaged recognition accuracy of 96.01% ± 0.83% for coded gesture recognition tasks and 92.73% ± 1.47% for character recognition tasks. Besides, it demonstrats that sEMG signals are rather consistent for a given hand shape independent of hand movements. Hence, the number of training samples will not be significantly increased when the vocabulary scale increases, since not only the number of the completely new proposed coded gestures is constant and limited, but also the transition movement which connects successive signs needs no training samples to model even though the same coded gesture performed in different characters. This work opens up a possible new way to realize a practical Chinese SLR system. PMID:26389907

  18. A Novel Phonology- and Radical-Coded Chinese Sign Language Recognition Framework Using Accelerometer and Surface Electromyography Sensors.

    PubMed

    Cheng, Juan; Chen, Xun; Liu, Aiping; Peng, Hu

    2015-09-15

    Sign language recognition (SLR) is an important communication tool between the deaf and the external world. It is highly necessary to develop a worldwide continuous and large-vocabulary-scale SLR system for practical usage. In this paper, we propose a novel phonology- and radical-coded Chinese SLR framework to demonstrate the feasibility of continuous SLR using accelerometer (ACC) and surface electromyography (sEMG) sensors. The continuous Chinese characters, consisting of coded sign gestures, are first segmented into active segments using EMG signals by means of moving average algorithm. Then, features of each component are extracted from both ACC and sEMG signals of active segments (i.e., palm orientation represented by the mean and variance of ACC signals, hand movement represented by the fixed-point ACC sequence, and hand shape represented by both the mean absolute value (MAV) and autoregressive model coefficients (ARs)). Afterwards, palm orientation is first classified, distinguishing "Palm Downward" sign gestures from "Palm Inward" ones. Only the "Palm Inward" gestures are sent for further hand movement and hand shape recognition by dynamic time warping (DTW) algorithm and hidden Markov models (HMM) respectively. Finally, component recognition results are integrated to identify one certain coded gesture. Experimental results demonstrate that the proposed SLR framework with a vocabulary scale of 223 characters can achieve an averaged recognition accuracy of 96.01% ± 0.83% for coded gesture recognition tasks and 92.73% ± 1.47% for character recognition tasks. Besides, it demonstrats that sEMG signals are rather consistent for a given hand shape independent of hand movements. Hence, the number of training samples will not be significantly increased when the vocabulary scale increases, since not only the number of the completely new proposed coded gestures is constant and limited, but also the transition movement which connects successive signs needs no training samples to model even though the same coded gesture performed in different characters. This work opens up a possible new way to realize a practical Chinese SLR system.

  19. Formulating accident occurrence as a survival process.

    PubMed

    Chang, H L; Jovanis, P P

    1990-10-01

    A conceptual framework for accident occurrence is developed based on the principle of the driver as an information processor. The framework underlies the development of a modeling approach that is consistent with the definition of exposure to risk as a repeated trial. Survival theory is proposed as a statistical technique that is consistent with the conceptual structure and allows the exploration of a wide range of factors that contribute to highway operating risk. This survival model of accident occurrence is developed at a disaggregate level, allowing safety researchers to broaden the scope of studies which may be limited by the use of traditional aggregate approaches. An application of the approach to motor carrier safety is discussed as are potential applications to a variety of transportation industries. Lastly, a typology of highway safety research methodologies is developed to compare the properties of four safety methodologies: laboratory experiments, on-the-road studies, multidisciplinary accident investigations, and correlational studies. The survival theory formulation has a mathematical structure that is compatible with each safety methodology, so it may facilitate the integration of findings across methodologies.

  20. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    NASA Astrophysics Data System (ADS)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  1. Some Behavioral and Neurobiological Constraints on Theories of Audiovisual Speech Integration: A Review and Suggestions for New Directions

    PubMed Central

    Altieri, Nicholas; Pisoni, David B.; Townsend, James T.

    2012-01-01

    Summerfield (1987) proposed several accounts of audiovisual speech perception, a field of research that has burgeoned in recent years. The proposed accounts included the integration of discrete phonetic features, vectors describing the values of independent acoustical and optical parameters, the filter function of the vocal tract, and articulatory dynamics of the vocal tract. The latter two accounts assume that the representations of audiovisual speech perception are based on abstract gestures, while the former two assume that the representations consist of symbolic or featural information obtained from visual and auditory modalities. Recent converging evidence from several different disciplines reveals that the general framework of Summerfield’s feature-based theories should be expanded. An updated framework building upon the feature-based theories is presented. We propose a processing model arguing that auditory and visual brain circuits provide facilitatory information when the inputs are correctly timed, and that auditory and visual speech representations do not necessarily undergo translation into a common code during information processing. Future research on multisensory processing in speech perception should investigate the connections between auditory and visual brain regions, and utilize dynamic modeling tools to further understand the timing and information processing mechanisms involved in audiovisual speech integration. PMID:21968081

  2. Periodic benefit-risk assessment using Bayesian stochastic multi-criteria acceptability analysis.

    PubMed

    Li, Kan; Yuan, Shuai Sammy; Wang, William; Wan, Shuyan Sabrina; Ceesay, Paulette; Heyse, Joseph F; Mt-Isa, Shahrul; Luo, Sheng

    2018-04-01

    Benefit-risk (BR) assessment is essential to ensure the best decisions are made for a medical product in the clinical development process, regulatory marketing authorization, post-market surveillance, and coverage and reimbursement decisions. One challenge of BR assessment in practice is that the benefit and risk profile may keep evolving while new evidence is accumulating. Regulators and the International Conference on Harmonization (ICH) recommend performing periodic benefit-risk evaluation report (PBRER) through the product's lifecycle. In this paper, we propose a general statistical framework for periodic benefit-risk assessment, in which Bayesian meta-analysis and stochastic multi-criteria acceptability analysis (SMAA) will be combined to synthesize the accumulating evidence. The proposed approach allows us to compare the acceptability of different drugs dynamically and effectively and accounts for the uncertainty of clinical measurements and imprecise or incomplete preference information of decision makers. We apply our approaches to two real examples in a post-hoc way for illustration purpose. The proposed method may easily be modified for other pre and post market settings, and thus be an important complement to the current structured benefit-risk assessment (sBRA) framework to improve the transparent and consistency of the decision-making process. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Some behavioral and neurobiological constraints on theories of audiovisual speech integration: a review and suggestions for new directions.

    PubMed

    Altieri, Nicholas; Pisoni, David B; Townsend, James T

    2011-01-01

    Summerfield (1987) proposed several accounts of audiovisual speech perception, a field of research that has burgeoned in recent years. The proposed accounts included the integration of discrete phonetic features, vectors describing the values of independent acoustical and optical parameters, the filter function of the vocal tract, and articulatory dynamics of the vocal tract. The latter two accounts assume that the representations of audiovisual speech perception are based on abstract gestures, while the former two assume that the representations consist of symbolic or featural information obtained from visual and auditory modalities. Recent converging evidence from several different disciplines reveals that the general framework of Summerfield's feature-based theories should be expanded. An updated framework building upon the feature-based theories is presented. We propose a processing model arguing that auditory and visual brain circuits provide facilitatory information when the inputs are correctly timed, and that auditory and visual speech representations do not necessarily undergo translation into a common code during information processing. Future research on multisensory processing in speech perception should investigate the connections between auditory and visual brain regions, and utilize dynamic modeling tools to further understand the timing and information processing mechanisms involved in audiovisual speech integration.

  4. Process-to-Panel Modeling and Multiprobe Characterization of Silicon Heterojunction Solar Cell Technology

    NASA Astrophysics Data System (ADS)

    Chavali, Raghu Vamsi Krishna

    The large-scale deployment of PV technology is very sensitive to the material and process costs. There are several potential candidates among p-n heterojunction (HJ) solar cells competing for higher efficiencies at lower material and process costs. These systems are, however, generally complex, involve diverse materials, and are not well understood. The direct translation of classical p-n homojunction theory to p-n HJ cells may not always be self-consistent and can lead, therefore, to misinterpretation of experimental results. Ultimately, this translation may not be useful for modeling and characterization of these solar cells. Hence, there is a strong need to redefine/reinterpret the modeling/characterization methodologies for HJ solar cells to produce a self-consistent framework for optimizing HJ solar cell designs. Towards this goal, we explore the physics and interpret characterization experiments of p-n HJs using Silicon HJ (HIT) solar cells. We will: (1) identify the key HJ properties that affect the cell efficiency; (2) analyze the dependence of key HJ properties on the carrier transport under light and dark conditions; (3) provide a selfconsistent multi-probe approach to extract the HJ parameters using several characterization techniques including dark I-V, light I-V, C-V, impedance spectroscopy, and Suns-Voc; (4) propose design guidelines to address the HJ bottlenecks of HIT cells; and (5) develop a process-to-module modeling framework to establish the module performance limits. The guidelines resulting from this multi-scale and self-consistent framework can be used to improve performance of HIT cells as well as other HJ based solar cells.

  5. Portable implementation model for CFD simulations. Application to hybrid CPU/GPU supercomputers

    NASA Astrophysics Data System (ADS)

    Oyarzun, Guillermo; Borrell, Ricard; Gorobets, Andrey; Oliva, Assensi

    2017-10-01

    Nowadays, high performance computing (HPC) systems experience a disruptive moment with a variety of novel architectures and frameworks, without any clarity of which one is going to prevail. In this context, the portability of codes across different architectures is of major importance. This paper presents a portable implementation model based on an algebraic operational approach for direct numerical simulation (DNS) and large eddy simulation (LES) of incompressible turbulent flows using unstructured hybrid meshes. The strategy proposed consists in representing the whole time-integration algorithm using only three basic algebraic operations: sparse matrix-vector product, a linear combination of vectors and dot product. The main idea is based on decomposing the nonlinear operators into a concatenation of two SpMV operations. This provides high modularity and portability. An exhaustive analysis of the proposed implementation for hybrid CPU/GPU supercomputers has been conducted with tests using up to 128 GPUs. The main objective consists in understanding the challenges of implementing CFD codes on new architectures.

  6. Text extraction via an edge-bounded averaging and a parametric character model

    NASA Astrophysics Data System (ADS)

    Fan, Jian

    2003-01-01

    We present a deterministic text extraction algorithm that relies on three basic assumptions: color/luminance uniformity of the interior region, closed boundaries of sharp edges and the consistency of local contrast. The algorithm is basically independent of the character alphabet, text layout, font size and orientation. The heart of this algorithm is an edge-bounded averaging for the classification of smooth regions that enhances robustness against noise without sacrificing boundary accuracy. We have also developed a verification process to clean up the residue of incoherent segmentation. Our framework provides a symmetric treatment for both regular and inverse text. We have proposed three heuristics for identifying the type of text from a cluster consisting of two types of pixel aggregates. Finally, we have demonstrated the advantages of the proposed algorithm over adaptive thresholding and block-based clustering methods in terms of boundary accuracy, segmentation coherency, and capability to identify inverse text and separate characters from background patches.

  7. Consistent second-order boundary implementations for convection-diffusion lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    Zhang, Liangqi; Yang, Shiliang; Zeng, Zhong; Chew, Jia Wei

    2018-02-01

    In this study, an alternative second-order boundary scheme is proposed under the framework of the convection-diffusion lattice Boltzmann (LB) method for both straight and curved geometries. With the proposed scheme, boundary implementations are developed for the Dirichlet, Neumann and linear Robin conditions in a consistent way. The Chapman-Enskog analysis and the Hermite polynomial expansion technique are first applied to derive the explicit expression for the general distribution function with second-order accuracy. Then, the macroscopic variables involved in the expression for the distribution function is determined by the prescribed macroscopic constraints and the known distribution functions after streaming [see the paragraph after Eq. (29) for the discussions of the "streaming step" in LB method]. After that, the unknown distribution functions are obtained from the derived macroscopic information at the boundary nodes. For straight boundaries, boundary nodes are directly placed at the physical boundary surface, and the present scheme is applied directly. When extending the present scheme to curved geometries, a local curvilinear coordinate system and first-order Taylor expansion are introduced to relate the macroscopic variables at the boundary nodes to the physical constraints at the curved boundary surface. In essence, the unknown distribution functions at the boundary node are derived from the known distribution functions at the same node in accordance with the macroscopic boundary conditions at the surface. Therefore, the advantages of the present boundary implementations are (i) the locality, i.e., no information from neighboring fluid nodes is required; (ii) the consistency, i.e., the physical boundary constraints are directly applied when determining the macroscopic variables at the boundary nodes, thus the three kinds of conditions are realized in a consistent way. It should be noted that the present focus is on two-dimensional cases, and theoretical derivations as well as the numerical validations are performed in the framework of the two-dimensional five-velocity lattice model.

  8. Physical Therapy Residency and Fellowship Education: Reflections on the Past, Present, and Future.

    PubMed

    Furze, Jennifer A; Tichenor, Carol Jo; Fisher, Beth E; Jensen, Gail M; Rapport, Mary Jane

    2016-07-01

    The physical therapy profession continues to respond to the complex and changing landscape of health care to meet the needs of patients and the demands of patient care. Consistent with this evolution is the rapid development and expansion of residency and fellowship postprofessional programs. With the interested number of applicants exceeding the number of residency and fellowship slots available, a "critical period" in the educational process is emerging. The purposes of this perspective article are: (1) to analyze the state of residency and fellowship education within the profession, (2) to identify best practice elements from other health professions that are applicable to physical therapy residency and fellowship education, and (3) to propose a working framework grounded in common domains of competence to be used as a platform for dialogue, consistency, and quality across all residency and fellowship programs. Seven domains of competence are proposed to theoretically ground residency and fellowship programs and facilitate a more consistent approach to curricular development and assessment. Although the recent proliferation of residency and fellowship programs attempts to meet the demand of physical therapists seeking advanced educational opportunities, it is imperative that these programs are consistently delivering high-quality education with a common focus on delivering health care in the context of societal needs. © 2016 American Physical Therapy Association.

  9. Saliency detection using mutual consistency-guided spatial cues combination

    NASA Astrophysics Data System (ADS)

    Wang, Xin; Ning, Chen; Xu, Lizhong

    2015-09-01

    Saliency detection has received extensive interests due to its remarkable contribution to wide computer vision and pattern recognition applications. However, most existing computational models are designed for detecting saliency in visible images or videos. When applied to infrared images, they may suffer from limitations in saliency detection accuracy and robustness. In this paper, we propose a novel algorithm to detect visual saliency in infrared images by mutual consistency-guided spatial cues combination. First, based on the luminance contrast and contour characteristics of infrared images, two effective saliency maps, i.e., the luminance contrast saliency map and contour saliency map are constructed, respectively. Afterwards, an adaptive combination scheme guided by mutual consistency is exploited to integrate these two maps to generate the spatial saliency map. This idea is motivated by the observation that different maps are actually related to each other and the fusion scheme should present a logically consistent view of these maps. Finally, an enhancement technique is adopted to incorporate spatial saliency maps at various scales into a unified multi-scale framework to improve the reliability of the final saliency map. Comprehensive evaluations on real-life infrared images and comparisons with many state-of-the-art saliency models demonstrate the effectiveness and superiority of the proposed method for saliency detection in infrared images.

  10. Probabilistic economic frameworks for disaster risk management

    NASA Astrophysics Data System (ADS)

    Dulac, Guillaume; Forni, Marc

    2013-04-01

    Starting from the general concept of risk, we set up an economic analysis framework for Disaster Risk Management (DRM) investment. It builds on uncertainty management techniques - notably Monte Carlo simulations - and includes both a risk and performance metrics adapted to recurring issues in disaster risk management as entertained by governments and international organisations. This type of framework proves to be enlightening in several regards, and is thought to ease the promotion of DRM projects as "investments" rather than "costs to be born" and allow for meaningful comparison between DRM and other sectors. We then look at the specificities of disaster risk investments of medium to large scales through this framework, where some "invariants" can be identified, notably: (i) it makes more sense to perform analysis over long-term horizons -space and time scales are somewhat linked; (ii) profiling of the fluctuations of the gains and losses of DRM investments over long periods requires the ability to handle possibly highly volatile variables; (iii) complexity increases with the scale which results in a higher sensitivity of the analytic framework on the results; (iv) as the perimeter of analysis (time, theme and space-wise) is widened, intrinsic parameters of the project tend to weight lighter. This puts DRM in a very different perspective from traditional modelling, which usually builds on more intrinsic features of the disaster as it relates to the scientific knowledge about hazard(s). As models hardly accommodate for such complexity or "data entropy" (they require highly structured inputs), there is a need for a complementary approach to understand risk at global scale. The proposed framework suggests opting for flexible ad hoc modelling of specific issues consistent with one's objective, risk and performance metrics. Such tailored solutions are strongly context-dependant (time and budget, sensitivity of the studied variable in the economic framework) and can range from simple elicitation of data from a subject matter expert to calibrate a probability distribution to more advanced stochastic modelling. This approach can be referred to more as a proficiency in the language of uncertainty rather than modelling per se in the sense that it allows for greater flexibility to adapt a given context. In a real decision making context, one seldom has neither time nor budget resources to investigate all of these variables thoroughly, hence the importance of being able to prioritize the level of effort among them. Under the proposed framework, this can be done in an optimised fashion. The point here consists in applying probabilistic sensitivity analysis together with the fundamentals of the economic value of information; the framework as built is well suited to such considerations, and variables can be ranked according to their contribution to risk understanding. Efforts to deal with second order uncertainties on variables prove to be valuable when dealing with the economic value of sample information.

  11. Knowledge, instruction and behavioural change: building a framework for effective eczema education in clinical practice.

    PubMed

    Thompson, Deryn Lee; Thompson, Murray John

    2014-11-01

    A discussion on the reasons educational interventions about eczema, by nurses, are successful, with the subsequent development of a theoretical framework to guide nurses to become effective patient educators. Effective child and parent education is the key to successful self-management of eczema. When diagnosed, children and parents should learn to understand the condition through clear explanations, seeing treatment demonstrations and have ongoing support to learn practical skills to control eczema. Dermatology nurses provide these services, but no one has proposed a framework of the concepts underpinning their successful eczema educational interventions. A discussion paper. A literature search of online databases was undertaken utilizing terms 'eczema OR atopic dermatitis', 'education', 'parent', 'nurs*', 'framework', 'knowledge', motivation', in Scopus, CINAHL, Web of Science, Medline and Pubmed. Limits were English language and 2003-2013. The framework can inform discussion on child and parent education, provide a scaffold for future research and guide non-specialist nurses, internationally, in providing consistent patient education about eczema. Founded on an understanding of knowledge, the framework utilizes essential elements of cognitive psychology and social cognitive theory leading to successful self-management of eczema. This framework may prove useful as a basis for future research in child and parent education, globally, in the healthcare community. A framework has been created to help nurses understand the essential elements of the learning processes at the foundation of effective child and parent education. The framework serves to explain the improved outcomes reported in previous nurse-led eczema educational interventions. © 2014 The Authors. Journal of Advanced Nursing Published by John Wiley & Sons Ltd.

  12. 78 FR 70354 - Conceptual Example of a Proposed Risk Management Regulatory Framework Policy Statement

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-25

    ... NUCLEAR REGULATORY COMMISSION [NRC-2013-0254] Conceptual Example of a Proposed Risk Management... issuing a document entitled: ``White Paper on a Conceptual Example of a Proposed Risk Management... ``openness,'' a white paper on a Conceptual Example of a Proposed Risk Management Regulatory Framework (RMRF...

  13. Towards a Cloud Based Smart Traffic Management Framework

    NASA Astrophysics Data System (ADS)

    Rahimi, M. M.; Hakimpour, F.

    2017-09-01

    Traffic big data has brought many opportunities for traffic management applications. However several challenges like heterogeneity, storage, management, processing and analysis of traffic big data may hinder their efficient and real-time applications. All these challenges call for well-adapted distributed framework for smart traffic management that can efficiently handle big traffic data integration, indexing, query processing, mining and analysis. In this paper, we present a novel, distributed, scalable and efficient framework for traffic management applications. The proposed cloud computing based framework can answer technical challenges for efficient and real-time storage, management, process and analyse of traffic big data. For evaluation of the framework, we have used OpenStreetMap (OSM) real trajectories and road network on a distributed environment. Our evaluation results indicate that speed of data importing to this framework exceeds 8000 records per second when the size of datasets is near to 5 million. We also evaluate performance of data retrieval in our proposed framework. The data retrieval speed exceeds 15000 records per second when the size of datasets is near to 5 million. We have also evaluated scalability and performance of our proposed framework using parallelisation of a critical pre-analysis in transportation applications. The results show that proposed framework achieves considerable performance and efficiency in traffic management applications.

  14. Interleaved EPI diffusion imaging using SPIRiT-based reconstruction with virtual coil compression.

    PubMed

    Dong, Zijing; Wang, Fuyixue; Ma, Xiaodong; Zhang, Zhe; Dai, Erpeng; Yuan, Chun; Guo, Hua

    2018-03-01

    To develop a novel diffusion imaging reconstruction framework based on iterative self-consistent parallel imaging reconstruction (SPIRiT) for multishot interleaved echo planar imaging (iEPI), with computation acceleration by virtual coil compression. As a general approach for autocalibrating parallel imaging, SPIRiT improves the performance of traditional generalized autocalibrating partially parallel acquisitions (GRAPPA) methods in that the formulation with self-consistency is better conditioned, suggesting SPIRiT to be a better candidate in k-space-based reconstruction. In this study, a general SPIRiT framework is adopted to incorporate both coil sensitivity and phase variation information as virtual coils and then is applied to 2D navigated iEPI diffusion imaging. To reduce the reconstruction time when using a large number of coils and shots, a novel shot-coil compression method is proposed for computation acceleration in Cartesian sampling. Simulations and in vivo experiments were conducted to evaluate the performance of the proposed method. Compared with the conventional coil compression, the shot-coil compression achieved higher compression rates with reduced errors. The simulation and in vivo experiments demonstrate that the SPIRiT-based reconstruction outperformed the existing method, realigned GRAPPA, and provided superior images with reduced artifacts. The SPIRiT-based reconstruction with virtual coil compression is a reliable method for high-resolution iEPI diffusion imaging. Magn Reson Med 79:1525-1531, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  15. A practical salient region feature based 3D multi-modality registration method for medical images

    NASA Astrophysics Data System (ADS)

    Hahn, Dieter A.; Wolz, Gabriele; Sun, Yiyong; Hornegger, Joachim; Sauer, Frank; Kuwert, Torsten; Xu, Chenyang

    2006-03-01

    We present a novel representation of 3D salient region features and its integration into a hybrid rigid-body registration framework. We adopt scale, translation and rotation invariance properties of those intrinsic 3D features to estimate a transform between underlying mono- or multi-modal 3D medical images. Our method combines advantageous aspects of both feature- and intensity-based approaches and consists of three steps: an automatic extraction of a set of 3D salient region features on each image, a robust estimation of correspondences and their sub-pixel accurate refinement with outliers elimination. We propose a region-growing based approach for the extraction of 3D salient region features, a solution to the problem of feature clustering and a reduction of the correspondence search space complexity. Results of the developed algorithm are presented for both mono- and multi-modal intra-patient 3D image pairs (CT, PET and SPECT) that have been acquired for change detection, tumor localization, and time based intra-person studies. The accuracy of the method is clinically evaluated by a medical expert with an approach that measures the distance between a set of selected corresponding points consisting of both anatomical and functional structures or lesion sites. This demonstrates the robustness of the proposed method to image overlap, missing information and artefacts. We conclude by discussing potential medical applications and possibilities for integration into a non-rigid registration framework.

  16. Temporal Dynamics Assessment of Spatial Overlap Pattern of Functional Brain Networks Reveals Novel Functional Architecture of Cerebral Cortex.

    PubMed

    Jiang, Xi; Li, Xiang; Lv, Jinglei; Zhao, Shijie; Zhang, Shu; Zhang, Wei; Zhang, Tuo; Han, Junwei; Guo, Lei; Liu, Tianming

    2018-06-01

    Various studies in the brain mapping field have demonstrated that there exist multiple concurrent functional networks that are spatially overlapped and interacting with each other during specific task performance to jointly realize the total brain function. Assessing such spatial overlap patterns of functional networks (SOPFNs) based on functional magnetic resonance imaging (fMRI) has thus received increasing interest for brain function studies. However, there are still two crucial issues to be addressed. First, the SOPFNs are assessed over the entire fMRI scan assuming the temporal stationarity, while possibly time-dependent dynamics of the SOPFNs is not sufficiently explored. Second, the SOPFNs are assessed within individual subjects, while group-wise consistency of the SOPFNs is largely unknown. To address the two issues, we propose a novel computational framework of group-wise sparse representation of whole-brain fMRI temporal segments to assess the temporal dynamic spatial patterns of SOPFNs that are consistent across different subjects. Experimental results based on the recently publicly released Human Connectome Project grayordinate task fMRI data demonstrate that meaningful SOPFNs exhibiting dynamic spatial patterns across different time periods are effectively and robustly identified based on the reconstructed concurrent functional networks via the proposed framework. Specifically, those SOPFNs locate significantly more on gyral regions than on sulcal regions across different time periods. These results reveal novel functional architecture of cortical gyri and sulci. Moreover, these results help better understand functional dynamics mechanisms of cerebral cortex in the future.

  17. Model-Based Policymaking: A Framework to Promote Ethical "Good Practice" in Mathematical Modeling for Public Health Policymaking.

    PubMed

    Boden, Lisa A; McKendrick, Iain J

    2017-01-01

    Mathematical models are increasingly relied upon as decision support tools, which estimate risks and generate recommendations to underpin public health policies. However, there are no formal agreements about what constitutes professional competencies or duties in mathematical modeling for public health. In this article, we propose a framework to evaluate whether mathematical models that assess human and animal disease risks and control strategies meet standards consistent with ethical "good practice" and are thus "fit for purpose" as evidence in support of policy. This framework is derived from principles of biomedical ethics: independence, transparency (autonomy), beneficence/non-maleficence, and justice. We identify ethical risks associated with model development and implementation and consider the extent to which scientists are accountable for the translation and communication of model results to policymakers so that the strengths and weaknesses of the scientific evidence base and any socioeconomic and ethical impacts of biased or uncertain predictions are clearly understood. We propose principles to operationalize a framework for ethically sound model development and risk communication between scientists and policymakers. These include the creation of science-policy partnerships to mutually define policy questions and communicate results; development of harmonized international standards for model development; and data stewardship and improvement of the traceability and transparency of models via a searchable archive of policy-relevant models. Finally, we suggest that bespoke ethical advisory groups, with relevant expertise and access to these resources, would be beneficial as a bridge between science and policy, advising modelers of potential ethical risks and providing overview of the translation of modeling advice into policy.

  18. A smoothed particle hydrodynamics framework for modelling multiphase interactions at meso-scale

    NASA Astrophysics Data System (ADS)

    Li, Ling; Shen, Luming; Nguyen, Giang D.; El-Zein, Abbas; Maggi, Federico

    2018-01-01

    A smoothed particle hydrodynamics (SPH) framework is developed for modelling multiphase interactions at meso-scale, including the liquid-solid interaction induced deformation of the solid phase. With an inter-particle force formulation that mimics the inter-atomic force in molecular dynamics, the proposed framework includes the long-range attractions between particles, and more importantly, the short-range repulsive forces to avoid particle clustering and instability problems. Three-dimensional numerical studies have been conducted to demonstrate the capabilities of the proposed framework to quantitatively replicate the surface tension of water, to model the interactions between immiscible liquids and solid, and more importantly, to simultaneously model the deformation of solid and liquid induced by the multiphase interaction. By varying inter-particle potential magnitude, the proposed SPH framework has successfully simulated various wetting properties ranging from hydrophobic to hydrophilic surfaces. The simulation results demonstrate the potential of the proposed framework to genuinely study complex multiphase interactions in wet granular media.

  19. Evolutionary game based control for biological systems with applications in drug delivery.

    PubMed

    Li, Xiaobo; Lenaghan, Scott C; Zhang, Mingjun

    2013-06-07

    Control engineering and analysis of biological systems have become increasingly important for systems and synthetic biology. Unfortunately, no widely accepted control framework is currently available for these systems, especially at the cell and molecular levels. This is partially due to the lack of appropriate mathematical models to describe the unique dynamics of biological systems, and the lack of implementation techniques, such as ultra-fast and ultra-small devices and corresponding control algorithms. This paper proposes a control framework for biological systems subject to dynamics that exhibit adaptive behavior under evolutionary pressures. The control framework was formulated based on evolutionary game based modeling, which integrates both the internal dynamics and the population dynamics. In the proposed control framework, the adaptive behavior was characterized as an internal dynamic, and the external environment was regarded as an external control input. The proposed open-interface control framework can be integrated with additional control algorithms for control of biological systems. To demonstrate the effectiveness of the proposed framework, an optimal control strategy was developed and validated for drug delivery using the pathogen Giardia lamblia as a test case. In principle, the proposed control framework can be applied to any biological system exhibiting adaptive behavior under evolutionary pressures. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Cancer drug development and the evolving regulatory framework for companion diagnostics in the European union.

    PubMed

    Pignatti, Francesco; Ehmann, Falk; Hemmings, Robert; Jonsson, Bertil; Nuebling, Micha; Papaluca-Amati, Marisa; Posch, Martin; Rasi, Guido

    2014-03-15

    The European Union (EU) legal framework for medical device regulation is currently under revision. The European Commission has proposed a new framework to ensure that medical devices serve the needs and ensure the safety of European citizens, aiming for a framework that is fit for purpose, more transparent, and better adapted to scientific and technological progress. The proposed new framework is described as an evolution of the current regime keeping the same legal approach. An important proposed change is that companion diagnostics will no longer be considered as low risk and subject to self-certification by the manufacturer. According to the new proposal, companion diagnostics will be classified as high individual risk or moderate public health risk (category C) and require conformity assessment by a notified body. It has also been proposed that evidence of the clinical utility of the device for the intended purpose should be required for companion diagnostics. In this article, we review the EU legal framework relevant for companion diagnostics, describe the proposed changes, and summarize the available scientific guidance from the European Medicines Agency and its regulatory experience with cancer drug development including companion diagnostics. See all articles in this CCR Focus section, "The Precision Medicine Conundrum: Approaches to Companion Diagnostic Co-development." ©2014 AACR.

  1. Utilizing the National Research Council's (NRC) Conceptual Framework for the Next Generation Science Standards (NGSS): A Self-Study in My Science, Engineering, and Mathematics Classroom

    NASA Astrophysics Data System (ADS)

    Corvo, Arthur Francis

    Given the reality that active and competitive participation in the 21 st century requires American students to deepen their scientific and mathematical knowledge base, the National Research Council (NRC) proposed a new conceptual framework for K--12 science education. The framework consists of an integration of what the NRC report refers to as the three dimensions: scientific and engineering practices, crosscutting concepts, and core ideas in four disciplinary areas (physical, life and earth/spaces sciences, and engineering/technology). The Next Generation Science Standards (NGSS ), which are derived from this new framework, were released in April 2013 and have implications on teacher learning and development in Science, Technology, Engineering, and Mathematics (STEM). Given the NGSS's recent introduction, there is little research on how teachers can prepare for its release. To meet this research need, I implemented a self-study aimed at examining my teaching practices and classroom outcomes through the lens of the NRC's conceptual framework and the NGSS. The self-study employed design-based research (DBR) methods to investigate what happened in my secondary classroom when I designed, enacted, and reflected on units of study for my science, engineering, and mathematics classes. I utilized various best practices including Learning for Use (LfU) and Understanding by Design (UbD) models for instructional design, talk moves as a tool for promoting discourse, and modeling instruction for these designed units of study. The DBR strategy was chosen to promote reflective cycles, which are consistent with and in support of the self-study framework. A multiple case, mixed-methods approach was used for data collection and analysis. The findings in the study are reported by study phase in terms of unit planning, unit enactment, and unit reflection. The findings have implications for science teaching, teacher professional development, and teacher education.

  2. Expanding on Successful Concepts, Models, and Organization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teeguarden, Justin G.; Tan, Yu-Mei; Edwards, Stephen W.

    In her letter to the editor1 regarding our recent Feature Article “Completing the Link between Exposure Science and Toxicology for Improved Environmental Health Decision Making: The Aggregate Exposure Pathway Framework” 2, Dr. von Göetz expressed several concerns about terminology, and the perception that we propose the replacement of successful approaches and models for exposure assessment with a concept. We are glad to have the opportunity to address these issues here. If the goal of the AEP framework was to replace existing exposure models or databases for organizing exposure data with a concept, we would share Dr. von Göetz concerns. Instead,more » the outcome we promote is broader use of an organizational framework for exposure science. The framework would support improved generation, organization, and interpretation of data as well as modeling and prediction, not replacement of models. The field of toxicology has seen the benefits of wide use of one or more organizational frameworks (e.g., mode and mechanism of action, adverse outcome pathway). These frameworks influence how experiments are designed, data are collected, curated, stored and interpreted and ultimately how data are used in risk assessment. Exposure science is poised to similarly benefit from broader use of a parallel organizational framework, which Dr. von Göetz correctly points out, is currently used in the exposure modeling community. In our view, the concepts used so effectively in the exposure modeling community, expanded upon in the AEP framework, could see wider adoption by the field as a whole. The value of such a framework was recognized by the National Academy of Sciences.3 Replacement of models, databases, or any application with the AEP framework was not proposed in our article. The positive role broader more consistent use of such a framework might have in enabling and advancing “general activities such as data acquisition, organization…,” and exposure modeling was discussed in some detail. Like Dr. von Göetz, we recognized the challenges associated with acceptance of the terminology, definitions, and structure proposed in the paper. To address these challenges, an expert workshop was held in May, 2016 to consider and revise the “basic elements” outlined in the paper. The attendees produced revisions to the terminology (e.g., key events) that align with terminology currently in use in the field. We were also careful in our paper to acknowledge a point raised by Dr. von Göetz, that the term AEP implies aggregation, providing these clarifications: “The simplest form of an AEP represents a single source and a single pathway and may more commonly be referred to as an exposure pathway,”; and “An aggregate exposure pathway may represent multiple sources and transfer through single pathways to the TSE, single sources and transfer through multiple pathways to the target site exposure (TSE), or any combination of these.” These clarifications address the concern that the AEP term is not accurate or logical, and further expands upon the word “aggregate” in a broader context. Our use of AEP is consistent with the definition for “aggregate exposure”, which refers to the combined exposures to a single chemical across multiple routes and pathways.3 The AEP framework embraces existing methods for collection, prediction, organization, and interpretation of human and ecological exposure data cited by Dr. von Göetz. We remain hopeful that wider recognition and use of an organizing concept for exposure information across the exposure science, toxicology and epidemiology communities advances the development of the kind of infrastructure and models Dr. von Göetz discusses. This outcome would be a step forward, rather than a step backward.« less

  3. A Framework for Enhancing Real-time Social Media Data to Improve Disaster Management Process

    NASA Astrophysics Data System (ADS)

    Attique Shah, Syed; Zafer Şeker, Dursun; Demirel, Hande

    2018-05-01

    Social Media datasets are playing a vital role to provide information that can support decision making in nearly all domains of technology. It is due to the fact that social media is a quick and economical approach for data collection from public through methods like crowdsourcing. It is already proved by existing research that in case of any disaster (natural or man-made) the information extracted from Social Media sites is very critical to Disaster Management Systems for response and reconstruction. This study comprises of two components, the first part proposes a framework that provides updated and filtered real time input data for the disaster management system through social media and the second part consists of a designed web user API for a structured and defined real time data input process. This study contributes to the discipline of design science for the information systems domain. The aim of this study is to propose a framework that can filter and organize data from the unstructured social media sources through recognized methods and to bring this retrieved data to the same level as that of taken through a structured and predefined mechanism of a web API. Both components are designed to a level such that they can potentially collaborate and produce updated information for a disaster management system to carry out accurate and effective.

  4. Comparative-effectiveness research to aid population decision making by relating clinical outcomes and quality-adjusted life years.

    PubMed

    Campbell, Jonathan D; Zerzan, Judy; Garrison, Louis P; Libby, Anne M

    2013-04-01

    Comparative-effectiveness research (CER) at the population level is missing standardized approaches to quantify and weigh interventions in terms of their clinical risks, benefits, and uncertainty. We proposed an adapted CER framework for population decision making, provided example displays of the outputs, and discussed the implications for population decision makers. Building on decision-analytical modeling but excluding cost, we proposed a 2-step approach to CER that explicitly compared interventions in terms of clinical risks and benefits and linked this evidence to the quality-adjusted life year (QALY). The first step was a traditional intervention-specific evidence synthesis of risks and benefits. The second step was a decision-analytical model to simulate intervention-specific progression of disease over an appropriate time. The output was the ability to compare and quantitatively link clinical outcomes with QALYs. The outputs from these CER models include clinical risks, benefits, and QALYs over flexible and relevant time horizons. This approach yields an explicit, structured, and consistent quantitative framework to weigh all relevant clinical measures. Population decision makers can use this modeling framework and QALYs to aid in their judgment of the individual and collective risks and benefits of the alternatives over time. Future research should study effective communication of these domains for stakeholders. Copyright © 2013 Elsevier HS Journals, Inc. All rights reserved.

  5. Recommendations on chemicals management policy and legislation in the framework of the Egyptian-German twinning project on hazardous substances and waste management.

    PubMed

    Wagner, Burkhard O; Aziz, Elham Refaat Abdel; Schwetje, Anja; Shouk, Fatma Abou; Koch-Jugl, Juliane; Braedt, Michael; Choudhury, Keya; Weber, Roland

    2013-04-01

    The sustainable management of chemicals and their associated wastes-especially legacy stockpiles-is always challenging. Developing countries face particular difficulties as they often have insufficient treatment and disposal capacity, have limited resources and many lack an appropriate and effective regulatory framework. This paper describes the objectives and the approach of the Egyptian-German Twinning Project under the European Neighbourhood Policy to improve the strategy of managing hazardous substances in the Egyptian Environmental Affairs Agency (EEAA) between November 2008 and May 2011. It also provides an introduction to the Republic of Egypt's legal and administrative system regarding chemical controls. Subsequently, options for a new chemical management strategy consistent with the recommendations of the United Nations Chemicals Conventions are proposed. The Egyptian legal and administrative system is discussed in relation to the United Nations' recommendations and current European Union legislation for the sound management of chemicals. We also discuss a strategy for the EEAA to use the existing Egyptian legal system to implement the United Nations' Globally Harmonized System of Classification and Labelling of Chemicals, the Stockholm Convention and other proposed regulatory frameworks. The analysis, the results, and the recommendations presented may be useful for other developing countries in a comparable position to Egypt aspiring to update their legislation and administration to the international standards of sound management of chemicals.

  6. A service-based framework for pharmacogenomics data integration

    NASA Astrophysics Data System (ADS)

    Wang, Kun; Bai, Xiaoying; Li, Jing; Ding, Cong

    2010-08-01

    Data are central to scientific research and practices. The advance of experiment methods and information retrieval technologies leads to explosive growth of scientific data and databases. However, due to the heterogeneous problems in data formats, structures and semantics, it is hard to integrate the diversified data that grow explosively and analyse them comprehensively. As more and more public databases are accessible through standard protocols like programmable interfaces and Web portals, Web-based data integration becomes a major trend to manage and synthesise data that are stored in distributed locations. Mashup, a Web 2.0 technique, presents a new way to compose content and software from multiple resources. The paper proposes a layered framework for integrating pharmacogenomics data in a service-oriented approach using the mashup technology. The framework separates the integration concerns from three perspectives including data, process and Web-based user interface. Each layer encapsulates the heterogeneous issues of one aspect. To facilitate the mapping and convergence of data, the ontology mechanism is introduced to provide consistent conceptual models across different databases and experiment platforms. To support user-interactive and iterative service orchestration, a context model is defined to capture information of users, tasks and services, which can be used for service selection and recommendation during a dynamic service composition process. A prototype system is implemented and cases studies are presented to illustrate the promising capabilities of the proposed approach.

  7. A Novel Design Framework for Structures/Materials with Enhanced Mechanical Performance

    PubMed Central

    Liu, Jie; Fan, Xiaonan; Wen, Guilin; Qing, Qixiang; Wang, Hongxin; Zhao, Gang

    2018-01-01

    Structure/material requires simultaneous consideration of both its design and manufacturing processes to dramatically enhance its manufacturability, assembly and maintainability. In this work, a novel design framework for structural/material with a desired mechanical performance and compelling topological design properties achieved using origami techniques is presented. The framework comprises four procedures, including topological design, unfold, reduction manufacturing, and fold. The topological design method, i.e., the solid isotropic material penalization (SIMP) method, serves to optimize the structure in order to achieve the preferred mechanical characteristics, and the origami technique is exploited to allow the structure to be rapidly and easily fabricated. Topological design and unfold procedures can be conveniently completed in a computer; then, reduction manufacturing, i.e., cutting, is performed to remove materials from the unfolded flat plate; the final structure is obtained by folding out the plate from the previous procedure. A series of cantilevers, consisting of origami parallel creases and Miura-ori (usually regarded as a metamaterial) and made of paperboard, are designed with the least weight and the required stiffness by using the proposed framework. The findings here furnish an alternative design framework for engineering structures that could be better than the 3D-printing technique, especially for large structures made of thin metal materials. PMID:29642555

  8. Diversity training for the community aged care workers: A conceptual framework for evaluation.

    PubMed

    Appannah, Arti; Meyer, Claudia; Ogrin, Rajna; McMillan, Sally; Barrett, Elizabeth; Browning, Colette

    2017-08-01

    Older Australians are an increasingly diverse population, with variable characteristics such as culture, sexual orientation, socioeconomic status, and physical capabilities potentially influencing their participation in healthcare. In response, community aged care workers may need to increase skills and uptake of knowledge into practice regarding diversity through appropriate training interventions. Diversity training (DT) programs have traditionally existed in the realm of business, with little research attention devoted to scientifically evaluating the outcomes of training directed at community aged care workers. A DT workshop has been developed for community aged care workers, and this paper focuses on the construction of a formative evaluative framework for the workshop. Key evaluation concepts and measures relating to DT have been identified in the literature and integrated into the framework, focusing on five categories: Training needs analysis; Reactions; Learning outcomes, Behavioural outcomes and Results The use of a mixed methods approach in the framework provides an additional strength, by evaluating long-term behavioural change and improvements in service delivery. As little is known about the effectiveness of DT programs for community aged care workers, the proposed framework will provide an empirical and consistent method of evaluation, to assess their impact on enhancing older people's experience of healthcare. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. A Novel Design Framework for Structures/Materials with Enhanced Mechanical Performance.

    PubMed

    Liu, Jie; Fan, Xiaonan; Wen, Guilin; Qing, Qixiang; Wang, Hongxin; Zhao, Gang

    2018-04-09

    Abstract : Structure/material requires simultaneous consideration of both its design and manufacturing processes to dramatically enhance its manufacturability, assembly and maintainability. In this work, a novel design framework for structural/material with a desired mechanical performance and compelling topological design properties achieved using origami techniques is presented. The framework comprises four procedures, including topological design, unfold, reduction manufacturing, and fold. The topological design method, i.e., the solid isotropic material penalization (SIMP) method, serves to optimize the structure in order to achieve the preferred mechanical characteristics, and the origami technique is exploited to allow the structure to be rapidly and easily fabricated. Topological design and unfold procedures can be conveniently completed in a computer; then, reduction manufacturing, i.e., cutting, is performed to remove materials from the unfolded flat plate; the final structure is obtained by folding out the plate from the previous procedure. A series of cantilevers, consisting of origami parallel creases and Miura-ori (usually regarded as a metamaterial) and made of paperboard, are designed with the least weight and the required stiffness by using the proposed framework. The findings here furnish an alternative design framework for engineering structures that could be better than the 3D-printing technique, especially for large structures made of thin metal materials.

  10. The Development of a Telemedicine Planning Framework Based on Needs Assessment.

    PubMed

    AlDossary, Sharifah; Martin-Khan, Melinda G; Bradford, Natalie K; Armfield, Nigel R; Smith, Anthony C

    2017-05-01

    Providing equitable access to healthcare services in rural and remote communities is an ongoing challenge that faces most governments. By increasing access to specialty expertise, telemedicine may be a potential solution to this problem. Regardless of its potential, many telemedicine initiatives do not progress beyond the research phase, and are not implemented into mainstream practice. One reason may be that some telemedicine services are developed without the appropriate planning to ascertain community needs and clinical requirements. The aim of this paper is to report the development of a planning framework for telemedicine services based on needs assessment. The presented framework is based on the key processes in needs assessment, Penchansky and Thomas's dimensions of access, and Bradshaw's types of need. This proposed planning framework consists of two phases. Phase one comprises data collection and needs assessment, and includes assessment of availability and expressed needs; accessibility; perception and affordability. Phase two involves prioritising the demand for health services, balanced against the known limitations of supply, and the implementation of an appropriate telemedicine service that reflects and meets the needs of the community. Using a structured framework for the planning of telemedicine services, based on need assessment, may help with the identification and prioritisation of community health needs.

  11. Heat-Passing Framework for Robust Interpretation of Data in Networks

    PubMed Central

    Fang, Yi; Sun, Mengtian; Ramani, Karthik

    2015-01-01

    Researchers are regularly interested in interpreting the multipartite structure of data entities according to their functional relationships. Data is often heterogeneous with intricately hidden inner structure. With limited prior knowledge, researchers are likely to confront the problem of transforming this data into knowledge. We develop a new framework, called heat-passing, which exploits intrinsic similarity relationships within noisy and incomplete raw data, and constructs a meaningful map of the data. The proposed framework is able to rank, cluster, and visualize the data all at once. The novelty of this framework is derived from an analogy between the process of data interpretation and that of heat transfer, in which all data points contribute simultaneously and globally to reveal intrinsic similarities between regions of data, meaningful coordinates for embedding the data, and exemplar data points that lie at optimal positions for heat transfer. We demonstrate the effectiveness of the heat-passing framework for robustly partitioning the complex networks, analyzing the globin family of proteins and determining conformational states of macromolecules in the presence of high levels of noise. The results indicate that the methodology is able to reveal functionally consistent relationships in a robust fashion with no reference to prior knowledge. The heat-passing framework is very general and has the potential for applications to a broad range of research fields, for example, biological networks, social networks and semantic analysis of documents. PMID:25668316

  12. A range/depth modulation transfer function (RMTF) framework for characterizing 3D imaging LADAR performance

    NASA Astrophysics Data System (ADS)

    Staple, Bevan; Earhart, R. P.; Slaymaker, Philip A.; Drouillard, Thomas F., II; Mahony, Thomas

    2005-05-01

    3D imaging LADARs have emerged as the key technology for producing high-resolution imagery of targets in 3-dimensions (X and Y spatial, and Z in the range/depth dimension). Ball Aerospace & Technologies Corp. continues to make significant investments in this technology to enable critical NASA, Department of Defense, and national security missions. As a consequence of rapid technology developments, two issues have emerged that need resolution. First, the terminology used to rate LADAR performance (e.g., range resolution) is inconsistently defined, is improperly used, and thus has become misleading. Second, the terminology does not include a metric of the system"s ability to resolve the 3D depth features of targets. These two issues create confusion when translating customer requirements into hardware. This paper presents a candidate framework for addressing these issues. To address the consistency issue, the framework utilizes only those terminologies proposed and tested by leading LADAR research and standards institutions. We also provide suggestions for strengthening these definitions by linking them to the well-known Rayleigh criterion extended into the range dimension. To address the inadequate 3D image quality metrics, the framework introduces the concept of a Range/Depth Modulation Transfer Function (RMTF). The RMTF measures the impact of the spatial frequencies of a 3D target on its measured modulation in range/depth. It is determined using a new, Range-Based, Slanted Knife-Edge test. We present simulated results for two LADAR pulse detection techniques and compare them to a baseline centroid technique. Consistency in terminology plus a 3D image quality metric enable improved system standardization.

  13. Analysis of higher education policy frameworks for open and distance education in Pakistan.

    PubMed

    Ellahi, Abida; Zaka, Bilal

    2015-04-01

    The constant rise in demand for higher education has become the biggest challenge for educational planners. This high demand has paved a way for distance education across the globe. This article innovatively analyzes the policy documentation of a major distance education initiative in Pakistan for validity that will identify the utility of policy linkages. The study adopted a qualitative research design that consisted of two steps. In the first step, a content analysis of distance learning policy framework was made. For this purpose, two documents were accessed titled "Framework for Launching Distance Learning Programs in HEIs of Pakistan" and "Guideline on Quality of Distance Education for External Students at the HEIs of Pakistan." In the second step, the policy guidelines mentioned in these two documents were evaluated at two levels. At the first level, the overall policy documents were assessed against a criterion proposed by Cheung, Mirzaei, and Leeder. At the second level, the proposed program of distance learning was assessed against a criterion set by Gellman-Danley and Fetzner and Berge. The distance education program initiative in Pakistan is of promising nature which needs to be assessed regularly. This study has made an initial attempt to assess the policy document against a criterion identified from literature. The analysis shows that the current policy documents do offer some strengths at this initial level, however, they cannot be considered a comprehensive policy guide. The inclusion or correction of missing or vague areas identified in this study would make this policy guideline document a treasured tool for Higher Education Commission (HEC). For distance education policy makers, this distance education policy framework model recognizes several fundamental areas with which they should be concerned. The findings of this study in the light of two different policy framework measures highlight certain opportunities that can help strengthening the distance education policies. The criteria and findings are useful for the reviewers of policy proposals to identify the gaps where policy documents can be improved to bring the desired outcomes. © The Author(s) 2015.

  14. FRAMEWORK FOR STRUCTURAL ONLINE HEALTH MONITORING OF AGING AND DEGRADATION OF SECONDARY PIPING SYSTEMS DUE TO SOME ASPECTS OF EROSION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gribok, Andrei V.; Agarwal, Vivek

    This paper describes the current state of research related to critical aspects of erosion and selected aspects of degradation of secondary components in nuclear power plants (NPPs). The paper also proposes a framework for online health monitoring of aging and degradation of secondary components. The framework consists of an integrated multi-sensor modality system, which can be used to monitor different piping configurations under different degradation conditions. The report analyses the currently known degradation mechanisms and available predictive models. Based on this analysis, the structural health monitoring framework is proposed. The Light Water Reactor Sustainability Program began to evaluate technologies thatmore » could be used to perform online monitoring of piping and other secondary system structural components in commercial NPPs. These online monitoring systems have the potential to identify when a more detailed inspection is needed using real time measurements, rather than at a pre-determined inspection interval. This transition to condition-based, risk-informed automated maintenance will contribute to a significant reduction of operations and maintenance costs that account for the majority of nuclear power generation costs. Furthermore, of the operations and maintenance costs in U.S. plants, approximately 80% are labor costs. To address the issue of rising operating costs and economic viability, in 2017, companies that operate the national nuclear energy fleet started the Delivering the Nuclear Promise Initiative, which is a 3 year program aimed at maintaining operational focus, increasing value, and improving efficiency. There is unanimous agreement between industry experts and academic researchers that identifying and prioritizing inspection locations in secondary piping systems (for example, in raw water piping or diesel piping) would eliminate many excessive in-service inspections. The proposed structural health monitoring framework takes aim at answering this challenge by combining long range guided wave technologies with other monitoring techniques, which can significantly increase the inspection length and pinpoint the locations that degraded the most. More widely, the report suggests research efforts aimed at developing, validating, and deploying online corrosion monitoring techniques for complex geometries, which are pervasive in NPPs.« less

  15. Developing a Shuffled Complex-Self Adaptive Hybrid Evolution (SC-SAHEL) Framework for Water Resources Management and Water-Energy System Optimization

    NASA Astrophysics Data System (ADS)

    Rahnamay Naeini, M.; Sadegh, M.; AghaKouchak, A.; Hsu, K. L.; Sorooshian, S.; Yang, T.

    2017-12-01

    Meta-Heuristic optimization algorithms have gained a great deal of attention in a wide variety of fields. Simplicity and flexibility of these algorithms, along with their robustness, make them attractive tools for solving optimization problems. Different optimization methods, however, hold algorithm-specific strengths and limitations. Performance of each individual algorithm obeys the "No-Free-Lunch" theorem, which means a single algorithm cannot consistently outperform all possible optimization problems over a variety of problems. From users' perspective, it is a tedious process to compare, validate, and select the best-performing algorithm for a specific problem or a set of test cases. In this study, we introduce a new hybrid optimization framework, entitled Shuffled Complex-Self Adaptive Hybrid EvoLution (SC-SAHEL), which combines the strengths of different evolutionary algorithms (EAs) in a parallel computing scheme, and allows users to select the most suitable algorithm tailored to the problem at hand. The concept of SC-SAHEL is to execute different EAs as separate parallel search cores, and let all participating EAs to compete during the course of the search. The newly developed SC-SAHEL algorithm is designed to automatically select, the best performing algorithm for the given optimization problem. This algorithm is rigorously effective in finding the global optimum for several strenuous benchmark test functions, and computationally efficient as compared to individual EAs. We benchmark the proposed SC-SAHEL algorithm over 29 conceptual test functions, and two real-world case studies - one hydropower reservoir model and one hydrological model (SAC-SMA). Results show that the proposed framework outperforms individual EAs in an absolute majority of the test problems, and can provide competitive results to the fittest EA algorithm with more comprehensive information during the search. The proposed framework is also flexible for merging additional EAs, boundary-handling techniques, and sampling schemes, and has good potential to be used in Water-Energy system optimal operation and management.

  16. Consistent thermodynamic framework for interacting particles by neglecting thermal noise.

    PubMed

    Nobre, Fernando D; Curado, Evaldo M F; Souza, Andre M C; Andrade, Roberto F S

    2015-02-01

    An effective temperature θ, conjugated to a generalized entropy s(q), was introduced recently for a system of interacting particles. Since θ presents values much higher than those of typical room temperatures T≪θ, the thermal noise can be neglected (T/θ≃0) in these systems. Moreover, the consistency of this definition, as well as of a form analogous to the first law of thermodynamics, du=θds(q)+δW, were verified lately by means of a Carnot cycle, whose efficiency was shown to present the usual form, η=1-(θ(2)/θ(1)). Herein we explore further the heat contribution δQ=θds(q) by proposing a way for a heat exchange between two such systems, as well as its associated thermal equilibrium. As a consequence, the zeroth principle is also established. Moreover, we consolidate the first-law proposal by following the usual procedure for obtaining different potentials, i.e., applying Legendre transformations for distinct pairs of independent variables. From these potentials we derive the equation of state, Maxwell relations, and define response functions. All results presented are shown to be consistent with those of standard thermodynamics for T>0.

  17. Development and validation of a Database Forensic Metamodel (DBFM)

    PubMed Central

    Al-dhaqm, Arafat; Razak, Shukor; Othman, Siti Hajar; Ngadi, Asri; Ahmed, Mohammed Nazir; Ali Mohammed, Abdulalem

    2017-01-01

    Database Forensics (DBF) is a widespread area of knowledge. It has many complex features and is well known amongst database investigators and practitioners. Several models and frameworks have been created specifically to allow knowledge-sharing and effective DBF activities. However, these are often narrow in focus and address specified database incident types. We have analysed 60 such models in an attempt to uncover how numerous DBF activities are really public even when the actions vary. We then generate a unified abstract view of DBF in the form of a metamodel. We identified, extracted, and proposed a common concept and reconciled concept definitions to propose a metamodel. We have applied a metamodelling process to guarantee that this metamodel is comprehensive and consistent. PMID:28146585

  18. College-"Conocimiento": Toward an Interdisciplinary College Choice Framework for Latinx Students

    ERIC Educational Resources Information Center

    Acevedo-Gil, Nancy

    2017-01-01

    This paper builds upon Perna's college choice model by integrating Anzaldúa's theory of "conocimiento" to propose an interdisciplinary college choice framework for Latinx students. Using previous literature, this paper proposes college-"conocimiento" as a framework that contextualizes Latinx student college choices within the…

  19. 75 FR 20980 - Proposed Information Collection; Comment Request; Atlantic Surfclam and Ocean Quahog Framework...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-22

    ..., evaluate, and enforce fishery regulations. Framework Adjustment 1 (FW1) to the Atlantic Surf Clam and Ocean... DEPARTMENT OF COMMERCE National Oceanic and Atmospheric Administration Proposed Information Collection; Comment Request; Atlantic Surfclam and Ocean Quahog Framework Adjustment I AGENCY: National...

  20. 75 FR 19356 - Proposed Information Collection; Comment Request; Atlantic Surfclam and Ocean Quahog Framework...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-14

    ..., evaluate, and enforce fishery regulations. Framework Adjustment 1 (FW1) to the Atlantic Surf Clam and Ocean... DEPARTMENT OF COMMERCE National Oceanic and Atmospheric Administration Proposed Information Collection; Comment Request; Atlantic Surfclam and Ocean Quahog Framework Adjustment I AGENCY: National...

  1. Defining a competency framework: the first step toward competency-based medical education.

    PubMed

    Mirzazadeh, Azim; Mortaz Hejri, Sara; Jalili, Mohammad; Asghari, Fariba; Labaf, Ali; Sedaghat Siyahkal, Mojtaba; Afshari, Ali; Saleh, Narges

    2014-01-01

    Despite the existence of a large variety of competency frameworks for medical graduates, there is no agreement on a single set of outcomes. Different countries have attempted to define their own set of competencies to respond to their local situations. This article reports the process of developing medical graduates' competency framework as the first step in the curriculum reform in Tehran University of Medical Sciences (TUMS). A participatory approach was applied to develop a competency framework in Tehran University of Medical Sciences (TUMS). Following literature review, nominal group meetings with students and faculty members were held to generate the initial list of expectations, and 9 domains was proposed. Then, domains were reviewed, and one of the domains was removed. The competency framework was sent to Curriculum Reform Committee for consideration and approval, where it was decided to distribute electronic and paper forms among all faculty members and ask them for their comments. Following incorporating some of the modifications, the document was approved by the committee. The TUMS competency framework consists of 8 domains: Clinical skills; Communication skills; Patient management; Health promotion and disease prevention; Personal development; Professionalism, medical ethics and law; Decision making, reasoning and problem-solving; and Health system and the corresponding role of physicians. Development of a competency framework through a participatory approach was the first step towards curriculum reform in TUMS, aligned with local needs and conditions. The lessons learned through the process may be useful for similar projects in the future.

  2. A human-oriented framework for developing assistive service robots.

    PubMed

    McGinn, Conor; Cullinan, Michael F; Culleton, Mark; Kelly, Kevin

    2018-04-01

    Multipurpose robots that can perform a range of useful tasks have the potential to increase the quality of life for many people living with disabilities. Owing to factors such as high system complexity, as-yet unresolved research questions and current technology limitations, there is a need for effective strategies to coordinate the development process. Integrating established methodologies based on human-centred design and universal design, a framework was formulated to coordinate the robot design process over successive iterations of prototype development. An account is given of how the framework was practically applied to the problem of developing a personal service robot. Application of the framework led to the formation of several design goals which addressed a wide range of identified user needs. The resultant prototype solution, which consisted of several component elements, succeeded in demonstrating the performance stipulated by all of the proposed metrics. Application of the framework resulted in the development of a complex prototype that addressed many aspects of the functional and usability requirements of a personal service robot. Following the process led to several important insights which directly benefit the development of subsequent prototypes. Implications for Rehabilitation This research shows how universal design might be used to formulate usability requirements for assistive service robots. A framework is presented that guides the process of designing service robots in a human-centred way. Through practical application of the framework, a prototype robot system that addressed a range of identified user needs was developed.

  3. Truth-Valued-Flow Inference (TVFI) and its applications in approximate reasoning

    NASA Technical Reports Server (NTRS)

    Wang, Pei-Zhuang; Zhang, Hongmin; Xu, Wei

    1993-01-01

    The framework of the theory of Truth-valued-flow Inference (TVFI) is introduced. Even though there are dozens of papers presented on fuzzy reasoning, we think it is still needed to explore a rather unified fuzzy reasoning theory which has the following two features: (1) it is simplified enough to be executed feasibly and easily; and (2) it is well structural and well consistent enough that it can be built into a strict mathematical theory and is consistent with the theory proposed by L.A. Zadeh. TVFI is one of the fuzzy reasoning theories that satisfies the above two features. It presents inference by the form of networks, and naturally views inference as a process of truth values flowing among propositions.

  4. SOMA: A Proposed Framework for Trend Mining in Large UK Diabetic Retinopathy Temporal Databases

    NASA Astrophysics Data System (ADS)

    Somaraki, Vassiliki; Harding, Simon; Broadbent, Deborah; Coenen, Frans

    In this paper, we present SOMA, a new trend mining framework; and Aretaeus, the associated trend mining algorithm. The proposed framework is able to detect different kinds of trends within longitudinal datasets. The prototype trends are defined mathematically so that they can be mapped onto the temporal patterns. Trends are defined and generated in terms of the frequency of occurrence of pattern changes over time. To evaluate the proposed framework the process was applied to a large collection of medical records, forming part of the diabetic retinopathy screening programme at the Royal Liverpool University Hospital.

  5. The role of evidence, context, and facilitation in an implementation trial: implications for the development of the PARIHS framework

    PubMed Central

    2013-01-01

    Background The case has been made for more and better theory-informed process evaluations within trials in an effort to facilitate insightful understandings of how interventions work. In this paper, we provide an explanation of implementation processes from one of the first national implementation research randomized controlled trials with embedded process evaluation conducted within acute care, and a proposed extension to the Promoting Action on Research Implementation in Health Services (PARIHS) framework. Methods The PARIHS framework was prospectively applied to guide decisions about intervention design, data collection, and analysis processes in a trial focussed on reducing peri-operative fasting times. In order to capture a holistic picture of implementation processes, the same data were collected across 19 participating hospitals irrespective of allocation to intervention. This paper reports on findings from data collected from a purposive sample of 151 staff and patients pre- and post-intervention. Data were analysed using content analysis within, and then across data sets. Results A robust and uncontested evidence base was a necessary, but not sufficient condition for practice change, in that individual staff and patient responses such as caution influenced decision making. The implementation context was challenging, in which individuals and teams were bounded by professional issues, communication challenges, power and a lack of clarity for the authority and responsibility for practice change. Progress was made in sites where processes were aligned with existing initiatives. Additionally, facilitators reported engaging in many intervention implementation activities, some of which result in practice changes, but not significant improvements to outcomes. Conclusions This study provided an opportunity for reflection on the comprehensiveness of the PARIHS framework. Consistent with the underlying tenant of PARIHS, a multi-faceted and dynamic story of implementation was evident. However, the prominent role that individuals played as part of the interaction between evidence and context is not currently explicit within the framework. We propose that successful implementation of evidence into practice is a planned facilitated process involving an interplay between individuals, evidence, and context to promote evidence-informed practice. This proposal will enhance the potential of the PARIHS framework for explanation, and ensure theoretical development both informs and responds to the evidence base for implementation. Trial registration ISRCTN18046709 - Peri-operative Implementation Study Evaluation (PoISE). PMID:23497438

  6. Surrogate Endpoint Evaluation: Principal Stratification Criteria and the Prentice Definition.

    PubMed

    Gilbert, Peter B; Gabriel, Erin E; Huang, Ying; Chan, Ivan S F

    2015-09-01

    A common problem of interest within a randomized clinical trial is the evaluation of an inexpensive response endpoint as a valid surrogate endpoint for a clinical endpoint, where a chief purpose of a valid surrogate is to provide a way to make correct inferences on clinical treatment effects in future studies without needing to collect the clinical endpoint data. Within the principal stratification framework for addressing this problem based on data from a single randomized clinical efficacy trial, a variety of definitions and criteria for a good surrogate endpoint have been proposed, all based on or closely related to the "principal effects" or "causal effect predictiveness (CEP)" surface. We discuss CEP-based criteria for a useful surrogate endpoint, including (1) the meaning and relative importance of proposed criteria including average causal necessity (ACN), average causal sufficiency (ACS), and large clinical effect modification; (2) the relationship between these criteria and the Prentice definition of a valid surrogate endpoint; and (3) the relationship between these criteria and the consistency criterion (i.e., assurance against the "surrogate paradox"). This includes the result that ACN plus a strong version of ACS generally do not imply the Prentice definition nor the consistency criterion, but they do have these implications in special cases. Moreover, the converse does not hold except in a special case with a binary candidate surrogate. The results highlight that assumptions about the treatment effect on the clinical endpoint before the candidate surrogate is measured are influential for the ability to draw conclusions about the Prentice definition or consistency. In addition, we emphasize that in some scenarios that occur commonly in practice, the principal strata sub-populations for inference are identifiable from the observable data, in which cases the principal stratification framework has relatively high utility for the purpose of effect modification analysis, and is closely connected to the treatment marker selection problem. The results are illustrated with application to a vaccine efficacy trial, where ACN and ACS for an antibody marker are found to be consistent with the data and hence support the Prentice definition and consistency.

  7. Surrogate Endpoint Evaluation: Principal Stratification Criteria and the Prentice Definition

    PubMed Central

    Gilbert, Peter B.; Gabriel, Erin E.; Huang, Ying; Chan, Ivan S.F.

    2015-01-01

    A common problem of interest within a randomized clinical trial is the evaluation of an inexpensive response endpoint as a valid surrogate endpoint for a clinical endpoint, where a chief purpose of a valid surrogate is to provide a way to make correct inferences on clinical treatment effects in future studies without needing to collect the clinical endpoint data. Within the principal stratification framework for addressing this problem based on data from a single randomized clinical efficacy trial, a variety of definitions and criteria for a good surrogate endpoint have been proposed, all based on or closely related to the “principal effects” or “causal effect predictiveness (CEP)” surface. We discuss CEP-based criteria for a useful surrogate endpoint, including (1) the meaning and relative importance of proposed criteria including average causal necessity (ACN), average causal sufficiency (ACS), and large clinical effect modification; (2) the relationship between these criteria and the Prentice definition of a valid surrogate endpoint; and (3) the relationship between these criteria and the consistency criterion (i.e., assurance against the “surrogate paradox”). This includes the result that ACN plus a strong version of ACS generally do not imply the Prentice definition nor the consistency criterion, but they do have these implications in special cases. Moreover, the converse does not hold except in a special case with a binary candidate surrogate. The results highlight that assumptions about the treatment effect on the clinical endpoint before the candidate surrogate is measured are influential for the ability to draw conclusions about the Prentice definition or consistency. In addition, we emphasize that in some scenarios that occur commonly in practice, the principal strata sub-populations for inference are identifiable from the observable data, in which cases the principal stratification framework has relatively high utility for the purpose of effect modification analysis, and is closely connected to the treatment marker selection problem. The results are illustrated with application to a vaccine efficacy trial, where ACN and ACS for an antibody marker are found to be consistent with the data and hence support the Prentice definition and consistency. PMID:26722639

  8. Innovation adoption: a review of theories and constructs.

    PubMed

    Wisdom, Jennifer P; Chor, Ka Ho Brian; Hoagwood, Kimberly E; Horwitz, Sarah M

    2014-07-01

    Many theoretical frameworks seek to describe the dynamic process of the implementation of innovations. Little is known, however, about factors related to decisions to adopt innovations and how the likelihood of adoption of innovations can be increased. Using a narrative synthesis approach, this paper compared constructs theorized to be related to adoption of innovations proposed in existing theoretical frameworks in order to identify characteristics likely to increase adoption of innovations. The overall goal was to identify elements across adoption frameworks that are potentially modifiable and, thus, might be employed to improve the adoption of evidence-based practices. The review identified 20 theoretical frameworks that could be grouped into two broad categories: theories that mainly address the adoption process (N = 10) and theories that address adoption within the context of implementation, diffusion, dissemination, and/or sustainability (N = 10). Constructs of leadership, operational size and structure, innovation fit with norms and values, and attitudes/motivation toward innovations each are mentioned in at least half of the theories, though there were no consistent definitions of measures for these constructs. A lack of precise definitions and measurement of constructs suggests further work is needed to increase our understanding of adoption of innovations.

  9. PRECEPT: an evidence assessment framework for infectious disease epidemiology, prevention and control.

    PubMed

    Harder, Thomas; Takla, Anja; Eckmanns, Tim; Ellis, Simon; Forland, Frode; James, Roberta; Meerpohl, Joerg J; Morgan, Antony; Rehfuess, Eva; Schünemann, Holger; Zuiderent-Jerak, Teun; de Carvalho Gomes, Helena; Wichmann, Ole

    2017-10-01

    Decisions in public health should be based on the best available evidence, reviewed and appraised using a rigorous and transparent methodology. The Project on a Framework for Rating Evidence in Public Health (PRECEPT) defined a methodology for evaluating and grading evidence in infectious disease epidemiology, prevention and control that takes different domains and question types into consideration. The methodology rates evidence in four domains: disease burden, risk factors, diagnostics and intervention. The framework guiding it has four steps going from overarching questions to an evidence statement. In step 1, approaches for identifying relevant key areas and developing specific questions to guide systematic evidence searches are described. In step 2, methodological guidance for conducting systematic reviews is provided; 15 study quality appraisal tools are proposed and an algorithm is given for matching a given study design with a tool. In step 3, a standardised evidence-grading scheme using the Grading of Recommendations Assessment, Development and Evaluation Working Group (GRADE) methodology is provided, whereby findings are documented in evidence profiles. Step 4 consists of preparing a narrative evidence summary. Users of this framework should be able to evaluate and grade scientific evidence from the four domains in a transparent and reproducible way.

  10. PRECEPT: an evidence assessment framework for infectious disease epidemiology, prevention and control

    PubMed Central

    Harder, Thomas; Takla, Anja; Eckmanns, Tim; Ellis, Simon; Forland, Frode; James, Roberta; Meerpohl, Joerg J; Morgan, Antony; Rehfuess, Eva; Schünemann, Holger; Zuiderent-Jerak, Teun; de Carvalho Gomes, Helena; Wichmann, Ole

    2017-01-01

    Decisions in public health should be based on the best available evidence, reviewed and appraised using a rigorous and transparent methodology. The Project on a Framework for Rating Evidence in Public Health (PRECEPT) defined a methodology for evaluating and grading evidence in infectious disease epidemiology, prevention and control that takes different domains and question types into consideration. The methodology rates evidence in four domains: disease burden, risk factors, diagnostics and intervention. The framework guiding it has four steps going from overarching questions to an evidence statement. In step 1, approaches for identifying relevant key areas and developing specific questions to guide systematic evidence searches are described. In step 2, methodological guidance for conducting systematic reviews is provided; 15 study quality appraisal tools are proposed and an algorithm is given for matching a given study design with a tool. In step 3, a standardised evidence-grading scheme using the Grading of Recommendations Assessment, Development and Evaluation Working Group (GRADE) methodology is provided, whereby findings are documented in evidence profiles. Step 4 consists of preparing a narrative evidence summary. Users of this framework should be able to evaluate and grade scientific evidence from the four domains in a transparent and reproducible way. PMID:29019317

  11. Innovation Adoption: A Review of Theories and Constructs

    PubMed Central

    Chor, Ka Ho Brian; Hoagwood, Kimberly E.; Horwitz, Sarah M.

    2013-01-01

    Many theoretical frameworks seek to describe the dynamic process of the implementation of innovations. Little is known, however, about factors related to decisions to adopt innovations and how the likelihood of adoption of innovations can be increased. Using a narrative synthesis approach, this paper compared constructs theorized to be related to adoption of innovations proposed in existing theoretical frameworks in order to identify characteristics likely to increase adoption of innovations. The overall goal was to identify elements across adoption frameworks that are potentially modifiable and, thus, might be employed to improve the adoption of evidence-based practices. The review identified 20 theoretical frameworks that could be grouped into two broad categories: theories that mainly address the adoption process (N = 10) and theories that address adoption within the context of implementation, diffusion, dissemination, and/or sustainability (N = 10). Constructs of leadership, operational size and structure, innovation fit with norms and values, and attitudes/motivation toward innovations each are mentioned in at least half of the theories, though there were no consistent definitions of measures for these constructs. A lack of precise definitions and measurement of constructs suggests further work is needed to increase our understanding of adoption of innovations. PMID:23549911

  12. A new pressure ulcer conceptual framework.

    PubMed

    Coleman, Susanne; Nixon, Jane; Keen, Justin; Wilson, Lyn; McGinnis, Elizabeth; Dealey, Carol; Stubbs, Nikki; Farrin, Amanda; Dowding, Dawn; Schols, Jos M G A; Cuddigan, Janet; Berlowitz, Dan; Jude, Edward; Vowden, Peter; Schoonhoven, Lisette; Bader, Dan L; Gefen, Amit; Oomens, Cees W J; Nelson, E Andrea

    2014-10-01

    This paper discusses the critical determinants of pressure ulcer development and proposes a new pressure ulcer conceptual framework. Recent work to develop and validate a new evidence-based pressure ulcer risk assessment framework was undertaken. This formed part of a Pressure UlceR Programme Of reSEarch (RP-PG-0407-10056), funded by the National Institute for Health Research. The foundation for the risk assessment component incorporated a systematic review and a consensus study that highlighted the need to propose a new conceptual framework. Discussion Paper. The new conceptual framework links evidence from biomechanical, physiological and epidemiological evidence, through use of data from a systematic review (search conducted March 2010), a consensus study (conducted December 2010-2011) and an international expert group meeting (conducted December 2011). A new pressure ulcer conceptual framework incorporating key physiological and biomechanical components and their impact on internal strains, stresses and damage thresholds is proposed. Direct and key indirect causal factors suggested in a theoretical causal pathway are mapped to the physiological and biomechanical components of the framework. The new proposed conceptual framework provides the basis for understanding the critical determinants of pressure ulcer development and has the potential to influence risk assessment guidance and practice. It could also be used to underpin future research to explore the role of individual risk factors conceptually and operationally. By integrating existing knowledge from epidemiological, physiological and biomechanical evidence, a theoretical causal pathway and new conceptual framework are proposed with potential implications for practice and research. © 2014 The Authors. Journal of Advanced Nursing Published by John Wiley & Sons Ltd.

  13. A new pressure ulcer conceptual framework

    PubMed Central

    Coleman, Susanne; Nixon, Jane; Keen, Justin; Wilson, Lyn; McGinnis, Elizabeth; Dealey, Carol; Stubbs, Nikki; Farrin, Amanda; Dowding, Dawn; Schols, Jos MGA; Cuddigan, Janet; Berlowitz, Dan; Jude, Edward; Vowden, Peter; Schoonhoven, Lisette; Bader, Dan L; Gefen, Amit; Oomens, Cees WJ; Nelson, E Andrea

    2014-01-01

    Aim This paper discusses the critical determinants of pressure ulcer development and proposes a new pressure ulcer conceptual framework. Background Recent work to develop and validate a new evidence-based pressure ulcer risk assessment framework was undertaken. This formed part of a Pressure UlceR Programme Of reSEarch (RP-PG-0407-10056), funded by the National Institute for Health Research. The foundation for the risk assessment component incorporated a systematic review and a consensus study that highlighted the need to propose a new conceptual framework. Design Discussion Paper. Data Sources The new conceptual framework links evidence from biomechanical, physiological and epidemiological evidence, through use of data from a systematic review (search conducted March 2010), a consensus study (conducted December 2010–2011) and an international expert group meeting (conducted December 2011). Implications for Nursing A new pressure ulcer conceptual framework incorporating key physiological and biomechanical components and their impact on internal strains, stresses and damage thresholds is proposed. Direct and key indirect causal factors suggested in a theoretical causal pathway are mapped to the physiological and biomechanical components of the framework. The new proposed conceptual framework provides the basis for understanding the critical determinants of pressure ulcer development and has the potential to influence risk assessment guidance and practice. It could also be used to underpin future research to explore the role of individual risk factors conceptually and operationally. Conclusion By integrating existing knowledge from epidemiological, physiological and biomechanical evidence, a theoretical causal pathway and new conceptual framework are proposed with potential implications for practice and research. PMID:24684197

  14. Education for Sustainable Development: A Framework for Nigeria

    ERIC Educational Resources Information Center

    Oni, Adesoji A.; Adetoro, J. A.

    2012-01-01

    This paper proposed a framework for conceptualizing, planning for and implementing an education agenda for sustainable development within the Nigerian context. The strategic questions informing this framework are: What is the context within which sustainable development is being proposed? What are the educational needs that arise within the given…

  15. A Contingency Framework for Listening to the Dying

    ERIC Educational Resources Information Center

    Vora, Erika; Vora, Ariana

    2008-01-01

    Listening to the dying poses special challenges. This paper proposes a contingency framework for describing and assessing various circumstances when listening to the dying. It identifies current approaches to listening, applies the contingency framework toward effectively listening to the dying, and proposes a new type of listening called…

  16. A College and Career Readiness Framework for Secondary Students with Disabilities

    ERIC Educational Resources Information Center

    Morningstar, Mary E.; Lombardi, Allison; Fowler, Catherine H.; Test, David W.

    2017-01-01

    In this qualitative study, a proposed organizing framework of college and career readiness for secondary students with disabilities was developed based on a synthesis of extant research articulating student success. The original proposed framework included six domains representing academic and nonacademic skills associated with college and career…

  17. A Deep Learning Approach to on-Node Sensor Data Analytics for Mobile or Wearable Devices.

    PubMed

    Ravi, Daniele; Wong, Charence; Lo, Benny; Yang, Guang-Zhong

    2017-01-01

    The increasing popularity of wearable devices in recent years means that a diverse range of physiological and functional data can now be captured continuously for applications in sports, wellbeing, and healthcare. This wealth of information requires efficient methods of classification and analysis where deep learning is a promising technique for large-scale data analytics. While deep learning has been successful in implementations that utilize high-performance computing platforms, its use on low-power wearable devices is limited by resource constraints. In this paper, we propose a deep learning methodology, which combines features learned from inertial sensor data together with complementary information from a set of shallow features to enable accurate and real-time activity classification. The design of this combined method aims to overcome some of the limitations present in a typical deep learning framework where on-node computation is required. To optimize the proposed method for real-time on-node computation, spectral domain preprocessing is used before the data are passed onto the deep learning framework. The classification accuracy of our proposed deep learning approach is evaluated against state-of-the-art methods using both laboratory and real world activity datasets. Our results show the validity of the approach on different human activity datasets, outperforming other methods, including the two methods used within our combined pipeline. We also demonstrate that the computation times for the proposed method are consistent with the constraints of real-time on-node processing on smartphones and a wearable sensor platform.

  18. A Framework for Measuring the Progress in Exoskeleton Skills in People with Complete Spinal Cord Injury

    PubMed Central

    van Dijsseldonk, Rosanne B.; Rijken, Hennie; van Nes, Ilse J. W.; van de Meent, Henk; Keijsers, Noel L. W.

    2017-01-01

    For safe application of exoskeletons in people with spinal cord injury at home or in the community, it is required to have completed an exoskeleton training in which users learn to perform basic and advanced skills. So far, a framework to test exoskeleton skills is lacking. The aim of this study was to develop and test the hierarchy and reliability of a framework for measuring the progress in the ability to perform basic and advanced skills. Twelve participants with paraplegia were given twenty-four training sessions in 8 weeks with the Rewalk-exoskeleton. During the 2nd, 4th, and 6th training week the Intermediate-skills-test was performed consisting of 27 skills, measured in an hierarchical order of difficulty, until two skills were not achieved. When participants could walk independently, the Final-skills-test, consisting of 20 skills, was performed in the last training session. Each skill was performed at least two times with a maximum of three attempts. As a reliability measure the consistency was used, which was the number of skills performed the same in the first two attempts relative to the total number. Ten participants completed the training program. Their number of achieved intermediate skills was significantly different between the measurements XF2(2) = 12.36, p = 0.001. Post-hoc analysis revealed a significant increase in the median achieved intermediate skills from 4 [1–7] at the first to 10.5 [5–26] at the third Intermediate-skills-test. The rate of participants who achieved the intermediate skills decreased and the coefficient of reproducibility was 0.98. Eight participants met the criteria to perform the Final-skills-test. Their median number of successfully performed final skills was 16.5 [13–20] and 17 [14–19] skills in the first and second time. The overall consistency of >70% was achieved in the Intermediate-skills-test (73%) and the Final-skills-test (81%). Eight out of twelve participants experienced skin damage during the training, in four participants this resulted in missed training sessions. The framework proposed in this study measured the progress in performing basic and advanced exoskeleton skills during a training program. The hierarchical ordered skills-test could discriminate across participants' skill-level and the overall consistency was considered acceptable. PMID:29311780

  19. A Framework for Measuring the Progress in Exoskeleton Skills in People with Complete Spinal Cord Injury.

    PubMed

    van Dijsseldonk, Rosanne B; Rijken, Hennie; van Nes, Ilse J W; van de Meent, Henk; Keijsers, Noel L W

    2017-01-01

    For safe application of exoskeletons in people with spinal cord injury at home or in the community, it is required to have completed an exoskeleton training in which users learn to perform basic and advanced skills. So far, a framework to test exoskeleton skills is lacking. The aim of this study was to develop and test the hierarchy and reliability of a framework for measuring the progress in the ability to perform basic and advanced skills. Twelve participants with paraplegia were given twenty-four training sessions in 8 weeks with the Rewalk-exoskeleton. During the 2nd, 4th, and 6th training week the Intermediate-skills-test was performed consisting of 27 skills, measured in an hierarchical order of difficulty, until two skills were not achieved. When participants could walk independently, the Final-skills-test, consisting of 20 skills, was performed in the last training session. Each skill was performed at least two times with a maximum of three attempts. As a reliability measure the consistency was used, which was the number of skills performed the same in the first two attempts relative to the total number. Ten participants completed the training program. Their number of achieved intermediate skills was significantly different between the measurements X F 2 (2) = 12.36, p = 0.001. Post-hoc analysis revealed a significant increase in the median achieved intermediate skills from 4 [1-7] at the first to 10.5 [5-26] at the third Intermediate-skills-test. The rate of participants who achieved the intermediate skills decreased and the coefficient of reproducibility was 0.98. Eight participants met the criteria to perform the Final-skills-test. Their median number of successfully performed final skills was 16.5 [13-20] and 17 [14-19] skills in the first and second time. The overall consistency of >70% was achieved in the Intermediate-skills-test (73%) and the Final-skills-test (81%). Eight out of twelve participants experienced skin damage during the training, in four participants this resulted in missed training sessions. The framework proposed in this study measured the progress in performing basic and advanced exoskeleton skills during a training program. The hierarchical ordered skills-test could discriminate across participants' skill-level and the overall consistency was considered acceptable.

  20. A New Analytic Framework for Moderation Analysis --- Moving Beyond Analytic Interactions

    PubMed Central

    Tang, Wan; Yu, Qin; Crits-Christoph, Paul; Tu, Xin M.

    2009-01-01

    Conceptually, a moderator is a variable that modifies the effect of a predictor on a response. Analytically, a common approach as used in most moderation analyses is to add analytic interactions involving the predictor and moderator in the form of cross-variable products and test the significance of such terms. The narrow scope of such a procedure is inconsistent with the broader conceptual definition of moderation, leading to confusion in interpretation of study findings. In this paper, we develop a new approach to the analytic procedure that is consistent with the concept of moderation. The proposed framework defines moderation as a process that modifies an existing relationship between the predictor and the outcome, rather than simply a test of a predictor by moderator interaction. The approach is illustrated with data from a real study. PMID:20161453

  1. Individual psychotherapy for schizophrenia: trends and developments in the wake of the recovery movement.

    PubMed

    Hamm, Jay A; Hasson-Ohayon, Ilanit; Kukla, Marina; Lysaker, Paul H

    2013-01-01

    Although the role and relative prominence of psychotherapy in the treatment of schizophrenia has fluctuated over time, an analysis of the history of psychotherapy for schizophrenia, focusing on findings from the recovery movement, reveals recent trends including the emergence of the development of integrative psychotherapy approaches. The authors suggest that the recovery movement has revealed limitations in traditional approaches to psychotherapy, and has provided opportunities for integrative approaches to emerge as a mechanism for promoting recovery in persons with schizophrenia. Five approaches to integrative psychotherapy for persons with schizophrenia are presented, and a shared conceptual framework that allows these five approaches to be compatible with one another is proposed. The conceptual framework is consistent with theories of recovery and emphasizes interpersonal attachment, personal narrative, and metacognitive processes. Implications for future research on integrative psychotherapy are considered.

  2. Theory of the Origin, Evolution, and Nature of Life

    PubMed Central

    Andrulis, Erik D.

    2011-01-01

    Life is an inordinately complex unsolved puzzle. Despite significant theoretical progress, experimental anomalies, paradoxes, and enigmas have revealed paradigmatic limitations. Thus, the advancement of scientific understanding requires new models that resolve fundamental problems. Here, I present a theoretical framework that economically fits evidence accumulated from examinations of life. This theory is based upon a straightforward and non-mathematical core model and proposes unique yet empirically consistent explanations for major phenomena including, but not limited to, quantum gravity, phase transitions of water, why living systems are predominantly CHNOPS (carbon, hydrogen, nitrogen, oxygen, phosphorus, and sulfur), homochirality of sugars and amino acids, homeoviscous adaptation, triplet code, and DNA mutations. The theoretical framework unifies the macrocosmic and microcosmic realms, validates predicted laws of nature, and solves the puzzle of the origin and evolution of cellular life in the universe. PMID:25382118

  3. A study of microindentation hardness tests by mechanism-based strain gradient plasticity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Y.; Xue, Z.; Gao, H.

    2000-08-01

    We recently proposed a theory of mechanism-based strain gradient (MSG) plasticity to account for the size dependence of plastic deformation at micron- and submicron-length scales. The MSG plasticity theory connects micron-scale plasticity to dislocation theories via a multiscale, hierarchical framework linking Taylor's dislocation hardening model to strain gradient plasticity. Here we show that the theory of MSG plasticity, when used to study micro-indentation, indeed reproduces the linear dependence observed in experiments, thus providing an important self-consistent check of the theory. The effects of pileup, sink-in, and the radius of indenter tip have been taken into account in the indentation model.more » In accomplishing this objective, we have generalized the MSG plasticity theory to include the elastic deformation in the hierarchical framework. (c) 2000 Materials Research Society.« less

  4. Information processing capacity in psychopathy: Effects of anomalous attention.

    PubMed

    Hamilton, Rachel K B; Newman, Joseph P

    2018-03-01

    Hamilton and colleagues (2015) recently proposed that an integrative deficit in psychopathy restricts simultaneous processing, thereby leaving fewer resources available for information encoding, narrowing the scope of attention, and undermining associative processing. The current study evaluated this parallel processing deficit proposal using the Simultaneous-Sequential paradigm. This investigation marks the first a priori test of the Hamilton et al.'s theoretical framework. We predicted that psychopathy would be associated with inferior performance (as indexed by lower accuracy and longer response time) on trials requiring simultaneous processing of visual information relative to trials necessitating sequential processing. Results were consistent with these predictions, supporting the proposal that psychopathy is characterized by a reduced capacity to process multicomponent perceptual information concurrently. We discuss the potential implications of impaired simultaneous processing for the conceptualization of the psychopathic deficit. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  5. Cluster Correspondence Analysis.

    PubMed

    van de Velden, M; D'Enza, A Iodice; Palumbo, F

    2017-03-01

    A method is proposed that combines dimension reduction and cluster analysis for categorical data by simultaneously assigning individuals to clusters and optimal scaling values to categories in such a way that a single between variance maximization objective is achieved. In a unified framework, a brief review of alternative methods is provided and we show that the proposed method is equivalent to GROUPALS applied to categorical data. Performance of the methods is appraised by means of a simulation study. The results of the joint dimension reduction and clustering methods are compared with the so-called tandem approach, a sequential analysis of dimension reduction followed by cluster analysis. The tandem approach is conjectured to perform worse when variables are added that are unrelated to the cluster structure. Our simulation study confirms this conjecture. Moreover, the results of the simulation study indicate that the proposed method also consistently outperforms alternative joint dimension reduction and clustering methods.

  6. Maximum margin multiple instance clustering with applications to image and text clustering.

    PubMed

    Zhang, Dan; Wang, Fei; Si, Luo; Li, Tao

    2011-05-01

    In multiple instance learning problems, patterns are often given as bags and each bag consists of some instances. Most of existing research in the area focuses on multiple instance classification and multiple instance regression, while very limited work has been conducted for multiple instance clustering (MIC). This paper formulates a novel framework, maximum margin multiple instance clustering (M(3)IC), for MIC. However, it is impractical to directly solve the optimization problem of M(3)IC. Therefore, M(3)IC is relaxed in this paper to enable an efficient optimization solution with a combination of the constrained concave-convex procedure and the cutting plane method. Furthermore, this paper presents some important properties of the proposed method and discusses the relationship between the proposed method and some other related ones. An extensive set of empirical results are shown to demonstrate the advantages of the proposed method against existing research for both effectiveness and efficiency.

  7. An approach to the interpretation of Cole-Davidson and Cole-Cole dielectric functions

    NASA Astrophysics Data System (ADS)

    Iglesias, T. P.; Vilão, G.; Reis, João Carlos R.

    2017-08-01

    Assuming that a dielectric sample can be described by Debye's model at each frequency, a method based on Cole's treatment is proposed for the direct estimation at experimental frequencies of relaxation times and the corresponding static and infinite-frequency permittivities. These quantities and the link between dielectric strength and mean molecular dipole moment at each frequency could be useful to analyze dielectric relaxation processes. The method is applied to samples that follow a Cole-Cole or a Cole-Davidson dielectric function. A physical interpretation of these dielectric functions is proposed. The behavior of relaxation time with frequency can be distinguished between the two dielectric functions. The proposed method can also be applied to samples following a Navriliak-Negami or any other dielectric function. The dielectric relaxation of a nanofluid consisting of graphene nanoparticles dispersed in the oil squalane is reported and discussed within the novel framework.

  8. Image denoising in mixed Poisson-Gaussian noise.

    PubMed

    Luisier, Florian; Blu, Thierry; Unser, Michael

    2011-03-01

    We propose a general methodology (PURE-LET) to design and optimize a wide class of transform-domain thresholding algorithms for denoising images corrupted by mixed Poisson-Gaussian noise. We express the denoising process as a linear expansion of thresholds (LET) that we optimize by relying on a purely data-adaptive unbiased estimate of the mean-squared error (MSE), derived in a non-Bayesian framework (PURE: Poisson-Gaussian unbiased risk estimate). We provide a practical approximation of this theoretical MSE estimate for the tractable optimization of arbitrary transform-domain thresholding. We then propose a pointwise estimator for undecimated filterbank transforms, which consists of subband-adaptive thresholding functions with signal-dependent thresholds that are globally optimized in the image domain. We finally demonstrate the potential of the proposed approach through extensive comparisons with state-of-the-art techniques that are specifically tailored to the estimation of Poisson intensities. We also present denoising results obtained on real images of low-count fluorescence microscopy.

  9. Real-time anomaly detection for very short-term load forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Jian; Hong, Tao; Yue, Meng

    Although the recent load information is critical to very short-term load forecasting (VSTLF), power companies often have difficulties in collecting the most recent load values accurately and timely for VSTLF applications. This paper tackles the problem of real-time anomaly detection in most recent load information used by VSTLF. This paper proposes a model-based anomaly detection method that consists of two components, a dynamic regression model and an adaptive anomaly threshold. The case study is developed using the data from ISO New England. This paper demonstrates that the proposed method significantly outperforms three other anomaly detection methods including two methods commonlymore » used in the field and one state-of-the-art method used by a winning team of the Global Energy Forecasting Competition 2014. Lastly, a general anomaly detection framework is proposed for the future research.« less

  10. Real-time anomaly detection for very short-term load forecasting

    DOE PAGES

    Luo, Jian; Hong, Tao; Yue, Meng

    2018-01-06

    Although the recent load information is critical to very short-term load forecasting (VSTLF), power companies often have difficulties in collecting the most recent load values accurately and timely for VSTLF applications. This paper tackles the problem of real-time anomaly detection in most recent load information used by VSTLF. This paper proposes a model-based anomaly detection method that consists of two components, a dynamic regression model and an adaptive anomaly threshold. The case study is developed using the data from ISO New England. This paper demonstrates that the proposed method significantly outperforms three other anomaly detection methods including two methods commonlymore » used in the field and one state-of-the-art method used by a winning team of the Global Energy Forecasting Competition 2014. Lastly, a general anomaly detection framework is proposed for the future research.« less

  11. A Bayesian approach to traffic light detection and mapping

    NASA Astrophysics Data System (ADS)

    Hosseinyalamdary, Siavash; Yilmaz, Alper

    2017-03-01

    Automatic traffic light detection and mapping is an open research problem. The traffic lights vary in color, shape, geolocation, activation pattern, and installation which complicate their automated detection. In addition, the image of the traffic lights may be noisy, overexposed, underexposed, or occluded. In order to address this problem, we propose a Bayesian inference framework to detect and map traffic lights. In addition to the spatio-temporal consistency constraint, traffic light characteristics such as color, shape and height is shown to further improve the accuracy of the proposed approach. The proposed approach has been evaluated on two benchmark datasets and has been shown to outperform earlier studies. The results show that the precision and recall rates for the KITTI benchmark are 95.78 % and 92.95 % respectively and the precision and recall rates for the LARA benchmark are 98.66 % and 94.65 % .

  12. Adopting the sensemaking perspective for chronic disease self-management.

    PubMed

    Mamykina, Lena; Smaldone, Arlene M; Bakken, Suzanne R

    2015-08-01

    Self-monitoring is an integral component of many chronic diseases; however few theoretical frameworks address how individuals understand self-monitoring data and use it to guide self-management. To articulate a theoretical framework of sensemaking in diabetes self-management that integrates existing scholarship with empirical data. The proposed framework is grounded in theories of sensemaking adopted from organizational behavior, education, and human-computer interaction. To empirically validate the framework the researchers reviewed and analyzed reports on qualitative studies of diabetes self-management practices published in peer-reviewed journals from 2000 to 2015. The proposed framework distinguishes between sensemaking and habitual modes of self-management and identifies three essential sensemaking activities: perception of new information related to health and wellness, development of inferences that inform selection of actions, and carrying out daily activities in response to new information. The analysis of qualitative findings from 50 published reports provided ample empirical evidence for the proposed framework; however, it also identified a number of barriers to engaging in sensemaking in diabetes self-management. The proposed framework suggests new directions for research in diabetes self-management and for design of new informatics interventions for data-driven self-management. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Geometrothermodynamic model for the evolution of the Universe

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gruber, Christine; Quevedo, Hernando, E-mail: christine.gruber@correo.nucleares.unam.mx, E-mail: quevedo@nucleares.unam.mx

    Using the formalism of geometrothermodynamics to derive a fundamental thermodynamic equation, we construct a cosmological model in the framework of relativistic cosmology. In a first step, we describe a system without thermodynamic interaction, and show it to be equivalent to the standard ΛCDM paradigm. The second step includes thermodynamic interaction and produces a model consistent with the main features of inflation. With the proposed fundamental equation we are thus able to describe all the known epochs in the evolution of our Universe, starting from the inflationary phase.

  14. Holographic scalar fields in Kaluza-Klein framework

    NASA Astrophysics Data System (ADS)

    Erkan, Sevda; Pirinccioglu, Nurettin; Salti, Mustafa; Aydogdu, Oktay

    2017-12-01

    Making use of the Friedmann-Robertson-Walker (FRW) type Kaluza-Klein universe (KKU), we discuss the holographic dark energy density (HDED) in order to develop its correspondence with some scalar field descriptions such as the tachyon, quintessence, DBI-essence, dilaton and the k-essence. It is concluded that the Kaluza-Klein-type HDED proposal becomes stable throughout the history of our universe and is consistent with the current status of the universe. Next, we obtain the exact solutions of self-interacting potential and scalar field function for the selected models.

  15. NASA Astrophysics Data System (ADS)

    2018-05-01

    Eigenvalues and eigenvectors, together, constitute the eigenstructure of the system. The design of vibrating systems aimed at satisfying specifications on eigenvalues and eigenvectors, which is commonly known as eigenstructure assignment, has drawn increasing interest over the recent years. The most natural mathematical framework for such problems is constituted by the inverse eigenproblems, which consist in the determination of the system model that features a desired set of eigenvalues and eigenvectors. Although such a problem is intrinsically challenging, several solutions have been proposed in the literature. The approaches to eigenstructure assignment can be basically divided into passive control and active control.

  16. Sensor Based Framework for Secure Multimedia Communication in VANET

    PubMed Central

    Rahim, Aneel; Khan, Zeeshan Shafi; Bin Muhaya, Fahad T.; Sher, Muhammad; Kim, Tai-Hoon

    2010-01-01

    Secure multimedia communication enhances the safety of passengers by providing visual pictures of accidents and danger situations. In this paper we proposed a framework for secure multimedia communication in Vehicular Ad-Hoc Networks (VANETs). Our proposed framework is mainly divided into four components: redundant information, priority assignment, malicious data verification and malicious node verification. The proposed scheme jhas been validated with the help of the NS-2 network simulator and the Evalvid tool. PMID:22163462

  17. Proposal of a framework for evaluating military surveillance systems for early detection of outbreaks on duty areas

    PubMed Central

    Meynard, Jean-Baptiste; Chaudet, Herve; Green, Andrew D; Jefferson, Henry L; Texier, Gaetan; Webber, Daniel; Dupuy, Bruce; Boutin, Jean-Paul

    2008-01-01

    Background In recent years a wide variety of epidemiological surveillance systems have been developed to provide early identification of outbreaks of infectious disease. Each system has had its own strengths and weaknesses. In 2002 a Working Group of the Centers for Disease Control and Prevention (CDC) produced a framework for evaluation, which proved suitable for many public health surveillance systems. However this did not easily adapt to the military setting, where by necessity a variety of different parameters are assessed, different constraints placed on the systems, and different objectives required. This paper describes a proposed framework for evaluation of military syndromic surveillance systems designed to detect outbreaks of disease on operational deployments. Methods The new framework described in this paper was developed from the cumulative experience of British and French military syndromic surveillance systems. The methods included a general assessment framework (CDC), followed by more specific methods of conducting evaluation. These included Knowledge/Attitude/Practice surveys (KAP surveys), technical audits, ergonomic studies, simulations and multi-national exercises. A variety of military constraints required integration into the evaluation. Examples of these include the variability of geographical conditions in the field, deployment to areas without prior knowledge of naturally-occurring disease patterns, the differences in field sanitation between locations and over the length of deployment, the mobility of military forces, turnover of personnel, continuity of surveillance across different locations, integration with surveillance systems from other nations working alongside each other, compatibility with non-medical information systems, and security. Results A framework for evaluation has been developed that can be used for military surveillance systems in a staged manner consisting of initial, intermediate and final evaluations. For each stage of the process parameters for assessment have been defined and methods identified. Conclusion The combined experiences of French and British syndromic surveillance systems developed for use in deployed military forces has allowed the development of a specific evaluation framework. The tool is suitable for use by all nations who wish to evaluate syndromic surveillance in their own military forces. It could also be useful for civilian mobile systems or for national security surveillance systems. PMID:18447944

  18. Ecosystem Services: a Framework for Environmental Management of the Deep Sea

    NASA Astrophysics Data System (ADS)

    Le, J. T.; Levin, L. A.; Carson, R. T.

    2016-02-01

    As demand for deep-sea resources rapidly expands in the food, energy, mineral, and pharmaceutical sectors, it has become increasingly clear that a regulatory structure for extracting these resources is not yet in place. There are jurisdictional gaps and a lack of regulatory consistency regarding what aspects of the deep sea need protection and what requirements might help guarantee that protection. Given the mining sector's intent to exploit seafloor massive sulphides, Mn nodules, cobalt crusts, and phosphorites in the coming years, there is an urgent need for deep-ocean environmental management. Here, we propose an ecosystem services-based framework to inform decisions and best practices regarding resource exploitation, and to guide baseline studies, preventative actions, monitoring, and remediation. With policy in early stages of development, an ecosystem services approach has the potential to serve as an overarching framework that takes protection of natural capital provided by the environment into account during the decision-making process. We show how an ecosystem services approach combined with economic tools, such as benefit transfer techniques, should help illuminate issues where there are direct conflicts among different industries, and between industry and conservation. We argue for baseline and monitoring measurements and metrics that inform about deep-sea ecosystem services that would be impaired by mining, and discuss ways to incorporate the value of those losses into decision making, mitigation measures, and ultimately product costs. This proposal is considered relative to current International Seabed Authority recommendations and contractor practices, and new actions are proposed. An ecosystem services-based understanding of how these systems work and their value to society can improve sustainability and stewardship of the deep ocean.

  19. Psychological theory and pedagogical effectiveness: the learning promotion potential framework.

    PubMed

    Tomlinson, Peter

    2008-12-01

    After a century of educational psychology, eminent commentators are still lamenting problems besetting the appropriate relating of psychological insights to teaching design, a situation not helped by the persistence of crude assumptions concerning the nature of pedagogical effectiveness. To propose an analytical or meta-theoretical framework based on the concept of learning promotion potential (LPP) as a basis for understanding the basic relationship between psychological insights and teaching strategies, and to draw out implications for psychology-based pedagogical design, development and research. This is a theoretical and meta-theoretical paper relying mainly on conceptual analysis, though also calling on psychological theory and research. Since teaching consists essentially in activity designed to promote learning, it follows that a teaching strategy has the potential in principle to achieve particular kinds of learning gains (LPP) to the extent that it embodies or stimulates the relevant learning processes on the part of learners and enables the teacher's functions of on-line monitoring and assistance for such learning processes. Whether a teaching strategy actually does realize its LPP by way of achieving its intended learning goals depends also on the quality of its implementation, in conjunction with other factors in the situated interaction that teaching always involves. The core role of psychology is to provide well-grounded indication of the nature of such learning processes and the teaching functions that support them, rather than to directly generate particular ways of teaching. A critically eclectic stance towards potential sources of psychological insight is argued for. Applying this framework, the paper proposes five kinds of issue to be attended to in the design and evaluation of psychology-based pedagogy. Other work proposing comparable ideas is briefly reviewed, with particular attention to similarities and a key difference with the ideas of Oser and Baeriswyl (2001).

  20. Conditional Stochastic Models in Reduced Space: Towards Efficient Simulation of Tropical Cyclone Precipitation Patterns

    NASA Astrophysics Data System (ADS)

    Dodov, B.

    2017-12-01

    Stochastic simulation of realistic and statistically robust patterns of Tropical Cyclone (TC) induced precipitation is a challenging task. It is even more challenging in a catastrophe modeling context, where tens of thousands of typhoon seasons need to be simulated in order to provide a complete view of flood risk. Ultimately, one could run a coupled global climate model and regional Numerical Weather Prediction (NWP) model, but this approach is not feasible in the catastrophe modeling context and, most importantly, may not provide TC track patterns consistent with observations. Rather, we propose to leverage NWP output for the observed TC precipitation patterns (in terms of downscaled reanalysis 1979-2015) collected on a Lagrangian frame along the historical TC tracks and reduced to the leading spatial principal components of the data. The reduced data from all TCs is then grouped according to timing, storm evolution stage (developing, mature, dissipating, ETC transitioning) and central pressure and used to build a dictionary of stationary (within a group) and non-stationary (for transitions between groups) covariance models. Provided that the stochastic storm tracks with all the parameters describing the TC evolution are already simulated, a sequence of conditional samples from the covariance models chosen according to the TC characteristics at a given moment in time are concatenated, producing a continuous non-stationary precipitation pattern in a Lagrangian framework. The simulated precipitation for each event is finally distributed along the stochastic TC track and blended with a non-TC background precipitation using a data assimilation technique. The proposed framework provides means of efficient simulation (10000 seasons simulated in a couple of days) and robust typhoon precipitation patterns consistent with observed regional climate and visually undistinguishable from high resolution NWP output. The framework is used to simulate a catalog of 10000 typhoon seasons implemented in a flood risk model for Japan.

  1. An Efficient and Adaptive Mutual Authentication Framework for Heterogeneous Wireless Sensor Network-Based Applications

    PubMed Central

    Kumar, Pardeep; Ylianttila, Mika; Gurtov, Andrei; Lee, Sang-Gon; Lee, Hoon-Jae

    2014-01-01

    Robust security is highly coveted in real wireless sensor network (WSN) applications since wireless sensors' sense critical data from the application environment. This article presents an efficient and adaptive mutual authentication framework that suits real heterogeneous WSN-based applications (such as smart homes, industrial environments, smart grids, and healthcare monitoring). The proposed framework offers: (i) key initialization; (ii) secure network (cluster) formation (i.e., mutual authentication and dynamic key establishment); (iii) key revocation; and (iv) new node addition into the network. The correctness of the proposed scheme is formally verified. An extensive analysis shows the proposed scheme coupled with message confidentiality, mutual authentication and dynamic session key establishment, node privacy, and message freshness. Moreover, the preliminary study also reveals the proposed framework is secure against popular types of attacks, such as impersonation attacks, man-in-the-middle attacks, replay attacks, and information-leakage attacks. As a result, we believe the proposed framework achieves efficiency at reasonable computation and communication costs and it can be a safeguard to real heterogeneous WSN applications. PMID:24521942

  2. An efficient and adaptive mutual authentication framework for heterogeneous wireless sensor network-based applications.

    PubMed

    Kumar, Pardeep; Ylianttila, Mika; Gurtov, Andrei; Lee, Sang-Gon; Lee, Hoon-Jae

    2014-02-11

    Robust security is highly coveted in real wireless sensor network (WSN) applications since wireless sensors' sense critical data from the application environment. This article presents an efficient and adaptive mutual authentication framework that suits real heterogeneous WSN-based applications (such as smart homes, industrial environments, smart grids, and healthcare monitoring). The proposed framework offers: (i) key initialization; (ii) secure network (cluster) formation (i.e., mutual authentication and dynamic key establishment); (iii) key revocation; and (iv) new node addition into the network. The correctness of the proposed scheme is formally verified. An extensive analysis shows the proposed scheme coupled with message confidentiality, mutual authentication and dynamic session key establishment, node privacy, and message freshness. Moreover, the preliminary study also reveals the proposed framework is secure against popular types of attacks, such as impersonation attacks, man-in-the-middle attacks, replay attacks, and information-leakage attacks. As a result, we believe the proposed framework achieves efficiency at reasonable computation and communication costs and it can be a safeguard to real heterogeneous WSN applications.

  3. Tetraquark mixing framework for isoscalar resonances in light mesons

    NASA Astrophysics Data System (ADS)

    Kim, Hungchong; Kim, K. S.; Cheoun, Myung-Ki; Oka, Makoto

    2018-05-01

    Recently, a tetraquark mixing framework has been proposed for light mesons and applied more or less successfully to the isovector resonances, a0(980 ) , a0(1450 ) , as well as to the isodoublet resonances, K0*(800 ),K0*(1430 ). In this work, we present a more extensive view on the mixing framework and apply this framework to the isoscalar resonances, f0(500 ), f0(980 ), f0(1370 ), f0(1500 ). Tetraquarks in this framework can have two spin configurations containing either spin-0 diquark or spin-1 diquark and each configuration forms a nonet in flavor space. The two spin configurations are found to mix strongly through the color-spin interactions. Their mixtures, which diagonalize the hyperfine masses, can generate the physical resonances constituting two nonets, which, in fact, coincide roughly with the experimental observation. We identify that f0(500 ), f0(980 ) are the isoscalar members in the light nonet, and f0(1370 ), f0(1500 ) are the similar members in the heavy nonet. This means that the spin configuration mixing, as it relates the corresponding members in the two nonets, can generate f0(500 ) , f0(1370 ) among the members in light mass, and f0(980 ) , f0(1500 ) in heavy mass. The complication arises because the isoscalar members of each nonet are subject to an additional flavor mixing known as Okubo-Zweig-Iizuka rule so that f0(500 ) , f0(980 ) , and similarly f0(1370 ) , f0(1500 ) , are the mixture of two isoscalar members belonging to an octet and a singlet in SUf(3 ) . The tetraquark mixing framework including the flavor mixing is tested for the isoscalar resonances in terms of the mass splitting and the fall-apart decay modes. The mass splitting among the isoscalar resonances is found to be consistent qualitatively with their hyperfine mass splitting strongly driven by the spin configuration mixing, which suggests that the tetraquark mixing framework works. The fall-apart modes from our tetraquarks also seem to be consistent with the experimental modes. We also discuss possible existence of the spin-1 tetraquarks that can be constructed by the spin-1 diquark.

  4. Distance Learning Courses on the Web: The Authoring Approach.

    ERIC Educational Resources Information Center

    Santos, Neide; Diaz, Alicia; Bibbo, Luis Mariano

    This paper proposes a framework for supporting the authoring process of distance learning courses. An overview of distance learning courses and the World Wide Web is presented. The proposed framework is then described, including: (1) components of the framework--a hypermedia design methodology for authoring the course, links to related Web sites,…

  5. 77 FR 33683 - Privacy Act of 1974: Implementation of Exemptions; Department of Homeland Security, U.S. Customs...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-07

    ... Border Protection, DHS/CBP--017 Analytical Framework for Intelligence (AFI) System of Records AGENCY... Framework for Intelligence (AFI) System of Records'' and this proposed rulemaking. In this proposed... Protection, DHS/CBP--017 Analytical Framework for Intelligence (AFI) System of Records.'' AFI enhances DHS's...

  6. A Framework for Spatial Interaction Analysis Based on Large-Scale Mobile Phone Data

    PubMed Central

    Li, Weifeng; Cheng, Xiaoyun; Guo, Gaohua

    2014-01-01

    The overall understanding of spatial interaction and the exact knowledge of its dynamic evolution are required in the urban planning and transportation planning. This study aimed to analyze the spatial interaction based on the large-scale mobile phone data. The newly arisen mass dataset required a new methodology which was compatible with its peculiar characteristics. A three-stage framework was proposed in this paper, including data preprocessing, critical activity identification, and spatial interaction measurement. The proposed framework introduced the frequent pattern mining and measured the spatial interaction by the obtained association. A case study of three communities in Shanghai was carried out as verification of proposed method and demonstration of its practical application. The spatial interaction patterns and the representative features proved the rationality of the proposed framework. PMID:25435865

  7. Disease Management, Case Management, Care Management, and Care Coordination: A Framework and a Brief Manual for Care Programs and Staff.

    PubMed

    Ahmed, Osman I

    2016-01-01

    With the changing landscape of health care delivery in the United States since the passage of the Patient Protection and Affordable Care Act in 2010, health care organizations have struggled to keep pace with the evolving paradigm, particularly as it pertains to population health management. New nomenclature emerged to describe components of the new environment, and familiar words were put to use in an entirely different context. This article proposes a working framework for activities performed in case management, disease management, care management, and care coordination. The author offers standard working definitions for some of the most frequently used words in the health care industry with the goal of increasing consistency for their use, especially in the backdrop of the Centers for Medicaid & Medicare Services offering a "chronic case management fee" to primary care providers for managing the sickest, high-cost Medicare patients. Health care organizations performing case management, care management, disease management, and care coordination. Road map for consistency among users, in reporting, comparison, and for success of care management/coordination programs. This article offers a working framework for disease managers, case and care managers, and care coordinators. It suggests standard definitions to use for disease management, case management, care management, and care coordination. Moreover, the use of clear terminology will facilitate comparing, contrasting, and evaluating all care programs and increase consistency. The article can improve understanding of care program components and success factors, estimate program value and effectiveness, heighten awareness of consumer engagement tools, recognize current state and challenges for care programs, understand the role of health information technology solutions in care programs, and use information and knowledge gained to assess and improve care programs to design the "next generation" of programs.

  8. A Framework for Developing the Structure of Public Health Economic Models.

    PubMed

    Squires, Hazel; Chilcott, James; Akehurst, Ronald; Burr, Jennifer; Kelly, Michael P

    2016-01-01

    A conceptual modeling framework is a methodology that assists modelers through the process of developing a model structure. Public health interventions tend to operate in dynamically complex systems. Modeling public health interventions requires broader considerations than clinical ones. Inappropriately simple models may lead to poor validity and credibility, resulting in suboptimal allocation of resources. This article presents the first conceptual modeling framework for public health economic evaluation. The framework presented here was informed by literature reviews of the key challenges in public health economic modeling and existing conceptual modeling frameworks; qualitative research to understand the experiences of modelers when developing public health economic models; and piloting a draft version of the framework. The conceptual modeling framework comprises four key principles of good practice and a proposed methodology. The key principles are that 1) a systems approach to modeling should be taken; 2) a documented understanding of the problem is imperative before and alongside developing and justifying the model structure; 3) strong communication with stakeholders and members of the team throughout model development is essential; and 4) a systematic consideration of the determinants of health is central to identifying the key impacts of public health interventions. The methodology consists of four phases: phase A, aligning the framework with the decision-making process; phase B, identifying relevant stakeholders; phase C, understanding the problem; and phase D, developing and justifying the model structure. Key areas for further research involve evaluation of the framework in diverse case studies and the development of methods for modeling individual and social behavior. This approach could improve the quality of Public Health economic models, supporting efficient allocation of scarce resources. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  9. Evaluation Framework for NASA's Educational Outreach Programs

    NASA Technical Reports Server (NTRS)

    Berg, Rick; Booker, Angela; Linde, Charlotte; Preston, Connie

    1999-01-01

    The objective of the proposed work is to develop an evaluation framework for NASA's educational outreach efforts. We focus on public (rather than technical or scientific) dissemination efforts, specifically on Internet-based outreach sites for children.The outcome of this work is to propose both methods and criteria for evaluation, which would enable NASA to do a more analytic evaluation of its outreach efforts. The proposed framework is based on IRL's ethnographic and video-based observational methods, which allow us to analyze how these sites are actually used.

  10. Three Sets of Case Studies Suggest Logic and Consistency Challenges with Value Frameworks.

    PubMed

    Cohen, Joshua T; Anderson, Jordan E; Neumann, Peter J

    2017-02-01

    To assess the logic and consistency of three prominent value frameworks. We reviewed the value frameworks from three organizations: the Memorial Sloan Kettering Cancer Center (DrugAbacus), the American Society of Clinical Oncologists, and the Institute for Clinical and Economic Review. For each framework, we developed case studies to explore the degree to which the frameworks have face validity in the sense that they are consistent with four important principles: value should be proportional to a therapy's benefit; components of value should matter to framework users (patients and payers); attribute weights should reflect user preferences; and value estimates used to inform therapy prices should reflect per-person benefit. All three frameworks can aid decision making by elucidating factors not explicitly addressed by conventional evaluation techniques (in particular, cost-effectiveness analyses). Our case studies identified four challenges: 1) value is not always proportional to benefit; 2) value reflects factors that may not be relevant to framework users (patients or payers); 3) attribute weights do not necessarily reflect user preferences or relate to value in ways that are transparent; and 4) value does not reflect per-person benefit. Although the value frameworks we reviewed capture value in a way that is important to various audiences, they are not always logical or consistent. Because these frameworks may have a growing influence on therapy access, it is imperative that analytic challenges be further explored. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  11. A New Framework of Removing Salt and Pepper Impulse Noise for the Noisy Image Including Many Noise-Free White and Black Pixels

    NASA Astrophysics Data System (ADS)

    Li, Song; Wang, Caizhu; Li, Yeqiu; Wang, Ling; Sakata, Shiro; Sekiya, Hiroo; Kuroiwa, Shingo

    In this paper, we propose a new framework of removing salt and pepper impulse noise. In our proposed framework, the most important point is that the number of noise-free white and black pixels in a noisy image can be determined by using the noise rates estimated by Fuzzy Impulse Noise Detection and Reduction Method (FINDRM) and Efficient Detail-Preserving Approach (EDPA). For the noisy image includes many noise-free white and black pixels, the detected noisy pixel from the FINDRM is re-checked by using the alpha-trimmed mean. Finally, the impulse noise filtering phase of the FINDRM is used to restore the image. Simulation results show that for the noisy image including many noise-free white and black pixels, the proposed framework can decrease the False Hit Rate (FHR) efficiently compared with the FINDRM. Therefore, the proposed framework can be used more widely than the FINDRM.

  12. Back to first principles: a new model for the regulation of drug promotion

    PubMed Central

    Bennett, Alan; Jiménez, Freddy; Fields, Larry Eugene; Oyster, Joshua

    2015-01-01

    The US Food and Drug Administration's (‘FDA’ or the ‘Agency’) current regulatory framework for drug promotion, by significantly restricting the ability of drug manufacturers to communicate important, accurate, up-to-date scientific information about their products that is truthful and non-misleading, runs afoul of the First Amendment and actually runs counter to the Agency's public health mission. Our article proposes a New Model that represents an initial proposal for a modern, sustainable regulatory framework that comprehensively addresses drug promotion while protecting the public health, protecting manufacturers’ First Amendment rights, establishing clear and understandable rules, and maintaining the integrity of the FDA approval process. The New Model would create three categories of manufacturer communications—(1) Scientific Exchange and Other Exempt Communications, (2) Non-Core Communications, and (3) Core Communications—that would be regulated consistent with the First Amendment and according to the strength of the government's interest in regulating the specific communications included within each category. The New Model should address the FDA's concerns related to off-label speech while protecting drug manufacturers’ freedom to engage in truthful and non-misleading communications about their products. PMID:27774195

  13. Integrated chassis control for a three-axle electric bus with distributed driving motors and active rear steering system

    NASA Astrophysics Data System (ADS)

    Liu, Wei; He, Hongwen; Sun, Fengchun; Lv, Jiangyi

    2017-05-01

    This paper describes an integrated chassis control framework for a novel three-axle electric bus with active rear steering (ARS) axle and four motors at the middle and rear wheels. The proposed integrated framework consists of four parts: (1) an active speed limiting controller is designed for anti-body slip control and rollover prevention; (2) an ARS controller is designed for coordinating the tyre wear between the driving wheels; (3) an inter-axle torque distribution controller is designed for optimal torque distribution between the axles, considering anti-wheel slip and battery power limitations and (4) a data acquisition and estimation module for collecting the measured and estimated vehicle states. To verify the performances, a simulation platform is established in Trucksim software combined with Simulink. Three test cases are particularly designed to show the performances. The proposed algorithm is compared with a simple even control algorithm. The test results show satisfactory lateral stability and rollover prevention performances under severe steering conditions. The desired tyre wear coordinating performance is also realised, and the wheel slip ratios are restricted within stable region during intensive driving and emergency braking with complicated road conditions.

  14. Using Individual GPS Trajectories to Explore Foodscape Exposure: A Case Study in Beijing Metropolitan Area

    PubMed Central

    Zhang, Shuhua; Ma, Jinsong

    2018-01-01

    With the growing interest in studying the characteristics of people’s access to the food environment and its influence upon individual health, there has been a focus on assessing individual food exposure based on GPS trajectories. However, existing studies have largely focused on the overall activity space using short-period trajectories, which ignores the complexity of human movements and the heterogeneity of the spaces that are experienced by the individual over daily life schedules. In this study, we propose a novel framework to extract the exposure areas consisting of the localized activity spaces around daily life centers and non-motorized commuting routes from long-term GPS trajectories. The newly proposed framework is individual-specific and can incorporate the internal heterogeneity of individual activities (spatial extent, stay duration, and timing) in different places as well as the dynamics of the context. A pilot study of the GeoLife dataset suggests that there are significant variations in the magnitude as well as the composition of the food environment in different parts of the individual exposure area, and residential environment is not representative of the overall foodscape exposure. PMID:29495449

  15. Design of a SIP device cooperation system on OSGi service platforms

    NASA Astrophysics Data System (ADS)

    Takayama, Youji; Koita, Takahiro; Sato, Kenya

    2007-12-01

    Home networks feature such various technologies as protocols, specifications, and middleware, including HTTP, UPnP, and Jini. A service platform is required to handle such technologies to enable them to cooperate with different devices. The OSGi service platform, which meets the requirements based on service-oriented architecture, is designed and standardized by OSGi Alliance and consists of two parts: one OSGi Framework and bundles. On the OSGi service platform, APIs are defined as services that can handle these technologies and are implemented in the bundle. By using the OSGi Framework with bundles, various technologies can cooperate with each other. On the other hand, in IP networks, Session Initiation Protocol (SIP) is often used in device cooperation services to resolve an IP address, control a session between two or more devices, and easily exchange the statuses of devices. However, since many existing devices do not correspond to SIP, it cannot be used for device cooperation services. A device that does not correspond to SIP is called an unSIP device. This paper proposes and implements a prototype system that enables unSIP devices to correspond to SIP. For unSIP devices, the proposed system provides device cooperation services with SIP.

  16. On learning navigation behaviors for small mobile robots with reservoir computing architectures.

    PubMed

    Antonelo, Eric Aislan; Schrauwen, Benjamin

    2015-04-01

    This paper proposes a general reservoir computing (RC) learning framework that can be used to learn navigation behaviors for mobile robots in simple and complex unknown partially observable environments. RC provides an efficient way to train recurrent neural networks by letting the recurrent part of the network (called reservoir) be fixed while only a linear readout output layer is trained. The proposed RC framework builds upon the notion of navigation attractor or behavior that can be embedded in the high-dimensional space of the reservoir after learning. The learning of multiple behaviors is possible because the dynamic robot behavior, consisting of a sensory-motor sequence, can be linearly discriminated in the high-dimensional nonlinear space of the dynamic reservoir. Three learning approaches for navigation behaviors are shown in this paper. The first approach learns multiple behaviors based on the examples of navigation behaviors generated by a supervisor, while the second approach learns goal-directed navigation behaviors based only on rewards. The third approach learns complex goal-directed behaviors, in a supervised way, using a hierarchical architecture whose internal predictions of contextual switches guide the sequence of basic navigation behaviors toward the goal.

  17. Evolving effective behaviours to interact with tag-based populations

    NASA Astrophysics Data System (ADS)

    Yucel, Osman; Crawford, Chad; Sen, Sandip

    2015-07-01

    Tags and other characteristics, externally perceptible features that are consistent among groups of animals or humans, can be used by others to determine appropriate response strategies in societies. This usage of tags can be extended to artificial environments, where agents can significantly reduce cognitive effort spent on appropriate strategy choice and behaviour selection by reusing strategies for interacting with new partners based on their tags. Strategy selection mechanisms developed based on this idea have successfully evolved stable cooperation in games such as the Prisoner's Dilemma game but relies upon payoff sharing and matching methods that limit the applicability of the tag framework. Our goal is to develop a general classification and behaviour selection approach based on the tag framework. We propose and evaluate alternative tag matching and adaptation schemes for a new, incoming individual to select appropriate behaviour against any population member of an existing, stable society. Our proposed approach allows agents to evolve both the optimal tag for the environment as well as appropriate strategies for existing agent groups. We show that these mechanisms will allow for robust selection of optimal strategies by agents entering a stable society and analyse the various environments where this approach is effective.

  18. 78 FR 4382 - Proposed Foreign-Trade Zone-Northwest Iowa; Under Alternative Site Framework

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-22

    ... DEPARTMENT OF COMMERCE Foreign-Trade Zones Board [B-4-2013] Proposed Foreign-Trade Zone--Northwest Iowa; Under Alternative Site Framework An application has been submitted to the Foreign-Trade Zones... alternative site framework (ASF) adopted by the Board (15 CFR 400.2(c)). The ASF is an option for grantees for...

  19. Teacher Identity and Numeracy: Developing an Analytic Lens for Understanding Numeracy Teacher Identity

    ERIC Educational Resources Information Center

    Bennison, Anne; Goos, Merrilyn

    2013-01-01

    This paper reviews recent literature on teacher identity in order to propose an operational framework that can be used to investigate the formation and development of numeracy teacher identities. The proposed framework is based on Van Zoest and Bohl's (2005) framework for mathematics teacher identity with a focus on those characteristics thought…

  20. Model-Based Policymaking: A Framework to Promote Ethical “Good Practice” in Mathematical Modeling for Public Health Policymaking

    PubMed Central

    Boden, Lisa A.; McKendrick, Iain J.

    2017-01-01

    Mathematical models are increasingly relied upon as decision support tools, which estimate risks and generate recommendations to underpin public health policies. However, there are no formal agreements about what constitutes professional competencies or duties in mathematical modeling for public health. In this article, we propose a framework to evaluate whether mathematical models that assess human and animal disease risks and control strategies meet standards consistent with ethical “good practice” and are thus “fit for purpose” as evidence in support of policy. This framework is derived from principles of biomedical ethics: independence, transparency (autonomy), beneficence/non-maleficence, and justice. We identify ethical risks associated with model development and implementation and consider the extent to which scientists are accountable for the translation and communication of model results to policymakers so that the strengths and weaknesses of the scientific evidence base and any socioeconomic and ethical impacts of biased or uncertain predictions are clearly understood. We propose principles to operationalize a framework for ethically sound model development and risk communication between scientists and policymakers. These include the creation of science–policy partnerships to mutually define policy questions and communicate results; development of harmonized international standards for model development; and data stewardship and improvement of the traceability and transparency of models via a searchable archive of policy-relevant models. Finally, we suggest that bespoke ethical advisory groups, with relevant expertise and access to these resources, would be beneficial as a bridge between science and policy, advising modelers of potential ethical risks and providing overview of the translation of modeling advice into policy. PMID:28424768

  1. A cyber-event correlation framework and metrics

    NASA Astrophysics Data System (ADS)

    Kang, Myong H.; Mayfield, Terry

    2003-08-01

    In this paper, we propose a cyber-event fusion, correlation, and situation assessment framework that, when instantiated, will allow cyber defenders to better understand the local, regional, and global cyber-situation. This framework, with associated metrics, can be used to guide assessment of our existing cyber-defense capabilities, and to help evaluate the state of cyber-event correlation research and where we must focus our future cyber-event correlation research. The framework, based on the cyber-event gathering activities and analysis functions, consists of five operational steps, each of which provides a richer set of contextual information to support greater situational understanding. The first three steps are categorically depicted as increasingly richer and broader-scoped contexts achieved through correlation activity, while in the final two steps, these richer contexts are achieved through analytical activities (situation assessment, and threat analysis & prediction). Category 1 Correlation focuses on the detection of suspicious activities and the correlation of events from a single cyber-event source. Category 2 Correlation clusters the same or similar events from multiple detectors that are located at close proximity and prioritizes them. Finally, the events from different time periods and event sources at different location/regions are correlated at Category 3 to recognize the relationship among different events. This is the category that focuses on the detection of large-scale and coordinated attacks. The situation assessment step (Category 4) focuses on the assessment of cyber asset damage and the analysis of the impact on missions. The threat analysis and prediction step (Category 5) analyzes attacks based on attack traces and predicts the next steps. Metrics that can distinguish correlation and cyber-situation assessment tools for each category are also proposed.

  2. Dynamic and scalable audio classification by collective network of binary classifiers framework: an evolutionary approach.

    PubMed

    Kiranyaz, Serkan; Mäkinen, Toni; Gabbouj, Moncef

    2012-10-01

    In this paper, we propose a novel framework based on a collective network of evolutionary binary classifiers (CNBC) to address the problems of feature and class scalability. The main goal of the proposed framework is to achieve a high classification performance over dynamic audio and video repositories. The proposed framework adopts a "Divide and Conquer" approach in which an individual network of binary classifiers (NBC) is allocated to discriminate each audio class. An evolutionary search is applied to find the best binary classifier in each NBC with respect to a given criterion. Through the incremental evolution sessions, the CNBC framework can dynamically adapt to each new incoming class or feature set without resorting to a full-scale re-training or re-configuration. Therefore, the CNBC framework is particularly designed for dynamically varying databases where no conventional static classifiers can adapt to such changes. In short, it is entirely a novel topology, an unprecedented approach for dynamic, content/data adaptive and scalable audio classification. A large set of audio features can be effectively used in the framework, where the CNBCs make appropriate selections and combinations so as to achieve the highest discrimination among individual audio classes. Experiments demonstrate a high classification accuracy (above 90%) and efficiency of the proposed framework over large and dynamic audio databases. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Principles for consistent value assessment and sustainable funding of orphan drugs in Europe.

    PubMed

    Gutierrez, Laura; Patris, Julien; Hutchings, Adam; Cowell, Warren

    2015-05-03

    The European Orphan Medicinal Products (OMP) Regulation has successfully encouraged research to develop treatments for rare diseases resulting in the authorisation of new OMPs in Europe. While decisions on OMP designation and marketing authorisation are made at the European Union level, reimbursement decisions are made at the national level. OMP value and affordability are high priority issues for policymakers and decisions regarding their pricing and funding are highly complex. There is currently no European consensus on how OMP value should be assessed and inequalities of access to OMPs have previously been observed. Against this background, policy makers in many countries are considering reforms to improve access to OMPs. This paper proposes ten principles to be considered when undertaking such reforms, from the perspective of an OMP manufacturer. We recommend the continued prioritisation of rare diseases by policymakers, an increased alignment between payer and regulatory frameworks, pricing centred on OMP value, and mechanisms to ensure long-term financial sustainability allowing a continuous and virtuous development of OMPs. Our recommendations support the development of more consistent frameworks and encourage collaboration between all stakeholders, including research-based industry, payers, clinicians, and patients.

  4. Development by Design in Colombia: Making Mitigation Decisions Consistent with Conservation Outcomes

    PubMed Central

    Saenz, Shirley; Walschburger, Tomas; González, Juan Carlos; León, Jorge; McKenney, Bruce; Kiesecker, Joseph

    2013-01-01

    Mitigation policy and regulatory frameworks are consistent in their strong support for the mitigation hierarchy of: (1) avoiding impacts, (2) minimizing impacts, and then (3) offsetting/compensating for residual impacts. While mitigation frameworks require developers to avoid, minimize and restore biodiversity on-site before considering an offset for residual impacts, there is a lack of quantitative guidance for this decision-making process. What are the criteria for requiring impacts be avoided altogether? Here we examine how conservation planning can guide the application of the mitigation hierarchy to address this issue. In support of the Colombian government's aim to improve siting and mitigation practices for planned development, we examined five pilot projects in landscapes expected to experience significant increases in mining, petroleum and/or infrastructure development. By blending landscape-level conservation planning with application of the mitigation hierarchy, we can proactively identify where proposed development and conservation priorities would be in conflict and where impacts should be avoided. The approach we outline here has been adopted by the Colombian Ministry of Environment and Sustainable Development to guide licensing decisions, avoid piecemeal licensing, and promote mitigation decisions that maintain landscape condition. PMID:24339972

  5. Development by design in Colombia: making mitigation decisions consistent with conservation outcomes.

    PubMed

    Saenz, Shirley; Walschburger, Tomas; González, Juan Carlos; León, Jorge; McKenney, Bruce; Kiesecker, Joseph

    2013-01-01

    Mitigation policy and regulatory frameworks are consistent in their strong support for the mitigation hierarchy of: (1) avoiding impacts, (2) minimizing impacts, and then (3) offsetting/compensating for residual impacts. While mitigation frameworks require developers to avoid, minimize and restore biodiversity on-site before considering an offset for residual impacts, there is a lack of quantitative guidance for this decision-making process. What are the criteria for requiring impacts be avoided altogether? Here we examine how conservation planning can guide the application of the mitigation hierarchy to address this issue. In support of the Colombian government's aim to improve siting and mitigation practices for planned development, we examined five pilot projects in landscapes expected to experience significant increases in mining, petroleum and/or infrastructure development. By blending landscape-level conservation planning with application of the mitigation hierarchy, we can proactively identify where proposed development and conservation priorities would be in conflict and where impacts should be avoided. The approach we outline here has been adopted by the Colombian Ministry of Environment and Sustainable Development to guide licensing decisions, avoid piecemeal licensing, and promote mitigation decisions that maintain landscape condition.

  6. Managing changes in the enterprise architecture modelling context

    NASA Astrophysics Data System (ADS)

    Khanh Dam, Hoa; Lê, Lam-Son; Ghose, Aditya

    2016-07-01

    Enterprise architecture (EA) models the whole enterprise in various aspects regarding both business processes and information technology resources. As the organisation grows, the architecture of its systems and processes must also evolve to meet the demands of the business environment. Evolving an EA model may involve making changes to various components across different levels of the EA. As a result, an important issue before making a change to an EA model is assessing the ripple effect of the change, i.e. change impact analysis. Another critical issue is change propagation: given a set of primary changes that have been made to the EA model, what additional secondary changes are needed to maintain consistency across multiple levels of the EA. There has been however limited work on supporting the maintenance and evolution of EA models. This article proposes an EA description language, namely ChangeAwareHierarchicalEA, integrated with an evolution framework to support both change impact analysis and change propagation within an EA model. The core part of our framework is a technique for computing the impact of a change and a new method for generating interactive repair plans from Alloy consistency rules that constrain the EA model.

  7. An evaluation of bias in propensity score-adjusted non-linear regression models.

    PubMed

    Wan, Fei; Mitra, Nandita

    2018-03-01

    Propensity score methods are commonly used to adjust for observed confounding when estimating the conditional treatment effect in observational studies. One popular method, covariate adjustment of the propensity score in a regression model, has been empirically shown to be biased in non-linear models. However, no compelling underlying theoretical reason has been presented. We propose a new framework to investigate bias and consistency of propensity score-adjusted treatment effects in non-linear models that uses a simple geometric approach to forge a link between the consistency of the propensity score estimator and the collapsibility of non-linear models. Under this framework, we demonstrate that adjustment of the propensity score in an outcome model results in the decomposition of observed covariates into the propensity score and a remainder term. Omission of this remainder term from a non-collapsible regression model leads to biased estimates of the conditional odds ratio and conditional hazard ratio, but not for the conditional rate ratio. We further show, via simulation studies, that the bias in these propensity score-adjusted estimators increases with larger treatment effect size, larger covariate effects, and increasing dissimilarity between the coefficients of the covariates in the treatment model versus the outcome model.

  8. A framework for telehealth program evaluation.

    PubMed

    Nepal, Surya; Li, Jane; Jang-Jaccard, Julian; Alem, Leila

    2014-04-01

    Evaluating telehealth programs is a challenging task, yet it is the most sensible first step when embarking on a telehealth study. How can we frame and report on telehealth studies? What are the health services elements to select based on the application needs? What are the appropriate terms to use to refer to such elements? Various frameworks have been proposed in the literature to answer these questions, and each framework is defined by a set of properties covering different aspects of telehealth systems. The most common properties include application, technology, and functionality. With the proliferation of telehealth, it is important not only to understand these properties, but also to define new properties to account for a wider range of context of use and evaluation outcomes. This article presents a comprehensive framework for delivery design, implementation, and evaluation of telehealth services. We first survey existing frameworks proposed in the literature and then present our proposed comprehensive multidimensional framework for telehealth. Six key dimensions of the proposed framework include health domains, health services, delivery technologies, communication infrastructure, environment setting, and socioeconomic analysis. We define a set of example properties for each dimension. We then demonstrate how we have used our framework to evaluate telehealth programs in rural and remote Australia. A few major international studies have been also mapped to demonstrate the feasibility of the framework. The key characteristics of the framework are as follows: (a) loosely coupled and hence easy to use, (b) provides a basis for describing a wide range of telehealth programs, and (c) extensible to future developments and needs.

  9. Predicting Player Position for Talent Identification in Association Football

    NASA Astrophysics Data System (ADS)

    Razali, Nazim; Mustapha, Aida; Yatim, Faiz Ahmad; Aziz, Ruhaya Ab

    2017-08-01

    This paper is set to introduce a new framework from the perspective of Computer Science for identifying talents in the sport of football based on the players’ individual qualities; physical, mental, and technical. The combination of qualities as assessed by coaches are then used to predict the players’ position in a match that suits the player the best in a particular team formation. Evaluation of the proposed framework is two-fold; quantitatively via classification experiments to predict player position, and qualitatively via a Talent Identification Site developed to achieve the same goal. Results from the classification experiments using Bayesian Networks, Decision Trees, and K-Nearest Neighbor have shown an average of 98% accuracy, which will promote consistency in decision-making though elimination of personal bias in team selection. The positive reviews on the Football Identification Site based on user acceptance evaluation also indicates that the framework is sufficient to serve as the basis of developing an intelligent team management system in different sports, whereby growth and performance of sport players can be monitored and identified.

  10. Basic test framework for the evaluation of text line segmentation and text parameter extraction.

    PubMed

    Brodić, Darko; Milivojević, Dragan R; Milivojević, Zoran

    2010-01-01

    Text line segmentation is an essential stage in off-line optical character recognition (OCR) systems. It is a key because inaccurately segmented text lines will lead to OCR failure. Text line segmentation of handwritten documents is a complex and diverse problem, complicated by the nature of handwriting. Hence, text line segmentation is a leading challenge in handwritten document image processing. Due to inconsistencies in measurement and evaluation of text segmentation algorithm quality, some basic set of measurement methods is required. Currently, there is no commonly accepted one and all algorithm evaluation is custom oriented. In this paper, a basic test framework for the evaluation of text feature extraction algorithms is proposed. This test framework consists of a few experiments primarily linked to text line segmentation, skew rate and reference text line evaluation. Although they are mutually independent, the results obtained are strongly cross linked. In the end, its suitability for different types of letters and languages as well as its adaptability are its main advantages. Thus, the paper presents an efficient evaluation method for text analysis algorithms.

  11. From Planetary Boundaries to national fair shares of the global safe operating space - How can the scales be bridged?

    NASA Astrophysics Data System (ADS)

    Häyhä, Tiina; Cornell, Sarah; Lucas, Paul; van Vuuren, Detlef; Hoff, Holger

    2016-04-01

    The planetary boundaries framework proposes precautionary quantitative global limits to the anthropogenic perturbation of crucial Earth system processes. In this way, it marks out a planetary 'safe operating space' for human activities. However, decisions regarding resource use and emissions are mostly made at much smaller scales, mostly by (sub-)national and regional governments, businesses, and other local actors. To operationalize the planetary boundaries, they need to be translated into and aligned with targets that are relevant at these smaller scales. In this paper, we develop a framework that addresses the three dimension of bridging across scales: biophysical, socio-economic and ethical, to provide a consistent universally applicable approach for translating the planetary boundaries into national level context-specific and fair shares of the safe operating space. We discuss our findings in the context of previous studies and their implications for future analyses and policymaking. In this way, we help link the planetary boundaries framework to widely- applied operational and policy concepts for more robust strong sustainability decision-making.

  12. Basic Test Framework for the Evaluation of Text Line Segmentation and Text Parameter Extraction

    PubMed Central

    Brodić, Darko; Milivojević, Dragan R.; Milivojević, Zoran

    2010-01-01

    Text line segmentation is an essential stage in off-line optical character recognition (OCR) systems. It is a key because inaccurately segmented text lines will lead to OCR failure. Text line segmentation of handwritten documents is a complex and diverse problem, complicated by the nature of handwriting. Hence, text line segmentation is a leading challenge in handwritten document image processing. Due to inconsistencies in measurement and evaluation of text segmentation algorithm quality, some basic set of measurement methods is required. Currently, there is no commonly accepted one and all algorithm evaluation is custom oriented. In this paper, a basic test framework for the evaluation of text feature extraction algorithms is proposed. This test framework consists of a few experiments primarily linked to text line segmentation, skew rate and reference text line evaluation. Although they are mutually independent, the results obtained are strongly cross linked. In the end, its suitability for different types of letters and languages as well as its adaptability are its main advantages. Thus, the paper presents an efficient evaluation method for text analysis algorithms. PMID:22399932

  13. Defining interdisciplinary competencies for audiological rehabilitation: findings from a modified Delphi study.

    PubMed

    Xue, Lina; Le Bot, Gaëlle; Van Petegem, Wim; van Wieringen, Astrid

    2018-02-01

    The aim of this study is to derive a consensus on an interdisciplinary competency framework regarding a holistic approach for audiological rehabilitation (AR), which includes disciplines from medicine, engineering, social sciences and humanities. We employed a modified Delphi method. In the first round survey, experts were asked to rate an initial list of 28 generic interdisciplinary competencies and to propose specific knowledge areas for AR. In the second round, experts were asked to reconsider their answers in light of the group answers of the first round. An international panel of 27 experts from different disciplines in AR completed the first round. Twenty-two of them completed the second round. We developed a competency framework consisting of 21 generic interdisciplinary competencies grouped in five domains and nine specific competencies (knowledge areas) in three clusters. Suggestions for the implementation of the generic competencies in interdisciplinary programmes were identified. This study reveals insights into the interdisciplinary competencies that are unique for AR. The framework will be useful for educators in developing interdisciplinary programmes as well as for professionals in considering their lifelong training needs in AR.

  14. Pair-Starved Pulsar Magnetospheres

    NASA Technical Reports Server (NTRS)

    Muslimov, Alex G.; Harding, Alice K.

    2009-01-01

    We propose a simple analytic model for the innermost (within the light cylinder of canonical radius, approx. c/Omega) structure of open-magnetic-field lines of a rotating neutron star (NS) with relativistic outflow of charged particles (electrons/positrons) and arbitrary angle between the NS spin and magnetic axes. We present the self-consistent solution of Maxwell's equations for the magnetic field and electric current in the pair-starved regime where the density of electron-positron plasma generated above the pulsar polar cap is not sufficient to completely screen the accelerating electric field and thus establish thee E . B = 0 condition above the pair-formation front up to the very high altitudes within the light cylinder. The proposed mode1 may provide a theoretical framework for developing the refined model of the global pair-starved pulsar magnetosphere.

  15. An Alternative Lattice Field Theory Formulation Inspired by Lattice Supersymmetry-Summary of the Formulation-

    NASA Astrophysics Data System (ADS)

    D'Adda, Alessandro; Kawamoto, Noboru; Saito, Jun

    2018-03-01

    We propose a lattice field theory formulation which overcomes some fundamental diffculties in realizing exact supersymmetry on the lattice. The Leibniz rule for the difference operator can be recovered by defining a new product on the lattice, the star product, and the chiral fermion species doublers degrees of freedom can be avoided consistently. This framework is general enough to formulate non-supersymmetric lattice field theory without chiral fermion problem. This lattice formulation has a nonlocal nature and is essentially equivalent to the corresponding continuum theory. We can show that the locality of the star product is recovered exponentially in the continuum limit. Possible regularization procedures are proposed.The associativity of the product and the lattice translational invariance of the formulation will be discussed.

  16. Relaxion cosmology and the price of fine-tuning

    NASA Astrophysics Data System (ADS)

    Di Chiara, Stefano; Kannike, Kristjan; Marzola, Luca; Racioppi, Antonio; Raidal, Martti; Spethmann, Christian

    2016-05-01

    The relaxion scenario presents an intriguing extension of the standard model in which the particle introduced to solve to the strong C P problem, the axion, also achieves the dynamical relaxation of the Higgs boson mass term. In this work we complete this framework by proposing a scenario of inflationary cosmology that is consistent with all the observational constraints: the relaxion hybrid inflation with an asymmetric waterfall. In our scheme, the vacuum energy of the inflaton drives inflation in a natural way while the relaxion slow rolls. The constraints on the present inflationary observables are then matched through a subsequent inflationary epoch driven by the inflaton. We quantify the amount of fine-tuning of the proposed inflation scenario, concluding that the inflaton sector severely decreases the naturalness of the theory.

  17. An index-based robust decision making framework for watershed management in a changing climate.

    PubMed

    Kim, Yeonjoo; Chung, Eun-Sung

    2014-03-01

    This study developed an index-based robust decision making framework for watershed management dealing with water quantity and quality issues in a changing climate. It consists of two parts of management alternative development and analysis. The first part for alternative development consists of six steps: 1) to understand the watershed components and process using HSPF model, 2) to identify the spatial vulnerability ranking using two indices: potential streamflow depletion (PSD) and potential water quality deterioration (PWQD), 3) to quantify the residents' preferences on water management demands and calculate the watershed evaluation index which is the weighted combinations of PSD and PWQD, 4) to set the quantitative targets for water quantity and quality, 5) to develop a list of feasible alternatives and 6) to eliminate the unacceptable alternatives. The second part for alternative analysis has three steps: 7) to analyze all selected alternatives with a hydrologic simulation model considering various climate change scenarios, 8) to quantify the alternative evaluation index including social and hydrologic criteria with utilizing multi-criteria decision analysis methods and 9) to prioritize all options based on a minimax regret strategy for robust decision. This framework considers the uncertainty inherent in climate models and climate change scenarios with utilizing the minimax regret strategy, a decision making strategy under deep uncertainty and thus this procedure derives the robust prioritization based on the multiple utilities of alternatives from various scenarios. In this study, the proposed procedure was applied to the Korean urban watershed, which has suffered from streamflow depletion and water quality deterioration. Our application shows that the framework provides a useful watershed management tool for incorporating quantitative and qualitative information into the evaluation of various policies with regard to water resource planning and management. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. A generic biogeochemical module for earth system models

    NASA Astrophysics Data System (ADS)

    Fang, Y.; Huang, M.; Liu, C.; Li, H.-Y.; Leung, L. R.

    2013-06-01

    Physical and biogeochemical processes regulate soil carbon dynamics and CO2 flux to and from the atmosphere, influencing global climate changes. Integration of these processes into earth system models (e.g. community land models - CLM), however, currently faces three major challenges: (1) extensive efforts are required to modify modeling structures and to rewrite computer programs to incorporate new or updated processes as new knowledge is being generated, (2) computational cost is prohibitively expensive to simulate biogeochemical processes in land models due to large variations in the rates of biogeochemical processes, and (3) various mathematical representations of biogeochemical processes exist to incorporate different aspects of fundamental mechanisms, but systematic evaluation of the different mathematical representations is difficult, if not impossible. To address these challenges, we propose a new computational framework to easily incorporate physical and biogeochemical processes into land models. The new framework consists of a new biogeochemical module with a generic algorithm and reaction database so that new and updated processes can be incorporated into land models without the need to manually set up the ordinary differential equations to be solved numerically. The reaction database consists of processes of nutrient flow through the terrestrial ecosystems in plants, litter and soil. This framework facilitates effective comparison studies of biogeochemical cycles in an ecosystem using different conceptual models under the same land modeling framework. The approach was first implemented in CLM and benchmarked against simulations from the original CLM-CN code. A case study was then provided to demonstrate the advantages of using the new approach to incorporate a phosphorus cycle into the CLM model. To our knowledge, the phosphorus-incorporated CLM is a new model that can be used to simulate phosphorus limitation on the productivity of terrestrial ecosystems.

  19. Consistent Simulation Framework for Efficient Mass Discharge and Source Depletion Time Predictions of DNAPL Contaminants in Heterogeneous Aquifers Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Nowak, W.; Koch, J.

    2014-12-01

    Predicting DNAPL fate and transport in heterogeneous aquifers is challenging and subject to an uncertainty that needs to be quantified. Models for this task needs to be equipped with an accurate source zone description, i.e., the distribution of mass of all partitioning phases (DNAPL, water, and soil) in all possible states ((im)mobile, dissolved, and sorbed), mass-transfer algorithms, and the simulation of transport processes in the groundwater. Such detailed models tend to be computationally cumbersome when used for uncertainty quantification. Therefore, a selective choice of the relevant model states, processes, and scales are both sensitive and indispensable. We investigate the questions: what is a meaningful level of model complexity and how to obtain an efficient model framework that is still physically and statistically consistent. In our proposed model, aquifer parameters and the contaminant source architecture are conceptualized jointly as random space functions. The governing processes are simulated in a three-dimensional, highly-resolved, stochastic, and coupled model that can predict probability density functions of mass discharge and source depletion times. We apply a stochastic percolation approach as an emulator to simulate the contaminant source formation, a random walk particle tracking method to simulate DNAPL dissolution and solute transport within the aqueous phase, and a quasi-steady-state approach to solve for DNAPL depletion times. Using this novel model framework, we test whether and to which degree the desired model predictions are sensitive to simplifications often found in the literature. With this we identify that aquifer heterogeneity, groundwater flow irregularity, uncertain and physically-based contaminant source zones, and their mutual interlinkages are indispensable components of a sound model framework.

  20. A Simple Demonstration of Concrete Structural Health Monitoring Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahadevan, Sankaran; Agarwal, Vivek; Cai, Guowei

    Assessment and management of aging concrete structures in nuclear power plants require a more systematic approach than simple reliance on existing code margins of safety. Structural health monitoring of concrete structures aims to understand the current health condition of a structure based on heterogeneous measurements to produce high confidence actionable information regarding structural integrity that supports operational and maintenance decisions. This ongoing research project is seeking to develop a probabilistic framework for health diagnosis and prognosis of aging concrete structures in a nuclear power plant subjected to physical, chemical, environment, and mechanical degradation. The proposed framework consists of four elements—damagemore » modeling, monitoring, data analytics, and uncertainty quantification. This report describes a proof-of-concept example on a small concrete slab subjected to a freeze-thaw experiment that explores techniques in each of the four elements of the framework and their integration. An experimental set-up at Vanderbilt University’s Laboratory for Systems Integrity and Reliability is used to research effective combination of full-field techniques that include infrared thermography, digital image correlation, and ultrasonic measurement. The measured data are linked to the probabilistic framework: the thermography, digital image correlation data, and ultrasonic measurement data are used for Bayesian calibration of model parameters, for diagnosis of damage, and for prognosis of future damage. The proof-of-concept demonstration presented in this report highlights the significance of each element of the framework and their integration.« less

  1. Organizational Context and Capabilities for Integrating Care: A Framework for Improvement.

    PubMed

    Evans, Jenna M; Grudniewicz, Agnes; Baker, G Ross; Wodchis, Walter P

    2016-08-31

    Interventions aimed at integrating care have become widespread in healthcare; however, there is significant variability in their success. Differences in organizational contexts and associated capabilities may be responsible for some of this variability. This study develops and validates a conceptual framework of organizational capabilities for integrating care, identifies which of these capabilities may be most important, and explores the mechanisms by which they influence integrated care efforts. The Context and Capabilities for Integrating Care (CCIC) Framework was developed through a literature review, and revised and validated through interviews with leaders and care providers engaged in integrated care networks in Ontario, Canada. Interviews involved open-ended questions and graphic elicitation. Quantitative content analysis was used to summarize the data. The CCIC Framework consists of eighteen organizational factors in three categories: Basic Structures, People and Values, and Key Processes. The three most important capabilities shaping the capacity of organizations to implement integrated care interventions include Leadership Approach, Clinician Engagement and Leadership, and Readiness for Change. The majority of hypothesized relationships among organizational capabilities involved Readiness for Change and Partnering, emphasizing the complexity, interrelatedness and importance of these two factors to integrated care efforts. Organizational leaders can use the framework to determine readiness to integrate care, develop targeted change management strategies, and select appropriate partners with overlapping or complementary profiles on key capabilities. Researchers may use the results to test and refine the proposed framework, with a focus on the hypothesized relationships among organizational capabilities and between organizational capabilities and performance outcomes.

  2. A software tool for ecosystem services assessments

    NASA Astrophysics Data System (ADS)

    Riegels, Niels; Klinting, Anders; Butts, Michael; Middelboe, Anne Lise; Mark, Ole

    2017-04-01

    The EU FP7 DESSIN project is developing methods and tools for assessment of ecosystem services (ESS) and associated economic values, with a focus on freshwater ESS in urban settings. Although the ESS approach has gained considerable visibility over the past ten years, operationalizing the approach remains a challenge. Therefore, DESSSIN is also supporting development of a free software tool to support users implementing the DESSIN ESS evaluation framework. The DESSIN ESS evaluation framework is a structured approach to measuring changes in ecosystem services. The main purpose of the framework is to facilitate the application of the ESS approach in the appraisal of projects that have impacts on freshwater ecosystems and their services. The DESSIN framework helps users evaluate changes in ESS by linking biophysical, economic, and sustainability assessments sequentially. It was developed using the Common International Classification of Ecosystem Services (CICES) and the DPSIR (Drivers, Pressures, States, Impacts, Responses) adaptive management cycle. The former is a standardized system for the classification of ESS developed by the European Union to enhance the consistency and comparability of ESS assessments. The latter is a well-known concept to disentangle the biophysical and social aspects of a system under study. As part of its analytical component, the DESSIN framework also integrates elements of the Final Ecosystem Goods and Services-Classification System (FEGS-CS) of the US Environmental Protection Agency (USEPA). As implemented in the software tool, the DESSIN framework consists of five parts: • In part I of the evaluation, the ecosystem is defined and described and the local stakeholders are identified. In addition, administrative details and objectives of the assessment are defined. • In part II, drivers and pressures are identified. Once these first two elements of the DPSIR scheme have been characterized, the claimed/expected capabilities of a proposed project can be estimated to determine whether the project affects drivers, pressures, states or a combination of these. • In part III, information about impacts on drivers, pressures, and states is used to identify ESS impacted by a proposed project. Potential beneficiaries of impacted ESS are also identified. • In part IV, changes in ESS are estimated. These estimates include changes in the provision of ESS, the use of ESS, and the value of ESS. • A sustainability assessment in Part V estimates the broader impact of a proposed project according to social, environmental, governance and other criteria. The ESS evaluation software tool is designed to assist an evaluation or study leader carrying out an ESS assessment. The tool helps users move through the logic of the ESS evaluation and make sense of relationships between elements of the DPSIR framework, the CICES classification scheme, and the FEGS approach. The tool also provides links to useful indicators and assessment methods in order to help users quantify changes in ESS and ESS values. The software tool is developed in collaboration with the DESSIN user group, who will use the software to estimate changes in ESS resulting from the implementation of green technologies addressing water quality and water scarcity issues. Although the software is targeted to this user group, it will be made available for free to the public after the conclusion of the project.

  3. A log-normal distribution model for the molecular weight of aquatic fulvic acids

    USGS Publications Warehouse

    Cabaniss, S.E.; Zhou, Q.; Maurice, P.A.; Chin, Y.-P.; Aiken, G.R.

    2000-01-01

    The molecular weight of humic substances influences their proton and metal binding, organic pollutant partitioning, adsorption onto minerals and activated carbon, and behavior during water treatment. We propose a lognormal model for the molecular weight distribution in aquatic fulvic acids to provide a conceptual framework for studying these size effects. The normal curve mean and standard deviation are readily calculated from measured M(n) and M(w) and vary from 2.7 to 3 for the means and from 0.28 to 0.37 for the standard deviations for typical aquatic fulvic acids. The model is consistent with several types of molecular weight data, including the shapes of high- pressure size-exclusion chromatography (HP-SEC) peaks. Applications of the model to electrostatic interactions, pollutant solubilization, and adsorption are explored in illustrative calculations.The molecular weight of humic substances influences their proton and metal binding, organic pollutant partitioning, adsorption onto minerals and activated carbon, and behavior during water treatment. We propose a log-normal model for the molecular weight distribution in aquatic fulvic acids to provide a conceptual framework for studying these size effects. The normal curve mean and standard deviation are readily calculated from measured Mn and Mw and vary from 2.7 to 3 for the means and from 0.28 to 0.37 for the standard deviations for typical aquatic fulvic acids. The model is consistent with several type's of molecular weight data, including the shapes of high-pressure size-exclusion chromatography (HP-SEC) peaks. Applications of the model to electrostatic interactions, pollutant solubilization, and adsorption are explored in illustrative calculations.

  4. A data-driven dynamics simulation framework for railway vehicles

    NASA Astrophysics Data System (ADS)

    Nie, Yinyu; Tang, Zhao; Liu, Fengjia; Chang, Jian; Zhang, Jianjun

    2018-03-01

    The finite element (FE) method is essential for simulating vehicle dynamics with fine details, especially for train crash simulations. However, factors such as the complexity of meshes and the distortion involved in a large deformation would undermine its calculation efficiency. An alternative method, the multi-body (MB) dynamics simulation provides satisfying time efficiency but limited accuracy when highly nonlinear dynamic process is involved. To maintain the advantages of both methods, this paper proposes a data-driven simulation framework for dynamics simulation of railway vehicles. This framework uses machine learning techniques to extract nonlinear features from training data generated by FE simulations so that specific mesh structures can be formulated by a surrogate element (or surrogate elements) to replace the original mechanical elements, and the dynamics simulation can be implemented by co-simulation with the surrogate element(s) embedded into a MB model. This framework consists of a series of techniques including data collection, feature extraction, training data sampling, surrogate element building, and model evaluation and selection. To verify the feasibility of this framework, we present two case studies, a vertical dynamics simulation and a longitudinal dynamics simulation, based on co-simulation with MATLAB/Simulink and Simpack, and a further comparison with a popular data-driven model (the Kriging model) is provided. The simulation result shows that using the legendre polynomial regression model in building surrogate elements can largely cut down the simulation time without sacrifice in accuracy.

  5. A mechanistic spatio-temporal framework for modelling individual-to-individual transmission—With an application to the 2014-2015 West Africa Ebola outbreak

    PubMed Central

    McClelland, Amanda; Zelner, Jon; Streftaris, George; Funk, Sebastian; Metcalf, Jessica; Dalziel, Benjamin D.; Grenfell, Bryan T.

    2017-01-01

    In recent years there has been growing availability of individual-level spatio-temporal disease data, particularly due to the use of modern communicating devices with GPS tracking functionality. These detailed data have been proven useful for inferring disease transmission to a more refined level than previously. However, there remains a lack of statistically sound frameworks to model the underlying transmission dynamic in a mechanistic manner. Such a development is particularly crucial for enabling a general epidemic predictive framework at the individual level. In this paper we propose a new statistical framework for mechanistically modelling individual-to-individual disease transmission in a landscape with heterogeneous population density. Our methodology is first tested using simulated datasets, validating our inferential machinery. The methodology is subsequently applied to data that describes a regional Ebola outbreak in Western Africa (2014-2015). Our results show that the methods are able to obtain estimates of key epidemiological parameters that are broadly consistent with the literature, while revealing a significantly shorter distance of transmission. More importantly, in contrast to existing approaches, we are able to perform a more general model prediction that takes into account the susceptible population. Finally, our results show that, given reasonable scenarios, the framework can be an effective surrogate for susceptible-explicit individual models which are often computationally challenging. PMID:29084216

  6. A mechanistic spatio-temporal framework for modelling individual-to-individual transmission-With an application to the 2014-2015 West Africa Ebola outbreak.

    PubMed

    Lau, Max S Y; Gibson, Gavin J; Adrakey, Hola; McClelland, Amanda; Riley, Steven; Zelner, Jon; Streftaris, George; Funk, Sebastian; Metcalf, Jessica; Dalziel, Benjamin D; Grenfell, Bryan T

    2017-10-01

    In recent years there has been growing availability of individual-level spatio-temporal disease data, particularly due to the use of modern communicating devices with GPS tracking functionality. These detailed data have been proven useful for inferring disease transmission to a more refined level than previously. However, there remains a lack of statistically sound frameworks to model the underlying transmission dynamic in a mechanistic manner. Such a development is particularly crucial for enabling a general epidemic predictive framework at the individual level. In this paper we propose a new statistical framework for mechanistically modelling individual-to-individual disease transmission in a landscape with heterogeneous population density. Our methodology is first tested using simulated datasets, validating our inferential machinery. The methodology is subsequently applied to data that describes a regional Ebola outbreak in Western Africa (2014-2015). Our results show that the methods are able to obtain estimates of key epidemiological parameters that are broadly consistent with the literature, while revealing a significantly shorter distance of transmission. More importantly, in contrast to existing approaches, we are able to perform a more general model prediction that takes into account the susceptible population. Finally, our results show that, given reasonable scenarios, the framework can be an effective surrogate for susceptible-explicit individual models which are often computationally challenging.

  7. A mathematical framework for combining decisions of multiple experts toward accurate and remote diagnosis of malaria using tele-microscopy.

    PubMed

    Mavandadi, Sam; Feng, Steve; Yu, Frank; Dimitrov, Stoyan; Nielsen-Saines, Karin; Prescott, William R; Ozcan, Aydogan

    2012-01-01

    We propose a methodology for digitally fusing diagnostic decisions made by multiple medical experts in order to improve accuracy of diagnosis. Toward this goal, we report an experimental study involving nine experts, where each one was given more than 8,000 digital microscopic images of individual human red blood cells and asked to identify malaria infected cells. The results of this experiment reveal that even highly trained medical experts are not always self-consistent in their diagnostic decisions and that there exists a fair level of disagreement among experts, even for binary decisions (i.e., infected vs. uninfected). To tackle this general medical diagnosis problem, we propose a probabilistic algorithm to fuse the decisions made by trained medical experts to robustly achieve higher levels of accuracy when compared to individual experts making such decisions. By modelling the decisions of experts as a three component mixture model and solving for the underlying parameters using the Expectation Maximisation algorithm, we demonstrate the efficacy of our approach which significantly improves the overall diagnostic accuracy of malaria infected cells. Additionally, we present a mathematical framework for performing 'slide-level' diagnosis by using individual 'cell-level' diagnosis data, shedding more light on the statistical rules that should govern the routine practice in examination of e.g., thin blood smear samples. This framework could be generalized for various other tele-pathology needs, and can be used by trained experts within an efficient tele-medicine platform.

  8. A Text-Mining Framework for Supporting Systematic Reviews.

    PubMed

    Li, Dingcheng; Wang, Zhen; Wang, Liwei; Sohn, Sunghwan; Shen, Feichen; Murad, Mohammad Hassan; Liu, Hongfang

    2016-11-01

    Systematic reviews (SRs) involve the identification, appraisal, and synthesis of all relevant studies for focused questions in a structured reproducible manner. High-quality SRs follow strict procedures and require significant resources and time. We investigated advanced text-mining approaches to reduce the burden associated with abstract screening in SRs and provide high-level information summary. A text-mining SR supporting framework consisting of three self-defined semantics-based ranking metrics was proposed, including keyword relevance, indexed-term relevance and topic relevance. Keyword relevance is based on the user-defined keyword list used in the search strategy. Indexed-term relevance is derived from indexed vocabulary developed by domain experts used for indexing journal articles and books. Topic relevance is defined as the semantic similarity among retrieved abstracts in terms of topics generated by latent Dirichlet allocation, a Bayesian-based model for discovering topics. We tested the proposed framework using three published SRs addressing a variety of topics (Mass Media Interventions, Rectal Cancer and Influenza Vaccine). The results showed that when 91.8%, 85.7%, and 49.3% of the abstract screening labor was saved, the recalls were as high as 100% for the three cases; respectively. Relevant studies identified manually showed strong topic similarity through topic analysis, which supported the inclusion of topic analysis as relevance metric. It was demonstrated that advanced text mining approaches can significantly reduce the abstract screening labor of SRs and provide an informative summary of relevant studies.

  9. A hybrid simulation approach for integrating safety behavior into construction planning: An earthmoving case study.

    PubMed

    Goh, Yang Miang; Askar Ali, Mohamed Jawad

    2016-08-01

    One of the key challenges in improving construction safety and health is the management of safety behavior. From a system point of view, workers work unsafely due to system level issues such as poor safety culture, excessive production pressure, inadequate allocation of resources and time and lack of training. These systemic issues should be eradicated or minimized during planning. However, there is a lack of detailed planning tools to help managers assess the impact of their upstream decisions on worker safety behavior. Even though simulation had been used in construction planning, the review conducted in this study showed that construction safety management research had not been exploiting the potential of simulation techniques. Thus, a hybrid simulation framework is proposed to facilitate integration of safety management considerations into construction activity simulation. The hybrid framework consists of discrete event simulation (DES) as the core, but heterogeneous, interactive and intelligent (able to make decisions) agents replace traditional entities and resources. In addition, some of the cognitive processes and physiological aspects of agents are captured using system dynamics (SD) approach. The combination of DES, agent-based simulation (ABS) and SD allows a more "natural" representation of the complex dynamics in construction activities. The proposed hybrid framework was demonstrated using a hypothetical case study. In addition, due to the lack of application of factorial experiment approach in safety management simulation, the case study demonstrated sensitivity analysis and factorial experiment to guide future research. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Essential levels of health information in Europe: an action plan for a coherent and sustainable infrastructure.

    PubMed

    Carinci, Fabrizio

    2015-04-01

    The European Union needs a common health information infrastructure to support policy and governance on a routine basis. A stream of initiatives conducted in Europe during the last decade resulted into several success stories, but did not specify a unified framework that could be broadly implemented on a continental level. The recent debate raised a potential controversy on the different roles and responsibilities of policy makers vs the public health community in the construction of such a pan-European health information system. While institutional bodies shall clarify the statutory conditions under which such an endeavour is to be carried out, researchers should define a common framework for optimal cross-border information exchange. This paper conceptualizes a general solution emerging from past experiences, introducing a governance structure and overarching framework that can be realized through four main action lines, underpinned by the key principle of "Essential Levels of Health Information" for Europe. The proposed information model is amenable to be applied in a consistent manner at both national and EU level. If realized, the four action lines outlined here will allow developing a EU health information infrastructure that would effectively integrate best practices emerging from EU public health initiatives, including projects and joint actions carried out during the last ten years. The proposed approach adds new content to the ongoing debate on the future activity of the European Commission in the area of health information. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  11. A Content-Adaptive Analysis and Representation Framework for Audio Event Discovery from "Unscripted" Multimedia

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, Regunathan; Divakaran, Ajay; Xiong, Ziyou; Otsuka, Isao

    2006-12-01

    We propose a content-adaptive analysis and representation framework to discover events using audio features from "unscripted" multimedia such as sports and surveillance for summarization. The proposed analysis framework performs an inlier/outlier-based temporal segmentation of the content. It is motivated by the observation that "interesting" events in unscripted multimedia occur sparsely in a background of usual or "uninteresting" events. We treat the sequence of low/mid-level features extracted from the audio as a time series and identify subsequences that are outliers. The outlier detection is based on eigenvector analysis of the affinity matrix constructed from statistical models estimated from the subsequences of the time series. We define the confidence measure on each of the detected outliers as the probability that it is an outlier. Then, we establish a relationship between the parameters of the proposed framework and the confidence measure. Furthermore, we use the confidence measure to rank the detected outliers in terms of their departures from the background process. Our experimental results with sequences of low- and mid-level audio features extracted from sports video show that "highlight" events can be extracted effectively as outliers from a background process using the proposed framework. We proceed to show the effectiveness of the proposed framework in bringing out suspicious events from surveillance videos without any a priori knowledge. We show that such temporal segmentation into background and outliers, along with the ranking based on the departure from the background, can be used to generate content summaries of any desired length. Finally, we also show that the proposed framework can be used to systematically select "key audio classes" that are indicative of events of interest in the chosen domain.

  12. Toward wideband steerable acoustic metasurfaces with arrays of active electroacoustic resonators

    NASA Astrophysics Data System (ADS)

    Lissek, Hervé; Rivet, Etienne; Laurence, Thomas; Fleury, Romain

    2018-03-01

    We introduce an active concept for achieving acoustic metasurfaces with steerable reflection properties, effective over a wide frequency band. The proposed active acoustic metasurface consists of a surface array of subwavelength loudspeaker diaphragms, each with programmable individual active acoustic impedances allowing for local control over the different reflection phases over the metasurface. The active control framework used for controlling the reflection phase over the metasurface is derived from the Active Electroacoustic Resonator concept. Each unit-cell simply consists of a current-driven electrodynamic loudspeaker in a closed box, whose acoustic impedance at the diaphragm is judiciously adjusted by connecting an active electrical control circuit. The control is known to achieve a wide variety of acoustic impedances on a single loudspeaker diaphragm used as an acoustic resonator, with the possibility to shift its resonance frequency by more than one octave. This paper presents a methodology for designing such active metasurface elements. An experimental validation of the achieved individual reflection coefficients is presented, and full wave simulations present a few examples of achievable reflection properties, with a focus on the bandwidth of operation of the proposed control concept.

  13. Grand Canonical adaptive resolution simulation for molecules with electrons: A theoretical framework based on physical consistency

    NASA Astrophysics Data System (ADS)

    Delle Site, Luigi

    2018-01-01

    A theoretical scheme for the treatment of an open molecular system with electrons and nuclei is proposed. The idea is based on the Grand Canonical description of a quantum region embedded in a classical reservoir of molecules. Electronic properties of the quantum region are calculated at constant electronic chemical potential equal to that of the corresponding (large) bulk system treated at full quantum level. Instead, the exchange of molecules between the quantum region and the classical environment occurs at the chemical potential of the macroscopic thermodynamic conditions. The Grand Canonical Adaptive Resolution Scheme is proposed for the treatment of the classical environment; such an approach can treat the exchange of molecules according to first principles of statistical mechanics and thermodynamic. The overall scheme is build on the basis of physical consistency, with the corresponding definition of numerical criteria of control of the approximations implied by the coupling. Given the wide range of expertise required, this work has the intention of providing guiding principles for the construction of a well founded computational protocol for actual multiscale simulations from the electronic to the mesoscopic scale.

  14. Hybrid particle-field molecular dynamics simulation for polyelectrolyte systems.

    PubMed

    Zhu, You-Liang; Lu, Zhong-Yuan; Milano, Giuseppe; Shi, An-Chang; Sun, Zhao-Yan

    2016-04-14

    To achieve simulations on large spatial and temporal scales with high molecular chemical specificity, a hybrid particle-field method was proposed recently. This method is developed by combining molecular dynamics and self-consistent field theory (MD-SCF). The MD-SCF method has been validated by successfully predicting the experimentally observable properties of several systems. Here we propose an efficient scheme for the inclusion of electrostatic interactions in the MD-SCF framework. In this scheme, charged molecules are interacting with the external fields that are self-consistently determined from the charge densities. This method is validated by comparing the structural properties of polyelectrolytes in solution obtained from the MD-SCF and particle-based simulations. Moreover, taking PMMA-b-PEO and LiCF3SO3 as examples, the enhancement of immiscibility between the ion-dissolving block and the inert block by doping lithium salts into the copolymer is examined by using the MD-SCF method. By employing GPU-acceleration, the high performance of the MD-SCF method with explicit treatment of electrostatics facilitates the simulation study of many problems involving polyelectrolytes.

  15. An Active Learning Framework for Hyperspectral Image Classification Using Hierarchical Segmentation

    NASA Technical Reports Server (NTRS)

    Zhang, Zhou; Pasolli, Edoardo; Crawford, Melba M.; Tilton, James C.

    2015-01-01

    Augmenting spectral data with spatial information for image classification has recently gained significant attention, as classification accuracy can often be improved by extracting spatial information from neighboring pixels. In this paper, we propose a new framework in which active learning (AL) and hierarchical segmentation (HSeg) are combined for spectral-spatial classification of hyperspectral images. The spatial information is extracted from a best segmentation obtained by pruning the HSeg tree using a new supervised strategy. The best segmentation is updated at each iteration of the AL process, thus taking advantage of informative labeled samples provided by the user. The proposed strategy incorporates spatial information in two ways: 1) concatenating the extracted spatial features and the original spectral features into a stacked vector and 2) extending the training set using a self-learning-based semi-supervised learning (SSL) approach. Finally, the two strategies are combined within an AL framework. The proposed framework is validated with two benchmark hyperspectral datasets. Higher classification accuracies are obtained by the proposed framework with respect to five other state-of-the-art spectral-spatial classification approaches. Moreover, the effectiveness of the proposed pruning strategy is also demonstrated relative to the approaches based on a fixed segmentation.

  16. Individual psychotherapy for schizophrenia: trends and developments in the wake of the recovery movement

    PubMed Central

    Hamm, Jay A; Hasson-Ohayon, Ilanit; Kukla, Marina; Lysaker, Paul H

    2013-01-01

    Although the role and relative prominence of psychotherapy in the treatment of schizophrenia has fluctuated over time, an analysis of the history of psychotherapy for schizophrenia, focusing on findings from the recovery movement, reveals recent trends including the emergence of the development of integrative psychotherapy approaches. The authors suggest that the recovery movement has revealed limitations in traditional approaches to psychotherapy, and has provided opportunities for integrative approaches to emerge as a mechanism for promoting recovery in persons with schizophrenia. Five approaches to integrative psychotherapy for persons with schizophrenia are presented, and a shared conceptual framework that allows these five approaches to be compatible with one another is proposed. The conceptual framework is consistent with theories of recovery and emphasizes interpersonal attachment, personal narrative, and metacognitive processes. Implications for future research on integrative psychotherapy are considered. PMID:23950665

  17. An Ontology-based Context-aware System for Smart Homes: E-care@home.

    PubMed

    Alirezaie, Marjan; Renoux, Jennifer; Köckemann, Uwe; Kristoffersson, Annica; Karlsson, Lars; Blomqvist, Eva; Tsiftes, Nicolas; Voigt, Thiemo; Loutfi, Amy

    2017-07-06

    Smart home environments have a significant potential to provide for long-term monitoring of users with special needs in order to promote the possibility to age at home. Such environments are typically equipped with a number of heterogeneous sensors that monitor both health and environmental parameters. This paper presents a framework called E-care@home, consisting of an IoT infrastructure, which provides information with an unambiguous, shared meaning across IoT devices, end-users, relatives, health and care professionals and organizations. We focus on integrating measurements gathered from heterogeneous sources by using ontologies in order to enable semantic interpretation of events and context awareness. Activities are deduced using an incremental answer set solver for stream reasoning. The paper demonstrates the proposed framework using an instantiation of a smart environment that is able to perform context recognition based on the activities and the events occurring in the home.

  18. Simple framework for understanding the universality of the maximum drag reduction asymptote in turbulent flow of polymer solutions

    NASA Astrophysics Data System (ADS)

    Li, Chang-Feng; Sureshkumar, Radhakrishna; Khomami, Bamin

    2015-10-01

    Self-consistent direct numerical simulations of turbulent channel flows of dilute polymer solutions exhibiting friction drag reduction (DR) show that an effective Deborah number defined as the ratio of polymer relaxation time to the time scale of fluctuations in the vorticity in the mean flow direction remains O (1) from the onset of DR to the maximum drag reduction (MDR) asymptote. However, the ratio of the convective time scale associated with streamwise vorticity fluctuations to the vortex rotation time decreases with increasing DR, and the maximum drag reduction asymptote is achieved when these two time scales become nearly equal. Based on these observations, a simple framework is proposed that adequately describes the influence of polymer additives on the extent of DR from the onset of DR to MDR as well as the universality of the MDR in wall-bounded turbulent flows with polymer additives.

  19. Simple framework for understanding the universality of the maximum drag reduction asymptote in turbulent flow of polymer solutions.

    PubMed

    Li, Chang-Feng; Sureshkumar, Radhakrishna; Khomami, Bamin

    2015-10-01

    Self-consistent direct numerical simulations of turbulent channel flows of dilute polymer solutions exhibiting friction drag reduction (DR) show that an effective Deborah number defined as the ratio of polymer relaxation time to the time scale of fluctuations in the vorticity in the mean flow direction remains O(1) from the onset of DR to the maximum drag reduction (MDR) asymptote. However, the ratio of the convective time scale associated with streamwise vorticity fluctuations to the vortex rotation time decreases with increasing DR, and the maximum drag reduction asymptote is achieved when these two time scales become nearly equal. Based on these observations, a simple framework is proposed that adequately describes the influence of polymer additives on the extent of DR from the onset of DR to MDR as well as the universality of the MDR in wall-bounded turbulent flows with polymer additives.

  20. Cultural adaptation of preschool PATHS (Promoting Alternative Thinking Strategies) curriculum for Pakistani children.

    PubMed

    Inam, Ayesha; Tariq, Pervaiz N; Zaman, Sahira

    2015-06-01

    Cultural adaptation of evidence-based programmes has gained importance primarily owing to its perceived impact on the established effectiveness of a programme. To date, many researchers have proposed different frameworks for systematic adaptation process. This article presents the cultural adaptation of preschool Promoting Alternative Thinking Strategies (PATHS) curriculum for Pakistani children using the heuristic framework of adaptation (Barrera & Castro, 2006). The study was completed in four steps: information gathering, preliminary adaptation design, preliminary adaptation test and adaptation refinement. Feedbacks on programme content suggested universality of the core programme components. Suggested changes were mostly surface structure: language, presentation of materials, conceptual equivalence of concepts, training needs of implementation staff and frequency of programme delivery. In-depth analysis was done to acquire cultural equivalence. Pilot testing of the outcome measures showed strong internal consistency. The results were further discussed with reference to similar work undertaken in other cultures. © 2014 International Union of Psychological Science.

  1. Distributed attitude synchronization of formation flying via consensus-based virtual structure

    NASA Astrophysics Data System (ADS)

    Cong, Bing-Long; Liu, Xiang-Dong; Chen, Zhen

    2011-06-01

    This paper presents a general framework for synchronized multiple spacecraft rotations via consensus-based virtual structure. In this framework, attitude control systems for formation spacecrafts and virtual structure are designed separately. Both parametric uncertainty and external disturbance are taken into account. A time-varying sliding mode control (TVSMC) algorithm is designed to improve the robustness of the actual attitude control system. As for the virtual attitude control system, a behavioral consensus algorithm is presented to accomplish the attitude maneuver of the entire formation and guarantee a consistent attitude among the local virtual structure counterparts during the attitude maneuver. A multiple virtual sub-structures (MVSSs) system is introduced to enhance current virtual structure scheme when large amounts of spacecrafts are involved in the formation. The attitude of spacecraft is represented by modified Rodrigues parameter (MRP) for its non-redundancy. Finally, a numerical simulation with three synchronization situations is employed to illustrate the effectiveness of the proposed strategy.

  2. An Ontology-based Context-aware System for Smart Homes: E-care@home

    PubMed Central

    Alirezaie, Marjan; Köckemann, Uwe; Kristoffersson, Annica; Karlsson, Lars; Blomqvist, Eva; Voigt, Thiemo; Loutfi, Amy

    2017-01-01

    Smart home environments have a significant potential to provide for long-term monitoring of users with special needs in order to promote the possibility to age at home. Such environments are typically equipped with a number of heterogeneous sensors that monitor both health and environmental parameters. This paper presents a framework called E-care@home, consisting of an IoT infrastructure, which provides information with an unambiguous, shared meaning across IoT devices, end-users, relatives, health and care professionals and organizations. We focus on integrating measurements gathered from heterogeneous sources by using ontologies in order to enable semantic interpretation of events and context awareness. Activities are deduced using an incremental answer set solver for stream reasoning. The paper demonstrates the proposed framework using an instantiation of a smart environment that is able to perform context recognition based on the activities and the events occurring in the home. PMID:28684686

  3. Fostering integrity in postgraduate research: an evidence-based policy and support framework.

    PubMed

    Mahmud, Saadia; Bretag, Tracey

    2014-01-01

    Postgraduate research students have a unique position in the debate on integrity in research as students and novice researchers. To assess how far policies for integrity in postgraduate research meet the needs of students as "research trainees," we reviewed online policies for integrity in postgraduate research at nine particular Australian universities against the Australian Code for Responsible Conduct of Research (the Code) and the five core elements of exemplary academic integrity policy identified by Bretag et al. (2011 ), i.e., access, approach, responsibility, detail, and support. We found inconsistency with the Code in the definition of research misconduct and a lack of adequate detail and support. Based on our analysis, previous research, and the literature, we propose a framework for policy and support for postgraduate research that encompasses a consistent and educative approach to integrity maintained across the university at all levels of scholarship and for all stakeholders.

  4. MO-D-213-06: Quantitative Image Quality Metrics Are for Physicists, Not Radiologists: How to Communicate to Your Radiologists Using Their Language

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Szczykutowicz, T; Rubert, N; Ranallo, F

    Purpose: A framework for explaining differences in image quality to non-technical audiences in medial imaging is needed. Currently, this task is something that is learned “on the job.” The lack of a formal methodology for communicating optimal acquisition parameters into the clinic effectively mitigates many technological advances. As a community, medical physicists need to be held responsible for not only advancing image science, but also for ensuring its proper use in the clinic. This work outlines a framework that bridges the gap between the results from quantitative image quality metrics like detectability, MTF, and NPS and their effect on specificmore » anatomical structures present in diagnostic imaging tasks. Methods: Specific structures of clinical importance were identified for a body, an extremity, a chest, and a temporal bone protocol. Using these structures, quantitative metrics were used to identify the parameter space that should yield optimal image quality constrained within the confines of clinical logistics and dose considerations. The reading room workflow for presenting the proposed changes for imaging each of these structures is presented. The workflow consists of displaying images for physician review consisting of different combinations of acquisition parameters guided by quantitative metrics. Examples of using detectability index, MTF, NPS, noise and noise non-uniformity are provided. During review, the physician was forced to judge the image quality solely on those features they need for diagnosis, not on the overall “look” of the image. Results: We found that in many cases, use of this framework settled mis-agreements between physicians. Once forced to judge images on the ability to detect specific structures inter reader agreement was obtained. Conclusion: This framework will provide consulting, research/industrial, or in-house physicists with clinically relevant imaging tasks to guide reading room image review. This framework avoids use of the overall “look” or “feel” to dictate acquisition parameter selection. Equipment grants GE Healthcare.« less

  5. Integration of electro-anatomical and imaging data of the left ventricle: An evaluation framework.

    PubMed

    Soto-Iglesias, David; Butakoff, Constantine; Andreu, David; Fernández-Armenta, Juan; Berruezo, Antonio; Camara, Oscar

    2016-08-01

    Integration of electrical and structural information for scar characterization in the left ventricle (LV) is a crucial step to better guide radio-frequency ablation therapies, which are usually performed in complex ventricular tachycardia (VT) cases. This integration requires finding a common representation where to map the electrical information from the electro-anatomical map (EAM) surfaces and tissue viability information from delay-enhancement magnetic resonance images (DE-MRI). However, the development of a consistent integration method is still an open problem due to the lack of a proper evaluation framework to assess its accuracy. In this paper we present both: (i) an evaluation framework to assess the accuracy of EAM and imaging integration strategies with simulated EAM data and a set of global and local measures; and (ii) a new integration methodology based on a planar disk representation where the LV surface meshes are quasi-conformally mapped (QCM) by flattening, allowing for simultaneous visualization and joint analysis of the multi-modal data. The developed evaluation framework was applied to estimate the accuracy of the QCM-based integration strategy on a benchmark dataset of 128 synthetically generated ground-truth cases presenting different scar configurations and EAM characteristics. The obtained results demonstrate a significant reduction in global overlap errors (50-100%) with respect to state-of-the-art integration techniques, also better preserving the local topology of small structures such as conduction channels in scars. Data from seventeen VT patients were also used to study the feasibility of the QCM technique in a clinical setting, consistently outperforming the alternative integration techniques in the presence of sparse and noisy clinical data. The proposed evaluation framework has allowed a rigorous comparison of different EAM and imaging data integration strategies, providing useful information to better guide clinical practice in complex cardiac interventions. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. A proposed ITS evaluation framework for Texas

    DOT National Transportation Integrated Search

    1999-03-01

    This report presents a proposed intelligent transportation system (ITS) evaluation framework that can be used by the Texas Department of Transportation (TxDOT) in developing evaluation plans for specific ITS applications and deployments in Texas. The...

  7. A Survey and Analysis of Frameworks and Framework Issues for Information Fusion Applications

    NASA Astrophysics Data System (ADS)

    Llinas, James

    This paper was stimulated by the proposed project for the Santander Bank-sponsored "Chairs of Excellence" program in Spain, of which the author is a recipient. That project involves research on characterizing a robust, problem-domain-agnostic framework in which Information Fusion (IF) processes of all description, to include artificial intelligence processes and techniques could be developed. The paper describes the IF process and its requirements, a literature survey on IF frameworks, and a new proposed framework that will be implemented and evaluated at Universidad Carlos III de Madrid, Colmenarejo Campus.

  8. An Alienation-Based Framework for Student Experience in Higher Education: New Interpretations of Past Observations in Student Learning Theory

    ERIC Educational Resources Information Center

    Barnhardt, Bradford; Ginns, Paul

    2014-01-01

    This article orients a recently proposed alienation-based framework for student learning theory (SLT) to the empirical basis of the approaches to learning perspective. The proposed framework makes new macro-level interpretations of an established micro-level theory, across three levels of interpretation: (1) a context-free psychological state…

  9. A Framework for Action To Make Private Housing Lead-Safe: A Proposal To Focus National Attention.

    ERIC Educational Resources Information Center

    Alliance to End Childhood Lead Poisoning, Washington, DC.

    This framework sets forth detailed proposals that are crucial to eliminating the epidemic of childhood lead poisoning in the United States. Private housing units can and must be made lead-safe, and this framework is designed to achieve that goal through specific requirements for property owners, a workable schedule, and mechanisms that reinforce…

  10. Proposal of a Framework for Internet Based Licensing of Learning Objects

    ERIC Educational Resources Information Center

    Santos, Osvaldo A.; Ramos, Fernando M. S.

    2004-01-01

    This paper presents a proposal of a framework whose main objective is to manage the delivery and rendering of learning objects in a digital rights controlled environment. The framework is based on a digital licensing scheme that requires each learning object to have the proper license in order to be rendered by a trusted player. A conceptual model…

  11. An Integrated Qualitative and Quantitative Biochemical Model Learning Framework Using Evolutionary Strategy and Simulated Annealing.

    PubMed

    Wu, Zujian; Pang, Wei; Coghill, George M

    2015-01-01

    Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.

  12. Knowledge, instruction and behavioural change: building a framework for effective eczema education in clinical practice

    PubMed Central

    Thompson, Deryn Lee; Thompson, Murray John

    2014-01-01

    Aims A discussion on the reasons educational interventions about eczema, by nurses, are successful, with the subsequent development of a theoretical framework to guide nurses to become effective patient educators. Background Effective child and parent education is the key to successful self-management of eczema. When diagnosed, children and parents should learn to understand the condition through clear explanations, seeing treatment demonstrations and have ongoing support to learn practical skills to control eczema. Dermatology nurses provide these services, but no one has proposed a framework of the concepts underpinning their successful eczema educational interventions. Design A discussion paper. Data Sources A literature search of online databases was undertaken utilizing terms ‘eczema OR atopic dermatitis’, ‘education’, ‘parent’, ‘nurs*’, ‘framework’, ‘knowledge’, motivation’, in Scopus, CINAHL, Web of Science, Medline and Pubmed. Limits were English language and 2003–2013. Implications for Nursing The framework can inform discussion on child and parent education, provide a scaffold for future research and guide non-specialist nurses, internationally, in providing consistent patient education about eczema. Conclusion Founded on an understanding of knowledge, the framework utilizes essential elements of cognitive psychology and social cognitive theory leading to successful self-management of eczema. This framework may prove useful as a basis for future research in child and parent education, globally, in the healthcare community. A framework has been created to help nurses understand the essential elements of the learning processes at the foundation of effective child and parent education. The framework serves to explain the improved outcomes reported in previous nurse-led eczema educational interventions. PMID:25312442

  13. Developing a financial framework for academic service partnerships: models of the United States and Europe.

    PubMed

    De Geest, Sabina; Sullivan Marx, Eileen M; Rich, Victoria; Spichiger, Elisabeth; Schwendimann, Rene; Spirig, Rebecca; Van Malderen, Greet

    2010-09-01

    Academic service partnerships (ASPs) are structured linkages between academe and service which have demonstrated higher levels of innovation. In the absence of descriptions in the literature on financial frameworks to support ASPs, the purpose of this paper is to present the supporting financial frameworks of a Swiss and a U.S. ASP. This paper used a case study approach. Two frameworks are presented. The U.S. model presented consists of a variety of ASPs, all linked to the School of Nursing of the University of Pennsylvania. The structural integration and governance system is elucidated. Each ASP has its own source of revenue or grant support with the goal to be fiscally in the black. Joint appointments are used as an instrument to realize these ASPs. The Swiss ASP entails a detailed description of the financial framework of one ASP between the Institute of Nursing Science at the University of Basel and the Inselspital Bern University Hospital. Balance in the partnership, in terms of both benefit and cost between both partners, was a main principle that guided the development of the financial framework and the translation of the ASP in budgetary terms. The model builds on a number of assumptions and provides the partnership management within a simple framework for monitoring and evaluation of the progress of the partnership. In operationalizing an ASP, careful budgetary planning should be an integral part of the preparation and evaluation of the collaboration. The proposed Swiss and U.S. financial frameworks allow doing so. Outcomes of care can be improved with strong nursing service and academic partnerships. Sustaining such partnerships requires attention to financial and contractual arrangements.

  14. Consensus proposals for classification of the family Hepeviridae.

    PubMed

    Smith, Donald B; Simmonds, Peter; Jameel, Shahid; Emerson, Suzanne U; Harrison, Tim J; Meng, Xiang-Jin; Okamoto, Hiroaki; Van der Poel, Wim H M; Purdy, Michael A

    2014-10-01

    The family Hepeviridae consists of positive-stranded RNA viruses that infect a wide range of mammalian species, as well as chickens and trout. A subset of these viruses infects humans and can cause a self-limiting acute hepatitis that may become chronic in immunosuppressed individuals. Current published descriptions of the taxonomical divisions within the family Hepeviridae are contradictory in relation to the assignment of species and genotypes. Through analysis of existing sequence information, we propose a taxonomic scheme in which the family is divided into the genera Orthohepevirus (all mammalian and avian hepatitis E virus (HEV) isolates) and Piscihepevirus (cutthroat trout virus). Species within the genus Orthohepevirus are designated Orthohepevirus A (isolates from human, pig, wild boar, deer, mongoose, rabbit and camel), Orthohepevirus B (isolates from chicken), Orthohepevirus C (isolates from rat, greater bandicoot, Asian musk shrew, ferret and mink) and Orthohepevirus D (isolates from bat). Proposals are also made for the designation of genotypes within the human and rat HEVs. This hierarchical system is congruent with hepevirus phylogeny, and the three classification levels (genus, species and genotype) are consistent with, and reflect discontinuities in the ranges of pairwise distances between amino acid sequences. Adoption of this system would include the avoidance of host names in taxonomic identifiers and provide a logical framework for the assignment of novel variants.

  15. Self-consistent field model for strong electrostatic correlations and inhomogeneous dielectric media.

    PubMed

    Ma, Manman; Xu, Zhenli

    2014-12-28

    Electrostatic correlations and variable permittivity of electrolytes are essential for exploring many chemical and physical properties of interfaces in aqueous solutions. We propose a continuum electrostatic model for the treatment of these effects in the framework of the self-consistent field theory. The model incorporates a space- or field-dependent dielectric permittivity and an excluded ion-size effect for the correlation energy. This results in a self-energy modified Poisson-Nernst-Planck or Poisson-Boltzmann equation together with state equations for the self energy and the dielectric function. We show that the ionic size is of significant importance in predicting a finite self energy for an ion in an inhomogeneous medium. Asymptotic approximation is proposed for the solution of a generalized Debye-Hückel equation, which has been shown to capture the ionic correlation and dielectric self energy. Through simulating ionic distribution surrounding a macroion, the modified self-consistent field model is shown to agree with particle-based Monte Carlo simulations. Numerical results for symmetric and asymmetric electrolytes demonstrate that the model is able to predict the charge inversion at high correlation regime in the presence of multivalent interfacial ions which is beyond the mean-field theory and also show strong effect to double layer structure due to the space- or field-dependent dielectric permittivity.

  16. A multi-species synthesis of physiological mechanisms in drought-induced tree mortality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Henry D.; Zeppel, Melanie J. B.; Anderegg, William R. L.

    Widespread tree mortality associated with drought has been observed on all forested continents, and global change is expected to exacerbate vegetation vulnerability. Forest mortality has implications for future biosphere-atmosphere interactions of carbon, water, and energy balance, and is poorly represented in dynamic vegetation models. Reducing uncertainty requires improved mortality projections founded on robust physiological processes. However, the proposed mechanisms of drought-induced mortality, including hydraulic failure and carbon starvation, are unresolved. A growing number of empirical studies have investigated these mechanisms, but data have not been consistently analyzed across species and biomes using a standardized physiological framework. Here we show thatmore » xylem hydraulic failure was ubiquitous across multiple tree taxa at drought-induced mortality. All species assessed had 60% or greater loss of xylem hydraulic conductivity, consistent with proposed theoretical and modelled survival thresholds. We found diverse responses in non-structural carbohydrates at mortality, indicating that evidence supporting carbon starvation was not universal. Reduced non-structural carbohydrates were more common for gymnosperms than angiosperms, associated with xylem hydraulic vulnerability, and may have a role in hydraulic deterioration. The consistent Our finding that across species of hydraulic failure at drought-induced mortality was persistent across species indicates that substantial improvement in vegetation modelling can be achieved using thresholds in hydraulic function.« less

  17. Sustainability in Health care by Allocating Resources Effectively (SHARE) 8: developing, implementing and evaluating an evidence dissemination service in a local healthcare setting.

    PubMed

    Harris, Claire; Garrubba, Marie; Melder, Angela; Voutier, Catherine; Waller, Cara; King, Richard; Ramsey, Wayne

    2018-03-02

    This is the eighth in a series of papers reporting Sustainability in Health care by Allocating Resources Effectively (SHARE) in a local healthcare setting. The SHARE Program was a systematic, integrated, evidence-based program for disinvestment within a large Australian health service. One of the aims was to explore methods to deliver existing high quality synthesised evidence directly to decision-makers to drive decision-making proactively. An Evidence Dissemination Service (EDS) was proposed. While this was conceived as a method to identify disinvestment opportunities, it became clear that it could also be a way to review all practices for consistency with current evidence. This paper reports the development, implementation and evaluation of two models of an in-house EDS. Frameworks for development of complex interventions, implementation of evidence-based change, and evaluation and explication of processes and outcomes were adapted and/or applied. Mixed methods including a literature review, surveys, interviews, workshops, audits, document analysis and action research were used to capture barriers, enablers and local needs; identify effective strategies; develop and refine proposals; ascertain feedback and measure outcomes. Methods to identify, capture, classify, store, repackage, disseminate and facilitate use of synthesised research evidence were investigated. In Model 1, emails containing links to multiple publications were sent to all self-selected participants who were asked to determine whether they were the relevant decision-maker for any of the topics presented, whether change was required, and to take the relevant action. This voluntary framework did not achieve the aim of ensuring practice was consistent with current evidence. In Model 2, the need for change was established prior to dissemination, then a summary of the evidence was sent to the decision-maker responsible for practice in the relevant area who was required to take appropriate action and report the outcome. This mandatory governance framework was successful. The factors influencing decisions, processes and outcomes were identified. An in-house EDS holds promise as a method of identifying disinvestment opportunities and/or reviewing local practice for consistency with current evidence. The resource-intensive nature of delivery of the EDS is a potential barrier. The findings from this study will inform further exploration.

  18. Discovering biclusters in gene expression data based on high-dimensional linear geometries

    PubMed Central

    Gan, Xiangchao; Liew, Alan Wee-Chung; Yan, Hong

    2008-01-01

    Background In DNA microarray experiments, discovering groups of genes that share similar transcriptional characteristics is instrumental in functional annotation, tissue classification and motif identification. However, in many situations a subset of genes only exhibits consistent pattern over a subset of conditions. Conventional clustering algorithms that deal with the entire row or column in an expression matrix would therefore fail to detect these useful patterns in the data. Recently, biclustering has been proposed to detect a subset of genes exhibiting consistent pattern over a subset of conditions. However, most existing biclustering algorithms are based on searching for sub-matrices within a data matrix by optimizing certain heuristically defined merit functions. Moreover, most of these algorithms can only detect a restricted set of bicluster patterns. Results In this paper, we present a novel geometric perspective for the biclustering problem. The biclustering process is interpreted as the detection of linear geometries in a high dimensional data space. Such a new perspective views biclusters with different patterns as hyperplanes in a high dimensional space, and allows us to handle different types of linear patterns simultaneously by matching a specific set of linear geometries. This geometric viewpoint also inspires us to propose a generic bicluster pattern, i.e. the linear coherent model that unifies the seemingly incompatible additive and multiplicative bicluster models. As a particular realization of our framework, we have implemented a Hough transform-based hyperplane detection algorithm. The experimental results on human lymphoma gene expression dataset show that our algorithm can find biologically significant subsets of genes. Conclusion We have proposed a novel geometric interpretation of the biclustering problem. We have shown that many common types of bicluster are just different spatial arrangements of hyperplanes in a high dimensional data space. An implementation of the geometric framework using the Fast Hough transform for hyperplane detection can be used to discover biologically significant subsets of genes under subsets of conditions for microarray data analysis. PMID:18433477

  19. A population-based tissue probability map-driven level set method for fully automated mammographic density estimations.

    PubMed

    Kim, Youngwoo; Hong, Byung Woo; Kim, Seung Ja; Kim, Jong Hyo

    2014-07-01

    A major challenge when distinguishing glandular tissues on mammograms, especially for area-based estimations, lies in determining a boundary on a hazy transition zone from adipose to glandular tissues. This stems from the nature of mammography, which is a projection of superimposed tissues consisting of different structures. In this paper, the authors present a novel segmentation scheme which incorporates the learned prior knowledge of experts into a level set framework for fully automated mammographic density estimations. The authors modeled the learned knowledge as a population-based tissue probability map (PTPM) that was designed to capture the classification of experts' visual systems. The PTPM was constructed using an image database of a selected population consisting of 297 cases. Three mammogram experts extracted regions for dense and fatty tissues on digital mammograms, which was an independent subset used to create a tissue probability map for each ROI based on its local statistics. This tissue class probability was taken as a prior in the Bayesian formulation and was incorporated into a level set framework as an additional term to control the evolution and followed the energy surface designed to reflect experts' knowledge as well as the regional statistics inside and outside of the evolving contour. A subset of 100 digital mammograms, which was not used in constructing the PTPM, was used to validate the performance. The energy was minimized when the initial contour reached the boundary of the dense and fatty tissues, as defined by experts. The correlation coefficient between mammographic density measurements made by experts and measurements by the proposed method was 0.93, while that with the conventional level set was 0.47. The proposed method showed a marked improvement over the conventional level set method in terms of accuracy and reliability. This result suggests that the proposed method successfully incorporated the learned knowledge of the experts' visual systems and has potential to be used as an automated and quantitative tool for estimations of mammographic breast density levels.

  20. ECG Denoising Using Marginalized Particle Extended Kalman Filter With an Automatic Particle Weighting Strategy.

    PubMed

    Hesar, Hamed Danandeh; Mohebbi, Maryam

    2017-05-01

    In this paper, a model-based Bayesian filtering framework called the "marginalized particle-extended Kalman filter (MP-EKF) algorithm" is proposed for electrocardiogram (ECG) denoising. This algorithm does not have the extended Kalman filter (EKF) shortcoming in handling non-Gaussian nonstationary situations because of its nonlinear framework. In addition, it has less computational complexity compared with particle filter. This filter improves ECG denoising performance by implementing marginalized particle filter framework while reducing its computational complexity using EKF framework. An automatic particle weighting strategy is also proposed here that controls the reliance of our framework to the acquired measurements. We evaluated the proposed filter on several normal ECGs selected from MIT-BIH normal sinus rhythm database. To do so, artificial white Gaussian and colored noises as well as nonstationary real muscle artifact (MA) noise over a range of low SNRs from 10 to -5 dB were added to these normal ECG segments. The benchmark methods were the EKF and extended Kalman smoother (EKS) algorithms which are the first model-based Bayesian algorithms introduced in the field of ECG denoising. From SNR viewpoint, the experiments showed that in the presence of Gaussian white noise, the proposed framework outperforms the EKF and EKS algorithms in lower input SNRs where the measurements and state model are not reliable. Owing to its nonlinear framework and particle weighting strategy, the proposed algorithm attained better results at all input SNRs in non-Gaussian nonstationary situations (such as presence of pink noise, brown noise, and real MA). In addition, the impact of the proposed filtering method on the distortion of diagnostic features of the ECG was investigated and compared with EKF/EKS methods using an ECG diagnostic distortion measure called the "Multi-Scale Entropy Based Weighted Distortion Measure" or MSEWPRD. The results revealed that our proposed algorithm had the lowest MSEPWRD for all noise types at low input SNRs. Therefore, the morphology and diagnostic information of ECG signals were much better conserved compared with EKF/EKS frameworks, especially in non-Gaussian nonstationary situations.

Top