Sample records for key analysis approaches

  1. Numerical approach for unstructured quantum key distribution

    PubMed Central

    Coles, Patrick J.; Metodiev, Eric M.; Lütkenhaus, Norbert

    2016-01-01

    Quantum key distribution (QKD) allows for communication with security guaranteed by quantum theory. The main theoretical problem in QKD is to calculate the secret key rate for a given protocol. Analytical formulas are known for protocols with symmetries, since symmetry simplifies the analysis. However, experimental imperfections break symmetries, hence the effect of imperfections on key rates is difficult to estimate. Furthermore, it is an interesting question whether (intentionally) asymmetric protocols could outperform symmetric ones. Here we develop a robust numerical approach for calculating the key rate for arbitrary discrete-variable QKD protocols. Ultimately this will allow researchers to study ‘unstructured' protocols, that is, those that lack symmetry. Our approach relies on transforming the key rate calculation to the dual optimization problem, which markedly reduces the number of parameters and hence the calculation time. We illustrate our method by investigating some unstructured protocols for which the key rate was previously unknown. PMID:27198739

  2. Identification of key regulators of pancreatic cancer progression through multidimensional systems-level analysis.

    PubMed

    Rajamani, Deepa; Bhasin, Manoj K

    2016-05-03

    Pancreatic cancer is an aggressive cancer with dismal prognosis, urgently necessitating better biomarkers to improve therapeutic options and early diagnosis. Traditional approaches of biomarker detection that consider only one aspect of the biological continuum like gene expression alone are limited in their scope and lack robustness in identifying the key regulators of the disease. We have adopted a multidimensional approach involving the cross-talk between the omics spaces to identify key regulators of disease progression. Multidimensional domain-specific disease signatures were obtained using rank-based meta-analysis of individual omics profiles (mRNA, miRNA, DNA methylation) related to pancreatic ductal adenocarcinoma (PDAC). These domain-specific PDAC signatures were integrated to identify genes that were affected across multiple dimensions of omics space in PDAC (genes under multiple regulatory controls, GMCs). To further pin down the regulators of PDAC pathophysiology, a systems-level network was generated from knowledge-based interaction information applied to the above identified GMCs. Key regulators were identified from the GMC network based on network statistics and their functional importance was validated using gene set enrichment analysis and survival analysis. Rank-based meta-analysis identified 5391 genes, 109 miRNAs and 2081 methylation-sites significantly differentially expressed in PDAC (false discovery rate ≤ 0.05). Bimodal integration of meta-analysis signatures revealed 1150 and 715 genes regulated by miRNAs and methylation, respectively. Further analysis identified 189 altered genes that are commonly regulated by miRNA and methylation, hence considered GMCs. Systems-level analysis of the scale-free GMCs network identified eight potential key regulator hubs, namely E2F3, HMGA2, RASA1, IRS1, NUAK1, ACTN1, SKI and DLL1, associated with important pathways driving cancer progression. Survival analysis on individual key regulators revealed

  3. Finite key analysis for symmetric attacks in quantum key distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyer, Tim; Kampermann, Hermann; Kleinmann, Matthias

    2006-10-15

    We introduce a constructive method to calculate the achievable secret key rate for a generic class of quantum key distribution protocols, when only a finite number n of signals is given. Our approach is applicable to all scenarios in which the quantum state shared by Alice and Bob is known. In particular, we consider the six state protocol with symmetric eavesdropping attacks, and show that for a small number of signals, i.e., below n{approx}10{sup 4}, the finite key rate differs significantly from the asymptotic value for n{yields}{infinity}. However, for larger n, a good approximation of the asymptotic value is found.more » We also study secret key rates for protocols using higher-dimensional quantum systems.« less

  4. Finite-key analysis for measurement-device-independent quantum key distribution.

    PubMed

    Curty, Marcos; Xu, Feihu; Cui, Wei; Lim, Charles Ci Wen; Tamaki, Kiyoshi; Lo, Hoi-Kwong

    2014-04-29

    Quantum key distribution promises unconditionally secure communications. However, as practical devices tend to deviate from their specifications, the security of some practical systems is no longer valid. In particular, an adversary can exploit imperfect detectors to learn a large part of the secret key, even though the security proof claims otherwise. Recently, a practical approach--measurement-device-independent quantum key distribution--has been proposed to solve this problem. However, so far its security has only been fully proven under the assumption that the legitimate users of the system have unlimited resources. Here we fill this gap and provide a rigorous security proof against general attacks in the finite-key regime. This is obtained by applying large deviation theory, specifically the Chernoff bound, to perform parameter estimation. For the first time we demonstrate the feasibility of long-distance implementations of measurement-device-independent quantum key distribution within a reasonable time frame of signal transmission.

  5. Exploring the 12-Key Approach: Perceptions and Experiences of Improvising Jazz Vocalists

    ERIC Educational Resources Information Center

    Hargreaves, Wendy

    2016-01-01

    The 12-key approach is considered a foundational practice strategy for jazz instrumentalists. Its relevance to vocalists, however, seems less clear. This article investigates improvising jazz vocalists' perceptions and experiences of using the 12-key approach as distinguished from instrumentalists'. It uses data from a two-phase, mixed methods…

  6. A novel key-frame extraction approach for both video summary and video index.

    PubMed

    Lei, Shaoshuai; Xie, Gang; Yan, Gaowei

    2014-01-01

    Existing key-frame extraction methods are basically video summary oriented; yet the index task of key-frames is ignored. This paper presents a novel key-frame extraction approach which can be available for both video summary and video index. First a dynamic distance separability algorithm is advanced to divide a shot into subshots based on semantic structure, and then appropriate key-frames are extracted in each subshot by SVD decomposition. Finally, three evaluation indicators are proposed to evaluate the performance of the new approach. Experimental results show that the proposed approach achieves good semantic structure for semantics-based video index and meanwhile produces video summary consistent with human perception.

  7. Experimental quantum key distribution with finite-key security analysis for noisy channels.

    PubMed

    Bacco, Davide; Canale, Matteo; Laurenti, Nicola; Vallone, Giuseppe; Villoresi, Paolo

    2013-01-01

    In quantum key distribution implementations, each session is typically chosen long enough so that the secret key rate approaches its asymptotic limit. However, this choice may be constrained by the physical scenario, as in the perspective use with satellites, where the passage of one terminal over the other is restricted to a few minutes. Here we demonstrate experimentally the extraction of secure keys leveraging an optimal design of the prepare-and-measure scheme, according to recent finite-key theoretical tight bounds. The experiment is performed in different channel conditions, and assuming two distinct attack models: individual attacks or general quantum attacks. The request on the number of exchanged qubits is then obtained as a function of the key size and of the ambient quantum bit error rate. The results indicate that viable conditions for effective symmetric, and even one-time-pad, cryptography are achievable.

  8. Enhancing LoRaWAN Security through a Lightweight and Authenticated Key Management Approach.

    PubMed

    Sanchez-Iborra, Ramon; Sánchez-Gómez, Jesús; Pérez, Salvador; Fernández, Pedro J; Santa, José; Hernández-Ramos, José L; Skarmeta, Antonio F

    2018-06-05

    Luckily, new communication technologies and protocols are nowadays designed considering security issues. A clear example of this can be found in the Internet of Things (IoT) field, a quite recent area where communication technologies such as ZigBee or IPv6 over Low power Wireless Personal Area Networks (6LoWPAN) already include security features to guarantee authentication, confidentiality and integrity. More recent technologies are Low-Power Wide-Area Networks (LP-WAN), which also consider security, but present initial approaches that can be further improved. An example of this can be found in Long Range (LoRa) and its layer-two supporter LoRa Wide Area Network (LoRaWAN), which include a security scheme based on pre-shared cryptographic material lacking flexibility when a key update is necessary. Because of this, in this work, we evaluate the security vulnerabilities of LoRaWAN in the area of key management and propose different alternative schemes. Concretely, the application of an approach based on the recently specified Ephemeral Diffie⁻Hellman Over COSE (EDHOC) is found as a convenient solution, given its flexibility in the update of session keys, its low computational cost and the limited message exchanges needed. A comparative conceptual analysis considering the overhead of different security schemes for LoRaWAN is carried out in order to evaluate their benefits in the challenging area of LP-WAN.

  9. Improved statistical fluctuation analysis for measurement-device-independent quantum key distribution with four-intensity decoy-state method.

    PubMed

    Mao, Chen-Chen; Zhou, Xing-Yu; Zhu, Jian-Rong; Zhang, Chun-Hui; Zhang, Chun-Mei; Wang, Qin

    2018-05-14

    Recently Zhang et al [ Phys. Rev. A95, 012333 (2017)] developed a new approach to estimate the failure probability for the decoy-state BB84 QKD system when taking finite-size key effect into account, which offers security comparable to Chernoff bound, while results in an improved key rate and transmission distance. Based on Zhang et al's work, now we extend this approach to the case of the measurement-device-independent quantum key distribution (MDI-QKD), and for the first time implement it onto the four-intensity decoy-state MDI-QKD system. Moreover, through utilizing joint constraints and collective error-estimation techniques, we can obviously increase the performance of practical MDI-QKD systems compared with either three- or four-intensity decoy-state MDI-QKD using Chernoff bound analysis, and achieve much higher level security compared with those applying Gaussian approximation analysis.

  10. A case study detailing key considerations for implementing a telehealth approach to office ergonomics.

    PubMed

    Ritchie, Catherine L W; Miller, Linda L; Antle, David M

    2017-01-01

    Telehealth approaches to delivering ergonomics assessment hold great potential to improve service delivery in rural and remote settings. This case study describes a telehealth-based ergonomics service delivery process, and compares in-person and telehealth-based ergonomics approaches at an Alberta-based non-profit advocacy group. This project demonstrates that telehealth approaches to ergonomics do not lead to significantly different scoring outcomes for assessment of ergonomics issues, when compared to in-person assessments. This project also outlines the importance of live real-time video conferencing to improving communication, attaining key assessment information, and demonstrating ergonomic adjustments. However, some key considerations of bandwidth and hardware capabilities need to be taken into account. Key communication strategies are outlined to improve rapport, maintain employee confidentiality, and reduce client anxiety around telehealth ergonomics assessments. This project provides further support for telehealth approaches to office ergonomics, and outlines some key implementation strategies and barriers that should be considered.

  11. Identification of the Key Fields and Their Key Technical Points of Oncology by Patent Analysis.

    PubMed

    Zhang, Ting; Chen, Juan; Jia, Xiaofeng

    2015-01-01

    This paper aims to identify the key fields and their key technical points of oncology by patent analysis. Patents of oncology applied from 2006 to 2012 were searched in the Thomson Innovation database. The key fields and their key technical points were determined by analyzing the Derwent Classification (DC) and the International Patent Classification (IPC), respectively. Patent applications in the top ten DC occupied 80% of all the patent applications of oncology, which were the ten fields of oncology to be analyzed. The number of patent applications in these ten fields of oncology was standardized based on patent applications of oncology from 2006 to 2012. For each field, standardization was conducted separately for each of the seven years (2006-2012) and the mean of the seven standardized values was calculated to reflect the relative amount of patent applications in that field; meanwhile, regression analysis using time (year) and the standardized values of patent applications in seven years (2006-2012) was conducted so as to evaluate the trend of patent applications in each field. Two-dimensional quadrant analysis, together with the professional knowledge of oncology, was taken into consideration in determining the key fields of oncology. The fields located in the quadrant with high relative amount or increasing trend of patent applications are identified as key ones. By using the same method, the key technical points in each key field were identified. Altogether 116,820 patents of oncology applied from 2006 to 2012 were retrieved, and four key fields with twenty-nine key technical points were identified, including "natural products and polymers" with nine key technical points, "fermentation industry" with twelve ones, "electrical medical equipment" with four ones, and "diagnosis, surgery" with four ones. The results of this study could provide guidance on the development direction of oncology, and also help researchers broaden innovative ideas and discover new

  12. Minimally Invasive Supraorbital Key-hole Approach for the Treatment of Anterior Cranial Fossa Meningiomas

    PubMed Central

    IACOANGELI, Maurizio; NOCCHI, Niccolò; NASI, Davide; DI RIENZO, Alessandro; DOBRAN, Mauro; GLADI, Maurizio; COLASANTI, Roberto; ALVARO, Lorenzo; POLONARA, Gabriele; SCERRATI, Massimo

    2016-01-01

    The most important target of minimally invasive surgery is to obtain the best therapeutic effect with the least iatrogenic injury. In this background, a pivotal role in contemporary neurosurgery is played by the supraorbital key-hole approach proposed by Perneczky for anterior cranial base surgery. In this article, it is presented as a possible valid alternative to the traditional craniotomies in anterior cranial fossa meningiomas removal. From January 2008 to January 2012 at our department 56 patients underwent anterior cranial base meningiomas removal. Thirty-three patients were submitted to traditional approaches while 23 to supraorbital key-hole technique. A clinical and neuroradiological pre- and postoperative evaluation were performed, with attention to eventual complications, length of surgical procedure, and hospitalization. Compared to traditional approaches the supraorbital key-hole approach was associated neither to a greater range of postoperative complications nor to a longer surgical procedure and hospitalization while permitting the same lesion control. With this technique, minimization of brain exposition and manipulation with reduction of unwanted iatrogenic injuries, neurovascular structures preservation, and a better aesthetic result are possible. The supraorbital key-hole approach according to Perneckzy could represent a valid alternative to traditional approaches in anterior cranial base meningiomas surgery. PMID:26804334

  13. Key components of financial-analysis education for clinical nurses.

    PubMed

    Lim, Ji Young; Noh, Wonjung

    2015-09-01

    In this study, we identified key components of financial-analysis education for clinical nurses. We used a literature review, focus group discussions, and a content validity index survey to develop key components of financial-analysis education. First, a wide range of references were reviewed, and 55 financial-analysis education components were gathered. Second, two focus group discussions were performed; the participants were 11 nurses who had worked for more than 3 years in a hospital, and nine components were agreed upon. Third, 12 professionals, including professors, nurse executive, nurse managers, and an accountant, participated in the content validity index. Finally, six key components of financial-analysis education were selected. These key components were as follows: understanding the need for financial analysis, introduction to financial analysis, reading and implementing balance sheets, reading and implementing income statements, understanding the concepts of financial ratios, and interpretation and practice of financial ratio analysis. The results of this study will be used to develop an education program to increase financial-management competency among clinical nurses. © 2015 Wiley Publishing Asia Pty Ltd.

  14. Identification of the Key Fields and Their Key Technical Points of Oncology by Patent Analysis

    PubMed Central

    Zhang, Ting; Chen, Juan; Jia, Xiaofeng

    2015-01-01

    Background This paper aims to identify the key fields and their key technical points of oncology by patent analysis. Methodology/Principal Findings Patents of oncology applied from 2006 to 2012 were searched in the Thomson Innovation database. The key fields and their key technical points were determined by analyzing the Derwent Classification (DC) and the International Patent Classification (IPC), respectively. Patent applications in the top ten DC occupied 80% of all the patent applications of oncology, which were the ten fields of oncology to be analyzed. The number of patent applications in these ten fields of oncology was standardized based on patent applications of oncology from 2006 to 2012. For each field, standardization was conducted separately for each of the seven years (2006–2012) and the mean of the seven standardized values was calculated to reflect the relative amount of patent applications in that field; meanwhile, regression analysis using time (year) and the standardized values of patent applications in seven years (2006–2012) was conducted so as to evaluate the trend of patent applications in each field. Two-dimensional quadrant analysis, together with the professional knowledge of oncology, was taken into consideration in determining the key fields of oncology. The fields located in the quadrant with high relative amount or increasing trend of patent applications are identified as key ones. By using the same method, the key technical points in each key field were identified. Altogether 116,820 patents of oncology applied from 2006 to 2012 were retrieved, and four key fields with twenty-nine key technical points were identified, including “natural products and polymers” with nine key technical points, “fermentation industry” with twelve ones, “electrical medical equipment” with four ones, and “diagnosis, surgery” with four ones. Conclusions/Significance The results of this study could provide guidance on the development

  15. Protein analysis: key to the future.

    PubMed

    Boodhun, Nawsheen

    2018-05-01

    Protein analysis is crucial to elucidating the function of proteins and understanding the impact of their presence, absence and alteration. This is key to advancing knowledge about diseases, providing the opportunity for biomarker discovery and development of therapeutics. In this issue of Tech News, Nawsheen Boodhun explores the various means of protein analysis.

  16. Interstage Flammability Analysis Approach

    NASA Technical Reports Server (NTRS)

    Little, Jeffrey K.; Eppard, William M.

    2011-01-01

    The Interstage of the Ares I launch platform houses several key components which are on standby during First Stage operation: the Reaction Control System (ReCS), the Upper Stage (US) Thrust Vector Control (TVC) and the J-2X with the Main Propulsion System (MPS) propellant feed system. Therefore potentially dangerous leaks of propellants could develop. The Interstage leaks analysis addresses the concerns of localized mixing of hydrogen and oxygen gases to produce deflagration zones in the Interstage of the Ares I launch vehicle during First Stage operation. This report details the approach taken to accomplish the analysis. Specified leakage profiles and actual flammability results are not presented due to proprietary and security restrictions. The interior volume formed by the Interstage walls, bounding interfaces with the Upper and First Stages, and surrounding the J2-X engine was modeled using Loci-CHEM to assess the potential for flammable gas mixtures to develop during First Stage operations. The transient analysis included a derived flammability indicator based on mixture ratios to maintain achievable simulation times. Validation of results was based on a comparison to Interstage pressure profiles outlined in prior NASA studies. The approach proved useful in the bounding of flammability risk in supporting program hazard reviews.

  17. Analysis of key technologies for virtual instruments metrology

    NASA Astrophysics Data System (ADS)

    Liu, Guixiong; Xu, Qingui; Gao, Furong; Guan, Qiuju; Fang, Qiang

    2008-12-01

    Virtual instruments (VIs) require metrological verification when applied as measuring instruments. Owing to the software-centered architecture, metrological evaluation of VIs includes two aspects: measurement functions and software characteristics. Complexity of software imposes difficulties on metrological testing of VIs. Key approaches and technologies for metrology evaluation of virtual instruments are investigated and analyzed in this paper. The principal issue is evaluation of measurement uncertainty. The nature and regularity of measurement uncertainty caused by software and algorithms can be evaluated by modeling, simulation, analysis, testing and statistics with support of powerful computing capability of PC. Another concern is evaluation of software features like correctness, reliability, stability, security and real-time of VIs. Technologies from software engineering, software testing and computer security domain can be used for these purposes. For example, a variety of black-box testing, white-box testing and modeling approaches can be used to evaluate the reliability of modules, components, applications and the whole VI software. The security of a VI can be assessed by methods like vulnerability scanning and penetration analysis. In order to facilitate metrology institutions to perform metrological verification of VIs efficiently, an automatic metrological tool for the above validation is essential. Based on technologies of numerical simulation, software testing and system benchmarking, a framework for the automatic tool is proposed in this paper. Investigation on implementation of existing automatic tools that perform calculation of measurement uncertainty, software testing and security assessment demonstrates the feasibility of the automatic framework advanced.

  18. Metabolomic approach for determination of key volatile compounds related to beef flavor in glutathione-Maillard reaction products.

    PubMed

    Lee, Sang Mi; Kwon, Goo Young; Kim, Kwang-Ok; Kim, Young-Suk

    2011-10-10

    The non-targeted analysis, combining gas chromatography coupled with time-of-flight mass spectrometry (GC-TOF/MS) and sensory evaluation, was applied to investigate the relationship between volatile compounds and the sensory attributes of glutathione-Maillard reaction products (GSH-MRPs) prepared under different reaction conditions. Volatile compounds in GSH-MRPs correlating to the sensory attributes were determined using partial least-squares (PLS) regression. Volatile compounds such as 2-methylfuran-3-thiol, 3-sulfanylpentan-2-one, furan-2-ylmethanethiol, 2-propylpyrazine, 1-furan-2-ylpropan-2-one, 1H-pyrrole, 2-methylthiophene, and 2-(furan-2-ylmethyldisulfanylmethyl)furan could be identified as possible key contributors to the beef-related attributes of GSH-MRPs. In this study, we demonstrated that the unbiased non-targeted analysis based on metabolomic approach allows the identification of key volatile compounds related to beef flavor in GSH-MRPs. Copyright © 2011 Elsevier B.V. All rights reserved.

  19. Continuous variable quantum key distribution: finite-key analysis of composable security against coherent attacks.

    PubMed

    Furrer, F; Franz, T; Berta, M; Leverrier, A; Scholz, V B; Tomamichel, M; Werner, R F

    2012-09-07

    We provide a security analysis for continuous variable quantum key distribution protocols based on the transmission of two-mode squeezed vacuum states measured via homodyne detection. We employ a version of the entropic uncertainty relation for smooth entropies to give a lower bound on the number of secret bits which can be extracted from a finite number of runs of the protocol. This bound is valid under general coherent attacks, and gives rise to keys which are composably secure. For comparison, we also give a lower bound valid under the assumption of collective attacks. For both scenarios, we find positive key rates using experimental parameters reachable today.

  20. Identifying Key Performance Indicators for Holistic Hospital Management with a Modified DEMATEL Approach

    PubMed Central

    Si, Sheng-Li; You, Xiao-Yue; Huang, Jia

    2017-01-01

    Performance analysis is an important way for hospitals to achieve higher efficiency and effectiveness in providing services to their customers. The performance of the healthcare system can be measured by many indicators, but it is difficult to improve them simultaneously due to the limited resources. A feasible way is to identify the central and influential indicators to improve healthcare performance in a stepwise manner. In this paper, we propose a hybrid multiple criteria decision making (MCDM) approach to identify key performance indicators (KPIs) for holistic hospital management. First, through integrating evidential reasoning approach and interval 2-tuple linguistic variables, various assessments of performance indicators provided by healthcare experts are modeled. Then, the decision making trial and evaluation laboratory (DEMATEL) technique is adopted to build an interactive network and visualize the causal relationships between the performance indicators. Finally, an empirical case study is provided to demonstrate the proposed approach for improving the efficiency of healthcare management. The results show that “accidents/adverse events”, “nosocomial infection”, ‘‘incidents/errors”, “number of operations/procedures” are significant influential indicators. Also, the indicators of “length of stay”, “bed occupancy” and “financial measures” play important roles in performance evaluation of the healthcare organization. The proposed decision making approach could be considered as a reference for healthcare administrators to enhance the performance of their healthcare institutions. PMID:28825613

  1. Identifying Key Performance Indicators for Holistic Hospital Management with a Modified DEMATEL Approach.

    PubMed

    Si, Sheng-Li; You, Xiao-Yue; Liu, Hu-Chen; Huang, Jia

    2017-08-19

    Performance analysis is an important way for hospitals to achieve higher efficiency and effectiveness in providing services to their customers. The performance of the healthcare system can be measured by many indicators, but it is difficult to improve them simultaneously due to the limited resources. A feasible way is to identify the central and influential indicators to improve healthcare performance in a stepwise manner. In this paper, we propose a hybrid multiple criteria decision making (MCDM) approach to identify key performance indicators (KPIs) for holistic hospital management. First, through integrating evidential reasoning approach and interval 2-tuple linguistic variables, various assessments of performance indicators provided by healthcare experts are modeled. Then, the decision making trial and evaluation laboratory (DEMATEL) technique is adopted to build an interactive network and visualize the causal relationships between the performance indicators. Finally, an empirical case study is provided to demonstrate the proposed approach for improving the efficiency of healthcare management. The results show that "accidents/adverse events", "nosocomial infection", ''incidents/errors", "number of operations/procedures" are significant influential indicators. Also, the indicators of "length of stay", "bed occupancy" and "financial measures" play important roles in performance evaluation of the healthcare organization. The proposed decision making approach could be considered as a reference for healthcare administrators to enhance the performance of their healthcare institutions.

  2. Finite-key security analysis of quantum key distribution with imperfect light sources

    DOE PAGES

    Mizutani, Akihiro; Curty, Marcos; Lim, Charles Ci Wen; ...

    2015-09-09

    In recent years, the gap between theory and practice in quantum key distribution (QKD) has been significantly narrowed, particularly for QKD systems with arbitrarily flawed optical receivers. The status for QKD systems with imperfect light sources is however less satisfactory, in the sense that the resulting secure key rates are often overly dependent on the quality of state preparation. This is especially the case when the channel loss is high. Very recently, to overcome this limitation, Tamaki et al proposed a QKD protocol based on the so-called 'rejected data analysis', and showed that its security in the limit of infinitelymore » long keys is almost independent of any encoding flaw in the qubit space, being this protocol compatible with the decoy state method. Here, as a step towards practical QKD, we show that a similar conclusion is reached in the finite-key regime, even when the intensity of the light source is unstable. More concretely, we derive security bounds for a wide class of realistic light sources and show that the bounds are also efficient in the presence of high channel loss. Our results strongly suggest the feasibility of long distance provably secure communication with imperfect light sources.« less

  3. Active Rehabilitation-a community peer-based approach for persons with spinal cord injury: international utilisation of key elements.

    PubMed

    Divanoglou, A; Tasiemski, T; Augutis, M; Trok, K

    2017-06-01

    Active Rehabilitation (AR) is a community peer-based approach that started in Sweden in 1976. As a key component of the approach, AR training camps provide intensive, goal-oriented, intentional, group-based, customised training and peer-support opportunities in a community environment for individuals with spinal cord injury. Prospective cross-sectional study. To describe the profile of the organisations that use components of the AR approach, and to explore the characteristics and the international variations of the approach. Twenty-two organisations from 21 countries from Europe, Asia and Africa reported using components of the AR approach during the past 10 years. An electronic survey was developed and distributed through a personalised email. Sampling involved a prospective identification of organisations that met the inclusion criteria and snowball strategies. While there were many collaborating links between the organisations, RG Active Rehabilitation from Sweden and Motivation Charitable Trust from the United Kingdom were identified as key supporting organisations. The 10 key elements of the AR approach were found to be used uniformly across the participating organisations. Small variations were associated with variations in country income and key supporting organisation. This is the first study to describe the key elements and international variations of the AR approach. This will provide the basis for further studies exploring the effectiveness of the approach, it will likely facilitate international collaboration on research and operational aspects and it could potentially support higher integration in the health-care system and long-term funding of these programmes.

  4. A Novel Quantitative Approach to Concept Analysis: The Internomological Network

    PubMed Central

    Cook, Paul F.; Larsen, Kai R.; Sakraida, Teresa J.; Pedro, Leli

    2012-01-01

    Background When a construct such as patients’ transition to self-management of chronic illness is studied by researchers across multiple disciplines, the meaning of key terms can become confused. This results from inherent problems in language where a term can have multiple meanings (polysemy) and different words can mean the same thing (synonymy). Objectives To test a novel quantitative method for clarifying the meaning of constructs by examining the similarity of published contexts in which they are used. Method Published terms related to the concept transition to self-management of chronic illness were analyzed using the internomological network (INN), a type of latent semantic analysis to calculate the mathematical relationships between constructs based on the contexts in which researchers use each term. This novel approach was tested by comparing results to those from concept analysis, a best-practice qualitative approach to clarifying meanings of terms. By comparing results of the two methods, the best synonyms of transition to self-management, as well as key antecedent, attribute, and consequence terms, were identified. Results Results from INN analysis were consistent with those from concept analysis. The potential synonyms self-management, transition, and adaptation had the greatest utility. Adaptation was the clearest overall synonym, but had lower cross-disciplinary use. The terms coping and readiness had more circumscribed meanings. The INN analysis confirmed key features of transition to self-management, and suggested related concepts not found by the previous review. Discussion The INN analysis is a promising novel methodology that allows researchers to quantify the semantic relationships between constructs. The method works across disciplinary boundaries, and may help to integrate the diverse literature on self-management of chronic illness. PMID:22592387

  5. Device-independent secret-key-rate analysis for quantum repeaters

    NASA Astrophysics Data System (ADS)

    Holz, Timo; Kampermann, Hermann; Bruß, Dagmar

    2018-01-01

    The device-independent approach to quantum key distribution (QKD) aims to establish a secret key between two or more parties with untrusted devices, potentially under full control of a quantum adversary. The performance of a QKD protocol can be quantified by the secret key rate, which can be lower bounded via the violation of an appropriate Bell inequality in a setup with untrusted devices. We study secret key rates in the device-independent scenario for different quantum repeater setups and compare them to their device-dependent analogon. The quantum repeater setups under consideration are the original protocol by Briegel et al. [Phys. Rev. Lett. 81, 5932 (1998), 10.1103/PhysRevLett.81.5932] and the hybrid quantum repeater protocol by van Loock et al. [Phys. Rev. Lett. 96, 240501 (2006), 10.1103/PhysRevLett.96.240501]. For a given repeater scheme and a given QKD protocol, the secret key rate depends on a variety of parameters, such as the gate quality or the detector efficiency. We systematically analyze the impact of these parameters and suggest optimized strategies.

  6. Cities and health: history, approaches, and key questions.

    PubMed

    Vlahov, David; Gibble, Emily; Freudenberg, Nicholas; Galea, Sandro

    2004-12-01

    The majority of the world's population will live in cities in the next few years, and the pace of urbanization worldwide will continue to accelerate over the coming decades. Such a dramatic demographic shift can be expected to have an impact on population health. Although there has been historic interest in how city living is associated with health, this interest has waxed and waned and a cogent framework has yet to evolve that encompasses key issues in urban health. In this article, the authors discuss three alternate approaches to the study of urban health today; these include considering urban health from the perspective of a presumed urban health penalty, from an urban sprawl perspective, and more comprehensively, considering how urban living conditions may be associated with health. The authors also propose three key questions that may help guide the study and practice of urban health in coming decades. These include considering what specific features of cities are causally related to health, the extent to which these features are unique to a particular city or are different between cities, and ultimately, to what extent these features of cities are modifiable in order to allow interventions that can improve the health of urban populations.

  7. A practical guide to assessing clinical decision-making skills using the key features approach.

    PubMed

    Farmer, Elizabeth A; Page, Gordon

    2005-12-01

    This paper in the series on professional assessment provides a practical guide to writing key features problems (KFPs). Key features problems test clinical decision-making skills in written or computer-based formats. They are based on the concept of critical steps or 'key features' in decision making and represent an advance on the older, less reliable patient management problem (PMP) formats. The practical steps in writing these problems are discussed and illustrated by examples. Steps include assembling problem-writing groups, selecting a suitable clinical scenario or problem and defining its key features, writing the questions, selecting question response formats, preparing scoring keys, reviewing item quality and item banking. The KFP format provides educators with a flexible approach to testing clinical decision-making skills with demonstrated validity and reliability when constructed according to the guidelines provided.

  8. Data warehouse model for monitoring key performance indicators (KPIs) using goal oriented approach

    NASA Astrophysics Data System (ADS)

    Abdullah, Mohammed Thajeel; Ta'a, Azman; Bakar, Muhamad Shahbani Abu

    2016-08-01

    The growth and development of universities, just as other organizations, depend on their abilities to strategically plan and implement development blueprints which are in line with their vision and mission statements. The actualizations of these statements, which are often designed into goals and sub-goals and linked to their respective actors are better measured by defining key performance indicators (KPIs) of the university. The proposes ReGADaK, which is an extended the GRAnD approach highlights the facts, dimensions, attributes, measures and KPIs of the organization. The measures from the goal analysis of this unit serve as the basis of developing the related university's KPIs. The proposed data warehouse schema is evaluated through expert review, prototyping and usability evaluation. The findings from the evaluation processes suggest that the proposed data warehouse schema is suitable for monitoring the University's KPIs.

  9. The comparison and analysis of extracting video key frame

    NASA Astrophysics Data System (ADS)

    Ouyang, S. Z.; Zhong, L.; Luo, R. Q.

    2018-05-01

    Video key frame extraction is an important part of the large data processing. Based on the previous work in key frame extraction, we summarized four important key frame extraction algorithms, and these methods are largely developed by comparing the differences between each of two frames. If the difference exceeds a threshold value, take the corresponding frame as two different keyframes. After the research, the key frame extraction based on the amount of mutual trust is proposed, the introduction of information entropy, by selecting the appropriate threshold values into the initial class, and finally take a similar mean mutual information as a candidate key frame. On this paper, several algorithms is used to extract the key frame of tunnel traffic videos. Then, with the analysis to the experimental results and comparisons between the pros and cons of these algorithms, the basis of practical applications is well provided.

  10. Selection of key financial indicators: a literature, panel and survey approach.

    PubMed

    Pink, George H; Daniel, Imtiaz; Hall, Linda McGillis; McKillop, Ian

    2007-01-01

    Since 1998, most hospitals in Ontario have voluntarily participated in one of the largest and most ambitious publicly available performance-reporting initiatives in the world. This article describes the method used to select key financial indicators for inclusion in the report including the literature review, panel and survey approaches that were used. The results for five years of recent data for Ontario hospitals are also presented.

  11. Implementing recovery: an analysis of the key technologies in Scotland

    PubMed Central

    2011-01-01

    Background Over the past ten years the promotion of recovery has become a stated aim of mental health policies within a number of English speaking countries, including Scotland. Implementation of a recovery approach involves a significant reorientation of mental health services and practices, which often poses significant challenges for reformers. This article examines how four key technologies of recovery have assisted in the move towards the creation of a recovery-oriented mental health system in Scotland. Methods Drawing on documentary analysis and a series of interviews we examine the construction and implementation of four key recovery 'technologies' as they have been put to use in Scotland: recovery narratives, the Scottish Recovery Indicator (SRI), Wellness Recovery Action Planning (WRAP) and peer support. Results Our findings illuminate how each of these technologies works to instantiate, exemplify and disseminate a 'recovery orientation' at different sites within the mental health system in order to bring about a 'recovery oriented' mental health system. They also enable us to identify some of the factors that facilitate or hinder the effectiveness of those technologies in bringing about a change in how mental health services are delivered in Scotland. These finding provide a basis for some general reflections on the utility of 'recovery technologies' to implement a shift towards recovery in mental health services in Scotland and elsewhere. Conclusions Our analysis of this process within the Scottish context will be valuable for policy makers and service coordinators wishing to implement recovery values within their own national mental health systems. PMID:21569633

  12. Profiling conserved biological pathways in Autosomal Dominant Polycystic Kidney Disorder (ADPKD) to elucidate key transcriptomic alterations regulating cystogenesis: A cross-species meta-analysis approach.

    PubMed

    Chatterjee, Shatakshee; Verma, Srikant Prasad; Pandey, Priyanka

    2017-09-05

    Initiation and progression of fluid filled cysts mark Autosomal Dominant Polycystic Kidney Disease (ADPKD). Thus, improved therapeutics targeting cystogenesis remains a constant challenge. Microarray studies in single ADPKD animal models species with limited sample sizes tend to provide scattered views on underlying ADPKD pathogenesis. Thus we aim to perform a cross species meta-analysis to profile conserved biological pathways that might be key targets for therapy. Nine ADPKD microarray datasets on rat, mice and human fulfilled our study criteria and were chosen. Intra-species combined analysis was performed after considering removal of batch effect. Significantly enriched GO biological processes and KEGG pathways were computed and their overlap was observed. For the conserved pathways, biological modules and gene regulatory networks were observed. Additionally, Gene Set Enrichment Analysis (GSEA) using Molecular Signature Database (MSigDB) was performed for genes found in conserved pathways. We obtained 28 modules of significantly enriched GO processes and 5 major functional categories from significantly enriched KEGG pathways conserved in human, mice and rats that in turn suggest a global transcriptomic perturbation affecting cyst - formation, growth and progression. Significantly enriched pathways obtained from up-regulated genes such as Genomic instability, Protein localization in ER and Insulin Resistance were found to regulate cyst formation and growth whereas cyst progression due to increased cell adhesion and inflammation was suggested by perturbations in Angiogenesis, TGF-beta, CAMs, and Infection related pathways. Additionally, networks revealed shared genes among pathways e.g. SMAD2 and SMAD7 in Endocytosis and TGF-beta. Our study suggests cyst formation and progression to be an outcome of interplay between a set of several key deregulated pathways. Thus, further translational research is warranted focusing on developing a combinatorial therapeutic

  13. An approach to market analysis for lighter than air transportation of freight

    NASA Technical Reports Server (NTRS)

    Roberts, P. O.; Marcus, H. S.; Pollock, J. H.

    1975-01-01

    An approach is presented to marketing analysis for lighter than air vehicles in a commercial freight market. After a discussion of key characteristics of supply and demand factors, a three-phase approach to marketing analysis is described. The existing transportation systems are quantitatively defined and possible roles for lighter than air vehicles within this framework are postulated. The marketing analysis views the situation from the perspective of both the shipper and the carrier. A demand for freight service is assumed and the resulting supply characteristics are determined. Then, these supply characteristics are used to establish the demand for competing modes. The process is then iterated to arrive at the market solution.

  14. Use of a scenario-neutral approach to identify the key hydro-meteorological attributes that impact runoff from a natural catchment

    NASA Astrophysics Data System (ADS)

    Guo, Danlu; Westra, Seth; Maier, Holger R.

    2017-11-01

    Scenario-neutral approaches are being used increasingly for assessing the potential impact of climate change on water resource systems, as these approaches allow the performance of these systems to be evaluated independently of climate change projections. However, practical implementations of these approaches are still scarce, with a key limitation being the difficulty of generating a range of plausible future time series of hydro-meteorological data. In this study we apply a recently developed inverse stochastic generation approach to support the scenario-neutral analysis, and thus identify the key hydro-meteorological variables to which the system is most sensitive. The stochastic generator simulates synthetic hydro-meteorological time series that represent plausible future changes in (1) the average, extremes and seasonal patterns of rainfall; and (2) the average values of temperature (Ta), relative humidity (RH) and wind speed (uz) as variables that drive PET. These hydro-meteorological time series are then fed through a conceptual rainfall-runoff model to simulate the potential changes in runoff as a function of changes in the hydro-meteorological variables, and runoff sensitivity is assessed with both correlation and Sobol' sensitivity analyses. The method was applied to a case study catchment in South Australia, and the results showed that the most important hydro-meteorological attributes for runoff were winter rainfall followed by the annual average rainfall, while the PET-related meteorological variables had comparatively little impact. The high importance of winter rainfall can be related to the winter-dominated nature of both the rainfall and runoff regimes in this catchment. The approach illustrated in this study can greatly enhance our understanding of the key hydro-meteorological attributes and processes that are likely to drive catchment runoff under a changing climate, thus enabling the design of tailored climate impact assessments to specific

  15. Microbial genome analysis: the COG approach.

    PubMed

    Galperin, Michael Y; Kristensen, David M; Makarova, Kira S; Wolf, Yuri I; Koonin, Eugene V

    2017-09-14

    For the past 20 years, the Clusters of Orthologous Genes (COG) database had been a popular tool for microbial genome annotation and comparative genomics. Initially created for the purpose of evolutionary classification of protein families, the COG have been used, apart from straightforward functional annotation of sequenced genomes, for such tasks as (i) unification of genome annotation in groups of related organisms; (ii) identification of missing and/or undetected genes in complete microbial genomes; (iii) analysis of genomic neighborhoods, in many cases allowing prediction of novel functional systems; (iv) analysis of metabolic pathways and prediction of alternative forms of enzymes; (v) comparison of organisms by COG functional categories; and (vi) prioritization of targets for structural and functional characterization. Here we review the principles of the COG approach and discuss its key advantages and drawbacks in microbial genome analysis. Published by Oxford University Press 2017. This work is written by US Government employees and is in the public domain in the US.

  16. Finite-key analysis for the 1-decoy state QKD protocol

    NASA Astrophysics Data System (ADS)

    Rusca, Davide; Boaron, Alberto; Grünenfelder, Fadri; Martin, Anthony; Zbinden, Hugo

    2018-04-01

    It has been shown that in the asymptotic case of infinite-key length, the 2-decoy state Quantum Key Distribution (QKD) protocol outperforms the 1-decoy state protocol. Here, we present a finite-key analysis of the 1-decoy method. Interestingly, we find that for practical block sizes of up to 108 bits, the 1-decoy protocol achieves for almost all experimental settings higher secret key rates than the 2-decoy protocol. Since using only one decoy is also easier to implement, we conclude that it is the best choice for QKD, in most common practical scenarios.

  17. An Integrated Approach to Life Cycle Analysis

    NASA Technical Reports Server (NTRS)

    Chytka, T. M.; Brown, R. W.; Shih, A. T.; Reeves, J. D.; Dempsey, J. A.

    2006-01-01

    Life Cycle Analysis (LCA) is the evaluation of the impacts that design decisions have on a system and provides a framework for identifying and evaluating design benefits and burdens associated with the life cycles of space transportation systems from a "cradle-to-grave" approach. Sometimes called life cycle assessment, life cycle approach, or "cradle to grave analysis", it represents a rapidly emerging family of tools and techniques designed to be a decision support methodology and aid in the development of sustainable systems. The implementation of a Life Cycle Analysis can vary and may take many forms; from global system-level uncertainty-centered analysis to the assessment of individualized discriminatory metrics. This paper will focus on a proven LCA methodology developed by the Systems Analysis and Concepts Directorate (SACD) at NASA Langley Research Center to quantify and assess key LCA discriminatory metrics, in particular affordability, reliability, maintainability, and operability. This paper will address issues inherent in Life Cycle Analysis including direct impacts, such as system development cost and crew safety, as well as indirect impacts, which often take the form of coupled metrics (i.e., the cost of system unreliability). Since LCA deals with the analysis of space vehicle system conceptual designs, it is imperative to stress that the goal of LCA is not to arrive at the answer but, rather, to provide important inputs to a broader strategic planning process, allowing the managers to make risk-informed decisions, and increase the likelihood of meeting mission success criteria.

  18. Meta‐analysis using individual participant data: one‐stage and two‐stage approaches, and why they may differ

    PubMed Central

    Ensor, Joie; Riley, Richard D.

    2016-01-01

    Meta‐analysis using individual participant data (IPD) obtains and synthesises the raw, participant‐level data from a set of relevant studies. The IPD approach is becoming an increasingly popular tool as an alternative to traditional aggregate data meta‐analysis, especially as it avoids reliance on published results and provides an opportunity to investigate individual‐level interactions, such as treatment‐effect modifiers. There are two statistical approaches for conducting an IPD meta‐analysis: one‐stage and two‐stage. The one‐stage approach analyses the IPD from all studies simultaneously, for example, in a hierarchical regression model with random effects. The two‐stage approach derives aggregate data (such as effect estimates) in each study separately and then combines these in a traditional meta‐analysis model. There have been numerous comparisons of the one‐stage and two‐stage approaches via theoretical consideration, simulation and empirical examples, yet there remains confusion regarding when each approach should be adopted, and indeed why they may differ. In this tutorial paper, we outline the key statistical methods for one‐stage and two‐stage IPD meta‐analyses, and provide 10 key reasons why they may produce different summary results. We explain that most differences arise because of different modelling assumptions, rather than the choice of one‐stage or two‐stage itself. We illustrate the concepts with recently published IPD meta‐analyses, summarise key statistical software and provide recommendations for future IPD meta‐analyses. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:27747915

  19. Passage Key Inlet, Florida; CMS Modeling and Borrow Site Impact Analysis

    DTIC Science & Technology

    2016-06-01

    Impact Analysis by Kelly R. Legault and Sirisha Rayaprolu PURPOSE: This Coastal and Hydraulics Engineering Technical Note (CHETN) describes the...use of a nested Coastal Modeling System (CMS) model for Passage Key Inlet, which is one of the connections between the Gulf of Mexico and Tampa Bay...driven sediment transport at Passage Key Inlet. This analysis resulted in issuing a new Florida Department of Environmental Protection (FDEP) permit to

  20. A Statewide Key Informant Survey and Social Indicators Analysis.

    ERIC Educational Resources Information Center

    Fleischer, Mitchell

    This needs assessment study of mental health needs of the elderly in Pennsylvania used a three-part approach. These parts were a review of existing data sources, an extensive key informant study, and a review of service delivery models. A recent study found a prevalence rate for mental illness in the elderly of 12.8%, more than 5% lower than the…

  1. Modelling Creativity: Identifying Key Components through a Corpus-Based Approach

    PubMed Central

    2016-01-01

    Creativity is a complex, multi-faceted concept encompassing a variety of related aspects, abilities, properties and behaviours. If we wish to study creativity scientifically, then a tractable and well-articulated model of creativity is required. Such a model would be of great value to researchers investigating the nature of creativity and in particular, those concerned with the evaluation of creative practice. This paper describes a unique approach to developing a suitable model of how creative behaviour emerges that is based on the words people use to describe the concept. Using techniques from the field of statistical natural language processing, we identify a collection of fourteen key components of creativity through an analysis of a corpus of academic papers on the topic. Words are identified which appear significantly often in connection with discussions of the concept. Using a measure of lexical similarity to help cluster these words, a number of distinct themes emerge, which collectively contribute to a comprehensive and multi-perspective model of creativity. The components provide an ontology of creativity: a set of building blocks which can be used to model creative practice in a variety of domains. The components have been employed in two case studies to evaluate the creativity of computational systems and have proven useful in articulating achievements of this work and directions for further research. PMID:27706185

  2. Modelling Creativity: Identifying Key Components through a Corpus-Based Approach.

    PubMed

    Jordanous, Anna; Keller, Bill

    2016-01-01

    Creativity is a complex, multi-faceted concept encompassing a variety of related aspects, abilities, properties and behaviours. If we wish to study creativity scientifically, then a tractable and well-articulated model of creativity is required. Such a model would be of great value to researchers investigating the nature of creativity and in particular, those concerned with the evaluation of creative practice. This paper describes a unique approach to developing a suitable model of how creative behaviour emerges that is based on the words people use to describe the concept. Using techniques from the field of statistical natural language processing, we identify a collection of fourteen key components of creativity through an analysis of a corpus of academic papers on the topic. Words are identified which appear significantly often in connection with discussions of the concept. Using a measure of lexical similarity to help cluster these words, a number of distinct themes emerge, which collectively contribute to a comprehensive and multi-perspective model of creativity. The components provide an ontology of creativity: a set of building blocks which can be used to model creative practice in a variety of domains. The components have been employed in two case studies to evaluate the creativity of computational systems and have proven useful in articulating achievements of this work and directions for further research.

  3. Identifying key performance indicators for nursing and midwifery care using a consensus approach.

    PubMed

    McCance, Tanya; Telford, Lorna; Wilson, Julie; Macleod, Olive; Dowd, Audrey

    2012-04-01

    The aim of this study was to gain consensus on key performance indicators that are appropriate and relevant for nursing and midwifery practice in the current policy context. There is continuing demand to demonstrate effectiveness and efficiency in health and social care and to communicate this at boardroom level. Whilst there is substantial literature on the use of clinical indicators and nursing metrics, there is less evidence relating to indicators that reflect the patient experience. A consensus approach was used to identify relevant key performance indicators. A nominal group technique was used comprising two stages: a workshop involving all grades of nursing and midwifery staff in two HSC trusts in Northern Ireland (n = 50); followed by a regional Consensus Conference (n = 80). During the workshop, potential key performance indicators were identified. This was used as the basis for the Consensus Conference, which involved two rounds of consensus. Analysis was based on aggregated scores that were then ranked. Stage one identified 38 potential indicators and stage two prioritised the eight top-ranked indicators as a core set for nursing and midwifery. The relevance and appropriateness of these indicators were confirmed with nurses and midwives working in a range of settings and from the perspective of service users. The eight indicators identified do not conform to the majority of other nursing metrics generally reported in the literature. Furthermore, they are strategically aligned to work on the patient experience and are reflective of the fundamentals of nursing and midwifery practice, with the focus on person-centred care. Nurses and midwives have a significant contribution to make in determining the extent to which these indicators are achieved in practice. Furthermore, measurement of such indicators provides an opportunity to evidence of the unique impact of nursing/midwifery care on the patient experience. © 2011 Blackwell Publishing Ltd.

  4. Microarray analysis identifies candidate genes for key roles in coral development

    PubMed Central

    Grasso, Lauretta C; Maindonald, John; Rudd, Stephen; Hayward, David C; Saint, Robert; Miller, David J; Ball, Eldon E

    2008-01-01

    Background Anthozoan cnidarians are amongst the simplest animals at the tissue level of organization, but are surprisingly complex and vertebrate-like in terms of gene repertoire. As major components of tropical reef ecosystems, the stony corals are anthozoans of particular ecological significance. To better understand the molecular bases of both cnidarian development in general and coral-specific processes such as skeletogenesis and symbiont acquisition, microarray analysis was carried out through the period of early development – when skeletogenesis is initiated, and symbionts are first acquired. Results Of 5081 unique peptide coding genes, 1084 were differentially expressed (P ≤ 0.05) in comparisons between four different stages of coral development, spanning key developmental transitions. Genes of likely relevance to the processes of settlement, metamorphosis, calcification and interaction with symbionts were characterised further and their spatial expression patterns investigated using whole-mount in situ hybridization. Conclusion This study is the first large-scale investigation of developmental gene expression for any cnidarian, and has provided candidate genes for key roles in many aspects of coral biology, including calcification, metamorphosis and symbiont uptake. One surprising finding is that some of these genes have clear counterparts in higher animals but are not present in the closely-related sea anemone Nematostella. Secondly, coral-specific processes (i.e. traits which distinguish corals from their close relatives) may be analogous to similar processes in distantly related organisms. This first large-scale application of microarray analysis demonstrates the potential of this approach for investigating many aspects of coral biology, including the effects of stress and disease. PMID:19014561

  5. Key Elements of a Family Intervention for Schizophrenia: A Qualitative Analysis of an RCT.

    PubMed

    Grácio, Jaime; Gonçalves-Pereira, Manuel; Leff, Julian

    2018-03-01

    Schizophrenia is a complex biopsychosocial condition in which expressed emotion in family members is a robust predictor of relapse. Not surprisingly, family interventions are remarkably effective and thus recommended in current treatment guidelines. Their key elements seem to be common therapeutic factors, followed by education and coping skills training. However, few studies have explored these key elements and the process of the intervention itself. We conducted a qualitative and quantitative analysis of the records from a pioneering family intervention trial addressing expressed emotion, published by Leff and colleagues four decades ago. Records were analyzed into categories and data explored using descriptive statistics. This was complemented by a narrative evaluation using an inductive approach based on emotional markers and markers of change. The most used strategies in the intervention were addressing needs, followed by coping skills enhancement, advice, and emotional support. Dealing with overinvolvement and reframing were the next most frequent. Single-family home sessions seemed to augment the therapeutic work conducted in family groups. Overall the intervention seemed to promote cognitive and emotional change in the participants, and therapists were sensitive to the emotional trajectory of each subject. On the basis of our findings, we developed a longitudinal framework for better understanding the process of this treatment approach. © 2016 Family Process Institute.

  6. Reflections on Practical Approaches to Involving Children and Young People in the Data Analysis Process

    ERIC Educational Resources Information Center

    Coad, Jane; Evans, Ruth

    2008-01-01

    This article reflects on key methodological issues emerging from children and young people's involvement in data analysis processes. We outline a pragmatic framework illustrating different approaches to engaging children, using two case studies of children's experiences of participating in data analysis. The article highlights methods of…

  7. Toxicogenomics and cancer risk assessment: a framework for key event analysis and dose-response assessment for nongenotoxic carcinogens.

    PubMed

    Bercu, Joel P; Jolly, Robert A; Flagella, Kelly M; Baker, Thomas K; Romero, Pedro; Stevens, James L

    2010-12-01

    In order to determine a threshold for nongenotoxic carcinogens, the traditional risk assessment approach has been to identify a mode of action (MOA) with a nonlinear dose-response. The dose-response for one or more key event(s) linked to the MOA for carcinogenicity allows a point of departure (POD) to be selected from the most sensitive effect dose or no-effect dose. However, this can be challenging because multiple MOAs and key events may exist for carcinogenicity and oftentimes extensive research is required to elucidate the MOA. In the present study, a microarray analysis was conducted to determine if a POD could be identified following short-term oral rat exposure with two nongenotoxic rodent carcinogens, fenofibrate and methapyrilene, using a benchmark dose analysis of genes aggregated in Kyoto Encyclopedia of Genes and Genomes (KEGG) pathways and Gene Ontology (GO) biological processes, which likely encompass key event(s) for carcinogenicity. The gene expression response for fenofibrate given to rats for 2days was consistent with its MOA and known key events linked to PPARα activation. The temporal response from daily dosing with methapyrilene demonstrated biological complexity with waves of pathways/biological processes occurring over 1, 3, and 7days; nonetheless, the benchmark dose values were consistent over time. When comparing the dose-response of toxicogenomic data to tumorigenesis or precursor events, the toxicogenomics POD was slightly below any effect level. Our results suggest that toxicogenomic analysis using short-term studies can be used to identify a threshold for nongenotoxic carcinogens based on evaluation of potential key event(s) which then can be used within a risk assessment framework. Copyright © 2010 Elsevier Inc. All rights reserved.

  8. Identifying Key Words in 9-1-1 Calls for Stroke: A Mixed Methods Approach.

    PubMed

    Richards, Christopher T; Wang, Baiyang; Markul, Eddie; Albarran, Frank; Rottman, Doreen; Aggarwal, Neelum T; Lindeman, Patricia; Stein-Spencer, Leslee; Weber, Joseph M; Pearlman, Kenneth S; Tataris, Katie L; Holl, Jane L; Klabjan, Diego; Prabhakaran, Shyam

    2017-01-01

    Identifying stroke during a 9-1-1 call is critical to timely prehospital care. However, emergency medical dispatchers (EMDs) recognize stroke in less than half of 9-1-1 calls, potentially due to the words used by callers to communicate stroke signs and symptoms. We hypothesized that callers do not typically use words and phrases considered to be classical descriptors of stroke, such as focal neurologic deficits, but that a mixed-methods approach can identify words and phrases commonly used by 9-1-1 callers to describe acute stroke victims. We performed a mixed-method, retrospective study of 9-1-1 call audio recordings for adult patients with confirmed stroke who were transported by ambulance in a large urban city. Content analysis, a qualitative methodology, and computational linguistics, a quantitative methodology, were used to identify key words and phrases used by 9-1-1 callers to describe acute stroke victims. Because a caller's level of emotional distress contributes to the communication during a 9-1-1 call, the Emotional Content and Cooperation Score was scored by a multidisciplinary team. A total of 110 9-1-1 calls, received between June and September 2013, were analyzed. EMDs recognized stroke in 48% of calls, and the emotional state of most callers (95%) was calm. In 77% of calls in which EMDs recognized stroke, callers specifically used the word "stroke"; however, the word "stroke" was used in only 38% of calls. Vague, non-specific words and phrases were used to describe stroke victims' symptoms in 55% of calls, and 45% of callers used distractor words and phrases suggestive of non-stroke emergencies. Focal neurologic symptoms were described in 39% of calls. Computational linguistics identified 9 key words that were more commonly used in calls where the EMD identified stroke. These words were concordant with terms identified through qualitative content analysis. Most 9-1-1 callers used vague, non-specific, or distractor words and phrases and infrequently

  9. A systems biology approach to detect key pathways and interaction networks in gastric cancer on the basis of microarray analysis.

    PubMed

    Guo, Leilei; Song, Chunhua; Wang, Peng; Dai, Liping; Zhang, Jianying; Wang, Kaijuan

    2015-11-01

    The aim of the present study was to explore key molecular pathways contributing to gastric cancer (GC) and to construct an interaction network between significant pathways and potential biomarkers. Publicly available gene expression profiles of GSE29272 for GC, and data for the corresponding normal tissue, were downloaded from Gene Expression Omnibus. Pre‑processing and differential analysis were performed with R statistical software packages, and a number of differentially expressed genes (DEGs) were obtained. A functional enrichment analysis was performed for all the DEGs with a BiNGO plug‑in in Cytoscape. Their correlation was analyzed in order to construct a network. The modularity analysis and pathway identification operations were used to identify graph clusters and associated pathways. The underlying molecular mechanisms involving these DEGs were also assessed by data mining. A total of 249 DEGs, which were markedly upregulated and downregulated, were identified. The extracellular region contained the most significantly over‑represented functional terms, with respect to upregulated and downregulated genes, and the closest topological matches were identified for taste transduction and regulation of autophagy. In addition, extracellular matrix‑receptor interactions were identified as the most relevant pathway associated with the progression of GC. The genes for fibronectin 1, secreted phosphoprotein 1, collagen type 4 variant α‑1/2 and thrombospondin 1, which are involved in the pathways, may be considered as potential therapeutic targets for GC. A series of associations between candidate genes and key pathways were also identified for GC, and their correlation may provide novel insights into the pathogenesis of GC.

  10. Empirical Analysis of Optical Attenuator Performance in Quantum Key Distribution Systems Using a Particle Model

    DTIC Science & Technology

    2012-03-01

    EMPIRICAL ANALYSIS OF OPTICAL ATTENUATOR PERFORMANCE IN QUANTUM KEY DISTRIBUTION SYSTEMS USING A...DISTRIBUTION IS UNLIMITED AFIT/GCS/ENG/12-01 EMPIRICAL ANALYSIS OF OPTICAL ATTENUATOR PERFORMANCE IN QUANTUM KEY DISTRIBUTION SYSTEMS USING ...challenging as the complexity of actual implementation specifics are considered. Two components common to most quantum key distribution

  11. Systems Analysis Approach for the NASA Environmentally Responsible Aviation Project

    NASA Technical Reports Server (NTRS)

    Kimmel, William M.

    2011-01-01

    This conference paper describes the current systems analysis approach being implemented for the Environmentally Responsible Aviation Project within the Integrated Systems Research Program under the NASA Aeronautics Research Mission Directorate. The scope and purpose of these systems studies are introduced followed by a methodology overview. The approach involves both top-down and bottoms-up components to provide NASA s stakeholders with a rationale for the prioritization and tracking of a portfolio of technologies which enable the future fleet of aircraft to operate with a simultaneous reduction of aviation noise, emissions and fuel-burn impacts to our environment. Examples of key current results and relevant decision support conclusions are presented along with a forecast of the planned analyses to follow.

  12. Integrative Analysis of DNA Methylation and Gene Expression Data Identifies EPAS1 as a Key Regulator of COPD

    PubMed Central

    Yoo, Seungyeul; Takikawa, Sachiko; Geraghty, Patrick; Argmann, Carmen; Campbell, Joshua; Lin, Luan; Huang, Tao; Tu, Zhidong; Feronjy, Robert; Spira, Avrum; Schadt, Eric E.; Powell, Charles A.; Zhu, Jun

    2015-01-01

    Chronic Obstructive Pulmonary Disease (COPD) is a complex disease. Genetic, epigenetic, and environmental factors are known to contribute to COPD risk and disease progression. Therefore we developed a systematic approach to identify key regulators of COPD that integrates genome-wide DNA methylation, gene expression, and phenotype data in lung tissue from COPD and control samples. Our integrative analysis identified 126 key regulators of COPD. We identified EPAS1 as the only key regulator whose downstream genes significantly overlapped with multiple genes sets associated with COPD disease severity. EPAS1 is distinct in comparison with other key regulators in terms of methylation profile and downstream target genes. Genes predicted to be regulated by EPAS1 were enriched for biological processes including signaling, cell communications, and system development. We confirmed that EPAS1 protein levels are lower in human COPD lung tissue compared to non-disease controls and that Epas1 gene expression is reduced in mice chronically exposed to cigarette smoke. As EPAS1 downstream genes were significantly enriched for hypoxia responsive genes in endothelial cells, we tested EPAS1 function in human endothelial cells. EPAS1 knockdown by siRNA in endothelial cells impacted genes that significantly overlapped with EPAS1 downstream genes in lung tissue including hypoxia responsive genes, and genes associated with emphysema severity. Our first integrative analysis of genome-wide DNA methylation and gene expression profiles illustrates that not only does DNA methylation play a ‘causal’ role in the molecular pathophysiology of COPD, but it can be leveraged to directly identify novel key mediators of this pathophysiology. PMID:25569234

  13. Integrative analysis of DNA methylation and gene expression data identifies EPAS1 as a key regulator of COPD.

    PubMed

    Yoo, Seungyeul; Takikawa, Sachiko; Geraghty, Patrick; Argmann, Carmen; Campbell, Joshua; Lin, Luan; Huang, Tao; Tu, Zhidong; Foronjy, Robert F; Feronjy, Robert; Spira, Avrum; Schadt, Eric E; Powell, Charles A; Zhu, Jun

    2015-01-01

    Chronic Obstructive Pulmonary Disease (COPD) is a complex disease. Genetic, epigenetic, and environmental factors are known to contribute to COPD risk and disease progression. Therefore we developed a systematic approach to identify key regulators of COPD that integrates genome-wide DNA methylation, gene expression, and phenotype data in lung tissue from COPD and control samples. Our integrative analysis identified 126 key regulators of COPD. We identified EPAS1 as the only key regulator whose downstream genes significantly overlapped with multiple genes sets associated with COPD disease severity. EPAS1 is distinct in comparison with other key regulators in terms of methylation profile and downstream target genes. Genes predicted to be regulated by EPAS1 were enriched for biological processes including signaling, cell communications, and system development. We confirmed that EPAS1 protein levels are lower in human COPD lung tissue compared to non-disease controls and that Epas1 gene expression is reduced in mice chronically exposed to cigarette smoke. As EPAS1 downstream genes were significantly enriched for hypoxia responsive genes in endothelial cells, we tested EPAS1 function in human endothelial cells. EPAS1 knockdown by siRNA in endothelial cells impacted genes that significantly overlapped with EPAS1 downstream genes in lung tissue including hypoxia responsive genes, and genes associated with emphysema severity. Our first integrative analysis of genome-wide DNA methylation and gene expression profiles illustrates that not only does DNA methylation play a 'causal' role in the molecular pathophysiology of COPD, but it can be leveraged to directly identify novel key mediators of this pathophysiology.

  14. Finite-size analysis of a continuous-variable quantum key distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leverrier, Anthony; Grosshans, Frederic; Grangier, Philippe

    2010-06-15

    The goal of this paper is to extend the framework of finite-size analysis recently developed for quantum key distribution to continuous-variable protocols. We do not solve this problem completely here, and we mainly consider the finite-size effects on the parameter estimation procedure. Despite the fact that some questions are left open, we are able to give an estimation of the secret key rate for protocols which do not contain a postselection procedure. As expected, these results are significantly more pessimistic than those obtained in the asymptotic regime. However, we show that recent continuous-variable protocols are able to provide fully securemore » secret keys in the finite-size scenario, over distances larger than 50 km.« less

  15. Rethinking vulnerability analysis and governance with emphasis on a participatory approach.

    PubMed

    Rossignol, Nicolas; Delvenne, Pierre; Turcanu, Catrinel

    2015-01-01

    This article draws on vulnerability analysis as it emerged as a complement to classical risk analysis, and it aims at exploring its ability for nurturing risk and vulnerability governance actions. An analysis of the literature on vulnerability analysis allows us to formulate a three-fold critique: first, vulnerability analysis has been treated separately in the natural and the technological hazards fields. This separation prevents vulnerability from unleashing the full range of its potential, as it constrains appraisals into artificial categories and thus already closes down the outcomes of the analysis. Second, vulnerability analysis focused on assessment tools that are mainly quantitative, whereas qualitative appraisal is a key to assessing vulnerability in a comprehensive way and to informing policy making. Third, a systematic literature review of case studies reporting on participatory approaches to vulnerability analysis allows us to argue that participation has been important to address the above, but it remains too closed down in its approach and would benefit from embracing a more open, encompassing perspective. Therefore, we suggest rethinking vulnerability analysis as one part of a dynamic process between opening-up and closing-down strategies, in order to support a vulnerability governance framework. © 2014 Society for Risk Analysis.

  16. Analysis of counterfactual quantum key distribution using error-correcting theory

    NASA Astrophysics Data System (ADS)

    Li, Yan-Bing

    2014-10-01

    Counterfactual quantum key distribution is an interesting direction in quantum cryptography and has been realized by some researchers. However, it has been pointed that its insecure in information theory when it is used over a high lossy channel. In this paper, we retry its security from a error-correcting theory point of view. The analysis indicates that the security flaw comes from the reason that the error rate in the users' raw key pair is as high as that under the Eve's attack when the loss rate exceeds 50 %.

  17. The Key Events Dose-Response Framework: a cross-disciplinary mode-of-action based approach to examining dose-response and thresholds.

    PubMed

    Julien, Elizabeth; Boobis, Alan R; Olin, Stephen S

    2009-09-01

    The ILSI Research Foundation convened a cross-disciplinary working group to examine current approaches for assessing dose-response and identifying safe levels of intake or exposure for four categories of bioactive agents-food allergens, nutrients, pathogenic microorganisms, and environmental chemicals. This effort generated a common analytical framework-the Key Events Dose-Response Framework (KEDRF)-for systematically examining key events that occur between the initial dose of a bioactive agent and the effect of concern. Individual key events are considered with regard to factors that influence the dose-response relationship and factors that underlie variability in that relationship. This approach illuminates the connection between the processes occurring at the level of fundamental biology and the outcomes observed at the individual and population levels. Thus, it promotes an evidence-based approach for using mechanistic data to reduce reliance on default assumptions, to quantify variability, and to better characterize biological thresholds. This paper provides an overview of the KEDRF and introduces a series of four companion papers that illustrate initial application of the approach to a range of bioactive agents.

  18. SLAR image interpretation keys for geographic analysis

    NASA Technical Reports Server (NTRS)

    Coiner, J. C.

    1972-01-01

    A means for side-looking airborne radar (SLAR) imagery to become a more widely used data source in geoscience and agriculture is suggested by providing interpretation keys as an easily implemented interpretation model. Interpretation problems faced by the researcher wishing to employ SLAR are specifically described, and the use of various types of image interpretation keys to overcome these problems is suggested. With examples drawn from agriculture and vegetation mapping, direct and associate dichotomous image interpretation keys are discussed and methods of constructing keys are outlined. Initial testing of the keys, key-based automated decision rules, and the role of the keys in an information system for agriculture are developed.

  19. Cancer care management through a mobile phone health approach: key considerations.

    PubMed

    Mohammadzadeh, Niloofar; Safdari, Reza; Rahimi, Azin

    2013-01-01

    Greater use of mobile phone devices seems inevitable because the health industry and cancer care are facing challenges such as resource constraints, rising care costs, the need for immediate access to healthcare data of types such as audio video texts for early detection and treatment of patients and increasing remote aids in telemedicine. Physicians, in order to study the causes of cancer, detect cancer earlier, act in prevention measures, determine the effectiveness of treatment and specify the reasons for the treatment ineffectiveness, need to access accurate, comprehensive and timely cancer data. Mobile devices provide opportunities and can play an important role in consulting, diagnosis, treatment, and quick access to health information. There easy carriage make them perfect tools for healthcare providers in cancer care management. Key factors in cancer care management systems through a mobile phone health approach must be considered such as human resources, confidentiality and privacy, legal and ethical issues, appropriate ICT and provider infrastructure and costs in general aspects and interoperability, human relationships, types of mobile devices and telecommunication related points in specific aspects. The successful implementation of mobile-based systems in cancer care management will constantly face many challenges. Hence, in applying mobile cancer care, involvement of users and considering their needs in all phases of project, providing adequate bandwidth, preparation of standard tools that provide maximum mobility and flexibility for users, decreasing obstacles to interrupt network communications, and using suitable communication protocols are essential. It is obvious that identifying and reducing barriers and strengthening the positive points will have a significant role in appropriate planning and promoting the achievements of mobile cancer care systems. The aim of this article is to explain key points which should be considered in designing

  20. An imprecise probability approach for squeal instability analysis based on evidence theory

    NASA Astrophysics Data System (ADS)

    Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie

    2017-01-01

    An imprecise probability approach based on evidence theory is proposed for squeal instability analysis of uncertain disc brakes in this paper. First, the squeal instability of the finite element (FE) model of a disc brake is investigated and its dominant unstable eigenvalue is detected by running two typical numerical simulations, i.e., complex eigenvalue analysis (CEA) and transient dynamical analysis. Next, the uncertainty mainly caused by contact and friction is taken into account and some key parameters of the brake are described as uncertain parameters. All these uncertain parameters are usually involved with imprecise data such as incomplete information and conflict information. Finally, a squeal instability analysis model considering imprecise uncertainty is established by integrating evidence theory, Taylor expansion, subinterval analysis and surrogate model. In the proposed analysis model, the uncertain parameters with imprecise data are treated as evidence variables, and the belief measure and plausibility measure are employed to evaluate system squeal instability. The effectiveness of the proposed approach is demonstrated by numerical examples and some interesting observations and conclusions are summarized from the analyses and discussions. The proposed approach is generally limited to the squeal problems without too many investigated parameters. It can be considered as a potential method for squeal instability analysis, which will act as the first step to reduce squeal noise of uncertain brakes with imprecise information.

  1. Community Landscapes: An Integrative Approach to Determine Overlapping Network Module Hierarchy, Identify Key Nodes and Predict Network Dynamics

    PubMed Central

    Kovács, István A.; Palotai, Robin; Szalay, Máté S.; Csermely, Peter

    2010-01-01

    Background Network communities help the functional organization and evolution of complex networks. However, the development of a method, which is both fast and accurate, provides modular overlaps and partitions of a heterogeneous network, has proven to be rather difficult. Methodology/Principal Findings Here we introduce the novel concept of ModuLand, an integrative method family determining overlapping network modules as hills of an influence function-based, centrality-type community landscape, and including several widely used modularization methods as special cases. As various adaptations of the method family, we developed several algorithms, which provide an efficient analysis of weighted and directed networks, and (1) determine pervasively overlapping modules with high resolution; (2) uncover a detailed hierarchical network structure allowing an efficient, zoom-in analysis of large networks; (3) allow the determination of key network nodes and (4) help to predict network dynamics. Conclusions/Significance The concept opens a wide range of possibilities to develop new approaches and applications including network routing, classification, comparison and prediction. PMID:20824084

  2. Designing Integrated Approaches to Support People with Multimorbidity: Key Messages from Systematic Reviews, Health System Leaders and Citizens.

    PubMed

    Wilson, Michael G; Lavis, John N; Gauvin, Francois-Pierre

    2016-11-01

    Living with multiple chronic conditions (multimorbidity) - and facing complex, uncoordinated and fragmented care - is part of the daily life of a growing number of Canadians. We undertook: a knowledge synthesis; a "gap analysis" of existing systematic reviews; an issue brief that synthesized the available evidence about the problem, three options for addressing it and implementation considerations; a stakeholder dialogue involving key health-system leaders; and a citizen panel. We identified several recommendations for actions that can be taken, including: developing evidence-based guidance that providers can use to help achieve goals set by patients; embracing approaches to supporting self-management; supporting greater communication and collaboration across healthcare providers as well as between healthcare providers and patients; and investing more efforts in health promotion and disease prevention. Our results point to the need for health system decision-makers to support bottom-up, person-centred approaches to developing models of care that are tailored for people with multimorbidity and support a research agenda to address the identified priorities. Copyright © 2016 Longwoods Publishing.

  3. Structured-Exercise-Program (SEP): An Effective Training Approach to Key Healthcare Professionals

    ERIC Educational Resources Information Center

    Miazi, Mosharaf H.; Hossain, Taleb; Tiroyakgosi, C.

    2014-01-01

    Structured exercise program is an effective approach to technology dependent resource limited healthcare area for professional training. The result of a recently conducted data analysis revealed this. The aim of the study is to know the effectiveness of the applied approach that was designed to observe the level of adherence to newly adopted…

  4. Device-independent quantum key distribution

    NASA Astrophysics Data System (ADS)

    Hänggi, Esther

    2010-12-01

    In this thesis, we study two approaches to achieve device-independent quantum key distribution: in the first approach, the adversary can distribute any system to the honest parties that cannot be used to communicate between the three of them, i.e., it must be non-signalling. In the second approach, we limit the adversary to strategies which can be implemented using quantum physics. For both approaches, we show how device-independent quantum key distribution can be achieved when imposing an additional condition. In the non-signalling case this additional requirement is that communication is impossible between all pairwise subsystems of the honest parties, while, in the quantum case, we demand that measurements on different subsystems must commute. We give a generic security proof for device-independent quantum key distribution in these cases and apply it to an existing quantum key distribution protocol, thus proving its security even in this setting. We also show that, without any additional such restriction there always exists a successful joint attack by a non-signalling adversary.

  5. Modelling and analysis of FMS productivity variables by ISM, SEM and GTMA approach

    NASA Astrophysics Data System (ADS)

    Jain, Vineet; Raj, Tilak

    2014-09-01

    Productivity has often been cited as a key factor in a flexible manufacturing system (FMS) performance, and actions to increase it are said to improve profitability and the wage earning capacity of employees. Improving productivity is seen as a key issue for survival and success in the long term of a manufacturing system. The purpose of this paper is to make a model and analysis of the productivity variables of FMS. This study was performed by different approaches viz. interpretive structural modelling (ISM), structural equation modelling (SEM), graph theory and matrix approach (GTMA) and a cross-sectional survey within manufacturing firms in India. ISM has been used to develop a model of productivity variables, and then it has been analyzed. Exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) are powerful statistical techniques. CFA is carried by SEM. EFA is applied to extract the factors in FMS by the statistical package for social sciences (SPSS 20) software and confirming these factors by CFA through analysis of moment structures (AMOS 20) software. The twenty productivity variables are identified through literature and four factors extracted, which involves the productivity of FMS. The four factors are people, quality, machine and flexibility. SEM using AMOS 20 was used to perform the first order four-factor structures. GTMA is a multiple attribute decision making (MADM) methodology used to find intensity/quantification of productivity variables in an organization. The FMS productivity index has purposed to intensify the factors which affect FMS.

  6. Parallel Key Frame Extraction for Surveillance Video Service in a Smart City.

    PubMed

    Zheng, Ran; Yao, Chuanwei; Jin, Hai; Zhu, Lei; Zhang, Qin; Deng, Wei

    2015-01-01

    Surveillance video service (SVS) is one of the most important services provided in a smart city. It is very important for the utilization of SVS to provide design efficient surveillance video analysis techniques. Key frame extraction is a simple yet effective technique to achieve this goal. In surveillance video applications, key frames are typically used to summarize important video content. It is very important and essential to extract key frames accurately and efficiently. A novel approach is proposed to extract key frames from traffic surveillance videos based on GPU (graphics processing units) to ensure high efficiency and accuracy. For the determination of key frames, motion is a more salient feature in presenting actions or events, especially in surveillance videos. The motion feature is extracted in GPU to reduce running time. It is also smoothed to reduce noise, and the frames with local maxima of motion information are selected as the final key frames. The experimental results show that this approach can extract key frames more accurately and efficiently compared with several other methods.

  7. The key-features approach to assess clinical decisions: validity evidence to date.

    PubMed

    Bordage, G; Page, G

    2018-05-17

    The key-features (KFs) approach to assessment was initially proposed during the First Cambridge Conference on Medical Education in 1984 as a more efficient and effective means of assessing clinical decision-making skills. Over three decades later, we conducted a comprehensive, systematic review of the validity evidence gathered since then. The evidence was compiled according to the Standards for Educational and Psychological Testing's five sources of validity evidence, namely, Content, Response process, Internal structure, Relations to other variables, and Consequences, to which we added two other types related to Cost-feasibility and Acceptability. Of the 457 publications that referred to the KFs approach between 1984 and October 2017, 164 are cited here; the remaining 293 were either redundant or the authors simply mentioned the KFs concept in relation to their work. While one set of articles reported meeting the validity standards, another set examined KFs test development choices and score interpretation. The accumulated validity evidence for the KFs approach since its inception supports the decision-making construct measured and its use to assess clinical decision-making skills at all levels of training and practice and with various types of exam formats. Recognizing that gathering validity evidence is an ongoing process, areas with limited evidence, such as item factor analyses or consequences of testing, are identified as well as new topics needing further clarification, such as the use of the KFs approach for formative assessment and its place within a program of assessment.

  8. Differential Fault Analysis on CLEFIA with 128, 192, and 256-Bit Keys

    NASA Astrophysics Data System (ADS)

    Takahashi, Junko; Fukunaga, Toshinori

    This paper describes a differential fault analysis (DFA) attack against CLEFIA. The proposed attack can be applied to CLEFIA with all supported keys: 128, 192, and 256-bit keys. DFA is a type of side-channel attack. This attack enables the recovery of secret keys by injecting faults into a secure device during its computation of the cryptographic algorithm and comparing the correct ciphertext with the faulty one. CLEFIA is a 128-bit blockcipher with 128, 192, and 256-bit keys developed by the Sony Corporation in 2007. CLEFIA employs a generalized Feistel structure with four data lines. We developed a new attack method that uses this characteristic structure of the CLEFIA algorithm. On the basis of the proposed attack, only 2 pairs of correct and faulty ciphertexts are needed to retrieve the 128-bit key, and 10.78 pairs on average are needed to retrieve the 192 and 256-bit keys. The proposed attack is more efficient than any previously reported. In order to verify the proposed attack and estimate the calculation time to recover the secret key, we conducted an attack simulation using a PC. The simulation results show that we can obtain each secret key within three minutes on average. This result shows that we can obtain the entire key within a feasible computational time.

  9. A Delphi-Based Approach for Detecting Key E-Learning Trends in Postgraduate Education: The Spanish Case

    ERIC Educational Resources Information Center

    Lopez-Catalan, Blanca; Bañuls, Victor A.

    2017-01-01

    Purpose: The purpose of this paper is to present the results of national level Delphi study carried out in Spain aimed at providing inputs for higher education administrators and decision makers about key e-learning trends for supporting postgraduate courses. Design/methodology/approach: The ranking of the e-learning trends is based on a…

  10. New approach to gallbladder ultrasonic images analysis and lesions recognition.

    PubMed

    Bodzioch, Sławomir; Ogiela, Marek R

    2009-03-01

    This paper presents a new approach to gallbladder ultrasonic image processing and analysis towards detection of disease symptoms on processed images. First, in this paper, there is presented a new method of filtering gallbladder contours from USG images. A major stage in this filtration is to segment and section off areas occupied by the said organ. In most cases this procedure is based on filtration that plays a key role in the process of diagnosing pathological changes. Unfortunately ultrasound images present among the most troublesome methods of analysis owing to the echogenic inconsistency of structures under observation. This paper provides for an inventive algorithm for the holistic extraction of gallbladder image contours. The algorithm is based on rank filtration, as well as on the analysis of histogram sections on tested organs. The second part concerns detecting lesion symptoms of the gallbladder. Automating a process of diagnosis always comes down to developing algorithms used to analyze the object of such diagnosis and verify the occurrence of symptoms related to given affection. Usually the final stage is to make a diagnosis based on the detected symptoms. This last stage can be carried out through either dedicated expert systems or more classic pattern analysis approach like using rules to determine illness basing on detected symptoms. This paper discusses the pattern analysis algorithms for gallbladder image interpretation towards classification of the most frequent illness symptoms of this organ.

  11. Illustration and analysis of a coordinated approach to an effective forensic trace evidence capability.

    PubMed

    Stoney, David A; Stoney, Paul L

    2015-08-01

    An effective trace evidence capability is defined as one that exploits all useful particle types, chooses appropriate technologies to do so, and directly integrates the findings with case-specific problems. Limitations of current approaches inhibit the attainment of an effective capability and it has been strongly argued that a new approach to trace evidence analysis is essential. A hypothetical case example is presented to illustrate and analyze how forensic particle analysis can be used as a powerful practical tool in forensic investigations. The specifics in this example, including the casework investigation, laboratory analyses, and close professional interactions, provide focal points for subsequent analysis of how this outcome can be achieved. This leads to the specification of five key elements that are deemed necessary and sufficient for effective forensic particle analysis: (1) a dynamic forensic analytical approach, (2) concise and efficient protocols addressing particle combinations, (3) multidisciplinary capabilities of analysis and interpretation, (4) readily accessible external specialist resources, and (5) information integration and communication. A coordinating role, absent in current approaches to trace evidence analysis, is essential to achieving these elements. However, the level of expertise required for the coordinating role is readily attainable. Some additional laboratory protocols are also essential. However, none of these has greater staffing requirements than those routinely met by existing forensic trace evidence practitioners. The major challenges that remain are organizational acceptance, planning and implementation. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  12. Risk assessment for enterprise resource planning (ERP) system implementations: a fault tree analysis approach

    NASA Astrophysics Data System (ADS)

    Zeng, Yajun; Skibniewski, Miroslaw J.

    2013-08-01

    Enterprise resource planning (ERP) system implementations are often characterised with large capital outlay, long implementation duration, and high risk of failure. In order to avoid ERP implementation failure and realise the benefits of the system, sound risk management is the key. This paper proposes a probabilistic risk assessment approach for ERP system implementation projects based on fault tree analysis, which models the relationship between ERP system components and specific risk factors. Unlike traditional risk management approaches that have been mostly focused on meeting project budget and schedule objectives, the proposed approach intends to address the risks that may cause ERP system usage failure. The approach can be used to identify the root causes of ERP system implementation usage failure and quantify the impact of critical component failures or critical risk events in the implementation process.

  13. Alexithymia and psychosocial problems among Italian preadolescents. A latent class analysis approach.

    PubMed

    Mannarini, Stefania; Balottin, Laura; Toldo, Irene; Gatta, Michela

    2016-10-01

    The study, conducted on Italian preadolscents aged 11 to 13 belonging to the general population, aims to investigate the relationship between the emotional functioning, namely, alexithymia, and the risk of developing behavioral and emotional problems measured using the Strength and Difficulty Questionnaire. The latent class analysis approach allowed to identify two latent variables, accounting for the internalizing (emotional symptoms and difficulties in emotional awareness) and for the externalizing problems (conduct problems and hyperactivity, problematic relationships with peers, poor prosocial behaviors and externally oriented thinking). The two latent variables featured two latent classes: the difficulty in dealing with problems and the strength to face problems that was representative of most of the healthy participants with specific gender differences. Along with the analysis of psychopathological behaviors, the study of resilience and strengths can prove to be a key step in order to develop valuable preventive approaches to tackle psychiatric disorders. © 2016 Scandinavian Psychological Associations and John Wiley & Sons Ltd.

  14. KEY COMPARISON: Final report on CCQM-K69 key comparison: Testosterone glucuronide in human urine

    NASA Astrophysics Data System (ADS)

    Liu, Fong-Ha; Mackay, Lindsey; Murby, John

    2010-01-01

    The CCQM-K69 key comparison of testosterone glucuronide in human urine was organized under the auspices of the CCQM Organic Analysis Working Group (OAWG). The National Measurement Institute Australia (NMIA) acted as the coordinating laboratory for the comparison. The samples distributed for the key comparison were prepared at NMIA with funding from the World Anti-Doping Agency (WADA). WADA granted the approval for this material to be used for the intercomparison provided the distribution and handling of the material were strictly controlled. Three national metrology institutes (NMIs)/designated institutes (DIs) developed reference methods and submitted data for the key comparison along with two other laboratories who participated in the parallel pilot study. A good selection of analytical methods and sample workup procedures was displayed in the results submitted considering the complexities of the matrix involved. The comparability of measurement results was successfully demonstrated by the participating NMIs. Only the key comparison data were used to estimate the key comparison reference value (KCRV), using the arithmetic mean approach. The reported expanded uncertainties for results ranged from 3.7% to 6.7% at the 95% level of confidence and all results agreed within the expanded uncertainty of the KCRV. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (MRA).

  15. Formal Analysis of Key Integrity in PKCS#11

    NASA Astrophysics Data System (ADS)

    Falcone, Andrea; Focardi, Riccardo

    PKCS#11 is a standard API to cryptographic devices such as smarcards, hardware security modules and usb crypto-tokens. Though widely adopted, this API has been shown to be prone to attacks in which a malicious user gains access to the sensitive keys stored in the devices. In 2008, Delaune, Kremer and Steel proposed a model to formally reason on this kind of attacks. We extend this model to also describe flaws that are based on integrity violations of the stored keys. In particular, we consider scenarios in which a malicious overwriting of keys might fool honest users into using attacker's own keys, while performing sensitive operations. We further enrich the model with a trusted key mechanism ensuring that only controlled, non-tampered keys are used in cryptographic operations, and we show how this modified API prevents the above mentioned key-replacement attacks.

  16. A Positive Deviance Approach to Understanding Key Features to Improving Diabetes Care in the Medical Home

    PubMed Central

    Gabbay, Robert A.; Friedberg, Mark W.; Miller-Day, Michelle; Cronholm, Peter F.; Adelman, Alan; Schneider, Eric C.

    2013-01-01

    PURPOSE The medical home has gained national attention as a model to reorganize primary care to improve health outcomes. Pennsylvania has undertaken one of the largest state-based, multipayer medical home pilot projects. We used a positive deviance approach to identify and compare factors driving the care models of practices showing the greatest and least improvement in diabetes care in a sample of 25 primary care practices in southeast Pennsylvania. METHODS We ranked practices into improvement quintiles on the basis of the average absolute percentage point improvement from baseline to 18 months in 3 registry-based measures of performance related to diabetes care: glycated hemoglobin concentration, blood pressure, and low-density lipoprotein cholesterol level. We then conducted surveys and key informant interviews with leaders and staff in the 5 most and least improved practices, and compared their responses. RESULTS The most improved/higher-performing practices tended to have greater structural capabilities (eg, electronic health records) than the least improved/lower-performing practices at baseline. Interviews revealed striking differences between the groups in terms of leadership styles and shared vision; sense, use, and development of teams; processes for monitoring progress and obtaining feedback; and presence of technologic and financial distractions. CONCLUSIONS Positive deviance analysis suggests that primary care practices’ baseline structural capabilities and abilities to buffer the stresses of change may be key facilitators of performance improvement in medical home transformations. Attention to the practices’ structural capabilities and factors shaping successful change, especially early in the process, will be necessary to improve the likelihood of successful medical home transformation and better care. PMID:23690393

  17. [Key informers. When and How?].

    PubMed

    Martín González, R

    2009-03-01

    When information obtained through duly designed and developed studies is not available, the solution to certain problems that affect the population or that respond to certain questions may be approached by using the information and experience provided by the so-called key informer. The key informer is defined as a person who is in contact with the community or with the problem to be studied, who is considered to have good knowledge of the situation and therefore who is considered an expert. The search for consensus is the basis to obtain information through the key informers. The techniques used have different characteristics based on whether the experts chosen meet together or not, whether they are guided or not, whether they interact with each other or not. These techniques include the survey, the Delphi technique, the nominal group technique, brainwriting, brainstorming, the Phillips 66 technique, the 6-3-5 technique, the community forum and the community impressions technique. Information provided by key informers through the search for consensus is relevant when this is not available or cannot be obtained by other methods. It has permitted the analysis of the existing neurological care model, elaboration of recommendations on visit times for the out-patient neurological care, and the elaboration of guidelines and recommendations for the management of prevalent neurological problems.

  18. Improved key-rate bounds for practical decoy-state quantum-key-distribution systems

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen; Zhao, Qi; Razavi, Mohsen; Ma, Xiongfeng

    2017-01-01

    The decoy-state scheme is the most widely implemented quantum-key-distribution protocol in practice. In order to account for the finite-size key effects on the achievable secret key generation rate, a rigorous statistical fluctuation analysis is required. Originally, a heuristic Gaussian-approximation technique was used for this purpose, which, despite its analytical convenience, was not sufficiently rigorous. The fluctuation analysis has recently been made rigorous by using the Chernoff bound. There is a considerable gap, however, between the key-rate bounds obtained from these techniques and that obtained from the Gaussian assumption. Here we develop a tighter bound for the decoy-state method, which yields a smaller failure probability. This improvement results in a higher key rate and increases the maximum distance over which secure key exchange is possible. By optimizing the system parameters, our simulation results show that our method almost closes the gap between the two previously proposed techniques and achieves a performance similar to that of conventional Gaussian approximations.

  19. Outline of a new approach to the analysis of complex systems and decision processes.

    NASA Technical Reports Server (NTRS)

    Zadeh, L. A.

    1973-01-01

    Development of a conceptual framework for dealing with systems which are too complex or too ill-defined to admit of precise quantitative analysis. The approach outlined is based on the premise that the key elements in human thinking are not numbers, but labels of fuzzy sets - i.e., classes of objects in which the transition from membership to nonmembership is gradual rather than abrupt. The approach in question has three main distinguishing features - namely, the use of so-called 'linguistic' variables in place of or in addition to numerical variables, the characterization of simple relations between variables by conditional fuzzy statements, and the characterization of complex relations by fuzzy algorithms.

  20. Comparative Performance Analysis of a Hyper-Temporal Ndvi Analysis Approach and a Landscape-Ecological Mapping Approach

    NASA Astrophysics Data System (ADS)

    Ali, A.; de Bie, C. A. J. M.; Scarrott, R. G.; Ha, N. T. T.; Skidmore, A. K.

    2012-07-01

    Both agricultural area expansion and intensification are necessary to cope with the growing demand for food, and the growing threat of food insecurity which is rapidly engulfing poor and under-privileged sections of the global population. Therefore, it is of paramount importance to have the ability to accurately estimate crop area and spatial distribution. Remote sensing has become a valuable tool for estimating and mapping cropland areas, useful in food security monitoring. This work contributes to addressing this broad issue, focusing on the comparative performance analysis of two mapping approaches (i) a hyper-temporal Normalized Difference Vegetation Index (NDVI) analysis approach and (ii) a Landscape-ecological approach. The hyper-temporal NDVI analysis approach utilized SPOT 10-day NDVI imagery from April 1998-December 2008, whilst the Landscape-ecological approach used multitemporal Landsat-7 ETM+ imagery acquired intermittently between 1992 and 2002. Pixels in the time-series NDVI dataset were clustered using an ISODATA clustering algorithm adapted to determine the optimal number of pixel clusters to successfully generalize hyper-temporal datasets. Clusters were then characterized with crop cycle information, and flooding information to produce an NDVI unit map of rice classes with flood regime and NDVI profile information. A Landscape-ecological map was generated using a combination of digitized homogenous map units in the Landsat-7 ETM+ imagery, a Land use map 2005 of the Mekong delta, and supplementary datasets on the regions terrain, geo-morphology and flooding depths. The output maps were validated using reported crop statistics, and regression analyses were used to ascertain the relationship between land use area estimated from maps, and those reported in district crop statistics. The regression analysis showed that the hyper-temporal NDVI analysis approach explained 74% and 76% of the variability in reported crop statistics in two rice crop and three

  1. A machine learning approach for the identification of key markers involved in brain development from single-cell transcriptomic data.

    PubMed

    Hu, Yongli; Hase, Takeshi; Li, Hui Peng; Prabhakar, Shyam; Kitano, Hiroaki; Ng, See Kiong; Ghosh, Samik; Wee, Lawrence Jin Kiat

    2016-12-22

    The ability to sequence the transcriptomes of single cells using single-cell RNA-seq sequencing technologies presents a shift in the scientific paradigm where scientists, now, are able to concurrently investigate the complex biology of a heterogeneous population of cells, one at a time. However, till date, there has not been a suitable computational methodology for the analysis of such intricate deluge of data, in particular techniques which will aid the identification of the unique transcriptomic profiles difference between the different cellular subtypes. In this paper, we describe the novel methodology for the analysis of single-cell RNA-seq data, obtained from neocortical cells and neural progenitor cells, using machine learning algorithms (Support Vector machine (SVM) and Random Forest (RF)). Thirty-eight key transcripts were identified, using the SVM-based recursive feature elimination (SVM-RFE) method of feature selection, to best differentiate developing neocortical cells from neural progenitor cells in the SVM and RF classifiers built. Also, these genes possessed a higher discriminative power (enhanced prediction accuracy) as compared commonly used statistical techniques or geneset-based approaches. Further downstream network reconstruction analysis was carried out to unravel hidden general regulatory networks where novel interactions could be further validated in web-lab experimentation and be useful candidates to be targeted for the treatment of neuronal developmental diseases. This novel approach reported for is able to identify transcripts, with reported neuronal involvement, which optimally differentiate neocortical cells and neural progenitor cells. It is believed to be extensible and applicable to other single-cell RNA-seq expression profiles like that of the study of the cancer progression and treatment within a highly heterogeneous tumour.

  2. A call for differentiated approaches to delivering HIV services to key populations.

    PubMed

    Macdonald, Virginia; Verster, Annette; Baggaley, Rachel

    2017-07-21

    Key populations (KPs) are disproportionally affected by HIV and have low rates of access to HIV testing and treatment services compared to the broader population. WHO promotes the use of differentiated approaches for reaching and recruiting KP into the HIV services continuum. These approaches may help increase access to KPs who are often criminalized or stigmatized. By catering to the specific needs of each KP individual, differentiated approaches may increase service acceptability, quality and coverage, reduce costs and support KP members in leading the HIV response among their communities. WHO recommends the implementation of community-based and lay provider administered HIV testing services. Together, these approaches reduce barriers and costs associated with other testing strategies, allow greater ownership in HIV programmes for KP members and reach more people than do facility-based services. Despite this evidence availability and support for them is limited. Peer-driven interventions have been shown to be effective in engaging, recruiting and supporting clients. Some programmes employ HIV-positive or non-PLHIV "peer navigators" and other staff to provide case management, enrolment and/or re-enrolment in care and treatment services. However, a better understanding of the impact, cost effectiveness and potential burden on peer volunteers is required. Task shifting and non-facility-based service locations for antiretroviral therapy (ART) initiation and maintenance and antiretroviral (ARV) distribution are recommended in both the consolidated HIV treatment and KP guidelines of WHO. These approaches are accepted in generalized epidemics and for the general population where successful models exist; however, few organizations provide or initiate ART at KP community-based services. The application of a differentiated service approach for KP could increase the number of people who know their status and receive effective and sustained prevention and treatment for HIV

  3. The Power of Key: Celebrating 20 Years of Innovation at the Key Learning Community

    ERIC Educational Resources Information Center

    Kunkel, Christine

    2007-01-01

    The Key Learning Community in Indianapolis was the first school in the world to base its approach on the theory of multiple intelligences. Ms. Kunkel, Key's principal, reflects on the school's continuing growth and success--even in the face of pressures to standardize--and shares the history of its founding. (Contains 5 endnotes.)

  4. A fast image matching algorithm based on key points

    NASA Astrophysics Data System (ADS)

    Wang, Huilin; Wang, Ying; An, Ru; Yan, Peng

    2014-05-01

    Image matching is a very important technique in image processing. It has been widely used for object recognition and tracking, image retrieval, three-dimensional vision, change detection, aircraft position estimation, and multi-image registration. Based on the requirements of matching algorithm for craft navigation, such as speed, accuracy and adaptability, a fast key point image matching method is investigated and developed. The main research tasks includes: (1) Developing an improved celerity key point detection approach using self-adapting threshold of Features from Accelerated Segment Test (FAST). A method of calculating self-adapting threshold was introduced for images with different contrast. Hessian matrix was adopted to eliminate insecure edge points in order to obtain key points with higher stability. This approach in detecting key points has characteristics of small amount of computation, high positioning accuracy and strong anti-noise ability; (2) PCA-SIFT is utilized to describe key point. 128 dimensional vector are formed based on the SIFT method for the key points extracted. A low dimensional feature space was established by eigenvectors of all the key points, and each eigenvector was projected onto the feature space to form a low dimensional eigenvector. These key points were re-described by dimension-reduced eigenvectors. After reducing the dimension by the PCA, the descriptor was reduced to 20 dimensions from the original 128. This method can reduce dimensions of searching approximately near neighbors thereby increasing overall speed; (3) Distance ratio between the nearest neighbour and second nearest neighbour searching is regarded as the measurement criterion for initial matching points from which the original point pairs matched are obtained. Based on the analysis of the common methods (e.g. RANSAC (random sample consensus) and Hough transform cluster) used for elimination false matching point pairs, a heuristic local geometric restriction

  5. Consensus building for interlaboratory studies, key comparisons, and meta-analysis

    NASA Astrophysics Data System (ADS)

    Koepke, Amanda; Lafarge, Thomas; Possolo, Antonio; Toman, Blaza

    2017-06-01

    Interlaboratory studies in measurement science, including key comparisons, and meta-analyses in several fields, including medicine, serve to intercompare measurement results obtained independently, and typically produce a consensus value for the common measurand that blends the values measured by the participants. Since interlaboratory studies and meta-analyses reveal and quantify differences between measured values, regardless of the underlying causes for such differences, they also provide so-called ‘top-down’ evaluations of measurement uncertainty. Measured values are often substantially over-dispersed by comparison with their individual, stated uncertainties, thus suggesting the existence of yet unrecognized sources of uncertainty (dark uncertainty). We contrast two different approaches to take dark uncertainty into account both in the computation of consensus values and in the evaluation of the associated uncertainty, which have traditionally been preferred by different scientific communities. One inflates the stated uncertainties by a multiplicative factor. The other adds laboratory-specific ‘effects’ to the value of the measurand. After distinguishing what we call recipe-based and model-based approaches to data reductions in interlaboratory studies, we state six guiding principles that should inform such reductions. These principles favor model-based approaches that expose and facilitate the critical assessment of validating assumptions, and give preeminence to substantive criteria to determine which measurement results to include, and which to exclude, as opposed to purely statistical considerations, and also how to weigh them. Following an overview of maximum likelihood methods, three general purpose procedures for data reduction are described in detail, including explanations of how the consensus value and degrees of equivalence are computed, and the associated uncertainty evaluated: the DerSimonian-Laird procedure; a hierarchical Bayesian

  6. Asymmetry of perceived key movement in chorale sequences: converging evidence from a probe-tone analysis.

    PubMed

    Cuddy, L L; Thompson, W F

    1992-01-01

    In a probe-tone experiment, two groups of listeners--one trained, the other untrained, in traditional music theory--rated the goodness of fit of each of the 12 notes of the chromatic scale to four-voice harmonic sequences. Sequences were 12 simplified excerpts from Bach chorales, 4 nonmodulating, and 8 modulating. Modulations occurred either one or two steps in either the clockwise or the counterclockwise direction on the cycle of fifths. A consistent pattern of probe-tone ratings was obtained for each sequence, with no significant differences between listener groups. Two methods of analysis (Fourier analysis and regression analysis) revealed a directional asymmetry in the perceived key movement conveyed by modulating sequences. For a given modulation distance, modulations in the counterclockwise direction effected a clearer shift in tonal organization toward the final key than did clockwise modulations. The nature of the directional asymmetry was consistent with results reported for identification and rating of key change in the sequences (Thompson & Cuddy, 1989a). Further, according to the multiple-regression analysis, probe-tone ratings did not merely reflect the distribution of tones in the sequence. Rather, ratings were sensitive to the temporal structure of the tonal organization in the sequence.

  7. Integrative Data Analysis of Multi-Platform Cancer Data with a Multimodal Deep Learning Approach.

    PubMed

    Liang, Muxuan; Li, Zhizhong; Chen, Ting; Zeng, Jianyang

    2015-01-01

    Identification of cancer subtypes plays an important role in revealing useful insights into disease pathogenesis and advancing personalized therapy. The recent development of high-throughput sequencing technologies has enabled the rapid collection of multi-platform genomic data (e.g., gene expression, miRNA expression, and DNA methylation) for the same set of tumor samples. Although numerous integrative clustering approaches have been developed to analyze cancer data, few of them are particularly designed to exploit both deep intrinsic statistical properties of each input modality and complex cross-modality correlations among multi-platform input data. In this paper, we propose a new machine learning model, called multimodal deep belief network (DBN), to cluster cancer patients from multi-platform observation data. In our integrative clustering framework, relationships among inherent features of each single modality are first encoded into multiple layers of hidden variables, and then a joint latent model is employed to fuse common features derived from multiple input modalities. A practical learning algorithm, called contrastive divergence (CD), is applied to infer the parameters of our multimodal DBN model in an unsupervised manner. Tests on two available cancer datasets show that our integrative data analysis approach can effectively extract a unified representation of latent features to capture both intra- and cross-modality correlations, and identify meaningful disease subtypes from multi-platform cancer data. In addition, our approach can identify key genes and miRNAs that may play distinct roles in the pathogenesis of different cancer subtypes. Among those key miRNAs, we found that the expression level of miR-29a is highly correlated with survival time in ovarian cancer patients. These results indicate that our multimodal DBN based data analysis approach may have practical applications in cancer pathogenesis studies and provide useful guidelines for

  8. Efficient bit sifting scheme of post-processing in quantum key distribution

    NASA Astrophysics Data System (ADS)

    Li, Qiong; Le, Dan; Wu, Xianyan; Niu, Xiamu; Guo, Hong

    2015-10-01

    Bit sifting is an important step in the post-processing of quantum key distribution (QKD). Its function is to sift out the undetected original keys. The communication traffic of bit sifting has essential impact on the net secure key rate of a practical QKD system. In this paper, an efficient bit sifting scheme is presented, of which the core is a lossless source coding algorithm. Both theoretical analysis and experimental results demonstrate that the performance of the scheme is approaching the Shannon limit. The proposed scheme can greatly decrease the communication traffic of the post-processing of a QKD system, which means the proposed scheme can decrease the secure key consumption for classical channel authentication and increase the net secure key rate of the QKD system, as demonstrated by analyzing the improvement on the net secure key rate. Meanwhile, some recommendations on the application of the proposed scheme to some representative practical QKD systems are also provided.

  9. A Holistic Approach to Networked Information Systems Design and Analysis

    DTIC Science & Technology

    2016-04-15

    attain quite substantial savings. 11. Optimal algorithms for energy harvesting in wireless networks. We use a Markov- decision-process (MDP) based...approach to obtain optimal policies for transmissions . The key advantage of our approach is that it holistically considers information and energy in a...Coding technique to minimize delays and the number of transmissions in Wireless Systems. As we approach an era of ubiquitous computing with information

  10. Analysis of the secrecy of the running key in quantum encryption channels using coherent states of light

    NASA Astrophysics Data System (ADS)

    Nikulin, Vladimir V.; Hughes, David H.; Malowicki, John; Bedi, Vijit

    2015-05-01

    Free-space optical communication channels offer secure links with low probability of interception and detection. Despite their point-to-point topology, additional security features may be required in privacy-critical applications. Encryption can be achieved at the physical layer by using quantized values of photons, which makes exploitation of such quantum communication links extremely difficult. One example of such technology is keyed communication in quantum noise, a novel quantum modulation protocol that offers ultra-secure communication with competitive performance characteristics. Its utilization relies on specific coherent measurements to decrypt the signal. The process of measurements is complicated by the inherent and irreducible quantum noise of coherent states. This problem is different from traditional laser communication with coherent detection; therefore continuous efforts are being made to improve the measurement techniques. Quantum-based encryption systems that use the phase of the signal as the information carrier impose aggressive requirements on the accuracy of the measurements when an unauthorized party attempts intercepting the data stream. Therefore, analysis of the secrecy of the data becomes extremely important. In this paper, we present the results of a study that had a goal of assessment of potential vulnerability of the running key. Basic results of the laboratory measurements are combined with simulation studies and statistical analysis that can be used for both conceptual improvement of the encryption approach and for quantitative comparison of secrecy of different quantum communication protocols.

  11. Key Frame Extraction in the Summary Space.

    PubMed

    Li, Xuelong; Zhao, Bin; Lu, Xiaoqiang; Xuelong Li; Bin Zhao; Xiaoqiang Lu; Lu, Xiaoqiang; Li, Xuelong; Zhao, Bin

    2018-06-01

    Key frame extraction is an efficient way to create the video summary which helps users obtain a quick comprehension of the video content. Generally, the key frames should be representative of the video content, meanwhile, diverse to reduce the redundancy. Based on the assumption that the video data are near a subspace of a high-dimensional space, a new approach, named as key frame extraction in the summary space, is proposed for key frame extraction in this paper. The proposed approach aims to find the representative frames of the video and filter out similar frames from the representative frame set. First of all, the video data are mapped to a high-dimensional space, named as summary space. Then, a new representation is learned for each frame by analyzing the intrinsic structure of the summary space. Specifically, the learned representation can reflect the representativeness of the frame, and is utilized to select representative frames. Next, the perceptual hash algorithm is employed to measure the similarity of representative frames. As a result, the key frame set is obtained after filtering out similar frames from the representative frame set. Finally, the video summary is constructed by assigning the key frames in temporal order. Additionally, the ground truth, created by filtering out similar frames from human-created summaries, is utilized to evaluate the quality of the video summary. Compared with several traditional approaches, the experimental results on 80 videos from two datasets indicate the superior performance of our approach.

  12. CADDIS Volume 4. Data Analysis: Selecting an Analysis Approach

    EPA Pesticide Factsheets

    An approach for selecting statistical analyses to inform causal analysis. Describes methods for determining whether test site conditions differ from reference expectations. Describes an approach for estimating stressor-response relationships.

  13. Finite-key analysis for quantum key distribution with weak coherent pulses based on Bernoulli sampling

    NASA Astrophysics Data System (ADS)

    Kawakami, Shun; Sasaki, Toshihiko; Koashi, Masato

    2017-07-01

    An essential step in quantum key distribution is the estimation of parameters related to the leaked amount of information, which is usually done by sampling of the communication data. When the data size is finite, the final key rate depends on how the estimation process handles statistical fluctuations. Many of the present security analyses are based on the method with simple random sampling, where hypergeometric distribution or its known bounds are used for the estimation. Here we propose a concise method based on Bernoulli sampling, which is related to binomial distribution. Our method is suitable for the Bennett-Brassard 1984 (BB84) protocol with weak coherent pulses [C. H. Bennett and G. Brassard, Proceedings of the IEEE Conference on Computers, Systems and Signal Processing (IEEE, New York, 1984), Vol. 175], reducing the number of estimated parameters to achieve a higher key generation rate compared to the method with simple random sampling. We also apply the method to prove the security of the differential-quadrature-phase-shift (DQPS) protocol in the finite-key regime. The result indicates that the advantage of the DQPS protocol over the phase-encoding BB84 protocol in terms of the key rate, which was previously confirmed in the asymptotic regime, persists in the finite-key regime.

  14. Novel secret key generation techniques using memristor devices

    NASA Astrophysics Data System (ADS)

    Abunahla, Heba; Shehada, Dina; Yeun, Chan Yeob; Mohammad, Baker; Jaoude, Maguy Abi

    2016-02-01

    This paper proposes novel secret key generation techniques using memristor devices. The approach depends on using the initial profile of a memristor as a master key. In addition, session keys are generated using the master key and other specified parameters. In contrast to existing memristor-based security approaches, the proposed development is cost effective and power efficient since the operation can be achieved with a single device rather than a crossbar structure. An algorithm is suggested and demonstrated using physics based Matlab model. It is shown that the generated keys can have dynamic size which provides perfect security. Moreover, the proposed encryption and decryption technique using the memristor based generated keys outperforms Triple Data Encryption Standard (3DES) and Advanced Encryption Standard (AES) in terms of processing time. This paper is enriched by providing characterization results of a fabricated microscale Al/TiO2/Al memristor prototype in order to prove the concept of the proposed approach and study the impacts of process variations. The work proposed in this paper is a milestone towards System On Chip (SOC) memristor based security.

  15. Approaches to a global quantum key distribution network

    NASA Astrophysics Data System (ADS)

    Islam, Tanvirul; Bedington, Robert; Ling, Alexander

    2017-10-01

    Progress in realising quantum computers threatens to weaken existing public key encryption infrastructure. A global quantum key distribution (QKD) network can play a role in computational attack-resistant encryption. Such a network could use a constellation of high altitude platforms such as airships and satellites as trusted nodes to facilitate QKD between any two points on the globe on demand. This requires both space-to-ground and inter-platform links. However, the prohibitive cost of traditional satellite based development limits the experimental work demonstrating relevant technologies. To accelerate progress towards a global network, we use an emerging class of shoe-box sized spacecraft known as CubeSats. We have designed a polarization entangled photon pair source that can operate on board CubeSats. The robustness and miniature form factor of our entanglement source makes it especially suitable for performing pathfinder missions that studies QKD between two high altitude platforms. The technological outcomes of such mission would be the essential building blocks for a global QKD network.

  16. Practical challenges in quantum key distribution

    DOE PAGES

    Diamanti, Eleni; Lo, Hoi -Kwong; Qi, Bing; ...

    2016-11-08

    Here, quantum key distribution (QKD) promises unconditional security in data communication and is currently being deployed in commercial applications. Nonetheless, before QKD can be widely adopted, it faces a number of important challenges such as secret key rate, distance, size, cost and practical security. Here, we survey those key challenges and the approaches that are currently being taken to address them.

  17. Practical challenges in quantum key distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diamanti, Eleni; Lo, Hoi -Kwong; Qi, Bing

    Here, quantum key distribution (QKD) promises unconditional security in data communication and is currently being deployed in commercial applications. Nonetheless, before QKD can be widely adopted, it faces a number of important challenges such as secret key rate, distance, size, cost and practical security. Here, we survey those key challenges and the approaches that are currently being taken to address them.

  18. A Risk-Based Approach for Aerothermal/TPS Analysis and Testing

    NASA Technical Reports Server (NTRS)

    Wright, Michael J.; Grinstead, Jay H.; Bose, Deepak

    2007-01-01

    The current status of aerothermal and thermal protection system modeling for civilian entry missions is reviewed. For most such missions, the accuracy of our simulations is limited not by the tools and processes currently employed, but rather by reducible deficiencies in the underlying physical models. Improving the accuracy of and reducing the uncertainties in these models will enable a greater understanding of the system level impacts of a particular thermal protection system and of the system operation and risk over the operational life of the system. A strategic plan will be laid out by which key modeling deficiencies can be identified via mission-specific gap analysis. Once these gaps have been identified, the driving component uncertainties are determined via sensitivity analyses. A Monte-Carlo based methodology is presented for physics-based probabilistic uncertainty analysis of aerothermodynamics and thermal protection system material response modeling. These data are then used to advocate for and plan focused testing aimed at reducing key uncertainties. The results of these tests are used to validate or modify existing physical models. Concurrently, a testing methodology is outlined for thermal protection materials. The proposed approach is based on using the results of uncertainty/sensitivity analyses discussed above to tailor ground testing so as to best identify and quantify system performance and risk drivers. A key component of this testing is understanding the relationship between the test and flight environments. No existing ground test facility can simultaneously replicate all aspects of the flight environment, and therefore good models for traceability to flight are critical to ensure a low risk, high reliability thermal protection system design. Finally, the role of flight testing in the overall thermal protection system development strategy is discussed.

  19. Key rate for calibration robust entanglement based BB84 quantum key distribution protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gittsovich, O.; Moroder, T.

    2014-12-04

    We apply the approach of verifying entanglement, which is based on the sole knowledge of the dimension of the underlying physical system to the entanglement based version of the BB84 quantum key distribution protocol. We show that the familiar one-way key rate formula holds already if one assumes the assumption that one of the parties is measuring a qubit and no further assumptions about the measurement are needed.

  20. College and Career Readiness Assessment: Validation of the Key Cognitive Strategies Framework

    ERIC Educational Resources Information Center

    Lombardi, Allison R.; Conley, David T.; Seburn, Mary A.; Downs, Andrew M.

    2013-01-01

    In this study, the authors examined the psychometric properties of the key cognitive strategies (KCS) within the CollegeCareerReady[TM] School Diagnostic, a self-report measure of critical thinking skills intended for high school students. Using a cross-validation approach, an exploratory factor analysis was conducted with a randomly selected…

  1. What Synthesis Methodology Should I Use? A Review and Analysis of Approaches to Research Synthesis.

    PubMed

    Schick-Makaroff, Kara; MacDonald, Marjorie; Plummer, Marilyn; Burgess, Judy; Neander, Wendy

    2016-01-01

    When we began this process, we were doctoral students and a faculty member in a research methods course. As students, we were facing a review of the literature for our dissertations. We encountered several different ways of conducting a review but were unable to locate any resources that synthesized all of the various synthesis methodologies. Our purpose is to present a comprehensive overview and assessment of the main approaches to research synthesis. We use 'research synthesis' as a broad overarching term to describe various approaches to combining, integrating, and synthesizing research findings. We conducted an integrative review of the literature to explore the historical, contextual, and evolving nature of research synthesis. We searched five databases, reviewed websites of key organizations, hand-searched several journals, and examined relevant texts from the reference lists of the documents we had already obtained. We identified four broad categories of research synthesis methodology including conventional, quantitative, qualitative, and emerging syntheses. Each of the broad categories was compared to the others on the following: key characteristics, purpose, method, product, context, underlying assumptions, unit of analysis, strengths and limitations, and when to use each approach. The current state of research synthesis reflects significant advancements in emerging synthesis studies that integrate diverse data types and sources. New approaches to research synthesis provide a much broader range of review alternatives available to health and social science students and researchers.

  2. A Phenomenological Approach to Wear Debris Analysis

    DTIC Science & Technology

    1996-04-01

    Ferrography ; oil analysis; wear debris analysis Introduction: Wear debris analysis is an important subject in maintenance, especially condition...diagnostic ol can be traced to Ferrography developed in the early 1970’s. Westcott and Seifert [1] state e heart and soul of Ferrography , or optical debris...monitoring, as follows. The key to Ferrography or optical examination of wear debris is to find marks or features on wear debris which suggest likely

  3. Screening key candidate genes and pathways involved in insulinoma by microarray analysis.

    PubMed

    Zhou, Wuhua; Gong, Li; Li, Xuefeng; Wan, Yunyan; Wang, Xiangfei; Li, Huili; Jiang, Bin

    2018-06-01

    Insulinoma is a rare type tumor and its genetic features remain largely unknown. This study aimed to search for potential key genes and relevant enriched pathways of insulinoma.The gene expression data from GSE73338 were downloaded from Gene Expression Omnibus database. Differentially expressed genes (DEGs) were identified between insulinoma tissues and normal pancreas tissues, followed by pathway enrichment analysis, protein-protein interaction (PPI) network construction, and module analysis. The expressions of candidate key genes were validated by quantitative real-time polymerase chain reaction (RT-PCR) in insulinoma tissues.A total of 1632 DEGs were obtained, including 1117 upregulated genes and 514 downregulated genes. Pathway enrichment results showed that upregulated DEGs were significantly implicated in insulin secretion, and downregulated DEGs were mainly enriched in pancreatic secretion. PPI network analysis revealed 7 hub genes with degrees more than 10, including GCG (glucagon), GCGR (glucagon receptor), PLCB1 (phospholipase C, beta 1), CASR (calcium sensing receptor), F2R (coagulation factor II thrombin receptor), GRM1 (glutamate metabotropic receptor 1), and GRM5 (glutamate metabotropic receptor 5). DEGs involved in the significant modules were enriched in calcium signaling pathway, protein ubiquitination, and platelet degranulation. Quantitative RT-PCR data confirmed that the expression trends of these hub genes were similar to the results of bioinformatic analysis.The present study demonstrated that candidate DEGs and enriched pathways were the potential critical molecule events involved in the development of insulinoma, and these findings were useful for better understanding of insulinoma genesis.

  4. A random-key encoded harmony search approach for energy-efficient production scheduling with shared resources

    NASA Astrophysics Data System (ADS)

    Garcia-Santiago, C. A.; Del Ser, J.; Upton, C.; Quilligan, F.; Gil-Lopez, S.; Salcedo-Sanz, S.

    2015-11-01

    When seeking near-optimal solutions for complex scheduling problems, meta-heuristics demonstrate good performance with affordable computational effort. This has resulted in a gravitation towards these approaches when researching industrial use-cases such as energy-efficient production planning. However, much of the previous research makes assumptions about softer constraints that affect planning strategies and about how human planners interact with the algorithm in a live production environment. This article describes a job-shop problem that focuses on minimizing energy consumption across a production facility of shared resources. The application scenario is based on real facilities made available by the Irish Center for Manufacturing Research. The formulated problem is tackled via harmony search heuristics with random keys encoding. Simulation results are compared to a genetic algorithm, a simulated annealing approach and a first-come-first-served scheduling. The superior performance obtained by the proposed scheduler paves the way towards its practical implementation over industrial production chains.

  5. A Systematic Approach to Time-series Metabolite Profiling and RNA-seq Analysis of Chinese Hamster Ovary Cell Culture.

    PubMed

    Hsu, Han-Hsiu; Araki, Michihiro; Mochizuki, Masao; Hori, Yoshimi; Murata, Masahiro; Kahar, Prihardi; Yoshida, Takanobu; Hasunuma, Tomohisa; Kondo, Akihiko

    2017-03-02

    Chinese hamster ovary (CHO) cells are the primary host used for biopharmaceutical protein production. The engineering of CHO cells to produce higher amounts of biopharmaceuticals has been highly dependent on empirical approaches, but recent high-throughput "omics" methods are changing the situation in a rational manner. Omics data analyses using gene expression or metabolite profiling make it possible to identify key genes and metabolites in antibody production. Systematic omics approaches using different types of time-series data are expected to further enhance understanding of cellular behaviours and molecular networks for rational design of CHO cells. This study developed a systematic method for obtaining and analysing time-dependent intracellular and extracellular metabolite profiles, RNA-seq data (enzymatic mRNA levels) and cell counts from CHO cell cultures to capture an overall view of the CHO central metabolic pathway (CMP). We then calculated correlation coefficients among all the profiles and visualised the whole CMP by heatmap analysis and metabolic pathway mapping, to classify genes and metabolites together. This approach provides an efficient platform to identify key genes and metabolites in CHO cell culture.

  6. Biochemometrics for Natural Products Research: Comparison of Data Analysis Approaches and Application to Identification of Bioactive Compounds.

    PubMed

    Kellogg, Joshua J; Todd, Daniel A; Egan, Joseph M; Raja, Huzefa A; Oberlies, Nicholas H; Kvalheim, Olav M; Cech, Nadja B

    2016-02-26

    A central challenge of natural products research is assigning bioactive compounds from complex mixtures. The gold standard approach to address this challenge, bioassay-guided fractionation, is often biased toward abundant, rather than bioactive, mixture components. This study evaluated the combination of bioassay-guided fractionation with untargeted metabolite profiling to improve active component identification early in the fractionation process. Key to this methodology was statistical modeling of the integrated biological and chemical data sets (biochemometric analysis). Three data analysis approaches for biochemometric analysis were compared, namely, partial least-squares loading vectors, S-plots, and the selectivity ratio. Extracts from the endophytic fungi Alternaria sp. and Pyrenochaeta sp. with antimicrobial activity against Staphylococcus aureus served as test cases. Biochemometric analysis incorporating the selectivity ratio performed best in identifying bioactive ions from these extracts early in the fractionation process, yielding altersetin (3, MIC 0.23 μg/mL) and macrosphelide A (4, MIC 75 μg/mL) as antibacterial constituents from Alternaria sp. and Pyrenochaeta sp., respectively. This study demonstrates the potential of biochemometrics coupled with bioassay-guided fractionation to identify bioactive mixture components. A benefit of this approach is the ability to integrate multiple stages of fractionation and bioassay data into a single analysis.

  7. Public-key quantum digital signature scheme with one-time pad private-key

    NASA Astrophysics Data System (ADS)

    Chen, Feng-Lin; Liu, Wan-Fang; Chen, Su-Gen; Wang, Zhi-Hua

    2018-01-01

    A quantum digital signature scheme is firstly proposed based on public-key quantum cryptosystem. In the scheme, the verification public-key is derived from the signer's identity information (such as e-mail) on the foundation of identity-based encryption, and the signature private-key is generated by one-time pad (OTP) protocol. The public-key and private-key pair belongs to classical bits, but the signature cipher belongs to quantum qubits. After the signer announces the public-key and generates the final quantum signature, each verifier can verify publicly whether the signature is valid or not with the public-key and quantum digital digest. Analysis results show that the proposed scheme satisfies non-repudiation and unforgeability. Information-theoretic security of the scheme is ensured by quantum indistinguishability mechanics and OTP protocol. Based on the public-key cryptosystem, the proposed scheme is easier to be realized compared with other quantum signature schemes under current technical conditions.

  8. An Empirical Analysis of the Cascade Secret Key Reconciliation Protocol for Quantum Key Distribution

    DTIC Science & Technology

    2011-09-01

    performance with the parity checks within each pass increasing and as a result, the processing time is expected to increase as well. A conclusion is drawn... timely manner has driven efforts to develop new key distribution methods. The most promising method is Quantum Key Distribution (QKD) and is...thank the QKD Project Team for all of the insight and support they provided in such a short time period. Thanks are especially in order for my

  9. Testing key predictions of the associative account of mirror neurons in humans using multivariate pattern analysis.

    PubMed

    Oosterhof, Nikolaas N; Wiggett, Alison J; Cross, Emily S

    2014-04-01

    Cook et al. overstate the evidence supporting their associative account of mirror neurons in humans: most studies do not address a key property, action-specificity that generalizes across the visual and motor domains. Multivariate pattern analysis (MVPA) of neuroimaging data can address this concern, and we illustrate how MVPA can be used to test key predictions of their account.

  10. What Synthesis Methodology Should I Use? A Review and Analysis of Approaches to Research Synthesis

    PubMed Central

    Schick-Makaroff, Kara; MacDonald, Marjorie; Plummer, Marilyn; Burgess, Judy; Neander, Wendy

    2016-01-01

    Background When we began this process, we were doctoral students and a faculty member in a research methods course. As students, we were facing a review of the literature for our dissertations. We encountered several different ways of conducting a review but were unable to locate any resources that synthesized all of the various synthesis methodologies. Our purpose is to present a comprehensive overview and assessment of the main approaches to research synthesis. We use ‘research synthesis’ as a broad overarching term to describe various approaches to combining, integrating, and synthesizing research findings. Methods We conducted an integrative review of the literature to explore the historical, contextual, and evolving nature of research synthesis. We searched five databases, reviewed websites of key organizations, hand-searched several journals, and examined relevant texts from the reference lists of the documents we had already obtained. Results We identified four broad categories of research synthesis methodology including conventional, quantitative, qualitative, and emerging syntheses. Each of the broad categories was compared to the others on the following: key characteristics, purpose, method, product, context, underlying assumptions, unit of analysis, strengths and limitations, and when to use each approach. Conclusions The current state of research synthesis reflects significant advancements in emerging synthesis studies that integrate diverse data types and sources. New approaches to research synthesis provide a much broader range of review alternatives available to health and social science students and researchers. PMID:29546155

  11. In-Silico Integration Approach to Identify a Key miRNA Regulating a Gene Network in Aggressive Prostate Cancer

    PubMed Central

    Colaprico, Antonio; Bontempi, Gianluca; Castiglioni, Isabella

    2018-01-01

    Like other cancer diseases, prostate cancer (PC) is caused by the accumulation of genetic alterations in the cells that drives malignant growth. These alterations are revealed by gene profiling and copy number alteration (CNA) analysis. Moreover, recent evidence suggests that also microRNAs have an important role in PC development. Despite efforts to profile PC, the alterations (gene, CNA, and miRNA) and biological processes that correlate with disease development and progression remain partially elusive. Many gene signatures proposed as diagnostic or prognostic tools in cancer poorly overlap. The identification of co-expressed genes, that are functionally related, can identify a core network of genes associated with PC with a better reproducibility. By combining different approaches, including the integration of mRNA expression profiles, CNAs, and miRNA expression levels, we identified a gene signature of four genes overlapping with other published gene signatures and able to distinguish, in silico, high Gleason-scored PC from normal human tissue, which was further enriched to 19 genes by gene co-expression analysis. From the analysis of miRNAs possibly regulating this network, we found that hsa-miR-153 was highly connected to the genes in the network. Our results identify a four-gene signature with diagnostic and prognostic value in PC and suggest an interesting gene network that could play a key regulatory role in PC development and progression. Furthermore, hsa-miR-153, controlling this network, could be a potential biomarker for theranostics in high Gleason-scored PC. PMID:29562723

  12. Simultaneous detection of landmarks and key-frame in cardiac perfusion MRI using a joint spatial-temporal context model

    NASA Astrophysics Data System (ADS)

    Lu, Xiaoguang; Xue, Hui; Jolly, Marie-Pierre; Guetter, Christoph; Kellman, Peter; Hsu, Li-Yueh; Arai, Andrew; Zuehlsdorff, Sven; Littmann, Arne; Georgescu, Bogdan; Guehring, Jens

    2011-03-01

    Cardiac perfusion magnetic resonance imaging (MRI) has proven clinical significance in diagnosis of heart diseases. However, analysis of perfusion data is time-consuming, where automatic detection of anatomic landmarks and key-frames from perfusion MR sequences is helpful for anchoring structures and functional analysis of the heart, leading toward fully automated perfusion analysis. Learning-based object detection methods have demonstrated their capabilities to handle large variations of the object by exploring a local region, i.e., context. Conventional 2D approaches take into account spatial context only. Temporal signals in perfusion data present a strong cue for anchoring. We propose a joint context model to encode both spatial and temporal evidence. In addition, our spatial context is constructed not only based on the landmark of interest, but also the landmarks that are correlated in the neighboring anatomies. A discriminative model is learned through a probabilistic boosting tree. A marginal space learning strategy is applied to efficiently learn and search in a high dimensional parameter space. A fully automatic system is developed to simultaneously detect anatomic landmarks and key frames in both RV and LV from perfusion sequences. The proposed approach was evaluated on a database of 373 cardiac perfusion MRI sequences from 77 patients. Experimental results of a 4-fold cross validation show superior landmark detection accuracies of the proposed joint spatial-temporal approach to the 2D approach that is based on spatial context only. The key-frame identification results are promising.

  13. Bi-directional gene set enrichment and canonical correlation analysis identify key diet-sensitive pathways and biomarkers of metabolic syndrome.

    PubMed

    Morine, Melissa J; McMonagle, Jolene; Toomey, Sinead; Reynolds, Clare M; Moloney, Aidan P; Gormley, Isobel C; Gaora, Peadar O; Roche, Helen M

    2010-10-07

    Currently, a number of bioinformatics methods are available to generate appropriate lists of genes from a microarray experiment. While these lists represent an accurate primary analysis of the data, fewer options exist to contextualise those lists. The development and validation of such methods is crucial to the wider application of microarray technology in the clinical setting. Two key challenges in clinical bioinformatics involve appropriate statistical modelling of dynamic transcriptomic changes, and extraction of clinically relevant meaning from very large datasets. Here, we apply an approach to gene set enrichment analysis that allows for detection of bi-directional enrichment within a gene set. Furthermore, we apply canonical correlation analysis and Fisher's exact test, using plasma marker data with known clinical relevance to aid identification of the most important gene and pathway changes in our transcriptomic dataset. After a 28-day dietary intervention with high-CLA beef, a range of plasma markers indicated a marked improvement in the metabolic health of genetically obese mice. Tissue transcriptomic profiles indicated that the effects were most dramatic in liver (1270 genes significantly changed; p < 0.05), followed by muscle (601 genes) and adipose (16 genes). Results from modified GSEA showed that the high-CLA beef diet affected diverse biological processes across the three tissues, and that the majority of pathway changes reached significance only with the bi-directional test. Combining the liver tissue microarray results with plasma marker data revealed 110 CLA-sensitive genes showing strong canonical correlation with one or more plasma markers of metabolic health, and 9 significantly overrepresented pathways among this set; each of these pathways was also significantly changed by the high-CLA diet. Closer inspection of two of these pathways--selenoamino acid metabolism and steroid biosynthesis--illustrated clear diet-sensitive changes in

  14. Bi-directional gene set enrichment and canonical correlation analysis identify key diet-sensitive pathways and biomarkers of metabolic syndrome

    PubMed Central

    2010-01-01

    Background Currently, a number of bioinformatics methods are available to generate appropriate lists of genes from a microarray experiment. While these lists represent an accurate primary analysis of the data, fewer options exist to contextualise those lists. The development and validation of such methods is crucial to the wider application of microarray technology in the clinical setting. Two key challenges in clinical bioinformatics involve appropriate statistical modelling of dynamic transcriptomic changes, and extraction of clinically relevant meaning from very large datasets. Results Here, we apply an approach to gene set enrichment analysis that allows for detection of bi-directional enrichment within a gene set. Furthermore, we apply canonical correlation analysis and Fisher's exact test, using plasma marker data with known clinical relevance to aid identification of the most important gene and pathway changes in our transcriptomic dataset. After a 28-day dietary intervention with high-CLA beef, a range of plasma markers indicated a marked improvement in the metabolic health of genetically obese mice. Tissue transcriptomic profiles indicated that the effects were most dramatic in liver (1270 genes significantly changed; p < 0.05), followed by muscle (601 genes) and adipose (16 genes). Results from modified GSEA showed that the high-CLA beef diet affected diverse biological processes across the three tissues, and that the majority of pathway changes reached significance only with the bi-directional test. Combining the liver tissue microarray results with plasma marker data revealed 110 CLA-sensitive genes showing strong canonical correlation with one or more plasma markers of metabolic health, and 9 significantly overrepresented pathways among this set; each of these pathways was also significantly changed by the high-CLA diet. Closer inspection of two of these pathways - selenoamino acid metabolism and steroid biosynthesis - illustrated clear diet

  15. [Causal analysis approaches in epidemiology].

    PubMed

    Dumas, O; Siroux, V; Le Moual, N; Varraso, R

    2014-02-01

    Epidemiological research is mostly based on observational studies. Whether such studies can provide evidence of causation remains discussed. Several causal analysis methods have been developed in epidemiology. This paper aims at presenting an overview of these methods: graphical models, path analysis and its extensions, and models based on the counterfactual approach, with a special emphasis on marginal structural models. Graphical approaches have been developed to allow synthetic representations of supposed causal relationships in a given problem. They serve as qualitative support in the study of causal relationships. The sufficient-component cause model has been developed to deal with the issue of multicausality raised by the emergence of chronic multifactorial diseases. Directed acyclic graphs are mostly used as a visual tool to identify possible confounding sources in a study. Structural equations models, the main extension of path analysis, combine a system of equations and a path diagram, representing a set of possible causal relationships. They allow quantifying direct and indirect effects in a general model in which several relationships can be tested simultaneously. Dynamic path analysis further takes into account the role of time. The counterfactual approach defines causality by comparing the observed event and the counterfactual event (the event that would have been observed if, contrary to the fact, the subject had received a different exposure than the one he actually received). This theoretical approach has shown limits of traditional methods to address some causality questions. In particular, in longitudinal studies, when there is time-varying confounding, classical methods (regressions) may be biased. Marginal structural models have been developed to address this issue. In conclusion, "causal models", though they were developed partly independently, are based on equivalent logical foundations. A crucial step in the application of these models is the

  16. Approaches for the Analysis of Chlorinated Lipids

    PubMed Central

    Wang, Wen-yi; Albert, Carolyn J.; Ford, David A.

    2013-01-01

    Leukocytes are key cellular mediators of human diseases through their role in inflammation. Identifying unique molecules produced by leukocytes may provide new biomarkers and mechanistic insights into the role of leukocytes in disease. Chlorinated lipids are generated as a result of myeloperoxidase-containing leukocyte-derived hypochlorous acid targeting the vinyl ether bond of plasmalogens. The initial product of this reaction is α-chlorofatty aldehyde. α -Chlorofatty aldehyde is both oxidized to α-chlorofatty acid and reduced to α-chlorofatty alcohol by cellular metabolism. This review focuses on the separation techniques and quantitative analysis for these chlorinated lipids. For α-chlorofatty acid the negative charge of carboxylic acids is exploited to detect the chlorinated lipid species of these acids by electrospray ionization mass spectrometry in the negative ion mode. In contrast, α-chlorofatty aldehyde and α-chlorofatty alcohol are converted to pentafluorobenzyl oxime and pentafluorobenzoyl ester derivatives, which are detected by negative ion-chemical ionization mass spectrometry. These two detection methods coupled with the use of stable isotope internal standards and either liquid chromatography or gas chromatography provide highly sensitive analytical approaches to measure these novel lipids. PMID:24056259

  17. Continuous variable quantum key distribution with a real local oscillator using simultaneous pilot signals.

    PubMed

    Kleis, Sebastian; Rueckmann, Max; Schaeffer, Christian G

    2017-04-15

    In this Letter, we propose a novel implementation of continuous variable quantum key distribution that operates with a real local oscillator placed at the receiver site. In addition, pulsing of the continuous wave laser sources is not required, leading to an extraordinary practical and secure setup. It is suitable for arbitrary schemes based on modulated coherent states and heterodyne detection. The shown results include transmission experiments, as well as an excess noise analysis applying a discrete 8-state phase modulation. Achievable key rates under collective attacks are estimated. The results demonstrate the high potential of the approach to achieve high secret key rates at relatively low effort and cost.

  18. An improved scheme on decoy-state method for measurement-device-independent quantum key distribution.

    PubMed

    Wang, Dong; Li, Mo; Guo, Guang-Can; Wang, Qin

    2015-10-14

    Quantum key distribution involving decoy-states is a significant application of quantum information. By using three-intensity decoy-states of single-photon-added coherent sources, we propose a practically realizable scheme on quantum key distribution which approaches very closely the ideal asymptotic case of an infinite number of decoy-states. We make a comparative study between this scheme and two other existing ones, i.e., two-intensity decoy-states with single-photon-added coherent sources, and three-intensity decoy-states with weak coherent sources. Through numerical analysis, we demonstrate the advantages of our scheme in secure transmission distance and the final key generation rate.

  19. An improved scheme on decoy-state method for measurement-device-independent quantum key distribution

    PubMed Central

    Wang, Dong; Li, Mo; Guo, Guang-Can; Wang, Qin

    2015-01-01

    Quantum key distribution involving decoy-states is a significant application of quantum information. By using three-intensity decoy-states of single-photon-added coherent sources, we propose a practically realizable scheme on quantum key distribution which approaches very closely the ideal asymptotic case of an infinite number of decoy-states. We make a comparative study between this scheme and two other existing ones, i.e., two-intensity decoy-states with single-photon-added coherent sources, and three-intensity decoy-states with weak coherent sources. Through numerical analysis, we demonstrate the advantages of our scheme in secure transmission distance and the final key generation rate. PMID:26463580

  20. A tandem regression-outlier analysis of a ligand cellular system for key structural modifications around ligand binding.

    PubMed

    Lin, Ying-Ting

    2013-04-30

    A tandem technique of hard equipment is often used for the chemical analysis of a single cell to first isolate and then detect the wanted identities. The first part is the separation of wanted chemicals from the bulk of a cell; the second part is the actual detection of the important identities. To identify the key structural modifications around ligand binding, the present study aims to develop a counterpart of tandem technique for cheminformatics. A statistical regression and its outliers act as a computational technique for separation. A PPARγ (peroxisome proliferator-activated receptor gamma) agonist cellular system was subjected to such an investigation. Results show that this tandem regression-outlier analysis, or the prioritization of the context equations tagged with features of the outliers, is an effective regression technique of cheminformatics to detect key structural modifications, as well as their tendency of impact to ligand binding. The key structural modifications around ligand binding are effectively extracted or characterized out of cellular reactions. This is because molecular binding is the paramount factor in such ligand cellular system and key structural modifications around ligand binding are expected to create outliers. Therefore, such outliers can be captured by this tandem regression-outlier analysis.

  1. Textual analysis of tobacco editorials: how are Key media gatekeepers framing the issues?

    PubMed

    Smith, Katherine Clegg; Wakefield, Melanie

    2005-01-01

    The news media's potential to promote awareness of health issues is established, and media advocacy is now an important tool in combating tobacco use. This study examines newspaper editors' perspectives of tobacco-related issues. This study presents a textual analysis of tobacco-related editorials. The data consist of editorials on tobacco from a sample of 310 U.S. daily newspapers over the course of 1 year (2001). Data were sampled from a random one-third of the days per month, yielding 162 editorials for analysis. A qualitative textual analysis was conducted. Each editorial was coded for theme, position, and frame. We analyzed the topics gaining editorial attention and the arguments made to support various perspectives. Editorials discussed a variety of both positive and negative news events, largely conveying support for tobacco-control objectives. Various organizing frames were used-supporting policy interventions, condemning the industry, highlighting individual rights, and expressing general cynicism were most prevalent. Editors largely promoted tobacco-control efforts, particularly policy advances. There was, however, little coverage of key issues such as health effects and addiction-perhaps because they are no longer perceived to be contentious. Advocates should seek to address this area and minimize the cynicism of key media gatekeepers to avoid undermining policy and individual change efforts.

  2. Social Media as a Practical Approach in Engaging Key Stakeholders in School Crisis Communication Plans: A Qualitative Analysis

    ERIC Educational Resources Information Center

    Agozzino, Alisa; Kaiser, Candace

    2014-01-01

    The current study examined how public relations specialists within school systems are developing, implementing, and revising their communication crisis plans in an effort to fully engage all key stakeholders. Four research questions and two hypotheses were posed. Members from a state public relations association for schools were asked to…

  3. Key Trends in Institutional Changes Under Sustainable Development

    NASA Astrophysics Data System (ADS)

    Karpova, Olga; Pevneva, Inna; Dymova, Irina; Kostina, Tatiana; Li, Sergey

    2017-11-01

    The article is devoted to the consideration of the essential problems of accounting institution formation under the sustainable development of the country and the region. The research is based on the key research the field of the intuition economics and considers the trends of institutional changes including incremental, evolutionary and revolutionary. Approaches to the analysis of institutions are presented as well. The first approach states that economic efficiency is guaranteed by newly emerging institutions. The second approach involves certain internal and external incentives for changing institutions. Whereas the third approach insists on considering institutional changes to be the relation of individual economic entities to institutional innovations in terms of the net benefit from their implementation. The conclusion draws the leading role of the state in the process of the emergence and further development of newly created institutions focusing on the fact that not every change leads to greater efficiency. Thus it is crucial to consider the previous background of institutions development at implementing changes in accounting and control.

  4. An Approach to Knowledge-Directed Image Analysis,

    DTIC Science & Technology

    1977-09-01

    34AN APPROACH TO KNOWLEDGE -DIRECTED IMAGE ANALYSIS D.H. Ballard, C.M.’Brown, J.A. Feldman Computer Science Department iThe University of Rochester...Rochester, New York 14627 DTII EECTE UTIC FILE COPY o n I, n 83 - ’ f t 8 11 28 19 1f.. AN APPROACH TO KNOWLEDGE -DIRECTED IMAGE ANALYSIS 5*., D.H...semantic network model and a distributed control structure to accomplish the image analysis process. The process of " understanding an image" leads to

  5. Key Working for Families with Young Disabled Children

    PubMed Central

    Carter, Bernie; Thomas, Megan

    2011-01-01

    For families with a disabled child, the usual challenges of family life can be further complicated by the need to access a wide range of services provided by a plethora of professionals and agencies. Key working aims to support children and their families in navigating these complexities ensuring easy access to relevant, high quality, and coordinated care. The aim of this paper is to explore the key worker role in relation to “being a key worker” and “having a key worker”. The data within this paper draw on a larger evaluation study of the Blackpool Early Support Pilot Programme. The qualitative study used an appreciative and narrative approach and utilised mixed methods (interviews, surveys and a nominal group workshop). Data were collected from 43 participants (parents, key workers, and other stakeholders). All stakeholders who had been involved with the service were invited to participate. In the paper we present and discuss the ways in which key working made a difference to the lives of children and their families. We also consider how key working transformed the perspectives of the key workers creating a deeper and richer understanding of family lives and the ways in which other disciplines and agencies worked. Key working contributed to the shift to a much more family-centred approach, and enhanced communication and information sharing between professionals and agencies improved. This resulted in families feeling more informed. Key workers acted in an entrepreneurial fashion, forging new relationships with families and between families and other stakeholders. Parents of young disabled children and their service providers benefited from key working. Much of the benefit accrued came from strong, relational, and social-professional networking which facilitated the embedding of new ways of working into everyday practice. Using an appreciative inquiry approach provided an effective and relevant way of engaging with parents, professionals, and other

  6. Modeling, Simulation and Analysis of Public Key Infrastructure

    NASA Technical Reports Server (NTRS)

    Liu, Yuan-Kwei; Tuey, Richard; Ma, Paul (Technical Monitor)

    1998-01-01

    Security is an essential part of network communication. The advances in cryptography have provided solutions to many of the network security requirements. Public Key Infrastructure (PKI) is the foundation of the cryptography applications. The main objective of this research is to design a model to simulate a reliable, scalable, manageable, and high-performance public key infrastructure. We build a model to simulate the NASA public key infrastructure by using SimProcess and MatLab Software. The simulation is from top level all the way down to the computation needed for encryption, decryption, digital signature, and secure web server. The application of secure web server could be utilized in wireless communications. The results of the simulation are analyzed and confirmed by using queueing theory.

  7. Towards secure quantum key distribution protocol for wireless LANs: a hybrid approach

    NASA Astrophysics Data System (ADS)

    Naik, R. Lalu; Reddy, P. Chenna

    2015-12-01

    The primary goals of security such as authentication, confidentiality, integrity and non-repudiation in communication networks can be achieved with secure key distribution. Quantum mechanisms are highly secure means of distributing secret keys as they are unconditionally secure. Quantum key distribution protocols can effectively prevent various attacks in the quantum channel, while classical cryptography is efficient in authentication and verification of secret keys. By combining both quantum cryptography and classical cryptography, security of communications over networks can be leveraged. Hwang, Lee and Li exploited the merits of both cryptographic paradigms for provably secure communications to prevent replay, man-in-the-middle, and passive attacks. In this paper, we propose a new scheme with the combination of quantum cryptography and classical cryptography for 802.11i wireless LANs. Since quantum cryptography is premature in wireless networks, our work is a significant step forward toward securing communications in wireless networks. Our scheme is known as hybrid quantum key distribution protocol. Our analytical results revealed that the proposed scheme is provably secure for wireless networks.

  8. Employing Simulation to Evaluate Designs: The APEX Approach

    NASA Technical Reports Server (NTRS)

    Freed, Michael A.; Shafto, Michael G.; Remington, Roger W.; Null, Cynthia H. (Technical Monitor)

    1998-01-01

    The key innovations of APEX are its integrated approaches to task analysis, procedure definition, and intelligent, resource-constrained multi-tasking. This paper presents a step-by-step description of how APEX is used, from scenario development through trace analysis.

  9. Systemic Analysis Approaches for Air Transportation

    NASA Technical Reports Server (NTRS)

    Conway, Sheila

    2005-01-01

    Air transportation system designers have had only limited success using traditional operations research and parametric modeling approaches in their analyses of innovations. They need a systemic methodology for modeling of safety-critical infrastructure that is comprehensive, objective, and sufficiently concrete, yet simple enough to be used with reasonable investment. The methodology must also be amenable to quantitative analysis so issues of system safety and stability can be rigorously addressed. However, air transportation has proven itself an extensive, complex system whose behavior is difficult to describe, no less predict. There is a wide range of system analysis techniques available, but some are more appropriate for certain applications than others. Specifically in the area of complex system analysis, the literature suggests that both agent-based models and network analysis techniques may be useful. This paper discusses the theoretical basis for each approach in these applications, and explores their historic and potential further use for air transportation analysis.

  10. A Futures Approach to Policy Analysis.

    ERIC Educational Resources Information Center

    Morrison, James L.

    An approach to policy analysis for college officials is described that is based on evaluating and using information about the external environment to consider policy options for the future. The futures approach involves the following tasks: establishing an environmental scanning system to identify critical trends and emerging issues, identifying…

  11. Social Network Analysis Identifies Key Participants in Conservation Development.

    PubMed

    Farr, Cooper M; Reed, Sarah E; Pejchar, Liba

    2018-05-01

    Understanding patterns of participation in private lands conservation, which is often implemented voluntarily by individual citizens and private organizations, could improve its effectiveness at combating biodiversity loss. We used social network analysis (SNA) to examine participation in conservation development (CD), a private land conservation strategy that clusters houses in a small portion of a property while preserving the remaining land as protected open space. Using data from public records for six counties in Colorado, USA, we compared CD participation patterns among counties and identified actors that most often work with others to implement CDs. We found that social network characteristics differed among counties. The network density, or proportion of connections in the network, varied from fewer than 2 to nearly 15%, and was higher in counties with smaller populations and fewer CDs. Centralization, or the degree to which connections are held disproportionately by a few key actors, was not correlated strongly with any county characteristics. Network characteristics were not correlated with the prevalence of wildlife-friendly design features in CDs. The most highly connected actors were biological and geological consultants, surveyors, and engineers. Our work demonstrates a new application of SNA to land-use planning, in which CD network patterns are examined and key actors are identified. For better conservation outcomes of CD, we recommend using network patterns to guide strategies for outreach and information dissemination, and engaging with highly connected actor types to encourage widespread adoption of best practices for CD design and stewardship.

  12. Fundamental finite key limits for one-way information reconciliation in quantum key distribution

    NASA Astrophysics Data System (ADS)

    Tomamichel, Marco; Martinez-Mateo, Jesus; Pacher, Christoph; Elkouss, David

    2017-11-01

    The security of quantum key distribution protocols is guaranteed by the laws of quantum mechanics. However, a precise analysis of the security properties requires tools from both classical cryptography and information theory. Here, we employ recent results in non-asymptotic classical information theory to show that one-way information reconciliation imposes fundamental limitations on the amount of secret key that can be extracted in the finite key regime. In particular, we find that an often used approximation for the information leakage during information reconciliation is not generally valid. We propose an improved approximation that takes into account finite key effects and numerically test it against codes for two probability distributions, that we call binary-binary and binary-Gaussian, that typically appear in quantum key distribution protocols.

  13. A Quantitative Approach to Scar Analysis

    PubMed Central

    Khorasani, Hooman; Zheng, Zhong; Nguyen, Calvin; Zara, Janette; Zhang, Xinli; Wang, Joyce; Ting, Kang; Soo, Chia

    2011-01-01

    Analysis of collagen architecture is essential to wound healing research. However, to date no consistent methodologies exist for quantitatively assessing dermal collagen architecture in scars. In this study, we developed a standardized approach for quantitative analysis of scar collagen morphology by confocal microscopy using fractal dimension and lacunarity analysis. Full-thickness wounds were created on adult mice, closed by primary intention, and harvested at 14 days after wounding for morphometrics and standard Fourier transform-based scar analysis as well as fractal dimension and lacunarity analysis. In addition, transmission electron microscopy was used to evaluate collagen ultrastructure. We demonstrated that fractal dimension and lacunarity analysis were superior to Fourier transform analysis in discriminating scar versus unwounded tissue in a wild-type mouse model. To fully test the robustness of this scar analysis approach, a fibromodulin-null mouse model that heals with increased scar was also used. Fractal dimension and lacunarity analysis effectively discriminated unwounded fibromodulin-null versus wild-type skin as well as healing fibromodulin-null versus wild-type wounds, whereas Fourier transform analysis failed to do so. Furthermore, fractal dimension and lacunarity data also correlated well with transmission electron microscopy collagen ultrastructure analysis, adding to their validity. These results demonstrate that fractal dimension and lacunarity are more sensitive than Fourier transform analysis for quantification of scar morphology. PMID:21281794

  14. Benefit-Risk Analysis for Decision-Making: An Approach.

    PubMed

    Raju, G K; Gurumurthi, K; Domike, R

    2016-12-01

    The analysis of benefit and risk is an important aspect of decision-making throughout the drug lifecycle. In this work, the use of a benefit-risk analysis approach to support decision-making was explored. The proposed approach builds on the qualitative US Food and Drug Administration (FDA) approach to include a more explicit analysis based on international standards and guidance that enables aggregation and comparison of benefit and risk on a common basis and a lifecycle focus. The approach is demonstrated on six decisions over the lifecycle (e.g., accelerated approval, withdrawal, and traditional approval) using two case studies: natalizumab for multiple sclerosis (MS) and bedaquiline for multidrug-resistant tuberculosis (MDR-TB). © 2016 American Society for Clinical Pharmacology and Therapeutics.

  15. Key Health Information Technologies and Related Issues for Iran: A Qualitative Study.

    PubMed

    Hemmat, Morteza; Ayatollahi, Haleh; Maleki, Mohammadreza; Saghafi, Fatemeh

    2018-01-01

    Planning for the future of Health Information Technology (HIT) requires applying a systematic approach when conducting foresight studies. The aim of this study was to identify key health information technologies and related issues for Iran until 2025. This was a qualitative study and the participants included experts and policy makers in the field of health information technology. In-depth semi-structured interviews were conducted and data were analyzed by using framework analysis and MAXQDA software. The findings revealed that the development of national health information network, electronic health records, patient health records, a cloud-based service center, interoperability standards, patient monitoring technologies, telehealth, mhealth, clinical decision support systems, health information technology and mhealth infrastructure were found to be the key technologies for the future. These technologies could influence the economic, organizational and individual levels. To achieve them, the economic and organizational obstacles need to be overcome. In this study, a number of key technologies and related issues were identified. This approach can help to focus on the most important technologies in the future and to priorities these technologies for better resource allocation and policy making.

  16. Key Health Information Technologies and Related Issues for Iran: A Qualitative Study

    PubMed Central

    Hemmat, Morteza; Ayatollahi, Haleh; Maleki, Mohammadreza; Saghafi, Fatemeh

    2018-01-01

    Background and Objective: Planning for the future of Health Information Technology (HIT) requires applying a systematic approach when conducting foresight studies. The aim of this study was to identify key health information technologies and related issues for Iran until 2025. Methods: This was a qualitative study and the participants included experts and policy makers in the field of health information technology. In-depth semi-structured interviews were conducted and data were analyzed by using framework analysis and MAXQDA software. Results: The findings revealed that the development of national health information network, electronic health records, patient health records, a cloud-based service center, interoperability standards, patient monitoring technologies, telehealth, mhealth, clinical decision support systems, health information technology and mhealth infrastructure were found to be the key technologies for the future. These technologies could influence the economic, organizational and individual levels. To achieve them, the economic and organizational obstacles need to be overcome. Conclusion: In this study, a number of key technologies and related issues were identified. This approach can help to focus on the most important technologies in the future and to priorities these technologies for better resource allocation and policy making. PMID:29854016

  17. Virtual lock-and-key approach: the in silico revival of Fischer model by means of molecular descriptors.

    PubMed

    Lauria, Antonino; Tutone, Marco; Almerico, Anna Maria

    2011-09-01

    In the last years the application of computational methodologies in the medicinal chemistry fields has found an amazing development. All the efforts were focused on the searching of new leads featuring a close affinity on a specific biological target. Thus, different molecular modeling approaches in simulation of molecular behavior for a specific biological target were employed. In spite of the increasing reliability of computational methodologies, not always the designed lead, once synthesized and screened, are suitable for the chosen biological target. To give another chance to these compounds, this work tries to resume the old concept of Fischer lock-and-key model. The same can be done for the "re-purposing" of old drugs. In fact, it is known that drugs may have many physiological targets, therefore it may be useful to identify them. This aspect, called "polypharmacology", is known to be therapeutically essential in the different treatments. The proposed protocol, the virtual lock-and-key approach (VLKA), consists in the "virtualization" of biological targets through the respectively known inhibitors. In order to release a real lock it is necessary the key fits the pins of the lock. The molecular descriptors could be considered as pins. A tested compound can be considered a potential inhibitor of a biological target if the values of its molecular descriptors fall in the calculated range values for the set of known inhibitors. The proposed protocol permits to transform a biological target in a "lock model" starting from its known inhibitors. To release a real lock all pins must fit. In the proposed protocol, it was supposed that the higher is the number of fit pins, the higher will be the affinity to the considered biological target. Therefore, each biological target was converted in a sequence of "weighted" molecular descriptor range values (locks) by using the structural features of the known inhibitors. Each biological target lock was tested by performing a

  18. Repeated unit cell (RUC) approach for pure bending analysis of coronary stents.

    PubMed

    Ju, Feng; Xia, Zihui; Zhou, Chuwei

    2008-08-01

    Flexibility is one of the key properties of coronary stents. The objective of this paper is to characterize the bending behaviour of stents through finite element analysis with repeated unit cell (RUC) models. General periodic boundary conditions for the RUC under the pure bending condition are formulated. It is found that the proposed RUC approach can provide accurate numerical results of bending behaviour of stents with much less computational costs. Bending stiffness, post-yield bending behaviour and the relationship between moment and bending curvature are investigated for Palmaz-Schatz stents and stents with the V- and S-shaped links. It is found that the effect of link geometry on the bending behaviour of stent is significant. The behaviour of stents subjected to cyclic bending is also investigated.

  19. Efficient Multidisciplinary Analysis Approach for Conceptual Design of Aircraft with Large Shape Change

    NASA Technical Reports Server (NTRS)

    Chwalowski, Pawel; Samareh, Jamshid A.; Horta, Lucas G.; Piatak, David J.; McGowan, Anna-Maria R.

    2009-01-01

    The conceptual and preliminary design processes for aircraft with large shape changes are generally difficult and time-consuming, and the processes are often customized for a specific shape change concept to streamline the vehicle design effort. Accordingly, several existing reports show excellent results of assessing a particular shape change concept or perturbations of a concept. The goal of the current effort was to develop a multidisciplinary analysis tool and process that would enable an aircraft designer to assess several very different morphing concepts early in the design phase and yet obtain second-order performance results so that design decisions can be made with better confidence. The approach uses an efficient parametric model formulation that allows automatic model generation for systems undergoing radical shape changes as a function of aerodynamic parameters, geometry parameters, and shape change parameters. In contrast to other more self-contained approaches, the approach utilizes off-the-shelf analysis modules to reduce development time and to make it accessible to many users. Because the analysis is loosely coupled, discipline modules like a multibody code can be easily swapped for other modules with similar capabilities. One of the advantages of this loosely coupled system is the ability to use the medium- to high-fidelity tools early in the design stages when the information can significantly influence and improve overall vehicle design. Data transfer among the analysis modules are based on an accurate and automated general purpose data transfer tool. In general, setup time for the integrated system presented in this paper is 2-4 days for simple shape change concepts and 1-2 weeks for more mechanically complicated concepts. Some of the key elements briefly described in the paper include parametric model development, aerodynamic database generation, multibody analysis, and the required software modules as well as examples for a telescoping wing

  20. Identification of Key Odorants in Withering-Flavored Green Tea by Aroma Extract Dilution Analysis

    NASA Astrophysics Data System (ADS)

    Mizukami, Yuzo; Yamaguchi, Yuichi

    This research aims to identify key odorants in withering-flavored green tea. Application of the aroma extract dilution analysis using the volatile fraction of green tea and withering-flavored green tea revealed 25 and 35 odor-active peaks with the flavor dilution factors of≥4, respectively. 4-mercapto-4-methylpentan-2-one, (E)-2-nonenal, linalool, (E,Z)-2,6-nonadienal and 3-methylnonane-2,4-dione were key odorants in green tea with the flavor dilution factor of≥16. As well as these 5 odorants, 1-octen-3-one, β-damascenone, geraniol, β-ionone, (Z)-methyljasmonate, indole and coumarine contributed to the withering flavor of green tea.

  1. Integrated Analysis of Mutation Data from Various Sources Identifies Key Genes and Signaling Pathways in Hepatocellular Carcinoma

    PubMed Central

    Wei, Lin; Tang, Ruqi; Lian, Baofeng; Zhao, Yingjun; He, Xianghuo; Xie, Lu

    2014-01-01

    Background Recently, a number of studies have performed genome or exome sequencing of hepatocellular carcinoma (HCC) and identified hundreds or even thousands of mutations in protein-coding genes. However, these studies have only focused on a limited number of candidate genes, and many important mutation resources remain to be explored. Principal Findings In this study, we integrated mutation data obtained from various sources and performed pathway and network analysis. We identified 113 pathways that were significantly mutated in HCC samples and found that the mutated genes included in these pathways contained high percentages of known cancer genes, and damaging genes and also demonstrated high conservation scores, indicating their important roles in liver tumorigenesis. Five classes of pathways that were mutated most frequently included (a) proliferation and apoptosis related pathways, (b) tumor microenvironment related pathways, (c) neural signaling related pathways, (d) metabolic related pathways, and (e) circadian related pathways. Network analysis further revealed that the mutated genes with the highest betweenness coefficients, such as the well-known cancer genes TP53, CTNNB1 and recently identified novel mutated genes GNAL and the ADCY family, may play key roles in these significantly mutated pathways. Finally, we highlight several key genes (e.g., RPS6KA3 and PCLO) and pathways (e.g., axon guidance) in which the mutations were associated with clinical features. Conclusions Our workflow illustrates the increased statistical power of integrating multiple studies of the same subject, which can provide biological insights that would otherwise be masked under individual sample sets. This type of bioinformatics approach is consistent with the necessity of making the best use of the ever increasing data provided in valuable databases, such as TCGA, to enhance the speed of deciphering human cancers. PMID:24988079

  2. Integrated analysis of mutation data from various sources identifies key genes and signaling pathways in hepatocellular carcinoma.

    PubMed

    Zhang, Yuannv; Qiu, Zhaoping; Wei, Lin; Tang, Ruqi; Lian, Baofeng; Zhao, Yingjun; He, Xianghuo; Xie, Lu

    2014-01-01

    Recently, a number of studies have performed genome or exome sequencing of hepatocellular carcinoma (HCC) and identified hundreds or even thousands of mutations in protein-coding genes. However, these studies have only focused on a limited number of candidate genes, and many important mutation resources remain to be explored. In this study, we integrated mutation data obtained from various sources and performed pathway and network analysis. We identified 113 pathways that were significantly mutated in HCC samples and found that the mutated genes included in these pathways contained high percentages of known cancer genes, and damaging genes and also demonstrated high conservation scores, indicating their important roles in liver tumorigenesis. Five classes of pathways that were mutated most frequently included (a) proliferation and apoptosis related pathways, (b) tumor microenvironment related pathways, (c) neural signaling related pathways, (d) metabolic related pathways, and (e) circadian related pathways. Network analysis further revealed that the mutated genes with the highest betweenness coefficients, such as the well-known cancer genes TP53, CTNNB1 and recently identified novel mutated genes GNAL and the ADCY family, may play key roles in these significantly mutated pathways. Finally, we highlight several key genes (e.g., RPS6KA3 and PCLO) and pathways (e.g., axon guidance) in which the mutations were associated with clinical features. Our workflow illustrates the increased statistical power of integrating multiple studies of the same subject, which can provide biological insights that would otherwise be masked under individual sample sets. This type of bioinformatics approach is consistent with the necessity of making the best use of the ever increasing data provided in valuable databases, such as TCGA, to enhance the speed of deciphering human cancers.

  3. BARI+: a biometric based distributed key management approach for wireless body area networks.

    PubMed

    Muhammad, Khaliq-ur-Rahman Raazi Syed; Lee, Heejo; Lee, Sungyoung; Lee, Young-Koo

    2010-01-01

    Wireless body area networks (WBAN) consist of resource constrained sensing devices just like other wireless sensor networks (WSN). However, they differ from WSN in topology, scale and security requirements. Due to these differences, key management schemes designed for WSN are inefficient and unnecessarily complex when applied to WBAN. Considering the key management issue, WBAN are also different from WPAN because WBAN can use random biometric measurements as keys. We highlight the differences between WSN and WBAN and propose an efficient key management scheme, which makes use of biometrics and is specifically designed for WBAN domain.

  4. Tight finite-key analysis for quantum cryptography

    PubMed Central

    Tomamichel, Marco; Lim, Charles Ci Wen; Gisin, Nicolas; Renner, Renato

    2012-01-01

    Despite enormous theoretical and experimental progress in quantum cryptography, the security of most current implementations of quantum key distribution is still not rigorously established. One significant problem is that the security of the final key strongly depends on the number, M, of signals exchanged between the legitimate parties. Yet, existing security proofs are often only valid asymptotically, for unrealistically large values of M. Another challenge is that most security proofs are very sensitive to small differences between the physical devices used by the protocol and the theoretical model used to describe them. Here we show that these gaps between theory and experiment can be simultaneously overcome by using a recently developed proof technique based on the uncertainty relation for smooth entropies. PMID:22252558

  5. Tight finite-key analysis for quantum cryptography.

    PubMed

    Tomamichel, Marco; Lim, Charles Ci Wen; Gisin, Nicolas; Renner, Renato

    2012-01-17

    Despite enormous theoretical and experimental progress in quantum cryptography, the security of most current implementations of quantum key distribution is still not rigorously established. One significant problem is that the security of the final key strongly depends on the number, M, of signals exchanged between the legitimate parties. Yet, existing security proofs are often only valid asymptotically, for unrealistically large values of M. Another challenge is that most security proofs are very sensitive to small differences between the physical devices used by the protocol and the theoretical model used to describe them. Here we show that these gaps between theory and experiment can be simultaneously overcome by using a recently developed proof technique based on the uncertainty relation for smooth entropies.

  6. On Robust Key Agreement Based on Public Key Authentication

    NASA Astrophysics Data System (ADS)

    Hao, Feng

    We describe two new attacks on the HMQV protocol. The first attack raises a serious question on the basic definition of "authentication" in HMQV, while the second attack is generally applicable to many other protocols. In addition, we present a new authenticated key agreement protocol called YAK. Our approach is to depend on well-established techniques such as Schnorr's signature. Among all the related protocols, YAK appears to be the simplest so far. We believe simplicity is an important engineering principle.

  7. Self-referenced continuous-variable quantum key distribution protocol

    DOE PAGES

    Soh, Daniel Beom Soo; Sarovar, Mohan; Brif, Constantin; ...

    2015-10-21

    We introduce a new continuous-variable quantum key distribution (CV-QKD) protocol, self-referenced CV-QKD, that eliminates the need for transmission of a high-power local oscillator between the communicating parties. In this protocol, each signal pulse is accompanied by a reference pulse (or a pair of twin reference pulses), used to align Alice’s and Bob’s measurement bases. The method of phase estimation and compensation based on the reference pulse measurement can be viewed as a quantum analog of intradyne detection used in classical coherent communication, which extracts the phase information from the modulated signal. We present a proof-of-principle, fiber-based experimental demonstration of themore » protocol and quantify the expected secret key rates by expressing them in terms of experimental parameters. Our analysis of the secret key rate fully takes into account the inherent uncertainty associated with the quantum nature of the reference pulse(s) and quantifies the limit at which the theoretical key rate approaches that of the respective conventional protocol that requires local oscillator transmission. The self-referenced protocol greatly simplifies the hardware required for CV-QKD, especially for potential integrated photonics implementations of transmitters and receivers, with minimum sacrifice of performance. Furthermore, it provides a pathway towards scalable integrated CV-QKD transceivers, a vital step towards large-scale QKD networks.« less

  8. Self-referenced continuous-variable quantum key distribution protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soh, Daniel Beom Soo; Sarovar, Mohan; Brif, Constantin

    We introduce a new continuous-variable quantum key distribution (CV-QKD) protocol, self-referenced CV-QKD, that eliminates the need for transmission of a high-power local oscillator between the communicating parties. In this protocol, each signal pulse is accompanied by a reference pulse (or a pair of twin reference pulses), used to align Alice’s and Bob’s measurement bases. The method of phase estimation and compensation based on the reference pulse measurement can be viewed as a quantum analog of intradyne detection used in classical coherent communication, which extracts the phase information from the modulated signal. We present a proof-of-principle, fiber-based experimental demonstration of themore » protocol and quantify the expected secret key rates by expressing them in terms of experimental parameters. Our analysis of the secret key rate fully takes into account the inherent uncertainty associated with the quantum nature of the reference pulse(s) and quantifies the limit at which the theoretical key rate approaches that of the respective conventional protocol that requires local oscillator transmission. The self-referenced protocol greatly simplifies the hardware required for CV-QKD, especially for potential integrated photonics implementations of transmitters and receivers, with minimum sacrifice of performance. Furthermore, it provides a pathway towards scalable integrated CV-QKD transceivers, a vital step towards large-scale QKD networks.« less

  9. Self-Referenced Continuous-Variable Quantum Key Distribution Protocol

    NASA Astrophysics Data System (ADS)

    Soh, Daniel B. S.; Brif, Constantin; Coles, Patrick J.; Lütkenhaus, Norbert; Camacho, Ryan M.; Urayama, Junji; Sarovar, Mohan

    2015-10-01

    We introduce a new continuous-variable quantum key distribution (CV-QKD) protocol, self-referenced CV-QKD, that eliminates the need for transmission of a high-power local oscillator between the communicating parties. In this protocol, each signal pulse is accompanied by a reference pulse (or a pair of twin reference pulses), used to align Alice's and Bob's measurement bases. The method of phase estimation and compensation based on the reference pulse measurement can be viewed as a quantum analog of intradyne detection used in classical coherent communication, which extracts the phase information from the modulated signal. We present a proof-of-principle, fiber-based experimental demonstration of the protocol and quantify the expected secret key rates by expressing them in terms of experimental parameters. Our analysis of the secret key rate fully takes into account the inherent uncertainty associated with the quantum nature of the reference pulse(s) and quantifies the limit at which the theoretical key rate approaches that of the respective conventional protocol that requires local oscillator transmission. The self-referenced protocol greatly simplifies the hardware required for CV-QKD, especially for potential integrated photonics implementations of transmitters and receivers, with minimum sacrifice of performance. As such, it provides a pathway towards scalable integrated CV-QKD transceivers, a vital step towards large-scale QKD networks.

  10. BARI+: A Biometric Based Distributed Key Management Approach for Wireless Body Area Networks

    PubMed Central

    Muhammad, Khaliq-ur-Rahman Raazi Syed; Lee, Heejo; Lee, Sungyoung; Lee, Young-Koo

    2010-01-01

    Wireless body area networks (WBAN) consist of resource constrained sensing devices just like other wireless sensor networks (WSN). However, they differ from WSN in topology, scale and security requirements. Due to these differences, key management schemes designed for WSN are inefficient and unnecessarily complex when applied to WBAN. Considering the key management issue, WBAN are also different from WPAN because WBAN can use random biometric measurements as keys. We highlight the differences between WSN and WBAN and propose an efficient key management scheme, which makes use of biometrics and is specifically designed for WBAN domain. PMID:22319333

  11. Learning from Monet: A Fundamentally New Approach to Image Analysis

    NASA Astrophysics Data System (ADS)

    Falco, Charles M.

    2009-03-01

    The hands and minds of artists are intimately involved in the creative process, intrinsically making paintings complex images to analyze. In spite of this difficulty, several years ago the painter David Hockney and I identified optical evidence within a number of paintings that demonstrated artists as early as Jan van Eyck (c1425) used optical projections as aids for producing portions of their images. In the course of making those discoveries, Hockney and I developed new insights that are now being applied in a fundamentally new approach to image analysis. Very recent results from this new approach include identifying from Impressionist paintings by Monet, Pissarro, Renoir and others the precise locations the artists stood when making a number of their paintings. The specific deviations we find when accurately comparing these examples with photographs taken from the same locations provide us with key insights into what the artists' visual skills informed them were the ways to represent these two-dimensional images of three-dimensional scenes to viewers. As will be discussed, these results also have implications for improving the representation of certain scientific data. Acknowledgment: I am grateful to David Hockney for the many invaluable insights into imaging gained from him in our collaboration.

  12. A Bayesian approach to meta-analysis of plant pathology studies.

    PubMed

    Mila, A L; Ngugi, H K

    2011-01-01

    Bayesian statistical methods are used for meta-analysis in many disciplines, including medicine, molecular biology, and engineering, but have not yet been applied for quantitative synthesis of plant pathology studies. In this paper, we illustrate the key concepts of Bayesian statistics and outline the differences between Bayesian and classical (frequentist) methods in the way parameters describing population attributes are considered. We then describe a Bayesian approach to meta-analysis and present a plant pathological example based on studies evaluating the efficacy of plant protection products that induce systemic acquired resistance for the management of fire blight of apple. In a simple random-effects model assuming a normal distribution of effect sizes and no prior information (i.e., a noninformative prior), the results of the Bayesian meta-analysis are similar to those obtained with classical methods. Implementing the same model with a Student's t distribution and a noninformative prior for the effect sizes, instead of a normal distribution, yields similar results for all but acibenzolar-S-methyl (Actigard) which was evaluated only in seven studies in this example. Whereas both the classical (P = 0.28) and the Bayesian analysis with a noninformative prior (95% credibility interval [CRI] for the log response ratio: -0.63 to 0.08) indicate a nonsignificant effect for Actigard, specifying a t distribution resulted in a significant, albeit variable, effect for this product (CRI: -0.73 to -0.10). These results confirm the sensitivity of the analytical outcome (i.e., the posterior distribution) to the choice of prior in Bayesian meta-analyses involving a limited number of studies. We review some pertinent literature on more advanced topics, including modeling of among-study heterogeneity, publication bias, analyses involving a limited number of studies, and methods for dealing with missing data, and show how these issues can be approached in a Bayesian framework

  13. Key Parameters Evaluation for Hip Prosthesis with Finite Element Analysis

    NASA Astrophysics Data System (ADS)

    Guo, Hongqiang; Li, Dichen; Lian, Qin; Li, Xiang; Jin, Zhongmin

    2007-09-01

    Stem length and cross section are two key parameters that influence the stability and longevity of metallic hip prosthesis in the total hip arthroplasty (THA). In order to assess their influence to the stress and fatigue behavior of hip prosthesis, a series model of hip prosthesis with round-shaped or drum-shaped cross section, and with different stem lengths were created. These models were analyzed under both static and dynamic loading conditions with finite element analysis, and dynamic loading represents normal walking was used in the dynamic analysis. The stress on the metallic stem, cement, and adjacent bone were got, micromotion on the cement-metal interface were got too. Safety factors for fatigue life of the hip prothesis were calculated based on data obtained from dynamic analysis. Static analysis shows that drum-shaped cross section can decrease the displacement of the stem, that stress on drum-shaped stem focus on the corner of the femoral neck and the distal part of hip prosthesis, whereas the stress on the round-shaped stem distributes evenly over most part of the stem, and maximum stress on stem prosthesis fluctuates with stem length bottoming out at stem length range from 80 mm to 110 mm, that drum-shaped stems with drum height 8 mm generate more stress at the distal part of stem than drum-shaped stems with drum height 10 mm and round stems do. Dynamic and fatigue analysis shows that drum-shaped stem with drum height 10 mm and stem length 90 mm has the greatest safety factor therefore long fatigue life.

  14. Key Impact Factors on Dam Break Fatalities

    NASA Astrophysics Data System (ADS)

    Huang, D.; Yu, Z.; Song, Y.; Han, D.; Li, Y.

    2016-12-01

    Dam failures can lead to catastrophes on human society. However, there is a lack of research about dam break fatalities, especially on the key factors that affect fatalities. Based on the analysis of historical dam break cases, most studies have used the regression analysis to explore the correlation between those factors and fatalities, but without implementing optimization to find the dominating factors. In order to understand and reduce the risk of fatalities, this study has proposed a new method to select the impact factors on the fatality. It employs an improved ANN (Artificial Neural Network) combined with LOOCV (Leave-one-out cross-validation) and SFS (Stepwise Forward Selection) approach to explore the nonlinear relationship between impact factors and life losses. It not only considers the factors that have been widely used in the literature but also introduces new factors closely involved with fatalities. Dam break cases occurred in China from 1954 to 2013 are summarized, within which twenty-five cases are selected with a comprehensive coverage of geographic position and temporal variation. Twelve impact factors are taken into account as the inputs, i.e., severity of dam break flood (SF), population at risk (PR), public understanding of dam break (UB), warning time (TW), evacuation condition (EC), weather condition during dam break (WB), dam break mode (MB), water storage (SW), building vulnerability (VB), dam break time (TB), average distance from the affected area to the dam (DD) and preventive measures by government (PG).From those, three key factors of SF, MB and TB are chosen. The proposed method is able to extract the key factors, and the derived fatality model performs well in various types of dam break conditions.

  15. Unconditional security of time-energy entanglement quantum key distribution using dual-basis interferometry.

    PubMed

    Zhang, Zheshen; Mower, Jacob; Englund, Dirk; Wong, Franco N C; Shapiro, Jeffrey H

    2014-03-28

    High-dimensional quantum key distribution (HDQKD) offers the possibility of high secure-key rate with high photon-information efficiency. We consider HDQKD based on the time-energy entanglement produced by spontaneous parametric down-conversion and show that it is secure against collective attacks. Its security rests upon visibility data-obtained from Franson and conjugate-Franson interferometers-that probe photon-pair frequency correlations and arrival-time correlations. From these measurements, an upper bound can be established on the eavesdropper's Holevo information by translating the Gaussian-state security analysis for continuous-variable quantum key distribution so that it applies to our protocol. We show that visibility data from just the Franson interferometer provides a weaker, but nonetheless useful, secure-key rate lower bound. To handle multiple-pair emissions, we incorporate the decoy-state approach into our protocol. Our results show that over a 200-km transmission distance in optical fiber, time-energy entanglement HDQKD could permit a 700-bit/sec secure-key rate and a photon information efficiency of 2 secure-key bits per photon coincidence in the key-generation phase using receivers with a 15% system efficiency.

  16. Elucidating the Key Role of a Lewis Base Solvent in the Formation of Perovskite Films Fabricated from the Lewis Adduct Approach.

    PubMed

    Cao, Xiaobing; Zhi, Lili; Li, Yahui; Fang, Fei; Cui, Xian; Yao, Youwei; Ci, Lijie; Ding, Kongxian; Wei, Jinquan

    2017-09-27

    High-quality perovskite films can be fabricated from Lewis acid-base adducts through molecule exchange. Substantial work is needed to fully understand the formation mechanism of the perovskite films, which helps to further improve their quality. Here, we study the formation of CH 3 NH 3 PbI 3 perovskite films by introducing some dimethylacetamide into the PbI 2 /N,N-dimethylformamide solution. We reveal that there are three key processes during the formation of perovskite films through the Lewis acid-base adduct approach: molecule intercalation of solvent into the PbI 2 lattice, molecule exchange between the solvent and CH 3 NH 3 I, and dissolution-recrystallization of the perovskite grains during annealing. The Lewis base solvents play multiple functions in the above processes. The properties of the solvent, including Lewis basicity and boiling point, play key roles in forming smooth perovskite films with large grains. We also provide some rules for choosing Lewis base additives to prepare high-quality perovskite films through the Lewis adduct approach.

  17. A risk-based approach to management of leachables utilizing statistical analysis of extractables.

    PubMed

    Stults, Cheryl L M; Mikl, Jaromir; Whelehan, Oliver; Morrical, Bradley; Duffield, William; Nagao, Lee M

    2015-04-01

    To incorporate quality by design concepts into the management of leachables, an emphasis is often put on understanding the extractable profile for the materials of construction for manufacturing disposables, container-closure, or delivery systems. Component manufacturing processes may also impact the extractable profile. An approach was developed to (1) identify critical components that may be sources of leachables, (2) enable an understanding of manufacturing process factors that affect extractable profiles, (3) determine if quantitative models can be developed that predict the effect of those key factors, and (4) evaluate the practical impact of the key factors on the product. A risk evaluation for an inhalation product identified injection molding as a key process. Designed experiments were performed to evaluate the impact of molding process parameters on the extractable profile from an ABS inhaler component. Statistical analysis of the resulting GC chromatographic profiles identified processing factors that were correlated with peak levels in the extractable profiles. The combination of statistically significant molding process parameters was different for different types of extractable compounds. ANOVA models were used to obtain optimal process settings and predict extractable levels for a selected number of compounds. The proposed paradigm may be applied to evaluate the impact of material composition and processing parameters on extractable profiles and utilized to manage product leachables early in the development process and throughout the product lifecycle.

  18. Requirements Analysis is Key to Realizing Increased Value from Remote Sensing

    NASA Technical Reports Server (NTRS)

    Ryan, Robert; Alexander, Timothy M.

    2001-01-01

    This note explores requirements analysis - one of the critical and very often overlooked activities that enable satellites to measure what is important and to translate observations into effective and affordable information. Recent experience at the Stennis Space Center is used to illuminate some approaches for improving requirements practice.

  19. Multi-agent system as a new approach to effective chronic heart failure management: key considerations.

    PubMed

    Mohammadzadeh, Niloofar; Safdari, Reza; Rahimi, Azin

    2013-09-01

    Given the importance of the follow-up of chronic heart failure (CHF) patients to reduce common causes of re-admission and deterioration of their status that lead to imposing spiritual and physical costs on patients and society, modern technology tools should be used to the best advantage. The aim of this article is to explain key points which should be considered in designing an appropriate multi-agent system to improve CHF management. In this literature review articles were searched with keywords like multi-agent system, heart failure, chronic disease management in Science Direct, Google Scholar and PubMed databases without regard to the year of publications. Agents are an innovation in the field of artificial intelligence. Because agents are capable of solving complex and dynamic health problems, to take full advantage of e-Health, the healthcare system must take steps to make use of this technology. Key factors in CHF management through a multi-agent system approach must be considered such as organization, confidentiality in general aspects and design and architecture points in specific aspects. Note that use of agent systems only with a technical view is associated with many problems. Hence, in delivering healthcare to CHF patients, considering social and human aspects is essential. It is obvious that identifying and resolving technical and non-technical challenges is vital in the successful implementation of this technology.

  20. Tag Content Access Control with Identity-based Key Exchange

    NASA Astrophysics Data System (ADS)

    Yan, Liang; Rong, Chunming

    2010-09-01

    Radio Frequency Identification (RFID) technology that used to identify objects and users has been applied to many applications such retail and supply chain recently. How to prevent tag content from unauthorized readout is a core problem of RFID privacy issues. Hash-lock access control protocol can make tag to release its content only to reader who knows the secret key shared between them. However, in order to get this shared secret key required by this protocol, reader needs to communicate with a back end database. In this paper, we propose to use identity-based secret key exchange approach to generate the secret key required for hash-lock access control protocol. With this approach, not only back end database connection is not needed anymore, but also tag cloning problem can be eliminated at the same time.

  1. Assessing and reporting uncertainties in dietary exposure analysis: Mapping of uncertainties in a tiered approach.

    PubMed

    Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David

    2015-08-01

    Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. 2D hybrid analysis: Approach for building three-dimensional atomic model by electron microscopy image matching.

    PubMed

    Matsumoto, Atsushi; Miyazaki, Naoyuki; Takagi, Junichi; Iwasaki, Kenji

    2017-03-23

    In this study, we develop an approach termed "2D hybrid analysis" for building atomic models by image matching from electron microscopy (EM) images of biological molecules. The key advantage is that it is applicable to flexible molecules, which are difficult to analyze by 3DEM approach. In the proposed approach, first, a lot of atomic models with different conformations are built by computer simulation. Then, simulated EM images are built from each atomic model. Finally, they are compared with the experimental EM image. Two kinds of models are used as simulated EM images: the negative stain model and the simple projection model. Although the former is more realistic, the latter is adopted to perform faster computations. The use of the negative stain model enables decomposition of the averaged EM images into multiple projection images, each of which originated from a different conformation or orientation. We apply this approach to the EM images of integrin to obtain the distribution of the conformations, from which the pathway of the conformational change of the protein is deduced.

  3. Using Key Part-of-Speech Analysis to Examine Spoken Discourse by Taiwanese EFL Learners

    ERIC Educational Resources Information Center

    Lin, Yen-Liang

    2015-01-01

    This study reports on a corpus analysis of samples of spoken discourse between a group of British and Taiwanese adolescents, with the aim of exploring the statistically significant differences in the use of grammatical categories between the two groups of participants. The key word method extended to a part-of-speech level using the web-based…

  4. Three-party authenticated key agreements for optimal communication

    PubMed Central

    Lee, Tian-Fu; Hwang, Tzonelih

    2017-01-01

    Authenticated key agreements enable users to determine session keys, and to securely communicate with others over an insecure channel via the session keys. This study investigates the lower bounds on communications for three-party authenticated key agreements and considers whether or not the sub-keys for generating a session key can be revealed in the channel. Since two clients do not share any common secret key, they require the help of the server to authenticate their identities and exchange confidential and authenticated information over insecure networks. However, if the session key security is based on asymmetric cryptosystems, then revealing the sub-keys cannot compromise the session key. The clients can directly exchange the sub-keys and reduce the transmissions. In addition, authenticated key agreements were developed by using the derived results of the lower bounds on communications. Compared with related approaches, the proposed protocols had fewer transmissions and realized the lower bounds on communications. PMID:28355253

  5. Using Qualitative Comparative Analysis of Key Informant Interviews in Health Services Research: Enhancing a Study of Adjuvant Therapy Use in Breast Cancer Care.

    PubMed

    McAlearney, Ann Scheck; Walker, Daniel; Moss, Alexandra D; Bickell, Nina A

    2016-04-01

    Qualitative comparative analysis (QCA) is a methodology created to address causal complexity in social sciences research by preserving the objectivity of quantitative data analysis without losing detail inherent in qualitative research. However, its use in health services research (HSR) is limited, and questions remain about its application in this context. To explore the strengths and weaknesses of using QCA for HSR. Using data from semistructured interviews conducted as part of a multiple case study about adjuvant treatment underuse among underserved breast cancer patients, findings were compared using qualitative approaches with and without QCA to identify strengths, challenges, and opportunities presented by QCA. Ninety administrative and clinical key informants interviewed across 10 NYC area safety net hospitals. Transcribed interviews were coded by 3 investigators using an iterative and interactive approach. Codes were calibrated for QCA, as well as examined using qualitative analysis without QCA. Relative to traditional qualitative analysis, QCA strengths include: (1) addressing causal complexity, (2) results presentation as pathways as opposed to a list, (3) identification of necessary conditions, (4) the option of fuzzy-set calibrations, and (5) QCA-specific parameters of fit that allow researchers to compare outcome pathways. Weaknesses include: (1) few guidelines and examples exist for calibrating interview data, (2) not designed to create predictive models, and (3) unidirectionality. Through its presentation of results as pathways, QCA can highlight factors most important for production of an outcome. This strength can yield unique benefits for HSR not available through other methods.

  6. A matter of definition--key elements identified in a discourse analysis of definitions of palliative care.

    PubMed

    Pastrana, T; Jünger, S; Ostgathe, C; Elsner, F; Radbruch, L

    2008-04-01

    For more than 30 years, the term "palliative care" has been used. From the outset, the term has undergone a series of transformations in its definitions and consequently in its tasks and goals. There remains a lack of consensus on a definition. The aim of this article is to analyse the definitions of palliative care in the specialist literature and to identify the key elements of palliative care using discourse analysis: a qualitative methodology. The literature search focused on definitions of the term 'palliative medicine' and 'palliative care' in the World Wide Web and medical reference books in English and German. A total of 37 English and 26 German definitions were identified and analysed. Our study confirmed the lack of a consistent meaning concerning the investigated terms, reflecting on-going discussion about the nature of the field among palliative care practitioners. Several common key elements were identified. Four main categories emerged from the discourse analysis of the definition of palliative care: target groups, structure, tasks and expertise. In addition, the theoretical principles and goals of palliative care were discussed and found to be key elements, with relief and prevention of suffering and improvement of quality of life as main goals. The identified key elements can contribute to the definition of the concept 'palliative care'. Our study confirms the importance of semantic and ethical influences on palliative care that should be considered in future research on semantics in different languages.

  7. Public-Key Cryptography: A Hardware Implementation and Novel Neural Network-Based Approach

    DTIC Science & Technology

    1992-09-01

    FUNDINGISPONSORING 8b. OFFICE SYMBOL 9. PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION (if applicable ) 8c. ADDRESS (City, State, and ZIP Code) 10...8217....... ......... 4. .. . . iii TABLE OF CONTENTS I. INTRODUCTION ............................. 1 II. MATHEMATICAL BASIS FOR THE DEVELOPMENT OF PUBLIC-KEY...in the spirit of this future that this thesis is presented. It is an in-depth study of the public-key cryptosystem. First, the mathematical basis

  8. A global optimization approach to multi-polarity sentiment analysis.

    PubMed

    Li, Xinmiao; Li, Jing; Wu, Yukeng

    2015-01-01

    Following the rapid development of social media, sentiment analysis has become an important social media mining technique. The performance of automatic sentiment analysis primarily depends on feature selection and sentiment classification. While information gain (IG) and support vector machines (SVM) are two important techniques, few studies have optimized both approaches in sentiment analysis. The effectiveness of applying a global optimization approach to sentiment analysis remains unclear. We propose a global optimization-based sentiment analysis (PSOGO-Senti) approach to improve sentiment analysis with IG for feature selection and SVM as the learning engine. The PSOGO-Senti approach utilizes a particle swarm optimization algorithm to obtain a global optimal combination of feature dimensions and parameters in the SVM. We evaluate the PSOGO-Senti model on two datasets from different fields. The experimental results showed that the PSOGO-Senti model can improve binary and multi-polarity Chinese sentiment analysis. We compared the optimal feature subset selected by PSOGO-Senti with the features in the sentiment dictionary. The results of this comparison indicated that PSOGO-Senti can effectively remove redundant and noisy features and can select a domain-specific feature subset with a higher-explanatory power for a particular sentiment analysis task. The experimental results showed that the PSOGO-Senti approach is effective and robust for sentiment analysis tasks in different domains. By comparing the improvements of two-polarity, three-polarity and five-polarity sentiment analysis results, we found that the five-polarity sentiment analysis delivered the largest improvement. The improvement of the two-polarity sentiment analysis was the smallest. We conclude that the PSOGO-Senti achieves higher improvement for a more complicated sentiment analysis task. We also compared the results of PSOGO-Senti with those of the genetic algorithm (GA) and grid search method. From

  9. Key Factors in Development of Man-Made and Natural Ecosystems

    NASA Astrophysics Data System (ADS)

    Pechurkin, N. S.

    1999-01-01

    Key factors of ecosystem functioning are of the same nature for artificial and natural types. An hierarchical approach gives the opportunity for estimation of the quantitative behavior of both individual links and the system as a whole. At the organismic level we can use interactions of studied macroorganisms (man, animal, higher plant) with selected microorganisms as key indicating factors of the organisms immune status. The most informative factor for the population/community level is an age structure of populations and relationships of domination/elimination. The integrated key factors of the ecosystems level are productivity and rates of cycling of the limiting substances. The key factors approach is of great value for growth regulations and monitoring the state of any ecosystem, including the life support system (LSS)-type.

  10. Multi-Agent System as a New Approach to Effective Chronic Heart Failure Management: Key Considerations

    PubMed Central

    Mohammadzadeh, Niloofar; Rahimi, Azin

    2013-01-01

    Objectives Given the importance of the follow-up of chronic heart failure (CHF) patients to reduce common causes of re-admission and deterioration of their status that lead to imposing spiritual and physical costs on patients and society, modern technology tools should be used to the best advantage. The aim of this article is to explain key points which should be considered in designing an appropriate multi-agent system to improve CHF management. Methods In this literature review articles were searched with keywords like multi-agent system, heart failure, chronic disease management in Science Direct, Google Scholar and PubMed databases without regard to the year of publications. Results Agents are an innovation in the field of artificial intelligence. Because agents are capable of solving complex and dynamic health problems, to take full advantage of e-Health, the healthcare system must take steps to make use of this technology. Key factors in CHF management through a multi-agent system approach must be considered such as organization, confidentiality in general aspects and design and architecture points in specific aspects. Conclusions Note that use of agent systems only with a technical view is associated with many problems. Hence, in delivering healthcare to CHF patients, considering social and human aspects is essential. It is obvious that identifying and resolving technical and non-technical challenges is vital in the successful implementation of this technology. PMID:24195010

  11. An Overview of Focal Approaches of Critical Discourse Analysis

    ERIC Educational Resources Information Center

    Jahedi, Maryam; Abdullah, Faiz Sathi; Mukundan, Jayakaran

    2014-01-01

    This article aims to present detailed accounts of central approaches to Critical Discourse Analysis. It focuses on the work of three prominent scholars such as Fairclough's critical approach, Wodak's discourse-historical approach and Van Dijk's socio-cognitive approach. This study concludes that a combination of these three approaches can be…

  12. Understanding alternative fluxes/effluxes through comparative metabolic pathway analysis of phylum actinobacteria using a simplified approach.

    PubMed

    Verma, Mansi; Lal, Devi; Saxena, Anjali; Anand, Shailly; Kaur, Jasvinder; Kaur, Jaspreet; Lal, Rup

    2013-12-01

    Actinobacteria are known for their diverse metabolism and physiology. Some are dreadful human pathogens whereas some constitute the natural flora for human gut. Therefore, the understanding of metabolic pathways is a key feature for targeting the pathogenic bacteria without disturbing the symbiotic ones. A big challenge faced today is multiple drug resistance by Mycobacterium and other pathogens that utilize alternative fluxes/effluxes. With the availability of genome sequence, it is now feasible to conduct the comparative in silico analysis. Here we present a simplified approach to compare metabolic pathways so that the species specific enzyme may be traced and engineered for future therapeutics. The analyses of four key carbohydrate metabolic pathways, i.e., glycolysis, pyruvate metabolism, tri carboxylic acid cycle and pentose phosphate pathway suggest the presence of alternative fluxes. It was found that the upper pathway of glycolysis was highly variable in the actinobacterial genomes whereas lower glycolytic pathway was highly conserved. Likewise, pentose phosphate pathway was well conserved in contradiction to TCA cycle, which was found to be incomplete in majority of actinobacteria. The clustering based on presence and absence of genes of these metabolic pathways clearly revealed that members of different genera shared identical pathways and, therefore, provided an easy method to identify the metabolic similarities/differences between pathogenic and symbiotic organisms. The analyses could identify isoenzymes and some key enzymes that were found to be missing in some pathogenic actinobacteria. The present work defines a simple approach to explore the effluxes in four metabolic pathways within the phylum actinobacteria. The analysis clearly reflects that actinobacteria exhibit diverse routes for metabolizing substrates. The pathway comparison can help in finding the enzymes that can be used as drug targets for pathogens without effecting symbiotic organisms

  13. A sequential factorial analysis approach to characterize the effects of uncertainties for supporting air quality management

    NASA Astrophysics Data System (ADS)

    Wang, S.; Huang, G. H.; Veawab, A.

    2013-03-01

    This study proposes a sequential factorial analysis (SFA) approach for supporting regional air quality management under uncertainty. SFA is capable not only of examining the interactive effects of input parameters, but also of analyzing the effects of constraints. When there are too many factors involved in practical applications, SFA has the advantage of conducting a sequence of factorial analyses for characterizing the effects of factors in a systematic manner. The factor-screening strategy employed in SFA is effective in greatly reducing the computational effort. The proposed SFA approach is applied to a regional air quality management problem for demonstrating its applicability. The results indicate that the effects of factors are evaluated quantitatively, which can help decision makers identify the key factors that have significant influence on system performance and explore the valuable information that may be veiled beneath their interrelationships.

  14. Propensity Score Analysis: An Alternative Statistical Approach for HRD Researchers

    ERIC Educational Resources Information Center

    Keiffer, Greggory L.; Lane, Forrest C.

    2016-01-01

    Purpose: This paper aims to introduce matching in propensity score analysis (PSA) as an alternative statistical approach for researchers looking to make causal inferences using intact groups. Design/methodology/approach: An illustrative example demonstrated the varying results of analysis of variance, analysis of covariance and PSA on a heuristic…

  15. Medical technology as a key driver of rising health expenditure: disentangling the relationship

    PubMed Central

    Sorenson, Corinna; Drummond, Michael; Bhuiyan Khan, Beena

    2013-01-01

    Health care spending has risen steadily in most countries, becoming a concern for decision-makers worldwide. Commentators often point to new medical technology as the key driver for burgeoning expenditures. This paper critically appraises this conjecture, based on an analysis of the existing literature, with the aim of offering a more detailed and considered analysis of this relationship. Several databases were searched to identify relevant literature. Various categories of studies (eg, multivariate and cost-effectiveness analyses) were included to cover different perspectives, methodological approaches, and issues regarding the link between medical technology and costs. Selected articles were reviewed and relevant information was extracted into a standardized template and analyzed for key cross-cutting themes, ie, impact of technology on costs, factors influencing this relationship, and methodological challenges in measuring such linkages. A total of 86 studies were reviewed. The analysis suggests that the relationship between medical technology and spending is complex and often conflicting. Findings were frequently contingent on varying factors, such as the availability of other interventions, patient population, and the methodological approach employed. Moreover, the impact of technology on costs differed across technologies, in that some (eg, cancer drugs, invasive medical devices) had significant financial implications, while others were cost-neutral or cost-saving. In light of these issues, we argue that decision-makers and other commentators should extend their focus beyond costs solely to include consideration of whether medical technology results in better value in health care and broader socioeconomic benefits. PMID:23807855

  16. Trait-based approaches in the analysis of stream fish communities

    USGS Publications Warehouse

    Frimpong, Emmanuel; Angermeier, Paul

    2010-01-01

    Species traits are used to study the functional organization of fish communities for a range of reasons, from simply reducing data dimensionality to providing mechanistic explanations for observed variation in communities. Ecological and life history traits have been used to understand the basic ecology of fishes and predict (1) species and community responses to habitat and climate alteration, and (2) species extinction, species invasion, and community homogenization. Many approaches in this arena have been developed during the past three decades, but they often have not been integrated with related ecological concepts or subdisciplines, which has led to confusion in terminology. We review 102 studies of species traits and then summarize patterns in traits being used and questions being addressed with trait-based approaches. Overall, studies of fish–habitat relationships that apply habitat templates and hierarchical filters dominate our sample; the most frequently used traits are related to feeding. We define and show the relationships among key terms such as fundamental and realized niches; functional traits, performance, and fitness; tactic, trait-state, syndromes, and strategies; and guilds and functional groups. We propose accelerating research to (1) quantify trait plasticity, (2) identify traits useful for testing ecological hypotheses, (3) model habitat and biotic interactions in communities while explicitly accounting for phylogenetic relationships, (4) explore how traits control community assembly, and (5) document the importance of traits in fish– community responses to anthropogenic change and in delivering ecosystem services. Further synthesis of these topics is still needed to develop concepts, models, and principles that can unify the disparate approaches taken in trait-based analysis of fish communities, link fish community ecology to general community ecology, and inform sustainable management of ecosystems.

  17. Exploring key factors in online shopping with a hybrid model.

    PubMed

    Chen, Hsiao-Ming; Wu, Chia-Huei; Tsai, Sang-Bing; Yu, Jian; Wang, Jiangtao; Zheng, Yuxiang

    2016-01-01

    Nowadays, the web increasingly influences retail sales. An in-depth analysis of consumer decision-making in the context of e-business has become an important issue for internet vendors. However, factors affecting e-business are complicated and intertwined. To stimulate online sales, understanding key influential factors and causal relationships among the factors is important. To gain more insights into this issue, this paper introduces a hybrid method, which combines the Decision Making Trial and Evaluation Laboratory (DEMATEL) with the analytic network process, called DANP method, to find out the driving factors that influence the online business mostly. By DEMATEL approach the causal graph showed that "online service" dimension has the highest degree of direct impact on other dimensions; thus, the internet vendor is suggested to made strong efforts on service quality throughout the online shopping process. In addition, the study adopted DANP to measure the importance of key factors, among which "transaction security" proves to be the most important criterion. Hence, transaction security should be treated with top priority to boost the online businesses. From our study with DANP approach, the comprehensive information can be visually detected so that the decision makers can spotlight on the root causes to develop effectual actions.

  18. LabKey Server: an open source platform for scientific data integration, analysis and collaboration.

    PubMed

    Nelson, Elizabeth K; Piehler, Britt; Eckels, Josh; Rauch, Adam; Bellew, Matthew; Hussey, Peter; Ramsay, Sarah; Nathe, Cory; Lum, Karl; Krouse, Kevin; Stearns, David; Connolly, Brian; Skillman, Tom; Igra, Mark

    2011-03-09

    roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0.

  19. LabKey Server: An open source platform for scientific data integration, analysis and collaboration

    PubMed Central

    2011-01-01

    organizations. It tracks roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Conclusions Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0. PMID:21385461

  20. Unconditional optimality of Gaussian attacks against continuous-variable quantum key distribution.

    PubMed

    García-Patrón, Raúl; Cerf, Nicolas J

    2006-11-10

    A fully general approach to the security analysis of continuous-variable quantum key distribution (CV-QKD) is presented. Provided that the quantum channel is estimated via the covariance matrix of the quadratures, Gaussian attacks are shown to be optimal against all collective eavesdropping strategies. The proof is made strikingly simple by combining a physical model of measurement, an entanglement-based description of CV-QKD, and a recent powerful result on the extremality of Gaussian states [M. M. Wolf, Phys. Rev. Lett. 96, 080502 (2006)10.1103/PhysRevLett.96.080502].

  1. An SSH key management system: easing the pain of managing key/user/account associations

    NASA Astrophysics Data System (ADS)

    Arkhipkin, D.; Betts, W.; Lauret, J.; Shiryaev, A.

    2008-07-01

    Cyber security requirements for secure access to computing facilities often call for access controls via gatekeepers and the use of two-factor authentication. Using SSH keys to satisfy the two factor authentication requirement has introduced a potentially challenging task of managing the keys and their associations with individual users and user accounts. Approaches for a facility with the simple model of one remote user corresponding to one local user would not work at facilities that require a many-to-many mapping between users and accounts on multiple systems. We will present an SSH key management system we developed, tested and deployed to address the many-to-many dilemma in the environment of the STAR experiment. We will explain its use in an online computing context and explain how it makes possible the management and tracing of group account access spread over many sub-system components (data acquisition, slow controls, trigger, detector instrumentation, etc.) without the use of shared passwords for remote logins.

  2. Parametric and experimental analysis using a power flow approach

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1990-01-01

    A structural power flow approach for the analysis of structure-borne transmission of vibrations is used to analyze the influence of structural parameters on transmitted power. The parametric analysis is also performed using the Statistical Energy Analysis approach and the results are compared with those obtained using the power flow approach. The advantages of structural power flow analysis are demonstrated by comparing the type of results that are obtained by the two analytical methods. Also, to demonstrate that the power flow results represent a direct physical parameter that can be measured on a typical structure, an experimental study of structural power flow is presented. This experimental study presents results for an L shaped beam for which an available solution was already obtained. Various methods to measure vibrational power flow are compared to study their advantages and disadvantages.

  3. Simulation as a preoperative planning approach in advanced heart failure patients. A retrospective clinical analysis.

    PubMed

    Capoccia, Massimo; Marconi, Silvia; Singh, Sanjeet Avtaar; Pisanelli, Domenico M; De Lazzari, Claudio

    2018-05-02

    Modelling and simulation may become clinically applicable tools for detailed evaluation of the cardiovascular system and clinical decision-making to guide therapeutic intervention. Models based on pressure-volume relationship and zero-dimensional representation of the cardiovascular system may be a suitable choice given their simplicity and versatility. This approach has great potential for application in heart failure where the impact of left ventricular assist devices has played a significant role as a bridge to transplant and more recently as a long-term solution for non eligible candidates. We sought to investigate the value of simulation in the context of three heart failure patients with a view to predict or guide further management. CARDIOSIM © was the software used for this purpose. The study was based on retrospective analysis of haemodynamic data previously discussed at a multidisciplinary meeting. The outcome of the simulations addressed the value of a more quantitative approach in the clinical decision process. Although previous experience, co-morbidities and the risk of potentially fatal complications play a role in clinical decision-making, patient-specific modelling may become a daily approach for selection and optimisation of device-based treatment for heart failure patients. Willingness to adopt this integrated approach may be the key to further progress.

  4. A TIERED APPROACH TO PERFORMING UNCERTAINTY ANALYSIS IN CONDUCTING EXPOSURE ANALYSIS FOR CHEMICALS

    EPA Science Inventory

    The WHO/IPCS draft Guidance Document on Characterizing and Communicating Uncertainty in Exposure Assessment provides guidance on recommended strategies for conducting uncertainty analysis as part of human exposure analysis. Specifically, a tiered approach to uncertainty analysis ...

  5. Work-Centered Approach to Insurgency Campaign Analysis

    DTIC Science & Technology

    2007-06-01

    a constructivist or sensemaking philosophy by defining data, information , situation awareness , and situation understanding in the following manner...present paper explores a new approach to understanding transnational insurgency movements –an approach based on a fundamental analysis of the knowledge ...country or region. By focusing at the fundamental level of knowledge creation, the resulting framework allows an understanding of insurgency

  6. Meta-Analysis for Sociology – A Measure-Driven Approach

    PubMed Central

    Roelfs, David J.; Shor, Eran; Falzon, Louise; Davidson, Karina W.; Schwartz, Joseph E.

    2013-01-01

    Meta-analytic methods are becoming increasingly important in sociological research. In this article we present an approach for meta-analysis which is especially helpful for sociologists. Conventional approaches to meta-analysis often prioritize “concept-driven” literature searches. However, in disciplines with high theoretical diversity, such as sociology, this search approach might constrain the researcher’s ability to fully exploit the entire body of relevant work. We explicate a “measure-driven” approach, in which iterative searches and new computerized search techniques are used to increase the range of publications found (and thus the range of possible analyses) and to traverse time and disciplinary boundaries. We demonstrate this measure-driven search approach with two meta-analytic projects, examining the effects of various social variables on all-cause mortality. PMID:24163498

  7. Five keys to real transformation in health care.

    PubMed

    Senzon, Simon A

    2011-11-01

    Transformation in health care requires a deeply holistic approach. Natural leaders of such a transformation are the complementary and alternative medicine practitioners who already share a vision of wellness, prevention, and optimal human function. Central to this shared vision is lifestyle change for patients and practitioners. Yet, to change a lifestyle is to change a self. Assisting individuals to transform their very sense of self in order to live healthier, more fulfilling lives centered on flourishing requires several important keys. Visionary and unified leaders are the first key. Structural support through coordination of health clinics locally and nationally is the second key. This can be optimized by utilizing initiatives of the new Affordable Health Care Act, because it provides a potential impetus for deep structural changes. An expanded evidence base for multifactorial approaches to wellness lifestyles is the third key. A reorganizational orientation with an emphasis on the right timing of transformation is the fourth key. The fifth key is an Integral map, which brings together the personal, behavioral, cultural, and social domains. By utilizing such a map, one ensures that no aspect of the transformative revolution at hand slips away due to any misplaced focus, such as emphasizing only on the things we can see with our eyes. By embracing the essence of transformation in terms of a wholeness to all reality, an evolutionary unifying field with interior depth and exterior expression, health care is redefined more authentically. © Mary Ann Liebert, Inc.

  8. Key populations and human rights in the context of HIV services rendition in Ghana.

    PubMed

    Laar, Amos; DeBruin, Debra

    2017-08-02

    In line with its half century old penal code, Ghana currently criminalizes and penalizes behaviors of some key populations - populations deemed to be at higher risk of acquiring or transmitting Human Immunodeficiency Virus (HIV). Men who have sex with men (MSM), and sex workers (SWs) fit into this categorization. This paper provides an analysis of how enactment and implementation of rights-limiting laws not only limit rights, but also amplify risk and vulnerability to HIV in key and general populations. The paper derives from a project that assessed the ethics sensitivity of key documents guiding Ghana's response to its HIV epidemic. Assessment was guided by leading frameworks from public health ethics, and relevant articles from the international bill of rights. Ghana's response to her HIV epidemic does not adequately address the rights and needs of key populations. Even though the national response has achieved some public health successes, palpable efforts to address rights issues remain nascent. Ghana's guiding documents for HIV response include no advocacy for decriminalization, depenalization or harm reduction approaches for these key populations. The impact of rights-restricting codes on the nation's HIV epidemic is real: criminalization impedes key populations' access to HIV prevention and treatment services. Given that they are bridging populations, whatever affects the Ghanaian key populations directly, affects the general population indirectly. The right to the highest attainable standard of health, without qualification, is generally acknowledged as a fundamental human right. Unfortunately, this right currently eludes the Ghanaian SW and MSM. The paper endorses decriminalization as a means of promoting this right. In the face of opposition to decriminalization, the paper proposes specific harm reduction strategies as approaches to promote health and uplift the diminished rights of key populations. Thus the authors call on Ghana to remove impediments to

  9. Final report on key comparison CCQM-K100: Analysis of copper in ethanol

    NASA Astrophysics Data System (ADS)

    Zhou, Tao; Kakoulides, Elias; Zhu, Yanbei; Jaehrling, Reinhard; Rienitz, Olaf; Saxby, David; Phukphatthanachai, Pranee; Yafa, Charun; Labarraque, Guillaume; Cankur, Oktay; Can, Süleyman Z.; Konopelko, Leonid A.; Kustikov, Yu A.; Caciano de Sena, Rodrigo; Marques Rodrigues, Janaina; Fonseca Sarmanho, Gabriel; Fortunato de Carvalho Rocha, Werickson; dos Reis, Lindomar Augusto

    2014-01-01

    The increase in renewable sources in the energy matrix of the countries is an effort to reduce dependency on crude oil and the environmental impacts associated with its use. In order to help overcome the lack of widely accepted quality standards for fuel ethanol and to guarantee its competitiveness in the international trade market, the NMIs have been working to develop certified reference materials for bio-fuels and measurement methods. Inorganic impurities such as Cu, Na and Fe may be present in the fuel ethanol and their presence is associated with corrosion and the formation of oxide deposits in some engine parts. The key comparison CCQM-K100 was carried out under the auspices of the Inorganic Analysis Working Group (IAWG) and the coordination of the National Institute of Metrology, Quality and Technology (INMETRO). The objective of this key comparison was to compare the measurement capabilities of the participants for the determination of Cu in fuel ethanol. Ten NMIs participated in this exercise and most of them used the isotopic dilution method for determining the amount of Cu. The median was chosen as key comparison reference value (KCRV). The assigned KCRV for the Cu content was 0.3589 µg/g with a combined standard uncertainty of 0.0014 µg/g. In general, there is a good agreement among the participants' results. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).

  10. Transformation of Learning in Education and Training: Key Qualifications Revisited. CEDEFOP Reference Series.

    ERIC Educational Resources Information Center

    Kamarainen, Pekka, Ed.; Attwell, Graham, Ed.; Brown, Alan, Ed.

    This book contains 15 papers examining European approaches to the theme of key qualifications. The following papers are included: "Key Qualifications Revisited: An Introduction" (Pekka Kamarainen); "Exploring Key Qualifications: Context, Theory, and Practice in Europe" (Pekka Kamarainen); "Rethinking Key Qualifications:…

  11. Partition of some key regulating services in terrestrial ecosystems: Meta-analysis and review.

    PubMed

    Viglizzo, E F; Jobbágy, E G; Ricard, M F; Paruelo, J M

    2016-08-15

    Our knowledge about the functional foundations of ecosystem service (ES) provision is still limited and more research is needed to elucidate key functional mechanisms. Using a simplified eco-hydrological scheme, in this work we analyzed how land-use decisions modify the partition of some essential regulatory ES by altering basic relationships between biomass stocks and water flows. A comprehensive meta-analysis and review was conducted based on global, regional and local data from peer-reviewed publications. We analyzed five datasets comprising 1348 studies and 3948 records on precipitation (PPT), aboveground biomass (AGB), AGB change, evapotranspiration (ET), water yield (WY), WY change, runoff (R) and infiltration (I). The conceptual framework was focused on ES that are associated with the ecological functions (e.g., intermediate ES) of ET, WY, R and I. ES included soil protection, carbon sequestration, local climate regulation, water-flow regulation and water recharge. To address the problem of data normality, the analysis included both parametric and non-parametric regression analysis. Results demonstrate that PPT is a first-order biophysical factor that controls ES release at the broader scales. At decreasing scales, ES are partitioned as result of PPT interactions with other biophysical and anthropogenic factors. At intermediate scales, land-use change interacts with PPT modifying ES partition as it the case of afforestation in dry regions, where ET and climate regulation may be enhanced at the expense of R and water-flow regulation. At smaller scales, site-specific conditions such as topography interact with PPT and AGB displaying different ES partition formats. The probable implications of future land-use and climate change on some key ES production and partition are discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. DOD Financial Management: Effect of Continuing Weaknesses on Management and Operations and Status of Key Challenges

    DTIC Science & Technology

    2014-05-13

    the information needed to effectively (1) manage its assets, (2) assess program performance and make budget decisions , (3) make cost- effective ... decision making, including the information needed to effectively (1) manage its assets, (2) assess program performance and make budget decisions , (3...incorporating key elements of a comprehensive management approach , such as a complete analysis of the return on investment, quantitatively -defined goals

  13. Reliability Modeling Methodology for Independent Approaches on Parallel Runways Safety Analysis

    NASA Technical Reports Server (NTRS)

    Babcock, P.; Schor, A.; Rosch, G.

    1998-01-01

    This document is an adjunct to the final report An Integrated Safety Analysis Methodology for Emerging Air Transport Technologies. That report presents the results of our analysis of the problem of simultaneous but independent, approaches of two aircraft on parallel runways (independent approaches on parallel runways, or IAPR). This introductory chapter presents a brief overview and perspective of approaches and methodologies for performing safety analyses for complex systems. Ensuing chapter provide the technical details that underlie the approach that we have taken in performing the safety analysis for the IAPR concept.

  14. Multicomponent reactions provide key molecules for secret communication.

    PubMed

    Boukis, Andreas C; Reiter, Kevin; Frölich, Maximiliane; Hofheinz, Dennis; Meier, Michael A R

    2018-04-12

    A convenient and inherently more secure communication channel for encoding messages via specifically designed molecular keys is introduced by combining advanced encryption standard cryptography with molecular steganography. The necessary molecular keys require large structural diversity, thus suggesting the application of multicomponent reactions. Herein, the Ugi four-component reaction of perfluorinated acids is utilized to establish an exemplary database consisting of 130 commercially available components. Considering all permutations, this combinatorial approach can unambiguously provide 500,000 molecular keys in only one synthetic procedure per key. The molecular keys are transferred nondigitally and concealed by either adsorption onto paper, coffee, tea or sugar as well as by dissolution in a perfume or in blood. Re-isolation and purification from these disguises is simplified by the perfluorinated sidechains of the molecular keys. High resolution tandem mass spectrometry can unequivocally determine the molecular structure and thus the identity of the key for a subsequent decryption of an encoded message.

  15. Security of Color Image Data Designed by Public-Key Cryptosystem Associated with 2D-DWT

    NASA Astrophysics Data System (ADS)

    Mishra, D. C.; Sharma, R. K.; Kumar, Manish; Kumar, Kuldeep

    2014-08-01

    In present times the security of image data is a major issue. So, we have proposed a novel technique for security of color image data by public-key cryptosystem or asymmetric cryptosystem. In this technique, we have developed security of color image data using RSA (Rivest-Shamir-Adleman) cryptosystem with two-dimensional discrete wavelet transform (2D-DWT). Earlier proposed schemes for security of color images designed on the basis of keys, but this approach provides security of color images with the help of keys and correct arrangement of RSA parameters. If the attacker knows about exact keys, but has no information of exact arrangement of RSA parameters, then the original information cannot be recovered from the encrypted data. Computer simulation based on standard example is critically examining the behavior of the proposed technique. Security analysis and a detailed comparison between earlier developed schemes for security of color images and proposed technique are also mentioned for the robustness of the cryptosystem.

  16. An Evaluation on Factors Influencing Decision making for Malaysia Disaster Management: The Confirmatory Factor Analysis Approach

    NASA Astrophysics Data System (ADS)

    Zubir, S. N. A.; Thiruchelvam, S.; Mustapha, K. N. M.; Che Muda, Z.; Ghazali, A.; Hakimie, H.

    2017-12-01

    For the past few years, natural disaster has been the subject of debate in disaster management especially in flood disaster. Each year, natural disaster results in significant loss of life, destruction of homes and public infrastructure, and economic hardship. Hence, an effective and efficient flood disaster management would assure non-futile efforts for life saving. The aim of this article is to examine the relationship between approach, decision maker, influence factor, result, and ethic to decision making for flood disaster management in Malaysia. The key elements of decision making in the disaster management were studied based on the literature. Questionnaire surveys were administered among lead agencies at East Coast of Malaysia in the state of Kelantan and Pahang. A total of 307 valid responses had been obtained for further analysis. Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA) were carried out to analyse the measurement model involved in the study. The CFA for second-order reflective and first-order reflective measurement model indicates that approach, decision maker, influence factor, result, and ethic have a significant and direct effect on decision making during disaster. The results from this study showed that decision- making during disaster is an important element for disaster management to necessitate a successful collaborative decision making. The measurement model is accepted to proceed with further analysis known as Structural Equation Modeling (SEM) and can be assessed for the future research.

  17. A geostatistical approach to the change-of-support problem and variable-support data fusion in spatial analysis

    NASA Astrophysics Data System (ADS)

    Wang, Jun; Wang, Yang; Zeng, Hui

    2016-01-01

    A key issue to address in synthesizing spatial data with variable-support in spatial analysis and modeling is the change-of-support problem. We present an approach for solving the change-of-support and variable-support data fusion problems. This approach is based on geostatistical inverse modeling that explicitly accounts for differences in spatial support. The inverse model is applied here to produce both the best predictions of a target support and prediction uncertainties, based on one or more measurements, while honoring measurements. Spatial data covering large geographic areas often exhibit spatial nonstationarity and can lead to computational challenge due to the large data size. We developed a local-window geostatistical inverse modeling approach to accommodate these issues of spatial nonstationarity and alleviate computational burden. We conducted experiments using synthetic and real-world raster data. Synthetic data were generated and aggregated to multiple supports and downscaled back to the original support to analyze the accuracy of spatial predictions and the correctness of prediction uncertainties. Similar experiments were conducted for real-world raster data. Real-world data with variable-support were statistically fused to produce single-support predictions and associated uncertainties. The modeling results demonstrate that geostatistical inverse modeling can produce accurate predictions and associated prediction uncertainties. It is shown that the local-window geostatistical inverse modeling approach suggested offers a practical way to solve the well-known change-of-support problem and variable-support data fusion problem in spatial analysis and modeling.

  18. Using Qualitative Comparative Analysis (QCA) of Key Informant Interviews in Health Services Research: Enhancing a Study of Adjuvant Therapy Use in Breast Cancer Care

    PubMed Central

    McAlearney, Ann Scheck; Walker, Daniel; Moss, Alexandra DeNardis; Bickell, Nina A.

    2015-01-01

    Background Qualitative Comparative Analysis (QCA) is a methodology created to address causal complexity in social sciences research by preserving the objectivity of quantitative data analysis without losing detail inherent in qualitative research. However, its use in health services research (HSR) is limited, and questions remain about its application in this context. Objective To explore the strengths and weaknesses of using QCA for HSR. Research Design Using data from semi-structured interviews conducted as part of a multiple case study about adjuvant treatment underuse among underserved breast cancer patients, findings were compared using qualitative approaches with and without QCA to identify strengths, challenges, and opportunities presented by QCA. Subjects Ninety administrative and clinical key informants interviewed across ten NYC area safety net hospitals. Measures Transcribed interviews were coded by three investigators using an iterative and interactive approach. Codes were calibrated for QCA, as well as examined using qualitative analysis without QCA. Results Relative to traditional qualitative analysis, QCA strengths include: (1) addressing causal complexity, (2) results presentation as pathways as opposed to a list, (3) identification of necessary conditions, (4) the option of fuzzy-set calibrations, and (5) QCA-specific parameters of fit that allow researchers to compare outcome pathways. Weaknesses include: (1) few guidelines and examples exist for calibrating interview data, (2) not designed to create predictive models, and (3) unidirectionality. Conclusions Through its presentation of results as pathways, QCA can highlight factors most important for production of an outcome. This strength can yield unique benefits for HSR not available through other methods. PMID:26908085

  19. Parametric and experimental analysis using a power flow approach

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1988-01-01

    Having defined and developed a structural power flow approach for the analysis of structure-borne transmission of structural vibrations, the technique is used to perform an analysis of the influence of structural parameters on the transmitted energy. As a base for comparison, the parametric analysis is first performed using a Statistical Energy Analysis approach and the results compared with those obtained using the power flow approach. The advantages of using structural power flow are thus demonstrated by comparing the type of results obtained by the two methods. Additionally, to demonstrate the advantages of using the power flow method and to show that the power flow results represent a direct physical parameter that can be measured on a typical structure, an experimental investigation of structural power flow is also presented. Results are presented for an L-shaped beam for which an analytical solution has already been obtained. Furthermore, the various methods available to measure vibrational power flow are compared to investigate the advantages and disadvantages of each method.

  20. The identification of key genes and pathways in hepatocellular carcinoma by bioinformatics analysis of high-throughput data.

    PubMed

    Zhang, Chaoyang; Peng, Li; Zhang, Yaqin; Liu, Zhaoyang; Li, Wenling; Chen, Shilian; Li, Guancheng

    2017-06-01

    Liver cancer is a serious threat to public health and has fairly complicated pathogenesis. Therefore, the identification of key genes and pathways is of much importance for clarifying molecular mechanism of hepatocellular carcinoma (HCC) initiation and progression. HCC-associated gene expression dataset was downloaded from Gene Expression Omnibus database. Statistical software R was used for significance analysis of differentially expressed genes (DEGs) between liver cancer samples and normal samples. Gene Ontology (GO) term enrichment analysis and Kyoto Encyclopedia of Genes and Genomes (KEGG) pathway analysis, based on R software, were applied for the identification of pathways in which DEGs significantly enriched. Cytoscape software was for the construction of protein-protein interaction (PPI) network and module analysis to find the hub genes and key pathways. Finally, weighted correlation network analysis (WGCNA) was conducted to further screen critical gene modules with similar expression pattern and explore their biological significance. Significance analysis identified 1230 DEGs with fold change >2, including 632 significantly down-regulated DEGs and 598 significantly up-regulated DEGs. GO term enrichment analysis suggested that up-regulated DEG significantly enriched in immune response, cell adhesion, cell migration, type I interferon signaling pathway, and cell proliferation, and the down-regulated DEG mainly enriched in response to endoplasmic reticulum stress and endoplasmic reticulum unfolded protein response. KEGG pathway analysis found DEGs significantly enriched in five pathways including complement and coagulation cascades, focal adhesion, ECM-receptor interaction, antigen processing and presentation, and protein processing in endoplasmic reticulum. The top 10 hub genes in HCC were separately GMPS, ACACA, ALB, TGFB1, KRAS, ERBB2, BCL2, EGFR, STAT3, and CD8A, which resulted from PPI network. The top 3 gene interaction modules in PPI network enriched

  1. [Key physical parameters of hawthorn leaf granules by stepwise regression analysis method].

    PubMed

    Jiang, Qie-Ying; Zeng, Rong-Gui; Li, Zhe; Luo, Juan; Zhao, Guo-Wei; Lv, Dan; Liao, Zheng-Gen

    2017-05-01

    The purpose of this study was to investigate the effect of key physical properties of hawthorn leaf granule on its dissolution behavior. Hawthorn leaves extract was utilized as a model drug. The extract was mixed with microcrystalline cellulose or starch with the same ratio by using different methods. Appropriate amount of lubricant and disintegrating agent was added into part of the mixed powder, and then the granules were prepared by using extrusion granulation and high shear granulation. The granules dissolution behavior was evaluated by using equilibrium dissolution quantity and dissolution rate constant of the hypericin as the indicators. Then the effect of physical properties on dissolution behavior was analyzed through the stepwise regression analysis method. The equilibrium dissolution quantity of hypericin and adsorption heat constant in hawthorn leaves were positively correlated with the monolayer adsorption capacity and negatively correlated with the moisture absorption rate constant. The dissolution rate constants were decreased with the increase of Hausner rate, monolayer adsorption capacity and adsorption heat constant, and were increased with the increase of Carr index and specific surface area. Adsorption heat constant, monolayer adsorption capacity, moisture absorption rate constant, Carr index and specific surface area were the key physical properties of hawthorn leaf granule to affect its dissolution behavior. Copyright© by the Chinese Pharmaceutical Association.

  2. LACIE analyst interpretation keys

    NASA Technical Reports Server (NTRS)

    Baron, J. G.; Payne, R. W.; Palmer, W. F. (Principal Investigator)

    1979-01-01

    Two interpretation aids, 'The Image Analysis Guide for Wheat/Small Grains Inventories' and 'The United States and Canadian Great Plains Regional Keys', were developed during LACIE phase 2 and implemented during phase 3 in order to provide analysts with a better understanding of the expected ranges in color variation of signatures for individual biostages and of the temporal sequences of LANDSAT signatures. The keys were tested using operational LACIE data, and the results demonstrate that their use provides improved labeling accuracy in all analyst experience groupings, in all geographic areas within the U.S. Great Plains, and during all periods of crop development.

  3. Identification of the key regulating genes of diminished ovarian reserve (DOR) by network and gene ontology analysis.

    PubMed

    Pashaiasl, Maryam; Ebrahimi, Mansour; Ebrahimie, Esmaeil

    2016-09-01

    Diminished ovarian reserve (DOR) is one of the reasons for infertility that not only affects both older and young women. Ovarian reserve assessment can be used as a new prognostic tool for infertility treatment decision making. Here, up- and down-regulated gene expression profiles of granulosa cells were analysed to generate a putative interaction map of the involved genes. In addition, gene ontology (GO) analysis was used to get insight intol the biological processes and molecular functions of involved proteins in DOR. Eleven up-regulated genes and nine down-regulated genes were identified and assessed by constructing interaction networks based on their biological processes. PTGS2, CTGF, LHCGR, CITED, SOCS2, STAR and FSTL3 were the key nodes in the up-regulated networks, while the IGF2, AMH, GREM, and FOXC1 proteins were key in the down-regulated networks. MIRN101-1, MIRN153-1 and MIRN194-1 inhibited the expression of SOCS2, while CSH1 and BMP2 positively regulated IGF1 and IGF2. Ossification, ovarian follicle development, vasculogenesis, sequence-specific DNA binding transcription factor activity, and golgi apparatus are the major differential groups between up-regulated and down-regulated genes in DOR. Meta-analysis of publicly available transcriptomic data highlighted the high coexpression of CTGF, connective tissue growth factor, with the other key regulators of DOR. CTGF is involved in organ senescence and focal adhesion pathway according to GO analysis. These findings provide a comprehensive system biology based insight into the aetiology of DOR through network and gene ontology analyses.

  4. Newborn Survival Case Study in Rwanda - Bottleneck Analysis and Projections in Key Maternal and Child Mortality Rates Using Lives Saved Tool (LiST).

    PubMed

    Khurmi, Manpreet Singh; Sayinzoga, Felix; Berhe, Atakilt; Bucyana, Tatien; Mwali, Assumpta Kayinamura; Manzi, Emmanuel; Muthu, Maharajan

    2017-01-01

    The Newborn Survival Case study in Rwanda provides an analysis of the newborn health and survival situation in the country. It reviews evidence-based interventions and coverage levels already implemented in the country; identifies key issues and bottlenecks in service delivery and uptake of services by community/beneficiaries, and provides key recommendations aimed at faster reduction in newborn mortality rate. This study utilized mixed method research including qualitative and quantitative analyses of various maternal and newborn health programs implemented in the country. This included interviewing key stakeholders at each level, field visits and also interviewing beneficiaries for assessment of uptake of services. Monitoring systems such as Health Management Information Systems (HMIS), maternal and newborn death audits were reviewed and data analyzed to aid these analyses. Policies, protocols, various guidelines and tools for monitoring are already in place however, implementation of these remains a challenge e.g. infection control practices to reduce deaths due to sepsis. Although existing staff are quite knowledgeable and are highly motivated, however, shortage of health personnel especially doctors in an issue. New facilities are being operationalized e.g. at Gisenyi, however, the existing facilities needs expansion. It is essential to implement high impact evidence based interventions but coverage levels need to be significantly high in order to achieve higher reduction in newborn mortality rate. Equity approach should be considered in planning so that the services are better implemented and the poor and needy can get the benefits of public health programs.

  5. Setting objectives for managing Key deer

    USGS Publications Warehouse

    Diefenbach, Duane R.; Wagner, Tyler; Stauffer, Glenn E.

    2014-01-01

    The U.S. Fish and Wildlife Service (FWS) is responsible for the protection and management of Key deer (Odocoileus virginianus clavium) because the species is listed as Endangered under the Endangered Species Act (ESA). The purpose of the ESA is to protect and recover imperiled species and the ecosystems upon which they depend. There are a host of actions that could possibly be undertaken to recover the Key deer population, but without a clearly defined problem and stated objectives it can be difficult to compare and evaluate alternative actions. In addition, management goals and the acceptability of alternative management actions are inherently linked to stakeholders, who should be engaged throughout the process of developing a decision framework. The purpose of this project was to engage a representative group of stakeholders to develop a problem statement that captured the management problem the FWS must address with Key deer and identify objectives that, if met, would help solve the problem. In addition, the objectives were organized in a hierarchical manner (i.e., an objectives network) to show how they are linked, and measurable attributes were identified for each objective. We organized a group of people who represented stakeholders interested in and potentially affected by the management of Key deer. These stakeholders included individuals who represented local, state, and federal governments, non-governmental organizations, the general public, and local businesses. This stakeholder group met five full days over the course of an eight-week period to identify objectives that would address the following problem:“As recovery and removal from the Endangered Species list is the purpose of the Endangered Species Act, the U.S. Fish and Wildlife Service needs a management approach that will ensure a sustainable, viable, and healthy Key deer population. Urbanization has affected the behavior and population dynamics of the Key deer and the amount and characteristics

  6. Air-to-air combat analysis - Review of differential-gaming approaches

    NASA Technical Reports Server (NTRS)

    Ardema, M. D.

    1981-01-01

    The problem of evaluating the combat performance of fighter/attack aircraft is discussed, and the mathematical nature of the problem is examined. The following approaches to air combat analysis are reviewed: (1) differential-turning differential game and (2) coplanar differential game. Selected numerical examples of these approaches are presented. The relative advantages and disadvantages of each are analyzed, and it is concluded that air combat analysis is an extremely difficult mathematical problem and that no one method of approach is best for all purposes. The paper concludes with a discussion of how the two approaches might be used in a complementary manner.

  7. Computed tomography-guided aspiration versus key-hole craniotomy for spontaneous putaminal haemorrhage: a prospective comparison of minimally invasive procedures.

    PubMed

    Zhao, J Z; Zhou, L F; Zhou, D B; Wang, R Z; Wang, M; Wang, D J; Wang, S; Yuan, G; Kang, S; Ji, N; Zhao, Y L; Ye, X

    2009-08-01

    To compare the effectiveness of two minimally invasive procedures, namely computed tomography-guided aspiration and the key-hole approach, in the neurosurgical management for spontaneous putaminal haemorrhage, and to explore the indications for the two approaches. A multicentre, single-blinded controlled trial. Hospitals taking part in this trial and the sources for patients were from China. Among others, the hospitals involved in the interventions included: the Beijing Tiantan Hospital (of the Capital University of Medical Sciences), the General Hospital of People's Liberation Army, the Peking Union Hospital, and the Shanghai Huashan Hospital (of the Fudan University medical school). From September 2001 to November 2003, data were available for analysis from a total of 841 patients with spontaneous putaminal haemorrhage from 135 hospitals all over China (except Tibet, Hong Kong, Taiwan, and Macao). All follow-up data were for at least 3 months. Mortality, Glasgow Coma Scale score, postoperative complications, Kanofsky Performance Scale score, and Barthel Index. There were 563 patients who underwent computed tomography-guided aspiration, and 165 were treated by the key-hole approach. Respective mortality rates 1 month after the operation were 17.9% and 18.3%; at 3 months they were 19.4% and 19.4%. In those undergoing computed tomography-guided aspiration, mortality rates at 3 months after the operation were 28.2% in patients with Glasgow Coma Scale scores of 8 or below, as opposed to 8.2% in those with higher scores. This amounted to a 3.4-fold difference. In those treated by the key-hole approach, the corresponding rates were 30.2% and 7.6%, which amounted to a 4-fold difference. The corresponding mortality at 3 months in patients with complications was 3.9 times as great as in those without complications. In those with haematoma volumes of 70 mL or greater, it was 2.7 times as much as in those in whom the volumes below 30 mL. The postoperative complication rate

  8. The need to disentangle key concepts from ecosystem-approach jargon.

    PubMed

    Waylen, K A; Hastings, E J; Banks, E A; Holstead, K L; Irvine, R J; Blackstock, K L

    2014-10-01

    The ecosystem approach--as endorsed by the Convention on Biological Diversity (CDB) in 2000-is a strategy for holistic, sustainable, and equitable natural resource management, to be implemented via the 12 Malawi Principles. These principles describe the need to manage nature in terms of dynamic ecosystems, while fully engaging with local peoples. It is an ambitious concept. Today, the term is common throughout the research and policy literature on environmental management. However, multiple meanings have been attached to the term, resulting in confusion. We reviewed references to the ecosystem approach from 1957 to 2012 and identified 3 primary uses: as an alternative to ecosystem management or ecosystem-based management; in reference to an integrated and equitable approach to resource management as per the CBD; and as a term signifying a focus on understanding and valuing ecosystem services. Although uses of this term and its variants may overlap in meaning, typically, they do not entirely reflect the ethos of the ecosystem approach as defined by the CBD. For example, there is presently an increasing emphasis on ecosystem services, but focusing on these alone does not promote decentralization of management or use of all forms of knowledge, both of which are integral to the CBD's concept. We highlight that the Malawi Principles are at risk of being forgotten. To better understand these principles, more effort to implement them is required. Such efforts should be evaluated, ideally with comparative approaches, before allowing the CBD's concept of holistic and socially engaged management to be abandoned or superseded. It is possible that attempts to implement all 12 principles together will face many challenges, but they may also offer a unique way to promote holistic and equitable governance of natural resources. Therefore, we believe that the CBD's concept of the ecosystem approach demands more attention. © 2014 Society for Conservation Biology.

  9. Experimental demonstration of subcarrier multiplexed quantum key distribution system.

    PubMed

    Mora, José; Ruiz-Alba, Antonio; Amaya, Waldimar; Martínez, Alfonso; García-Muñoz, Víctor; Calvo, David; Capmany, José

    2012-06-01

    We provide, to our knowledge, the first experimental demonstration of the feasibility of sending several parallel keys by exploiting the technique of subcarrier multiplexing (SCM) widely employed in microwave photonics. This approach brings several advantages such as high spectral efficiency compatible with the actual secure key rates, the sharing of the optical fainted pulse by all the quantum multiplexed channels reducing the system complexity, and the possibility of upgrading with wavelength division multiplexing in a two-tier scheme, to increase the number of parallel keys. Two independent quantum SCM channels featuring a sifted key rate of 10 Kb/s/channel over a link with quantum bit error rate <2% is reported.

  10. Microscopic saw mark analysis: an empirical approach.

    PubMed

    Love, Jennifer C; Derrick, Sharon M; Wiersema, Jason M; Peters, Charles

    2015-01-01

    Microscopic saw mark analysis is a well published and generally accepted qualitative analytical method. However, little research has focused on identifying and mitigating potential sources of error associated with the method. The presented study proposes the use of classification trees and random forest classifiers as an optimal, statistically sound approach to mitigate the potential for error of variability and outcome error in microscopic saw mark analysis. The statistical model was applied to 58 experimental saw marks created with four types of saws. The saw marks were made in fresh human femurs obtained through anatomical gift and were analyzed using a Keyence digital microscope. The statistical approach weighed the variables based on discriminatory value and produced decision trees with an associated outcome error rate of 8.62-17.82%. © 2014 American Academy of Forensic Sciences.

  11. Key Topics for High-Lift Research: A Joint Wind Tunnel/Flight Test Approach

    NASA Technical Reports Server (NTRS)

    Fisher, David; Thomas, Flint O.; Nelson, Robert C.

    1996-01-01

    Future high-lift systems must achieve improved aerodynamic performance with simpler designs that involve fewer elements and reduced maintenance costs. To expeditiously achieve this, reliable CFD design tools are required. The development of useful CFD-based design tools for high lift systems requires increased attention to unresolved flow physics issues. The complex flow field over any multi-element airfoil may be broken down into certain generic component flows which are termed high-lift building block flows. In this report a broad spectrum of key flow field physics issues relevant to the design of improved high lift systems are considered. It is demonstrated that in-flight experiments utilizing the NASA Dryden Flight Test Fixture (which is essentially an instrumented ventral fin) carried on an F-15B support aircraft can provide a novel and cost effective method by which both Reynolds and Mach number effects associated with specific high lift building block flows can be investigated. These in-flight high lift building block flow experiments are most effective when performed in conjunction with coordinated ground based wind tunnel experiments in low speed facilities. For illustrative purposes three specific examples of in-flight high lift building block flow experiments capable of yielding a high payoff are described. The report concludes with a description of a joint wind tunnel/flight test approach to high lift aerodynamics research.

  12. Slips of the Typewriter Key.

    ERIC Educational Resources Information Center

    Berg, Thomas

    2002-01-01

    Presents an analysis of 500 submorphemic slips of the typewriter key that escaped the notice of authors and other proofreaders and thereby made their way into the published records of scientific research. (Author/VWL)

  13. Variability in University Students' Use of Technology: An "Approaches to Learning" Perspective

    ERIC Educational Resources Information Center

    Mimirinis, Mike

    2016-01-01

    This study reports the results of a cross-case study analysis of how students' approaches to learning are demonstrated in blended learning environments. It was initially propositioned that approaches to learning as key determinants of the quality of student learning outcomes are demonstrated specifically in how students utilise technology in…

  14. Key principles to guide development of consumer medicine information--content analysis of information design texts.

    PubMed

    Raynor, David K; Dickinson, David

    2009-04-01

    Effective written consumer medicines information is essential to support safe and effective medicine taking, but the wording and layout of currently provided materials do not meet patients' needs. To identify principles from the wider discipline of information design for use by health professionals when developing or assessing written drug information for patients. Six experts in information design nominated texts on best practice in information design applicable to consumer medicines information. A content analysis identified key principles that were tabulated to bring out key themes. Six texts that met the inclusion criteria, were identified, and content analysis indentified 4 themes: words, type, lines, and layout. Within these main themes, there were 24 subthemes. Selected principles relating to these subthemes were: use short familiar words, short sentences, and short headings that stand out from the text; use a conversational tone of voice, addressing the reader as "you"; use a large type size while retaining sufficient white space; use bullet points to organize lists; use unjustified text (ragged right) and bold, lower-case text for emphasis. Pictures or graphics do not necessarily improve a document. Applying the good information design principles identified to written consumer medicines information could support health professionals when developing and assessing drug information for patients.

  15. Optimal attacks on qubit-based Quantum Key Recycling

    NASA Astrophysics Data System (ADS)

    Leermakers, Daan; Škorić, Boris

    2018-03-01

    Quantum Key Recycling (QKR) is a quantum cryptographic primitive that allows one to reuse keys in an unconditionally secure way. By removing the need to repeatedly generate new keys, it improves communication efficiency. Škorić and de Vries recently proposed a QKR scheme based on 8-state encoding (four bases). It does not require quantum computers for encryption/decryption but only single-qubit operations. We provide a missing ingredient in the security analysis of this scheme in the case of noisy channels: accurate upper bounds on the required amount of privacy amplification. We determine optimal attacks against the message and against the key, for 8-state encoding as well as 4-state and 6-state conjugate coding. We provide results in terms of min-entropy loss as well as accessible (Shannon) information. We show that the Shannon entropy analysis for 8-state encoding reduces to the analysis of quantum key distribution, whereas 4-state and 6-state suffer from additional leaks that make them less effective. From the optimal attacks we compute the required amount of privacy amplification and hence the achievable communication rate (useful information per qubit) of qubit-based QKR. Overall, 8-state encoding yields the highest communication rates.

  16. Understanding key factors affecting electronic medical record implementation: a sociotechnical approach.

    PubMed

    Cucciniello, Maria; Lapsley, Irvine; Nasi, Greta; Pagliari, Claudia

    2015-07-17

    Recent health care policies have supported the adoption of Information and Communication Technologies (ICT) but examples of failed ICT projects in this sector have highlighted the need for a greater understanding of the processes used to implement such innovations in complex organizations. This study examined the interaction of sociological and technological factors in the implementation of an Electronic Medical Record (EMR) system by a major national hospital. It aimed to obtain insights for managers planning such projects in the future and to examine the usefulness of Actor Network Theory (ANT) as a research tool in this context. Case study using documentary analysis, interviews and observations. Qualitative thematic analysis drawing on ANT. Qualitative analyses revealed a complex network of interactions between organizational stakeholders and technology that helped to shape the system and influence its acceptance and adoption. The EMR clearly emerged as a central 'actor' within this network. The results illustrate how important it is to plan innovative and complex information systems with reference to (i) the expressed needs and involvement of different actors, starting from the initial introductory phase; (ii) promoting commitment to the system and adopting a participative approach; (iii) defining and resourcing new roles within the organization capable of supporting and sustaining the change and (iv) assessing system impacts in order to mobilize the network around a common goal. The paper highlights the organizational, cultural, technological, and financial considerations that should be taken into account when planning strategies for the implementation of EMR systems in hospital settings. It also demonstrates how ANT may be usefully deployed in evaluating such projects.

  17. Photo interpretation key to Michigan land cover/use

    NASA Technical Reports Server (NTRS)

    Enslin, W. R.; Hudson, W. D.; Lusch, D. P.

    1983-01-01

    A set of photo interpretation keys is presented to provide a structured approach to the identification of land cover/use categories as specified in the Michigan Resource Inventory Act. The designated categories are urban and; built up lands; agricultural lands; forest land; nonforested land; water bodies; wetlands; and barren land. The keys were developed for use with medium scale (1:20,000 to 1:24,000) color infrared aerial photography. Although each key is generalized in that it relies only upon the most distinguishing photo characteristics in separating the various land cover/use categories, additional interpretation characteristics, distinguishing features and background material are given.

  18. Investigation of Pre-Service English Language Teachers' Cognitive Structures about Some Key Concepts in Approaches and Methods in Language Teaching Course through Word Association Test

    ERIC Educational Resources Information Center

    Ersanli, Ceylan Yangin

    2016-01-01

    This study aims to map the cognitive structure of pre-service English language (EL) teachers about three key concepts related to approaches and methods in language teaching so as to discover their learning process and misconceptions. The study involves both qualitative and quantitative data. The researcher administrated a Word Association Test…

  19. Global Sensitivity Analysis of OnGuard Models Identifies Key Hubs for Transport Interaction in Stomatal Dynamics1[CC-BY

    PubMed Central

    Vialet-Chabrand, Silvere; Griffiths, Howard

    2017-01-01

    The physical requirement for charge to balance across biological membranes means that the transmembrane transport of each ionic species is interrelated, and manipulating solute flux through any one transporter will affect other transporters at the same membrane, often with unforeseen consequences. The OnGuard systems modeling platform has helped to resolve the mechanics of stomatal movements, uncovering previously unexpected behaviors of stomata. To date, however, the manual approach to exploring model parameter space has captured little formal information about the emergent connections between parameters that define the most interesting properties of the system as a whole. Here, we introduce global sensitivity analysis to identify interacting parameters affecting a number of outputs commonly accessed in experiments in Arabidopsis (Arabidopsis thaliana). The analysis highlights synergies between transporters affecting the balance between Ca2+ sequestration and Ca2+ release pathways, notably those associated with internal Ca2+ stores and their turnover. Other, unexpected synergies appear, including with the plasma membrane anion channels and H+-ATPase and with the tonoplast TPK K+ channel. These emergent synergies, and the core hubs of interaction that they define, identify subsets of transporters associated with free cytosolic Ca2+ concentration that represent key targets to enhance plant performance in the future. They also highlight the importance of interactions between the voltage regulation of the plasma membrane and tonoplast in coordinating transport between the different cellular compartments. PMID:28432256

  20. Identifying Key Hospital Service Quality Factors in Online Health Communities

    PubMed Central

    Jung, Yuchul; Hur, Cinyoung; Jung, Dain

    2015-01-01

    Background The volume of health-related user-created content, especially hospital-related questions and answers in online health communities, has rapidly increased. Patients and caregivers participate in online community activities to share their experiences, exchange information, and ask about recommended or discredited hospitals. However, there is little research on how to identify hospital service quality automatically from the online communities. In the past, in-depth analysis of hospitals has used random sampling surveys. However, such surveys are becoming impractical owing to the rapidly increasing volume of online data and the diverse analysis requirements of related stakeholders. Objective As a solution for utilizing large-scale health-related information, we propose a novel approach to identify hospital service quality factors and overtime trends automatically from online health communities, especially hospital-related questions and answers. Methods We defined social media–based key quality factors for hospitals. In addition, we developed text mining techniques to detect such factors that frequently occur in online health communities. After detecting these factors that represent qualitative aspects of hospitals, we applied a sentiment analysis to recognize the types of recommendations in messages posted within online health communities. Korea’s two biggest online portals were used to test the effectiveness of detection of social media–based key quality factors for hospitals. Results To evaluate the proposed text mining techniques, we performed manual evaluations on the extraction and classification results, such as hospital name, service quality factors, and recommendation types using a random sample of messages (ie, 5.44% (9450/173,748) of the total messages). Service quality factor detection and hospital name extraction achieved average F1 scores of 91% and 78%, respectively. In terms of recommendation classification, performance (ie, precision) is

  1. Identifying key hospital service quality factors in online health communities.

    PubMed

    Jung, Yuchul; Hur, Cinyoung; Jung, Dain; Kim, Minki

    2015-04-07

    The volume of health-related user-created content, especially hospital-related questions and answers in online health communities, has rapidly increased. Patients and caregivers participate in online community activities to share their experiences, exchange information, and ask about recommended or discredited hospitals. However, there is little research on how to identify hospital service quality automatically from the online communities. In the past, in-depth analysis of hospitals has used random sampling surveys. However, such surveys are becoming impractical owing to the rapidly increasing volume of online data and the diverse analysis requirements of related stakeholders. As a solution for utilizing large-scale health-related information, we propose a novel approach to identify hospital service quality factors and overtime trends automatically from online health communities, especially hospital-related questions and answers. We defined social media-based key quality factors for hospitals. In addition, we developed text mining techniques to detect such factors that frequently occur in online health communities. After detecting these factors that represent qualitative aspects of hospitals, we applied a sentiment analysis to recognize the types of recommendations in messages posted within online health communities. Korea's two biggest online portals were used to test the effectiveness of detection of social media-based key quality factors for hospitals. To evaluate the proposed text mining techniques, we performed manual evaluations on the extraction and classification results, such as hospital name, service quality factors, and recommendation types using a random sample of messages (ie, 5.44% (9450/173,748) of the total messages). Service quality factor detection and hospital name extraction achieved average F1 scores of 91% and 78%, respectively. In terms of recommendation classification, performance (ie, precision) is 78% on average. Extraction and

  2. Drama in the Key Stage 3 English Framework. Key Stage 3: National Strategy.

    ERIC Educational Resources Information Center

    Department for Education and Skills, London (England).

    Effective drama teaching improves the following student skills: speaking and listening, reading and writing through developing thinking, communication skills, and critical analysis. Drama is part of young people's core curriculum entitlement in the United Kingdom. It is included in the English Curriculum Orders and in the Key Stage 3 Framework for…

  3. The Value Added National Project. Technical Report: Primary 4. Value-Added Key Stage 1 to Key Stage 2.

    ERIC Educational Resources Information Center

    Tymms, Peter

    This is the fourth in a series of technical reports that have dealt with issues surrounding the possibility of national value-added systems for primary schools in England. The main focus has been on the relative progress made by students between the ends of Key Stage 1 (KS1) and Key Stage 2 (KS2). The analysis has indicated that the strength of…

  4. Multivariate analysis: A statistical approach for computations

    NASA Astrophysics Data System (ADS)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  5. Emergency management in health: key issues and challenges in the UK.

    PubMed

    Lee, Andrew C K; Phillips, Wendy; Challen, Kirsty; Goodacre, Steve

    2012-10-19

    Emergency planning in the UK has grown considerably in recent years, galvanised by the threat of terrorism. However, deficiencies in NHS emergency planning were identified and the evidence-base that underpins it is questionable. Inconsistencies in terminologies and concepts also exist. Different models of emergency management exist internationally but the optimal system is unknown. This study examines the evidence-base and evidence requirements for emergency planning in the UK health context. The study involved semi-structured interviews with key stakeholders and opinion leaders. Purposive sampling was used to obtain a breadth of views from various agencies involved in emergency planning and response. Interviews were then analysed using a grounded approach using standard framework analysis techniques. We conducted 17 key informant interviews. Interviewees identified greater gaps in operational than technical aspects of emergency planning. Social and behavioural knowledge gaps were highlighted with regards to how individuals and organisations deal with risk and behave in emergencies. Evidence-based approaches to public engagement and for developing community resilience to disasters are lacking. Other gaps included how knowledge was developed and used. Conflicting views with regards to the optimal configuration and operation of the emergency management system were voiced. Four thematic categories for future research emerged:(i) Knowledge-base for emergency management: Further exploration is needed of how knowledge is acquired, valued, disseminated, adopted and retained.(ii) Social and behavioural issues: Greater understanding of how individuals approach risk and behave in emergencies is required.(iii) Organisational issues in emergencies: Several conflicting organisational issues were identified; value of planning versus plans, flexible versus standardized procedures, top-down versus bottom-up engagement, generic versus specific planning, and reactive versus

  6. The Key Roles in the Informal Organization: A Network Analysis Perspective

    ERIC Educational Resources Information Center

    de Toni, Alberto F.; Nonino, Fabio

    2010-01-01

    Purpose: The purpose of this paper is to identify the key roles embedded in the informal organizational structure (informal networks) and to outline their contribution in the companies' performance. A major objective of the research is to find and characterize a new key informal role that synthesises problem solving, expertise, and accessibility…

  7. Crew awareness as key to optimizing habitability standards onboard naval platforms: A 'back-to-basics' approach.

    PubMed

    Neelakantan, Anand; Ilankumaran, Mookkiah; Ray, Sougat

    2017-10-01

    A healthy habitable environment onboard warships is vital to operational fleet efficiency and fit sea-warrier force. Unique man-machine-armament interface issues and consequent constraints on habitability necessitate a multi-disciplinary approach toward optimizing habitability standards. Study of the basic 'human factor', including crew awareness on what determines shipboard habitability, and its association with habitation specifications is an essential step in such an approach. The aim of this study was to assess crew awareness on shipboard habitability and the association between awareness and maintenance of optimal habitability as per specifications. A cross-sectional descriptive study was carried out among 552 naval personnel onboard warships in Mumbai. Data on crew awareness on habitability was collected using a standardized questionnaire, and correlated with basic habitability requirement specifications. Data was analyzed using Microsoft Excel, Epi-info, and SPSS version 17. Awareness level on basic habitability aspects was very good in 65.3% of crew. Area-specific awareness was maximum with respect to living area (95.3%). Knowledge levels on waste management were among the lowest (65.2%) in the category of aspect-wise awareness. Statistically significant association was found between awareness levels and habitability standards (OR = 7.27). The new benchmarks set in the form of high crew awareness levels on basic shipboard habitability specifications and its significant association with standards needs to be sustained. It entails re-iteration of healthy habitation essentials into training; and holds the key to a fit fighting force.

  8. Materials Analysis: A Key to Unlocking the Mystery of the Columbia Tragedy

    NASA Technical Reports Server (NTRS)

    Mayeaux, Brian M.; Collins, Thomas E.; Piascik, Robert S.; Russel, Richard W.; Jerman, Gregory A.; Shah, Sandeep R.; McDanels, Steven J.

    2004-01-01

    Materials analyses of key forensic evidence helped unlock the mystery of the loss of space shuttle Columbia that disintegrated February 1, 2003 while returning from a 16-day research mission. Following an intensive four-month recovery effort by federal, state, and local emergency management and law officials, Columbia debris was collected, catalogued, and reassembled at the Kennedy Space Center. Engineers and scientists from the Materials and Processes (M&P) team formed by NASA supported Columbia reconstruction efforts, provided factual data through analysis, and conducted experiments to validate the root cause of the accident. Fracture surfaces and thermal effects of selected airframe debris were assessed, and process flows for both nondestructive and destructive sampling and evaluation of debris were developed. The team also assessed left hand (LH) airframe components that were believed to be associated with a structural breach of Columbia. Analytical data collected by the M&P team showed that a significant thermal event occurred at the left wing leading edge in the proximity of LH reinforced carbon carbon (RCC) panels 8 and 9. The analysis also showed exposure to temperatures in excess of 1,649 C, which would severely degrade the support structure, tiles, and RCC panel materials. The integrated failure analysis of wing leading edge debris and deposits strongly supported the hypothesis that a breach occurred at LH RCC panel 8.

  9. Key-value store with internal key-value storage interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, John M.; Faibish, Sorin; Ting, Dennis P. J.

    A key-value store is provided having one or more key-value storage interfaces. A key-value store on at least one compute node comprises a memory for storing a plurality of key-value pairs; and an abstract storage interface comprising a software interface module that communicates with at least one persistent storage device providing a key-value interface for persistent storage of one or more of the plurality of key-value pairs, wherein the software interface module provides the one or more key-value pairs to the at least one persistent storage device in a key-value format. The abstract storage interface optionally processes one or moremore » batch operations on the plurality of key-value pairs. A distributed embodiment for a partitioned key-value store is also provided.« less

  10. Progress in satellite quantum key distribution

    NASA Astrophysics Data System (ADS)

    Bedington, Robert; Arrazola, Juan Miguel; Ling, Alexander

    2017-08-01

    Quantum key distribution (QKD) is a family of protocols for growing a private encryption key between two parties. Despite much progress, all ground-based QKD approaches have a distance limit due to atmospheric losses or in-fibre attenuation. These limitations make purely ground-based systems impractical for a global distribution network. However, the range of communication may be extended by employing satellites equipped with high-quality optical links. This manuscript summarizes research and development which is beginning to enable QKD with satellites. It includes a discussion of protocols, infrastructure, and the technical challenges involved with implementing such systems, as well as a top level summary of on-going satellite QKD initiatives around the world.

  11. An Approach to Experimental Design for the Computer Analysis of Complex Phenomenon

    NASA Technical Reports Server (NTRS)

    Rutherford, Brian

    2000-01-01

    The ability to make credible system assessments, predictions and design decisions related to engineered systems and other complex phenomenon is key to a successful program for many large-scale investigations in government and industry. Recently, many of these large-scale analyses have turned to computational simulation to provide much of the required information. Addressing specific goals in the computer analysis of these complex phenomenon is often accomplished through the use of performance measures that are based on system response models. The response models are constructed using computer-generated responses together with physical test results where possible. They are often based on probabilistically defined inputs and generally require estimation of a set of response modeling parameters. As a consequence, the performance measures are themselves distributed quantities reflecting these variabilities and uncertainties. Uncertainty in the values of the performance measures leads to uncertainties in predicted performance and can cloud the decisions required of the analysis. A specific goal of this research has been to develop methodology that will reduce this uncertainty in an analysis environment where limited resources and system complexity together restrict the number of simulations that can be performed. An approach has been developed that is based on evaluation of the potential information provided for each "intelligently selected" candidate set of computer runs. Each candidate is evaluated by partitioning the performance measure uncertainty into two components - one component that could be explained through the additional computational simulation runs and a second that would remain uncertain. The portion explained is estimated using a probabilistic evaluation of likely results for the additional computational analyses based on what is currently known about the system. The set of runs indicating the largest potential reduction in uncertainty is then selected

  12. Identification of key regulators for the migration and invasion of rheumatoid synoviocytes through a systems approach

    PubMed Central

    You, Sungyong; Yoo, Seung-Ah; Choi, Susanna; Kim, Ji-Young; Park, Su-Jung; Ji, Jong Dae; Kim, Tae-Hwan; Kim, Ki-Jo; Cho, Chul-Soo; Hwang, Daehee; Kim, Wan-Uk

    2014-01-01

    Rheumatoid synoviocytes, which consist of fibroblast-like synoviocytes (FLSs) and synovial macrophages (SMs), are crucial for the progression of rheumatoid arthritis (RA). Particularly, FLSs of RA patients (RA-FLSs) exhibit invasive characteristics reminiscent of cancer cells, destroying cartilage and bone. RA-FLSs and SMs originate differently from mesenchymal and myeloid cells, respectively, but share many pathologic functions. However, the molecular signatures and biological networks representing the distinct and shared features of the two cell types are unknown. We performed global transcriptome profiling of FLSs and SMs obtained from RA and osteoarthritis patients. By comparing the transcriptomes, we identified distinct molecular signatures and cellular processes defining invasiveness of RA-FLSs and proinflammatory properties of RA-SMs, respectively. Interestingly, under the interleukin-1β (IL-1β)–stimulated condition, the RA-FLSs newly acquired proinflammatory signature dominant in RA-SMs without losing invasive properties. We next reconstructed a network model that delineates the shared, RA-FLS–dominant (invasive), and RA-SM–dominant (inflammatory) processes. From the network model, we selected 13 genes, including periostin, osteoblast-specific factor (POSTN) and twist basic helix–loop–helix transcription factor 1 (TWIST1), as key regulator candidates responsible for FLS invasiveness. Of note, POSTN and TWIST1 expressions were elevated in independent RA-FLSs and further instigated by IL-1β. Functional assays demonstrated the requirement of POSTN and TWIST1 for migration and invasion of RA-FLSs stimulated with IL-1β. Together, our systems approach to rheumatoid synovitis provides a basis for identifying key regulators responsible for pathological features of RA-FLSs and -SMs, demonstrating how a certain type of cells acquires functional redundancy under chronic inflammatory conditions. PMID:24374632

  13. Physical and Chemical Analytical Analysis: A key component of Bioforensics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Velsko, S P

    The anthrax letters event of 2001 has raised our awareness of the potential importance of non-biological measurements on samples of biological agents used in a terrorism incident. Such measurements include a variety of mass spectral, spectroscopic, and other instrumental techniques that are part of the current armamentarium of the modern materials analysis or analytical chemistry laboratory. They can provide morphological, trace element, isotopic, and other molecular ''fingerprints'' of the agent that may be key pieces of evidence, supplementing that obtained from genetic analysis or other biological properties. The generation and interpretation of such data represents a new domain of forensicmore » science, closely aligned with other areas of ''microbial forensics''. This paper describes some major elements of the R&D agenda that will define this sub-field in the immediate future and provide the foundations for a coherent national capability. Data from chemical and physical analysis of BW materials can be useful to an investigation of a bio-terror event in two ways. First, it can be used to compare evidence samples collected at different locations where such incidents have occurred (e.g. between the powders in the New York and Washington letters in the Amerithrax investigation) or between the attack samples and those seized during the investigation of sites where it is suspected the material was manufactured (if such samples exist). Matching of sample properties can help establish the relatedness of disparate incidents, and mis-matches might exclude certain scenarios, or signify a more complex etiology of the events under investigation. Chemical and morphological analysis for sample matching has a long history in forensics, and is likely to be acceptable in principle in court, assuming that match criteria are well defined and derived from known limits of precision of the measurement techniques in question. Thus, apart from certain operational issues (such as how

  14. Challenges of the science data processing, analysis and archiving approach in BepiColombo

    NASA Astrophysics Data System (ADS)

    Martinez, Santa

    BepiColombo is a joint mission of the European Space Agency (ESA) and the Japan Aerospace Exploration Agency (JAXA) to the planet Mercury. It comprises two separate orbiters: the Mercury Planetary Orbiter (MPO) and the Mercury Magnetospheric Orbiter (MMO). After approximately 7.5 years of cruise, BepiColombo will arrive at Mercury in 2024 and will gather data during a 1-year nominal mission, with a possible 1-year extension. The approach selected for BepiColombo for the processing, analysis and archiving of the science data represents a significant change with respect to previous ESA planetary missions. Traditionally Instrument Teams are responsible for processing, analysing and preparing their science data for the long-term archive, however in BepiColombo, the Science Ground Segment (SGS), located in Madrid, Spain, will play a key role in these activities. Fundamental aspects of this approach include: the involvement of the SGS in the definition, development and operation of the instrument processing pipelines; the production of ready-to-archive science products compatible with NASA’s Planetary Data System (PDS) standards in all the processing steps; the joint development of a quick-look analysis system to monitor deviations between planned and executed observations to feed back the results into the different planning cycles when possible; and a mission archive providing access to the scientific products and to the operational data throughout the different phases of the mission (from the early development phase to the legacy phase). In order to achieve these goals, the SGS will need to overcome a number of challenges. The proposed approach requires a flexible infrastructure able to cope with a distributed data processing system, residing in different locations but designed as a single entity. For this, all aspects related to the integration of software developed by different Instrument Teams and the alignment of their development schedules will need to be

  15. Quantitative methods of identifying the key nodes in the illegal wildlife trade network

    PubMed Central

    Patel, Nikkita Gunvant; Rorres, Chris; Joly, Damien O.; Brownstein, John S.; Boston, Ray; Levy, Michael Z.; Smith, Gary

    2015-01-01

    Innovative approaches are needed to combat the illegal trade in wildlife. Here, we used network analysis and a new database, HealthMap Wildlife Trade, to identify the key nodes (countries) that support the illegal wildlife trade. We identified key exporters and importers from the number of shipments a country sent and received and from the number of connections a country had to other countries over a given time period. We used flow betweenness centrality measurements to identify key intermediary countries. We found the set of nodes whose removal from the network would cause the maximum disruption to the network. Selecting six nodes would fragment 89.5% of the network for elephants, 92.3% for rhinoceros, and 98.1% for tigers. We then found sets of nodes that would best disseminate an educational message via direct connections through the network. We would need to select 18 nodes to reach 100% of the elephant trade network, 16 nodes for rhinoceros, and 10 for tigers. Although the choice of locations for interventions should be customized for the animal and the goal of the intervention, China was the most frequently selected country for network fragmentation and information dissemination. Identification of key countries will help strategize illegal wildlife trade interventions. PMID:26080413

  16. Extractive waste management: A risk analysis approach.

    PubMed

    Mehta, Neha; Dino, Giovanna Antonella; Ajmone-Marsan, Franco; Lasagna, Manuela; Romè, Chiara; De Luca, Domenico Antonio

    2018-05-01

    Abandoned mine sites continue to present serious environmental hazards because the heavy metals associated with extractive waste are continuously released into the environment, where they threaten human life and the environment. Remediating and securing extractive waste are complex, lengthy and costly processes. Thus, in most European countries, a site is considered for intervention when it poses a risk to human health and the surrounding environment. As a consequence, risk analysis presents a viable decisional approach towards the management of extractive waste. To evaluate the effects posed by extractive waste to human health and groundwater, a risk analysis approach was used for an abandoned nickel extraction site in Campello Monti in North Italy. This site is located in the Southern Italian Alps. The area consists of large and voluminous mafic rocks intruded by mantle peridotite. The mining activities in this area have generated extractive waste. A risk analysis of the site was performed using Risk Based Corrective Action (RBCA) guidelines, considering the properties of extractive waste and water for the properties of environmental matrices. The results showed the presence of carcinogenic risk due to arsenic and risks to groundwater due to nickel. The results of the risk analysis form a basic understanding of the current situation at the site, which is affected by extractive waste. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. IDHEAS – A NEW APPROACH FOR HUMAN RELIABILITY ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    G. W. Parry; J.A Forester; V.N. Dang

    2013-09-01

    This paper describes a method, IDHEAS (Integrated Decision-Tree Human Event Analysis System) that has been developed jointly by the US NRC and EPRI as an improved approach to Human Reliability Analysis (HRA) that is based on an understanding of the cognitive mechanisms and performance influencing factors (PIFs) that affect operator responses. The paper describes the various elements of the method, namely the performance of a detailed cognitive task analysis that is documented in a crew response tree (CRT), and the development of the associated time-line to identify the critical tasks, i.e. those whose failure results in a human failure eventmore » (HFE), and an approach to quantification that is based on explanations of why the HFE might occur.« less

  18. An Efficient Soft Set-Based Approach for Conflict Analysis

    PubMed Central

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%. PMID:26928627

  19. An Efficient Soft Set-Based Approach for Conflict Analysis.

    PubMed

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%.

  20. Nonlinear Analysis for High-temperature Composites: Turbine Blades/vanes

    NASA Technical Reports Server (NTRS)

    Hopkins, D. A.; Chamis, C. C.

    1984-01-01

    An integrated approach to nonlinear analysis of high-temperature composites in turbine blade/vane applications is presented. The overall strategy of this approach and the key elements comprising this approach are summarized. Preliminary results for a tungsten-fiber-reinforced superalloy (TFRS) composite are discussed.

  1. Extended analysis of the Trojan-horse attack in quantum key distribution

    NASA Astrophysics Data System (ADS)

    Vinay, Scott E.; Kok, Pieter

    2018-04-01

    The discrete-variable quantum key distribution protocols based on the 1984 protocol of Bennett and Brassard (BB84) are known to be secure against an eavesdropper, Eve, intercepting the flying qubits and performing any quantum operation on them. However, these protocols may still be vulnerable to side-channel attacks. We investigate the Trojan-horse side-channel attack where Eve sends her own state into Alice's apparatus and measures the reflected state to estimate the key. We prove that the separable coherent state is optimal for Eve among the class of multimode Gaussian attack states, even in the presence of thermal noise. We then provide a bound on the secret key rate in the case where Eve may use any separable state.

  2. Florida Keys

    NASA Image and Video Library

    2002-12-13

    The Florida Keys are a chain of islands, islets and reefs extending from Virginia Key to the Dry Tortugas for about 309 kilometers (192 miles). The keys are chiefly limestone and coral formations. The larger islands of the group are Key West (with its airport), Key Largo, Sugarloaf Key, and Boca Chica Key. A causeway extends from the mainland to Key West. This image was acquired on October 28, 2001, by the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) on NASA's Terra satellite. With its 14 spectral bands from the visible to the thermal infrared wavelength region, and its high spatial resolution of 15 to 90 meters (about 50 to 300 feet), ASTER images Earth to map and monitor the changing surface of our planet. http://photojournal.jpl.nasa.gov/catalog/PIA03890

  3. Counterfactual Quantum Deterministic Key Distribution

    NASA Astrophysics Data System (ADS)

    Zhang, Sheng; Wang, Jian; Tang, Chao-Jing

    2013-01-01

    We propose a new counterfactual quantum cryptography protocol concerning about distributing a deterministic key. By adding a controlled blocking operation module to the original protocol [T.G. Noh, Phys. Rev. Lett. 103 (2009) 230501], the correlation between the polarizations of the two parties, Alice and Bob, is extended, therefore, one can distribute both deterministic keys and random ones using our protocol. We have also given a simple proof of the security of our protocol using the technique we ever applied to the original protocol. Most importantly, our analysis produces a bound tighter than the existing ones.

  4. High speed and adaptable error correction for megabit/s rate quantum key distribution.

    PubMed

    Dixon, A R; Sato, H

    2014-12-02

    Quantum Key Distribution is moving from its theoretical foundation of unconditional security to rapidly approaching real world installations. A significant part of this move is the orders of magnitude increases in the rate at which secure key bits are distributed. However, these advances have mostly been confined to the physical hardware stage of QKD, with software post-processing often being unable to support the high raw bit rates. In a complete implementation this leads to a bottleneck limiting the final secure key rate of the system unnecessarily. Here we report details of equally high rate error correction which is further adaptable to maximise the secure key rate under a range of different operating conditions. The error correction is implemented both in CPU and GPU using a bi-directional LDPC approach and can provide 90-94% of the ideal secure key rate over all fibre distances from 0-80 km.

  5. Unifying Approach to Analytical Chemistry and Chemical Analysis: Problem-Oriented Role of Chemical Analysis.

    ERIC Educational Resources Information Center

    Pardue, Harry L.; Woo, Jannie

    1984-01-01

    Proposes an approach to teaching analytical chemistry and chemical analysis in which a problem to be resolved is the focus of a course. Indicates that this problem-oriented approach is intended to complement detailed discussions of fundamental and applied aspects of chemical determinations and not replace such discussions. (JN)

  6. Key findings from the International Ovarian Tumor Analysis (IOTA) study: an approach to the optimal ultrasound based characterisation of adnexal pathology

    PubMed Central

    Bourne, Tom; De Rijdt, Sylvie; Van Holsbeke, Caroline; Sayasneh, Ahmad; Valentin, Lil; Van Calster, Ben; Timmerman, Dirk

    2015-01-01

    Abstract The principal aim of the IOTA project has been to develop approaches to the evaluation of adnexal pathology using ultrasound that can be transferred to all examiners. Creating models that use simple, easily reproducible ultrasound characteristics is one approach. PMID:28191150

  7. Theoretical and methodological approaches in discourse analysis.

    PubMed

    Stevenson, Chris

    2004-01-01

    Discourse analysis (DA) embodies two main approaches: Foucauldian DA and radical social constructionist DA. Both are underpinned by social constructionism to a lesser or greater extent. Social constructionism has contested areas in relation to power, embodiment, and materialism, although Foucauldian DA does focus on the issue of power Embodiment and materialism may be especially relevant for researchers of nursing where the physical body is prominent. However, the contested nature of social constructionism allows a fusion of theoretical and methodological approaches tailored to a specific research interest. In this paper, Chris Stevenson suggests a framework for working out and declaring the DA approach to be taken in relation to a research area, as well as to aid anticipating methodological critique. Method, validity, reliability and scholarship are discussed from within a discourse analytic frame of reference.

  8. Theoretical and methodological approaches in discourse analysis.

    PubMed

    Stevenson, Chris

    2004-10-01

    Discourse analysis (DA) embodies two main approaches: Foucauldian DA and radical social constructionist DA. Both are underpinned by social constructionism to a lesser or greater extent. Social constructionism has contested areas in relation to power, embodiment, and materialism, although Foucauldian DA does focus on the issue of power. Embodiment and materialism may be especially relevant for researchers of nursing where the physical body is prominent. However, the contested nature of social constructionism allows a fusion of theoretical and methodological approaches tailored to a specific research interest. In this paper, Chris Stevenson suggests a frame- work for working out and declaring the DA approach to be taken in relation to a research area, as well as to aid anticipating methodological critique. Method, validity, reliability and scholarship are discussed from within a discourse analytic frame of reference.

  9. Quality control, analysis and secure sharing of Luminex® immunoassay data using the open source LabKey Server platform

    PubMed Central

    2013-01-01

    Background Immunoassays that employ multiplexed bead arrays produce high information content per sample. Such assays are now frequently used to evaluate humoral responses in clinical trials. Integrated software is needed for the analysis, quality control, and secure sharing of the high volume of data produced by such multiplexed assays. Software that facilitates data exchange and provides flexibility to perform customized analyses (including multiple curve fits and visualizations of assay performance over time) could increase scientists’ capacity to use these immunoassays to evaluate human clinical trials. Results The HIV Vaccine Trials Network and the Statistical Center for HIV/AIDS Research and Prevention collaborated with LabKey Software to enhance the open source LabKey Server platform to facilitate workflows for multiplexed bead assays. This system now supports the management, analysis, quality control, and secure sharing of data from multiplexed immunoassays that leverage Luminex xMAP® technology. These assays may be custom or kit-based. Newly added features enable labs to: (i) import run data from spreadsheets output by Bio-Plex Manager™ software; (ii) customize data processing, curve fits, and algorithms through scripts written in common languages, such as R; (iii) select script-defined calculation options through a graphical user interface; (iv) collect custom metadata for each titration, analyte, run and batch of runs; (v) calculate dose–response curves for titrations; (vi) interpolate unknown concentrations from curves for titrated standards; (vii) flag run data for exclusion from analysis; (viii) track quality control metrics across runs using Levey-Jennings plots; and (ix) automatically flag outliers based on expected values. Existing system features allow researchers to analyze, integrate, visualize, export and securely share their data, as well as to construct custom user interfaces and workflows. Conclusions Unlike other tools tailored for

  10. Quality control, analysis and secure sharing of Luminex® immunoassay data using the open source LabKey Server platform.

    PubMed

    Eckels, Josh; Nathe, Cory; Nelson, Elizabeth K; Shoemaker, Sara G; Nostrand, Elizabeth Van; Yates, Nicole L; Ashley, Vicki C; Harris, Linda J; Bollenbeck, Mark; Fong, Youyi; Tomaras, Georgia D; Piehler, Britt

    2013-04-30

    Immunoassays that employ multiplexed bead arrays produce high information content per sample. Such assays are now frequently used to evaluate humoral responses in clinical trials. Integrated software is needed for the analysis, quality control, and secure sharing of the high volume of data produced by such multiplexed assays. Software that facilitates data exchange and provides flexibility to perform customized analyses (including multiple curve fits and visualizations of assay performance over time) could increase scientists' capacity to use these immunoassays to evaluate human clinical trials. The HIV Vaccine Trials Network and the Statistical Center for HIV/AIDS Research and Prevention collaborated with LabKey Software to enhance the open source LabKey Server platform to facilitate workflows for multiplexed bead assays. This system now supports the management, analysis, quality control, and secure sharing of data from multiplexed immunoassays that leverage Luminex xMAP® technology. These assays may be custom or kit-based. Newly added features enable labs to: (i) import run data from spreadsheets output by Bio-Plex Manager™ software; (ii) customize data processing, curve fits, and algorithms through scripts written in common languages, such as R; (iii) select script-defined calculation options through a graphical user interface; (iv) collect custom metadata for each titration, analyte, run and batch of runs; (v) calculate dose-response curves for titrations; (vi) interpolate unknown concentrations from curves for titrated standards; (vii) flag run data for exclusion from analysis; (viii) track quality control metrics across runs using Levey-Jennings plots; and (ix) automatically flag outliers based on expected values. Existing system features allow researchers to analyze, integrate, visualize, export and securely share their data, as well as to construct custom user interfaces and workflows. Unlike other tools tailored for Luminex immunoassays, LabKey Server

  11. A Key Pre-Distribution Scheme Based on µ-PBIBD for Enhancing Resilience in Wireless Sensor Networks.

    PubMed

    Yuan, Qi; Ma, Chunguang; Yu, Haitao; Bian, Xuefen

    2018-05-12

    Many key pre-distribution (KPD) schemes based on combinatorial design were proposed for secure communication of wireless sensor networks (WSNs). Due to complexity of constructing the combinatorial design, it is infeasible to generate key rings using the corresponding combinatorial design in large scale deployment of WSNs. In this paper, we present a definition of new combinatorial design, termed “µ-partially balanced incomplete block design (µ-PBIBD)”, which is a refinement of partially balanced incomplete block design (PBIBD), and then describe a 2-D construction of µ-PBIBD which is mapped to KPD in WSNs. Our approach is of simple construction which provides a strong key connectivity and a poor network resilience. To improve the network resilience of KPD based on 2-D µ-PBIBD, we propose a KPD scheme based on 3-D Ex-µ-PBIBD which is a construction of µ-PBIBD from 2-D space to 3-D space. Ex-µ-PBIBD KPD scheme improves network scalability and resilience while has better key connectivity. Theoretical analysis and comparison with the related schemes show that key pre-distribution scheme based on Ex-µ-PBIBD provides high network resilience and better key scalability, while it achieves a trade-off between network resilience and network connectivity.

  12. A Key Pre-Distribution Scheme Based on µ-PBIBD for Enhancing Resilience in Wireless Sensor Networks

    PubMed Central

    Yuan, Qi; Ma, Chunguang; Yu, Haitao; Bian, Xuefen

    2018-01-01

    Many key pre-distribution (KPD) schemes based on combinatorial design were proposed for secure communication of wireless sensor networks (WSNs). Due to complexity of constructing the combinatorial design, it is infeasible to generate key rings using the corresponding combinatorial design in large scale deployment of WSNs. In this paper, we present a definition of new combinatorial design, termed “µ-partially balanced incomplete block design (µ-PBIBD)”, which is a refinement of partially balanced incomplete block design (PBIBD), and then describe a 2-D construction of µ-PBIBD which is mapped to KPD in WSNs. Our approach is of simple construction which provides a strong key connectivity and a poor network resilience. To improve the network resilience of KPD based on 2-D µ-PBIBD, we propose a KPD scheme based on 3-D Ex-µ-PBIBD which is a construction of µ-PBIBD from 2-D space to 3-D space. Ex-µ-PBIBD KPD scheme improves network scalability and resilience while has better key connectivity. Theoretical analysis and comparison with the related schemes show that key pre-distribution scheme based on Ex-µ-PBIBD provides high network resilience and better key scalability, while it achieves a trade-off between network resilience and network connectivity. PMID:29757244

  13. Beyond Repair: Conversation Analysis as an Approach to SLA

    ERIC Educational Resources Information Center

    Kasper, Gabriele

    2006-01-01

    As one of several approaches to SLA as social practice, Conversation Analysis (CA) has the capacity to examine in detail how opportunities for L2 learning arise in different interactional activities. Its particular strength, and one that distinguishes it from other social practice approaches, is its consistent focus on the orientations and…

  14. Interdisciplinary and participatory approaches: the key to effective groundwater management

    NASA Astrophysics Data System (ADS)

    Barthel, Roland; Foster, Stephen; Villholth, Karen G.

    2017-11-01

    The challenges of a changing world, which are progressively threatening sustainable use of groundwater resources, can only be rationally and effectively addressed through close collaboration between experts and practitioners from different disciplines. Furthermore, science and management need to build on stakeholder opinions and processes in order to generate useful knowledge and positive outcomes in terms of sustainable and equitable groundwater management. This essay provides a discussion of the status of and vision for participatory and inter-disciplinary approaches to groundwater evaluation and management as well as a conceptual framework and relevant research questions that will facilitate such approaches.

  15. Sensitivity analysis of key components in large-scale hydroeconomic models

    NASA Astrophysics Data System (ADS)

    Medellin-Azuara, J.; Connell, C. R.; Lund, J. R.; Howitt, R. E.

    2008-12-01

    This paper explores the likely impact of different estimation methods in key components of hydro-economic models such as hydrology and economic costs or benefits, using the CALVIN hydro-economic optimization for water supply in California. In perform our analysis using two climate scenarios: historical and warm-dry. The components compared were perturbed hydrology using six versus eighteen basins, highly-elastic urban water demands, and different valuation of agricultural water scarcity. Results indicate that large scale hydroeconomic hydro-economic models are often rather robust to a variety of estimation methods of ancillary models and components. Increasing the level of detail in the hydrologic representation of this system might not greatly affect overall estimates of climate and its effects and adaptations for California's water supply. More price responsive urban water demands will have a limited role in allocating water optimally among competing uses. Different estimation methods for the economic value of water and scarcity in agriculture may influence economically optimal water allocation; however land conversion patterns may have a stronger influence in this allocation. Overall optimization results of large-scale hydro-economic models remain useful for a wide range of assumptions in eliciting promising water management alternatives.

  16. Projecting biodiversity and wood production in future forest landscapes: 15 key modeling considerations.

    PubMed

    Felton, Adam; Ranius, Thomas; Roberge, Jean-Michel; Öhman, Karin; Lämås, Tomas; Hynynen, Jari; Juutinen, Artti; Mönkkönen, Mikko; Nilsson, Urban; Lundmark, Tomas; Nordin, Annika

    2017-07-15

    A variety of modeling approaches can be used to project the future development of forest systems, and help to assess the implications of different management alternatives for biodiversity and ecosystem services. This diversity of approaches does however present both an opportunity and an obstacle for those trying to decide which modeling technique to apply, and interpreting the management implications of model output. Furthermore, the breadth of issues relevant to addressing key questions related to forest ecology, conservation biology, silviculture, economics, requires insights stemming from a number of distinct scientific disciplines. As forest planners, conservation ecologists, ecological economists and silviculturalists, experienced with modeling trade-offs and synergies between biodiversity and wood biomass production, we identified fifteen key considerations relevant to assessing the pros and cons of alternative modeling approaches. Specifically we identified key considerations linked to study question formulation, modeling forest dynamics, forest processes, study landscapes, spatial and temporal aspects, and the key response metrics - biodiversity and wood biomass production, as well as dealing with trade-offs and uncertainties. We also provide illustrative examples from the modeling literature stemming from the key considerations assessed. We use our findings to reiterate the need for explicitly addressing and conveying the limitations and uncertainties of any modeling approach taken, and the need for interdisciplinary research efforts when addressing the conservation of biodiversity and sustainable use of environmental resources. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Key Largo Limestone revisited: Pleistocene shelf-edge facies, Florida Keys, USA

    USGS Publications Warehouse

    Gray, Multer H.; Gischler, E.; Lundberg, J.; Simmons, K.R.; Shinn, E.A.

    2002-01-01

    New dates and analysis of 12 deep and 57 shallow cores allow a more detailed interpretation of the Pleistocene shelf edge of the Florida Platform as found in various facies of the Key Largo Limestone beneath the Florida Keys. In this study a three-phase evolution of the Quaternary units (Q1-Q5) of the Key Largo is presented with new subdivision of the Q5. (1) In the first phase, the Q1 and Q2 (perhaps deposited during oxygen-isotope stage 11) deep-water quartz-rich environment evolved into a shallow carbonate phase. (2) Subsequently, a Q3 (presumably corresponding to oxygen-isotope stage 9) flourishing reef and productive high-platform sediment phase developed. (3) Finally, a Q4 and Q5 (corresponding to oxygen-isotope stages 7 and 5) stabilization phase occured with reefs and leeward productive lagoons, followed by lower sea levels presenting a sequence of younger (isotope substages 5c, 5a) shelf-margin wedges, sediment veneers and outlier reefs. The Key Largo Limestone provides an accessible model of a carbonate shelf edge with fluctuating water depth, bordering a deep seaward basin for a period of at least 300 ka. During this time, at least four onlaps/offlaps, often separated by periods of karst development with associated diagenetic alterations, took place. The story presented by this limestone not only allows a better understanding of the history of south Florida but also aids in the interpretation of similar persistent shelf-edge sites bordering deep basins in other areas.

  18. Comparison of approaches for mobile document image analysis using server supported smartphones

    NASA Astrophysics Data System (ADS)

    Ozarslan, Suleyman; Eren, P. Erhan

    2014-03-01

    With the recent advances in mobile technologies, new capabilities are emerging, such as mobile document image analysis. However, mobile phones are still less powerful than servers, and they have some resource limitations. One approach to overcome these limitations is performing resource-intensive processes of the application on remote servers. In mobile document image analysis, the most resource consuming process is the Optical Character Recognition (OCR) process, which is used to extract text in mobile phone captured images. In this study, our goal is to compare the in-phone and the remote server processing approaches for mobile document image analysis in order to explore their trade-offs. For the inphone approach, all processes required for mobile document image analysis run on the mobile phone. On the other hand, in the remote-server approach, core OCR process runs on the remote server and other processes run on the mobile phone. Results of the experiments show that the remote server approach is considerably faster than the in-phone approach in terms of OCR time, but adds extra delays such as network delay. Since compression and downscaling of images significantly reduce file sizes and extra delays, the remote server approach overall outperforms the in-phone approach in terms of selected speed and correct recognition metrics, if the gain in OCR time compensates for the extra delays. According to the results of the experiments, using the most preferable settings, the remote server approach performs better than the in-phone approach in terms of speed and acceptable correct recognition metrics.

  19. A new approach to hazardous materials transportation risk analysis: decision modeling to identify critical variables.

    PubMed

    Clark, Renee M; Besterfield-Sacre, Mary E

    2009-03-01

    We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk.

  20. Active magnetic radiation shielding system analysis and key technologies.

    PubMed

    Washburn, S A; Blattnig, S R; Singleterry, R C; Westover, S C

    2015-01-01

    Many active magnetic shielding designs have been proposed in order to reduce the radiation exposure received by astronauts on long duration, deep space missions. While these designs are promising, they pose significant engineering challenges. This work presents a survey of the major systems required for such unconfined magnetic field design, allowing the identification of key technologies for future development. Basic mass calculations are developed for each system and are used to determine the resulting galactic cosmic radiation exposure for a generic solenoid design, using a range of magnetic field strength and thickness values, allowing some of the basic characteristics of such a design to be observed. This study focuses on a solenoid shaped, active magnetic shield design; however, many of the principles discussed are applicable regardless of the exact design configuration, particularly the key technologies cited. Copyright © 2015 The Committee on Space Research (COSPAR). All rights reserved.

  1. Fully device-independent conference key agreement

    NASA Astrophysics Data System (ADS)

    Ribeiro, Jérémy; Murta, Gláucia; Wehner, Stephanie

    2018-02-01

    We present a security analysis of conference key agreement (CKA) in the most adversarial model of device independence (DI). Our protocol can be implemented by any experimental setup that is capable of performing Bell tests [specifically, the Mermin-Ardehali-Belinskii-Klyshko (MABK) inequality], and security can in principle be obtained for any violation of the MABK inequality that detects genuine multipartite entanglement among the N parties involved in the protocol. As our main tool, we derive a direct physical connection between the N -partite MABK inequality and the Clauser-Horne-Shimony-Holt (CHSH) inequality, showing that certain violations of the MABK inequality correspond to a violation of the CHSH inequality between one of the parties and the other N -1 . We compare the asymptotic key rate for device-independent conference key agreement (DICKA) to the case where the parties use N -1 device-independent quantum key distribution protocols in order to generate a common key. We show that for some regime of noise the DICKA protocol leads to better rates.

  2. Employing an ethnographic approach: key characteristics.

    PubMed

    Lambert, Veronica; Glacken, Michele; McCarron, Mary

    2011-01-01

    Nurses are increasingly embracing ethnography as a useful research methodology. This paper presents an overview of some of the main characteristics we considered and the challenges encountered when using ethnography to explore the nature of communication between children and health professionals in a children's hospital. There is no consensual definition or single procedure to follow when using ethnography. This is largely attributable to the re-contextualisation of ethnography over time through diversification in and across many disciplines. Thus, it is imperative to consider some of ethnography's trademark features. To identify core trademark features of ethnography, we collated data following a scoping review of pertinent ethnographic textbooks, journal articles, attendance at ethnographic workshops and discussions with principle ethnographers. This is a methodological paper. Essentially, ethnography is a field-orientated activity that has cultural interpretations at its core, although the levels of those interpretations vary. We identified six trademark features to be considered when embracing an ethnographic approach: naturalism; context; multiple data sources; small case numbers; 'emic' and 'etic' perspectives, and ethical considerations. Ethnography has an assortment of meanings, so it is not often used in a wholly orthodox way and does not fall under the auspices of one epistemological belief. Yet, there are core criteria and trademark features that researchers should take into account alongside their particular epistemological beliefs when embracing an ethnographic inquiry. We hope this paper promotes a clearer vision of the methodological processes to consider when embarking on ethnography and creates an avenue for others to disseminate their experiences of and challenges encountered when applying ethnography's trademark features in different healthcare contexts.

  3. Performance Analysis of Hierarchical Group Key Management Integrated with Adaptive Intrusion Detection in Mobile ad hoc Networks

    DTIC Science & Technology

    2016-04-05

    applications in wireless networks such as military battlefields, emergency response, mobile commerce , online gaming, and collaborative work are based on the...www.elsevier.com/locate/peva Performance analysis of hierarchical group key management integrated with adaptive intrusion detection in mobile ad hoc...Accepted 19 September 2010 Available online 26 September 2010 Keywords: Mobile ad hoc networks Intrusion detection Group communication systems Group

  4. Advanced approach to the analysis of a series of in-situ nuclear forward scattering experiments

    NASA Astrophysics Data System (ADS)

    Vrba, Vlastimil; Procházka, Vít; Smrčka, David; Miglierini, Marcel

    2017-03-01

    This study introduces a sequential fitting procedure as a specific approach to nuclear forward scattering (NFS) data evaluation. Principles and usage of this advanced evaluation method are described in details and its utilization is demonstrated on NFS in-situ investigations of fast processes. Such experiments frequently consist of hundreds of time spectra which need to be evaluated. The introduced procedure allows the analysis of these experiments and significantly decreases the time needed for the data evaluation. The key contributions of the study are the sequential use of the output fitting parameters of a previous data set as the input parameters for the next data set and the model suitability crosscheck option of applying the procedure in ascending and descending directions of the data sets. Described fitting methodology is beneficial for checking of model validity and reliability of obtained results.

  5. Designing new institutions for implementing integrated disaster risk management: key elements and future directions.

    PubMed

    Gopalakrishnan, Chennat; Okada, Norio

    2007-12-01

    The goal of integrated disaster risk management is to promote an overall improvement in the quality of safety and security in a region, city or community at disaster risk. This paper presents the case for a thorough overhaul of the institutional component of integrated disaster risk management. A review of disaster management institutions in the United States indicates significant weaknesses in their ability to contribute effectively to the implementation of integrated disaster risk management. Our analysis and findings identify eight key elements for the design of dynamic new disaster management institutions. Six specific approaches are suggested for incorporating the identified key elements in building new institutions that would have significant potential for enhancing the effective implementation of integrated disaster risk management. We have developed a possible blueprint for effective design and construction of efficient, sustainable and functional disaster management institutions.

  6. GIS environmental information analysis of the Darro River basin as the key for the management and hydrological forest restoration.

    PubMed

    Fernandez, Paz; Delgado, Expectación; Lopez-Alonso, Mónica; Poyatos, José Manuel

    2018-02-01

    This article presents analyses of soil and environmental information for the Darro River basin (Granada-Spain) preliminary to its hydrological and forestry restoration. These analyses were carried out using a geographical information system (GIS) and employing a new procedure that adapts hydrological forest-restoration methods. The complete analysis encompasses morphological conditions, soil and climate characteristics as well as vegetation and land use. The study investigates soil erosion in the basin by using Universal Soil Loss Equation (USLE) and by mapping erosion fragility units. The results are presented in a set of maps and their analysis, providing the starting point for river basin management and the hydrological and forestry-restoration project that was approved at the end of 2015. The presence of soft substrates (e.g. gravel and sand) indicates that the area is susceptible to erosion, particularly the areas that are dominated by human activity and have little soil protection. Finally, land use and vegetation cover were identified as key factors in the soil erosion in the basin. According to the results, river authorities have included several measures in the restoration project aimed at reducing the erosion and helping to recover the environmental value of this river basin and to include it in recreation possibilities for the community of Granada. The presented analytical approach, designed by the authors, would be useful as a tool for environmental restoration in other small Mediterranean river basins. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Identification of key micro-organisms involved in Douchi fermentation by statistical analysis and their use in an experimental fermentation.

    PubMed

    Chen, C; Xiang, J Y; Hu, W; Xie, Y B; Wang, T J; Cui, J W; Xu, Y; Liu, Z; Xiang, H; Xie, Q

    2015-11-01

    To screen and identify safe micro-organisms used during Douchi fermentation, and verify the feasibility of producing high-quality Douchi using these identified micro-organisms. PCR-denaturing gradient gel electrophoresis (DGGE) and automatic amino-acid analyser were used to investigate the microbial diversity and free amino acids (FAAs) content of 10 commercial Douchi samples. The correlations between microbial communities and FAAs were analysed by statistical analysis. Ten strains with significant positive correlation were identified. Then an experiment on Douchi fermentation by identified strains was carried out, and the nutritional composition in Douchi was analysed. Results showed that FAAs and relative content of isoflavone aglycones in verification Douchi samples were generally higher than those in commercial Douchi samples. Our study indicated that fungi, yeasts, Bacillus and lactic acid bacteria were the key players in Douchi fermentation, and with identified probiotic micro-organisms participating in fermentation, a higher quality Douchi product was produced. This is the first report to analyse and confirm the key micro-organisms during Douchi fermentation by statistical analysis. This work proves fermentation micro-organisms to be the key influencing factor of Douchi quality, and demonstrates the feasibility of fermenting Douchi using identified starter micro-organisms. © 2015 The Society for Applied Microbiology.

  8. Function key and shortcut key use in airway facilities.

    DOT National Transportation Integrated Search

    2003-02-01

    This document provides information on the function keys and shortcut keys used by systems in the Federal Aviation Administration : Airway Facilities (AF) work environment. It includes a catalog of the function keys and shortcut keys used by each syst...

  9. A Recommended Set of Key Arctic Indicators

    NASA Astrophysics Data System (ADS)

    Stanitski, D.; Druckenmiller, M.; Fetterer, F. M.; Gerst, M.; Intrieri, J. M.; Kenney, M. A.; Meier, W.; Overland, J. E.; Stroeve, J.; Trainor, S.

    2017-12-01

    The Arctic is an interconnected and environmentally sensitive system of ice, ocean, land, atmosphere, ecosystems, and people. From local to pan-Arctic scales, the area has already undergone major changes in physical and societal systems and will continue at a pace that is greater than twice the global average. Key Arctic indicators can quantify these changes. Indicators serve as the bridge between complex information and policy makers, stakeholders, and the general public, revealing trends and information people need to make important socioeconomic decisions. This presentation evaluates and compiles more than 70 physical, biological, societal and economic indicators into an approachable summary that defines the changing Arctic. We divided indicators into "existing," "in development," "possible," and "aspirational". In preparing a paper on Arctic Indicators for a special issue of the journal Climatic Change, our group established a set of selection criteria to identify indicators to specifically guide decision-makers in their responses to climate change. A goal of the analysis is to select a manageable composite list of recommended indicators based on sustained, reliable data sources with known user communities. The selected list is also based on the development of a conceptual model that identifies components and processes critical to our understanding of the Arctic region. This list of key indicators is designed to inform the plans and priorities of multiple groups such as the U.S. Global Change Research Program (USGCRP), Interagency Arctic Research Policy Committee (IARPC), and the Arctic Council.

  10. Face recognition using an enhanced independent component analysis approach.

    PubMed

    Kwak, Keun-Chang; Pedrycz, Witold

    2007-03-01

    This paper is concerned with an enhanced independent component analysis (ICA) and its application to face recognition. Typically, face representations obtained by ICA involve unsupervised learning and high-order statistics. In this paper, we develop an enhancement of the generic ICA by augmenting this method by the Fisher linear discriminant analysis (LDA); hence, its abbreviation, FICA. The FICA is systematically developed and presented along with its underlying architecture. A comparative analysis explores four distance metrics, as well as classification with support vector machines (SVMs). We demonstrate that the FICA approach leads to the formation of well-separated classes in low-dimension subspace and is endowed with a great deal of insensitivity to large variation in illumination and facial expression. The comprehensive experiments are completed for the facial-recognition technology (FERET) face database; a comparative analysis demonstrates that FICA comes with improved classification rates when compared with some other conventional approaches such as eigenface, fisherface, and the ICA itself.

  11. Key factors of case management interventions for frequent users of healthcare services: a thematic analysis review.

    PubMed

    Hudon, Catherine; Chouinard, Maud-Christine; Lambert, Mireille; Diadiou, Fatoumata; Bouliane, Danielle; Beaudin, Jérémie

    2017-10-22

    The aim of this paper was to identify the key factors of case management (CM) interventions among frequent users of healthcare services found in empirical studies of effectiveness. Thematic analysis review of CM studies. We built on a previously published review that aimed to report the effectiveness of CM interventions for frequent users of healthcare services, using the Medline, Scopus and CINAHL databases covering the January 2004-December 2015 period, then updated to July 2017, with the keywords 'CM' and 'frequent use'. We extracted factors of successful (n=7) and unsuccessful (n=6) CM interventions and conducted a mixed thematic analysis to synthesise findings. Chaudoir's implementation of health innovations framework was used to organise results into four broad levels of factors: (1) ,environmental/organisational level, (2) practitioner level, (3) patient level and (4) programme level. Access to, and close partnerships with, healthcare providers and community services resources were key factors of successful CM interventions that should target patients with the greatest needs and promote frequent contacts with the healthcare team. The selection and training of the case manager was also an important factor to foster patient engagement in CM. Coordination of care, self-management support and assistance with care navigation were key CM activities. The main issues reported by unsuccessful CM interventions were problems with case finding or lack of care integration. CM interventions for frequent users of healthcare services should ensure adequate case finding processes, rigorous selection and training of the case manager, sufficient intensity of the intervention, as well as good care integration among all partners. Other studies could further evaluate the influence of contextual factors on intervention impacts. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted

  12. Key factors of case management interventions for frequent users of healthcare services: a thematic analysis review

    PubMed Central

    Hudon, Catherine; Chouinard, Maud-Christine; Lambert, Mireille; Diadiou, Fatoumata; Bouliane, Danielle; Beaudin, Jérémie

    2017-01-01

    Objective The aim of this paper was to identify the key factors of case management (CM) interventions among frequent users of healthcare services found in empirical studies of effectiveness. Design Thematic analysis review of CM studies. Methods We built on a previously published review that aimed to report the effectiveness of CM interventions for frequent users of healthcare services, using the Medline, Scopus and CINAHL databases covering the January 2004–December 2015 period, then updated to July 2017, with the keywords ‘CM’ and ‘frequent use’. We extracted factors of successful (n=7) and unsuccessful (n=6) CM interventions and conducted a mixed thematic analysis to synthesise findings. Chaudoir’s implementation of health innovations framework was used to organise results into four broad levels of factors: (1) environmental/organisational level, (2) practitioner level, (3) patient level and (4) programme level. Results Access to, and close partnerships with, healthcare providers and community services resources were key factors of successful CM interventions that should target patients with the greatest needs and promote frequent contacts with the healthcare team. The selection and training of the case manager was also an important factor to foster patient engagement in CM. Coordination of care, self-management support and assistance with care navigation were key CM activities. The main issues reported by unsuccessful CM interventions were problems with case finding or lack of care integration. Conclusions CM interventions for frequent users of healthcare services should ensure adequate case finding processes, rigorous selection and training of the case manager, sufficient intensity of the intervention, as well as good care integration among all partners. Other studies could further evaluate the influence of contextual factors on intervention impacts. PMID:29061623

  13. High speed and adaptable error correction for megabit/s rate quantum key distribution

    PubMed Central

    Dixon, A. R.; Sato, H.

    2014-01-01

    Quantum Key Distribution is moving from its theoretical foundation of unconditional security to rapidly approaching real world installations. A significant part of this move is the orders of magnitude increases in the rate at which secure key bits are distributed. However, these advances have mostly been confined to the physical hardware stage of QKD, with software post-processing often being unable to support the high raw bit rates. In a complete implementation this leads to a bottleneck limiting the final secure key rate of the system unnecessarily. Here we report details of equally high rate error correction which is further adaptable to maximise the secure key rate under a range of different operating conditions. The error correction is implemented both in CPU and GPU using a bi-directional LDPC approach and can provide 90–94% of the ideal secure key rate over all fibre distances from 0–80 km. PMID:25450416

  14. Identification of gene expression profiles and key genes in subchondral bone of osteoarthritis using weighted gene coexpression network analysis.

    PubMed

    Guo, Sheng-Min; Wang, Jian-Xiong; Li, Jin; Xu, Fang-Yuan; Wei, Quan; Wang, Hai-Ming; Huang, Hou-Qiang; Zheng, Si-Lin; Xie, Yu-Jie; Zhang, Chi

    2018-06-15

    Osteoarthritis (OA) significantly influences the quality life of people around the world. It is urgent to find an effective way to understand the genetic etiology of OA. We used weighted gene coexpression network analysis (WGCNA) to explore the key genes involved in the subchondral bone pathological process of OA. Fifty gene expression profiles of GSE51588 were downloaded from the Gene Expression Omnibus database. The OA-associated genes and gene ontologies were acquired from JuniorDoc. Weighted gene coexpression network analysis was used to find disease-related networks based on 21756 gene expression correlation coefficients, hub-genes with the highest connectivity in each module were selected, and the correlation between module eigengene and clinical traits was calculated. The genes in the traits-related gene coexpression modules were subject to functional annotation and pathway enrichment analysis using ClusterProfiler. A total of 73 gene modules were identified, of which, 12 modules were found with high connectivity with clinical traits. Five modules were found with enriched OA-associated genes. Moreover, 310 OA-associated genes were found, and 34 of them were among hub-genes in each module. Consequently, enrichment results indicated some key metabolic pathways, such as extracellular matrix (ECM)-receptor interaction (hsa04512), focal adhesion (hsa04510), the phosphatidylinositol 3'-kinase (PI3K)-Akt signaling pathway (PI3K-AKT) (hsa04151), transforming growth factor beta pathway, and Wnt pathway. We intended to identify some core genes, collagen (COL)6A3, COL6A1, ITGA11, BAMBI, and HCK, which could influence downstream signaling pathways once they were activated. In this study, we identified important genes within key coexpression modules, which associate with a pathological process of subchondral bone in OA. Functional analysis results could provide important information to understand the mechanism of OA. © 2018 Wiley Periodicals, Inc.

  15. A novel bi-level meta-analysis approach: applied to biological pathway analysis.

    PubMed

    Nguyen, Tin; Tagett, Rebecca; Donato, Michele; Mitrea, Cristina; Draghici, Sorin

    2016-02-01

    The accumulation of high-throughput data in public repositories creates a pressing need for integrative analysis of multiple datasets from independent experiments. However, study heterogeneity, study bias, outliers and the lack of power of available methods present real challenge in integrating genomic data. One practical drawback of many P-value-based meta-analysis methods, including Fisher's, Stouffer's, minP and maxP, is that they are sensitive to outliers. Another drawback is that, because they perform just one statistical test for each individual experiment, they may not fully exploit the potentially large number of samples within each study. We propose a novel bi-level meta-analysis approach that employs the additive method and the Central Limit Theorem within each individual experiment and also across multiple experiments. We prove that the bi-level framework is robust against bias, less sensitive to outliers than other methods, and more sensitive to small changes in signal. For comparative analysis, we demonstrate that the intra-experiment analysis has more power than the equivalent statistical test performed on a single large experiment. For pathway analysis, we compare the proposed framework versus classical meta-analysis approaches (Fisher's, Stouffer's and the additive method) as well as against a dedicated pathway meta-analysis package (MetaPath), using 1252 samples from 21 datasets related to three human diseases, acute myeloid leukemia (9 datasets), type II diabetes (5 datasets) and Alzheimer's disease (7 datasets). Our framework outperforms its competitors to correctly identify pathways relevant to the phenotypes. The framework is sufficiently general to be applied to any type of statistical meta-analysis. The R scripts are available on demand from the authors. sorin@wayne.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e

  16. Support vector methods for survival analysis: a comparison between ranking and regression approaches.

    PubMed

    Van Belle, Vanya; Pelckmans, Kristiaan; Van Huffel, Sabine; Suykens, Johan A K

    2011-10-01

    To compare and evaluate ranking, regression and combined machine learning approaches for the analysis of survival data. The literature describes two approaches based on support vector machines to deal with censored observations. In the first approach the key idea is to rephrase the task as a ranking problem via the concordance index, a problem which can be solved efficiently in a context of structural risk minimization and convex optimization techniques. In a second approach, one uses a regression approach, dealing with censoring by means of inequality constraints. The goal of this paper is then twofold: (i) introducing a new model combining the ranking and regression strategy, which retains the link with existing survival models such as the proportional hazards model via transformation models; and (ii) comparison of the three techniques on 6 clinical and 3 high-dimensional datasets and discussing the relevance of these techniques over classical approaches fur survival data. We compare svm-based survival models based on ranking constraints, based on regression constraints and models based on both ranking and regression constraints. The performance of the models is compared by means of three different measures: (i) the concordance index, measuring the model's discriminating ability; (ii) the logrank test statistic, indicating whether patients with a prognostic index lower than the median prognostic index have a significant different survival than patients with a prognostic index higher than the median; and (iii) the hazard ratio after normalization to restrict the prognostic index between 0 and 1. Our results indicate a significantly better performance for models including regression constraints above models only based on ranking constraints. This work gives empirical evidence that svm-based models using regression constraints perform significantly better than svm-based models based on ranking constraints. Our experiments show a comparable performance for methods

  17. Examining the Reggio Emilia Approach: Keys to Understanding Why It Motivates Students

    ERIC Educational Resources Information Center

    Gardner, Alexa Fraley; Jones, Brett D.

    2016-01-01

    Because of the success of the Reggio Emilia Approach in early childhood education, it could be useful to researchers and practitioners to identify and explicate components of the approach that make it effective in motivating students. In this paper, we examine the Reggio Emilia Approach through the lens of the MUSIC® Model of Motivation, a model…

  18. Identification of key factors regulating self-renewal and differentiation in EML hematopoietic precursor cells by RNA-sequencing analysis.

    PubMed

    Zong, Shan; Deng, Shuyun; Chen, Kenian; Wu, Jia Qian

    2014-11-11

    Hematopoietic stem cells (HSCs) are used clinically for transplantation treatment to rebuild a patient's hematopoietic system in many diseases such as leukemia and lymphoma. Elucidating the mechanisms controlling HSCs self-renewal and differentiation is important for application of HSCs for research and clinical uses. However, it is not possible to obtain large quantity of HSCs due to their inability to proliferate in vitro. To overcome this hurdle, we used a mouse bone marrow derived cell line, the EML (Erythroid, Myeloid, and Lymphocytic) cell line, as a model system for this study. RNA-sequencing (RNA-Seq) has been increasingly used to replace microarray for gene expression studies. We report here a detailed method of using RNA-Seq technology to investigate the potential key factors in regulation of EML cell self-renewal and differentiation. The protocol provided in this paper is divided into three parts. The first part explains how to culture EML cells and separate Lin-CD34+ and Lin-CD34- cells. The second part of the protocol offers detailed procedures for total RNA preparation and the subsequent library construction for high-throughput sequencing. The last part describes the method for RNA-Seq data analysis and explains how to use the data to identify differentially expressed transcription factors between Lin-CD34+ and Lin-CD34- cells. The most significantly differentially expressed transcription factors were identified to be the potential key regulators controlling EML cell self-renewal and differentiation. In the discussion section of this paper, we highlight the key steps for successful performance of this experiment. In summary, this paper offers a method of using RNA-Seq technology to identify potential regulators of self-renewal and differentiation in EML cells. The key factors identified are subjected to downstream functional analysis in vitro and in vivo.

  19. Identification of Key Factors Regulating Self-renewal and Differentiation in EML Hematopoietic Precursor Cells by RNA-sequencing Analysis

    PubMed Central

    Chen, Kenian; Wu, Jia Qian

    2014-01-01

    Hematopoietic stem cells (HSCs) are used clinically for transplantation treatment to rebuild a patient's hematopoietic system in many diseases such as leukemia and lymphoma. Elucidating the mechanisms controlling HSCs self-renewal and differentiation is important for application of HSCs for research and clinical uses. However, it is not possible to obtain large quantity of HSCs due to their inability to proliferate in vitro. To overcome this hurdle, we used a mouse bone marrow derived cell line, the EML (Erythroid, Myeloid, and Lymphocytic) cell line, as a model system for this study. RNA-sequencing (RNA-Seq) has been increasingly used to replace microarray for gene expression studies. We report here a detailed method of using RNA-Seq technology to investigate the potential key factors in regulation of EML cell self-renewal and differentiation. The protocol provided in this paper is divided into three parts. The first part explains how to culture EML cells and separate Lin-CD34+ and Lin-CD34- cells. The second part of the protocol offers detailed procedures for total RNA preparation and the subsequent library construction for high-throughput sequencing. The last part describes the method for RNA-Seq data analysis and explains how to use the data to identify differentially expressed transcription factors between Lin-CD34+ and Lin-CD34- cells. The most significantly differentially expressed transcription factors were identified to be the potential key regulators controlling EML cell self-renewal and differentiation. In the discussion section of this paper, we highlight the key steps for successful performance of this experiment. In summary, this paper offers a method of using RNA-Seq technology to identify potential regulators of self-renewal and differentiation in EML cells. The key factors identified are subjected to downstream functional analysis in vitro and in vivo. PMID:25407807

  20. Experiences in methods to involve key players in planning protective actions in the case of a nuclear accident.

    PubMed

    Sinkko, K; Hämäläinen, R P; Hänninen, R

    2004-01-01

    A widely used method in the planning of protective actions is to establish a stakeholder network to generate a comprehensive set of generic protective actions. The aim is to increase competence and build links for communication and coordination. The approach of this work was to systematically evaluate protective action strategies in the case of a nuclear accident. This was done in a way that the concerns and issues of all key players could be transparently and equally included in the decision taken. An approach called Facilitated Decision Analysis Workshop has been developed and tested. The work builds on case studies in which it was assumed that a hypothetical accident had led to a release of considerable amounts of radionuclides and, therefore, various types of countermeasures had to be considered. Six workshops were organised in the Nordic countries where the key players were represented, i.e. authorities, expert organisations, industry and agricultural producers. Copyright 2004 Oxford University Press

  1. The RNA world in the 21st century-a systems approach to finding non-coding keys to clinical questions.

    PubMed

    Schmitz, Ulf; Naderi-Meshkin, Hojjat; Gupta, Shailendra K; Wolkenhauer, Olaf; Vera, Julio

    2016-05-01

    There was evidence that RNAs are a functionally rich class of molecules not only since the arrival of the next-generation sequencing technology. Non-coding RNAs (ncRNA) could be the key to accelerated diagnosis and enhanced prediction of disease and therapy outcomes as well as the design of advanced therapeutic strategies to overcome yet unsatisfactory approaches.In this review, we discuss the state of the art in RNA systems biology with focus on the application in the systems biomedicine field. We propose guidelines for analysing the role of microRNAs and long non-coding RNAs in human pathologies. We introduce RNA expression profiling and network approaches for the identification of stable and effective RNomics-based biomarkers, providing insights into the role of ncRNAs in disease regulation. Towards this, we discuss ways to model the dynamics of gene regulatory networks and signalling pathways that involve ncRNAs. We also describe data resources and computational methods for finding putative mechanisms of action of ncRNAs. Finally, we discuss avenues for the computer-aided design of novel RNA-based therapeutics. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  2. Feasibility of satellite quantum key distribution

    NASA Astrophysics Data System (ADS)

    Bonato, C.; Tomaello, A.; Da Deppo, V.; Naletto, G.; Villoresi, P.

    2009-04-01

    In this paper, we present a novel analysis of the feasibility of quantum key distribution between a LEO satellite and a ground station. First of all, we study signal propagation through a turbulent atmosphere for uplinks and downlinks, discussing the contribution of beam spreading and beam wandering. Then we introduce a model for the background noise of the channel during night-time and day-time, calculating the signal-to-noise ratio for different configurations. We also discuss the expected error-rate due to imperfect polarization compensation in the channel. Finally, we calculate the expected key generation rate of a secure key for different configurations (uplink, downlink) and for different protocols (BB84 with and without decoy states, entanglement-based Ekert91 protocol).

  3. Weighted Key Player Problem for Social Network Analysis

    DTIC Science & Technology

    2011-03-01

    the degree of the actor, the number of adjacent neighbors, to de - termine its centrality value. Introduced in its current form by Freeman, a node’s...identifying individuals who are key in a number of contexts. This chapter developed the WKPP-Pos measure that allows for the inclusion of actor and...Techniques were de - 44 veloped for using the p-median to find optimal solutions to the WKPP-Pos measure and for using hierarchical clustering as a

  4. New national curricula guidelines that support the use of interprofessional education in the Brazilian context: An analysis of key documents.

    PubMed

    Freire Filho, José Rodrigues; Viana Da Costa, Marcelo; Forster, Aldaísa Cassanho; Reeves, Scott

    2017-11-01

    The National Curricular Guidelines (NCGs) are important documents for understanding the history of academic health professions education in Brazil. Key policies within the NCGs have helped to reorient health professions education and have stimulated curricular changes, including active learning methodologies and more integrated teaching-service environments, and, more recently, have introduced interprofessional education (IPE) in both undergraduate and postgraduate sectors. This article presents the findings of a study that examined the NCGs for nursing, dentistry, and medicine courses as juridical foundations for adopting strategies that promote IPE across higher education institutions in Brazil. We employed a comparative and exploratory documentary analysis to understand the role of IPE and collaborative practices in NCGs for the three largest professions in Brazil. Following a thematic analysis of these texts, four key themes emerged: faculty development; competencies for teamwork; curricular structure; and learning metrics. Key findings related to each of these themes are presented and discussed in relation to the wider interprofessional literature. The article goes on to argue that the statements contained in the NCGs about adoption of IPE and collaborative practices will have an important influence in shaping the future of health professions education in Brazil.

  5. RAPD analysis of the last population of a likely Florida Keys endemic cactus

    Treesearch

    D.R. Gordon; Thomas L. Kubisiak

    1998-01-01

    The semaphore cactus in the Florida Keys has until recently been considered a disjunct location of the Jamaican Opuntia spinosissima. Loss of all but one population in the Keys coupled with recent suggestions that the species should be taxonomically separated from the Jamaican cactus and is, therefore, a Florida Keys endemic, makes this population of conservation...

  6. 1992 NASA Life Support Systems Analysis workshop

    NASA Technical Reports Server (NTRS)

    Evanich, Peggy L.; Crabb, Thomas M.; Gartrell, Charles F.

    1992-01-01

    The 1992 Life Support Systems Analysis Workshop was sponsored by NASA's Office of Aeronautics and Space Technology (OAST) to integrate the inputs from, disseminate information to, and foster communication among NASA, industry, and academic specialists. The workshop continued discussion and definition of key issues identified in the 1991 workshop, including: (1) modeling and experimental validation; (2) definition of systems analysis evaluation criteria; (3) integration of modeling at multiple levels; and (4) assessment of process control modeling approaches. Through both the 1991 and 1992 workshops, NASA has continued to seek input from industry and university chemical process modeling and analysis experts, and to introduce and apply new systems analysis approaches to life support systems. The workshop included technical presentations, discussions, and interactive planning, with sufficient time allocated for discussion of both technology status and technology development recommendations. Key personnel currently involved with life support technology developments from NASA, industry, and academia provided input to the status and priorities of current and future systems analysis methods and requirements.

  7. Closed-loop, pilot/vehicle analysis of the approach and landing task

    NASA Technical Reports Server (NTRS)

    Anderson, M. R.; Schmidt, D. K.

    1986-01-01

    In the case of approach and landing, it is universally accepted that the pilot uses more than one vehicle response, or output, to close his control loops. Therefore, to model this task, a multi-loop analysis technique is required. The analysis problem has been in obtaining reasonable analytic estimates of the describing functions representing the pilot's loop compensation. Once these pilot describing functions are obtained, appropriate performance and workload metrics must then be developed for the landing task. The optimal control approach provides a powerful technique for obtaining the necessary describing functions, once the appropriate task objective is defined in terms of a quadratic objective function. An approach is presented through the use of a simple, reasonable objective function and model-based metrics to evaluate loop performance and pilot workload. The results of an analysis of the LAHOS (Landing and Approach of Higher Order Systems) study performed by R.E. Smith is also presented.

  8. 3.8 Proposed approach to uncertainty quantification and sensitivity analysis in the next PA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flach, Greg; Wohlwend, Jen

    2017-10-02

    This memorandum builds upon Section 3.8 of SRNL (2016) and Flach (2017) by defining key error analysis, uncertainty quantification, and sensitivity analysis concepts and terms, in preparation for the next E-Area Performance Assessment (WSRC 2008) revision.

  9. Simple 2.5 GHz time-bin quantum key distribution

    NASA Astrophysics Data System (ADS)

    Boaron, Alberto; Korzh, Boris; Houlmann, Raphael; Boso, Gianluca; Rusca, Davide; Gray, Stuart; Li, Ming-Jun; Nolan, Daniel; Martin, Anthony; Zbinden, Hugo

    2018-04-01

    We present a 2.5 GHz quantum key distribution setup with the emphasis on a simple experimental realization. It features a three-state time-bin protocol based on a pulsed diode laser and a single intensity modulator. Implementing an efficient one-decoy scheme and finite-key analysis, we achieve record breaking secret key rates of 1.5 kbps over 200 km of standard optical fibers.

  10. Entropy Driven Self-Assembly in Charged Lock-Key Particles.

    PubMed

    Odriozola, Gerardo; Lozada-Cassou, Marcelo

    2016-07-07

    In this work we study the lock-key model successfully used in supramolecular chemistry and particles self-assembly and gain further insight into the infinite diluted limit of the lock and key, depletant mediated, effective attraction. We discuss the depletant forces and entropy approaches to self-assembly and give details on the different contributions to the net force for a charged lock and key pair immersed in a solvent plus a primitive model electrolyte. We show a strong correlation of the force components behavior and the underlying processes of co-ion and solvent release from the cavity. In addition, we put into context the universal behavior observed for the energy-distance curves when changing the lock and key to solvent size ratio. Basically, we now show that this behavior is not always achieved and depends on the particular system geometry. Finally, we present a qualitative good agreement with experiments when changing the electrolyte concentration, valence, and cavity-key size ratio.

  11. A hierarchical approach to forest landscape pattern characterization.

    PubMed

    Wang, Jialing; Yang, Xiaojun

    2012-01-01

    Landscape spatial patterns have increasingly been considered to be essential for environmental planning and resources management. In this study, we proposed a hierarchical approach for landscape classification and evaluation by characterizing landscape spatial patterns across different hierarchical levels. The case study site is the Red Hills region of northern Florida and southwestern Georgia, well known for its biodiversity, historic resources, and scenic beauty. We used one Landsat Enhanced Thematic Mapper image to extract land-use/-cover information. Then, we employed principal-component analysis to help identify key class-level landscape metrics for forests at different hierarchical levels, namely, open pine, upland pine, and forest as a whole. We found that the key class-level landscape metrics varied across different hierarchical levels. Compared with forest as a whole, open pine forest is much more fragmented. The landscape metric, such as CONTIG_MN, which measures whether pine patches are contiguous or not, is more important to characterize the spatial pattern of pine forest than to forest as a whole. This suggests that different metric sets should be used to characterize landscape patterns at different hierarchical levels. We further used these key metrics, along with the total class area, to classify and evaluate subwatersheds through cluster analysis. This study demonstrates a promising approach that can be used to integrate spatial patterns and processes for hierarchical forest landscape planning and management.

  12. Efficient key pathway mining: combining networks and OMICS data.

    PubMed

    Alcaraz, Nicolas; Friedrich, Tobias; Kötzing, Timo; Krohmer, Anton; Müller, Joachim; Pauling, Josch; Baumbach, Jan

    2012-07-01

    Systems biology has emerged over the last decade. Driven by the advances in sophisticated measurement technology the research community generated huge molecular biology data sets. These comprise rather static data on the interplay of biological entities, for instance protein-protein interaction network data, as well as quite dynamic data collected for studying the behavior of individual cells or tissues in accordance with changing environmental conditions, such as DNA microarrays or RNA sequencing. Here we bring the two different data types together in order to gain higher level knowledge. We introduce a significantly improved version of the KeyPathwayMiner software framework. Given a biological network modelled as a graph and a set of expression studies, KeyPathwayMiner efficiently finds and visualizes connected sub-networks where most components are expressed in most cases. It finds all maximal connected sub-networks where all nodes but k exceptions are expressed in all experimental studies but at most l exceptions. We demonstrate the power of the new approach by comparing it to similar approaches with gene expression data previously used to study Huntington's disease. In addition, we demonstrate KeyPathwayMiner's flexibility and applicability to non-array data by analyzing genome-scale DNA methylation profiles from colorectal tumor cancer patients. KeyPathwayMiner release 2 is available as a Cytoscape plugin and online at http://keypathwayminer.mpi-inf.mpg.de.

  13. Issues in Moderation of National Curriculum Assessment at Key Stage 3.

    ERIC Educational Resources Information Center

    Cowling, Les

    1994-01-01

    Highlights the issues of moderation of teacher judgments for accountability and moderation for achieving consistent assessments. Discusses the meaning of moderation, the need for moderating teacher judgments, approaches to moderation, methods of moderation, and most appropriate methods of moderation at Key Stage 3. Presents an approach to…

  14. Physical Cryptography: A New Approach to Key Generation and Direct Encryption

    DTIC Science & Technology

    2009-11-18

    has been  further studied theoretically and P a g e  | 4    experimentally to only a limited extent. The second is quantum cryptography [3] based on...Std Z39-18 P a g e  | 2    Abstract: The security of key generation and direct encryption in quantum and physical cryptography have been...investigated. It is found that similar to the situation of conventional mathematics based cryptography , fundamental and meaningful security levels for either

  15. The same analysis approach: Practical protection against the pitfalls of novel neuroimaging analysis methods.

    PubMed

    Görgen, Kai; Hebart, Martin N; Allefeld, Carsten; Haynes, John-Dylan

    2017-12-27

    Standard neuroimaging data analysis based on traditional principles of experimental design, modelling, and statistical inference is increasingly complemented by novel analysis methods, driven e.g. by machine learning methods. While these novel approaches provide new insights into neuroimaging data, they often have unexpected properties, generating a growing literature on possible pitfalls. We propose to meet this challenge by adopting a habit of systematic testing of experimental design, analysis procedures, and statistical inference. Specifically, we suggest to apply the analysis method used for experimental data also to aspects of the experimental design, simulated confounds, simulated null data, and control data. We stress the importance of keeping the analysis method the same in main and test analyses, because only this way possible confounds and unexpected properties can be reliably detected and avoided. We describe and discuss this Same Analysis Approach in detail, and demonstrate it in two worked examples using multivariate decoding. With these examples, we reveal two sources of error: A mismatch between counterbalancing (crossover designs) and cross-validation which leads to systematic below-chance accuracies, and linear decoding of a nonlinear effect, a difference in variance. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Nonnegative constraint analysis of key fluorophores within human breast cancer using native fluorescence spectroscopy excited by selective wavelength of 300 nm

    NASA Astrophysics Data System (ADS)

    Pu, Yang; Sordillo, Laura A.; Alfano, Robert R.

    2015-03-01

    Native fluorescence spectroscopy offers an important role in cancer discrimination. It is widely acknowledged that the emission spectrum of tissue is a superposition of spectra of various salient fluorophores. In this study, the native fluorescence spectra of human cancerous and normal breast tissues excited by selected wavelength of 300 nm are used to investigate the key building block fluorophores: tryptophan and reduced nicotinamide adenine dinucleotide (NADH). The basis spectra of these key fluorophores' contribution to the tissue emission spectra are obtained by nonnegative constraint analysis. The emission spectra of human cancerous and normal tissue samples are projected onto the fluorophore spectral subspace. Since previous studies indicate that tryptophan and NADH are key fluorophores related with tumor evolution, it is essential to obtain their information from tissue fluorescence but discard the redundancy. To evaluate the efficacy of for cancer detection, linear discriminant analysis (LDA) classifier is used to evaluate the sensitivity, and specificity. This research demonstrates that the native fluorescence spectroscopy measurements are effective to detect changes of fluorophores' compositions in tissues due to the development of cancer.

  17. A Participatory Action Research Approach To Evaluating Inclusive School Programs.

    ERIC Educational Resources Information Center

    Dymond, Stacy K.

    2001-01-01

    This article proposes a model for evaluating inclusive schools. Key elements of the model are inclusion of stakeholders in the evaluation process through a participatory action research approach, analysis of program processes and outcomes, use of multiple methods and measures, and obtaining perceptions from diverse stakeholder groups. (Contains…

  18. Analysis of cytokine release assay data using machine learning approaches.

    PubMed

    Xiong, Feiyu; Janko, Marco; Walker, Mindi; Makropoulos, Dorie; Weinstock, Daniel; Kam, Moshe; Hrebien, Leonid

    2014-10-01

    The possible onset of Cytokine Release Syndrome (CRS) is an important consideration in the development of monoclonal antibody (mAb) therapeutics. In this study, several machine learning approaches are used to analyze CRS data. The analyzed data come from a human blood in vitro assay which was used to assess the potential of mAb-based therapeutics to produce cytokine release similar to that induced by Anti-CD28 superagonistic (Anti-CD28 SA) mAbs. The data contain 7 mAbs and two negative controls, a total of 423 samples coming from 44 donors. Three (3) machine learning approaches were applied in combination to observations obtained from that assay, namely (i) Hierarchical Cluster Analysis (HCA); (ii) Principal Component Analysis (PCA) followed by K-means clustering; and (iii) Decision Tree Classification (DTC). All three approaches were able to identify the treatment that caused the most severe cytokine response. HCA was able to provide information about the expected number of clusters in the data. PCA coupled with K-means clustering allowed classification of treatments sample by sample, and visualizing clusters of treatments. DTC models showed the relative importance of various cytokines such as IFN-γ, TNF-α and IL-10 to CRS. The use of these approaches in tandem provides better selection of parameters for one method based on outcomes from another, and an overall improved analysis of the data through complementary approaches. Moreover, the DTC analysis showed in addition that IL-17 may be correlated with CRS reactions, although this correlation has not yet been corroborated in the literature. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Approach to proliferation risk assessment based on multiple objective analysis framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrianov, A.; Kuptsov, I.; Studgorodok 1, Obninsk, Kaluga region, 249030

    2013-07-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materialsmore » circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk.« less

  20. Small Private Key PKS on an Embedded Microprocessor

    PubMed Central

    Seo, Hwajeong; Kim, Jihyun; Choi, Jongseok; Park, Taehwan; Liu, Zhe; Kim, Howon

    2014-01-01

    Multivariate quadratic ( ) cryptography requires the use of long public and private keys to ensure a sufficient security level, but this is not favorable to embedded systems, which have limited system resources. Recently, various approaches to cryptography using reduced public keys have been studied. As a result of this, at CHES2011 (Cryptographic Hardware and Embedded Systems, 2011), a small public key scheme, was proposed, and its feasible implementation on an embedded microprocessor was reported at CHES2012. However, the implementation of a small private key scheme was not reported. For efficient implementation, random number generators can contribute to reduce the key size, but the cost of using a random number generator is much more complex than computing on modern microprocessors. Therefore, no feasible results have been reported on embedded microprocessors. In this paper, we propose a feasible implementation on embedded microprocessors for a small private key scheme using a pseudo-random number generator and hash function based on a block-cipher exploiting a hardware Advanced Encryption Standard (AES) accelerator. To speed up the performance, we apply various implementation methods, including parallel computation, on-the-fly computation, optimized logarithm representation, vinegar monomials and assembly programming. The proposed method reduces the private key size by about 99.9% and boosts signature generation and verification by 5.78% and 12.19% than previous results in CHES2012. PMID:24651722

  1. Small private key MQPKS on an embedded microprocessor.

    PubMed

    Seo, Hwajeong; Kim, Jihyun; Choi, Jongseok; Park, Taehwan; Liu, Zhe; Kim, Howon

    2014-03-19

    Multivariate quadratic (MQ) cryptography requires the use of long public and private keys to ensure a sufficient security level, but this is not favorable to embedded systems, which have limited system resources. Recently, various approaches to MQ cryptography using reduced public keys have been studied. As a result of this, at CHES2011 (Cryptographic Hardware and Embedded Systems, 2011), a small public key MQ scheme, was proposed, and its feasible implementation on an embedded microprocessor was reported at CHES2012. However, the implementation of a small private key MQ scheme was not reported. For efficient implementation, random number generators can contribute to reduce the key size, but the cost of using a random number generator is much more complex than computing MQ on modern microprocessors. Therefore, no feasible results have been reported on embedded microprocessors. In this paper, we propose a feasible implementation on embedded microprocessors for a small private key MQ scheme using a pseudo-random number generator and hash function based on a block-cipher exploiting a hardware Advanced Encryption Standard (AES) accelerator. To speed up the performance, we apply various implementation methods, including parallel computation, on-the-fly computation, optimized logarithm representation, vinegar monomials and assembly programming. The proposed method reduces the private key size by about 99.9% and boosts signature generation and verification by 5.78% and 12.19% than previous results in CHES2012.

  2. Entangled-coherent-state quantum key distribution with entanglement witnessing

    NASA Astrophysics Data System (ADS)

    Simon, David S.; Jaeger, Gregg; Sergienko, Alexander V.

    2014-01-01

    An entanglement-witness approach to quantum coherent-state key distribution and a system for its practical implementation are described. In this approach, eavesdropping can be detected by a change in sign of either of two witness functions: an entanglement witness S or an eavesdropping witness W. The effects of loss and eavesdropping on system operation are evaluated as a function of distance. Although the eavesdropping witness W does not directly witness entanglement for the system, its behavior remains related to that of the true entanglement witness S. Furthermore, W is easier to implement experimentally than S. W crosses the axis at a finite distance, in a manner reminiscent of entanglement sudden death. The distance at which this occurs changes measurably when an eavesdropper is present. The distance dependence of the two witnesses due to amplitude reduction and due to increased variance resulting from both ordinary propagation losses and possible eavesdropping activity is provided. Finally, the information content and secure key rate of a continuous variable protocol using this witness approach are given.

  3. Simple Web-based interactive key development software (WEBiKEY) and an example key for Kuruna (Poaceae: Bambusoideae).

    PubMed

    Attigala, Lakshmi; De Silva, Nuwan I; Clark, Lynn G

    2016-04-01

    Programs that are user-friendly and freely available for developing Web-based interactive keys are scarce and most of the well-structured applications are relatively expensive. WEBiKEY was developed to enable researchers to easily develop their own Web-based interactive keys with fewer resources. A Web-based multiaccess identification tool (WEBiKEY) was developed that uses freely available Microsoft ASP.NET technologies and an SQL Server database for Windows-based hosting environments. WEBiKEY was tested for its usability with a sample data set, the temperate woody bamboo genus Kuruna (Poaceae). WEBiKEY is freely available to the public and can be used to develop Web-based interactive keys for any group of species. The interactive key we developed for Kuruna using WEBiKEY enables users to visually inspect characteristics of Kuruna and identify an unknown specimen as one of seven possible species in the genus.

  4. Screening of the key volatile organic compounds of Tuber melanosporum fermentation by aroma sensory evaluation combination with principle component analysis

    PubMed Central

    Liu, Rui-Sang; Jin, Guang-Huai; Xiao, Deng-Rong; Li, Hong-Mei; Bai, Feng-Wu; Tang, Ya-Jie

    2015-01-01

    Aroma results from the interplay of volatile organic compounds (VOCs) and the attributes of microbial-producing aromas are significantly affected by fermentation conditions. Among the VOCs, only a few of them contribute to aroma. Thus, screening and identification of the key VOCs is critical for microbial-producing aroma. The traditional method is based on gas chromatography-olfactometry (GC-O), which is time-consuming and laborious. Considering the Tuber melanosporum fermentation system as an example, a new method to screen and identify the key VOCs by combining the aroma evaluation method with principle component analysis (PCA) was developed in this work. First, an aroma sensory evaluation method was developed to screen 34 potential favorite aroma samples from 504 fermentation samples. Second, PCA was employed to screen nine common key VOCs from these 34 samples. Third, seven key VOCs were identified by the traditional method. Finally, all of the seven key VOCs identified by the traditional method were also identified, along with four others, by the new strategy. These results indicate the reliability of the new method and demonstrate it to be a viable alternative to the traditional method. PMID:26655663

  5. Adversarial risk analysis with incomplete information: a level-k approach.

    PubMed

    Rothschild, Casey; McLay, Laura; Guikema, Seth

    2012-07-01

    This article proposes, develops, and illustrates the application of level-k game theory to adversarial risk analysis. Level-k reasoning, which assumes that players play strategically but have bounded rationality, is useful for operationalizing a Bayesian approach to adversarial risk analysis. It can be applied in a broad class of settings, including settings with asynchronous play and partial but incomplete revelation of early moves. Its computational and elicitation requirements are modest. We illustrate the approach with an application to a simple defend-attack model in which the defender's countermeasures are revealed with a probability less than one to the attacker before he decides on how or whether to attack. © 2011 Society for Risk Analysis.

  6. Multivariate geometry as an approach to algal community analysis

    USGS Publications Warehouse

    Allen, T.F.H.; Skagen, S.

    1973-01-01

    Multivariate analyses are put in the context of more usual approaches to phycological investigations. The intuitive common-sense involved in methods of ordination, classification and discrimination are emphasised by simple geometric accounts which avoid jargon and matrix algebra. Warnings are given that artifacts result from technique abuses by the naive or over-enthusiastic. An analysis of a simple periphyton data set is presented as an example of the approach. Suggestions are made as to situations in phycological investigations, where the techniques could be appropriate. The discipline is reprimanded for its neglect of the multivariate approach.

  7. Analysis of key safety metrics of thorium utilization in LWRs

    DOE PAGES

    Ade, Brian J.; Bowman, Stephen M.; Worrall, Andrew; ...

    2016-04-08

    Here, thorium has great potential to stretch nuclear fuel reserves because of its natural abundance and because it is possible to breed the 232Th isotope into a fissile fuel ( 233U). Various scenarios exist for utilization of thorium in the nuclear fuel cycle, including use in different nuclear reactor types (e.g., light water, high-temperature gas-cooled, fast spectrum sodium, and molten salt reactors), along with use in advanced accelerator-driven systems and even in fission-fusion hybrid systems. The most likely near-term application of thorium in the United States is in currently operating light water reactors (LWRs). This use is primarily based onmore » concepts that mix thorium with uranium (UO 2 + ThO 2) or that add fertile thorium (ThO 2) fuel pins to typical LWR fuel assemblies. Utilization of mixed fuel assemblies (PuO 2 + ThO 2) is also possible. The addition of thorium to currently operating LWRs would result in a number of different phenomenological impacts to the nuclear fuel. Thorium and its irradiation products have different nuclear characteristics from those of uranium and its irradiation products. ThO 2, alone or mixed with UO 2 fuel, leads to different chemical and physical properties of the fuel. These key reactor safety–related issues have been studied at Oak Ridge National Laboratory and documented in “Safety and Regulatory Issues of the Thorium Fuel Cycle” (NUREG/CR-7176, U.S. Nuclear Regulatory Commission, 2014). Various reactor analyses were performed using the SCALE code system for comparison of key performance parameters of both ThO 2 + UO 2 and ThO 2 + PuO 2 against those of UO 2 and typical UO 2 + PuO 2 mixed oxide fuels, including reactivity coefficients and power sharing between surrounding UO 2 assemblies and the assembly of interest. The decay heat and radiological source terms for spent fuel after its discharge from the reactor are also presented. Based on this evaluation, potential impacts on safety requirements and

  8. How to Tackle Key Challenges in the Promotion of Physical Activity among Older Adults (65+): The AEQUIPA Network Approach

    PubMed Central

    Forberger, Sarah; Bammann, Karin; Bauer, Jürgen; Boll, Susanne; Bolte, Gabriele; Brand, Tilman; Hein, Andreas; Koppelin, Frauke; Lippke, Sonia; Meyer, Jochen; Pischke, Claudia R.; Voelcker-Rehage, Claudia; Zeeb, Hajo

    2017-01-01

    The paper introduces the theoretical framework and methods/instruments used by the Physical Activity and Health Equity: Primary Prevention for Healthy Ageing (AEQUIPA) prevention research network as an interdisciplinary approach to tackle key challenges in the promotion of physical activity among older people (65+). Drawing on the social-ecological model, the AEQUIPA network developed an interdisciplinary methodological design including quantitative/qualitative studies and systematic reviews, while combining expertise from diverse fields: public health, psychology, urban planning, sports sciences, health technology and geriatrics. AEQUIPA tackles key challenges when promoting physical activity (PA) in older adults: tailoring of interventions, fostering community readiness and participation, strengthening intersectoral collaboration, using new technological devices and evaluating intervention generated inequalities. AEQUIPA aims to strengthen the evidence base for age-specific preventive PA interventions and to yield new insights into the explanatory power of individual and contextual factors. Currently, the empirical work is still underway. First experiences indicate that the network has achieved a strong regional linkage with communities, local stakeholders and individuals. However, involving inactive persons and individuals from minority groups remained challenging. A review of existing PA intervention studies among the elderly revealed the potential to assess equity effects. The results will add to the theoretical and methodological discussion on evidence-based age-specific PA interventions and will contribute to the discussion about European and national health targets. PMID:28375177

  9. How to Tackle Key Challenges in the Promotion of Physical Activity among Older Adults (65+): The AEQUIPA Network Approach.

    PubMed

    Forberger, Sarah; Bammann, Karin; Bauer, Jürgen; Boll, Susanne; Bolte, Gabriele; Brand, Tilman; Hein, Andreas; Koppelin, Frauke; Lippke, Sonia; Meyer, Jochen; Pischke, Claudia R; Voelcker-Rehage, Claudia; Zeeb, Hajo

    2017-04-04

    The paper introduces the theoretical framework and methods/instruments used by the Physical Activity and Health Equity: Primary Prevention for Healthy Ageing (AEQUIPA) prevention research network as an interdisciplinary approach to tackle key challenges in the promotion of physical activity among older people (65+). Drawing on the social-ecological model, the AEQUIPA network developed an interdisciplinary methodological design including quantitative/qualitative studies and systematic reviews, while combining expertise from diverse fields: public health, psychology, urban planning, sports sciences, health technology and geriatrics. AEQUIPA tackles key challenges when promoting physical activity (PA) in older adults: tailoring of interventions, fostering community readiness and participation, strengthening intersectoral collaboration, using new technological devices and evaluating intervention generated inequalities. AEQUIPA aims to strengthen the evidence base for age-specific preventive PA interventions and to yield new insights into the explanatory power of individual and contextual factors. Currently, the empirical work is still underway. First experiences indicate that thenetwork has achieved a strong regional linkage with communities, local stakeholders and individuals. However, involving inactive persons and individuals from minority groups remained challenging. A review of existing PA intervention studies among the elderly revealed the potential to assess equity effects. The results will add to the theoretical and methodological discussion on evidence-based age-specific PA interventions and will contribute to the discussion about European and national health targets.

  10. Personalized translational epilepsy research - Novel approaches and future perspectives: Part I: Clinical and network analysis approaches.

    PubMed

    Rosenow, Felix; van Alphen, Natascha; Becker, Albert; Chiocchetti, Andreas; Deichmann, Ralf; Deller, Thomas; Freiman, Thomas; Freitag, Christine M; Gehrig, Johannes; Hermsen, Anke M; Jedlicka, Peter; Kell, Christian; Klein, Karl Martin; Knake, Susanne; Kullmann, Dimitri M; Liebner, Stefan; Norwood, Braxton A; Omigie, Diana; Plate, Karlheinz; Reif, Andreas; Reif, Philipp S; Reiss, Yvonne; Roeper, Jochen; Ronellenfitsch, Michael W; Schorge, Stephanie; Schratt, Gerhard; Schwarzacher, Stephan W; Steinbach, Joachim P; Strzelczyk, Adam; Triesch, Jochen; Wagner, Marlies; Walker, Matthew C; von Wegner, Frederic; Bauer, Sebastian

    2017-11-01

    Despite the availability of more than 15 new "antiepileptic drugs", the proportion of patients with pharmacoresistant epilepsy has remained constant at about 20-30%. Furthermore, no disease-modifying treatments shown to prevent the development of epilepsy following an initial precipitating brain injury or to reverse established epilepsy have been identified to date. This is likely in part due to the polyetiologic nature of epilepsy, which in turn requires personalized medicine approaches. Recent advances in imaging, pathology, genetics and epigenetics have led to new pathophysiological concepts and the identification of monogenic causes of epilepsy. In the context of these advances, the First International Symposium on Personalized Translational Epilepsy Research (1st ISymPTER) was held in Frankfurt on September 8, 2016, to discuss novel approaches and future perspectives for personalized translational research. These included new developments and ideas in a range of experimental and clinical areas such as deep phenotyping, quantitative brain imaging, EEG/MEG-based analysis of network dysfunction, tissue-based translational studies, innate immunity mechanisms, microRNA as treatment targets, functional characterization of genetic variants in human cell models and rodent organotypic slice cultures, personalized treatment approaches for monogenic epilepsies, blood-brain barrier dysfunction, therapeutic focal tissue modification, computational modeling for target and biomarker identification, and cost analysis in (monogenic) disease and its treatment. This report on the meeting proceedings is aimed at stimulating much needed investments of time and resources in personalized translational epilepsy research. Part I includes the clinical phenotyping and diagnostic methods, EEG network-analysis, biomarkers, and personalized treatment approaches. In Part II, experimental and translational approaches will be discussed (Bauer et al., 2017) [1]. Copyright © 2017 Elsevier Inc

  11. Quantum key distribution using gaussian-modulated coherent states

    NASA Astrophysics Data System (ADS)

    Grosshans, Frédéric; Van Assche, Gilles; Wenger, Jérôme; Brouri, Rosa; Cerf, Nicolas J.; Grangier, Philippe

    2003-01-01

    Quantum continuous variables are being explored as an alternative means to implement quantum key distribution, which is usually based on single photon counting. The former approach is potentially advantageous because it should enable higher key distribution rates. Here we propose and experimentally demonstrate a quantum key distribution protocol based on the transmission of gaussian-modulated coherent states (consisting of laser pulses containing a few hundred photons) and shot-noise-limited homodyne detection; squeezed or entangled beams are not required. Complete secret key extraction is achieved using a reverse reconciliation technique followed by privacy amplification. The reverse reconciliation technique is in principle secure for any value of the line transmission, against gaussian individual attacks based on entanglement and quantum memories. Our table-top experiment yields a net key transmission rate of about 1.7 megabits per second for a loss-free line, and 75 kilobits per second for a line with losses of 3.1dB. We anticipate that the scheme should remain effective for lines with higher losses, particularly because the present limitations are essentially technical, so that significant margin for improvement is available on both the hardware and software.

  12. Channel analysis for single photon underwater free space quantum key distribution.

    PubMed

    Shi, Peng; Zhao, Shi-Cheng; Gu, Yong-Jian; Li, Wen-Dong

    2015-03-01

    We investigate the optical absorption and scattering properties of underwater media pertinent to our underwater free space quantum key distribution (QKD) channel model. With the vector radiative transfer theory and Monte Carlo method, we obtain the attenuation of photons, the fidelity of the scattered photons, the quantum bit error rate, and the sifted key generation rate of underwater quantum communication. It can be observed from our simulations that the most secure single photon underwater free space QKD is feasible in the clearest ocean water.

  13. Preliminary Evaluation of BIM-based Approaches for Schedule Delay Analysis

    NASA Astrophysics Data System (ADS)

    Chou, Hui-Yu; Yang, Jyh-Bin

    2017-10-01

    The problem of schedule delay commonly occurs in construction projects. The quality of delay analysis depends on the availability of schedule-related information and delay evidence. More information used in delay analysis usually produces more accurate and fair analytical results. How to use innovative techniques to improve the quality of schedule delay analysis results have received much attention recently. As Building Information Modeling (BIM) technique has been quickly developed, using BIM and 4D simulation techniques have been proposed and implemented. Obvious benefits have been achieved especially in identifying and solving construction consequence problems in advance of construction. This study preforms an intensive literature review to discuss the problems encountered in schedule delay analysis and the possibility of using BIM as a tool in developing a BIM-based approach for schedule delay analysis. This study believes that most of the identified problems can be dealt with by BIM technique. Research results could be a fundamental of developing new approaches for resolving schedule delay disputes.

  14. Key species and impact of fishery through food web analysis: A case study from Baja California Sur, Mexico

    NASA Astrophysics Data System (ADS)

    Rocchi, Marta; Scotti, Marco; Micheli, Fiorenza; Bodini, Antonio

    2017-01-01

    Ecosystem-Based Management (EBM) aims to support the protection of natural ecosystems and to improve economic activities. It requires considering all of the actors interacting in social-ecological systems (e.g., fish and fishers) in the understanding that their interplay determines the dynamic behavior of the single actors as well as that of the system as a whole. Connections are thus central to EBM. Within the ecological dimension of socio-ecological systems, interactions between species define such connections. Understanding how connections affect ecosystem and species dynamics is often impaired by a lack of data. We propose food web network analysis as a tool to help bridge the gap between EBM theory and practice in data-poor contexts, and illustrate this approach through its application to a coastal marine ecosystem in Baja California Sur, Mexico. First, we calculated centrality indices to identify which key (i.e., most central) species must be considered when designing strategies for sustainable resource management. Second, we analyzed the resilience of the system by measuring changes in food web structure due to the local extinction of vulnerable species (i.e., by mimicking the possible effect of excessive fishing pressure). The consequences of species removals were quantified in terms of impacts on global structural indices and species' centrality indices. Overall, we found that this coastal ecosystem shows high resilience to species loss. We identified species (e.g., Octopus sp. and the kelp bass, Paralabrax clathratus) whose protection could further decrease the risk of potential negative impacts of fishing activities on the Baja California Sur food web. This work introduces an approach that can be applied to other ecosystems to aid the implementation of EBM in data-poor contexts.

  15. Analysis of a Complex Faulted CO 2 Reservoir Using a Three-dimensional Hydro-geochemical-Mechanical Approach

    DOE PAGES

    Nguyen, Ba Nghiep; Hou, Zhangshuan; Bacon, Diana H.; ...

    2017-08-18

    This work applies a three-dimensional (3D) multiscale approach recently developed to analyze a complex CO 2 faulted reservoir that includes some key geological features of the San Andreas and nearby faults. The approach couples the STOMP-CO2-R code for flow and reactive transport modeling to the ABAQUS ® finite element package for geomechanical analysis. The objective is to examine the coupled hydro-geochemical-mechanical impact on the risk of hydraulic fracture and fault slip in a complex and representative CO 2 reservoir that contains two nearly parallel faults. STOMP-CO2-R/ABAQUS ® coupled analyses of this reservoir are performed assuming extensional and compressional stress regimesmore » to predict evolutions of fluid pressure, stress and strain distributions as well as potential fault failure and leakage of CO 2 along the fault damage zones. The tendency for the faults to slip and pressure margin to fracture are examined in terms of stress regime, mineral composition, crack distributions in the fault damage zones and geomechanical properties. Here, this model in combination with a detailed description of the faults helps assess the coupled hydro-geochemical-mechanical effect.« less

  16. Analysis of a Complex Faulted CO 2 Reservoir Using a Three-dimensional Hydro-geochemical-Mechanical Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, Ba Nghiep; Hou, Zhangshuan; Bacon, Diana H.

    This work applies a three-dimensional (3D) multiscale approach recently developed to analyze a complex CO 2 faulted reservoir that includes some key geological features of the San Andreas and nearby faults. The approach couples the STOMP-CO2-R code for flow and reactive transport modeling to the ABAQUS ® finite element package for geomechanical analysis. The objective is to examine the coupled hydro-geochemical-mechanical impact on the risk of hydraulic fracture and fault slip in a complex and representative CO 2 reservoir that contains two nearly parallel faults. STOMP-CO2-R/ABAQUS ® coupled analyses of this reservoir are performed assuming extensional and compressional stress regimesmore » to predict evolutions of fluid pressure, stress and strain distributions as well as potential fault failure and leakage of CO 2 along the fault damage zones. The tendency for the faults to slip and pressure margin to fracture are examined in terms of stress regime, mineral composition, crack distributions in the fault damage zones and geomechanical properties. Here, this model in combination with a detailed description of the faults helps assess the coupled hydro-geochemical-mechanical effect.« less

  17. Teaching Public Administration: Key Themes 1996-2016

    ERIC Educational Resources Information Center

    Fenwick, John

    2018-01-01

    In this article, the aim is to explore some of the key themes to emerge in the journal during the past two decades. Each selected theme will be reviewed in the light of issues raised in particular papers. The aim of this approach is, first, to facilitate reflection upon the contribution of the journal as its subject matter has moved from a concern…

  18. Experiences of surgical continence management approaches for cloacal anomalies: a qualitative analysis based on 6 women.

    PubMed

    Liao, L-M; Baker, E; Boyle, M E; Woodhouse, C R J; Creighton, S M

    2014-10-01

    The aim of this qualitative study was to gain insight into health care experiences of young women diagnosed with cloacal anomalies, with a special focus on continence management. Qualitative analysis of one-to-one interviews. A tertiary center for congenital anomalies of the urogenital tract in London. Six women aged 16 to 24 with cloacal anomalies. Tape-recorded one-to-one semi-stuctured interviews with a skilled interviewer. The taped interviews were transcribed and analyzed verbatim using interpretative phenomenological analysis according to the research question. Organizing themes across all of the accounts were identified. Two organizing themes concerning our research interests are summarized. The first theme Personal Agency in the Hands of Experts focuses on the interviewees' appreciation of their life-saving surgical care and their involvement in treatment decisions. The second theme Compromises and Trade-Offs focuses on what it was like to live with the more traditional versus the more advanced continence methods. Reliability emerged as a key priority in terms of continence treatment outcome. Gratitude may have interfered with the women's honest communications during treatment decision and evaluation consultations. A more developed approach to communication about the complex interventions proposed, founded on a nuanced understanding of users perspectives, can enhance informed decision making about continence management approaches. Despite these specific gaps, the interviewees were appreciative of their care and optimistic about life. Copyright © 2014 North American Society for Pediatric and Adolescent Gynecology. Published by Elsevier Inc. All rights reserved.

  19. Hands-on Approach to Prepare Specialists in Climate Changes Modeling and Analysis Using an Information-Computational Web-GIS Portal "Climate"

    NASA Astrophysics Data System (ADS)

    Shulgina, T. M.; Gordova, Y. E.; Martynova, Y. V.

    2014-12-01

    A problem of making education relevant to the workplace tasks is a key problem of higher education in the professional field of environmental sciences. To answer this challenge several new courses for students of "Climatology" and "Meteorology" specialties were developed and implemented at the Tomsk State University, which comprises theoretical knowledge from up-to-date environmental sciences with computational tasks. To organize the educational process we use an open-source course management system Moodle (www.moodle.org). It gave us an opportunity to combine text and multimedia in a theoretical part of educational courses. The hands-on approach is realized through development of innovative trainings which are performed within the information-computational web GIS platform "Climate" (http://climate.scert.ru/). The platform has a set of tools and data bases allowing a researcher to perform climate changes analysis on the selected territory. The tools are also used for students' trainings, which contain practical tasks on climate modeling and climate changes assessment and analysis. Laboratory exercises are covering three topics: "Analysis of regional climate changes"; "Analysis of climate extreme indices on the regional scale"; and "Analysis of future climate". They designed to consolidate students' knowledge of discipline, to instill in them the skills to work independently with large amounts of geophysical data using modern processing and analysis tools of web-GIS platform "Climate" and to train them to present results obtained on laboratory work as reports with the statement of the problem, the results of calculations and logically justified conclusion. Thus, students are engaged in n the use of modern tools of the geophysical data analysis and it cultivates dynamic of their professional learning. The approach can help us to fill in this gap because it is the only approach that offers experience, increases students involvement, advance the use of modern

  20. Screening for Key Pathways Associated with the Development of Osteoporosis by Bioinformatics Analysis

    PubMed Central

    Liu, Yanqing; Wang, Yueqiu; Zhang, Yanxia; Liu, Zhiyong; Xiang, Hongfei; Peng, Xianbo

    2017-01-01

    Objectives. We aimed to find the key pathways associated with the development of osteoporosis. Methods. We downloaded expression profile data of GSE35959 and analyzed the differentially expressed genes (DEGs) in 3 comparison groups (old_op versus middle, old_op versus old, and old_op versus senescent). KEGG (Kyoto Encyclopedia of Genes and Genomes) pathway enrichment analyses were carried out. Besides, Venn diagram analysis and gene functional interaction (FI) network analysis were performed. Results. Totally 520 DEGs, 966 DEGs, and 709 DEGs were obtained in old_op versus middle, old_op versus old, and old_op versus senescent groups, respectively. Lysosome pathway was the significantly enriched pathways enriched by intersection genes. The pathways enriched by subnetwork modules suggested that mitotic metaphase and anaphase and signaling by Rho GTPases in module 1 had more proteins from module. Conclusions. Lysosome pathway, mitotic metaphase and anaphase, and signaling by Rho GTPases may be involved in the development of osteoporosis. Furthermore, Rho GTPases may regulate the balance of bone resorption and bone formation via controlling osteoclast and osteoblast. These 3 pathways may be regarded as the treatment targets for osteoporosis. PMID:28466021

  1. Secure SCADA communication by using a modified key management scheme.

    PubMed

    Rezai, Abdalhossein; Keshavarzi, Parviz; Moravej, Zahra

    2013-07-01

    This paper presents and evaluates a new cryptographic key management scheme which increases the efficiency and security of the Supervisory Control And Data Acquisition (SCADA) communication. In the proposed key management scheme, two key update phases are used: session key update and master key update. In the session key update phase, session keys are generated in the master station. In the master key update phase, the Elliptic Curve Diffie-Hellman (ECDH) protocol is used. The Poisson process is also used to model the Security Index (SI) and Quality of Service (QoS). Our analysis shows that the proposed key management not only supports the required speed in the MODBUS implementation but also has several advantages compared to other key management schemes for secure communication in SCADA networks. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  2. An application of different dioids in public key cryptography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Durcheva, Mariana I., E-mail: mdurcheva66@gmail.com

    2014-11-18

    Dioids provide a natural framework for analyzing a broad class of discrete event dynamical systems such as the design and analysis of bus and railway timetables, scheduling of high-throughput industrial processes, solution of combinatorial optimization problems, the analysis and improvement of flow systems in communication networks. They have appeared in several branches of mathematics such as functional analysis, optimization, stochastic systems and dynamic programming, tropical geometry, fuzzy logic. In this paper we show how to involve dioids in public key cryptography. The main goal is to create key – exchange protocols based on dioids. Additionally the digital signature scheme ismore » presented.« less

  3. Key Strengths of an Innovative Volunteer Training Workshop

    ERIC Educational Resources Information Center

    Sellick, Angelika; Bournot-Trites, Monique; Reeder, Ken; Scales, Andrew; Smith, Mark; Zappa-Hollman, Sandra

    2011-01-01

    The study involved 14 volunteer facilitators, four UBC staff members, and the researcher as participant; the data collected were observation notes, questionnaires, results from focus groups, and interviews. The study revealed that the key strengths of the training workshop lay in its approach to training, its focus on confidence and capacity…

  4. The Digital Thread as the Key Enabler

    DTIC Science & Technology

    2016-11-01

    17 Defense AT&L: November-December 2016 The Digital Thread as the Key Enabler Col. Keith Bearden, USAF Bearden is the deputy director of...enabling you to do your job better, faster and cheaper. There is one initiative, the key enabler, to accomplish this goal—the digital thread . But let’s... process that would allow for rapid cross- domain analysis and technology transition prior to bending metal. • Re-establish a culture of “hands-on

  5. Rethinking Approaches to Exploration and Analysis of Big Data in Earth Science

    NASA Astrophysics Data System (ADS)

    Graves, S. J.; Maskey, M.

    2015-12-01

    With increasing amounts of data available for exploration and analysis, there are increasing numbers of users that need information extracted from the data for very specific purposes. Many of the specific purposes may not have even been considered yet so how do computational and data scientists plan for this diverse and not well defined set of possible users? There are challenges to be considered in the computational architectures, as well as the organizational structures for the data to allow for the best possible exploration and analytical capabilities. Data analytics need to be a key component in thinking about the data structures and types of storage of these large amounts of data, coming from a variety of sensing platforms that may be space based, airborne, in situ and social media. How do we provide for better capabilities for exploration and anaylsis at the point of collection for real-time or near real-time requirements? This presentation will address some of the approaches being considered and the challenges the computational and data science communities are facing in collaboration with the Earth Science research and application communities.

  6. Fair market value: taking a proactive approach.

    PubMed

    Romero, Richard A

    2008-04-01

    A valuation report assessing the fair market value of a contractual arrangement should include: A description of the company, entity, or circumstance being valued. Analysis of general economic conditions that are expected to affect the enterprise. Evaluation of economic conditions in the medical services industry. Explanation of the various valuation approaches that were considered. Documentation of key underlying assumptions, including revenue and expense projections, projected profit, and ROI.

  7. A fast key generation method based on dynamic biometrics to secure wireless body sensor networks for p-health.

    PubMed

    Zhang, G H; Poon, Carmen C Y; Zhang, Y T

    2010-01-01

    Body sensor networks (BSNs) have emerged as a new technology for healthcare applications, but the security of communication in BSNs remains a formidable challenge yet to be resolved. The paper discusses the typical attacks faced by BSNs and proposes a fast biometric based approach to generate keys for ensuing confidentiality and authentication in BSN communications. The approach was tested on 900 segments of electrocardiogram. Each segment was 4 seconds long and used to generate a 128-bit key. The results of the study found that entropy of 96% of the keys were above 0.95 and 99% of the hamming distances calculated from any two keys were above 50 bits. Based on the randomness and distinctiveness of these keys, it is concluded that the fast biometric based approach has great potential to be used to secure communication in BSNs for health applications.

  8. Eavesdropping on counterfactual quantum key distribution with finite resources

    NASA Astrophysics Data System (ADS)

    Liu, Xingtong; Zhang, Bo; Wang, Jian; Tang, Chaojing; Zhao, Jingjing; Zhang, Sheng

    2014-08-01

    A striking scheme called "counterfactual quantum cryptography" gives a conceptually new approach to accomplish the task of key distribution. It allows two legitimate parties to share a secret even though a particle carrying secret information is not, in fact, transmitted through the quantum channel. Since an eavesdropper cannot directly access the entire quantum system of each signal particle, the protocol seems to provide practical security advantages. However, here we propose an eavesdropping method which works on the scheme in a finite key scenario. We show that, for practical systems only generating a finite number of keys, the eavesdropping can obtain all of the secret information without being detected. We also present a improved protocol as a countermeasure against this attack.

  9. Criteria for Comparing Domain Analysis Approaches Version 01.00.00

    DTIC Science & Technology

    1991-12-01

    Down-Bottom-Up Domain Analysis Process (1990 Version) ..... 14 Figure 8. FODAs Domain Analysis Process ............................................ 16... FODA , which uses the Design Approach for Real-Time Systems (DARTS) design method (Gomaa 1984)? 1 1. hntmduction Domain analysis is still immature... Analysis Process 16 2. An Orvicw of Some Domain AnabAppro•d•a 2.4.3 ExAzs The FODA report illustrates the process by using the window management

  10. Evaluating Digital Health Interventions: Key Questions and Approaches.

    PubMed

    Murray, Elizabeth; Hekler, Eric B; Andersson, Gerhard; Collins, Linda M; Doherty, Aiden; Hollis, Chris; Rivera, Daniel E; West, Robert; Wyatt, Jeremy C

    2016-11-01

    Digital health interventions have enormous potential as scalable tools to improve health and healthcare delivery by improving effectiveness, efficiency, accessibility, safety, and personalization. Achieving these improvements requires a cumulative knowledge base to inform development and deployment of digital health interventions. However, evaluations of digital health interventions present special challenges. This paper aims to examine these challenges and outline an evaluation strategy in terms of the research questions needed to appraise such interventions. As they are at the intersection of biomedical, behavioral, computing, and engineering research, methods drawn from all of these disciplines are required. Relevant research questions include defining the problem and the likely benefit of the digital health intervention, which in turn requires establishing the likely reach and uptake of the intervention, the causal model describing how the intervention will achieve its intended benefit, key components, and how they interact with one another, and estimating overall benefit in terms of effectiveness, cost effectiveness, and harms. Although RCTs are important for evaluation of effectiveness and cost effectiveness, they are best undertaken only when: (1) the intervention and its delivery package are stable; (2) these can be implemented with high fidelity; and (3) there is a reasonable likelihood that the overall benefits will be clinically meaningful (improved outcomes or equivalent outcomes at lower cost). Broadening the portfolio of research questions and evaluation methods will help with developing the necessary knowledge base to inform decisions on policy, practice, and research. Copyright © 2016 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  11. Improving Argumentative Writing: Effects of a Blended Learning Approach and Gamification

    ERIC Educational Resources Information Center

    Lam, Yau Wai; Hew, Khe Foon; Chiu, Kin Fung

    2018-01-01

    This study investigated the effectiveness of a blended learning approach--involving the thesis, analysis, and synthesis key (TASK) procedural strategy; online Edmodo discussions; online message labels; and writing models--on student argumentative writing in a Hong Kong secondary school. It also examined whether the application of digital game…

  12. A Deliberate Practice Approach to Teaching Phylogenetic Analysis

    PubMed Central

    Hobbs, F. Collin; Johnson, Daniel J.; Kearns, Katherine D.

    2013-01-01

    One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or “one-shot,” in-class activities. Using a deliberate practice instructional approach, we designed a set of five assignments for a 300-level plant systematics course that incrementally introduces the concepts and skills used in phylogenetic analysis. In our assignments, students learned the process of constructing phylogenetic trees through a series of increasingly difficult tasks; thus, skill development served as a framework for building content knowledge. We present results from 5 yr of final exam scores, pre- and postconcept assessments, and student surveys to assess the impact of our new pedagogical materials on student performance related to constructing and interpreting phylogenetic trees. Students improved in their ability to interpret relationships within trees and improved in several aspects related to between-tree comparisons and tree construction skills. Student feedback indicated that most students believed our approach prepared them to engage in tree construction and gave them confidence in their abilities. Overall, our data confirm that instructional approaches implementing deliberate practice address student misconceptions, improve student experiences, and foster deeper understanding of difficult scientific concepts. PMID:24297294

  13. A stochastic approach to uncertainty quantification in residual moveout analysis

    NASA Astrophysics Data System (ADS)

    Johng-Ay, T.; Landa, E.; Dossou-Gbété, S.; Bordes, L.

    2015-06-01

    Oil and gas exploration and production relies usually on the interpretation of a single seismic image, which is obtained from observed data. However, the statistical nature of seismic data and the various approximations and assumptions are sources of uncertainties which may corrupt the evaluation of parameters. The quantification of these uncertainties is a major issue which supposes to help in decisions that have important social and commercial implications. The residual moveout analysis, which is an important step in seismic data processing is usually performed by a deterministic approach. In this paper we discuss a Bayesian approach to the uncertainty analysis.

  14. An Analysis of the Contents and Pedagogy of Al-Kashi's 1427 "Key to Arithmetic" (Miftah Al-Hisab)

    ERIC Educational Resources Information Center

    Ta'ani, Osama Hekmat

    2011-01-01

    Al-Kashi's 1427 "Key to Arithmetic" had important use over several hundred years in mathematics teaching in Medieval Islam throughout the time of the Ottoman Empire. Its pedagogical features have never been studied before. In this dissertation I have made a close pedagogical analysis of these features and discovered several teaching…

  15. Key Factors Influencing the Energy Absorption of Dual-Phase Steels: Multiscale Material Model Approach and Microstructural Optimization

    NASA Astrophysics Data System (ADS)

    Belgasam, Tarek M.; Zbib, Hussein M.

    2018-06-01

    The increase in use of dual-phase (DP) steel grades by vehicle manufacturers to enhance crash resistance and reduce body car weight requires the development of a clear understanding of the effect of various microstructural parameters on the energy absorption in these materials. Accordingly, DP steelmakers are interested in predicting the effect of various microscopic factors as well as optimizing microstructural properties for application in crash-relevant components of vehicle bodies. This study presents a microstructure-based approach using a multiscale material and structure model. In this approach, Digimat and LS-DYNA software were coupled and employed to provide a full micro-macro multiscale material model, which is then used to simulate tensile tests. Microstructures with varied ferrite grain sizes, martensite volume fractions, and carbon content in DP steels were studied. The impact of these microstructural features at different strain rates on energy absorption characteristics of DP steels is investigated numerically using an elasto-viscoplastic constitutive model. The model is implemented in a multiscale finite-element framework. A comprehensive statistical parametric study using response surface methodology is performed to determine the optimum microstructural features for a required tensile toughness at different strain rates. The simulation results are validated using experimental data found in the literature. The developed methodology proved to be effective for investigating the influence and interaction of key microscopic properties on the energy absorption characteristics of DP steels. Furthermore, it is shown that this method can be used to identify optimum microstructural conditions at different strain-rate conditions.

  16. Key Factors Influencing the Energy Absorption of Dual-Phase Steels: Multiscale Material Model Approach and Microstructural Optimization

    NASA Astrophysics Data System (ADS)

    Belgasam, Tarek M.; Zbib, Hussein M.

    2018-03-01

    The increase in use of dual-phase (DP) steel grades by vehicle manufacturers to enhance crash resistance and reduce body car weight requires the development of a clear understanding of the effect of various microstructural parameters on the energy absorption in these materials. Accordingly, DP steelmakers are interested in predicting the effect of various microscopic factors as well as optimizing microstructural properties for application in crash-relevant components of vehicle bodies. This study presents a microstructure-based approach using a multiscale material and structure model. In this approach, Digimat and LS-DYNA software were coupled and employed to provide a full micro-macro multiscale material model, which is then used to simulate tensile tests. Microstructures with varied ferrite grain sizes, martensite volume fractions, and carbon content in DP steels were studied. The impact of these microstructural features at different strain rates on energy absorption characteristics of DP steels is investigated numerically using an elasto-viscoplastic constitutive model. The model is implemented in a multiscale finite-element framework. A comprehensive statistical parametric study using response surface methodology is performed to determine the optimum microstructural features for a required tensile toughness at different strain rates. The simulation results are validated using experimental data found in the literature. The developed methodology proved to be effective for investigating the influence and interaction of key microscopic properties on the energy absorption characteristics of DP steels. Furthermore, it is shown that this method can be used to identify optimum microstructural conditions at different strain-rate conditions.

  17. Optimization of tribological performance of SiC embedded composite coating via Taguchi analysis approach

    NASA Astrophysics Data System (ADS)

    Maleque, M. A.; Bello, K. A.; Adebisi, A. A.; Akma, N.

    2017-03-01

    Tungsten inert gas (TIG) torch is one of the most recently used heat source for surface modification of engineering parts, giving similar results to the more expensive high power laser technique. In this study, ceramic-based embedded composite coating has been produced by precoated silicon carbide (SiC) powders on the AISI 4340 low alloy steel substrate using TIG welding torch process. A design of experiment based on Taguchi approach has been adopted to optimize the TIG cladding process parameters. The L9 orthogonal array and the signal-to-noise was used to study the effect of TIG welding parameters such as arc current, travelling speed, welding voltage and argon flow rate on tribological response behaviour (wear rate, surface roughness and wear track width). The objective of the study was to identify optimal design parameter that significantly minimizes each of the surface quality characteristics. The analysis of the experimental results revealed that the argon flow rate was found to be the most influential factor contributing to the minimum wear and surface roughness of the modified coating surface. On the other hand, the key factor in reducing wear scar is the welding voltage. Finally, a convenient and economical Taguchi approach used in this study was efficient to find out optimal factor settings for obtaining minimum wear rate, wear scar and surface roughness responses in TIG-coated surfaces.

  18. Simple Web-based interactive key development software (WEBiKEY) and an example key for Kuruna (Poaceae: Bambusoideae)1

    PubMed Central

    Attigala, Lakshmi; De Silva, Nuwan I.; Clark, Lynn G.

    2016-01-01

    Premise of the study: Programs that are user-friendly and freely available for developing Web-based interactive keys are scarce and most of the well-structured applications are relatively expensive. WEBiKEY was developed to enable researchers to easily develop their own Web-based interactive keys with fewer resources. Methods and Results: A Web-based multiaccess identification tool (WEBiKEY) was developed that uses freely available Microsoft ASP.NET technologies and an SQL Server database for Windows-based hosting environments. WEBiKEY was tested for its usability with a sample data set, the temperate woody bamboo genus Kuruna (Poaceae). Conclusions: WEBiKEY is freely available to the public and can be used to develop Web-based interactive keys for any group of species. The interactive key we developed for Kuruna using WEBiKEY enables users to visually inspect characteristics of Kuruna and identify an unknown specimen as one of seven possible species in the genus. PMID:27144109

  19. Blurring the Inputs: A Natural Language Approach to Sensitivity Analysis

    NASA Technical Reports Server (NTRS)

    Kleb, William L.; Thompson, Richard A.; Johnston, Christopher O.

    2007-01-01

    To document model parameter uncertainties and to automate sensitivity analyses for numerical simulation codes, a natural-language-based method to specify tolerances has been developed. With this new method, uncertainties are expressed in a natural manner, i.e., as one would on an engineering drawing, namely, 5.25 +/- 0.01. This approach is robust and readily adapted to various application domains because it does not rely on parsing the particular structure of input file formats. Instead, tolerances of a standard format are added to existing fields within an input file. As a demonstration of the power of this simple, natural language approach, a Monte Carlo sensitivity analysis is performed for three disparate simulation codes: fluid dynamics (LAURA), radiation (HARA), and ablation (FIAT). Effort required to harness each code for sensitivity analysis was recorded to demonstrate the generality and flexibility of this new approach.

  20. Synthetic data sets for the identification of key ingredients for RNA-seq differential analysis.

    PubMed

    Rigaill, Guillem; Balzergue, Sandrine; Brunaud, Véronique; Blondet, Eddy; Rau, Andrea; Rogier, Odile; Caius, José; Maugis-Rabusseau, Cathy; Soubigou-Taconnat, Ludivine; Aubourg, Sébastien; Lurin, Claire; Martin-Magniette, Marie-Laure; Delannoy, Etienne

    2018-01-01

    Numerous statistical pipelines are now available for the differential analysis of gene expression measured with RNA-sequencing technology. Most of them are based on similar statistical frameworks after normalization, differing primarily in the choice of data distribution, mean and variance estimation strategy and data filtering. We propose an evaluation of the impact of these choices when few biological replicates are available through the use of synthetic data sets. This framework is based on real data sets and allows the exploration of various scenarios differing in the proportion of non-differentially expressed genes. Hence, it provides an evaluation of the key ingredients of the differential analysis, free of the biases associated with the simulation of data using parametric models. Our results show the relevance of a proper modeling of the mean by using linear or generalized linear modeling. Once the mean is properly modeled, the impact of the other parameters on the performance of the test is much less important. Finally, we propose to use the simple visualization of the raw P-value histogram as a practical evaluation criterion of the performance of differential analysis methods on real data sets. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. A Deliberate Practice Approach to Teaching Phylogenetic Analysis

    ERIC Educational Resources Information Center

    Hobbs, F. Collin; Johnson, Daniel J.; Kearns, Katherine D.

    2013-01-01

    One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or "one-shot," in-class activities. Using a deliberate practice instructional approach, we…

  2. A Disciplined Architectural Approach to Scaling Data Analysis for Massive, Scientific Data

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Braverman, A. J.; Cinquini, L.; Turmon, M.; Lee, H.; Law, E.

    2014-12-01

    Data collections across remote sensing and ground-based instruments in astronomy, Earth science, and planetary science are outpacing scientists' ability to analyze them. Furthermore, the distribution, structure, and heterogeneity of the measurements themselves pose challenges that limit the scalability of data analysis using traditional approaches. Methods for developing science data processing pipelines, distribution of scientific datasets, and performing analysis will require innovative approaches that integrate cyber-infrastructure, algorithms, and data into more systematic approaches that can more efficiently compute and reduce data, particularly distributed data. This requires the integration of computer science, machine learning, statistics and domain expertise to identify scalable architectures for data analysis. The size of data returned from Earth Science observing satellites and the magnitude of data from climate model output, is predicted to grow into the tens of petabytes challenging current data analysis paradigms. This same kind of growth is present in astronomy and planetary science data. One of the major challenges in data science and related disciplines defining new approaches to scaling systems and analysis in order to increase scientific productivity and yield. Specific needs include: 1) identification of optimized system architectures for analyzing massive, distributed data sets; 2) algorithms for systematic analysis of massive data sets in distributed environments; and 3) the development of software infrastructures that are capable of performing massive, distributed data analysis across a comprehensive data science framework. NASA/JPL has begun an initiative in data science to address these challenges. Our goal is to evaluate how scientific productivity can be improved through optimized architectural topologies that identify how to deploy and manage the access, distribution, computation, and reduction of massive, distributed data, while

  3. Systems Approach to Arms Control Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allen, K; Neimeyer, I; Listner, C

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between twomore » model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.« less

  4. Conversation Analysis--A Discourse Approach to Teaching Oral English Skills

    ERIC Educational Resources Information Center

    Wu, Yan

    2013-01-01

    This paper explores a pedagocial approach to teaching oral English---Conversation Analysis. First, features of spoken language is described in comparison to written language. Second, Conversation Analysis theory is elaborated in terms of adjacency pairs, turn-taking, repairs, sequences, openings and closings, and feedback. Third, under the…

  5. Foreign Languages: Key Links in the Chain of Learning.

    ERIC Educational Resources Information Center

    Mead, Robert G., Jr., Ed.

    The articles discuss the necessity of including foreign language as an integral part of the curriculum at all levels of instruction. The following chapters are included: "Elementary School Foreign Language: Key Link in the Chain of Learning" (rationale, innovations, immersion programs, and interdisciplinary approaches); "Foreign…

  6. Quantum key management

    DOEpatents

    Hughes, Richard John; Thrasher, James Thomas; Nordholt, Jane Elizabeth

    2016-11-29

    Innovations for quantum key management harness quantum communications to form a cryptography system within a public key infrastructure framework. In example implementations, the quantum key management innovations combine quantum key distribution and a quantum identification protocol with a Merkle signature scheme (using Winternitz one-time digital signatures or other one-time digital signatures, and Merkle hash trees) to constitute a cryptography system. More generally, the quantum key management innovations combine quantum key distribution and a quantum identification protocol with a hash-based signature scheme. This provides a secure way to identify, authenticate, verify, and exchange secret cryptographic keys. Features of the quantum key management innovations further include secure enrollment of users with a registration authority, as well as credential checking and revocation with a certificate authority, where the registration authority and/or certificate authority can be part of the same system as a trusted authority for quantum key distribution.

  7. Principal coordinate analysis assisted chromatographic analysis of bacterial cell wall collection: A robust classification approach.

    PubMed

    Kumar, Keshav; Cava, Felipe

    2018-04-10

    In the present work, Principal coordinate analysis (PCoA) is introduced to develop a robust model to classify the chromatographic data sets of peptidoglycan sample. PcoA captures the heterogeneity present in the data sets by using the dissimilarity matrix as input. Thus, in principle, it can even capture the subtle differences in the bacterial peptidoglycan composition and can provide a more robust and fast approach for classifying the bacterial collection and identifying the novel cell wall targets for further biological and clinical studies. The utility of the proposed approach is successfully demonstrated by analysing the two different kind of bacterial collections. The first set comprised of peptidoglycan sample belonging to different subclasses of Alphaproteobacteria. Whereas, the second set that is relatively more intricate for the chemometric analysis consist of different wild type Vibrio Cholerae and its mutants having subtle differences in their peptidoglycan composition. The present work clearly proposes a useful approach that can classify the chromatographic data sets of chromatographic peptidoglycan samples having subtle differences. Furthermore, present work clearly suggest that PCoA can be a method of choice in any data analysis workflow. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. Short-term effect of Keyes' approach to periodontal therapy compared with modified Widman flap surgery.

    PubMed

    Whitehead, S P; Watts, T L

    1987-11-01

    Keyes' method of non-surgical therapy was compared with modified Widman flap surgery in 9 patients with symmetrical periodontal disease. Following an initial oral hygiene programme, baseline measurements were recorded and paired contralateral areas were subjected randomly to the 2 techniques. 42 teeth receiving surgery were compared with 40 treated by Keyes' method. 6 sites per tooth were scored immediately prior to therapy and 3 months later, using a constant force probe with onlays. Consistent data were recorded for the 6 separate sites, which showed no baseline difference between treatments, slightly greater recession with surgery at 3 months, but no difference between treatments in probing depth and attachment levels. Mean data for individual patients showed similar consistency. Probing depth in deep sites was reduced slightly more with surgery, and there were no differences in bleeding on probing at 3 months. Both techniques gave marked improvements in health. Surprisingly, only 2 subjects preferred Keyes' technique of mechanical therapy, 6 preferred surgery, and 1 had no preference.

  9. Uncovering key patterns in self-harm in adolescents: Sequence analysis using the Card Sort Task for Self-harm (CaTS).

    PubMed

    Townsend, E; Wadman, R; Sayal, K; Armstrong, M; Harroe, C; Majumder, P; Vostanis, P; Clarke, D

    2016-12-01

    Self-harm is a significant clinical issue in adolescence. There is little research on the interplay of key factors in the months, weeks, days and hours leading to self-harm. We developed the Card Sort Task for Self-harm (CaTS) to investigate the pattern of thoughts, feelings, events and behaviours leading to self-harm. Forty-five young people (aged 13-21 years) with recent repeated self-harm completed the CaTS to describe their first ever/most recent self-harm episode. Lag sequential analysis determined significant transitions in factors leading to self-harm (presented in state transition diagrams). A significant sequential structure to the card sequences produced was observed demonstrating similarities and important differences in antecedents to first and most recent self-harm. Life-events were distal in the self-harm pathway and more heterogeneous. Of significant clinical concern was that the wish to die and hopelessness emerged as important antecedents in the most recent episode. First ever self-harm was associated with feeling better afterward, but this disappeared for the most recent episode. Larger sample sizes are necessary to examine longer chains of sequences and differences in genders, age and type of self-harm. The sample was self-selected with 53% having experience of living in care. The CaTs offers a systematic approach to understanding the dynamic interplay of factors that lead to self-harm in young people. It offers a method to target key points for intervention in the self-harm pathway. Crucially the factors most proximal to self-harm (negative emotions, impulsivity and access to means) are modifiable with existing clinical interventions. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  10. Hearing Children's Voices through a Conversation Analysis Approach

    ERIC Educational Resources Information Center

    Bateman, Amanda

    2017-01-01

    This article introduces the methodological approach of conversation analysis (CA) and demonstrates its usefulness in presenting more authentic documentation and analysis of children's voices. Grounded in ethnomethodology, CA has recently gained interest in the area of early childhood studies due to the affordances it holds for gaining access to…

  11. Security Analysis of Measurement-Device-Independent Quantum Key Distribution in Collective-Rotation Noisy Environment

    NASA Astrophysics Data System (ADS)

    Li, Na; Zhang, Yu; Wen, Shuang; Li, Lei-lei; Li, Jian

    2018-01-01

    Noise is a problem that communication channels cannot avoid. It is, thus, beneficial to analyze the security of MDI-QKD in noisy environment. An analysis model for collective-rotation noise is introduced, and the information theory methods are used to analyze the security of the protocol. The maximum amount of information that Eve can eavesdrop is 50%, and the eavesdropping can always be detected if the noise level ɛ ≤ 0.68. Therefore, MDI-QKD protocol is secure as quantum key distribution protocol. The maximum probability that the relay outputs successful results is 16% when existing eavesdropping. Moreover, the probability that the relay outputs successful results when existing eavesdropping is higher than the situation without eavesdropping. The paper validates that MDI-QKD protocol has better robustness.

  12. Experimental demonstration of counterfactual quantum key distribution

    NASA Astrophysics Data System (ADS)

    Ren, M.; Wu, G.; Wu, E.; Zeng, H.

    2011-04-01

    Counterfactual quantum key distribution provides natural advantage against the eavesdropping on the actual signal particles. It can prevent the photon-number-splitting attack when a weak coherent light source is used for the practical implementation. We experimentally realized the counterfactual quantum key distribution in an unbalanced Mach-Zehnder interferometer of 12.5-km-long quantum channel with a high-fringe visibility of 97.4%. According to the security analysis, the system was robust against the photon-number-splitting attack. The article is published in the original.

  13. Forensic steganalysis: determining the stego key in spatial domain steganography

    NASA Astrophysics Data System (ADS)

    Fridrich, Jessica; Goljan, Miroslav; Soukal, David; Holotyak, Taras

    2005-03-01

    This paper is an extension of our work on stego key search for JPEG images published at EI SPIE in 2004. We provide a more general theoretical description of the methodology, apply our approach to the spatial domain, and add a method that determines the stego key from multiple images. We show that in the spatial domain the stego key search can be made significantly more efficient by working with the noise component of the image obtained using a denoising filter. The technique is tested on the LSB embedding paradigm and on a special case of embedding by noise adding (the +/-1 embedding). The stego key search can be performed for a wide class of steganographic techniques even for sizes of secret message well below those detectable using known methods. The proposed strategy may prove useful to forensic analysts and law enforcement.

  14. Hospital innovation portfolios: key determinants of size and innovativeness.

    PubMed

    Schultz, Carsten; Zippel-Schultz, Bettina; Salomo, Søren

    2012-01-01

    Health care organizations face an increasing demand for strategic change and innovation; however, there are also several barriers to innovation that impede successful implementation. We aimed to shed light on key issues of innovation management in hospitals and provide empirical evidence for controlling the size and innovativeness of a hospital's new health service and process portfolio. We show how health care managers could align the need for exploration and exploitation by applying both informal (e.g., employee encouragement) and formal (e.g., analytical orientation and reward systems) organizational mechanisms. To develop hypotheses, we integrated the innovation management literature into the hospital context. Detailed information about the innovation portfolio of 87 German hospitals was generated and combined with multirespondent survey data using ratings from management, medical, and nursing directors. Multivariate regression analysis was applied. The empirical results showed that an analytical approach increased the size of innovation portfolios. Employee encouragement amplified the degree of innovativeness of activities in the portfolio. Reward systems did not have direct effects on the composition of innovation portfolios. However, they adjusted bottom-up employee and top-down strategic initiatives to match with the existing organization, thereby decreasing the degree of innovativeness and enforcing exploitation. Hospitals should intertwine employee encouragement, analytical approaches, and formal reward systems depending on organizational goals.

  15. Novel presentational approaches were developed for reporting network meta-analysis.

    PubMed

    Tan, Sze Huey; Cooper, Nicola J; Bujkiewicz, Sylwia; Welton, Nicky J; Caldwell, Deborah M; Sutton, Alexander J

    2014-06-01

    To present graphical tools for reporting network meta-analysis (NMA) results aiming to increase the accessibility, transparency, interpretability, and acceptability of NMA analyses. The key components of NMA results were identified based on recommendations by agencies such as the National Institute for Health and Care Excellence (United Kingdom). Three novel graphs were designed to amalgamate the identified components using familiar graphical tools such as the bar, line, or pie charts and adhering to good graphical design principles. Three key components for presentation of NMA results were identified, namely relative effects and their uncertainty, probability of an intervention being best, and between-study heterogeneity. Two of the three graphs developed present results (for each pairwise comparison of interventions in the network) obtained from both NMA and standard pairwise meta-analysis for easy comparison. They also include options to display the probability best, ranking statistics, heterogeneity, and prediction intervals. The third graph presents rankings of interventions in terms of their effectiveness to enable clinicians to easily identify "top-ranking" interventions. The graphical tools presented can display results tailored to the research question of interest, and targeted at a whole spectrum of users from the technical analyst to the nontechnical clinician. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Assessing Instructional Reform in San Diego: A Theory-Based Approach

    ERIC Educational Resources Information Center

    O'Day, Jennifer; Quick, Heather E.

    2009-01-01

    This article provides an overview of the approach, methodology, and key findings from a theory-based evaluation of the district-led instructional reform effort in San Diego City Schools, under the leadership of Alan Bersin and Anthony Alvarado, that began in 1998. Beginning with an analysis of the achievement trends in San Diego relative to other…

  17. Intelligent Systems Approaches to Product Sound Quality Analysis

    NASA Astrophysics Data System (ADS)

    Pietila, Glenn M.

    As a product market becomes more competitive, consumers become more discriminating in the way in which they differentiate between engineered products. The consumer often makes a purchasing decision based on the sound emitted from the product during operation by using the sound to judge quality or annoyance. Therefore, in recent years, many sound quality analysis tools have been developed to evaluate the consumer preference as it relates to a product sound and to quantify this preference based on objective measurements. This understanding can be used to direct a product design process in order to help differentiate the product from competitive products or to establish an impression on consumers regarding a product's quality or robustness. The sound quality process is typically a statistical tool that is used to model subjective preference, or merit score, based on objective measurements, or metrics. In this way, new product developments can be evaluated in an objective manner without the laborious process of gathering a sample population of consumers for subjective studies each time. The most common model used today is the Multiple Linear Regression (MLR), although recently non-linear Artificial Neural Network (ANN) approaches are gaining popularity. This dissertation will review publicly available published literature and present additional intelligent systems approaches that can be used to improve on the current sound quality process. The focus of this work is to address shortcomings in the current paired comparison approach to sound quality analysis. This research will propose a framework for an adaptive jury analysis approach as an alternative to the current Bradley-Terry model. The adaptive jury framework uses statistical hypothesis testing to focus on sound pairings that are most interesting and is expected to address some of the restrictions required by the Bradley-Terry model. It will also provide a more amicable framework for an intelligent systems approach

  18. A global sensitivity analysis approach for morphogenesis models.

    PubMed

    Boas, Sonja E M; Navarro Jimenez, Maria I; Merks, Roeland M H; Blom, Joke G

    2015-11-21

    Morphogenesis is a developmental process in which cells organize into shapes and patterns. Complex, non-linear and multi-factorial models with images as output are commonly used to study morphogenesis. It is difficult to understand the relation between the uncertainty in the input and the output of such 'black-box' models, giving rise to the need for sensitivity analysis tools. In this paper, we introduce a workflow for a global sensitivity analysis approach to study the impact of single parameters and the interactions between them on the output of morphogenesis models. To demonstrate the workflow, we used a published, well-studied model of vascular morphogenesis. The parameters of this cellular Potts model (CPM) represent cell properties and behaviors that drive the mechanisms of angiogenic sprouting. The global sensitivity analysis correctly identified the dominant parameters in the model, consistent with previous studies. Additionally, the analysis provided information on the relative impact of single parameters and of interactions between them. This is very relevant because interactions of parameters impede the experimental verification of the predicted effect of single parameters. The parameter interactions, although of low impact, provided also new insights in the mechanisms of in silico sprouting. Finally, the analysis indicated that the model could be reduced by one parameter. We propose global sensitivity analysis as an alternative approach to study the mechanisms of morphogenesis. Comparison of the ranking of the impact of the model parameters to knowledge derived from experimental data and from manipulation experiments can help to falsify models and to find the operand mechanisms in morphogenesis. The workflow is applicable to all 'black-box' models, including high-throughput in vitro models in which output measures are affected by a set of experimental perturbations.

  19. Key influences in the design and implementation of mental health information systems in Ghana and South Africa.

    PubMed

    Ahuja, S; Mirzoev, T; Lund, C; Ofori-Atta, A; Skeen, S; Kufuor, A

    2016-01-01

    Strengthening of mental health information systems (MHIS) is essential to monitor and evaluate mental health services in low and middle-income countries. While research exists assessing wider health management information systems, there is limited published evidence exploring the design and implementation of MHIS in these settings. This paper aims to identify and assess the key factors affecting the design and implementation of MHIS, as perceived by the key stakeholders in Ghana and South Africa. We report findings from the Mental Health and Poverty Project, a 5-year research programme implemented within four African countries. The MHIS strengthening in South Africa and Ghana included two related components: intervention and research. The intervention component aimed to strengthen MHIS in the two countries, and the research component aimed to document interventions in each country, including the key influences. Data were collected using semi structured interviews with key stakeholders and reviews of key documents and secondary data from the improved MHIS. We analyzed the qualitative data using a framework approach. Key components of the MHIS intervention involved the introduction of a redesigned patient registration form, entry into computers for analysis every 2 months by clinical managerial staff, and utilization of data in hospital management meetings in three psychiatric hospitals in Ghana; and the introduction of a new set of mental health indicators and related forms and tally sheets at primary care clinics and district hospitals in five districts in the KwaZulu-Natal and Northern Cape provinces in South Africa. Overall, the key stakeholders perceived the MHIS strengthening as an effective intervention in both countries with an enhanced set of indicators in South Africa and introduction of a computerized system in Ghana. Influences on the design and implementation of MHIS interventions in Ghana and South Africa relate to resources, working approaches

  20. Pricing and components analysis of some key essential pediatric medicine in Odisha state

    PubMed Central

    Samal, Satyajit; Swain, Trupti Rekha

    2017-01-01

    Objective: Study highlighting prices, i.e., the patients actually pay at ground level is important for interventions such as alternate procurement schemes or to expedite regulatory assessment of essential medicines for children. The present study was undertaken to study pricing and component analysis of few key essential medicines in Odisha state. Methodology: Six child-specific medicines of different formulations were selected based on use in different disease condition and having widest pricing variation. Data were collected, entered, and analyzed in the price components data collection form of the World Health Organization-Health Action International (WHO-HAI) 2007 Workbook version 5 – Part II provided as part of the WHO/HAI methodology. The analysis includes the cumulative percent markup, total cumulative percent markup, and percent contribution of individual components to the final medicine price in both public and private sector of Odisha state. Results: Add-on costs such as taxes, wholesale, and retail markups contribute substantially to the final price of medicines in private sector, particularly for branded-generic products. The largest contributor to add-on costs is at the level of retailer shop. Conclusion: Policy should be framed to achieve a greater transparency and uniformity of the pricing of medicines at different health sectors of Odisha. PMID:28458429

  1. Pricing and components analysis of some key essential pediatric medicine in Odisha state.

    PubMed

    Samal, Satyajit; Swain, Trupti Rekha

    2017-01-01

    Study highlighting prices, i.e., the patients actually pay at ground level is important for interventions such as alternate procurement schemes or to expedite regulatory assessment of essential medicines for children. The present study was undertaken to study pricing and component analysis of few key essential medicines in Odisha state. Six child-specific medicines of different formulations were selected based on use in different disease condition and having widest pricing variation. Data were collected, entered, and analyzed in the price components data collection form of the World Health Organization-Health Action International (WHO-HAI) 2007 Workbook version 5 - Part II provided as part of the WHO/HAI methodology. The analysis includes the cumulative percent markup, total cumulative percent markup, and percent contribution of individual components to the final medicine price in both public and private sector of Odisha state. Add-on costs such as taxes, wholesale, and retail markups contribute substantially to the final price of medicines in private sector, particularly for branded-generic products. The largest contributor to add-on costs is at the level of retailer shop. Policy should be framed to achieve a greater transparency and uniformity of the pricing of medicines at different health sectors of Odisha.

  2. Delay and cost performance analysis of the diffie-hellman key exchange protocol in opportunistic mobile networks

    NASA Astrophysics Data System (ADS)

    Soelistijanto, B.; Muliadi, V.

    2018-03-01

    Diffie-Hellman (DH) provides an efficient key exchange system by reducing the number of cryptographic keys distributed in the network. In this method, a node broadcasts a single public key to all nodes in the network, and in turn each peer uses this key to establish a shared secret key which then can be utilized to encrypt and decrypt traffic between the peer and the given node. In this paper, we evaluate the key transfer delay and cost performance of DH in opportunistic mobile networks, a specific scenario of MANETs where complete end-to-end paths rarely exist between sources and destinations; consequently, the end-to-end delays in these networks are much greater than typical MANETs. Simulation results, driven by a random node movement model and real human mobility traces, showed that DH outperforms a typical key distribution scheme based on the RSA algorithm in terms of key transfer delay, measured by average key convergence time; however, DH performs as well as the benchmark in terms of key transfer cost, evaluated by total key (copies) forwards.

  3. A Simplified Approach to Encephalitis and Its Mimics: Key Clinical Decision Points in the Setting of Specific Imaging Abnormalities.

    PubMed

    McKnight, Colin D; Kelly, Aine M; Petrou, Myria; Nidecker, Anna E; Lorincz, Matthew T; Altaee, Duaa K; Gebarski, Stephen S; Foerster, Bradley

    2017-06-01

    Infectious encephalitis is a relatively common cause of morbidity and mortality. Treatment of infectious encephalitis with antiviral medication can be highly effective when administered promptly. Clinical mimics of encephalitis arise from a broad range of pathologic processes, including toxic, metabolic, neoplastic, autoimmune, and cardiovascular etiologies. These mimics need to be rapidly differentiated from infectious encephalitis to appropriately manage the correct etiology; however, the many overlapping signs of these various entities present a challenge to accurate diagnosis. A systematic approach that considers both the clinical manifestations and the imaging findings of infectious encephalitis and its mimics can contribute to more accurate and timely diagnosis. Following an institutional review board approval, a health insurance portability and accountability act (HIPAA)-compliant search of our institutional imaging database (teaching files) was conducted to generate a list of adult and pediatric patients who presented between January 1, 1995 and October 10, 2013 for imaging to evaluate possible cases of encephalitis. Pertinent medical records, including clinical notes as well as surgical and pathology reports, were reviewed and correlated with imaging findings. Clinical and imaging findings were combined to generate useful flowcharts designed to assist in distinguishing infectious encephalitis from its mimics. Key imaging features were reviewed and were placed in the context of the provided flowcharts. Four flowcharts were presented based on the primary anatomic site of imaging abnormality: group 1: temporal lobe; group 2: cerebral cortex; group 3: deep gray matter; and group 4: white matter. An approach that combines features on clinical presentation was then detailed. Imaging examples were used to demonstrate similarities and key differences. Early recognition of infectious encephalitis is critical, but can be quite complex due to diverse pathologies and

  4. Semiotic Approach to the Analysis of Children's Drawings

    ERIC Educational Resources Information Center

    Turkcan, Burcin

    2013-01-01

    Semiotics, which is used for the analysis of a number of communication languages, helps describe the specific operational rules by determining the sub-systems included in the field it examines. Considering that art is a communication language, this approach could be used in analyzing children's products in art education. The present study aiming…

  5. Considering a Cost Analysis Project? A Planned Approach

    ERIC Educational Resources Information Center

    Parish, Mina; Teetor, Travis

    2006-01-01

    As resources become more constrained in the library community, many organizations are finding that they need to have a better understanding of their costs. To this end, this article will present one approach to conducting a cost analysis (including questions to ask yourself, project team makeup, organizational support, and data organization). We…

  6. Multiparty Quantum Key Agreement Based on Quantum Search Algorithm

    PubMed Central

    Cao, Hao; Ma, Wenping

    2017-01-01

    Quantum key agreement is an important topic that the shared key must be negotiated equally by all participants, and any nontrivial subset of participants cannot fully determine the shared key. To date, the embed modes of subkey in all the previously proposed quantum key agreement protocols are based on either BB84 or entangled states. The research of the quantum key agreement protocol based on quantum search algorithms is still blank. In this paper, on the basis of investigating the properties of quantum search algorithms, we propose the first quantum key agreement protocol whose embed mode of subkey is based on a quantum search algorithm known as Grover’s algorithm. A novel example of protocols with 5 – party is presented. The efficiency analysis shows that our protocol is prior to existing MQKA protocols. Furthermore it is secure against both external attack and internal attacks. PMID:28332610

  7. Sensitivity analysis of monthly reference crop evapotranspiration trends in Iran: a qualitative approach

    NASA Astrophysics Data System (ADS)

    Mosaedi, Abolfazl; Ghabaei Sough, Mohammad; Sadeghi, Sayed-Hossein; Mooshakhian, Yousof; Bannayan, Mohammad

    2017-05-01

    The main objective of this study was to analyze the sensitivity of the monthly reference crop evapotranspiration (ETo) trends to key climatic factors (minimum and maximum air temperature ( T max and T min), relative humidity (RH), sunshine hours ( t sun), and wind speed ( U 2)) in Iran by applying a qualitative detrended method, rather than the historical mathematical approach. Meteorological data for the period of 1963-2007 from five synoptic stations with different climatic characteristics, including Mashhad (mountains), Tabriz (mountains), Tehran (semi-desert), Anzali (coastal wet), and Shiraz (semi-mountains) were used to address this objective. The Mann-Kendall test was employed to assess the trends of ETo and the climatic variables. The results indicated a significant increasing trend of the monthly ETo for Mashhad and Tabriz for most part of the year while the opposite conclusion was drawn for Tehran, Anzali, and Shiraz. Based on the detrended method, RH and U 2 were the two main variables enhancing the negative ETo trends in Tehran and Anzali stations whereas U 2 and temperature were responsible for this observation in Shiraz. On the other hand, the main meteorological variables affecting the significant positive trend of ETo were RH and t sun in Tabriz and T min, RH, and U 2 in Mashhad. Although a relative agreement was observed in terms of identifying one of the first two key climatic variables affecting the ETo trend, the qualitative and the quantitative sensitivity analysis solutions did never coincide. Further research is needed to evaluate this interesting finding for other geographic locations, and also to search for the major causes of this discrepancy.

  8. Multifractal Approach to the Analysis of Crime Dynamics: Results for Burglary in San Francisco

    NASA Astrophysics Data System (ADS)

    Melgarejo, Miguel; Obregon, Nelson

    This paper provides evidence of fractal, multifractal and chaotic behaviors in urban crime by computing key statistical attributes over a long data register of criminal activity. Fractal and multifractal analyses based on power spectrum, Hurst exponent computation, hierarchical power law detection and multifractal spectrum are considered ways to characterize and quantify the footprint of complexity of criminal activity. Moreover, observed chaos analysis is considered a second step to pinpoint the nature of the underlying crime dynamics. This approach is carried out on a long database of burglary activity reported by 10 police districts of San Francisco city. In general, interarrival time processes of criminal activity in San Francisco exhibit fractal and multifractal patterns. The behavior of some of these processes is close to 1/f noise. Therefore, a characterization as deterministic, high-dimensional, chaotic phenomena is viable. Thus, the nature of crime dynamics can be studied from geometric and chaotic perspectives. Our findings support that crime dynamics may be understood from complex systems theories like self-organized criticality or highly optimized tolerance.

  9. Hierarchical Bayes approach for subgroup analysis.

    PubMed

    Hsu, Yu-Yi; Zalkikar, Jyoti; Tiwari, Ram C

    2017-01-01

    In clinical data analysis, both treatment effect estimation and consistency assessment are important for a better understanding of the drug efficacy for the benefit of subjects in individual subgroups. The linear mixed-effects model has been used for subgroup analysis to describe treatment differences among subgroups with great flexibility. The hierarchical Bayes approach has been applied to linear mixed-effects model to derive the posterior distributions of overall and subgroup treatment effects. In this article, we discuss the prior selection for variance components in hierarchical Bayes, estimation and decision making of the overall treatment effect, as well as consistency assessment of the treatment effects across the subgroups based on the posterior predictive p-value. Decision procedures are suggested using either the posterior probability or the Bayes factor. These decision procedures and their properties are illustrated using a simulated example with normally distributed response and repeated measurements.

  10. Improving Public Perception of Behavior Analysis.

    PubMed

    Freedman, David H

    2016-05-01

    The potential impact of behavior analysis is limited by the public's dim awareness of the field. The mass media rarely cover behavior analysis, other than to echo inaccurate negative stereotypes about control and punishment. The media instead play up appealing but less-evidence-based approaches to problems, a key example being the touting of dubious diets over behavioral approaches to losing excess weight. These sorts of claims distort or skirt scientific evidence, undercutting the fidelity of behavior analysis to scientific rigor. Strategies for better connecting behavior analysis with the public might include reframing the field's techniques and principles in friendlier, more resonant form; pushing direct outcome comparisons between behavior analysis and its rivals in simple terms; and playing up the "warm and fuzzy" side of behavior analysis.

  11. A joint probability approach for coincidental flood frequency analysis at ungauged basin confluences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Cheng

    2016-03-12

    A reliable and accurate flood frequency analysis at the confluence of streams is of importance. Given that long-term peak flow observations are often unavailable at tributary confluences, at a practical level, this paper presents a joint probability approach (JPA) to address the coincidental flood frequency analysis at the ungauged confluence of two streams based on the flow rate data from the upstream tributaries. One case study is performed for comparison against several traditional approaches, including the position-plotting formula, the univariate flood frequency analysis, and the National Flood Frequency Program developed by US Geological Survey. It shows that the results generatedmore » by the JPA approach agree well with the floods estimated by the plotting position and univariate flood frequency analysis based on the observation data.« less

  12. Key aspects of coronal heating

    PubMed Central

    Klimchuk, James A.

    2015-01-01

    We highlight 10 key aspects of coronal heating that must be understood before we can consider the problem to be solved. (1) All coronal heating is impulsive. (2) The details of coronal heating matter. (3) The corona is filled with elemental magnetic stands. (4) The corona is densely populated with current sheets. (5) The strands must reconnect to prevent an infinite build-up of stress. (6) Nanoflares repeat with different frequencies. (7) What is the characteristic magnitude of energy release? (8) What causes the collective behaviour responsible for loops? (9) What are the onset conditions for energy release? (10) Chromospheric nanoflares are not a primary source of coronal plasma. Significant progress in solving the coronal heating problem will require coordination of approaches: observational studies, field-aligned hydrodynamic simulations, large-scale and localized three-dimensional magnetohydrodynamic simulations, and possibly also kinetic simulations. There is a unique value to each of these approaches, and the community must strive to coordinate better. PMID:25897094

  13. Building capacity in Australian interprofessional health education: perspectives from key health and higher education stakeholders.

    PubMed

    Matthews, Lynda R; Pockett, Rosalie B; Nisbet, Gillian; Thistlethwaite, Jill E; Dunston, Roger; Lee, Alison; White, Jill F

    2011-05-01

    A substantial literature engaging with the directions and experiences of stakeholders involved in interprofessional health education exists at the international level, yet almost nothing has been published that documents and analyses the Australian experience. Accordingly, this study aimed to scope the experiences of key stakeholders in health and higher education in relation to the development of interprofessional practice capabilities in health graduates in Australia. Twenty-seven semi-structured interviews and two focus groups of key stakeholders involved in the development and delivery of interprofessional health education in Australian higher education were undertaken. Interview data were coded to identify categories that were organised into key themes, according to principles of thematic analysis. Three themes were identified: the need for common ground between health and higher education, constraints and enablers in current practice, and the need for research to establish an evidence base. Five directions for national development were also identified. The study identified a range of interconnected changes that will be required to successfully mainstream interprofessional education within Australia, in particular, the importance of addressing issues of culture change and the need for a nationally coordinated and research informed approach. These findings reiterate those found in the international literature.

  14. Symmetric Blind Information Reconciliation for Quantum Key Distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kiktenko, Evgeniy O.; Trushechkin, Anton S.; Lim, Charles Ci Wen

    Quantum key distribution (QKD) is a quantum-proof key-exchange scheme which is fast approaching the communication industry. An essential component in QKD is the information reconciliation step, which is used for correcting the quantum-channel noise errors. The recently suggested blind-reconciliation technique, based on low-density parity-check codes, offers remarkable prospectives for efficient information reconciliation without an a priori quantum bit error rate estimation. We suggest an improvement of the blind-information-reconciliation protocol promoting a significant increase in the efficiency of the procedure and reducing its interactivity. Finally, the proposed technique is based on introducing symmetry in operations of parties, and the consideration ofmore » results of unsuccessful belief-propagation decodings.« less

  15. Symmetric Blind Information Reconciliation for Quantum Key Distribution

    DOE PAGES

    Kiktenko, Evgeniy O.; Trushechkin, Anton S.; Lim, Charles Ci Wen; ...

    2017-10-27

    Quantum key distribution (QKD) is a quantum-proof key-exchange scheme which is fast approaching the communication industry. An essential component in QKD is the information reconciliation step, which is used for correcting the quantum-channel noise errors. The recently suggested blind-reconciliation technique, based on low-density parity-check codes, offers remarkable prospectives for efficient information reconciliation without an a priori quantum bit error rate estimation. We suggest an improvement of the blind-information-reconciliation protocol promoting a significant increase in the efficiency of the procedure and reducing its interactivity. Finally, the proposed technique is based on introducing symmetry in operations of parties, and the consideration ofmore » results of unsuccessful belief-propagation decodings.« less

  16. Symmetric Blind Information Reconciliation for Quantum Key Distribution

    NASA Astrophysics Data System (ADS)

    Kiktenko, E. O.; Trushechkin, A. S.; Lim, C. C. W.; Kurochkin, Y. V.; Fedorov, A. K.

    2017-10-01

    Quantum key distribution (QKD) is a quantum-proof key-exchange scheme which is fast approaching the communication industry. An essential component in QKD is the information reconciliation step, which is used for correcting the quantum-channel noise errors. The recently suggested blind-reconciliation technique, based on low-density parity-check codes, offers remarkable prospectives for efficient information reconciliation without an a priori quantum bit error rate estimation. We suggest an improvement of the blind-information-reconciliation protocol promoting a significant increase in the efficiency of the procedure and reducing its interactivity. The proposed technique is based on introducing symmetry in operations of parties, and the consideration of results of unsuccessful belief-propagation decodings.

  17. KEY COMPARISON: Key comparison CCQM-K60: Total selenium and selenomethionine in selenised wheat flour

    NASA Astrophysics Data System (ADS)

    Goenaga Infante, Heidi; Sargent, Mike

    2010-01-01

    Key comparison CCQM-K60 was performed to assess the analytical capabilities of national metrology institutes (NMIs) to accurately quantitate the mass fraction of selenomethionine (SeMet) and total selenium (at low mg kg-1 levels) in selenised wheat flour. It was organized by the Inorganic Analysis Working Group (IAWG) of the Comité Consultatif pour la Quantité de Matière (CCQM) as a follow-up key comparison to the previous pilot study CCQM-P86 on selenised yeast tablets. LGC Limited (Teddington, UK) and the Institute for National Measurement Standards, National Research Council Canada (NRCC, Ottawa, Canada) acted as the coordinating laboratories. CCQM-K60 was organized in parallel with a pilot study (CCQM-P86.1) involving not only NMIs but also expert laboratories worldwide, thus enabling them to assess their capabilities, discover problems and learn how to modify analytical procedures accordingly. Nine results for total Se and four results for SeMet were reported by the participant NMIs. Methods used for sample preparation were microwave assisted acid digestion for total Se and multiple-step enzymatic hydrolysis and hydrolysis with methanesulfonic acid for SeMet. For total Se, detection techniques included inductively coupled plasma mass spectrometry (ICP-MS) with external calibration, standard additions or isotope dilution analysis (IDMS); instrumental neutron activation analysis (INAA); and graphite furnace atomic absorption spectrometry (GFAAS) with external calibration. For determination of SeMet in the wheat flour sample, the four NMIs relied upon measurements using species-specific IDMS (using 76Se-enriched SeMet) with HPLC-ICP-MS. Eight of the nine participating NMIs reported results for total Se within 3.5% deviation from the key comparison reference value (KCRV). For SeMet, the four participating NMIs reported results within 3.2% deviation from the KCRV. This shows that the performance of the majority of the CCQM-K60 participants was very good

  18. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  19. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis.

    PubMed

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  20. One Step Quantum Key Distribution Based on EPR Entanglement.

    PubMed

    Li, Jian; Li, Na; Li, Lei-Lei; Wang, Tao

    2016-06-30

    A novel quantum key distribution protocol is presented, based on entanglement and dense coding and allowing asymptotically secure key distribution. Considering the storage time limit of quantum bits, a grouping quantum key distribution protocol is proposed, which overcomes the vulnerability of first protocol and improves the maneuverability. Moreover, a security analysis is given and a simple type of eavesdropper's attack would introduce at least an error rate of 46.875%. Compared with the "Ping-pong" protocol involving two steps, the proposed protocol does not need to store the qubit and only involves one step.

  1. Long-distance practical quantum key distribution by entanglement swapping.

    PubMed

    Scherer, Artur; Sanders, Barry C; Tittel, Wolfgang

    2011-02-14

    We develop a model for practical, entanglement-based long-distance quantum key distribution employing entanglement swapping as a key building block. Relying only on existing off-the-shelf technology, we show how to optimize resources so as to maximize secret key distribution rates. The tools comprise lossy transmission links, such as telecom optical fibers or free space, parametric down-conversion sources of entangled photon pairs, and threshold detectors that are inefficient and have dark counts. Our analysis provides the optimal trade-off between detector efficiency and dark counts, which are usually competing, as well as the optimal source brightness that maximizes the secret key rate for specified distances (i.e. loss) between sender and receiver.

  2. Measurement errors in voice-key naming latency for Hiragana.

    PubMed

    Yamada, Jun; Tamaoka, Katsuo

    2003-12-01

    This study makes explicit the limitations and possibilities of voice-key naming latency research on single hiragana symbols (a Japanese syllabic script) by examining three sets of voice-key naming data against Sakuma, Fushimi, and Tatsumi's 1997 speech-analyzer voice-waveform data. Analysis showed that voice-key measurement errors can be substantial in standard procedures as they may conceal the true effects of significant variables involved in hiragana-naming behavior. While one can avoid voice-key measurement errors to some extent by applying Sakuma, et al.'s deltas and by excluding initial phonemes which induce measurement errors, such errors may be ignored when test items are words and other higher-level linguistic materials.

  3. A Pocock Approach to Sequential Meta-Analysis of Clinical Trials

    ERIC Educational Resources Information Center

    Shuster, Jonathan J.; Neu, Josef

    2013-01-01

    Three recent papers have provided sequential methods for meta-analysis of two-treatment randomized clinical trials. This paper provides an alternate approach that has three desirable features. First, when carried out prospectively (i.e., we only have the results up to the time of our current analysis), we do not require knowledge of the…

  4. Microfluidic approaches to malaria detection

    PubMed Central

    Gascoyne, Peter; Satayavivad, Jutamaad; Ruchirawat, Mathuros

    2009-01-01

    Microfluidic systems are under development to address a variety of medical problems. Key advantages of micrototal analysis systems based on microfluidic technology are the promise of small size and the integration of sample handling and measurement functions within a single, automated device having low mass-production costs. Here, we review the spectrum of methods currently used to detect malaria, consider their advantages and disadvantages, and discuss their adaptability towards integration into small, automated micro total analysis systems. Molecular amplification methods emerge as leading candidates for chip-based systems because they offer extremely high sensitivity, the ability to recognize malaria species and strain, and they will be adaptable to the detection of new genotypic signatures that will emerge from current genomic-based research of the disease. Current approaches to the development of chip-based molecular amplification are considered with special emphasis on flow-through PCR, and we present for the first time the method of malaria specimen preparation by dielectrophoretic field-flow-fractionation. Although many challenges must be addressed to realize a micrototal analysis system for malaria diagnosis, it is concluded that the potential benefits of the approach are well worth pursuing. PMID:14744562

  5. A Comparative Study of Data Envelopment Analysis and Other Approaches to Efficiency Evaluation and Estimation.

    DTIC Science & Technology

    1982-11-01

    ADA27 91 A COMPARATIVE STJDO DATAENVEOPMENT ANALYSISAND / HER APPROACHES TO A .i)TEXAS UN VAT AUSIN CENTER FOCAS RE CCYR 5 NO BERRETIC STADIES A... COMPARATIVE STUDY OF DATA ENVELOPMENT ANALYSIS AND OTHER APPROACHES TO EFFICIENCY EVALUATION AND ESTIMATIONt by A. Charnes W.W. Cooper H.D. Sherman...School of Business, 1981, entitled "Measurement of Hospital Efficiency: A Comparative Analysis of Data Envelopment Analysis and Other Approaches for

  6. Comparison of a Traditional Probabilistic Risk Assessment Approach with Advanced Safety Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis L; Mandelli, Diego; Zhegang Ma

    2014-11-01

    As part of the Light Water Sustainability Program (LWRS) [1], the purpose of the Risk Informed Safety Margin Characterization (RISMC) [2] Pathway research and development (R&D) is to support plant decisions for risk-informed margin management with the aim to improve economics, reliability, and sustain safety of current NPPs. In this paper, we describe the RISMC analysis process illustrating how mechanistic and probabilistic approaches are combined in order to estimate a safety margin. We use the scenario of a “station blackout” (SBO) wherein offsite power and onsite power is lost, thereby causing a challenge to plant safety systems. We describe themore » RISMC approach, illustrate the station blackout modeling, and contrast this with traditional risk analysis modeling for this type of accident scenario. We also describe our approach we are using to represent advanced flooding analysis.« less

  7. Virtual-optical information security system based on public key infrastructure

    NASA Astrophysics Data System (ADS)

    Peng, Xiang; Zhang, Peng; Cai, Lilong; Niu, Hanben

    2005-01-01

    A virtual-optical based encryption model with the aid of public key infrastructure (PKI) is presented in this paper. The proposed model employs a hybrid architecture in which our previously published encryption method based on virtual-optics scheme (VOS) can be used to encipher and decipher data while an asymmetric algorithm, for example RSA, is applied for enciphering and deciphering the session key(s). The whole information security model is run under the framework of international standard ITU-T X.509 PKI, which is on basis of public-key cryptography and digital signatures. This PKI-based VOS security approach has additional features like confidentiality, authentication, and integrity for the purpose of data encryption under the environment of network. Numerical experiments prove the effectiveness of the method. The security of proposed model is briefly analyzed by examining some possible attacks from the viewpoint of a cryptanalysis.

  8. Flow-graph approach for optical analysis of planar structures.

    PubMed

    Minkov, D

    1994-11-20

    The flow-graph approach (FGA) is applied to optical analysis of isotropic stratified planar structures (ISPS's) at inclined light incidence. Conditions for the presence of coherent and noncoherent light interaction within ISPS's are determined. Examples of the use of FGA for calculation of the transmission and the reflection of two-layer ISPS's for different types of light interaction are given. The advantages of the use of FGA for optical analysis of ISPS's are discussed.

  9. Statistical approach to partial equilibrium analysis

    NASA Astrophysics Data System (ADS)

    Wang, Yougui; Stanley, H. E.

    2009-04-01

    A statistical approach to market equilibrium and efficiency analysis is proposed in this paper. One factor that governs the exchange decisions of traders in a market, named willingness price, is highlighted and constitutes the whole theory. The supply and demand functions are formulated as the distributions of corresponding willing exchange over the willingness price. The laws of supply and demand can be derived directly from these distributions. The characteristics of excess demand function are analyzed and the necessary conditions for the existence and uniqueness of equilibrium point of the market are specified. The rationing rates of buyers and sellers are introduced to describe the ratio of realized exchange to willing exchange, and their dependence on the market price is studied in the cases of shortage and surplus. The realized market surplus, which is the criterion of market efficiency, can be written as a function of the distributions of willing exchange and the rationing rates. With this approach we can strictly prove that a market is efficient in the state of equilibrium.

  10. A hybrid approach to device integration on a genetic analysis platform

    NASA Astrophysics Data System (ADS)

    Brennan, Des; Jary, Dorothee; Kurg, Ants; Berik, Evgeny; Justice, John; Aherne, Margaret; Macek, Milan; Galvin, Paul

    2012-10-01

    Point-of-care (POC) systems require significant component integration to implement biochemical protocols associated with molecular diagnostic assays. Hybrid platforms where discrete components are combined in a single platform are a suitable approach to integration, where combining multiple device fabrication steps on a single substrate is not possible due to incompatible or costly fabrication steps. We integrate three devices each with a specific system functionality: (i) a silicon electro-wetting-on-dielectric (EWOD) device to move and mix sample and reagent droplets in an oil phase, (ii) a polymer microfluidic chip containing channels and reservoirs and (iii) an aqueous phase glass microarray for fluorescence microarray hybridization detection. The EWOD device offers the possibility of fully integrating on-chip sample preparation using nanolitre sample and reagent volumes. A key challenge is sample transfer from the oil phase EWOD device to the aqueous phase microarray for hybridization detection. The EWOD device, waveguide performance and functionality are maintained during the integration process. An on-chip biochemical protocol for arrayed primer extension (APEX) was implemented for single nucleotide polymorphism (SNiP) analysis. The prepared sample is aspirated from the EWOD oil phase to the aqueous phase microarray for hybridization. A bench-top instrumentation system was also developed around the integrated platform to drive the EWOD electrodes, implement APEX sample heating and image the microarray after hybridization.

  11. Interventionist and participatory approaches to flood risk mitigation decisions: two case studies in the Italian Alps

    NASA Astrophysics Data System (ADS)

    Bianchizza, C.; Del Bianco, D.; Pellizzoni, L.; Scolobig, A.

    2012-04-01

    Flood risk mitigation decisions pose key challenges not only from a technical but also from a social, economic and political viewpoint. There is an increasing demand for improving the quality of these processes by including different stakeholders - and especially by involving the local residents in the decision making process - and by guaranteeing the actual improvement of local social capacities during and after the decision making. In this paper we analyse two case studies of flood risk mitigation decisions, Malborghetto-Valbruna and Vipiteno-Sterzing, in the Italian Alps. In both of them, mitigation works have been completed or planned, yet following completely different approaches especially in terms of responses of residents and involvement of local authorities. In Malborghetto-Valbruna an 'interventionist' approach (i.e. leaning towards a top down/technocratic decision process) was used to make decisions after the flood event that affected the municipality in the year 2003. In Vipiteno-Sterzing, a 'participatory' approach (i.e. leaning towards a bottom-up/inclusive decision process) was applied: decisions about risk mitigation measures were made by submitting different projects to the local citizens and by involving them in the decision making process. The analysis of the two case studies presented in the paper is grounded on the results of two research projects. Structured and in-depth interviews, as well as questionnaire surveys were used to explore residents' and local authorities' orientations toward flood risk mitigation. Also a SWOT analysis (Strengths, Weaknesses, Opportunities and Threats) involving key stakeholders was used to better understand the characteristics of the communities and their perception of flood risk mitigation issues. The results highlight some key differences between interventionist and participatory approaches, together with some implications of their adoption in the local context. Strengths and weaknesses of the two approaches

  12. Understanding Design Tradeoffs for Health Technologies: A Mixed-Methods Approach

    PubMed Central

    O’Leary, Katie; Eschler, Jordan; Kendall, Logan; Vizer, Lisa M.; Ralston, James D.; Pratt, Wanda

    2017-01-01

    We introduce a mixed-methods approach for determining how people weigh tradeoffs in values related to health and technologies for health self-management. Our approach combines interviews with Q-methodology, a method from psychology uniquely suited to quantifying opinions. We derive the framework for structured data collection and analysis for the Q-methodology from theories of self-management of chronic illness and technology adoption. To illustrate the power of this new approach, we used it in a field study of nine older adults with type 2 diabetes, and nine mothers of children with asthma. Our mixed-methods approach provides three key advantages for health design science in HCI: (1) it provides a structured health sciences theoretical framework to guide data collection and analysis; (2) it enhances the coding of unstructured data with statistical patterns of polarizing and consensus views; and (3) it empowers participants to actively weigh competing values that are most personally significant to them. PMID:28804794

  13. KEY COMPARISON: Final report on international key comparison CCQM-K53: Oxygen in nitrogen

    NASA Astrophysics Data System (ADS)

    Lee, Jeongsoon; Bok Lee, Jin; Moon, Dong Min; Seog Kim, Jin; van der Veen, Adriaan M. H.; Besley, Laurie; Heine, Hans-Joachim; Martin, Belén; Konopelko, L. A.; Kato, Kenji; Shimosaka, Takuya; Perez Castorena, Alejandro; Macé, Tatiana; Milton, Martin J. T.; Kelley, Mike; Guenther, Franklin; Botha, Angelique

    2010-01-01

    Gravimetry is used as the primary method for the preparation of primary standard gas mixtures in most national metrology institutes, and it requires the combined abilities of purity assessment, weighing technique and analytical skills. At the CCQM GAWG meeting in October 2005, it was agreed that KRISS should coordinate a key comparison, CCQM-K53, on the gravimetric preparation of gas, at a level of 100 µmol/mol of oxygen in nitrogen. KRISS compared the gravimetric value of each cylinder with an analytical instrument. A preparation for oxygen gas standard mixture requires particular care to be accurate, because oxygen is a major component of the atmosphere. Key issues for this comparison are related to (1) the gravimetric technique which needs at least two steps for dilution, (2) oxygen impurity in nitrogen, and (3) argon impurity in nitrogen. The key comparison reference value is obtained from the linear regression line (with origin) of a selected set of participants. The KCRV subset, except one, agree with each other. The standard deviation of the x-residuals of this group (which consists of NMIJ, VSL, NIST, NPL, BAM, KRISS and CENAM) is 0.056 µmol/mol and consistent with the uncertainties given to their standard mixtures. The standard deviation of the residuals of all participating laboratory is 0.182 µmol/mol. With respect to impurity analysis, overall argon amounts of the cylinders are in the region of about 3 µmol/mol however; four cylinders showed an argon amount fraction over 10 µmol/mol. Two of these are inconsistent with the KCRV subset. The explicit separation between two peaks of oxygen and argon in the GC chromatogram is essential to maintain analytical capability. Additionally oxygen impurity analysis in nitrogen is indispensable to ensure the preparative capability. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The

  14. Microarray analysis reveals key genes and pathways in Tetralogy of Fallot

    PubMed Central

    He, Yue-E; Qiu, Hui-Xian; Jiang, Jian-Bing; Wu, Rong-Zhou; Xiang, Ru-Lian; Zhang, Yuan-Hai

    2017-01-01

    The aim of the present study was to identify key genes that may be involved in the pathogenesis of Tetralogy of Fallot (TOF) using bioinformatics methods. The GSE26125 microarray dataset, which includes cardiovascular tissue samples derived from 16 children with TOF and five healthy age-matched control infants, was downloaded from the Gene Expression Omnibus database. Differential expression analysis was performed between TOF and control samples to identify differentially expressed genes (DEGs) using Student's t-test, and the R/limma package, with a log2 fold-change of >2 and a false discovery rate of <0.01 set as thresholds. The biological functions of DEGs were analyzed using the ToppGene database. The ReactomeFIViz application was used to construct functional interaction (FI) networks, and the genes in each module were subjected to pathway enrichment analysis. The iRegulon plugin was used to identify transcription factors predicted to regulate the DEGs in the FI network, and the gene-transcription factor pairs were then visualized using Cytoscape software. A total of 878 DEGs were identified, including 848 upregulated genes and 30 downregulated genes. The gene FI network contained seven function modules, which were all comprised of upregulated genes. Genes enriched in Module 1 were enriched in the following three neurological disorder-associated signaling pathways: Parkinson's disease, Alzheimer's disease and Huntington's disease. Genes in Modules 0, 3 and 5 were dominantly enriched in pathways associated with ribosomes and protein translation. The Xbox binding protein 1 transcription factor was demonstrated to be involved in the regulation of genes encoding the subunits of cytoplasmic and mitochondrial ribosomes, as well as genes involved in neurodegenerative disorders. Therefore, dysfunction of genes involved in signaling pathways associated with neurodegenerative disorders, ribosome function and protein translation may contribute to the pathogenesis of TOF

  15. A Unified Data-Driven Approach for Programming In Situ Analysis and Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aiken, Alex

    The placement and movement of data is becoming the key limiting factor on both performance and energy efficiency of high performance computations. As systems generate more data, it is becoming increasingly difficult to actually move that data elsewhere for post-processing, as the rate of improvements in supporting I/O infrastructure is not keeping pace. Together, these trends are creating a shift in how we think about exascale computations, from a viewpoint that focuses on FLOPS to one that focuses on data and data-centric operations as fundamental to the reasoning about, and optimization of, scientific workflows on extreme-scale architectures. The overarching goalmore » of our effort was the study of a unified data-driven approach for programming applications and in situ analysis and visualization. Our work was to understand the interplay between data-centric programming model requirements at extreme-scale and the overall impact of those requirements on the design, capabilities, flexibility, and implementation details for both applications and the supporting in situ infrastructure. In this context, we made many improvements to the Legion programming system (one of the leading data-centric models today) and demonstrated in situ analyses on real application codes using these improvements.« less

  16. Analysis of the key enzymes of butyric and acetic acid fermentation in biogas reactors

    PubMed Central

    Gabris, Christina; Bengelsdorf, Frank R; Dürre, Peter

    2015-01-01

    This study aimed at the investigation of the mechanisms of acidogenesis, which is a key process during anaerobic digestion. To expose possible bottlenecks, specific activities of the key enzymes of acidification, such as acetate kinase (Ack, 0.23–0.99 U mg−1 protein), butyrate kinase (Buk, < 0.03 U mg−1 protein) and butyryl-CoA:acetate-CoA transferase (But, 3.24–7.64 U mg−1 protein), were determined in cell free extracts of biogas reactor content from three different biogas reactors. Furthermore, the detection of Ack was successful via Western blot analysis. Quantification of corresponding functional genes encoding Buk (buk) and But (but) was not feasible, although an amplification was possible. Thus, phylogenetic trees were constructed based on respective gene fragments. Four new clades of possible butyrate-producing bacteria were postulated, as well as bacteria of the genera Roseburia or Clostridium identified. The low Buk activity was in contrast to the high specific But activity in the analysed samples. Butyrate formation via Buk activity does barely occur in the investigated biogas reactor. Specific enzyme activities (Ack, Buk and But) in samples drawn from three different biogas reactors correlated with ammonia and ammonium concentrations (NH3 and NH4+-N), and a negative dependency can be postulated. Thus, high concentrations of NH3 and NH4+-N may lead to a bottleneck in acidogenesis due to decreased specific acidogenic enzyme activities. PMID:26086956

  17. A Gaussian Approximation Approach for Value of Information Analysis.

    PubMed

    Jalal, Hawre; Alarid-Escudero, Fernando

    2018-02-01

    Most decisions are associated with uncertainty. Value of information (VOI) analysis quantifies the opportunity loss associated with choosing a suboptimal intervention based on current imperfect information. VOI can inform the value of collecting additional information, resource allocation, research prioritization, and future research designs. However, in practice, VOI remains underused due to many conceptual and computational challenges associated with its application. Expected value of sample information (EVSI) is rooted in Bayesian statistical decision theory and measures the value of information from a finite sample. The past few years have witnessed a dramatic growth in computationally efficient methods to calculate EVSI, including metamodeling. However, little research has been done to simplify the experimental data collection step inherent to all EVSI computations, especially for correlated model parameters. This article proposes a general Gaussian approximation (GA) of the traditional Bayesian updating approach based on the original work by Raiffa and Schlaifer to compute EVSI. The proposed approach uses a single probabilistic sensitivity analysis (PSA) data set and involves 2 steps: 1) a linear metamodel step to compute the EVSI on the preposterior distributions and 2) a GA step to compute the preposterior distribution of the parameters of interest. The proposed approach is efficient and can be applied for a wide range of data collection designs involving multiple non-Gaussian parameters and unbalanced study designs. Our approach is particularly useful when the parameters of an economic evaluation are correlated or interact.

  18. Exploring multifunctional agriculture. A review of conceptual approaches and prospects for an integrative transitional framework.

    PubMed

    Renting, H; Rossing, W A H; Groot, J C J; Van der Ploeg, J D; Laurent, C; Perraud, D; Stobbelaar, D J; Van Ittersum, M K

    2009-05-01

    In the last decade the multifunctional agriculture (MFA) concept has emerged as a key notion in scientific and policy debates on the future of agriculture and rural development. Broadly speaking, MFA refers to the fact that agricultural activity beyond its role of producing food and fibre may also have several other functions such as renewable natural resources management, landscape and biodiversity conservation and contribution to the socio-economic viability of rural areas. The use of the concept can be traced to a number of wider societal and political transformation processes, which have influenced scientific and policy approaches in different ways amongst countries and disciplines. This paper critically discusses various existing research approaches to MFA, both from natural and social sciences. To this aim different strands of literature are classified according to their focus on specific governance mechanisms and levels of analysis into four main categories of research approaches (market regulation, land-use approaches, actor-oriented and public regulation approaches). For each category an overview of the state-of-the-art of research is given and an assessment is made of its strengths and weaknesses. The review demonstrates that the multifunctionality concept has attracted a wealth of scientific contributions, which have considerably improved our understanding of key aspects of MFA. At the same time approaches in the four categories have remained fragmented and each has limitations to understand MFA in all its complexity due to inherent constraints of applied conceptualizations and associated disciplinary backgrounds. To go beyond these limitations, we contend, new meta-level frameworks of analysis are to be developed that enable a more integrated approach. The paper concludes by presenting the main lines of an integrative, transitional framework for the study of MFA, which analyses multifunctional agriculture against the background of wider societal change

  19. Free-Space Quantum Key Distribution using Polarization Entangled Photons

    NASA Astrophysics Data System (ADS)

    Kurtsiefer, Christian

    2007-06-01

    We report on a complete experimental implementation of a quantum key distribution protocol through a free space link using polarization-entangled photon pairs from a compact parametric down-conversion source [1]. Based on a BB84-equivalent protocol, we generated without interruption over 10 hours a secret key free-space optical link distance of 1.5 km with a rate up to 950 bits per second after error correction and privacy amplification. Our system is based on two time stamp units and relies on no specific hardware channel for coincidence identification besides an IP link. For that, initial clock synchronization with an accuracy of better than 2 ns is achieved, based on a conventional NTP protocol and a tiered cross correlation of time tags on both sides. Time tags are used to servo a local clock, allowing a streamed measurement on correctly identified photon pairs. Contrary to the majority of quantum key distribution systems, this approach does not require a trusted large-bandwidth random number generator, but integrates that into the physical key generation process. We discuss our current progress of implementing a key distribution via an atmospherical link during daylight conditions, and possible attack scenarios on a physical timing information side channel to a entanglement-based key distribution system. [1] I. Marcikic, A. Lamas-Linares, C. Kurtsiefer, Appl. Phys. Lett. 89, 101122 (2006).

  20. Fetal ECG extraction using independent component analysis by Jade approach

    NASA Astrophysics Data System (ADS)

    Giraldo-Guzmán, Jader; Contreras-Ortiz, Sonia H.; Lasprilla, Gloria Isabel Bautista; Kotas, Marian

    2017-11-01

    Fetal ECG monitoring is a useful method to assess the fetus health and detect abnormal conditions. In this paper we propose an approach to extract fetal ECG from abdomen and chest signals using independent component analysis based on the joint approximate diagonalization of eigenmatrices approach. The JADE approach avoids redundancy, what reduces matrix dimension and computational costs. Signals were filtered with a high pass filter to eliminate low frequency noise. Several levels of decomposition were tested until the fetal ECG was recognized in one of the separated sources output. The proposed method shows fast and good performance.

  1. Providing long-acting reversible contraception services in Seattle school-based health centers: key themes for facilitating implementation.

    PubMed

    Gilmore, Kelly; Hoopes, Andrea J; Cady, Janet; Amies Oelschlager, Anne-Marie; Prager, Sarah; Vander Stoep, Ann

    2015-06-01

    The purpose of this study was to describe the implementation of a program that provides long-acting reversible contraception (LARC) services within school-based health centers (SBHCs) and to identify barriers and facilitators to implementation as reported by SBHC clinicians and administrators, public health officials, and community partners. We conducted 14 semistructured interviews with key informants involved in the implementation of LARC services. Key informants included SBHC clinicians and administrators, public health officials, and community partners. We used a content analysis approach to analyze interview transcripts for themes. We explored barriers to and facilitators of LARC service delivery across and within key informant groups. The most cited barriers across key informant groups were as follows: perceived lack of provider procedural skills and bias and negative attitudes about LARC methods. The most common facilitators identified across groups were as follows: clear communication strategies, contraceptive counseling practice changes, provider trainings, and stakeholder engagement. Two additional barriers emerged in specific key informant groups. Technical and logistical barriers to LARC service delivery were cited heavily by SBHC administrative staff, community partners, and public health officials. Expense and billing was a major barrier to SBHC administrative staff. LARC counseling and procedural services can be implemented in an SBHC setting to promote access to effective contraceptive options for adolescent women. Copyright © 2015 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  2. Anterior approach versus posterior approach for Pipkin I and II femoral head fractures: A systemic review and meta-analysis.

    PubMed

    Wang, Chen-guang; Li, Yao-min; Zhang, Hua-feng; Li, Hui; Li, Zhi-jun

    2016-03-01

    We performed a meta-analysis, pooling the results from controlled clinical trials to compare the efficiency of anterior and posterior surgical approaches to Pipkin I and II fractures of the femoral head. Potential academic articles were identified from the Cochrane Library, Medline (1966-2015.5), PubMed (1966-2015.5), Embase (1980-2015.5) and ScienceDirect (1966-2015.5) databases. Gray studies were identified from the references of the included literature. Pooling of the data was performed and analyzed by RevMan software, version 5.1. Five case-control trials (CCTs) met the inclusion criteria. There were significant differences in the incidence of heterotopic ossification (HO) between the approaches, but no significant differences were found between the two groups regarding functional outcomes of the hip, general postoperative complications, osteonecrosis of the femoral head or post-traumatic arthritis. The present meta-analysis indicated that the posterior approach decreased the risk of heterotopic ossification compared with the anterior approach for the treatment of Pipkin I and II femoral head fractures. No other complications were related to anterior and posterior approaches. Future high-quality randomized, controlled trials (RCTs) are needed to determine the optimal surgical approach and to predict other postoperative complications. III. Copyright © 2016 IJS Publishing Group Limited. Published by Elsevier Ltd. All rights reserved.

  3. Key variables analysis of a novel continuous biodrying process for drying mixed sludge.

    PubMed

    Navaee-Ardeh, Shahram; Bertrand, François; Stuart, Paul R

    2010-05-01

    A novel continuous biodrying process has been developed whose goal is to increase the dry solids content of the sludge to economic levels rendering it suitable for a safe and economic combustion operation in a biomass boiler. The sludge drying rates are enhanced by the metabolic bioheat produced in the matrix of mixed sludge. The goal of this study was to systematically analyze the continuous biodrying reactor. By performing a variable analysis, it was found that the outlet relative humidity profile was the key variable in the biodrying reactor. The influence of different outlet relative humidity profiles was then evaluated using biodrying efficiency index. It was found that by maintaining the air outlet relative humidity profile at 85/85/96/96% in the four compartments of the reactor, the highest biodrying efficiency index can be achieved, while economic dry solids level (>45%w/w) are guaranteed. Crown Copyright 2009. Published by Elsevier Ltd. All rights reserved.

  4. Exploring novel key regulators in breast cancer network.

    PubMed

    Ali, Shahnawaz; Malik, Md Zubbair; Singh, Soibam Shyamchand; Chirom, Keilash; Ishrat, Romana; Singh, R K Brojen

    2018-01-01

    The breast cancer network constructed from 70 experimentally verified genes is found to follow hierarchical scale free nature with heterogeneous modular organization and diverge leading hubs. The topological parameters (degree distributions, clustering co-efficient, connectivity and centralities) of this network obey fractal rules indicating absence of centrality lethality rule, and efficient communication among the components. From the network theoretical approach, we identified few key regulators out of large number of leading hubs, which are deeply rooted from top to down of the network, serve as backbone of the network, and possible target genes. However, p53, which is one of these key regulators, is found to be in low rank and keep itself at low profile but directly cross-talks with important genes BRCA2 and BRCA3. The popularity of these hubs gets changed in unpredictable way at various levels of organization thus showing disassortive nature. The local community paradigm approach in this network shows strong correlation of nodes in majority of modules/sub-modules (fast communication among nodes) and weak correlation of nodes only in few modules/sub-modules (slow communication among nodes) at various levels of network organization.

  5. Comparative study of key exchange and authentication methods in application, transport and network level security mechanisms

    NASA Astrophysics Data System (ADS)

    Fathirad, Iraj; Devlin, John; Jiang, Frank

    2012-09-01

    The key-exchange and authentication are two crucial elements of any network security mechanism. IPsec, SSL/TLS, PGP and S/MIME are well-known security approaches in providing security service to network, transport and application layers; these protocols use different methods (based on their requirements) to establish keying materials and authenticates key-negotiation and participated parties. This paper studies and compares the authenticated key negotiation methods in mentioned protocols.

  6. Direct determination approach for the multifractal detrending moving average analysis

    NASA Astrophysics Data System (ADS)

    Xu, Hai-Chuan; Gu, Gao-Feng; Zhou, Wei-Xing

    2017-11-01

    In the canonical framework, we propose an alternative approach for the multifractal analysis based on the detrending moving average method (MF-DMA). We define a canonical measure such that the multifractal mass exponent τ (q ) is related to the partition function and the multifractal spectrum f (α ) can be directly determined. The performances of the direct determination approach and the traditional approach of the MF-DMA are compared based on three synthetic multifractal and monofractal measures generated from the one-dimensional p -model, the two-dimensional p -model, and the fractional Brownian motions. We find that both approaches have comparable performances to unveil the fractal and multifractal nature. In other words, without loss of accuracy, the multifractal spectrum f (α ) can be directly determined using the new approach with less computation cost. We also apply the new MF-DMA approach to the volatility time series of stock prices and confirm the presence of multifractality.

  7. Using object-oriented analysis to design a multi-mission ground data system

    NASA Technical Reports Server (NTRS)

    Shames, Peter

    1995-01-01

    This paper describes an analytical approach and descriptive methodology that is adapted from Object-Oriented Analysis (OOA) techniques. The technique is described and then used to communicate key issues of system logical architecture. The essence of the approach is to limit the analysis to only service objects, with the idea of providing a direct mapping from the design to a client-server implementation. Key perspectives on the system, such as user interaction, data flow and management, service interfaces, hardware configuration, and system and data integrity are covered. A significant advantage of this service-oriented approach is that it permits mapping all of these different perspectives on the system onto a single common substrate. This services substrate is readily represented diagramatically, thus making details of the overall design much more accessible.

  8. A prototype framework for models of socio-hydrology: identification of key feedback loops and parameterisation approach

    NASA Astrophysics Data System (ADS)

    Elshafei, Y.; Sivapalan, M.; Tonts, M.; Hipsey, M. R.

    2014-06-01

    It is increasingly acknowledged that, in order to sustainably manage global freshwater resources, it is critical that we better understand the nature of human-hydrology interactions at the broader catchment system scale. Yet to date, a generic conceptual framework for building models of catchment systems that include adequate representation of socioeconomic systems - and the dynamic feedbacks between human and natural systems - has remained elusive. In an attempt to work towards such a model, this paper outlines a generic framework for models of socio-hydrology applicable to agricultural catchments, made up of six key components that combine to form the coupled system dynamics: namely, catchment hydrology, population, economics, environment, socioeconomic sensitivity and collective response. The conceptual framework posits two novel constructs: (i) a composite socioeconomic driving variable, termed the Community Sensitivity state variable, which seeks to capture the perceived level of threat to a community's quality of life, and acts as a key link tying together one of the fundamental feedback loops of the coupled system, and (ii) a Behavioural Response variable as the observable feedback mechanism, which reflects land and water management decisions relevant to the hydrological context. The framework makes a further contribution through the introduction of three macro-scale parameters that enable it to normalise for differences in climate, socioeconomic and political gradients across study sites. In this way, the framework provides for both macro-scale contextual parameters, which allow for comparative studies to be undertaken, and catchment-specific conditions, by way of tailored "closure relationships", in order to ensure that site-specific and application-specific contexts of socio-hydrologic problems can be accommodated. To demonstrate how such a framework would be applied, two socio-hydrological case studies, taken from the Australian experience, are presented

  9. Genome sequence analysis of dengue virus 1 isolated in Key West, Florida.

    PubMed

    Shin, Dongyoung; Richards, Stephanie L; Alto, Barry W; Bettinardi, David J; Smartt, Chelsea T

    2013-01-01

    Dengue virus (DENV) is transmitted to humans through the bite of mosquitoes. In November 2010, a dengue outbreak was reported in Monroe County in southern Florida (FL), including greater than 20 confirmed human cases. The virus collected from the human cases was verified as DENV serotype 1 (DENV-1) and one isolate was provided for sequence analysis. RNA was extracted from the DENV-1 isolate and was used in reverse transcription polymerase chain reaction (RT-PCR) to amplify PCR fragments to sequence. Nucleic acid primers were designed to generate overlapping PCR fragments that covered the entire genome. The DENV-1 isolate found in Key West (KW), FL was sequenced for whole genome characterization. Sequence assembly, Genbank searches, and recombination analyses were performed to verify the identity of the genome sequences and to determine percent similarity to known DENV-1 sequences. We show that the KW DENV-1 strain is 99% identical to Nicaraguan and Mexican DENV-1 strains. Phylogenetic and recombination analyses suggest that the DENV-1 isolated in KW originated from Nicaragua (NI) and the KW strain may circulate in KW. Also, recombination analysis results detected recombination events in the KW strain compared to DENV-1 strains from Puerto Rico. We evaluate the relative growth of KW strain of DENV-1 compared to other dengue viruses to determine whether the underlying genetics of the strain is associated with a replicative advantage, an important consideration since local transmission of DENV may result because domestic tourism can spread DENVs.

  10. A Technical Analysis Information Fusion Approach for Stock Price Analysis and Modeling

    NASA Astrophysics Data System (ADS)

    Lahmiri, Salim

    In this paper, we address the problem of technical analysis information fusion in improving stock market index-level prediction. We present an approach for analyzing stock market price behavior based on different categories of technical analysis metrics and a multiple predictive system. Each category of technical analysis measures is used to characterize stock market price movements. The presented predictive system is based on an ensemble of neural networks (NN) coupled with particle swarm intelligence for parameter optimization where each single neural network is trained with a specific category of technical analysis measures. The experimental evaluation on three international stock market indices and three individual stocks show that the presented ensemble-based technical indicators fusion system significantly improves forecasting accuracy in comparison with single NN. Also, it outperforms the classical neural network trained with index-level lagged values and NN trained with stationary wavelet transform details and approximation coefficients. As a result, technical information fusion in NN ensemble architecture helps improving prediction accuracy.

  11. The Phasor Approach to Fluorescence Lifetime Imaging Analysis

    PubMed Central

    Digman, Michelle A.; Caiolfa, Valeria R.; Zamai, Moreno; Gratton, Enrico

    2008-01-01

    Changing the data representation from the classical time delay histogram to the phasor representation provides a global view of the fluorescence decay at each pixel of an image. In the phasor representation we can easily recognize the presence of different molecular species in a pixel or the occurrence of fluorescence resonance energy transfer. The analysis of the fluorescence lifetime imaging microscopy (FLIM) data in the phasor space is done observing clustering of pixels values in specific regions of the phasor plot rather than by fitting the fluorescence decay using exponentials. The analysis is instantaneous since is not based on calculations or nonlinear fitting. The phasor approach has the potential to simplify the way data are analyzed in FLIM, paving the way for the analysis of large data sets and, in general, making the FLIM technique accessible to the nonexpert in spectroscopy and data analysis. PMID:17981902

  12. Chronic Absenteeism: A Key Indicator of Student Success. Policy Analysis

    ERIC Educational Resources Information Center

    Rafa, Alyssa

    2017-01-01

    Research shows that chronic absenteeism can affect academic performance in later grades and is a key early warning sign that a student is more likely to drop out of high school. Several states enacted legislation to address this issue, and many states are currently discussing the utility of chronic absenteeism as an indicator of school quality or…

  13. HemoVision: An automated and virtual approach to bloodstain pattern analysis.

    PubMed

    Joris, Philip; Develter, Wim; Jenar, Els; Suetens, Paul; Vandermeulen, Dirk; Van de Voorde, Wim; Claes, Peter

    2015-06-01

    Bloodstain pattern analysis (BPA) is a subspecialty of forensic sciences, dealing with the analysis and interpretation of bloodstain patterns in crime scenes. The aim of BPA is uncovering new information about the actions that took place in a crime scene, potentially leading to a confirmation or refutation of a suspect's statement. A typical goal of BPA is to estimate the flight paths for a set of stains, followed by a directional analysis in order to estimate the area of origin for the stains. The traditional approach, referred to as stringing, consists of attaching a piece of string to each stain, and letting the string represent an approximation of the stain's flight path. Even though stringing has been used extensively, many (practical) downsides exist. We propose an automated and virtual approach, employing fiducial markers and digital images. By automatically reconstructing a single coordinate frame from several images, limited user input is required. Synthetic crime scenes were created and analysed in order to evaluate the approach. Results demonstrate the correct operation and practical advantages, suggesting that the proposed approach may become a valuable asset for practically analysing bloodstain spatter patterns. Accompanying software called HemoVision is currently provided as a demonstrator and will be further developed for practical use in forensic investigations. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. The Use of a Modified Semantic Features Analysis Approach in Aphasia

    ERIC Educational Resources Information Center

    Hashimoto, Naomi; Frome, Amber

    2011-01-01

    Several studies have reported improved naming using the semantic feature analysis (SFA) approach in individuals with aphasia. Whether the SFA can be modified and still produce naming improvements in aphasia is unknown. The present study was designed to address this question by using a modified version of the SFA approach. Three, rather than the…

  15. Efficient multiparty quantum key agreement with collective detection.

    PubMed

    Huang, Wei; Su, Qi; Liu, Bin; He, Yuan-Hang; Fan, Fan; Xu, Bing-Jie

    2017-11-10

    As a burgeoning branch of quantum cryptography, quantum key agreement is a kind of key establishing processes where the security and fairness of the established common key should be guaranteed simultaneously. However, the difficulty on designing a qualified quantum key agreement protocol increases significantly with the increase of the number of the involved participants. Thus far, only few of the existing multiparty quantum key agreement (MQKA) protocols can really achieve security and fairness. Nevertheless, these qualified MQKA protocols are either too inefficient or too impractical. In this paper, an MQKA protocol is proposed with single photons in travelling mode. Since only one eavesdropping detection is needed in the proposed protocol, the qubit efficiency and measurement efficiency of it are higher than those of the existing ones in theory. Compared with the protocols which make use of the entangled states or multi-particle measurements, the proposed protocol is more feasible with the current technologies. Security and fairness analysis shows that the proposed protocol is not only immune to the attacks from external eavesdroppers, but also free from the attacks from internal betrayers.

  16. One Step Quantum Key Distribution Based on EPR Entanglement

    PubMed Central

    Li, Jian; Li, Na; Li, Lei-Lei; Wang, Tao

    2016-01-01

    A novel quantum key distribution protocol is presented, based on entanglement and dense coding and allowing asymptotically secure key distribution. Considering the storage time limit of quantum bits, a grouping quantum key distribution protocol is proposed, which overcomes the vulnerability of first protocol and improves the maneuverability. Moreover, a security analysis is given and a simple type of eavesdropper’s attack would introduce at least an error rate of 46.875%. Compared with the “Ping-pong” protocol involving two steps, the proposed protocol does not need to store the qubit and only involves one step. PMID:27357865

  17. A chemical proteomics approach for global analysis of lysine monomethylome profiling.

    PubMed

    Wu, Zhixiang; Cheng, Zhongyi; Sun, Mingwei; Wan, Xuelian; Liu, Ping; He, Tieming; Tan, Minjia; Zhao, Yingming

    2015-02-01

    Methylation of lysine residues on histone proteins is known to play an important role in chromatin structure and function. However, non-histone protein substrates of this modification remain largely unknown. An effective approach for system-wide analysis of protein lysine methylation, particularly lysine monomethylation, is lacking. Here we describe a chemical proteomics approach for global screening for monomethyllysine substrates, involving chemical propionylation of monomethylated lysine, affinity enrichment of the modified monomethylated peptides, and HPLC/MS/MS analysis. Using this approach, we identified with high confidence 446 lysine monomethylation sites in 398 proteins, including three previously unknown histone monomethylation marks, representing the largest data set of protein lysine monomethylation described to date. Our data not only confirms previously discovered lysine methylation substrates in the nucleus and spliceosome, but also reveals new substrates associated with diverse biological processes. This method hence offers a powerful approach for dynamic study of protein lysine monomethylation under diverse cellular conditions and in human diseases. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.

  18. Building Interdisciplinary Research Capacity: a Key Challenge for Ecological Approaches in Public Health.

    PubMed

    Galway, Lindsay P; Parkes, Margot W; Allen, Diana; Takaro, Tim K

    2016-01-01

    The shortcomings of public health research informed by reductionist and fragmented biomedical approaches and the emergence of wicked problems are fueling a renewed interest in ecological approaches in public health. Despite the central role of interdisciplinarity in the context of ecological approaches in public health research, inadequate attention has been given to the specific challenge of doing interdisciplinary research in practice. As a result, important knowledge gaps exist with regards to the practice of interdisciplinary research. We argue that explicit attention towards the challenge of doing interdisciplinary research is critical in order to effectively apply ecological approaches to public health issues. This paper draws on our experiences developing and conducting an interdisciplinary research project exploring the links among climate change, water, and health to highlight five specific insights which we see as relevant to building capacity for interdisciplinary research specifically, and which have particular relevance to addressing the integrative challenges demanded by ecological approaches to address public health issues. These lessons include: (i) the need for frameworks that facilitate integration; (ii) emphasize learning-by-doing; (iii) the benefits of examining issues at multiple scales; (iv) make the implicit, explicit; and (v) the need for reflective practice. By synthesizing and sharing experiences gained by engaging in interdisciplinary inquiries using an ecological approach, this paper responds to a growing need to build interdisciplinary research capacity as a means for advancing the ecological public health agenda more broadly.

  19. Building Interdisciplinary Research Capacity: a Key Challenge for Ecological Approaches in Public Health

    PubMed Central

    Galway, Lindsay P.; Parkes, Margot W.; Allen, Diana; Takaro, Tim K.

    2016-01-01

    The shortcomings of public health research informed by reductionist and fragmented biomedical approaches and the emergence of wicked problems are fueling a renewed interest in ecological approaches in public health. Despite the central role of interdisciplinarity in the context of ecological approaches in public health research, inadequate attention has been given to the specific challenge of doing interdisciplinary research in practice. As a result, important knowledge gaps exist with regards to the practice of interdisciplinary research. We argue that explicit attention towards the challenge of doing interdisciplinary research is critical in order to effectively apply ecological approaches to public health issues. This paper draws on our experiences developing and conducting an interdisciplinary research project exploring the links among climate change, water, and health to highlight five specific insights which we see as relevant to building capacity for interdisciplinary research specifically, and which have particular relevance to addressing the integrative challenges demanded by ecological approaches to address public health issues. These lessons include: (i) the need for frameworks that facilitate integration; (ii) emphasize learning-by-doing; (iii) the benefits of examining issues at multiple scales; (iv) make the implicit, explicit; and (v) the need for reflective practice. By synthesizing and sharing experiences gained by engaging in interdisciplinary inquiries using an ecological approach, this paper responds to a growing need to build interdisciplinary research capacity as a means for advancing the ecological public health agenda more broadly. PMID:29546171

  20. A Novel Real-Time Reference Key Frame Scan Matching Method

    PubMed Central

    Mohamed, Haytham; Moussa, Adel; Elhabiby, Mohamed; El-Sheimy, Naser; Sesay, Abu

    2017-01-01

    Unmanned aerial vehicles represent an effective technology for indoor search and rescue operations. Typically, most indoor missions’ environments would be unknown, unstructured, and/or dynamic. Navigation of UAVs in such environments is addressed by simultaneous localization and mapping approach using either local or global approaches. Both approaches suffer from accumulated errors and high processing time due to the iterative nature of the scan matching method. Moreover, point-to-point scan matching is prone to outlier association processes. This paper proposes a low-cost novel method for 2D real-time scan matching based on a reference key frame (RKF). RKF is a hybrid scan matching technique comprised of feature-to-feature and point-to-point approaches. This algorithm aims at mitigating errors accumulation using the key frame technique, which is inspired from video streaming broadcast process. The algorithm depends on the iterative closest point algorithm during the lack of linear features which is typically exhibited in unstructured environments. The algorithm switches back to the RKF once linear features are detected. To validate and evaluate the algorithm, the mapping performance and time consumption are compared with various algorithms in static and dynamic environments. The performance of the algorithm exhibits promising navigational, mapping results and very short computational time, that indicates the potential use of the new algorithm with real-time systems. PMID:28481285

  1. A Novel Real-Time Reference Key Frame Scan Matching Method.

    PubMed

    Mohamed, Haytham; Moussa, Adel; Elhabiby, Mohamed; El-Sheimy, Naser; Sesay, Abu

    2017-05-07

    Unmanned aerial vehicles represent an effective technology for indoor search and rescue operations. Typically, most indoor missions' environments would be unknown, unstructured, and/or dynamic. Navigation of UAVs in such environments is addressed by simultaneous localization and mapping approach using either local or global approaches. Both approaches suffer from accumulated errors and high processing time due to the iterative nature of the scan matching method. Moreover, point-to-point scan matching is prone to outlier association processes. This paper proposes a low-cost novel method for 2D real-time scan matching based on a reference key frame (RKF). RKF is a hybrid scan matching technique comprised of feature-to-feature and point-to-point approaches. This algorithm aims at mitigating errors accumulation using the key frame technique, which is inspired from video streaming broadcast process. The algorithm depends on the iterative closest point algorithm during the lack of linear features which is typically exhibited in unstructured environments. The algorithm switches back to the RKF once linear features are detected. To validate and evaluate the algorithm, the mapping performance and time consumption are compared with various algorithms in static and dynamic environments. The performance of the algorithm exhibits promising navigational, mapping results and very short computational time, that indicates the potential use of the new algorithm with real-time systems.

  2. Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis

    ERIC Educational Resources Information Center

    Young, Cristobal; Holsteen, Katherine

    2017-01-01

    Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all…

  3. Identifying key areas for active interprofessional learning partnerships: A facilitated dialogue.

    PubMed

    Steven, Kathryn; Angus, Allyson; Breckenridge, Jenna; Davey, Peter; Tully, Vicki; Muir, Fiona

    2016-11-01

    Student and service user involvement is recognised as an important factor in creating interprofessional education (IPE) opportunities. We used a team-based learning approach to bring together undergraduate health professional students, early career professionals (ECPs), public partners, volunteers, and carers to explore learning partnerships. Influenced by evaluative inquiry, this qualitative study used a free text response to allow participants to give their own opinion. A total of 153 participants (50 public partners and 103 students and professionals representing 11 healthcare professions) took part. Participants were divided into mixed groups of six (n = 25) and asked to identify areas where students, professionals, and public could work together to improve health professional education. Each group documented their discussions by summarising agreed areas and next steps. Responses were collected and transcribed for inductive content analysis. Seven key themes (areas for joint working) were identified: communication, public as partners, standards of conduct, IPE, quality improvement, education, and learning environments. The team-based learning format enabled undergraduate and postgraduate health professionals to achieve consensus with public partners on areas for IPE and collaboration. Some of our results may be context-specific but the approach is generalisable to other areas.

  4. In acceptance we trust? Conceptualising acceptance as a viable approach to NGO security management.

    PubMed

    Fast, Larissa A; Freeman, C Faith; O'Neill, Michael; Rowley, Elizabeth

    2013-04-01

    This paper documents current understanding of acceptance as a security management approach and explores issues and challenges non-governmental organisations (NGOs) confront when implementing an acceptance approach to security management. It argues that the failure of organisations to systematise and clearly articulate acceptance as a distinct security management approach and a lack of organisational policies and procedures concerning acceptance hinder its efficacy as a security management approach. The paper identifies key and cross-cutting components of acceptance that are critical to its effective implementation in order to advance a comprehensive and systematic concept of acceptance. The key components of acceptance illustrate how organisational and staff functions affect positively or negatively an organisation's acceptance, and include: an organisation's principles and mission, communications, negotiation, programming, relationships and networks, stakeholder and context analysis, staffing, and image. The paper contends that acceptance is linked not only to good programming, but also to overall organisational management and structures. © 2013 The Author(s). Journal compilation © Overseas Development Institute, 2013.

  5. Transperitoneal approach versus retroperitoneal approach: a meta-analysis of laparoscopic partial nephrectomy for renal cell carcinoma.

    PubMed

    Ren, Tong; Liu, Yan; Zhao, Xiaowen; Ni, Shaobin; Zhang, Cheng; Guo, Changgang; Ren, Minghua

    2014-01-01

    To compare the efficiency and safety of the transperitoneal approaches with retroperitoneal approaches in laparoscopic partial nephrectomy for renal cell carcinoma and provide evidence-based medicine support for clinical treatment. A systematic computer search of PUBMED, EMBASE, and the Cochrane Library was executed to identify retrospective observational and prospective randomized controlled trials studies that compared the outcomes of the two approaches in laparoscopic partial nephrectomy. Two reviewers independently screened, extracted, and evaluated the included studies and executed statistical analysis by using software STATA 12.0. Outcomes of interest included perioperative and postoperative variables, surgical complications and oncological variables. There were 8 studies assessed transperitoneal laparoscopic partial nephrectomy (TLPN) versus retroperitoneal laparoscopic partial nephrectomy (RLPN) were included. RLPN had a shorter operating time (SMD = 1.001,95%confidence interval[CI] 0.609-1.393,P<0.001), a lower estimated blood loss (SMD = 0.403,95%CI 0.015-0.791,P = 0.042) and a shorter length of hospital stay (WMD = 0.936 DAYS,95%CI 0.609-1.263,P<0.001) than TLPN. There were no significant differences between the transperitoneal and retroperitoneal approaches in other outcomes of interest. This meta-analysis indicates that, in appropriately selected patients, especially patients with intraperitoneal procedures history or posteriorly located renal tumors, the RLPN can shorten the operation time, reduce the estimated blood loss and shorten the length of hospital stay. RLPN may be equally safe and be faster compared with the TLPN.

  6. LitMiner and WikiGene: identifying problem-related key players of gene regulation using publication abstracts.

    PubMed

    Maier, Holger; Döhr, Stefanie; Grote, Korbinian; O'Keeffe, Sean; Werner, Thomas; Hrabé de Angelis, Martin; Schneider, Ralf

    2005-07-01

    The LitMiner software is a literature data-mining tool that facilitates the identification of major gene regulation key players related to a user-defined field of interest in PubMed abstracts. The prediction of gene-regulatory relationships is based on co-occurrence analysis of key terms within the abstracts. LitMiner predicts relationships between key terms from the biomedical domain in four categories (genes, chemical compounds, diseases and tissues). Owing to the limitations (no direction, unverified automatic prediction) of the co-occurrence approach, the primary data in the LitMiner database represent postulated basic gene-gene relationships. The usefulness of the LitMiner system has been demonstrated recently in a study that reconstructed disease-related regulatory networks by promoter modelling that was initiated by a LitMiner generated primary gene list. To overcome the limitations and to verify and improve the data, we developed WikiGene, a Wiki-based curation tool that allows revision of the data by expert users over the Internet. LitMiner (http://andromeda.gsf.de/litminer) and WikiGene (http://andromeda.gsf.de/wiki) can be used unrestricted with any Internet browser.

  7. Information security system based on virtual-optics imaging methodology and public key infrastructure

    NASA Astrophysics Data System (ADS)

    Peng, Xiang; Zhang, Peng; Cai, Lilong

    In this paper, we present a virtual-optical based information security system model with the aid of public-key-infrastructure (PKI) techniques. The proposed model employs a hybrid architecture in which our previously published encryption algorithm based on virtual-optics imaging methodology (VOIM) can be used to encipher and decipher data while an asymmetric algorithm, for example RSA, is applied for enciphering and deciphering the session key(s). For an asymmetric system, given an encryption key, it is computationally infeasible to determine the decryption key and vice versa. The whole information security model is run under the framework of PKI, which is on basis of public-key cryptography and digital signatures. This PKI-based VOIM security approach has additional features like confidentiality, authentication, and integrity for the purpose of data encryption under the environment of network.

  8. Infusing Reliability Techniques into Software Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  9. Information Retrieval and Graph Analysis Approaches for Book Recommendation.

    PubMed

    Benkoussas, Chahinez; Bellot, Patrice

    2015-01-01

    A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD) a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval) Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments.

  10. Information Retrieval and Graph Analysis Approaches for Book Recommendation

    PubMed Central

    Benkoussas, Chahinez; Bellot, Patrice

    2015-01-01

    A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD) a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval) Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments. PMID:26504899

  11. Identification of key factors affecting the water pollutant concentration in the sluice-controlled river reaches of the Shaying River in China via statistical analysis methods.

    PubMed

    Dou, Ming; Zhang, Yan; Zuo, Qiting; Mi, Qingbin

    2015-08-01

    The construction of sluices creates a strong disturbance in water environmental factors within a river. The change in water pollutant concentrations of sluice-controlled river reaches (SCRRs) is more complex than that of natural river segments. To determine the key factors affecting water pollutant concentration changes in SCRRs, river reaches near the Huaidian Sluice in the Shaying River of China were selected as a case study, and water quality monitoring experiments based on different regulating modes were implemented in 2009 and 2010. To identify the key factors affecting the change rates for the chemical oxygen demand of permanganate (CODMn) and ammonia nitrogen (NH3-N) concentrations in the SCRRs of the Huaidian Sluice, partial correlation analysis, principal component analysis and principal factor analysis were used. The results indicate four factors, i.e., the inflow quantity from upper reaches, opening size of sluice gates, water pollutant concentration from upper reaches, and turbidity before the sluice, which are the common key factors for the CODMn and NH3-N concentration change rates. Moreover, the dissolved oxygen before a sluice is a key factor for the permanganate concentration from CODMn change rate, and the water depth before a sluice is a key factor for the NH3-N concentration change rate. Multiple linear regressions between the water pollutant concentration change rate and key factors were established via multiple linear regression analyses, and the quantitative relationship between the CODMn and NH3-N concentration change rates and key affecting factors was analyzed. Finally, the mechanism of action for the key factors affecting the water pollutant concentration changes was analyzed. The results reveal that the inflow quantity from upper reaches, opening size of sluice gates, permanganate concentration from CODMn from upper reaches and dissolved oxygen before the sluice have a negative influence and the turbidity before the sluice has a positive

  12. Identification of key nitrous oxide production pathways in aerobic partial nitrifying granules.

    PubMed

    Ishii, Satoshi; Song, Yanjun; Rathnayake, Lashitha; Tumendelger, Azzaya; Satoh, Hisashi; Toyoda, Sakae; Yoshida, Naohiro; Okabe, Satoshi

    2014-10-01

    The identification of the key nitrous oxide (N2O) production pathways is important to establish a strategy to mitigate N2O emission. In this study, we combined real-time gas-monitoring analysis, (15)N stable isotope analysis, denitrification functional gene transcriptome analysis and microscale N2O concentration measurements to identify the main N2O producers in a partial nitrification (PN) aerobic granule reactor, which was fed with ammonium and acetate. Our results suggest that heterotrophic denitrification was the main contributor to N2O production in our PN aerobic granule reactor. The heterotrophic denitrifiers were probably related to Rhodocyclales bacteria, although different types of bacteria were active in the initial and latter stages of the PN reaction cycles, most likely in response to the presence of acetate. Hydroxylamine oxidation and nitrifier denitrification occurred, but their contribution to N2O emission was relatively small (20-30%) compared with heterotrophic denitrification. Our approach can be useful to quantitatively examine the relative contributions of the three pathways (hydroxylamine oxidation, nitrifier denitrification and heterotrophic denitrification) to N2O emission in mixed microbial populations. © 2014 Society for Applied Microbiology and John Wiley & Sons Ltd.

  13. Key Microbiota Identification Using Functional Gene Analysis during Pepper (Piper nigrum L.) Peeling.

    PubMed

    Zhang, Jiachao; Hu, Qisong; Xu, Chuanbiao; Liu, Sixin; Li, Congfa

    2016-01-01

    Pepper pericarp microbiota plays an important role in the pepper peeling process for the production of white pepper. We collected pepper samples at different peeling time points from Hainan Province, China, and used a metagenomic approach to identify changes in the pericarp microbiota based on functional gene analysis. UniFrac distance-based principal coordinates analysis revealed significant changes in the pericarp microbiota structure during peeling, which were attributed to increases in bacteria from the genera Selenomonas and Prevotella. We identified 28 core operational taxonomic units at each time point, mainly belonging to Selenomonas, Prevotella, Megasphaera, Anaerovibrio, and Clostridium genera. The results were confirmed by quantitative polymerase chain reaction. At the functional level, we observed significant increases in microbial features related to acetyl xylan esterase and pectinesterase for pericarp degradation during peeling. These findings offer a new insight into biodegradation for pepper peeling and will promote the development of the white pepper industry.

  14. Key Microbiota Identification Using Functional Gene Analysis during Pepper (Piper nigrum L.) Peeling

    PubMed Central

    Xu, Chuanbiao; Liu, Sixin; Li, Congfa

    2016-01-01

    Pepper pericarp microbiota plays an important role in the pepper peeling process for the production of white pepper. We collected pepper samples at different peeling time points from Hainan Province, China, and used a metagenomic approach to identify changes in the pericarp microbiota based on functional gene analysis. UniFrac distance-based principal coordinates analysis revealed significant changes in the pericarp microbiota structure during peeling, which were attributed to increases in bacteria from the genera Selenomonas and Prevotella. We identified 28 core operational taxonomic units at each time point, mainly belonging to Selenomonas, Prevotella, Megasphaera, Anaerovibrio, and Clostridium genera. The results were confirmed by quantitative polymerase chain reaction. At the functional level, we observed significant increases in microbial features related to acetyl xylan esterase and pectinesterase for pericarp degradation during peeling. These findings offer a new insight into biodegradation for pepper peeling and will promote the development of the white pepper industry. PMID:27768750

  15. Multiple Paths to Mathematics Practice in Al-Kashi's Key to Arithmetic

    NASA Astrophysics Data System (ADS)

    Taani, Osama

    2014-01-01

    In this paper, I discuss one of the most distinguishing features of Jamshid al-Kashi's pedagogy from his Key to Arithmetic, a well-known Arabic mathematics textbook from the fifteenth century. This feature is the multiple paths that he includes to find a desired result. In the first section light is shed on al-Kashi's life and his contributions to mathematics and astronomy. Section 2 starts with a brief discussion of the contents and pedagogy of the Key to Arithmetic. Al-Kashi's multiple approaches are discussed through four different examples of his versatility in presenting a topic from multiple perspectives. These examples are multiple definitions, multiple algorithms, multiple formulas, and multiple methods for solving word problems. Section 3 is devoted to some benefits that can be gained by implementing al-Kashi's multiple paths approach in modern curricula. For this discussion, examples from two teaching modules taken from the Key to Arithmetic and implemented in Pre-Calculus and mathematics courses for preservice teachers are discussed. Also, the conclusions are supported by some aspects of these modules. This paper is an attempt to help mathematics educators explore more benefits from reading from original sources.

  16. Are the Keys loved to death? A study of diver specialization levels and preferences in the Florida Keys

    Treesearch

    Shona Paterson; David K. Loomis

    2010-01-01

    This paper presents research conducted for the Florida Reef Resilience Program on nonresident recreational SCUBA divers in three zones of the Florida Keys. When divers were segmented into specialization subgroups for analysis, divers in different subgroups tended to use different geographic locations. These results suggest differences in user preferences; yet when...

  17. A simplified approach for slope stability analysis of uncontrolled waste dumps.

    PubMed

    Turer, Dilek; Turer, Ahmet

    2011-02-01

    Slope stability analysis of municipal solid waste has always been problematic because of the heterogeneous nature of the waste materials. The requirement for large testing equipment in order to obtain representative samples has identified the need for simplified approaches to obtain the unit weight and shear strength parameters of the waste. In the present study, two of the most recently published approaches for determining the unit weight and shear strength parameters of the waste have been incorporated into a slope stability analysis using the Bishop method to prepare slope stability charts. The slope stability charts were prepared for uncontrolled waste dumps having no liner and leachate collection systems with pore pressure ratios of 0, 0.1, 0.2, 0.3, 0.4 and 0.5, considering the most critical slip surface passing through the toe of the slope. As the proposed slope stability charts were prepared by considering the change in unit weight as a function of height, they reflect field conditions better than accepting a constant unit weight approach in the stability analysis. They also streamline the selection of slope or height as a function of the desired factor of safety.

  18. Key Recommendations from the MedtecHTA Project.

    PubMed

    Tarricone, Rosanna; Torbica, Aleksandra; Drummond, Michael

    2017-02-01

    There are particular characteristics of Medical Devices, such as the device-user interaction, the incremental nature of innovation and the broader organizational impact that lead to additional challenges for health technology assessment (HTA). The project explored key aspects of the conduct and methods of HTA for MDs. Systematic reviews and original research studies were conducted to determine improvements in processes and methods that could enhance the potential for HTA and optimize the diffusion of MDs. Regulatory processes for MDs should be more closely aligned, the HTA evaluative framework should be harmonized and processes for conditional coverage and evidence development should be used. The methods for HTA should consider MDs as complex interventions, require the establishment of high quality registries, consider an iterative approach to the evaluation over time, recognize and allow for the particular characteristics of devices and use appropriate approaches for confounder adjustment in comparative effectiveness studies. To optimize the diffusion, a common classification should be developed across countries in order to facilitate international comparisons, factors driving diffusion should be explored in HTA reports and physicians' personal goals and motivation should be better understood. The key recommendations of the MedtecHTA project should improve the conduct and use of HTA for MDs. © 2017 The Authors. Health Economics published by John Wiley & Sons, Ltd. © 2017 The Authors. Health Economics published by John Wiley & Sons, Ltd.

  19. A Huge Responsibility: Three Keys to Teaching Elementary Students

    ERIC Educational Resources Information Center

    Davison, Leslie

    2014-01-01

    Based on her 20 years of teaching Spanish, Leslie Davison strives for a holistic approach to teaching and learning that is authentic and relevant to her young language learners. Herein, she shares three keys to teaching elementary level students in a way that ensures they will have a "Can Do" attitude in terms of language proficiency and…

  20. Whole-genome CNV analysis: advances in computational approaches.

    PubMed

    Pirooznia, Mehdi; Goes, Fernando S; Zandi, Peter P

    2015-01-01

    Accumulating evidence indicates that DNA copy number variation (CNV) is likely to make a significant contribution to human diversity and also play an important role in disease susceptibility. Recent advances in genome sequencing technologies have enabled the characterization of a variety of genomic features, including CNVs. This has led to the development of several bioinformatics approaches to detect CNVs from next-generation sequencing data. Here, we review recent advances in CNV detection from whole genome sequencing. We discuss the informatics approaches and current computational tools that have been developed as well as their strengths and limitations. This review will assist researchers and analysts in choosing the most suitable tools for CNV analysis as well as provide suggestions for new directions in future development.

  1. Risk analysis with a fuzzy-logic approach of a complex installation

    NASA Astrophysics Data System (ADS)

    Peikert, Tim; Garbe, Heyno; Potthast, Stefan

    2016-09-01

    This paper introduces a procedural method based on fuzzy logic to analyze systematic the risk of an electronic system in an intentional electromagnetic environment (IEME). The method analyzes the susceptibility of a complex electronic installation with respect to intentional electromagnetic interference (IEMI). It combines the advantages of well-known techniques as fault tree analysis (FTA), electromagnetic topology (EMT) and Bayesian networks (BN) and extends the techniques with an approach to handle uncertainty. This approach uses fuzzy sets, membership functions and fuzzy logic to handle the uncertainty with probability functions and linguistic terms. The linguistic terms add to the risk analysis the knowledge from experts of the investigated system or environment.

  2. Using Formative Scenario Analysis approach for landslide risk analysis in a relatively scarce data environment: preliminary results

    NASA Astrophysics Data System (ADS)

    Zumpano, Veronica; Balteanu, Dan; Mazzorana, Bruno; Micu, Mihai

    2014-05-01

    It is increasingly important to provide to stakeholders tools that will enable them to better understand what is the state of the environment in which they live and manage and to help them to make decisions that aim to minimize the consequences of hydro-meteorological hazards. Very often, however, quantitative studies, especially for large areas, are difficult to perform. This is due to the fact that unfortunately isn't often possible to have the numerous data required to perform the analysis. In addition it has been proven that in scenario analysis, often deterministic approaches are not able to detect some features of the system revealing unexpected behaviors, and resulting in underestimation or omission of some impact factors. Here are presented some preliminary results obtained applying Formative Scenario Analysis that can be considered a possible solution for landslide risk analysis in cases where the data needed even if existent are not available. This method is an expert based approach that integrates intuitions and qualitative evaluations of impact factors with the quantitative analysis of relations between these factors: a group of experts with different but pertinent expertise, determine (by a rating procedure) quantitative relations between these factors, then through mathematical operations the scenarios describing a certain state of the system are obtained. The approach is applied to Buzau County (Romania), an area belonging to the Curvature Romanian Carpathians and Subcarpathians, a region strongly affected by environmental hazards. The region has been previously involved in numerous episodes of severe hydro-meteorological events that caused considerable damages (1975, 2005, 2006). In this application we are referring only to one type of landslides that can be described as shallow and medium-seated with a (mainly) translational movement that can go from slide to flow. The material involved can be either soil, debris or a mixture of both, in Romanian

  3. Key management of the double random-phase-encoding method using public-key encryption

    NASA Astrophysics Data System (ADS)

    Saini, Nirmala; Sinha, Aloka

    2010-03-01

    Public-key encryption has been used to encode the key of the encryption process. In the proposed technique, an input image has been encrypted by using the double random-phase-encoding method using extended fractional Fourier transform. The key of the encryption process have been encoded by using the Rivest-Shamir-Adelman (RSA) public-key encryption algorithm. The encoded key has then been transmitted to the receiver side along with the encrypted image. In the decryption process, first the encoded key has been decrypted using the secret key and then the encrypted image has been decrypted by using the retrieved key parameters. The proposed technique has advantage over double random-phase-encoding method because the problem associated with the transmission of the key has been eliminated by using public-key encryption. Computer simulation has been carried out to validate the proposed technique.

  4. Weighted gene co‑expression network analysis in identification of key genes and networks for ischemic‑reperfusion remodeling myocardium.

    PubMed

    Guo, Nan; Zhang, Nan; Yan, Liqiu; Lian, Zheng; Wang, Jiawang; Lv, Fengfeng; Wang, Yunfei; Cao, Xufen

    2018-06-14

    Acute myocardial infarction induces ventricular remodeling, which is implicated in dilated heart and heart failure. The pathogenical mechanism of myocardium remodeling remains to be elucidated. The aim of the present study was to identify key genes and networks for myocardium remodeling following ischemia‑reperfusion (IR). First, the mRNA expression data from the National Center for Biotechnology Information database were downloaded to identify differences in mRNA expression of the IR heart at days 2 and 7. Then, weighted gene co‑expression network analysis, hierarchical clustering, protein‑protein interaction (PPI) network, Gene Ontology (GO), Kyoto Encyclopedia of Genes and Genomes (KEGG) pathway were used to identify key genes and networks for the heart remodeling process following IR. A total of 3,321 differentially expressed genes were identified during the heart remodeling process. A total of 6 modules were identified through gene co‑expression network analysis. GO and KEGG analysis results suggested that each module represented a different biological function and was associated with different pathways. Finally, hub genes of each module were identified by PPI network construction. The present study revealed that heart remodeling following IR is a complicated process, involving extracellular matrix organization, neural development, apoptosis and energy metabolism. The dysregulated genes, including SRC proto‑oncogene, non‑receptor tyrosine kinase, discs large MAGUK scaffold protein 1, ATP citrate lyase, RAN, member RAS oncogene family, tumor protein p53, and polo like kinase 2, may be essential for heart remodeling following IR and may be used as potential targets for the inhibition of heart remodeling following acute myocardial infarction.

  5. Optical key system

    DOEpatents

    Hagans, Karla G.; Clough, Robert E.

    2000-01-01

    An optical key system comprises a battery-operated optical key and an isolated lock that derives both its operating power and unlock signals from the correct optical key. A light emitting diode or laser diode is included within the optical key and is connected to transmit a bit-serial password. The key user physically enters either the code-to-transmit directly, or an index to a pseudorandom number code, in the key. Such person identification numbers can be retained permanently, or ephemeral. When a send button is pressed, the key transmits a beam of light modulated with the password information. The modulated beam of light is received by a corresponding optical lock with a photovoltaic cell that produces enough power from the beam of light to operate a password-screen digital logic. In one application, an acceptable password allows a two watt power laser diode to pump ignition and timing information over a fiberoptic cable into a sealed engine compartment. The receipt of a good password allows the fuel pump, spark, and starter systems to each operate. Therefore, bypassing the lock mechanism as is now routine with automobile thieves is pointless because the engine is so thoroughly disabled.

  6. Ideas for a pattern-oriented approach towards a VERA analysis ensemble

    NASA Astrophysics Data System (ADS)

    Gorgas, T.; Dorninger, M.

    2010-09-01

    Ideas for a pattern-oriented approach towards a VERA analysis ensemble For many applications in meteorology and especially for verification purposes it is important to have some information about the uncertainties of observation and analysis data. A high quality of these "reference data" is an absolute necessity as the uncertainties are reflected in verification measures. The VERA (Vienna Enhanced Resolution Analysis) scheme includes a sophisticated quality control tool which accounts for the correction of observational data and provides an estimation of the observation uncertainty. It is crucial for meteorologically and physically reliable analysis fields. VERA is based on a variational principle and does not need any first guess fields. It is therefore NWP model independent and can also be used as an unbiased reference for real time model verification. For downscaling purposes VERA uses an a priori knowledge on small-scale physical processes over complex terrain, the so called "fingerprint technique", which transfers information from rich to data sparse regions. The enhanced Joint D-PHASE and COPS data set forms the data base for the analysis ensemble study. For the WWRP projects D-PHASE and COPS a joint activity has been started to collect GTS and non-GTS data from the national and regional meteorological services in Central Europe for 2007. Data from more than 11.000 stations are available for high resolution analyses. The usage of random numbers as perturbations for ensemble experiments is a common approach in meteorology. In most implementations, like for NWP-model ensemble systems, the focus lies on error growth and propagation on the spatial and temporal scale. When defining errors in analysis fields we have to consider the fact that analyses are not time dependent and that no perturbation method aimed at temporal evolution is possible. Further, the method applied should respect two major sources of analysis errors: Observation errors AND analysis or

  7. Iterative key-residues interrogation of a phytase with thermostability increasing substitutions identified in directed evolution.

    PubMed

    Shivange, Amol V; Roccatano, Danilo; Schwaneberg, Ulrich

    2016-01-01

    Bacterial phytases have attracted industrial interest as animal feed supplement due to their high activity and sufficient thermostability (required for feed pelleting). We devised an approach named KeySIDE,  an iterative Key-residues interrogation of the wild type with Substitutions Identified in Directed Evolution for improving Yersinia mollaretii phytase (Ymphytase) thermostability by combining key beneficial substitutions and elucidating their individual roles. Directed evolution yielded in a discovery of nine positions in Ymphytase and combined iteratively to identify key positions. The "best" combination (M6: T77K, Q154H, G187S, and K289Q) resulted in significantly improved thermal resistance; the residual activity improved from 35 % (wild type) to 89 % (M6) at 58 °C and 20-min incubation. Melting temperature increased by 3 °C in M6 without a loss of specific activity. Molecular dynamics simulation studies revealed reduced flexibility in the loops located next to helices (B, F, and K) which possess substitutions (Helix-B: T77K, Helix-F: G187S, and Helix-K: K289E/Q). Reduced flexibility in the loops might be caused by strengthened hydrogen bonding network (e.g., G187S and K289E/K289Q) and a salt bridge (T77K). Our results demonstrate a promising approach to design phytases in food research, and we hope that the KeySIDE might become an attractive approach for understanding of structure-function relationships of enzymes.

  8. How new concepts become universal scientific approaches: insights from citation network analysis of agent-based complex systems science.

    PubMed

    Vincenot, Christian E

    2018-03-14

    Progress in understanding and managing complex systems comprised of decision-making agents, such as cells, organisms, ecosystems or societies, is-like many scientific endeavours-limited by disciplinary boundaries. These boundaries, however, are moving and can actively be made porous or even disappear. To study this process, I advanced an original bibliometric approach based on network analysis to track and understand the development of the model-based science of agent-based complex systems (ACS). I analysed research citations between the two communities devoted to ACS research, namely agent-based (ABM) and individual-based modelling (IBM). Both terms refer to the same approach, yet the former is preferred in engineering and social sciences, while the latter prevails in natural sciences. This situation provided a unique case study for grasping how a new concept evolves distinctly across scientific domains and how to foster convergence into a universal scientific approach. The present analysis based on novel hetero-citation metrics revealed the historical development of ABM and IBM, confirmed their past disjointedness, and detected their progressive merger. The separation between these synonymous disciplines had silently opposed the free flow of knowledge among ACS practitioners and thereby hindered the transfer of methodological advances and the emergence of general systems theories. A surprisingly small number of key publications sparked the ongoing fusion between ABM and IBM research. Beside reviews raising awareness of broad-spectrum issues, generic protocols for model formulation and boundary-transcending inference strategies were critical means of science integration. Accessible broad-spectrum software similarly contributed to this change. From the modelling viewpoint, the discovery of the unification of ABM and IBM demonstrates that a wide variety of systems substantiate the premise of ACS research that microscale behaviours of agents and system-level dynamics

  9. Mining key elements for severe convection prediction based on CNN

    NASA Astrophysics Data System (ADS)

    Liu, Ming; Pan, Ning; Zhang, Changan; Sha, Hongzhou; Zhang, Bolei; Liu, Liang; Zhang, Meng

    2017-04-01

    Severe convective weather is a kind of weather disasters accompanied by heavy rainfall, gust wind, hail, etc. Along with recent developments on remote sensing and numerical modeling, there are high-volume and long-term observational and modeling data accumulated to capture massive severe convective events over particular areas and time periods. With those high-volume and high-variety weather data, most of the existing studies and methods carry out the dynamical laws, cause analysis, potential rule study, and prediction enhancement by utilizing the governing equations from fluid dynamics and thermodynamics. In this study, a key-element mining method is proposed for severe convection prediction based on convolution neural network (CNN). It aims to identify the key areas and key elements from huge amounts of historical weather data including conventional measurements, weather radar, satellite, so as numerical modeling and/or reanalysis data. Under this manner, the machine-learning based method could help the human forecasters on their decision-making on operational weather forecasts on severe convective weathers by extracting key information from the real-time and historical weather big data. In this paper, it first utilizes computer vision technology to complete the data preprocessing work of the meteorological variables. Then, it utilizes the information such as radar map and expert knowledge to annotate all images automatically. And finally, by using CNN model, it cloud analyze and evaluate each weather elements (e.g., particular variables, patterns, features, etc.), and identify key areas of those critical weather elements, then help forecasters quickly screen out the key elements from huge amounts of observation data by current weather conditions. Based on the rich weather measurement and model data (up to 10 years) over Fujian province in China, where the severe convective weathers are very active during the summer months, experimental tests are conducted with

  10. Copyright Ownership of E-Learning and Teaching Materials: Policy Approaches Taken by UK Universities

    ERIC Educational Resources Information Center

    Gadd, Elizabeth; Weedon, Ralph

    2017-01-01

    Investigates whether and how UK university copyright policies address key copyright ownership issues relating to printed and electronic teaching materials. A content analysis of 81 UK university copyright policies is performed to understand their approach towards copyright ownership of printed and e-learning materials and performances; rights on…

  11. Models of the Behavior of People Searching the Internet: A Petri Net Approach.

    ERIC Educational Resources Information Center

    Kantor, Paul B.; Nordlie, Ragnar

    1999-01-01

    Illustrates how various key abstractions of information finding, such as document relevance, a desired number of relevant documents, discouragement, exhaustion, and satisfaction can be modeled using the Petri Net framework. Shows that this model leads naturally to a new approach to collection of user data, and to analysis of transaction logs.…

  12. A social network analysis of alcohol-impaired drivers in Maryland : an egocentric approach.

    DOT National Transportation Integrated Search

    2011-04-01

    This study examined the personal, household, and social structural attributes of alcoholimpaired : drivers in Maryland. The study used an egocentric approach of social network : analysis. This approach concentrated on specific actors (alcohol-impaire...

  13. ANALYSIS OF METEOROLOGICAL CONDITIONS DURING THE 1977 ANCLOTE KEYS PLUME STUDY

    EPA Science Inventory

    Meteorological conditions are described and analyzed for nine experimental observation periods of the Anclote Keys Plume Study, which was conducted near Tampa, Florida during February 1977. The primary objective of the Plume Study was to investigate both the short and long range ...

  14. Linear mixed-effects modeling approach to FMRI group analysis

    PubMed Central

    Chen, Gang; Saad, Ziad S.; Britton, Jennifer C.; Pine, Daniel S.; Cox, Robert W.

    2013-01-01

    Conventional group analysis is usually performed with Student-type t-test, regression, or standard AN(C)OVA in which the variance–covariance matrix is presumed to have a simple structure. Some correction approaches are adopted when assumptions about the covariance structure is violated. However, as experiments are designed with different degrees of sophistication, these traditional methods can become cumbersome, or even be unable to handle the situation at hand. For example, most current FMRI software packages have difficulty analyzing the following scenarios at group level: (1) taking within-subject variability into account when there are effect estimates from multiple runs or sessions; (2) continuous explanatory variables (covariates) modeling in the presence of a within-subject (repeated measures) factor, multiple subject-grouping (between-subjects) factors, or the mixture of both; (3) subject-specific adjustments in covariate modeling; (4) group analysis with estimation of hemodynamic response (HDR) function by multiple basis functions; (5) various cases of missing data in longitudinal studies; and (6) group studies involving family members or twins. Here we present a linear mixed-effects modeling (LME) methodology that extends the conventional group analysis approach to analyze many complicated cases, including the six prototypes delineated above, whose analyses would be otherwise either difficult or unfeasible under traditional frameworks such as AN(C)OVA and general linear model (GLM). In addition, the strength of the LME framework lies in its flexibility to model and estimate the variance–covariance structures for both random effects and residuals. The intraclass correlation (ICC) values can be easily obtained with an LME model with crossed random effects, even at the presence of confounding fixed effects. The simulations of one prototypical scenario indicate that the LME modeling keeps a balance between the control for false positives and the

  15. Linear mixed-effects modeling approach to FMRI group analysis.

    PubMed

    Chen, Gang; Saad, Ziad S; Britton, Jennifer C; Pine, Daniel S; Cox, Robert W

    2013-06-01

    Conventional group analysis is usually performed with Student-type t-test, regression, or standard AN(C)OVA in which the variance-covariance matrix is presumed to have a simple structure. Some correction approaches are adopted when assumptions about the covariance structure is violated. However, as experiments are designed with different degrees of sophistication, these traditional methods can become cumbersome, or even be unable to handle the situation at hand. For example, most current FMRI software packages have difficulty analyzing the following scenarios at group level: (1) taking within-subject variability into account when there are effect estimates from multiple runs or sessions; (2) continuous explanatory variables (covariates) modeling in the presence of a within-subject (repeated measures) factor, multiple subject-grouping (between-subjects) factors, or the mixture of both; (3) subject-specific adjustments in covariate modeling; (4) group analysis with estimation of hemodynamic response (HDR) function by multiple basis functions; (5) various cases of missing data in longitudinal studies; and (6) group studies involving family members or twins. Here we present a linear mixed-effects modeling (LME) methodology that extends the conventional group analysis approach to analyze many complicated cases, including the six prototypes delineated above, whose analyses would be otherwise either difficult or unfeasible under traditional frameworks such as AN(C)OVA and general linear model (GLM). In addition, the strength of the LME framework lies in its flexibility to model and estimate the variance-covariance structures for both random effects and residuals. The intraclass correlation (ICC) values can be easily obtained with an LME model with crossed random effects, even at the presence of confounding fixed effects. The simulations of one prototypical scenario indicate that the LME modeling keeps a balance between the control for false positives and the sensitivity

  16. Military Service Member and Veteran Reintegration: A Conceptual Analysis, Unified Definition, and Key Domains

    PubMed Central

    Elnitsky, Christine A.; Fisher, Michael P.; Blevins, Cara L.

    2017-01-01

    Returning military service members and veterans (MSMVs) may experience a variety of stress-related disorders and challenges when reintegrating from the military to the community. Facilitating the reintegration, transition, readjustment and coping, and community integration, of MSMVs is a societal priority. To date, research addressing MSMV reintegration has not identified a comprehensive definition of the term or defined the broader context within which the process of reintegration occurs although both are needed to promote valid and reliable measurement of reintegration and clarify related challenges, processes, and their impact on outcomes. Therefore, this principle-based concept analysis sought to review existing empirical reintegration measurement instruments and identify the problems and needs of MSMV reintegration to provide a unified definition of reintegration to guide future research, clinical practice, and related services. We identified 1,459 articles in the health and social sciences literature, published between 1990 and 2015, by searching multiple electronic databases. Screening of abstracts and full text review based on our inclusion/exclusion criteria, yielded 117 articles for review. Two investigators used constant conceptual comparison to evaluate relevant articles independently. We examined the term reintegration and related terms (i.e., transition, readjustment, community integration) identifying trends in their use over time, analyzed the eight reintegration survey instruments, and synthesized service member and veteran self-reported challenges and needs for reintegration. More reintegration research was published during the last 5 years (n = 373) than in the previous 10 years combined (n = 130). The research suggests coping with life stresses plays an integral role in military service member and veteran post-deployment reintegration. Key domains of reintegration include individual, interpersonal, community organizations, and societal factors

  17. Military Service Member and Veteran Reintegration: A Conceptual Analysis, Unified Definition, and Key Domains.

    PubMed

    Elnitsky, Christine A; Fisher, Michael P; Blevins, Cara L

    2017-01-01

    Returning military service members and veterans (MSMVs) may experience a variety of stress-related disorders and challenges when reintegrating from the military to the community. Facilitating the reintegration, transition, readjustment and coping, and community integration, of MSMVs is a societal priority. To date, research addressing MSMV reintegration has not identified a comprehensive definition of the term or defined the broader context within which the process of reintegration occurs although both are needed to promote valid and reliable measurement of reintegration and clarify related challenges, processes, and their impact on outcomes. Therefore, this principle-based concept analysis sought to review existing empirical reintegration measurement instruments and identify the problems and needs of MSMV reintegration to provide a unified definition of reintegration to guide future research, clinical practice, and related services. We identified 1,459 articles in the health and social sciences literature, published between 1990 and 2015, by searching multiple electronic databases. Screening of abstracts and full text review based on our inclusion/exclusion criteria, yielded 117 articles for review. Two investigators used constant conceptual comparison to evaluate relevant articles independently. We examined the term reintegration and related terms (i.e., transition, readjustment, community integration) identifying trends in their use over time, analyzed the eight reintegration survey instruments, and synthesized service member and veteran self-reported challenges and needs for reintegration. More reintegration research was published during the last 5 years ( n = 373) than in the previous 10 years combined ( n = 130). The research suggests coping with life stresses plays an integral role in military service member and veteran post-deployment reintegration. Key domains of reintegration include individual, interpersonal, community organizations, and societal factors

  18. Nanochannel Device with Embedded Nanopore: a New Approach for Single-Molecule DNA Analysis and Manipulation

    NASA Astrophysics Data System (ADS)

    Zhang, Yuning; Reisner, Walter

    2012-02-01

    Nanopore and nanochannel based devices are robust methods for biomolecular sensing and single DNA manipulation. Nanopore-based DNA sensing has attractive features that make it a leading candidate as a single-molecule DNA sequencing technology. Nanochannel based extension of DNA, combined with enzymatic or denaturation-based barcoding schemes, is already a powerful approach for genome analysis. We believe that there is revolutionary potential in devices that combine nanochannels with nanpore detectors. In particular, due to the fast translocation of a DNA molecule through a standard nanopore configuration, there is an unfavorable trade-off between signal and sequence resolution. With a combined nanochannel-nanopore device, based on embedding a nanopore inside a nanochannel, we can in principle gain independent control over both DNA translocation speed and sensing signal, solving the key draw-back of the standard nanopore configuration. We will discuss our recent progress on device fabrication and characterization. In particular, we demonstrate that we can detect - using fluorescent microscopy - successful translocation of DNA from the nanochannel out through the nanopore, a possible method to 'select' a given barcode for further analysis. In particular, we show that in equilibrium DNA will not escape through an embedded sub-persistence length nanopore, suggesting that the embedded pore could be used as a nanoscale window through which to interrogate a nanochannel extended DNA molecule.

  19. Nanochannel Device with Embedded Nanopore: a New Approach for Single-Molecule DNA Analysis and Manipulation

    NASA Astrophysics Data System (ADS)

    Zhang, Yuning; Reisner, Walter

    2013-03-01

    Nanopore and nanochannel based devices are robust methods for biomolecular sensing and single DNA manipulation. Nanopore-based DNA sensing has attractive features that make it a leading candidate as a single-molecule DNA sequencing technology. Nanochannel based extension of DNA, combined with enzymatic or denaturation-based barcoding schemes, is already a powerful approach for genome analysis. We believe that there is revolutionary potential in devices that combine nanochannels with embedded pore detectors. In particular, due to the fast translocation of a DNA molecule through a standard nanopore configuration, there is an unfavorable trade-off between signal and sequence resolution. With a combined nanochannel-nanopore device, based on embedding a pore inside a nanochannel, we can in principle gain independent control over both DNA translocation speed and sensing signal, solving the key draw-back of the standard nanopore configuration. We demonstrate that we can optically detect successful translocation of DNA from the nanochannel out through the nanopore, a possible method to 'select' a given barcode for further analysis. In particular, we show that in equilibrium DNA will not escape through an embedded sub-persistence length nanopore, suggesting that the pore could be used as a nanoscale window through which to interrogate a nanochannel extended DNA molecule. Furthermore, electrical measurements through the nanopore are performed, indicating that DNA sensing is feasible using the nanochannel-nanopore device.

  20. A comparative analysis of numerical approaches to the mechanics of elastic sheets

    NASA Astrophysics Data System (ADS)

    Taylor, Michael; Davidovitch, Benny; Qiu, Zhanlong; Bertoldi, Katia

    2015-06-01

    Numerically simulating deformations in thin elastic sheets is a challenging problem in computational mechanics due to destabilizing compressive stresses that result in wrinkling. Determining the location, structure, and evolution of wrinkles in these problems has important implications in design and is an area of increasing interest in the fields of physics and engineering. In this work, several numerical approaches previously proposed to model equilibrium deformations in thin elastic sheets are compared. These include standard finite element-based static post-buckling approaches as well as a recently proposed method based on dynamic relaxation, which are applied to the problem of an annular sheet with opposed tractions where wrinkling is a key feature. Numerical solutions are compared to analytic predictions of the ground state, enabling a quantitative evaluation of the predictive power of the various methods. Results indicate that static finite element approaches produce local minima that are highly sensitive to initial imperfections, relying on a priori knowledge of the equilibrium wrinkling pattern to generate optimal results. In contrast, dynamic relaxation is much less sensitive to initial imperfections and can generate low-energy solutions for a wide variety of loading conditions without requiring knowledge of the equilibrium solution beforehand.

  1. Optical key system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagans, K.G.; Clough, R.E.

    2000-04-25

    An optical key system comprises a battery-operated optical key and an isolated lock that derives both its operating power and unlock signals from the correct optical key. A light emitting diode or laser diode is included within the optical key and is connected to transmit a bit-serial password. The key user physically enters either the code-to-transmit directly, or an index to a pseudorandom number code, in the key. Such person identification numbers can be retained permanently, or ephemeral. When a send button is pressed, the key transmits a beam of light modulated with the password information. The modulated beam ofmore » light is received by a corresponding optical lock with a photovoltaic cell that produces enough power from the beam of light to operate a password-screen digital logic. In one application, an acceptable password allows a two watt power laser diode to pump ignition and timing information over a fiberoptic cable into a sealed engine compartment. The receipt of a good password allows the fuel pump, spark, and starter systems to each operate. Therefore, bypassing the lock mechanism as is now routine with automobile thieves is pointless because the engine is so thoroughly disabled.« less

  2. African Primary Care Research: Qualitative data analysis and writing results

    PubMed Central

    Govender, Indiran; Ogunbanjo, Gboyega A.; Mash, Bob

    2014-01-01

    Abstract This article is part of a series on African primary care research and gives practical guidance on qualitative data analysis and the presentation of qualitative findings. After an overview of qualitative methods and analytical approaches, the article focuses particularly on content analysis, using the framework method as an example. The steps of familiarisation, creating a thematic index, indexing, charting, interpretation and confirmation are described. Key concepts with regard to establishing the quality and trustworthiness of data analysis are described. Finally, an approach to the presentation of qualitative findings is given. PMID:26245437

  3. African Primary Care Research: qualitative data analysis and writing results.

    PubMed

    Mabuza, Langalibalele H; Govender, Indiran; Ogunbanjo, Gboyega A; Mash, Bob

    2014-06-05

    This article is part of a series on African primary care research and gives practical guidance on qualitative data analysis and the presentation of qualitative findings. After an overview of qualitative methods and analytical approaches, the article focuses particularly on content analysis, using the framework method as an example. The steps of familiarisation, creating a thematic index, indexing, charting, interpretation and confirmation are described. Key concepts with regard to establishing the quality and trustworthiness of data analysis are described. Finally, an approach to the presentation of qualitative findings is given.

  4. Characterization of the key aroma compounds in beef extract using aroma extract dilution analysis.

    PubMed

    Takakura, Yukiko; Sakamoto, Tomohiro; Hirai, Sachi; Masuzawa, Takuya; Wakabayashi, Hidehiko; Nishimura, Toshihide

    2014-05-01

    Aroma extract dilution analysis (AEDA) of an ether extract prepared from beef extract (BE) and subsequent identification experiments led to the determination of seven aroma-active compounds in the flavor dilution (FD) factor range of 32-128. Omission experiments to select the most aroma-active compounds from the seven aroma compounds suggested that 2,3,5-trimethyl pyrazine, 1-octen-3-ol, 3-methylbutanoic acid, and 4-hydroxy-2,5-dimethyl-3(2H)-furanone were the main active compounds contributing to the aroma of BE. Aroma recombination, addition, and omission experiments of the four aroma compounds in taste-reconstituted BE showed that each compound had an individual aroma profile. A comparison of the overall aroma between this recombination mixture and BE showed a high similarity, suggesting that the key aroma compounds had been identified successfully. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Lunar Exploration Architecture Level Key Drivers and Sensitivities

    NASA Technical Reports Server (NTRS)

    Goodliff, Kandyce; Cirillo, William; Earle, Kevin; Reeves, J. D.; Shyface, Hilary; Andraschko, Mark; Merrill, R. Gabe; Stromgren, Chel; Cirillo, Christopher

    2009-01-01

    Strategic level analysis of the integrated behavior of lunar transportation and lunar surface systems architecture options is performed to assess the benefit, viability, affordability, and robustness of system design choices. This analysis employs both deterministic and probabilistic modeling techniques so that the extent of potential future uncertainties associated with each option are properly characterized. The results of these analyses are summarized in a predefined set of high-level Figures of Merit (FOMs) so as to provide senior NASA Constellation Program (CxP) and Exploration Systems Mission Directorate (ESMD) management with pertinent information to better inform strategic level decision making. The strategic level exploration architecture model is designed to perform analysis at as high a level as possible but still capture those details that have major impacts on system performance. The strategic analysis methodology focuses on integrated performance, affordability, and risk analysis, and captures the linkages and feedbacks between these three areas. Each of these results leads into the determination of the high-level FOMs. This strategic level analysis methodology has been previously applied to Space Shuttle and International Space Station assessments and is now being applied to the development of the Constellation Program point-of-departure lunar architecture. This paper provides an overview of the strategic analysis methodology and the lunar exploration architecture analyses to date. In studying these analysis results, the strategic analysis team has identified and characterized key drivers affecting the integrated architecture behavior. These key drivers include inclusion of a cargo lander, mission rate, mission location, fixed-versus- variable costs/return on investment, and the requirement for probabilistic analysis. Results of sensitivity analysis performed on lunar exploration architecture scenarios are also presented.

  6. Mixed Methods Research: What Are the Key Issues to Consider?

    ERIC Educational Resources Information Center

    Ghosh, Rajashi

    2016-01-01

    Mixed methods research (MMR) is increasingly becoming a popular methodological approach in several fields due to the promise it holds for comprehensive understanding of complex problems being researched. However, researchers interested in MMR often lack reference to a guide that can explain the key issues pertaining to the paradigm wars…

  7. An analysis of clinical transition stresses experienced by dental students: A qualitative methods approach.

    PubMed

    Botelho, M; Gao, X; Bhuyan, S Y

    2018-04-17

    Stress in dental students is well established with potential psychological distress, emotional exhaustion and burnout-related symptoms. Little attention has been given to the problems encountered by dental students during the transition from theoretical or paraclinical training to the clinical environment. The aim of this study was to adopt a qualitative research methods approach to understand the perceived stressors during students' clinical transition and provide insights for curriculum planners to enhance learning. This study analysed four groups of 2nd- and 3rd-year BDS students' experiences in focus group interviews relating to their pre-clinical and clinical transitions. The interviews were recorded and transcribed verbatim, and a thematic analysis was performed using an inductive qualitative approach. Key overlapping domains identified were the transition gap and stresses. The transition gap was subclassified into knowledge and skill (hard and soft), and stresses was subcategorised into internal and external stresses. On first coming to clinics, students experienced knowledge gaps of unfamiliar clinical treatments with mismatches between knowledge acquisition and clinical exposure. Students felt incompetent owing to the stresses attributable to curriculum design, staff and the patient. This negatively affected their confidence and clinical performance. A range of challenges have been identified that will allow curriculum designer's to plan a more supportive learning experience to help students during their transition to clinical practice giving them timely knowledge, confidence and clinical performance to better prepare them for entering clinics. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  8. The Need to Disentangle Key Concepts from Ecosystem-Approach Jargon

    PubMed Central

    WAYLEN, K A; HASTINGS, E J; BANKS, E A; HOLSTEAD, K L; IRVINE, R J; BLACKSTOCK, K L

    2014-01-01

    The ecosystem approach—as endorsed by the Convention on Biological Diversity (CDB) in 2000—is a strategy for holistic, sustainable, and equitable natural resource management, to be implemented via the 12 Malawi Principles. These principles describe the need to manage nature in terms of dynamic ecosystems, while fully engaging with local peoples. It is an ambitious concept. Today, the term is common throughout the research and policy literature on environmental management. However, multiple meanings have been attached to the term, resulting in confusion. We reviewed references to the ecosystem approach from 1957 to 2012 and identified 3 primary uses: as an alternative to ecosystem management or ecosystem-based management; in reference to an integrated and equitable approach to resource management as per the CBD; and as a term signifying a focus on understanding and valuing ecosystem services. Although uses of this term and its variants may overlap in meaning, typically, they do not entirely reflect the ethos of the ecosystem approach as defined by the CBD. For example, there is presently an increasing emphasis on ecosystem services, but focusing on these alone does not promote decentralization of management or use of all forms of knowledge, both of which are integral to the CBD’s concept. We highlight that the Malawi Principles are at risk of being forgotten. To better understand these principles, more effort to implement them is required. Such efforts should be evaluated, ideally with comparative approaches, before allowing the CBD’s concept of holistic and socially engaged management to be abandoned or superseded. It is possible that attempts to implement all 12 principles together will face many challenges, but they may also offer a unique way to promote holistic and equitable governance of natural resources. Therefore, we believe that the CBD’s concept of the ecosystem approach demands more attention. La Necesidad de Desenredar Conceptos Clave del

  9. Work Keys USA.

    ERIC Educational Resources Information Center

    Work Keys USA, 1998

    1998-01-01

    "Work Keys" is a comprehensive program for assessing and teaching workplace skills. This serial "special issue" features 18 first-hand reports on Work Keys projects in action in states across North America. They show how the Work Keys is helping businesses and educators solve the challenge of building a world-class work force.…

  10. Integrating care for older people with complex needs: key insights and lessons from a seven-country cross-case analysis.

    PubMed

    Wodchis, Walter P; Dixon, Anna; Anderson, Geoff M; Goodwin, Nick

    2015-01-01

    To address the challenges of caring for a growing number of older people with a mix of both health problems and functional impairment, programmes in different countries have different approaches to integrating health and social service supports. The goal of this analysis is to identify important lessons for policy makers and service providers to enable better design, implementation and spread of successful integrated care models. This paper provides a structured cross-case synthesis of seven integrated care programmes in Australia, Canada, the Netherlands, New Zealand, Sweden, the UK and the USA. All seven programmes involved bottom-up innovation driven by local needs and included: (1) a single point of entry, (2) holistic care assessments, (3) comprehensive care planning, (4) care co-ordination and (5) a well-connected provider network. The process of achieving successful integration involves collaboration and, although the specific types of collaboration varied considerably across the seven case studies, all involved a care coordinator or case manager. Most programmes were not systematically evaluated but the two with formal external evaluations showed benefit and have been expanded. Case managers or care coordinators who support patient-centred collaborative care are key to successful integration in all our cases as are policies that provide funds and support for local initiatives that allow for bottom-up innovation. However, more robust and systematic evaluation of these initiatives is needed to clarify the 'business case' for integrated health and social care and to ensure successful generalization of local successes.

  11. Recurrent seascape units identify key ecological processes along the western Antarctic Peninsula.

    PubMed

    Bowman, Jeff S; Kavanaugh, Maria T; Doney, Scott C; Ducklow, Hugh W

    2018-04-10

    The western Antarctic Peninsula (WAP) is a bellwether of global climate change and natural laboratory for identifying interactions between climate and ecosystems. The Palmer Long-Term Ecological Research (LTER) project has collected data on key ecological and environmental processes along the WAP since 1993. To better understand how key ecological parameters are changing across space and time, we developed a novel seascape classification approach based on in situ temperature, salinity, chlorophyll a, nitrate + nitrite, phosphate, and silicate. We anticipate that this approach will be broadly applicable to other geographical areas. Through the application of self-organizing maps (SOMs), we identified eight recurrent seascape units (SUs) in these data. These SUs have strong fidelity to known regional water masses but with an additional layer of biogeochemical detail, allowing us to identify multiple distinct nutrient profiles in several water masses. To identify the temporal and spatial distribution of these SUs, we mapped them across the Palmer LTER sampling grid via objective mapping of the original parameters. Analysis of the abundance and distribution of SUs since 1993 suggests two year types characterized by the partitioning of chlorophyll a into SUs with different spatial characteristics. By developing generalized linear models for correlated, time-lagged external drivers, we conclude that early spring sea ice conditions exert a strong influence on the distribution of chlorophyll a and nutrients along the WAP, but not necessarily the total chlorophyll a inventory. Because the distribution and density of phytoplankton biomass can have an impact on biomass transfer to the upper trophic levels, these results highlight anticipated links between the WAP marine ecosystem and climate. © 2018 John Wiley & Sons Ltd.

  12. Analysis of the proteolysis of bioactive peptides using a peptidomics approach

    PubMed Central

    Kim, Yun-Gon; Lone, Anna Mari; Saghatelian, Alan

    2014-01-01

    Identifying the peptidases that inactivate bioactive peptides (e.g. peptide hormones and neuropeptides) in mammals is an important unmet challenge. This protocol describes a recent approach that combines liquid chromatography-mass spectrometry peptidomics to identify endogenous cleavage sites of a bioactive peptide, the subsequent biochemical purification of a candidate peptidase based on these cleavage sites, and validation of the candidate peptidase’s role in the physiological regulation of the bioactive peptide by examining a peptidase knockout mouse. We highlight successful application of this protocol to discover that insulin-degrading enzyme (IDE) regulates physiological calcitonin gene-related peptide (CGRP) levels and detail the key stages and steps in this approach. This protocol requires 7 days of work; however, the total time for this protocol is highly variable because of its dependence on the availability of biological reagents, namely purified enzymes and knockout mice. The protocol is valuable because it expedites the characterization of mammalian peptidases, such as IDE, which in certain instances can be used to develop novel therapeutics. PMID:23949379

  13. Integrating diverse databases into an unified analysis framework: a Galaxy approach

    PubMed Central

    Blankenberg, Daniel; Coraor, Nathan; Von Kuster, Gregory; Taylor, James; Nekrutenko, Anton

    2011-01-01

    Recent technological advances have lead to the ability to generate large amounts of data for model and non-model organisms. Whereas, in the past, there have been a relatively small number of central repositories that serve genomic data, an increasing number of distinct specialized data repositories and resources have been established. Here, we describe a generic approach that provides for the integration of a diverse spectrum of data resources into a unified analysis framework, Galaxy (http://usegalaxy.org). This approach allows the simplified coupling of external data resources with the data analysis tools available to Galaxy users, while leveraging the native data mining facilities of the external data resources. Database URL: http://usegalaxy.org PMID:21531983

  14. Integrative Analysis of Transcription Factor Combinatorial Interactions Using a Bayesian Tensor Factorization Approach

    PubMed Central

    Ye, Yusen; Gao, Lin; Zhang, Shihua

    2017-01-01

    Transcription factors play a key role in transcriptional regulation of genes and determination of cellular identity through combinatorial interactions. However, current studies about combinatorial regulation is deficient due to lack of experimental data in the same cellular environment and extensive existence of data noise. Here, we adopt a Bayesian CANDECOMP/PARAFAC (CP) factorization approach (BCPF) to integrate multiple datasets in a network paradigm for determining precise TF interaction landscapes. In our first application, we apply BCPF to integrate three networks built based on diverse datasets of multiple cell lines from ENCODE respectively to predict a global and precise TF interaction network. This network gives 38 novel TF interactions with distinct biological functions. In our second application, we apply BCPF to seven types of cell type TF regulatory networks and predict seven cell lineage TF interaction networks, respectively. By further exploring the dynamics and modularity of them, we find cell lineage-specific hub TFs participate in cell type or lineage-specific regulation by interacting with non-specific TFs. Furthermore, we illustrate the biological function of hub TFs by taking those of cancer lineage and blood lineage as examples. Taken together, our integrative analysis can reveal more precise and extensive description about human TF combinatorial interactions. PMID:29033978

  15. Integrative Analysis of Transcription Factor Combinatorial Interactions Using a Bayesian Tensor Factorization Approach.

    PubMed

    Ye, Yusen; Gao, Lin; Zhang, Shihua

    2017-01-01

    Transcription factors play a key role in transcriptional regulation of genes and determination of cellular identity through combinatorial interactions. However, current studies about combinatorial regulation is deficient due to lack of experimental data in the same cellular environment and extensive existence of data noise. Here, we adopt a Bayesian CANDECOMP/PARAFAC (CP) factorization approach (BCPF) to integrate multiple datasets in a network paradigm for determining precise TF interaction landscapes. In our first application, we apply BCPF to integrate three networks built based on diverse datasets of multiple cell lines from ENCODE respectively to predict a global and precise TF interaction network. This network gives 38 novel TF interactions with distinct biological functions. In our second application, we apply BCPF to seven types of cell type TF regulatory networks and predict seven cell lineage TF interaction networks, respectively. By further exploring the dynamics and modularity of them, we find cell lineage-specific hub TFs participate in cell type or lineage-specific regulation by interacting with non-specific TFs. Furthermore, we illustrate the biological function of hub TFs by taking those of cancer lineage and blood lineage as examples. Taken together, our integrative analysis can reveal more precise and extensive description about human TF combinatorial interactions.

  16. Public key infrastructure for DOE security research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aiken, R.; Foster, I.; Johnston, W.E.

    This document summarizes the Department of Energy`s Second Joint Energy Research/Defence Programs Security Research Workshop. The workshop, built on the results of the first Joint Workshop which reviewed security requirements represented in a range of mission-critical ER and DP applications, discussed commonalties and differences in ER/DP requirements and approaches, and identified an integrated common set of security research priorities. One significant conclusion of the first workshop was that progress in a broad spectrum of DOE-relevant security problems and applications could best be addressed through public-key cryptography based systems, and therefore depended upon the existence of a robust, broadly deployed public-keymore » infrastructure. Hence, public-key infrastructure ({open_quotes}PKI{close_quotes}) was adopted as a primary focus for the second workshop. The Second Joint Workshop covered a range of DOE security research and deployment efforts, as well as summaries of the state of the art in various areas relating to public-key technologies. Key findings were that a broad range of DOE applications can benefit from security architectures and technologies built on a robust, flexible, widely deployed public-key infrastructure; that there exists a collection of specific requirements for missing or undeveloped PKI functionality, together with a preliminary assessment of how these requirements can be met; that, while commercial developments can be expected to provide many relevant security technologies, there are important capabilities that commercial developments will not address, due to the unique scale, performance, diversity, distributed nature, and sensitivity of DOE applications; that DOE should encourage and support research activities intended to increase understanding of security technology requirements, and to develop critical components not forthcoming from other sources in a timely manner.« less

  17. Global analysis of plasticity in turgor loss point, a key drought tolerance trait.

    PubMed

    Bartlett, Megan K; Zhang, Ya; Kreidler, Nissa; Sun, Shanwen; Ardy, Rico; Cao, Kunfang; Sack, Lawren

    2014-12-01

    Many species face increasing drought under climate change. Plasticity has been predicted to strongly influence species' drought responses, but broad patterns in plasticity have not been examined for key drought tolerance traits, including turgor loss or 'wilting' point (πtlp ). As soil dries, plants shift πtlp by accumulating solutes (i.e. 'osmotic adjustment'). We conducted the first global analysis of plasticity in Δπtlp and related traits for 283 wild and crop species in ecosystems worldwide. Δπtlp was widely prevalent but moderate (-0.44 MPa), accounting for 16% of post-drought πtlp. Thus, pre-drought πtlp was a considerably stronger predictor of post-drought πtlp across species of wild plants. For cultivars of certain crops Δπtlp accounted for major differences in post-drought πtlp. Climate was correlated with pre- and post-drought πtlp, but not Δπtlp. Thus, despite the wide prevalence of plasticity, πtlp measured in one season can reliably characterise most species' constitutive drought tolerances and distributions relative to water supply. © 2014 John Wiley & Sons Ltd/CNRS.

  18. Network theory inspired analysis of time-resolved expression data reveals key players guiding P. patens stem cell development.

    PubMed

    Busch, Hauke; Boerries, Melanie; Bao, Jie; Hanke, Sebastian T; Hiss, Manuel; Tiko, Theodhor; Rensing, Stefan A

    2013-01-01

    Transcription factors (TFs) often trigger developmental decisions, yet, their transcripts are often only moderately regulated and thus not easily detected by conventional statistics on expression data. Here we present a method that allows to determine such genes based on trajectory analysis of time-resolved transcriptome data. As a proof of principle, we have analysed apical stem cells of filamentous moss (P. patens) protonemata that develop from leaflets upon their detachment from the plant. By our novel correlation analysis of the post detachment transcriptome kinetics we predict five out of 1,058 TFs to be involved in the signaling leading to the establishment of pluripotency. Among the predicted regulators is the basic helix loop helix TF PpRSL1, which we show to be involved in the establishment of apical stem cells in P. patens. Our methodology is expected to aid analysis of key players of developmental decisions in complex plant and animal systems.

  19. Malay sentiment analysis based on combined classification approaches and Senti-lexicon algorithm.

    PubMed

    Al-Saffar, Ahmed; Awang, Suryanti; Tao, Hai; Omar, Nazlia; Al-Saiagh, Wafaa; Al-Bared, Mohammed

    2018-01-01

    Sentiment analysis techniques are increasingly exploited to categorize the opinion text to one or more predefined sentiment classes for the creation and automated maintenance of review-aggregation websites. In this paper, a Malay sentiment analysis classification model is proposed to improve classification performances based on the semantic orientation and machine learning approaches. First, a total of 2,478 Malay sentiment-lexicon phrases and words are assigned with a synonym and stored with the help of more than one Malay native speaker, and the polarity is manually allotted with a score. In addition, the supervised machine learning approaches and lexicon knowledge method are combined for Malay sentiment classification with evaluating thirteen features. Finally, three individual classifiers and a combined classifier are used to evaluate the classification accuracy. In experimental results, a wide-range of comparative experiments is conducted on a Malay Reviews Corpus (MRC), and it demonstrates that the feature extraction improves the performance of Malay sentiment analysis based on the combined classification. However, the results depend on three factors, the features, the number of features and the classification approach.

  20. Malay sentiment analysis based on combined classification approaches and Senti-lexicon algorithm

    PubMed Central

    Awang, Suryanti; Tao, Hai; Omar, Nazlia; Al-Saiagh, Wafaa; Al-bared, Mohammed

    2018-01-01

    Sentiment analysis techniques are increasingly exploited to categorize the opinion text to one or more predefined sentiment classes for the creation and automated maintenance of review-aggregation websites. In this paper, a Malay sentiment analysis classification model is proposed to improve classification performances based on the semantic orientation and machine learning approaches. First, a total of 2,478 Malay sentiment-lexicon phrases and words are assigned with a synonym and stored with the help of more than one Malay native speaker, and the polarity is manually allotted with a score. In addition, the supervised machine learning approaches and lexicon knowledge method are combined for Malay sentiment classification with evaluating thirteen features. Finally, three individual classifiers and a combined classifier are used to evaluate the classification accuracy. In experimental results, a wide-range of comparative experiments is conducted on a Malay Reviews Corpus (MRC), and it demonstrates that the feature extraction improves the performance of Malay sentiment analysis based on the combined classification. However, the results depend on three factors, the features, the number of features and the classification approach. PMID:29684036

  1. Better Crunching: Recommendations for Multivariate Data Analysis Approaches for Program Impact Evaluations

    ERIC Educational Resources Information Center

    Braverman, Marc T.

    2016-01-01

    Extension program evaluations often present opportunities to analyze data in multiple ways. This article suggests that program evaluations can involve more sophisticated data analysis approaches than are often used. On the basis of a hypothetical program scenario and corresponding data set, two approaches to testing for evidence of program impact…

  2. Integrated care: wellness-oriented peer approaches: a key ingredient for integrated care.

    PubMed

    Swarbrick, Margaret A

    2013-08-01

    People with lived experience of mental illness have become leaders of an influential movement to help the mental health system embrace the notion of whole health and wellness in the areas of advocacy, policy, and care delivery. Wellness-oriented peer approaches delivered by peer-support whole-health specialists and wellness coaches can play an important role in integrated care models. This column examines the wellness definitions and peer models and some specific benefits and tensions between the peer-oriented wellness approach and the medical model. These models can work in unison to improve health and wellness among people with mental and substance use disorders.

  3. Partially Key Distribution with Public Key Cryptosystem Based on Error Control Codes

    NASA Astrophysics Data System (ADS)

    Tavallaei, Saeed Ebadi; Falahati, Abolfazl

    Due to the low level of security in public key cryptosystems based on number theory, fundamental difficulties such as "key escrow" in Public Key Infrastructure (PKI) and a secure channel in ID-based cryptography, a new key distribution cryptosystem based on Error Control Codes (ECC) is proposed . This idea is done by some modification on McEliece cryptosystem. The security of ECC cryptosystem obtains from the NP-Completeness of block codes decoding. The capability of generating public keys with variable lengths which is suitable for different applications will be provided by using ECC. It seems that usage of these cryptosystems because of decreasing in the security of cryptosystems based on number theory and increasing the lengths of their keys would be unavoidable in future.

  4. A Key Gene, PLIN1, Can Affect Porcine Intramuscular Fat Content Based on Transcriptome Analysis

    PubMed Central

    Li, Bojiang; Weng, Qiannan; Dong, Chao; Zhang, Zengkai; Li, Rongyang; Liu, Jingge; Jiang, Aiwen; Li, Qifa; Jia, Chao; Wu, Wangjun; Liu, Honglin

    2018-01-01

    Intramuscular fat (IMF) content is an important indicator for meat quality evaluation. However, the key genes and molecular regulatory mechanisms affecting IMF deposition remain unclear. In the present study, we identified 75 differentially expressed genes (DEGs) between the higher (H) and lower (L) IMF content of pigs using transcriptome analysis, of which 27 were upregulated and 48 were downregulated. Notably, Kyoto Encyclopedia of Genes and Genomes (KEGG) enrichment analysis indicated that the DEG perilipin-1 (PLIN1) was significantly enriched in the fat metabolism-related peroxisome proliferator-activated receptor (PPAR) signaling pathway. Furthermore, we determined the expression patterns and functional role of porcine PLIN1. Our results indicate that PLIN1 was highly expressed in porcine adipose tissue, and its expression level was significantly higher in the H IMF content group when compared with the L IMF content group, and expression was increased during adipocyte differentiation. Additionally, our results confirm that PLIN1 knockdown decreases the triglyceride (TG) level and lipid droplet (LD) size in porcine adipocytes. Overall, our data identify novel candidate genes affecting IMF content and provide new insight into PLIN1 in porcine IMF deposition and adipocyte differentiation. PMID:29617344

  5. A Key Gene, PLIN1, Can Affect Porcine Intramuscular Fat Content Based on Transcriptome Analysis.

    PubMed

    Li, Bojiang; Weng, Qiannan; Dong, Chao; Zhang, Zengkai; Li, Rongyang; Liu, Jingge; Jiang, Aiwen; Li, Qifa; Jia, Chao; Wu, Wangjun; Liu, Honglin

    2018-04-04

    Intramuscular fat (IMF) content is an important indicator for meat quality evaluation. However, the key genes and molecular regulatory mechanisms affecting IMF deposition remain unclear. In the present study, we identified 75 differentially expressed genes (DEGs) between the higher (H) and lower (L) IMF content of pigs using transcriptome analysis, of which 27 were upregulated and 48 were downregulated. Notably, Kyoto Encyclopedia of Genes and Genomes (KEGG) enrichment analysis indicated that the DEG perilipin-1 ( PLIN1 ) was significantly enriched in the fat metabolism-related peroxisome proliferator-activated receptor (PPAR) signaling pathway. Furthermore, we determined the expression patterns and functional role of porcine PLIN1. Our results indicate that PLIN1 was highly expressed in porcine adipose tissue, and its expression level was significantly higher in the H IMF content group when compared with the L IMF content group, and expression was increased during adipocyte differentiation. Additionally, our results confirm that PLIN1 knockdown decreases the triglyceride (TG) level and lipid droplet (LD) size in porcine adipocytes. Overall, our data identify novel candidate genes affecting IMF content and provide new insight into PLIN1 in porcine IMF deposition and adipocyte differentiation.

  6. Practical decoy state for quantum key distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma Xiongfeng; Qi Bing; Zhao Yi

    2005-07-15

    Decoy states have recently been proposed as a useful method for substantially improving the performance of quantum key distribution (QKD). Here, we present a general theory of the decoy state protocol based on only two decoy states and one signal state. We perform optimization on the choice of intensities of the two decoy states and the signal state. Our result shows that a decoy state protocol with only two types of decoy states - the vacuum and a weak decoy state - asymptotically approaches the theoretical limit of the most general type of decoy state protocol (with an infinite numbermore » of decoy states). We also present a one-decoy-state protocol. Moreover, we provide estimations on the effects of statistical fluctuations and suggest that, even for long-distance (larger than 100 km) QKD, our two-decoy-state protocol can be implemented with only a few hours of experimental data. In conclusion, decoy state quantum key distribution is highly practical.« less

  7. Key principles of community-based natural resource management: a synthesis and interpretation of identified effective approaches for managing the commons.

    PubMed

    Gruber, James S

    2010-01-01

    This article examines recent research on approaches to community-based environmental and natural resource management and reviews the commonalities and differences between these interdisciplinary and multistakeholder initiatives. To identify the most effective characteristics of Community-based natural resource management (CBNRM), I collected a multiplicity of perspectives from research teams and then grouped findings into a matrix of organizational principles and key characteristics. The matrix was initially vetted (or "field tested") by applying numerous case studies that were previously submitted to the World Bank International Workshop on CBNRM. These practitioner case studies were then compared and contrasted with the findings of the research teams. It is hoped that the developed matrix may be useful to researchers in further focusing research, understanding core characteristics of effective and sustainable CBNRM, providing practitioners with a framework for developing new CBNRM initiatives for managing the commons, and providing a potential resource for academic institutions during their evaluation of their practitioner-focused environmental management and leadership curriculum.

  8. Keys to Scholarship

    ERIC Educational Resources Information Center

    Hebert, Terri

    2011-01-01

    Up ahead, a foreboding wooden door showing wear from passage of earlier travelers is spotted. As the old porch light emits a pale yellow glow, a key ring emerges from deep inside the coat pocket. Searching for just the right key, the voyager settles on one that also shows age. As the key enters its receptacle and begins to turn, a clicking noise…

  9. Myopic Loss Aversion: Demystifying the Key Factors Influencing Decision Problem Framing

    ERIC Educational Resources Information Center

    Hardin, Andrew M.; Looney, Clayton Arlen

    2012-01-01

    Advancement of myopic loss aversion theory has been hamstrung by conflicting results, methodological inconsistencies, and a piecemeal approach toward understanding the key factors influencing decision problem framing. A series of controlled experiments provides a more holistic view of the variables promoting myopia. Extending the information…

  10. [Approaches to medical training among physicians who teach; analysis of two different educational strategies].

    PubMed

    Loría-Castellanos, Jorge; Rivera-lbarra, Doris Beatriz; Márquez-Avila, Guadalupe

    2009-01-01

    Compare the outreach of a promotional educational strategy that focuses on active participation and compare it with a more traditional approach to medical training. A quasi-experimental design was approved by the research committee. We compared the outreach of two different approaches to medical training. We administered a validated instrument that included 72 items that analyze statements used to measure educational tasks in the form of duplets through 3 indicators. A group that included seven physicians that were actively participating in teaching activities was stratified according to teaching approaches. One of the approaches was a traditional one and the other included a promotional strategy aimed at increasing participation. All participants signed informed consent before answering the research instruments. Statistical analysis was done using non-parametric tests. Mann-Whitney results did not show differences among the group in the preliminary analysis. A second analysis with the same test after the interventions found significant differences (p d" 0.018) in favor of those subjects that had participated in the promotional approach mainly in the indicator measuring "consequence". The Wilcoxon test showed that all participants in the promotional approach increased significantly (pd" 0.018) in 3 main indicators as compared with the control group. A promotional strategy aimed at increasing physician participation constitutes a more profitable approach when compared with traditional teaching methods.

  11. An object-based approach to weather analysis and its applications

    NASA Astrophysics Data System (ADS)

    Troemel, Silke; Diederich, Malte; Horvath, Akos; Simmer, Clemens; Kumjian, Matthew

    2013-04-01

    The research group 'Object-based Analysis and SEamless prediction' (OASE) within the Hans Ertel Centre for Weather Research programme (HErZ) pursues an object-based approach to weather analysis. The object-based tracking approach adopts the Lagrange perspective by identifying and following the development of convective events over the course of their lifetime. Prerequisites of the object-based analysis are a high-resolved observational data base and a tracking algorithm. A near real-time radar and satellite remote sensing-driven 3D observation-microphysics composite covering Germany, currently under development, contains gridded observations and estimated microphysical quantities. A 3D scale-space tracking identifies convective rain events in the dual-composite and monitors the development over the course of their lifetime. The OASE-group exploits the object-based approach in several fields of application: (1) For a better understanding and analysis of precipitation processes responsible for extreme weather events, (2) in nowcasting, (3) as a novel approach for validation of meso-γ atmospheric models, and (4) in data assimilation. Results from the different fields of application will be presented. The basic idea of the object-based approach is to identify a small set of radar- and satellite derived descriptors which characterize the temporal development of precipitation systems which constitute the objects. So-called proxies of the precipitation process are e.g. the temporal change of the brightband, vertically extensive columns of enhanced differential reflectivity ZDR or the cloud top temperature and heights identified in the 4D field of ground-based radar reflectivities and satellite retrievals generated by a cell during its life time. They quantify (micro-) physical differences among rain events and relate to the precipitation yield. Analyses on the informative content of ZDR columns as precursor for storm evolution for example will be presented to demonstrate

  12. Authenticated multi-user quantum key distribution with single particles

    NASA Astrophysics Data System (ADS)

    Lin, Song; Wang, Hui; Guo, Gong-De; Ye, Guo-Hua; Du, Hong-Zhen; Liu, Xiao-Fen

    2016-03-01

    Quantum key distribution (QKD) has been growing rapidly in recent years and becomes one of the hottest issues in quantum information science. During the implementation of QKD on a network, identity authentication has been one main problem. In this paper, an efficient authenticated multi-user quantum key distribution (MQKD) protocol with single particles is proposed. In this protocol, any two users on a quantum network can perform mutual authentication and share a secure session key with the assistance of a semi-honest center. Meanwhile, the particles, which are used as quantum information carriers, are not required to be stored, therefore the proposed protocol is feasible with current technology. Finally, security analysis shows that this protocol is secure in theory.

  13. Environment and Development--Key Concepts for a New Approach to Education.

    ERIC Educational Resources Information Center

    Sachs, Ignacy

    1978-01-01

    Development and environment are two interconnected concepts and solutions to the problems of both can be found through "ecodevelopment." Inherent in this planning approach is an effort to meet the basic needs of a community while establishing true symbiosis between man and the planet. Educational effort at every level is necessary to make this…

  14. Holistic Understanding in Geography Education (HUGE)--An Alternative Approach to Curriculum Development and Learning at Key Stage 3

    ERIC Educational Resources Information Center

    Renshaw, Simon; Wood, Phil

    2011-01-01

    This article reports the results of a small-scale curriculum development project focusing on two of the seven "key concepts" identified in the revised Key Stage 3 (KS3) National Curriculum programme of study for geography, introduced into schools in 2007. The study used "interdependence" and "physical processes" as…

  15. Coherent attacking continuous-variable quantum key distribution with entanglement in the middle

    NASA Astrophysics Data System (ADS)

    Zhang, Zhaoyuan; Shi, Ronghua; Zeng, Guihua; Guo, Ying

    2018-06-01

    We suggest an approach on the coherent attack of continuous-variable quantum key distribution (CVQKD) with an untrusted entangled source in the middle. The coherent attack strategy can be performed on the double links of quantum system, enabling the eavesdropper to steal more information from the proposed scheme using the entanglement correlation. Numeric simulation results show the improved performance of the attacked CVQKD system in terms of the derived secret key rate with the controllable parameters maximizing the stolen information.

  16. Chemical Discrimination of Cortex Phellodendri amurensis and Cortex Phellodendri chinensis by Multivariate Analysis Approach.

    PubMed

    Sun, Hui; Wang, Huiyu; Zhang, Aihua; Yan, Guangli; Han, Ying; Li, Yuan; Wu, Xiuhong; Meng, Xiangcai; Wang, Xijun

    2016-01-01

    As herbal medicines have an important position in health care systems worldwide, their current assessment, and quality control are a major bottleneck. Cortex Phellodendri chinensis (CPC) and Cortex Phellodendri amurensis (CPA) are widely used in China, however, how to identify species of CPA and CPC has become urgent. In this study, multivariate analysis approach was performed to the investigation of chemical discrimination of CPA and CPC. Principal component analysis showed that two herbs could be separated clearly. The chemical markers such as berberine, palmatine, phellodendrine, magnoflorine, obacunone, and obaculactone were identified through the orthogonal partial least squared discriminant analysis, and were identified tentatively by the accurate mass of quadruple-time-of-flight mass spectrometry. A total of 29 components can be used as the chemical markers for discrimination of CPA and CPC. Of them, phellodenrine is significantly higher in CPC than that of CPA, whereas obacunone and obaculactone are significantly higher in CPA than that of CPC. The present study proves that multivariate analysis approach based chemical analysis greatly contributes to the investigation of CPA and CPC, and showed that the identified chemical markers as a whole should be used to discriminate the two herbal medicines, and simultaneously the results also provided chemical information for their quality assessment. Multivariate analysis approach was performed to the investigate the herbal medicineThe chemical markers were identified through multivariate analysis approachA total of 29 components can be used as the chemical markers. UPLC-Q/TOF-MS-based multivariate analysis method for the herbal medicine samples Abbreviations used: CPC: Cortex Phellodendri chinensis, CPA: Cortex Phellodendri amurensis, PCA: Principal component analysis, OPLS-DA: Orthogonal partial least squares discriminant analysis, BPI: Base peaks ion intensity.

  17. Formalizing the Austrian Procedure Catalogue: A 4-step methodological analysis approach.

    PubMed

    Neururer, Sabrina Barbara; Lasierra, Nelia; Peiffer, Karl Peter; Fensel, Dieter

    2016-04-01

    Due to the lack of an internationally accepted and adopted standard for coding health interventions, Austria has established its own country-specific procedure classification system - the Austrian Procedure Catalogue (APC). Even though the APC is an elaborate coding standard for medical procedures, it has shortcomings that limit its usability. In order to enhance usability and usefulness, especially for research purposes and e-health applications, we developed an ontologized version of the APC. In this paper we present a novel four-step approach for the ontology engineering process, which enables accurate extraction of relevant concepts for medical ontologies from written text. The proposed approach for formalizing the APC consists of the following four steps: (1) comparative pre-analysis, (2) definition analysis, (3) typological analysis, and (4) ontology implementation. The first step contained a comparison of the APC to other well-established or elaborate health intervention coding systems in order to identify strengths and weaknesses of the APC. In the second step, a list of definitions of medical terminology used in the APC was obtained. This list of definitions was used as input for Step 3, in which we identified the most important concepts to describe medical procedures using the qualitative typological analysis approach. The definition analysis as well as the typological analysis are well-known and effective methods used in social sciences, but not commonly employed in the computer science or ontology engineering domain. Finally, this list of concepts was used in Step 4 to formalize the APC. The pre-analysis highlighted the major shortcomings of the APC, such as the lack of formal definition, leading to implicitly available, but not directly accessible information (hidden data), or the poor procedural type classification. After performing the definition and subsequent typological analyses, we were able to identify the following main characteristics of

  18. Characterizing the structure and content of nurse handoffs: A Sequential Conversational Analysis approach.

    PubMed

    Abraham, Joanna; Kannampallil, Thomas; Brenner, Corinne; Lopez, Karen D; Almoosa, Khalid F; Patel, Bela; Patel, Vimla L

    2016-02-01

    Effective communication during nurse handoffs is instrumental in ensuring safe and quality patient care. Much of the prior research on nurse handoffs has utilized retrospective methods such as interviews, surveys and questionnaires. While extremely useful, an in-depth understanding of the structure and content of conversations, and the inherent relationships within the content is paramount to designing effective nurse handoff interventions. In this paper, we present a methodological framework-Sequential Conversational Analysis (SCA)-a mixed-method approach that integrates qualitative conversational analysis with quantitative sequential pattern analysis. We describe the SCA approach and provide a detailed example as a proof of concept of its use for the analysis of nurse handoff communication in a medical intensive care unit. This novel approach allows us to characterize the conversational structure, clinical content, disruptions in the conversation, and the inherently phasic nature of nurse handoff communication. The characterization of communication patterns highlights the relationships underlying the verbal content of nurse handoffs with specific emphasis on: the interactive nature of conversation, relevance of role-based (incoming, outgoing) communication requirements, clinical content focus on critical patient-related events, and discussion of pending patient management tasks. We also discuss the applicability of the SCA approach as a method for providing in-depth understanding of the dynamics of communication in other settings and domains. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Identification of key genes associated with the effect of estrogen on ovarian cancer using microarray analysis.

    PubMed

    Zhang, Shi-tao; Zuo, Chao; Li, Wan-nan; Fu, Xue-qi; Xing, Shu; Zhang, Xiao-ping

    2016-02-01

    To identify key genes related to the effect of estrogen on ovarian cancer. Microarray data (GSE22600) were downloaded from Gene Expression Omnibus. Eight estrogen and seven placebo treatment samples were obtained using a 2 × 2 factorial designs, which contained 2 cell lines (PEO4 and 2008) and 2 treatments (estrogen and placebo). Differentially expressed genes were identified by Bayesian methods, and the genes with P < 0.05 and |log2FC (fold change)| ≥0.5 were chosen as cut-off criterion. Differentially co-expressed genes (DCGs) and differentially regulated genes (DRGs) were, respectively, identified by DCe function and DRsort function in DCGL package. Topological structure analysis was performed on the important transcriptional factors (TFs) and genes in transcriptional regulatory network using tYNA. Functional enrichment analysis was, respectively, performed for DEGs and the important genes using Gene Ontology and KEGG databases. In total, 465 DEGs were identified. Functional enrichment analysis of DEGs indicated that ACVR2B, LTBP1, BMP7 and MYC involved in TGF-beta signaling pathway. The 2285 DCG pairs and 357 DRGs were identified. Topological structure analysis showed that 52 important TFs and 65 important genes were identified. Functional enrichment analysis of the important genes showed that TP53 and MLH1 participated in DNA damage response and the genes (ACVR2B, LTBP1, BMP7 and MYC) involved in TGF-beta signaling pathway. TP53, MLH1, ACVR2B, LTBP1 and BMP7 might participate in the pathogenesis of ovarian cancer.

  20. Identification of key microRNAs and genes in preeclampsia by bioinformatics analysis

    PubMed Central

    Luo, Shouling; Cao, Nannan; Tang, Yao; Gu, Weirong

    2017-01-01

    Preeclampsia is a leading cause of perinatal maternal–foetal mortality and morbidity. The aim of this study is to identify the key microRNAs and genes in preeclampsia and uncover their potential functions. We downloaded the miRNA expression profile of GSE84260 and the gene expression profile of GSE73374 from the Gene Expression Omnibus database. Differentially expressed miRNAs and genes were identified and compared to miRNA-target information from MiRWalk 2.0, and a total of 65 differentially expressed miRNAs (DEMIs), including 32 up-regulated miRNAs and 33 down-regulated miRNAs, and 91 differentially expressed genes (DEGs), including 83 up-regulated genes and 8 down-regulated genes, were identified. The pathway enrichment analyses of the DEMIs showed that the up-regulated DEMIs were enriched in the Hippo signalling pathway and MAPK signalling pathway, and the down-regulated DEMIs were enriched in HTLV-I infection and miRNAs in cancers. The gene ontology (GO) and Kyoto Encyclopedia of Genes and Genomes pathway (KEGG) enrichment analyses of the DEGs were performed using Multifaceted Analysis Tool for Human Transcriptome. The up-regulated DEGs were enriched in biological processes (BPs), including the response to cAMP, response to hydrogen peroxide and cell-cell adhesion mediated by integrin; no enrichment of down-regulated DEGs was identified. KEGG analysis showed that the up-regulated DEGs were enriched in the Hippo signalling pathway and pathways in cancer. A PPI network of the DEGs was constructed by using Cytoscape software, and FOS, STAT1, MMP14, ITGB1, VCAN, DUSP1, LDHA, MCL1, MET, and ZFP36 were identified as the hub genes. The current study illustrates a characteristic microRNA profile and gene profile in preeclampsia, which may contribute to the interpretation of the progression of preeclampsia and provide novel biomarkers and therapeutic targets for preeclampsia. PMID:28594854