Sample records for qualitative predictions based

  1. Viewing Knowledge Bases as Qualitative Models.

    ERIC Educational Resources Information Center

    Clancey, William J.

    The concept of a qualitative model provides a unifying perspective for understanding how expert systems differ from conventional programs. Knowledge bases contain qualitative models of systems in the world, that is, primarily non-numeric descriptions that provide a basis for explaining and predicting behavior and formulating action plans. The…

  2. Hyperspectral face recognition using improved inter-channel alignment based on qualitative prediction models.

    PubMed

    Cho, Woon; Jang, Jinbeum; Koschan, Andreas; Abidi, Mongi A; Paik, Joonki

    2016-11-28

    A fundamental limitation of hyperspectral imaging is the inter-band misalignment correlated with subject motion during data acquisition. One way of resolving this problem is to assess the alignment quality of hyperspectral image cubes derived from the state-of-the-art alignment methods. In this paper, we present an automatic selection framework for the optimal alignment method to improve the performance of face recognition. Specifically, we develop two qualitative prediction models based on: 1) a principal curvature map for evaluating the similarity index between sequential target bands and a reference band in the hyperspectral image cube as a full-reference metric; and 2) the cumulative probability of target colors in the HSV color space for evaluating the alignment index of a single sRGB image rendered using all of the bands of the hyperspectral image cube as a no-reference metric. We verify the efficacy of the proposed metrics on a new large-scale database, demonstrating a higher prediction accuracy in determining improved alignment compared to two full-reference and five no-reference image quality metrics. We also validate the ability of the proposed framework to improve hyperspectral face recognition.

  3. Predicting Dissertation Methodology Choice among Doctoral Candidates at a Faith-Based University

    ERIC Educational Resources Information Center

    Lunde, Rebecca

    2017-01-01

    Limited research has investigated dissertation methodology choice and the factors that contribute to this choice. Quantitative research is based in mathematics and scientific positivism, and qualitative research is based in constructivism. These underlying philosophical differences posit the question if certain factors predict dissertation…

  4. Qualitative model-based diagnosis using possibility theory

    NASA Technical Reports Server (NTRS)

    Joslyn, Cliff

    1994-01-01

    The potential for the use of possibility in the qualitative model-based diagnosis of spacecraft systems is described. The first sections of the paper briefly introduce the Model-Based Diagnostic (MBD) approach to spacecraft fault diagnosis; Qualitative Modeling (QM) methodologies; and the concepts of possibilistic modeling in the context of Generalized Information Theory (GIT). Then the necessary conditions for the applicability of possibilistic methods to qualitative MBD, and a number of potential directions for such an application, are described.

  5. A Qualitative Simulation Framework in Smalltalk Based on Fuzzy Arithmetic

    Treesearch

    Richard L. Olson; Daniel L. Schmoldt; David L. Peterson

    1996-01-01

    For many systems, it is not practical to collect and correlate empirical data necessary to formulate a mathematical model. However, it is often sufficient to predict qualitative dynamics effects (as opposed to system quantities), especially for research purposes. In this effort, an object-oriented application framework (AF) was developed for the qualitative modeling of...

  6. Qualitative Event-Based Diagnosis: Case Study on the Second International Diagnostic Competition

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Roychoudhury, Indranil

    2010-01-01

    We describe a diagnosis algorithm entered into the Second International Diagnostic Competition. We focus on the first diagnostic problem of the industrial track of the competition in which a diagnosis algorithm must detect, isolate, and identify faults in an electrical power distribution testbed and provide corresponding recovery recommendations. The diagnosis algorithm embodies a model-based approach, centered around qualitative event-based fault isolation. Faults produce deviations in measured values from model-predicted values. The sequence of these deviations is matched to those predicted by the model in order to isolate faults. We augment this approach with model-based fault identification, which determines fault parameters and helps to further isolate faults. We describe the diagnosis approach, provide diagnosis results from running the algorithm on provided example scenarios, and discuss the issues faced, and lessons learned, from implementing the approach

  7. Integrating qualitative research into evidence based practice.

    PubMed

    Greenhalgh, Trisha

    2002-09-01

    This article attempts to provide an overview of qualitative tools and methods using mainly examples from diabetes research. The other articles in this issue of the Endocrinology and Metabolism Clinics of North America have demonstrated the enormous contribution made in the past 15 years or so by rigorous quantitative studies of prevalence, diagnosis, prognosis, and therapy to clinical decision-making in endocrinology. In the early 21st century, the state of qualitative research into such topics as the illness experience of diabetes; the barriers to effective self care and positive health choices; the design of complex educational interventions; the design of appropriate, acceptable and responsive health services; and the decision-making behavior of health professionals, is such that there remain many more questions than answers. But qualitative research is increasingly recognized as an important, legitimate and expanding dimension of evidence-based health care (18;19). It is highly likely that the major landmark studies in diabetes care over the next decade will build on an exploratory qualitative study or incorporate an explanatory or evaluative dimension based on qualitative methods.

  8. Predicting Protein Function by Genomic Context: Quantitative Evaluation and Qualitative Inferences

    PubMed Central

    Huynen, Martijn; Snel, Berend; Lathe, Warren; Bork, Peer

    2000-01-01

    Various new methods have been proposed to predict functional interactions between proteins based on the genomic context of their genes. The types of genomic context that they use are Type I: the fusion of genes; Type II: the conservation of gene-order or co-occurrence of genes in potential operons; and Type III: the co-occurrence of genes across genomes (phylogenetic profiles). Here we compare these types for their coverage, their correlations with various types of functional interaction, and their overlap with homology-based function assignment. We apply the methods to Mycoplasma genitalium, the standard benchmarking genome in computational and experimental genomics. Quantitatively, conservation of gene order is the technique with the highest coverage, applying to 37% of the genes. By combining gene order conservation with gene fusion (6%), the co-occurrence of genes in operons in absence of gene order conservation (8%), and the co-occurrence of genes across genomes (11%), significant context information can be obtained for 50% of the genes (the categories overlap). Qualitatively, we observe that the functional interactions between genes are stronger as the requirements for physical neighborhood on the genome are more stringent, while the fraction of potential false positives decreases. Moreover, only in cases in which gene order is conserved in a substantial fraction of the genomes, in this case six out of twenty-five, does a single type of functional interaction (physical interaction) clearly dominate (>80%). In other cases, complementary function information from homology searches, which is available for most of the genes with significant genomic context, is essential to predict the type of interaction. Using a combination of genomic context and homology searches, new functional features can be predicted for 10% of M. genitalium genes. PMID:10958638

  9. Enhance your team-based qualitative research.

    PubMed

    Fernald, Douglas H; Duclos, Christine W

    2005-01-01

    Qualitative research projects often involve the collaborative efforts of a research team. Challenges inherent in teamwork include changes in membership and differences in analytical style, philosophy, training, experience, and skill. This article discusses teamwork issues and tools and techniques used to improve team-based qualitative research. We drew on our experiences in working on numerous projects of varying, size, duration, and purpose. Through trials of different tools and techniques, expert consultation, and review of the literature, we learned to improve how we build teams, manage information, and disseminate results. Attention given to team members and team processes is as important as choosing appropriate analytical tools and techniques. Attentive team leadership, commitment to early and regular team meetings, and discussion of roles, responsibilities, and expectations all help build more effective teams and establish clear norms. As data are collected and analyzed, it is important to anticipate potential problems from differing skills and styles, and how information and files are managed. Discuss analytical preferences and biases and set clear guidelines and practices for how data will be analyzed and handled. As emerging ideas and findings disperse across team members, common tools (such as summary forms and data grids), coding conventions, intermediate goals or products, and regular documentation help capture essential ideas and insights. In a team setting, little should be left to chance. This article identifies ways to improve team-based qualitative research with more a considered and systematic approach. Qualitative researchers will benefit from further examination and discussion of effective, field-tested, team-based strategies.

  10. Centrifugal compressor fault diagnosis based on qualitative simulation and thermal parameters

    NASA Astrophysics Data System (ADS)

    Lu, Yunsong; Wang, Fuli; Jia, Mingxing; Qi, Yuanchen

    2016-12-01

    This paper concerns fault diagnosis of centrifugal compressor based on thermal parameters. An improved qualitative simulation (QSIM) based fault diagnosis method is proposed to diagnose the faults of centrifugal compressor in a gas-steam combined-cycle power plant (CCPP). The qualitative models under normal and two faulty conditions have been built through the analysis of the principle of centrifugal compressor. To solve the problem of qualitative description of the observations of system variables, a qualitative trend extraction algorithm is applied to extract the trends of the observations. For qualitative states matching, a sliding window based matching strategy which consists of variables operating ranges constraints and qualitative constraints is proposed. The matching results are used to determine which QSIM model is more consistent with the running state of system. The correct diagnosis of two typical faults: seal leakage and valve stuck in the centrifugal compressor has validated the targeted performance of the proposed method, showing the advantages of fault roots containing in thermal parameters.

  11. System monitoring and diagnosis with qualitative models

    NASA Technical Reports Server (NTRS)

    Kuipers, Benjamin

    1991-01-01

    A substantial foundation of tools for model-based reasoning with incomplete knowledge was developed: QSIM (a qualitative simulation program) and its extensions for qualitative simulation; Q2, Q3 and their successors for quantitative reasoning on a qualitative framework; and the CC (component-connection) and QPC (Qualitative Process Theory) model compilers for building QSIM QDE (qualitative differential equation) models starting from different ontological assumptions. Other model-compilers for QDE's, e.g., using bond graphs or compartmental models, have been developed elsewhere. These model-building tools will support automatic construction of qualitative models from physical specifications, and further research into selection of appropriate modeling viewpoints. For monitoring and diagnosis, plausible hypotheses are unified against observations to strengthen or refute the predicted behaviors. In MIMIC (Model Integration via Mesh Interpolation Coefficients), multiple hypothesized models of the system are tracked in parallel in order to reduce the 'missing model' problem. Each model begins as a qualitative model, and is unified with a priori quantitative knowledge and with the stream of incoming observational data. When the model/data unification yields a contradiction, the model is refuted. When there is no contradiction, the predictions of the model are progressively strengthened, for use in procedure planning and differential diagnosis. Only under a qualitative level of description can a finite set of models guarantee the complete coverage necessary for this performance. The results of this research are presented in several publications. Abstracts of these published papers are presented along with abtracts of papers representing work that was synergistic with the NASA grant but funded otherwise. These 28 papers include but are not limited to: 'Combined qualitative and numerical simulation with Q3'; 'Comparative analysis and qualitative integral representations

  12. Qualitative-Modeling-Based Silicon Neurons and Their Networks

    PubMed Central

    Kohno, Takashi; Sekikawa, Munehisa; Li, Jing; Nanami, Takuya; Aihara, Kazuyuki

    2016-01-01

    The ionic conductance models of neuronal cells can finely reproduce a wide variety of complex neuronal activities. However, the complexity of these models has prompted the development of qualitative neuron models. They are described by differential equations with a reduced number of variables and their low-dimensional polynomials, which retain the core mathematical structures. Such simple models form the foundation of a bottom-up approach in computational and theoretical neuroscience. We proposed a qualitative-modeling-based approach for designing silicon neuron circuits, in which the mathematical structures in the polynomial-based qualitative models are reproduced by differential equations with silicon-native expressions. This approach can realize low-power-consuming circuits that can be configured to realize various classes of neuronal cells. In this article, our qualitative-modeling-based silicon neuron circuits for analog and digital implementations are quickly reviewed. One of our CMOS analog silicon neuron circuits can realize a variety of neuronal activities with a power consumption less than 72 nW. The square-wave bursting mode of this circuit is explained. Another circuit can realize Class I and II neuronal activities with about 3 nW. Our digital silicon neuron circuit can also realize these classes. An auto-associative memory realized on an all-to-all connected network of these silicon neurons is also reviewed, in which the neuron class plays important roles in its performance. PMID:27378842

  13. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    PubMed

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  14. Metrology of human-based and other qualitative measurements

    NASA Astrophysics Data System (ADS)

    Pendrill, Leslie; Petersson, Niclas

    2016-09-01

    The metrology of human-based and other qualitative measurements is in its infancy—concepts such as traceability and uncertainty are as yet poorly developed. This paper reviews how a measurement system analysis approach, particularly invoking as performance metric the ability of a probe (such as a human being) acting as a measurement instrument to make a successful decision, can enable a more general metrological treatment of qualitative observations. Measures based on human observations are typically qualitative, not only in sectors, such as health care, services and safety, where the human factor is obvious, but also in customer perception of traditional products of all kinds. A principal challenge is that the usual tools of statistics normally employed for expressing measurement accuracy and uncertainty will probably not work reliably if relations between distances on different portions of scales are not fully known, as is typical of ordinal or other qualitative measurements. A key enabling insight is to connect the treatment of decision risks associated with measurement uncertainty to generalized linear modelling (GLM). Handling qualitative observations in this way unites information theory, the perceptive identification and choice paradigms of psychophysics. The Rasch invariant measure psychometric GLM approach in particular enables a proper treatment of ordinal data; a clear separation of probe and item attribute estimates; simple expressions for instrument sensitivity; etc. Examples include two aspects of the care of breast cancer patients, from diagnosis to rehabilitation. The Rasch approach leads in turn to opportunities of establishing metrological references for quality assurance of qualitative measurements. In psychometrics, one could imagine a certified reference for knowledge challenge, for example, a particular concept in understanding physics or for product quality of a certain health care service. Multivariate methods, such as Principal Component

  15. Qualitative assessment of awake nasopharyngoscopy for prediction of oral appliance treatment response in obstructive sleep apnoea.

    PubMed

    Sutherland, Kate; Chan, Andrew S L; Ngiam, Joachim; Darendeliler, M Ali; Cistulli, Peter A

    2018-01-23

    Clinical methods to identify responders to oral appliance (OA) therapy for obstructive sleep apnoea (OSA) are needed. Awake nasopharyngoscopy during mandibular advancement, with image capture and subsequent processing and analysis, may predict treatment response. A qualitative assessment of awake nasopharyngoscopy would be simpler for clinical practice. We aimed to determine if a qualitative classification system of nasopharyngoscopic observations reflects treatment response. OSA patients were recruited for treatment with a customised two-piece OA. A custom scoring sheet was used to record observations of the pharyngeal airway (velopharynx, oropharynx, hypopharynx) during supine nasopharyngoscopy in response to mandibular advancement and performance of the Müller manoeuvre. Qualitative scores for degree (< 25%, 25-50%, 50-75%, > 75%), collapse pattern (concentric, anteroposterior, lateral) and diameter change (uniform, anteroposterior, lateral) were recorded. Treatment outcome was confirmed by polysomnography after a titration period of 14.6 ± 9.8 weeks. Treatment response was defined as (1) Treatment AHI < 5, (2) Treatment AHI < 10 plus > 50% AHI reduction and (3) > 50% AHI reduction. Eighty OSA patients (53.8% male) underwent nasopharyngoscopy. The most common naspharyngoscopic observation with mandibular advancement was a small (< 50%) increase in velopharyngeal lateral diameter (37.5%). The majority of subjects (72.5%) were recorded as having > 75% velopharyngeal collapse on performance of the Müller manoeuvre. Mandibular advancement reduced the observed level of pharyngeal collapse at all three pharyngeal regions (p < 0.001). None of the nasopharyngoscopic qualitative scores differed between responder and non-responder groups. Qualitative assessment of awake nasopharyngoscopy appears useful for assessing the effect of mandibular advancement on upper airway collapsibility. However, it is not sensitive enough to predict oral

  16. Qualitative-Based Methodology to Teaching Qualitative Methodology in Higher Education

    ERIC Educational Resources Information Center

    Katz, Sara

    2015-01-01

    There is no defined theory for teaching Qualitative Inquiry, and very few studies have focused on the topic. This study is a qualitative case study focused on the Qualitative Methods course that I teach at a college of education in Israel. The aim of the study is to explore and describe the course, to provide a true picture of my pedagogy, and to…

  17. The mathematical bases for qualitative reasoning

    NASA Technical Reports Server (NTRS)

    Kalagnanam, Jayant; Simon, Herbert A.; Iwasaki, Yumi

    1991-01-01

    The practices of researchers in many fields who use qualitative reasoning are summarized and explained. The goal is to gain an understanding of the formal assumptions and mechanisms that underlie this kind of analysis. The explanations given are based on standard mathematical formalisms, particularly on ordinal properties, continuous differentiable functions, and the mathematics of nonlinear dynamic systems.

  18. Synthesis of qualitative research and evidence-based nursing.

    PubMed

    Flemming, Kate

    Evidence-based nursing is central to nursing practice. Systematic reviews have played a key part in providing evidence for decision making in nursing. Traditionally, these have consisted of syntheses of randomised controlled trials. New approaches to combining research include the synthesis of qualitative research. This article discusses the development of research synthesis as a method for creating evidence of effectiveness identified in quantitative research; more effective use of primary data; enhancing the generalizability of qualitative research; the identification of future nursing research topics.

  19. Qualitative and quantitative prediction of volatile compounds from initial amino acid profiles in Korean rice wine (makgeolli) model.

    PubMed

    Kang, Bo-Sik; Lee, Jang-Eun; Park, Hyun-Jin

    2014-06-01

    In Korean rice wine (makgeolli) model, we tried to develop a prediction model capable of eliciting a quantitative relationship between initial amino acids in makgeolli mash and major aromatic compounds, such as fusel alcohols, their acetate esters, and ethyl esters of fatty acids, in makgeolli brewed. Mass-spectrometry-based electronic nose (MS-EN) was used to qualitatively discriminate between makgeollis made from makgeolli mashes with different amino acid compositions. Following this measurement, headspace solid-phase microextraction coupled to gas chromatography-mass spectrometry (GC-MS) combined with partial least-squares regression (PLSR) method was employed to quantitatively correlate amino acid composition of makgeolli mash with major aromatic compounds evolved during makgeolli fermentation. In qualitative prediction with MS-EN analysis, the makgeollis were well discriminated according to the volatile compounds derived from amino acids of makgeolli mash. Twenty-seven ion fragments with mass-to-charge ratio (m/z) of 55 to 98 amu were responsible for the discrimination. In GC-MS combined with PLSR method, a quantitative approach between the initial amino acids of makgeolli mash and the fusel compounds of makgeolli demonstrated that coefficient of determination (R(2)) of most of the fusel compounds ranged from 0.77 to 0.94 in good correlation, except for 2-phenylethanol (R(2) = 0.21), whereas R(2) for ethyl esters of MCFAs including ethyl caproate, ethyl caprylate, and ethyl caprate was 0.17 to 0.40 in poor correlation. The amino acids have been known to affect the aroma in alcoholic beverages. In this study, we demonstrated that an electronic nose qualitatively differentiated Korean rice wines (makgeollis) by their volatile compounds evolved from amino acids with rapidity and reproducibility and successively, a quantitative correlation with acceptable R2 between amino acids and fusel compounds could be established via HS-SPME GC-MS combined with partial least

  20. Data-Based Predictive Control with Multirate Prediction Step

    NASA Technical Reports Server (NTRS)

    Barlow, Jonathan S.

    2010-01-01

    Data-based predictive control is an emerging control method that stems from Model Predictive Control (MPC). MPC computes current control action based on a prediction of the system output a number of time steps into the future and is generally derived from a known model of the system. Data-based predictive control has the advantage of deriving predictive models and controller gains from input-output data. Thus, a controller can be designed from the outputs of complex simulation code or a physical system where no explicit model exists. If the output data happens to be corrupted by periodic disturbances, the designed controller will also have the built-in ability to reject these disturbances without the need to know them. When data-based predictive control is implemented online, it becomes a version of adaptive control. One challenge of MPC is computational requirements increasing with prediction horizon length. This paper develops a closed-loop dynamic output feedback controller that minimizes a multi-step-ahead receding-horizon cost function with multirate prediction step. One result is a reduced influence of prediction horizon and the number of system outputs on the computational requirements of the controller. Another result is an emphasis on portions of the prediction window that are sampled more frequently. A third result is the ability to include more outputs in the feedback path than in the cost function.

  1. Qualitative Assessment of Inquiry-Based Teaching Methods

    ERIC Educational Resources Information Center

    Briggs, Michael; Long, George; Owens, Katrina

    2011-01-01

    A new approach to teaching method assessment using student focused qualitative studies and the theoretical framework of mental models is proposed. The methodology is considered specifically for the advantages it offers when applied to the assessment of inquiry-based teaching methods. The theoretical foundation of mental models is discussed, and…

  2. [Predictive value of qualitative assessment of general movements for adverse outcomes at 24 months of age in infants with asphyxia].

    PubMed

    Chen, Nan; Wen, Xiao-Hong; Huang, Jin-Hua; Wang, Shui-Yun; Zhu, Yue-E

    2015-12-01

    To investigate the predictive value of the qualitative assessment of general movements (GMs) for adverse outcomes at 24 months of age in full-term infants with asphyxia. A total of 114 full-term asphyxiated infants, who were admitted to the neonatal intensive care unit between 2009 and 2012 and took part in follow-ups after discharge were included in the study. All of them received the qualitative assessment of GMs within 3 months after birth. The development quotient was determined with the Bayley Scales of Infant Development at 24 months of age. The results of the qualitative assessment of GMs within 3 months after birth showed that among 114 infants, 20 (17.5%) had poor repertoire movements and 7 (6.1%) had cramped-synchronized movements during the writhing movements period; 8 infants (7.0%) had the absence of fidgety movements during the fidgety movements period. The results of development quotient at 24 months of age showed that 7 infants (6.1%) had adverse developmental outcomes: 6 cases of cerebral palsy and mental retardation and 1 case of mental retardation. There was a poor consistency between poor repertoire movements during the writhing movements period and the developmental outcomes at 24 months of age (Kappa=-0.019; P>0.05). There was a high consistency between cramped-synchronized movements during the writhing movements period and the developmental outcomes at 24 months of age (Kappa=0.848; P<0.05), and the results of predictive values of cramped-synchronized movements were shown as follows: predictive validity 98.2%, sensitivity 85.7%, specificity 99.1%, positive predictive value 85.7%, and negative predictive value 99.1%. There was a high consistency between the absence of fidgety movements during the fidgety movements period and the developmental outcomes at 24 months of age (Kappa=0.786; P<0.05), and its predictive values were expressed as follows: predictive validity 97.4%, sensitivity 85.7%, specificity 98.1%, positive predictive value 75

  3. Theoretical investigation on the microstructure of triethylene glycol based deep eutectic solvents: COSMO-RS and TURBOMOLE prediction

    NASA Astrophysics Data System (ADS)

    Aissaoui, Tayeb; Benguerba, Yacine; AlNashef, Inas M.

    2017-08-01

    The in-silico combination mechanism of triethylene glycol based DESs has been studied. COSMO-RS and graphical user interface TmoleX software were used to predict the interaction mechanism of hydrogen bond donors (HBDs) with hydrogen bond acceptors (HBA) to form DESs. The predicted IR results were compared with the previously reported experimental FT-IR analysis for the same studied DESs. The sigma profiles for the HBD, HBAs and formed DESs were interpreted to identify qualitatively molecular properties like polarity or hydrogen bonding donor and acceptor abilities. The predicted physicochemical properties reported in this study were in good agreement with experimental ones.

  4. Exploring the Sequence-based Prediction of Folding Initiation Sites in Proteins.

    PubMed

    Raimondi, Daniele; Orlando, Gabriele; Pancsa, Rita; Khan, Taushif; Vranken, Wim F

    2017-08-18

    Protein folding is a complex process that can lead to disease when it fails. Especially poorly understood are the very early stages of protein folding, which are likely defined by intrinsic local interactions between amino acids close to each other in the protein sequence. We here present EFoldMine, a method that predicts, from the primary amino acid sequence of a protein, which amino acids are likely involved in early folding events. The method is based on early folding data from hydrogen deuterium exchange (HDX) data from NMR pulsed labelling experiments, and uses backbone and sidechain dynamics as well as secondary structure propensities as features. The EFoldMine predictions give insights into the folding process, as illustrated by a qualitative comparison with independent experimental observations. Furthermore, on a quantitative proteome scale, the predicted early folding residues tend to become the residues that interact the most in the folded structure, and they are often residues that display evolutionary covariation. The connection of the EFoldMine predictions with both folding pathway data and the folded protein structure suggests that the initial statistical behavior of the protein chain with respect to local structure formation has a lasting effect on its subsequent states.

  5. Utility of qualitative research findings in evidence-based public health practice.

    PubMed

    Jack, Susan M

    2006-01-01

    Epidemiological data, derived from quantitative studies, provide important information about the causes, prevalence, risk correlates, treatment and prevention of diseases, and health issues at a population level. However, public health issues are complex in nature and quantitative research findings are insufficient to support practitioners and administrators in making evidence-informed decisions. Upshur's Synthetic Model of Evidence (2001) situates qualitative research findings as a credible source of evidence for public health practice. This article answers the following questions: (1) where does qualitative research fit within the paradigm of evidence-based practice and (2) how can qualitative research be used by public health professionals? Strategies for using qualitative research findings instrumentally, conceptually, and symbolically are identified by applying Estabrooks' (1999) conceptual structure of research utilization. Different research utilization strategies are illustrated through the use of research examples from the field of work on intimate partner violence against women. Recommendations for qualitative researchers disseminating findings and for public health practitioners/policy makers considering the use of qualitative findings as evidence to inform decisions are provided.

  6. Enhancing emotional-based target prediction

    NASA Astrophysics Data System (ADS)

    Gosnell, Michael; Woodley, Robert

    2008-04-01

    This work extends existing agent-based target movement prediction to include key ideas of behavioral inertia, steady states, and catastrophic change from existing psychological, sociological, and mathematical work. Existing target prediction work inherently assumes a single steady state for target behavior, and attempts to classify behavior based on a single emotional state set. The enhanced, emotional-based target prediction maintains up to three distinct steady states, or typical behaviors, based on a target's operating conditions and observed behaviors. Each steady state has an associated behavioral inertia, similar to the standard deviation of behaviors within that state. The enhanced prediction framework also allows steady state transitions through catastrophic change and individual steady states could be used in an offline analysis with additional modeling efforts to better predict anticipated target reactions.

  7. Classification of cassava genotypes based on qualitative and quantitative data.

    PubMed

    Oliveira, E J; Oliveira Filho, O S; Santos, V S

    2015-02-02

    We evaluated the genetic variation of cassava accessions based on qualitative (binomial and multicategorical) and quantitative traits (continuous). We characterized 95 accessions obtained from the Cassava Germplasm Bank of Embrapa Mandioca e Fruticultura; we evaluated these accessions for 13 continuous, 10 binary, and 25 multicategorical traits. First, we analyzed the accessions based only on quantitative traits; next, we conducted joint analysis (qualitative and quantitative traits) based on the Ward-MLM method, which performs clustering in two stages. According to the pseudo-F, pseudo-t2, and maximum likelihood criteria, we identified five and four groups based on quantitative trait and joint analysis, respectively. The smaller number of groups identified based on joint analysis may be related to the nature of the data. On the other hand, quantitative data are more subject to environmental effects in the phenotype expression; this results in the absence of genetic differences, thereby contributing to greater differentiation among accessions. For most of the accessions, the maximum probability of classification was >0.90, independent of the trait analyzed, indicating a good fit of the clustering method. Differences in clustering according to the type of data implied that analysis of quantitative and qualitative traits in cassava germplasm might explore different genomic regions. On the other hand, when joint analysis was used, the means and ranges of genetic distances were high, indicating that the Ward-MLM method is very useful for clustering genotypes when there are several phenotypic traits, such as in the case of genetic resources and breeding programs.

  8. SDG and qualitative trend based model multiple scale validation

    NASA Astrophysics Data System (ADS)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  9. Prediction-based dynamic load-sharing heuristics

    NASA Technical Reports Server (NTRS)

    Goswami, Kumar K.; Devarakonda, Murthy; Iyer, Ravishankar K.

    1993-01-01

    The authors present dynamic load-sharing heuristics that use predicted resource requirements of processes to manage workloads in a distributed system. A previously developed statistical pattern-recognition method is employed for resource prediction. While nonprediction-based heuristics depend on a rapidly changing system status, the new heuristics depend on slowly changing program resource usage patterns. Furthermore, prediction-based heuristics can be more effective since they use future requirements rather than just the current system state. Four prediction-based heuristics, two centralized and two distributed, are presented. Using trace driven simulations, they are compared against random scheduling and two effective nonprediction based heuristics. Results show that the prediction-based centralized heuristics achieve up to 30 percent better response times than the nonprediction centralized heuristic, and that the prediction-based distributed heuristics achieve up to 50 percent improvements relative to their nonprediction counterpart.

  10. Using internet-based approaches to collect qualitative data from vulnerable groups: reflections from the field.

    PubMed

    Neville, Stephen; Adams, Jeffery; Cook, Catherine

    2016-12-01

    Undertaking qualitative research with vulnerable populations is a complex and challenging process for researchers. Traditional and common modes of collecting qualitative data with these groups have been via face-to-face recorded interviews. This article reports on three internet-based data collection methods; email and synchronous online interviews, as well as online qualitative survey. The key characteristics of using email, sychronous online interviews and an online qualitative survey including the strengths and limitations of each are presented. Reflections and insights on the use of these internet-based data collection methods are provided to encourage researchers to embrace technology and move away from using traditional face-to-face interviews when researching with vulnerable populations. Using the internet to collect qualitative data offers additional ways to gather qualitative data over traditional data collection methods. The use of alternative interview methods may encourage participation of vulnerable participants.

  11. The promises of qualitative inquiry.

    PubMed

    Gergen, Kenneth J; Josselson, Ruthellen; Freeman, Mark

    2015-01-01

    We address the significance and implications of the formal entry of qualitative inquiry into the American Psychological Association. In our view, the discipline is enriched in new and important ways. Most prominently, the qualitative movement brings with it a pluralist orientation to knowledge and to practices of inquiry. Adding to the traditional view of knowledge as empirically supported theory are research practices congenial with varying accounts of knowledge, including, for example, knowledge as hermeneutic understanding, social construction, and practice-based experience. Added to the goal of prediction are investments in increasing cultural understanding, challenging cultural conventions, and directly fostering social change. The qualitative movement also enriches the discipline as a whole through the special ways in which it inspires new ranges of theory, fosters minority inclusion, and invites interdisciplinary collaboration. Finally, the movement holds promise in terms of the discipline's contribution to society at large. Here we focus on the advantages of knowing with others in addition to about them, and on ways in which qualitative work enhances communication with the society and the world. Realizing these potentials will depend on developments in responsible research and reporting, academic and journal policies, along with the discipline's capacities for appreciating a more comprehensive orientation to inquiry. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  12. Synthesising quantitative and qualitative research in evidence-based patient information.

    PubMed

    Goldsmith, Megan R; Bankhead, Clare R; Austoker, Joan

    2007-03-01

    Systematic reviews have, in the past, focused on quantitative studies and clinical effectiveness, while excluding qualitative evidence. Qualitative research can inform evidence-based practice independently of other research methodologies but methods for the synthesis of such data are currently evolving. Synthesising quantitative and qualitative research in a single review is an important methodological challenge. This paper describes the review methods developed and the difficulties encountered during the process of updating a systematic review of evidence to inform guidelines for the content of patient information related to cervical screening. Systematic searches of 12 electronic databases (January 1996 to July 2004) were conducted. Studies that evaluated the content of information provided to women about cervical screening or that addressed women's information needs were assessed for inclusion. A data extraction form and quality assessment criteria were developed from published resources. A non-quantitative synthesis was conducted and a tabular evidence profile for each important outcome (eg "explain what the test involves") was prepared. The overall quality of evidence for each outcome was then assessed using an approach published by the GRADE working group, which was adapted to suit the review questions and modified to include qualitative research evidence. Quantitative and qualitative studies were considered separately for every outcome. 32 papers were included in the systematic review following data extraction and assessment of methodological quality. The review questions were best answered by evidence from a range of data sources. The inclusion of qualitative research, which was often highly relevant and specific to many components of the screening information materials, enabled the production of a set of recommendations that will directly affect policy within the NHS Cervical Screening Programme. A practical example is provided of how quantitative and

  13. Synthesising quantitative and qualitative research in evidence‐based patient information

    PubMed Central

    Goldsmith, Megan R; Bankhead, Clare R; Austoker, Joan

    2007-01-01

    Background Systematic reviews have, in the past, focused on quantitative studies and clinical effectiveness, while excluding qualitative evidence. Qualitative research can inform evidence‐based practice independently of other research methodologies but methods for the synthesis of such data are currently evolving. Synthesising quantitative and qualitative research in a single review is an important methodological challenge. Aims This paper describes the review methods developed and the difficulties encountered during the process of updating a systematic review of evidence to inform guidelines for the content of patient information related to cervical screening. Methods Systematic searches of 12 electronic databases (January 1996 to July 2004) were conducted. Studies that evaluated the content of information provided to women about cervical screening or that addressed women's information needs were assessed for inclusion. A data extraction form and quality assessment criteria were developed from published resources. A non‐quantitative synthesis was conducted and a tabular evidence profile for each important outcome (eg “explain what the test involves”) was prepared. The overall quality of evidence for each outcome was then assessed using an approach published by the GRADE working group, which was adapted to suit the review questions and modified to include qualitative research evidence. Quantitative and qualitative studies were considered separately for every outcome. Results 32 papers were included in the systematic review following data extraction and assessment of methodological quality. The review questions were best answered by evidence from a range of data sources. The inclusion of qualitative research, which was often highly relevant and specific to many components of the screening information materials, enabled the production of a set of recommendations that will directly affect policy within the NHS Cervical Screening Programme. Conclusions A

  14. sNebula, a network-based algorithm to predict binding between human leukocyte antigens and peptides

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Heng; Ye, Hao; Ng, Hui Wen

    Understanding the binding between human leukocyte antigens (HLAs) and peptides is important to understand the functioning of the immune system. Since it is time-consuming and costly to measure the binding between large numbers of HLAs and peptides, computational methods including machine learning models and network approaches have been developed to predict HLA-peptide binding. However, there are several limitations for the existing methods. We developed a network-based algorithm called sNebula to address these limitations. We curated qualitative Class I HLA-peptide binding data and demonstrated the prediction performance of sNebula on this dataset using leave-one-out cross-validation and five-fold cross-validations. Furthermore, this algorithmmore » can predict not only peptides of different lengths and different types of HLAs, but also the peptides or HLAs that have no existing binding data. We believe sNebula is an effective method to predict HLA-peptide binding and thus improve our understanding of the immune system.« less

  15. sNebula, a network-based algorithm to predict binding between human leukocyte antigens and peptides

    PubMed Central

    Luo, Heng; Ye, Hao; Ng, Hui Wen; Sakkiah, Sugunadevi; Mendrick, Donna L.; Hong, Huixiao

    2016-01-01

    Understanding the binding between human leukocyte antigens (HLAs) and peptides is important to understand the functioning of the immune system. Since it is time-consuming and costly to measure the binding between large numbers of HLAs and peptides, computational methods including machine learning models and network approaches have been developed to predict HLA-peptide binding. However, there are several limitations for the existing methods. We developed a network-based algorithm called sNebula to address these limitations. We curated qualitative Class I HLA-peptide binding data and demonstrated the prediction performance of sNebula on this dataset using leave-one-out cross-validation and five-fold cross-validations. This algorithm can predict not only peptides of different lengths and different types of HLAs, but also the peptides or HLAs that have no existing binding data. We believe sNebula is an effective method to predict HLA-peptide binding and thus improve our understanding of the immune system. PMID:27558848

  16. sNebula, a network-based algorithm to predict binding between human leukocyte antigens and peptides

    DOE PAGES

    Luo, Heng; Ye, Hao; Ng, Hui Wen; ...

    2016-08-25

    Understanding the binding between human leukocyte antigens (HLAs) and peptides is important to understand the functioning of the immune system. Since it is time-consuming and costly to measure the binding between large numbers of HLAs and peptides, computational methods including machine learning models and network approaches have been developed to predict HLA-peptide binding. However, there are several limitations for the existing methods. We developed a network-based algorithm called sNebula to address these limitations. We curated qualitative Class I HLA-peptide binding data and demonstrated the prediction performance of sNebula on this dataset using leave-one-out cross-validation and five-fold cross-validations. Furthermore, this algorithmmore » can predict not only peptides of different lengths and different types of HLAs, but also the peptides or HLAs that have no existing binding data. We believe sNebula is an effective method to predict HLA-peptide binding and thus improve our understanding of the immune system.« less

  17. Utilizing Problem-Based Learning in Qualitative Analysis Lab Experiments

    ERIC Educational Resources Information Center

    Hicks, Randall W.; Bevsek, Holly M.

    2012-01-01

    A series of qualitative analysis (QA) laboratory experiments utilizing a problem-based learning (PBL) module has been designed and implemented. The module guided students through the experiments under the guise of cleaning up a potentially contaminated water site as employees of an environmental chemistry laboratory. The main goal was the…

  18. A Qualitative Study of College-Based Peace Education Programs

    ERIC Educational Resources Information Center

    Boudreau, Will

    2017-01-01

    The purpose of this exploratory research study was to examine the perceptions of seven northeast United States, college-based, Peace Education program directors regarding their respective programs' characteristics and the challenges they face. This qualitative study was designed to fill a gap in the literature by examining the perceptions of…

  19. Qualitative and quantitative comparison of geostatistical techniques of porosity prediction from the seismic and logging data: a case study from the Blackfoot Field, Alberta, Canada

    NASA Astrophysics Data System (ADS)

    Maurya, S. P.; Singh, K. H.; Singh, N. P.

    2018-05-01

    In present study, three recently developed geostatistical methods, single attribute analysis, multi-attribute analysis and probabilistic neural network algorithm have been used to predict porosity in inter well region for Blackfoot field, Alberta, Canada, an offshore oil field. These techniques make use of seismic attributes, generated by model based inversion and colored inversion techniques. The principle objective of the study is to find the suitable combination of seismic inversion and geostatistical techniques to predict porosity and identification of prospective zones in 3D seismic volume. The porosity estimated from these geostatistical approaches is corroborated with the well log porosity. The results suggest that all the three implemented geostatistical methods are efficient and reliable to predict the porosity but the multi-attribute and probabilistic neural network analysis provide more accurate and high resolution porosity sections. A low impedance (6000-8000 m/s g/cc) and high porosity (> 15%) zone is interpreted from inverted impedance and porosity sections respectively between 1060 and 1075 ms time interval and is characterized as reservoir. The qualitative and quantitative results demonstrate that of all the employed geostatistical methods, the probabilistic neural network along with model based inversion is the most efficient method for predicting porosity in inter well region.

  20. Rheumatoid arthritis patient perceptions on the value of predictive testing for treatments: a qualitative study.

    PubMed

    Kumar, Kanta; Peters, Sarah; Barton, Anne

    2016-11-08

    Rheumatoid arthritis (RA) is a long term condition that requires early treatment to control symptoms and improve long-term outcomes. Lack of response to RA treatments is not only a waste of healthcare resources, but also causes disability and distress to patients. Identifying biomarkers predictive of treatment response offers an opportunity to improve clinical decisions about which treatment to recommend in patients and could ultimately lead to better patient outcomes. The aim of this study was to explore the understanding of and factors affecting Rheumatoid Arthritis (RA) patients' decisions around predictive treatment testing. A qualitative study was conducted with a purposive sample of 16 patients with RA from three major UK cities. Four focus groups explored patient perceptions of the use of biomarker tests to predict response to treatments. Interviews were audio-recorded, transcribed verbatim and analysed using thematic analysis by three researchers. Data were organised within three interlinking themes: [1] Perceptions of predictive tests and patient preference of tests; [2] Utility of the test to manage expectations; [3] The influence of the disease duration on take up of predictive testing. During consultations for predictive testing, patients felt they would need, first, careful explanations detailing the consequences of untreated RA and delayed treatment response and, second, support to balance the risks of tests, which might be invasive and/or only moderately accurate, with the potential benefits of better management of symptoms. This study provides important insights into predictive testing. Besides supporting clinical decision making, the development of predictive testing in RA is largely supported by patients. Developing strategies which communicate risk information about predictive testing effectively while reducing the psychological burden associated with this information will be essential to maximise uptake.

  1. A quantitative evaluation of a qualitative risk assessment framework: Examining the assumptions and predictions of the Productivity Susceptibility Analysis (PSA)

    PubMed Central

    2018-01-01

    Qualitative risk assessment frameworks, such as the Productivity Susceptibility Analysis (PSA), have been developed to rapidly evaluate the risks of fishing to marine populations and prioritize management and research among species. Despite being applied to over 1,000 fish populations, and an ongoing debate about the most appropriate method to convert biological and fishery characteristics into an overall measure of risk, the assumptions and predictive capacity of these approaches have not been evaluated. Several interpretations of the PSA were mapped to a conventional age-structured fisheries dynamics model to evaluate the performance of the approach under a range of assumptions regarding exploitation rates and measures of biological risk. The results demonstrate that the underlying assumptions of these qualitative risk-based approaches are inappropriate, and the expected performance is poor for a wide range of conditions. The information required to score a fishery using a PSA-type approach is comparable to that required to populate an operating model and evaluating the population dynamics within a simulation framework. In addition to providing a more credible characterization of complex system dynamics, the operating model approach is transparent, reproducible and can evaluate alternative management strategies over a range of plausible hypotheses for the system. PMID:29856869

  2. Model-based predictions for dopamine.

    PubMed

    Langdon, Angela J; Sharpe, Melissa J; Schoenbaum, Geoffrey; Niv, Yael

    2018-04-01

    Phasic dopamine responses are thought to encode a prediction-error signal consistent with model-free reinforcement learning theories. However, a number of recent findings highlight the influence of model-based computations on dopamine responses, and suggest that dopamine prediction errors reflect more dimensions of an expected outcome than scalar reward value. Here, we review a selection of these recent results and discuss the implications and complications of model-based predictions for computational theories of dopamine and learning. Copyright © 2017. Published by Elsevier Ltd.

  3. A framework for qualitative reasoning about solid objects

    NASA Technical Reports Server (NTRS)

    Davis, E.

    1987-01-01

    Predicting the behavior of a qualitatively described system of solid objects requires a combination of geometrical, temporal, and physical reasoning. Methods based upon formulating and solving differential equations are not adequate for robust prediction, since the behavior of a system over extended time may be much simpler than its behavior over local time. A first-order logic, in which one can state simple physical problems and derive their solution deductively, without recourse to solving the differential equations, is discussed. This logic is substantially more expressive and powerful than any previous AI representational system in this domain.

  4. The Mathematical Bases for Qualitative Reasoning

    DTIC Science & Technology

    1990-01-01

    but solely in terms of ordinary language. A good deal of such qualitative reasoning makes implicit use of the properties of ordinal variables and...without use of mathematical formalisms, but solely in terms of ordinary language. A good deal of such qualitative reasoning makes implicit use of the...irenheit 2 QualItalive ReaonIng 27 Janury 1990 or Celsius temperature on either day. If we ar. considering an equation connecting two variables, wfAx), we

  5. Students' Understanding of Acid, Base and Salt Reactions in Qualitative Analysis.

    ERIC Educational Resources Information Center

    Tan, Kim-Chwee Daniel; Goh, Ngoh-Khang; Chia, Lian-Sai; Treagust, David F.

    2003-01-01

    Uses a two-tier, multiple-choice diagnostic instrument to determine (n=915) grade 10 students' understanding of the acid, base, and salt reactions involved in basic qualitative analysis. Reports that many students did not understand the formation of precipitates and the complex salts, acid/salt-base reactions, and thermal decomposition involved in…

  6. A Physiologically Based Pharmacokinetic Model for Pregnant Women to Predict the Pharmacokinetics of Drugs Metabolized Via Several Enzymatic Pathways.

    PubMed

    Dallmann, André; Ince, Ibrahim; Coboeken, Katrin; Eissing, Thomas; Hempel, Georg

    2017-09-18

    Physiologically based pharmacokinetic modeling is considered a valuable tool for predicting pharmacokinetic changes in pregnancy to subsequently guide in-vivo pharmacokinetic trials in pregnant women. The objective of this study was to extend and verify a previously developed physiologically based pharmacokinetic model for pregnant women for the prediction of pharmacokinetics of drugs metabolized via several cytochrome P450 enzymes. Quantitative information on gestation-specific changes in enzyme activity available in the literature was incorporated in a pregnancy physiologically based pharmacokinetic model and the pharmacokinetics of eight drugs metabolized via one or multiple cytochrome P450 enzymes was predicted. The tested drugs were caffeine, midazolam, nifedipine, metoprolol, ondansetron, granisetron, diazepam, and metronidazole. Pharmacokinetic predictions were evaluated by comparison with in-vivo pharmacokinetic data obtained from the literature. The pregnancy physiologically based pharmacokinetic model successfully predicted the pharmacokinetics of all tested drugs. The observed pregnancy-induced pharmacokinetic changes were qualitatively and quantitatively reasonably well predicted for all drugs. Ninety-seven percent of the mean plasma concentrations predicted in pregnant women fell within a twofold error range and 63% within a 1.25-fold error range. For all drugs, the predicted area under the concentration-time curve was within a 1.25-fold error range. The presented pregnancy physiologically based pharmacokinetic model can quantitatively predict the pharmacokinetics of drugs that are metabolized via one or multiple cytochrome P450 enzymes by integrating prior knowledge of the pregnancy-related effect on these enzymes. This pregnancy physiologically based pharmacokinetic model may thus be used to identify potential exposure changes in pregnant women a priori and to eventually support informed decision making when clinical trials are designed in this

  7. Base Rates, Contingencies, and Prediction Behavior

    ERIC Educational Resources Information Center

    Kareev, Yaakov; Fiedler, Klaus; Avrahami, Judith

    2009-01-01

    A skew in the base rate of upcoming events can often provide a better cue for accurate predictions than a contingency between signals and events. The authors study prediction behavior and test people's sensitivity to both base rate and contingency; they also examine people's ability to compare the benefits of both for prediction. They formalize…

  8. Electromigration model for the prediction of lifetime based on the failure unit statistics in aluminum metallization

    NASA Astrophysics Data System (ADS)

    Park, Jong Ho; Ahn, Byung Tae

    2003-01-01

    A failure model for electromigration based on the "failure unit model" was presented for the prediction of lifetime in metal lines.The failure unit model, which consists of failure units in parallel and series, can predict both the median time to failure (MTTF) and the deviation in the time to failure (DTTF) in Al metal lines. The model can describe them only qualitatively. In our model, both the probability function of the failure unit in single grain segments and polygrain segments are considered instead of in polygrain segments alone. Based on our model, we calculated MTTF, DTTF, and activation energy for different median grain sizes, grain size distributions, linewidths, line lengths, current densities, and temperatures. Comparisons between our results and published experimental data showed good agreements and our model could explain the previously unexplained phenomena. Our advanced failure unit model might be further applied to other electromigration characteristics of metal lines.

  9. The Development of a Web-Based Virtual Environment for Teaching Qualitative Analysis of Structures

    ERIC Educational Resources Information Center

    O'Dwyer, D. W.; Logan-Phelan, T. M.; O'Neill, E. A.

    2007-01-01

    The current paper describes the design and development of a qualitative analysis course and an interactive web-based teaching and assessment tool called VSE (virtual structural environment). The widespread reliance on structural analysis programs requires engineers to be able to verify computer output by carrying out qualitative analyses.…

  10. The Uses of Qualitative Research: Powerful Methods to Inform Evidence-Based Practice in Education

    ERIC Educational Resources Information Center

    Kozleski, Elizabeth B.

    2017-01-01

    This article offers a rationale for the contributions of qualitative research to evidence-based practice in special education. In it, I make the argument that qualitative research encompasses the ability to study significant problems of practice, engage with practitioners in the conduct of research studies, learn and change processes during a…

  11. Numerical and Qualitative Contrasts of Two Statistical Models ...

    EPA Pesticide Factsheets

    Two statistical approaches, weighted regression on time, discharge, and season and generalized additive models, have recently been used to evaluate water quality trends in estuaries. Both models have been used in similar contexts despite differences in statistical foundations and products. This study provided an empirical and qualitative comparison of both models using 29 years of data for two discrete time series of chlorophyll-a (chl-a) in the Patuxent River estuary. Empirical descriptions of each model were based on predictive performance against the observed data, ability to reproduce flow-normalized trends with simulated data, and comparisons of performance with validation datasets. Between-model differences were apparent but minor and both models had comparable abilities to remove flow effects from simulated time series. Both models similarly predicted observations for missing data with different characteristics. Trends from each model revealed distinct mainstem influences of the Chesapeake Bay with both models predicting a roughly 65% increase in chl-a over time in the lower estuary, whereas flow-normalized predictions for the upper estuary showed a more dynamic pattern, with a nearly 100% increase in chl-a in the last 10 years. Qualitative comparisons highlighted important differences in the statistical structure, available products, and characteristics of the data and desired analysis. This manuscript describes a quantitative comparison of two recently-

  12. Ensemble-based prediction of RNA secondary structures.

    PubMed

    Aghaeepour, Nima; Hoos, Holger H

    2013-04-24

    Accurate structure prediction methods play an important role for the understanding of RNA function. Energy-based, pseudoknot-free secondary structure prediction is one of the most widely used and versatile approaches, and improved methods for this task have received much attention over the past five years. Despite the impressive progress that as been achieved in this area, existing evaluations of the prediction accuracy achieved by various algorithms do not provide a comprehensive, statistically sound assessment. Furthermore, while there is increasing evidence that no prediction algorithm consistently outperforms all others, no work has been done to exploit the complementary strengths of multiple approaches. In this work, we present two contributions to the area of RNA secondary structure prediction. Firstly, we use state-of-the-art, resampling-based statistical methods together with a previously published and increasingly widely used dataset of high-quality RNA structures to conduct a comprehensive evaluation of existing RNA secondary structure prediction procedures. The results from this evaluation clarify the performance relationship between ten well-known existing energy-based pseudoknot-free RNA secondary structure prediction methods and clearly demonstrate the progress that has been achieved in recent years. Secondly, we introduce AveRNA, a generic and powerful method for combining a set of existing secondary structure prediction procedures into an ensemble-based method that achieves significantly higher prediction accuracies than obtained from any of its component procedures. Our new, ensemble-based method, AveRNA, improves the state of the art for energy-based, pseudoknot-free RNA secondary structure prediction by exploiting the complementary strengths of multiple existing prediction procedures, as demonstrated using a state-of-the-art statistical resampling approach. In addition, AveRNA allows an intuitive and effective control of the trade-off between

  13. Qualitative simulation for process modeling and control

    NASA Technical Reports Server (NTRS)

    Dalle Molle, D. T.; Edgar, T. F.

    1989-01-01

    A qualitative model is developed for a first-order system with a proportional-integral controller without precise knowledge of the process or controller parameters. Simulation of the qualitative model yields all of the solutions to the system equations. In developing the qualitative model, a necessary condition for the occurrence of oscillatory behavior is identified. Initializations that cannot exhibit oscillatory behavior produce a finite set of behaviors. When the phase-space behavior of the oscillatory behavior is properly constrained, these initializations produce an infinite but comprehensible set of asymptotically stable behaviors. While the predictions include all possible behaviors of the real system, a class of spurious behaviors has been identified. When limited numerical information is included in the model, the number of predictions is significantly reduced.

  14. Learning linear transformations between counting-based and prediction-based word embeddings

    PubMed Central

    Hayashi, Kohei; Kawarabayashi, Ken-ichi

    2017-01-01

    Despite the growing interest in prediction-based word embedding learning methods, it remains unclear as to how the vector spaces learnt by the prediction-based methods differ from that of the counting-based methods, or whether one can be transformed into the other. To study the relationship between counting-based and prediction-based embeddings, we propose a method for learning a linear transformation between two given sets of word embeddings. Our proposal contributes to the word embedding learning research in three ways: (a) we propose an efficient method to learn a linear transformation between two sets of word embeddings, (b) using the transformation learnt in (a), we empirically show that it is possible to predict distributed word embeddings for novel unseen words, and (c) empirically it is possible to linearly transform counting-based embeddings to prediction-based embeddings, for frequent words, different POS categories, and varying degrees of ambiguities. PMID:28926629

  15. A novel logic-based approach for quantitative toxicology prediction.

    PubMed

    Amini, Ata; Muggleton, Stephen H; Lodhi, Huma; Sternberg, Michael J E

    2007-01-01

    There is a pressing need for accurate in silico methods to predict the toxicity of molecules that are being introduced into the environment or are being developed into new pharmaceuticals. Predictive toxicology is in the realm of structure activity relationships (SAR), and many approaches have been used to derive such SAR. Previous work has shown that inductive logic programming (ILP) is a powerful approach that circumvents several major difficulties, such as molecular superposition, faced by some other SAR methods. The ILP approach reasons with chemical substructures within a relational framework and yields chemically understandable rules. Here, we report a general new approach, support vector inductive logic programming (SVILP), which extends the essentially qualitative ILP-based SAR to quantitative modeling. First, ILP is used to learn rules, the predictions of which are then used within a novel kernel to derive a support-vector generalization model. For a highly heterogeneous dataset of 576 molecules with known fathead minnow fish toxicity, the cross-validated correlation coefficients (R2CV) from a chemical descriptor method (CHEM) and SVILP are 0.52 and 0.66, respectively. The ILP, CHEM, and SVILP approaches correctly predict 55, 58, and 73%, respectively, of toxic molecules. In a set of 165 unseen molecules, the R2 values from the commercial software TOPKAT and SVILP are 0.26 and 0.57, respectively. In all calculations, SVILP showed significant improvements in comparison with the other methods. The SVILP approach has a major advantage in that it uses ILP automatically and consistently to derive rules, mostly novel, describing fragments that are toxicity alerts. The SVILP is a general machine-learning approach and has the potential of tackling many problems relevant to chemoinformatics including in silico drug design.

  16. A computational approach to predicting ligand selectivity for the size-based separation of trivalent lanthanides

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ivanov, Alexander S.; Bryantsev, Vyacheslav S.

    An accurate description of solvation effects for trivalent lanthanide ions is a main stumbling block to the qualitative prediction of selectivity trends along the lanthanide series. In this work, we propose a simple model to describe the differential effect of solvation in the competitive binding of a ligand by lanthanide ions by including weakly co-ordinated counterions in the complexes of more than a +1 charge. The success of the approach to quantitatively reproduce selectivities obtained from aqueous phase complexation studies demonstrates its potential for the design and screening of new ligands for efficient size-based separation.

  17. A computational approach to predicting ligand selectivity for the size-based separation of trivalent lanthanides

    DOE PAGES

    Ivanov, Alexander S.; Bryantsev, Vyacheslav S.

    2016-06-20

    An accurate description of solvation effects for trivalent lanthanide ions is a main stumbling block to the qualitative prediction of selectivity trends along the lanthanide series. In this work, we propose a simple model to describe the differential effect of solvation in the competitive binding of a ligand by lanthanide ions by including weakly co-ordinated counterions in the complexes of more than a +1 charge. The success of the approach to quantitatively reproduce selectivities obtained from aqueous phase complexation studies demonstrates its potential for the design and screening of new ligands for efficient size-based separation.

  18. Probabilistic self-localisation on a qualitative map based on occlusions

    NASA Astrophysics Data System (ADS)

    Santos, Paulo E.; Martins, Murilo F.; Fenelon, Valquiria; Cozman, Fabio G.; Dee, Hannah M.

    2016-09-01

    Spatial knowledge plays an essential role in human reasoning, permitting tasks such as locating objects in the world (including oneself), reasoning about everyday actions and describing perceptual information. This is also the case in the field of mobile robotics, where one of the most basic (and essential) tasks is the autonomous determination of the pose of a robot with respect to a map, given its perception of the environment. This is the problem of robot self-localisation (or simply the localisation problem). This paper presents a probabilistic algorithm for robot self-localisation that is based on a topological map constructed from the observation of spatial occlusion. Distinct locations on the map are defined by means of a classical formalism for qualitative spatial reasoning, whose base definitions are closer to the human categorisation of space than traditional, numerical, localisation procedures. The approach herein proposed was systematically evaluated through experiments using a mobile robot equipped with a RGB-D sensor. The results obtained show that the localisation algorithm is successful in locating the robot in qualitatively distinct regions.

  19. Using framework-based synthesis for conducting reviews of qualitative studies.

    PubMed

    Dixon-Woods, Mary

    2011-04-14

    Framework analysis is a technique used for data analysis in primary qualitative research. Recent years have seen its being adapted to conduct syntheses of qualitative studies. Framework-based synthesis shows considerable promise in addressing applied policy questions. An innovation in the approach, known as 'best fit' framework synthesis, has been published in BMC Medical Research Methodology this month. It involves reviewers in choosing a conceptual model likely to be suitable for the question of the review, and using it as the basis of their initial coding framework. This framework is then modified in response to the evidence reported in the studies in the reviews, so that the final product is a revised framework that may include both modified factors and new factors that were not anticipated in the original model. 'Best fit' framework-based synthesis may be especially suitable in addressing urgent policy questions where the need for a more fully developed synthesis is balanced by the need for a quick answer. Please see related article: http://www.biomedcentral.com/1471-2288/11/29.

  20. An evidential link prediction method and link predictability based on Shannon entropy

    NASA Astrophysics Data System (ADS)

    Yin, Likang; Zheng, Haoyang; Bian, Tian; Deng, Yong

    2017-09-01

    Predicting missing links is of both theoretical value and practical interest in network science. In this paper, we empirically investigate a new link prediction method base on similarity and compare nine well-known local similarity measures on nine real networks. Most of the previous studies focus on the accuracy, however, it is crucial to consider the link predictability as an initial property of networks itself. Hence, this paper has proposed a new link prediction approach called evidential measure (EM) based on Dempster-Shafer theory. Moreover, this paper proposed a new method to measure link predictability via local information and Shannon entropy.

  1. Qualitative Research and Community-Based Participatory Research: Considerations for Effective Dissemination in the Peer-Reviewed Literature.

    PubMed

    Grieb, Suzanne Dolwick; Eder, Milton Mickey; Smith, Katherine C; Calhoun, Karen; Tandon, Darius

    2015-01-01

    Qualitative research is appearing with increasing frequency in the public health and medical literature. Qualitative research in combination with a community-based participatory research (CBPR) approach can be powerful. However little guidance is available on how to present qualitative research within a CBPR framework for peer-review publications. This article provides a brief overview of how qualitative research can advance CBPR partnerships and outlines practical guidelines for writing for publication about qualitative research within a CBPR framework to (1) guide partners with little experience publishing in peer-reviewed journals and/or (2) facilitate effective preparation of manuscripts grounded in qualitative research for peer-reviewed journals. We provide information regarding the specific benefits of qualitative inquiry in CBPR, tips for organizing the manuscript, questions to consider in preparing the manuscript, common mistakes in the presentation of qualitative research, and examples of peer-reviewed manuscripts presenting qualitative research conducted within a CBPR framework. Qualitative research approaches have tremendous potential to integrate community and researcher perspectives to inform community health research findings. Effective dissemination of CBPR informed qualitative research findings is crucial to advancing health disparities research.

  2. Inquiry-Based Stress Reduction Meditation Technique for Teacher Burnout: A Qualitative Study

    ERIC Educational Resources Information Center

    Schnaider-Levi, Lia; Mitnik, Inbal; Zafrani, Keren; Goldman, Zehavit; Lev-Ari, Shahar

    2017-01-01

    An inquiry-based intervention has been found to have a positive effect on burnout and mental well-being parameters among teachers. The aim of the current study was to qualitatively evaluate the effect of the inquiry-based stress reduction (IBSR) meditation technique on the participants. Semi-structured interviews were conducted before and after…

  3. Knowledge-based fragment binding prediction.

    PubMed

    Tang, Grace W; Altman, Russ B

    2014-04-01

    Target-based drug discovery must assess many drug-like compounds for potential activity. Focusing on low-molecular-weight compounds (fragments) can dramatically reduce the chemical search space. However, approaches for determining protein-fragment interactions have limitations. Experimental assays are time-consuming, expensive, and not always applicable. At the same time, computational approaches using physics-based methods have limited accuracy. With increasing high-resolution structural data for protein-ligand complexes, there is now an opportunity for data-driven approaches to fragment binding prediction. We present FragFEATURE, a machine learning approach to predict small molecule fragments preferred by a target protein structure. We first create a knowledge base of protein structural environments annotated with the small molecule substructures they bind. These substructures have low-molecular weight and serve as a proxy for fragments. FragFEATURE then compares the structural environments within a target protein to those in the knowledge base to retrieve statistically preferred fragments. It merges information across diverse ligands with shared substructures to generate predictions. Our results demonstrate FragFEATURE's ability to rediscover fragments corresponding to the ligand bound with 74% precision and 82% recall on average. For many protein targets, it identifies high scoring fragments that are substructures of known inhibitors. FragFEATURE thus predicts fragments that can serve as inputs to fragment-based drug design or serve as refinement criteria for creating target-specific compound libraries for experimental or computational screening.

  4. Knowledge-based Fragment Binding Prediction

    PubMed Central

    Tang, Grace W.; Altman, Russ B.

    2014-01-01

    Target-based drug discovery must assess many drug-like compounds for potential activity. Focusing on low-molecular-weight compounds (fragments) can dramatically reduce the chemical search space. However, approaches for determining protein-fragment interactions have limitations. Experimental assays are time-consuming, expensive, and not always applicable. At the same time, computational approaches using physics-based methods have limited accuracy. With increasing high-resolution structural data for protein-ligand complexes, there is now an opportunity for data-driven approaches to fragment binding prediction. We present FragFEATURE, a machine learning approach to predict small molecule fragments preferred by a target protein structure. We first create a knowledge base of protein structural environments annotated with the small molecule substructures they bind. These substructures have low-molecular weight and serve as a proxy for fragments. FragFEATURE then compares the structural environments within a target protein to those in the knowledge base to retrieve statistically preferred fragments. It merges information across diverse ligands with shared substructures to generate predictions. Our results demonstrate FragFEATURE's ability to rediscover fragments corresponding to the ligand bound with 74% precision and 82% recall on average. For many protein targets, it identifies high scoring fragments that are substructures of known inhibitors. FragFEATURE thus predicts fragments that can serve as inputs to fragment-based drug design or serve as refinement criteria for creating target-specific compound libraries for experimental or computational screening. PMID:24762971

  5. Qualitative radiology assessment of tumor response: does it measure up?

    PubMed

    Gottlieb, Ronald H; Litwin, Alan; Gupta, Bhavna; Taylor, John; Raczyk, Cheryl; Mashtare, Terry; Wilding, Gregory; Fakih, Marwan

    2008-01-01

    Our purpose was to assess whether a simpler qualitative evaluation of tumor response by computed tomography is as reproducible and predictive of clinical outcome as the Response Evaluation Criteria in Solid Tumors (RECIST) and World Health Organization (WHO) methods. This study was a two-reader retrospective evaluation in which qualitative assessment resulted in agreement in 21 of 23 patients with metastatic colorectal carcinoma (91.3%, kappa=0.78; 95% CI, 0.51-1.00). Hepatic metastases were classified as increased, decreased, or unchanged, compared with agreement in 20 of 23 patients (87.0%) for RECIST (kappa=0.62; 95% CI, 0.23-1.00) and WHO (kappa=0.67; 95% CI, 0.34-1.00) methods. Patients were placed into partial response, stable disease, and disease progression categories. Time to progression of disease was better predicted qualitatively than by RECIST or WHO. Our pilot data suggest that our qualitative scoring system is more reproducible and predictive of patient clinical outcome than the RECIST and WHO methods.

  6. Earthquake prediction in seismogenic areas of the Iberian Peninsula based on computational intelligence

    NASA Astrophysics Data System (ADS)

    Morales-Esteban, A.; Martínez-Álvarez, F.; Reyes, J.

    2013-05-01

    A method to predict earthquakes in two of the seismogenic areas of the Iberian Peninsula, based on Artificial Neural Networks (ANNs), is presented in this paper. ANNs have been widely used in many fields but only very few and very recent studies have been conducted on earthquake prediction. Two kinds of predictions are provided in this study: a) the probability of an earthquake, of magnitude equal or larger than a preset threshold magnitude, within the next 7 days, to happen; b) the probability of an earthquake of a limited magnitude interval to happen, during the next 7 days. First, the physical fundamentals related to earthquake occurrence are explained. Second, the mathematical model underlying ANNs is explained and the configuration chosen is justified. Then, the ANNs have been trained in both areas: The Alborán Sea and the Western Azores-Gibraltar fault. Later, the ANNs have been tested in both areas for a period of time immediately subsequent to the training period. Statistical tests are provided showing meaningful results. Finally, ANNs were compared to other well known classifiers showing quantitatively and qualitatively better results. The authors expect that the results obtained will encourage researchers to conduct further research on this topic. Development of a system capable of predicting earthquakes for the next seven days Application of ANN is particularly reliable to earthquake prediction. Use of geophysical information modeling the soil behavior as ANN's input data Successful analysis of one region with large seismic activity

  7. Simulating boundary layer transition with low-Reynolds-number k-epsilon turbulence models. I - An evaluation of prediction characteristics. II - An approach to improving the predictions

    NASA Technical Reports Server (NTRS)

    Schmidt, R. C.; Patankar, S. V.

    1991-01-01

    The capability of two k-epsilon low-Reynolds number (LRN) turbulence models, those of Jones and Launder (1972) and Lam and Bremhorst (1981), to predict transition in external boundary-layer flows subject to free-stream turbulence is analyzed. Both models correctly predict the basic qualitative aspects of boundary-layer transition with free stream turbulence, but for calculations started at low values of certain defined Reynolds numbers, the transition is generally predicted at unrealistically early locations. Also, the methods predict transition lengths significantly shorter than those found experimentally. An approach to overcoming these deficiencies without abandoning the basic LRN k-epsilon framework is developed. This approach limits the production term in the turbulent kinetic energy equation and is based on a simple stability criterion. It is correlated to the free-stream turbulence value. The modification is shown to improve the qualitative and quantitative characteristics of the transition predictions.

  8. Critiquing qualitative research.

    PubMed

    Beck, Cheryl Tatano

    2009-10-01

    The ability to critique research is a valuable skill that is fundamental to a perioperative nurse's ability to base his or her clinical practice on evidence derived from research. Criteria differ for critiquing a quantitative versus a qualitative study (ie, statistics are evaluated in a quantitative study, but not in a qualitative study). This article provides on guidelines for assessing qualitative research. Excerpts from a published qualitative research report are summarized and then critiqued. Questions are provided that help evaluate different sections of a research study (eg, sample, data collection methods, data analysis).

  9. PAUSE: Predictive Analytics Using SPARQL-Endpoints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sukumar, Sreenivas R; Ainsworth, Keela; Bond, Nathaniel

    2014-07-11

    This invention relates to the medical industry and more specifically to methods of predicting risks. With the impetus towards personalized and evidence-based medicine, the need for a framework to analyze/interpret quantitative measurements (blood work, toxicology, etc.) with qualitative descriptions (specialist reports after reading images, bio-medical knowledgebase, etc.) to predict diagnostic risks is fast emerging. We describe a software solution that leverages hardware for scalable in-memory analytics and applies next-generation semantic query tools on medical data.

  10. Acceptability of the Predicting Abusive Head Trauma (PredAHT) clinical prediction tool: A qualitative study with child protection professionals.

    PubMed

    Cowley, Laura E; Maguire, Sabine; Farewell, Daniel M; Quinn-Scoggins, Harriet D; Flynn, Matthew O; Kemp, Alison M

    2018-05-09

    The validated Predicting Abusive Head Trauma (PredAHT) tool estimates the probability of abusive head trauma (AHT) based on combinations of six clinical features: head/neck bruising; apnea; seizures; rib/long-bone fractures; retinal hemorrhages. We aimed to determine the acceptability of PredAHT to child protection professionals. We conducted qualitative semi-structured interviews with 56 participants: clinicians (25), child protection social workers (10), legal practitioners (9, including 4 judges), police officers (8), and pathologists (4), purposively sampled across southwest United Kingdom. Interviews were recorded, transcribed and imported into NVivo for thematic analysis (38% double-coded). We explored participants' evaluations of PredAHT, their opinions about the optimal way to present the calculated probabilities, and their interpretation of probabilities in the context of suspected AHT. Clinicians, child protection social workers and police thought PredAHT would be beneficial as an objective adjunct to their professional judgment, to give them greater confidence in their decisions. Lawyers and pathologists appreciated its value for prompting multidisciplinary investigations, but were uncertain of its usefulness in court. Perceived disadvantages included: possible over-reliance and false reassurance from a low score. Interpretations regarding which percentages equate to 'low', 'medium' or 'high' likelihood of AHT varied; participants preferred a precise % probability over these general terms. Participants would use PredAHT with provisos: if they received multi-agency training to define accepted risk thresholds for consistent interpretation; with knowledge of its development; if it was accepted by colleagues. PredAHT may therefore increase professionals' confidence in their decision-making when investigating suspected AHT, but may be of less value in court. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Predictive equation of state method for heavy materials based on the Dirac equation and density functional theory

    NASA Astrophysics Data System (ADS)

    Wills, John M.; Mattsson, Ann E.

    2012-02-01

    Density functional theory (DFT) provides a formally predictive base for equation of state properties. Available approximations to the exchange/correlation functional provide accurate predictions for many materials in the periodic table. For heavy materials however, DFT calculations, using available functionals, fail to provide quantitative predictions, and often fail to be even qualitative. This deficiency is due both to the lack of the appropriate confinement physics in the exchange/correlation functional and to approximations used to evaluate the underlying equations. In order to assess and develop accurate functionals, it is essential to eliminate all other sources of error. In this talk we describe an efficient first-principles electronic structure method based on the Dirac equation and compare the results obtained with this method with other methods generally used. Implications for high-pressure equation of state of relativistic materials are demonstrated in application to Ce and the light actinides. Sandia National Laboratories is a multi-program laboratory managed andoperated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  12. Quantitative and qualitative 5-aminolevulinic acid–induced protoporphyrin IX fluorescence in skull base meningiomas

    PubMed Central

    Bekelis, Kimon; Valdés, Pablo A.; Erkmen, Kadir; Leblond, Frederic; Kim, Anthony; Wilson, Brian C.; Harris, Brent T.; Paulsen, Keith D.; Roberts, David W.

    2011-01-01

    Object Complete resection of skull base meningiomas provides patients with the best chance for a cure; however, surgery is frequently difficult given the proximity of lesions to vital structures, such as cranial nerves, major vessels, and venous sinuses. Accurate discrimination between tumor and normal tissue is crucial for optimal tumor resection. Qualitative assessment of protoporphyrin IX (PpIX) fluorescence following the exogenous administration of 5-aminolevulinic acid (ALA) has demonstrated utility in malignant glioma resection but limited use in meningiomas. Here the authors demonstrate the use of ALA-induced PpIX fluorescence guidance in resecting a skull base meningioma and elaborate on the advantages and disadvantages provided by both quantitative and qualitative fluorescence methodologies in skull base meningioma resection. Methods A 52-year-old patient with a sphenoid wing WHO Grade I meningioma underwent tumor resection as part of an institutional review board–approved prospective study of fluorescence-guided resection. A surgical microscope modified for fluorescence imaging was used for the qualitative assessment of visible fluorescence, and an intraoperative probe for in situ fluorescence detection was utilized for quantitative measurements of PpIX. The authors assessed the detection capabilities of both the qualitative and quantitative fluorescence approaches. Results The patient harboring a sphenoid wing meningioma with intraorbital extension underwent radical resection of the tumor with both visibly and nonvisibly fluorescent regions. The patient underwent a complete resection without any complications. Some areas of the tumor demonstrated visible fluorescence. The quantitative probe detected neoplastic tissue better than the qualitative modified surgical microscope. The intraoperative probe was particularly useful in areas that did not reveal visible fluorescence, and tissue from these areas was confirmed as tumor following histopathological

  13. Interview-based Qualitative Research in Emergency Care Part II: Data Collection, Analysis and Results Reporting.

    PubMed

    Ranney, Megan L; Meisel, Zachary F; Choo, Esther K; Garro, Aris C; Sasson, Comilla; Morrow Guthrie, Kate

    2015-09-01

    Qualitative methods are increasingly being used in emergency care research. Rigorous qualitative methods can play a critical role in advancing the emergency care research agenda by allowing investigators to generate hypotheses, gain an in-depth understanding of health problems or specific populations, create expert consensus, and develop new intervention and dissemination strategies. In Part I of this two-article series, we provided an introduction to general principles of applied qualitative health research and examples of its common use in emergency care research, describing study designs and data collection methods most relevant to our field (observation, individual interviews, and focus groups). Here in Part II of this series, we outline the specific steps necessary to conduct a valid and reliable qualitative research project, with a focus on interview-based studies. These elements include building the research team, preparing data collection guides, defining and obtaining an adequate sample, collecting and organizing qualitative data, and coding and analyzing the data. We also discuss potential ethical considerations unique to qualitative research as it relates to emergency care research. © 2015 by the Society for Academic Emergency Medicine.

  14. Interview-Based Qualitative Research in Emergency Care Part II: Data Collection, Analysis and Results Reporting

    PubMed Central

    Ranney, Megan L.; Meisel, Zachary; Choo, Esther K.; Garro, Aris; Sasson, Comilla; Morrow, Kathleen

    2015-01-01

    Qualitative methods are increasingly being used in emergency care research. Rigorous qualitative methods can play a critical role in advancing the emergency care research agenda by allowing investigators to generate hypotheses, gain an in-depth understanding of health problems or specific populations, create expert consensus, and develop new intervention and dissemination strategies. In Part I of this two-article series, we provided an introduction to general principles of applied qualitative health research and examples of its common use in emergency care research, describing study designs and data collection methods most relevant to our field (observation, individual interviews, and focus groups). Here in Part II of this series, we outline the specific steps necessary to conduct a valid and reliable qualitative research project, with a focus on interview-based studies. These elements include building the research team, preparing data collection guides, defining and obtaining an adequate sample, collecting and organizing qualitative data, and coding and analyzing the data. We also discuss potential ethical considerations unique to qualitative research as it relates to emergency care research. PMID:26284572

  15. Single well productivity prediction of carbonate reservoir

    NASA Astrophysics Data System (ADS)

    Le, Xu

    2018-06-01

    It is very important to predict the single-well productivity for the development of oilfields. The fracture structure of carbonate fractured-cavity reservoirs is complex, and the change of single-well productivity is inconsistent with that of sandstone reservoir. Therefore, the establishment of carbonate oil well productivity It is very important. Based on reservoir reality, three different methods for predicting the productivity of carbonate reservoirs have been established based on different types of reservoirs. (1) To qualitatively analyze the single-well capacity relations corresponding to different reservoir types, predict the production capacity according to the different wells encountered by single well; (2) Predict the productivity of carbonate reservoir wells by using numerical simulation technology; (3) According to the historical production data of oil well, fit the relevant capacity formula and make single-well productivity prediction; (4) Predict the production capacity by using oil well productivity formula of carbonate reservoir.

  16. Prediction of Air Pollutants Concentration Based on an Extreme Learning Machine: The Case of Hong Kong

    PubMed Central

    Zhang, Jiangshe; Ding, Weifu

    2017-01-01

    With the development of the economy and society all over the world, most metropolitan cities are experiencing elevated concentrations of ground-level air pollutants. It is urgent to predict and evaluate the concentration of air pollutants for some local environmental or health agencies. Feed-forward artificial neural networks have been widely used in the prediction of air pollutants concentration. However, there are some drawbacks, such as the low convergence rate and the local minimum. The extreme learning machine for single hidden layer feed-forward neural networks tends to provide good generalization performance at an extremely fast learning speed. The major sources of air pollutants in Hong Kong are mobile, stationary, and from trans-boundary sources. We propose predicting the concentration of air pollutants by the use of trained extreme learning machines based on the data obtained from eight air quality parameters in two monitoring stations, including Sham Shui Po and Tap Mun in Hong Kong for six years. The experimental results show that our proposed algorithm performs better on the Hong Kong data both quantitatively and qualitatively. Particularly, our algorithm shows better predictive ability, with R2 increased and root mean square error values decreased respectively. PMID:28125034

  17. A Bayesian network model for predicting type 2 diabetes risk based on electronic health records

    NASA Astrophysics Data System (ADS)

    Xie, Jiang; Liu, Yan; Zeng, Xu; Zhang, Wu; Mei, Zhen

    2017-07-01

    An extensive, in-depth study of diabetes risk factors (DBRF) is of crucial importance to prevent (or reduce) the chance of suffering from type 2 diabetes (T2D). Accumulation of electronic health records (EHRs) makes it possible to build nonlinear relationships between risk factors and diabetes. However, the current DBRF researches mainly focus on qualitative analyses, and the inconformity of physical examination items makes the risk factors likely to be lost, which drives us to study the novel machine learning approach for risk model development. In this paper, we use Bayesian networks (BNs) to analyze the relationship between physical examination information and T2D, and to quantify the link between risk factors and T2D. Furthermore, with the quantitative analyses of DBRF, we adopt EHR and propose a machine learning approach based on BNs to predict the risk of T2D. The experiments demonstrate that our approach can lead to better predictive performance than the classical risk model.

  18. A human factors systems approach to understanding team-based primary care: a qualitative analysis

    PubMed Central

    Mundt, Marlon P.; Swedlund, Matthew P.

    2016-01-01

    Background. Research shows that high-functioning teams improve patient outcomes in primary care. However, there is no consensus on a conceptual model of team-based primary care that can be used to guide measurement and performance evaluation of teams. Objective. To qualitatively understand whether the Systems Engineering Initiative for Patient Safety (SEIPS) model could serve as a framework for creating and evaluating team-based primary care. Methods. We evaluated qualitative interview data from 19 clinicians and staff members from 6 primary care clinics associated with a large Midwestern university. All health care clinicians and staff in the study clinics completed a survey of their communication connections to team members. Social network analysis identified key informants for interviews by selecting the respondents with the highest frequency of communication ties as reported by their teammates. Semi-structured interviews focused on communication patterns, team climate and teamwork. Results. Themes derived from the interviews lent support to the SEIPS model components, such as the work system (Team, Tools and Technology, Physical Environment, Tasks and Organization), team processes and team outcomes. Conclusions. Our qualitative data support the SEIPS model as a promising conceptual framework for creating and evaluating primary care teams. Future studies of team-based care may benefit from using the SEIPS model to shift clinical practice to high functioning team-based primary care. PMID:27578837

  19. Learning about Ecological Systems by Constructing Qualitative Models with DynaLearn

    ERIC Educational Resources Information Center

    Leiba, Moshe; Zuzovsky, Ruth; Mioduser, David; Benayahu, Yehuda; Nachmias, Rafi

    2012-01-01

    A qualitative model of a system is an abstraction that captures ordinal knowledge and predicts the set of qualitatively possible behaviours of the system, given a qualitative description of its structure and initial state. This paper examines an innovative approach to science education using an interactive learning environment that supports…

  20. Predicting pathogen growth during short-term temperature abuse of raw pork, beef, and poultry products: use of an isothermal-based predictive tool.

    PubMed

    Ingham, Steven C; Fanslau, Melody A; Burnham, Greg M; Ingham, Barbara H; Norback, John P; Schaffner, Donald W

    2007-06-01

    A computer-based tool (available at: www.wisc.edu/foodsafety/meatresearch) was developed for predicting pathogen growth in raw pork, beef, and poultry meat. The tool, THERM (temperature history evaluation for raw meats), predicts the growth of pathogens in pork and beef (Escherichia coli O157:H7, Salmonella serovars, and Staphylococcus aureus) and on poultry (Salmonella serovars and S. aureus) during short-term temperature abuse. The model was developed as follows: 25-g samples of raw ground pork, beef, and turkey were inoculated with a five-strain cocktail of the target pathogen(s) and held at isothermal temperatures from 10 to 43.3 degrees C. Log CFU per sample data were obtained for each pathogen and used to determine lag-phase duration (LPD) and growth rate (GR) by DMFit software. The LPD and GR were used to develop the THERM predictive tool, into which chronological time and temperature data for raw meat processing and storage are entered. The THERM tool then predicts a delta log CFU value for the desired pathogen-product combination. The accuracy of THERM was tested in 20 different inoculation experiments that involved multiple products (coarse-ground beef, skinless chicken breast meat, turkey scapula meat, and ground turkey) and temperature-abuse scenarios. With the time-temperature data from each experiment, THERM accurately predicted the pathogen growth and no growth (with growth defined as delta log CFU > 0.3) in 67, 85, and 95% of the experiments with E. coli 0157:H7, Salmonella serovars, and S. aureus, respectively, and yielded fail-safe predictions in the remaining experiments. We conclude that THERM is a useful tool for qualitatively predicting pathogen behavior (growth and no growth) in raw meats. Potential applications include evaluating process deviations and critical limits under the HACCP (hazard analysis critical control point) system.

  1. Qualitative and quantitative descriptions of glenohumeral motion.

    PubMed

    Hill, A M; Bull, A M J; Wallace, A L; Johnson, G R

    2008-02-01

    Joint modelling plays an important role in qualitative and quantitative descriptions of both normal and abnormal joints, as well as predicting outcomes of alterations to joints in orthopaedic practice and research. Contemporary efforts in modelling have focussed upon the major articulations of the lower limb. Well-constrained arthrokinematics can form the basis of manageable kinetic and dynamic mathematical predictions. In order to contain computation of shoulder complex modelling, glenohumeral joint representations in both limited and complete shoulder girdle models have undergone a generic simplification. As such, glenohumeral joint models are often based upon kinematic descriptions of inadequate degrees of freedom (DOF) for clinical purposes and applications. Qualitative descriptions of glenohumeral motion range from the parody of a hinge joint to the complex realism of a spatial joint. In developing a model, a clear idea of intention is required in order to achieve a required application. Clinical applicability of a model requires both descriptive and predictive output potentials, and as such, a high level of validation is required. Without sufficient appreciation of the clinical intention of the arthrokinematic foundation to a model, error is all too easily introduced. Mathematical description of joint motion serves to quantify all relevant clinical parameters. Commonly, both the Euler angle and helical (screw) axis methods have been applied to the glenohumeral joint, although concordance between these methods and classical anatomical appreciation of joint motion is limited, resulting in miscommunication between clinician and engineer. Compounding these inconsistencies in motion quantification is gimbal lock and sequence dependency.

  2. Smoking Beliefs Among Chinese Secondary School Students: A Theory-Based Qualitative Study.

    PubMed

    Zhao, Xiang; White, Katherine M; Young, Ross McD; Obst, Patricia L

    2018-02-07

    China has the world's greatest number of smokers but theory-based smoking interventions are rare. To develop an effective intervention, understanding the determinants of Chinese adolescent smoking is crucial. The Theory of Planned Behavior (TPB) is empirically supported to predict and assist in informing intervention strategies to change health-related behaviors. Based on the TPB, the elicitation of shared smoking beliefs among adolescents can inform future intervention designs among this at-risk population. We investigated the beliefs from six focus groups (N = 30) of one senior secondary school in Kunming, Yunnan Province, China. We used semi-structured questions based on the TPB framework, including prompts about behavioral (advantages and disadvantages), normative (important referents), and control (barriers and facilitators) beliefs. Following the Consensual Qualitative Research (CQR) methodology, data were discussed until consensus was reached. Auditing was undertaken by an external researcher. Seven domains (advantages, disadvantages, approvers, disapprovers, facilitators, barriers, and smoker images) were examined. Smoking as a gendered behavior, smoking as influenced by cultural and environmental contexts, smoking as a strategy to cope with stress, and awareness of the harm of smoking, are highlighted themes across domains. Data suggested an extended-TPB framework as an appropriate approach to adopt when addressing smoking beliefs among the target population. These beliefs can be utilized to inform future school-based interventions and public health campaigns targeting smoking among Chinese adolescents. A modified TPB approach has potential for future smoking interventions among Chinese adolescents. Beliefs elicited in this study form a strong basis for designing a location- and population-specific antismoking programme. © The Author 2017. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights

  3. Integrating Buprenorphine Treatment into Office-based Practice: a Qualitative Study

    PubMed Central

    Irwin, Kevin S.; Jones, Emlyn S.; Becker, William C.; Tetrault, Jeanette M.; Sullivan, Lynn E.; Hansen, Helena; O’Connor, Patrick G.; Schottenfeld, Richard S.; Fiellin, David A.

    2008-01-01

    BACKGROUND Despite the availability and demonstrated effectiveness of office-based buprenorphine maintenance treatment (BMT), the systematic examination of physicians’ attitudes towards this new medical practice has been largely neglected. OBJECTIVE To identify facilitators and barriers to the potential or actual implementation of BMT by office-based medical providers. DESIGN Qualitative study using individual and group semi-structured interviews. PARTICIPANTS Twenty-three practicing office-based physicians in New England. APPROACH Interviews were audiotaped, transcribed, and entered into a qualitative software program. The transcripts were thematically coded using the constant comparative method by a multidisciplinary team. RESULTS Eighty percent of the physicians were white; 55% were women. The mean number of years since graduating medical school was 14 (SD = 10). The primary areas of clinical specialization were internal medicine (50%), infectious disease (20%), and addiction medicine (15%). Physicians identified physician, patient, and logistical factors that would either facilitate or serve as a barrier to their integration of BMT into clinical practice. Physician facilitators included promoting continuity of patient care, positive perceptions of BMT, and viewing BMT as a positive alternative to methadone maintenance. Physician barriers included competing activities, lack of interest, and lack of expertise in addiction treatment. Physicians’ perceptions of patient-related barriers included concerns about confidentiality and cost, and low motivation for treatment. Perceived logistical barriers included lack of remuneration for BMT, limited ancillary support for physicians, not enough time, and a perceived low prevalence of opioid dependence in physicians’ practices. CONCLUSIONS Addressing physicians’ perceptions of facilitators and barriers to BMT is crucial to supporting the further expansion of BMT into primary care and office-based practices

  4. A human factors systems approach to understanding team-based primary care: a qualitative analysis.

    PubMed

    Mundt, Marlon P; Swedlund, Matthew P

    2016-12-01

    Research shows that high-functioning teams improve patient outcomes in primary care. However, there is no consensus on a conceptual model of team-based primary care that can be used to guide measurement and performance evaluation of teams. To qualitatively understand whether the Systems Engineering Initiative for Patient Safety (SEIPS) model could serve as a framework for creating and evaluating team-based primary care. We evaluated qualitative interview data from 19 clinicians and staff members from 6 primary care clinics associated with a large Midwestern university. All health care clinicians and staff in the study clinics completed a survey of their communication connections to team members. Social network analysis identified key informants for interviews by selecting the respondents with the highest frequency of communication ties as reported by their teammates. Semi-structured interviews focused on communication patterns, team climate and teamwork. Themes derived from the interviews lent support to the SEIPS model components, such as the work system (Team, Tools and Technology, Physical Environment, Tasks and Organization), team processes and team outcomes. Our qualitative data support the SEIPS model as a promising conceptual framework for creating and evaluating primary care teams. Future studies of team-based care may benefit from using the SEIPS model to shift clinical practice to high functioning team-based primary care. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  5. Logic-based models in systems biology: a predictive and parameter-free network analysis method†

    PubMed Central

    Wynn, Michelle L.; Consul, Nikita; Merajver, Sofia D.

    2012-01-01

    Highly complex molecular networks, which play fundamental roles in almost all cellular processes, are known to be dysregulated in a number of diseases, most notably in cancer. As a consequence, there is a critical need to develop practical methodologies for constructing and analysing molecular networks at a systems level. Mathematical models built with continuous differential equations are an ideal methodology because they can provide a detailed picture of a network’s dynamics. To be predictive, however, differential equation models require that numerous parameters be known a priori and this information is almost never available. An alternative dynamical approach is the use of discrete logic-based models that can provide a good approximation of the qualitative behaviour of a biochemical system without the burden of a large parameter space. Despite their advantages, there remains significant resistance to the use of logic-based models in biology. Here, we address some common concerns and provide a brief tutorial on the use of logic-based models, which we motivate with biological examples. PMID:23072820

  6. The salt marsh vegetation spread dynamics simulation and prediction based on conditions optimized CA

    NASA Astrophysics Data System (ADS)

    Guan, Yujuan; Zhang, Liquan

    2006-10-01

    The biodiversity conservation and management of the salt marsh vegetation relies on processing their spatial information. Nowadays, more attentions are focused on their classification surveying and describing qualitatively dynamics based on RS images interpreted, rather than on simulating and predicting their dynamics quantitatively, which is of greater importance for managing and planning the salt marsh vegetation. In this paper, our notion is to make a dynamic model on large-scale and to provide a virtual laboratory in which researchers can run it according requirements. Firstly, the characteristic of the cellular automata was analyzed and a conclusion indicated that it was necessary for a CA model to be extended geographically under varying conditions of space-time circumstance in order to make results matched the facts accurately. Based on the conventional cellular automata model, the author introduced several new conditions to optimize it for simulating the vegetation objectively, such as elevation, growth speed, invading ability, variation and inheriting and so on. Hence the CA cells and remote sensing image pixels, cell neighbors and pixel neighbors, cell rules and nature of the plants were unified respectively. Taking JiuDuanSha as the test site, where holds mainly Phragmites australis (P.australis) community, Scirpus mariqueter (S.mariqueter) community and Spartina alterniflora (S.alterniflora) community. The paper explored the process of making simulation and predictions about these salt marsh vegetable changing with the conditions optimized CA (COCA) model, and examined the links among data, statistical models, and ecological predictions. This study exploited the potential of applying Conditioned Optimized CA model technique to solve this problem.

  7. Extending Theory-Based Quantitative Predictions to New Health Behaviors.

    PubMed

    Brick, Leslie Ann D; Velicer, Wayne F; Redding, Colleen A; Rossi, Joseph S; Prochaska, James O

    2016-04-01

    Traditional null hypothesis significance testing suffers many limitations and is poorly adapted to theory testing. A proposed alternative approach, called Testing Theory-based Quantitative Predictions, uses effect size estimates and confidence intervals to directly test predictions based on theory. This paper replicates findings from previous smoking studies and extends the approach to diet and sun protection behaviors using baseline data from a Transtheoretical Model behavioral intervention (N = 5407). Effect size predictions were developed using two methods: (1) applying refined effect size estimates from previous smoking research or (2) using predictions developed by an expert panel. Thirteen of 15 predictions were confirmed for smoking. For diet, 7 of 14 predictions were confirmed using smoking predictions and 6 of 16 using expert panel predictions. For sun protection, 3 of 11 predictions were confirmed using smoking predictions and 5 of 19 using expert panel predictions. Expert panel predictions and smoking-based predictions poorly predicted effect sizes for diet and sun protection constructs. Future studies should aim to use previous empirical data to generate predictions whenever possible. The best results occur when there have been several iterations of predictions for a behavior, such as with smoking, demonstrating that expected values begin to converge on the population effect size. Overall, the study supports necessity in strengthening and revising theory with empirical data.

  8. Reasoning about energy in qualitative simulation

    NASA Technical Reports Server (NTRS)

    Fouche, Pierre; Kuipers, Benjamin J.

    1992-01-01

    While possible behaviors of a mechanism that are consistent with an incomplete state of knowledge can be predicted through qualitative modeling and simulation, spurious behaviors corresponding to no solution of any ordinary differential equation consistent with the model may be generated. The present method for energy-related reasoning eliminates an important source of spurious behaviors, as demonstrated by its application to a nonlinear, proportional-integral controlled. It is shown that such qualitative properties of such a system as stability and zero-offset control are captured by the simulation.

  9. Highly predictive and interpretable models for PAMPA permeability.

    PubMed

    Sun, Hongmao; Nguyen, Kimloan; Kerns, Edward; Yan, Zhengyin; Yu, Kyeong Ri; Shah, Pranav; Jadhav, Ajit; Xu, Xin

    2017-02-01

    Cell membrane permeability is an important determinant for oral absorption and bioavailability of a drug molecule. An in silico model predicting drug permeability is described, which is built based on a large permeability dataset of 7488 compound entries or 5435 structurally unique molecules measured by the same lab using parallel artificial membrane permeability assay (PAMPA). On the basis of customized molecular descriptors, the support vector regression (SVR) model trained with 4071 compounds with quantitative data is able to predict the remaining 1364 compounds with the qualitative data with an area under the curve of receiver operating characteristic (AUC-ROC) of 0.90. The support vector classification (SVC) model trained with half of the whole dataset comprised of both the quantitative and the qualitative data produced accurate predictions to the remaining data with the AUC-ROC of 0.88. The results suggest that the developed SVR model is highly predictive and provides medicinal chemists a useful in silico tool to facilitate design and synthesis of novel compounds with optimal drug-like properties, and thus accelerate the lead optimization in drug discovery. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Iranian Nursing Students' Experiences of Case-Based Learning: A Qualitative Study.

    PubMed

    Gholami, Mohammad; Saki, Mandana; Toulabi, Tahereh; Kordestani Moghadam, Parastou; Hossein Pour, Amir Hossein; Dostizadeh, Reza

    The purpose of this study was to explore the experiences of undergraduate nursing students of the implementation of case-based learning in an emergency nursing course. The present qualitative study was conducted using the qualitative content analysis method. Participants consisted of 18 third year undergraduate nursing students selected through purposive sampling, which continued until the saturation of the data. Data were collected using semistructured interviews and were analyzed concurrently with their collection through the constant comparison method. The process of data analysis led to the emergence of 4 main themes, including "the continuum of knowledge from production to transfer competence," "a positive atmosphere of interaction," "the process of stress relieving," "the sense of role-playing in professional life," and the emergence of 12 subthemes signifying participants' experiences and perceptions with regard to the implementation of case-based learning (CBL) in teaching the emergency nursing course. The results of the present study showed that CBL is a stressful but pleasant and empowering experience for Iranian nursing students that develops critical thinking and stress management skills, reinforces peers' potentials, improves diagnostic abilities, and helps acquire professional competencies for use in future practices through the creation of a positive environment. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. The Implication of Using NVivo Software in Qualitative Data Analysis: Evidence-Based Reflections.

    PubMed

    Zamawe, F C

    2015-03-01

    For a long time, electronic data analysis has been associated with quantitative methods. However, Computer Assisted Qualitative Data Analysis Software (CAQDAS) are increasingly being developed. Although the CAQDAS has been there for decades, very few qualitative health researchers report using it. This may be due to the difficulties that one has to go through to master the software and the misconceptions that are associated with using CAQDAS. While the issue of mastering CAQDAS has received ample attention, little has been done to address the misconceptions associated with CAQDAS. In this paper, the author reflects on his experience of interacting with one of the popular CAQDAS (NVivo) in order to provide evidence-based implications of using the software. The key message is that unlike statistical software, the main function of CAQDAS is not to analyse data but rather to aid the analysis process, which the researcher must always remain in control of. In other words, researchers must equally know that no software can analyse qualitative data. CAQDAS are basically data management packages, which support the researcher during analysis.

  12. Predicting links based on knowledge dissemination in complex network

    NASA Astrophysics Data System (ADS)

    Zhou, Wen; Jia, Yifan

    2017-04-01

    Link prediction is the task of mining the missing links in networks or predicting the next vertex pair to be connected by a link. A lot of link prediction methods were inspired by evolutionary processes of networks. In this paper, a new mechanism for the formation of complex networks called knowledge dissemination (KD) is proposed with the assumption of knowledge disseminating through the paths of a network. Accordingly, a new link prediction method-knowledge dissemination based link prediction (KDLP)-is proposed to test KD. KDLP characterizes vertex similarity based on knowledge quantity (KQ) which measures the importance of a vertex through H-index. Extensive numerical simulations on six real-world networks demonstrate that KDLP is a strong link prediction method which performs at a higher prediction accuracy than four well-known similarity measures including common neighbors, local path index, average commute time and matrix forest index. Furthermore, based on the common conclusion that an excellent link prediction method reveals a good evolving mechanism, the experiment results suggest that KD is a considerable network evolving mechanism for the formation of complex networks.

  13. Neural mechanisms of rhythm-based temporal prediction: Delta phase-locking reflects temporal predictability but not rhythmic entrainment.

    PubMed

    Breska, Assaf; Deouell, Leon Y

    2017-02-01

    Predicting the timing of upcoming events enables efficient resource allocation and action preparation. Rhythmic streams, such as music, speech, and biological motion, constitute a pervasive source for temporal predictions. Widely accepted entrainment theories postulate that rhythm-based predictions are mediated by synchronizing low-frequency neural oscillations to the rhythm, as indicated by increased phase concentration (PC) of low-frequency neural activity for rhythmic compared to random streams. However, we show here that PC enhancement in scalp recordings is not specific to rhythms but is observed to the same extent in less periodic streams if they enable memory-based prediction. This is inconsistent with the predictions of a computational entrainment model of stronger PC for rhythmic streams. Anticipatory change in alpha activity and facilitation of electroencephalogram (EEG) manifestations of response selection are also comparable between rhythm- and memory-based predictions. However, rhythmic sequences uniquely result in obligatory depression of preparation-related premotor brain activity when an on-beat event is omitted, even when it is strategically beneficial to maintain preparation, leading to larger behavioral costs for violation of prediction. Thus, while our findings undermine the validity of PC as a sign of rhythmic entrainment, they constitute the first electrophysiological dissociation, to our knowledge, between mechanisms of rhythmic predictions and of memory-based predictions: the former obligatorily lead to resonance-like preparation patterns (that are in line with entrainment), while the latter allow flexible resource allocation in time regardless of periodicity in the input. Taken together, they delineate the neural mechanisms of three distinct modes of preparation: continuous vigilance, interval-timing-based prediction and rhythm-based prediction.

  14. Neural mechanisms of rhythm-based temporal prediction: Delta phase-locking reflects temporal predictability but not rhythmic entrainment

    PubMed Central

    Deouell, Leon Y.

    2017-01-01

    Predicting the timing of upcoming events enables efficient resource allocation and action preparation. Rhythmic streams, such as music, speech, and biological motion, constitute a pervasive source for temporal predictions. Widely accepted entrainment theories postulate that rhythm-based predictions are mediated by synchronizing low-frequency neural oscillations to the rhythm, as indicated by increased phase concentration (PC) of low-frequency neural activity for rhythmic compared to random streams. However, we show here that PC enhancement in scalp recordings is not specific to rhythms but is observed to the same extent in less periodic streams if they enable memory-based prediction. This is inconsistent with the predictions of a computational entrainment model of stronger PC for rhythmic streams. Anticipatory change in alpha activity and facilitation of electroencephalogram (EEG) manifestations of response selection are also comparable between rhythm- and memory-based predictions. However, rhythmic sequences uniquely result in obligatory depression of preparation-related premotor brain activity when an on-beat event is omitted, even when it is strategically beneficial to maintain preparation, leading to larger behavioral costs for violation of prediction. Thus, while our findings undermine the validity of PC as a sign of rhythmic entrainment, they constitute the first electrophysiological dissociation, to our knowledge, between mechanisms of rhythmic predictions and of memory-based predictions: the former obligatorily lead to resonance-like preparation patterns (that are in line with entrainment), while the latter allow flexible resource allocation in time regardless of periodicity in the input. Taken together, they delineate the neural mechanisms of three distinct modes of preparation: continuous vigilance, interval-timing-based prediction and rhythm-based prediction. PMID:28187128

  15. Model-based prediction of myelosuppression and recovery based on frequent neutrophil monitoring.

    PubMed

    Netterberg, Ida; Nielsen, Elisabet I; Friberg, Lena E; Karlsson, Mats O

    2017-08-01

    To investigate whether a more frequent monitoring of the absolute neutrophil counts (ANC) during myelosuppressive chemotherapy, together with model-based predictions, can improve therapy management, compared to the limited clinical monitoring typically applied today. Daily ANC in chemotherapy-treated cancer patients were simulated from a previously published population model describing docetaxel-induced myelosuppression. The simulated values were used to generate predictions of the individual ANC time-courses, given the myelosuppression model. The accuracy of the predicted ANC was evaluated under a range of conditions with reduced amount of ANC measurements. The predictions were most accurate when more data were available for generating the predictions and when making short forecasts. The inaccuracy of ANC predictions was highest around nadir, although a high sensitivity (≥90%) was demonstrated to forecast Grade 4 neutropenia before it occurred. The time for a patient to recover to baseline could be well forecasted 6 days (±1 day) before the typical value occurred on day 17. Daily monitoring of the ANC, together with model-based predictions, could improve anticancer drug treatment by identifying patients at risk for severe neutropenia and predicting when the next cycle could be initiated.

  16. The qualitative research proposal.

    PubMed

    Klopper, H

    2008-12-01

    Qualitative research in the health sciences has had to overcome many prejudices and a number of misunderstandings, but today qualitative research is as acceptable as quantitative research designs and is widely funded and published. Writing the proposal of a qualitative study, however, can be a challenging feat, due to the emergent nature of the qualitative research design and the description of the methodology as a process. Even today, many sub-standard proposals at post-graduate evaluation committees and application proposals to be considered for funding are still seen. This problem has led the researcher to develop a framework to guide the qualitative researcher in writing the proposal of a qualitative study based on the following research questions: (i) What is the process of writing a qualitative research proposal? and (ii) What does the structure and layout of a qualitative proposal look like? The purpose of this article is to discuss the process of writing the qualitative research proposal, as well as describe the structure and layout of a qualitative research proposal. The process of writing a qualitative research proposal is discussed with regards to the most important questions that need to be answered in your research proposal with consideration of the guidelines of being practical, being persuasive, making broader links, aiming for crystal clarity and planning before you write. While the structure of the qualitative research proposal is discussed with regards to the key sections of the proposal, namely the cover page, abstract, introduction, review of the literature, research problem and research questions, research purpose and objectives, research paradigm, research design, research method, ethical considerations, dissemination plan, budget and appendices.

  17. Difficulties Faced in Social Club Activities: A Qualitative Study Based on Teacher Opinions

    ERIC Educational Resources Information Center

    Keçe, Murat

    2015-01-01

    The purpose of this study is to scrutinize the problems encountered in social club activities based on opinions of club advisors. This study was conducted in line with qualitative research methods using the interview technique to collect data. Therefore, interviews were held with 21 club advisors included in the study group. A category analysis, a…

  18. Quantitative and qualitative trophectoderm grading allows for prediction of live birth and gender.

    PubMed

    Ebner, Thomas; Tritscher, Katja; Mayer, Richard B; Oppelt, Peter; Duba, Hans-Christoph; Maurer, Maria; Schappacher-Tilp, Gudrun; Petek, Erwin; Shebl, Omar

    2016-01-01

    Prolonged in vitro culture is thought to affect pre- and postnatal development of the embryo. This prospective study was set up to determine whether quality/size of inner cell mass (ICM) (from which the fetus ultimately develops) and trophectoderm (TE) (from which the placenta ultimately develops) is reflected in birth and placental weight, healthy live-birth rate, and gender after fresh and frozen single blastocyst transfer. In 225 patients, qualitative scoring of blastocysts was done according to the criteria expansion, ICM, and TE appearance. In parallel, all three parameters were quantified semi-automatically. TE quality and cell number were the only parameters that predicted treatment outcome. In detail, pregnancies that continued on to a live birth could be distinguished from those pregnancies that aborted on the basis of TE grade and cell number. Male blastocysts had a 2.53 higher chance of showing TE of quality A compared to female ones. There was no correlation between the appearance of both cell lineages and birth or placental weight, respectively. The presented correlation of TE with outcome indicates that TE scoring could replace ICM scoring in terms of priority. This would automatically require a rethinking process in terms of blastocyst selection and cryopreservation strategy.

  19. THE FUTURE OF COMPUTER-BASED TOXICITY PREDICTION: MECHANISM-BASED MODELS VS. INFORMATION MINING APPROACHES

    EPA Science Inventory


    The Future of Computer-Based Toxicity Prediction:
    Mechanism-Based
    Models vs. Information Mining Approaches

    When we speak of computer-based toxicity prediction, we are generally referring to a broad array of approaches which rely primarily upon chemical structure ...

  20. Model-free and model-based reward prediction errors in EEG.

    PubMed

    Sambrook, Thomas D; Hardwick, Ben; Wills, Andy J; Goslin, Jeremy

    2018-05-24

    Learning theorists posit two reinforcement learning systems: model-free and model-based. Model-based learning incorporates knowledge about structure and contingencies in the world to assign candidate actions with an expected value. Model-free learning is ignorant of the world's structure; instead, actions hold a value based on prior reinforcement, with this value updated by expectancy violation in the form of a reward prediction error. Because they use such different learning mechanisms, it has been previously assumed that model-based and model-free learning are computationally dissociated in the brain. However, recent fMRI evidence suggests that the brain may compute reward prediction errors to both model-free and model-based estimates of value, signalling the possibility that these systems interact. Because of its poor temporal resolution, fMRI risks confounding reward prediction errors with other feedback-related neural activity. In the present study, EEG was used to show the presence of both model-based and model-free reward prediction errors and their place in a temporal sequence of events including state prediction errors and action value updates. This demonstration of model-based prediction errors questions a long-held assumption that model-free and model-based learning are dissociated in the brain. Copyright © 2018 Elsevier Inc. All rights reserved.

  1. Predicting solar radiation based on available weather indicators

    NASA Astrophysics Data System (ADS)

    Sauer, Frank Joseph

    Solar radiation prediction models are complex and require software that is not available for the household investor. The processing power within a normal desktop or laptop computer is sufficient to calculate similar models. This barrier to entry for the average consumer can be fixed by a model simple enough to be calculated by hand if necessary. Solar radiation modeling has been historically difficult to predict and accurate models have significant assumptions and restrictions on their use. Previous methods have been limited to linear relationships, location restrictions, or input data limits to one atmospheric condition. This research takes a novel approach by combining two techniques within the computational limits of a household computer; Clustering and Hidden Markov Models (HMMs). Clustering helps limit the large observation space which restricts the use of HMMs. Instead of using continuous data, and requiring significantly increased computations, the cluster can be used as a qualitative descriptor of each observation. HMMs incorporate a level of uncertainty and take into account the indirect relationship between meteorological indicators and solar radiation. This reduces the complexity of the model enough to be simply understood and accessible to the average household investor. The solar radiation is considered to be an unobservable state that each household will be unable to measure. The high temperature and the sky coverage are already available through the local or preferred source of weather information. By using the next day's prediction for high temperature and sky coverage, the model groups the data and then predicts the most likely range of radiation. This model uses simple techniques and calculations to give a broad estimate for the solar radiation when no other universal model exists for the average household.

  2. Information Uncertainty to Compare Qualitative Reasoning Security Risk Assessment Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chavez, Gregory M; Key, Brian P; Zerkle, David K

    2009-01-01

    The security risk associated with malevolent acts such as those of terrorism are often void of the historical data required for a traditional PRA. Most information available to conduct security risk assessments for these malevolent acts is obtained from subject matter experts as subjective judgements. Qualitative reasoning approaches such as approximate reasoning and evidential reasoning are useful for modeling the predicted risk from information provided by subject matter experts. Absent from these approaches is a consistent means to compare the security risk assessment results. Associated with each predicted risk reasoning result is a quantifiable amount of information uncertainty which canmore » be measured and used to compare the results. This paper explores using entropy measures to quantify the information uncertainty associated with conflict and non-specificity in the predicted reasoning results. The measured quantities of conflict and non-specificity can ultimately be used to compare qualitative reasoning results which are important in triage studies and ultimately resource allocation. Straight forward extensions of previous entropy measures are presented here to quantify the non-specificity and conflict associated with security risk assessment results obtained from qualitative reasoning models.« less

  3. Conceptual bases of Christian, faith-based substance abuse rehabilitation programs: qualitative analysis of staff interviews.

    PubMed

    McCoy, Lisa K; Hermos, John A; Bokhour, Barbara G; Frayne, Susan M

    2004-09-01

    Faith-based substance abuse rehabilitation programs provide residential treatment for many substance abusers. To determine key governing concepts of such programs, we conducted semi-structured interviews with sample of eleven clinical and administrative staff referred to us by program directors at six, Evangelical Christian, faith-based, residential rehabilitation programs representing two large, nationwide networks. Qualitative analysis using grounded theory methods examined how spirituality is incorporated into treatment and elicited key theories of addiction and recovery. Although containing comprehensive secular components, the core activities are strongly rooted in a Christian belief system that informs their understanding of addiction and recovery and drives the treatment format. These governing conceptions, that addiction stems from attempts to fill a spiritual void through substance use and recovery through salvation and a long-term relationship with God, provide an explicit, theory-driven model upon which they base their core treatment activities. Knowledge of these core concepts and practices should be helpful to clinicians in considering referrals to faith-based recovery programs.

  4. Alignment-Based Prediction of Sites of Metabolism.

    PubMed

    de Bruyn Kops, Christina; Friedrich, Nils-Ole; Kirchmair, Johannes

    2017-06-26

    Prediction of metabolically labile atom positions in a molecule (sites of metabolism) is a key component of the simulation of xenobiotic metabolism as a whole, providing crucial information for the development of safe and effective drugs. In 2008, an exploratory study was published in which sites of metabolism were derived based on molecular shape- and chemical feature-based alignment to a molecule whose site of metabolism (SoM) had been determined by experiments. We present a detailed analysis of the breadth of applicability of alignment-based SoM prediction, including transfer of the approach from a structure- to ligand-based method and extension of the applicability of the models from cytochrome P450 2C9 to all cytochrome P450 isozymes involved in drug metabolism. We evaluate the effect of molecular similarity of the query and reference molecules on the ability of this approach to accurately predict SoMs. In addition, we combine the alignment-based method with a leading chemical reactivity model to take reactivity into account. The combined model yielded superior performance in comparison to the alignment-based approach and the reactivity models with an average area under the receiver operating characteristic curve of 0.85 in cross-validation experiments. In particular, early enrichment was improved, as evidenced by higher BEDROC scores (mean BEDROC = 0.59 for α = 20.0, mean BEDROC = 0.73 for α = 80.5).

  5. Qualitative and temporal reasoning in engine behavior analysis

    NASA Technical Reports Server (NTRS)

    Dietz, W. E.; Stamps, M. E.; Ali, M.

    1987-01-01

    Numerical simulation models, engine experts, and experimental data are used to generate qualitative and temporal representations of abnormal engine behavior. Engine parameters monitored during operation are used to generate qualitative and temporal representations of actual engine behavior. Similarities between the representations of failure scenarios and the actual engine behavior are used to diagnose fault conditions which have already occurred, or are about to occur; to increase the surveillance by the monitoring system of relevant engine parameters; and to predict likely future engine behavior.

  6. The development of a qualitative dynamic attribute value model for healthcare institutes.

    PubMed

    Lee, Wan-I

    2010-01-01

    Understanding customers has become an urgent topic for increasing competitiveness. The purpopse of the study was to develop a qualitative dynamic attribute value model which provides insight into the customers' value for healthcare institute managers by conducting the initial open-ended questionnaire survey to select participants purposefully. A total number of 427 questionnaires was conducted in two hospitals in Taiwan (one district hospital with 635 beds and one academic hospital with 2495 beds) and 419 questionnaires were received in nine weeks. Then, apply qualitative in-depth interviews to explore customers' perspective of values for building a model of partial differential equations. This study concludes nine categories of value, including cost, equipment, physician background, physicain care, environment, timing arrangement, relationship, brand image and additional value, to construct objective network for customer value and qualitative dynamic attribute value model where the network shows the value process of loyalty development via its effect on customer satisfaction, customer relationship, customer loyalty and healthcare service. One set predicts the customer relationship based on comminent, including service quality, communication and empahty. As the same time, customer loyalty based on trust, involves buzz marketing, brand and image. Customer value of the current instance is useful for traversing original customer attributes and identifing customers on different service share.

  7. A Guide to Writing a Qualitative Systematic Review Protocol to Enhance Evidence-Based Practice in Nursing and Health Care.

    PubMed

    Butler, Ashleigh; Hall, Helen; Copnell, Beverley

    2016-06-01

    The qualitative systematic review is a rapidly developing area of nursing research. In order to present trustworthy, high-quality recommendations, such reviews should be based on a review protocol to minimize bias and enhance transparency and reproducibility. Although there are a number of resources available to guide researchers in developing a quantitative review protocol, very few resources exist for qualitative reviews. To guide researchers through the process of developing a qualitative systematic review protocol, using an example review question. The key elements required in a systematic review protocol are discussed, with a focus on application to qualitative reviews: Development of a research question; formulation of key search terms and strategies; designing a multistage review process; critical appraisal of qualitative literature; development of data extraction techniques; and data synthesis. The paper highlights important considerations during the protocol development process, and uses a previously developed review question as a working example. This paper will assist novice researchers in developing a qualitative systematic review protocol. By providing a worked example of a protocol, the paper encourages the development of review protocols, enhancing the trustworthiness and value of the completed qualitative systematic review findings. Qualitative systematic reviews should be based on well planned, peer reviewed protocols to enhance the trustworthiness of results and thus their usefulness in clinical practice. Protocols should outline, in detail, the processes which will be used to undertake the review, including key search terms, inclusion and exclusion criteria, and the methods used for critical appraisal, data extraction and data analysis to facilitate transparency of the review process. Additionally, journals should encourage and support the publication of review protocols, and should require reference to a protocol prior to publication of the

  8. Entropy-based link prediction in weighted networks

    NASA Astrophysics Data System (ADS)

    Xu, Zhongqi; Pu, Cunlai; Ramiz Sharafat, Rajput; Li, Lunbo; Yang, Jian

    2017-01-01

    Information entropy has been proved to be an effective tool to quantify the structural importance of complex networks. In the previous work (Xu et al, 2016 \\cite{xu2016}), we measure the contribution of a path in link prediction with information entropy. In this paper, we further quantify the contribution of a path with both path entropy and path weight, and propose a weighted prediction index based on the contributions of paths, namely Weighted Path Entropy (WPE), to improve the prediction accuracy in weighted networks. Empirical experiments on six weighted real-world networks show that WPE achieves higher prediction accuracy than three typical weighted indices.

  9. Experiences of patient-centredness with specialized community-based care: a systematic review and qualitative meta-synthesis.

    PubMed

    Winsor, S; Smith, A; Vanstone, M; Giacomini, M; Brundisini, F K; DeJean, D

    2013-01-01

    Specialized community-based care (SCBC) endeavours to help patients manage chronic diseases by formalizing the link between primary care providers and other community providers with specialized training. Many types of health care providers and community-based programs are employed in SCBC. Patient-centred care focuses on patients' psychosocial experience of health and illness to ensure that patients' care plans are modelled on their individual values, preferences, spirituality, and expressed needs. To synthesize qualitative research on patient and provider experiences of SCBC interventions and health care delivery models, using the core principles of patient-centredness. This report synthesizes 29 primary qualitative studies on the topic of SCBC interventions for patients with chronic conditions. Included studies were published between 2002 and 2012, and followed adult patients in North America, Europe, Australia, and New Zealand. Qualitative meta-synthesis was used to integrate findings across primary research studies. Three core themes emerged from the analysis: patients' health beliefs affect their participation in SCBC interventions;patients' experiences with community-based care differ from their experiences with hospital-based care;patients and providers value the role of nurses differently in community-based chronic disease care. Qualitative research findings are not intended to generalize directly to populations, although meta-synthesis across several qualitative studies builds an increasingly robust understanding that is more likely to be transferable. The diversity of interventions that fall under SCBC and the cross-interventional focus of many of the studies mean that findings might not be generalizable to all forms of SCBC or its specific components. Patients with chronic diseases who participated in SCBC interventions reported greater satisfaction when SCBC helped them better understand their diagnosis, facilitated increased socialization, provided

  10. Sequence-based predictive modeling to identify cancerlectins

    PubMed Central

    Lai, Hong-Yan; Chen, Xin-Xin; Chen, Wei; Tang, Hua; Lin, Hao

    2017-01-01

    Lectins are a diverse type of glycoproteins or carbohydrate-binding proteins that have a wide distribution to various species. They can specially identify and exclusively bind to a certain kind of saccharide groups. Cancerlectins are a group of lectins that are closely related to cancer and play a major role in the initiation, survival, growth, metastasis and spread of tumor. Several computational methods have emerged to discriminate cancerlectins from non-cancerlectins, which promote the study on pathogenic mechanisms and clinical treatment of cancer. However, the predictive accuracies of most of these techniques are very limited. In this work, by constructing a benchmark dataset based on the CancerLectinDB database, a new amino acid sequence-based strategy for feature description was developed, and then the binomial distribution was applied to screen the optimal feature set. Ultimately, an SVM-based predictor was performed to distinguish cancerlectins from non-cancerlectins, and achieved an accuracy of 77.48% with AUC of 85.52% in jackknife cross-validation. The results revealed that our prediction model could perform better comparing with published predictive tools. PMID:28423655

  11. Investigation of Inquiry-Based Science Pedagogy among Middle Level Science Teachers: A Qualitative Study

    ERIC Educational Resources Information Center

    Weiland, Sunny Minelli

    2012-01-01

    This study implemented a qualitative approach to examine the phenomenon of "inquiry-based science pedagogy or inquiry instruction" as it has been experienced by individuals. Data was collected through online open-ended surveys, focus groups, and teacher reported self-reflections to answer the research questions: 1) How do middle level…

  12. Predicting volume of distribution with decision tree-based regression methods using predicted tissue:plasma partition coefficients.

    PubMed

    Freitas, Alex A; Limbu, Kriti; Ghafourian, Taravat

    2015-01-01

    Volume of distribution is an important pharmacokinetic property that indicates the extent of a drug's distribution in the body tissues. This paper addresses the problem of how to estimate the apparent volume of distribution at steady state (Vss) of chemical compounds in the human body using decision tree-based regression methods from the area of data mining (or machine learning). Hence, the pros and cons of several different types of decision tree-based regression methods have been discussed. The regression methods predict Vss using, as predictive features, both the compounds' molecular descriptors and the compounds' tissue:plasma partition coefficients (Kt:p) - often used in physiologically-based pharmacokinetics. Therefore, this work has assessed whether the data mining-based prediction of Vss can be made more accurate by using as input not only the compounds' molecular descriptors but also (a subset of) their predicted Kt:p values. Comparison of the models that used only molecular descriptors, in particular, the Bagging decision tree (mean fold error of 2.33), with those employing predicted Kt:p values in addition to the molecular descriptors, such as the Bagging decision tree using adipose Kt:p (mean fold error of 2.29), indicated that the use of predicted Kt:p values as descriptors may be beneficial for accurate prediction of Vss using decision trees if prior feature selection is applied. Decision tree based models presented in this work have an accuracy that is reasonable and similar to the accuracy of reported Vss inter-species extrapolations in the literature. The estimation of Vss for new compounds in drug discovery will benefit from methods that are able to integrate large and varied sources of data and flexible non-linear data mining methods such as decision trees, which can produce interpretable models. Graphical AbstractDecision trees for the prediction of tissue partition coefficient and volume of distribution of drugs.

  13. Meta-path based heterogeneous combat network link prediction

    NASA Astrophysics Data System (ADS)

    Li, Jichao; Ge, Bingfeng; Yang, Kewei; Chen, Yingwu; Tan, Yuejin

    2017-09-01

    The combat system-of-systems in high-tech informative warfare, composed of many interconnected combat systems of different types, can be regarded as a type of complex heterogeneous network. Link prediction for heterogeneous combat networks (HCNs) is of significant military value, as it facilitates reconfiguring combat networks to represent the complex real-world network topology as appropriate with observed information. This paper proposes a novel integrated methodology framework called HCNMP (HCN link prediction based on meta-path) to predict multiple types of links simultaneously for an HCN. More specifically, the concept of HCN meta-paths is introduced, through which the HCNMP can accumulate information by extracting different features of HCN links for all the six defined types. Next, an HCN link prediction model, based on meta-path features, is built to predict all types of links of the HCN simultaneously. Then, the solution algorithm for the HCN link prediction model is proposed, in which the prediction results are obtained by iteratively updating with the newly predicted results until the results in the HCN converge or reach a certain maximum iteration number. Finally, numerical experiments on the dataset of a real HCN are conducted to demonstrate the feasibility and effectiveness of the proposed HCNMP, in comparison with 30 baseline methods. The results show that the performance of the HCNMP is superior to those of the baseline methods.

  14. Base pair probability estimates improve the prediction accuracy of RNA non-canonical base pairs

    PubMed Central

    2017-01-01

    Prediction of RNA tertiary structure from sequence is an important problem, but generating accurate structure models for even short sequences remains difficult. Predictions of RNA tertiary structure tend to be least accurate in loop regions, where non-canonical pairs are important for determining the details of structure. Non-canonical pairs can be predicted using a knowledge-based model of structure that scores nucleotide cyclic motifs, or NCMs. In this work, a partition function algorithm is introduced that allows the estimation of base pairing probabilities for both canonical and non-canonical interactions. Pairs that are predicted to be probable are more likely to be found in the true structure than pairs of lower probability. Pair probability estimates can be further improved by predicting the structure conserved across multiple homologous sequences using the TurboFold algorithm. These pairing probabilities, used in concert with prior knowledge of the canonical secondary structure, allow accurate inference of non-canonical pairs, an important step towards accurate prediction of the full tertiary structure. Software to predict non-canonical base pairs and pairing probabilities is now provided as part of the RNAstructure software package. PMID:29107980

  15. Qualitative interviews in medical research.

    PubMed Central

    Britten, N.

    1995-01-01

    Much qualitative research is interview based, and this paper provides an outline of qualitative interview techniques and their application in medical settings. It explains the rationale for these techniques and shows how they can be used to research kinds of questions that are different from those dealt with by quantitative methods. Different types of qualitative interviews are described, and the way in which they differ from clinical consultations is emphasised. Practical guidance for conducting such interviews is given. Images p252-a PMID:7627048

  16. Machine learning-based methods for prediction of linear B-cell epitopes.

    PubMed

    Wang, Hsin-Wei; Pai, Tun-Wen

    2014-01-01

    B-cell epitope prediction facilitates immunologists in designing peptide-based vaccine, diagnostic test, disease prevention, treatment, and antibody production. In comparison with T-cell epitope prediction, the performance of variable length B-cell epitope prediction is still yet to be satisfied. Fortunately, due to increasingly available verified epitope databases, bioinformaticians could adopt machine learning-based algorithms on all curated data to design an improved prediction tool for biomedical researchers. Here, we have reviewed related epitope prediction papers, especially those for linear B-cell epitope prediction. It should be noticed that a combination of selected propensity scales and statistics of epitope residues with machine learning-based tools formulated a general way for constructing linear B-cell epitope prediction systems. It is also observed from most of the comparison results that the kernel method of support vector machine (SVM) classifier outperformed other machine learning-based approaches. Hence, in this chapter, except reviewing recently published papers, we have introduced the fundamentals of B-cell epitope and SVM techniques. In addition, an example of linear B-cell prediction system based on physicochemical features and amino acid combinations is illustrated in details.

  17. Predicting outcome of Internet-based treatment for depressive symptoms.

    PubMed

    Warmerdam, Lisanne; Van Straten, Annemieke; Twisk, Jos; Cuijpers, Pim

    2013-01-01

    In this study we explored predictors and moderators of response to Internet-based cognitive behavioral therapy (CBT) and Internet-based problem-solving therapy (PST) for depressive symptoms. The sample consisted of 263 participants with moderate to severe depressive symptoms. Of those, 88 were randomized to CBT, 88 to PST and 87 to a waiting list control condition. Outcomes were improvement and clinically significant change in depressive symptoms after 8 weeks. Higher baseline depression and higher education predicted improvement, while higher education, less avoidance behavior and decreased rational problem-solving skills predicted clinically significant change across all groups. No variables were found that differentially predicted outcome between Internet-based CBT and Internet-based PST. More research is needed with sufficient power to investigate predictors and moderators of response to reveal for whom Internet-based therapy is best suited.

  18. ShinyGPAS: interactive genomic prediction accuracy simulator based on deterministic formulas.

    PubMed

    Morota, Gota

    2017-12-20

    Deterministic formulas for the accuracy of genomic predictions highlight the relationships among prediction accuracy and potential factors influencing prediction accuracy prior to performing computationally intensive cross-validation. Visualizing such deterministic formulas in an interactive manner may lead to a better understanding of how genetic factors control prediction accuracy. The software to simulate deterministic formulas for genomic prediction accuracy was implemented in R and encapsulated as a web-based Shiny application. Shiny genomic prediction accuracy simulator (ShinyGPAS) simulates various deterministic formulas and delivers dynamic scatter plots of prediction accuracy versus genetic factors impacting prediction accuracy, while requiring only mouse navigation in a web browser. ShinyGPAS is available at: https://chikudaisei.shinyapps.io/shinygpas/ . ShinyGPAS is a shiny-based interactive genomic prediction accuracy simulator using deterministic formulas. It can be used for interactively exploring potential factors that influence prediction accuracy in genome-enabled prediction, simulating achievable prediction accuracy prior to genotyping individuals, or supporting in-class teaching. ShinyGPAS is open source software and it is hosted online as a freely available web-based resource with an intuitive graphical user interface.

  19. Knowledge representation and qualitative simulation of salmon redd functioning. Part I: qualitative modeling and simulation.

    PubMed

    Guerrin, F; Dumas, J

    2001-02-01

    This work aims at representing empirical knowledge of freshwater ecologists on the functioning of salmon redds (spawning areas of salmon) and its impact on mortality of early stages. For this, we use Qsim, a qualitative simulator. In this first part, we provide unfamiliar readers with the underlying qualitative differential equation (QDE) ontology of Qsim: representing quantities, qualitative variables, qualitative constraints, QDE structure. Based on a very simple example taken of the salmon redd application, we show how informal biological knowledge may be represented and simulated using an approach that was first intended to analyze qualitatively ordinary differential equations systems. A companion paper (Part II) gives the full description and simulation of the salmon redd qualitative model. This work was part of a project aimed at assessing the impact of the environment on salmon populations dynamics by the use of models of processes acting at different levels: catchment, river, and redds. Only the latter level is dealt with in this paper.

  20. Research on Improved Depth Belief Network-Based Prediction of Cardiovascular Diseases

    PubMed Central

    Zhang, Hongpo

    2018-01-01

    Quantitative analysis and prediction can help to reduce the risk of cardiovascular disease. Quantitative prediction based on traditional model has low accuracy. The variance of model prediction based on shallow neural network is larger. In this paper, cardiovascular disease prediction model based on improved deep belief network (DBN) is proposed. Using the reconstruction error, the network depth is determined independently, and unsupervised training and supervised optimization are combined. It ensures the accuracy of model prediction while guaranteeing stability. Thirty experiments were performed independently on the Statlog (Heart) and Heart Disease Database data sets in the UCI database. Experimental results showed that the mean of prediction accuracy was 91.26% and 89.78%, respectively. The variance of prediction accuracy was 5.78 and 4.46, respectively. PMID:29854369

  1. [Qualitative research methodology in health care].

    PubMed

    Bedregal, Paula; Besoain, Carolina; Reinoso, Alejandro; Zubarew, Tamara

    2017-03-01

    Health care research requires different methodological approaches such as qualitative and quantitative analyzes to understand the phenomena under study. Qualitative research is usually the least considered. Central elements of the qualitative method are that the object of study is constituted by perceptions, emotions and beliefs, non-random sampling by purpose, circular process of knowledge construction, and methodological rigor throughout the research process, from quality design to the consistency of results. The objective of this work is to contribute to the methodological knowledge about qualitative research in health services, based on the implementation of the study, “The transition process from pediatric to adult services: perspectives from adolescents with chronic diseases, caregivers and health professionals”. The information gathered through the qualitative methodology facilitated the understanding of critical points, barriers and facilitators of the transition process of adolescents with chronic diseases, considering the perspective of users and the health team. This study allowed the design of a transition services model from pediatric to adult health services based on the needs of adolescents with chronic diseases, their caregivers and the health team.

  2. The effect of using genealogy-based haplotypes for genomic prediction.

    PubMed

    Edriss, Vahid; Fernando, Rohan L; Su, Guosheng; Lund, Mogens S; Guldbrandtsen, Bernt

    2013-03-06

    Genomic prediction uses two sources of information: linkage disequilibrium between markers and quantitative trait loci, and additive genetic relationships between individuals. One way to increase the accuracy of genomic prediction is to capture more linkage disequilibrium by regression on haplotypes instead of regression on individual markers. The aim of this study was to investigate the accuracy of genomic prediction using haplotypes based on local genealogy information. A total of 4429 Danish Holstein bulls were genotyped with the 50K SNP chip. Haplotypes were constructed using local genealogical trees. Effects of haplotype covariates were estimated with two types of prediction models: (1) assuming that effects had the same distribution for all haplotype covariates, i.e. the GBLUP method and (2) assuming that a large proportion (π) of the haplotype covariates had zero effect, i.e. a Bayesian mixture method. About 7.5 times more covariate effects were estimated when fitting haplotypes based on local genealogical trees compared to fitting individuals markers. Genealogy-based haplotype clustering slightly increased the accuracy of genomic prediction and, in some cases, decreased the bias of prediction. With the Bayesian method, accuracy of prediction was less sensitive to parameter π when fitting haplotypes compared to fitting markers. Use of haplotypes based on genealogy can slightly increase the accuracy of genomic prediction. Improved methods to cluster the haplotypes constructed from local genealogy could lead to additional gains in accuracy.

  3. Pharmacological mechanism-based drug safety assessment and prediction.

    PubMed

    Abernethy, D R; Woodcock, J; Lesko, L J

    2011-06-01

    Advances in cheminformatics, bioinformatics, and pharmacology in the context of biological systems are now at a point that these tools can be applied to mechanism-based drug safety assessment and prediction. The development of such predictive tools at the US Food and Drug Administration (FDA) will complement ongoing efforts in drug safety that are focused on spontaneous adverse event reporting and active surveillance to monitor drug safety. This effort will require the active collaboration of scientists in the pharmaceutical industry, academe, and the National Institutes of Health, as well as those at the FDA, to reach its full potential. Here, we describe the approaches and goals for the mechanism-based drug safety assessment and prediction program.

  4. Genetic-based prediction of disease traits: prediction is very difficult, especially about the future†

    PubMed Central

    Schrodi, Steven J.; Mukherjee, Shubhabrata; Shan, Ying; Tromp, Gerard; Sninsky, John J.; Callear, Amy P.; Carter, Tonia C.; Ye, Zhan; Haines, Jonathan L.; Brilliant, Murray H.; Crane, Paul K.; Smelser, Diane T.; Elston, Robert C.; Weeks, Daniel E.

    2014-01-01

    Translation of results from genetic findings to inform medical practice is a highly anticipated goal of human genetics. The aim of this paper is to review and discuss the role of genetics in medically-relevant prediction. Germline genetics presages disease onset and therefore can contribute prognostic signals that augment laboratory tests and clinical features. As such, the impact of genetic-based predictive models on clinical decisions and therapy choice could be profound. However, given that (i) medical traits result from a complex interplay between genetic and environmental factors, (ii) the underlying genetic architectures for susceptibility to common diseases are not well-understood, and (iii) replicable susceptibility alleles, in combination, account for only a moderate amount of disease heritability, there are substantial challenges to constructing and implementing genetic risk prediction models with high utility. In spite of these challenges, concerted progress has continued in this area with an ongoing accumulation of studies that identify disease predisposing genotypes. Several statistical approaches with the aim of predicting disease have been published. Here we summarize the current state of disease susceptibility mapping and pharmacogenetics efforts for risk prediction, describe methods used to construct and evaluate genetic-based predictive models, and discuss applications. PMID:24917882

  5. NAPR: a Cloud-Based Framework for Neuroanatomical Age Prediction.

    PubMed

    Pardoe, Heath R; Kuzniecky, Ruben

    2018-01-01

    The availability of cloud computing services has enabled the widespread adoption of the "software as a service" (SaaS) approach for software distribution, which utilizes network-based access to applications running on centralized servers. In this paper we apply the SaaS approach to neuroimaging-based age prediction. Our system, named "NAPR" (Neuroanatomical Age Prediction using R), provides access to predictive modeling software running on a persistent cloud-based Amazon Web Services (AWS) compute instance. The NAPR framework allows external users to estimate the age of individual subjects using cortical thickness maps derived from their own locally processed T1-weighted whole brain MRI scans. As a demonstration of the NAPR approach, we have developed two age prediction models that were trained using healthy control data from the ABIDE, CoRR, DLBS and NKI Rockland neuroimaging datasets (total N = 2367, age range 6-89 years). The provided age prediction models were trained using (i) relevance vector machines and (ii) Gaussian processes machine learning methods applied to cortical thickness surfaces obtained using Freesurfer v5.3. We believe that this transparent approach to out-of-sample evaluation and comparison of neuroimaging age prediction models will facilitate the development of improved age prediction models and allow for robust evaluation of the clinical utility of these methods.

  6. Qualitative methods: beyond the cookbook.

    PubMed

    Harding, G; Gantley, M

    1998-02-01

    Qualitative methods appear increasingly in vogue in health services research (HSR). Such research, however, has utilized, often uncritically, a 'cookbook' of methods for data collection, and common-sense principles for data analysis. This paper argues that qualitative HSR benefits from recognizing and drawing upon theoretical principles underlying qualitative data collection and analysis. A distinction is drawn between problem-orientated and theory-orientated research, in order to illustrate how problem-orientated research would benefit from the introduction of theoretical perspectives in order to develop the knowledge base of health services research.

  7. Evidence-based health information from the users' perspective--a qualitative analysis.

    PubMed

    Hirschberg, Irene; Seidel, Gabriele; Strech, Daniel; Bastian, Hilda; Dierks, Marie-Luise

    2013-10-10

    Evidence-based information is a precondition for informed decision-making and participation in health. There are several recommendations and definitions available on the generation and assessment of so called evidence-based health information for patients and consumers (EBHI). They stress the importance of objectively informing people about benefits and harms and any uncertainties in health-related procedures. There are also studies on the comprehensibility, relevance and user-friendliness of these informational materials. But to date there has been little research on the perceptions and cognitive reactions of users or lay people towards EBHI. The aim of our study is to define the spectrum of consumers' reaction patterns to written EBHI in order to gain a deeper understanding of their comprehension and assumptions, as well as their informational needs and expectations. This study is based on an external user evaluation of EBHI produced by the German Institute for Quality and Efficiency in Health Care (IQWiG), commissioned by the IQWiG. The EBHI were examined within guided group discussions, carried out with lay people. The test readers' first impressions and their appraisal of the informational content, presentation, structure, comprehensibility and effect were gathered. Then a qualitative text analysis of 25 discussion transcripts involving 94 test readers was performed. Based on the qualitative text analysis a framework for reaction patterns was developed, comprising eight main categories: (i) interest, (ii) satisfaction, (iii) reassurance and trust, (iv) activation, (v) disinterest, (vi) dissatisfaction and disappointment, (vii) anxiety and worry, (viii) doubt. Many lay people are unfamiliar with core characteristics of this special information type. Two particularly critical issues are the description of insufficient evidence and the attendant absence of clear-cut recommendations. Further research is needed to examine strategies to explain the specific

  8. Feared consequences of panic attacks in panic disorder: a qualitative and quantitative analysis.

    PubMed

    Raffa, Susan D; White, Kamila S; Barlow, David H

    2004-01-01

    Cognitions are hypothesized to play a central role in panic disorder (PD). Previous studies have used questionnaires to assess cognitive content, focusing on prototypical cognitions associated with PD; however, few studies have qualitatively examined cognitions associated with the feared consequences of panic attacks. The purpose of this study was to conduct a qualitative and quantitative analysis of feared consequences of panic attacks. The initial, qualitative analysis resulted in the development of 32 categories of feared consequences. The categories were derived from participant responses to a standardized, semi-structured question (n = 207). Five expert-derived categories were then utilized to quantitatively examine the relationship between cognitions and indicators of PD severity. Cognitions did not predict PD severity; however, correlational analyses indicated some predictive validity to the expert-derived categories. The qualitative analysis identified additional areas of patient-reported concern not included in previous research that may be important in the assessment and treatment of PD.

  9. Link prediction based on local community properties

    NASA Astrophysics Data System (ADS)

    Yang, Xu-Hua; Zhang, Hai-Feng; Ling, Fei; Cheng, Zhi; Weng, Guo-Qing; Huang, Yu-Jiao

    2016-09-01

    The link prediction algorithm is one of the key technologies to reveal the inherent rule of network evolution. This paper proposes a novel link prediction algorithm based on the properties of the local community, which is composed of the common neighbor nodes of any two nodes in the network and the links between these nodes. By referring to the node degree and the condition of assortativity or disassortativity in a network, we comprehensively consider the effect of the shortest path and edge clustering coefficient within the local community on node similarity. We numerically show the proposed method provide good link prediction results.

  10. Predicting the response of populations to environmental change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ives, A.R.

    1995-04-01

    When subject to long-term directional environmental perturbations, changes in population densities depend on the positive and negative feedbacks operating through interactions within and among species in a community. This paper develops techniques to predict the long-term responses of population densities to environmental changes using data on short-term population fluctuations driven by short-term environmental variability. In addition to giving quantitative predictions, the techniques also reveal how different qualitative patterns of species interactions either buffer or accentuate population responses to environmental trends. All of the predictions are based on regression coefficients extracted from time series data, and they can therefore be appliedmore » with a minimum of mathematical and statistical gymnastics. 48 refs., 10 figs., 4 tabs.« less

  11. Local-search based prediction of medical image registration error

    NASA Astrophysics Data System (ADS)

    Saygili, Görkem

    2018-03-01

    Medical image registration is a crucial task in many different medical imaging applications. Hence, considerable amount of work has been published recently that aim to predict the error in a registration without any human effort. If provided, these error predictions can be used as a feedback to the registration algorithm to further improve its performance. Recent methods generally start with extracting image-based and deformation-based features, then apply feature pooling and finally train a Random Forest (RF) regressor to predict the real registration error. Image-based features can be calculated after applying a single registration but provide limited accuracy whereas deformation-based features such as variation of deformation vector field may require up to 20 registrations which is a considerably high time-consuming task. This paper proposes to use extracted features from a local search algorithm as image-based features to estimate the error of a registration. The proposed method comprises a local search algorithm to find corresponding voxels between registered image pairs and based on the amount of shifts and stereo confidence measures, it predicts the amount of registration error in millimetres densely using a RF regressor. Compared to other algorithms in the literature, the proposed algorithm does not require multiple registrations, can be efficiently implemented on a Graphical Processing Unit (GPU) and can still provide highly accurate error predictions in existence of large registration error. Experimental results with real registrations on a public dataset indicate a substantially high accuracy achieved by using features from the local search algorithm.

  12. Salient Public Beliefs Underlying Disaster Preparedness Behaviors: A Theory-Based Qualitative Study.

    PubMed

    Najafi, Mehdi; Ardalan, Ali; Akbarisari, Ali; Noorbala, Ahmad Ali; Elmi, Helen

    2017-04-01

    Introduction Given the increasing importance of disaster preparedness in Tehran, the capital of Iran, interventions encouraging disaster preparedness behavior (DPB) are needed. This study was conducted to show how an elicitation method can be used to identify salient consequences, referents, and circumstances about DPB and provide recommendations for interventions and quantitative research. A theory-based qualitative study using a semi-structured elicitation questionnaire was conducted with 132 heads of households from 22 districts in Tehran, Iran. Following the Theory of Planned Behavior (TPB), six open-ended questions were used to record the opinion of people about DPB: advantages of engaging in DPB; disadvantages of doing so; people who approve; people who disapprove; things that make it easy; and things that make it difficult. Content analysis showed the categories of salient consequences, reference groups, and circumstances. The three most frequently mentioned advantages obtained from inhabitants of Tehran were health outcomes (eg, it helps us to save our lives, it provides basic needs, and it protects us until relief workers arrive); other salient advantages were mentioned (eg, helps family reunification). The main disadvantage was preparedness anxiety. Family members were the most frequently mentioned social referent when people were asked who might approve or disapprove of their DPB. The two main circumstances perceived to obstruct DPB included not having enough knowledge or enough time. The results of this qualitative study suggest that interventions to encourage DPB among Tehran inhabitants should address: perceived consequences of DPB on health and other factors beyond health; barriers of not having enough knowledge and time perceived to hinder DPB; and social approval. More accurate research on salient beliefs with close-ended items developed from these open-ended data and with larger sample sizes of Tehran inhabitants is necessary. Research with other

  13. Small RNA-based prediction of hybrid performance in maize.

    PubMed

    Seifert, Felix; Thiemann, Alexander; Schrag, Tobias A; Rybka, Dominika; Melchinger, Albrecht E; Frisch, Matthias; Scholten, Stefan

    2018-05-21

    Small RNA (sRNA) sequences are known to have a broad impact on gene regulation by various mechanisms. Their performance for the prediction of hybrid traits has not yet been analyzed. Our objective was to analyze the relation of parental sRNA expression with the performance of their hybrids, to develop a sRNA-based prediction approach, and to compare it to more common SNP and mRNA transcript based predictions using a factorial mating scheme of a maize hybrid breeding program. Correlation of genomic differences and messenger RNA (mRNA) or sRNA expression differences between parental lines with hybrid performance of their hybrids revealed that sRNAs showed an inverse relationship in contrast to the other two data types. We associated differences for SNPs, mRNA and sRNA expression between parental inbred lines with the performance of their hybrid combinations and developed two prediction approaches using distance measures based on associated markers. Cross-validations revealed parental differences in sRNA expression to be strong predictors for hybrid performance for grain yield in maize, comparable to genomic and mRNA data. The integration of both positively and negatively associated markers in the prediction approaches enhanced the prediction accurary. The associated sRNAs belong predominantly to the canonical size classes of 22- and 24-nt that show specific genomic mapping characteristics. Expression profiles of sRNA are a promising alternative to SNPs or mRNA expression profiles for hybrid prediction, especially for plant species without reference genome or transcriptome information. The characteristics of the sRNAs we identified suggest that association studies based on breeding populations facilitate the identification of sRNAs involved in hybrid performance.

  14. The effect of using genealogy-based haplotypes for genomic prediction

    PubMed Central

    2013-01-01

    Background Genomic prediction uses two sources of information: linkage disequilibrium between markers and quantitative trait loci, and additive genetic relationships between individuals. One way to increase the accuracy of genomic prediction is to capture more linkage disequilibrium by regression on haplotypes instead of regression on individual markers. The aim of this study was to investigate the accuracy of genomic prediction using haplotypes based on local genealogy information. Methods A total of 4429 Danish Holstein bulls were genotyped with the 50K SNP chip. Haplotypes were constructed using local genealogical trees. Effects of haplotype covariates were estimated with two types of prediction models: (1) assuming that effects had the same distribution for all haplotype covariates, i.e. the GBLUP method and (2) assuming that a large proportion (π) of the haplotype covariates had zero effect, i.e. a Bayesian mixture method. Results About 7.5 times more covariate effects were estimated when fitting haplotypes based on local genealogical trees compared to fitting individuals markers. Genealogy-based haplotype clustering slightly increased the accuracy of genomic prediction and, in some cases, decreased the bias of prediction. With the Bayesian method, accuracy of prediction was less sensitive to parameter π when fitting haplotypes compared to fitting markers. Conclusions Use of haplotypes based on genealogy can slightly increase the accuracy of genomic prediction. Improved methods to cluster the haplotypes constructed from local genealogy could lead to additional gains in accuracy. PMID:23496971

  15. The Development of a Qualitative Dynamic Attribute Value Model for Healthcare Institutes

    PubMed Central

    Lee, Wan-I

    2010-01-01

    Background: Understanding customers has become an urgent topic for increasing competitiveness. The purpopse of the study was to develop a qualitative dynamic attribute value model which provides insight into the customers’ value for healthcare institute managers by conducting the initial open-ended questionnaire survey to select participants purposefully. Methods: A total number of 427 questionnaires was conducted in two hospitals in Taiwan (one district hospital with 635 beds and one academic hospital with 2495 beds) and 419 questionnaires were received in nine weeks. Then, apply qualitative in-depth interviews to explore customers’ perspective of values for building a model of partial differential equations. Results: This study concludes nine categories of value, including cost, equipment, physician background, physicain care, environment, timing arrangement, relationship, brand image and additional value, to construct objective network for customer value and qualitative dynamic attribute value model where the network shows the value process of loyalty development via its effect on customer satisfaction, customer relationship, customer loyalty and healthcare service. Conclusion: One set predicts the customer relationship based on comminent, including service quality, communication and empahty. As the same time, customer loyalty based on trust, involves buzz marketing, brand and image. Customer value of the current instance is useful for traversing original customer attributes and identifing customers on different service share. PMID:23113034

  16. Copula based prediction models: an application to an aortic regurgitation study

    PubMed Central

    Kumar, Pranesh; Shoukri, Mohamed M

    2007-01-01

    Background: An important issue in prediction modeling of multivariate data is the measure of dependence structure. The use of Pearson's correlation as a dependence measure has several pitfalls and hence application of regression prediction models based on this correlation may not be an appropriate methodology. As an alternative, a copula based methodology for prediction modeling and an algorithm to simulate data are proposed. Methods: The method consists of introducing copulas as an alternative to the correlation coefficient commonly used as a measure of dependence. An algorithm based on the marginal distributions of random variables is applied to construct the Archimedean copulas. Monte Carlo simulations are carried out to replicate datasets, estimate prediction model parameters and validate them using Lin's concordance measure. Results: We have carried out a correlation-based regression analysis on data from 20 patients aged 17–82 years on pre-operative and post-operative ejection fractions after surgery and estimated the prediction model: Post-operative ejection fraction = - 0.0658 + 0.8403 (Pre-operative ejection fraction); p = 0.0008; 95% confidence interval of the slope coefficient (0.3998, 1.2808). From the exploratory data analysis, it is noted that both the pre-operative and post-operative ejection fractions measurements have slight departures from symmetry and are skewed to the left. It is also noted that the measurements tend to be widely spread and have shorter tails compared to normal distribution. Therefore predictions made from the correlation-based model corresponding to the pre-operative ejection fraction measurements in the lower range may not be accurate. Further it is found that the best approximated marginal distributions of pre-operative and post-operative ejection fractions (using q-q plots) are gamma distributions. The copula based prediction model is estimated as: Post -operative ejection fraction = - 0.0933 + 0.8907 × (Pre

  17. Conducting qualitative research in audiology: a tutorial.

    PubMed

    Knudsen, Line V; Laplante-Lévesque, Ariane; Jones, Lesley; Preminger, Jill E; Nielsen, Claus; Lunner, Thomas; Hickson, Louise; Naylor, Graham; Kramer, Sophia E

    2012-02-01

    Qualitative research methodologies are being used more frequently in audiology as it allows for a better understanding of the perspectives of people with hearing impairment. This article describes why and how international interdisciplinary qualitative research can be conducted. This paper is based on a literature review and our recent experience with the conduction of an international interdisciplinary qualitative study in audiology. We describe some available qualitative methods for sampling, data collection, and analysis and we discuss the rationale for choosing particular methods. The focus is on four approaches which have all previously been applied to audiologic research: grounded theory, interpretative phenomenological analysis, conversational analysis, and qualitative content analysis. This article provides a review of methodological issues useful for those designing qualitative research projects in audiology or needing assistance in the interpretation of qualitative literature.

  18. An object programming based environment for protein secondary structure prediction.

    PubMed

    Giacomini, M; Ruggiero, C; Sacile, R

    1996-01-01

    The most frequently used methods for protein secondary structure prediction are empirical statistical methods and rule based methods. A consensus system based on object-oriented programming is presented, which integrates the two approaches with the aim of improving the prediction quality. This system uses an object-oriented knowledge representation based on the concepts of conformation, residue and protein, where the conformation class is the basis, the residue class derives from it and the protein class derives from the residue class. The system has been tested with satisfactory results on several proteins of the Brookhaven Protein Data Bank. Its results have been compared with the results of the most widely used prediction methods, and they show a higher prediction capability and greater stability. Moreover, the system itself provides an index of the reliability of its current prediction. This system can also be regarded as a basis structure for programs of this kind.

  19. Structure-Based Predictions of Activity Cliffs

    PubMed Central

    Husby, Jarmila; Bottegoni, Giovanni; Kufareva, Irina; Abagyan, Ruben; Cavalli, Andrea

    2015-01-01

    In drug discovery, it is generally accepted that neighboring molecules in a given descriptors' space display similar activities. However, even in regions that provide strong predictability, structurally similar molecules can occasionally display large differences in potency. In QSAR jargon, these discontinuities in the activity landscape are known as ‘activity cliffs’. In this study, we assessed the reliability of ligand docking and virtual ligand screening schemes in predicting activity cliffs. We performed our calculations on a diverse, independently collected database of cliff-forming co-crystals. Starting from ideal situations, which allowed us to establish our baseline, we progressively moved toward simulating more realistic scenarios. Ensemble- and template-docking achieved a significant level of accuracy, suggesting that, despite the well-known limitations of empirical scoring schemes, activity cliffs can be accurately predicted by advanced structure-based methods. PMID:25918827

  20. Qualitative Research Designs: Selection and Implementation

    ERIC Educational Resources Information Center

    Creswell, John W.; Hanson, William E.; Plano Clark, Vicki L.; Morales, Alejandro

    2007-01-01

    Counseling psychologists face many approaches from which to choose when they conduct a qualitative research study. This article focuses on the processes of selecting, contrasting, and implementing five different qualitative approaches. Based on an extended example related to test interpretation by counselors, clients, and communities, this article…

  1. Vesicular stomatitis forecasting based on Google Trends

    PubMed Central

    Lu, Yi; Zhou, GuangYa; Chen, Qin

    2018-01-01

    Background Vesicular stomatitis (VS) is an important viral disease of livestock. The main feature of VS is irregular blisters that occur on the lips, tongue, oral mucosa, hoof crown and nipple. Humans can also be infected with vesicular stomatitis and develop meningitis. This study analyses 2014 American VS outbreaks in order to accurately predict vesicular stomatitis outbreak trends. Methods American VS outbreaks data were collected from OIE. The data for VS keywords were obtained by inputting 24 disease-related keywords into Google Trends. After calculating the Pearson and Spearman correlation coefficients, it was found that there was a relationship between outbreaks and keywords derived from Google Trends. Finally, the predicted model was constructed based on qualitative classification and quantitative regression. Results For the regression model, the Pearson correlation coefficients between the predicted outbreaks and actual outbreaks are 0.953 and 0.948, respectively. For the qualitative classification model, we constructed five classification predictive models and chose the best classification predictive model as the result. The results showed, SN (sensitivity), SP (specificity) and ACC (prediction accuracy) values of the best classification predictive model are 78.52%,72.5% and 77.14%, respectively. Conclusion This study applied Google search data to construct a qualitative classification model and a quantitative regression model. The results show that the method is effective and that these two models obtain more accurate forecast. PMID:29385198

  2. [Qualitative Research in Health Services Research - Discussion Paper, Part 3: Quality of Qualitative Research].

    PubMed

    Stamer, M; Güthlin, C; Holmberg, C; Karbach, U; Patzelt, C; Meyer, T

    2015-12-01

    The third and final discussion paper of the German Network of Health Services Research's (DNVF) "Qualitative Methods Working Group" demonstrates methods for the evaluation and quality of qualitative research in health services research. In this paper we discuss approaches described in evaluating qualitative studies, including: an orientation to the general principles of empirical research, an approach-specific course of action, as well as procedures based on the research-process and criteria-oriented approaches. Divided into general and specific aspects to be considered in a qualitative study quality evaluation, the central focus of the discussion paper undertakes an extensive examination of the process and criteria-oriented approaches. The general aspects include the participation of relevant groups in the research process as well as ethical aspects of the research and data protection issues. The more specific aspects in evaluating the quality of qualitative research include considerations about the research interest, research questions, and the selection of data collection methods and types of analyses. The formulated questions are intended to guide reviewers and researchers to evaluate and to develop qualitative research projects appropriately. The intention of this discussion paper is to ensure a transparent research culture, and to reflect on and discuss the methodological and research approach of qualitative studies in health services research. With this paper we aim to initiate a discussion on high quality evaluation of qualitative health services research. © Georg Thieme Verlag KG Stuttgart · New York.

  3. Selecting the minimum prediction base of historical data to perform 5-year predictions of the cancer burden: The GoF-optimal method.

    PubMed

    Valls, Joan; Castellà, Gerard; Dyba, Tadeusz; Clèries, Ramon

    2015-06-01

    Predicting the future burden of cancer is a key issue for health services planning, where a method for selecting the predictive model and the prediction base is a challenge. A method, named here Goodness-of-Fit optimal (GoF-optimal), is presented to determine the minimum prediction base of historical data to perform 5-year predictions of the number of new cancer cases or deaths. An empirical ex-post evaluation exercise for cancer mortality data in Spain and cancer incidence in Finland using simple linear and log-linear Poisson models was performed. Prediction bases were considered within the time periods 1951-2006 in Spain and 1975-2007 in Finland, and then predictions were made for 37 and 33 single years in these periods, respectively. The performance of three fixed different prediction bases (last 5, 10, and 20 years of historical data) was compared to that of the prediction base determined by the GoF-optimal method. The coverage (COV) of the 95% prediction interval and the discrepancy ratio (DR) were calculated to assess the success of the prediction. The results showed that (i) models using the prediction base selected through GoF-optimal method reached the highest COV and the lowest DR and (ii) the best alternative strategy to GoF-optimal was the one using the base of prediction of 5-years. The GoF-optimal approach can be used as a selection criterion in order to find an adequate base of prediction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Rocket engine diagnostics using qualitative modeling techniques

    NASA Technical Reports Server (NTRS)

    Binder, Michael; Maul, William; Meyer, Claudia; Sovie, Amy

    1992-01-01

    Researchers at NASA Lewis Research Center are presently developing qualitative modeling techniques for automated rocket engine diagnostics. A qualitative model of a turbopump interpropellant seal system was created. The qualitative model describes the effects of seal failures on the system steady state behavior. This model is able to diagnose the failure of particular seals in the system based on anomalous temperature and pressure values. The anomalous values input to the qualitative model are generated using numerical simulations. Diagnostic test cases include both single and multiple seal failures.

  5. A High Performance Cloud-Based Protein-Ligand Docking Prediction Algorithm

    PubMed Central

    Chen, Jui-Le; Yang, Chu-Sing

    2013-01-01

    The potential of predicting druggability for a particular disease by integrating biological and computer science technologies has witnessed success in recent years. Although the computer science technologies can be used to reduce the costs of the pharmaceutical research, the computation time of the structure-based protein-ligand docking prediction is still unsatisfied until now. Hence, in this paper, a novel docking prediction algorithm, named fast cloud-based protein-ligand docking prediction algorithm (FCPLDPA), is presented to accelerate the docking prediction algorithm. The proposed algorithm works by leveraging two high-performance operators: (1) the novel migration (information exchange) operator is designed specially for cloud-based environments to reduce the computation time; (2) the efficient operator is aimed at filtering out the worst search directions. Our simulation results illustrate that the proposed method outperforms the other docking algorithms compared in this paper in terms of both the computation time and the quality of the end result. PMID:23762864

  6. A general framework for multivariate multi-index drought prediction based on Multivariate Ensemble Streamflow Prediction (MESP)

    NASA Astrophysics Data System (ADS)

    Hao, Zengchao; Hao, Fanghua; Singh, Vijay P.

    2016-08-01

    Drought is among the costliest natural hazards worldwide and extreme drought events in recent years have caused huge losses to various sectors. Drought prediction is therefore critically important for providing early warning information to aid decision making to cope with drought. Due to the complicated nature of drought, it has been recognized that the univariate drought indicator may not be sufficient for drought characterization and hence multivariate drought indices have been developed for drought monitoring. Alongside the substantial effort in drought monitoring with multivariate drought indices, it is of equal importance to develop a drought prediction method with multivariate drought indices to integrate drought information from various sources. This study proposes a general framework for multivariate multi-index drought prediction that is capable of integrating complementary prediction skills from multiple drought indices. The Multivariate Ensemble Streamflow Prediction (MESP) is employed to sample from historical records for obtaining statistical prediction of multiple variables, which is then used as inputs to achieve multivariate prediction. The framework is illustrated with a linearly combined drought index (LDI), which is a commonly used multivariate drought index, based on climate division data in California and New York in the United States with different seasonality of precipitation. The predictive skill of LDI (represented with persistence) is assessed by comparison with the univariate drought index and results show that the LDI prediction skill is less affected by seasonality than the meteorological drought prediction based on SPI. Prediction results from the case study show that the proposed multivariate drought prediction outperforms the persistence prediction, implying a satisfactory performance of multivariate drought prediction. The proposed method would be useful for drought prediction to integrate drought information from various sources

  7. Prediction-based Dynamic Energy Management in Wireless Sensor Networks

    PubMed Central

    Wang, Xue; Ma, Jun-Jie; Wang, Sheng; Bi, Dao-Wei

    2007-01-01

    Energy consumption is a critical constraint in wireless sensor networks. Focusing on the energy efficiency problem of wireless sensor networks, this paper proposes a method of prediction-based dynamic energy management. A particle filter was introduced to predict a target state, which was adopted to awaken wireless sensor nodes so that their sleep time was prolonged. With the distributed computing capability of nodes, an optimization approach of distributed genetic algorithm and simulated annealing was proposed to minimize the energy consumption of measurement. Considering the application of target tracking, we implemented target position prediction, node sleep scheduling and optimal sensing node selection. Moreover, a routing scheme of forwarding nodes was presented to achieve extra energy conservation. Experimental results of target tracking verified that energy-efficiency is enhanced by prediction-based dynamic energy management.

  8. A micromechanics-based strength prediction methodology for notched metal matrix composites

    NASA Technical Reports Server (NTRS)

    Bigelow, C. A.

    1992-01-01

    An analytical micromechanics based strength prediction methodology was developed to predict failure of notched metal matrix composites. The stress-strain behavior and notched strength of two metal matrix composites, boron/aluminum (B/Al) and silicon-carbide/titanium (SCS-6/Ti-15-3), were predicted. The prediction methodology combines analytical techniques ranging from a three dimensional finite element analysis of a notched specimen to a micromechanical model of a single fiber. In the B/Al laminates, a fiber failure criteria based on the axial and shear stress in the fiber accurately predicted laminate failure for a variety of layups and notch-length to specimen-width ratios with both circular holes and sharp notches when matrix plasticity was included in the analysis. For the SCS-6/Ti-15-3 laminates, a fiber failure based on the axial stress in the fiber correlated well with experimental results for static and post fatigue residual strengths when fiber matrix debonding and matrix cracking were included in the analysis. The micromechanics based strength prediction methodology offers a direct approach to strength prediction by modeling behavior and damage on a constituent level, thus, explicitly including matrix nonlinearity, fiber matrix debonding, and matrix cracking.

  9. A micromechanics-based strength prediction methodology for notched metal-matrix composites

    NASA Technical Reports Server (NTRS)

    Bigelow, C. A.

    1993-01-01

    An analytical micromechanics-based strength prediction methodology was developed to predict failure of notched metal matrix composites. The stress-strain behavior and notched strength of two metal matrix composites, boron/aluminum (B/Al) and silicon-carbide/titanium (SCS-6/Ti-15-3), were predicted. The prediction methodology combines analytical techniques ranging from a three-dimensional finite element analysis of a notched specimen to a micromechanical model of a single fiber. In the B/Al laminates, a fiber failure criteria based on the axial and shear stress in the fiber accurately predicted laminate failure for a variety of layups and notch-length to specimen-width ratios with both circular holes and sharp notches when matrix plasticity was included in the analysis. For the SCS-6/Ti-15-3 laminates, a fiber failure based on the axial stress in the fiber correlated well with experimental results for static and postfatigue residual strengths when fiber matrix debonding and matrix cracking were included in the analysis. The micromechanics-based strength prediction methodology offers a direct approach to strength prediction by modeling behavior and damage on a constituent level, thus, explicitly including matrix nonlinearity, fiber matrix debonding, and matrix cracking.

  10. Intensive care nurses' perceptions of simulation-based team training for building patient safety in intensive care: a descriptive qualitative study.

    PubMed

    Ballangrud, Randi; Hall-Lord, Marie Louise; Persenius, Mona; Hedelin, Birgitta

    2014-08-01

    To describe intensive care nurses' perceptions of simulation-based team training for building patient safety in intensive care. Failures in team processes are found to be contributory factors to incidents in an intensive care environment. Simulation-based training is recommended as a method to make health-care personnel aware of the importance of team working and to improve their competencies. The study uses a qualitative descriptive design. Individual qualitative interviews were conducted with 18 intensive care nurses from May to December 2009, all of which had attended a simulation-based team training programme. The interviews were analysed by qualitative content analysis. One main category emerged to illuminate the intensive care nurse perception: "training increases awareness of clinical practice and acknowledges the importance of structured work in teams". Three generic categories were found: "realistic training contributes to safe care", "reflection and openness motivates learning" and "finding a common understanding of team performance". Simulation-based team training makes intensive care nurses more prepared to care for severely ill patients. Team training creates a common understanding of how to work in teams with regard to patient safety. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Rocket engine diagnostics using qualitative modeling techniques

    NASA Technical Reports Server (NTRS)

    Binder, Michael; Maul, William; Meyer, Claudia; Sovie, Amy

    1992-01-01

    Researchers at NASA Lewis Research Center are presently developing qualitative modeling techniques for automated rocket engine diagnostics. A qualitative model of a turbopump interpropellant seal system has been created. The qualitative model describes the effects of seal failures on the system steady-state behavior. This model is able to diagnose the failure of particular seals in the system based on anomalous temperature and pressure values. The anomalous values input to the qualitative model are generated using numerical simulations. Diagnostic test cases include both single and multiple seal failures.

  12. Rate-Based Model Predictive Control of Turbofan Engine Clearance

    NASA Technical Reports Server (NTRS)

    DeCastro, Jonathan A.

    2006-01-01

    An innovative model predictive control strategy is developed for control of nonlinear aircraft propulsion systems and sub-systems. At the heart of the controller is a rate-based linear parameter-varying model that propagates the state derivatives across the prediction horizon, extending prediction fidelity to transient regimes where conventional models begin to lose validity. The new control law is applied to a demanding active clearance control application, where the objectives are to tightly regulate blade tip clearances and also anticipate and avoid detrimental blade-shroud rub occurrences by optimally maintaining a predefined minimum clearance. Simulation results verify that the rate-based controller is capable of satisfying the objectives during realistic flight scenarios where both a conventional Jacobian-based model predictive control law and an unconstrained linear-quadratic optimal controller are incapable of doing so. The controller is evaluated using a variety of different actuators, illustrating the efficacy and versatility of the control approach. It is concluded that the new strategy has promise for this and other nonlinear aerospace applications that place high importance on the attainment of control objectives during transient regimes.

  13. Drug-target interaction prediction from PSSM based evolutionary information.

    PubMed

    Mousavian, Zaynab; Khakabimamaghani, Sahand; Kavousi, Kaveh; Masoudi-Nejad, Ali

    2016-01-01

    The labor-intensive and expensive experimental process of drug-target interaction prediction has motivated many researchers to focus on in silico prediction, which leads to the helpful information in supporting the experimental interaction data. Therefore, they have proposed several computational approaches for discovering new drug-target interactions. Several learning-based methods have been increasingly developed which can be categorized into two main groups: similarity-based and feature-based. In this paper, we firstly use the bi-gram features extracted from the Position Specific Scoring Matrix (PSSM) of proteins in predicting drug-target interactions. Our results demonstrate the high-confidence prediction ability of the Bigram-PSSM model in terms of several performance indicators specifically for enzymes and ion channels. Moreover, we investigate the impact of negative selection strategy on the performance of the prediction, which is not widely taken into account in the other relevant studies. This is important, as the number of non-interacting drug-target pairs are usually extremely large in comparison with the number of interacting ones in existing drug-target interaction data. An interesting observation is that different levels of performance reduction have been attained for four datasets when we change the sampling method from the random sampling to the balanced sampling. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Riometer based Neural Network Prediction of Kp

    NASA Astrophysics Data System (ADS)

    Arnason, K. M.; Spanswick, E.; Chaddock, D.; Tabrizi, A. F.; Behjat, L.

    2017-12-01

    The Canadian Geospace Observatory Riometer Array is a network of 11 wide-beam riometers deployed across Central and Northern Canada. The geographic coverage of the network affords a near continent scale view of high energy (>30keV) electron precipitation at a very course spatial resolution. In this paper we present the first results from a neural network based analysis of riometer data. Trained on decades of riometer data, the neural network is tuned to predict a simple index of global geomagnetic activity (Kp) based solely on the information provided by the high energy electron precipitation over Canada. We present results from various configurations of training and discuss the applicability of this technique for short term prediction of geomagnetic activity.

  15. Qualitative Beam Profiling of Light Curing Units for Resin Based Composites.

    PubMed

    Haenel, Thomas; Hausnerová, Berenika; Steinhaus, Johannes; Moeginger, Ing Bernhard

    2016-12-01

    This study investigates two technically simple methods to determine the irradiance distribution of light curing units that governs the performance of a visible-light curing resin-based composites. Insufficient light irradiation leads to under-cured composites with poor mechanical properties and elution of residual monomers. The unknown irradiance distribution and its effect on the final restoration are the main critical issues requiring highly sophisticated experimental equipment. The study shows that irradiance distributions of LCUs can easily be determined qualitatively with generally available equipment. This significantly helps dentists in practices to be informed about the homogeneity of the curing lights. Copyright© 2016 Dennis Barber Ltd.

  16. Concept for a Satellite-Based Advanced Air Traffic Management System : Volume 4. Operational Description and Qualitative Assessment.

    DOT National Transportation Integrated Search

    1974-02-01

    The volume presents a description of how the Satellite-Based Advanced Air Traffic Management System (SAATMS) operates and a qualitative assessment of the system. The operational description includes the services, functions, and tasks performed by the...

  17. Validity, reliability, and generalizability in qualitative research

    PubMed Central

    Leung, Lawrence

    2015-01-01

    In general practice, qualitative research contributes as significantly as quantitative research, in particular regarding psycho-social aspects of patient-care, health services provision, policy setting, and health administrations. In contrast to quantitative research, qualitative research as a whole has been constantly critiqued, if not disparaged, by the lack of consensus for assessing its quality and robustness. This article illustrates with five published studies how qualitative research can impact and reshape the discipline of primary care, spiraling out from clinic-based health screening to community-based disease monitoring, evaluation of out-of-hours triage services to provincial psychiatric care pathways model and finally, national legislation of core measures for children's healthcare insurance. Fundamental concepts of validity, reliability, and generalizability as applicable to qualitative research are then addressed with an update on the current views and controversies. PMID:26288766

  18. A burnout prediction model based around char morphology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tao Wu; Edward Lester; Michael Cloke

    Several combustion models have been developed that can make predictions about coal burnout and burnout potential. Most of these kinetic models require standard parameters such as volatile content and particle size to make a burnout prediction. This article presents a new model called the char burnout (ChB) model, which also uses detailed information about char morphology in its prediction. The input data to the model is based on information derived from two different image analysis techniques. One technique generates characterization data from real char samples, and the other predicts char types based on characterization data from image analysis of coalmore » particles. The pyrolyzed chars in this study were created in a drop tube furnace operating at 1300{sup o}C, 200 ms, and 1% oxygen. Modeling results were compared with a different carbon burnout kinetic model as well as the actual burnout data from refiring the same chars in a drop tube furnace operating at 1300{sup o}C, 5% oxygen, and residence times of 200, 400, and 600 ms. A good agreement between ChB model and experimental data indicates that the inclusion of char morphology in combustion models could well improve model predictions. 38 refs., 5 figs., 6 tabs.« less

  19. SHM-Based Probabilistic Fatigue Life Prediction for Bridges Based on FE Model Updating

    PubMed Central

    Lee, Young-Joo; Cho, Soojin

    2016-01-01

    Fatigue life prediction for a bridge should be based on the current condition of the bridge, and various sources of uncertainty, such as material properties, anticipated vehicle loads and environmental conditions, make the prediction very challenging. This paper presents a new approach for probabilistic fatigue life prediction for bridges using finite element (FE) model updating based on structural health monitoring (SHM) data. Recently, various types of SHM systems have been used to monitor and evaluate the long-term structural performance of bridges. For example, SHM data can be used to estimate the degradation of an in-service bridge, which makes it possible to update the initial FE model. The proposed method consists of three steps: (1) identifying the modal properties of a bridge, such as mode shapes and natural frequencies, based on the ambient vibration under passing vehicles; (2) updating the structural parameters of an initial FE model using the identified modal properties; and (3) predicting the probabilistic fatigue life using the updated FE model. The proposed method is demonstrated by application to a numerical model of a bridge, and the impact of FE model updating on the bridge fatigue life is discussed. PMID:26950125

  20. Deep-Learning-Based Drug-Target Interaction Prediction.

    PubMed

    Wen, Ming; Zhang, Zhimin; Niu, Shaoyu; Sha, Haozhi; Yang, Ruihan; Yun, Yonghuan; Lu, Hongmei

    2017-04-07

    Identifying interactions between known drugs and targets is a major challenge in drug repositioning. In silico prediction of drug-target interaction (DTI) can speed up the expensive and time-consuming experimental work by providing the most potent DTIs. In silico prediction of DTI can also provide insights about the potential drug-drug interaction and promote the exploration of drug side effects. Traditionally, the performance of DTI prediction depends heavily on the descriptors used to represent the drugs and the target proteins. In this paper, to accurately predict new DTIs between approved drugs and targets without separating the targets into different classes, we developed a deep-learning-based algorithmic framework named DeepDTIs. It first abstracts representations from raw input descriptors using unsupervised pretraining and then applies known label pairs of interaction to build a classification model. Compared with other methods, it is found that DeepDTIs reaches or outperforms other state-of-the-art methods. The DeepDTIs can be further used to predict whether a new drug targets to some existing targets or whether a new target interacts with some existing drugs.

  1. Theory and interpretation in qualitative studies from general practice: Why and how?

    PubMed

    Malterud, Kirsti

    2016-03-01

    In this article, I want to promote theoretical awareness and commitment among qualitative researchers in general practice and suggest adequate and feasible theoretical approaches. I discuss different theoretical aspects of qualitative research and present the basic foundations of the interpretative paradigm. Associations between paradigms, philosophies, methodologies and methods are examined and different strategies for theoretical commitment presented. Finally, I discuss the impact of theory for interpretation and the development of general practice knowledge. A scientific theory is a consistent and soundly based set of assumptions about a specific aspect of the world, predicting or explaining a phenomenon. Qualitative research is situated in an interpretative paradigm where notions about particular human experiences in context are recognized from different subject positions. Basic theoretical features from the philosophy of science explain why and how this is different from positivism. Reflexivity, including theoretical awareness and consistency, demonstrates interpretative assumptions, accounting for situated knowledge. Different types of theoretical commitment in qualitative analysis are presented, emphasizing substantive theories to sharpen the interpretative focus. Such approaches are clearly within reach for a general practice researcher contributing to clinical practice by doing more than summarizing what the participants talked about, without trying to become a philosopher. Qualitative studies from general practice deserve stronger theoretical awareness and commitment than what is currently established. Persistent attention to and respect for the distinctive domain of knowledge and practice where the research deliveries are targeted is necessary to choose adequate theoretical endeavours. © 2015 the Nordic Societies of Public Health.

  2. A Qualitative Exploration of Community-Based Organization Programs, Resources, and Training to Promote Adolescent Sexual Health

    ERIC Educational Resources Information Center

    McCarthy, Molly A.; Fisher, Christopher M.; Zhou, Junmin; Zhu, He; Pelster, Aja Kneip; Schober, Daniel J.; Baldwin, Kathleen; Fortenberry, J. Dennis; Goldsworthy, Richard

    2015-01-01

    Youth development professionals (YDPs) working at community-based organizations (CBOs) can promote adolescent sexual health through programs. This study explored the programs and resources that youth access at CBOs and training YDPs receive. Twenty-one semi-structured interviews were conducted with YDPs. Qualitative content analyses were conducted…

  3. Predicting Learned Helplessness Based on Personality

    ERIC Educational Resources Information Center

    Maadikhah, Elham; Erfani, Nasrollah

    2014-01-01

    Learned helplessness as a negative motivational state can latently underlie repeated failures and create negative feelings toward the education as well as depression in students and other members of a society. The purpose of this paper is to predict learned helplessness based on students' personality traits. The research is a predictive…

  4. Anthropometric measures in cardiovascular disease prediction: comparison of laboratory-based versus non-laboratory-based model.

    PubMed

    Dhana, Klodian; Ikram, M Arfan; Hofman, Albert; Franco, Oscar H; Kavousi, Maryam

    2015-03-01

    Body mass index (BMI) has been used to simplify cardiovascular risk prediction models by substituting total cholesterol and high-density lipoprotein cholesterol. In the elderly, the ability of BMI as a predictor of cardiovascular disease (CVD) declines. We aimed to find the most predictive anthropometric measure for CVD risk to construct a non-laboratory-based model and to compare it with the model including laboratory measurements. The study included 2675 women and 1902 men aged 55-79 years from the prospective population-based Rotterdam Study. We used Cox proportional hazard regression analysis to evaluate the association of BMI, waist circumference, waist-to-hip ratio and a body shape index (ABSI) with CVD, including coronary heart disease and stroke. The performance of the laboratory-based and non-laboratory-based models was evaluated by studying the discrimination, calibration, correlation and risk agreement. Among men, ABSI was the most informative measure associated with CVD, therefore ABSI was used to construct the non-laboratory-based model. Discrimination of the non-laboratory-based model was not different than laboratory-based model (c-statistic: 0.680-vs-0.683, p=0.71); both models were well calibrated (15.3% observed CVD risk vs 16.9% and 17.0% predicted CVD risks by the non-laboratory-based and laboratory-based models, respectively) and Spearman rank correlation and the agreement between non-laboratory-based and laboratory-based models were 0.89 and 91.7%, respectively. Among women, none of the anthropometric measures were independently associated with CVD. Among middle-aged and elderly where the ability of BMI to predict CVD declines, the non-laboratory-based model, based on ABSI, could predict CVD risk as accurately as the laboratory-based model among men. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  5. Teacher Implementation of Reform-Based Mathematics and Implications for Algebra Readiness: A Qualitative Study of 4th Grade Classrooms

    ERIC Educational Resources Information Center

    Sher, Stephen Korb

    2011-01-01

    This study looked at 4th grade classrooms to see "how" teachers implement NCTM standards-based or reform-based mathematics instruction and then analyzed it for the capacity to improve students' "algebra readiness." The qualitative study was based on classroom observations, teacher and administrator interviews, and teacher surveys. The study took…

  6. Risk prediction model: Statistical and artificial neural network approach

    NASA Astrophysics Data System (ADS)

    Paiman, Nuur Azreen; Hariri, Azian; Masood, Ibrahim

    2017-04-01

    Prediction models are increasingly gaining popularity and had been used in numerous areas of studies to complement and fulfilled clinical reasoning and decision making nowadays. The adoption of such models assist physician's decision making, individual's behavior, and consequently improve individual outcomes and the cost-effectiveness of care. The objective of this paper is to reviewed articles related to risk prediction model in order to understand the suitable approach, development and the validation process of risk prediction model. A qualitative review of the aims, methods and significant main outcomes of the nineteen published articles that developed risk prediction models from numerous fields were done. This paper also reviewed on how researchers develop and validate the risk prediction models based on statistical and artificial neural network approach. From the review done, some methodological recommendation in developing and validating the prediction model were highlighted. According to studies that had been done, artificial neural network approached in developing the prediction model were more accurate compared to statistical approach. However currently, only limited published literature discussed on which approach is more accurate for risk prediction model development.

  7. Blind test of physics-based prediction of protein structures.

    PubMed

    Shell, M Scott; Ozkan, S Banu; Voelz, Vincent; Wu, Guohong Albert; Dill, Ken A

    2009-02-01

    We report here a multiprotein blind test of a computer method to predict native protein structures based solely on an all-atom physics-based force field. We use the AMBER 96 potential function with an implicit (GB/SA) model of solvation, combined with replica-exchange molecular-dynamics simulations. Coarse conformational sampling is performed using the zipping and assembly method (ZAM), an approach that is designed to mimic the putative physical routes of protein folding. ZAM was applied to the folding of six proteins, from 76 to 112 monomers in length, in CASP7, a community-wide blind test of protein structure prediction. Because these predictions have about the same level of accuracy as typical bioinformatics methods, and do not utilize information from databases of known native structures, this work opens up the possibility of predicting the structures of membrane proteins, synthetic peptides, or other foldable polymers, for which there is little prior knowledge of native structures. This approach may also be useful for predicting physical protein folding routes, non-native conformations, and other physical properties from amino acid sequences.

  8. Blind Test of Physics-Based Prediction of Protein Structures

    PubMed Central

    Shell, M. Scott; Ozkan, S. Banu; Voelz, Vincent; Wu, Guohong Albert; Dill, Ken A.

    2009-01-01

    We report here a multiprotein blind test of a computer method to predict native protein structures based solely on an all-atom physics-based force field. We use the AMBER 96 potential function with an implicit (GB/SA) model of solvation, combined with replica-exchange molecular-dynamics simulations. Coarse conformational sampling is performed using the zipping and assembly method (ZAM), an approach that is designed to mimic the putative physical routes of protein folding. ZAM was applied to the folding of six proteins, from 76 to 112 monomers in length, in CASP7, a community-wide blind test of protein structure prediction. Because these predictions have about the same level of accuracy as typical bioinformatics methods, and do not utilize information from databases of known native structures, this work opens up the possibility of predicting the structures of membrane proteins, synthetic peptides, or other foldable polymers, for which there is little prior knowledge of native structures. This approach may also be useful for predicting physical protein folding routes, non-native conformations, and other physical properties from amino acid sequences. PMID:19186130

  9. Using connectome-based predictive modeling to predict individual behavior from brain connectivity

    PubMed Central

    Shen, Xilin; Finn, Emily S.; Scheinost, Dustin; Rosenberg, Monica D.; Chun, Marvin M.; Papademetris, Xenophon; Constable, R Todd

    2017-01-01

    Neuroimaging is a fast developing research area where anatomical and functional images of human brains are collected using techniques such as functional magnetic resonance imaging (fMRI), diffusion tensor imaging (DTI), and electroencephalography (EEG). Technical advances and large-scale datasets have allowed for the development of models capable of predicting individual differences in traits and behavior using brain connectivity measures derived from neuroimaging data. Here, we present connectome-based predictive modeling (CPM), a data-driven protocol for developing predictive models of brain-behavior relationships from connectivity data using cross-validation. This protocol includes the following steps: 1) feature selection, 2) feature summarization, 3) model building, and 4) assessment of prediction significance. We also include suggestions for visualizing the most predictive features (i.e., brain connections). The final result should be a generalizable model that takes brain connectivity data as input and generates predictions of behavioral measures in novel subjects, accounting for a significant amount of the variance in these measures. It has been demonstrated that the CPM protocol performs equivalently or better than most of the existing approaches in brain-behavior prediction. However, because CPM focuses on linear modeling and a purely data-driven driven approach, neuroscientists with limited or no experience in machine learning or optimization would find it easy to implement the protocols. Depending on the volume of data to be processed, the protocol can take 10–100 minutes for model building, 1–48 hours for permutation testing, and 10–20 minutes for visualization of results. PMID:28182017

  10. CD-Based Indices for Link Prediction in Complex Network.

    PubMed

    Wang, Tao; Wang, Hongjue; Wang, Xiaoxia

    2016-01-01

    Lots of similarity-based algorithms have been designed to deal with the problem of link prediction in the past decade. In order to improve prediction accuracy, a novel cosine similarity index CD based on distance between nodes and cosine value between vectors is proposed in this paper. Firstly, node coordinate matrix can be obtained by node distances which are different from distance matrix and row vectors of the matrix are regarded as coordinates of nodes. Then, cosine value between node coordinates is used as their similarity index. A local community density index LD is also proposed. Then, a series of CD-based indices include CD-LD-k, CD*LD-k, CD-k and CDI are presented and applied in ten real networks. Experimental results demonstrate the effectiveness of CD-based indices. The effects of network clustering coefficient and assortative coefficient on prediction accuracy of indices are analyzed. CD-LD-k and CD*LD-k can improve prediction accuracy without considering the assortative coefficient of network is negative or positive. According to analysis of relative precision of each method on each network, CD-LD-k and CD*LD-k indices have excellent average performance and robustness. CD and CD-k indices perform better on positive assortative networks than on negative assortative networks. For negative assortative networks, we improve and refine CD index, referred as CDI index, combining the advantages of CD index and evolutionary mechanism of the network model BA. Experimental results reveal that CDI index can increase prediction accuracy of CD on negative assortative networks.

  11. CD-Based Indices for Link Prediction in Complex Network

    PubMed Central

    Wang, Tao; Wang, Hongjue; Wang, Xiaoxia

    2016-01-01

    Lots of similarity-based algorithms have been designed to deal with the problem of link prediction in the past decade. In order to improve prediction accuracy, a novel cosine similarity index CD based on distance between nodes and cosine value between vectors is proposed in this paper. Firstly, node coordinate matrix can be obtained by node distances which are different from distance matrix and row vectors of the matrix are regarded as coordinates of nodes. Then, cosine value between node coordinates is used as their similarity index. A local community density index LD is also proposed. Then, a series of CD-based indices include CD-LD-k, CD*LD-k, CD-k and CDI are presented and applied in ten real networks. Experimental results demonstrate the effectiveness of CD-based indices. The effects of network clustering coefficient and assortative coefficient on prediction accuracy of indices are analyzed. CD-LD-k and CD*LD-k can improve prediction accuracy without considering the assortative coefficient of network is negative or positive. According to analysis of relative precision of each method on each network, CD-LD-k and CD*LD-k indices have excellent average performance and robustness. CD and CD-k indices perform better on positive assortative networks than on negative assortative networks. For negative assortative networks, we improve and refine CD index, referred as CDI index, combining the advantages of CD index and evolutionary mechanism of the network model BA. Experimental results reveal that CDI index can increase prediction accuracy of CD on negative assortative networks. PMID:26752405

  12. Microarray-based cancer prediction using soft computing approach.

    PubMed

    Wang, Xiaosheng; Gotoh, Osamu

    2009-05-26

    One of the difficulties in using gene expression profiles to predict cancer is how to effectively select a few informative genes to construct accurate prediction models from thousands or ten thousands of genes. We screen highly discriminative genes and gene pairs to create simple prediction models involved in single genes or gene pairs on the basis of soft computing approach and rough set theory. Accurate cancerous prediction is obtained when we apply the simple prediction models for four cancerous gene expression datasets: CNS tumor, colon tumor, lung cancer and DLBCL. Some genes closely correlated with the pathogenesis of specific or general cancers are identified. In contrast with other models, our models are simple, effective and robust. Meanwhile, our models are interpretable for they are based on decision rules. Our results demonstrate that very simple models may perform well on cancerous molecular prediction and important gene markers of cancer can be detected if the gene selection approach is chosen reasonably.

  13. Evaluation of predictive capacities of biomarkers based on research synthesis.

    PubMed

    Hattori, Satoshi; Zhou, Xiao-Hua

    2016-11-10

    The objective of diagnostic studies or prognostic studies is to evaluate and compare predictive capacities of biomarkers. Suppose we are interested in evaluation and comparison of predictive capacities of continuous biomarkers for a binary outcome based on research synthesis. In analysis of each study, subjects are often classified into two groups of the high-expression and low-expression groups according to a cut-off value, and statistical analysis is based on a 2 × 2 table defined by the response and the high expression or low expression of the biomarker. Because the cut-off is study specific, it is difficult to interpret a combined summary measure such as an odds ratio based on the standard meta-analysis techniques. The summary receiver operating characteristic curve is a useful method for meta-analysis of diagnostic studies in the presence of heterogeneity of cut-off values to examine discriminative capacities of biomarkers. We develop a method to estimate positive or negative predictive curves, which are alternative to the receiver operating characteristic curve based on information reported in published papers of each study. These predictive curves provide a useful graphical presentation of pairs of positive and negative predictive values and allow us to compare predictive capacities of biomarkers of different scales in the presence of heterogeneity in cut-off values among studies. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  14. Comparison of quantitative and qualitative tests for glucose-6-phosphate dehydrogenase deficiency.

    PubMed

    LaRue, Nicole; Kahn, Maria; Murray, Marjorie; Leader, Brandon T; Bansil, Pooja; McGray, Sarah; Kalnoky, Michael; Zhang, Hao; Huang, Huiqiang; Jiang, Hui; Domingo, Gonzalo J

    2014-10-01

    A barrier to eliminating Plasmodium vivax malaria is inadequate treatment of infected patients. 8-Aminoquinoline-based drugs clear the parasite; however, people with glucose-6-phosphate dehydrogenase (G6PD) deficiency are at risk for hemolysis from these drugs. Understanding the performance of G6PD deficiency tests is critical for patient safety. Two quantitative assays and two qualitative tests were evaluated. The comparison of quantitative assays gave a Pearson correlation coefficient of 0.7585 with significant difference in mean G6PD activity, highlighting the need to adhere to a single reference assay. Both qualitative tests had high sensitivity and negative predictive value at a cutoff G6PD value of 40% of normal activity if interpreted conservatively and performed under laboratory conditions. The performance of both tests dropped at a cutoff level of 45%. Cytochemical staining of specimens confirmed that heterozygous females with > 50% G6PD-deficient cells can seem normal by phenotypic tests. © The American Society of Tropical Medicine and Hygiene.

  15. Observational attachment theory-based parenting measures predict children's attachment narratives independently from social learning theory-based measures.

    PubMed

    Matias, Carla; O'Connor, Thomas G; Futh, Annabel; Scott, Stephen

    2014-01-01

    Conceptually and methodologically distinct models exist for assessing quality of parent-child relationships, but few studies contrast competing models or assess their overlap in predicting developmental outcomes. Using observational methodology, the current study examined the distinctiveness of attachment theory-based and social learning theory-based measures of parenting in predicting two key measures of child adjustment: security of attachment narratives and social acceptance in peer nominations. A total of 113 5-6-year-old children from ethnically diverse families participated. Parent-child relationships were rated using standard paradigms. Measures derived from attachment theory included sensitive responding and mutuality; measures derived from social learning theory included positive attending, directives, and criticism. Child outcomes were independently-rated attachment narrative representations and peer nominations. Results indicated that Attachment theory-based and Social Learning theory-based measures were modestly correlated; nonetheless, parent-child mutuality predicted secure child attachment narratives independently of social learning theory-based measures; in contrast, criticism predicted peer-nominated fighting independently of attachment theory-based measures. In young children, there is some evidence that attachment theory-based measures may be particularly predictive of attachment narratives; however, no single model of measuring parent-child relationships is likely to best predict multiple developmental outcomes. Assessment in research and applied settings may benefit from integration of different theoretical and methodological paradigms.

  16. Predicting quantitative and qualitative values of recreation participation

    Treesearch

    Elwood L., Jr. Shafer; George Moeller

    1971-01-01

    If future recreation consumption and associated intangible values can be predicted, the problem of rapid decision making in recreation-resource management can be reduced, and the problems of implementing those decisions can be anticipated. Management and research responsibilities for meeting recreation demand are discussed, and proved methods for forecasting recreation...

  17. Prediction of hot regions in protein-protein interaction by combining density-based incremental clustering with feature-based classification.

    PubMed

    Hu, Jing; Zhang, Xiaolong; Liu, Xiaoming; Tang, Jinshan

    2015-06-01

    Discovering hot regions in protein-protein interaction is important for drug and protein design, while experimental identification of hot regions is a time-consuming and labor-intensive effort; thus, the development of predictive models can be very helpful. In hot region prediction research, some models are based on structure information, and others are based on a protein interaction network. However, the prediction accuracy of these methods can still be improved. In this paper, a new method is proposed for hot region prediction, which combines density-based incremental clustering with feature-based classification. The method uses density-based incremental clustering to obtain rough hot regions, and uses feature-based classification to remove the non-hot spot residues from the rough hot regions. Experimental results show that the proposed method significantly improves the prediction performance of hot regions. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Effects of urban microcellular environments on ray-tracing-based coverage predictions.

    PubMed

    Liu, Zhongyu; Guo, Lixin; Guan, Xiaowei; Sun, Jiejing

    2016-09-01

    The ray-tracing (RT) algorithm, which is based on geometrical optics and the uniform theory of diffraction, has become a typical deterministic approach of studying wave-propagation characteristics. Under urban microcellular environments, the RT method highly depends on detailed environmental information. The aim of this paper is to provide help in selecting the appropriate level of accuracy required in building databases to achieve good tradeoffs between database costs and prediction accuracy. After familiarization with the operating procedures of the RT-based prediction model, this study focuses on the effect of errors in environmental information on prediction results. The environmental information consists of two parts, namely, geometric and electrical parameters. The geometric information can be obtained from a digital map of a city. To study the effects of inaccuracies in geometry information (building layout) on RT-based coverage prediction, two different artificial erroneous maps are generated based on the original digital map, and systematic analysis is performed by comparing the predictions with the erroneous maps and measurements or the predictions with the original digital map. To make the conclusion more persuasive, the influence of random errors on RMS delay spread results is investigated. Furthermore, given the electrical parameters' effect on the accuracy of the predicted results of the RT model, the dielectric constant and conductivity of building materials are set with different values. The path loss and RMS delay spread under the same circumstances are simulated by the RT prediction model.

  19. Streamflow Prediction based on Chaos Theory

    NASA Astrophysics Data System (ADS)

    Li, X.; Wang, X.; Babovic, V. M.

    2015-12-01

    Chaos theory is a popular method in hydrologic time series prediction. Local model (LM) based on this theory utilizes time-delay embedding to reconstruct the phase-space diagram. For this method, its efficacy is dependent on the embedding parameters, i.e. embedding dimension, time lag, and nearest neighbor number. The optimal estimation of these parameters is thus critical to the application of Local model. However, these embedding parameters are conventionally estimated using Average Mutual Information (AMI) and False Nearest Neighbors (FNN) separately. This may leads to local optimization and thus has limitation to its prediction accuracy. Considering about these limitation, this paper applies a local model combined with simulated annealing (SA) to find the global optimization of embedding parameters. It is also compared with another global optimization approach of Genetic Algorithm (GA). These proposed hybrid methods are applied in daily and monthly streamflow time series for examination. The results show that global optimization can contribute to the local model to provide more accurate prediction results compared with local optimization. The LM combined with SA shows more advantages in terms of its computational efficiency. The proposed scheme here can also be applied to other fields such as prediction of hydro-climatic time series, error correction, etc.

  20. "Phenomenology" and qualitative research methods.

    PubMed

    Nakayama, Y

    1994-01-01

    Phenomenology is generally based on phenomenological tradition from Husserl to Heidegger and Merleau-Ponty. As philosophical stances provide the assumptions in research methods, different philosophical stances produce different methods. However, the term "phenomenology" is used in various ways without the definition being given, such as phenomenological approach, phenomenological method, phenomenological research, etc. The term "phenomenology" is sometimes used as a paradigm and it is sometimes even viewed as synonymous with qualitative methods. As a result, the term "phenomenology" leads to conceptual confusions in qualitative research methods. The purpose of this paper is to examine the term "phenomenology" and explore philosophical assumptions, and discuss the relationship between philosophical stance and phenomenology as a qualitative research method in nursing.

  1. Route Prediction on Tracking Data to Location-Based Services

    NASA Astrophysics Data System (ADS)

    Petróczi, Attila István; Gáspár-Papanek, Csaba

    Wireless networks have become so widespread, it is beneficial to determine the ability of cellular networks for localization. This property enables the development of location-based services, providing useful information. These services can be improved by route prediction under the condition of using simple algorithms, because of the limited capabilities of mobile stations. This study gives alternative solutions for this problem of route prediction based on a specific graph model. Our models provide the opportunity to reach our destinations with less effort.

  2. A state-based probabilistic model for tumor respiratory motion prediction

    NASA Astrophysics Data System (ADS)

    Kalet, Alan; Sandison, George; Wu, Huanmei; Schmitz, Ruth

    2010-12-01

    This work proposes a new probabilistic mathematical model for predicting tumor motion and position based on a finite state representation using the natural breathing states of exhale, inhale and end of exhale. Tumor motion was broken down into linear breathing states and sequences of states. Breathing state sequences and the observables representing those sequences were analyzed using a hidden Markov model (HMM) to predict the future sequences and new observables. Velocities and other parameters were clustered using a k-means clustering algorithm to associate each state with a set of observables such that a prediction of state also enables a prediction of tumor velocity. A time average model with predictions based on average past state lengths was also computed. State sequences which are known a priori to fit the data were fed into the HMM algorithm to set a theoretical limit of the predictive power of the model. The effectiveness of the presented probabilistic model has been evaluated for gated radiation therapy based on previously tracked tumor motion in four lung cancer patients. Positional prediction accuracy is compared with actual position in terms of the overall RMS errors. Various system delays, ranging from 33 to 1000 ms, were tested. Previous studies have shown duty cycles for latencies of 33 and 200 ms at around 90% and 80%, respectively, for linear, no prediction, Kalman filter and ANN methods as averaged over multiple patients. At 1000 ms, the previously reported duty cycles range from approximately 62% (ANN) down to 34% (no prediction). Average duty cycle for the HMM method was found to be 100% and 91 ± 3% for 33 and 200 ms latency and around 40% for 1000 ms latency in three out of four breathing motion traces. RMS errors were found to be lower than linear and no prediction methods at latencies of 1000 ms. The results show that for system latencies longer than 400 ms, the time average HMM prediction outperforms linear, no prediction, and the more

  3. Using a Play-Based Methodology in Qualitative Research: A Case of Using Social Board to Examine School Climate

    ERIC Educational Resources Information Center

    Mankowska, Anna

    2016-01-01

    Little, if any, examination of using play-based tools to examine children's opinions in research exists in the current literature. Therefore, this paper is meant to address that gap within the literature and showcase the study about the use of a specific play-based methodological tool in qualitative research. This specific tool called social board…

  4. The Adoption Process of Ricefield-Based Fish Seed Production in Northwest Bangladesh: An Understanding through Quantitative and Qualitative Investigation

    ERIC Educational Resources Information Center

    Haque, Mohammad Mahfujul; Little, David C.; Barman, Benoy K.; Wahab, Md. Abdul

    2010-01-01

    Purpose: The purpose of the study was to understand the adoption process of ricefield based fish seed production (RBFSP) that has been developed, promoted and established in Northwest Bangladesh. Design/Methodology/Approach: Quantitative investigation based on regression analysis and qualitative investigation using semi-structured interview were…

  5. Dialog on a country path: the qualitative research journey.

    PubMed

    Sorrell, Jeanne M; Cangelosi, Pamela R; Dinkins, Christine S

    2014-03-01

    There is little information in the literature describing how students learn qualitative research. This article describes an approach to learning that is based on the pedagogical approach of Dinkins' Socratic-Hermeneutic Shared Inquiry. This approach integrates shared dialog as an essential aspect of learning. The qualitative pedagogy described in this article focused on three questions: What is knowing in qualitative research? How do we come to know qualitative research? What can we do with qualitative research? Students learned the basics of qualitative research within a context that fostered interpretive inquiry. In this way, the course framework mirrored the combination of interviewing, storytelling, and journeying toward understanding that constitute qualitative research. © 2013.

  6. Literature-based condition-specific miRNA-mRNA target prediction.

    PubMed

    Oh, Minsik; Rhee, Sungmin; Moon, Ji Hwan; Chae, Heejoon; Lee, Sunwon; Kang, Jaewoo; Kim, Sun

    2017-01-01

    miRNAs are small non-coding RNAs that regulate gene expression by binding to the 3'-UTR of genes. Many recent studies have reported that miRNAs play important biological roles by regulating specific mRNAs or genes. Many sequence-based target prediction algorithms have been developed to predict miRNA targets. However, these methods are not designed for condition-specific target predictions and produce many false positives; thus, expression-based target prediction algorithms have been developed for condition-specific target predictions. A typical strategy to utilize expression data is to leverage the negative control roles of miRNAs on genes. To control false positives, a stringent cutoff value is typically set, but in this case, these methods tend to reject many true target relationships, i.e., false negatives. To overcome these limitations, additional information should be utilized. The literature is probably the best resource that we can utilize. Recent literature mining systems compile millions of articles with experiments designed for specific biological questions, and the systems provide a function to search for specific information. To utilize the literature information, we used a literature mining system, BEST, that automatically extracts information from the literature in PubMed and that allows the user to perform searches of the literature with any English words. By integrating omics data analysis methods and BEST, we developed Context-MMIA, a miRNA-mRNA target prediction method that combines expression data analysis results and the literature information extracted based on the user-specified context. In the pathway enrichment analysis using genes included in the top 200 miRNA-targets, Context-MMIA outperformed the four existing target prediction methods that we tested. In another test on whether prediction methods can re-produce experimentally validated target relationships, Context-MMIA outperformed the four existing target prediction methods. In summary

  7. Connecting clinical and actuarial prediction with rule-based methods.

    PubMed

    Fokkema, Marjolein; Smits, Niels; Kelderman, Henk; Penninx, Brenda W J H

    2015-06-01

    Meta-analyses comparing the accuracy of clinical versus actuarial prediction have shown actuarial methods to outperform clinical methods, on average. However, actuarial methods are still not widely used in clinical practice, and there has been a call for the development of actuarial prediction methods for clinical practice. We argue that rule-based methods may be more useful than the linear main effect models usually employed in prediction studies, from a data and decision analytic as well as a practical perspective. In addition, decision rules derived with rule-based methods can be represented as fast and frugal trees, which, unlike main effects models, can be used in a sequential fashion, reducing the number of cues that have to be evaluated before making a prediction. We illustrate the usability of rule-based methods by applying RuleFit, an algorithm for deriving decision rules for classification and regression problems, to a dataset on prediction of the course of depressive and anxiety disorders from Penninx et al. (2011). The RuleFit algorithm provided a model consisting of 2 simple decision rules, requiring evaluation of only 2 to 4 cues. Predictive accuracy of the 2-rule model was very similar to that of a logistic regression model incorporating 20 predictor variables, originally applied to the dataset. In addition, the 2-rule model required, on average, evaluation of only 3 cues. Therefore, the RuleFit algorithm appears to be a promising method for creating decision tools that are less time consuming and easier to apply in psychological practice, and with accuracy comparable to traditional actuarial methods. (c) 2015 APA, all rights reserved).

  8. Prediction of blast-induced air overpressure: a hybrid AI-based predictive model.

    PubMed

    Jahed Armaghani, Danial; Hajihassani, Mohsen; Marto, Aminaton; Shirani Faradonbeh, Roohollah; Mohamad, Edy Tonnizam

    2015-11-01

    Blast operations in the vicinity of residential areas usually produce significant environmental problems which may cause severe damage to the nearby areas. Blast-induced air overpressure (AOp) is one of the most important environmental impacts of blast operations which needs to be predicted to minimize the potential risk of damage. This paper presents an artificial neural network (ANN) optimized by the imperialist competitive algorithm (ICA) for the prediction of AOp induced by quarry blasting. For this purpose, 95 blasting operations were precisely monitored in a granite quarry site in Malaysia and AOp values were recorded in each operation. Furthermore, the most influential parameters on AOp, including the maximum charge per delay and the distance between the blast-face and monitoring point, were measured and used to train the ICA-ANN model. Based on the generalized predictor equation and considering the measured data from the granite quarry site, a new empirical equation was developed to predict AOp. For comparison purposes, conventional ANN models were developed and compared with the ICA-ANN results. The results demonstrated that the proposed ICA-ANN model is able to predict blast-induced AOp more accurately than other presented techniques.

  9. SeqRate: sequence-based protein folding type classification and rates prediction

    PubMed Central

    2010-01-01

    Background Protein folding rate is an important property of a protein. Predicting protein folding rate is useful for understanding protein folding process and guiding protein design. Most previous methods of predicting protein folding rate require the tertiary structure of a protein as an input. And most methods do not distinguish the different kinetic nature (two-state folding or multi-state folding) of the proteins. Here we developed a method, SeqRate, to predict both protein folding kinetic type (two-state versus multi-state) and real-value folding rate using sequence length, amino acid composition, contact order, contact number, and secondary structure information predicted from only protein sequence with support vector machines. Results We systematically studied the contributions of individual features to folding rate prediction. On a standard benchmark dataset, the accuracy of folding kinetic type classification is 80%. The Pearson correlation coefficient and the mean absolute difference between predicted and experimental folding rates (sec-1) in the base-10 logarithmic scale are 0.81 and 0.79 for two-state protein folders, and 0.80 and 0.68 for three-state protein folders. SeqRate is the first sequence-based method for protein folding type classification and its accuracy of fold rate prediction is improved over previous sequence-based methods. Its performance can be further enhanced with additional information, such as structure-based geometric contacts, as inputs. Conclusions Both the web server and software of predicting folding rate are publicly available at http://casp.rnet.missouri.edu/fold_rate/index.html. PMID:20438647

  10. Qualitative Differences in Real-Time Solution of Standardized Figural Analogies.

    ERIC Educational Resources Information Center

    Schiano, Diane J.; And Others

    Performance on standardized figural analogy tests is considered highly predictive of academic success. While information-processing models of analogy solution attribute performance differences to quantitative differences in processing parameters, the problem-solving literature suggests that qualitative differences in problem representation and…

  11. Quantitative computed tomography-based predictions of vertebral strength in anterior bending.

    PubMed

    Buckley, Jenni M; Cheng, Liu; Loo, Kenneth; Slyfield, Craig; Xu, Zheng

    2007-04-20

    This study examined the ability of QCT-based structural assessment techniques to predict vertebral strength in anterior bending. The purpose of this study was to compare the abilities of QCT-based bone mineral density (BMD), mechanics of solids models (MOS), e.g., bending rigidity, and finite element analyses (FE) to predict the strength of isolated vertebral bodies under anterior bending boundary conditions. Although the relative performance of QCT-based structural measures is well established for uniform compression, the ability of these techniques to predict vertebral strength under nonuniform loading conditions has not yet been established. Thirty human thoracic vertebrae from 30 donors (T9-T10, 20 female, 10 male; 87 +/- 5 years of age) were QCT scanned and destructively tested in anterior bending using an industrial robot arm. The QCT scans were processed to generate specimen-specific FE models as well as trabecular bone mineral density (tBMD), integral bone mineral density (iBMD), and MOS measures, such as axial and bending rigidities. Vertebral strength in anterior bending was poorly to moderately predicted by QCT-based BMD and MOS measures (R2 = 0.14-0.22). QCT-based FE models were better strength predictors (R2 = 0.34-0.40); however, their predictive performance was not statistically different from MOS bending rigidity (P > 0.05). Our results suggest that the poor clinical performance of noninvasive structural measures may be due to their inability to predict vertebral strength under bending loads. While their performance was not statistically better than MOS bending rigidities, QCT-based FE models were moderate predictors of both compressive and bending loads at failure, suggesting that this technique has the potential for strength prediction under nonuniform loads. The current FE modeling strategy is insufficient, however, and significant modifications must be made to better mimic whole bone elastic and inelastic material behavior.

  12. An Efficient Deterministic Approach to Model-based Prediction Uncertainty Estimation

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew J.; Saxena, Abhinav; Goebel, Kai

    2012-01-01

    Prognostics deals with the prediction of the end of life (EOL) of a system. EOL is a random variable, due to the presence of process noise and uncertainty in the future inputs to the system. Prognostics algorithm must account for this inherent uncertainty. In addition, these algorithms never know exactly the state of the system at the desired time of prediction, or the exact model describing the future evolution of the system, accumulating additional uncertainty into the predicted EOL. Prediction algorithms that do not account for these sources of uncertainty are misrepresenting the EOL and can lead to poor decisions based on their results. In this paper, we explore the impact of uncertainty in the prediction problem. We develop a general model-based prediction algorithm that incorporates these sources of uncertainty, and propose a novel approach to efficiently handle uncertainty in the future input trajectories of a system by using the unscented transformation. Using this approach, we are not only able to reduce the computational load but also estimate the bounds of uncertainty in a deterministic manner, which can be useful to consider during decision-making. Using a lithium-ion battery as a case study, we perform several simulation-based experiments to explore these issues, and validate the overall approach using experimental data from a battery testbed.

  13. Predicting flight delay based on multiple linear regression

    NASA Astrophysics Data System (ADS)

    Ding, Yi

    2017-08-01

    Delay of flight has been regarded as one of the toughest difficulties in aviation control. How to establish an effective model to handle the delay prediction problem is a significant work. To solve the problem that the flight delay is difficult to predict, this study proposes a method to model the arriving flights and a multiple linear regression algorithm to predict delay, comparing with Naive-Bayes and C4.5 approach. Experiments based on a realistic dataset of domestic airports show that the accuracy of the proposed model approximates 80%, which is further improved than the Naive-Bayes and C4.5 approach approaches. The result testing shows that this method is convenient for calculation, and also can predict the flight delays effectively. It can provide decision basis for airport authorities.

  14. Prediction of aerodynamic noise in a ring fan based on wake characteristics

    NASA Astrophysics Data System (ADS)

    Sasaki, Soichi; Fukuda, Masaharu; Tsujino, Masao; Tsubota, Haruhiro

    2011-06-01

    A ring fan is a propeller fan that applies an axial-flow impeller with a ring-shaped shroud on the blade tip side. In this study, the entire flow field of the ring fan is simulated using computational fluid dynamics (CFD); the accuracy of the CFD is verified through a comparison with the aerodynamic characteristics of a propeller fan of current model. Moreover, the aerodynamic noise generated by the fan is predicted on the basis of the wake characteristics. The aerodynamic characteristic of the ring fan based on CFD can represent qualitatively the variation in the measured value. The main flow domain of the ring fan is formed at the tip side of the blade because blade tip vortex is not formed at that location. Therefore, the relative velocity of the ring fan is increased by the circumferential velocity. The sound pressure levels of the ring fan within the frequency band of less than 200 Hz are larger than that of the propeller fan. In the analysis of the wake characteristics, it revealed that Karman vortex shedding occurred in the main flow domain in the frequency domain lower than 200 Hz; the aerodynamic noise of the ring fan in the vortex shedding frequency enlarges due to increase in the relative velocity and the velocity fluctuation.

  15. Brain Lesions among Orally Fed and Gastrostomy-Fed Dysphagic Preterm Infants: Can Routine Qualitative or Volumetric Quantitative Magnetic Resonance Imaging Predict Feeding Outcomes?

    PubMed

    Kashou, Nasser H; Dar, Irfaan A; El-Mahdy, Mohamed A; Pluto, Charles; Smith, Mark; Gulati, Ish K; Lo, Warren; Jadcherla, Sudarshan R

    2017-01-01

    The usefulness of qualitative or quantitative volumetric magnetic resonance imaging (MRI) in early detection of brain structural changes and prediction of adverse outcomes in neonatal illnesses warrants further investigation. Our aim was to correlate certain brain injuries and the brain volume of feeding-related cortical and subcortical regions with feeding method at discharge among preterm dysphagic infants. Using a retrospective observational study design, we examined MRI data among 43 (22 male; born at 31.5 ± 0.8 week gestation) infants who went home on oral feeding or gastrostomy feeding (G-tube). MRI scans were segmented, and volumes of brainstem, cerebellum, cerebrum, basal ganglia, thalamus, and vermis were quantified, and correlations were made with discharge feeding outcomes. Chi-squared tests were used to evaluate MRI findings vs. feeding outcomes. ANCOVA was performed on the regression model to measure the association of maturity and brain volume between groups. Out of 43 infants, 44% were oral-fed and 56% were G-tube fed at hospital discharge (but not at time of the study). There was no relationship between qualitative brain lesions and feeding outcomes. Volumetric analysis revealed that cerebellum was greater ( p  < 0.05) in G-tube fed infants, whereas cerebrum volume was greater ( p  < 0.05) in oral-fed infants. Other brain regions did not show volumetric differences between groups. This study concludes that neither qualitative nor quantitative volumetric MRI findings correlate with feeding outcomes. Understanding the complexity of swallowing and feeding difficulties in infants warrants a comprehensive and in-depth functional neurological assessment.

  16. Patient Similarity in Prediction Models Based on Health Data: A Scoping Review

    PubMed Central

    Sharafoddini, Anis; Dubin, Joel A

    2017-01-01

    Background Physicians and health policy makers are required to make predictions during their decision making in various medical problems. Many advances have been made in predictive modeling toward outcome prediction, but these innovations target an average patient and are insufficiently adjustable for individual patients. One developing idea in this field is individualized predictive analytics based on patient similarity. The goal of this approach is to identify patients who are similar to an index patient and derive insights from the records of similar patients to provide personalized predictions.. Objective The aim is to summarize and review published studies describing computer-based approaches for predicting patients’ future health status based on health data and patient similarity, identify gaps, and provide a starting point for related future research. Methods The method involved (1) conducting the review by performing automated searches in Scopus, PubMed, and ISI Web of Science, selecting relevant studies by first screening titles and abstracts then analyzing full-texts, and (2) documenting by extracting publication details and information on context, predictors, missing data, modeling algorithm, outcome, and evaluation methods into a matrix table, synthesizing data, and reporting results. Results After duplicate removal, 1339 articles were screened in abstracts and titles and 67 were selected for full-text review. In total, 22 articles met the inclusion criteria. Within included articles, hospitals were the main source of data (n=10). Cardiovascular disease (n=7) and diabetes (n=4) were the dominant patient diseases. Most studies (n=18) used neighborhood-based approaches in devising prediction models. Two studies showed that patient similarity-based modeling outperformed population-based predictive methods. Conclusions Interest in patient similarity-based predictive modeling for diagnosis and prognosis has been growing. In addition to raw/coded health

  17. Qualitative Methods in Mental Health Services Research

    PubMed Central

    Palinkas, Lawrence A.

    2014-01-01

    Qualitative and mixed methods play a prominent role in mental health services research. However, the standards for their use are not always evident, especially for those not trained in such methods. This paper reviews the rationale and common approaches to using qualitative and mixed methods in mental health services and implementation research based on a review of the papers included in this special series along with representative examples from the literature. Qualitative methods are used to provide a “thick description” or depth of understanding to complement breadth of understanding afforded by quantitative methods, elicit the perspective of those being studied, explore issues that have not been well studied, develop conceptual theories or test hypotheses, or evaluate the process of a phenomenon or intervention. Qualitative methods adhere to many of the same principles of scientific rigor as quantitative methods, but often differ with respect to study design, data collection and data analysis strategies. For instance, participants for qualitative studies are usually sampled purposefully rather than at random and the design usually reflects an iterative process alternating between data collection and analysis. The most common techniques for data collection are individual semi-structured interviews, focus groups, document reviews, and participant observation. Strategies for analysis are usually inductive, based on principles of grounded theory or phenomenology. Qualitative methods are also used in combination with quantitative methods in mixed method designs for convergence, complementarity, expansion, development, and sampling. Rigorously applied qualitative methods offer great potential in contributing to the scientific foundation of mental health services research. PMID:25350675

  18. Barriers to Implementing Evidence-Based Intrapartum Care: A Descriptive Exploratory Qualitative Study

    PubMed Central

    Iravani, Mina; Janghorbani, Mohsen; Zarean, Ellahe; Bahrami, Masod

    2016-01-01

    Background: Evidence based practice is an effective strategy to improve the quality of obstetric care. Identification of barriers to adaptation of evidence-based intrapartum care is necessary and crucial to deliver high quality care to parturient women. Objectives: The current study aimed to explore barriers to adaptation of evidence-based intrapartum care from the perspective of clinical groups that provide obstetric care in Iran. Materials and Methods: This descriptive exploratory qualitative research was conducted from 2013 to 2014 in fourteen state medical training centers in Iran. Participants were selected from midwives, specialists, and residents of obstetrics and gynecology, with a purposive sample and snowball method. Data were collected through face-to-face semi-structured in-depth interviews and analyzed according to conventional content analysis. Results: Data analysis identified twenty subcategories and four main categories. Main categories included barriers were related to laboring women, persons providing care, the organization environment and health system. Conclusions: The adoption of evidence based intrapartum care is a complex process. In this regard, identifying potential barriers is the first step to determine and apply effective strategies to encourage the compliance evidence based obstetric care and improves maternity care quality. PMID:27175303

  19. Barriers to Implementing Evidence-Based Intrapartum Care: A Descriptive Exploratory Qualitative Study.

    PubMed

    Iravani, Mina; Janghorbani, Mohsen; Zarean, Ellahe; Bahrami, Masod

    2016-02-01

    Evidence based practice is an effective strategy to improve the quality of obstetric care. Identification of barriers to adaptation of evidence-based intrapartum care is necessary and crucial to deliver high quality care to parturient women. The current study aimed to explore barriers to adaptation of evidence-based intrapartum care from the perspective of clinical groups that provide obstetric care in Iran. This descriptive exploratory qualitative research was conducted from 2013 to 2014 in fourteen state medical training centers in Iran. Participants were selected from midwives, specialists, and residents of obstetrics and gynecology, with a purposive sample and snowball method. Data were collected through face-to-face semi-structured in-depth interviews and analyzed according to conventional content analysis. Data analysis identified twenty subcategories and four main categories. Main categories included barriers were related to laboring women, persons providing care, the organization environment and health system. The adoption of evidence based intrapartum care is a complex process. In this regard, identifying potential barriers is the first step to determine and apply effective strategies to encourage the compliance evidence based obstetric care and improves maternity care quality.

  20. [Obstacles perceived by nurses for evidence-based practice: a qualitative study].

    PubMed

    Sánchez-García, Inmaculada; López-Medina, Isabel M; Pancorbo-Hidalgo, Pedro L

    2013-01-01

    To examine the obstacles perceived by nurses to implement an evidence-based clinical practice. A qualitative study through semi-structured interviews conducted in 2010-2011 including 11 nurses purposively selected from public hospitals and community centres in Jaén and Córdoba (Spain). A content analysis was performed, using Miles and Huberman as a reference and comprising the following steps: data reduction, data presentation, and data conclusion/verification. Data saturation was reached in these categories (obstacles). The obstacles perceived by nurses to introduce an evidence-based clinical practice (EBCP) were grouped into 3 major categories: obstacles related with professionals (routine-based practice, unwilling and stagnant attitudes, and lack of training in EBCP), obstacles related to the social context (reluctance from other professionals and from patients or families), and obstacles related to the organization (obsolete cultures that do not promote innovation in nursing care). This study highlights the persistence of various factors that hinder the use of research findings in clinical practice. The results underline the need to change the culture of healthcare organizations, to motivate professionals, and to break some of the resistance attitudes that hinder the implementation of evidence-based practice. Copyright © 2012 Elsevier España, S.L. All rights reserved.

  1. Comparative study of contrast-enhanced ultrasound qualitative and quantitative analysis for identifying benign and malignant breast tumor lumps.

    PubMed

    Liu, Jian; Gao, Yun-Hua; Li, Ding-Dong; Gao, Yan-Chun; Hou, Ling-Mi; Xie, Ting

    2014-01-01

    To compare the value of contrast-enhanced ultrasound (CEUS) qualitative and quantitative analysis in the identification of breast tumor lumps. Qualitative and quantitative indicators of CEUS for 73 cases of breast tumor lumps were retrospectively analyzed by univariate and multivariate approaches. Logistic regression was applied and ROC curves were drawn for evaluation and comparison. The CEUS qualitative indicator-generated regression equation contained three indicators, namely enhanced homogeneity, diameter line expansion and peak intensity grading, which demonstrated prediction accuracy for benign and malignant breast tumor lumps of 91.8%; the quantitative indicator-generated regression equation only contained one indicator, namely the relative peak intensity, and its prediction accuracy was 61.5%. The corresponding areas under the ROC curve for qualitative and quantitative analyses were 91.3% and 75.7%, respectively, which exhibited a statistically significant difference by the Z test (P<0.05). The ability of CEUS qualitative analysis to identify breast tumor lumps is better than with quantitative analysis.

  2. A Comparative Study of Spectral Auroral Intensity Predictions From Multiple Electron Transport Models

    NASA Astrophysics Data System (ADS)

    Grubbs, Guy; Michell, Robert; Samara, Marilia; Hampton, Donald; Hecht, James; Solomon, Stanley; Jahn, Jorg-Micha

    2018-01-01

    It is important to routinely examine and update models used to predict auroral emissions resulting from precipitating electrons in Earth's magnetotail. These models are commonly used to invert spectral auroral ground-based images to infer characteristics about incident electron populations when in situ measurements are unavailable. In this work, we examine and compare auroral emission intensities predicted by three commonly used electron transport models using varying electron population characteristics. We then compare model predictions to same-volume in situ electron measurements and ground-based imaging to qualitatively examine modeling prediction error. Initial comparisons showed differences in predictions by the GLobal airglOW (GLOW) model and the other transport models examined. Chemical reaction rates and radiative rates in GLOW were updated using recent publications, and predictions showed better agreement with the other models and the same-volume data, stressing that these rates are important to consider when modeling auroral processes. Predictions by each model exhibit similar behavior for varying atmospheric constants, energies, and energy fluxes. Same-volume electron data and images are highly correlated with predictions by each model, showing that these models can be used to accurately derive electron characteristics and ionospheric parameters based solely on multispectral optical imaging data.

  3. Protein asparagine deamidation prediction based on structures with machine learning methods.

    PubMed

    Jia, Lei; Sun, Yaxiong

    2017-01-01

    Chemical stability is a major concern in the development of protein therapeutics due to its impact on both efficacy and safety. Protein "hotspots" are amino acid residues that are subject to various chemical modifications, including deamidation, isomerization, glycosylation, oxidation etc. A more accurate prediction method for potential hotspot residues would allow their elimination or reduction as early as possible in the drug discovery process. In this work, we focus on prediction models for asparagine (Asn) deamidation. Sequence-based prediction method simply identifies the NG motif (amino acid asparagine followed by a glycine) to be liable to deamidation. It still dominates deamidation evaluation process in most pharmaceutical setup due to its convenience. However, the simple sequence-based method is less accurate and often causes over-engineering a protein. We introduce structure-based prediction models by mining available experimental and structural data of deamidated proteins. Our training set contains 194 Asn residues from 25 proteins that all have available high-resolution crystal structures. Experimentally measured deamidation half-life of Asn in penta-peptides as well as 3D structure-based properties, such as solvent exposure, crystallographic B-factors, local secondary structure and dihedral angles etc., were used to train prediction models with several machine learning algorithms. The prediction tools were cross-validated as well as tested with an external test data set. The random forest model had high enrichment in ranking deamidated residues higher than non-deamidated residues while effectively eliminated false positive predictions. It is possible that such quantitative protein structure-function relationship tools can also be applied to other protein hotspot predictions. In addition, we extensively discussed metrics being used to evaluate the performance of predicting unbalanced data sets such as the deamidation case.

  4. Highway traffic noise prediction based on GIS

    NASA Astrophysics Data System (ADS)

    Zhao, Jianghua; Qin, Qiming

    2014-05-01

    Before building a new road, we need to predict the traffic noise generated by vehicles. Traditional traffic noise prediction methods are based on certain locations and they are not only time-consuming, high cost, but also cannot be visualized. Geographical Information System (GIS) can not only solve the problem of manual data processing, but also can get noise values at any point. The paper selected a road segment from Wenxi to Heyang. According to the geographical overview of the study area and the comparison between several models, we combine the JTG B03-2006 model and the HJ2.4-2009 model to predict the traffic noise depending on the circumstances. Finally, we interpolate the noise values at each prediction point and then generate contours of noise. By overlaying the village data on the noise contour layer, we can get the thematic maps. The use of GIS for road traffic noise prediction greatly facilitates the decision-makers because of GIS spatial analysis function and visualization capabilities. We can clearly see the districts where noise are excessive, and thus it becomes convenient to optimize the road line and take noise reduction measures such as installing sound barriers and relocating villages and so on.

  5. A qualitative study of programs for parents with serious mental illness and their children: building practice-based evidence.

    PubMed

    Nicholson, Joanne; Hinden, Beth R; Biebel, Kathleen; Henry, Alexis D; Katz-Leavy, Judith

    2007-10-01

    The rationale for the development of effective programs for parents with serious mental illness and their children is compelling. Using qualitative methods and a grounded theory approach with data obtained in site visits, seven existing programs for parents with mental illness and their children in the United States are described and compared across core components: target population, theory and assumptions, funding, community and agency contexts, essential services and intervention strategies, moderators, and outcomes. The diversity across programs is strongly complemented by shared characteristics, the identification of which provides the foundation for future testing and the development of an evidence base. Challenges in program implementation and sustainability are identified. Qualitative methods are useful, particularly when studying existing programs, in taking steps toward building the evidence base for effective programs for parents with serious mental illness and their children.

  6. Exploring experiences, barriers, and enablers to home- and class-based exercise in rotator cuff tendinopathy: A qualitative study.

    PubMed

    Sandford, Fiona M; Sanders, Thomas A B; Lewis, Jeremy S

    Qualitative study. Adherence is paramount to the successful outcome of exercise-based treatment. The barriers and enablers to adherence to a home- and class-based exercise program were explored in this qualitative study. Semi-structured interviews were carried out to establish common themes relating to the participants' experiences during a year-long randomized controlled trial. Twelve participants were interviewed. The main enablers to exercise were highlighted as equipment, perceived benefit from the exercises, and longer and more intensive monitoring. Barriers included the lack of motivation, lack of equipment, and pain. Implications for practice are incorporating enablers and addressing barriers including self-discharge from classes; the importance of longer term follow-up and the benefits of adopting exercise into a well-established routine may provide potential benefits. N/A. Copyright © 2017 Hanley & Belfus. Published by Elsevier Inc. All rights reserved.

  7. Comparison of RNA-seq and microarray-based models for clinical endpoint prediction.

    PubMed

    Zhang, Wenqian; Yu, Ying; Hertwig, Falk; Thierry-Mieg, Jean; Zhang, Wenwei; Thierry-Mieg, Danielle; Wang, Jian; Furlanello, Cesare; Devanarayan, Viswanath; Cheng, Jie; Deng, Youping; Hero, Barbara; Hong, Huixiao; Jia, Meiwen; Li, Li; Lin, Simon M; Nikolsky, Yuri; Oberthuer, André; Qing, Tao; Su, Zhenqiang; Volland, Ruth; Wang, Charles; Wang, May D; Ai, Junmei; Albanese, Davide; Asgharzadeh, Shahab; Avigad, Smadar; Bao, Wenjun; Bessarabova, Marina; Brilliant, Murray H; Brors, Benedikt; Chierici, Marco; Chu, Tzu-Ming; Zhang, Jibin; Grundy, Richard G; He, Min Max; Hebbring, Scott; Kaufman, Howard L; Lababidi, Samir; Lancashire, Lee J; Li, Yan; Lu, Xin X; Luo, Heng; Ma, Xiwen; Ning, Baitang; Noguera, Rosa; Peifer, Martin; Phan, John H; Roels, Frederik; Rosswog, Carolina; Shao, Susan; Shen, Jie; Theissen, Jessica; Tonini, Gian Paolo; Vandesompele, Jo; Wu, Po-Yen; Xiao, Wenzhong; Xu, Joshua; Xu, Weihong; Xuan, Jiekun; Yang, Yong; Ye, Zhan; Dong, Zirui; Zhang, Ke K; Yin, Ye; Zhao, Chen; Zheng, Yuanting; Wolfinger, Russell D; Shi, Tieliu; Malkas, Linda H; Berthold, Frank; Wang, Jun; Tong, Weida; Shi, Leming; Peng, Zhiyu; Fischer, Matthias

    2015-06-25

    Gene expression profiling is being widely applied in cancer research to identify biomarkers for clinical endpoint prediction. Since RNA-seq provides a powerful tool for transcriptome-based applications beyond the limitations of microarrays, we sought to systematically evaluate the performance of RNA-seq-based and microarray-based classifiers in this MAQC-III/SEQC study for clinical endpoint prediction using neuroblastoma as a model. We generate gene expression profiles from 498 primary neuroblastomas using both RNA-seq and 44 k microarrays. Characterization of the neuroblastoma transcriptome by RNA-seq reveals that more than 48,000 genes and 200,000 transcripts are being expressed in this malignancy. We also find that RNA-seq provides much more detailed information on specific transcript expression patterns in clinico-genetic neuroblastoma subgroups than microarrays. To systematically compare the power of RNA-seq and microarray-based models in predicting clinical endpoints, we divide the cohort randomly into training and validation sets and develop 360 predictive models on six clinical endpoints of varying predictability. Evaluation of factors potentially affecting model performances reveals that prediction accuracies are most strongly influenced by the nature of the clinical endpoint, whereas technological platforms (RNA-seq vs. microarrays), RNA-seq data analysis pipelines, and feature levels (gene vs. transcript vs. exon-junction level) do not significantly affect performances of the models. We demonstrate that RNA-seq outperforms microarrays in determining the transcriptomic characteristics of cancer, while RNA-seq and microarray-based models perform similarly in clinical endpoint prediction. Our findings may be valuable to guide future studies on the development of gene expression-based predictive models and their implementation in clinical practice.

  8. Predicting epileptic seizures from scalp EEG based on attractor state analysis.

    PubMed

    Chu, Hyunho; Chung, Chun Kee; Jeong, Woorim; Cho, Kwang-Hyun

    2017-05-01

    Epilepsy is the second most common disease of the brain. Epilepsy makes it difficult for patients to live a normal life because it is difficult to predict when seizures will occur. In this regard, if seizures could be predicted a reasonable period of time before their occurrence, epilepsy patients could take precautions against them and improve their safety and quality of life. In this paper, we investigate a novel seizure precursor based on attractor state analysis for seizure prediction. We analyze the transition process from normal to seizure attractor state and investigate a precursor phenomenon seen before reaching the seizure attractor state. From the result of an analysis, we define a quantified spectral measure in scalp EEG for seizure prediction. From scalp EEG recordings, the Fourier coefficients of six EEG frequency bands are extracted, and the defined spectral measure is computed based on the coefficients for each half-overlapped 20-second-long window. The computed spectral measure is applied to seizure prediction using a low-complexity methodology. Within scalp EEG, we identified an early-warning indicator before an epileptic seizure occurs. Getting closer to the bifurcation point that triggers the transition from normal to seizure state, the power spectral density of low frequency bands of the perturbation of an attractor in the EEG, showed a relative increase. A low-complexity seizure prediction algorithm using this feature was evaluated, using ∼583h of scalp EEG in which 143 seizures in 16 patients were recorded. With the test dataset, the proposed method showed high sensitivity (86.67%) with a false prediction rate of 0.367h -1 and average prediction time of 45.3min. A novel seizure prediction method using scalp EEG, based on attractor state analysis, shows potential for application with real epilepsy patients. This is the first study in which the seizure-precursor phenomenon of an epileptic seizure is investigated based on attractor-based

  9. Haplotype-Based Genome-Wide Prediction Models Exploit Local Epistatic Interactions Among Markers

    PubMed Central

    Jiang, Yong; Schmidt, Renate H.; Reif, Jochen C.

    2018-01-01

    Genome-wide prediction approaches represent versatile tools for the analysis and prediction of complex traits. Mostly they rely on marker-based information, but scenarios have been reported in which models capitalizing on closely-linked markers that were combined into haplotypes outperformed marker-based models. Detailed comparisons were undertaken to reveal under which circumstances haplotype-based genome-wide prediction models are superior to marker-based models. Specifically, it was of interest to analyze whether and how haplotype-based models may take local epistatic effects between markers into account. Assuming that populations consisted of fully homozygous individuals, a marker-based model in which local epistatic effects inside haplotype blocks were exploited (LEGBLUP) was linearly transformable into a haplotype-based model (HGBLUP). This theoretical derivation formally revealed that haplotype-based genome-wide prediction models capitalize on local epistatic effects among markers. Simulation studies corroborated this finding. Due to its computational efficiency the HGBLUP model promises to be an interesting tool for studies in which ultra-high-density SNP data sets are studied. Applying the HGBLUP model to empirical data sets revealed higher prediction accuracies than for marker-based models for both traits studied using a mouse panel. In contrast, only a small subset of the traits analyzed in crop populations showed such a benefit. Cases in which higher prediction accuracies are observed for HGBLUP than for marker-based models are expected to be of immediate relevance for breeders, due to the tight linkage a beneficial haplotype will be preserved for many generations. In this respect the inheritance of local epistatic effects very much resembles the one of additive effects. PMID:29549092

  10. Haplotype-Based Genome-Wide Prediction Models Exploit Local Epistatic Interactions Among Markers.

    PubMed

    Jiang, Yong; Schmidt, Renate H; Reif, Jochen C

    2018-05-04

    Genome-wide prediction approaches represent versatile tools for the analysis and prediction of complex traits. Mostly they rely on marker-based information, but scenarios have been reported in which models capitalizing on closely-linked markers that were combined into haplotypes outperformed marker-based models. Detailed comparisons were undertaken to reveal under which circumstances haplotype-based genome-wide prediction models are superior to marker-based models. Specifically, it was of interest to analyze whether and how haplotype-based models may take local epistatic effects between markers into account. Assuming that populations consisted of fully homozygous individuals, a marker-based model in which local epistatic effects inside haplotype blocks were exploited (LEGBLUP) was linearly transformable into a haplotype-based model (HGBLUP). This theoretical derivation formally revealed that haplotype-based genome-wide prediction models capitalize on local epistatic effects among markers. Simulation studies corroborated this finding. Due to its computational efficiency the HGBLUP model promises to be an interesting tool for studies in which ultra-high-density SNP data sets are studied. Applying the HGBLUP model to empirical data sets revealed higher prediction accuracies than for marker-based models for both traits studied using a mouse panel. In contrast, only a small subset of the traits analyzed in crop populations showed such a benefit. Cases in which higher prediction accuracies are observed for HGBLUP than for marker-based models are expected to be of immediate relevance for breeders, due to the tight linkage a beneficial haplotype will be preserved for many generations. In this respect the inheritance of local epistatic effects very much resembles the one of additive effects. Copyright © 2018 Jiang et al.

  11. Assessing Online Textual Feedback to Support Student Intrinsic Motivation Using a Collaborative Text-Based Dialogue System: A Qualitative Study

    ERIC Educational Resources Information Center

    Shroff, Ronnie H.; Deneen, Christopher

    2011-01-01

    This paper assesses textual feedback to support student intrinsic motivation using a collaborative text-based dialogue system. A research model is presented based on research into intrinsic motivation, and the specific construct of feedback provides a framework for the model. A qualitative research methodology is used to validate the model.…

  12. The Issue of Quality in Qualitative Research

    ERIC Educational Resources Information Center

    Hammersley, Martyn

    2007-01-01

    This article addresses the perennial issue of the criteria by which qualitative research should be evaluated. At the present time, there is a sharp conflict between demands for explicit criteria, for example in order to serve systematic reviewing and evidence-based practice, and arguments on the part of some qualitative researchers that such…

  13. Model-based influences on humans’ choices and striatal prediction errors

    PubMed Central

    Daw, Nathaniel D.; Gershman, Samuel J.; Seymour, Ben; Dayan, Peter; Dolan, Raymond J.

    2011-01-01

    Summary The mesostriatal dopamine system is prominently implicated in model-free reinforcement learning, with fMRI BOLD signals in ventral striatum notably covarying with model-free prediction errors. However, latent learning and devaluation studies show that behavior also shows hallmarks of model-based planning, and the interaction between model-based and model-free values, prediction errors and preferences is underexplored. We designed a multistep decision task in which model-based and model-free influences on human choice behavior could be distinguished. By showing that choices reflected both influences we could then test the purity of the ventral striatal BOLD signal as a model-free report. Contrary to expectations, the signal reflected both model-free and model-based predictions in proportions matching those that best explained choice behavior. These results challenge the notion of a separate model-free learner and suggest a more integrated computational architecture for high-level human decision-making. PMID:21435563

  14. Model-based influences on humans' choices and striatal prediction errors.

    PubMed

    Daw, Nathaniel D; Gershman, Samuel J; Seymour, Ben; Dayan, Peter; Dolan, Raymond J

    2011-03-24

    The mesostriatal dopamine system is prominently implicated in model-free reinforcement learning, with fMRI BOLD signals in ventral striatum notably covarying with model-free prediction errors. However, latent learning and devaluation studies show that behavior also shows hallmarks of model-based planning, and the interaction between model-based and model-free values, prediction errors, and preferences is underexplored. We designed a multistep decision task in which model-based and model-free influences on human choice behavior could be distinguished. By showing that choices reflected both influences we could then test the purity of the ventral striatal BOLD signal as a model-free report. Contrary to expectations, the signal reflected both model-free and model-based predictions in proportions matching those that best explained choice behavior. These results challenge the notion of a separate model-free learner and suggest a more integrated computational architecture for high-level human decision-making. Copyright © 2011 Elsevier Inc. All rights reserved.

  15. Genome-based prediction of test cross performance in two subsequent breeding cycles.

    PubMed

    Hofheinz, Nina; Borchardt, Dietrich; Weissleder, Knuth; Frisch, Matthias

    2012-12-01

    Genome-based prediction of genetic values is expected to overcome shortcomings that limit the application of QTL mapping and marker-assisted selection in plant breeding. Our goal was to study the genome-based prediction of test cross performance with genetic effects that were estimated using genotypes from the preceding breeding cycle. In particular, our objectives were to employ a ridge regression approach that approximates best linear unbiased prediction of genetic effects, compare cross validation with validation using genetic material of the subsequent breeding cycle, and investigate the prospects of genome-based prediction in sugar beet breeding. We focused on the traits sugar content and standard molasses loss (ML) and used a set of 310 sugar beet lines to estimate genetic effects at 384 SNP markers. In cross validation, correlations >0.8 between observed and predicted test cross performance were observed for both traits. However, in validation with 56 lines from the next breeding cycle, a correlation of 0.8 could only be observed for sugar content, for standard ML the correlation reduced to 0.4. We found that ridge regression based on preliminary estimates of the heritability provided a very good approximation of best linear unbiased prediction and was not accompanied with a loss in prediction accuracy. We conclude that prediction accuracy assessed with cross validation within one cycle of a breeding program can not be used as an indicator for the accuracy of predicting lines of the next cycle. Prediction of lines of the next cycle seems promising for traits with high heritabilities.

  16. PrePhyloPro: phylogenetic profile-based prediction of whole proteome linkages

    PubMed Central

    Niu, Yulong; Liu, Chengcheng; Moghimyfiroozabad, Shayan; Yang, Yi

    2017-01-01

    Direct and indirect functional links between proteins as well as their interactions as part of larger protein complexes or common signaling pathways may be predicted by analyzing the correlation of their evolutionary patterns. Based on phylogenetic profiling, here we present a highly scalable and time-efficient computational framework for predicting linkages within the whole human proteome. We have validated this method through analysis of 3,697 human pathways and molecular complexes and a comparison of our results with the prediction outcomes of previously published co-occurrency model-based and normalization methods. Here we also introduce PrePhyloPro, a web-based software that uses our method for accurately predicting proteome-wide linkages. We present data on interactions of human mitochondrial proteins, verifying the performance of this software. PrePhyloPro is freely available at http://prephylopro.org/phyloprofile/. PMID:28875072

  17. Improving Foster Parent Engagement: Using Qualitative Methods to Guide Tailoring of Evidence-based Engagement Strategies

    PubMed Central

    Conover, Kate L.; Cox, Julia Revillion

    2014-01-01

    Objective This qualitative study examined applicability and need for tailoring of an evidence-based engagement intervention, combined with Trauma-focused Cognitive Behavioral Therapy, for foster parents. Method Qualitative methods were used, including individual interviews with participating foster parents (N = 7), review of interview findings with an independent group of foster parents (N = 5), and review of the combined foster parent findings by child welfare caseworkers (N = 5), an important stakeholder group. Results The engagement intervention, with its primary focus on perceptual barriers (e.g., past experiences with mental health), was relevant for the foster care population. However, the study identified areas for tailoring to better recognize and address the unique needs and situation of foster parents as substitute caregivers. Conclusions Perceptually-focused engagement interventions may have broad applicability to a range of populations, including foster parents, with the potential for improving caregiver participation in children’s mental health services. PMID:24611600

  18. Probabilistic prediction of barrier-island response to hurricanes

    USGS Publications Warehouse

    Plant, Nathaniel G.; Stockdon, Hilary F.

    2012-01-01

    Prediction of barrier-island response to hurricane attack is important for assessing the vulnerability of communities, infrastructure, habitat, and recreational assets to the impacts of storm surge, waves, and erosion. We have demonstrated that a conceptual model intended to make qualitative predictions of the type of beach response to storms (e.g., beach erosion, dune erosion, dune overwash, inundation) can be reformulated in a Bayesian network to make quantitative predictions of the morphologic response. In an application of this approach at Santa Rosa Island, FL, predicted dune-crest elevation changes in response to Hurricane Ivan explained about 20% to 30% of the observed variance. An extended Bayesian network based on the original conceptual model, which included dune elevations, storm surge, and swash, but with the addition of beach and dune widths as input variables, showed improved skill compared to the original model, explaining 70% of dune elevation change variance and about 60% of dune and shoreline position change variance. This probabilistic approach accurately represented prediction uncertainty (measured with the log likelihood ratio), and it outperformed the baseline prediction (i.e., the prior distribution based on the observations). Finally, sensitivity studies demonstrated that degrading the resolution of the Bayesian network or removing data from the calibration process reduced the skill of the predictions by 30% to 40%. The reduction in skill did not change conclusions regarding the relative importance of the input variables, and the extended model's skill always outperformed the original model.

  19. The growth of a culture of evidence-based obstetrics in South Africa: a qualitative case study

    PubMed Central

    2011-01-01

    Background While the past two decades have seen a shift towards evidence-based obstetrics and midwifery, the process through which a culture of evidence-based practice develops and is sustained within particular fields of clinical practice has not been well documented, particularly in LMICs (low- and middle-income countries). Forming part of a broader qualitative study of evidence-based policy making, this paper describes the development of a culture of evidence-based practice amongst maternal health policy makers and senior academic obstetricians in South Africa Methods A qualitative case-study approach was used. This included a literature review, a policy document review, a timeline of key events and the collection and analysis of 15 interviews with policy makers and academic clinicians involved in these policy processes and sampled using a purposive approach. The data was analysed thematically. Results The concept of evidence-based medicine became embedded in South African academic obstetrics at a very early stage in relation to the development of the concept internationally. The diffusion of this concept into local academic obstetrics was facilitated by contact and exchange between local academic obstetricians, opinion leaders in international research and structures promoting evidence-based practice. Furthermore the growing acceptance of the concept was stimulated locally through the use of existing professional networks and meetings to share ideas and the contribution of local researchers to building the evidence base for obstetrics both locally and internationally. As a testimony to the extent of the diffusion of evidence-based medicine, South Africa has strongly evidence-based policies for maternal health. Conclusion This case study shows that the combined efforts of local and international researchers can create a culture of evidence-based medicine within one country. It also shows that doing so required time and perseverance from international researchers

  20. The Unification Space implemented as a localist neural net: predictions and error-tolerance in a constraint-based parser.

    PubMed

    Vosse, Theo; Kempen, Gerard

    2009-12-01

    We introduce a novel computer implementation of the Unification-Space parser (Vosse and Kempen in Cognition 75:105-143, 2000) in the form of a localist neural network whose dynamics is based on interactive activation and inhibition. The wiring of the network is determined by Performance Grammar (Kempen and Harbusch in Verb constructions in German and Dutch. Benjamins, Amsterdam, 2003), a lexicalist formalism with feature unification as binding operation. While the network is processing input word strings incrementally, the evolving shape of parse trees is represented in the form of changing patterns of activation in nodes that code for syntactic properties of words and phrases, and for the grammatical functions they fulfill. The system is capable, at least qualitatively and rudimentarily, of simulating several important dynamic aspects of human syntactic parsing, including garden-path phenomena and reanalysis, effects of complexity (various types of clause embeddings), fault-tolerance in case of unification failures and unknown words, and predictive parsing (expectation-based analysis, surprisal effects). English is the target language of the parser described.

  1. Geomorphically based predictive mapping of soil thickness in upland watersheds

    NASA Astrophysics Data System (ADS)

    Pelletier, Jon D.; Rasmussen, Craig

    2009-09-01

    The hydrologic response of upland watersheds is strongly controlled by soil (regolith) thickness. Despite the need to quantify soil thickness for input into hydrologic models, there is currently no widely used, geomorphically based method for doing so. In this paper we describe and illustrate a new method for predictive mapping of soil thicknesses using high-resolution topographic data, numerical modeling, and field-based calibration. The model framework works directly with input digital elevation model data to predict soil thicknesses assuming a long-term balance between soil production and erosion. Erosion rates in the model are quantified using one of three geomorphically based sediment transport models: nonlinear slope-dependent transport, nonlinear area- and slope-dependent transport, and nonlinear depth- and slope-dependent transport. The model balances soil production and erosion locally to predict a family of solutions corresponding to a range of values of two unconstrained model parameters. A small number of field-based soil thickness measurements can then be used to calibrate the local value of those unconstrained parameters, thereby constraining which solution is applicable at a particular study site. As an illustration, the model is used to predictively map soil thicknesses in two small, ˜0.1 km2, drainage basins in the Marshall Gulch watershed, a semiarid drainage basin in the Santa Catalina Mountains of Pima County, Arizona. Field observations and calibration data indicate that the nonlinear depth- and slope-dependent sediment transport model is the most appropriate transport model for this site. The resulting framework provides a generally applicable, geomorphically based tool for predictive mapping of soil thickness using high-resolution topographic data sets.

  2. Cloud-based Predictive Modeling System and its Application to Asthma Readmission Prediction

    PubMed Central

    Chen, Robert; Su, Hang; Khalilia, Mohammed; Lin, Sizhe; Peng, Yue; Davis, Tod; Hirsh, Daniel A; Searles, Elizabeth; Tejedor-Sojo, Javier; Thompson, Michael; Sun, Jimeng

    2015-01-01

    The predictive modeling process is time consuming and requires clinical researchers to handle complex electronic health record (EHR) data in restricted computational environments. To address this problem, we implemented a cloud-based predictive modeling system via a hybrid setup combining a secure private server with the Amazon Web Services (AWS) Elastic MapReduce platform. EHR data is preprocessed on a private server and the resulting de-identified event sequences are hosted on AWS. Based on user-specified modeling configurations, an on-demand web service launches a cluster of Elastic Compute 2 (EC2) instances on AWS to perform feature selection and classification algorithms in a distributed fashion. Afterwards, the secure private server aggregates results and displays them via interactive visualization. We tested the system on a pediatric asthma readmission task on a de-identified EHR dataset of 2,967 patients. We conduct a larger scale experiment on the CMS Linkable 2008–2010 Medicare Data Entrepreneurs’ Synthetic Public Use File dataset of 2 million patients, which achieves over 25-fold speedup compared to sequential execution. PMID:26958172

  3. Practical Guidelines for Qualitative Research Using Online Forums

    PubMed Central

    Im, Eun-Ok; Chee, Wonshik

    2012-01-01

    With an increasing number of Internet research in general, the number of qualitative Internet studies has recently increased. Online forums are one of the most frequently used qualitative Internet research methods. Despite an increasing number of online forum studies, very few articles have been written to provide practical guidelines to conduct an online forum as a qualitative research method. In this paper, practical guidelines in using an online forum as a qualitative research method are proposed based on three previous online forum studies. First, the three studies are concisely described. Practical guidelines are proposed based on nine idea categories related to issues in the three studies: (a) a fit with research purpose and questions; (b) logistics; (c) electronic versus conventional informed consent process; (d) structure and functionality of online forums; (e) interdisciplinary team; (f) screening methods; (g) languages; (h) data analysis methods; and (i) getting participants’ feedback. PMID:22918135

  4. Practical guidelines for qualitative research using online forums.

    PubMed

    Im, Eun-Ok; Chee, Wonshik

    2012-11-01

    With an increasing number of Internet research in general, the number of qualitative Internet studies has recently increased. Online forums are one of the most frequently used qualitative Internet research methods. Despite an increasing number of online forum studies, very few articles have been written to provide practical guidelines to conduct an online forum as a qualitative research method. In this article, practical guidelines in using an online forum as a qualitative research method are proposed based on three previous online forum studies. First, the three studies are concisely described. Practical guidelines are proposed based on nine idea categories related to issues in the three studies: (a) a fit with research purpose and questions, (b) logistics, (c) electronic versus conventional informed consent process, (d) structure and functionality of online forums, (e) interdisciplinary team, (f) screening methods, (g) languages, (h) data analysis methods, and (i) getting participants' feedback.

  5. A class-based link prediction using Distance Dependent Chinese Restaurant Process

    NASA Astrophysics Data System (ADS)

    Andalib, Azam; Babamir, Seyed Morteza

    2016-08-01

    One of the important tasks in relational data analysis is link prediction which has been successfully applied on many applications such as bioinformatics, information retrieval, etc. The link prediction is defined as predicting the existence or absence of edges between nodes of a network. In this paper, we propose a novel method for link prediction based on Distance Dependent Chinese Restaurant Process (DDCRP) model which enables us to utilize the information of the topological structure of the network such as shortest path and connectivity of the nodes. We also propose a new Gibbs sampling algorithm for computing the posterior distribution of the hidden variables based on the training data. Experimental results on three real-world datasets show the superiority of the proposed method over other probabilistic models for link prediction problem.

  6. The wind power prediction research based on mind evolutionary algorithm

    NASA Astrophysics Data System (ADS)

    Zhuang, Ling; Zhao, Xinjian; Ji, Tianming; Miao, Jingwen; Cui, Haina

    2018-04-01

    When the wind power is connected to the power grid, its characteristics of fluctuation, intermittent and randomness will affect the stability of the power system. The wind power prediction can guarantee the power quality and reduce the operating cost of power system. There were some limitations in several traditional wind power prediction methods. On the basis, the wind power prediction method based on Mind Evolutionary Algorithm (MEA) is put forward and a prediction model is provided. The experimental results demonstrate that MEA performs efficiently in term of the wind power prediction. The MEA method has broad prospect of engineering application.

  7. Buffered Qualitative Stability explains the robustness and evolvability of transcriptional networks

    PubMed Central

    Albergante, Luca; Blow, J Julian; Newman, Timothy J

    2014-01-01

    The gene regulatory network (GRN) is the central decision‐making module of the cell. We have developed a theory called Buffered Qualitative Stability (BQS) based on the hypothesis that GRNs are organised so that they remain robust in the face of unpredictable environmental and evolutionary changes. BQS makes strong and diverse predictions about the network features that allow stable responses under arbitrary perturbations, including the random addition of new connections. We show that the GRNs of E. coli, M. tuberculosis, P. aeruginosa, yeast, mouse, and human all verify the predictions of BQS. BQS explains many of the small- and large‐scale properties of GRNs, provides conditions for evolvable robustness, and highlights general features of transcriptional response. BQS is severely compromised in a human cancer cell line, suggesting that loss of BQS might underlie the phenotypic plasticity of cancer cells, and highlighting a possible sequence of GRN alterations concomitant with cancer initiation. DOI: http://dx.doi.org/10.7554/eLife.02863.001 PMID:25182846

  8. Buffered Qualitative Stability explains the robustness and evolvability of transcriptional networks.

    PubMed

    Albergante, Luca; Blow, J Julian; Newman, Timothy J

    2014-09-02

    The gene regulatory network (GRN) is the central decision-making module of the cell. We have developed a theory called Buffered Qualitative Stability (BQS) based on the hypothesis that GRNs are organised so that they remain robust in the face of unpredictable environmental and evolutionary changes. BQS makes strong and diverse predictions about the network features that allow stable responses under arbitrary perturbations, including the random addition of new connections. We show that the GRNs of E. coli, M. tuberculosis, P. aeruginosa, yeast, mouse, and human all verify the predictions of BQS. BQS explains many of the small- and large-scale properties of GRNs, provides conditions for evolvable robustness, and highlights general features of transcriptional response. BQS is severely compromised in a human cancer cell line, suggesting that loss of BQS might underlie the phenotypic plasticity of cancer cells, and highlighting a possible sequence of GRN alterations concomitant with cancer initiation. Copyright © 2014, Albergante et al.

  9. A polynomial based model for cell fate prediction in human diseases.

    PubMed

    Ma, Lichun; Zheng, Jie

    2017-12-21

    Cell fate regulation directly affects tissue homeostasis and human health. Research on cell fate decision sheds light on key regulators, facilitates understanding the mechanisms, and suggests novel strategies to treat human diseases that are related to abnormal cell development. In this study, we proposed a polynomial based model to predict cell fate. This model was derived from Taylor series. As a case study, gene expression data of pancreatic cells were adopted to test and verify the model. As numerous features (genes) are available, we employed two kinds of feature selection methods, i.e. correlation based and apoptosis pathway based. Then polynomials of different degrees were used to refine the cell fate prediction function. 10-fold cross-validation was carried out to evaluate the performance of our model. In addition, we analyzed the stability of the resultant cell fate prediction model by evaluating the ranges of the parameters, as well as assessing the variances of the predicted values at randomly selected points. Results show that, within both the two considered gene selection methods, the prediction accuracies of polynomials of different degrees show little differences. Interestingly, the linear polynomial (degree 1 polynomial) is more stable than others. When comparing the linear polynomials based on the two gene selection methods, it shows that although the accuracy of the linear polynomial that uses correlation analysis outcomes is a little higher (achieves 86.62%), the one within genes of the apoptosis pathway is much more stable. Considering both the prediction accuracy and the stability of polynomial models of different degrees, the linear model is a preferred choice for cell fate prediction with gene expression data of pancreatic cells. The presented cell fate prediction model can be extended to other cells, which may be important for basic research as well as clinical study of cell development related diseases.

  10. Analysis of energy-based algorithms for RNA secondary structure prediction

    PubMed Central

    2012-01-01

    Background RNA molecules play critical roles in the cells of organisms, including roles in gene regulation, catalysis, and synthesis of proteins. Since RNA function depends in large part on its folded structures, much effort has been invested in developing accurate methods for prediction of RNA secondary structure from the base sequence. Minimum free energy (MFE) predictions are widely used, based on nearest neighbor thermodynamic parameters of Mathews, Turner et al. or those of Andronescu et al. Some recently proposed alternatives that leverage partition function calculations find the structure with maximum expected accuracy (MEA) or pseudo-expected accuracy (pseudo-MEA) methods. Advances in prediction methods are typically benchmarked using sensitivity, positive predictive value and their harmonic mean, namely F-measure, on datasets of known reference structures. Since such benchmarks document progress in improving accuracy of computational prediction methods, it is important to understand how measures of accuracy vary as a function of the reference datasets and whether advances in algorithms or thermodynamic parameters yield statistically significant improvements. Our work advances such understanding for the MFE and (pseudo-)MEA-based methods, with respect to the latest datasets and energy parameters. Results We present three main findings. First, using the bootstrap percentile method, we show that the average F-measure accuracy of the MFE and (pseudo-)MEA-based algorithms, as measured on our largest datasets with over 2000 RNAs from diverse families, is a reliable estimate (within a 2% range with high confidence) of the accuracy of a population of RNA molecules represented by this set. However, average accuracy on smaller classes of RNAs such as a class of 89 Group I introns used previously in benchmarking algorithm accuracy is not reliable enough to draw meaningful conclusions about the relative merits of the MFE and MEA-based algorithms. Second, on our large

  11. Analysis of energy-based algorithms for RNA secondary structure prediction.

    PubMed

    Hajiaghayi, Monir; Condon, Anne; Hoos, Holger H

    2012-02-01

    RNA molecules play critical roles in the cells of organisms, including roles in gene regulation, catalysis, and synthesis of proteins. Since RNA function depends in large part on its folded structures, much effort has been invested in developing accurate methods for prediction of RNA secondary structure from the base sequence. Minimum free energy (MFE) predictions are widely used, based on nearest neighbor thermodynamic parameters of Mathews, Turner et al. or those of Andronescu et al. Some recently proposed alternatives that leverage partition function calculations find the structure with maximum expected accuracy (MEA) or pseudo-expected accuracy (pseudo-MEA) methods. Advances in prediction methods are typically benchmarked using sensitivity, positive predictive value and their harmonic mean, namely F-measure, on datasets of known reference structures. Since such benchmarks document progress in improving accuracy of computational prediction methods, it is important to understand how measures of accuracy vary as a function of the reference datasets and whether advances in algorithms or thermodynamic parameters yield statistically significant improvements. Our work advances such understanding for the MFE and (pseudo-)MEA-based methods, with respect to the latest datasets and energy parameters. We present three main findings. First, using the bootstrap percentile method, we show that the average F-measure accuracy of the MFE and (pseudo-)MEA-based algorithms, as measured on our largest datasets with over 2000 RNAs from diverse families, is a reliable estimate (within a 2% range with high confidence) of the accuracy of a population of RNA molecules represented by this set. However, average accuracy on smaller classes of RNAs such as a class of 89 Group I introns used previously in benchmarking algorithm accuracy is not reliable enough to draw meaningful conclusions about the relative merits of the MFE and MEA-based algorithms. Second, on our large datasets, the

  12. Improving Computational Efficiency of Prediction in Model-Based Prognostics Using the Unscented Transform

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew John; Goebel, Kai Frank

    2010-01-01

    Model-based prognostics captures system knowledge in the form of physics-based models of components, and how they fail, in order to obtain accurate predictions of end of life (EOL). EOL is predicted based on the estimated current state distribution of a component and expected profiles of future usage. In general, this requires simulations of the component using the underlying models. In this paper, we develop a simulation-based prediction methodology that achieves computational efficiency by performing only the minimal number of simulations needed in order to accurately approximate the mean and variance of the complete EOL distribution. This is performed through the use of the unscented transform, which predicts the means and covariances of a distribution passed through a nonlinear transformation. In this case, the EOL simulation acts as that nonlinear transformation. In this paper, we review the unscented transform, and describe how this concept is applied to efficient EOL prediction. As a case study, we develop a physics-based model of a solenoid valve, and perform simulation experiments to demonstrate improved computational efficiency without sacrificing prediction accuracy.

  13. A Qualitative Exploration of Implementation Factors in a School-Based Mindfulness and Yoga Program: Lessons Learned from Students and Teachers

    PubMed Central

    Dariotis, Jacinda K.; Mirabal-Beltran, Roxanne; Cluxton-Keller, Fallon; Gould, Laura Feagans; Greenberg, Mark T.; Mendelson, Tamar

    2016-01-01

    Identifying factors relevant for successful implementation of school-based interventions is essential to ensure that programs are provided in an effective and engaging manner. The perspectives of two key stakeholders critical for identifying implementation barriers and facilitators – students and their classroom teachers – merit attention in this context and have rarely been explored using qualitative methods. This study reports qualitative perspectives of fifth and sixth grade participants and their teachers of a 16-week school-based mindfulness and yoga program in three public schools serving low-income urban communities. Four themes related to program implementation barriers and facilitators emerged: program delivery factors, program buy-in, implementer communication with teachers, and instructor qualities. Feedback from students and teachers is discussed in the context of informing implementation, adaptation, and future development of school-based mindfulness and yoga programming in urban settings. PMID:28670007

  14. A Qualitative Exploration of Implementation Factors in a School-Based Mindfulness and Yoga Program: Lessons Learned from Students and Teachers.

    PubMed

    Dariotis, Jacinda K; Mirabal-Beltran, Roxanne; Cluxton-Keller, Fallon; Gould, Laura Feagans; Greenberg, Mark T; Mendelson, Tamar

    2017-01-01

    Identifying factors relevant for successful implementation of school-based interventions is essential to ensure that programs are provided in an effective and engaging manner. The perspectives of two key stakeholders critical for identifying implementation barriers and facilitators - students and their classroom teachers - merit attention in this context and have rarely been explored using qualitative methods. This study reports qualitative perspectives of fifth and sixth grade participants and their teachers of a 16-week school-based mindfulness and yoga program in three public schools serving low-income urban communities. Four themes related to program implementation barriers and facilitators emerged: program delivery factors, program buy-in, implementer communication with teachers, and instructor qualities. Feedback from students and teachers is discussed in the context of informing implementation, adaptation, and future development of school-based mindfulness and yoga programming in urban settings.

  15. User's manual for the ALS base heating prediction code, volume 2

    NASA Technical Reports Server (NTRS)

    Reardon, John E.; Fulton, Michael S.

    1992-01-01

    The Advanced Launch System (ALS) Base Heating Prediction Code is based on a generalization of first principles in the prediction of plume induced base convective heating and plume radiation. It should be considered to be an approximate method for evaluating trends as a function of configuration variables because the processes being modeled are too complex to allow an accurate generalization. The convective methodology is based upon generalizing trends from four nozzle configurations, so an extension to use the code with strap-on boosters, multiple nozzle sizes, and variations in the propellants and chamber pressure histories cannot be precisely treated. The plume radiation is more amenable to precise computer prediction, but simplified assumptions are required to model the various aspects of the candidate configurations. Perhaps the most difficult area to characterize is the variation of radiation with altitude. The theory in the radiation predictions is described in more detail. This report is intended to familiarize a user with the interface operation and options, to summarize the limitations and restrictions of the code, and to provide information to assist in installing the code.

  16. Life prediction modeling based on cyclic damage accumulation

    NASA Technical Reports Server (NTRS)

    Nelson, Richard S.

    1988-01-01

    A high temperature, low cycle fatigue life prediction method was developed. This method, Cyclic Damage Accumulation (CDA), was developed for use in predicting the crack initiation lifetime of gas turbine engine materials, where initiation was defined as a 0.030 inch surface length crack. A principal engineering feature of the CDA method is the minimum data base required for implementation. Model constants can be evaluated through a few simple specimen tests such as monotonic loading and rapic cycle fatigue. The method was expanded to account for the effects on creep-fatigue life of complex loadings such as thermomechanical fatigue, hold periods, waveshapes, mean stresses, multiaxiality, cumulative damage, coatings, and environmental attack. A significant data base was generated on the behavior of the cast nickel-base superalloy B1900+Hf, including hundreds of specimen tests under such loading conditions. This information is being used to refine and extend the CDA life prediction model, which is now nearing completion. The model is also being verified using additional specimen tests on wrought INCO 718, and the final version of the model is expected to be adaptable to most any high-temperature alloy. The model is currently available in the form of equations and related constants. A proposed contract addition will make the model available in the near future in the form of a computer code to potential users.

  17. Connectome-based predictive modeling of attention: Comparing different functional connectivity features and prediction methods across datasets.

    PubMed

    Yoo, Kwangsun; Rosenberg, Monica D; Hsu, Wei-Ting; Zhang, Sheng; Li, Chiang-Shan R; Scheinost, Dustin; Constable, R Todd; Chun, Marvin M

    2018-02-15

    Connectome-based predictive modeling (CPM; Finn et al., 2015; Shen et al., 2017) was recently developed to predict individual differences in traits and behaviors, including fluid intelligence (Finn et al., 2015) and sustained attention (Rosenberg et al., 2016a), from functional brain connectivity (FC) measured with fMRI. Here, using the CPM framework, we compared the predictive power of three different measures of FC (Pearson's correlation, accordance, and discordance) and two different prediction algorithms (linear and partial least square [PLS] regression) for attention function. Accordance and discordance are recently proposed FC measures that respectively track in-phase synchronization and out-of-phase anti-correlation (Meskaldji et al., 2015). We defined connectome-based models using task-based or resting-state FC data, and tested the effects of (1) functional connectivity measure and (2) feature-selection/prediction algorithm on individualized attention predictions. Models were internally validated in a training dataset using leave-one-subject-out cross-validation, and externally validated with three independent datasets. The training dataset included fMRI data collected while participants performed a sustained attention task and rested (N = 25; Rosenberg et al., 2016a). The validation datasets included: 1) data collected during performance of a stop-signal task and at rest (N = 83, including 19 participants who were administered methylphenidate prior to scanning; Farr et al., 2014a; Rosenberg et al., 2016b), 2) data collected during Attention Network Task performance and rest (N = 41, Rosenberg et al., in press), and 3) resting-state data and ADHD symptom severity from the ADHD-200 Consortium (N = 113; Rosenberg et al., 2016a). Models defined using all combinations of functional connectivity measure (Pearson's correlation, accordance, and discordance) and prediction algorithm (linear and PLS regression) predicted attentional abilities, with

  18. Template‐based field map prediction for rapid whole brain B0 shimming

    PubMed Central

    Shi, Yuhang; Vannesjo, S. Johanna; Miller, Karla L.

    2017-01-01

    Purpose In typical MRI protocols, time is spent acquiring a field map to calculate the shim settings for best image quality. We propose a fast template‐based field map prediction method that yields near‐optimal shims without measuring the field. Methods The template‐based prediction method uses prior knowledge of the B0 distribution in the human brain, based on a large database of field maps acquired from different subjects, together with subject‐specific structural information from a quick localizer scan. The shimming performance of using the template‐based prediction is evaluated in comparison to a range of potential fast shimming methods. Results Static B0 shimming based on predicted field maps performed almost as well as shimming based on individually measured field maps. In experimental evaluations at 7 T, the proposed approach yielded a residual field standard deviation in the brain of on average 59 Hz, compared with 50 Hz using measured field maps and 176 Hz using no subject‐specific shim. Conclusions This work demonstrates that shimming based on predicted field maps is feasible. The field map prediction accuracy could potentially be further improved by generating the template from a subset of subjects, based on parameters such as head rotation and body mass index. Magn Reson Med 80:171–180, 2018. © 2017 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited. PMID:29193340

  19. Toward Hypertension Prediction Based on PPG-Derived HRV Signals: a Feasibility Study.

    PubMed

    Lan, Kun-Chan; Raknim, Paweeya; Kao, Wei-Fong; Huang, Jyh-How

    2018-04-21

    Heart rate variability (HRV) is often used to assess the risk of cardiovascular disease, and data on this can be obtained via electrocardiography (ECG). However, collecting heart rate data via photoplethysmography (PPG) is now a lot easier. We investigate the feasibility of using the PPG-based heart rate to estimate HRV and predict diseases. We obtain three months of PPG-based heart rate data from subjects with and without hypertension, and calculate the HRV based on various forms of time and frequency domain analysis. We then apply a data mining technique to this estimated HRV data, to see if it is possible to correctly identify patients with hypertension. We use six HRV parameters to predict hypertension, and find SDNN has the best predictive power. We show that early disease prediction is possible through collecting one's PPG-based heart rate information.

  20. Qualitative and quantitative feedback in the context of competency-based education.

    PubMed

    Tekian, Ara; Watling, Christopher J; Roberts, Trudie E; Steinert, Yvonne; Norcini, John

    2017-12-01

    Research indicates the importance and usefulness of feedback, yet with the shift of medical curricula toward competencies, feedback is not well understood in this context. This paper attempts to identify how feedback fits within a competency-based curriculum. After careful consideration of the literature, the following conclusions are drawn: (1) Because feedback is predicated on assessment, the assessment should be designed to optimize and prevent inaccuracies in feedback; (2) Giving qualitative feedback in the form of a conversation would lend credibility to the feedback, address emotional obstacles and create a context in which feedback is comfortable; (3) Quantitative feedback in the form of individualized data could fulfill the demand for more feedback, help students devise strategies on how to improve, allow students to compare themselves to their peers, recognizing that big data have limitations; and (4) Faculty development needs to incorporate and promote cultural and systems changes with regard to feedback. A better understanding of the role of feedback in competency-based education could result in more efficient learning for students.

  1. Qualitative Fault Isolation of Hybrid Systems: A Structural Model Decomposition-Based Approach

    NASA Technical Reports Server (NTRS)

    Bregon, Anibal; Daigle, Matthew; Roychoudhury, Indranil

    2016-01-01

    Quick and robust fault diagnosis is critical to ensuring safe operation of complex engineering systems. A large number of techniques are available to provide fault diagnosis in systems with continuous dynamics. However, many systems in aerospace and industrial environments are best represented as hybrid systems that consist of discrete behavioral modes, each with its own continuous dynamics. These hybrid dynamics make the on-line fault diagnosis task computationally more complex due to the large number of possible system modes and the existence of autonomous mode transitions. This paper presents a qualitative fault isolation framework for hybrid systems based on structural model decomposition. The fault isolation is performed by analyzing the qualitative information of the residual deviations. However, in hybrid systems this process becomes complex due to possible existence of observation delays, which can cause observed deviations to be inconsistent with the expected deviations for the current mode in the system. The great advantage of structural model decomposition is that (i) it allows to design residuals that respond to only a subset of the faults, and (ii) every time a mode change occurs, only a subset of the residuals will need to be reconfigured, thus reducing the complexity of the reasoning process for isolation purposes. To demonstrate and test the validity of our approach, we use an electric circuit simulation as the case study.

  2. Challenges of teacher-based clinical evaluation from nursing students' point of view: Qualitative content analysis.

    PubMed

    Sadeghi, Tabandeh; Seyed Bagheri, Seyed Hamid

    2017-01-01

    Clinical evaluation is very important in the educational system of nursing. One of the most common methods of clinical evaluation is evaluation by the teacher, but the challenges that students would face in this evaluation method, have not been mentioned. Thus, this study aimed to explore the experiences and views of nursing students about the challenges of teacher-based clinical evaluation. This study was a descriptive qualitative study with a qualitative content analysis approach. Data were gathered through semi-structured focused group sessions with undergraduate nursing students who were passing their 8 th semester at Rafsanjan University of Medical Sciences. Date were analyzed using Graneheim and Lundman's proposed method. Data collection and analysis were concurrent. According to the findings, "factitious evaluation" was the main theme of study that consisted of three categories: "Personal preferences," "unfairness" and "shirking responsibility." These categories are explained using quotes derived from the data. According to the results of this study, teacher-based clinical evaluation would lead to factitious evaluation. Thus, changing this approach of evaluation toward modern methods of evaluation is suggested. The finding can help nursing instructors to get a better understanding of the nursing students' point of view toward this evaluation approach and as a result could be planning for changing of this approach.

  3. Aggregation, Validation, and Generalization of Qualitative Data - Methodological and Practical Research Strategies Illustrated by the Research Process of an empirically Based Typology.

    PubMed

    Weis, Daniel; Willems, Helmut

    2017-06-01

    The article deals with the question of how aggregated data which allow for generalizable insights can be generated from single-case based qualitative investigations. Thereby, two central challenges of qualitative social research are outlined: First, researchers must ensure that the single-case data can be aggregated and condensed so that new collective structures can be detected. Second, they must apply methods and practices to allow for the generalization of the results beyond the specific study. In the following, we demonstrate how and under what conditions these challenges can be addressed in research practice. To this end, the research process of the construction of an empirically based typology is described. A qualitative study, conducted within the framework of the Luxembourg Youth Report, is used to illustrate this process. Specifically, strategies are presented which increase the likelihood of generalizability or transferability of the results, while also highlighting their limitations.

  4. Gene function prediction based on Gene Ontology Hierarchy Preserving Hashing.

    PubMed

    Zhao, Yingwen; Fu, Guangyuan; Wang, Jun; Guo, Maozu; Yu, Guoxian

    2018-02-23

    Gene Ontology (GO) uses structured vocabularies (or terms) to describe the molecular functions, biological roles, and cellular locations of gene products in a hierarchical ontology. GO annotations associate genes with GO terms and indicate the given gene products carrying out the biological functions described by the relevant terms. However, predicting correct GO annotations for genes from a massive set of GO terms as defined by GO is a difficult challenge. To combat with this challenge, we introduce a Gene Ontology Hierarchy Preserving Hashing (HPHash) based semantic method for gene function prediction. HPHash firstly measures the taxonomic similarity between GO terms. It then uses a hierarchy preserving hashing technique to keep the hierarchical order between GO terms, and to optimize a series of hashing functions to encode massive GO terms via compact binary codes. After that, HPHash utilizes these hashing functions to project the gene-term association matrix into a low-dimensional one and performs semantic similarity based gene function prediction in the low-dimensional space. Experimental results on three model species (Homo sapiens, Mus musculus and Rattus norvegicus) for interspecies gene function prediction show that HPHash performs better than other related approaches and it is robust to the number of hash functions. In addition, we also take HPHash as a plugin for BLAST based gene function prediction. From the experimental results, HPHash again significantly improves the prediction performance. The codes of HPHash are available at: http://mlda.swu.edu.cn/codes.php?name=HPHash. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Exploring the Position of Community-Based Nursing in Iran: A Qualitative Study.

    PubMed

    Heydari, Heshmatolah; Rahnavard, Zahra; Ghaffari, Fatemeh

    2017-10-01

    Community-based nursing focuses on providing health services to families and communities in the second and third levels of prevention and this can improve the individuals, families and communities' quality of life, and reduce the healthcare costs. The aim of this study was to explore the status of community-based nursing in Iran. This qualitative study was conducted from March to November 2015, in Tehran, Iran, using the content analysis approach. The study setting consisted of Iran and Tehran Faculties of Nursing and Midwifery, Tehran, Iran. The purposive sampling method was used. Twenty faculty members and Master's and PhD students were interviewed by using the face-to-face semi-structured interview method. Moreover, two focus groups were conducted for complementing and enriching the study data. The data were analyzed using the Graneheim and Lundman's approach to content analysis. The trustworthiness of the study findings was maintained by employing the Lincoln and Guba's criteria of credibility, dependability, and confirmability. In total, 580 codes were generated and categorized into three main categories of conventional services, the necessity for creating infrastructures, and multidimensional outcomes of community-based nursing. Introducing community-based nursing into nursing education curricula and creating ample job opportunities for community-based nurses seem clearly essential.

  6. Affective-cognitive meta-bases versus structural bases of attitudes predict processing interest versus efficiency.

    PubMed

    See, Ya Hui Michelle; Petty, Richard E; Fabrigar, Leandre R

    2013-08-01

    We proposed that (a) processing interest for affective over cognitive information is captured by meta-bases (i.e., the extent to which people subjectively perceive themselves to rely on affect or cognition in their attitudes) and (b) processing efficiency for affective over cognitive information is captured by structural bases (i.e., the extent to which attitudes are more evaluatively congruent with affect or cognition). Because processing speed can disentangle interest from efficiency by being manifest as longer or shorter reading times, we hypothesized and found that more affective meta-bases predicted longer affective than cognitive reading time when processing efficiency was held constant (Study 1). In contrast, more affective structural bases predicted shorter affective than cognitive reading time when participants were constrained in their ability to allocate resources deliberatively (Study 2). When deliberation was neither encouraged nor constrained, effects for meta-bases and structural bases emerged (Study 3). Implications for affective-cognitive processing and other attitudes-relevant constructs are discussed.

  7. Short-term solar flare prediction using image-case-based reasoning

    NASA Astrophysics Data System (ADS)

    Liu, Jin-Fu; Li, Fei; Zhang, Huai-Peng; Yu, Da-Ren

    2017-10-01

    Solar flares strongly influence space weather and human activities, and their prediction is highly complex. The existing solutions such as data based approaches and model based approaches have a common shortcoming which is the lack of human engagement in the forecasting process. An image-case-based reasoning method is introduced to achieve this goal. The image case library is composed of SOHO/MDI longitudinal magnetograms, the images from which exhibit the maximum horizontal gradient, the length of the neutral line and the number of singular points that are extracted for retrieving similar image cases. Genetic optimization algorithms are employed for optimizing the weight assignment for image features and the number of similar image cases retrieved. Similar image cases and prediction results derived by majority voting for these similar image cases are output and shown to the forecaster in order to integrate his/her experience with the final prediction results. Experimental results demonstrate that the case-based reasoning approach has slightly better performance than other methods, and is more efficient with forecasts improved by humans.

  8. Predicting MHC-II binding affinity using multiple instance regression

    PubMed Central

    EL-Manzalawy, Yasser; Dobbs, Drena; Honavar, Vasant

    2011-01-01

    Reliably predicting the ability of antigen peptides to bind to major histocompatibility complex class II (MHC-II) molecules is an essential step in developing new vaccines. Uncovering the amino acid sequence correlates of the binding affinity of MHC-II binding peptides is important for understanding pathogenesis and immune response. The task of predicting MHC-II binding peptides is complicated by the significant variability in their length. Most existing computational methods for predicting MHC-II binding peptides focus on identifying a nine amino acids core region in each binding peptide. We formulate the problems of qualitatively and quantitatively predicting flexible length MHC-II peptides as multiple instance learning and multiple instance regression problems, respectively. Based on this formulation, we introduce MHCMIR, a novel method for predicting MHC-II binding affinity using multiple instance regression. We present results of experiments using several benchmark datasets that show that MHCMIR is competitive with the state-of-the-art methods for predicting MHC-II binding peptides. An online web server that implements the MHCMIR method for MHC-II binding affinity prediction is freely accessible at http://ailab.cs.iastate.edu/mhcmir. PMID:20855923

  9. Predicting performance and safety based on driver fatigue.

    PubMed

    Mollicone, Daniel; Kan, Kevin; Mott, Chris; Bartels, Rachel; Bruneau, Steve; van Wollen, Matthew; Sparrow, Amy R; Van Dongen, Hans P A

    2018-04-02

    Fatigue causes decrements in vigilant attention and reaction time and is a major safety hazard in the trucking industry. There is a need to quantify the relationship between driver fatigue and safety in terms of operationally relevant measures. Hard-braking events are a suitable measure for this purpose as they are relatively easily observed and are correlated with collisions and near-crashes. We developed an analytic approach that predicts driver fatigue based on a biomathematical model and then estimates hard-braking events as a function of predicted fatigue, controlling for time of day to account for systematic variations in exposure (traffic density). The analysis used de-identified data from a previously published, naturalistic field study of 106 U.S. commercial motor vehicle (CMV) drivers. Data analyzed included drivers' official duty logs, sleep patterns measured around the clock using wrist actigraphy, and continuous recording of vehicle data to capture hard-braking events. The curve relating predicted fatigue to hard-braking events showed that the frequency of hard-braking events increased as predicted fatigue levels worsened. For each increment on the fatigue scale, the frequency of hard-braking events increased by 7.8%. The results provide proof of concept for a novel approach that predicts fatigue based on drivers' sleep patterns and estimates driving performance in terms of an operational metric related to safety. The approach can be translated to practice by CMV operators to achieve a fatigue risk profile specific to their own settings, in order to support data-driven decisions about fatigue countermeasures that cost-effectively deliver quantifiable operational benefits. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. The establishment and external validation of NIR qualitative analysis model for waste polyester-cotton blend fabrics.

    PubMed

    Li, Feng; Li, Wen-Xia; Zhao, Guo-Liang; Tang, Shi-Jun; Li, Xue-Jiao; Wu, Hong-Mei

    2014-10-01

    A series of 354 polyester-cotton blend fabrics were studied by the near-infrared spectra (NIRS) technology, and a NIR qualitative analysis model for different spectral characteristics was established by partial least squares (PLS) method combined with qualitative identification coefficient. There were two types of spectrum for dying polyester-cotton blend fabrics: normal spectrum and slash spectrum. The slash spectrum loses its spectral characteristics, which are effected by the samples' dyes, pigments, matting agents and other chemical additives. It was in low recognition rate when the model was established by the total sample set, so the samples were divided into two types of sets: normal spectrum sample set and slash spectrum sample set, and two NIR qualitative analysis models were established respectively. After the of models were established the model's spectral region, pretreatment methods and factors were optimized based on the validation results, and the robustness and reliability of the model can be improved lately. The results showed that the model recognition rate was improved greatly when they were established respectively, the recognition rate reached up to 99% when the two models were verified by the internal validation. RC (relation coefficient of calibration) values of the normal spectrum model and slash spectrum model were 0.991 and 0.991 respectively, RP (relation coefficient of prediction) values of them were 0.983 and 0.984 respectively, SEC (standard error of calibration) values of them were 0.887 and 0.453 respectively, SEP (standard error of prediction) values of them were 1.131 and 0.573 respectively. A series of 150 bounds samples reached used to verify the normal spectrum model and slash spectrum model and the recognition rate reached up to 91.33% and 88.00% respectively. It showed that the NIR qualitative analysis model can be used for identification in the recycle site for the polyester-cotton blend fabrics.

  11. Predicting online ratings based on the opinion spreading process

    NASA Astrophysics Data System (ADS)

    He, Xing-Sheng; Zhou, Ming-Yang; Zhuo, Zhao; Fu, Zhong-Qian; Liu, Jian-Guo

    2015-10-01

    Predicting users' online ratings is always a challenge issue and has drawn lots of attention. In this paper, we present a rating prediction method by combining the user opinion spreading process with the collaborative filtering algorithm, where user similarity is defined by measuring the amount of opinion a user transfers to another based on the primitive user-item rating matrix. The proposed method could produce a more precise rating prediction for each unrated user-item pair. In addition, we introduce a tunable parameter λ to regulate the preferential diffusion relevant to the degree of both opinion sender and receiver. The numerical results for Movielens and Netflix data sets show that this algorithm has a better accuracy than the standard user-based collaborative filtering algorithm using Cosine and Pearson correlation without increasing computational complexity. By tuning λ, our method could further boost the prediction accuracy when using Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE) as measurements. In the optimal cases, on Movielens and Netflix data sets, the corresponding algorithmic accuracy (MAE and RMSE) are improved 11.26% and 8.84%, 13.49% and 10.52% compared to the item average method, respectively.

  12. Polar body based aneuploidy screening is poorly predictive of embryo ploidy and reproductive potential.

    PubMed

    Salvaggio, C N; Forman, E J; Garnsey, H M; Treff, N R; Scott, R T

    2014-09-01

    Polar body (polar body) biopsy represents one possible solution to performing comprehensive chromosome screening (CCS). This study adds to what is known about the predictive value of polar body based testing for the genetic status of the resulting embryo, but more importantly, provides the first evaluation of the predictive value for actual clinical outcomes after embryo transfer. SNP array was performed on first polar body, second polar body, and either a blastomere or trophectoderm biopsy, or the entire arrested embryo. Concordance of the polar body-based prediction with the observed diagnoses in the embryos was assessed. In addition, the predictive value of the polar body -based diagnosis for the specific clinical outcome of transferred embryos was evaluated through the use of DNA fingerprinting to track individual embryos. There were 459 embryos analyzed from 96 patients with a mean maternal age of 35.3. The polar body-based predictive value for the embryo based diagnosis was 70.3%. The blastocyst implantation predictive value of a euploid trophectoderm was higher than from euploid polar bodies (51% versus 40%). The cleavage stage embryo implantation predictive value of a euploid blastomere was also higher than from euploid polar bodies (31% versus 22%). Polar body based aneuploidy screening results were less predictive of actual clinical outcomes than direct embryo assessment and may not be adequate to improve sustained implantation rates. In nearly one-third of cases the polar body based analysis failed to predict the ploidy of the embryo. This imprecision may hinder efforts for polar body based CCS to improve IVF clinical outcomes.

  13. Appraising Qualitative Research in Health Education: Guidelines for Public Health Educators

    PubMed Central

    Jeanfreau, Scharalda G.; Jack, Leonard

    2010-01-01

    Research studies, including qualitative studies, form the basis for evidence-based practice among health professionals. However, many practicing health educators do not feel fully confident in their ability to critically appraise qualitative research studies. This publication presents an overview of qualitative research approaches, defines key terminology used in qualitative research, and provides guidelines for appraising the strengths and weaknesses of published qualitative research. On reading, health educators will be better equipped to evaluate the quality of the evidence through critical appraisals of qualitative research publications. PMID:20817630

  14. Predicting subcontractor performance using web-based Evolutionary Fuzzy Neural Networks.

    PubMed

    Ko, Chien-Ho

    2013-01-01

    Subcontractor performance directly affects project success. The use of inappropriate subcontractors may result in individual work delays, cost overruns, and quality defects throughout the project. This study develops web-based Evolutionary Fuzzy Neural Networks (EFNNs) to predict subcontractor performance. EFNNs are a fusion of Genetic Algorithms (GAs), Fuzzy Logic (FL), and Neural Networks (NNs). FL is primarily used to mimic high level of decision-making processes and deal with uncertainty in the construction industry. NNs are used to identify the association between previous performance and future status when predicting subcontractor performance. GAs are optimizing parameters required in FL and NNs. EFNNs encode FL and NNs using floating numbers to shorten the length of a string. A multi-cut-point crossover operator is used to explore the parameter and retain solution legality. Finally, the applicability of the proposed EFNNs is validated using real subcontractors. The EFNNs are evolved using 22 historical patterns and tested using 12 unseen cases. Application results show that the proposed EFNNs surpass FL and NNs in predicting subcontractor performance. The proposed approach improves prediction accuracy and reduces the effort required to predict subcontractor performance, providing field operators with web-based remote access to a reliable, scientific prediction mechanism.

  15. Predicting Subcontractor Performance Using Web-Based Evolutionary Fuzzy Neural Networks

    PubMed Central

    2013-01-01

    Subcontractor performance directly affects project success. The use of inappropriate subcontractors may result in individual work delays, cost overruns, and quality defects throughout the project. This study develops web-based Evolutionary Fuzzy Neural Networks (EFNNs) to predict subcontractor performance. EFNNs are a fusion of Genetic Algorithms (GAs), Fuzzy Logic (FL), and Neural Networks (NNs). FL is primarily used to mimic high level of decision-making processes and deal with uncertainty in the construction industry. NNs are used to identify the association between previous performance and future status when predicting subcontractor performance. GAs are optimizing parameters required in FL and NNs. EFNNs encode FL and NNs using floating numbers to shorten the length of a string. A multi-cut-point crossover operator is used to explore the parameter and retain solution legality. Finally, the applicability of the proposed EFNNs is validated using real subcontractors. The EFNNs are evolved using 22 historical patterns and tested using 12 unseen cases. Application results show that the proposed EFNNs surpass FL and NNs in predicting subcontractor performance. The proposed approach improves prediction accuracy and reduces the effort required to predict subcontractor performance, providing field operators with web-based remote access to a reliable, scientific prediction mechanism. PMID:23864830

  16. Data base for the prediction of inlet external drag

    NASA Technical Reports Server (NTRS)

    Mcmillan, O. J.; Perkins, E. W.; Perkins, S. C., Jr.

    1980-01-01

    Results are presented from a study to define and evaluate the data base for predicting an airframe/propulsion system interference effect shown to be of considerable importance, inlet external drag. The study is focused on supersonic tactical aircraft with highly integrated jet propulsion systems, although some information is included for supersonic strategic aircraft and for transport aircraft designed for high subsonic or low supersonic cruise. The data base for inlet external drag is considered to consist of the theoretical and empirical prediction methods as well as the experimental data identified in an extensive literature search. The state of the art in the subsonic and transonic speed regimes is evaluated. The experimental data base is organized and presented in a series of tables in which the test article, the quantities measured and the ranges of test conditions covered are described for each set of data; in this way, the breadth of coverage and gaps in the existing experimental data are evident. Prediction methods are categorized by method of solution, type of inlet and speed range to which they apply, major features are given, and their accuracy is assessed by means of comparison to experimental data.

  17. Systematic Braiding of Two Evidence-Based Parent Training Programs: Qualitative Results from the Pilot Phase

    PubMed Central

    Guastaferro, Kate; Miller, Katy; Shanley Chatham, Jenelle R.; Whitaker, Daniel J.; McGilly, Kate; Lutzker, John R.

    2017-01-01

    An effective approach in early intervention for children and families, including child maltreatment prevention, is home-based services. Though several evidence-based programs exist, they are often grouped together, despite having different foci. This paper describes an ongoing cluster randomized trial systematically braiding two evidence-based home-based models, SafeCare® and Parents as Teachers (PAT)®, to better meet the needs of families at-risk. We describe the methodology for braiding model implementation and curriculum, specifically focusing on how structured qualitative feedback from pilot families and providers was used to create the braided curriculum and implementation. Systematic braiding of two models at the implementation and curriculum levels is a mechanism that has the potential to meet the more comprehensive needs of families at-risk for maltreatment. PMID:27870760

  18. Prediction of trabecular bone qualitative properties using scanning quantitative ultrasound

    NASA Astrophysics Data System (ADS)

    Qin, Yi-Xian; Lin, Wei; Mittra, Erik; Xia, Yi; Cheng, Jiqi; Judex, Stefan; Rubin, Clint; Müller, Ralph

    2013-11-01

    Microgravity induced bone loss represents a critical health problem in astronauts, particularly occurred in weight-supporting skeleton, which leads to osteopenia and increase of fracture risk. Lack of suitable evaluation modality makes it difficult for monitoring skeletal status in long term space mission and increases potential risk of complication. Such disuse osteopenia and osteoporosis compromise trabecular bone density, and architectural and mechanical properties. While X-ray based imaging would not be practical in space, quantitative ultrasound may provide advantages to characterize bone density and strength through wave propagation in complex trabecular structure. This study used a scanning confocal acoustic diagnostic and navigation system (SCAN) to evaluate trabecular bone quality in 60 cubic trabecular samples harvested from adult sheep. Ultrasound image based SCAN measurements in structural and strength properties were validated by μCT and compressive mechanical testing. This result indicated a moderately strong negative correlations observed between broadband ultrasonic attenuation (BUA) and μCT-determined bone volume fraction (BV/TV, R2=0.53). Strong correlations were observed between ultrasound velocity (UV) and bone's mechanical strength and structural parameters, i.e., bulk Young's modulus (R2=0.67) and BV/TV (R2=0.85). The predictions for bone density and mechanical strength were significantly improved by using a linear combination of both BUA and UV, yielding R2=0.92 for BV/TV and R2=0.71 for bulk Young's modulus. These results imply that quantitative ultrasound can characterize trabecular structural and mechanical properties through measurements of particular ultrasound parameters, and potentially provide an excellent estimation for bone's structural integrity.

  19. Thai nursing students' adaption to problem-based learning: a qualitative study.

    PubMed

    Klunklin, Areewan; Subpaiboongid, Pornpun; Keitlertnapha, Pongsri; Viseskul, Nongkran; Turale, Sue

    2011-11-01

    Student-centred forms of learning have gained favour internationally over the last few decades including problem based learning, an approach now incorporated in medicine, nursing and other disciplines' education in many countries. However, it is still new in Thailand and being piloted to try to offset traditional forms of didactic, teacher-centred forms of teaching. In this qualitative study, 25 undergraduate nursing students in northern Thailand were interviewed about their experiences with problem-based learning in a health promotion subject. Content analysis was used to interrogate interview data, which revealed four categories: adapting, seeking assistance, self-development, and thinking process development. Initially participants had mixed emotions of confusion, negativity or boredom in the adaption process, but expressed satisfaction with creativity in learning, group work, and leadership development. They described increased abilities to problem solve and think critically, but struggled to develop questioning behaviours in learning. Socio-culturally in Thai education, students have great respect for teachers, but rarely question or challenge them or their learning. We conclude that problem-based learning has great potential in Thai nursing education, but educators and systems need to systematically prepare appropriate learning environments, their staff and students, to incorporate this within curricula. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. The Use of Modelling for Theory Building in Qualitative Analysis

    ERIC Educational Resources Information Center

    Briggs, Ann R. J.

    2007-01-01

    The purpose of this article is to exemplify and enhance the place of modelling as a qualitative process in educational research. Modelling is widely used in quantitative research as a tool for analysis, theory building and prediction. Statistical data lend themselves to graphical representation of values, interrelationships and operational…

  1. A hadoop-based method to predict potential effective drug combination.

    PubMed

    Sun, Yifan; Xiong, Yi; Xu, Qian; Wei, Dongqing

    2014-01-01

    Combination drugs that impact multiple targets simultaneously are promising candidates for combating complex diseases due to their improved efficacy and reduced side effects. However, exhaustive screening of all possible drug combinations is extremely time-consuming and impractical. Here, we present a novel Hadoop-based approach to predict drug combinations by taking advantage of the MapReduce programming model, which leads to an improvement of scalability of the prediction algorithm. By integrating the gene expression data of multiple drugs, we constructed data preprocessing and the support vector machines and naïve Bayesian classifiers on Hadoop for prediction of drug combinations. The experimental results suggest that our Hadoop-based model achieves much higher efficiency in the big data processing steps with satisfactory performance. We believed that our proposed approach can help accelerate the prediction of potential effective drugs with the increasing of the combination number at an exponential rate in future. The source code and datasets are available upon request.

  2. A Hadoop-Based Method to Predict Potential Effective Drug Combination

    PubMed Central

    Xiong, Yi; Xu, Qian; Wei, Dongqing

    2014-01-01

    Combination drugs that impact multiple targets simultaneously are promising candidates for combating complex diseases due to their improved efficacy and reduced side effects. However, exhaustive screening of all possible drug combinations is extremely time-consuming and impractical. Here, we present a novel Hadoop-based approach to predict drug combinations by taking advantage of the MapReduce programming model, which leads to an improvement of scalability of the prediction algorithm. By integrating the gene expression data of multiple drugs, we constructed data preprocessing and the support vector machines and naïve Bayesian classifiers on Hadoop for prediction of drug combinations. The experimental results suggest that our Hadoop-based model achieves much higher efficiency in the big data processing steps with satisfactory performance. We believed that our proposed approach can help accelerate the prediction of potential effective drugs with the increasing of the combination number at an exponential rate in future. The source code and datasets are available upon request. PMID:25147789

  3. An Efficient Deterministic Approach to Model-based Prediction Uncertainty Estimation

    DTIC Science & Technology

    2012-09-01

    94035, USA abhinav.saxena@nasa.gov ABSTRACT Prognostics deals with the prediction of the end of life ( EOL ) of a system. EOL is a random variable, due...future evolution of the system, accumulating additional uncertainty into the predicted EOL . Prediction algorithms that do not account for these sources of...uncertainty are misrepresenting the EOL and can lead to poor decisions based on their results. In this paper, we explore the impact of uncertainty in

  4. A qualitative study exploring adolescents' experiences with a school-based mental health program.

    PubMed

    Garmy, Pernilla; Berg, Agneta; Clausson, Eva K

    2015-10-21

    Supporting positive mental health development in adolescents is a major public health concern worldwide. Although several school-based programs aimed at preventing depression have been launched, it is crucial to evaluate these programs and to obtain feedback from participating adolescents. This study aimed to explore adolescents' experiences with a -based cognitive-behavioral depression prevention program. Eighty-nine adolescents aged 13-15 years were divided into 12 focus groups. The focus group interviews were analyzed using qualitative content analysis. Three categories and eight subcategories were found to be related to the experience of the school-based program. The first category, intrapersonal strategies, consisted of the subcategories of directed thinking, improved self-confidence, stress management, and positive activities. The second category, interpersonal awareness, consisted of the subcategories of trusting the group and considering others. The third category, structural constraints, consisted of the subcategories of negative framing and emphasis on performance. The school-based mental health program was perceived as beneficial and meaningful on both individual and group levels, but students expressed a desire for a more health-promoting approach.

  5. DNA methylation-based age prediction from various tissues and body fluids

    PubMed Central

    Jung, Sang-Eun; Shin, Kyoung-Jin; Lee, Hwan Young

    2017-01-01

    Aging is a natural and gradual process in human life. It is influenced by heredity, environment, lifestyle, and disease. DNA methylation varies with age, and the ability to predict the age of donor using DNA from evidence materials at a crime scene is of considerable value in forensic investigations. Recently, many studies have reported age prediction models based on DNA methylation from various tissues and body fluids. Those models seem to be very promising because of their high prediction accuracies. In this review, the changes of age-associated DNA methylation and the age prediction models for various tissues and body fluids were examined, and then the applicability of the DNA methylation-based age prediction method to the forensic investigations was discussed. This will improve the understandings about DNA methylation markers and their potential to be used as biomarkers in the forensic field, as well as the clinical field. PMID:28946940

  6. The implementation of two stages clustering (k-means clustering and adaptive neuro fuzzy inference system) for prediction of medicine need based on medical data

    NASA Astrophysics Data System (ADS)

    Husein, A. M.; Harahap, M.; Aisyah, S.; Purba, W.; Muhazir, A.

    2018-03-01

    Medication planning aim to get types, amount of medicine according to needs, and avoid the emptiness medicine based on patterns of disease. In making the medicine planning is still rely on ability and leadership experience, this is due to take a long time, skill, difficult to obtain a definite disease data, need a good record keeping and reporting, and the dependence of the budget resulted in planning is not going well, and lead to frequent lack and excess of medicines. In this research, we propose Adaptive Neuro Fuzzy Inference System (ANFIS) method to predict medication needs in 2016 and 2017 based on medical data in 2015 and 2016 from two source of hospital. The framework of analysis using two approaches. The first phase is implementing ANFIS to a data source, while the second approach we keep using ANFIS, but after the process of clustering from K-Means algorithm, both approaches are calculated values of Root Mean Square Error (RMSE) for training and testing. From the testing result, the proposed method with better prediction rates based on the evaluation analysis of quantitative and qualitative compared with existing systems, however the implementation of K-Means Algorithm against ANFIS have an effect on the timing of the training process and provide a classification accuracy significantly better without clustering.

  7. A vertical handoff decision algorithm based on ARMA prediction model

    NASA Astrophysics Data System (ADS)

    Li, Ru; Shen, Jiao; Chen, Jun; Liu, Qiuhuan

    2012-01-01

    With the development of computer technology and the increasing demand for mobile communications, the next generation wireless networks will be composed of various wireless networks (e.g., WiMAX and WiFi). Vertical handoff is a key technology of next generation wireless networks. During the vertical handoff procedure, handoff decision is a crucial issue for an efficient mobility. Based on auto regression moving average (ARMA) prediction model, we propose a vertical handoff decision algorithm, which aims to improve the performance of vertical handoff and avoid unnecessary handoff. Based on the current received signal strength (RSS) and the previous RSS, the proposed approach adopt ARMA model to predict the next RSS. And then according to the predicted RSS to determine whether trigger the link layer triggering event and complete vertical handoff. The simulation results indicate that the proposed algorithm outperforms the RSS-based scheme with a threshold in the performance of handoff and the number of handoff.

  8. Stringent DDI-based prediction of H. sapiens-M. tuberculosis H37Rv protein-protein interactions.

    PubMed

    Zhou, Hufeng; Rezaei, Javad; Hugo, Willy; Gao, Shangzhi; Jin, Jingjing; Fan, Mengyuan; Yong, Chern-Han; Wozniak, Michal; Wong, Limsoon

    2013-01-01

    H. sapiens-M. tuberculosis H37Rv protein-protein interaction (PPI) data are very important information to illuminate the infection mechanism of M. tuberculosis H37Rv. But current H. sapiens-M. tuberculosis H37Rv PPI data are very scarce. This seriously limits the study of the interaction between this important pathogen and its host H. sapiens. Computational prediction of H. sapiens-M. tuberculosis H37Rv PPIs is an important strategy to fill in the gap. Domain-domain interaction (DDI) based prediction is one of the frequently used computational approaches in predicting both intra-species and inter-species PPIs. However, the performance of DDI-based host-pathogen PPI prediction has been rather limited. We develop a stringent DDI-based prediction approach with emphasis on (i) differences between the specific domain sequences on annotated regions of proteins under the same domain ID and (ii) calculation of the interaction strength of predicted PPIs based on the interacting residues in their interaction interfaces. We compare our stringent DDI-based approach to a conventional DDI-based approach for predicting PPIs based on gold standard intra-species PPIs and coherent informative Gene Ontology terms assessment. The assessment results show that our stringent DDI-based approach achieves much better performance in predicting PPIs than the conventional approach. Using our stringent DDI-based approach, we have predicted a small set of reliable H. sapiens-M. tuberculosis H37Rv PPIs which could be very useful for a variety of related studies. We also analyze the H. sapiens-M. tuberculosis H37Rv PPIs predicted by our stringent DDI-based approach using cellular compartment distribution analysis, functional category enrichment analysis and pathway enrichment analysis. The analyses support the validity of our prediction result. Also, based on an analysis of the H. sapiens-M. tuberculosis H37Rv PPI network predicted by our stringent DDI-based approach, we have discovered some

  9. Stringent DDI-based Prediction of H. sapiens-M. tuberculosis H37Rv Protein-Protein Interactions

    PubMed Central

    2013-01-01

    Background H. sapiens-M. tuberculosis H37Rv protein-protein interaction (PPI) data are very important information to illuminate the infection mechanism of M. tuberculosis H37Rv. But current H. sapiens-M. tuberculosis H37Rv PPI data are very scarce. This seriously limits the study of the interaction between this important pathogen and its host H. sapiens. Computational prediction of H. sapiens-M. tuberculosis H37Rv PPIs is an important strategy to fill in the gap. Domain-domain interaction (DDI) based prediction is one of the frequently used computational approaches in predicting both intra-species and inter-species PPIs. However, the performance of DDI-based host-pathogen PPI prediction has been rather limited. Results We develop a stringent DDI-based prediction approach with emphasis on (i) differences between the specific domain sequences on annotated regions of proteins under the same domain ID and (ii) calculation of the interaction strength of predicted PPIs based on the interacting residues in their interaction interfaces. We compare our stringent DDI-based approach to a conventional DDI-based approach for predicting PPIs based on gold standard intra-species PPIs and coherent informative Gene Ontology terms assessment. The assessment results show that our stringent DDI-based approach achieves much better performance in predicting PPIs than the conventional approach. Using our stringent DDI-based approach, we have predicted a small set of reliable H. sapiens-M. tuberculosis H37Rv PPIs which could be very useful for a variety of related studies. We also analyze the H. sapiens-M. tuberculosis H37Rv PPIs predicted by our stringent DDI-based approach using cellular compartment distribution analysis, functional category enrichment analysis and pathway enrichment analysis. The analyses support the validity of our prediction result. Also, based on an analysis of the H. sapiens-M. tuberculosis H37Rv PPI network predicted by our stringent DDI-based approach, we have

  10. Prediction of pump cavitation performance

    NASA Technical Reports Server (NTRS)

    Moore, R. D.

    1974-01-01

    A method for predicting pump cavitation performance with various liquids, liquid temperatures, and rotative speeds is presented. Use of the method requires that two sets of test data be available for the pump of interest. Good agreement between predicted and experimental results of cavitation performance was obtained for several pumps operated in liquids which exhibit a wide range of properties. Two cavitation parameters which qualitatively evaluate pump cavitation performance are also presented.

  11. Accurate prediction of energy expenditure using a shoe-based activity monitor.

    PubMed

    Sazonova, Nadezhda; Browning, Raymond C; Sazonov, Edward

    2011-07-01

    The aim of this study was to develop and validate a method for predicting energy expenditure (EE) using a footwear-based system with integrated accelerometer and pressure sensors. We developed a footwear-based device with an embedded accelerometer and insole pressure sensors for the prediction of EE. The data from the device can be used to perform accurate recognition of major postures and activities and to estimate EE using the acceleration, pressure, and posture/activity classification information in a branched algorithm without the need for individual calibration. We measured EE via indirect calorimetry as 16 adults (body mass index=19-39 kg·m) performed various low- to moderate-intensity activities and compared measured versus predicted EE using several models based on the acceleration and pressure signals. Inclusion of pressure data resulted in better accuracy of EE prediction during static postures such as sitting and standing. The activity-based branched model that included predictors from accelerometer and pressure sensors (BACC-PS) achieved the lowest error (e.g., root mean squared error (RMSE)=0.69 METs) compared with the accelerometer-only-based branched model BACC (RMSE=0.77 METs) and nonbranched model (RMSE=0.94-0.99 METs). Comparison of EE prediction models using data from both legs versus models using data from a single leg indicates that only one shoe needs to be equipped with sensors. These results suggest that foot acceleration combined with insole pressure measurement, when used in an activity-specific branched model, can accurately estimate the EE associated with common daily postures and activities. The accuracy and unobtrusiveness of a footwear-based device may make it an effective physical activity monitoring tool.

  12. Non-Fourier based thermal-mechanical tissue damage prediction for thermal ablation.

    PubMed

    Li, Xin; Zhong, Yongmin; Smith, Julian; Gu, Chengfan

    2017-01-02

    Prediction of tissue damage under thermal loads plays important role for thermal ablation planning. A new methodology is presented in this paper by combing non-Fourier bio-heat transfer, constitutive elastic mechanics as well as non-rigid motion of dynamics to predict and analyze thermal distribution, thermal-induced mechanical deformation and thermal-mechanical damage of soft tissues under thermal loads. Simulations and comparison analysis demonstrate that the proposed methodology based on the non-Fourier bio-heat transfer can account for the thermal-induced mechanical behaviors of soft tissues and predict tissue thermal damage more accurately than classical Fourier bio-heat transfer based model.

  13. Laser-Based Trespassing Prediction in Restrictive Environments: A Linear Approach

    PubMed Central

    Cheein, Fernando Auat; Scaglia, Gustavo

    2012-01-01

    Stationary range laser sensors for intruder monitoring, restricted space violation detections and workspace determination are extensively used in risky environments. In this work we present a linear based approach for predicting the presence of moving agents before they trespass a laser-based restricted space. Our approach is based on the Taylor's series expansion of the detected objects' movements. The latter makes our proposal suitable for embedded applications. In the experimental results (carried out in different scenarios) presented herein, our proposal shows 100% of effectiveness in predicting trespassing situations. Several implementation results and statistics analysis showing the performance of our proposal are included in this work.

  14. Levels of reconstruction as complementarity in mixed methods research: a social theory-based conceptual framework for integrating qualitative and quantitative research.

    PubMed

    Carroll, Linda J; Rothe, J Peter

    2010-09-01

    Like other areas of health research, there has been increasing use of qualitative methods to study public health problems such as injuries and injury prevention. Likewise, the integration of qualitative and quantitative research (mixed-methods) is beginning to assume a more prominent role in public health studies. Likewise, using mixed-methods has great potential for gaining a broad and comprehensive understanding of injuries and their prevention. However, qualitative and quantitative research methods are based on two inherently different paradigms, and their integration requires a conceptual framework that permits the unity of these two methods. We present a theory-driven framework for viewing qualitative and quantitative research, which enables us to integrate them in a conceptually sound and useful manner. This framework has its foundation within the philosophical concept of complementarity, as espoused in the physical and social sciences, and draws on Bergson's metaphysical work on the 'ways of knowing'. Through understanding how data are constructed and reconstructed, and the different levels of meaning that can be ascribed to qualitative and quantitative findings, we can use a mixed-methods approach to gain a conceptually sound, holistic knowledge about injury phenomena that will enhance our development of relevant and successful interventions.

  15. Aced Out: Censorship of Qualitative Research in the Age of "Scientifically Based Research"

    ERIC Educational Resources Information Center

    Ceglowski, Deborah; Bacigalupa, Chiara; Peck, Emery

    2011-01-01

    In this manuscript, we examine three layers of censorship related to the publication of qualitative research studies: (a) the global level of federal legislation and the definition of the "gold standard" of educational research, (b) the decline in the number of qualitative studies published in a top-tiered early childhood educational…

  16. Two approaches to improving mental health care: positivist/quantitative versus skill-based/qualitative.

    PubMed

    Luchins, Daniel

    2012-01-01

    The quality improvement model currently used in medicine and mental health was adopted from industry, where it developed out of early 20th-century efforts to apply a positivist/quantitative agenda to improving manufacturing. This article questions the application of this model to mental health care. It argues that (1) developing "operational definitions" for something as value-laden as "quality" risks conflating two realms, what we measure with what we value; (2) when measurements that are tied to individuals are aggregated to establish benchmarks and goals, unwarranted mathematical assumptions are made; (3) choosing clinical outcomes is problematic; (4) there is little relationship between process measures and clinical outcomes; and (5) since changes in quality indices do not relate to improved clinical care, management's reliance on such indices provides an illusory sense of control. An alternative model is the older, skill-based/qualitative approach to knowing, which relies on "implicit/ expert" knowledge. These two approaches offer a series of contrasts: quality versus excellence, competence versus expertise, management versus leadership, extrinsic versus intrinsic rewards. The article concludes that we need not totally dispense with the current quality improvement model, but rather should balance quantitative efforts with the older qualitative approach in a mixed methods model.

  17. Phosphate-based glasses: Prediction of acoustical properties

    NASA Astrophysics Data System (ADS)

    El-Moneim, Amin Abd

    2016-04-01

    In this work, a comprehensive study has been carried out to predict the composition dependence of bulk modulus and ultrasonic attenuation coefficient in the phosphate-based glass systems PbO-P2O5, Li2O-TeO2-B2O3-P2O5, TiO2-Na2O-CaO-P2O5 and Cr2O3-doped Na2O-ZnO-P2O5 at room temperature. The prediction is based on (i) Makishima-Mackenzie theory, which correlates the bulk modulus with packing density and dissociation energy per unit volume, and (ii) Our recently presented semi-empirical formulas, which correlate the ultrasonic attenuation coefficient with the oxygen density, mean atomic ring size, first-order stretching force constant and experimental bulk modulus. Results revealed that our recently presented semi-empirical formulas can be applied successfully to predict changes of ultrasonic attenuation coefficient in binary PbO-P2O5 glasses at 10 MHz frequency and in quaternary Li2O-TeO2-B2O3-P2O5, TiO2-Na2O-CaO-P2O5 and Cr2O3-Na2O-ZnO-P2O5 glasses at 5 MHz frequency. Also, Makishima-Mackenzie theory appears to be valid for the studied glasses if the effect of the basic structural units that present in the glass network is taken into account.

  18. Predicting neuroblastoma using developmental signals and a logic-based model.

    PubMed

    Kasemeier-Kulesa, Jennifer C; Schnell, Santiago; Woolley, Thomas; Spengler, Jennifer A; Morrison, Jason A; McKinney, Mary C; Pushel, Irina; Wolfe, Lauren A; Kulesa, Paul M

    2018-07-01

    Genomic information from human patient samples of pediatric neuroblastoma cancers and known outcomes have led to specific gene lists put forward as high risk for disease progression. However, the reliance on gene expression correlations rather than mechanistic insight has shown limited potential and suggests a critical need for molecular network models that better predict neuroblastoma progression. In this study, we construct and simulate a molecular network of developmental genes and downstream signals in a 6-gene input logic model that predicts a favorable/unfavorable outcome based on the outcome of the four cell states including cell differentiation, proliferation, apoptosis, and angiogenesis. We simulate the mis-expression of the tyrosine receptor kinases, trkA and trkB, two prognostic indicators of neuroblastoma, and find differences in the number and probability distribution of steady state outcomes. We validate the mechanistic model assumptions using RNAseq of the SHSY5Y human neuroblastoma cell line to define the input states and confirm the predicted outcome with antibody staining. Lastly, we apply input gene signatures from 77 published human patient samples and show that our model makes more accurate disease outcome predictions for early stage disease than any current neuroblastoma gene list. These findings highlight the predictive strength of a logic-based model based on developmental genes and offer a better understanding of the molecular network interactions during neuroblastoma disease progression. Copyright © 2018. Published by Elsevier B.V.

  19. City traffic flow breakdown prediction based on fuzzy rough set

    NASA Astrophysics Data System (ADS)

    Yang, Xu; Da-wei, Hu; Bing, Su; Duo-jia, Zhang

    2017-05-01

    In city traffic management, traffic breakdown is a very important issue, which is defined as a speed drop of a certain amount within a dense traffic situation. In order to predict city traffic flow breakdown accurately, in this paper, we propose a novel city traffic flow breakdown prediction algorithm based on fuzzy rough set. Firstly, we illustrate the city traffic flow breakdown problem, in which three definitions are given, that is, 1) Pre-breakdown flow rate, 2) Rate, density, and speed of the traffic flow breakdown, and 3) Duration of the traffic flow breakdown. Moreover, we define a hazard function to represent the probability of the breakdown ending at a given time point. Secondly, as there are many redundant and irrelevant attributes in city flow breakdown prediction, we propose an attribute reduction algorithm using the fuzzy rough set. Thirdly, we discuss how to predict the city traffic flow breakdown based on attribute reduction and SVM classifier. Finally, experiments are conducted by collecting data from I-405 Freeway, which is located at Irvine, California. Experimental results demonstrate that the proposed algorithm is able to achieve lower average error rate of city traffic flow breakdown prediction.

  20. Image-Based Predictive Modeling of Heart Mechanics.

    PubMed

    Wang, V Y; Nielsen, P M F; Nash, M P

    2015-01-01

    Personalized biophysical modeling of the heart is a useful approach for noninvasively analyzing and predicting in vivo cardiac mechanics. Three main developments support this style of analysis: state-of-the-art cardiac imaging technologies, modern computational infrastructure, and advanced mathematical modeling techniques. In vivo measurements of cardiac structure and function can be integrated using sophisticated computational methods to investigate mechanisms of myocardial function and dysfunction, and can aid in clinical diagnosis and developing personalized treatment. In this article, we review the state-of-the-art in cardiac imaging modalities, model-based interpretation of 3D images of cardiac structure and function, and recent advances in modeling that allow personalized predictions of heart mechanics. We discuss how using such image-based modeling frameworks can increase the understanding of the fundamental biophysics behind cardiac mechanics, and assist with diagnosis, surgical guidance, and treatment planning. Addressing the challenges in this field will require a coordinated effort from both the clinical-imaging and modeling communities. We also discuss future directions that can be taken to bridge the gap between basic science and clinical translation.

  1. Writing usable qualitative health research findings.

    PubMed

    Sandelowski, Margarete; Leeman, Jennifer

    2012-10-01

    Scholars in diverse health-related disciplines and specialty fields of practice routinely promote qualitative research as an essential component of intervention and implementation programs of research and of a comprehensive evidence base for practice. Remarkably little attention, however, has been paid to the most important element of qualitative studies--the findings in reports of those studies--and specifically to enhancing the accessibility and utilization value of these findings for diverse audiences of users. The findings in reports of qualitative health research are too often difficult to understand and even to find owing to the way they are presented. A basic strategy for enhancing the presentation of these findings is to translate them into thematic statements, which can then in turn be translated into the language of intervention and implementation. Writers of qualitative health research reports might consider these strategies better to showcase the significance and actionability of findings to a wider audience.

  2. Position of pelvis in the 3rd month of life predicts further motor development.

    PubMed

    Gajewska, Ewa; Sobieska, Magdalena; Moczko, Jerzy

    2018-06-01

    The aim of the study is to select elements of motor skills assessed at 3 months that provide the best predictive properties for motor development at 9 months. In all children a physiotherapeutic assessment of the quantitative and qualitative development at the age of 3 months was performed in the prone and supine positions, which was presented in previous papers as the quantitative and qualitative assessment sheet of motor development. The neurological examination at the age of 9 months was based on the Denver Development Screening Test II and the evaluation of reflexes, muscle tone (hypotony and hypertony), and symmetry. The particular elements of motor performance assessment were shown to have distinct predictive value for further motor development (as assessed at 9 months), and the pelvis position was the strongest predictive element. Irrespective of the symptomatic and anamnestic factors the inappropriate motor performance may already be detected in the 3rd month of life and is predictive for further motor development. The assessment of the motor performance should be performed in both supine and prone positions. The proper position of pelvis summarizes the proper positioning of the whole spine and ensures proper further motor development. To our knowledge, the presented motor development assessment sheet allows the earliest prediction of motor disturbances. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Qualitative research.

    PubMed

    Gelling, Leslie

    2015-03-25

    Qualitative research has an important role in helping nurses and other healthcare professionals understand patient experiences of health and illness. Qualitative researchers have a large number of methodological options and therefore should take care in planning and conducting their research. This article offers a brief overview of some of the key issues qualitative researchers should consider.

  4. A Mobile Health App-Based Postnatal Educational Program (Home-but not Alone): Descriptive Qualitative Study.

    PubMed

    Shorey, Shefaly; Yang, Yen Yen; Dennis, Cindy-Lee

    2018-04-19

    The postnatal period poses numerous challenges for new parents. Various educational programs are available to support new parents during this stressful period. However, the usefulness of educational programs must be evaluated to ascertain their credibility. The aim of this descriptive, qualitative study was to explore the views of parents of newborns with regard to the content and delivery of a mobile health (mHealth) app-based postnatal educational program. A qualitative semistructured interview guide was used to collect data from 17 participants who belonged to the intervention group of a randomized controlled trial. The intervention, a 4-week-long access to a mHealth app-based educational program, was evaluated. The interviews were conducted in English and at the participants' homes. Thematic analysis was used to analyze the data. The Consolidated Criteria for Reporting Qualitative Research checklist was used to report the findings. The interviews revealed 4 main themes: (1) positive features of the mHealth app, (2) advice from midwives, (3) experiences gained from using the mHealth app, and (4) recommendations for the future. The participants evaluated the educational program to be a good source of information that was tailored to the local context. The different modes of delivery, including audio and video, accentuated the accessibility of information. The parents evaluated that the facilitator of the featured communication platform, a midwife, provided trustworthy advice. Belongingness to a virtual community beyond the hospital endowed the parents the confidence that they were not alone and were supported by other parents and health care professionals. According to the parents, the mHealth app-based educational program was helpful in supporting a multi-ethnic sample of parents during the postnatal period. This insight indicates that the program could be implemented in a wide community of parents in the postnatal period. The helpfulness of the educational

  5. A Mobile Health App–Based Postnatal Educational Program (Home-but not Alone): Descriptive Qualitative Study

    PubMed Central

    Yang, Yen Yen; Dennis, Cindy-Lee

    2018-01-01

    Background The postnatal period poses numerous challenges for new parents. Various educational programs are available to support new parents during this stressful period. However, the usefulness of educational programs must be evaluated to ascertain their credibility. Objective The aim of this descriptive, qualitative study was to explore the views of parents of newborns with regard to the content and delivery of a mobile health (mHealth) app–based postnatal educational program. Methods A qualitative semistructured interview guide was used to collect data from 17 participants who belonged to the intervention group of a randomized controlled trial. The intervention, a 4-week-long access to a mHealth app–based educational program, was evaluated. The interviews were conducted in English and at the participants’ homes. Thematic analysis was used to analyze the data. The Consolidated Criteria for Reporting Qualitative Research checklist was used to report the findings. Results The interviews revealed 4 main themes: (1) positive features of the mHealth app, (2) advice from midwives, (3) experiences gained from using the mHealth app, and (4) recommendations for the future. The participants evaluated the educational program to be a good source of information that was tailored to the local context. The different modes of delivery, including audio and video, accentuated the accessibility of information. The parents evaluated that the facilitator of the featured communication platform, a midwife, provided trustworthy advice. Belongingness to a virtual community beyond the hospital endowed the parents the confidence that they were not alone and were supported by other parents and health care professionals. Conclusions According to the parents, the mHealth app–based educational program was helpful in supporting a multi-ethnic sample of parents during the postnatal period. This insight indicates that the program could be implemented in a wide community of parents in the

  6. Base drag prediction on missile configurations

    NASA Technical Reports Server (NTRS)

    Moore, F. G.; Hymer, T.; Wilcox, F.

    1993-01-01

    New wind tunnel data have been taken, and a new empirical model has been developed for predicting base drag on missile configurations. The new wind tunnel data were taken at NASA-Langley in the Unitary Wind Tunnel at Mach numbers from 2.0 to 4.5, angles of attack to 16 deg, fin control deflections up to 20 deg, fin thickness/chord of 0.05 to 0.15, and fin locations from 'flush with the base' to two chord-lengths upstream of the base. The empirical model uses these data along with previous wind tunnel data, estimating base drag as a function of all these variables as well as boat-tail and power-on/power-off effects. The new model yields improved accuracy, compared to wind tunnel data. The new model also is more robust due to inclusion of additional variables. On the other hand, additional wind tunnel data are needed to validate or modify the current empirical model in areas where data are not available.

  7. A meta-ethnography of interview-based qualitative research studies on medical students' views and experiences of empathy.

    PubMed

    Jeffrey, David

    2016-12-01

    Quantitative research suggests that medical students' empathy declines during their training. This meta-ethnography asks: What new understanding may be gained by a synthesis of interview-based qualitative research on medical students' views and experiences of empathy? How can such a synthesis be undertaken? A meta-ethnography synthesizes individual qualitative studies to generate knowledge increasing understanding and informing debate. A literature search yielded eight qualitative studies which met the inclusion criteria. These were analyzed from a phenomenological and interpretative perspective. The meta-ethnography revealed a conceptual confusion around empathy and a tension in medical education between distancing and connecting with patients. Barriers to empathy included a lack of patient contact and a strong emphasis on the biomedical over the psycho-social aspects of the curriculum. A number of influences discussed in the paper lead students to adopt less overt ways of showing their empathy. These insights deepen our understanding of the apparent decline in empathy in medical students. The lessons from these studies suggest that future curriculum development should include earlier patient contact, more emphasis on psycho-social aspects of care and address the barriers to empathy to ensure that tomorrow's doctors are empathetic as well as competent.

  8. MATERNAL PERCEPTIONS OF PARENTING FOLLOWING AN EVIDENCE-BASED PARENTING PROGRAM: A QUALITATIVE STUDY OF LEGACY FOR CHILDRENTM.

    PubMed

    Hartwig, Sophie A; Robinson, Lara R; Comeau, Dawn L; Claussen, Angelika H; Perou, Ruth

    2017-07-01

    This article presents the findings of a qualitative study of maternal perceptions of parenting following participation in Legacy for Children TM (Legacy), an evidence-based parenting program for low-income mothers of young children and infants. To further examine previous findings and better understand participant experiences, we analyzed semistructured focus-group discussions with predominantly Hispanic and Black, non-Hispanic Legacy mothers at two sites (n = 166) using thematic analysis and grounded theory techniques. The qualitative study presented here investigated how mothers view their parenting following participation in Legacy, allowing participants to describe their experience with the program in their own words, thus capturing an "insider" perspective. Mothers at both sites communicated knowledge and use of positive parenting practices targeted by the goals of Legacy; some site-specific differences emerged related to these parenting practices. These findings align with the interpretation of quantitative results from the randomized controlled trials and further demonstrate the significance of the Legacy program in promoting positive parenting for mothers living in poverty. This study emphasizes the importance of understanding real-world context regarding program efficacy and the benefit of using qualitative research to understand participant experiences. © 2017 Michigan Association for Infant Mental Health.

  9. Thematic and spatial resolutions affect model-based predictions of tree species distribution.

    PubMed

    Liang, Yu; He, Hong S; Fraser, Jacob S; Wu, ZhiWei

    2013-01-01

    Subjective decisions of thematic and spatial resolutions in characterizing environmental heterogeneity may affect the characterizations of spatial pattern and the simulation of occurrence and rate of ecological processes, and in turn, model-based tree species distribution. Thus, this study quantified the importance of thematic and spatial resolutions, and their interaction in predictions of tree species distribution (quantified by species abundance). We investigated how model-predicted species abundances changed and whether tree species with different ecological traits (e.g., seed dispersal distance, competitive capacity) had different responses to varying thematic and spatial resolutions. We used the LANDIS forest landscape model to predict tree species distribution at the landscape scale and designed a series of scenarios with different thematic (different numbers of land types) and spatial resolutions combinations, and then statistically examined the differences of species abundance among these scenarios. Results showed that both thematic and spatial resolutions affected model-based predictions of species distribution, but thematic resolution had a greater effect. Species ecological traits affected the predictions. For species with moderate dispersal distance and relatively abundant seed sources, predicted abundance increased as thematic resolution increased. However, for species with long seeding distance or high shade tolerance, thematic resolution had an inverse effect on predicted abundance. When seed sources and dispersal distance were not limiting, the predicted species abundance increased with spatial resolution and vice versa. Results from this study may provide insights into the choice of thematic and spatial resolutions for model-based predictions of tree species distribution.

  10. CRAFFT: An Activity Prediction Model based on Bayesian Networks

    PubMed Central

    Nazerfard, Ehsan; Cook, Diane J.

    2014-01-01

    Recent advances in the areas of pervasive computing, data mining, and machine learning offer unique opportunities to provide health monitoring and assistance for individuals facing difficulties to live independently in their homes. Several components have to work together to provide health monitoring for smart home residents including, but not limited to, activity recognition, activity discovery, activity prediction, and prompting system. Compared to the significant research done to discover and recognize activities, less attention has been given to predict the future activities that the resident is likely to perform. Activity prediction components can play a major role in design of a smart home. For instance, by taking advantage of an activity prediction module, a smart home can learn context-aware rules to prompt individuals to initiate important activities. In this paper, we propose an activity prediction model using Bayesian networks together with a novel two-step inference process to predict both the next activity features and the next activity label. We also propose an approach to predict the start time of the next activity which is based on modeling the relative start time of the predicted activity using the continuous normal distribution and outlier detection. To validate our proposed models, we used real data collected from physical smart environments. PMID:25937847

  11. CRAFFT: An Activity Prediction Model based on Bayesian Networks.

    PubMed

    Nazerfard, Ehsan; Cook, Diane J

    2015-04-01

    Recent advances in the areas of pervasive computing, data mining, and machine learning offer unique opportunities to provide health monitoring and assistance for individuals facing difficulties to live independently in their homes. Several components have to work together to provide health monitoring for smart home residents including, but not limited to, activity recognition, activity discovery, activity prediction, and prompting system. Compared to the significant research done to discover and recognize activities, less attention has been given to predict the future activities that the resident is likely to perform. Activity prediction components can play a major role in design of a smart home. For instance, by taking advantage of an activity prediction module, a smart home can learn context-aware rules to prompt individuals to initiate important activities. In this paper, we propose an activity prediction model using Bayesian networks together with a novel two-step inference process to predict both the next activity features and the next activity label. We also propose an approach to predict the start time of the next activity which is based on modeling the relative start time of the predicted activity using the continuous normal distribution and outlier detection. To validate our proposed models, we used real data collected from physical smart environments.

  12. Adaptive Data-based Predictive Control for Short Take-off and Landing (STOL) Aircraft

    NASA Technical Reports Server (NTRS)

    Barlow, Jonathan Spencer; Acosta, Diana Michelle; Phan, Minh Q.

    2010-01-01

    Data-based Predictive Control is an emerging control method that stems from Model Predictive Control (MPC). MPC computes current control action based on a prediction of the system output a number of time steps into the future and is generally derived from a known model of the system. Data-based predictive control has the advantage of deriving predictive models and controller gains from input-output data. Thus, a controller can be designed from the outputs of complex simulation code or a physical system where no explicit model exists. If the output data happens to be corrupted by periodic disturbances, the designed controller will also have the built-in ability to reject these disturbances without the need to know them. When data-based predictive control is implemented online, it becomes a version of adaptive control. The characteristics of adaptive data-based predictive control are particularly appropriate for the control of nonlinear and time-varying systems, such as Short Take-off and Landing (STOL) aircraft. STOL is a capability of interest to NASA because conceptual Cruise Efficient Short Take-off and Landing (CESTOL) transport aircraft offer the ability to reduce congestion in the terminal area by utilizing existing shorter runways at airports, as well as to lower community noise by flying steep approach and climb-out patterns that reduce the noise footprint of the aircraft. In this study, adaptive data-based predictive control is implemented as an integrated flight-propulsion controller for the outer-loop control of a CESTOL-type aircraft. Results show that the controller successfully tracks velocity while attempting to maintain a constant flight path angle, using longitudinal command, thrust and flap setting as the control inputs.

  13. A practical approach to evidence-based dentistry: VIII: How to appraise an article based on a qualitative study.

    PubMed

    Sale, Joanna E M; Amin, Maryam; Carrasco-Labra, Alonso; Brignardello-Petersen, Romina; Glick, Michael; Guyatt, Gordon H; Azarpazhooh, Amir

    2015-08-01

    Because of qualitative researchers' abilities to explore social problems and to understand the perspective of patients, qualitative research studies are useful to provide insight about patients' fears, worries, goals, and expectations related to dental care. To benefit fully from such studies, clinicians should be aware of some relevant principles of critical appraisal. In this article, the authors present one approach to critically appraise the evidence from a qualitative research study. Critical appraisal involves assessing whether the results are credible (the selection of participants, research ethics, data collection, data analysis), what are these results, and how they can be applied in clinical practice. The authors also examined how the results could be applied to patient care in terms of offering theory, understanding the context of clinical practice, and helping clinicians understand social interactions in clinical care. By applying these principles, clinicians can consider qualitative studies when trying to achieve the best possible results for their own practices. Copyright © 2015 American Dental Association. Published by Elsevier Inc. All rights reserved.

  14. RKNNMDA: Ranking-based KNN for MiRNA-Disease Association prediction.

    PubMed

    Chen, Xing; Wu, Qiao-Feng; Yan, Gui-Ying

    2017-07-03

    Cumulative verified experimental studies have demonstrated that microRNAs (miRNAs) could be closely related with the development and progression of human complex diseases. Based on the assumption that functional similar miRNAs may have a strong correlation with phenotypically similar diseases and vice versa, researchers developed various effective computational models which combine heterogeneous biologic data sets including disease similarity network, miRNA similarity network, and known disease-miRNA association network to identify potential relationships between miRNAs and diseases in biomedical research. Considering the limitations in previous computational study, we introduced a novel computational method of Ranking-based KNN for miRNA-Disease Association prediction (RKNNMDA) to predict potential related miRNAs for diseases, and our method obtained an AUC of 0.8221 based on leave-one-out cross validation. In addition, RKNNMDA was applied to 3 kinds of important human cancers for further performance evaluation. The results showed that 96%, 80% and 94% of predicted top 50 potential related miRNAs for Colon Neoplasms, Esophageal Neoplasms, and Prostate Neoplasms have been confirmed by experimental literatures, respectively. Moreover, RKNNMDA could be used to predict potential miRNAs for diseases without any known miRNAs, and it is anticipated that RKNNMDA would be of great use for novel miRNA-disease association identification.

  15. The prediction of surface temperature in the new seasonal prediction system based on the MPI-ESM coupled climate model

    NASA Astrophysics Data System (ADS)

    Baehr, J.; Fröhlich, K.; Botzet, M.; Domeisen, D. I. V.; Kornblueh, L.; Notz, D.; Piontek, R.; Pohlmann, H.; Tietsche, S.; Müller, W. A.

    2015-05-01

    A seasonal forecast system is presented, based on the global coupled climate model MPI-ESM as used for CMIP5 simulations. We describe the initialisation of the system and analyse its predictive skill for surface temperature. The presented system is initialised in the atmospheric, oceanic, and sea ice component of the model from reanalysis/observations with full field nudging in all three components. For the initialisation of the ensemble, bred vectors with a vertically varying norm are implemented in the ocean component to generate initial perturbations. In a set of ensemble hindcast simulations, starting each May and November between 1982 and 2010, we analyse the predictive skill. Bias-corrected ensemble forecasts for each start date reproduce the observed surface temperature anomalies at 2-4 months lead time, particularly in the tropics. Niño3.4 sea surface temperature anomalies show a small root-mean-square error and predictive skill up to 6 months. Away from the tropics, predictive skill is mostly limited to the ocean, and to regions which are strongly influenced by ENSO teleconnections. In summary, the presented seasonal prediction system based on a coupled climate model shows predictive skill for surface temperature at seasonal time scales comparable to other seasonal prediction systems using different underlying models and initialisation strategies. As the same model underlying our seasonal prediction system—with a different initialisation—is presently also used for decadal predictions, this is an important step towards seamless seasonal-to-decadal climate predictions.

  16. Research on cross - Project software defect prediction based on transfer learning

    NASA Astrophysics Data System (ADS)

    Chen, Ya; Ding, Xiaoming

    2018-04-01

    According to the two challenges in the prediction of cross-project software defects, the distribution differences between the source project and the target project dataset and the class imbalance in the dataset, proposing a cross-project software defect prediction method based on transfer learning, named NTrA. Firstly, solving the source project data's class imbalance based on the Augmented Neighborhood Cleaning Algorithm. Secondly, the data gravity method is used to give different weights on the basis of the attribute similarity of source project and target project data. Finally, a defect prediction model is constructed by using Trad boost algorithm. Experiments were conducted using data, come from NASA and SOFTLAB respectively, from a published PROMISE dataset. The results show that the method has achieved good values of recall and F-measure, and achieved good prediction results.

  17. Anisotropic connectivity implements motion-based prediction in a spiking neural network.

    PubMed

    Kaplan, Bernhard A; Lansner, Anders; Masson, Guillaume S; Perrinet, Laurent U

    2013-01-01

    Predictive coding hypothesizes that the brain explicitly infers upcoming sensory input to establish a coherent representation of the world. Although it is becoming generally accepted, it is not clear on which level spiking neural networks may implement predictive coding and what function their connectivity may have. We present a network model of conductance-based integrate-and-fire neurons inspired by the architecture of retinotopic cortical areas that assumes predictive coding is implemented through network connectivity, namely in the connection delays and in selectiveness for the tuning properties of source and target cells. We show that the applied connection pattern leads to motion-based prediction in an experiment tracking a moving dot. In contrast to our proposed model, a network with random or isotropic connectivity fails to predict the path when the moving dot disappears. Furthermore, we show that a simple linear decoding approach is sufficient to transform neuronal spiking activity into a probabilistic estimate for reading out the target trajectory.

  18. Hadoop-Based Distributed System for Online Prediction of Air Pollution Based on Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Ghaemi, Z.; Farnaghi, M.; Alimohammadi, A.

    2015-12-01

    The critical impact of air pollution on human health and environment in one hand and the complexity of pollutant concentration behavior in the other hand lead the scientists to look for advance techniques for monitoring and predicting the urban air quality. Additionally, recent developments in data measurement techniques have led to collection of various types of data about air quality. Such data is extremely voluminous and to be useful it must be processed at high velocity. Due to the complexity of big data analysis especially for dynamic applications, online forecasting of pollutant concentration trends within a reasonable processing time is still an open problem. The purpose of this paper is to present an online forecasting approach based on Support Vector Machine (SVM) to predict the air quality one day in advance. In order to overcome the computational requirements for large-scale data analysis, distributed computing based on the Hadoop platform has been employed to leverage the processing power of multiple processing units. The MapReduce programming model is adopted for massive parallel processing in this study. Based on the online algorithm and Hadoop framework, an online forecasting system is designed to predict the air pollution of Tehran for the next 24 hours. The results have been assessed on the basis of Processing Time and Efficiency. Quite accurate predictions of air pollutant indicator levels within an acceptable processing time prove that the presented approach is very suitable to tackle large scale air pollution prediction problems.

  19. Ways of providing the patient with a prognosis: a terminology of employed strategies based on qualitative data.

    PubMed

    Graugaard, Peter Kjær; Rogg, Lotte; Eide, Hilde; Uhlig, Till; Loge, Jon Håvard

    2011-04-01

    To identify, denote, and structure strategies applied by physicians and patients when communicating information about prognosis. A descriptive qualitative study based on audiotaped physician-patient encounters between 23 haematologists and rheumatologists, and 89 patients in Oslo. Classification of identified prognostic sequences was based on consensus. Physicians seldom initiated communication with patients explicitly to find out their overall preferences for prognostic information (metacommunication). Instead, they used sounding and implicit strategies such as invitations, implicatures, and non-specific information that might result in further disclosure of information if requested by the patients. In order to balance the obligation to promote hope and provide (true) information, they used strategies such as bad news/good news spirals, authentications, safeguardings, and softenings. Identified strategies applied by the patients to adjust the physician-initiated prognostic information to their needs were requests for specification, requests for optimism, and emotional warnings. The study presents an empirically derived terminology so that clinicians and educators involved in medical communication can increase their awareness of prognostic communication. Based on qualitative data obtained from communication excerpts, we suggest that individual clinicians and researchers evaluate the possible benefits of more frequent use of metacommunication and explicit prognostic information. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  20. An auxiliary optimization method for complex public transit route network based on link prediction

    NASA Astrophysics Data System (ADS)

    Zhang, Lin; Lu, Jian; Yue, Xianfei; Zhou, Jialin; Li, Yunxuan; Wan, Qian

    2018-02-01

    Inspired by the missing (new) link prediction and the spurious existing link identification in link prediction theory, this paper establishes an auxiliary optimization method for public transit route network (PTRN) based on link prediction. First, link prediction applied to PTRN is described, and based on reviewing the previous studies, the summary indices set and its algorithms set are collected for the link prediction experiment. Second, through analyzing the topological properties of Jinan’s PTRN established by the Space R method, we found that this is a typical small-world network with a relatively large average clustering coefficient. This phenomenon indicates that the structural similarity-based link prediction will show a good performance in this network. Then, based on the link prediction experiment of the summary indices set, three indices with maximum accuracy are selected for auxiliary optimization of Jinan’s PTRN. Furthermore, these link prediction results show that the overall layout of Jinan’s PTRN is stable and orderly, except for a partial area that requires optimization and reconstruction. The above pattern conforms to the general pattern of the optimal development stage of PTRN in China. Finally, based on the missing (new) link prediction and the spurious existing link identification, we propose optimization schemes that can be used not only to optimize current PTRN but also to evaluate PTRN planning.

  1. Moving beyond qualitative evaluations of Bayesian models of cognition.

    PubMed

    Hemmer, Pernille; Tauber, Sean; Steyvers, Mark

    2015-06-01

    Bayesian models of cognition provide a powerful way to understand the behavior and goals of individuals from a computational point of view. Much of the focus in the Bayesian cognitive modeling approach has been on qualitative model evaluations, where predictions from the models are compared to data that is often averaged over individuals. In many cognitive tasks, however, there are pervasive individual differences. We introduce an approach to directly infer individual differences related to subjective mental representations within the framework of Bayesian models of cognition. In this approach, Bayesian data analysis methods are used to estimate cognitive parameters and motivate the inference process within a Bayesian cognitive model. We illustrate this integrative Bayesian approach on a model of memory. We apply the model to behavioral data from a memory experiment involving the recall of heights of people. A cross-validation analysis shows that the Bayesian memory model with inferred subjective priors predicts withheld data better than a Bayesian model where the priors are based on environmental statistics. In addition, the model with inferred priors at the individual subject level led to the best overall generalization performance, suggesting that individual differences are important to consider in Bayesian models of cognition.

  2. MicroRNAfold: pre-microRNA secondary structure prediction based on modified NCM model with thermodynamics-based scoring strategy.

    PubMed

    Han, Dianwei; Zhang, Jun; Tang, Guiliang

    2012-01-01

    An accurate prediction of the pre-microRNA secondary structure is important in miRNA informatics. Based on a recently proposed model, nucleotide cyclic motifs (NCM), to predict RNA secondary structure, we propose and implement a Modified NCM (MNCM) model with a physics-based scoring strategy to tackle the problem of pre-microRNA folding. Our microRNAfold is implemented using a global optimal algorithm based on the bottom-up local optimal solutions. Our experimental results show that microRNAfold outperforms the current leading prediction tools in terms of True Negative rate, False Negative rate, Specificity, and Matthews coefficient ratio.

  3. Qualitative analysis of a nurse's responses to stroke caregivers on a web-based supportive intervention.

    PubMed

    Pierce, Linda L; Steiner, Victoria; de Dios, Ann Margaret Vergel; Vollmer, Megan; Govoni, Amy L; Thompson, Teresa L Cervantez

    2015-04-01

    Approximately 800 000 people experience a stroke every year; most are cared for by unpaid family members in home settings. Web-based interventions provide 24/7 access to education/support services and have been explored in the literature with family caregivers dealing with chronic conditions. Current research into nurses' web-based interactions with caregivers in these interventions is lacking. The aim of this qualitative secondary data analysis was to examine a nurse specialist's responses and advice that she gave in a web-based supportive intervention for stroke family caregivers used in a randomized controlled trial for 1 year. Using a qualitative research design, caregivers were recruited from rehabilitation facilities in Ohio and Michigan (n = 36). They accessed the intervention's email forum and discussion group facilitated by the nurse. These email message data were examined using rigorous content analysis. The caregivers were primarily white women caring for a spouse, with an average age of 54 years. From the 2148 email messages between the nurse and caregivers, five themes emerged and were drawn to Friedemann's Framework. These themes included: getting to know the situation (Friedemann's coherence and individuation), validating emotions (individuation), promoting self-care (individuation), assisting in role adaptation (system maintenance and individuation), and providing healthcare information (system maintenance and individuation). These caregivers of stroke survivors were asking for advice, seeking support, and looking for information from an advanced practice nurse. Nurses, and others, in supportive roles can use these findings to promote informed care and directed interventions for caregivers dealing with stroke and its outcomes.

  4. Factors affecting planned return to work after trauma: A prospective descriptive qualitative and quantitative study.

    PubMed

    Folkard, S S; Bloomfield, T D; Page, P R J; Wilson, D; Ricketts, D M; Rogers, B A

    2016-12-01

    The use of patient reported outcome measures (PROMs) in trauma is limited. The aim of this pilot study is to evaluate qualitative responses and factors affecting planned return to work following significant trauma, for which there is currently a poor evidence base. National ethical approval was obtained for routine prospective PROMs data collection, including EQ-5D, between Sept 2013 and March 2015 for trauma patients admitted to the Sussex Major Trauma Centre (n=92). 84 trauma patients disclosed their intended return to work at discharge. Additional open questions asked 'things done well' and 'things to be improved'. EQ-5D responses were valued using the time trade-off method. Statistical analysis between multiple variables was completed by ANOVA, and with categorical categories by Chi squared analysis. Only 18/68 of patients working at admission anticipated returning to work within 14days post-discharge. The injury severity scores (ISS) of those predicting return to work within two weeks and those predicting return to work longer than two weeks were 14.17 and 13.59, respectively. Increased physicality of work showed a trend towards poorer return to work outcomes, although non-significant in Chi-squared test in groups predicting return in less than or greater than two weeks (4.621, p=0.2017ns). No significant difference was demonstrated in the comparative incomes of patients with different estimated return to work outcomes (ANOVA r 2 =0.001, P=0.9590ns). EQ-5D scores were higher in those predicting return to work within two weeks when compared to greater than two weeks. Qualitative thematic content analysis of open responses was possible for 66/92 of respondents. Prominent positive themes were: care, staff, professionalism, and communication. Prominent negative themes were: food, ward response time, and communication. This pilot study highlights the importance of qualitative PROMs analysis in leading patient-driven improvements in trauma care. We provide standard

  5. Prediction of trabecular bone qualitative properties using scanning quantitative ultrasound

    PubMed Central

    Qin, Yi-Xian; Lin, Wei; Mittra, Erik; Xia, Yi; Cheng, Jiqi; Judex, Stefan; Rubin, Clint; Müller, Ralph

    2012-01-01

    Microgravity induced bone loss represents a critical health problem in astronauts, particularly occurred in weight-supporting skeleton, which leads to osteopenia and increase of fracture risk. Lack of suitable evaluation modality makes it difficult for monitoring skeletal status in long term space mission and increases potential risk of complication. Such disuse osteopenia and osteoporosis compromise trabecular bone density, and architectural and mechanical properties. While X-ray based imaging would not be practical in space, quantitative ultrasound may provide advantages to characterize bone density and strength through wave propagation in complex trabecular structure. This study used a scanning confocal acoustic diagnostic and navigation system (SCAN) to evaluate trabecular bone quality in 60 cubic trabecular samples harvested from adult sheep. Ultrasound image based SCAN measurements in structural and strength properties were validated by μCT and compressive mechanical testing. This result indicated a moderately strong negative correlations observed between broadband ultrasonic attenuation (BUA) and μCT-determined bone volume fraction (BV/TV, R2=0.53). Strong correlations were observed between ultrasound velocity (UV) and bone’s mechanical strength and structural parameters, i.e., bulk Young’s modulus (R2=0.67) and BV/TV (R2=0.85). The predictions for bone density and mechanical strength were significantly improved by using a linear combination of both BUA and UV, yielding R2=0.92 for BV/TV and R2=0.71 for bulk Young’s modulus. These results imply that quantitative ultrasound can characterize trabecular structural and mechanical properties through measurements of particular ultrasound parameters, and potentially provide an excellent estimation for bone’s structural integrity. PMID:23976803

  6. The Satellite Clock Bias Prediction Method Based on Takagi-Sugeno Fuzzy Neural Network

    NASA Astrophysics Data System (ADS)

    Cai, C. L.; Yu, H. G.; Wei, Z. C.; Pan, J. D.

    2017-05-01

    The continuous improvement of the prediction accuracy of Satellite Clock Bias (SCB) is the key problem of precision navigation. In order to improve the precision of SCB prediction and better reflect the change characteristics of SCB, this paper proposes an SCB prediction method based on the Takagi-Sugeno fuzzy neural network. Firstly, the SCB values are pre-treated based on their characteristics. Then, an accurate Takagi-Sugeno fuzzy neural network model is established based on the preprocessed data to predict SCB. This paper uses the precise SCB data with different sampling intervals provided by IGS (International Global Navigation Satellite System Service) to realize the short-time prediction experiment, and the results are compared with the ARIMA (Auto-Regressive Integrated Moving Average) model, GM(1,1) model, and the quadratic polynomial model. The results show that the Takagi-Sugeno fuzzy neural network model is feasible and effective for the SCB short-time prediction experiment, and performs well for different types of clocks. The prediction results for the proposed method are better than the conventional methods obviously.

  7. Exploring the Position of Community-Based Nursing in Iran: A Qualitative Study

    PubMed Central

    Heydari, Heshmatolah; Rahnavard, Zahra; Ghaffari, Fatemeh

    2017-01-01

    ABSTRACT Background: Community-based nursing focuses on providing health services to families and communities in the second and third levels of prevention and this can improve the individuals, families and communities’ quality of life, and reduce the healthcare costs. The aim of this study was to explore the status of community-based nursing in Iran. Methods: This qualitative study was conducted from March to November 2015, in Tehran, Iran, using the content analysis approach. The study setting consisted of Iran and Tehran Faculties of Nursing and Midwifery, Tehran, Iran. The purposive sampling method was used. Twenty faculty members and Master’s and PhD students were interviewed by using the face-to-face semi-structured interview method. Moreover, two focus groups were conducted for complementing and enriching the study data. The data were analyzed using the Graneheim and Lundman’s approach to content analysis. The trustworthiness of the study findings was maintained by employing the Lincoln and Guba’s criteria of credibility, dependability, and confirmability. Results: In total, 580 codes were generated and categorized into three main categories of conventional services, the necessity for creating infrastructures, and multidimensional outcomes of community-based nursing. Conclusion: Introducing community-based nursing into nursing education curricula and creating ample job opportunities for community-based nurses seem clearly essential. PMID:29043284

  8. Non-Fourier based thermal-mechanical tissue damage prediction for thermal ablation

    PubMed Central

    Li, Xin; Zhong, Yongmin; Smith, Julian; Gu, Chengfan

    2017-01-01

    ABSTRACT Prediction of tissue damage under thermal loads plays important role for thermal ablation planning. A new methodology is presented in this paper by combing non-Fourier bio-heat transfer, constitutive elastic mechanics as well as non-rigid motion of dynamics to predict and analyze thermal distribution, thermal-induced mechanical deformation and thermal-mechanical damage of soft tissues under thermal loads. Simulations and comparison analysis demonstrate that the proposed methodology based on the non-Fourier bio-heat transfer can account for the thermal-induced mechanical behaviors of soft tissues and predict tissue thermal damage more accurately than classical Fourier bio-heat transfer based model. PMID:27690290

  9. Trust from the past: Bayesian Personalized Ranking based Link Prediction in Knowledge Graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Baichuan; Choudhury, Sutanay; Al-Hasan, Mohammad

    2016-02-01

    Estimating the confidence for a link is a critical task for Knowledge Graph construction. Link prediction, or predicting the likelihood of a link in a knowledge graph based on prior state is a key research direction within this area. We propose a Latent Feature Embedding based link recommendation model for prediction task and utilize Bayesian Personalized Ranking based optimization technique for learning models for each predicate. Experimental results on large-scale knowledge bases such as YAGO2 show that our approach achieves substantially higher performance than several state-of-art approaches. Furthermore, we also study the performance of the link prediction algorithm in termsmore » of topological properties of the Knowledge Graph and present a linear regression model to reason about its expected level of accuracy.« less

  10. Clients' experiences of a community based lifestyle modification program: a qualitative study.

    PubMed

    Chan, Ruth S M; Lok, Kris Y W; Sea, Mandy M M; Woo, Jean

    2009-10-01

    There is little information about how clients attending lifestyle modification programs view the outcomes. This qualitative study examined the clients' experience of a community based lifestyle modification program in Hong Kong. Semi-structured interviews were conducted with 25 clients attending the program. Clients perceived the program had positive impacts on their health and nutrition knowledge. They experienced frustration, negative emotion, lack of motivation, and pressure from others during the program. Working environment and lack of healthy food choices in restaurants were the major perceived environmental barriers for lifestyle modification. Clients valued nutritionists' capability of providing professional information and psychological support in the program. Our results suggest that nutritionist's capability of providing quality consultations and patient-centered care are important for empowering clients achieve lifestyle modification.

  11. The Role of Qualitative Research in Science Education

    ERIC Educational Resources Information Center

    Devetak, Iztok; Glazar, Sasa A.; Vogrinc, Janez

    2010-01-01

    In the paper the qualitative research in which the researcher has been directly involved, and has himself been examining the research phenomenon in the studied environment, is presented. The aim of this qualitative study is to gather data in the form of rich content-based descriptions of people, events, and situations by using different,…

  12. Ensemble-based docking: From hit discovery to metabolism and toxicity predictions

    DOE PAGES

    Evangelista, Wilfredo; Weir, Rebecca; Ellingson, Sally; ...

    2016-07-29

    The use of ensemble-based docking for the exploration of biochemical pathways and toxicity prediction of drug candidates is described. We describe the computational engineering work necessary to enable large ensemble docking campaigns on supercomputers. We show examples where ensemble-based docking has significantly increased the number and the diversity of validated drug candidates. Finally, we illustrate how ensemble-based docking can be extended beyond hit discovery and toward providing a structural basis for the prediction of metabolism and off-target binding relevant to pre-clinical and clinical trials.

  13. Estimation of genetic connectedness diagnostics based on prediction errors without the prediction error variance-covariance matrix.

    PubMed

    Holmes, John B; Dodds, Ken G; Lee, Michael A

    2017-03-02

    An important issue in genetic evaluation is the comparability of random effects (breeding values), particularly between pairs of animals in different contemporary groups. This is usually referred to as genetic connectedness. While various measures of connectedness have been proposed in the literature, there is general agreement that the most appropriate measure is some function of the prediction error variance-covariance matrix. However, obtaining the prediction error variance-covariance matrix is computationally demanding for large-scale genetic evaluations. Many alternative statistics have been proposed that avoid the computational cost of obtaining the prediction error variance-covariance matrix, such as counts of genetic links between contemporary groups, gene flow matrices, and functions of the variance-covariance matrix of estimated contemporary group fixed effects. In this paper, we show that a correction to the variance-covariance matrix of estimated contemporary group fixed effects will produce the exact prediction error variance-covariance matrix averaged by contemporary group for univariate models in the presence of single or multiple fixed effects and one random effect. We demonstrate the correction for a series of models and show that approximations to the prediction error matrix based solely on the variance-covariance matrix of estimated contemporary group fixed effects are inappropriate in certain circumstances. Our method allows for the calculation of a connectedness measure based on the prediction error variance-covariance matrix by calculating only the variance-covariance matrix of estimated fixed effects. Since the number of fixed effects in genetic evaluation is usually orders of magnitudes smaller than the number of random effect levels, the computational requirements for our method should be reduced.

  14. Thematic and Spatial Resolutions Affect Model-Based Predictions of Tree Species Distribution

    PubMed Central

    Liang, Yu; He, Hong S.; Fraser, Jacob S.; Wu, ZhiWei

    2013-01-01

    Subjective decisions of thematic and spatial resolutions in characterizing environmental heterogeneity may affect the characterizations of spatial pattern and the simulation of occurrence and rate of ecological processes, and in turn, model-based tree species distribution. Thus, this study quantified the importance of thematic and spatial resolutions, and their interaction in predictions of tree species distribution (quantified by species abundance). We investigated how model-predicted species abundances changed and whether tree species with different ecological traits (e.g., seed dispersal distance, competitive capacity) had different responses to varying thematic and spatial resolutions. We used the LANDIS forest landscape model to predict tree species distribution at the landscape scale and designed a series of scenarios with different thematic (different numbers of land types) and spatial resolutions combinations, and then statistically examined the differences of species abundance among these scenarios. Results showed that both thematic and spatial resolutions affected model-based predictions of species distribution, but thematic resolution had a greater effect. Species ecological traits affected the predictions. For species with moderate dispersal distance and relatively abundant seed sources, predicted abundance increased as thematic resolution increased. However, for species with long seeding distance or high shade tolerance, thematic resolution had an inverse effect on predicted abundance. When seed sources and dispersal distance were not limiting, the predicted species abundance increased with spatial resolution and vice versa. Results from this study may provide insights into the choice of thematic and spatial resolutions for model-based predictions of tree species distribution. PMID:23861828

  15. Workplace-based assessment and students' approaches to learning: a qualitative inquiry.

    PubMed

    Al-Kadri, Hanan M; Al-Kadi, Mohammed T; Van Der Vleuten, Cees P M

    2013-01-01

    We have performed this research to assess the effect of work-place based assessment (WBA) practice on medical students' learning approaches. The research was conducted at the King Saud bin Abdulaziz University for Health Sciences, College of Medicine from 1 March to 31 July 2012. We conducted a qualitative, phenomenological research utilizing semi-structured individual interviews with medical students exposed to WBA. The audio-taped interviews were transcribed verbatim, analyzed, and themes were identified. We preformed investigators' triangulation, member checking with clinical supervisors and we triangulated the data with a similar research performed prior to the implementation of WBA. WBA results in variable learning approaches. Based on several affecting factors; clinical supervisors, faculty-given feedback, and assessment function, students may swing between surface, deep and effort and achievement learning approaches. Students' and supervisors' orientations on the process of WBA, utilization of peer feedback and formative rather than summative assessment facilitate successful implementation of WBA and lead to students' deeper approaches to learning. Interestingly, students and their supervisors have contradicting perceptions to WBA. A change in culture to unify students' and supervisors' perceptions of WBA, more accommodation of formative assessment, and feedback may result in students' deeper approach to learning.

  16. What can acute medicine learn from qualitative methods?

    PubMed

    Heasman, Brett; Reader, Tom W

    2015-10-01

    The contribution of qualitative methods to evidence-based medicine is growing, with qualitative studies increasingly used to examine patient experience and unsafe organizational cultures. The present review considers qualitative research recently conducted on teamwork and organizational culture in the ICU and also other acute domains. Qualitative studies have highlighted the importance of interpersonal and social aspects of healthcare on managing and responding to patient care needs. Clear/consistent communication, compassion, and trust underpin successful patient-physician interactions, with improved patient experiences linked to patient safety and clinical effectiveness across a wide range of measures and outcomes. Across multidisciplinary teams, good communication facilitates shared understanding, decision-making and coordinated action, reducing patient risk in the process. Qualitative methods highlight the complex nature of risk management in hospital wards, which is highly contextualized to the demands and resources available, and influenced by multilayered social contexts. In addition to augmenting quantitative research, qualitative investigations enable the investigation of questions on social behaviour that are beyond the scope of quantitative assessment alone. To develop improved patient-centred care, health professionals should therefore consider integrating qualitative procedures into their existing assessments of patient/staff satisfaction.

  17. Prediction of Drug-Target Interactions and Drug Repositioning via Network-Based Inference

    PubMed Central

    Jiang, Jing; Lu, Weiqiang; Li, Weihua; Liu, Guixia; Zhou, Weixing; Huang, Jin; Tang, Yun

    2012-01-01

    Drug-target interaction (DTI) is the basis of drug discovery and design. It is time consuming and costly to determine DTI experimentally. Hence, it is necessary to develop computational methods for the prediction of potential DTI. Based on complex network theory, three supervised inference methods were developed here to predict DTI and used for drug repositioning, namely drug-based similarity inference (DBSI), target-based similarity inference (TBSI) and network-based inference (NBI). Among them, NBI performed best on four benchmark data sets. Then a drug-target network was created with NBI based on 12,483 FDA-approved and experimental drug-target binary links, and some new DTIs were further predicted. In vitro assays confirmed that five old drugs, namely montelukast, diclofenac, simvastatin, ketoconazole, and itraconazole, showed polypharmacological features on estrogen receptors or dipeptidyl peptidase-IV with half maximal inhibitory or effective concentration ranged from 0.2 to 10 µM. Moreover, simvastatin and ketoconazole showed potent antiproliferative activities on human MDA-MB-231 breast cancer cell line in MTT assays. The results indicated that these methods could be powerful tools in prediction of DTIs and drug repositioning. PMID:22589709

  18. Prediction on carbon dioxide emissions based on fuzzy rules

    NASA Astrophysics Data System (ADS)

    Pauzi, Herrini; Abdullah, Lazim

    2014-06-01

    There are several ways to predict air quality, varying from simple regression to models based on artificial intelligence. Most of the conventional methods are not sufficiently able to provide good forecasting performances due to the problems with non-linearity uncertainty and complexity of the data. Artificial intelligence techniques are successfully used in modeling air quality in order to cope with the problems. This paper describes fuzzy inference system (FIS) to predict CO2 emissions in Malaysia. Furthermore, adaptive neuro-fuzzy inference system (ANFIS) is used to compare the prediction performance. Data of five variables: energy use, gross domestic product per capita, population density, combustible renewable and waste and CO2 intensity are employed in this comparative study. The results from the two model proposed are compared and it is clearly shown that the ANFIS outperforms FIS in CO2 prediction.

  19. Current thinking in qualitative research: evidence-based practice, moral philosophies, and political struggle.

    PubMed

    Papadimitriou, Christina; Magasi, Susan; Frank, Gelya

    2012-01-01

    In this introduction to the special issue on current thinking in qualitative research and occupational therapy and science, the authors focus on the importance of rigorous qualitative research to inform occupational therapy practice. The authors chosen for this special issue reflect a "second generation of qualitative researchers" who are critical, theoretically sophisticated, methodologically productive, and politically relevant to show that working with disabled clients is political work. Three themes emerged across the articles included in this special issue: (1) recognizing and addressing social justice issues; (2) learning from clients' experiences; and (3) critically reframing occupational therapy's role. These themes can inform occupational therapy practice, research, and education to reflect a more client-centered and politically engaging approach. Copyright 2012, SLACK Incorporated.

  20. Entropy Is Simple, Qualitatively.

    ERIC Educational Resources Information Center

    Lambert, Frank L.

    2002-01-01

    Suggests that qualitatively, entropy is simple. Entropy increase from a macro viewpoint is a measure of the dispersal of energy from localized to spread out at a temperature T. Fundamentally based on statistical and quantum mechanics, this approach is superior to the non-fundamental "disorder" as a descriptor of entropy change. (MM)

  1. The Maudsley Model of Family-Based Treatment for Anorexia Nervosa: A Qualitative Evaluation of Parent-to-Parent Consultation

    ERIC Educational Resources Information Center

    Rhodes, Paul; Brown, Jac; Madden, Sloane

    2009-01-01

    This article describes the qualitative analysis of a randomized control trial that explores the use of parent-to-parent consultations as an augmentation to the Maudsley model of family-based treatment for anorexia. Twenty families were randomized into two groups, 10 receiving standard treatment and 10 receiving an additional parent-to-parent…

  2. Prediction of Protein Structure by Template-Based Modeling Combined with the UNRES Force Field.

    PubMed

    Krupa, Paweł; Mozolewska, Magdalena A; Joo, Keehyoung; Lee, Jooyoung; Czaplewski, Cezary; Liwo, Adam

    2015-06-22

    A new approach to the prediction of protein structures that uses distance and backbone virtual-bond dihedral angle restraints derived from template-based models and simulations with the united residue (UNRES) force field is proposed. The approach combines the accuracy and reliability of template-based methods for the segments of the target sequence with high similarity to those having known structures with the ability of UNRES to pack the domains correctly. Multiplexed replica-exchange molecular dynamics with restraints derived from template-based models of a given target, in which each restraint is weighted according to the accuracy of the prediction of the corresponding section of the molecule, is used to search the conformational space, and the weighted histogram analysis method and cluster analysis are applied to determine the families of the most probable conformations, from which candidate predictions are selected. To test the capability of the method to recover template-based models from restraints, five single-domain proteins with structures that have been well-predicted by template-based methods were used; it was found that the resulting structures were of the same quality as the best of the original models. To assess whether the new approach can improve template-based predictions with incorrectly predicted domain packing, four such targets were selected from the CASP10 targets; for three of them the new approach resulted in significantly better predictions compared with the original template-based models. The new approach can be used to predict the structures of proteins for which good templates can be found for sections of the sequence or an overall good template can be found for the entire sequence but the prediction quality is remarkably weaker in putative domain-linker regions.

  3. RKNNMDA: Ranking-based KNN for MiRNA-Disease Association prediction

    PubMed Central

    Chen, Xing; Yan, Gui-Ying

    2017-01-01

    ABSTRACT Cumulative verified experimental studies have demonstrated that microRNAs (miRNAs) could be closely related with the development and progression of human complex diseases. Based on the assumption that functional similar miRNAs may have a strong correlation with phenotypically similar diseases and vice versa, researchers developed various effective computational models which combine heterogeneous biologic data sets including disease similarity network, miRNA similarity network, and known disease-miRNA association network to identify potential relationships between miRNAs and diseases in biomedical research. Considering the limitations in previous computational study, we introduced a novel computational method of Ranking-based KNN for miRNA-Disease Association prediction (RKNNMDA) to predict potential related miRNAs for diseases, and our method obtained an AUC of 0.8221 based on leave-one-out cross validation. In addition, RKNNMDA was applied to 3 kinds of important human cancers for further performance evaluation. The results showed that 96%, 80% and 94% of predicted top 50 potential related miRNAs for Colon Neoplasms, Esophageal Neoplasms, and Prostate Neoplasms have been confirmed by experimental literatures, respectively. Moreover, RKNNMDA could be used to predict potential miRNAs for diseases without any known miRNAs, and it is anticipated that RKNNMDA would be of great use for novel miRNA-disease association identification. PMID:28421868

  4. Participant experiences in a smartphone-based health coaching intervention for type 2 diabetes: A qualitative inquiry.

    PubMed

    Pludwinski, Sarah; Ahmad, Farah; Wayne, Noah; Ritvo, Paul

    2016-04-01

    We investigated the experience of individuals diagnosed with type 2 diabetes mellitus (T2DM) who participated in an intervention in which the key elements were the provision of a smartphone and self-monitoring software. The interviews focused on use of a smartphone and the effects on motivation for health behavior change. This was a qualitative evaluation of participants in a larger T2DM self-management randomized controlled trial (RCT) conducted at the Black Creek Community Health Centre (BCCHC) in Toronto, Canada (ClinicalTrials.gov Identifier: NCT02036892). The study is based on semi-structured interviews (n = 11) that were audio taped and analyzed with a thematic analytic approach. The RCT compared the effectiveness of six months of smartphone-based self-monitoring and health coaching with a control group who received health coaching without internet or smartphone-based assistance. Qualitative data analyses resulted in derivation of four major themes that describe participant experience: (a) 'smartphone and software', describes smartphone use in relation to health behavior change; (b) 'health coach' describes how client/health coach relationships were assisted by smartphone use; (c) 'overall experience' describes perceptions of the overall intervention; and (d) 'frustrations in managing chronic conditions' describes difficulties with the complexities of T2DM management from a patient perspective. Findings suggest that interventions with T2DM assisted by smartphone software and health coaches actively engage individuals in improved hemoglobin A1c (HbA1c) control. © The Author(s) 2015.

  5. Kernel-based whole-genome prediction of complex traits: a review.

    PubMed

    Morota, Gota; Gianola, Daniel

    2014-01-01

    Prediction of genetic values has been a focus of applied quantitative genetics since the beginning of the 20th century, with renewed interest following the advent of the era of whole genome-enabled prediction. Opportunities offered by the emergence of high-dimensional genomic data fueled by post-Sanger sequencing technologies, especially molecular markers, have driven researchers to extend Ronald Fisher and Sewall Wright's models to confront new challenges. In particular, kernel methods are gaining consideration as a regression method of choice for genome-enabled prediction. Complex traits are presumably influenced by many genomic regions working in concert with others (clearly so when considering pathways), thus generating interactions. Motivated by this view, a growing number of statistical approaches based on kernels attempt to capture non-additive effects, either parametrically or non-parametrically. This review centers on whole-genome regression using kernel methods applied to a wide range of quantitative traits of agricultural importance in animals and plants. We discuss various kernel-based approaches tailored to capturing total genetic variation, with the aim of arriving at an enhanced predictive performance in the light of available genome annotation information. Connections between prediction machines born in animal breeding, statistics, and machine learning are revisited, and their empirical prediction performance is discussed. Overall, while some encouraging results have been obtained with non-parametric kernels, recovering non-additive genetic variation in a validation dataset remains a challenge in quantitative genetics.

  6. A range-based predictive localization algorithm for WSID networks

    NASA Astrophysics Data System (ADS)

    Liu, Yuan; Chen, Junjie; Li, Gang

    2017-11-01

    Most studies on localization algorithms are conducted on the sensor networks with densely distributed nodes. However, the non-localizable problems are prone to occur in the network with sparsely distributed sensor nodes. To solve this problem, a range-based predictive localization algorithm (RPLA) is proposed in this paper for the wireless sensor networks syncretizing the RFID (WSID) networks. The Gaussian mixture model is established to predict the trajectory of a mobile target. Then, the received signal strength indication is used to reduce the residence area of the target location based on the approximate point-in-triangulation test algorithm. In addition, collaborative localization schemes are introduced to locate the target in the non-localizable situations. Simulation results verify that the RPLA achieves accurate localization for the network with sparsely distributed sensor nodes. The localization accuracy of the RPLA is 48.7% higher than that of the APIT algorithm, 16.8% higher than that of the single Gaussian model-based algorithm and 10.5% higher than that of the Kalman filtering-based algorithm.

  7. Evaluation of three statistical prediction models for forensic age prediction based on DNA methylation.

    PubMed

    Smeers, Inge; Decorte, Ronny; Van de Voorde, Wim; Bekaert, Bram

    2018-05-01

    DNA methylation is a promising biomarker for forensic age prediction. A challenge that has emerged in recent studies is the fact that prediction errors become larger with increasing age due to interindividual differences in epigenetic ageing rates. This phenomenon of non-constant variance or heteroscedasticity violates an assumption of the often used method of ordinary least squares (OLS) regression. The aim of this study was to evaluate alternative statistical methods that do take heteroscedasticity into account in order to provide more accurate, age-dependent prediction intervals. A weighted least squares (WLS) regression is proposed as well as a quantile regression model. Their performances were compared against an OLS regression model based on the same dataset. Both models provided age-dependent prediction intervals which account for the increasing variance with age, but WLS regression performed better in terms of success rate in the current dataset. However, quantile regression might be a preferred method when dealing with a variance that is not only non-constant, but also not normally distributed. Ultimately the choice of which model to use should depend on the observed characteristics of the data. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Using Qualitative Metasummary to Synthesize Qualitative and Quantitative Descriptive Findings

    PubMed Central

    Sandelowski, Margarete; Barroso, Julie; Voils, Corrine I.

    2008-01-01

    The new imperative in the health disciplines to be more methodologically inclusive has generated a growing interest in mixed research synthesis, or the integration of qualitative and quantitative research findings. Qualitative metasummary is a quantitatively oriented aggregation of qualitative findings originally developed to accommodate the distinctive features of qualitative surveys. Yet these findings are similar in form and mode of production to the descriptive findings researchers often present in addition to the results of bivariate and multivariable analyses. Qualitative metasummary, which includes the extraction, grouping, and formatting of findings, and the calculation of frequency and intensity effect sizes, can be used to produce mixed research syntheses and to conduct a posteriori analyses of the relationship between reports and findings. PMID:17243111

  9. Engaging men with penile cancer in qualitative research: reflections from an interview-based study.

    PubMed

    Witty, Karl; Branney, Peter; Bullen, Kate; White, Alan; Evans, Julie; Eardley, Ian

    2014-01-01

    To explore the challenges of engaging men with penile cancer in qualitative interview research. Qualitative interviewing offers an ideal tool for exploring men's experiences of illness, complementing and providing context to gendered health inequalities identified in epidemiological research on men. But conducting interviews with men can be challenging and embarking on a qualitative interview study with males can feel like a daunting task, given the limited amount of practical, gender-sensitive guidance for researchers. Reflecting on a researcher's experience of conducting qualitative research on men with penile cancer, this paper explores the potential challenges of interviewing this group, but also documents how engagement and data collection were achieved. This is a reflective paper, informed by the experiences of a male researcher (KW) with no nurse training, who conducted 28 interviews with men who had been treated for penile cancer. The researcher's experiences are reported in chronological order, from the methodological challenges of recruitment to those of conducting the interview. The paper offers a resource for the novice researcher, highlighting some advantages and disadvantages of conducting qualitative interview research as a nurse researcher, as well as recommendations on how to overcome challenges. Engaging men with penile cancer in qualitative interview raises practical, methodological, ethical and emotional challenges for the researcher. However, when these challenges are met, men will talk about their health. Methodological procedures must enable an open and ongoing dialogue with clinical gatekeepers and potential participants to promote engagement. Support from colleagues is essential for any interviewer, no matter how experienced the researcher is.

  10. The predictive value of quantitative fibronectin testing in combination with cervical length measurement in symptomatic women.

    PubMed

    Bruijn, Merel M C; Kamphuis, Esme I; Hoesli, Irene M; Martinez de Tejada, Begoña; Loccufier, Anne R; Kühnert, Maritta; Helmer, Hanns; Franz, Marie; Porath, Martina M; Oudijk, Martijn A; Jacquemyn, Yves; Schulzke, Sven M; Vetter, Grit; Hoste, Griet; Vis, Jolande Y; Kok, Marjolein; Mol, Ben W J; van Baaren, Gert-Jan

    2016-12-01

    The combination of the qualitative fetal fibronectin test and cervical length measurement has a high negative predictive value for preterm birth within 7 days; however, positive prediction is poor. A new bedside quantitative fetal fibronectin test showed potential additional value over the conventional qualitative test, but there is limited evidence on the combination with cervical length measurement. The purpose of this study was to compare quantitative fetal fibronectin and qualitative fetal fibronectin testing in the prediction of spontaneous preterm birth within 7 days in symptomatic women who undergo cervical length measurement. We performed a European multicenter cohort study in 10 perinatal centers in 5 countries. Women between 24 and 34 weeks of gestation with signs of active labor and intact membranes underwent quantitative fibronectin testing and cervical length measurement. We assessed the risk of preterm birth within 7 days in predefined strata based on fibronectin concentration and cervical length. Of 455 women who were included in the study, 48 women (11%) delivered within 7 days. A combination of cervical length and qualitative fibronectin resulted in the identification of 246 women who were at low risk: 164 women with a cervix between 15 and 30 mm and a negative fibronectin test (<50 ng/mL; preterm birth rate, 2%) and 82 women with a cervix at >30 mm (preterm birth rate, 2%). Use of quantitative fibronectin alone resulted in a predicted risk of preterm birth within 7 days that ranged from 2% in the group with the lowest fibronectin level (<10 ng/mL) to 38% in the group with the highest fibronectin level (>500 ng/mL), with similar accuracy as that of the combination of cervical length and qualitative fibronectin. Combining cervical length and quantitative fibronectin resulted in the identification of an additional 19 women at low risk (preterm birth rate, 5%), using a threshold of 10 ng/mL in women with a cervix at <15 mm, and 6 women at high risk

  11. Link prediction based on nonequilibrium cooperation effect

    NASA Astrophysics Data System (ADS)

    Li, Lanxi; Zhu, Xuzhen; Tian, Hui

    2018-04-01

    Link prediction in complex networks has become a common focus of many researchers. But most existing methods concentrate on neighbors, and rarely consider degree heterogeneity of two endpoints. Node degree represents the importance or status of endpoints. We describe the large-degree heterogeneity as the nonequilibrium between nodes. This nonequilibrium facilitates a stable cooperation between endpoints, so that two endpoints with large-degree heterogeneity tend to connect stably. We name such a phenomenon as the nonequilibrium cooperation effect. Therefore, this paper proposes a link prediction method based on the nonequilibrium cooperation effect to improve accuracy. Theoretical analysis will be processed in advance, and at the end, experiments will be performed in 12 real-world networks to compare the mainstream methods with our indices in the network through numerical analysis.

  12. Compressed sensing based missing nodes prediction in temporal communication network

    NASA Astrophysics Data System (ADS)

    Cheng, Guangquan; Ma, Yang; Liu, Zhong; Xie, Fuli

    2018-02-01

    The reconstruction of complex network topology is of great theoretical and practical significance. Most research so far focuses on the prediction of missing links. There are many mature algorithms for link prediction which have achieved good results, but research on the prediction of missing nodes has just begun. In this paper, we propose an algorithm for missing node prediction in complex networks. We detect the position of missing nodes based on their neighbor nodes under the theory of compressed sensing, and extend the algorithm to the case of multiple missing nodes using spectral clustering. Experiments on real public network datasets and simulated datasets show that our algorithm can detect the locations of hidden nodes effectively with high precision.

  13. How do clinical clerkship students experience simulator-based teaching? A qualitative analysis.

    PubMed

    Takayesu, James K; Farrell, Susan E; Evans, Adelaide J; Sullivan, John E; Pawlowski, John B; Gordon, James A

    2006-01-01

    To critically analyze the experience of clinical clerkship students exposed to simulator-based teaching, in order to better understand student perspectives on its utility. A convenience sample of clinical students (n = 95) rotating through an emergency medicine, surgery, or longitudinal patient-doctor clerkship voluntarily participated in a 2-hour simulator-based teaching session. Groups of 3-5 students managed acute scenarios including respiratory failure, myocardial infarction, or multisystem trauma. After the session, students completed a brief written evaluation asking for free text commentary on the strengths and weaknesses of the experience; they also provided simple satisfaction ratings. Using a qualitative research approach, the textual commentary was transcribed and parsed into fragments, coded for emergent themes, and tested for inter-rater agreement. Six major thematic categories emerged from the qualitative analysis: The "Knowledge & Curriculum" domain was described by 35% of respondents, who commented on the opportunity for self-assessment, recall and memory, basic and clinical science learning, and motivation. "Applied Cognition and Critical Thought" was highlighted by 53% of respondents, who commented on the value of decision-making, active thought, clinical integration, and the uniqueness of learning-by-doing. "Teamwork and Communication" and "Procedural/Hands-On Skills" were each mentioned by 12% of subjects. Observations on the "Teaching/Learning Environment" were offered by 80% of students, who commented on the realism, interactivity, safety, and emotionality of the experience; here they also offered feedback on format, logistics, and instructors. Finally, "Suggestions for Use/Place in Undergraduate Medical Education" were provided by 22% of subjects, who primarily recommended more exposure. On a simple rating scale, 94% of students rated the quality of the simulator session as "excellent," whereas 91% felt the exercises should be "mandatory

  14. A Micromechanics-Based Method for Multiscale Fatigue Prediction

    NASA Astrophysics Data System (ADS)

    Moore, John Allan

    An estimated 80% of all structural failures are due to mechanical fatigue, often resulting in catastrophic, dangerous and costly failure events. However, an accurate model to predict fatigue remains an elusive goal. One of the major challenges is that fatigue is intrinsically a multiscale process, which is dependent on a structure's geometric design as well as its material's microscale morphology. The following work begins with a microscale study of fatigue nucleation around non- metallic inclusions. Based on this analysis, a novel multiscale method for fatigue predictions is developed. This method simulates macroscale geometries explicitly while concurrently calculating the simplified response of microscale inclusions. Thus, providing adequate detail on multiple scales for accurate fatigue life predictions. The methods herein provide insight into the multiscale nature of fatigue, while also developing a tool to aid in geometric design and material optimization for fatigue critical devices such as biomedical stents and artificial heart valves.

  15. Analyst-to-Analyst Variability in Simulation-Based Prediction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glickman, Matthew R.; Romero, Vicente J.

    This report describes findings from the culminating experiment of the LDRD project entitled, "Analyst-to-Analyst Variability in Simulation-Based Prediction". For this experiment, volunteer participants solving a given test problem in engineering and statistics were interviewed at different points in their solution process. These interviews are used to trace differing solutions to differing solution processes, and differing processes to differences in reasoning, assumptions, and judgments. The issue that the experiment was designed to illuminate -- our paucity of understanding of the ways in which humans themselves have an impact on predictions derived from complex computational simulations -- is a challenging and openmore » one. Although solution of the test problem by analyst participants in this experiment has taken much more time than originally anticipated, and is continuing past the end of this LDRD, this project has provided a rare opportunity to explore analyst-to-analyst variability in significant depth, from which we derive evidence-based insights to guide further explorations in this important area.« less

  16. A qualitative exploration of the experiences of patients with breast cancer receiving outpatient-based chemotherapy.

    PubMed

    Lai, Xiao Bin; Ching, Shirley Siu Yin; Wong, Frances Kam Yuet

    2017-10-01

    The aim of this study was to understand the experiences of patients with breast cancer and their involvement during outpatient-based chemotherapy in Hong Kong. The outcome evaluation using a mixed-methods approach is not common in interventional studies of nurse-led chemotherapy care programmes. A qualitative approach could provide a deep understanding of the experiences of patients. A qualitative study was conducted. This is part of a randomized controlled trial of a nurse-led care programme (NCT02228200). Individual interviews were conducted in 2013 with 10 patients with breast cancer after they had completed the chemotherapy. Qualitative content analysis was adopted to analyse the interviews. Chemotherapy affected the patients in different ways. Some participants completed the chemotherapy treatment smoothly with minimum side effects, while others encountered many problems during the treatment, which had a great impact on their lives. Guided by their coping attitudes, which were affected by the Chinese culture, most participants adopted behavioural, social, cognitive and emotional strategies to actively cope with the chemotherapy. A few tolerated the treatment passively. Some thought that the process of undergoing chemotherapy was physically bearable, while some equated it with suffering. Others regarded it as a chance to get a new start. The experience of patients with breast cancer during chemotherapy can be likened to that of going on a hike. They reach the peak through different paths and bear different burdens. Yet, they have to go through until the end, regardless of how much of a burden they bear and how they achieve the peak. © 2017 John Wiley & Sons Ltd.

  17. Driving forces for home-based reablement; a qualitative study of older adults' experiences.

    PubMed

    Hjelle, Kari Margrete; Tuntland, Hanne; Førland, Oddvar; Alvsvåg, Herdis

    2017-09-01

    As a result of the ageing population worldwide, there has been a growing international interest in a new intervention termed 'reablement'. Reablement is an early and time-limited home-based intervention with emphasis on intensive, goal-oriented and interdisciplinary rehabilitation for older adults in need of rehabilitation or at risk of functional decline. The aim of this qualitative study was to describe how older adults experienced participation in reablement. Eight older adults participated in semi-structured interviews. A qualitative content analysis was used as the analysis strategy. Four main themes emerged from the participants' experiences of participating in reablement: 'My willpower is needed', 'Being with my stuff and my people', 'The home-trainers are essential', and 'Training is physical exercises, not everyday activities'. The first three themes in particular reflected the participants' driving forces in the reablement process. Driving forces are intrinsic motivation in interaction with extrinsic motivation. Intrinsic motivation was based on the person's willpower and responsibility, and extrinsic motivation was expressed to be strengthened by being in one's home environment with 'own' people, as well as by the co-operation with the reablement team. The reablement team encouraged and supported the older adults to regain confidence in performing everyday activities as well as participating in the society. Our findings have practical significance for politicians, healthcare providers and healthcare professionals by contributing to an understanding of how intrinsic and extrinsic motivation influence reablement. Some persons need apparently more extrinsic motivational support also after the time-limited reablement period is completed. The municipal health and care services need to consider individualised follow-up programmes after the intensive reablement period in order to maintain the achieved skills to perform everyday activities and participate in

  18. NASA space cancer risk model-2014: Uncertainties due to qualitative differences in biological effects of HZE particles

    NASA Astrophysics Data System (ADS)

    Cucinotta, Francis

    Uncertainties in estimating health risks from exposures to galactic cosmic rays (GCR) — comprised of protons and high-energy and charge (HZE) nuclei are an important limitation to long duration space travel. HZE nuclei produce both qualitative and quantitative differences in biological effects compared to terrestrial radiation leading to large uncertainties in predicting risks to humans. Our NASA Space Cancer Risk Model-2012 (NSCR-2012) for estimating lifetime cancer risks from space radiation included several new features compared to earlier models from the National Council on Radiation Protection and Measurements (NCRP) used at NASA. New features of NSCR-2012 included the introduction of NASA defined radiation quality factors based on track structure concepts, a Bayesian analysis of the dose and dose-rate reduction effectiveness factor (DDREF) and its uncertainty, and the use of a never-smoker population to represent astronauts. However, NSCR-2012 did not include estimates of the role of qualitative differences between HZE particles and low LET radiation. In this report we discuss evidence for non-targeted effects increasing cancer risks at space relevant HZE particle absorbed doses in tissue (<0.2 Gy), and for increased tumor lethality due to the propensity for higher rates of metastatic tumors from high LET radiation suggested by animal experiments. The NSCR-2014 model considers how these qualitative differences modify the overall probability distribution functions (PDF) for cancer mortality risk estimates from space radiation. Predictions of NSCR-2014 for International Space Station missions and Mars exploration will be described, and compared to those of our earlier NSCR-2012 model.

  19. Student and educator experiences of maternal-child simulation-based learning: a systematic review of qualitative evidence.

    PubMed

    MacKinnon, Karen; Marcellus, Lenora; Rivers, Julie; Gordon, Carol; Ryan, Maureen; Butcher, Diane

    2017-11-01

    Although maternal-child care is a pillar of primary health care, there is a global shortage of maternal-child health care providers. Nurse educators experience difficulties providing undergraduate students with maternal-child learning experiences for a number of reasons. Simulation has the potential to complement learning in clinical and classroom settings. Although systematic reviews of simulation are available, no systematic reviews of qualitative evidence related to maternal-child simulation-based learning (SBL) for undergraduate nursing students and/or educators have been located. The aim of this systematic review was to identify the appropriateness and meaningfulness of maternal-child simulation-based learning for undergraduate nursing students and nursing educators in educational settings to inform curriculum decision-making. The review questions are: INCLUSION CRITERIA TYPES OF PARTICIPANTS: Pre-registration or pre-licensure or undergraduate nursing or health professional students and educators. Experiences of simulation in an educational setting with a focus relevant to maternal child nursing. Qualitative research and educational evaluation using qualitative methods. North America, Europe, Australia and New Zealand. A three-step search strategy identified published studies in the English language from 2000 until April 2016. Identified studies that met the inclusion criteria were retrieved and critically appraised using the Joanna Briggs Institute Qualitative Assessment and Review Instrument (JBI-QARI) by at least two independent reviewers. Overall the methodological quality of the included studies was low. Qualitative findings were extracted by two independent reviewers using JBI-QARI data extraction tools. Findings were aggregated and categorized on the basis of similarity in meaning. Categories were subjected to a meta-synthesis to produce a single comprehensive set of synthesized findings. Twenty-two articles from 19 studies were included in the review

  20. Detecting determinism with improved sensitivity in time series: rank-based nonlinear predictability score.

    PubMed

    Naro, Daniel; Rummel, Christian; Schindler, Kaspar; Andrzejak, Ralph G

    2014-09-01

    The rank-based nonlinear predictability score was recently introduced as a test for determinism in point processes. We here adapt this measure to time series sampled from time-continuous flows. We use noisy Lorenz signals to compare this approach against a classical amplitude-based nonlinear prediction error. Both measures show an almost identical robustness against Gaussian white noise. In contrast, when the amplitude distribution of the noise has a narrower central peak and heavier tails than the normal distribution, the rank-based nonlinear predictability score outperforms the amplitude-based nonlinear prediction error. For this type of noise, the nonlinear predictability score has a higher sensitivity for deterministic structure in noisy signals. It also yields a higher statistical power in a surrogate test of the null hypothesis of linear stochastic correlated signals. We show the high relevance of this improved performance in an application to electroencephalographic (EEG) recordings from epilepsy patients. Here the nonlinear predictability score again appears of higher sensitivity to nonrandomness. Importantly, it yields an improved contrast between signals recorded from brain areas where the first ictal EEG signal changes were detected (focal EEG signals) versus signals recorded from brain areas that were not involved at seizure onset (nonfocal EEG signals).

  1. Detecting determinism with improved sensitivity in time series: Rank-based nonlinear predictability score

    NASA Astrophysics Data System (ADS)

    Naro, Daniel; Rummel, Christian; Schindler, Kaspar; Andrzejak, Ralph G.

    2014-09-01

    The rank-based nonlinear predictability score was recently introduced as a test for determinism in point processes. We here adapt this measure to time series sampled from time-continuous flows. We use noisy Lorenz signals to compare this approach against a classical amplitude-based nonlinear prediction error. Both measures show an almost identical robustness against Gaussian white noise. In contrast, when the amplitude distribution of the noise has a narrower central peak and heavier tails than the normal distribution, the rank-based nonlinear predictability score outperforms the amplitude-based nonlinear prediction error. For this type of noise, the nonlinear predictability score has a higher sensitivity for deterministic structure in noisy signals. It also yields a higher statistical power in a surrogate test of the null hypothesis of linear stochastic correlated signals. We show the high relevance of this improved performance in an application to electroencephalographic (EEG) recordings from epilepsy patients. Here the nonlinear predictability score again appears of higher sensitivity to nonrandomness. Importantly, it yields an improved contrast between signals recorded from brain areas where the first ictal EEG signal changes were detected (focal EEG signals) versus signals recorded from brain areas that were not involved at seizure onset (nonfocal EEG signals).

  2. Qualitative "trial-sibling" studies and "unrelated" qualitative studies contributed to complex intervention reviews.

    PubMed

    Noyes, Jane; Hendry, Margaret; Lewin, Simon; Glenton, Claire; Chandler, Jackie; Rashidian, Arash

    2016-06-01

    To compare the contribution of "trial-sibling" and "unrelated" qualitative studies in complex intervention reviews. Researchers are using qualitative "trial-sibling" studies undertaken alongside trials to provide explanations to understand complex interventions. In the absence of qualitative "trial-sibling" studies, it is not known if qualitative studies "unrelated" to trials are helpful. Trials, "trial-sibling," and "unrelated" qualitative studies looking at three health system interventions were identified. We looked for similarities and differences between the two types of qualitative studies, such as participants, intervention delivery, context, study quality and reporting, and contribution to understanding trial results. Reporting was generally poor in both qualitative study types. We detected no substantial differences in participant characteristics. Interventions in qualitative "trial-sibling" studies were delivered using standardized protocols, whereas interventions in "unrelated" qualitative studies were delivered in routine care. Qualitative "trial-sibling" studies alone provided insufficient data to develop meaningful transferrable explanations beyond the trial context, and their limited focus on immediate implementation did not address all phenomena of interest. Together, "trial-sibling" and "unrelated" qualitative studies provided larger, richer data sets across contexts to better understand the phenomena of interest. Findings support inclusion of "trial-sibling" and "unrelated" qualitative studies to explore complexity in complex intervention reviews. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. A data base approach for prediction of deforestation-induced mass wasting events

    NASA Technical Reports Server (NTRS)

    Logan, T. L.

    1981-01-01

    A major topic of concern in timber management is determining the impact of clear-cutting on slope stability. Deforestation treatments on steep mountain slopes have often resulted in a high frequency of major mass wasting events. The Geographic Information System (GIS) is a potentially useful tool for predicting the location of mass wasting sites. With a raster-based GIS, digitally encoded maps of slide hazard parameters can be overlayed and modeled to produce new maps depicting high probability slide areas. The present investigation has the objective to examine the raster-based information system as a tool for predicting the location of the clear-cut mountain slopes which are most likely to experience shallow soil debris avalanches. A literature overview is conducted, taking into account vegetation, roads, precipitation, soil type, slope-angle and aspect, and models predicting mass soil movements. Attention is given to a data base approach and aspects of slide prediction.

  4. Assessment and prediction of drying shrinkage cracking in bonded mortar overlays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beushausen, Hans, E-mail: hans.beushausen@uct.ac.za; Chilwesa, Masuzyo

    2013-11-15

    Restrained drying shrinkage cracking was investigated on composite beams consisting of substrate concrete and bonded mortar overlays, and compared to the performance of the same mortars when subjected to the ring test. Stress development and cracking in the composite specimens were analytically modeled and predicted based on the measurement of relevant time-dependent material properties such as drying shrinkage, elastic modulus, tensile relaxation and tensile strength. Overlay cracking in the composite beams could be very well predicted with the analytical model. The ring test provided a useful qualitative comparison of the cracking performance of the mortars. The duration of curing wasmore » found to only have a minor influence on crack development. This was ascribed to the fact that prolonged curing has a beneficial effect on tensile strength at the onset of stress development, but is in the same time not beneficial to the values of tensile relaxation and elastic modulus. -- Highlights: •Parameter study on material characteristics influencing overlay cracking. •Analytical model gives good quantitative indication of overlay cracking. •Ring test presents good qualitative indication of overlay cracking. •Curing duration has little effect on overlay cracking.« less

  5. Stringent homology-based prediction of H. sapiens-M. tuberculosis H37Rv protein-protein interactions.

    PubMed

    Zhou, Hufeng; Gao, Shangzhi; Nguyen, Nam Ninh; Fan, Mengyuan; Jin, Jingjing; Liu, Bing; Zhao, Liang; Xiong, Geng; Tan, Min; Li, Shijun; Wong, Limsoon

    2014-04-08

    H. sapiens-M. tuberculosis H37Rv protein-protein interaction (PPI) data are essential for understanding the infection mechanism of the formidable pathogen M. tuberculosis H37Rv. Computational prediction is an important strategy to fill the gap in experimental H. sapiens-M. tuberculosis H37Rv PPI data. Homology-based prediction is frequently used in predicting both intra-species and inter-species PPIs. However, some limitations are not properly resolved in several published works that predict eukaryote-prokaryote inter-species PPIs using intra-species template PPIs. We develop a stringent homology-based prediction approach by taking into account (i) differences between eukaryotic and prokaryotic proteins and (ii) differences between inter-species and intra-species PPI interfaces. We compare our stringent homology-based approach to a conventional homology-based approach for predicting host-pathogen PPIs, based on cellular compartment distribution analysis, disease gene list enrichment analysis, pathway enrichment analysis and functional category enrichment analysis. These analyses support the validity of our prediction result, and clearly show that our approach has better performance in predicting H. sapiens-M. tuberculosis H37Rv PPIs. Using our stringent homology-based approach, we have predicted a set of highly plausible H. sapiens-M. tuberculosis H37Rv PPIs which might be useful for many of related studies. Based on our analysis of the H. sapiens-M. tuberculosis H37Rv PPI network predicted by our stringent homology-based approach, we have discovered several interesting properties which are reported here for the first time. We find that both host proteins and pathogen proteins involved in the host-pathogen PPIs tend to be hubs in their own intra-species PPI network. Also, both host and pathogen proteins involved in host-pathogen PPIs tend to have longer primary sequence, tend to have more domains, tend to be more hydrophilic, etc. And the protein domains from both

  6. Predicting motor development in very preterm infants at 12 months' corrected age: the role of qualitative magnetic resonance imaging and general movements assessments.

    PubMed

    Spittle, Alicia J; Boyd, Roslyn N; Inder, Terrie E; Doyle, Lex W

    2009-02-01

    The objective of this study was to compare the predictive value of qualitative MRI of brain structure at term and general movements assessments at 1 and 3 months' corrected age for motor outcome at 1 year's corrected age in very preterm infants. Eighty-six very preterm infants (<30 weeks' gestation) underwent MRI at term-equivalent age, were evaluated for white matter abnormality, and had general movements assessed at 1 and 3 months' corrected age. Motor outcome at 1 year's corrected age was evaluated with the Alberta Infant Motor Scale, the Neuro-Sensory Motor Development Assessment, and the diagnosis of cerebral palsy by the child's pediatrician. At 1 year of age, the Alberta Infant Motor Scale categorized 30 (35%) infants as suspicious/abnormal; the Neuro-Sensory Motor Development Assessment categorized 16 (18%) infants with mild-to-severe motor dysfunction, and 5 (6%) infants were classified with cerebral palsy. White matter abnormality at term and general movements at 1 and 3 months significantly correlated with Alberta Infant Motor Scale and Neuro-Sensory Motor Development Assessment scores at 1 year. White matter abnormality and general movements at 3 months were the only assessments that correlated with cerebral palsy. All assessments had 100% sensitivity in predicting cerebral palsy. White matter abnormality demonstrated the greatest accuracy in predicting combined motor outcomes, with excellent levels of specificity (>90%); however, the sensitivity was low. On the other hand, general movements assessments at 1 month had the highest sensitivity (>80%); however, the overall accuracy was relatively low. Neuroimaging (MRI) and functional (general movements) examinations have important complementary roles in predicting motor development of very preterm infants.

  7. Uncertainty analysis of neural network based flood forecasting models: An ensemble based approach for constructing prediction interval

    NASA Astrophysics Data System (ADS)

    Kasiviswanathan, K.; Sudheer, K.

    2013-05-01

    Artificial neural network (ANN) based hydrologic models have gained lot of attention among water resources engineers and scientists, owing to their potential for accurate prediction of flood flows as compared to conceptual or physics based hydrologic models. The ANN approximates the non-linear functional relationship between the complex hydrologic variables in arriving at the river flow forecast values. Despite a large number of applications, there is still some criticism that ANN's point prediction lacks in reliability since the uncertainty of predictions are not quantified, and it limits its use in practical applications. A major concern in application of traditional uncertainty analysis techniques on neural network framework is its parallel computing architecture with large degrees of freedom, which makes the uncertainty assessment a challenging task. Very limited studies have considered assessment of predictive uncertainty of ANN based hydrologic models. In this study, a novel method is proposed that help construct the prediction interval of ANN flood forecasting model during calibration itself. The method is designed to have two stages of optimization during calibration: at stage 1, the ANN model is trained with genetic algorithm (GA) to obtain optimal set of weights and biases vector, and during stage 2, the optimal variability of ANN parameters (obtained in stage 1) is identified so as to create an ensemble of predictions. During the 2nd stage, the optimization is performed with multiple objectives, (i) minimum residual variance for the ensemble mean, (ii) maximum measured data points to fall within the estimated prediction interval and (iii) minimum width of prediction interval. The method is illustrated using a real world case study of an Indian basin. The method was able to produce an ensemble that has an average prediction interval width of 23.03 m3/s, with 97.17% of the total validation data points (measured) lying within the interval. The derived

  8. "There Are No Known Benefits . . .": Considering the Risk/Benefit Ratio of Qualitative Research.

    PubMed

    Opsal, Tara; Wolgemuth, Jennifer; Cross, Jennifer; Kaanta, Tanya; Dickmann, Ellyn; Colomer, Soria; Erdil-Moody, Zeynep

    2016-07-01

    Institutional review boards (IRBs) are responsible for weighing the risks and benefits of research participation. Qualitative researchers note numerous instances where IRB ethical frameworks fail to align with the ethics of their research projects and point out that IRB understandings of the benefits and risks of research often differ from those of the participants they seek to protect. This qualitative cross-case research investigates participants' interview experiences in six qualitative studies that differed in their methods, subject of focus, and populations. Our findings indicate that contemporary IRBs' use of population "vulnerability" and topic "sensitivity" to assess project risk does not adequately determine the benefits, risks, or ethicality of research. We recommend that IRBs treat as real the evidence for benefits in qualitative research, recognize that sensitivity and vulnerability do not predict risk, and encourage researchers to attend to relationships in their projects. © The Author(s) 2015.

  9. Bias-adjusted satellite-based rainfall estimates for predicting floods: Narayani Basin

    USGS Publications Warehouse

    Artan, Guleid A.; Tokar, S.A.; Gautam, D.K.; Bajracharya, S.R.; Shrestha, M.S.

    2011-01-01

    In Nepal, as the spatial distribution of rain gauges is not sufficient to provide detailed perspective on the highly varied spatial nature of rainfall, satellite-based rainfall estimates provides the opportunity for timely estimation. This paper presents the flood prediction of Narayani Basin at the Devghat hydrometric station (32 000 km2) using bias-adjusted satellite rainfall estimates and the Geospatial Stream Flow Model (GeoSFM), a spatially distributed, physically based hydrologic model. The GeoSFM with gridded gauge observed rainfall inputs using kriging interpolation from 2003 was used for calibration and 2004 for validation to simulate stream flow with both having a Nash Sutcliff Efficiency of above 0.7. With the National Oceanic and Atmospheric Administration Climate Prediction Centre's rainfall estimates (CPC_RFE2.0), using the same calibrated parameters, for 2003 the model performance deteriorated but improved after recalibration with CPC_RFE2.0 indicating the need to recalibrate the model with satellite-based rainfall estimates. Adjusting the CPC_RFE2.0 by a seasonal, monthly and 7-day moving average ratio, improvement in model performance was achieved. Furthermore, a new gauge-satellite merged rainfall estimates obtained from ingestion of local rain gauge data resulted in significant improvement in flood predictability. The results indicate the applicability of satellite-based rainfall estimates in flood prediction with appropriate bias correction.

  10. Prediction of beta-turns at over 80% accuracy based on an ensemble of predicted secondary structures and multiple alignments.

    PubMed

    Zheng, Ce; Kurgan, Lukasz

    2008-10-10

    beta-turn is a secondary protein structure type that plays significant role in protein folding, stability, and molecular recognition. To date, several methods for prediction of beta-turns from protein sequences were developed, but they are characterized by relatively poor prediction quality. The novelty of the proposed sequence-based beta-turn predictor stems from the usage of a window based information extracted from four predicted three-state secondary structures, which together with a selected set of position specific scoring matrix (PSSM) values serve as an input to the support vector machine (SVM) predictor. We show that (1) all four predicted secondary structures are useful; (2) the most useful information extracted from the predicted secondary structure includes the structure of the predicted residue, secondary structure content in a window around the predicted residue, and features that indicate whether the predicted residue is inside a secondary structure segment; (3) the PSSM values of Asn, Asp, Gly, Ile, Leu, Met, Pro, and Val were among the top ranked features, which corroborates with recent studies. The Asn, Asp, Gly, and Pro indicate potential beta-turns, while the remaining four amino acids are useful to predict non-beta-turns. Empirical evaluation using three nonredundant datasets shows favorable Q total, Q predicted and MCC values when compared with over a dozen of modern competing methods. Our method is the first to break the 80% Q total barrier and achieves Q total = 80.9%, MCC = 0.47, and Q predicted higher by over 6% when compared with the second best method. We use feature selection to reduce the dimensionality of the feature vector used as the input for the proposed prediction method. The applied feature set is smaller by 86, 62 and 37% when compared with the second and two third-best (with respect to MCC) competing methods, respectively. Experiments show that the proposed method constitutes an improvement over the competing prediction

  11. Trajectory-Based Complexity (TBX): A Modified Aircraft Count to Predict Sector Complexity During Trajectory-Based Operations

    NASA Technical Reports Server (NTRS)

    Prevot, Thomas; Lee, Paul U.

    2011-01-01

    In this paper we introduce a new complexity metric to predict -in real-time- sector complexity for trajectory-based operations (TBO). TBO will be implemented in the Next Generation Air Transportation System (NextGen). Trajectory-Based Complexity (TBX) is a modified aircraft count that can easily be computed and communicated in a TBO environment based upon predictions of aircraft and weather trajectories. TBX is scaled to aircraft count and represents an alternate and additional means to manage air traffic demand and capacity with more consideration of dynamic factors such as weather, aircraft equipage or predicted separation violations, as well as static factors such as sector size. We have developed and evaluated TBX in the Airspace Operations Laboratory (AOL) at the NASA Ames Research Center during human-in-the-loop studies of trajectory-based concepts since 2009. In this paper we will describe the TBX computation in detail and present the underlying algorithm. Next, we will describe the specific TBX used in an experiment at NASA's AOL. We will evaluate the performance of this metric using data collected during a controller-inthe- loop study on trajectory-based operations at different equipage levels. In this study controllers were prompted at regular intervals to rate their current workload on a numeric scale. When comparing this real-time workload rating to the TBX values predicted for these time periods we demonstrate that TBX is a better predictor of workload than aircraft count. Furthermore we demonstrate that TBX is well suited to be used for complexity management in TBO and can easily be adjusted to future operational concepts.

  12. Link Prediction in Evolving Networks Based on Popularity of Nodes.

    PubMed

    Wang, Tong; He, Xing-Sheng; Zhou, Ming-Yang; Fu, Zhong-Qian

    2017-08-02

    Link prediction aims to uncover the underlying relationship behind networks, which could be utilized to predict missing edges or identify the spurious edges. The key issue of link prediction is to estimate the likelihood of potential links in networks. Most classical static-structure based methods ignore the temporal aspects of networks, limited by the time-varying features, such approaches perform poorly in evolving networks. In this paper, we propose a hypothesis that the ability of each node to attract links depends not only on its structural importance, but also on its current popularity (activeness), since active nodes have much more probability to attract future links. Then a novel approach named popularity based structural perturbation method (PBSPM) and its fast algorithm are proposed to characterize the likelihood of an edge from both existing connectivity structure and current popularity of its two endpoints. Experiments on six evolving networks show that the proposed methods outperform state-of-the-art methods in accuracy and robustness. Besides, visual results and statistical analysis reveal that the proposed methods are inclined to predict future edges between active nodes, rather than edges between inactive nodes.

  13. Deep learning predictions of survival based on MRI in amyotrophic lateral sclerosis.

    PubMed

    van der Burgh, Hannelore K; Schmidt, Ruben; Westeneng, Henk-Jan; de Reus, Marcel A; van den Berg, Leonard H; van den Heuvel, Martijn P

    2017-01-01

    Amyotrophic lateral sclerosis (ALS) is a progressive neuromuscular disease, with large variation in survival between patients. Currently, it remains rather difficult to predict survival based on clinical parameters alone. Here, we set out to use clinical characteristics in combination with MRI data to predict survival of ALS patients using deep learning, a machine learning technique highly effective in a broad range of big-data analyses. A group of 135 ALS patients was included from whom high-resolution diffusion-weighted and T1-weighted images were acquired at the first visit to the outpatient clinic. Next, each of the patients was monitored carefully and survival time to death was recorded. Patients were labeled as short, medium or long survivors, based on their recorded time to death as measured from the time of disease onset. In the deep learning procedure, the total group of 135 patients was split into a training set for deep learning (n = 83 patients), a validation set (n = 20) and an independent evaluation set (n = 32) to evaluate the performance of the obtained deep learning networks. Deep learning based on clinical characteristics predicted survival category correctly in 68.8% of the cases. Deep learning based on MRI predicted 62.5% correctly using structural connectivity and 62.5% using brain morphology data. Notably, when we combined the three sources of information, deep learning prediction accuracy increased to 84.4%. Taken together, our findings show the added value of MRI with respect to predicting survival in ALS, demonstrating the advantage of deep learning in disease prognostication.

  14. Enhancing the quality and credibility of qualitative analysis.

    PubMed

    Patton, M Q

    1999-12-01

    Varying philosophical and theoretical orientations to qualitative inquiry remind us that issues of quality and credibility intersect with audience and intended research purposes. This overview examines ways of enhancing the quality and credibility of qualitative analysis by dealing with three distinct but related inquiry concerns: rigorous techniques and methods for gathering and analyzing qualitative data, including attention to validity, reliability, and triangulation; the credibility, competence, and perceived trustworthiness of the qualitative researcher; and the philosophical beliefs of evaluation users about such paradigm-based preferences as objectivity versus subjectivity, truth versus perspective, and generalizations versus extrapolations. Although this overview examines some general approaches to issues of credibility and data quality in qualitative analysis, it is important to acknowledge that particular philosophical underpinnings, specific paradigms, and special purposes for qualitative inquiry will typically include additional or substitute criteria for assuring and judging quality, validity, and credibility. Moreover, the context for these considerations has evolved. In early literature on evaluation methods the debate between qualitative and quantitative methodologists was often strident. In recent years the debate has softened. A consensus has gradually emerged that the important challenge is to match appropriately the methods to empirical questions and issues, and not to universally advocate any single methodological approach for all problems.

  15. Enhancing the quality and credibility of qualitative analysis.

    PubMed Central

    Patton, M Q

    1999-01-01

    Varying philosophical and theoretical orientations to qualitative inquiry remind us that issues of quality and credibility intersect with audience and intended research purposes. This overview examines ways of enhancing the quality and credibility of qualitative analysis by dealing with three distinct but related inquiry concerns: rigorous techniques and methods for gathering and analyzing qualitative data, including attention to validity, reliability, and triangulation; the credibility, competence, and perceived trustworthiness of the qualitative researcher; and the philosophical beliefs of evaluation users about such paradigm-based preferences as objectivity versus subjectivity, truth versus perspective, and generalizations versus extrapolations. Although this overview examines some general approaches to issues of credibility and data quality in qualitative analysis, it is important to acknowledge that particular philosophical underpinnings, specific paradigms, and special purposes for qualitative inquiry will typically include additional or substitute criteria for assuring and judging quality, validity, and credibility. Moreover, the context for these considerations has evolved. In early literature on evaluation methods the debate between qualitative and quantitative methodologists was often strident. In recent years the debate has softened. A consensus has gradually emerged that the important challenge is to match appropriately the methods to empirical questions and issues, and not to universally advocate any single methodological approach for all problems. PMID:10591279

  16. Deep learning based tissue analysis predicts outcome in colorectal cancer.

    PubMed

    Bychkov, Dmitrii; Linder, Nina; Turkki, Riku; Nordling, Stig; Kovanen, Panu E; Verrill, Clare; Walliander, Margarita; Lundin, Mikael; Haglund, Caj; Lundin, Johan

    2018-02-21

    Image-based machine learning and deep learning in particular has recently shown expert-level accuracy in medical image classification. In this study, we combine convolutional and recurrent architectures to train a deep network to predict colorectal cancer outcome based on images of tumour tissue samples. The novelty of our approach is that we directly predict patient outcome, without any intermediate tissue classification. We evaluate a set of digitized haematoxylin-eosin-stained tumour tissue microarray (TMA) samples from 420 colorectal cancer patients with clinicopathological and outcome data available. The results show that deep learning-based outcome prediction with only small tissue areas as input outperforms (hazard ratio 2.3; CI 95% 1.79-3.03; AUC 0.69) visual histological assessment performed by human experts on both TMA spot (HR 1.67; CI 95% 1.28-2.19; AUC 0.58) and whole-slide level (HR 1.65; CI 95% 1.30-2.15; AUC 0.57) in the stratification into low- and high-risk patients. Our results suggest that state-of-the-art deep learning techniques can extract more prognostic information from the tissue morphology of colorectal cancer than an experienced human observer.

  17. Practical quantum mechanics-based fragment methods for predicting molecular crystal properties.

    PubMed

    Wen, Shuhao; Nanda, Kaushik; Huang, Yuanhang; Beran, Gregory J O

    2012-06-07

    Significant advances in fragment-based electronic structure methods have created a real alternative to force-field and density functional techniques in condensed-phase problems such as molecular crystals. This perspective article highlights some of the important challenges in modeling molecular crystals and discusses techniques for addressing them. First, we survey recent developments in fragment-based methods for molecular crystals. Second, we use examples from our own recent research on a fragment-based QM/MM method, the hybrid many-body interaction (HMBI) model, to analyze the physical requirements for a practical and effective molecular crystal model chemistry. We demonstrate that it is possible to predict molecular crystal lattice energies to within a couple kJ mol(-1) and lattice parameters to within a few percent in small-molecule crystals. Fragment methods provide a systematically improvable approach to making predictions in the condensed phase, which is critical to making robust predictions regarding the subtle energy differences found in molecular crystals.

  18. Seizure prediction in hippocampal and neocortical epilepsy using a model-based approach

    PubMed Central

    Aarabi, Ardalan; He, Bin

    2014-01-01

    Objectives The aim of this study is to develop a model based seizure prediction method. Methods A neural mass model was used to simulate the macro-scale dynamics of intracranial EEG data. The model was composed of pyramidal cells, excitatory and inhibitory interneurons described through state equations. Twelve model’s parameters were estimated by fitting the model to the power spectral density of intracranial EEG signals and then integrated based on information obtained by investigating changes in the parameters prior to seizures. Twenty-one patients with medically intractable hippocampal and neocortical focal epilepsy were studied. Results Tuned to obtain maximum sensitivity, an average sensitivity of 87.07% and 92.6% with an average false prediction rate of 0.2 and 0.15/h were achieved using maximum seizure occurrence periods of 30 and 50 min and a minimum seizure prediction horizon of 10 s, respectively. Under maximum specificity conditions, the system sensitivity decreased to 82.9% and 90.05% and the false prediction rates were reduced to 0.16 and 0.12/h using maximum seizure occurrence periods of 30 and 50 min, respectively. Conclusions The spatio-temporal changes in the parameters demonstrated patient-specific preictal signatures that could be used for seizure prediction. Significance The present findings suggest that the model-based approach may aid prediction of seizures. PMID:24374087

  19. Novel Approach for the Recognition and Prediction of Multi-Function Radar Behaviours Based on Predictive State Representations.

    PubMed

    Ou, Jian; Chen, Yongguang; Zhao, Feng; Liu, Jin; Xiao, Shunping

    2017-03-19

    The extensive applications of multi-function radars (MFRs) have presented a great challenge to the technologies of radar countermeasures (RCMs) and electronic intelligence (ELINT). The recently proposed cognitive electronic warfare (CEW) provides a good solution, whose crux is to perceive present and future MFR behaviours, including the operating modes, waveform parameters, scheduling schemes, etc. Due to the variety and complexity of MFR waveforms, the existing approaches have the drawbacks of inefficiency and weak practicability in prediction. A novel method for MFR behaviour recognition and prediction is proposed based on predictive state representation (PSR). With the proposed approach, operating modes of MFR are recognized by accumulating the predictive states, instead of using fixed transition probabilities that are unavailable in the battlefield. It helps to reduce the dependence of MFR on prior information. And MFR signals can be quickly predicted by iteratively using the predicted observation, avoiding the very large computation brought by the uncertainty of future observations. Simulations with a hypothetical MFR signal sequence in a typical scenario are presented, showing that the proposed methods perform well and efficiently, which attests to their validity.

  20. Predicting data saturation in qualitative surveys with mathematical models from ecological research.

    PubMed

    Tran, Viet-Thi; Porcher, Raphael; Tran, Viet-Chi; Ravaud, Philippe

    2017-02-01

    Sample size in surveys with open-ended questions relies on the principle of data saturation. Determining the point of data saturation is complex because researchers have information on only what they have found. The decision to stop data collection is solely dictated by the judgment and experience of researchers. In this article, we present how mathematical modeling may be used to describe and extrapolate the accumulation of themes during a study to help researchers determine the point of data saturation. The model considers a latent distribution of the probability of elicitation of all themes and infers the accumulation of themes as arising from a mixture of zero-truncated binomial distributions. We illustrate how the model could be used with data from a survey with open-ended questions on the burden of treatment involving 1,053 participants from 34 different countries and with various conditions. The performance of the model in predicting the number of themes to be found with the inclusion of new participants was investigated by Monte Carlo simulations. Then, we tested how the slope of the expected theme accumulation curve could be used as a stopping criterion for data collection in surveys with open-ended questions. By doubling the sample size after the inclusion of initial samples of 25 to 200 participants, the model reliably predicted the number of themes to be found. Mean estimation error ranged from 3% to 1% with simulated data and was <2% with data from the study of the burden of treatment. Sequentially calculating the slope of the expected theme accumulation curve for every five new participants included was a feasible approach to balance the benefits of including these new participants in the study. In our simulations, a stopping criterion based on a value of 0.05 for this slope allowed for identifying 97.5% of the themes while limiting the inclusion of participants eliciting nothing new in the study. Mathematical models adapted from ecological research

  1. Reflectance Prediction Modelling for Residual-Based Hyperspectral Image Coding

    PubMed Central

    Xiao, Rui; Gao, Junbin; Bossomaier, Terry

    2016-01-01

    A Hyperspectral (HS) image provides observational powers beyond human vision capability but represents more than 100 times the data compared to a traditional image. To transmit and store the huge volume of an HS image, we argue that a fundamental shift is required from the existing “original pixel intensity”-based coding approaches using traditional image coders (e.g., JPEG2000) to the “residual”-based approaches using a video coder for better compression performance. A modified video coder is required to exploit spatial-spectral redundancy using pixel-level reflectance modelling due to the different characteristics of HS images in their spectral and shape domain of panchromatic imagery compared to traditional videos. In this paper a novel coding framework using Reflectance Prediction Modelling (RPM) in the latest video coding standard High Efficiency Video Coding (HEVC) for HS images is proposed. An HS image presents a wealth of data where every pixel is considered a vector for different spectral bands. By quantitative comparison and analysis of pixel vector distribution along spectral bands, we conclude that modelling can predict the distribution and correlation of the pixel vectors for different bands. To exploit distribution of the known pixel vector, we estimate a predicted current spectral band from the previous bands using Gaussian mixture-based modelling. The predicted band is used as the additional reference band together with the immediate previous band when we apply the HEVC. Every spectral band of an HS image is treated like it is an individual frame of a video. In this paper, we compare the proposed method with mainstream encoders. The experimental results are fully justified by three types of HS dataset with different wavelength ranges. The proposed method outperforms the existing mainstream HS encoders in terms of rate-distortion performance of HS image compression. PMID:27695102

  2. Benchmark data sets for structure-based computational target prediction.

    PubMed

    Schomburg, Karen T; Rarey, Matthias

    2014-08-25

    Structure-based computational target prediction methods identify potential targets for a bioactive compound. Methods based on protein-ligand docking so far face many challenges, where the greatest probably is the ranking of true targets in a large data set of protein structures. Currently, no standard data sets for evaluation exist, rendering comparison and demonstration of improvements of methods cumbersome. Therefore, we propose two data sets and evaluation strategies for a meaningful evaluation of new target prediction methods, i.e., a small data set consisting of three target classes for detailed proof-of-concept and selectivity studies and a large data set consisting of 7992 protein structures and 72 drug-like ligands allowing statistical evaluation with performance metrics on a drug-like chemical space. Both data sets are built from openly available resources, and any information needed to perform the described experiments is reported. We describe the composition of the data sets, the setup of screening experiments, and the evaluation strategy. Performance metrics capable to measure the early recognition of enrichments like AUC, BEDROC, and NSLR are proposed. We apply a sequence-based target prediction method to the large data set to analyze its content of nontrivial evaluation cases. The proposed data sets are used for method evaluation of our new inverse screening method iRAISE. The small data set reveals the method's capability and limitations to selectively distinguish between rather similar protein structures. The large data set simulates real target identification scenarios. iRAISE achieves in 55% excellent or good enrichment a median AUC of 0.67 and RMSDs below 2.0 Å for 74% and was able to predict the first true target in 59 out of 72 cases in the top 2% of the protein data set of about 8000 structures.

  3. Pushing the Frontier of Data-Oriented Geodynamic Modeling: from Qualitative to Quantitative to Predictive

    NASA Astrophysics Data System (ADS)

    Liu, L.; Hu, J.; Zhou, Q.

    2016-12-01

    The rapid accumulation of geophysical and geological data sets poses an increasing demand for the development of geodynamic models to better understand the evolution of the solid Earth. Consequently, the earlier qualitative physical models are no long satisfying. Recent efforts are focusing on more quantitative simulations and more efficient numerical algorithms. Among these, a particular line of research is on the implementation of data-oriented geodynamic modeling, with the purpose of building an observationally consistent and physically correct geodynamic framework. Such models could often catalyze new insights into the functioning mechanisms of the various aspects of plate tectonics, and their predictive nature could also guide future research in a deterministic fashion. Over the years, we have been working on constructing large-scale geodynamic models with both sequential and variational data assimilation techniques. These models act as a bridge between different observational records, and the superposition of the constraining power from different data sets help reveal unknown processes and mechanisms of the dynamics of the mantle and lithosphere. We simulate the post-Cretaceous subduction history in South America using a forward (sequential) approach. The model is constrained using past subduction history, seafloor age evolution, tectonic architecture of continents, and the present day geophysical observations. Our results quantify the various driving forces shaping the present South American flat slabs, which we found are all internally torn. The 3-D geometry of these torn slabs further explains the abnormal seismicity pattern and enigmatic volcanic history. An inverse (variational) model simulating the late Cenozoic western U.S. mantle dynamics with similar constraints reveals a different mechanism for the formation of Yellowstone-related volcanism from traditional understanding. Furthermore, important insights on the mantle density and viscosity structures

  4. PatchSurfers: Two methods for local molecular property-based binding ligand prediction.

    PubMed

    Shin, Woong-Hee; Bures, Mark Gregory; Kihara, Daisuke

    2016-01-15

    Protein function prediction is an active area of research in computational biology. Function prediction can help biologists make hypotheses for characterization of genes and help interpret biological assays, and thus is a productive area for collaboration between experimental and computational biologists. Among various function prediction methods, predicting binding ligand molecules for a target protein is an important class because ligand binding events for a protein are usually closely intertwined with the proteins' biological function, and also because predicted binding ligands can often be directly tested by biochemical assays. Binding ligand prediction methods can be classified into two types: those which are based on protein-protein (or pocket-pocket) comparison, and those that compare a target pocket directly to ligands. Recently, our group proposed two computational binding ligand prediction methods, Patch-Surfer, which is a pocket-pocket comparison method, and PL-PatchSurfer, which compares a pocket to ligand molecules. The two programs apply surface patch-based descriptions to calculate similarity or complementarity between molecules. A surface patch is characterized by physicochemical properties such as shape, hydrophobicity, and electrostatic potentials. These properties on the surface are represented using three-dimensional Zernike descriptors (3DZD), which are based on a series expansion of a 3 dimensional function. Utilizing 3DZD for describing the physicochemical properties has two main advantages: (1) rotational invariance and (2) fast comparison. Here, we introduce Patch-Surfer and PL-PatchSurfer with an emphasis on PL-PatchSurfer, which is more recently developed. Illustrative examples of PL-PatchSurfer performance on binding ligand prediction as well as virtual drug screening are also provided. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Bridge Structure Deformation Prediction Based on GNSS Data Using Kalman-ARIMA-GARCH Model

    PubMed Central

    Li, Xiaoqing; Wang, Yu

    2018-01-01

    Bridges are an essential part of the ground transportation system. Health monitoring is fundamentally important for the safety and service life of bridges. A large amount of structural information is obtained from various sensors using sensing technology, and the data processing has become a challenging issue. To improve the prediction accuracy of bridge structure deformation based on data mining and to accurately evaluate the time-varying characteristics of bridge structure performance evolution, this paper proposes a new method for bridge structure deformation prediction, which integrates the Kalman filter, autoregressive integrated moving average model (ARIMA), and generalized autoregressive conditional heteroskedasticity (GARCH). Firstly, the raw deformation data is directly pre-processed using the Kalman filter to reduce the noise. After that, the linear recursive ARIMA model is established to analyze and predict the structure deformation. Finally, the nonlinear recursive GARCH model is introduced to further improve the accuracy of the prediction. Simulation results based on measured sensor data from the Global Navigation Satellite System (GNSS) deformation monitoring system demonstrated that: (1) the Kalman filter is capable of denoising the bridge deformation monitoring data; (2) the prediction accuracy of the proposed Kalman-ARIMA-GARCH model is satisfactory, where the mean absolute error increases only from 3.402 mm to 5.847 mm with the increment of the prediction step; and (3) in comparision to the Kalman-ARIMA model, the Kalman-ARIMA-GARCH model results in superior prediction accuracy as it includes partial nonlinear characteristics (heteroscedasticity); the mean absolute error of five-step prediction using the proposed model is improved by 10.12%. This paper provides a new way for structural behavior prediction based on data processing, which can lay a foundation for the early warning of bridge health monitoring system based on sensor data using sensing

  6. Bridge Structure Deformation Prediction Based on GNSS Data Using Kalman-ARIMA-GARCH Model.

    PubMed

    Xin, Jingzhou; Zhou, Jianting; Yang, Simon X; Li, Xiaoqing; Wang, Yu

    2018-01-19

    Bridges are an essential part of the ground transportation system. Health monitoring is fundamentally important for the safety and service life of bridges. A large amount of structural information is obtained from various sensors using sensing technology, and the data processing has become a challenging issue. To improve the prediction accuracy of bridge structure deformation based on data mining and to accurately evaluate the time-varying characteristics of bridge structure performance evolution, this paper proposes a new method for bridge structure deformation prediction, which integrates the Kalman filter, autoregressive integrated moving average model (ARIMA), and generalized autoregressive conditional heteroskedasticity (GARCH). Firstly, the raw deformation data is directly pre-processed using the Kalman filter to reduce the noise. After that, the linear recursive ARIMA model is established to analyze and predict the structure deformation. Finally, the nonlinear recursive GARCH model is introduced to further improve the accuracy of the prediction. Simulation results based on measured sensor data from the Global Navigation Satellite System (GNSS) deformation monitoring system demonstrated that: (1) the Kalman filter is capable of denoising the bridge deformation monitoring data; (2) the prediction accuracy of the proposed Kalman-ARIMA-GARCH model is satisfactory, where the mean absolute error increases only from 3.402 mm to 5.847 mm with the increment of the prediction step; and (3) in comparision to the Kalman-ARIMA model, the Kalman-ARIMA-GARCH model results in superior prediction accuracy as it includes partial nonlinear characteristics (heteroscedasticity); the mean absolute error of five-step prediction using the proposed model is improved by 10.12%. This paper provides a new way for structural behavior prediction based on data processing, which can lay a foundation for the early warning of bridge health monitoring system based on sensor data using sensing

  7. How do the features of mindfulness-based cognitive therapy contribute to positive therapeutic change? A meta-synthesis of qualitative studies.

    PubMed

    Cairns, Victoria; Murray, Craig

    2015-05-01

    The exploration of Mindfulness-based Cognitive Therapy through qualitative investigation is a growing area of interest within current literature, providing valuable understanding of the process of change experienced by those engaging in this therapeutic approach. This meta-synthesis aims to gain a deeper understanding of how the features of Mindfulness-based Cognitive Therapy contribute to positive therapeutic change. Noblit and Hare's (1988) 7-step meta-ethnography method was conducted in order to synthesize the findings of seven qualitative studies. The process of reciprocal translation identified the following five major themes: i) Taking control through understanding, awareness and acceptance; ii) The impact of the group; (iii) Taking skills into everyday life; (iv) Feelings towards the self; (v) The role of expectations. The synthesis of translation identified the higher order concept of "The Mindfulness-based Cognitive Therapy Journey to Change", which depicts the complex interaction between the five themes in relation to how they contribute to positive therapeutic change. The findings are discussed in relation to previous research, theory and their implications for clinical practice.

  8. Predicting Drug-Target Interactions Based on Small Positive Samples.

    PubMed

    Hu, Pengwei; Chan, Keith C C; Hu, Yanxing

    2018-01-01

    A basic task in drug discovery is to find new medication in the form of candidate compounds that act on a target protein. In other words, a drug has to interact with a target and such drug-target interaction (DTI) is not expected to be random. Significant and interesting patterns are expected to be hidden in them. If these patterns can be discovered, new drugs are expected to be more easily discoverable. Currently, a number of computational methods have been proposed to predict DTIs based on their similarity. However, such as approach does not allow biochemical features to be directly considered. As a result, some methods have been proposed to try to discover patterns in physicochemical interactions. Since the number of potential negative DTIs are very high both in absolute terms and in comparison to that of the known ones, these methods are rather computationally expensive and they can only rely on subsets, rather than the full set, of negative DTIs for training and validation. As there is always a relatively high chance for negative DTIs to be falsely identified and as only partial subset of such DTIs is considered, existing approaches can be further improved to better predict DTIs. In this paper, we present a novel approach, called ODT (one class drug target interaction prediction), for such purpose. One main task of ODT is to discover association patterns between interacting drugs and proteins from the chemical structure of the former and the protein sequence network of the latter. ODT does so in two phases. First, the DTI-network is transformed to a representation by structural properties. Second, it applies a oneclass classification algorithm to build a prediction model based only on known positive interactions. We compared the best AUROC scores of the ODT with several state-of-art approaches on Gold standard data. The prediction accuracy of the ODT is superior in comparison with all the other methods at GPCRs dataset and Ion channels dataset. Performance

  9. Motion-based prediction is sufficient to solve the aperture problem

    PubMed Central

    Perrinet, Laurent U; Masson, Guillaume S

    2012-01-01

    In low-level sensory systems, it is still unclear how the noisy information collected locally by neurons may give rise to a coherent global percept. This is well demonstrated for the detection of motion in the aperture problem: as luminance of an elongated line is symmetrical along its axis, tangential velocity is ambiguous when measured locally. Here, we develop the hypothesis that motion-based predictive coding is sufficient to infer global motion. Our implementation is based on a context-dependent diffusion of a probabilistic representation of motion. We observe in simulations a progressive solution to the aperture problem similar to physiology and behavior. We demonstrate that this solution is the result of two underlying mechanisms. First, we demonstrate the formation of a tracking behavior favoring temporally coherent features independently of their texture. Second, we observe that incoherent features are explained away while coherent information diffuses progressively to the global scale. Most previous models included ad-hoc mechanisms such as end-stopped cells or a selection layer to track specific luminance-based features as necessary conditions to solve the aperture problem. Here, we have proved that motion-based predictive coding, as it is implemented in this functional model, is sufficient to solve the aperture problem. This solution may give insights in the role of prediction underlying a large class of sensory computations. PMID:22734489

  10. Work-based assessment: qualitative perspectives of novice nutrition and dietetics educators.

    PubMed

    Palermo, C; Beck, E J; Chung, A; Ash, S; Capra, S; Truby, H; Jolly, B

    2014-10-01

    The assessment of competence for health professionals including nutrition and dietetics professionals in work-based settings is challenging. The present study aimed to explore the experiences of educators involved in the assessment of nutrition and dietetics students in the practice setting and to identify barriers and enablers to effective assessment. A qualitative research approach using in-depth interviews was employed with a convenience sample of inexperienced dietitian assessors. Interviews explored assessment practices and challenges. Data were analysed using a thematic approach within a phenomenological framework. Twelve relatively inexperienced practice educators were purposefully sampled to take part in the present study. Three themes emerged from these data. (i) Student learning and thus assessment is hindered by a number of barriers, including workload demands and case-mix. Some workplaces are challenged to provide appropriate learning opportunities and environment. Adequate support for placement educators from the university, managers and their peers and planning are enablers to effective assessment. (ii) The role of the assessor and their relationship with students impacts on competence assessment. (iii) There is a lack of clarity in the tasks and responsibilities of competency-based assessment. The present study provides perspectives on barriers and enablers to effective assessment. It highlights the importance of reflective practice and feedback in assessment practices that are synonymous with evidence from other disciplines, which can be used to better support a work-based competency assessment of student performance. © 2013 The British Dietetic Association Ltd.

  11. In silico free energy predictions for ionic liquid-assisted exfoliation of a graphene bilayer into individual graphene nanosheets.

    PubMed

    Kamath, Ganesh; Baker, Gary A

    2012-06-14

    Free energies for graphene exfoliation from bilayer graphene using ionic liquids based on various cations paired with the bis(trifluoromethylsulfonyl)imide anion were determined from adaptive bias force-molecular dynamics (ABF-MD) simulation and fall in excellent qualitative agreement with experiment. This method has notable potential as an a priori screening tool for performance based rank order prediction of novel ionic liquids for the dispersion and exfoliation of various nanocarbons and inorganic graphene analogues.

  12. Stringent homology-based prediction of H. sapiens-M. tuberculosis H37Rv protein-protein interactions

    PubMed Central

    2014-01-01

    Background H. sapiens-M. tuberculosis H37Rv protein-protein interaction (PPI) data are essential for understanding the infection mechanism of the formidable pathogen M. tuberculosis H37Rv. Computational prediction is an important strategy to fill the gap in experimental H. sapiens-M. tuberculosis H37Rv PPI data. Homology-based prediction is frequently used in predicting both intra-species and inter-species PPIs. However, some limitations are not properly resolved in several published works that predict eukaryote-prokaryote inter-species PPIs using intra-species template PPIs. Results We develop a stringent homology-based prediction approach by taking into account (i) differences between eukaryotic and prokaryotic proteins and (ii) differences between inter-species and intra-species PPI interfaces. We compare our stringent homology-based approach to a conventional homology-based approach for predicting host-pathogen PPIs, based on cellular compartment distribution analysis, disease gene list enrichment analysis, pathway enrichment analysis and functional category enrichment analysis. These analyses support the validity of our prediction result, and clearly show that our approach has better performance in predicting H. sapiens-M. tuberculosis H37Rv PPIs. Using our stringent homology-based approach, we have predicted a set of highly plausible H. sapiens-M. tuberculosis H37Rv PPIs which might be useful for many of related studies. Based on our analysis of the H. sapiens-M. tuberculosis H37Rv PPI network predicted by our stringent homology-based approach, we have discovered several interesting properties which are reported here for the first time. We find that both host proteins and pathogen proteins involved in the host-pathogen PPIs tend to be hubs in their own intra-species PPI network. Also, both host and pathogen proteins involved in host-pathogen PPIs tend to have longer primary sequence, tend to have more domains, tend to be more hydrophilic, etc. And the protein

  13. Qualitative mechanism models and the rationalization of procedures

    NASA Technical Reports Server (NTRS)

    Farley, Arthur M.

    1989-01-01

    A qualitative, cluster-based approach to the representation of hydraulic systems is described and its potential for generating and explaining procedures is demonstrated. Many ideas are formalized and implemented as part of an interactive, computer-based system. The system allows for designing, displaying, and reasoning about hydraulic systems. The interactive system has an interface consisting of three windows: a design/control window, a cluster window, and a diagnosis/plan window. A qualitative mechanism model for the ORS (Orbital Refueling System) is presented to coordinate with ongoing research on this system being conducted at NASA Ames Research Center.

  14. A Hybrid FPGA-Based System for EEG- and EMG-Based Online Movement Prediction

    PubMed Central

    Wöhrle, Hendrik; Tabie, Marc; Kim, Su Kyoung; Kirchner, Frank; Kirchner, Elsa Andrea

    2017-01-01

    A current trend in the development of assistive devices for rehabilitation, for example exoskeletons or active orthoses, is to utilize physiological data to enhance their functionality and usability, for example by predicting the patient’s upcoming movements using electroencephalography (EEG) or electromyography (EMG). However, these modalities have different temporal properties and classification accuracies, which results in specific advantages and disadvantages. To use physiological data analysis in rehabilitation devices, the processing should be performed in real-time, guarantee close to natural movement onset support, provide high mobility, and should be performed by miniaturized systems that can be embedded into the rehabilitation device. We present a novel Field Programmable Gate Array (FPGA) -based system for real-time movement prediction using physiological data. Its parallel processing capabilities allows the combination of movement predictions based on EEG and EMG and additionally a P300 detection, which is likely evoked by instructions of the therapist. The system is evaluated in an offline and an online study with twelve healthy subjects in total. We show that it provides a high computational performance and significantly lower power consumption in comparison to a standard PC. Furthermore, despite the usage of fixed-point computations, the proposed system achieves a classification accuracy similar to systems with double precision floating-point precision. PMID:28671632

  15. A Hybrid FPGA-Based System for EEG- and EMG-Based Online Movement Prediction.

    PubMed

    Wöhrle, Hendrik; Tabie, Marc; Kim, Su Kyoung; Kirchner, Frank; Kirchner, Elsa Andrea

    2017-07-03

    A current trend in the development of assistive devices for rehabilitation, for example exoskeletons or active orthoses, is to utilize physiological data to enhance their functionality and usability, for example by predicting the patient's upcoming movements using electroencephalography (EEG) or electromyography (EMG). However, these modalities have different temporal properties and classification accuracies, which results in specific advantages and disadvantages. To use physiological data analysis in rehabilitation devices, the processing should be performed in real-time, guarantee close to natural movement onset support, provide high mobility, and should be performed by miniaturized systems that can be embedded into the rehabilitation device. We present a novel Field Programmable Gate Array (FPGA) -based system for real-time movement prediction using physiological data. Its parallel processing capabilities allows the combination of movement predictions based on EEG and EMG and additionally a P300 detection, which is likely evoked by instructions of the therapist. The system is evaluated in an offline and an online study with twelve healthy subjects in total. We show that it provides a high computational performance and significantly lower power consumption in comparison to a standard PC. Furthermore, despite the usage of fixed-point computations, the proposed system achieves a classification accuracy similar to systems with double precision floating-point precision.

  16. Quantitative and qualitative approaches in the study of poverty and adolescent development: separation or integration?

    PubMed

    Leung, Janet T Y; Shek, Daniel T L

    2011-01-01

    This paper examines the use of quantitative and qualitative approaches to study the impact of economic disadvantage on family processes and adolescent development. Quantitative research has the merits of objectivity, good predictive and explanatory power, parsimony, precision and sophistication of analysis. Qualitative research, in contrast, provides a detailed, holistic, in-depth understanding of social reality and allows illumination of new insights. With the pragmatic considerations of methodological appropriateness, design flexibility, and situational responsiveness in responding to the research inquiry, a mixed methods approach could be a possibility of integrating quantitative and qualitative approaches and offers an alternative strategy to study the impact of economic disadvantage on family processes and adolescent development.

  17. An Adaptive Handover Prediction Scheme for Seamless Mobility Based Wireless Networks

    PubMed Central

    Safa Sadiq, Ali; Fisal, Norsheila Binti; Ghafoor, Kayhan Zrar; Lloret, Jaime

    2014-01-01

    We propose an adaptive handover prediction (AHP) scheme for seamless mobility based wireless networks. That is, the AHP scheme incorporates fuzzy logic with AP prediction process in order to lend cognitive capability to handover decision making. Selection metrics, including received signal strength, mobile node relative direction towards the access points in the vicinity, and access point load, are collected and considered inputs of the fuzzy decision making system in order to select the best preferable AP around WLANs. The obtained handover decision which is based on the calculated quality cost using fuzzy inference system is also based on adaptable coefficients instead of fixed coefficients. In other words, the mean and the standard deviation of the normalized network prediction metrics of fuzzy inference system, which are collected from available WLANs are obtained adaptively. Accordingly, they are applied as statistical information to adjust or adapt the coefficients of membership functions. In addition, we propose an adjustable weight vector concept for input metrics in order to cope with the continuous, unpredictable variation in their membership degrees. Furthermore, handover decisions are performed in each MN independently after knowing RSS, direction toward APs, and AP load. Finally, performance evaluation of the proposed scheme shows its superiority compared with representatives of the prediction approaches. PMID:25574490

  18. An adaptive handover prediction scheme for seamless mobility based wireless networks.

    PubMed

    Sadiq, Ali Safa; Fisal, Norsheila Binti; Ghafoor, Kayhan Zrar; Lloret, Jaime

    2014-01-01

    We propose an adaptive handover prediction (AHP) scheme for seamless mobility based wireless networks. That is, the AHP scheme incorporates fuzzy logic with AP prediction process in order to lend cognitive capability to handover decision making. Selection metrics, including received signal strength, mobile node relative direction towards the access points in the vicinity, and access point load, are collected and considered inputs of the fuzzy decision making system in order to select the best preferable AP around WLANs. The obtained handover decision which is based on the calculated quality cost using fuzzy inference system is also based on adaptable coefficients instead of fixed coefficients. In other words, the mean and the standard deviation of the normalized network prediction metrics of fuzzy inference system, which are collected from available WLANs are obtained adaptively. Accordingly, they are applied as statistical information to adjust or adapt the coefficients of membership functions. In addition, we propose an adjustable weight vector concept for input metrics in order to cope with the continuous, unpredictable variation in their membership degrees. Furthermore, handover decisions are performed in each MN independently after knowing RSS, direction toward APs, and AP load. Finally, performance evaluation of the proposed scheme shows its superiority compared with representatives of the prediction approaches.

  19. Weather-based prediction of Plasmodium falciparum malaria in epidemic-prone regions of Ethiopia II. Weather-based prediction systems perform comparably to early detection systems in identifying times for interventions.

    PubMed

    Teklehaimanot, Hailay D; Schwartz, Joel; Teklehaimanot, Awash; Lipsitch, Marc

    2004-11-19

    Timely and accurate information about the onset of malaria epidemics is essential for effective control activities in epidemic-prone regions. Early warning methods that provide earlier alerts (usually by the use of weather variables) may permit control measures to interrupt transmission earlier in the epidemic, perhaps at the expense of some level of accuracy. Expected case numbers were modeled using a Poisson regression with lagged weather factors in a 4th-degree polynomial distributed lag model. For each week, the numbers of malaria cases were predicted using coefficients obtained using all years except that for which the prediction was being made. The effectiveness of alerts generated by the prediction system was compared against that of alerts based on observed cases. The usefulness of the prediction system was evaluated in cold and hot districts. The system predicts the overall pattern of cases well, yet underestimates the height of the largest peaks. Relative to alerts triggered by observed cases, the alerts triggered by the predicted number of cases performed slightly worse, within 5% of the detection system. The prediction-based alerts were able to prevent 10-25% more cases at a given sensitivity in cold districts than in hot ones. The prediction of malaria cases using lagged weather performed well in identifying periods of increased malaria cases. Weather-derived predictions identified epidemics with reasonable accuracy and better timeliness than early detection systems; therefore, the prediction of malarial epidemics using weather is a plausible alternative to early detection systems.

  20. Prediction of beta-turns at over 80% accuracy based on an ensemble of predicted secondary structures and multiple alignments

    PubMed Central

    Zheng, Ce; Kurgan, Lukasz

    2008-01-01

    Background β-turn is a secondary protein structure type that plays significant role in protein folding, stability, and molecular recognition. To date, several methods for prediction of β-turns from protein sequences were developed, but they are characterized by relatively poor prediction quality. The novelty of the proposed sequence-based β-turn predictor stems from the usage of a window based information extracted from four predicted three-state secondary structures, which together with a selected set of position specific scoring matrix (PSSM) values serve as an input to the support vector machine (SVM) predictor. Results We show that (1) all four predicted secondary structures are useful; (2) the most useful information extracted from the predicted secondary structure includes the structure of the predicted residue, secondary structure content in a window around the predicted residue, and features that indicate whether the predicted residue is inside a secondary structure segment; (3) the PSSM values of Asn, Asp, Gly, Ile, Leu, Met, Pro, and Val were among the top ranked features, which corroborates with recent studies. The Asn, Asp, Gly, and Pro indicate potential β-turns, while the remaining four amino acids are useful to predict non-β-turns. Empirical evaluation using three nonredundant datasets shows favorable Qtotal, Qpredicted and MCC values when compared with over a dozen of modern competing methods. Our method is the first to break the 80% Qtotal barrier and achieves Qtotal = 80.9%, MCC = 0.47, and Qpredicted higher by over 6% when compared with the second best method. We use feature selection to reduce the dimensionality of the feature vector used as the input for the proposed prediction method. The applied feature set is smaller by 86, 62 and 37% when compared with the second and two third-best (with respect to MCC) competing methods, respectively. Conclusion Experiments show that the proposed method constitutes an improvement over the competing

  1. A Study Combining Criticism and Qualitative Research Techniques for Appraising Classroom Media.

    ERIC Educational Resources Information Center

    Swartz, James D.

    Qualitative criticism is a method of understanding things, actions, and events within a social framework. It is a method of acquiring knowledge to guide decision making based on local knowledge and a synthesis of principles from criticism and qualitative research. The function of qualitative criticism is centered with Richard Rorty's theoretical…

  2. Qualitative Student Models.

    ERIC Educational Resources Information Center

    Clancey, William J.

    The concept of a qualitative model is used as the focus of this review of qualitative student models in order to compare alternative computational models and to contrast domain requirements. The report is divided into eight sections: (1) Origins and Goals (adaptive instruction, qualitative models of processes, components of an artificial…

  3. Physics-based protein-structure prediction using a hierarchical protocol based on the UNRES force field: assessment in two blind tests.

    PubMed

    Ołdziej, S; Czaplewski, C; Liwo, A; Chinchio, M; Nanias, M; Vila, J A; Khalili, M; Arnautova, Y A; Jagielska, A; Makowski, M; Schafroth, H D; Kaźmierkiewicz, R; Ripoll, D R; Pillardy, J; Saunders, J A; Kang, Y K; Gibson, K D; Scheraga, H A

    2005-05-24

    Recent improvements in the protein-structure prediction method developed in our laboratory, based on the thermodynamic hypothesis, are described. The conformational space is searched extensively at the united-residue level by using our physics-based UNRES energy function and the conformational space annealing method of global optimization. The lowest-energy coarse-grained structures are then converted to an all-atom representation and energy-minimized with the ECEPP/3 force field. The procedure was assessed in two recent blind tests of protein-structure prediction. During the first blind test, we predicted large fragments of alpha and alpha+beta proteins [60-70 residues with C(alpha) rms deviation (rmsd) <6 A]. However, for alpha+beta proteins, significant topological errors occurred despite low rmsd values. In the second exercise, we predicted whole structures of five proteins (two alpha and three alpha+beta, with sizes of 53-235 residues) with remarkably good accuracy. In particular, for the genomic target TM0487 (a 102-residue alpha+beta protein from Thermotoga maritima), we predicted the complete, topologically correct structure with 7.3-A C(alpha) rmsd. So far this protein is the largest alpha+beta protein predicted based solely on the amino acid sequence and a physics-based potential-energy function and search procedure. For target T0198, a phosphate transport system regulator PhoU from T. maritima (a 235-residue mainly alpha-helical protein), we predicted the topology of the whole six-helix bundle correctly within 8 A rmsd, except the 32 C-terminal residues, most of which form a beta-hairpin. These and other examples described in this work demonstrate significant progress in physics-based protein-structure prediction.

  4. Bias-adjusted satellite-based rainfall estimates for predicting floods: Narayani Basin

    USGS Publications Warehouse

    Shrestha, M.S.; Artan, G.A.; Bajracharya, S.R.; Gautam, D.K.; Tokar, S.A.

    2011-01-01

    In Nepal, as the spatial distribution of rain gauges is not sufficient to provide detailed perspective on the highly varied spatial nature of rainfall, satellite-based rainfall estimates provides the opportunity for timely estimation. This paper presents the flood prediction of Narayani Basin at the Devghat hydrometric station (32000km2) using bias-adjusted satellite rainfall estimates and the Geospatial Stream Flow Model (GeoSFM), a spatially distributed, physically based hydrologic model. The GeoSFM with gridded gauge observed rainfall inputs using kriging interpolation from 2003 was used for calibration and 2004 for validation to simulate stream flow with both having a Nash Sutcliff Efficiency of above 0.7. With the National Oceanic and Atmospheric Administration Climate Prediction Centre's rainfall estimates (CPC-RFE2.0), using the same calibrated parameters, for 2003 the model performance deteriorated but improved after recalibration with CPC-RFE2.0 indicating the need to recalibrate the model with satellite-based rainfall estimates. Adjusting the CPC-RFE2.0 by a seasonal, monthly and 7-day moving average ratio, improvement in model performance was achieved. Furthermore, a new gauge-satellite merged rainfall estimates obtained from ingestion of local rain gauge data resulted in significant improvement in flood predictability. The results indicate the applicability of satellite-based rainfall estimates in flood prediction with appropriate bias correction. ?? 2011 The Authors. Journal of Flood Risk Management ?? 2011 The Chartered Institution of Water and Environmental Management.

  5. Anesthesiologists' learning curves for bedside qualitative ultrasound assessment of gastric content: a cohort study.

    PubMed

    Arzola, Cristian; Carvalho, Jose C A; Cubillos, Javier; Ye, Xiang Y; Perlas, Anahi

    2013-08-01

    Focused assessment of the gastric antrum by ultrasound is a feasible tool to evaluate the quality of the stomach content. We aimed to determine the amount of training an anesthesiologist would need to achieve competence in the bedside ultrasound technique for qualitative assessment of gastric content. Six anesthesiologists underwent a teaching intervention followed by a formative assessment; then learning curves were constructed. Participants received didactic teaching (reading material, picture library, and lecture) and an interactive hands-on workshop on live models directed by an expert sonographer. The participants were instructed on how to perform a systematic qualitative assessment to diagnose one of three distinct categories of gastric content (empty, clear fluid, solid) in healthy volunteers. Individual learning curves were constructed using the cumulative sum method, and competence was defined as a 90% success rate in a series of ultrasound examinations. A predictive model was further developed based on the entire cohort performance to determine the number of cases required to achieve a 95% success rate. Each anesthesiologist performed 30 ultrasound examinations (a total of 180 assessments), and three of the six participants achieved competence. The average number of cases required to achieve 90% and 95% success rates was estimated to be 24 and 33, respectively. With appropriate training and supervision, it is estimated that anesthesiologists will achieve a 95% success rate in bedside qualitative ultrasound assessment after performing approximately 33 examinations.

  6. Crack propagation analysis and fatigue life prediction for structural alloy steel based on metal magnetic memory testing

    NASA Astrophysics Data System (ADS)

    Ni, Chen; Hua, Lin; Wang, Xiaokai

    2018-09-01

    To monitor the crack propagation and predict the fatigue life of ferromagnetic material, the metal magnetic memory (MMM) testing was carried out to the single edge notched specimen made from structural alloy steel under three-point bending fatigue experiment in this paper. The variation of magnetic memory signal Hp (y) in process of fatigue crack propagation was investigated. The gradient K of Hp (y) was investigated and compared with the stress of specimen obtained by finite element analysis. It indicated that the gradient K can qualitatively reflect the distribution and variation of stress. The maximum gradient Kmax and crack size showed a good linear relationship, which indicated that the crack propagation can be estimated by MMM testing. Furthermore, the damage model represented by magnetic memory characteristic was created and a fatigue life prediction method was developed. The fatigue life can be evaluated by the relationship between damage parameter and normalized life. The method was also verified by another specimen. Because of MMM testing, it provided a new approach for predicting fatigue life.

  7. EP-DNN: A Deep Neural Network-Based Global Enhancer Prediction Algorithm.

    PubMed

    Kim, Seong Gon; Harwani, Mrudul; Grama, Ananth; Chaterji, Somali

    2016-12-08

    We present EP-DNN, a protocol for predicting enhancers based on chromatin features, in different cell types. Specifically, we use a deep neural network (DNN)-based architecture to extract enhancer signatures in a representative human embryonic stem cell type (H1) and a differentiated lung cell type (IMR90). We train EP-DNN using p300 binding sites, as enhancers, and TSS and random non-DHS sites, as non-enhancers. We perform same-cell and cross-cell predictions to quantify the validation rate and compare against two state-of-the-art methods, DEEP-ENCODE and RFECS. We find that EP-DNN has superior accuracy with a validation rate of 91.6%, relative to 85.3% for DEEP-ENCODE and 85.5% for RFECS, for a given number of enhancer predictions and also scales better for a larger number of enhancer predictions. Moreover, our H1 → IMR90 predictions turn out to be more accurate than IMR90 → IMR90, potentially because H1 exhibits a richer signature set and our EP-DNN model is expressive enough to extract these subtleties. Our work shows how to leverage the full expressivity of deep learning models, using multiple hidden layers, while avoiding overfitting on the training data. We also lay the foundation for exploration of cross-cell enhancer predictions, potentially reducing the need for expensive experimentation.

  8. EP-DNN: A Deep Neural Network-Based Global Enhancer Prediction Algorithm

    NASA Astrophysics Data System (ADS)

    Kim, Seong Gon; Harwani, Mrudul; Grama, Ananth; Chaterji, Somali

    2016-12-01

    We present EP-DNN, a protocol for predicting enhancers based on chromatin features, in different cell types. Specifically, we use a deep neural network (DNN)-based architecture to extract enhancer signatures in a representative human embryonic stem cell type (H1) and a differentiated lung cell type (IMR90). We train EP-DNN using p300 binding sites, as enhancers, and TSS and random non-DHS sites, as non-enhancers. We perform same-cell and cross-cell predictions to quantify the validation rate and compare against two state-of-the-art methods, DEEP-ENCODE and RFECS. We find that EP-DNN has superior accuracy with a validation rate of 91.6%, relative to 85.3% for DEEP-ENCODE and 85.5% for RFECS, for a given number of enhancer predictions and also scales better for a larger number of enhancer predictions. Moreover, our H1 → IMR90 predictions turn out to be more accurate than IMR90 → IMR90, potentially because H1 exhibits a richer signature set and our EP-DNN model is expressive enough to extract these subtleties. Our work shows how to leverage the full expressivity of deep learning models, using multiple hidden layers, while avoiding overfitting on the training data. We also lay the foundation for exploration of cross-cell enhancer predictions, potentially reducing the need for expensive experimentation.

  9. Network-based ranking methods for prediction of novel disease associated microRNAs.

    PubMed

    Le, Duc-Hau

    2015-10-01

    Many studies have shown roles of microRNAs on human disease and a number of computational methods have been proposed to predict such associations by ranking candidate microRNAs according to their relevance to a disease. Among them, machine learning-based methods usually have a limitation in specifying non-disease microRNAs as negative training samples. Meanwhile, network-based methods are becoming dominant since they well exploit a "disease module" principle in microRNA functional similarity networks. Of which, random walk with restart (RWR) algorithm-based method is currently state-of-the-art. The use of this algorithm was inspired from its success in predicting disease gene because the "disease module" principle also exists in protein interaction networks. Besides, many algorithms designed for webpage ranking have been successfully applied in ranking disease candidate genes because web networks share topological properties with protein interaction networks. However, these algorithms have not yet been utilized for disease microRNA prediction. We constructed microRNA functional similarity networks based on shared targets of microRNAs, and then we integrated them with a microRNA functional synergistic network, which was recently identified. After analyzing topological properties of these networks, in addition to RWR, we assessed the performance of (i) PRINCE (PRIoritizatioN and Complex Elucidation), which was proposed for disease gene prediction; (ii) PageRank with Priors (PRP) and K-Step Markov (KSM), which were used for studying web networks; and (iii) a neighborhood-based algorithm. Analyses on topological properties showed that all microRNA functional similarity networks are small-worldness and scale-free. The performance of each algorithm was assessed based on average AUC values on 35 disease phenotypes and average rankings of newly discovered disease microRNAs. As a result, the performance on the integrated network was better than that on individual ones. In

  10. HomPPI: a class of sequence homology based protein-protein interface prediction methods

    PubMed Central

    2011-01-01

    Background Although homology-based methods are among the most widely used methods for predicting the structure and function of proteins, the question as to whether interface sequence conservation can be effectively exploited in predicting protein-protein interfaces has been a subject of debate. Results We studied more than 300,000 pair-wise alignments of protein sequences from structurally characterized protein complexes, including both obligate and transient complexes. We identified sequence similarity criteria required for accurate homology-based inference of interface residues in a query protein sequence. Based on these analyses, we developed HomPPI, a class of sequence homology-based methods for predicting protein-protein interface residues. We present two variants of HomPPI: (i) NPS-HomPPI (Non partner-specific HomPPI), which can be used to predict interface residues of a query protein in the absence of knowledge of the interaction partner; and (ii) PS-HomPPI (Partner-specific HomPPI), which can be used to predict the interface residues of a query protein with a specific target protein. Our experiments on a benchmark dataset of obligate homodimeric complexes show that NPS-HomPPI can reliably predict protein-protein interface residues in a given protein, with an average correlation coefficient (CC) of 0.76, sensitivity of 0.83, and specificity of 0.78, when sequence homologs of the query protein can be reliably identified. NPS-HomPPI also reliably predicts the interface residues of intrinsically disordered proteins. Our experiments suggest that NPS-HomPPI is competitive with several state-of-the-art interface prediction servers including those that exploit the structure of the query proteins. The partner-specific classifier, PS-HomPPI can, on a large dataset of transient complexes, predict the interface residues of a query protein with a specific target, with a CC of 0.65, sensitivity of 0.69, and specificity of 0.70, when homologs of both the query and the

  11. Improving local clustering based top-L link prediction methods via asymmetric link clustering information

    NASA Astrophysics Data System (ADS)

    Wu, Zhihao; Lin, Youfang; Zhao, Yiji; Yan, Hongyan

    2018-02-01

    Networks can represent a wide range of complex systems, such as social, biological and technological systems. Link prediction is one of the most important problems in network analysis, and has attracted much research interest recently. Many link prediction methods have been proposed to solve this problem with various techniques. We can note that clustering information plays an important role in solving the link prediction problem. In previous literatures, we find node clustering coefficient appears frequently in many link prediction methods. However, node clustering coefficient is limited to describe the role of a common-neighbor in different local networks, because it cannot distinguish different clustering abilities of a node to different node pairs. In this paper, we shift our focus from nodes to links, and propose the concept of asymmetric link clustering (ALC) coefficient. Further, we improve three node clustering based link prediction methods via the concept of ALC. The experimental results demonstrate that ALC-based methods outperform node clustering based methods, especially achieving remarkable improvements on food web, hamster friendship and Internet networks. Besides, comparing with other methods, the performance of ALC-based methods are very stable in both globalized and personalized top-L link prediction tasks.

  12. A group evolving-based framework with perturbations for link prediction

    NASA Astrophysics Data System (ADS)

    Si, Cuiqi; Jiao, Licheng; Wu, Jianshe; Zhao, Jin

    2017-06-01

    Link prediction is a ubiquitous application in many fields which uses partially observed information to predict absence or presence of links between node pairs. The group evolving study provides reasonable explanations on the behaviors of nodes, relations between nodes and community formation in a network. Possible events in group evolution include continuing, growing, splitting, forming and so on. The changes discovered in networks are to some extent the result of these events. In this work, we present a group evolving-based characterization of node's behavioral patterns, and via which we can estimate the probability they tend to interact. In general, the primary aim of this paper is to offer a minimal toy model to detect missing links based on evolution of groups and give a simpler explanation on the rationality of the model. We first introduce perturbations into networks to obtain stable cluster structures, and the stable clusters determine the stability of each node. Then fluctuations, another node behavior, are assumed by the participation of each node to its own belonging group. Finally, we demonstrate that such characteristics allow us to predict link existence and propose a model for link prediction which outperforms many classical methods with a decreasing computational time in large scales. Encouraging experimental results obtained on real networks show that our approach can effectively predict missing links in network, and even when nearly 40% of the edges are missing, it also retains stationary performance.

  13. A Qualitative Phenomenology of Christian Middle School Implementation of Inquiry-based Science Instruction

    NASA Astrophysics Data System (ADS)

    Ferrin, Patricia Ann

    The purpose of this qualitative phenomenology study will be to explore curriculum coordinators, teachers, and principals' implementation of Inquiry-Based Instruction (IBI) in Christian middle school science classes in the central Virginia area. IBI will be referred to as "a teaching method that combines the curiosity of students and the scientific method to enhance the development of critical thinking skills while learning science" (Warner & Myers, 2008, p.3). A qualitative phenomenology study will be made to consider the requirements and implementation of IBI in the Christian middle schools as compared to the requirements and implementation of IBI in the National Science Education Standard (NSES). Curriculum coordinators, teachers, and principals, and participated in this study from five Christian middle schools in the central Virginia area. The guiding theories include John Dewey's (1948) Constructivism, Lev Vygotsky's (1998) Social Constructivism, and William Glasser's (2005) Choice Theory as they relate to the beliefs curriculum coordinators, teachers, and principals have regarding the implementation of IBI. A primary research question for this study is, "If research supports successful outcomes of IBI, then how and why do Christian CMSST, principals, and curriculum coordinators implement or not implement IBI?" Interviews, classroom observations, and document reviews were used for triangulation and data collection. The data analysis used in this study were completed by using Moustakas' (1994) seven step thematic coding derived from the observations, interview transcriptions, and school documents in the form of lesson plans and objectives (Merriam, 2009; Moustakas, 1994).

  14. Association Rule-based Predictive Model for Machine Failure in Industrial Internet of Things

    NASA Astrophysics Data System (ADS)

    Kwon, Jung-Hyok; Lee, Sol-Bee; Park, Jaehoon; Kim, Eui-Jik

    2017-09-01

    This paper proposes an association rule-based predictive model for machine failure in industrial Internet of things (IIoT), which can accurately predict the machine failure in real manufacturing environment by investigating the relationship between the cause and type of machine failure. To develop the predictive model, we consider three major steps: 1) binarization, 2) rule creation, 3) visualization. The binarization step translates item values in a dataset into one or zero, then the rule creation step creates association rules as IF-THEN structures using the Lattice model and Apriori algorithm. Finally, the created rules are visualized in various ways for users’ understanding. An experimental implementation was conducted using R Studio version 3.3.2. The results show that the proposed predictive model realistically predicts machine failure based on association rules.

  15. Structural protein descriptors in 1-dimension and their sequence-based predictions.

    PubMed

    Kurgan, Lukasz; Disfani, Fatemeh Miri

    2011-09-01

    The last few decades observed an increasing interest in development and application of 1-dimensional (1D) descriptors of protein structure. These descriptors project 3D structural features onto 1D strings of residue-wise structural assignments. They cover a wide-range of structural aspects including conformation of the backbone, burying depth/solvent exposure and flexibility of residues, and inter-chain residue-residue contacts. We perform first-of-its-kind comprehensive comparative review of the existing 1D structural descriptors. We define, review and categorize ten structural descriptors and we also describe, summarize and contrast over eighty computational models that are used to predict these descriptors from the protein sequences. We show that the majority of the recent sequence-based predictors utilize machine learning models, with the most popular being neural networks, support vector machines, hidden Markov models, and support vector and linear regressions. These methods provide high-throughput predictions and most of them are accessible to a non-expert user via web servers and/or stand-alone software packages. We empirically evaluate several recent sequence-based predictors of secondary structure, disorder, and solvent accessibility descriptors using a benchmark set based on CASP8 targets. Our analysis shows that the secondary structure can be predicted with over 80% accuracy and segment overlap (SOV), disorder with over 0.9 AUC, 0.6 Matthews Correlation Coefficient (MCC), and 75% SOV, and relative solvent accessibility with PCC of 0.7 and MCC of 0.6 (0.86 when homology is used). We demonstrate that the secondary structure predicted from sequence without the use of homology modeling is as good as the structure extracted from the 3D folds predicted by top-performing template-based methods.

  16. SVM-Based System for Prediction of Epileptic Seizures from iEEG Signal

    PubMed Central

    Cherkassky, Vladimir; Lee, Jieun; Veber, Brandon; Patterson, Edward E.; Brinkmann, Benjamin H.; Worrell, Gregory A.

    2017-01-01

    Objective This paper describes a data-analytic modeling approach for prediction of epileptic seizures from intracranial electroencephalogram (iEEG) recording of brain activity. Even though it is widely accepted that statistical characteristics of iEEG signal change prior to seizures, robust seizure prediction remains a challenging problem due to subject-specific nature of data-analytic modeling. Methods Our work emphasizes understanding of clinical considerations important for iEEG-based seizure prediction, and proper translation of these clinical considerations into data-analytic modeling assumptions. Several design choices during pre-processing and post-processing are considered and investigated for their effect on seizure prediction accuracy. Results Our empirical results show that the proposed SVM-based seizure prediction system can achieve robust prediction of preictal and interictal iEEG segments from dogs with epilepsy. The sensitivity is about 90–100%, and the false-positive rate is about 0–0.3 times per day. The results also suggest good prediction is subject-specific (dog or human), in agreement with earlier studies. Conclusion Good prediction performance is possible only if the training data contain sufficiently many seizure episodes, i.e., at least 5–7 seizures. Significance The proposed system uses subject-specific modeling and unbalanced training data. This system also utilizes three different time scales during training and testing stages. PMID:27362758

  17. Predicting seizure by modeling synaptic plasticity based on EEG signals - a case study of inherited epilepsy

    NASA Astrophysics Data System (ADS)

    Zhang, Honghui; Su, Jianzhong; Wang, Qingyun; Liu, Yueming; Good, Levi; Pascual, Juan M.

    2018-03-01

    This paper explores the internal dynamical mechanisms of epileptic seizures through quantitative modeling based on full brain electroencephalogram (EEG) signals. Our goal is to provide seizure prediction and facilitate treatment for epileptic patients. Motivated by an earlier mathematical model with incorporated synaptic plasticity, we studied the nonlinear dynamics of inherited seizures through a differential equation model. First, driven by a set of clinical inherited electroencephalogram data recorded from a patient with diagnosed Glucose Transporter Deficiency, we developed a dynamic seizure model on a system of ordinary differential equations. The model was reduced in complexity after considering and removing redundancy of each EEG channel. Then we verified that the proposed model produces qualitatively relevant behavior which matches the basic experimental observations of inherited seizure, including synchronization index and frequency. Meanwhile, the rationality of the connectivity structure hypothesis in the modeling process was verified. Further, through varying the threshold condition and excitation strength of synaptic plasticity, we elucidated the effect of synaptic plasticity to our seizure model. Results suggest that synaptic plasticity has great effect on the duration of seizure activities, which support the plausibility of therapeutic interventions for seizure control.

  18. Managing Everyday Life: A Qualitative Study of Patients’ Experiences of a Web-Based Ulcer Record for Home-Based Treatment

    PubMed Central

    Trondsen, Marianne V.

    2014-01-01

    Chronic skin ulcers are a significant challenge for patients and health service resources, and ulcer treatment often requires the competence of a specialist. Although e-health interventions are increasingly valued for ulcer care by giving access to specialists at a distance, there is limited research on patients’ use of e-health services for home-based ulcer treatment. This article reports an exploratory qualitative study of the first Norwegian web-based counselling service for home-based ulcer treatment, established in 2011 by the University Hospital of North Norway (UNN). Community nurses, general practitioners (GPs) and patients are offered access to a web-based record system to optimize ulcer care. The web-based ulcer record enables the exchange and storage of digital photos and clinical information, by the use of which, an ulcer team at UNN, consisting of specialized nurses and dermatologists, is accessible within 24 h. This article explores patients’ experiences of using the web-based record for their home-based ulcer treatment without assistance from community nurses. Semi-structured interviews were conducted with a total of four patients who had used the record. The main outcomes identified were: autonomy and flexibility; safety and trust; involvement and control; and motivation and hope. These aspects improved the patients’ everyday life during long-term ulcer care and can be understood as stimulating patient empowerment. PMID:27429289

  19. Predicting Fluctuations in Cryptocurrency Transactions Based on User Comments and Replies.

    PubMed

    Kim, Young Bin; Kim, Jun Gi; Kim, Wook; Im, Jae Ho; Kim, Tae Hyeong; Kang, Shin Jin; Kim, Chang Hun

    2016-01-01

    This paper proposes a method to predict fluctuations in the prices of cryptocurrencies, which are increasingly used for online transactions worldwide. Little research has been conducted on predicting fluctuations in the price and number of transactions of a variety of cryptocurrencies. Moreover, the few methods proposed to predict fluctuation in currency prices are inefficient because they fail to take into account the differences in attributes between real currencies and cryptocurrencies. This paper analyzes user comments in online cryptocurrency communities to predict fluctuations in the prices of cryptocurrencies and the number of transactions. By focusing on three cryptocurrencies, each with a large market size and user base, this paper attempts to predict such fluctuations by using a simple and efficient method.

  20. Predicting Fluctuations in Cryptocurrency Transactions Based on User Comments and Replies

    PubMed Central

    Kim, Young Bin; Kim, Jun Gi; Kim, Wook; Im, Jae Ho; Kim, Tae Hyeong; Kang, Shin Jin; Kim, Chang Hun

    2016-01-01

    This paper proposes a method to predict fluctuations in the prices of cryptocurrencies, which are increasingly used for online transactions worldwide. Little research has been conducted on predicting fluctuations in the price and number of transactions of a variety of cryptocurrencies. Moreover, the few methods proposed to predict fluctuation in currency prices are inefficient because they fail to take into account the differences in attributes between real currencies and cryptocurrencies. This paper analyzes user comments in online cryptocurrency communities to predict fluctuations in the prices of cryptocurrencies and the number of transactions. By focusing on three cryptocurrencies, each with a large market size and user base, this paper attempts to predict such fluctuations by using a simple and efficient method. PMID:27533113

  1. Predicting Social Anxiety Treatment Outcome Based on Therapeutic Email Conversations.

    PubMed

    Hoogendoorn, Mark; Berger, Thomas; Schulz, Ava; Stolz, Timo; Szolovits, Peter

    2017-09-01

    Predicting therapeutic outcome in the mental health domain is of utmost importance to enable therapists to provide the most effective treatment to a patient. Using information from the writings of a patient can potentially be a valuable source of information, especially now that more and more treatments involve computer-based exercises or electronic conversations between patient and therapist. In this paper, we study predictive modeling using writings of patients under treatment for a social anxiety disorder. We extract a wealth of information from the text written by patients including their usage of words, the topics they talk about, the sentiment of the messages, and the style of writing. In addition, we study trends over time with respect to those measures. We then apply machine learning algorithms to generate the predictive models. Based on a dataset of 69 patients, we are able to show that we can predict therapy outcome with an area under the curve of 0.83 halfway through the therapy and with a precision of 0.78 when using the full data (i.e., the entire treatment period). Due to the limited number of participants, it is hard to generalize the results, but they do show great potential in this type of information.

  2. Ensemble-based docking: From hit discovery to metabolism and toxicity predictions.

    PubMed

    Evangelista, Wilfredo; Weir, Rebecca L; Ellingson, Sally R; Harris, Jason B; Kapoor, Karan; Smith, Jeremy C; Baudry, Jerome

    2016-10-15

    This paper describes and illustrates the use of ensemble-based docking, i.e., using a collection of protein structures in docking calculations for hit discovery, the exploration of biochemical pathways and toxicity prediction of drug candidates. We describe the computational engineering work necessary to enable large ensemble docking campaigns on supercomputers. We show examples where ensemble-based docking has significantly increased the number and the diversity of validated drug candidates. Finally, we illustrate how ensemble-based docking can be extended beyond hit discovery and toward providing a structural basis for the prediction of metabolism and off-target binding relevant to pre-clinical and clinical trials. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Next Place Prediction Based on Spatiotemporal Pattern Mining of Mobile Device Logs.

    PubMed

    Lee, Sungjun; Lim, Junseok; Park, Jonghun; Kim, Kwanho

    2016-01-23

    Due to the recent explosive growth of location-aware services based on mobile devices, predicting the next places of a user is of increasing importance to enable proactive information services. In this paper, we introduce a data-driven framework that aims to predict the user's next places using his/her past visiting patterns analyzed from mobile device logs. Specifically, the notion of the spatiotemporal-periodic (STP) pattern is proposed to capture the visits with spatiotemporal periodicity by focusing on a detail level of location for each individual. Subsequently, we present algorithms that extract the STP patterns from a user's past visiting behaviors and predict the next places based on the patterns. The experiment results obtained by using a real-world dataset show that the proposed methods are more effective in predicting the user's next places than the previous approaches considered in most cases.

  4. A new method for enhancer prediction based on deep belief network.

    PubMed

    Bu, Hongda; Gan, Yanglan; Wang, Yang; Zhou, Shuigeng; Guan, Jihong

    2017-10-16

    Studies have shown that enhancers are significant regulatory elements to play crucial roles in gene expression regulation. Since enhancers are unrelated to the orientation and distance to their target genes, it is a challenging mission for scholars and researchers to accurately predicting distal enhancers. In the past years, with the high-throughout ChiP-seq technologies development, several computational techniques emerge to predict enhancers using epigenetic or genomic features. Nevertheless, the inconsistency of computational models across different cell-lines and the unsatisfactory prediction performance call for further research in this area. Here, we propose a new Deep Belief Network (DBN) based computational method for enhancer prediction, which is called EnhancerDBN. This method combines diverse features, composed of DNA sequence compositional features, DNA methylation and histone modifications. Our computational results indicate that 1) EnhancerDBN outperforms 13 existing methods in prediction, and 2) GC content and DNA methylation can serve as relevant features for enhancer prediction. Deep learning is effective in boosting the performance of enhancer prediction.

  5. A microRNA-based prediction model for lymph node metastasis in hepatocellular carcinoma.

    PubMed

    Zhang, Li; Xiang, Zuo-Lin; Zeng, Zhao-Chong; Fan, Jia; Tang, Zhao-You; Zhao, Xiao-Mei

    2016-01-19

    We developed an efficient microRNA (miRNA) model that could predict the risk of lymph node metastasis (LNM) in hepatocellular carcinoma (HCC). We first evaluated a training cohort of 192 HCC patients after hepatectomy and found five LNM associated predictive factors: vascular invasion, Barcelona Clinic Liver Cancer stage, miR-145, miR-31, and miR-92a. The five statistically independent factors were used to develop a predictive model. The predictive value of the miRNA-based model was confirmed in a validation cohort of 209 consecutive HCC patients. The prediction model was scored for LNM risk from 0 to 8. The cutoff value 4 was used to distinguish high-risk and low-risk groups. The model sensitivity and specificity was 69.6 and 80.2%, respectively, during 5 years in the validation cohort. And the area under the curve (AUC) for the miRNA-based prognostic model was 0.860. The 5-year positive and negative predictive values of the model in the validation cohort were 30.3 and 95.5%, respectively. Cox regression analysis revealed that the LNM hazard ratio of the high-risk versus low-risk groups was 11.751 (95% CI, 5.110-27.021; P < 0.001) in the validation cohort. In conclusion, the miRNA-based model is reliable and accurate for the early prediction of LNM in patients with HCC.

  6. Geometry-based pressure drop prediction in mildly diseased human coronary arteries.

    PubMed

    Schrauwen, J T C; Wentzel, J J; van der Steen, A F W; Gijsen, F J H

    2014-06-03

    Pressure drop (△p) estimations in human coronary arteries have several important applications, including determination of appropriate boundary conditions for CFD and estimation of fractional flow reserve (FFR). In this study a △p prediction was made based on geometrical features derived from patient-specific imaging data. Twenty-two mildly diseased human coronary arteries were imaged with computed tomography and intravascular ultrasound. Each artery was modelled in three consecutive steps: from straight to tapered, to stenosed, to curved model. CFD was performed to compute the additional △p in each model under steady flow for a wide range of Reynolds numbers. The correlations between the added geometrical complexity and additional △p were used to compute a predicted △p. This predicted △p based on geometry was compared to CFD results. The mean △p calculated with CFD was 855±666Pa. Tapering and curvature added significantly to the total △p, accounting for 31.4±19.0% and 18.0±10.9% respectively at Re=250. Using tapering angle, maximum area stenosis and angularity of the centerline, we were able to generate a good estimate for the predicted △p with a low mean but high standard deviation: average error of 41.1±287.8Pa at Re=250. Furthermore, the predicted △p was used to accurately estimate FFR (r=0.93). The effect of the geometric features was determined and the pressure drop in mildly diseased human coronary arteries was predicted quickly based solely on geometry. This pressure drop estimation could serve as a boundary condition in CFD to model the impact of distal epicardial vessels. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Novel Approach for the Recognition and Prediction of Multi-Function Radar Behaviours Based on Predictive State Representations

    PubMed Central

    Ou, Jian; Chen, Yongguang; Zhao, Feng; Liu, Jin; Xiao, Shunping

    2017-01-01

    The extensive applications of multi-function radars (MFRs) have presented a great challenge to the technologies of radar countermeasures (RCMs) and electronic intelligence (ELINT). The recently proposed cognitive electronic warfare (CEW) provides a good solution, whose crux is to perceive present and future MFR behaviours, including the operating modes, waveform parameters, scheduling schemes, etc. Due to the variety and complexity of MFR waveforms, the existing approaches have the drawbacks of inefficiency and weak practicability in prediction. A novel method for MFR behaviour recognition and prediction is proposed based on predictive state representation (PSR). With the proposed approach, operating modes of MFR are recognized by accumulating the predictive states, instead of using fixed transition probabilities that are unavailable in the battlefield. It helps to reduce the dependence of MFR on prior information. And MFR signals can be quickly predicted by iteratively using the predicted observation, avoiding the very large computation brought by the uncertainty of future observations. Simulations with a hypothetical MFR signal sequence in a typical scenario are presented, showing that the proposed methods perform well and efficiently, which attests to their validity. PMID:28335492

  8. Making predictions of mangrove deforestation: a comparison of two methods in Kenya.

    PubMed

    Rideout, Alasdair J R; Joshi, Neha P; Viergever, Karin M; Huxham, Mark; Briers, Robert A

    2013-11-01

    Deforestation of mangroves is of global concern given their importance for carbon storage, biogeochemical cycling and the provision of other ecosystem services, but the links between rates of loss and potential drivers or risk factors are rarely evaluated. Here, we identified key drivers of mangrove loss in Kenya and compared two different approaches to predicting risk. Risk factors tested included various possible predictors of anthropogenic deforestation, related to population, suitability for land use change and accessibility. Two approaches were taken to modelling risk; a quantitative statistical approach and a qualitative categorical ranking approach. A quantitative model linking rates of loss to risk factors was constructed based on generalized least squares regression and using mangrove loss data from 1992 to 2000. Population density, soil type and proximity to roads were the most important predictors. In order to validate this model it was used to generate a map of losses of Kenyan mangroves predicted to have occurred between 2000 and 2010. The qualitative categorical model was constructed using data from the same selection of variables, with the coincidence of different risk factors in particular mangrove areas used in an additive manner to create a relative risk index which was then mapped. Quantitative predictions of loss were significantly correlated with the actual loss of mangroves between 2000 and 2010 and the categorical risk index values were also highly correlated with the quantitative predictions. Hence, in this case the relatively simple categorical modelling approach was of similar predictive value to the more complex quantitative model of mangrove deforestation. The advantages and disadvantages of each approach are discussed, and the implications for mangroves are outlined. © 2013 Blackwell Publishing Ltd.

  9. Field-scale prediction of enhanced DNAPL dissolution based on partitioning tracers.

    PubMed

    Wang, Fang; Annable, Michael D; Jawitz, James W

    2013-09-01

    The equilibrium streamtube model (EST) has demonstrated the ability to accurately predict dense nonaqueous phase liquid (DNAPL) dissolution in laboratory experiments and numerical simulations. Here the model is applied to predict DNAPL dissolution at a tetrachloroethylene (PCE)-contaminated dry cleaner site, located in Jacksonville, Florida. The EST model is an analytical solution with field-measurable input parameters. Measured data from a field-scale partitioning tracer test were used to parameterize the EST model and the predicted PCE dissolution was compared to measured data from an in-situ ethanol flood. In addition, a simulated partitioning tracer test from a calibrated, three-dimensional, spatially explicit multiphase flow model (UTCHEM) was also used to parameterize the EST analytical solution. The EST ethanol prediction based on both the field partitioning tracer test and the simulation closely matched the total recovery well field ethanol data with Nash-Sutcliffe efficiency E=0.96 and 0.90, respectively. The EST PCE predictions showed a peak shift to earlier arrival times for models based on either field-measured or simulated partitioning tracer tests, resulting in poorer matches to the field PCE data in both cases. The peak shifts were concluded to be caused by well screen interval differences between the field tracer test and ethanol flood. Both the EST model and UTCHEM were also used to predict PCE aqueous dissolution under natural gradient conditions, which has a much less complex flow pattern than the forced-gradient double five spot used for the ethanol flood. The natural gradient EST predictions based on parameters determined from tracer tests conducted with a complex flow pattern underestimated the UTCHEM-simulated natural gradient total mass removal by 12% after 170 pore volumes of water flushing indicating that some mass was not detected by the tracers likely due to stagnation zones in the flow field. These findings highlight the important

  10. Field-scale prediction of enhanced DNAPL dissolution based on partitioning tracers

    NASA Astrophysics Data System (ADS)

    Wang, Fang; Annable, Michael D.; Jawitz, James W.

    2013-09-01

    The equilibrium streamtube model (EST) has demonstrated the ability to accurately predict dense nonaqueous phase liquid (DNAPL) dissolution in laboratory experiments and numerical simulations. Here the model is applied to predict DNAPL dissolution at a tetrachloroethylene (PCE)-contaminated dry cleaner site, located in Jacksonville, Florida. The EST model is an analytical solution with field-measurable input parameters. Measured data from a field-scale partitioning tracer test were used to parameterize the EST model and the predicted PCE dissolution was compared to measured data from an in-situ ethanol flood. In addition, a simulated partitioning tracer test from a calibrated, three-dimensional, spatially explicit multiphase flow model (UTCHEM) was also used to parameterize the EST analytical solution. The EST ethanol prediction based on both the field partitioning tracer test and the simulation closely matched the total recovery well field ethanol data with Nash-Sutcliffe efficiency E = 0.96 and 0.90, respectively. The EST PCE predictions showed a peak shift to earlier arrival times for models based on either field-measured or simulated partitioning tracer tests, resulting in poorer matches to the field PCE data in both cases. The peak shifts were concluded to be caused by well screen interval differences between the field tracer test and ethanol flood. Both the EST model and UTCHEM were also used to predict PCE aqueous dissolution under natural gradient conditions, which has a much less complex flow pattern than the forced-gradient double five spot used for the ethanol flood. The natural gradient EST predictions based on parameters determined from tracer tests conducted with a complex flow pattern underestimated the UTCHEM-simulated natural gradient total mass removal by 12% after 170 pore volumes of water flushing indicating that some mass was not detected by the tracers likely due to stagnation zones in the flow field. These findings highlight the important

  11. Qualitative simulation of bathymetric changes due to reservoir sedimentation: A Japanese case study

    PubMed Central

    Dai, Wenhong; Larson, Magnus; Beebo, Qaid Naamo; Xie, Qiancheng

    2017-01-01

    Sediment-dynamics modeling is a useful tool for estimating a dam’s lifespan and its cost–benefit analysis. Collecting real data for sediment-dynamics analysis from conventional field survey methods is both tedious and expensive. Therefore, for most rivers, the historical record of data is either missing or not very detailed. Available data and existing tools have much potential and may be used for qualitative prediction of future bathymetric change trend. This study shows that proxy approaches may be used to increase the spatiotemporal resolution of flow data, and hypothesize the river cross-sections and sediment data. Sediment-dynamics analysis of the reach of the Tenryu River upstream of Sakuma Dam in Japan was performed to predict its future bathymetric changes using a 1D numerical model (HEC-RAS). In this case study, only annually-averaged flow data and the river’s longitudinal bed profile at 5-year intervals were available. Therefore, the other required data, including river cross-section and geometry and sediment inflow grain sizes, had to be hypothesized or assimilated indirectly. The model yielded a good qualitative agreement, with an R2 (coefficient of determination) of 0.8 for the observed and simulated bed profiles. A predictive simulation demonstrated that the useful life of the dam would end after the year 2035 (±5 years), which is in conformity with initial detailed estimates. The study indicates that a sediment-dynamic analysis can be performed even with a limited amount of data. However, such studies may only assess the qualitative trends of sediment dynamics. PMID:28384361

  12. Qualitative simulation of bathymetric changes due to reservoir sedimentation: A Japanese case study.

    PubMed

    Bilal, Ahmed; Dai, Wenhong; Larson, Magnus; Beebo, Qaid Naamo; Xie, Qiancheng

    2017-01-01

    Sediment-dynamics modeling is a useful tool for estimating a dam's lifespan and its cost-benefit analysis. Collecting real data for sediment-dynamics analysis from conventional field survey methods is both tedious and expensive. Therefore, for most rivers, the historical record of data is either missing or not very detailed. Available data and existing tools have much potential and may be used for qualitative prediction of future bathymetric change trend. This study shows that proxy approaches may be used to increase the spatiotemporal resolution of flow data, and hypothesize the river cross-sections and sediment data. Sediment-dynamics analysis of the reach of the Tenryu River upstream of Sakuma Dam in Japan was performed to predict its future bathymetric changes using a 1D numerical model (HEC-RAS). In this case study, only annually-averaged flow data and the river's longitudinal bed profile at 5-year intervals were available. Therefore, the other required data, including river cross-section and geometry and sediment inflow grain sizes, had to be hypothesized or assimilated indirectly. The model yielded a good qualitative agreement, with an R2 (coefficient of determination) of 0.8 for the observed and simulated bed profiles. A predictive simulation demonstrated that the useful life of the dam would end after the year 2035 (±5 years), which is in conformity with initial detailed estimates. The study indicates that a sediment-dynamic analysis can be performed even with a limited amount of data. However, such studies may only assess the qualitative trends of sediment dynamics.

  13. Quantifying uncertainties in streamflow predictions through signature based inference of hydrological model parameters

    NASA Astrophysics Data System (ADS)

    Fenicia, Fabrizio; Reichert, Peter; Kavetski, Dmitri; Albert, Calro

    2016-04-01

    The calibration of hydrological models based on signatures (e.g. Flow Duration Curves - FDCs) is often advocated as an alternative to model calibration based on the full time series of system responses (e.g. hydrographs). Signature based calibration is motivated by various arguments. From a conceptual perspective, calibration on signatures is a way to filter out errors that are difficult to represent when calibrating on the full time series. Such errors may for example occur when observed and simulated hydrographs are shifted, either on the "time" axis (i.e. left or right), or on the "streamflow" axis (i.e. above or below). These shifts may be due to errors in the precipitation input (time or amount), and if not properly accounted in the likelihood function, may cause biased parameter estimates (e.g. estimated model parameters that do not reproduce the recession characteristics of a hydrograph). From a practical perspective, signature based calibration is seen as a possible solution for making predictions in ungauged basins. Where streamflow data are not available, it may in fact be possible to reliably estimate streamflow signatures. Previous research has for example shown how FDCs can be reliably estimated at ungauged locations based on climatic and physiographic influence factors. Typically, the goal of signature based calibration is not the prediction of the signatures themselves, but the prediction of the system responses. Ideally, the prediction of system responses should be accompanied by a reliable quantification of the associated uncertainties. Previous approaches for signature based calibration, however, do not allow reliable estimates of streamflow predictive distributions. Here, we illustrate how the Bayesian approach can be employed to obtain reliable streamflow predictive distributions based on signatures. A case study is presented, where a hydrological model is calibrated on FDCs and additional signatures. We propose an approach where the likelihood

  14. Qualitative research and the profound grasp of the obvious.

    PubMed Central

    Hurley, R E

    1999-01-01

    OBJECTIVE: To discuss the value of promoting coexistent and complementary relationships between qualitative and quantitative research methods as illustrated by presentations made by four respected health services researchers who described their experiences in multi-method projects. DATA SOURCES: Presentations and publications related to the four research projects, which described key substantive and methodological areas that had been addressed with qualitative techniques. PRINCIPAL FINDINGS: Sponsor interest in timely, insightful, and reality-anchored evidence has provided a strong base of support for the incorporation of qualitative methods into major contemporary policy research studies. In addition, many issues may be suitable for study only with qualitative methods because of their complexity, their emergent nature, or because of the need to revisit and reexamine previously untested assumptions. CONCLUSION: Experiences from the four projects, as well as from other recent health services studies with major qualitative components, support the assertion that the interests of sponsors in the policy realm and pressure from them suppress some of the traditional tensions and antagonisms between qualitative and quantitative methods. PMID:10591276

  15. A Question-Answer System for Mobile Devices in Lecture-Based Instruction: A Qualitative Analysis of Student Engagement and Learning

    ERIC Educational Resources Information Center

    Hatun Atas, Amine; Delialioglu, Ömer

    2018-01-01

    The aim of this study was to explore the opinions, perceptions and evaluations of students about their experiences with a question-answer system used on mobile devices in a lecture-based course. Basic qualitative research method was employed in this study to understand how students made sense of their experiences during the instruction. The…

  16. Relative Packing Groups in Template-Based Structure Prediction: Cooperative Effects of True Positive Constraints

    PubMed Central

    Day, Ryan; Qu, Xiaotao; Swanson, Rosemarie; Bohannan, Zach; Bliss, Robert

    2011-01-01

    Abstract Most current template-based structure prediction methods concentrate on finding the correct backbone conformation and then packing sidechains within that backbone. Our packing-based method derives distance constraints from conserved relative packing groups (RPGs). In our refinement approach, the RPGs provide a level of resolution that restrains global topology while allowing conformational sampling. In this study, we test our template-based structure prediction method using 51 prediction units from CASP7 experiments. RPG-based constraints are able to substantially improve approximately two-thirds of starting templates. Upon deeper investigation, we find that true positive spatial constraints, especially those non-local in sequence, derived from the RPGs were important to building nearer native models. Surprisingly, the fraction of incorrect or false positive constraints does not strongly influence the quality of the final candidate. This result indicates that our RPG-based true positive constraints sample the self-consistent, cooperative interactions of the native structure. The lack of such reinforcing cooperativity explains the weaker effect of false positive constraints. Generally, these findings are encouraging indications that RPGs will improve template-based structure prediction. PMID:21210729

  17. Generating qualitative data by design: the Australian Longitudinal Study on Women's Health qualitative data collection.

    PubMed

    Tavener, Meredith; Chojenta, Catherine; Loxton, Deborah

    2016-07-15

    Objectives and importance of study: The purpose of this study was to illustrate how qualitative free-text comments, collected within the context of a health survey, represent a rich data source for understanding specific phenomena. Work conducted with data from the Australian Longitudinal Study on Women's Health (ALSWH) was used to demonstrate the breadth and depth of qualitative information that can be collected. The ALSWH has been collecting data on women's health since 1996, and represents a unique opportunity for understanding lived experiences across the lifecourse. A multiple case study design was used to demonstrate the techniques that researchers have used to manage free-text qualitative comments collected by the ALSWH. Eleven projects conducted using free-text comments are discussed according to the method of analysis. These methods include coding (both inductively and deductively), longitudinal analyses and software-based analyses. This work shows that free-text comments are a data resource in their own right, and have the potential to provide rich and valuable information about a wide variety of topics.

  18. The Experience of Psychiatric Care of Adolescents with Anxiety-based School Refusal and of their Parents: A Qualitative Study.

    PubMed

    Sibeoni, Jordan; Orri, Massimiliano; Podlipski, Marc-Antoine; Labey, Mathilde; Campredon, Sophie; Gerardin, Priscille; Revah-Levy, Anne

    2018-01-01

    Anxiety-based school refusal in adolescence is a complex, sometimes difficult to treat disorder that can have serious academic and psychiatric consequences. The objective of this qualitative study was to explore how teens with this problem and their parents experience the psychiatric care received. This qualitative multicenter study took place in France, where we conducted semi-structured interviews with adolescents receiving psychiatric care for anxiety-based school refusal and with their parents. Data collection by purposive sampling continued until we reached theoretical sufficiency. Data analysis was thematic. This study included 20 adolescents aged 12 to 18 years and 21 parents. Two themes emerged from the analysis: (1) the goals of psychiatric care with two sub-themes, " self-transformation " and problem solving ; and, (2) the therapeutic levers identified as effective with two sub-themes: time and space and relationships . Our results show a divergence between parents and teens in their representations of care and especially of its goals. Therapeutic and research implications about the terms of return to school within psychiatric care and also the temporality of care are discussed.

  19. Jenner-predict server: prediction of protein vaccine candidates (PVCs) in bacteria based on host-pathogen interactions

    PubMed Central

    2013-01-01

    Background Subunit vaccines based on recombinant proteins have been effective in preventing infectious diseases and are expected to meet the demands of future vaccine development. Computational approach, especially reverse vaccinology (RV) method has enormous potential for identification of protein vaccine candidates (PVCs) from a proteome. The existing protective antigen prediction software and web servers have low prediction accuracy leading to limited applications for vaccine development. Besides machine learning techniques, those software and web servers have considered only protein’s adhesin-likeliness as criterion for identification of PVCs. Several non-adhesin functional classes of proteins involved in host-pathogen interactions and pathogenesis are known to provide protection against bacterial infections. Therefore, knowledge of bacterial pathogenesis has potential to identify PVCs. Results A web server, Jenner-Predict, has been developed for prediction of PVCs from proteomes of bacterial pathogens. The web server targets host-pathogen interactions and pathogenesis by considering known functional domains from protein classes such as adhesin, virulence, invasin, porin, flagellin, colonization, toxin, choline-binding, penicillin-binding, transferring-binding, fibronectin-binding and solute-binding. It predicts non-cytosolic proteins containing above domains as PVCs. It also provides vaccine potential of PVCs in terms of their possible immunogenicity by comparing with experimentally known IEDB epitopes, absence of autoimmunity and conservation in different strains. Predicted PVCs are prioritized so that only few prospective PVCs could be validated experimentally. The performance of web server was evaluated against known protective antigens from diverse classes of bacteria reported in Protegen database and datasets used for VaxiJen server development. The web server efficiently predicted known vaccine candidates reported from Streptococcus pneumoniae and

  20. Qualitative Research in PBL in Health Sciences Education: A Review

    ERIC Educational Resources Information Center

    Jin, Jun; Bridges, Susan

    2016-01-01

    Context: Qualitative methodologies are relatively new in health sciences education research, especially in the area of problem-based learning (PBL). A key advantage of qualitative approaches is the ability to gain in-depth, textured insights into educational phenomena. Key methodological issues arise, however, in terms of the strategies of…

  1. Attention, awareness of contingencies, and control in spatial localization: a qualitative difference approach.

    PubMed

    Vaquero, Joaquín M M; Fiacconi, Chris; Milliken, Bruce

    2010-12-01

    The qualitative difference method for distinguishing between aware and unaware processes was applied here to a spatial priming task. Participants were asked simply to locate a target stimulus that appeared in one of four locations, and this target stimulus was preceded by a prime in one of the same four locations. The prime location predicted the location of the target with high probability (p = .75), but prime and target mismatched on a task-relevant feature (identity, color). Across 5 experiments, we observed repetition costs in the absence of awareness of the contingency, and repetition benefits in the presence of awareness of the contingency. These results were particularly clear-cut in Experiment 4, in which awareness was defined by reference to self-reported strategy use. Finally, Experiment 5 showed that frequency-based implicit learning effects were present in our experiments but that these implicit learning effects were not strong enough to override repetition costs that pushed performance in the opposite direction. The results of these experiments constitute a novel application of the qualitative difference method to the study of awareness, learning of contingencies, and strategic control.

  2. Incorporating information on predicted solvent accessibility to the co-evolution-based study of protein interactions.

    PubMed

    Ochoa, David; García-Gutiérrez, Ponciano; Juan, David; Valencia, Alfonso; Pazos, Florencio

    2013-01-27

    A widespread family of methods for studying and predicting protein interactions using sequence information is based on co-evolution, quantified as similarity of phylogenetic trees. Part of the co-evolution observed between interacting proteins could be due to co-adaptation caused by inter-protein contacts. In this case, the co-evolution is expected to be more evident when evaluated on the surface of the proteins or the internal layers close to it. In this work we study the effect of incorporating information on predicted solvent accessibility to three methods for predicting protein interactions based on similarity of phylogenetic trees. We evaluate the performance of these methods in predicting different types of protein associations when trees based on positions with different characteristics of predicted accessibility are used as input. We found that predicted accessibility improves the results of two recent versions of the mirrortree methodology in predicting direct binary physical interactions, while it neither improves these methods, nor the original mirrortree method, in predicting other types of interactions. That improvement comes at no cost in terms of applicability since accessibility can be predicted for any sequence. We also found that predictions of protein-protein interactions are improved when multiple sequence alignments with a richer representation of sequences (including paralogs) are incorporated in the accessibility prediction.

  3. Can contaminant transport models predict breakthrough?

    USGS Publications Warehouse

    Peng, Wei-Shyuan; Hampton, Duane R.; Konikow, Leonard F.; Kambham, Kiran; Benegar, Jeffery J.

    2000-01-01

    A solute breakthrough curve measured during a two-well tracer test was successfully predicted in 1986 using specialized contaminant transport models. Water was injected into a confined, unconsolidated sand aquifer and pumped out 125 feet (38.3 m) away at the same steady rate. The injected water was spiked with bromide for over three days; the outflow concentration was monitored for a month. Based on previous tests, the horizontal hydraulic conductivity of the thick aquifer varied by a factor of seven among 12 layers. Assuming stratified flow with small dispersivities, two research groups accurately predicted breakthrough with three-dimensional (12-layer) models using curvilinear elements following the arc-shaped flowlines in this test. Can contaminant transport models commonly used in industry, that use rectangular blocks, also reproduce this breakthrough curve? The two-well test was simulated with four MODFLOW-based models, MT3D (FD and HMOC options), MODFLOWT, MOC3D, and MODFLOW-SURFACT. Using the same 12 layers and small dispersivity used in the successful 1986 simulations, these models fit almost as accurately as the models using curvilinear blocks. Subtle variations in the curves illustrate differences among the codes. Sensitivities of the results to number and size of grid blocks, number of layers, boundary conditions, and values of dispersivity and porosity are briefly presented. The fit between calculated and measured breakthrough curves degenerated as the number of layers and/or grid blocks decreased, reflecting a loss of model predictive power as the level of characterization lessened. Therefore, the breakthrough curve for most field sites can be predicted only qualitatively due to limited characterization of the hydrogeology and contaminant source strength.

  4. Longitudinal Study-Based Dementia Prediction for Public Health

    PubMed Central

    Kim, HeeChel; Chun, Hong-Woo; Kim, Seonho; Coh, Byoung-Youl; Kwon, Oh-Jin; Moon, Yeong-Ho

    2017-01-01

    The issue of public health in Korea has attracted significant attention given the aging of the country’s population, which has created many types of social problems. The approach proposed in this article aims to address dementia, one of the most significant symptoms of aging and a public health care issue in Korea. The Korean National Health Insurance Service Senior Cohort Database contains personal medical data of every citizen in Korea. There are many different medical history patterns between individuals with dementia and normal controls. The approach used in this study involved examination of personal medical history features from personal disease history, sociodemographic data, and personal health examinations to develop a prediction model. The prediction model used a support-vector machine learning technique to perform a 10-fold cross-validation analysis. The experimental results demonstrated promising performance (80.9% F-measure). The proposed approach supported the significant influence of personal medical history features during an optimal observation period. It is anticipated that a biomedical “big data”-based disease prediction model may assist the diagnosis of any disease more correctly. PMID:28867810

  5. Background-Modeling-Based Adaptive Prediction for Surveillance Video Coding.

    PubMed

    Zhang, Xianguo; Huang, Tiejun; Tian, Yonghong; Gao, Wen

    2014-02-01

    The exponential growth of surveillance videos presents an unprecedented challenge for high-efficiency surveillance video coding technology. Compared with the existing coding standards that were basically developed for generic videos, surveillance video coding should be designed to make the best use of the special characteristics of surveillance videos (e.g., relative static background). To do so, this paper first conducts two analyses on how to improve the background and foreground prediction efficiencies in surveillance video coding. Following the analysis results, we propose a background-modeling-based adaptive prediction (BMAP) method. In this method, all blocks to be encoded are firstly classified into three categories. Then, according to the category of each block, two novel inter predictions are selectively utilized, namely, the background reference prediction (BRP) that uses the background modeled from the original input frames as the long-term reference and the background difference prediction (BDP) that predicts the current data in the background difference domain. For background blocks, the BRP can effectively improve the prediction efficiency using the higher quality background as the reference; whereas for foreground-background-hybrid blocks, the BDP can provide a better reference after subtracting its background pixels. Experimental results show that the BMAP can achieve at least twice the compression ratio on surveillance videos as AVC (MPEG-4 Advanced Video Coding) high profile, yet with a slightly additional encoding complexity. Moreover, for the foreground coding performance, which is crucial to the subjective quality of moving objects in surveillance videos, BMAP also obtains remarkable gains over several state-of-the-art methods.

  6. Predicting missing links in complex networks based on common neighbors and distance

    PubMed Central

    Yang, Jinxuan; Zhang, Xiao-Dong

    2016-01-01

    The algorithms based on common neighbors metric to predict missing links in complex networks are very popular, but most of these algorithms do not account for missing links between nodes with no common neighbors. It is not accurate enough to reconstruct networks by using these methods in some cases especially when between nodes have less common neighbors. We proposed in this paper a new algorithm based on common neighbors and distance to improve accuracy of link prediction. Our proposed algorithm makes remarkable effect in predicting the missing links between nodes with no common neighbors and performs better than most existing currently used methods for a variety of real-world networks without increasing complexity. PMID:27905526

  7. Predicting infant cortical surface development using a 4D varifold-based learning framework and local topography-based shape morphing.

    PubMed

    Rekik, Islem; Li, Gang; Lin, Weili; Shen, Dinggang

    2016-02-01

    Longitudinal neuroimaging analysis methods have remarkably advanced our understanding of early postnatal brain development. However, learning predictive models to trace forth the evolution trajectories of both normal and abnormal cortical shapes remains broadly absent. To fill this critical gap, we pioneered the first prediction model for longitudinal developing cortical surfaces in infants using a spatiotemporal current-based learning framework solely from the baseline cortical surface. In this paper, we detail this prediction model and even further improve its performance by introducing two key variants. First, we use the varifold metric to overcome the limitations of the current metric for surface registration that was used in our preliminary study. We also extend the conventional varifold-based surface registration model for pairwise registration to a spatiotemporal surface regression model. Second, we propose a morphing process of the baseline surface using its topographic attributes such as normal direction and principal curvature sign. Specifically, our method learns from longitudinal data both the geometric (vertices positions) and dynamic (temporal evolution trajectories) features of the infant cortical surface, comprising a training stage and a prediction stage. In the training stage, we use the proposed varifold-based shape regression model to estimate geodesic cortical shape evolution trajectories for each training subject. We then build an empirical mean spatiotemporal surface atlas. In the prediction stage, given an infant, we select the best learnt features from training subjects to simultaneously predict the cortical surface shapes at all later timepoints, based on similarity metrics between this baseline surface and the learnt baseline population average surface atlas. We used a leave-one-out cross validation method to predict the inner cortical surface shape at 3, 6, 9 and 12 months of age from the baseline cortical surface shape at birth. Our

  8. Qualitative studies. Their role in medical research.

    PubMed Central

    Huston, P.; Rowan, M.

    1998-01-01

    OBJECTIVE: To define qualitative research in terms of its philosophical roots, the questions it addresses, its methods and analyses, and the type of results it can offer. DATA SOURCES: MEDLINE and CINAHL (Cumulative Index to Nursing and Allied Health Literature) databases were searched for the years January 1985 to April 1998. The search strategy consisted of "textword" terms that searched in the "title" field of both databases. Qualitative research and evaluation textbooks in health and the social sciences were also used. QUALITY OF EVIDENCE: The information on qualitative research is based on the most recent and valid evidence from the health and social science fields. MAIN MESSAGE: Qualitative research seeks to understand and interpret personal experience to explain social phenomena, including those related to health. It can address questions that quantitative research cannot, such as why people do not adhere to a treatment regimen or why a certain health care intervention is successful. It uses many methods of data collection, including participant observation, case studies, and interviews, and numerous approaches to data analysis that range from the quasistatistical to the intuitive and inductive. CONCLUSIONS: Qualitative research, a form of research completely different from quantitative research, can provide important insights into health-related phenomena and can enrich further research inquiries. PMID:9839063

  9. QUAGOL: a guide for qualitative data analysis.

    PubMed

    Dierckx de Casterlé, Bernadette; Gastmans, Chris; Bryon, Els; Denier, Yvonne

    2012-03-01

    Data analysis is a complex and contested part of the qualitative research process, which has received limited theoretical attention. Researchers are often in need of useful instructions or guidelines on how to analyze the mass of qualitative data, but face the lack of clear guidance for using particular analytic methods. The aim of this paper is to propose and discuss the Qualitative Analysis Guide of Leuven (QUAGOL), a guide that was developed in order to be able to truly capture the rich insights of qualitative interview data. The article describes six major problems researchers are often struggling with during the process of qualitative data analysis. Consequently, the QUAGOL is proposed as a guide to facilitate the process of analysis. Challenges emerged and lessons learned from own extensive experiences with qualitative data analysis within the Grounded Theory Approach, as well as from those of other researchers (as described in the literature), were discussed and recommendations were presented. Strengths and pitfalls of the proposed method were discussed in detail. The Qualitative Analysis Guide of Leuven (QUAGOL) offers a comprehensive method to guide the process of qualitative data analysis. The process consists of two parts, each consisting of five stages. The method is systematic but not rigid. It is characterized by iterative processes of digging deeper, constantly moving between the various stages of the process. As such, it aims to stimulate the researcher's intuition and creativity as optimal as possible. The QUAGOL guide is a theory and practice-based guide that supports and facilitates the process of analysis of qualitative interview data. Although the method can facilitate the process of analysis, it cannot guarantee automatic quality. The skills of the researcher and the quality of the research team remain the most crucial components of a successful process of analysis. Additionally, the importance of constantly moving between the various stages

  10. Predicting Player Position for Talent Identification in Association Football

    NASA Astrophysics Data System (ADS)

    Razali, Nazim; Mustapha, Aida; Yatim, Faiz Ahmad; Aziz, Ruhaya Ab

    2017-08-01

    This paper is set to introduce a new framework from the perspective of Computer Science for identifying talents in the sport of football based on the players’ individual qualities; physical, mental, and technical. The combination of qualities as assessed by coaches are then used to predict the players’ position in a match that suits the player the best in a particular team formation. Evaluation of the proposed framework is two-fold; quantitatively via classification experiments to predict player position, and qualitatively via a Talent Identification Site developed to achieve the same goal. Results from the classification experiments using Bayesian Networks, Decision Trees, and K-Nearest Neighbor have shown an average of 98% accuracy, which will promote consistency in decision-making though elimination of personal bias in team selection. The positive reviews on the Football Identification Site based on user acceptance evaluation also indicates that the framework is sufficient to serve as the basis of developing an intelligent team management system in different sports, whereby growth and performance of sport players can be monitored and identified.

  11. Prediction of essential proteins based on gene expression programming.

    PubMed

    Zhong, Jiancheng; Wang, Jianxin; Peng, Wei; Zhang, Zhen; Pan, Yi

    2013-01-01

    Essential proteins are indispensable for cell survive. Identifying essential proteins is very important for improving our understanding the way of a cell working. There are various types of features related to the essentiality of proteins. Many methods have been proposed to combine some of them to predict essential proteins. However, it is still a big challenge for designing an effective method to predict them by integrating different features, and explaining how these selected features decide the essentiality of protein. Gene expression programming (GEP) is a learning algorithm and what it learns specifically is about relationships between variables in sets of data and then builds models to explain these relationships. In this work, we propose a GEP-based method to predict essential protein by combing some biological features and topological features. We carry out experiments on S. cerevisiae data. The experimental results show that the our method achieves better prediction performance than those methods using individual features. Moreover, our method outperforms some machine learning methods and performs as well as a method which is obtained by combining the outputs of eight machine learning methods. The accuracy of predicting essential proteins can been improved by using GEP method to combine some topological features and biological features.

  12. Prediction of drug synergy in cancer using ensemble-based machine learning techniques

    NASA Astrophysics Data System (ADS)

    Singh, Harpreet; Rana, Prashant Singh; Singh, Urvinder

    2018-04-01

    Drug synergy prediction plays a significant role in the medical field for inhibiting specific cancer agents. It can be developed as a pre-processing tool for therapeutic successes. Examination of different drug-drug interaction can be done by drug synergy score. It needs efficient regression-based machine learning approaches to minimize the prediction errors. Numerous machine learning techniques such as neural networks, support vector machines, random forests, LASSO, Elastic Nets, etc., have been used in the past to realize requirement as mentioned above. However, these techniques individually do not provide significant accuracy in drug synergy score. Therefore, the primary objective of this paper is to design a neuro-fuzzy-based ensembling approach. To achieve this, nine well-known machine learning techniques have been implemented by considering the drug synergy data. Based on the accuracy of each model, four techniques with high accuracy are selected to develop ensemble-based machine learning model. These models are Random forest, Fuzzy Rules Using Genetic Cooperative-Competitive Learning method (GFS.GCCL), Adaptive-Network-Based Fuzzy Inference System (ANFIS) and Dynamic Evolving Neural-Fuzzy Inference System method (DENFIS). Ensembling is achieved by evaluating the biased weighted aggregation (i.e. adding more weights to the model with a higher prediction score) of predicted data by selected models. The proposed and existing machine learning techniques have been evaluated on drug synergy score data. The comparative analysis reveals that the proposed method outperforms others in terms of accuracy, root mean square error and coefficient of correlation.

  13. Search strategies for identifying qualitative studies in CINAHL.

    PubMed

    Wilczynski, Nancy L; Marks, Susan; Haynes, R Brian

    2007-05-01

    Nurses, allied health professionals, clinicians, and researchers increasingly use online access to evidence in the course of patient care or when conducting reviews on a particular topic. Qualitative research has an important role in evidence-based health care. Online searching for qualitative studies can be difficult, however, resulting in the need to develop search filters. The objective of this study was to develop optimal search strategies to retrieve qualitative studies in CINAHL for the 2000 publishing year. The authors conducted an analytic survey comparing hand searches of journals with retrievals from CINAHL for candidate search terms and combinations. Combinations of search terms reached peak sensitivities of 98.9% and peak specificities of 99.5%. Combining search terms optimized both sensitivity and specificity at 94.2%. Empirically derived search strategies combining indexing terms and textwords can achieve high sensitivity and high specificity for retrieving qualitative studies from CINAHL.

  14. A Knowledge-Base for a Personalized Infectious Disease Risk Prediction System.

    PubMed

    Vinarti, Retno; Hederman, Lucy

    2018-01-01

    We present a knowledge-base to represent collated infectious disease risk (IDR) knowledge. The knowledge is about personal and contextual risk of contracting an infectious disease obtained from declarative sources (e.g. Atlas of Human Infectious Diseases). Automated prediction requires encoding this knowledge in a form that can produce risk probabilities (e.g. Bayesian Network - BN). The knowledge-base presented in this paper feeds an algorithm that can auto-generate the BN. The knowledge from 234 infectious diseases was compiled. From this compilation, we designed an ontology and five rule types for modelling IDR knowledge in general. The evaluation aims to assess whether the knowledge-base structure, and its application to three disease-country contexts, meets the needs of personalized IDR prediction system. From the evaluation results, the knowledge-base conforms to the system's purpose: personalization of infectious disease risk.

  15. NLLSS: Predicting Synergistic Drug Combinations Based on Semi-supervised Learning

    PubMed Central

    Chen, Ming; Wang, Quanxin; Zhang, Lixin; Yan, Guiying

    2016-01-01

    Fungal infection has become one of the leading causes of hospital-acquired infections with high mortality rates. Furthermore, drug resistance is common for fungus-causing diseases. Synergistic drug combinations could provide an effective strategy to overcome drug resistance. Meanwhile, synergistic drug combinations can increase treatment efficacy and decrease drug dosage to avoid toxicity. Therefore, computational prediction of synergistic drug combinations for fungus-causing diseases becomes attractive. In this study, we proposed similar nature of drug combinations: principal drugs which obtain synergistic effect with similar adjuvant drugs are often similar and vice versa. Furthermore, we developed a novel algorithm termed Network-based Laplacian regularized Least Square Synergistic drug combination prediction (NLLSS) to predict potential synergistic drug combinations by integrating different kinds of information such as known synergistic drug combinations, drug-target interactions, and drug chemical structures. We applied NLLSS to predict antifungal synergistic drug combinations and showed that it achieved excellent performance both in terms of cross validation and independent prediction. Finally, we performed biological experiments for fungal pathogen Candida albicans to confirm 7 out of 13 predicted antifungal synergistic drug combinations. NLLSS provides an efficient strategy to identify potential synergistic antifungal combinations. PMID:27415801

  16. Study of Earthquake Disaster Prediction System of Langfang city Based on GIS

    NASA Astrophysics Data System (ADS)

    Huang, Meng; Zhang, Dian; Li, Pan; Zhang, YunHui; Zhang, RuoFei

    2017-07-01

    In this paper, according to the status of China’s need to improve the ability of earthquake disaster prevention, this paper puts forward the implementation plan of earthquake disaster prediction system of Langfang city based on GIS. Based on the GIS spatial database, coordinate transformation technology, GIS spatial analysis technology and PHP development technology, the seismic damage factor algorithm is used to predict the damage of the city under different intensity earthquake disaster conditions. The earthquake disaster prediction system of Langfang city is based on the B / S system architecture. Degree and spatial distribution and two-dimensional visualization display, comprehensive query analysis and efficient auxiliary decision-making function to determine the weak earthquake in the city and rapid warning. The system has realized the transformation of the city’s earthquake disaster reduction work from static planning to dynamic management, and improved the city’s earthquake and disaster prevention capability.

  17. Predicting diffusion paths and interface motion in gamma/gamma + beta, Ni-Cr-Al diffusion couples

    NASA Technical Reports Server (NTRS)

    Nesbitt, J. A.; Heckel, R. W.

    1987-01-01

    A simplified model has been developed to predict Beta recession and diffusion paths in ternary gamma/gamma + beta diffusion couples (gamma:fcc, beta: NiAl structure). The model was tested by predicting beta recession and diffusion paths for four gamma/gamma + beta, Ni-Cr-Al couples annealed for 100 hours at 1200 C. The model predicted beta recession within 20 percent of that measured for each of the couples. The model also predicted shifts in the concentration of the gamma phase at the gamma/gamma + beta interface within 2 at. pct Al and 6 at. pct Cr of that measured in each of the couples. A qualitative explanation based on simple kinetic and mass balance arguments has been given which demonstrates the necessity for diffusion in the two-phase region of certain gamma/gamma + beta, Ni-Cr-Al couples.

  18. Mammographic features and subsequent risk of breast cancer: a comparison of qualitative and quantitative evaluations in the Guernsey prospective studies.

    PubMed

    Torres-Mejía, Gabriela; De Stavola, Bianca; Allen, Diane S; Pérez-Gavilán, Juan J; Ferreira, Jorge M; Fentiman, Ian S; Dos Santos Silva, Isabel

    2005-05-01

    Mammographic features are known to be associated with breast cancer but the magnitude of the effect differs markedly from study to study. Methods to assess mammographic features range from subjective qualitative classifications to computer-automated quantitative measures. We used data from the UK Guernsey prospective studies to examine the relative value of these methods in predicting breast cancer risk. In all, 3,211 women ages > or =35 years who had a mammogram taken in 1986 to 1989 were followed-up to the end of October 2003, with 111 developing breast cancer during this period. Mammograms were classified using the subjective qualitative Wolfe classification and several quantitative mammographic features measured using computer-based techniques. Breast cancer risk was positively associated with high-grade Wolfe classification, percent breast density and area of dense tissue, and negatively associated with area of lucent tissue, fractal dimension, and lacunarity. Inclusion of the quantitative measures in the same model identified area of dense tissue and lacunarity as the best predictors of breast cancer, with risk increasing by 59% [95% confidence interval (95% CI), 29-94%] per SD increase in total area of dense tissue but declining by 39% (95% CI, 53-22%) per SD increase in lacunarity, after adjusting for each other and for other confounders. Comparison of models that included both the qualitative Wolfe classification and these two quantitative measures to models that included either the qualitative or the two quantitative variables showed that they all made significant contributions to prediction of breast cancer risk. These findings indicate that breast cancer risk is affected not only by the amount of mammographic density but also by the degree of heterogeneity of the parenchymal pattern and, presumably, by other features captured by the Wolfe classification.

  19. Not just the norm: exemplar-based models also predict face aftereffects.

    PubMed

    Ross, David A; Deroche, Mickael; Palmeri, Thomas J

    2014-02-01

    The face recognition literature has considered two competing accounts of how faces are represented within the visual system: Exemplar-based models assume that faces are represented via their similarity to exemplars of previously experienced faces, while norm-based models assume that faces are represented with respect to their deviation from an average face, or norm. Face identity aftereffects have been taken as compelling evidence in favor of a norm-based account over an exemplar-based account. After a relatively brief period of adaptation to an adaptor face, the perceived identity of a test face is shifted toward a face with attributes opposite to those of the adaptor, suggesting an explicit psychological representation of the norm. Surprisingly, despite near universal recognition that face identity aftereffects imply norm-based coding, there have been no published attempts to simulate the predictions of norm- and exemplar-based models in face adaptation paradigms. Here, we implemented and tested variations of norm and exemplar models. Contrary to common claims, our simulations revealed that both an exemplar-based model and a version of a two-pool norm-based model, but not a traditional norm-based model, predict face identity aftereffects following face adaptation.

  20. Not Just the Norm: Exemplar-Based Models also Predict Face Aftereffects

    PubMed Central

    Ross, David A.; Deroche, Mickael; Palmeri, Thomas J.

    2014-01-01

    The face recognition literature has considered two competing accounts of how faces are represented within the visual system: Exemplar-based models assume that faces are represented via their similarity to exemplars of previously experienced faces, while norm-based models assume that faces are represented with respect to their deviation from an average face, or norm. Face identity aftereffects have been taken as compelling evidence in favor of a norm-based account over an exemplar-based account. After a relatively brief period of adaptation to an adaptor face, the perceived identity of a test face is shifted towards a face with opposite attributes to the adaptor, suggesting an explicit psychological representation of the norm. Surprisingly, despite near universal recognition that face identity aftereffects imply norm-based coding, there have been no published attempts to simulate the predictions of norm- and exemplar-based models in face adaptation paradigms. Here we implemented and tested variations of norm and exemplar models. Contrary to common claims, our simulations revealed that both an exemplar-based model and a version of a two-pool norm-based model, but not a traditional norm-based model, predict face identity aftereffects following face adaptation. PMID:23690282

  1. Soil-pipe interaction modeling for pipe behavior prediction with super learning based methods

    NASA Astrophysics Data System (ADS)

    Shi, Fang; Peng, Xiang; Liu, Huan; Hu, Yafei; Liu, Zheng; Li, Eric

    2018-03-01

    Underground pipelines are subject to severe distress from the surrounding expansive soil. To investigate the structural response of water mains to varying soil movements, field data, including pipe wall strains in situ soil water content, soil pressure and temperature, was collected. The research on monitoring data analysis has been reported, but the relationship between soil properties and pipe deformation has not been well-interpreted. To characterize the relationship between soil property and pipe deformation, this paper presents a super learning based approach combining feature selection algorithms to predict the water mains structural behavior in different soil environments. Furthermore, automatic variable selection method, e.i. recursive feature elimination algorithm, were used to identify the critical predictors contributing to the pipe deformations. To investigate the adaptability of super learning to different predictive models, this research employed super learning based methods to three different datasets. The predictive performance was evaluated by R-squared, root-mean-square error and mean absolute error. Based on the prediction performance evaluation, the superiority of super learning was validated and demonstrated by predicting three types of pipe deformations accurately. In addition, a comprehensive understand of the water mains working environments becomes possible.

  2. A CBR-Based and MAHP-Based Customer Value Prediction Model for New Product Development

    PubMed Central

    Zhao, Yu-Jie; Luo, Xin-xing; Deng, Li

    2014-01-01

    In the fierce market environment, the enterprise which wants to meet customer needs and boost its market profit and share must focus on the new product development. To overcome the limitations of previous research, Chan et al. proposed a dynamic decision support system to predict the customer lifetime value (CLV) for new product development. However, to better meet the customer needs, there are still some deficiencies in their model, so this study proposes a CBR-based and MAHP-based customer value prediction model for a new product (C&M-CVPM). CBR (case based reasoning) can reduce experts' workload and evaluation time, while MAHP (multiplicative analytic hierarchy process) can use actual but average influencing factor's effectiveness in stimulation, and at same time C&M-CVPM uses dynamic customers' transition probability which is more close to reality. This study not only introduces the realization of CBR and MAHP, but also elaborates C&M-CVPM's three main modules. The application of the proposed model is illustrated and confirmed to be sensible and convincing through a stimulation experiment. PMID:25162050

  3. A CBR-based and MAHP-based customer value prediction model for new product development.

    PubMed

    Zhao, Yu-Jie; Luo, Xin-xing; Deng, Li

    2014-01-01

    In the fierce market environment, the enterprise which wants to meet customer needs and boost its market profit and share must focus on the new product development. To overcome the limitations of previous research, Chan et al. proposed a dynamic decision support system to predict the customer lifetime value (CLV) for new product development. However, to better meet the customer needs, there are still some deficiencies in their model, so this study proposes a CBR-based and MAHP-based customer value prediction model for a new product (C&M-CVPM). CBR (case based reasoning) can reduce experts' workload and evaluation time, while MAHP (multiplicative analytic hierarchy process) can use actual but average influencing factor's effectiveness in stimulation, and at same time C&M-CVPM uses dynamic customers' transition probability which is more close to reality. This study not only introduces the realization of CBR and MAHP, but also elaborates C&M-CVPM's three main modules. The application of the proposed model is illustrated and confirmed to be sensible and convincing through a stimulation experiment.

  4. Binary similarity measures for fingerprint analysis of qualitative metabolomic profiles.

    PubMed

    Rácz, Anita; Andrić, Filip; Bajusz, Dávid; Héberger, Károly

    2018-01-01

    Contemporary metabolomic fingerprinting is based on multiple spectrometric and chromatographic signals, used either alone or combined with structural and chemical information of metabolic markers at the qualitative and semiquantitative level. However, signal shifting, convolution, and matrix effects may compromise metabolomic patterns. Recent increase in the use of qualitative metabolomic data, described by the presence (1) or absence (0) of particular metabolites, demonstrates great potential in the field of metabolomic profiling and fingerprint analysis. The aim of this study is a comprehensive evaluation of binary similarity measures for the elucidation of patterns among samples of different botanical origin and various metabolomic profiles. Nine qualitative metabolomic data sets covering a wide range of natural products and metabolomic profiles were applied to assess 44 binary similarity measures for the fingerprinting of plant extracts and natural products. The measures were analyzed by the novel sum of ranking differences method (SRD), searching for the most promising candidates. Baroni-Urbani-Buser (BUB) and Hawkins-Dotson (HD) similarity coefficients were selected as the best measures by SRD and analysis of variance (ANOVA), while Dice (Di1), Yule, Russel-Rao, and Consonni-Todeschini 3 ranked the worst. ANOVA revealed that concordantly and intermediately symmetric similarity coefficients are better candidates for metabolomic fingerprinting than the asymmetric and correlation based ones. The fingerprint analysis based on the BUB and HD coefficients and qualitative metabolomic data performed equally well as the quantitative metabolomic profile analysis. Fingerprint analysis based on the qualitative metabolomic profiles and binary similarity measures proved to be a reliable way in finding the same/similar patterns in metabolomic data as that extracted from quantitative data.

  5. Evaluation of the approach based on the concept of hyperbolicity breaking for prediction of flooding velocity of both room temperature and cryogenic fluids

    NASA Astrophysics Data System (ADS)

    Zhou, Rui; Yu, Liu; Xie, Huangjun; Qiu, Limin; Zhi, Xiaoqin; Zhang, Xiaobin

    2018-07-01

    The theoretical approach for the prediction of flooding velocity based on the concept of hyperbolicity breaking was evaluated in the counter-current two-phase flow. Detailed mathematical derivations of neutral stability condition together with the correlation of the void fraction are presented. The flooding velocity is obtained by assuming that the wavelength at flooding is proportional to the wavelength of the fastest-growing wave at Helmholtz instability. Some available experimental data for different fluid pair flow in inclined tubes is adopted for comparison with the theoretical calculations, which includes the data of water/air, aqueous oleic acid natrium solution/air, Aq. butanol 2%/air and kerosene/air in the published papers, as well as the liquid nitrogen/vapor nitrogen by the present authors. The comparison of flooding velocity proves that the approach can predict the flooding velocity with accepted accuracy for the water/air and liquid nitrogen/vapor nitrogen flow if the tube diameter is greater than 9 mm. While the diameter is smaller than 9 mm, regardless of the inclinations and the fluid pairs, the error becomes larger relative to the cases of diameter larger than 9 mm. The calculations for small diameter cases also fail to predict the critical liquid velocity at which the flooding velocity of gas reaches the maximum value, as revealed by the experiments. The reasons for the increased errors were qualitatively explained.

  6. Qualitative Versus Quantitative Social Support as a Predictor of Depression in the Elderly.

    ERIC Educational Resources Information Center

    Chwalisz, Kathleen D.; And Others

    This study examined the relationship between qualitative and quantitative indicators of social support in the prediction of depression. Quantitative indicators were examined with regard to their direct effects on depression as well as their indirect effects through their relationship to perceived social support. Subjects were 301…

  7. Debris-flow runout predictions based on the average channel slope (ACS)

    USGS Publications Warehouse

    Prochaska, A.B.; Santi, P.M.; Higgins, J.D.; Cannon, S.H.

    2008-01-01

    Prediction of the runout distance of a debris flow is an important element in the delineation of potentially hazardous areas on alluvial fans and for the siting of mitigation structures. Existing runout estimation methods rely on input parameters that are often difficult to estimate, including volume, velocity, and frictional factors. In order to provide a simple method for preliminary estimates of debris-flow runout distances, we developed a model that provides runout predictions based on the average channel slope (ACS model) for non-volcanic debris flows that emanate from confined channels and deposit on well-defined alluvial fans. This model was developed from 20 debris-flow events in the western United States and British Columbia. Based on a runout estimation method developed for snow avalanches, this model predicts debris-flow runout as an angle of reach from a fixed point in the drainage channel to the end of the runout zone. The best fixed point was found to be the mid-point elevation of the drainage channel, measured from the apex of the alluvial fan to the top of the drainage basin. Predicted runout lengths were more consistent than those obtained from existing angle-of-reach estimation methods. Results of the model compared well with those of laboratory flume tests performed using the same range of channel slopes. The robustness of this model was tested by applying it to three debris-flow events not used in its development: predicted runout ranged from 82 to 131% of the actual runout for these three events. Prediction interval multipliers were also developed so that the user may calculate predicted runout within specified confidence limits. ?? 2008 Elsevier B.V. All rights reserved.

  8. A qualitative synthesis of the positive and negative impacts related to delivery of peer-based health interventions in prison settings.

    PubMed

    South, Jane; Woodall, James; Kinsella, Karina; Bagnall, Anne-Marie

    2016-09-29

    Peer interventions involving prisoners in delivering peer education and peer support in a prison setting can address health need and add capacity for health services operating in this setting. This paper reports on a qualitative synthesis conducted as part of a systematic review of prison-based peer interventions. One of the review questions aimed to investigate the positive and negative impacts of delivering peer interventions within prison settings. This covered organisational and process issues relating to peer interventions, including prisoner and staff views. A qualitative synthesis of qualitative and mixed method studies was undertaken. The overall study design comprised a systematic review involving searching, study selection, data extraction and validity assessment. Studies reporting interventions with prisoners or ex-prisoners delivering education or support to prisoners resident in any type of prison or young offender institution, all ages, male and female, were included. A thematic synthesis was undertaken with a subset of studies reporting qualitative data (n = 33). This involved free coding of text reporting qualitative findings to develop a set of codes, which were then grouped into thematic categories and mapped back to the review question. Themes on process issues and wider impacts were grouped into four thematic categories: peer recruitment training and support; organisational support; prisoner relationships; prison life. There was consistent qualitative evidence on the need for organisational support within the prison to ensure smooth implementation and on managing security risks when prisoners were involved in service delivery. A suite of factors affecting the delivery of peer interventions and the wider organisation of prison life were identified. Alongside reported benefits of peer delivery, some reasons for non-utilisation of services by other prisoners were found. There was weak qualitative evidence on wider impacts on the prison system

  9. Sequence Based Prediction of Antioxidant Proteins Using a Classifier Selection Strategy

    PubMed Central

    Zhang, Lina; Zhang, Chengjin; Gao, Rui; Yang, Runtao; Song, Qing

    2016-01-01

    Antioxidant proteins perform significant functions in maintaining oxidation/antioxidation balance and have potential therapies for some diseases. Accurate identification of antioxidant proteins could contribute to revealing physiological processes of oxidation/antioxidation balance and developing novel antioxidation-based drugs. In this study, an ensemble method is presented to predict antioxidant proteins with hybrid features, incorporating SSI (Secondary Structure Information), PSSM (Position Specific Scoring Matrix), RSA (Relative Solvent Accessibility), and CTD (Composition, Transition, Distribution). The prediction results of the ensemble predictor are determined by an average of prediction results of multiple base classifiers. Based on a classifier selection strategy, we obtain an optimal ensemble classifier composed of RF (Random Forest), SMO (Sequential Minimal Optimization), NNA (Nearest Neighbor Algorithm), and J48 with an accuracy of 0.925. A Relief combined with IFS (Incremental Feature Selection) method is adopted to obtain optimal features from hybrid features. With the optimal features, the ensemble method achieves improved performance with a sensitivity of 0.95, a specificity of 0.93, an accuracy of 0.94, and an MCC (Matthew’s Correlation Coefficient) of 0.880, far better than the existing method. To evaluate the prediction performance objectively, the proposed method is compared with existing methods on the same independent testing dataset. Encouragingly, our method performs better than previous studies. In addition, our method achieves more balanced performance with a sensitivity of 0.878 and a specificity of 0.860. These results suggest that the proposed ensemble method can be a potential candidate for antioxidant protein prediction. For public access, we develop a user-friendly web server for antioxidant protein identification that is freely accessible at http://antioxidant.weka.cc. PMID:27662651

  10. Researches of fruit quality prediction model based on near infrared spectrum

    NASA Astrophysics Data System (ADS)

    Shen, Yulin; Li, Lian

    2018-04-01

    With the improvement in standards for food quality and safety, people pay more attention to the internal quality of fruits, therefore the measurement of fruit internal quality is increasingly imperative. In general, nondestructive soluble solid content (SSC) and total acid content (TAC) analysis of fruits is vital and effective for quality measurement in global fresh produce markets, so in this paper, we aim at establishing a novel fruit internal quality prediction model based on SSC and TAC for Near Infrared Spectrum. Firstly, the model of fruit quality prediction based on PCA + BP neural network, PCA + GRNN network, PCA + BP adaboost strong classifier, PCA + ELM and PCA + LS_SVM classifier are designed and implemented respectively; then, in the NSCT domain, the median filter and the SavitzkyGolay filter are used to preprocess the spectral signal, Kennard-Stone algorithm is used to automatically select the training samples and test samples; thirdly, we achieve the optimal models by comparing 15 kinds of prediction model based on the theory of multi-classifier competition mechanism, specifically, the non-parametric estimation is introduced to measure the effectiveness of proposed model, the reliability and variance of nonparametric estimation evaluation of each prediction model to evaluate the prediction result, while the estimated value and confidence interval regard as a reference, the experimental results demonstrate that this model can better achieve the optimal evaluation of the internal quality of fruit; finally, we employ cat swarm optimization to optimize two optimal models above obtained from nonparametric estimation, empirical testing indicates that the proposed method can provide more accurate and effective results than other forecasting methods.

  11. MOST: most-similar ligand based approach to target prediction.

    PubMed

    Huang, Tao; Mi, Hong; Lin, Cheng-Yuan; Zhao, Ling; Zhong, Linda L D; Liu, Feng-Bin; Zhang, Ge; Lu, Ai-Ping; Bian, Zhao-Xiang

    2017-03-11

    Many computational approaches have been used for target prediction, including machine learning, reverse docking, bioactivity spectra analysis, and chemical similarity searching. Recent studies have suggested that chemical similarity searching may be driven by the most-similar ligand. However, the extent of bioactivity of most-similar ligands has been oversimplified or even neglected in these studies, and this has impaired the prediction power. Here we propose the MOst-Similar ligand-based Target inference approach, namely MOST, which uses fingerprint similarity and explicit bioactivity of the most-similar ligands to predict targets of the query compound. Performance of MOST was evaluated by using combinations of different fingerprint schemes, machine learning methods, and bioactivity representations. In sevenfold cross-validation with a benchmark Ki dataset from CHEMBL release 19 containing 61,937 bioactivity data of 173 human targets, MOST achieved high average prediction accuracy (0.95 for pKi ≥ 5, and 0.87 for pKi ≥ 6). Morgan fingerprint was shown to be slightly better than FP2. Logistic Regression and Random Forest methods performed better than Naïve Bayes. In a temporal validation, the Ki dataset from CHEMBL19 were used to train models and predict the bioactivity of newly deposited ligands in CHEMBL20. MOST also performed well with high accuracy (0.90 for pKi ≥ 5, and 0.76 for pKi ≥ 6), when Logistic Regression and Morgan fingerprint were employed. Furthermore, the p values associated with explicit bioactivity were found be a robust index for removing false positive predictions. Implicit bioactivity did not offer this capability. Finally, p values generated with Logistic Regression, Morgan fingerprint and explicit activity were integrated with a false discovery rate (FDR) control procedure to reduce false positives in multiple-target prediction scenario, and the success of this strategy it was demonstrated with a case of fluanisone

  12. Qualitative Study of Functional Groups and Antioxidant Properties of Soy-Based Beverages Compared to Cow Milk

    PubMed Central

    Durazzo, Alessandra; Gabrielli, Paolo; Manzi, Pamela

    2015-01-01

    Soy-based beverages are a source of high quality proteins and balanced nutrients; they thus represent an alternative to milk in case of allergy to cow milk proteins or intolerance to lactose. In this research, antioxidant properties of soy-based beverages and UHT cow milk were studied. In addition, color parameters, by a quick and non-destructive methodology, were studied in order to verify a possible correlation with antioxidant properties and a qualitative analysis of the major functional groups undertaken by Fourier Transformed Infrared Spectroscopy (FTIR) on Attenuated Total Reflectance (ATR) was carried out. Extractable and hydrolysable polyphenols were studied in soy-based beverages. However, only the extractable fraction was studied in UHT milk, which was characterized by a small amount of polyphenols. All color parameters showed highly significant differences among soy-based beverages and between soy-based beverages and cow milk. FTIR-ATR spectra of soy-based beverages and cow milk showed several differences in the various regions depending on both the specific contribution of molecular groups and different food items. PMID:26783841

  13. Prediction-Market-Based Quantification of Climate Change Consensus and Uncertainty

    NASA Astrophysics Data System (ADS)

    Boslough, M.

    2012-12-01

    Intrade is an online trading exchange that includes climate prediction markets. One such family of contracts can be described as "Global temperature anomaly for 2012 to be greater than x °C or more," where the figure x ranges in increments of .05 from .30 to 1.10 (relative to the 1951-1980 base period), based on data published by NASA GISS. Each market will settle at 10.00 if the published global temperature anomaly for 2012 is equal to or greater than x, and will otherwise settle at 0.00. Similar contracts will be available for 2013. Global warming hypotheses can be cast as probabilistic predictions for future temperatures. The first modern such climate prediction is that of Broecker (1975), whose temperatures are easily separable from his CO2 growth scenario—which he overestimated—by interpolating his table of temperature as a function of CO2 concentration and projecting the current trend into the near future. For the current concentration of 395 ppm, Broecker's equilibrium temperature anomaly prediction relative to pre-industrial is 1.05 °C, or about 0.75 °C relative to the GISS base period. His neglect of lag in response to the changes in radiative forcing was partially compensated by his low sensitivity of 2.4 °C, leading to a slight overestimate. Simple linear extrapolation of the current trend since 1975 yields an estimate of .65 ± .09 °C (net warming of .95 °C) for anthropogenic global warming with a normal distribution of random natural variability. To evaluate an extreme case, we can estimate the prediction Broecker would have made if he had used the Lindzen & Choi (2009) climate sensitivity of 0.5 °C. The net post-industrial warming by 2012 would have been 0.21 °C, for an expected change of -0.09 from the GISS base period. This is the temperature to which the Earth would be expected to revert if the observed warming since the 19th century was merely due to random natural variability that coincidentally mimicked Broecker's anthropogenic

  14. Prediction of redox-sensitive cysteines using sequential distance and other sequence-based features.

    PubMed

    Sun, Ming-An; Zhang, Qing; Wang, Yejun; Ge, Wei; Guo, Dianjing

    2016-08-24

    Reactive oxygen species can modify the structure and function of proteins and may also act as important signaling molecules in various cellular processes. Cysteine thiol groups of proteins are particularly susceptible to oxidation. Meanwhile, their reversible oxidation is of critical roles for redox regulation and signaling. Recently, several computational tools have been developed for predicting redox-sensitive cysteines; however, those methods either only focus on catalytic redox-sensitive cysteines in thiol oxidoreductases, or heavily depend on protein structural data, thus cannot be widely used. In this study, we analyzed various sequence-based features potentially related to cysteine redox-sensitivity, and identified three types of features for efficient computational prediction of redox-sensitive cysteines. These features are: sequential distance to the nearby cysteines, PSSM profile and predicted secondary structure of flanking residues. After further feature selection using SVM-RFE, we developed Redox-Sensitive Cysteine Predictor (RSCP), a SVM based classifier for redox-sensitive cysteine prediction using primary sequence only. Using 10-fold cross-validation on RSC758 dataset, the accuracy, sensitivity, specificity, MCC and AUC were estimated as 0.679, 0.602, 0.756, 0.362 and 0.727, respectively. When evaluated using 10-fold cross-validation with BALOSCTdb dataset which has structure information, the model achieved performance comparable to current structure-based method. Further validation using an independent dataset indicates it is robust and of relatively better accuracy for predicting redox-sensitive cysteines from non-enzyme proteins. In this study, we developed a sequence-based classifier for predicting redox-sensitive cysteines. The major advantage of this method is that it does not rely on protein structure data, which ensures more extensive application compared to other current implementations. Accurate prediction of redox-sensitive cysteines not

  15. Dissimilarity based Partial Least Squares (DPLS) for genomic prediction from SNPs.

    PubMed

    Singh, Priyanka; Engel, Jasper; Jansen, Jeroen; de Haan, Jorn; Buydens, Lutgarde Maria Celina

    2016-05-04

    Genomic prediction (GP) allows breeders to select plants and animals based on their breeding potential for desirable traits, without lengthy and expensive field trials or progeny testing. We have proposed to use Dissimilarity-based Partial Least Squares (DPLS) for GP. As a case study, we use the DPLS approach to predict Bacterial wilt (BW) in tomatoes using SNPs as predictors. The DPLS approach was compared with the Genomic Best-Linear Unbiased Prediction (GBLUP) and single-SNP regression with SNP as a fixed effect to assess the performance of DPLS. Eight genomic distance measures were used to quantify relationships between the tomato accessions from the SNPs. Subsequently, each of these distance measures was used to predict the BW using the DPLS prediction model. The DPLS model was found to be robust to the choice of distance measures; similar prediction performances were obtained for each distance measure. DPLS greatly outperformed the single-SNP regression approach, showing that BW is a comprehensive trait dependent on several loci. Next, the performance of the DPLS model was compared to that of GBLUP. Although GBLUP and DPLS are conceptually very different, the prediction quality (PQ) measured by DPLS models were similar to the prediction statistics obtained from GBLUP. A considerable advantage of DPLS is that the genotype-phenotype relationship can easily be visualized in a 2-D scatter plot. This so-called score-plot provides breeders an insight to select candidates for their future breeding program. DPLS is a highly appropriate method for GP. The model prediction performance was similar to the GBLUP and far better than the single-SNP approach. The proposed method can be used in combination with a wide range of genomic dissimilarity measures and genotype representations such as allele-count, haplotypes or allele-intensity values. Additionally, the data can be insightfully visualized by the DPLS model, allowing for selection of desirable candidates from the

  16. Predictive control strategies for wind turbine system based on permanent magnet synchronous generator.

    PubMed

    Maaoui-Ben Hassine, Ikram; Naouar, Mohamed Wissem; Mrabet-Bellaaj, Najiba

    2016-05-01

    In this paper, Model Predictive Control and Dead-beat predictive control strategies are proposed for the control of a PMSG based wind energy system. The proposed MPC considers the model of the converter-based system to forecast the possible future behavior of the controlled variables. It allows selecting the voltage vector to be applied that leads to a minimum error by minimizing a predefined cost function. The main features of the MPC are low current THD and robustness against parameters variations. The Dead-beat predictive control is based on the system model to compute the optimum voltage vector that ensures zero-steady state error. The optimum voltage vector is then applied through Space Vector Modulation (SVM) technique. The main advantages of the Dead-beat predictive control are low current THD and constant switching frequency. The proposed control techniques are presented and detailed for the control of back-to-back converter in a wind turbine system based on PMSG. Simulation results (under Matlab-Simulink software environment tool) and experimental results (under developed prototyping platform) are presented in order to show the performances of the considered control strategies. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  17. Cloud Based Metalearning System for Predictive Modeling of Biomedical Data

    PubMed Central

    Vukićević, Milan

    2014-01-01

    Rapid growth and storage of biomedical data enabled many opportunities for predictive modeling and improvement of healthcare processes. On the other side analysis of such large amounts of data is a difficult and computationally intensive task for most existing data mining algorithms. This problem is addressed by proposing a cloud based system that integrates metalearning framework for ranking and selection of best predictive algorithms for data at hand and open source big data technologies for analysis of biomedical data. PMID:24892101

  18. Requirements for Workflow-Based EHR Systems - Results of a Qualitative Study.

    PubMed

    Schweitzer, Marco; Lasierra, Nelia; Hoerbst, Alexander

    2016-01-01

    Today's high quality healthcare delivery strongly relies on efficient electronic health records (EHR). These EHR systems or in general healthcare IT-systems are usually developed in a static manner according to a given workflow. Hence, they are not flexible enough to enable access to EHR data and to execute individual actions within a consultation. This paper reports on requirements pointed by experts in the domain of diabetes mellitus to design a system for supporting dynamic workflows to serve personalization within a medical activity. Requirements were collected by means of expert interviews. These interviews completed a conducted triangulation approach, aimed to gather requirements for workflow-based EHR interactions. The data from the interviews was analyzed through a qualitative approach resulting in a set of requirements enhancing EHR functionality from the user's perspective. Requirements were classified according to four different categorizations: (1) process-related requirements, (2) information needs, (3) required functions, (4) non-functional requirements. Workflow related requirements were identified which should be considered when developing and deploying EHR systems.

  19. Driver's mental workload prediction model based on physiological indices.

    PubMed

    Yan, Shengyuan; Tran, Cong Chi; Wei, Yingying; Habiyaremye, Jean Luc

    2017-09-15

    Developing an early warning model to predict the driver's mental workload (MWL) is critical and helpful, especially for new or less experienced drivers. The present study aims to investigate the correlation between new drivers' MWL and their work performance, regarding the number of errors. Additionally, the group method of data handling is used to establish the driver's MWL predictive model based on subjective rating (NASA task load index [NASA-TLX]) and six physiological indices. The results indicate that the NASA-TLX and the number of errors are positively correlated, and the predictive model shows the validity of the proposed model with an R 2 value of 0.745. The proposed model is expected to provide a reference value for the new drivers of their MWL by providing the physiological indices, and the driving lesson plans can be proposed to sustain an appropriate MWL as well as improve the driver's work performance.

  20. Sequence-Based Prediction of RNA-Binding Residues in Proteins.

    PubMed

    Walia, Rasna R; El-Manzalawy, Yasser; Honavar, Vasant G; Dobbs, Drena

    2017-01-01

    Identifying individual residues in the interfaces of protein-RNA complexes is important for understanding the molecular determinants of protein-RNA recognition and has many potential applications. Recent technical advances have led to several high-throughput experimental methods for identifying partners in protein-RNA complexes, but determining RNA-binding residues in proteins is still expensive and time-consuming. This chapter focuses on available computational methods for identifying which amino acids in an RNA-binding protein participate directly in contacting RNA. Step-by-step protocols for using three different web-based servers to predict RNA-binding residues are described. In addition, currently available web servers and software tools for predicting RNA-binding sites, as well as databases that contain valuable information about known protein-RNA complexes, RNA-binding motifs in proteins, and protein-binding recognition sites in RNA are provided. We emphasize sequence-based methods that can reliably identify interfacial residues without the requirement for structural information regarding either the RNA-binding protein or its RNA partner.

  1. Sequence-Based Prediction of RNA-Binding Residues in Proteins

    PubMed Central

    Walia, Rasna R.; EL-Manzalawy, Yasser; Honavar, Vasant G.; Dobbs, Drena

    2017-01-01

    Identifying individual residues in the interfaces of protein–RNA complexes is important for understanding the molecular determinants of protein–RNA recognition and has many potential applications. Recent technical advances have led to several high-throughput experimental methods for identifying partners in protein–RNA complexes, but determining RNA-binding residues in proteins is still expensive and time-consuming. This chapter focuses on available computational methods for identifying which amino acids in an RNA-binding protein participate directly in contacting RNA. Step-by-step protocols for using three different web-based servers to predict RNA-binding residues are described. In addition, currently available web servers and software tools for predicting RNA-binding sites, as well as databases that contain valuable information about known protein–RNA complexes, RNA-binding motifs in proteins, and protein-binding recognition sites in RNA are provided. We emphasize sequence-based methods that can reliably identify interfacial residues without the requirement for structural information regarding either the RNA-binding protein or its RNA partner. PMID:27787829

  2. A dynamic multi-scale Markov model based methodology for remaining life prediction

    NASA Astrophysics Data System (ADS)

    Yan, Jihong; Guo, Chaozhong; Wang, Xing

    2011-05-01

    The ability to accurately predict the remaining life of partially degraded components is crucial in prognostics. In this paper, a performance degradation index is designed using multi-feature fusion techniques to represent deterioration severities of facilities. Based on this indicator, an improved Markov model is proposed for remaining life prediction. Fuzzy C-Means (FCM) algorithm is employed to perform state division for Markov model in order to avoid the uncertainty of state division caused by the hard division approach. Considering the influence of both historical and real time data, a dynamic prediction method is introduced into Markov model by a weighted coefficient. Multi-scale theory is employed to solve the state division problem of multi-sample prediction. Consequently, a dynamic multi-scale Markov model is constructed. An experiment is designed based on a Bently-RK4 rotor testbed to validate the dynamic multi-scale Markov model, experimental results illustrate the effectiveness of the methodology.

  3. Health services for survivors of gender-based violence in northern Uganda: a qualitative study.

    PubMed

    Henttonen, Mirkka; Watts, Charlotte; Roberts, Bayard; Kaducu, Felix; Borchert, Matthias

    2008-05-01

    The 20-year war in northern Uganda has resulted in up to 1.7 million people being internally displaced, and impoverishment and vulnerability to violence amongst the civilian population. This qualitative study examined the status of health services available for the survivors of gender-based violence in the Gulu district, northern Uganda. Semi-structured interviews were carried out in 2006 with 26 experts on gender-based violence and general health providers, and availability of medical supplies was reviewed. The Inter-Agency Standing Committee (IASC) guidelines on gender-based violence interventions in humanitarian settings were used to prepare the interview guides and analyse the findings. Some legislation and programmes do exist on gender-based violence. However, health facilities lacked sufficiently qualified staff and medical supplies to adequately detect and manage survivors, and confidential treatment and counselling could not be ensured. There was inter-sectoral collaboration, but greater resources are required to increase coverage and effectiveness of services. Intimate partner violence, sexual abuse of girls aged under 18, sexual harassment and early and forced marriage may be more common than rape by strangers. As the IASC guidelines focus on sexual violence by strangers and do not address other forms of gender-based violence, we suggest the need to explore this issue further to determine whether a broader concept of gender-based violence should be incorporated into the guidelines.

  4. Risk Prediction Models in Psychiatry: Toward a New Frontier for the Prevention of Mental Illnesses.

    PubMed

    Bernardini, Francesco; Attademo, Luigi; Cleary, Sean D; Luther, Charles; Shim, Ruth S; Quartesan, Roberto; Compton, Michael T

    2017-05-01

    We conducted a systematic, qualitative review of risk prediction models designed and tested for depression, bipolar disorder, generalized anxiety disorder, posttraumatic stress disorder, and psychotic disorders. Our aim was to understand the current state of research on risk prediction models for these 5 disorders and thus future directions as our field moves toward embracing prediction and prevention. Systematic searches of the entire MEDLINE electronic database were conducted independently by 2 of the authors (from 1960 through 2013) in July 2014 using defined search criteria. Search terms included risk prediction, predictive model, or prediction model combined with depression, bipolar, manic depressive, generalized anxiety, posttraumatic, PTSD, schizophrenia, or psychosis. We identified 268 articles based on the search terms and 3 criteria: published in English, provided empirical data (as opposed to review articles), and presented results pertaining to developing or validating a risk prediction model in which the outcome was the diagnosis of 1 of the 5 aforementioned mental illnesses. We selected 43 original research reports as a final set of articles to be qualitatively reviewed. The 2 independent reviewers abstracted 3 types of data (sample characteristics, variables included in the model, and reported model statistics) and reached consensus regarding any discrepant abstracted information. Twelve reports described models developed for prediction of major depressive disorder, 1 for bipolar disorder, 2 for generalized anxiety disorder, 4 for posttraumatic stress disorder, and 24 for psychotic disorders. Most studies reported on sensitivity, specificity, positive predictive value, negative predictive value, and area under the (receiver operating characteristic) curve. Recent studies demonstrate the feasibility of developing risk prediction models for psychiatric disorders (especially psychotic disorders). The field must now advance by (1) conducting more large

  5. A multi-valued neutrosophic qualitative flexible approach based on likelihood for multi-criteria decision-making problems

    NASA Astrophysics Data System (ADS)

    Peng, Juan-juan; Wang, Jian-qiang; Yang, Wu-E.

    2017-01-01

    In this paper, multi-criteria decision-making (MCDM) problems based on the qualitative flexible multiple criteria method (QUALIFLEX), in which the criteria values are expressed by multi-valued neutrosophic information, are investigated. First, multi-valued neutrosophic sets (MVNSs), which allow the truth-membership function, indeterminacy-membership function and falsity-membership function to have a set of crisp values between zero and one, are introduced. Then the likelihood of multi-valued neutrosophic number (MVNN) preference relations is defined and the corresponding properties are also discussed. Finally, an extended QUALIFLEX approach based on likelihood is explored to solve MCDM problems where the assessments of alternatives are in the form of MVNNs; furthermore an example is provided to illustrate the application of the proposed method, together with a comparison analysis.

  6. Ab Initio-Based Predictions of Hydrocarbon Combustion Chemistry

    DTIC Science & Technology

    2015-07-15

    There are two prime objectives of the research. One is to develop and apply efficient methods for using ab initio potential energy surfaces (PESs...31-Mar-2015 Approved for Public Release; Distribution Unlimited Final Report: Ab Initio -Based Predictions of Hydrocarbon Combustion Chemistry The...Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 hydrocarbon combustion, ab initio quantum chemistry, potential energy surfaces, chemical

  7. A deep learning-based multi-model ensemble method for cancer prediction.

    PubMed

    Xiao, Yawen; Wu, Jun; Lin, Zongli; Zhao, Xiaodong

    2018-01-01

    Cancer is a complex worldwide health problem associated with high mortality. With the rapid development of the high-throughput sequencing technology and the application of various machine learning methods that have emerged in recent years, progress in cancer prediction has been increasingly made based on gene expression, providing insight into effective and accurate treatment decision making. Thus, developing machine learning methods, which can successfully distinguish cancer patients from healthy persons, is of great current interest. However, among the classification methods applied to cancer prediction so far, no one method outperforms all the others. In this paper, we demonstrate a new strategy, which applies deep learning to an ensemble approach that incorporates multiple different machine learning models. We supply informative gene data selected by differential gene expression analysis to five different classification models. Then, a deep learning method is employed to ensemble the outputs of the five classifiers. The proposed deep learning-based multi-model ensemble method was tested on three public RNA-seq data sets of three kinds of cancers, Lung Adenocarcinoma, Stomach Adenocarcinoma and Breast Invasive Carcinoma. The test results indicate that it increases the prediction accuracy of cancer for all the tested RNA-seq data sets as compared to using a single classifier or the majority voting algorithm. By taking full advantage of different classifiers, the proposed deep learning-based multi-model ensemble method is shown to be accurate and effective for cancer prediction. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Predictability of depression severity based on posterior alpha oscillations.

    PubMed

    Jiang, H; Popov, T; Jylänki, P; Bi, K; Yao, Z; Lu, Q; Jensen, O; van Gerven, M A J

    2016-04-01

    We aimed to integrate neural data and an advanced machine learning technique to predict individual major depressive disorder (MDD) patient severity. MEG data was acquired from 22 MDD patients and 22 healthy controls (HC) resting awake with eyes closed. Individual power spectra were calculated by a Fourier transform. Sources were reconstructed via beamforming technique. Bayesian linear regression was applied to predict depression severity based on the spatial distribution of oscillatory power. In MDD patients, decreased theta (4-8 Hz) and alpha (8-14 Hz) power was observed in fronto-central and posterior areas respectively, whereas increased beta (14-30 Hz) power was observed in fronto-central regions. In particular, posterior alpha power was negatively related to depression severity. The Bayesian linear regression model showed significant depression severity prediction performance based on the spatial distribution of both alpha (r=0.68, p=0.0005) and beta power (r=0.56, p=0.007) respectively. Our findings point to a specific alteration of oscillatory brain activity in MDD patients during rest as characterized from MEG data in terms of spectral and spatial distribution. The proposed model yielded a quantitative and objective estimation for the depression severity, which in turn has a potential for diagnosis and monitoring of the recovery process. Copyright © 2016 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  9. Factors influencing protein tyrosine nitration--structure-based predictive models.

    PubMed

    Bayden, Alexander S; Yakovlev, Vasily A; Graves, Paul R; Mikkelsen, Ross B; Kellogg, Glen E

    2011-03-15

    Models for exploring tyrosine nitration in proteins have been created based on 3D structural features of 20 proteins for which high-resolution X-ray crystallographic or NMR data are available and for which nitration of 35 total tyrosines has been experimentally proven under oxidative stress. Factors suggested in previous work to enhance nitration were examined with quantitative structural descriptors. The role of neighboring acidic and basic residues is complex: for the majority of tyrosines that are nitrated the distance to the heteroatom of the closest charged side chain corresponds to the distance needed for suspected nitrating species to form hydrogen bond bridges between the tyrosine and that charged amino acid. This suggests that such bridges play a very important role in tyrosine nitration. Nitration is generally hindered for tyrosines that are buried and for those tyrosines for which there is insufficient space for the nitro group. For in vitro nitration, closed environments with nearby heteroatoms or unsaturated centers that can stabilize radicals are somewhat favored. Four quantitative structure-based models, depending on the conditions of nitration, have been developed for predicting site-specific tyrosine nitration. The best model, relevant for both in vitro and in vivo cases, predicts 30 of 35 tyrosine nitrations (positive predictive value) and has a sensitivity of 60/71 (11 false positives). Copyright © 2010 Elsevier Inc. All rights reserved.

  10. A link prediction method for heterogeneous networks based on BP neural network

    NASA Astrophysics Data System (ADS)

    Li, Ji-chao; Zhao, Dan-ling; Ge, Bing-Feng; Yang, Ke-Wei; Chen, Ying-Wu

    2018-04-01

    Most real-world systems, composed of different types of objects connected via many interconnections, can be abstracted as various complex heterogeneous networks. Link prediction for heterogeneous networks is of great significance for mining missing links and reconfiguring networks according to observed information, with considerable applications in, for example, friend and location recommendations and disease-gene candidate detection. In this paper, we put forward a novel integrated framework, called MPBP (Meta-Path feature-based BP neural network model), to predict multiple types of links for heterogeneous networks. More specifically, the concept of meta-path is introduced, followed by the extraction of meta-path features for heterogeneous networks. Next, based on the extracted meta-path features, a supervised link prediction model is built with a three-layer BP neural network. Then, the solution algorithm of the proposed link prediction model is put forward to obtain predicted results by iteratively training the network. Last, numerical experiments on the dataset of examples of a gene-disease network and a combat network are conducted to verify the effectiveness and feasibility of the proposed MPBP. It shows that the MPBP with very good performance is superior to the baseline methods.

  11. Junior High School Physics: Using a Qualitative Strategy for Successful Problem Solving

    ERIC Educational Resources Information Center

    Mualem, Roni; Eylon, Bat Sheva

    2010-01-01

    Students at the junior high school (JHS) level often cannot use their knowledge of physics for explaining and predicting phenomena. We claim that this difficulty stems from the fact that explanations are multi-step reasoning tasks, and students often lack the qualitative problem-solving strategies needed to guide them. This article describes a new…

  12. The experience and management of emotions on an inpatient setting for people with anorexia nervosa: a qualitative study.

    PubMed

    Pemberton, Kathryn; Fox, John R E

    2013-01-01

    Research has identified how people with anorexia nervosa (AN) have problematic relationships with their own emotions, which can impact recovery. The aim of this study was to understand factors that were important in the care and emotional management of people with eating disorders on an inpatient unit. Semi-structured interviews were conducted with eight participants with AN. Interview transcripts were analysed using a qualitative approach that was based upon interpretative phenomenological analysis, but also incorporated a theoretical component. From the qualitative analysis, two overarching and related themes were developed: 'difficulty with emotion' and 'predictability and care'. These were underpinned by a number of theoretical important constructs, such as 'staff factors', 'understanding of emotion', 'validity of emotion' and 'looking for ideal care'. Results suggested that the management strategies employed by some staff could serve to maintain eating disorders symptoms, whilst patient factors were also important as they had negative effect on staff's impact to care for this patient group. Copyright © 2011 John Wiley & Sons, Ltd.

  13. Challenges in conducting qualitative research in health: A conceptual paper.

    PubMed

    Khankeh, Hamidreza; Ranjbar, Maryam; Khorasani-Zavareh, Davoud; Zargham-Boroujeni, Ali; Johansson, Eva

    2015-01-01

    Qualitative research focuses on social world and provides the tools to study health phenomena from the perspective of those experiencing them. Identifying the problem, forming the question, and selecting an appropriate methodology and design are some of the initial challenges that researchers encounter in the early stages of any research project. These problems are particularly common for novices. This article describes the practical challenges of using qualitative inquiry in the field of health and the challenges of performing an interpretive research based on professional experience as a qualitative researcher and on available literature. One of the main topics discussed is the nature of qualitative research, its inherent challenges, and how to overcome them. Some of those highlighted here include: identification of the research problem, formation of the research question/aim, and selecting an appropriate methodology and research design, which are the main concerns of qualitative researchers and need to be handled properly. Insights from real-life experiences in conducting qualitative research in health reveal these issues. The paper provides personal comments on the experiences of a researcher in conducting pure qualitative research in the field of health. It offers insights into the practical difficulties encountered when performing qualitative studies and offers solutions and alternatives applied by these authors, which may be of use to others.

  14. YouTube as a Qualitative Research Asset: Reviewing User Generated Videos as Learning Resources

    ERIC Educational Resources Information Center

    Chenail, Ronald J.

    2011-01-01

    YouTube, the video hosting service, offers students, teachers, and practitioners of qualitative researchers a unique reservoir of video clips introducing basic qualitative research concepts, sharing qualitative data from interviews and field observations, and presenting completed research studies. This web-based site also affords qualitative…

  15. Internal Medicine Residents' Perceptions of Team-Based Care and its Educational Value in the Continuity Clinic: A Qualitative Study.

    PubMed

    Soones, Tacara N; O'Brien, Bridget C; Julian, Katherine A

    2015-09-01

    In order to teach residents how to work in interprofessional teams, educators in graduate medical education are implementing team-based care models in resident continuity clinics. However, little is known about the impact of interprofessional teams on residents' education in the ambulatory setting. To identify factors affecting residents' experience of team-based care within continuity clinics and the impact of these teams on residents' education. This was a qualitative study of focus groups with internal medicine residents. Seventy-seven internal medicine residents at the University of California San Francisco at three continuity clinic sites participated in the study. Qualitative interviews were audiotaped and transcribed. The authors used a general inductive approach with sensitizing concepts in four frames (structural, human resources, political and symbolic) to develop codes and identify themes. Residents believed that team-based care improves continuity and quality of care. Factors in four frames affected their ability to achieve these goals. Structural factors included communication through the electronic medical record, consistent schedules and regular team meetings. Human resources factors included the presence of stable teams and clear roles. Political and symbolic factors negatively impacted team-based care, and included low staffing ratios and a culture of ultimate resident responsibility, respectively. Regardless of the presence of these factors or resident perceptions of their teams, residents did not see the practice of interprofessional team-based care as intrinsically educational. Residents' experiences practicing team-based care are influenced by many principles described in the interprofessional teamwork literature, including understanding team members' roles, good communication and sufficient staffing. However, these attributes are not correlated with residents' perceptions of the educational value of team-based care. Including residents in

  16. Prediction of Marital Satisfaction Based on Emotional Intelligence in Postmenopausal Women.

    PubMed

    Heidari, Mohammad; Shahbazi, Sara; Ghafourifard, Mansour; Ali Sheikhi, Rahim

    2017-12-01

    This study was coperinducted with the aim of prediction of marital satisfaction based on emotional intelligence for postmenopausal women. This cross-sectional study was the descriptive-correlation and with a sample size of 134 people to predict marital satisfaction based on emotional intelligence for postmenopausal women was conducted in the Borujen city. The subjects were selected by convenience sampling. Data collection tools included an emotional intelligence questionnaire (Bar-on) and Enrich marital satisfaction questionnaire. The results of this study showed a significant positive relationship between marital satisfaction and emotional intelligence ( P < 0.05, r = 0.25). Also, regression analysis showed that emotional intelligence ( β = 0.31) can predict positively and significantly marital satisfaction. Due to the positive relationship between emotional intelligence and marital satisfaction, adequacy of emotional intelligence is improved as important structural in marital satisfaction. So it seems that can with measuring emotional intelligence in reinforced marital satisfaction during menopause, done appropriate action.

  17. Model-based prediction of nephropathia epidemica outbreaks based on climatological and vegetation data and bank vole population dynamics.

    PubMed

    Haredasht, S Amirpour; Taylor, C J; Maes, P; Verstraeten, W W; Clement, J; Barrios, M; Lagrou, K; Van Ranst, M; Coppin, P; Berckmans, D; Aerts, J-M

    2013-11-01

    Wildlife-originated zoonotic diseases in general are a major contributor to emerging infectious diseases. Hantaviruses more specifically cause thousands of human disease cases annually worldwide, while understanding and predicting human hantavirus epidemics pose numerous unsolved challenges. Nephropathia epidemica (NE) is a human infection caused by Puumala virus, which is naturally carried and shed by bank voles (Myodes glareolus). The objective of this study was to develop a method that allows model-based predicting 3 months ahead of the occurrence of NE epidemics. Two data sets were utilized to develop and test the models. These data sets were concerned with NE cases in Finland and Belgium. In this study, we selected the most relevant inputs from all the available data for use in a dynamic linear regression (DLR) model. The number of NE cases in Finland were modelled using data from 1996 to 2008. The NE cases were predicted based on the time series data of average monthly air temperature (°C) and bank voles' trapping index using a DLR model. The bank voles' trapping index data were interpolated using a related dynamic harmonic regression model (DHR). Here, the DLR and DHR models used time-varying parameters. Both the DHR and DLR models were based on a unified state-space estimation framework. For the Belgium case, no time series of the bank voles' population dynamics were available. Several studies, however, have suggested that the population of bank voles is related to the variation in seed production of beech and oak trees in Northern Europe. Therefore, the NE occurrence pattern in Belgium was predicted based on a DLR model by using remotely sensed phenology parameters of broad-leaved forests, together with the oak and beech seed categories and average monthly air temperature (°C) using data from 2001 to 2009. Our results suggest that even without any knowledge about hantavirus dynamics in the host population, the time variation in NE outbreaks in Finland

  18. A Genomics-Based Model for Prediction of Severe Bioprosthetic Mitral Valve Calcification.

    PubMed

    Ponasenko, Anastasia V; Khutornaya, Maria V; Kutikhin, Anton G; Rutkovskaya, Natalia V; Tsepokina, Anna V; Kondyukova, Natalia V; Yuzhalin, Arseniy E; Barbarash, Leonid S

    2016-08-31

    Severe bioprosthetic mitral valve calcification is a significant problem in cardiovascular surgery. Unfortunately, clinical markers did not demonstrate efficacy in prediction of severe bioprosthetic mitral valve calcification. Here, we examined whether a genomics-based approach is efficient in predicting the risk of severe bioprosthetic mitral valve calcification. A total of 124 consecutive Russian patients who underwent mitral valve replacement surgery were recruited. We investigated the associations of the inherited variation in innate immunity, lipid metabolism and calcium metabolism genes with severe bioprosthetic mitral valve calcification. Genotyping was conducted utilizing the TaqMan assay. Eight gene polymorphisms were significantly associated with severe bioprosthetic mitral valve calcification and were therefore included into stepwise logistic regression which identified male gender, the T/T genotype of the rs3775073 polymorphism within the TLR6 gene, the C/T genotype of the rs2229238 polymorphism within the IL6R gene, and the A/A genotype of the rs10455872 polymorphism within the LPA gene as independent predictors of severe bioprosthetic mitral valve calcification. The developed genomics-based model had fair predictive value with area under the receiver operating characteristic (ROC) curve of 0.73. In conclusion, our genomics-based approach is efficient for the prediction of severe bioprosthetic mitral valve calcification.

  19. A Genomics-Based Model for Prediction of Severe Bioprosthetic Mitral Valve Calcification

    PubMed Central

    Ponasenko, Anastasia V.; Khutornaya, Maria V.; Kutikhin, Anton G.; Rutkovskaya, Natalia V.; Tsepokina, Anna V.; Kondyukova, Natalia V.; Yuzhalin, Arseniy E.; Barbarash, Leonid S.

    2016-01-01

    Severe bioprosthetic mitral valve calcification is a significant problem in cardiovascular surgery. Unfortunately, clinical markers did not demonstrate efficacy in prediction of severe bioprosthetic mitral valve calcification. Here, we examined whether a genomics-based approach is efficient in predicting the risk of severe bioprosthetic mitral valve calcification. A total of 124 consecutive Russian patients who underwent mitral valve replacement surgery were recruited. We investigated the associations of the inherited variation in innate immunity, lipid metabolism and calcium metabolism genes with severe bioprosthetic mitral valve calcification. Genotyping was conducted utilizing the TaqMan assay. Eight gene polymorphisms were significantly associated with severe bioprosthetic mitral valve calcification and were therefore included into stepwise logistic regression which identified male gender, the T/T genotype of the rs3775073 polymorphism within the TLR6 gene, the C/T genotype of the rs2229238 polymorphism within the IL6R gene, and the A/A genotype of the rs10455872 polymorphism within the LPA gene as independent predictors of severe bioprosthetic mitral valve calcification. The developed genomics-based model had fair predictive value with area under the receiver operating characteristic (ROC) curve of 0.73. In conclusion, our genomics-based approach is efficient for the prediction of severe bioprosthetic mitral valve calcification. PMID:27589735

  20. Adaptive Granulation-Based Prediction for Energy System of Steel Industry.

    PubMed

    Wang, Tianyu; Han, Zhongyang; Zhao, Jun; Wang, Wei

    2018-01-01

    The flow variation tendency of byproduct gas plays a crucial role for energy scheduling in steel industry. An accurate prediction of its future trends will be significantly beneficial for the economic profits of steel enterprise. In this paper, a long-term prediction model for the energy system is proposed by providing an adaptive granulation-based method that considers the production semantics involved in the fluctuation tendency of the energy data, and partitions them into a series of information granules. To fully reflect the corresponding data characteristics of the formed unequal-length temporal granules, a 3-D feature space consisting of the timespan, the amplitude and the linetype is designed as linguistic descriptors. In particular, a collaborative-conditional fuzzy clustering method is proposed to granularize the tendency-based feature descriptors and specifically measure the amplitude variation of industrial data which plays a dominant role in the feature space. To quantify the performance of the proposed method, a series of real-world industrial data coming from the energy data center of a steel plant is employed to conduct the comparative experiments. The experimental results demonstrate that the proposed method successively satisfies the requirements of the practically viable prediction.

  1. Modeling of BN Lifetime Prediction of a System Based on Integrated Multi-Level Information.

    PubMed

    Wang, Jingbin; Wang, Xiaohong; Wang, Lizhi

    2017-09-15

    Predicting system lifetime is important to ensure safe and reliable operation of products, which requires integrated modeling based on multi-level, multi-sensor information. However, lifetime characteristics of equipment in a system are different and failure mechanisms are inter-coupled, which leads to complex logical correlations and the lack of a uniform lifetime measure. Based on a Bayesian network (BN), a lifetime prediction method for systems that combine multi-level sensor information is proposed. The method considers the correlation between accidental failures and degradation failure mechanisms, and achieves system modeling and lifetime prediction under complex logic correlations. This method is applied in the lifetime prediction of a multi-level solar-powered unmanned system, and the predicted results can provide guidance for the improvement of system reliability and for the maintenance and protection of the system.

  2. Context-sensitive network-based disease genetics prediction and its implications in drug discovery.

    PubMed

    Chen, Yang; Xu, Rong

    2017-04-01

    Disease phenotype networks play an important role in computational approaches to identifying new disease-gene associations. Current disease phenotype networks often model disease relationships based on pairwise similarities, therefore ignore the specific context on how two diseases are connected. In this study, we propose a new strategy to model disease associations using context-sensitive networks (CSNs). We developed a CSN-based phenome-driven approach for disease genetics prediction, and investigated the translational potential of the predicted genes in drug discovery. We constructed CSNs by directly connecting diseases with associated phenotypes. Here, we constructed two CSNs using different data sources; the two networks contain 26 790 and 13 822 nodes respectively. We integrated the CSNs with a genetic functional relationship network and predicted disease genes using a network-based ranking algorithm. For comparison, we built Similarity-Based disease Networks (SBN) using the same disease phenotype data. In a de novo cross validation for 3324 diseases, the CSN-based approach significantly increased the average rank from top 12.6 to top 8.8% for all tested genes comparing with the SBN-based approach ( ppredicted genes for Parkinson's disease using CSNs, and demonstrated that the top-ranked genes are highly relevant to PD pathologenesis. We pin-pointed a top-ranked drug target gene for PD, and found its association with neurodegeneration supported by literature. In summary, CSNs lead to significantly improve the disease genetics prediction comparing with SBNs and provide leads for potential drug targets. nlp.case.edu/public/data/. rxx@case.edu. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  3. Context-sensitive network-based disease genetics prediction and its implications in drug discovery

    PubMed Central

    Chen, Yang; Xu, Rong

    2017-01-01

    Abstract Motivation: Disease phenotype networks play an important role in computational approaches to identifying new disease-gene associations. Current disease phenotype networks often model disease relationships based on pairwise similarities, therefore ignore the specific context on how two diseases are connected. In this study, we propose a new strategy to model disease associations using context-sensitive networks (CSNs). We developed a CSN-based phenome-driven approach for disease genetics prediction, and investigated the translational potential of the predicted genes in drug discovery. Results: We constructed CSNs by directly connecting diseases with associated phenotypes. Here, we constructed two CSNs using different data sources; the two networks contain 26 790 and 13 822 nodes respectively. We integrated the CSNs with a genetic functional relationship network and predicted disease genes using a network-based ranking algorithm. For comparison, we built Similarity-Based disease Networks (SBN) using the same disease phenotype data. In a de novo cross validation for 3324 diseases, the CSN-based approach significantly increased the average rank from top 12.6 to top 8.8% for all tested genes comparing with the SBN-based approach (ppredicted genes for Parkinson’s disease using CSNs, and demonstrated that the top-ranked genes are highly relevant to PD pathologenesis. We pin-pointed a top-ranked drug target gene for PD, and found its association with neurodegeneration supported by literature. In summary, CSNs lead to significantly improve the disease genetics prediction comparing with SBNs and provide leads for potential drug targets. Availability and Implementation: nlp.case.edu/public/data/ Contact: rxx@case.edu PMID:28062449

  4. Impact of Participation in TimeSlips, a Creative Group-Based Storytelling Program, on Medical Student Attitudes Toward Persons With Dementia: A Qualitative Study

    PubMed Central

    George, Daniel R.; Stuckey, Heather L.; Dillon, Caroline F.; Whitehead, Megan M.

    2011-01-01

    Purpose: To evaluate whether medical student participation in TimeSlips (TS), a creative group-based storytelling program, with persons affected by dementia would improve student attitudes toward this patient population. Design and Methods: Fifteen fourth-year medical students from Penn State College of Medicine participated in a month-long regimen of TS sessions at a retirement community. Student course evaluations were analyzed at the conclusion of the program to examine perceived qualitative changes in attitude. Findings: Qualitative data revealed insights into the manner in which student attitudes toward a geriatric patient population became more positive. Implications: This is the first known pilot study to suggest that participation in a creative group-based storytelling program might improve medical student attitudes toward persons with dementia. PMID:21665958

  5. Impact of implementation choices on quantitative predictions of cell-based computational models

    NASA Astrophysics Data System (ADS)

    Kursawe, Jochen; Baker, Ruth E.; Fletcher, Alexander G.

    2017-09-01

    'Cell-based' models provide a powerful computational tool for studying the mechanisms underlying the growth and dynamics of biological tissues in health and disease. An increasing amount of quantitative data with cellular resolution has paved the way for the quantitative parameterisation and validation of such models. However, the numerical implementation of cell-based models remains challenging, and little work has been done to understand to what extent implementation choices may influence model predictions. Here, we consider the numerical implementation of a popular class of cell-based models called vertex models, which are often used to study epithelial tissues. In two-dimensional vertex models, a tissue is approximated as a tessellation of polygons and the vertices of these polygons move due to mechanical forces originating from the cells. Such models have been used extensively to study the mechanical regulation of tissue topology in the literature. Here, we analyse how the model predictions may be affected by numerical parameters, such as the size of the time step, and non-physical model parameters, such as length thresholds for cell rearrangement. We find that vertex positions and summary statistics are sensitive to several of these implementation parameters. For example, the predicted tissue size decreases with decreasing cell cycle durations, and cell rearrangement may be suppressed by large time steps. These findings are counter-intuitive and illustrate that model predictions need to be thoroughly analysed and implementation details carefully considered when applying cell-based computational models in a quantitative setting.

  6. Getting added value from using qualitative research with randomized controlled trials: a qualitative interview study

    PubMed Central

    2014-01-01

    Background Qualitative research is undertaken with randomized controlled trials of health interventions. Our aim was to explore the perceptions of researchers with experience of this endeavour to understand the added value of qualitative research to the trial in practice. Methods A telephone semi-structured interview study with 18 researchers with experience of undertaking the trial and/or the qualitative research. Results Interviewees described the added value of qualitative research for the trial, explaining how it solved problems at the pretrial stage, explained findings, and helped to increase the utility of the evidence generated by the trial. From the interviews, we identified three models of relationship of the qualitative research to the trial. In ‘the peripheral’ model, the trial was an opportunity to undertake qualitative research, with no intention that it would add value to the trial. In ‘the add-on’ model, the qualitative researcher understood the potential value of the qualitative research but it was viewed as a separate and complementary endeavour by the trial lead investigator and wider team. Interviewees described how this could limit the value of the qualitative research to the trial. Finally ‘the integral’ model played out in two ways. In ‘integral-in-theory’ studies, the lead investigator viewed the qualitative research as essential to the trial. However, in practice the qualitative research was under-resourced relative to the trial, potentially limiting its ability to add value to the trial. In ‘integral-in-practice’ studies, interviewees described how the qualitative research was planned from the beginning of the study, senior qualitative expertise was on the team from beginning to end, and staff and time were dedicated to the qualitative research. In these studies interviewees described the qualitative research adding value to the trial although this value was not necessarily visible beyond the original research team due

  7. Getting added value from using qualitative research with randomized controlled trials: a qualitative interview study.

    PubMed

    O'Cathain, Alicia; Goode, Jackie; Drabble, Sarah J; Thomas, Kate J; Rudolph, Anne; Hewison, Jenny

    2014-06-09

    Qualitative research is undertaken with randomized controlled trials of health interventions. Our aim was to explore the perceptions of researchers with experience of this endeavour to understand the added value of qualitative research to the trial in practice. A telephone semi-structured interview study with 18 researchers with experience of undertaking the trial and/or the qualitative research. Interviewees described the added value of qualitative research for the trial, explaining how it solved problems at the pretrial stage, explained findings, and helped to increase the utility of the evidence generated by the trial. From the interviews, we identified three models of relationship of the qualitative research to the trial. In 'the peripheral' model, the trial was an opportunity to undertake qualitative research, with no intention that it would add value to the trial. In 'the add-on' model, the qualitative researcher understood the potential value of the qualitative research but it was viewed as a separate and complementary endeavour by the trial lead investigator and wider team. Interviewees described how this could limit the value of the qualitative research to the trial. Finally 'the integral' model played out in two ways. In 'integral-in-theory' studies, the lead investigator viewed the qualitative research as essential to the trial. However, in practice the qualitative research was under-resourced relative to the trial, potentially limiting its ability to add value to the trial. In 'integral-in-practice' studies, interviewees described how the qualitative research was planned from the beginning of the study, senior qualitative expertise was on the team from beginning to end, and staff and time were dedicated to the qualitative research. In these studies interviewees described the qualitative research adding value to the trial although this value was not necessarily visible beyond the original research team due to the challenges of publishing this research

  8. Ontology-based prediction of surgical events in laparoscopic surgery

    NASA Astrophysics Data System (ADS)

    Katić, Darko; Wekerle, Anna-Laura; Gärtner, Fabian; Kenngott, Hannes; Müller-Stich, Beat Peter; Dillmann, Rüdiger; Speidel, Stefanie

    2013-03-01

    Context-aware technologies have great potential to help surgeons during laparoscopic interventions. Their underlying idea is to create systems which can adapt their assistance functions automatically to the situation in the OR, thus relieving surgeons from the burden of managing computer assisted surgery devices manually. To this purpose, a certain kind of understanding of the current situation in the OR is essential. Beyond that, anticipatory knowledge of incoming events is beneficial, e.g. for early warnings of imminent risk situations. To achieve the goal of predicting surgical events based on previously observed ones, we developed a language to describe surgeries and surgical events using Description Logics and integrated it with methods from computational linguistics. Using n-Grams to compute probabilities of followup events, we are able to make sensible predictions of upcoming events in real-time. The system was evaluated on professionally recorded and labeled surgeries and showed an average prediction rate of 80%.

  9. Value of supervised learning events in predicting doctors in difficulty.

    PubMed

    Patel, Mumtaz; Agius, Steven; Wilkinson, Jack; Patel, Leena; Baker, Paul

    2016-07-01

    In the UK, supervised learning events (SLE) replaced traditional workplace-based assessments for foundation-year trainees in 2012. A key element of SLEs was to incorporate trainee reflection and assessor feedback in order to drive learning and identify training issues early. Few studies, however, have investigated the value of SLEs in predicting doctors in difficulty. This study aimed to identify principles that would inform understanding about how and why SLEs work or not in identifying doctors in difficulty (DiD). A retrospective case-control study of North West Foundation School trainees' electronic portfolios was conducted. Cases comprised all known DiD. Controls were randomly selected from the same cohort. Free-text supervisor comments from each SLE were assessed for the four domains defined in the General Medical Council's Good Medical Practice Guidelines and each scored blindly for level of concern using a three-point ordinal scale. Cumulative scores for each SLE were then analysed quantitatively for their predictive value of actual DiD. A qualitative thematic analysis was also conducted. The prevalence of DiD in this sample was 6.5%. Receiver operator characteristic curve analysis showed that Team Assessment of Behaviour (TAB) was the only SLE strongly predictive of actual DiD status. The Educational Supervisor Report (ESR) was also strongly predictive of DiD status. Fisher's test showed significant associations of TAB and ESR for both predicted and actual DiD status and also the health and performance subtypes. None of the other SLEs showed significant associations. Qualitative data analysis revealed inadequate completion and lack of constructive, particularly negative, feedback. This indicated that SLEs were not used to their full potential. TAB and the ESR are strongly predictive of DiD. However, SLEs are not being used to their full potential, and the quality of completion of reports on SLEs and feedback needs to be improved in order to better identify

  10. Women Veterans’ Experience With a Web-Based Diabetes Prevention Program: A Qualitative Study to Inform Future Practice

    PubMed Central

    Ertl, Kristyn; Schneider, Jessica; Vasti, Elena; Makki, Fatima; Richardson, Caroline; Havens, Kathryn; Damschroder, Laura

    2015-01-01

    Background Diabetes prevention is a national goal and particularly important in the Veterans Health Administration (VHA) where 1 in 4 veterans has diabetes. There is growing evidence to support the use of Web-based diabetes prevention program (DPP) interventions, shown to be as effective and often more feasible than in-person interventions. Objective Our primary objective was to qualitatively explore women veterans’ early experiences with a Web-based DPP intervention. Our secondary objective was to estimate weight loss, participation, and engagement to provide context for our qualitative findings. Methods We conducted and analyzed semistructured interviews and collected data on weight change, participation, and engagement. A total of 17 women veterans with prediabetes from a Midwest VA Women’s Health Clinic were eligible to participate; 15 completed interviews. Results Participants perceived the DPP program as an appealing way of initiating lifestyle changes and made them feel accountable in achieving their daily goals. The online program was convenient because it could be accessed at any time, and many found that it integrated well into daily life. However, some did not like the logging aspect and some found it to be too impersonal. Participants logged in a mean 76 times, posted a mean 46 group messages, and sent a mean 20.5 private messages to the health coach over 16 weeks. Participants lost 5.24% of baseline weight, and 82% (14/17) of participants completed at least 9 of 16 core modules. Conclusions Women veterans’ early experiences with a Web-based DPP intervention were generally positive. Accountability and convenience were key enabling factors for participation and engagement. A Web-based DPP intervention appears to be a promising means of translating the DPP for women veterans with prediabetes. PMID:26006697

  11. The Experience of Psychiatric Care of Adolescents with Anxiety-based School Refusal and of their Parents: A Qualitative Study

    PubMed Central

    Sibeoni, Jordan; Orri, Massimiliano; Podlipski, Marc-Antoine; Labey, Mathilde; Campredon, Sophie; Gerardin, Priscille; Revah-Levy, Anne

    2018-01-01

    Objective Anxiety-based school refusal in adolescence is a complex, sometimes difficult to treat disorder that can have serious academic and psychiatric consequences. The objective of this qualitative study was to explore how teens with this problem and their parents experience the psychiatric care received. Methods This qualitative multicenter study took place in France, where we conducted semi-structured interviews with adolescents receiving psychiatric care for anxiety-based school refusal and with their parents. Data collection by purposive sampling continued until we reached theoretical sufficiency. Data analysis was thematic. Results This study included 20 adolescents aged 12 to 18 years and 21 parents. Two themes emerged from the analysis: (1) the goals of psychiatric care with two sub-themes, “self-transformation” and problem solving; and, (2) the therapeutic levers identified as effective with two sub-themes: time and space and relationships. Conclusion Our results show a divergence between parents and teens in their representations of care and especially of its goals. Therapeutic and research implications about the terms of return to school within psychiatric care and also the temporality of care are discussed. PMID:29375632

  12. Peer Helpers in Hungary: A Qualitative Analysis

    ERIC Educational Resources Information Center

    Racz, Jozsef; Lacko, Zsuzsa

    2008-01-01

    Hungary is a country in transition that has no real tradition of peer helping. A qualitative study was carried out involving 13 peer helpers of two kinds (a) age-based peers, and (b) way-of-life-based peers (fellow helpers). The motivations for and the processes of becoming a peer helper were analyzed. Results showed the largest difference being…

  13. SNBRFinder: A Sequence-Based Hybrid Algorithm for Enhanced Prediction of Nucleic Acid-Binding Residues.

    PubMed

    Yang, Xiaoxia; Wang, Jia; Sun, Jun; Liu, Rong

    2015-01-01

    Protein-nucleic acid interactions are central to various fundamental biological processes. Automated methods capable of reliably identifying DNA- and RNA-binding residues in protein sequence are assuming ever-increasing importance. The majority of current algorithms rely on feature-based prediction, but their accuracy remains to be further improved. Here we propose a sequence-based hybrid algorithm SNBRFinder (Sequence-based Nucleic acid-Binding Residue Finder) by merging a feature predictor SNBRFinderF and a template predictor SNBRFinderT. SNBRFinderF was established using the support vector machine whose inputs include sequence profile and other complementary sequence descriptors, while SNBRFinderT was implemented with the sequence alignment algorithm based on profile hidden Markov models to capture the weakly homologous template of query sequence. Experimental results show that SNBRFinderF was clearly superior to the commonly used sequence profile-based predictor and SNBRFinderT can achieve comparable performance to the structure-based template methods. Leveraging the complementary relationship between these two predictors, SNBRFinder reasonably improved the performance of both DNA- and RNA-binding residue predictions. More importantly, the sequence-based hybrid prediction reached competitive performance relative to our previous structure-based counterpart. Our extensive and stringent comparisons show that SNBRFinder has obvious advantages over the existing sequence-based prediction algorithms. The value of our algorithm is highlighted by establishing an easy-to-use web server that is freely accessible at http://ibi.hzau.edu.cn/SNBRFinder.

  14. Research on reverse logistics location under uncertainty environment based on grey prediction

    NASA Astrophysics Data System (ADS)

    Zhenqiang, Bao; Congwei, Zhu; Yuqin, Zhao; Quanke, Pan

    This article constructs reverse logistic network based on uncertain environment, integrates the reverse logistics network and distribution network, and forms a closed network. An optimization model based on cost is established to help intermediate center, manufacturing center and remanufacturing center make location decision. A gray model GM (1, 1) is used to predict the product holdings of the collection points, and then prediction results are carried into the cost optimization model and a solution is got. Finally, an example is given to verify the effectiveness and feasibility of the model.

  15. A New Method for the Evaluation and Prediction of Base Stealing Performance.

    PubMed

    Bricker, Joshua C; Bailey, Christopher A; Driggers, Austin R; McInnis, Timothy C; Alami, Arya

    2016-11-01

    Bricker, JC, Bailey, CA, Driggers, AR, McInnis, TC, and Alami, A. A new method for the evaluation and prediction of base stealing performance. J Strength Cond Res 30(11): 3044-3050, 2016-The purposes of this study were to evaluate a new method using electronic timing gates to monitor base stealing performance in terms of reliability, differences between it and traditional stopwatch-collected times, and its ability to predict base stealing performance. Twenty-five healthy collegiate baseball players performed maximal effort base stealing trials with a right and left-handed pitcher. An infrared electronic timing system was used to calculate the reaction time (RT) and total time (TT), whereas coaches' times (CT) were recorded with digital stopwatches. Reliability of the TGM was evaluated with intraclass correlation coefficients (ICCs) and coefficient of variation (CV). Differences between the TGM and traditional CT were calculated with paired samples t tests Cohen's d effect size estimates. Base stealing performance predictability of the TGM was evaluated with Pearson's bivariate correlations. Acceptable relative reliability was observed (ICCs 0.74-0.84). Absolute reliability measures were acceptable for TT (CVs = 4.4-4.8%), but measures were elevated for RT (CVs = 32.3-35.5%). Statistical and practical differences were found between TT and CT (right p = 0.00, d = 1.28 and left p = 0.00, d = 1.49). The TGM TT seems to be a decent predictor of base stealing performance (r = -0.49 to -0.61). The authors recommend using the TGM used in this investigation for athlete monitoring because it was found to be reliable, seems to be more precise than traditional CT measured with a stopwatch, provides an additional variable of value (RT), and may predict future performance.

  16. TWT transmitter fault prediction based on ANFIS

    NASA Astrophysics Data System (ADS)

    Li, Mengyan; Li, Junshan; Li, Shuangshuang; Wang, Wenqing; Li, Fen

    2017-11-01

    Fault prediction is an important component of health management, and plays an important role in the reliability guarantee of complex electronic equipments. Transmitter is a unit with high failure rate. The cathode performance of TWT is a common fault of transmitter. In this dissertation, a model based on a set of key parameters of TWT is proposed. By choosing proper parameters and applying adaptive neural network training model, this method, combined with analytic hierarchy process (AHP), has a certain reference value for the overall health judgment of TWT transmitters.

  17. Implementing Internet-Based Self-Care Programs in Primary Care: Qualitative Analysis of Determinants of Practice for Patients and Providers.

    PubMed

    Hermes, Eric; Burrone, Laura; Perez, Elliottnell; Martino, Steve; Rowe, Michael

    2018-05-18

    Access to evidence-based interventions for common mental health conditions is limited due to geographic distance, scheduling, stigma, and provider availability. Internet-based self-care programs may mitigate these barriers. However, little is known about internet-based self-care program implementation in US health care systems. The objective of this study was to identify determinants of practice for internet-based self-care program use in primary care by eliciting provider and administrator perspectives on internet-based self-care program implementation. The objective was explored through qualitative analysis of semistructured interviews with primary care providers and administrators from the Veterans Health Administration. Participants were identified using a reputation-based snowball design. Interviews focused on identifying determinants of practice for the use of internet-based self-care programs at the point of care in Veterans Health Administration primary care. Qualitative analysis of transcripts was performed using thematic coding. A total of 20 physicians, psychologists, social workers, and nurses participated in interviews. Among this group, internet-based self-care program use was relatively low, but support for the platform was assessed as relatively high. Themes were organized into determinants active at patient and provider levels. Perceived patient-level determinants included literacy, age, internet access, patient expectations, internet-based self-care program fit with patient experiences, interest and motivation, and face-to-face human contact. Perceived provider-level determinants included familiarity with internet-based self-care programs, changes to traditional care delivery, face-to-face human contact, competing demands, and age. This exploration of perspectives on internet-based self-care program implementation among Veterans Health Administration providers and administrators revealed key determinants of practice, which can be used to develop

  18. Facilitators and barriers to the delivery of school-based smoking prevention interventions for children and young people: a protocol for a systematic review of qualitative studies.

    PubMed

    Dobbie, Fiona; Angus, Kathryn; Littlecott, Hannah; Allum, Karen; Wells, Valerie; Amos, Amanda; Haw, Sally; Bauld, Linda

    2018-04-06

    Despite a decline in child and adult smoking prevalence, young people who smoke (even occasionally) can rapidly become addicted to nicotine, with most adult smokers initiating smoking before they are 18. Schools have long been a popular setting to deliver youth smoking prevention interventions, but evidence of the effectiveness of school-based prevention programmes is mixed, and outcomes vary by the type of programme delivered. Existing systematic reviews that explore the factors contributing to the success or failure of school-based smoking prevention programmes often exclude qualitative studies, due to a focus on intervention effectiveness which qualitative research cannot answer. Instead, qualitative research is focussed on the experiences and perceptions of those involved in the programmes. This systematic review will address this gap by updating a 2009 review to examine qualitative studies. The aim is to generate deeper insight to help target resources which have the potential to save lives by preventing smoking initiation among children and young people. This systematic review will be searching the following databases: the Cochrane Library, MEDLINE, EMBASE, PsycINFO, HMIC, ERIC, ASSIA, Web of Science and CINAHL. In order to identify additional references, we will consult the reference lists of a sample of systematic reviews and search relevant organizational websites in order to identify appropriate grey literature. The search strategy will include key words and database-specific subject headings relating to smoking, children and young people, health promotion and school. Authors will independently screen, assess data quality and extract data for synthesis. Study findings will be synthesised thematically using 'best-fit framework syntheses'. This allows for an existing set of themes to be used as a starting point to map or code included studies. These themes are then adapted as coding takes place to accommodate new emerging themes. This review will focus on

  19. Degradation Prediction Model Based on a Neural Network with Dynamic Windows

    PubMed Central

    Zhang, Xinghui; Xiao, Lei; Kang, Jianshe

    2015-01-01

    Tracking degradation of mechanical components is very critical for effective maintenance decision making. Remaining useful life (RUL) estimation is a widely used form of degradation prediction. RUL prediction methods when enough run-to-failure condition monitoring data can be used have been fully researched, but for some high reliability components, it is very difficult to collect run-to-failure condition monitoring data, i.e., from normal to failure. Only a certain number of condition indicators in certain period can be used to estimate RUL. In addition, some existing prediction methods have problems which block RUL estimation due to poor extrapolability. The predicted value converges to a certain constant or fluctuates in certain range. Moreover, the fluctuant condition features also have bad effects on prediction. In order to solve these dilemmas, this paper proposes a RUL prediction model based on neural network with dynamic windows. This model mainly consists of three steps: window size determination by increasing rate, change point detection and rolling prediction. The proposed method has two dominant strengths. One is that the proposed approach does not need to assume the degradation trajectory is subject to a certain distribution. The other is it can adapt to variation of degradation indicators which greatly benefits RUL prediction. Finally, the performance of the proposed RUL prediction model is validated by real field data and simulation data. PMID:25806873

  20. Consumer Neuroscience-Based Metrics Predict Recall, Liking and Viewing Rates in Online Advertising.

    PubMed

    Guixeres, Jaime; Bigné, Enrique; Ausín Azofra, Jose M; Alcañiz Raya, Mariano; Colomer Granero, Adrián; Fuentes Hurtado, Félix; Naranjo Ornedo, Valery

    2017-01-01

    The purpose of the present study is to investigate whether the effectiveness of a new ad on digital channels (YouTube) can be predicted by using neural networks and neuroscience-based metrics (brain response, heart rate variability and eye tracking). Neurophysiological records from 35 participants were exposed to 8 relevant TV Super Bowl commercials. Correlations between neurophysiological-based metrics, ad recall, ad liking, the ACE metrix score and the number of views on YouTube during a year were investigated. Our findings suggest a significant correlation between neuroscience metrics and self-reported of ad effectiveness and the direct number of views on the YouTube channel. In addition, and using an artificial neural network based on neuroscience metrics, the model classifies (82.9% of average accuracy) and estimate the number of online views (mean error of 0.199). The results highlight the validity of neuromarketing-based techniques for predicting the success of advertising responses. Practitioners can consider the proposed methodology at the design stages of advertising content, thus enhancing advertising effectiveness. The study pioneers the use of neurophysiological methods in predicting advertising success in a digital context. This is the first article that has examined whether these measures could actually be used for predicting views for advertising on YouTube.