Sample records for recursive partitioning analysis-based

  1. Scoring and staging systems using cox linear regression modeling and recursive partitioning.

    PubMed

    Lee, J W; Um, S H; Lee, J B; Mun, J; Cho, H

    2006-01-01

    Scoring and staging systems are used to determine the order and class of data according to predictors. Systems used for medical data, such as the Child-Turcotte-Pugh scoring and staging systems for ordering and classifying patients with liver disease, are often derived strictly from physicians' experience and intuition. We construct objective and data-based scoring/staging systems using statistical methods. We consider Cox linear regression modeling and recursive partitioning techniques for censored survival data. In particular, to obtain a target number of stages we propose cross-validation and amalgamation algorithms. We also propose an algorithm for constructing scoring and staging systems by integrating local Cox linear regression models into recursive partitioning, so that we can retain the merits of both methods such as superior predictive accuracy, ease of use, and detection of interactions between predictors. The staging system construction algorithms are compared by cross-validation evaluation of real data. The data-based cross-validation comparison shows that Cox linear regression modeling is somewhat better than recursive partitioning when there are only continuous predictors, while recursive partitioning is better when there are significant categorical predictors. The proposed local Cox linear recursive partitioning has better predictive accuracy than Cox linear modeling and simple recursive partitioning. This study indicates that integrating local linear modeling into recursive partitioning can significantly improve prediction accuracy in constructing scoring and staging systems.

  2. Model-based recursive partitioning to identify risk clusters for metabolic syndrome and its components: findings from the International Mobility in Aging Study

    PubMed Central

    Pirkle, Catherine M; Wu, Yan Yan; Zunzunegui, Maria-Victoria; Gómez, José Fernando

    2018-01-01

    Objective Conceptual models underpinning much epidemiological research on ageing acknowledge that environmental, social and biological systems interact to influence health outcomes. Recursive partitioning is a data-driven approach that allows for concurrent exploration of distinct mixtures, or clusters, of individuals that have a particular outcome. Our aim is to use recursive partitioning to examine risk clusters for metabolic syndrome (MetS) and its components, in order to identify vulnerable populations. Study design Cross-sectional analysis of baseline data from a prospective longitudinal cohort called the International Mobility in Aging Study (IMIAS). Setting IMIAS includes sites from three middle-income countries—Tirana (Albania), Natal (Brazil) and Manizales (Colombia)—and two from Canada—Kingston (Ontario) and Saint-Hyacinthe (Quebec). Participants Community-dwelling male and female adults, aged 64–75 years (n=2002). Primary and secondary outcome measures We apply recursive partitioning to investigate social and behavioural risk factors for MetS and its components. Model-based recursive partitioning (MOB) was used to cluster participants into age-adjusted risk groups based on variabilities in: study site, sex, education, living arrangements, childhood adversities, adult occupation, current employment status, income, perceived income sufficiency, smoking status and weekly minutes of physical activity. Results 43% of participants had MetS. Using MOB, the primary partitioning variable was participant sex. Among women from middle-incomes sites, the predicted proportion with MetS ranged from 58% to 68%. Canadian women with limited physical activity had elevated predicted proportions of MetS (49%, 95% CI 39% to 58%). Among men, MetS ranged from 26% to 41% depending on childhood social adversity and education. Clustering for MetS components differed from the syndrome and across components. Study site was a primary partitioning variable for all components except HDL cholesterol. Sex was important for most components. Conclusion MOB is a promising technique for identifying disease risk clusters (eg, vulnerable populations) in modestly sized samples. PMID:29500203

  3. Decision tree modeling using R.

    PubMed

    Zhang, Zhongheng

    2016-08-01

    In machine learning field, decision tree learner is powerful and easy to interpret. It employs recursive binary partitioning algorithm that splits the sample in partitioning variable with the strongest association with the response variable. The process continues until some stopping criteria are met. In the example I focus on conditional inference tree, which incorporates tree-structured regression models into conditional inference procedures. While growing a single tree is subject to small changes in the training data, random forests procedure is introduced to address this problem. The sources of diversity for random forests come from the random sampling and restricted set of input variables to be selected. Finally, I introduce R functions to perform model based recursive partitioning. This method incorporates recursive partitioning into conventional parametric model building.

  4. Application of Recursive Partitioning to Derive and Validate a Claims-Based Algorithm for Identifying Keratinocyte Carcinoma (Nonmelanoma Skin Cancer).

    PubMed

    Chan, An-Wen; Fung, Kinwah; Tran, Jennifer M; Kitchen, Jessica; Austin, Peter C; Weinstock, Martin A; Rochon, Paula A

    2016-10-01

    Keratinocyte carcinoma (nonmelanoma skin cancer) accounts for substantial burden in terms of high incidence and health care costs but is excluded by most cancer registries in North America. Administrative health insurance claims databases offer an opportunity to identify these cancers using diagnosis and procedural codes submitted for reimbursement purposes. To apply recursive partitioning to derive and validate a claims-based algorithm for identifying keratinocyte carcinoma with high sensitivity and specificity. Retrospective study using population-based administrative databases linked to 602 371 pathology episodes from a community laboratory for adults residing in Ontario, Canada, from January 1, 1992, to December 31, 2009. The final analysis was completed in January 2016. We used recursive partitioning (classification trees) to derive an algorithm based on health insurance claims. The performance of the derived algorithm was compared with 5 prespecified algorithms and validated using an independent academic hospital clinic data set of 2082 patients seen in May and June 2011. Sensitivity, specificity, positive predictive value, and negative predictive value using the histopathological diagnosis as the criterion standard. We aimed to achieve maximal specificity, while maintaining greater than 80% sensitivity. Among 602 371 pathology episodes, 131 562 (21.8%) had a diagnosis of keratinocyte carcinoma. Our final derived algorithm outperformed the 5 simple prespecified algorithms and performed well in both community and hospital data sets in terms of sensitivity (82.6% and 84.9%, respectively), specificity (93.0% and 99.0%, respectively), positive predictive value (76.7% and 69.2%, respectively), and negative predictive value (95.0% and 99.6%, respectively). Algorithm performance did not vary substantially during the 18-year period. This algorithm offers a reliable mechanism for ascertaining keratinocyte carcinoma for epidemiological research in the absence of cancer registry data. Our findings also demonstrate the value of recursive partitioning in deriving valid claims-based algorithms.

  5. A Recursive Method for Calculating Certain Partition Functions.

    ERIC Educational Resources Information Center

    Woodrum, Luther; And Others

    1978-01-01

    Describes a simple recursive method for calculating the partition function and average energy of a system consisting of N electrons and L energy levels. Also, presents an efficient APL computer program to utilize the recursion relation. (Author/GA)

  6. Predicting cannabis abuse screening test (CAST) scores: a recursive partitioning analysis using survey data from Czech Republic, Italy, the Netherlands and Sweden.

    PubMed

    Blankers, Matthijs; Frijns, Tom; Belackova, Vendula; Rossi, Carla; Svensson, Bengt; Trautmann, Franz; van Laar, Margriet

    2014-01-01

    Cannabis is Europe's most commonly used illicit drug. Some users do not develop dependence or other problems, whereas others do. Many factors are associated with the occurrence of cannabis-related disorders. This makes it difficult to identify key risk factors and markers to profile at-risk cannabis users using traditional hypothesis-driven approaches. Therefore, the use of a data-mining technique called binary recursive partitioning is demonstrated in this study by creating a classification tree to profile at-risk users. 59 variables on cannabis use and drug market experiences were extracted from an internet-based survey dataset collected in four European countries (Czech Republic, Italy, Netherlands and Sweden), n = 2617. These 59 potential predictors of problematic cannabis use were used to partition individual respondents into subgroups with low and high risk of having a cannabis use disorder, based on their responses on the Cannabis Abuse Screening Test. Both a generic model for the four countries combined and four country-specific models were constructed. Of the 59 variables included in the first analysis step, only three variables were required to construct a generic partitioning model to classify high risk cannabis users with 65-73% accuracy. Based on the generic model for the four countries combined, the highest risk for cannabis use disorder is seen in participants reporting a cannabis use on more than 200 days in the last 12 months. In comparison to the generic model, the country-specific models led to modest, non-significant improvements in classification accuracy, with an exception for Italy (p = 0.01). Using recursive partitioning, it is feasible to construct classification trees based on only a few variables with acceptable performance to classify cannabis users into groups with low or high risk of meeting criteria for cannabis use disorder. The number of cannabis use days in the last 12 months is the most relevant variable. The identified variables may be considered for use in future screeners for cannabis use disorders.

  7. Identifying risk profiles for childhood obesity using recursive partitioning based on individual, familial, and neighborhood environment factors.

    PubMed

    Van Hulst, Andraea; Roy-Gagnon, Marie-Hélène; Gauvin, Lise; Kestens, Yan; Henderson, Mélanie; Barnett, Tracie A

    2015-02-15

    Few studies consider how risk factors within multiple levels of influence operate synergistically to determine childhood obesity. We used recursive partitioning analysis to identify unique combinations of individual, familial, and neighborhood factors that best predict obesity in children, and tested whether these predict 2-year changes in body mass index (BMI). Data were collected in 2005-2008 and in 2008-2011 for 512 Quebec youth (8-10 years at baseline) with a history of parental obesity (QUALITY study). CDC age- and sex-specific BMI percentiles were computed and children were considered obese if their BMI was ≥95th percentile. Individual (physical activity and sugar-sweetened beverage intake), familial (household socioeconomic status and measures of parental obesity including both BMI and waist circumference), and neighborhood (disadvantage, prestige, and presence of parks, convenience stores, and fast food restaurants) factors were examined. Recursive partitioning, a method that generates a classification tree predicting obesity based on combined exposure to a series of variables, was used. Associations between resulting varying risk group membership and BMI percentile at baseline and 2-year follow up were examined using linear regression. Recursive partitioning yielded 7 subgroups with a prevalence of obesity equal to 8%, 11%, 26%, 28%, 41%, 60%, and 63%, respectively. The 2 highest risk subgroups comprised i) children not meeting physical activity guidelines, with at least one BMI-defined obese parent and 2 abdominally obese parents, living in disadvantaged neighborhoods without parks and, ii) children with these characteristics, except with access to ≥1 park and with access to ≥1 convenience store. Group membership was strongly associated with BMI at baseline, but did not systematically predict change in BMI. Findings support the notion that obesity is predicted by multiple factors in different settings and provide some indications of potentially obesogenic environments. Alternate group definitions as well as longer duration of follow up should be investigated to predict change in obesity.

  8. Discovery of novel SERCA inhibitors by virtual screening of a large compound library.

    PubMed

    Elam, Christopher; Lape, Michael; Deye, Joel; Zultowsky, Jodie; Stanton, David T; Paula, Stefan

    2011-05-01

    Two screening protocols based on recursive partitioning and computational ligand docking methodologies, respectively, were employed for virtual screens of a compound library with 345,000 entries for novel inhibitors of the enzyme sarco/endoplasmic reticulum calcium ATPase (SERCA), a potential target for cancer chemotherapy. A total of 72 compounds that were predicted to be potential inhibitors of SERCA were tested in bioassays and 17 displayed inhibitory potencies at concentrations below 100 μM. The majority of these inhibitors were composed of two phenyl rings tethered to each other by a short link of one to three atoms. Putative interactions between SERCA and the inhibitors were identified by inspection of docking-predicted poses and some of the structural features required for effective SERCA inhibition were determined by analysis of the classification pattern employed by the recursive partitioning models. Copyright © 2011 Elsevier Masson SAS. All rights reserved.

  9. Stockholder projector analysis: A Hilbert-space partitioning of the molecular one-electron density matrix with orthogonal projectors

    NASA Astrophysics Data System (ADS)

    Vanfleteren, Diederik; Van Neck, Dimitri; Bultinck, Patrick; Ayers, Paul W.; Waroquier, Michel

    2012-01-01

    A previously introduced partitioning of the molecular one-electron density matrix over atoms and bonds [D. Vanfleteren et al., J. Chem. Phys. 133, 231103 (2010)] is investigated in detail. Orthogonal projection operators are used to define atomic subspaces, as in Natural Population Analysis. The orthogonal projection operators are constructed with a recursive scheme. These operators are chemically relevant and obey a stockholder principle, familiar from the Hirshfeld-I partitioning of the electron density. The stockholder principle is extended to density matrices, where the orthogonal projectors are considered to be atomic fractions of the summed contributions. All calculations are performed as matrix manipulations in one-electron Hilbert space. Mathematical proofs and numerical evidence concerning this recursive scheme are provided in the present paper. The advantages associated with the use of these stockholder projection operators are examined with respect to covalent bond orders, bond polarization, and transferability.

  10. An Element-Based Concurrent Partitioner for Unstructured Finite Element Meshes

    NASA Technical Reports Server (NTRS)

    Ding, Hong Q.; Ferraro, Robert D.

    1996-01-01

    A concurrent partitioner for partitioning unstructured finite element meshes on distributed memory architectures is developed. The partitioner uses an element-based partitioning strategy. Its main advantage over the more conventional node-based partitioning strategy is its modular programming approach to the development of parallel applications. The partitioner first partitions element centroids using a recursive inertial bisection algorithm. Elements and nodes then migrate according to the partitioned centroids, using a data request communication template for unpredictable incoming messages. Our scalable implementation is contrasted to a non-scalable implementation which is a straightforward parallelization of a sequential partitioner.

  11. Binary recursive partitioning: background, methods, and application to psychology.

    PubMed

    Merkle, Edgar C; Shaffer, Victoria A

    2011-02-01

    Binary recursive partitioning (BRP) is a computationally intensive statistical method that can be used in situations where linear models are often used. Instead of imposing many assumptions to arrive at a tractable statistical model, BRP simply seeks to accurately predict a response variable based on values of predictor variables. The method outputs a decision tree depicting the predictor variables that were related to the response variable, along with the nature of the variables' relationships. No significance tests are involved, and the tree's 'goodness' is judged based on its predictive accuracy. In this paper, we describe BRP methods in a detailed manner and illustrate their use in psychological research. We also provide R code for carrying out the methods.

  12. Recursive inverse factorization.

    PubMed

    Rubensson, Emanuel H; Bock, Nicolas; Holmström, Erik; Niklasson, Anders M N

    2008-03-14

    A recursive algorithm for the inverse factorization S(-1)=ZZ(*) of Hermitian positive definite matrices S is proposed. The inverse factorization is based on iterative refinement [A.M.N. Niklasson, Phys. Rev. B 70, 193102 (2004)] combined with a recursive decomposition of S. As the computational kernel is matrix-matrix multiplication, the algorithm can be parallelized and the computational effort increases linearly with system size for systems with sufficiently sparse matrices. Recent advances in network theory are used to find appropriate recursive decompositions. We show that optimization of the so-called network modularity results in an improved partitioning compared to other approaches. In particular, when the recursive inverse factorization is applied to overlap matrices of irregularly structured three-dimensional molecules.

  13. Recursions for the exchangeable partition function of the seedbank coalescent.

    PubMed

    Kurt, Noemi; Rafler, Mathias

    2017-04-01

    For the seedbank coalescent with mutation under the infinite alleles assumption, which describes the gene genealogy of a population with a strong seedbank effect subject to mutations, we study the distribution of the final partition with mutation. This generalizes the coalescent with freeze by Dong et al. (2007) to coalescents where ancestral lineages are blocked from coalescing. We derive an implicit recursion which we show to have a unique solution and give an interpretation in terms of absorption problems of a random walk. Moreover, we derive recursions for the distribution of the number of blocks in the final partition. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Fragment-based prediction of skin sensitization using recursive partitioning

    NASA Astrophysics Data System (ADS)

    Lu, Jing; Zheng, Mingyue; Wang, Yong; Shen, Qiancheng; Luo, Xiaomin; Jiang, Hualiang; Chen, Kaixian

    2011-09-01

    Skin sensitization is an important toxic endpoint in the risk assessment of chemicals. In this paper, structure-activity relationships analysis was performed on the skin sensitization potential of 357 compounds with local lymph node assay data. Structural fragments were extracted by GASTON (GrAph/Sequence/Tree extractiON) from the training set. Eight fragments with accuracy significantly higher than 0.73 ( p < 0.1) were retained to make up an indicator descriptor fragment. The fragment descriptor and eight other physicochemical descriptors closely related to the endpoint were calculated to construct the recursive partitioning tree (RP tree) for classification. The balanced accuracy of the training set, test set I, and test set II in the leave-one-out model were 0.846, 0.800, and 0.809, respectively. The results highlight that fragment-based RP tree is a preferable method for identifying skin sensitizers. Moreover, the selected fragments provide useful structural information for exploring sensitization mechanisms, and RP tree creates a graphic tree to identify the most important properties associated with skin sensitization. They can provide some guidance for designing of drugs with lower sensitization level.

  15. Multi-jagged: A scalable parallel spatial partitioning algorithm

    DOE PAGES

    Deveci, Mehmet; Rajamanickam, Sivasankaran; Devine, Karen D.; ...

    2015-03-18

    Geometric partitioning is fast and effective for load-balancing dynamic applications, particularly those requiring geometric locality of data (particle methods, crash simulations). We present, to our knowledge, the first parallel implementation of a multidimensional-jagged geometric partitioner. In contrast to the traditional recursive coordinate bisection algorithm (RCB), which recursively bisects subdomains perpendicular to their longest dimension until the desired number of parts is obtained, our algorithm does recursive multi-section with a given number of parts in each dimension. By computing multiple cut lines concurrently and intelligently deciding when to migrate data while computing the partition, we minimize data movement compared to efficientmore » implementations of recursive bisection. We demonstrate the algorithm's scalability and quality relative to the RCB implementation in Zoltan on both real and synthetic datasets. Our experiments show that the proposed algorithm performs and scales better than RCB in terms of run-time without degrading the load balance. Lastly, our implementation partitions 24 billion points into 65,536 parts within a few seconds and exhibits near perfect weak scaling up to 6K cores.« less

  16. An Introduction to Recursive Partitioning: Rationale, Application, and Characteristics of Classification and Regression Trees, Bagging, and Random Forests

    ERIC Educational Resources Information Center

    Strobl, Carolin; Malley, James; Tutz, Gerhard

    2009-01-01

    Recursive partitioning methods have become popular and widely used tools for nonparametric regression and classification in many scientific fields. Especially random forests, which can deal with large numbers of predictor variables even in the presence of complex interactions, have been applied successfully in genetics, clinical medicine, and…

  17. Detecting treatment-subgroup interactions in clustered data with generalized linear mixed-effects model trees.

    PubMed

    Fokkema, M; Smits, N; Zeileis, A; Hothorn, T; Kelderman, H

    2017-10-25

    Identification of subgroups of patients for whom treatment A is more effective than treatment B, and vice versa, is of key importance to the development of personalized medicine. Tree-based algorithms are helpful tools for the detection of such interactions, but none of the available algorithms allow for taking into account clustered or nested dataset structures, which are particularly common in psychological research. Therefore, we propose the generalized linear mixed-effects model tree (GLMM tree) algorithm, which allows for the detection of treatment-subgroup interactions, while accounting for the clustered structure of a dataset. The algorithm uses model-based recursive partitioning to detect treatment-subgroup interactions, and a GLMM to estimate the random-effects parameters. In a simulation study, GLMM trees show higher accuracy in recovering treatment-subgroup interactions, higher predictive accuracy, and lower type II error rates than linear-model-based recursive partitioning and mixed-effects regression trees. Also, GLMM trees show somewhat higher predictive accuracy than linear mixed-effects models with pre-specified interaction effects, on average. We illustrate the application of GLMM trees on an individual patient-level data meta-analysis on treatments for depression. We conclude that GLMM trees are a promising exploratory tool for the detection of treatment-subgroup interactions in clustered datasets.

  18. Differential diagnosis of jaw pain using informatics technology.

    PubMed

    Nam, Y; Kim, H-G; Kho, H-S

    2018-05-21

    This study aimed to deduce evidence-based clinical clues that differentiate temporomandibular disorders (TMD)-mimicking conditions from genuine TMD by text mining using natural language processing (NLP) and recursive partitioning. We compared the medical records of 29 patients diagnosed with TMD-mimicking conditions and 290 patients diagnosed with genuine TMD. Chief complaints and medical histories were preprocessed via NLP to compare the frequency of word usage. In addition, recursive partitioning was used to deduce the optimal size of mouth opening, which could differentiate TMD-mimicking from genuine TMD groups. The prevalence of TMD-mimicking conditions was more evenly distributed across all age groups and showed a nearly equal gender ratio, which was significantly different from genuine TMD. TMD-mimicking conditions were caused by inflammation, infection, hereditary disease and neoplasm. Patients with TMD-mimicking conditions frequently used "mouth opening limitation" (P < .001), but less commonly used words such as "noise" (P < .001) and "temporomandibular joint" (P < .001) than patients with genuine TMD. A diagnostic classification tree on the basis of recursive partitioning suggested that 12.0 mm of comfortable mouth opening and 26.5 mm of maximum mouth opening were deduced as the most optimal mouth-opening cutoff sizes. When the combined analyses were performed based on both the text mining and clinical examination data, the predictive performance of the model was 96.6% with 69.0% sensitivity and 99.3% specificity in predicting TMD-mimicking conditions. In conclusion, this study showed that AI technology-based methods could be applied in the field of differential diagnosis of orofacial pain disorders. © 2018 John Wiley & Sons Ltd.

  19. Establishing Long-Term Efficacy in Chronic Disease: Use of Recursive Partitioning and Propensity Score Adjustment to Estimate Outcome in MS

    PubMed Central

    Goodin, Douglas S.; Jones, Jason; Li, David; Traboulsee, Anthony; Reder, Anthony T.; Beckmann, Karola; Konieczny, Andreas; Knappertz, Volker

    2011-01-01

    Context Establishing the long-term benefit of therapy in chronic diseases has been challenging. Long-term studies require non-randomized designs and, thus, are often confounded by biases. For example, although disease-modifying therapy in MS has a convincing benefit on several short-term outcome-measures in randomized trials, its impact on long-term function remains uncertain. Objective Data from the 16-year Long-Term Follow-up study of interferon-beta-1b is used to assess the relationship between drug-exposure and long-term disability in MS patients. Design/Setting To mitigate the bias of outcome-dependent exposure variation in non-randomized long-term studies, drug-exposure was measured as the medication-possession-ratio, adjusted up or down according to multiple different weighting-schemes based on MS severity and MS duration at treatment initiation. A recursive-partitioning algorithm assessed whether exposure (using any weighing scheme) affected long-term outcome. The optimal cut-point that was used to define “high” or “low” exposure-groups was chosen by the algorithm. Subsequent to verification of an exposure-impact that included all predictor variables, the two groups were compared using a weighted propensity-stratified analysis in order to mitigate any treatment-selection bias that may have been present. Finally, multiple sensitivity-analyses were undertaken using different definitions of long-term outcome and different assumptions about the data. Main Outcome Measure Long-Term Disability. Results In these analyses, the same weighting-scheme was consistently selected by the recursive-partitioning algorithm. This scheme reduced (down-weighted) the effectiveness of drug exposure as either disease duration or disability at treatment-onset increased. Applying this scheme and using propensity-stratification to further mitigate bias, high-exposure had a consistently better clinical outcome compared to low-exposure (Cox proportional hazard ratio = 0.30–0.42; p<0.0001). Conclusions Early initiation and sustained use of interferon-beta-1b has a beneficial impact on long-term outcome in MS. Our analysis strategy provides a methodological framework for bias-mitigation in the analysis of non-randomized clinical data. Trial Registration Clinicaltrials.gov NCT00206635 PMID:22140424

  20. Recursive Partitioning Analysis for New Classification of Patients With Esophageal Cancer Treated by Chemoradiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nomura, Motoo, E-mail: excell@hkg.odn.ne.jp; Department of Clinical Oncology, Aichi Cancer Center Hospital, Nagoya; Department of Radiation Oncology, Aichi Cancer Center Hospital, Nagoya

    2012-11-01

    Background: The 7th edition of the American Joint Committee on Cancer staging system does not include lymph node size in the guidelines for staging patients with esophageal cancer. The objectives of this study were to determine the prognostic impact of the maximum metastatic lymph node diameter (ND) on survival and to develop and validate a new staging system for patients with esophageal squamous cell cancer who were treated with definitive chemoradiotherapy (CRT). Methods: Information on 402 patients with esophageal cancer undergoing CRT at two institutions was reviewed. Univariate and multivariate analyses of data from one institution were used to assessmore » the impact of clinical factors on survival, and recursive partitioning analysis was performed to develop the new staging classification. To assess its clinical utility, the new classification was validated using data from the second institution. Results: By multivariate analysis, gender, T, N, and ND stages were independently and significantly associated with survival (p < 0.05). The resulting new staging classification was based on the T and ND. The four new stages led to good separation of survival curves in both the developmental and validation datasets (p < 0.05). Conclusions: Our results showed that lymph node size is a strong independent prognostic factor and that the new staging system, which incorporated lymph node size, provided good prognostic power, and discriminated effectively for patients with esophageal cancer undergoing CRT.« less

  1. Adjuvant treatment may benefit patients with high-risk upper rectal cancer: A nomogram and recursive partitioning analysis of 547 patients.

    PubMed

    Wang, Xin; Jin, Jing; Yang, Yong; Liu, Wen-Yang; Ren, Hua; Feng, Yan-Ru; Xiao, Qin; Li, Ning; Deng, Lei; Fang, Hui; Jing, Hao; Lu, Ning-Ning; Tang, Yu; Wang, Jian-Yang; Wang, Shu-Lian; Wang, Wei-Hu; Song, Yong-Wen; Liu, Yue-Ping; Li, Ye-Xiong

    2016-10-04

    The role of adjuvant chemoradiotherapy (ACRT) or adjuvant chemotherapy (ACT) in treating patients with locally advanced upper rectal cancer (URC) after total mesorectal excision (TME) surgery remains unclear. We developed a clinical nomogram and a recursive partitioning analysis (RPA)-based risk stratification system for predicting 5-year cancer-specific survival (CSS) to determine whether these individuals require ACRT or ACT. This retrospective analysis included 547 patients with primary URC. A nomogram was developed based on the Cox regression model. The performance of the model was assessed by concordance index (C-index) and calibration curve in internal validation with bootstrapping. RPA stratified patients into risk groups based on their tumor characteristics. Five independent prognostic factors (age, preoperative increased carcinoembryonic antigen and carcinoma antigen 19-9, positive lymph node [PLN] number, tumor deposit [TD], pathological T classification) were identified and entered into the predictive nomogram. The bootstrap-corrected C-index was 0.757. RPA stratification of the three prognostic groups showed obviously different prognosis. Only the high-risk group (patients with PLN ≤ 6 and TD, or PLN > 6) benefited from ACRT plus ACT when compared with surgery followed by ACRT or ACT, and surgery alone (5-year CSS: 70.8% vs. 57.8% vs. 15.6%, P < 0.001). Our nomogram predicts 5-year CSS after TME surgery for locally advanced rectal cancer and RPA-based stratification indicates that ACRT plus ACT post-surgery may be an important treatment plan with potentially ignificant survival advantages in high-risk URC. This may help to select candidates of adjuvant treatment in prospective studies.

  2. TREAT (TREe-based Association Test)

    Cancer.gov

    TREAT is an R package for detecting complex joint effects in case-control studies. The test statistic is derived from a tree-structure model by recursive partitioning the data. Ultra-fast algorithm is designed to evaluate the significance of association between candidate gene and disease outcome

  3. Recursive partition analysis of peritoneal and systemic recurrence in patients with gastric cancer who underwent D2 gastrectomy: Implications for neoadjuvant therapy consideration.

    PubMed

    Chang, Jee Suk; Kim, Kyung Hwan; Keum, Ki Chang; Noh, Sung Hoon; Lim, Joon Seok; Kim, Hyo Song; Rha, Sun Young; Lee, Yong Chan; Hyung, Woo Jin; Koom, Woong Sub

    2016-12-01

    To classify patients with nonmetastatic advanced gastric cancer who underwent D2-gastrectomy into prognostic groups based on peritoneal and systemic recurrence risks. Between 2004 and 2007, 1,090 patients with T3-4 or N+ gastric cancer were identified from our registry. Recurrence rates were estimated using a competing-risk analysis. Different prognostic groups were defined using recursive partitioning analysis (RPA). Median follow-up was 7 years. In the RPA-model for peritoneal recurrence risk, the initial node was split by T stage, indicating that differences between patients with T1-3 and T4 cancer were the greatest. The 5-year peritoneal recurrence rates for patients with T4 (n = 627) and T1-3 (n = 463) disease were 34.3% and 9.1%, respectively. N stage and neural invasion had an additive impact on high-risk patients. The RPA model for systemic relapse incorporated N stage alone and gave two terminal nodes: N0-2 (n = 721) and N3 (n = 369). The 5-year cumulative incidences were 7.7% and 24.5%, respectively. We proposed risk stratification models of peritoneal and systemic recurrence in patients undergoing D2-gastrectomy. This classification could be used for stratification protocols in future studies evaluating adjuvant therapies such as preoperative chemoradiotherapy. J. Surg. Oncol. 2016;114:859-864. © 2016 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  4. High Performance Computing Based Parallel HIearchical Modal Association Clustering (HPAR HMAC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patlolla, Dilip R; Surendran Nair, Sujithkumar; Graves, Daniel A.

    For many applications, clustering is a crucial step in order to gain insight into the makeup of a dataset. The best approach to a given problem often depends on a variety of factors, such as the size of the dataset, time restrictions, and soft clustering requirements. The HMAC algorithm seeks to combine the strengths of 2 particular clustering approaches: model-based and linkage-based clustering. One particular weakness of HMAC is its computational complexity. HMAC is not practical for mega-scale data clustering. For high-definition imagery, a user would have to wait months or years for a result; for a 16-megapixel image, themore » estimated runtime skyrockets to over a decade! To improve the execution time of HMAC, it is reasonable to consider an multi-core implementation that utilizes available system resources. An existing imple-mentation (Ray and Cheng 2014) divides the dataset into N partitions - one for each thread prior to executing the HMAC algorithm. This implementation benefits from 2 types of optimization: parallelization and divide-and-conquer. By running each partition in parallel, the program is able to accelerate computation by utilizing more system resources. Although the parallel implementation provides considerable improvement over the serial HMAC, it still suffers from poor computational complexity, O(N2). Once the maximum number of cores on a system is exhausted, the program exhibits slower behavior. We now consider a modification to HMAC that involves a recursive partitioning scheme. Our modification aims to exploit divide-and-conquer benefits seen by the parallel HMAC implementation. At each level in the recursion tree, partitions are divided into 2 sub-partitions until a threshold size is reached. When the partition can no longer be divided without falling below threshold size, the base HMAC algorithm is applied. This results in a significant speedup over the parallel HMAC.« less

  5. Detection of Problem Gambler Subgroups Using Recursive Partitioning

    ERIC Educational Resources Information Center

    Markham, Francis; Young, Martin; Doran, Bruce

    2013-01-01

    The multivariate socio-demographic risk factors for problem gambling have been well documented. While this body of research is valuable in determining risk factors aggregated across various populations, the majority of studies tend not to specifically identify particular subgroups of problem gamblers based on the interaction between variables. The…

  6. Statistically extracted fundamental watershed variables for estimating the loads of total nitrogen in small streams

    USGS Publications Warehouse

    Kronholm, Scott C.; Capel, Paul D.; Terziotti, Silvia

    2016-01-01

    Accurate estimation of total nitrogen loads is essential for evaluating conditions in the aquatic environment. Extrapolation of estimates beyond measured streams will greatly expand our understanding of total nitrogen loading to streams. Recursive partitioning and random forest regression were used to assess 85 geospatial, environmental, and watershed variables across 636 small (<585 km2) watersheds to determine which variables are fundamentally important to the estimation of annual loads of total nitrogen. Initial analysis led to the splitting of watersheds into three groups based on predominant land use (agricultural, developed, and undeveloped). Nitrogen application, agricultural and developed land area, and impervious or developed land in the 100-m stream buffer were commonly extracted variables by both recursive partitioning and random forest regression. A series of multiple linear regression equations utilizing the extracted variables were created and applied to the watersheds. As few as three variables explained as much as 76 % of the variability in total nitrogen loads for watersheds with predominantly agricultural land use. Catchment-scale national maps were generated to visualize the total nitrogen loads and yields across the USA. The estimates provided by these models can inform water managers and help identify areas where more in-depth monitoring may be beneficial.

  7. Pathways to Early Coital Debut for Adolescent Girls: A Recursive Partitioning Analysis

    PubMed Central

    Pearson, Matthew R.; Kholodkov, Tatyana; Henson, James M.; Impett, Emily A.

    2011-01-01

    The current study examined pathways to early coital debut among early to middle adolescent girls in the United States. In a two-year longitudinal study of 104 adolescent girls, we conducted Recursive Partitioning (RP) analyses to examine the specific factors that were related to engaging in first intercourse by the 10th grade among adolescent girls who had not yet engaged in sexual intercourse by the 8th grade. RP analyses identified subsamples of girls who had low, medium, and high likelihoods of engaging in early coital debut based on six variables (i.e., school aspirations, early physical intimacy experiences, depression, body objectification, body image, and relationship inauthenticity). For example, girls in the lowest likelihood group (3% had engaged in sex by the 10th grade) reported no prior experiences with being touched under their clothes, low body objectification, high aspirations to complete graduate education, and low depressive symptoms; girls in the highest likelihood group (75% had engaged in sex by the 10th grade) also reported no prior experiences with being touched under their clothes but had high levels of body objectification. The implications of these analyses for the development of female adolescent sexuality as well as for advances in quantitative methods are discussed. PMID:21512947

  8. Recursive time-varying filter banks for subband image coding

    NASA Technical Reports Server (NTRS)

    Smith, Mark J. T.; Chung, Wilson C.

    1992-01-01

    Filter banks and wavelet decompositions that employ recursive filters have been considered previously and are recognized for their efficiency in partitioning the frequency spectrum. This paper presents an analysis of a new infinite impulse response (IIR) filter bank in which these computationally efficient filters may be changed adaptively in response to the input. The filter bank is presented and discussed in the context of finite-support signals with the intended application in subband image coding. In the absence of quantization errors, exact reconstruction can be achieved and by the proper choice of an adaptation scheme, it is shown that IIR time-varying filter banks can yield improvement over conventional ones.

  9. CD process control through machine learning

    NASA Astrophysics Data System (ADS)

    Utzny, Clemens

    2016-10-01

    For the specific requirements of the 14nm and 20nm site applications a new CD map approach was developed at the AMTC. This approach relies on a well established machine learning technique called recursive partitioning. Recursive partitioning is a powerful technique which creates a decision tree by successively testing whether the quantity of interest can be explained by one of the supplied covariates. The test performed is generally a statistical test with a pre-supplied significance level. Once the test indicates significant association between the variable of interest and a covariate a split performed at a threshold value which minimizes the variation within the newly attained groups. This partitioning is recurred until either no significant association can be detected or the resulting sub group size falls below a pre-supplied level.

  10. Major depressive disorder subtypes to predict long-term course

    PubMed Central

    van Loo, Hanna M.; Cai, Tianxi; Gruber, Michael J.; Li, Junlong; de Jonge, Peter; Petukhova, Maria; Rose, Sherri; Sampson, Nancy A.; Schoevers, Robert A.; Wardenaar, Klaas J.; Wilcox, Marsha A.; Al-Hamzawi, Ali Obaid; Andrade, Laura Helena; Bromet, Evelyn J.; Bunting, Brendan; Fayyad, John; Florescu, Silvia E.; Gureje, Oye; Hu, Chiyi; Huang, Yueqin; Levinson, Daphna; Medina-Mora, Maria Elena; Nakane, Yoshibumi; Posada-Villa, Jose; Scott, Kate M.; Xavier, Miguel; Zarkov, Zahari; Kessler, Ronald C.

    2016-01-01

    Background Variation in course of major depressive disorder (MDD) is not strongly predicted by existing subtype distinctions. A new subtyping approach is considered here. Methods Two data mining techniques, ensemble recursive partitioning and Lasso generalized linear models (GLMs) followed by k-means cluster analysis, are used to search for subtypes based on index episode symptoms predicting subsequent MDD course in the World Mental Health (WMH) Surveys. The WMH surveys are community surveys in 16 countries. Lifetime DSM-IV MDD was reported by 8,261 respondents. Retrospectively reported outcomes included measures of persistence (number of years with an episode; number of with an episode lasting most of the year) and severity (hospitalization for MDD; disability due to MDD). Results Recursive partitioning found significant clusters defined by the conjunctions of early onset, suicidality, and anxiety (irritability, panic, nervousness-worry-anxiety) during the index episode. GLMs found additional associations involving a number of individual symptoms. Predicted values of the four outcomes were strongly correlated. Cluster analysis of these predicted values found three clusters having consistently high, intermediate, or low predicted scores across all outcomes. The high-risk cluster (30.0% of respondents) accounted for 52.9-69.7% of high persistence and severity and was most strongly predicted by index episode severe dysphoria, suicidality, anxiety, and early onset. A total symptom count, in comparison, was not a significant predictor. Conclusions Despite being based on retrospective reports, results suggest that useful MDD subtyping distinctions can be made using data mining methods. Further studies are needed to test and expand these results with prospective data. PMID:24425049

  11. Cooperating reduction machines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kluge, W.E.

    1983-11-01

    This paper presents a concept and a system architecture for the concurrent execution of program expressions of a concrete reduction language based on lamda-expressions. If formulated appropriately, these expressions are well-suited for concurrent execution, following a demand-driven model of computation. In particular, recursive program expressions with nonlinear expansion may, at run time, recursively be partitioned into a hierarchy of independent subexpressions which can be reduced by a corresponding hierarchy of virtual reduction machines. This hierarchy unfolds and collapses dynamically, with virtual machines recursively assuming the role of masters that create and eventually terminate, or synchronize with, slaves. The paper alsomore » proposes a nonhierarchically organized system of reduction machines, each featuring a stack architecture, that effectively supports the allocation of virtual machines to the real machines of the system in compliance with their hierarchical order of creation and termination. 25 references.« less

  12. Assessing the relative importance of correlates of loneliness in later life. Gaining insight using recursive partitioning.

    PubMed

    Ejlskov, Linda; Wulff, Jesper; Bøggild, Henrik; Kuh, Diana; Stafford, Mai

    2017-09-08

    Improving the design and targeting of interventions is important for alleviating loneliness among older adults. This requires identifying which correlates are the most important predictors of loneliness. This study demonstrates the use of recursive partitioning in exploring the characteristics and assessing the relative importance of correlates of loneliness in older adults. Using exploratory regression trees and random forests, we examined combinations and the relative importance of 42 correlates in relation to loneliness at age 68 among 2453 participants from the birth cohort study the MRC National Survey of Health and Development. Positive mental well-being, personal mastery, identifying the spouse as the closest confidant, being extrovert and informal social contact were the most important correlates of lower loneliness levels. Participation in organised groups and demographic correlates were poor identifiers of loneliness. The regression tree suggested that loneliness was not raised among those with poor mental wellbeing if they identified their partner as closest confidante and had frequent social contact. Recursive partitioning can identify which combinations of experiences and circumstances characterise high-risk groups. Poor mental wellbeing and sparse social contact emerged as especially important and classical demographic factors as insufficient in identifying high loneliness levels among older adults.

  13. A Random Walk Approach to Query Informative Constraints for Clustering.

    PubMed

    Abin, Ahmad Ali

    2017-08-09

    This paper presents a random walk approach to the problem of querying informative constraints for clustering. The proposed method is based on the properties of the commute time, that is the expected time taken for a random walk to travel between two nodes and return, on the adjacency graph of data. Commute time has the nice property of that, the more short paths connect two given nodes in a graph, the more similar those nodes are. Since computing the commute time takes the Laplacian eigenspectrum into account, we use this property in a recursive fashion to query informative constraints for clustering. At each recursion, the proposed method constructs the adjacency graph of data and utilizes the spectral properties of the commute time matrix to bipartition the adjacency graph. Thereafter, the proposed method benefits from the commute times distance on graph to query informative constraints between partitions. This process iterates for each partition until the stop condition becomes true. Experiments on real-world data show the efficiency of the proposed method for constraints selection.

  14. Item-focussed Trees for the Identification of Items in Differential Item Functioning.

    PubMed

    Tutz, Gerhard; Berger, Moritz

    2016-09-01

    A novel method for the identification of differential item functioning (DIF) by means of recursive partitioning techniques is proposed. We assume an extension of the Rasch model that allows for DIF being induced by an arbitrary number of covariates for each item. Recursive partitioning on the item level results in one tree for each item and leads to simultaneous selection of items and variables that induce DIF. For each item, it is possible to detect groups of subjects with different item difficulties, defined by combinations of characteristics that are not pre-specified. The way a DIF item is determined by covariates is visualized in a small tree and therefore easily accessible. An algorithm is proposed that is based on permutation tests. Various simulation studies, including the comparison with traditional approaches to identify items with DIF, show the applicability and the competitive performance of the method. Two applications illustrate the usefulness and the advantages of the new method.

  15. Prognostic Indexes for Brain Metastases: Which Is the Most Powerful?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arruda Viani, Gustavo, E-mail: gusviani@gmail.com; Bernardes da Silva, Lucas Godoi; Stefano, Eduardo Jose

    Purpose: The purpose of the present study was to compare the prognostic indexes (PIs) of patients with brain metastases (BMs) treated with whole brain radiotherapy (WBRT) using an artificial neural network. This analysis is important, because it evaluates the prognostic power of each PI to guide clinical decision-making and outcomes research. Methods and Materials: A retrospective prognostic study was conducted of 412 patients with BMs who underwent WBRT between April 1998 and March 2010. The eligibility criteria for patients included having undergone WBRT or WBRT plus neurosurgery. The data were analyzed using the artificial neural network. The input neural datamore » consisted of all prognostic factors included in the 5 PIs (recursive partitioning analysis, graded prognostic assessment [GPA], basic score for BMs, Rotterdam score, and Germany score). The data set was randomly divided into 300 training and 112 testing examples for survival prediction. All 5 PIs were compared using our database of 412 patients with BMs. The sensibility of the 5 indexes to predict survival according to their input variables was determined statistically using receiver operating characteristic curves. The importance of each variable from each PI was subsequently evaluated. Results: The overall 1-, 2-, and 3-year survival rate was 22%, 10.2%, and 5.1%, respectively. All classes of PIs were significantly associated with survival (recursive partitioning analysis, P < .0001; GPA, P < .0001; basic score for BMs, P = .002; Rotterdam score, P = .001; and Germany score, P < .0001). Comparing the areas under the curves, the GPA was statistically most sensitive in predicting survival (GPA, 86%; recursive partitioning analysis, 81%; basic score for BMs, 79%; Rotterdam, 73%; and Germany score, 77%; P < .001). Among the variables included in each PI, the performance status and presence of extracranial metastases were the most important factors. Conclusion: A variety of prognostic models describe the survival of patients with BMs to a more or less satisfactory degree. Among the 5 PIs evaluated in the present study, GPA was the most powerful in predicting survival. Additional studies should include emerging biologic prognostic factors to improve the sensibility of these PIs.« less

  16. Condensate statistics and thermodynamics of weakly interacting Bose gas: Recursion relation approach

    NASA Astrophysics Data System (ADS)

    Dorfman, K. E.; Kim, M.; Svidzinsky, A. A.

    2011-03-01

    We study condensate statistics and thermodynamics of weakly interacting Bose gas with a fixed total number N of particles in a cubic box. We find the exact recursion relation for the canonical ensemble partition function. Using this relation, we calculate the distribution function of condensate particles for N=200. We also calculate the distribution function based on multinomial expansion of the characteristic function. Similar to the ideal gas, both approaches give exact statistical moments for all temperatures in the framework of Bogoliubov model. We compare them with the results of unconstraint canonical ensemble quasiparticle formalism and the hybrid master equation approach. The present recursion relation can be used for any external potential and boundary conditions. We investigate the temperature dependence of the first few statistical moments of condensate fluctuations as well as thermodynamic potentials and heat capacity analytically and numerically in the whole temperature range.

  17. Systemic treatment after whole-brain radiotherapy may improve survival in RPA class II/III breast cancer patients with brain metastasis.

    PubMed

    Zhang, Qian; Chen, Jian; Yu, Xiaoli; Ma, Jinli; Cai, Gang; Yang, Zhaozhi; Cao, Lu; Chen, Xingxing; Guo, Xiaomao; Chen, Jiayi

    2013-09-01

    Whole brain radiotherapy (WBRT) is the most widely used treatment for brain metastasis (BM), especially for patients with multiple intracranial lesions. The purpose of this study was to examine the efficacy of systemic treatments following WBRT in breast cancer patients with BM who had different clinical characteristics, based on the classification of the Radiation Therapy Oncology Group recursive partitioning analysis (RPA) and the breast cancer-specific Graded Prognostic Assessment (Breast-GPA). One hundred and one breast cancer patients with BM treated between 2006 and 2010 were analyzed. The median interval between breast cancer diagnosis and identification of BM in the triple-negative patients was shorter than in the luminal A subtype (26 vs. 36 months, respectively; P = 0.021). Univariate analysis indicated that age at BM diagnosis, Karnofsky performance status/recursive partitioning analysis (KPS/RPA) classes, number of BMs, primary tumor control, extracranial metastases and systemic treatment following WBRT were significant prognostic factors for overall survival (OS) (P < 0.05). Multivariate analysis revealed that KPS/RPA classes and systemic treatments following WBRT remained the significant prognostic factors for OS. For RPA class I, the median survival with and without systemic treatments following WBRT was 25 and 22 months, respectively (P = 0.819), while for RPA class II/III systemic treatments significantly improved OS from 7 and 2 months to 11 and 5 months, respectively (P < 0.05). Our results suggested that triple-negative patients had a shorter interval between initial diagnosis and the development of BM than luminal A patients. Systemic treatments following WBRT improved the survival of RPA class II/III patients.

  18. New Breast Cancer Recursive Partitioning Analysis Prognostic Index in Patients With Newly Diagnosed Brain Metastases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Niwinska, Anna, E-mail: alphaonetau@poczta.onet.pl; Murawska, Magdalena

    2012-04-01

    Purpose: The aim of the study was to present a new breast cancer recursive partitioning analysis (RPA) prognostic index for patients with newly diagnosed brain metastases as a guide in clinical decision making. Methods and Materials: A prospectively collected group of 441 consecutive patients with breast cancer and brain metastases treated between the years 2003 and 2009 was assessed. Prognostic factors significant for univariate analysis were included into RPA. Results: Three prognostic classes of a new breast cancer RPA prognostic index were selected. The median survival of patients within prognostic Classes I, II, and III was 29, 9, and 2.4more » months, respectively (p < 0.0001). Class I included patients with one or two brain metastases, without extracranial disease or with controlled extracranial disease, and with Karnofsky performance status (KPS) of 100. Class III included patients with multiple brain metastases with KPS of {<=}60. Class II included all other cases. Conclusions: The breast cancer RPA prognostic index is an easy and valuable tool for use in clinical practice. It can select patients who require aggressive treatment and those in whom whole-brain radiotherapy or symptomatic therapy is the most reasonable option. An individual approach is required for patients from prognostic Class II.« less

  19. Prediction of mutagenic toxicity by combination of Recursive Partitioning and Support Vector Machines.

    PubMed

    Liao, Quan; Yao, Jianhua; Yuan, Shengang

    2007-05-01

    The study of prediction of toxicity is very important and necessary because measurement of toxicity is typically time-consuming and expensive. In this paper, Recursive Partitioning (RP) method was used to select descriptors. RP and Support Vector Machines (SVM) were used to construct structure-toxicity relationship models, RP model and SVM model, respectively. The performances of the two models are different. The prediction accuracies of the RP model are 80.2% for mutagenic compounds in MDL's toxicity database, 83.4% for compounds in CMC and 84.9% for agrochemicals in in-house database respectively. Those of SVM model are 81.4%, 87.0% and 87.3% respectively.

  20. Perceived Organizational Support for Enhancing Welfare at Work: A Regression Tree Model

    PubMed Central

    Giorgi, Gabriele; Dubin, David; Perez, Javier Fiz

    2016-01-01

    When trying to examine outcomes such as welfare and well-being, research tends to focus on main effects and take into account limited numbers of variables at a time. There are a number of techniques that may help address this problem. For example, many statistical packages available in R provide easy-to-use methods of modeling complicated analysis such as classification and tree regression (i.e., recursive partitioning). The present research illustrates the value of recursive partitioning in the prediction of perceived organizational support in a sample of more than 6000 Italian bankers. Utilizing the tree function party package in R, we estimated a regression tree model predicting perceived organizational support from a multitude of job characteristics including job demand, lack of job control, lack of supervisor support, training, etc. The resulting model appears particularly helpful in pointing out several interactions in the prediction of perceived organizational support. In particular, training is the dominant factor. Another dimension that seems to influence organizational support is reporting (perceived communication about safety and stress concerns). Results are discussed from a theoretical and methodological point of view. PMID:28082924

  1. P-HARP: A parallel dynamic spectral partitioner

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sohn, A.; Biswas, R.; Simon, H.D.

    1997-05-01

    Partitioning unstructured graphs is central to the parallel solution of problems in computational science and engineering. The authors have introduced earlier the sequential version of an inertial spectral partitioner called HARP which maintains the quality of recursive spectral bisection (RSB) while forming the partitions an order of magnitude faster than RSB. The serial HARP is known to be the fastest spectral partitioner to date, three to four times faster than similar partitioners on a variety of meshes. This paper presents a parallel version of HARP, called P-HARP. Two types of parallelism have been exploited: loop level parallelism and recursive parallelism.more » P-HARP has been implemented in MPI on the SGI/Cray T3E and the IBM SP2. Experimental results demonstrate that P-HARP can partition a mesh of over 100,000 vertices into 256 partitions in 0.25 seconds on a 64-processor T3E. Experimental results further show that P-HARP can give nearly a 20-fold speedup on 64 processors. These results indicate that graph partitioning is no longer a major bottleneck that hinders the advancement of computational science and engineering for dynamically-changing real-world applications.« less

  2. Censored quantile regression with recursive partitioning-based weights

    PubMed Central

    Wey, Andrew; Wang, Lan; Rudser, Kyle

    2014-01-01

    Censored quantile regression provides a useful alternative to the Cox proportional hazards model for analyzing survival data. It directly models the conditional quantile of the survival time and hence is easy to interpret. Moreover, it relaxes the proportionality constraint on the hazard function associated with the popular Cox model and is natural for modeling heterogeneity of the data. Recently, Wang and Wang (2009. Locally weighted censored quantile regression. Journal of the American Statistical Association 103, 1117–1128) proposed a locally weighted censored quantile regression approach that allows for covariate-dependent censoring and is less restrictive than other censored quantile regression methods. However, their kernel smoothing-based weighting scheme requires all covariates to be continuous and encounters practical difficulty with even a moderate number of covariates. We propose a new weighting approach that uses recursive partitioning, e.g. survival trees, that offers greater flexibility in handling covariate-dependent censoring in moderately high dimensions and can incorporate both continuous and discrete covariates. We prove that this new weighting scheme leads to consistent estimation of the quantile regression coefficients and demonstrate its effectiveness via Monte Carlo simulations. We also illustrate the new method using a widely recognized data set from a clinical trial on primary biliary cirrhosis. PMID:23975800

  3. Greedy feature selection for glycan chromatography data with the generalized Dirichlet distribution

    PubMed Central

    2013-01-01

    Background Glycoproteins are involved in a diverse range of biochemical and biological processes. Changes in protein glycosylation are believed to occur in many diseases, particularly during cancer initiation and progression. The identification of biomarkers for human disease states is becoming increasingly important, as early detection is key to improving survival and recovery rates. To this end, the serum glycome has been proposed as a potential source of biomarkers for different types of cancers. High-throughput hydrophilic interaction liquid chromatography (HILIC) technology for glycan analysis allows for the detailed quantification of the glycan content in human serum. However, the experimental data from this analysis is compositional by nature. Compositional data are subject to a constant-sum constraint, which restricts the sample space to a simplex. Statistical analysis of glycan chromatography datasets should account for their unusual mathematical properties. As the volume of glycan HILIC data being produced increases, there is a considerable need for a framework to support appropriate statistical analysis. Proposed here is a methodology for feature selection in compositional data. The principal objective is to provide a template for the analysis of glycan chromatography data that may be used to identify potential glycan biomarkers. Results A greedy search algorithm, based on the generalized Dirichlet distribution, is carried out over the feature space to search for the set of “grouping variables” that best discriminate between known group structures in the data, modelling the compositional variables using beta distributions. The algorithm is applied to two glycan chromatography datasets. Statistical classification methods are used to test the ability of the selected features to differentiate between known groups in the data. Two well-known methods are used for comparison: correlation-based feature selection (CFS) and recursive partitioning (rpart). CFS is a feature selection method, while recursive partitioning is a learning tree algorithm that has been used for feature selection in the past. Conclusions The proposed feature selection method performs well for both glycan chromatography datasets. It is computationally slower, but results in a lower misclassification rate and a higher sensitivity rate than both correlation-based feature selection and the classification tree method. PMID:23651459

  4. Impact of triple-negative phenotype on prognosis of patients with breast cancer brain metastases.

    PubMed

    Xu, Zhiyuan; Schlesinger, David; Toulmin, Sushila; Rich, Tyvin; Sheehan, Jason

    2012-11-01

    To elucidate survival times and identify potential prognostic factors in patients with triple-negative (TN) phenotype who harbored brain metastases arising from breast cancer and who underwent stereotactic radiosurgery (SRS). A total of 103 breast cancer patients with brain metastases were treated with SRS and then studied retrospectively. Twenty-four patients (23.3%) were TN. Survival times were estimated using the Kaplan-Meier method, with a log-rank test computing the survival time difference between groups. Univariate and multivariate analyses to predict potential prognostic factors were performed using a Cox proportional hazard regression model. The presence of TN phenotype was associated with worse survival times, including overall survival after the diagnosis of primary breast cancer (43 months vs. 82 months), neurologic survival after the diagnosis of intracranial metastases, and radiosurgical survival after SRS, with median survival times being 13 months vs. 25 months and 6 months vs. 16 months, respectively (p < 0.002 in all three comparisons). On multivariate analysis, radiosurgical survival benefit was associated with non-TN status and lower recursive partitioning analysis class at the initial SRS. The TN phenotype represents a significant adverse prognostic factor with respect to overall survival, neurologic survival, and radiosurgical survival in breast cancer patients with intracranial metastasis. Recursive partitioning analysis class also served as an important and independent prognostic factor. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. Gender, Race, and Survival: A Study in Non-Small-Cell Lung Cancer Brain Metastases Patients Utilizing the Radiation Therapy Oncology Group Recursive Partitioning Analysis Classification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Videtic, Gregory M.M., E-mail: videtig@ccf.or; Reddy, Chandana A.; Chao, Samuel T.

    Purpose: To explore whether gender and race influence survival in non-small-cell lung cancer (NSCLC) in patients with brain metastases, using our large single-institution brain tumor database and the Radiation Therapy Oncology Group recursive partitioning analysis (RPA) brain metastases classification. Methods and materials: A retrospective review of a single-institution brain metastasis database for the interval January 1982 to September 2004 yielded 835 NSCLC patients with brain metastases for analysis. Patient subsets based on combinations of gender, race, and RPA class were then analyzed for survival differences. Results: Median follow-up was 5.4 months (range, 0-122.9 months). There were 485 male patients (M)more » (58.4%) and 346 female patients (F) (41.6%). Of the 828 evaluable patients (99%), 143 (17%) were black/African American (B) and 685 (83%) were white/Caucasian (W). Median survival time (MST) from time of brain metastasis diagnosis for all patients was 5.8 months. Median survival time by gender (F vs. M) and race (W vs. B) was 6.3 months vs. 5.5 months (p = 0.013) and 6.0 months vs. 5.2 months (p = 0.08), respectively. For patients stratified by RPA class, gender, and race, MST significantly favored BFs over BMs in Class II: 11.2 months vs. 4.6 months (p = 0.021). On multivariable analysis, significant variables were gender (p = 0.041, relative risk [RR] 0.83) and RPA class (p < 0.0001, RR 0.28 for I vs. III; p < 0.0001, RR 0.51 for II vs. III) but not race. Conclusions: Gender significantly influences NSCLC brain metastasis survival. Race trended to significance in overall survival but was not significant on multivariable analysis. Multivariable analysis identified gender and RPA classification as significant variables with respect to survival.« less

  6. Whole brain radiotherapy and stereotactic radiosurgery for patients with recursive partitioning analysis I and lesions <5 cm(3): A matched pair analysis.

    PubMed

    Viani, Gustavo Arruda; Godoi da Silva, Lucas Bernardes; Viana, Bruno Silveira; Rossi, Bruno Tiago; Suguikawa, Elton; Zuliani, Gisele

    2016-01-01

    The intention of this study is to compare whole brain radiotherapy and stereotactic radiosurgery (WBRT + SRS) with WBRT in patients with 1-4 brain metastases to find a subgroup of patients that have a great benefit with aggressive treatment. Between December 2002 and December 2013, 60 patients with 1-4 brain metastases were treated by WBRT + SRS. In this period, 60 patients treated with WBRT were matched with patients treated with WBRT + SRS. The median survival for the entire cohort was 8.3 months. In the univariate analysis, WBRT + SRS (0.031), the presence of extracranial disease (P = 0.02), Karnofsky performance score <70 (P = 0.0001), and age >65 (P = 0.001) years were significant factors for survival. In the entire cohort, the median survival for recursive partitioning analysis (RPA) classes I, II, and III was 11, 7, and 3 months, respectively (P = 0.0001). In a stratified analysis, only RPA class I achieved statistical significance for 1-year survival between the groups (WBRT + SRS = 51% and WBRT = 23%, P = 0.03). Cox regression analysis revealed WBRT + SRS, age >65 years, and extracranial disease as independent prognostic factors. In the univariate analysis, lesion volume ≤5 cm 3 (P = 0.002) and WBRT + SRS (P = 0.003) were the significant factors associated with better brain control. WBRT plus SRS was an independent prognostic factor for survival. However, the combined treatment appears to be justified only in patients with RPA I and lesion volume ≤5 cm 3, independently of the number of lesions.

  7. Recursive partitioning identifies greater than 4 U of packed red blood cells per hour as an improved massive transfusion definition.

    PubMed

    Moren, Alexis Marika; Hamptom, David; Diggs, Brian; Kiraly, Laszlo; Fox, Erin E; Holcomb, John B; Rahbar, Mohammad Hossein; Brasel, Karen J; Cohen, Mitchell Jay; Bulger, Eileen M; Schreiber, Martin A

    2015-12-01

    Massive transfusion (MT) is classically defined as greater than 10 U of packed red blood cells (PRBCs) in 24 hours. This fails to capture the most severely injured patients. Extending the previous work of Savage and Rahbar, a rolling hourly rate-based definition of MT may more accurately define critically injured patients requiring early, aggressive resuscitation. The Prospective Observational Multicenter Major Trauma Transfusion (PROMMTT) trial collected data from 10 Level 1 trauma centers. Patients were placed into rate-based transfusion groups by maximal number of PRBCs transfused in any hour within the first 6 hours. A nonparametric analysis using classification trees partitioned data according to mortality at 24 hours using a predictor variable of maximum number PRBC units transfused in an hour. Dichotomous variables significant in previous scores and models as predictors of MT were used to identify critically ill patients: a positive finding on Focused Assessment with Sonography in Trauma (FAST) examination, Glasgow Coma Scale (GCS) score less than 8, heart rate greater than 120 beats/min, systolic blood pressure less than 90 mm Hg, penetrating mechanism of injury, international normalized ratio greater than 1.5, hemoglobin less than 11, and base deficit greater than 5. These critical indicators were then compared among the nodes of the classification tree. Patients omitted included those who did not receive PRBCs (n = 24) and those who did not have all eight critical indicators reported (n = 449). In a population of 1,245 patients, the classification tree included 772 patients. Analysis by recursive partitioning showed increased mortality among patients receiving greater than 13 U/h (73.9%, p < 0.01). In those patients receiving less than or equal to 13 U/h, mortality was greater in patients who received more than 4 U/h (16.7% vs. 6.0%, p < 0.01) (Fig. 1). Nodal analysis showed that the median number of critical indicators for each node was 3 (2-4) (≤4 U/h), 4 (3-5) (>4 U/h and ≤13 U/h), and 5 (4-5.5) (>13 U/h). A rate-based transfusion definition identifies a difference in mortality in patients who receive greater than 4 U/h of PRBCs. Redefining MT to greater than 4 U/h allows early identification of patients with a significant mortality risk who may be missed by the current definition. Prognostic/epidemiologic study, level III.

  8. Prognostic Classification Factors Associated With Development of Multiple Autoantibodies, Dysglycemia, and Type 1 Diabetes—A Recursive Partitioning Analysis

    PubMed Central

    Krischer, Jeffrey P.

    2016-01-01

    OBJECTIVE To define prognostic classification factors associated with the progression from single to multiple autoantibodies, multiple autoantibodies to dysglycemia, and dysglycemia to type 1 diabetes onset in relatives of individuals with type 1 diabetes. RESEARCH DESIGN AND METHODS Three distinct cohorts of subjects from the Type 1 Diabetes TrialNet Pathway to Prevention Study were investigated separately. A recursive partitioning analysis (RPA) was used to determine the risk classes. Clinical characteristics, including genotype, antibody titers, and metabolic markers were analyzed. RESULTS Age and GAD65 autoantibody (GAD65Ab) titers defined three risk classes for progression from single to multiple autoantibodies. The 5-year risk was 11% for those subjects >16 years of age with low GAD65Ab titers, 29% for those ≤16 years of age with low GAD65Ab titers, and 45% for those subjects with high GAD65Ab titers regardless of age. Progression to dysglycemia was associated with islet antigen 2 Ab titers, and 2-h glucose and fasting C-peptide levels. The 5-year risk is 28%, 39%, and 51% for respective risk classes defined by the three predictors. Progression to type 1 diabetes was associated with the number of positive autoantibodies, peak C-peptide level, HbA1c level, and age. Four risk classes defined by RPA had a 5-year risk of 9%, 33%, 62%, and 80%, respectively. CONCLUSIONS The use of RPA offered a new classification approach that could predict the timing of transitions from one preclinical stage to the next in the development of type 1 diabetes. Using these RPA classes, new prevention techniques can be tailored based on the individual prognostic risk characteristics at different preclinical stages. PMID:27208341

  9. A Novel Space Partitioning Algorithm to Improve Current Practices in Facility Placement

    PubMed Central

    Jimenez, Tamara; Mikler, Armin R; Tiwari, Chetan

    2012-01-01

    In the presence of naturally occurring and man-made public health threats, the feasibility of regional bio-emergency contingency plans plays a crucial role in the mitigation of such emergencies. While the analysis of in-place response scenarios provides a measure of quality for a given plan, it involves human judgment to identify improvements in plans that are otherwise likely to fail. Since resource constraints and government mandates limit the availability of service provided in case of an emergency, computational techniques can determine optimal locations for providing emergency response assuming that the uniform distribution of demand across homogeneous resources will yield and optimal service outcome. This paper presents an algorithm that recursively partitions the geographic space into sub-regions while equally distributing the population across the partitions. For this method, we have proven the existence of an upper bound on the deviation from the optimal population size for sub-regions. PMID:23853502

  10. Spectroscopic diagnosis of laryngeal carcinoma using near-infrared Raman spectroscopy and random recursive partitioning ensemble techniques.

    PubMed

    Teh, Seng Khoon; Zheng, Wei; Lau, David P; Huang, Zhiwei

    2009-06-01

    In this work, we evaluated the diagnostic ability of near-infrared (NIR) Raman spectroscopy associated with the ensemble recursive partitioning algorithm based on random forests for identifying cancer from normal tissue in the larynx. A rapid-acquisition NIR Raman system was utilized for tissue Raman measurements at 785 nm excitation, and 50 human laryngeal tissue specimens (20 normal; 30 malignant tumors) were used for NIR Raman studies. The random forests method was introduced to develop effective diagnostic algorithms for classification of Raman spectra of different laryngeal tissues. High-quality Raman spectra in the range of 800-1800 cm(-1) can be acquired from laryngeal tissue within 5 seconds. Raman spectra differed significantly between normal and malignant laryngeal tissues. Classification results obtained from the random forests algorithm on tissue Raman spectra yielded a diagnostic sensitivity of 88.0% and specificity of 91.4% for laryngeal malignancy identification. The random forests technique also provided variables importance that facilitates correlation of significant Raman spectral features with cancer transformation. This study shows that NIR Raman spectroscopy in conjunction with random forests algorithm has a great potential for the rapid diagnosis and detection of malignant tumors in the larynx.

  11. The use of recursive partitioning analysis grouping in patients with brain metastases from non-small-cell lung cancer.

    PubMed

    Gülbaş, Hülya; Erkal, Haldun Sükrü; Serin, Meltem

    2006-04-01

    This study evaluates the use of recursive partitioning analysis (RPA) grouping in an attempt to predict the survival probabilities in patients with brain metastases from non-small-cell lung cancer (NSCLC). Seventy-two patients with brain metastases from NSCLC treated with radiation therapy were included in the study. Sixty-three patients were male and nine patients were female. Their median age was 57 years and their median Karnofsky performance status was 70. At the time of brain metastases, there was no evidence of the intrathoracic disease in 27 patients and the extrathoracic disease was limited to the intracranial disease in 42 patients. In accordance with RPA grouping, 12 patients were in Group 1, 24 patients were in Group 2, and 36 patients were in Group 3. Radiation therapy was delivered to the whole brain at a dose of 30 Gy in 10 fractions in most of the patients. The median survival time was 7 months for Group 1, 5 months for Group 2 and 3 months for Group 3. The survival probability at 1 year was 50% for Group 1, 26% for Group 2 and 14% for Group 3. This study presents evidence supporting the use of RPA grouping in an attempt to predict the survival probabilities in patients with brain metastases from NSCLC.

  12. Race and acute abdominal pain in a pediatric emergency department.

    PubMed

    Caperell, Kerry; Pitetti, Raymond; Cross, Keith P

    2013-06-01

    To investigate the demographic and clinical factors of children who present to the pediatric emergency department (ED) with abdominal pain and their outcomes. A review of the electronic medical record of patients 1 to 18 years old, who presented to the Children's Hospital of Pittsburgh ED with a complaint of abdominal pain over the course of 2 years, was conducted. Demographic and clinical characteristics, as well as visit outcomes, were reviewed. Subjects were grouped by age, race, and gender. Results of evaluation, treatment, and clinical outcomes were compared between groups by using multivariate analysis and recursive partitioning. There were 9424 patient visits during the study period that met inclusion and exclusion criteria. Female gender comprised 61% of African American children compared with 52% of white children. Insurance was characterized as private for 75% of white and 37% of African American children. A diagnosis of appendicitis was present in 1.9% of African American children and 5.1% of white children. Older children were more likely to be admitted and have an operation associated with their ED visit. Appendicitis was uncommon in younger children. Constipation was commonly diagnosed. Multivariate analysis by diagnosis as well as recursive partitioning analysis did not reflect any racial differences in evaluation, treatment, or outcome. Constipation is the most common diagnosis in children presenting with abdominal pain. Our data demonstrate that no racial differences exist in the evaluation, treatment, and disposition of children with abdominal pain.

  13. Efficient method for computing the electronic transport properties of a multiterminal system

    NASA Astrophysics Data System (ADS)

    Lima, Leandro R. F.; Dusko, Amintor; Lewenkopf, Caio

    2018-04-01

    We present a multiprobe recursive Green's function method to compute the transport properties of mesoscopic systems using the Landauer-Büttiker approach. By introducing an adaptive partition scheme, we map the multiprobe problem into the standard two-probe recursive Green's function method. We apply the method to compute the longitudinal and Hall resistances of a disordered graphene sample, a system of current interest. We show that the performance and accuracy of our method compares very well with other state-of-the-art schemes.

  14. Introduction in IND and recursive partitioning

    NASA Technical Reports Server (NTRS)

    Buntine, Wray; Caruana, Rich

    1991-01-01

    This manual describes the IND package for learning tree classifiers from data. The package is an integrated C and C shell re-implementation of tree learning routines such as CART, C4, and various MDL and Bayesian variations. The package includes routines for experiment control, interactive operation, and analysis of tree building. The manual introduces the system and its many options, gives a basic review of tree learning, contains a guide to the literature and a glossary, and lists the manual pages for the routines and instructions on installation.

  15. Prognostic factors in children and adolescents with acute myeloid leukemia (excluding children with Down syndrome and acute promyelocytic leukemia): univariate and recursive partitioning analysis of patients treated on Pediatric Oncology Group (POG) Study 8821.

    PubMed

    Chang, M; Raimondi, S C; Ravindranath, Y; Carroll, A J; Camitta, B; Gresik, M V; Steuber, C P; Weinstein, H

    2000-07-01

    The purpose of the paper was to define clinical or biological features associated with the risk for treatment failure for children with acute myeloid leukemia. Data from 560 children and adolescents with newly diagnosed acute myeloid leukemia who entered the Pediatric Oncology Group Study 8821 from June 1988 to March 1993 were analyzed by univariate and recursive partitioning methods. Children with Down syndrome or acute promyelocytic leukemia were excluded from the study. Factors examined included age, number of leukocytes, sex, FAB morphologic subtype, cytogenetic findings, and extramedullary disease at the time of diagnosis. The overall event-free survival (EFS) rate at 4 years was 32.7% (s.e. = 2.2%). Age > or =2 years, fewer than 50 x 10(9)/I leukocytes, and t(8;21) or inv(16), and normal chromosomes were associated with higher rates of EFS (P value = 0.003, 0.049, 0.0003, 0.031, respectively), whereas the M5 subtype of AML (P value = 0.0003) and chromosome abnormalities other than t(8;21) and inv(16) were associated with lower rates of EFS (P value = 0.0001). Recursive partitioning analysis defined three groups of patients with widely varied prognoses: female patients with t(8;21), inv(16), or a normal karyotype (n = 89) had the best prognosis (4-year EFS = 55.1%, s.e. = 5.7%); male patients with t(8;21), inv(16) or normal chromosomes (n = 106) had an intermediate prognosis (4-year EFS = 38.1%, s.e. = 5.3%); patients with chromosome abnormalities other than t(8;21) and inv(16) (n = 233) had the worst prognosis (4-year EFS = 27.0%, s.e. = 3.2%). One hundred and thirty-two patients (24%) could not be grouped because of missing cytogenetic data, mainly due to inadequate marrow samples. The results suggest that pediatric patients with acute myeloid leukemia can be categorized into three potential risk groups for prognosis and that differences in sex and chromosomal abnormalities are associated with differences in estimates of EFS. These results are tentative and must be confirmed by a large prospective clinical trial.

  16. Genetic Variation in the Raptor Gene Is Associated With Overweight But Not Hypertension in American Men of Japanese Ancestry

    PubMed Central

    Carnes, Bruce A.; Chen, Randi; Donlon, Timothy A.; He, Qimei; Grove, John S.; Masaki, Kamal H.; Elliott, Ayako; Willcox, Donald C.; Allsopp, Richard; Willcox, Bradley J.

    2015-01-01

    BACKGROUND The mechanistic target of rapamycin (mTOR) pathway is pivotal for cell growth. Regulatory associated protein of mTOR complex I (Raptor) is a unique component of this pro-growth complex. The present study tested whether variation across the raptor gene (RPTOR) is associated with overweight and hypertension. METHODS We tested 61 common (allele frequency ≥ 0.1) tagging single nucleotide polymorphisms (SNPs) that captured most of the genetic variation across RPTOR in 374 subjects of normal lifespan and 439 subjects with a lifespan exceeding 95 years for association with overweight/obesity, essential hypertension, and isolated systolic hypertension. Subjects were drawn from the Honolulu Heart Program, a homogeneous population of American men of Japanese ancestry, well characterized for phenotypes relevant to conditions of aging. Hypertension status was ascertained when subjects were 45–68 years old. Statistical evaluation involved contingency table analysis, logistic regression, and the powerful method of recursive partitioning. RESULTS After analysis of RPTOR genotypes by each statistical approach, we found no significant association between genetic variation in RPTOR and either essential hypertension or isolated systolic hypertension. Models generated by recursive partitioning analysis showed that RPTOR SNPs significantly enhanced the ability of the model to accurately assign individuals to either the overweight/obese or the non-overweight/obese groups (P = 0.008 by 1-tailed Z test). CONCLUSION Common genetic variation in RPTOR is associated with overweight/obesity but does not discernibly contribute to either essential hypertension or isolated systolic hypertension in the population studied. PMID:25249372

  17. What contributes to perceived stress in later life? A recursive partitioning approach.

    PubMed

    Scott, Stacey B; Jackson, Brenda R; Bergeman, C S

    2011-12-01

    One possible explanation for the individual differences in outcomes of stress is the diversity of inputs that produce perceptions of being stressed. The current study examines how combinations of contextual features (e.g., social isolation, neighborhood quality, health problems, age discrimination, financial concerns, and recent life events) of later life contribute to overall feelings of stress. Recursive partitioning techniques (regression trees and random forests) were used to examine unique interrelations between predictors of perceived stress in a sample of 282 community-dwelling adults. Trees provided possible examples of equifinality (i.e., subsets of people with similar levels of perceived stress but different predictors) as well as identification both of contextual combinations that separated participants with very high and very low perceived stress. Random forest analyses aggregated across many trees based on permuted versions of the data and predictors; loneliness, financial strain, neighborhood strain, ageism, and to some extent life events emerged as important predictors. Interviews with a subsample of participants provided both thick description of the complex relationships identified in the trees, as well as additional risks not appearing in the survey results. Together, the analyses highlight what may be missed when stress is used as a simple unidimensional construct and can guide differential intervention efforts.

  18. What contributes to perceived stress in later life? A recursive partitioning approach

    PubMed Central

    Scott, Stacey B.; Jackson, Brenda R.; Bergeman, C. S.

    2011-01-01

    One possible explanation for the individual differences in outcomes of stress is the diversity of inputs that produce perceptions of being stressed. The current study examines how combinations of contextual features (e.g., social isolation, neighborhood quality, health problems, age discrimination, financial concerns, and recent life events) of later life contribute to overall feelings of stress. Recursive partitioning techniques (regression trees and random forests) were used to examine unique interrelations between predictors of perceived stress in a sample of 282 community-dwelling adults. Trees provided possible examples of equifinality (i.e., subsets of people with similar levels of perceived stress but different predictors) as well as for the identification both of contextual combinations that separated participants with very high and very low perceived stress. Random forest analyses aggregated across many trees based on permuted versions of the data and predictors; loneliness, financial strain, neighborhood strain, ageism, and to some extent life events emerged as important predictors. Interviews with a subsample of participants provided both thick description of the complex relationships identified in the trees, as well as additional risks not appearing in the survey results. Together, the analyses highlight what may be missed when stress is used as a simple unidimensional construct and can guide differential intervention efforts. PMID:21604885

  19. Recursive partitioning analysis of 1999 Radiation Therapy Oncology Group (RTOG) patients with locally-advanced non-small-cell lung cancer (LA-NSCLC): identification of five groups with different survival.

    PubMed

    Werner-Wasik, M; Scott, C; Cox, J D; Sause, W T; Byhardt, R W; Asbell, S; Russell, A; Komaki, R; Lee, J S

    2000-12-01

    Survival of patients with locally-advanced non-small-cell lung cancer (LA-NSCLC) is predicted by the stage of the disease and other characteristics. This analysis was undertaken to identify these characteristics in a large cooperative group patient population, as well as to define subgroups of the population with differing outcomes. Analysis included 1,999 patients treated in 9 RTOG trials between 1983 and 1994 with thoracic irradiation (RT) with (n = 355) or without chemotherapy (CT). In univariate analysis, the following characteristics were significantly associated with an improved survival: use of CT, CT delivered without major deviation, abnormal pulmonary function tests, normal hemoglobin, protein, LDH and BUN, presence of dyspnea, hemoptysis, cough or hoarseness, uninvolved lymph nodes, T1 or T2 stage, no malignant pleural effusion (PE), weight loss of < 8%, Karnofsky performance status (KPS) of at least 90, adenocarcinoma histology, female gender, and age less than 70 years. Recursive partitioning analysis (RPA) was subsequently applied to identify 5 patient subgroups with significantly different median survival times (MST): Group I, KPS of > or = 90, who received chemotherapy (MST 16.2 months); Group II, KPS of > or = 90, who received no CT, but had no PE (MST 11.9 months); Group III, KPS < 90, younger than 70 years, with non-large cell histology (MST 9.6 months); Group IV, KPS > or = 90, but with PE, or KPS < 90, younger than 70 years, and with large cell histology, or older than 70 years, but without PE (MST 5.6-6.4 months); Group V, older than 70, with PE (MST 2.9 months). Cisplatinum-based CT improves survival, for excellent prognosis of LA-NSCLC patients, over RT alone. The presence of a malignant pleural effusion is a major negative prognostic factor for survival. The identification of RPA prognostic groups among patients with LA-NSCLC provides prognostic information and may serve as a basis of stratification in future trials.

  20. Recursive sequences in first-year calculus

    NASA Astrophysics Data System (ADS)

    Krainer, Thomas

    2016-02-01

    This article provides ready-to-use supplementary material on recursive sequences for a second-semester calculus class. It equips first-year calculus students with a basic methodical procedure based on which they can conduct a rigorous convergence or divergence analysis of many simple recursive sequences on their own without the need to invoke inductive arguments as is typically required in calculus textbooks. The sequences that are accessible to this kind of analysis are predominantly (eventually) monotonic, but also certain recursive sequences that alternate around their limit point as they converge can be considered.

  1. Time-based partitioning model for predicting neurologically favorable outcome among adults with witnessed bystander out-of-hospital CPA.

    PubMed

    Abe, Toshikazu; Tokuda, Yasuharu; Cook, E Francis

    2011-01-01

    Optimal acceptable time intervals from collapse to bystander cardiopulmonary resuscitation (CPR) for neurologically favorable outcome among adults with witnessed out-of-hospital cardiopulmonary arrest (CPA) have been unclear. Our aim was to assess the optimal acceptable thresholds of the time intervals of CPR for neurologically favorable outcome and survival using a recursive partitioning model. From January 1, 2005 through December 31, 2009, we conducted a prospective population-based observational study across Japan involving consecutive out-of-hospital CPA patients (N = 69,648) who received a witnessed bystander CPR. Of 69,648 patients, 34,605 were assigned to the derivation data set and 35,043 to the validation data set. Time factors associated with better outcomes: the better outcomes were survival and neurologically favorable outcome at one month, defined as category one (good cerebral performance) or two (moderate cerebral disability) of the cerebral performance categories. Based on the recursive partitioning model from the derivation dataset (n = 34,605) to predict the neurologically favorable outcome at one month, 5 min threshold was the acceptable time interval from collapse to CPR initiation; 11 min from collapse to ambulance arrival; 18 min from collapse to return of spontaneous circulation (ROSC); and 19 min from collapse to hospital arrival. Among the validation dataset (n = 35,043), 209/2,292 (9.1%) in all patients with the acceptable time intervals and 1,388/2,706 (52.1%) in the subgroup with the acceptable time intervals and pre-hospital ROSC showed neurologically favorable outcome. Initiation of CPR should be within 5 min for obtaining neurologically favorable outcome among adults with witnessed out-of-hospital CPA. Patients with the acceptable time intervals of bystander CPR and pre-hospital ROSC within 18 min could have 50% chance of neurologically favorable outcome.

  2. Introduction to IND and recursive partitioning, version 1.0

    NASA Technical Reports Server (NTRS)

    Buntine, Wray; Caruana, Rich

    1991-01-01

    This manual describes the IND package for learning tree classifiers from data. The package is an integrated C and C shell re-implementation of tree learning routines such as CART, C4, and various MDL and Bayesian variations. The package includes routines for experiment control, interactive operation, and analysis of tree building. The manual introduces the system and its many options, gives a basic review of tree learning, contains a guide to the literature and a glossary, lists the manual pages for the routines, and instructions on installation.

  3. Prognostic Classification Factors Associated With Development of Multiple Autoantibodies, Dysglycemia, and Type 1 Diabetes-A Recursive Partitioning Analysis.

    PubMed

    Xu, Ping; Krischer, Jeffrey P

    2016-06-01

    To define prognostic classification factors associated with the progression from single to multiple autoantibodies, multiple autoantibodies to dysglycemia, and dysglycemia to type 1 diabetes onset in relatives of individuals with type 1 diabetes. Three distinct cohorts of subjects from the Type 1 Diabetes TrialNet Pathway to Prevention Study were investigated separately. A recursive partitioning analysis (RPA) was used to determine the risk classes. Clinical characteristics, including genotype, antibody titers, and metabolic markers were analyzed. Age and GAD65 autoantibody (GAD65Ab) titers defined three risk classes for progression from single to multiple autoantibodies. The 5-year risk was 11% for those subjects >16 years of age with low GAD65Ab titers, 29% for those ≤16 years of age with low GAD65Ab titers, and 45% for those subjects with high GAD65Ab titers regardless of age. Progression to dysglycemia was associated with islet antigen 2 Ab titers, and 2-h glucose and fasting C-peptide levels. The 5-year risk is 28%, 39%, and 51% for respective risk classes defined by the three predictors. Progression to type 1 diabetes was associated with the number of positive autoantibodies, peak C-peptide level, HbA1c level, and age. Four risk classes defined by RPA had a 5-year risk of 9%, 33%, 62%, and 80%, respectively. The use of RPA offered a new classification approach that could predict the timing of transitions from one preclinical stage to the next in the development of type 1 diabetes. Using these RPA classes, new prevention techniques can be tailored based on the individual prognostic risk characteristics at different preclinical stages. © 2016 by the American Diabetes Association. Readers may use this article as long as the work is properly cited, the use is educational and not for profit, and the work is not altered.

  4. Recurrence relations in one-dimensional Ising models.

    PubMed

    da Conceição, C M Silva; Maia, R N P

    2017-09-01

    The exact finite-size partition function for the nonhomogeneous one-dimensional (1D) Ising model is found through an approach using algebra operators. Specifically, in this paper we show that the partition function can be computed through a trace from a linear second-order recurrence relation with nonconstant coefficients in matrix form. A relation between the finite-size partition function and the generalized Lucas polynomials is found for the simple homogeneous model, thus establishing a recursive formula for the partition function. This is an important property and it might indicate the possible existence of recurrence relations in higher-dimensional Ising models. Moreover, assuming quenched disorder for the interactions within the model, the quenched averaged magnetic susceptibility displays a nontrivial behavior due to changes in the ferromagnetic concentration probability.

  5. Validation and Development of a Modified Breast Graded Prognostic Assessment As a Tool for Survival in Patients With Breast Cancer and Brain Metastases.

    PubMed

    Subbiah, Ishwaria M; Lei, Xiudong; Weinberg, Jeffrey S; Sulman, Erik P; Chavez-MacGregor, Mariana; Tripathy, Debu; Gupta, Rohan; Varma, Ankur; Chouhan, Jay; Guevarra, Richard P; Valero, Vicente; Gilbert, Mark R; Gonzalez-Angulo, Ana M

    2015-07-10

    Several indices have been developed to predict overall survival (OS) in patients with breast cancer with brain metastases, including the breast graded prognostic assessment (breast-GPA), comprising age, tumor subtype, and Karnofsky performance score. However, number of brain metastases-a highly relevant clinical variable-is less often incorporated into the final model. We sought to validate the existing breast-GPA in an independent larger cohort and refine it integrating number of brain metastases. Data were retrospectively gathered from a prospectively maintained institutional database. Patients with newly diagnosed brain metastases from 1996 to 2013 were identified. After validating the breast-GPA, multivariable Cox regression and recursive partitioning analysis led to the development of the modified breast-GPA. The performances of the breast-GPA and modified breast-GPA were compared using the concordance index. In our cohort of 1,552 patients, the breast-GPA was validated as a prognostic tool for OS (P < .001). In multivariable analysis of the breast-GPA and number of brain metastases (> three v ≤ three), both were independent predictors of OS. We therefore developed the modified breast-GPA integrating a fourth clinical parameter. Recursive partitioning analysis reinforced the prognostic significance of these four factors. Concordance indices were 0.78 (95% CI, 0.77 to 0.80) and 0.84 (95% CI, 0.83 to 0.85) for the breast-GPA and modified breast-GPA, respectively (P < .001). The modified breast-GPA incorporates four simple clinical parameters of high prognostic significance. This index has an immediate role in the clinic as a formative part of the clinician's discussion of prognosis and direction of care and as a potential patient selection tool for clinical trials. © 2015 by American Society of Clinical Oncology.

  6. Genetic variation in the raptor gene is associated with overweight but not hypertension in American men of Japanese ancestry.

    PubMed

    Morris, Brian J; Carnes, Bruce A; Chen, Randi; Donlon, Timothy A; He, Qimei; Grove, John S; Masaki, Kamal H; Elliott, Ayako; Willcox, Donald C; Allsopp, Richard; Willcox, Bradley J

    2015-04-01

    The mechanistic target of rapamycin (mTOR) pathway is pivotal for cell growth. Regulatory associated protein of mTOR complex I (Raptor) is a unique component of this pro-growth complex. The present study tested whether variation across the raptor gene (RPTOR) is associated with overweight and hypertension. We tested 61 common (allele frequency ≥ 0.1) tagging single nucleotide polymorphisms (SNPs) that captured most of the genetic variation across RPTOR in 374 subjects of normal lifespan and 439 subjects with a lifespan exceeding 95 years for association with overweight/obesity, essential hypertension, and isolated systolic hypertension. Subjects were drawn from the Honolulu Heart Program, a homogeneous population of American men of Japanese ancestry, well characterized for phenotypes relevant to conditions of aging. Hypertension status was ascertained when subjects were 45-68 years old. Statistical evaluation involved contingency table analysis, logistic regression, and the powerful method of recursive partitioning. After analysis of RPTOR genotypes by each statistical approach, we found no significant association between genetic variation in RPTOR and either essential hypertension or isolated systolic hypertension. Models generated by recursive partitioning analysis showed that RPTOR SNPs significantly enhanced the ability of the model to accurately assign individuals to either the overweight/obese or the non-overweight/obese groups (P = 0.008 by 1-tailed Z test). Common genetic variation in RPTOR is associated with overweight/obesity but does not discernibly contribute to either essential hypertension or isolated systolic hypertension in the population studied. © American Journal of Hypertension, Ltd 2014. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Recursive partitioning analysis (RPA) classification predicts survival in patients with brain metastases from sarcoma.

    PubMed

    Grossman, Rachel; Ram, Zvi

    2014-12-01

    Sarcoma rarely metastasizes to the brain, and there are no specific treatment guidelines for these tumors. The recursive partitioning analysis (RPA) classification is a well-established prognostic scale used in many malignancies. In this study we assessed the clinical characteristics of metastatic sarcoma to the brain and the validity of the RPA classification system in a subset of 21 patients who underwent surgical resection of metastatic sarcoma to the brain We retrospectively analyzed the medical, radiological, surgical, pathological, and follow-up clinical records of 21 patients who were operated for metastatic sarcoma to the brain between 1996 and 2012. Gliosarcomas, sarcomas of the head and neck with local extension into the brain, and metastatic sarcomas to the spine were excluded from this reported series. The patients' mean age was 49.6 ± 14.2 years (range, 25-75 years) at the time of diagnosis. Sixteen patients had a known history of systemic sarcoma, mostly in the extremities, and had previously received systemic chemotherapy and radiation therapy for their primary tumor. The mean maximal tumor diameter in the brain was 4.9 ± 1.7 cm (range 1.7-7.2 cm). The group's median preoperative Karnofsky Performance Scale was 80, with 14 patients presenting with Karnofsky Performance Scale of 70 or greater. The median overall survival was 7 months (range 0.2-204 months). The median survival time stratified by the Radiation Therapy Oncology Group RPA classes were 31, 7, and 2 months for RPA class I, II, and III, respectively (P = 0.0001). This analysis is the first to support the prognostic utility of the Radiation Therapy Oncology Group RPA classification for sarcoma brain metastases and may be used as a treatment guideline tool in this rare disease. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Relationship between financial impact and coverage of drugs in Australia.

    PubMed

    Mauskopf, Josephine; Chirila, Costel; Masaquel, Catherine; Boye, Kristina S; Bowman, Lee; Birt, Julie; Grainger, David

    2013-01-01

    The aim of this study was to estimate the relationship between the financial impact of a new drug and the recommendation for reimbursement by the Australian Pharmaceutical Benefits Advisory Committee (PBAC). Data in the PBAC summary database were abstracted for decisions made between July 2005 and November 2009. Financial impact-the upper bound of the values presented in the PBAC summary database-was categorized as ≤A$0, >A$0 up to A$10 million, A$10 million up to A$30 million, and >A$30 million per year. Descriptive, logistic, survival, and recursive partitioning decision analyses were used to estimate the relationship between the financial impact of a new drug indication and the recommendation for reimbursement. Multivariable analyses controlled for other clinical and economic variables, including cost per quality-adjusted life-year gained. Financial impact was a significant predictor of the recommendation for reimbursement. In the logistic analysis, the odds ratios of reimbursement for drug submissions with financial impacts ≥A$10 million to ≥A$30 million or >A$0 to

  9. Diagnostic performance and safety of a three-dimensional 14-core systematic biopsy method.

    PubMed

    Takeshita, Hideki; Kawakami, Satoru; Numao, Noboru; Sakura, Mizuaki; Tatokoro, Manabu; Yamamoto, Shinya; Kijima, Toshiki; Komai, Yoshinobu; Saito, Kazutaka; Koga, Fumitaka; Fujii, Yasuhisa; Fukui, Iwao; Kihara, Kazunori

    2015-03-01

    To investigate the diagnostic performance and safety of a three-dimensional 14-core biopsy (3D14PBx) method, which is a combination of the transrectal six-core and transperineal eight-core biopsy methods. Between December 2005 and August 2010, 1103 men underwent 3D14PBx at our institutions and were analysed prospectively. Biopsy criteria included a PSA level of 2.5-20 ng/mL or abnormal digital rectal examination (DRE) findings, or both. The primary endpoint of the study was diagnostic performance and the secondary endpoint was safety. We applied recursive partitioning to the entire study cohort to delineate the unique contribution of each sampling site to overall and clinically significant cancer detection. Prostate cancer was detected in 503 of the 1103 patients (45.6%). Age, family history of prostate cancer, DRE, PSA, percentage of free PSA and prostate volume were associated with the positive biopsy results significantly and independently. Of the 503 cancers detected, 39 (7.8%) were clinically locally advanced (≥cT3a), 348 (69%) had a biopsy Gleason score (GS) of ≥7, and 463 (92%) met the definition of biopsy-based significant cancer. Recursive partitioning analysis showed that each sampling site contributed uniquely to both the overall and the biopsy-based significant cancer detection rate of the 3D14PBx method. The overall cancer-positive rate of each sampling site ranged from 14.5% in the transrectal far lateral base to 22.8% in the transrectal far lateral apex. As of August 2010, 210 patients (42%) had undergone radical prostatectomy, of whom 55 (26%) were found to have pathologically non-organ-confined disease, 174 (83%) had prostatectomy GS ≥7 and 185 (88%) met the definition of prostatectomy-based significant cancer. This is the first prospective analysis of the diagnostic performance of an extended biopsy method, which is a simplified version of the somewhat redundant super-extended three-dimensional 26-core biopsy. As expected, each sampling site uniquely contributed not only to overall cancer detection, but also to significant cancer detection. 3D14PBx is a feasible systematic biopsy method in men with PSA <20 ng/mL. © 2014 The Authors. BJU International © 2014 BJU International.

  10. Tear fluid proteomics multimarkers for diabetic retinopathy screening

    PubMed Central

    2013-01-01

    Background The aim of the project was to develop a novel method for diabetic retinopathy screening based on the examination of tear fluid biomarker changes. In order to evaluate the usability of protein biomarkers for pre-screening purposes several different approaches were used, including machine learning algorithms. Methods All persons involved in the study had diabetes. Diabetic retinopathy (DR) was diagnosed by capturing 7-field fundus images, evaluated by two independent ophthalmologists. 165 eyes were examined (from 119 patients), 55 were diagnosed healthy and 110 images showed signs of DR. Tear samples were taken from all eyes and state-of-the-art nano-HPLC coupled ESI-MS/MS mass spectrometry protein identification was performed on all samples. Applicability of protein biomarkers was evaluated by six different optimally parameterized machine learning algorithms: Support Vector Machine, Recursive Partitioning, Random Forest, Naive Bayes, Logistic Regression, K-Nearest Neighbor. Results Out of the six investigated machine learning algorithms the result of Recursive Partitioning proved to be the most accurate. The performance of the system realizing the above algorithm reached 74% sensitivity and 48% specificity. Conclusions Protein biomarkers selected and classified with machine learning algorithms alone are at present not recommended for screening purposes because of low specificity and sensitivity values. This tool can be potentially used to improve the results of image processing methods as a complementary tool in automatic or semiautomatic systems. PMID:23919537

  11. Venous tree separation in the liver: graph partitioning using a non-ising model.

    PubMed

    O'Donnell, Thomas; Kaftan, Jens N; Schuh, Andreas; Tietjen, Christian; Soza, Grzegorz; Aach, Til

    2011-01-01

    Entangled tree-like vascular systems are commonly found in the body (e.g., in the peripheries and lungs). Separation of these systems in medical images may be formulated as a graph partitioning problem given an imperfect segmentation and specification of the tree roots. In this work, we show that the ubiquitous Ising-model approaches (e.g., Graph Cuts, Random Walker) are not appropriate for tackling this problem and propose a novel method based on recursive minimal paths for doing so. To motivate our method, we focus on the intertwined portal and hepatic venous systems in the liver. Separation of these systems is critical for liver intervention planning, in particular when resection is involved. We apply our method to 34 clinical datasets, each containing well over a hundred vessel branches, demonstrating its effectiveness.

  12. Data-driven process decomposition and robust online distributed modelling for large-scale processes

    NASA Astrophysics Data System (ADS)

    Shu, Zhang; Lijuan, Li; Lijuan, Yao; Shipin, Yang; Tao, Zou

    2018-02-01

    With the increasing attention of networked control, system decomposition and distributed models show significant importance in the implementation of model-based control strategy. In this paper, a data-driven system decomposition and online distributed subsystem modelling algorithm was proposed for large-scale chemical processes. The key controlled variables are first partitioned by affinity propagation clustering algorithm into several clusters. Each cluster can be regarded as a subsystem. Then the inputs of each subsystem are selected by offline canonical correlation analysis between all process variables and its controlled variables. Process decomposition is then realised after the screening of input and output variables. When the system decomposition is finished, the online subsystem modelling can be carried out by recursively block-wise renewing the samples. The proposed algorithm was applied in the Tennessee Eastman process and the validity was verified.

  13. Recursive partitioned inversion of large (1500 x 1500) symmetric matrices

    NASA Technical Reports Server (NTRS)

    Putney, B. H.; Brownd, J. E.; Gomez, R. A.

    1976-01-01

    A recursive algorithm was designed to invert large, dense, symmetric, positive definite matrices using small amounts of computer core, i.e., a small fraction of the core needed to store the complete matrix. The described algorithm is a generalized Gaussian elimination technique. Other algorithms are also discussed for the Cholesky decomposition and step inversion techniques. The purpose of the inversion algorithm is to solve large linear systems of normal equations generated by working geodetic problems. The algorithm was incorporated into a computer program called SOLVE. In the past the SOLVE program has been used in obtaining solutions published as the Goddard earth models.

  14. Three quantitative approaches to the diagnosis of abdominal pain in children: practical applications of decision theory.

    PubMed

    Klein, M D; Rabbani, A B; Rood, K D; Durham, T; Rosenberg, N M; Bahr, M J; Thomas, R L; Langenburg, S E; Kuhns, L R

    2001-09-01

    The authors compared 3 quantitative methods for assisting clinicians in the differential diagnosis of abdominal pain in children, where the most common important endpoint is whether the patient has appendicitis. Pretest probability in different age and sex groups were determined to perform Bayesian analysis, binary logistic regression was used to determine which variables were statistically significantly likely to contribute to a diagnosis, and recursive partitioning was used to build decision trees with quantitative endpoints. The records of all children (1,208) seen at a large urban emergency department (ED) with a chief complaint of abdominal pain were immediately reviewed retrospectively (24 to 72 hours after the encounter). Attempts were made to contact all the patients' families to determine an accurate final diagnosis. A total of 1,008 (83%) families were contacted. Data were analyzed by calculation of the posttest probability, recursive partitioning, and binary logistic regression. In all groups the most common diagnosis was abdominal pain (ICD-9 Code 789). After this, however, the order of the most common final diagnoses for abdominal pain varied significantly. The entire group had a pretest probability of appendicitis of 0.06. This varied with age and sex from 0.02 in boys 2 to 5 years old to 0.16 in boys older than 12 years. In boys age 5 to 12, recursive partitioning and binary logistic regression agreed on guarding and anorexia as important variables. Guarding and tenderness were important in girls age 5 to 12. In boys age greater than 12, both agreed on guarding and anorexia. Using sensitivities and specificities from the literature, computed tomography improved the posttest probability for the group from.06 to.33; ultrasound improved it from.06 to.48; and barium enema improved it from.06 to.58. Knowing the pretest probabilities in a specific population allows the physician to evaluate the likely diagnoses first. Other quantitative methods can help judge how much importance a certain criterion should have in the decision making and how much a particular test is likely to influence the probability of a correct diagnosis. It now should be possible to make these sophisticated quantitative methods readily available to clinicians via the computer. Copyright 2001 by W.B. Saunders Company.

  15. Beating the Odds: Trees to Success in Different Countries

    ERIC Educational Resources Information Center

    Finch, W. Holmes; Marchant, Gregory J.

    2017-01-01

    A recursive partitioning model approach in the form of classification and regression trees (CART) was used with 2012 PISA data for five countries (Canada, Finland, Germany, Singapore-China, and the Unites States). The objective of the study was to determine demographic and educational variables that differentiated between low SES student that were…

  16. Predictive Value of Morphological Features in Patients with Autism versus Normal Controls

    ERIC Educational Resources Information Center

    Ozgen, H.; Hellemann, G. S.; de Jonge, M. V.; Beemer, F. A.; van Engeland, H.

    2013-01-01

    We investigated the predictive power of morphological features in 224 autistic patients and 224 matched-pairs controls. To assess the relationship between the morphological features and autism, we used the receiver operator curves (ROC). In addition, we used recursive partitioning (RP) to determine a specific pattern of abnormalities that is…

  17. Efficiently Exploring Multilevel Data with Recursive Partitioning

    ERIC Educational Resources Information Center

    Martin, Daniel P.; von Oertzen, Timo; Rimm-Kaufman, Sara E.

    2015-01-01

    There is an increasing number of datasets with many participants, variables, or both, in education and other fields that often deal with large, multilevel data structures. Once initial confirmatory hypotheses are exhausted, it can be difficult to determine how best to explore the dataset to discover hidden relationships that could help to inform…

  18. Recursive Partitioning to Identify Potential Causes of Differential Item Functioning in Cross-National Data

    ERIC Educational Resources Information Center

    Finch, W. Holmes; Hernández Finch, Maria E.; French, Brian F.

    2016-01-01

    Differential item functioning (DIF) assessment is key in score validation. When DIF is present scores may not accurately reflect the construct of interest for some groups of examinees, leading to incorrect conclusions from the scores. Given rising immigration, and the increased reliance of educational policymakers on cross-national assessments…

  19. Predictive Parameters of Symptomatic Hematochezia Following 5-Fraction Gantry-Based SABR in Prostate Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Musunuru, Hima Bindu; Department of Radiation Oncology, University of Toronto, Toronto, Ontario; Davidson, Melanie

    2016-04-01

    Purpose: This study identified predictors of high-grade late hematochezia (HH) following 5-fraction gantry-based stereotactic ablative radiation therapy (SABR). Methods and Materials: Hematochezia data for 258 patients who received 35 to 40 Gy SABR in 5-fractions as part of sequential phase 2 prospective trials was retrieved. Grade 2 or higher late rectal bleeding was labeled HH. Hematochezia needing steroid suppositories, 4% formalin, or 1 to 2 sessions of argon plasma coagulation (APC) was labeled grade 2. More than 2 sessions of APC, blood transfusion, or a course of hyperbaric oxygen was grade 3 and development of visceral fistula, grade 4. Various dosimetricmore » and clinical factors were analyzed using univariate and multivariate analyses. Receiver operating characteristic (ROC) curve analysis and recursive partitioning analysis were used to determine clinically valid cut-off points and identify risk groups, respectively. Results: HH was observed in 19.4%, grade ≥3 toxicity in 3.1%. Median follow-up was 29.7 months (interquartile range [IQR]: 20.6-61.7) Median time to develop HH was 11.7 months (IQR: 9.0-15.2) from the start of radiation. At 2 years, cumulative HH was 4.9%, 27.2%, and 42.1% in patients who received 35 Gy to prostate (4-mm planning target volume [PTV] margin), 40 Gy to prostate (5-mm PTV margin), and 40 Gy to prostate/seminal vesicles (5-mm PTV margin), respectively (P<.0001). In the ROC analysis, volume of rectum receiving radiation dose of 38 Gy (V38) was a strong predictor of HH with an area under the curve of 0.65. In multivariate analysis, rectal V38 (≥2.0 cm{sup 3}; odds ratio [OR]: 4.7); use of anticoagulants in the follow-up period (OR: 6.5) and presence of hemorrhoids (OR: 2.7) were the strongest predictors. Recursive partitioning analysis showed rectal V38 < 2.0 cm{sup 3}, and use of anticoagulants or rectal V38 ≥ 2.0 cm{sup 3} plus 1 other risk factor resulted in an HH risk of >30%. Conclusions: Rectal V38 and 2 clinical factors were strong predictors of HH following 5-fraction SABR. Planning constraints should keep rectal V38 below 2.0 cm{sup 3}.« less

  20. Adaptive semi-supervised recursive tree partitioning: The ART towards large scale patient indexing in personalized healthcare.

    PubMed

    Wang, Fei

    2015-06-01

    With the rapid development of information technologies, tremendous amount of data became readily available in various application domains. This big data era presents challenges to many conventional data analytics research directions including data capture, storage, search, sharing, analysis, and visualization. It is no surprise to see that the success of next-generation healthcare systems heavily relies on the effective utilization of gigantic amounts of medical data. The ability of analyzing big data in modern healthcare systems plays a vital role in the improvement of the quality of care delivery. Specifically, patient similarity evaluation aims at estimating the clinical affinity and diagnostic proximity of patients. As one of the successful data driven techniques adopted in healthcare systems, patient similarity evaluation plays a fundamental role in many healthcare research areas such as prognosis, risk assessment, and comparative effectiveness analysis. However, existing algorithms for patient similarity evaluation are inefficient in handling massive patient data. In this paper, we propose an Adaptive Semi-Supervised Recursive Tree Partitioning (ART) framework for large scale patient indexing such that the patients with similar clinical or diagnostic patterns can be correctly and efficiently retrieved. The framework is designed for semi-supervised settings since it is crucial to leverage experts' supervision knowledge in medical scenario, which are fairly limited compared to the available data. Starting from the proposed ART framework, we will discuss several specific instantiations and validate them on both benchmark and real world healthcare data. Our results show that with the ART framework, the patients can be efficiently and effectively indexed in the sense that (1) similarity patients can be retrieved in a very short time; (2) the retrieval performance can beat the state-of-the art indexing methods. Copyright © 2015. Published by Elsevier Inc.

  1. Impact of Age and Antibody Type on Progression From Single to Multiple Autoantibodies in Type 1 Diabetes Relatives.

    PubMed

    Bosi, Emanuele; Boulware, David C; Becker, Dorothy J; Buckner, Jane H; Geyer, Susan; Gottlieb, Peter A; Henderson, Courtney; Kinderman, Amanda; Sosenko, Jay M; Steck, Andrea K; Bingley, Polly J

    2017-08-01

    Islet autoantibodies are markers of type 1 diabetes, and an increase in number of autoantibodies detected during the preclinical phase predicts progression to overt disease. To refine the effect of age in relation to islet antibody type on progression from single to multiple autoantibodies in relatives of people with type 1 diabetes. We examined 994 relatives with normal glucose tolerance who were positive for a single autoantibody, followed prospectively in the TrialNet Pathway to Prevention. Antibodies to glutamic acid decarboxylase (GADA), insulin (IAA), insulinoma-associated antigen 2, and zinc transporter 8 and islet cell antibodies were tested every 6 to 12 months. The primary outcome was confirmed development of multiple autoantibodies. Age was categorized as <8 years, 8 to 11 years, 12 to 17 years, and ≥18 years, and optimal age breakpoints were identified by recursive partitioning analysis. After median follow-up of 2 years, 141 relatives had developed at least one additional autoantibodies. Five-year risk was inversely related to age, but the pattern differed by antibody type: Relatives with GADA showed a gradual decrease in risk over the four age groups, whereas relatives with IAA showed a sharp decrease above age 8 years. Recursive partitioning analysis identified age breakpoints at 14 years in relatives with GADA and at 4 years in relatives with IAA. In relatives with IAA, spread of islet autoimmunity is largely limited to early childhood, whereas immune responses initially directed at glutamic acid decarboxylase can mature over a longer period. These differences have important implications for monitoring these patients and for designing prevention trials. Copyright © 2017 Endocrine Society

  2. A predictive model of inflammatory markers and patient-reported symptoms for cachexia in newly diagnosed pancreatic cancer patients.

    PubMed

    Fogelman, David R; Morris, J; Xiao, L; Hassan, M; Vadhan, S; Overman, M; Javle, S; Shroff, R; Varadhachary, G; Wolff, R; Vence, L; Maitra, A; Cleeland, C; Wang, X S

    2017-06-01

    Cachexia is a frequent manifestation of pancreatic cancer, can limit a patient's ability to take chemotherapy, and is associated with shortened survival. We developed a model to predict the early onset of cachexia in advanced pancreatic cancer patients. Patients with newly diagnosed, untreated metastatic or locally advanced pancreatic cancer were included. Serum cytokines were drawn prior to therapy. Patient symptoms were recorded using the M.D. Anderson Symptom Inventory (MDASI). Our primary endpoint was either 10% weight loss or death within 60 days of the start of therapy. Twenty-seven of 89 patients met the primary endpoint (either having lost 10% of body weight or having died within 60 days of the start of treatment). In a univariate analysis, smoking, history symptoms of pain and difficulty swallowing, high levels of MK, CXCL-16, IL-6, TNF-a, and low IL-1b all correlated with this endpoint. We used recursive partition to fit a regression tree model, selecting four of 26 variables (CXCL-16, IL-1b, pain, swallowing difficulty) as important in predicting cachexia. From these, a model of two cytokines (CXCL-16 > 5.135 ng/ml and IL-1b < 0.08 ng/ml) demonstrated a better sensitivity and specificity for this outcome (0.70 and 0.86, respectively) than any individual cytokine or tumor marker. Cachexia is frequent in pancreatic cancer; one in three patients met our endpoint of 10% weight loss or death within 60 days. Inflammatory cytokines are better than conventional tumor markers at predicting this outcome. Recursive partitioning analysis suggests that a model of CXCL-16 and IL-1B may offer a better ability than individual cytokines to predict this outcome.

  3. A recursive field-normalized bibliometric performance indicator: an application to the field of library and information science.

    PubMed

    Waltman, Ludo; Yan, Erjia; van Eck, Nees Jan

    2011-10-01

    Two commonly used ideas in the development of citation-based research performance indicators are the idea of normalizing citation counts based on a field classification scheme and the idea of recursive citation weighing (like in PageRank-inspired indicators). We combine these two ideas in a single indicator, referred to as the recursive mean normalized citation score indicator, and we study the validity of this indicator. Our empirical analysis shows that the proposed indicator is highly sensitive to the field classification scheme that is used. The indicator also has a strong tendency to reinforce biases caused by the classification scheme. Based on these observations, we advise against the use of indicators in which the idea of normalization based on a field classification scheme and the idea of recursive citation weighing are combined.

  4. An introduction to tree-structured modeling with application to quality of life data.

    PubMed

    Su, Xiaogang; Azuero, Andres; Cho, June; Kvale, Elizabeth; Meneses, Karen M; McNees, M Patrick

    2011-01-01

    Investigators addressing nursing research are faced increasingly with the need to analyze data that involve variables of mixed types and are characterized by complex nonlinearity and interactions. Tree-based methods, also called recursive partitioning, are gaining popularity in various fields. In addition to efficiency and flexibility in handling multifaceted data, tree-based methods offer ease of interpretation. The aims of this study were to introduce tree-based methods, discuss their advantages and pitfalls in application, and describe their potential use in nursing research. In this article, (a) an introduction to tree-structured methods is presented, (b) the technique is illustrated via quality of life (QOL) data collected in the Breast Cancer Education Intervention study, and (c) implications for their potential use in nursing research are discussed. As illustrated by the QOL analysis example, tree methods generate interesting and easily understood findings that cannot be uncovered via traditional linear regression analysis. The expanding breadth and complexity of nursing research may entail the use of new tools to improve efficiency and gain new insights. In certain situations, tree-based methods offer an attractive approach that help address such needs.

  5. Stereotactic Body Radiotherapy for Early-stage Non-small-cell Lung Cancer in Patients 80 Years and Older: A Multi-center Analysis.

    PubMed

    Cassidy, Richard J; Patel, Pretesh R; Zhang, Xinyan; Press, Robert H; Switchenko, Jeffrey M; Pillai, Rathi N; Owonikoko, Taofeek K; Ramalingam, Suresh S; Fernandez, Felix G; Force, Seth D; Curran, Walter J; Higgins, Kristin A

    2017-09-01

    Stereotactic body radiotherapy (SBRT) is the standard of care for medically inoperable early-stage non-small-cell lung cancer. Despite the limited number of octogenarians and nonagenarians on trials of SBRT, its use is increasingly being offered in these patients, given the aging cancer population, medical fragility, or patient preference. Our purpose was to investigate the efficacy, safety, and survival of patients ≥ 80 years old treated with definitive lung SBRT. Patients who underwent SBRT were reviewed from 2009 to 2015 at 4 academic centers. Patients diagnosed at ≥ 80 years old were included. Kaplan-Meier and multivariate logistic regression and Cox proportional hazard regression analyses were performed. Recursive partitioning analysis was done to determine a subgroup of patients most likely to benefit from therapy. A total of 58 patients were included, with a median age of 84.9 years (range, 80.1-95.2 years), a median follow-up time of 19.9 months (range, 6.9-64.9 months), a median fraction size of 10.0 Gy (range, 7.0-20.0 Gy), and a median number of fractions of 5.0 (range, 3.0-8.0 fractions). On multivariate analysis, higher Karnofsky performance status (KPS) was associated with higher local recurrence-free survival (hazard ratio [HR], 0.92; P < .01), regional recurrence-free survival (HR, 0.94; P < .01), and overall survival (HR, 0.91; P < .01). On recursive partitioning analysis, patients with KPS ≥ 75 had improved 3-year cancer-specific and overall survival (99.4% and 91.9%, respectively) compared with patients with KPS < 75 (47.8% and 23.6%, respectively; P < .01). Definitive lung SBRT for early-stage non-small-cell lung cancer was efficacious and safe in patients ≥ 80 years old. Patients with a KPS of ≥ 75 derived the most benefit from therapy. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. A Matched-Pair Analysis Comparing Whole-Brain Radiotherapy Plus Stereotactic Radiosurgery Versus Surgery Plus Whole-Brain Radiotherapy and a Boost to the Metastatic Site for One or Two Brain Metastases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rades, Dirk; Department of Radiation Oncology, University Medical Center, Hamburg; Kueter, Jan-Dirk

    2009-03-15

    Purpose: To compare the results of whole-brain radiotherapy plus stereotactic radiosurgery (WBRT+SRS) with those of surgery plus whole-brain radiotherapy and a boost to the metastatic site (OP+WBRT+boost) for patients with one or two brain metastases. Methods and Materials: Survival, intracerebral control, and local control of the treated metastases were retrospectively evaluated. To reduce the risk of selection bias, a matched-pair analysis was performed. The outcomes of 47 patients who received WBRT+SRS were compared with those of a second cohort of 47 patients who received OP+WBRT+boost. The two treatment groups were matched for the following potential prognostic factors: WBRT schedule, age,more » gender, performance status, tumor type, number of brain metastases, extracerebral metastases, recursive partitioning analysis class, and interval from tumor diagnosis to WBRT. Results: The 1-year survival rates were 65% after WBRT+SRS and 63% after OP+WBRT+boost (p = 0.19). The 1-year intracerebral control rates were 70% and 78% (p = 0.39), respectively. The 1-year local control rates were 84% and 83% (p = 0.87), respectively. On multivariate analyses, improved survival was significantly associated with better performance status (p = 0.009), no extracerebral metastases (p = 0.004), recursive partitioning analysis Class 1 (p = 0.004), and interval from tumor diagnosis to WBRT (p = 0.001). Intracerebral control was not significantly associated with any of the potential prognostic factors. Improved local control was significantly associated with no extracerebral metastases (p = 0.037). Conclusions: Treatment outcomes were not significantly different after WBRT+SRS compared with OP+WBRT+boost. However, WBRT+SRS is less invasive than OP+WBRT+boost and may be preferable for patients with one or two brain metastases. The results should be confirmed by randomized t0011ria.« less

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castle, Katherine O., E-mail: kocastle@mdanderson.org; Hoffman, Karen E.; Levy, Lawrence B.

    Purpose: The benefit of adding androgen deprivation therapy (ADT) to dose-escalated radiation therapy (RT) for men with intermediate-risk prostate cancer is unclear; therefore, we assessed the impact of adding ADT to dose-escalated RT on freedom from failure (FFF). Methods: Three groups of men treated with intensity modulated RT or 3-dimensional conformal RT (75.6-78 Gy) from 1993-2008 for prostate cancer were categorized as (1) 326 intermediate-risk patients treated with RT alone, (2) 218 intermediate-risk patients treated with RT and ≤6 months of ADT, and (3) 274 low-risk patients treated with definitive RT. Median follow-up was 58 months. Recursive partitioning analysis basedmore » on FFF using Gleason score (GS), T stage, and pretreatment PSA concentration was applied to the intermediate-risk patients treated with RT alone. The Kaplan-Meier method was used to estimate 5-year FFF. Results: Based on recursive partitioning analysis, intermediate-risk patients treated with RT alone were divided into 3 prognostic groups: (1) 188 favorable patients: GS 6, ≤T2b or GS 3+4, ≤T1c; (2) 71 marginal patients: GS 3+4, T2a-b; and (3) 68 unfavorable patients: GS 4+3 or T2c disease. Hazard ratios (HR) for recurrence in each group were 1.0, 2.1, and 4.6, respectively. When intermediate-risk patients treated with RT alone were compared to intermediate-risk patients treated with RT and ADT, the greatest benefit from ADT was seen for the unfavorable intermediate-risk patients (FFF, 74% vs 94%, respectively; P=.005). Favorable intermediate-risk patients had no significant benefit from the addition of ADT to RT (FFF, 94% vs 95%, respectively; P=.85), and FFF for favorable intermediate-risk patients treated with RT alone approached that of low-risk patients treated with RT alone (98%). Conclusions: Patients with favorable intermediate-risk prostate cancer did not benefit from the addition of ADT to dose-escalated RT, and their FFF was nearly as good as patients with low-risk disease. In patients with GS 4+3 or T2c disease, the addition of ADT to dose-escalated RT did improve FFF.« less

  8. Evaluation of funnel traps for estimating tree mortality and associated population phase of spruce beetle in Utah

    Treesearch

    E. Matthew Hansen; Barbara J. Bentz; A. Steven Munson; James C. Vandygriff; David L. Turner

    2006-01-01

    Although funnel traps are routinely used to manage bark beetles, little is known regarding the relationship between trap captures of spruce beetle (Dendroctonus rufipennis Kirby) and mortality of Engelmann spruce (Picea engelmannii Parry ex Engelm.) within a 10 ha block of the trap. Using recursive partitioning tree analyses, rules...

  9. Predicting forest attributes from climate data using a recursive partitioning and regression tree algorithm

    Treesearch

    Greg C. Liknes; Christopher W. Woodall; Charles H. Perry

    2009-01-01

    Climate information frequently is included in geospatial modeling efforts to improve the predictive capability of other data sources. The selection of an appropriate climate data source requires consideration given the number of choices available. With regard to climate data, there are a variety of parameters (e.g., temperature, humidity, precipitation), time intervals...

  10. The nondeterministic divide

    NASA Technical Reports Server (NTRS)

    Charlesworth, Arthur

    1990-01-01

    The nondeterministic divide partitions a vector into two non-empty slices by allowing the point of division to be chosen nondeterministically. Support for high-level divide-and-conquer programming provided by the nondeterministic divide is investigated. A diva algorithm is a recursive divide-and-conquer sequential algorithm on one or more vectors of the same range, whose division point for a new pair of recursive calls is chosen nondeterministically before any computation is performed and whose recursive calls are made immediately after the choice of division point; also, access to vector components is only permitted during activations in which the vector parameters have unit length. The notion of diva algorithm is formulated precisely as a diva call, a restricted call on a sequential procedure. Diva calls are proven to be intimately related to associativity. Numerous applications of diva calls are given and strategies are described for translating a diva call into code for a variety of parallel computers. Thus diva algorithms separate logical correctness concerns from implementation concerns.

  11. Adventures in Topological Field Theory

    NASA Astrophysics Data System (ADS)

    Horne, James H.

    1990-01-01

    This thesis consists of 5 parts. In part I, the topological Yang-Mills theory and the topological sigma model are presented in a superspace formulation. This greatly simplifies the field content of the theories, and makes the Q-invariance more obvious. The Feynman rules for the topological Yang -Mills theory are derived. We calculate the one-loop beta-functions of the topological sigma model in superspace. The lattice version of these theories is presented. The self-duality constraints of both models lead to spectrum doubling. In part II, we show that conformally invariant gravity in three dimensions is equivalent to the Yang-Mills gauge theory of the conformal group in three dimensions, with a Chern-Simons action. This means that conformal gravity is finite and exactly soluble. In part III, we derive the skein relations for the fundamental representations of SO(N), Sp(2n), Su(m| n), and OSp(m| 2n). These relations can be used recursively to calculate the expectation values of Wilson lines in three-dimensional Chern-Simons gauge theory with these gauge groups. A combination of braiding and tying of Wilson lines completely describes the skein relations. In part IV, we show that the k = 1 two dimensional gravity amplitudes at genus 3 agree precisely with the results from intersection theory on moduli space. Predictions for the genus 4 intersection numbers follow from the two dimensional gravity theory. In part V, we discuss the partition function in two dimensional gravity. For the one matrix model at genus 2, we use the partition function to derive a recursion relation. We show that the k = 1 amplitudes completely determine the partition function at arbitrary genus. We present a conjecture for the partition function for the arbitrary topological field theory coupled to topological gravity.

  12. Overlapping communities detection based on spectral analysis of line graphs

    NASA Astrophysics Data System (ADS)

    Gui, Chun; Zhang, Ruisheng; Hu, Rongjing; Huang, Guoming; Wei, Jiaxuan

    2018-05-01

    Community in networks are often overlapping where one vertex belongs to several clusters. Meanwhile, many networks show hierarchical structure such that community is recursively grouped into hierarchical organization. In order to obtain overlapping communities from a global hierarchy of vertices, a new algorithm (named SAoLG) is proposed to build the hierarchical organization along with detecting the overlap of community structure. SAoLG applies the spectral analysis into line graphs to unify the overlap and hierarchical structure of the communities. In order to avoid the limitation of absolute distance such as Euclidean distance, SAoLG employs Angular distance to compute the similarity between vertices. Furthermore, we make a micro-improvement partition density to evaluate the quality of community structure and use it to obtain the more reasonable and sensible community numbers. The proposed SAoLG algorithm achieves a balance between overlap and hierarchy by applying spectral analysis to edge community detection. The experimental results on one standard network and six real-world networks show that the SAoLG algorithm achieves higher modularity and reasonable community number values than those generated by Ahn's algorithm, the classical CPM and GN ones.

  13. Early symptom burden predicts recovery after sport-related concussion

    PubMed Central

    Mannix, Rebekah; Monuteaux, Michael C.; Stein, Cynthia J.; Bachur, Richard G.

    2014-01-01

    Objective: To identify independent predictors of and use recursive partitioning to develop a multivariate regression tree predicting symptom duration greater than 28 days after a sport-related concussion. Methods: We conducted a prospective cohort study of patients in a sports concussion clinic. Participants completed questionnaires that included the Post-Concussion Symptom Scale (PCSS). Participants were asked to record the date on which they last experienced symptoms. Potential predictor variables included age, sex, score on symptom inventories, history of prior concussions, performance on computerized neurocognitive assessments, loss of consciousness and amnesia at the time of injury, history of prior medical treatment for headaches, history of migraines, and family history of concussion. We used recursive partitioning analysis to develop a multivariate prediction model for identifying athletes at risk for a prolonged recovery from concussion. Results: A total of 531 patients ranged in age from 7 to 26 years (mean 14.6 ± 2.9 years). The mean PCSS score at the initial visit was 26 ± 26; mean time to presentation was 12 ± 5 days. Only total score on symptom inventory was independently associated with symptoms lasting longer than 28 days (adjusted odds ratio 1.044; 95% confidence interval [CI] 1.034, 1.054 for PCSS). No other potential predictor variables were independently associated with symptom duration or useful in developing the optimal regression decision tree. Most participants (86%; 95% CI 80%, 90%) with an initial PCSS score of <13 had resolution of their symptoms within 28 days of injury. Conclusions: The only independent predictor of prolonged symptoms after sport-related concussion is overall symptom burden. PMID:25381296

  14. Early symptom burden predicts recovery after sport-related concussion.

    PubMed

    Meehan, William P; Mannix, Rebekah; Monuteaux, Michael C; Stein, Cynthia J; Bachur, Richard G

    2014-12-09

    To identify independent predictors of and use recursive partitioning to develop a multivariate regression tree predicting symptom duration greater than 28 days after a sport-related concussion. We conducted a prospective cohort study of patients in a sports concussion clinic. Participants completed questionnaires that included the Post-Concussion Symptom Scale (PCSS). Participants were asked to record the date on which they last experienced symptoms. Potential predictor variables included age, sex, score on symptom inventories, history of prior concussions, performance on computerized neurocognitive assessments, loss of consciousness and amnesia at the time of injury, history of prior medical treatment for headaches, history of migraines, and family history of concussion. We used recursive partitioning analysis to develop a multivariate prediction model for identifying athletes at risk for a prolonged recovery from concussion. A total of 531 patients ranged in age from 7 to 26 years (mean 14.6 ± 2.9 years). The mean PCSS score at the initial visit was 26 ± 26; mean time to presentation was 12 ± 5 days. Only total score on symptom inventory was independently associated with symptoms lasting longer than 28 days (adjusted odds ratio 1.044; 95% confidence interval [CI] 1.034, 1.054 for PCSS). No other potential predictor variables were independently associated with symptom duration or useful in developing the optimal regression decision tree. Most participants (86%; 95% CI 80%, 90%) with an initial PCSS score of <13 had resolution of their symptoms within 28 days of injury. The only independent predictor of prolonged symptoms after sport-related concussion is overall symptom burden. © 2014 American Academy of Neurology.

  15. Efficiency of International Classification of Diseases, Ninth Revision, Billing Code Searches to Identify Emergency Department Visits for Blood or Body Fluid Exposures through a Statewide Multicenter Database

    PubMed Central

    Rosen, Lisa M.; Liu, Tao; Merchant, Roland C.

    2016-01-01

    BACKGROUND Blood and body fluid exposures are frequently evaluated in emergency departments (EDs). However, efficient and effective methods for estimating their incidence are not yet established. OBJECTIVE Evaluate the efficiency and accuracy of estimating statewide ED visits for blood or body fluid exposures using International Classification of Diseases, Ninth Revision (ICD-9), code searches. DESIGN Secondary analysis of a database of ED visits for blood or body fluid exposure. SETTING EDs of 11 civilian hospitals throughout Rhode Island from January 1, 1995, through June 30, 2001. PATIENTS Patients presenting to the ED for possible blood or body fluid exposure were included, as determined by prespecified ICD-9 codes. METHODS Positive predictive values (PPVs) were estimated to determine the ability of 10 ICD-9 codes to distinguish ED visits for blood or body fluid exposure from ED visits that were not for blood or body fluid exposure. Recursive partitioning was used to identify an optimal subset of ICD-9 codes for this purpose. Random-effects logistic regression modeling was used to examine variations in ICD-9 coding practices and styles across hospitals. Cluster analysis was used to assess whether the choice of ICD-9 codes was similar across hospitals. RESULTS The PPV for the original 10 ICD-9 codes was 74.4% (95% confidence interval [CI], 73.2%–75.7%), whereas the recursive partitioning analysis identified a subset of 5 ICD-9 codes with a PPV of 89.9% (95% CI, 88.9%–90.8%) and a misclassification rate of 10.1%. The ability, efficiency, and use of the ICD-9 codes to distinguish types of ED visits varied across hospitals. CONCLUSIONS Although an accurate subset of ICD-9 codes could be identified, variations across hospitals related to hospital coding style, efficiency, and accuracy greatly affected estimates of the number of ED visits for blood or body fluid exposure. PMID:22561713

  16. Virasoro constraints and polynomial recursion for the linear Hodge integrals

    NASA Astrophysics Data System (ADS)

    Guo, Shuai; Wang, Gehao

    2017-04-01

    The Hodge tau-function is a generating function for the linear Hodge integrals. It is also a tau-function of the KP hierarchy. In this paper, we first present the Virasoro constraints for the Hodge tau-function in the explicit form of the Virasoro equations. The expression of our Virasoro constraints is simply a linear combination of the Virasoro operators, where the coefficients are restored from a power series for the Lambert W function. Then, using this result, we deduce a simple version of the Virasoro constraints for the linear Hodge partition function, where the coefficients are restored from the Gamma function. Finally, we establish the equivalence relation between the Virasoro constraints and polynomial recursion formula for the linear Hodge integrals.

  17. Accounting for Individual Differences in Bradley-Terry Models by Means of Recursive Partitioning

    ERIC Educational Resources Information Center

    Strobl, Carolin; Wickelmaier, Florian; Zeileis, Achim

    2011-01-01

    The preference scaling of a group of subjects may not be homogeneous, but different groups of subjects with certain characteristics may show different preference scalings, each of which can be derived from paired comparisons by means of the Bradley-Terry model. Usually, either different models are fit in predefined subsets of the sample or the…

  18. ReHypar: A Recursive Hybrid Chunk Partitioning Method Using NAND-Flash Memory SSD

    PubMed Central

    Park, Sung-Soon; Lim, Cheol-Su

    2014-01-01

    Due to the rapid development of flash memory, SSD is considered to be the replacement of HDD in the storage market. Although SSD retains several promising characteristics, such as high random I/O performance and nonvolatility, its high expense per capacity is the main obstacle in replacing HDD in all storage solutions. An alternative is to provide a hybrid structure where a small portion of SSD address space is combined with the much larger HDD address space. In such a structure, maximizing the space utilization of SSD in a cost-effective way is extremely important to generate high I/O performance. We developed ReHypar (recursive hybrid chunk partitioning) that enables improving the space utilization of SSD in the hybrid structure. The first objective of ReHypar is to mitigate the fragmentation overhead of SSD address space, by reusing the remaining free space of I/O units as much as possible. Furthermore, ReHypar allows defining several, logical data sections in SSD address space, with each of those sections being configured with the different I/O unit. We integrated ReHypar with ext2 and ext4 and evaluated it using two public benchmarks including IOzone and Postmark. PMID:24987741

  19. An application of change-point recursive models to the relationship between litter size and number of stillborns in pigs.

    PubMed

    Ibáñez-Escriche, N; López de Maturana, E; Noguera, J L; Varona, L

    2010-11-01

    We developed and implemented change-point recursive models and compared them with a linear recursive model and a standard mixed model (SMM), in the scope of the relationship between litter size (LS) and number of stillborns (NSB) in pigs. The proposed approach allows us to estimate the point of change in multiple-segment modeling of a nonlinear relationship between phenotypes. We applied the procedure to a data set provided by a commercial Large White selection nucleus. The data file consisted of LS and NSB records of 4,462 parities. The results of the analysis clearly identified the location of the change points between different structural regression coefficients. The magnitude of these coefficients increased with LS, indicating an increasing incidence of LS on the NSB ratio. However, posterior distributions of correlations were similar across subpopulations (defined by the change points on LS), except for those between residuals. The heritability estimates of NSB did not present differences between recursive models. Nevertheless, these heritabilities were greater than those obtained for SMM (0.05) with a posterior probability of 85%. These results suggest a nonlinear relationship between LS and NSB, which supports the adequacy of a change-point recursive model for its analysis. Furthermore, the results from model comparisons support the use of recursive models. However, the adequacy of the different recursive models depended on the criteria used: the linear recursive model was preferred on account of its smallest deviance value, whereas nonlinear recursive models provided a better fit and predictive ability based on the cross-validation approach.

  20. Developing the surveillance algorithm for detection of failure to recognize and treat severe sepsis.

    PubMed

    Harrison, Andrew M; Thongprayoon, Charat; Kashyap, Rahul; Chute, Christopher G; Gajic, Ognjen; Pickering, Brian W; Herasevich, Vitaly

    2015-02-01

    To develop and test an automated surveillance algorithm (sepsis "sniffer") for the detection of severe sepsis and monitoring failure to recognize and treat severe sepsis in a timely manner. We conducted an observational diagnostic performance study using independent derivation and validation cohorts from an electronic medical record database of the medical intensive care unit (ICU) of a tertiary referral center. All patients aged 18 years and older who were admitted to the medical ICU from January 1 through March 31, 2013 (N=587), were included. The criterion standard for severe sepsis/septic shock was manual review by 2 trained reviewers with a third superreviewer for cases of interobserver disagreement. Critical appraisal of false-positive and false-negative alerts, along with recursive data partitioning, was performed for algorithm optimization. An algorithm based on criteria for suspicion of infection, systemic inflammatory response syndrome, organ hypoperfusion and dysfunction, and shock had a sensitivity of 80% and a specificity of 96% when applied to the validation cohort. In order, low systolic blood pressure, systemic inflammatory response syndrome positivity, and suspicion of infection were determined through recursive data partitioning to be of greatest predictive value. Lastly, 117 alert-positive patients (68% of the 171 patients with severe sepsis) had a delay in recognition and treatment, defined as no lactate and central venous pressure measurement within 2 hours of the alert. The optimized sniffer accurately identified patients with severe sepsis that bedside clinicians failed to recognize and treat in a timely manner. Copyright © 2015 Mayo Foundation for Medical Education and Research. Published by Elsevier Inc. All rights reserved.

  1. Personalized Risk Prediction in Clinical Oncology Research: Applications and Practical Issues Using Survival Trees and Random Forests.

    PubMed

    Hu, Chen; Steingrimsson, Jon Arni

    2018-01-01

    A crucial component of making individualized treatment decisions is to accurately predict each patient's disease risk. In clinical oncology, disease risks are often measured through time-to-event data, such as overall survival and progression/recurrence-free survival, and are often subject to censoring. Risk prediction models based on recursive partitioning methods are becoming increasingly popular largely due to their ability to handle nonlinear relationships, higher-order interactions, and/or high-dimensional covariates. The most popular recursive partitioning methods are versions of the Classification and Regression Tree (CART) algorithm, which builds a simple interpretable tree structured model. With the aim of increasing prediction accuracy, the random forest algorithm averages multiple CART trees, creating a flexible risk prediction model. Risk prediction models used in clinical oncology commonly use both traditional demographic and tumor pathological factors as well as high-dimensional genetic markers and treatment parameters from multimodality treatments. In this article, we describe the most commonly used extensions of the CART and random forest algorithms to right-censored outcomes. We focus on how they differ from the methods for noncensored outcomes, and how the different splitting rules and methods for cost-complexity pruning impact these algorithms. We demonstrate these algorithms by analyzing a randomized Phase III clinical trial of breast cancer. We also conduct Monte Carlo simulations to compare the prediction accuracy of survival forests with more commonly used regression models under various scenarios. These simulation studies aim to evaluate how sensitive the prediction accuracy is to the underlying model specifications, the choice of tuning parameters, and the degrees of missing covariates.

  2. Algorithms for the automatic generation of 2-D structured multi-block grids

    NASA Technical Reports Server (NTRS)

    Schoenfeld, Thilo; Weinerfelt, Per; Jenssen, Carl B.

    1995-01-01

    Two different approaches to the fully automatic generation of structured multi-block grids in two dimensions are presented. The work aims to simplify the user interactivity necessary for the definition of a multiple block grid topology. The first approach is based on an advancing front method commonly used for the generation of unstructured grids. The original algorithm has been modified toward the generation of large quadrilateral elements. The second method is based on the divide-and-conquer paradigm with the global domain recursively partitioned into sub-domains. For either method each of the resulting blocks is then meshed using transfinite interpolation and elliptic smoothing. The applicability of these methods to practical problems is demonstrated for typical geometries of fluid dynamics.

  3. Dosimetric Predictors of Duodenal Toxicity After Intensity Modulated Radiation Therapy for Treatment of the Para-aortic Nodes in Gynecologic Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verma, Jonathan; Sulman, Erik P.; Jhingran, Anuja

    Purpose: To determine the incidence of duodenal toxicity in patients receiving intensity modulated radiation therapy (IMRT) for treatment of para-aortic nodes and to identify dosimetric parameters predictive of late duodenal toxicity. Methods and Materials: We identified 105 eligible patients with gynecologic malignancies who were treated with IMRT for gross metastatic disease in the para-aortic nodes from January 1, 2005, through December 31, 2009. Patients were treated to a nodal clinical target volume to 45 to 50.4 Gy with a boost to 60 to 66 Gy. The duodenum was contoured, and dosimetric data were exported for analysis. Duodenal toxicity was scoredmore » according to Radiation Therapy Oncology Group criteria. Univariate Cox proportional hazards analysis and recursive partitioning analysis were used to determine associations between dosimetric variables and time to toxicity and to identify the optimal threshold that separated patients according to risk of toxicity. Results: Nine of the 105 patients experienced grade 2 to grade 5 duodenal toxicity, confirmed by endoscopy in all cases. The 3-year actuarial rate of any duodenal toxicity was 11.7%. A larger volume of the duodenum receiving 55 Gy (V55) was associated with higher rates of duodenal toxicity. The 3-year actuarial rates of duodenal toxicity with V55 above and below 15 cm{sup 3} were 48.6% and 7.4%, respectively (P<.01). In Cox univariate analysis of dosimetric variables, V55 was associated with duodenal toxicity (P=.029). In recursive partitioning analysis, V55 less than 13.94% segregated all patients with duodenal toxicity. Conclusions: Dose-escalated IMRT can safely and effectively treat para-aortic nodal disease in gynecologic malignancies, provided that care is taken to limit the dose to the duodenum to reduce the risk of late duodenal toxicity. Limiting V55 to below 15 cm{sup 3} may reduce the risk of duodenal complications. In cases where the treatment cannot be delivered within these constraints, consideration should be given to other treatment approaches such as resection or initial chemotherapy.« less

  4. S-HARP: A parallel dynamic spectral partitioner

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sohn, A.; Simon, H.

    1998-01-01

    Computational science problems with adaptive meshes involve dynamic load balancing when implemented on parallel machines. This dynamic load balancing requires fast partitioning of computational meshes at run time. The authors present in this report a fast parallel dynamic partitioner, called S-HARP. The underlying principles of S-HARP are the fast feature of inertial partitioning and the quality feature of spectral partitioning. S-HARP partitions a graph from scratch, requiring no partition information from previous iterations. Two types of parallelism have been exploited in S-HARP, fine grain loop level parallelism and coarse grain recursive parallelism. The parallel partitioner has been implemented in Messagemore » Passing Interface on Cray T3E and IBM SP2 for portability. Experimental results indicate that S-HARP can partition a mesh of over 100,000 vertices into 256 partitions in 0.2 seconds on a 64 processor Cray T3E. S-HARP is much more scalable than other dynamic partitioners, giving over 15 fold speedup on 64 processors while ParaMeTiS1.0 gives a few fold speedup. Experimental results demonstrate that S-HARP is three to 10 times faster than the dynamic partitioners ParaMeTiS and Jostle on six computational meshes of size over 100,000 vertices.« less

  5. Beam Path Toxicities to Non-Target Structures During Intensity-Modulated Radiation Therapy for Head and Neck Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenthal, David I.; Chambers, Mark S.; Fuller, Clifton D.

    2008-11-01

    Background: Intensity-modulated radiation therapy (IMRT) beams traverse nontarget normal structures not irradiated during three-dimensional conformal RT (3D-CRT) for head and neck cancer (HNC). This study estimates the doses and toxicities to nontarget structures during IMRT. Materials and Methods: Oropharyngeal cancer IMRT and 3D-CRT cases were reviewed. Dose-volume histograms (DVH) were used to evaluate radiation dose to the lip, cochlea, brainstem, occipital scalp, and segments of the mandible. Toxicity rates were compared for 3D-CRT, IMRT alone, or IMRT with concurrent cisplatin. Descriptive statistics and exploratory recursive partitioning analysis were used to estimate dose 'breakpoints' associated with observed toxicities. Results: A totalmore » of 160 patients were evaluated for toxicity; 60 had detailed DVH evaluation and 15 had 3D-CRT plan comparison. Comparing IMRT with 3D-CRT, there was significant (p {<=} 0.002) nonparametric differential dose to all clinically significant structures of interest. Thirty percent of IMRT patients had headaches and 40% had occipital scalp alopecia. A total of 76% and 38% of patients treated with IMRT alone had nausea and vomiting, compared with 99% and 68%, respectively, of those with concurrent cisplatin. IMRT had a markedly distinct toxicity profile than 3D-CRT. In recursive partitioning analysis, National Cancer Institute's Common Toxicity Criteria adverse effects 3.0 nausea and vomiting, scalp alopecia and anterior mucositis were associated with reconstructed mean brainstem dose >36 Gy, occipital scalp dose >30 Gy, and anterior mandible dose >34 Gy, respectively. Conclusions: Dose reduction to specified structures during IMRT implies an increased beam path dose to alternate nontarget structures that may result in clinical toxicities that were uncommon with previous, less conformal approaches. These findings have implications for IMRT treatment planning and research, toxicity assessment, and multidisciplinary patient management.« less

  6. PROSPECT: Profiling of Resistance Patterns & Oncogenic Signaling Pathways in Evaluation of Cancers of the Thorax and Therapeutic Target Identification

    DTIC Science & Technology

    2012-06-01

    neoadjuvant therapies on disease-free, progression-free, and overall survival will vary across prognostically distinct groups. 3. Specific molecular... prognostically distinct subpopulations of patients with resectable NSCLC, and to assess the extent to which these molecular profiles correlate with tumor...overall survival, and will use Cox proportional hazards models and recursive partitioning methods to identify important biomarkers and prognostically

  7. Subarachnoid hemorrhage admissions retrospectively identified using a prediction model

    PubMed Central

    McIntyre, Lauralyn; Fergusson, Dean; Turgeon, Alexis; dos Santos, Marlise P.; Lum, Cheemun; Chassé, Michaël; Sinclair, John; Forster, Alan; van Walraven, Carl

    2016-01-01

    Objective: To create an accurate prediction model using variables collected in widely available health administrative data records to identify hospitalizations for primary subarachnoid hemorrhage (SAH). Methods: A previously established complete cohort of consecutive primary SAH patients was combined with a random sample of control hospitalizations. Chi-square recursive partitioning was used to derive and internally validate a model to predict the probability that a patient had primary SAH (due to aneurysm or arteriovenous malformation) using health administrative data. Results: A total of 10,322 hospitalizations with 631 having primary SAH (6.1%) were included in the study (5,122 derivation, 5,200 validation). In the validation patients, our recursive partitioning algorithm had a sensitivity of 96.5% (95% confidence interval [CI] 93.9–98.0), a specificity of 99.8% (95% CI 99.6–99.9), and a positive likelihood ratio of 483 (95% CI 254–879). In this population, patients meeting criteria for the algorithm had a probability of 45% of truly having primary SAH. Conclusions: Routinely collected health administrative data can be used to accurately identify hospitalized patients with a high probability of having a primary SAH. This algorithm may allow, upon validation, an easy and accurate method to create validated cohorts of primary SAH from either ruptured aneurysm or arteriovenous malformation. PMID:27629096

  8. Demographic predictors of active tuberculosis in people migrating to British Columbia, Canada: a retrospective cohort study.

    PubMed

    Ronald, Lisa A; Campbell, Jonathon R; Balshaw, Robert F; Romanowski, Kamila; Roth, David Z; Marra, Fawziah; Cook, Victoria J; Johnston, James C

    2018-02-26

    Canadian tuberculosis (TB) guidelines recommend targeting postlanding screening for and treatment of latent tuberculosis infection (LTBI) in people migrating to Canada who are at increased risk for TB reactivation. Our objectives were to calculate robust longitudinal estimates of TB incidence in a cohort of people migrating to British Columbia, Canada, over a 29-year period, and to identify groups at highest risk of developing TB based on demographic characteristics at time of landing. We included all individuals ( n = 1 080 908) who became permanent residents of Canada between Jan. 1, 1985, and Dec. 31, 2012, and were resident in BC at any time between 1985 and 2013. Multiple administrative databases were linked to the provincial TB registry. We used recursive partitioning models to identify populations with high TB yield. Active TB was diagnosed in 2814 individuals (incidence rate 24.2/100 000 person-years). Demographic factors (live-in caregiver, family, refugee immigration classes; higher TB incidence in country of birth; and older age) were strong predictors of TB incidence in BC, with elevated rates continuing many years after entry into the cohort. Recursive partitioning identified refugees 18-64 years of age from countries with a TB incidence greater than 224/100 000 population as a high-yield group, with 1% developing TB within the first 10 years. These findings support recommendations in Canadian guidelines to target postlanding screening for and treatment of LTBI in adult refugees from high-incidence countries. Because high-yield populations can be identified at entry via demographic data, screening at this point may be practical and high-impact, particularly if the LTBI care cascade can be optimized. © 2018 Joule Inc. or its licensors.

  9. Demographic predictors of active tuberculosis in people migrating to British Columbia, Canada: a retrospective cohort study

    PubMed Central

    Ronald, Lisa A.; Campbell, Jonathon R.; Balshaw, Robert F.; Romanowski, Kamila; Roth, David Z.; Marra, Fawziah; Cook, Victoria J.; Johnston, James C.

    2018-01-01

    BACKGROUND: Canadian tuberculosis (TB) guidelines recommend targeting postlanding screening for and treatment of latent tuberculosis infection (LTBI) in people migrating to Canada who are at increased risk for TB reactivation. Our objectives were to calculate robust longitudinal estimates of TB incidence in a cohort of people migrating to British Columbia, Canada, over a 29-year period, and to identify groups at highest risk of developing TB based on demographic characteristics at time of landing. METHODS: We included all individuals (n = 1 080 908) who became permanent residents of Canada between Jan. 1, 1985, and Dec. 31, 2012, and were resident in BC at any time between 1985 and 2013. Multiple administrative databases were linked to the provincial TB registry. We used recursive partitioning models to identify populations with high TB yield. RESULTS: Active TB was diagnosed in 2814 individuals (incidence rate 24.2/100 000 person-years). Demographic factors (live-in caregiver, family, refugee immigration classes; higher TB incidence in country of birth; and older age) were strong predictors of TB incidence in BC, with elevated rates continuing many years after entry into the cohort. Recursive partitioning identified refugees 18–64 years of age from countries with a TB incidence greater than 224/100 000 population as a high-yield group, with 1% developing TB within the first 10 years. INTERPRETATION: These findings support recommendations in Canadian guidelines to target postlanding screening for and treatment of LTBI in adult refugees from high-incidence countries. Because high-yield populations can be identified at entry via demographic data, screening at this point may be practical and high-impact, particularly if the LTBI care cascade can be optimized. PMID:29483329

  10. Tumor Volume and Patient Weight as Predictors of Outcome in Children with Intermediate Risk Rhabdomyosarcoma (RMS): A Report from the Children’s Oncology Group

    PubMed Central

    Rodeberg, David A.; Stoner, Julie A.; Garcia-Henriquez, Norbert; Randall, R. Lor; Spunt, Sheri L.; Arndt, Carola A.; Kao, Simon; Paidas, Charles N.; Million, Lynn; Hawkins, Douglas S.

    2010-01-01

    Background To compare tumor volume and patient weight vs. traditional factors of tumor diameter and patient age, to determine which parameters best discriminates outcome among intermediate risk RMS patients. Methods Complete patient information for non-metastatic RMS patients enrolled in the Children’s Oncology Group (COG) intermediate risk study D9803 (1999–2005) was available for 370 patients. The Kaplan-Meier method was used to estimate survival distributions. A recursive partitioning model was used to identify prognostic factors associated with event-free survival (EFS). Cox-proportional hazards regression models were used to estimate the association between patient characteristics and the risk of failure or death. Results For all intermediate risk patients with RMS, a recursive partitioning algorithm for EFS suggests that prognostic groups should optimally be defined by tumor volume (transition point 20 cm3), weight (transition point 50 kg), and embryonal histology. Tumor volume and patient weight added significant outcome information to the standard prognostic factors including tumor diameter and age (p=0.02). The ability to resect the tumor completely was not significantly associated with the size of the patient, and patient weight did not significantly modify the association between tumor volume and EFS after adjustment for standard risk factors (p=0.2). Conclusion The factors most strongly associated with EFS were tumor volume, patient weight, and histology. Based on regression modeling, volume and weight are superior predictors of outcome compared to tumor diameter and patient age in children with intermediate risk RMS. Prognostic performance of tumor volume and patient weight should be assessed in an independent prospective study. PMID:24048802

  11. Identifying Emergency Department Patients at Low Risk for a Variceal Source of Upper Gastrointestinal Hemorrhage.

    PubMed

    Klein, Lauren R; Money, Joel; Maharaj, Kaveesh; Robinson, Aaron; Lai, Tarissa; Driver, Brian E

    2017-11-01

    Assessing the likelihood of a variceal versus nonvariceal source of upper gastrointestinal bleeding (UGIB) guides therapy, but can be difficult to determine on clinical grounds. The objective of this study was to determine if there are easily ascertainable clinical and laboratory findings that can identify a patient as low risk for a variceal source of hemorrhage. This was a retrospective cohort study of adult ED patients with UGIB between January 2008 and December 2014 who had upper endoscopy performed during hospitalization. Clinical and laboratory data were abstracted from the medical record. The source of the UGIB was defined as variceal or nonvariceal based on endoscopic reports. Binary recursive partitioning was utilized to create a clinical decision rule. The rule was internally validated and test characteristics were calculated with 1,000 bootstrap replications. A total of 719 patients were identified; mean age was 55 years and 61% were male. There were 71 (10%) patients with a variceal UGIB identified on endoscopy. Binary recursive partitioning yielded a two-step decision rule (platelet count > 200 × 10 9 /L and an international normalized ratio [INR] < 1.3), which identified patients who were low risk for a variceal source of hemorrhage. For the bootstrapped samples, the rule performed with 97% sensitivity (95% confidence interval [CI] = 91%-100%) and 49% specificity (95% CI = 44%-53%). Although this derivation study must be externally validated before widespread use, patients presenting to the ED with an acute UGIB with platelet count of >200 × 10 9 /L and an INR of <1.3 may be at very low risk for a variceal source of their upper gastrointestinal hemorrhage. © 2017 by the Society for Academic Emergency Medicine.

  12. Recursive grid partitioning on a cortical surface model: an optimized technique for the localization of implanted subdural electrodes.

    PubMed

    Pieters, Thomas A; Conner, Christopher R; Tandon, Nitin

    2013-05-01

    Precise localization of subdural electrodes (SDEs) is essential for the interpretation of data from intracranial electrocorticography recordings. Blood and fluid accumulation underneath the craniotomy flap leads to a nonlinear deformation of the brain surface and of the SDE array on postoperative CT scans and adversely impacts the accurate localization of electrodes located underneath the craniotomy. Older methods that localize electrodes based on their identification on a postimplantation CT scan with coregistration to a preimplantation MR image can result in significant problems with accuracy of the electrode localization. The authors report 3 novel methods that rely on the creation of a set of 3D mesh models to depict the pial surface and a smoothed pial envelope. Two of these new methods are designed to localize electrodes, and they are compared with 6 methods currently in use to determine their relative accuracy and reliability. The first method involves manually localizing each electrode using digital photographs obtained at surgery. This is highly accurate, but requires time intensive, operator-dependent input. The second uses 4 electrodes localized manually in conjunction with an automated, recursive partitioning technique to localize the entire electrode array. The authors evaluated the accuracy of previously published methods by applying the methods to their data and comparing them against the photograph-based localization. Finally, the authors further enhanced the usability of these methods by using automatic parcellation techniques to assign anatomical labels to individual electrodes as well as by generating an inflated cortical surface model while still preserving electrode locations relative to the cortical anatomy. The recursive grid partitioning had the least error compared with older methods (672 electrodes, 6.4-mm maximum electrode error, 2.0-mm mean error, p < 10(-18)). The maximum errors derived using prior methods of localization ranged from 8.2 to 11.7 mm for an individual electrode, with mean errors ranging between 2.9 and 4.1 mm depending on the method used. The authors also noted a larger error in all methods that used CT scans alone to localize electrodes compared with those that used both postoperative CT and postoperative MRI. The large mean errors reported with these methods are liable to affect intermodal data comparisons (for example, with functional mapping techniques) and may impact surgical decision making. The authors have presented several aspects of using new techniques to visualize electrodes implanted for localizing epilepsy. The ability to use automated labeling schemas to denote which gyrus a particular electrode overlies is potentially of great utility in planning resections and in corroborating the results of extraoperative stimulation mapping. Dilation of the pial mesh model provides, for the first time, a sense of the cortical surface not sampled by the electrode, and the potential roles this "electrophysiologically hidden" cortex may play in both eloquent function and seizure onset.

  13. Stereotactic radiosurgery alone versus resection plus whole-brain radiotherapy for 1 or 2 brain metastases in recursive partitioning analysis class 1 and 2 patients.

    PubMed

    Rades, Dirk; Bohlen, Guenther; Pluemer, Andre; Veninga, Theo; Hanssens, Patrick; Dunst, Juergen; Schild, Steven E

    2007-06-15

    The objective of this study was to compare stereotactic radiosurgery (SRS) alone with resection plus whole-brain radiotherapy (WBRT) for the treatment of patients in recursive partitioning analysis (RPA) class 1 and 2 who had 1 or 2 brain metastases. Two hundred six patients in RPA class 1 and 2 who had 1 or 2 brain metastases were analyzed retrospectively. Patients in Group A (n = 94) received from 18 grays (Gy) to 25 Gy SRS, and patients in Group B (n = 112) underwent resection of their metastases and received 10 x 3 Gy/20 x 2 Gy WBRT. Eight other potential prognostic factors were evaluated regarding overall survival (OS), brain control (BC), and local control (LC) of treated metastases: age, sex, performance status, tumor type, number of brain metastases, extracranial metastases, RPA class, and interval from tumor diagnosis to treatment of brain metastases. A comparison of the 2 treatment groups did not reveal significantly different OS (P = .19), BC (P = .52), or LC (P = .25). In RPA subgroup analyses, outcome also did not differ significantly for either RPA class of patients (P values from .21 to .83). On multivariate analysis, improved OS was associated with age < or =60 years (relative risk [RR], 1.75; P = .002), better performance status (RR, 1.67; P = .015), no extracranial metastases (RR, 2.84; P < .001), interval from tumor diagnosis to treatment >12 months (RR, 1.70; P = .003), and RPA class 1 (RR, 1.51; P = .016). Improved BC was associated with a single metastasis (RR, 1.54; P = .034) and an interval from tumor diagnosis to treatment >12 months (RR, 1.58; P = .019), and improved LC was associated with an interval from tumor diagnosis to treatment >12 months (RR, 1.59; P = .047). SRS alone appeared to be as effective as resection plus WBRT in the treatment of 1 or 2 brain metastases for patients in RPA class 1 and 2. Patient outcomes were associated with age, Karnofsky performance status, number of brain metastases, extracranial metastases, RPA class, and interval from tumor diagnosis to treatment. Copyright 2007 American Cancer Society.

  14. Patellar segmentation from 3D magnetic resonance images using guided recursive ray-tracing for edge pattern detection

    NASA Astrophysics Data System (ADS)

    Cheng, Ruida; Jackson, Jennifer N.; McCreedy, Evan S.; Gandler, William; Eijkenboom, J. J. F. A.; van Middelkoop, M.; McAuliffe, Matthew J.; Sheehan, Frances T.

    2016-03-01

    The paper presents an automatic segmentation methodology for the patellar bone, based on 3D gradient recalled echo and gradient recalled echo with fat suppression magnetic resonance images. Constricted search space outlines are incorporated into recursive ray-tracing to segment the outer cortical bone. A statistical analysis based on the dependence of information in adjacent slices is used to limit the search in each image to between an outer and inner search region. A section based recursive ray-tracing mechanism is used to skip inner noise regions and detect the edge boundary. The proposed method achieves higher segmentation accuracy (0.23mm) than the current state-of-the-art methods with the average dice similarity coefficient of 96.0% (SD 1.3%) agreement between the auto-segmentation and ground truth surfaces.

  15. Generalized Path Analysis and Generalized Simultaneous Equations Model for Recursive Systems with Responses of Mixed Types

    ERIC Educational Resources Information Center

    Tsai, Tien-Lung; Shau, Wen-Yi; Hu, Fu-Chang

    2006-01-01

    This article generalizes linear path analysis (PA) and simultaneous equations models (SiEM) to deal with mixed responses of different types in a recursive or triangular system. An efficient instrumental variable (IV) method for estimating the structural coefficients of a 2-equation partially recursive generalized path analysis (GPA) model and…

  16. Essential energy space random walks to accelerate molecular dynamics simulations: Convergence improvements via an adaptive-length self-healing strategy

    NASA Astrophysics Data System (ADS)

    Zheng, Lianqing; Yang, Wei

    2008-07-01

    Recently, accelerated molecular dynamics (AMD) technique was generalized to realize essential energy space random walks so that further sampling enhancement and effective localized enhanced sampling could be achieved. This method is especially meaningful when essential coordinates of the target events are not priori known; moreover, the energy space metadynamics method was also introduced so that biasing free energy functions can be robustly generated. Despite the promising features of this method, due to the nonequilibrium nature of the metadynamics recursion, it is challenging to rigorously use the data obtained at the recursion stage to perform equilibrium analysis, such as free energy surface mapping; therefore, a large amount of data ought to be wasted. To resolve such problem so as to further improve simulation convergence, as promised in our original paper, we are reporting an alternate approach: the adaptive-length self-healing (ALSH) strategy for AMD simulations; this development is based on a recent self-healing umbrella sampling method. Here, the unit simulation length for each self-healing recursion is increasingly updated based on the Wang-Landau flattening judgment. When the unit simulation length for each update is long enough, all the following unit simulations naturally run into the equilibrium regime. Thereafter, these unit simulations can serve for the dual purposes of recursion and equilibrium analysis. As demonstrated in our model studies, by applying ALSH, both fast recursion and short nonequilibrium data waste can be compromised. As a result, combining all the data obtained from all the unit simulations that are in the equilibrium regime via the weighted histogram analysis method, efficient convergence can be robustly ensured, especially for the purpose of free energy surface mapping.

  17. A Framework for Parallel Unstructured Grid Generation for Complex Aerodynamic Simulations

    NASA Technical Reports Server (NTRS)

    Zagaris, George; Pirzadeh, Shahyar Z.; Chrisochoides, Nikos

    2009-01-01

    A framework for parallel unstructured grid generation targeting both shared memory multi-processors and distributed memory architectures is presented. The two fundamental building-blocks of the framework consist of: (1) the Advancing-Partition (AP) method used for domain decomposition and (2) the Advancing Front (AF) method used for mesh generation. Starting from the surface mesh of the computational domain, the AP method is applied recursively to generate a set of sub-domains. Next, the sub-domains are meshed in parallel using the AF method. The recursive nature of domain decomposition naturally maps to a divide-and-conquer algorithm which exhibits inherent parallelism. For the parallel implementation, the Master/Worker pattern is employed to dynamically balance the varying workloads of each task on the set of available CPUs. Performance results by this approach are presented and discussed in detail as well as future work and improvements.

  18. A clinical analysis of brain metastasis in gynecologic cancer: a retrospective multi-institute analysis.

    PubMed

    Kim, Young Zoon; Kwon, Jae Hyun; Lim, Soyi

    2015-01-01

    This study analyzes the clinical characteristics of the brain metastasis (BM) of gynecologic cancer based on the type of cancer. In addition, the study examines the factors influencing the survival. Total 61 BM patients of gynecologic cancer were analyzed retrospectively from January 2000 to December 2012 in terms of clinical and radiological characteristics by using medical and radiological records from three university hospitals. There were 19 (31.1%) uterine cancers, 32 (52.5%) ovarian cancers, and 10 (16.4%) cervical cancers. The mean interval to BM was 25.4 months (21.6 months in ovarian cancer, 27.8 months in uterine cancer, and 33.1 months in cervical cancer). The mean survival from BM was 16.7 months (14.1 months in ovarian cancer, 23.3 months in uterine cancer, and 8.8 months in cervical cancer). According to a multivariate analysis of factors influencing survival, type of primary cancer, Karnofsky performance score, status of primary cancer, recursive partitioning analysis class, and treatment modality, particularly combined therapies, were significantly related to the overall survival. These results suggest that, in addition to traditional prognostic factors in BM, multiple treatment methods such as neurosurgery and combined chemoradiotherapy may play an important role in prolonging the survival for BM patients of gynecologic cancer.

  19. Development of a recursion RNG-based turbulence model

    NASA Technical Reports Server (NTRS)

    Zhou, YE; Vahala, George; Thangam, S.

    1993-01-01

    Reynolds stress closure models based on the recursion renormalization group theory are developed for the prediction of turbulent separated flows. The proposed model uses a finite wavenumber truncation scheme to account for the spectral distribution of energy. In particular, the model incorporates effects of both local and nonlocal interactions. The nonlocal interactions are shown to yield a contribution identical to that from the epsilon-renormalization group (RNG), while the local interactions introduce higher order dispersive effects. A formal analysis of the model is presented and its ability to accurately predict separated flows is analyzed from a combined theoretical and computational stand point. Turbulent flow past a backward facing step is chosen as a test case and the results obtained based on detailed computations demonstrate that the proposed recursion -RNG model with finite cut-off wavenumber can yield very good predictions for the backstep problem.

  20. Methods for assessing movement path recursion with application to African buffalo in South Africa

    USGS Publications Warehouse

    Bar-David, S.; Bar-David, I.; Cross, P.C.; Ryan, S.J.; Knechtel, C.U.; Getz, W.M.

    2009-01-01

    Recent developments of automated methods for monitoring animal movement, e.g., global positioning systems (GPS) technology, yield high-resolution spatiotemporal data. To gain insights into the processes creating movement patterns, we present two new techniques for extracting information from these data on repeated visits to a particular site or patch ("recursions"). Identification of such patches and quantification of recursion pathways, when combined with patch-related ecological data, should contribute to our understanding of the habitat requirements of large herbivores, of factors governing their space-use patterns, and their interactions with the ecosystem. We begin by presenting output from a simple spatial model that simulates movements of large-herbivore groups based on minimal parameters: resource availability and rates of resource recovery after a local depletion. We then present the details of our new techniques of analyses (recursion analysis and circle analysis) and apply them to data generated by our model, as well as two sets of empirical data on movements of African buffalo (Syncerus coffer): the first collected in Klaserie Private Nature Reserve and the second in Kruger National Park, South Africa. Our recursion analyses of model outputs provide us with a basis for inferring aspects of the processes governing the production of buffalo recursion patterns, particularly the potential influence of resource recovery rate. Although the focus of our simulations was a comparison of movement patterns produced by different resource recovery rates, we conclude our paper with a comprehensive discussion of how recursion analyses can be used when appropriate ecological data are available to elucidate various factors influencing movement. Inter alia, these include the various limiting and preferred resources, parasites, and topographical and landscape factors. ?? 2009 by the Ecological Society of America.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kishi, Takahiro; Matsuo, Yukinori, E-mail: ymatsuo@kuhp.kyoto-u.ac.jp; Ueki, Nami

    Purpose: This study aimed to evaluate the prognostic significance of the modified Glasgow Prognostic Score (mGPS) in patients with non-small cell lung cancer (NSCLC) who received stereotactic body radiation therapy (SBRT). Methods and Materials: Data from 165 patients who underwent SBRT for stage I NSCLC with histologic confirmation from January 1999 to September 2010 were collected retrospectively. Factors, including age, performance status, histology, Charlson comorbidity index, mGPS, and recursive partitioning analysis (RPA) class based on sex and T stage, were evaluated with regard to overall survival (OS) using the Cox proportional hazards model. The impact of the mGPS on causemore » of death and failure patterns was also analyzed. Results: The 3-year OS was 57.9%, with a median follow-up time of 3.5 years. A higher mGPS correlated significantly with poor OS (P<.001). The 3-year OS of lower mGPS patients was 66.4%, whereas that of higher mGPS patients was 44.5%. On multivariate analysis, mGPS and RPA class were significant factors for OS. A higher mGPS correlated significantly with lung cancer death (P=.019) and distant metastasis (P=.013). Conclusions: The mGPS was a significant predictor of clinical outcomes for SBRT in NSCLC patients.« less

  2. Scalable detection of statistically significant communities and hierarchies, using message passing for modularity

    PubMed Central

    Zhang, Pan; Moore, Cristopher

    2014-01-01

    Modularity is a popular measure of community structure. However, maximizing the modularity can lead to many competing partitions, with almost the same modularity, that are poorly correlated with each other. It can also produce illusory ‘‘communities’’ in random graphs where none exist. We address this problem by using the modularity as a Hamiltonian at finite temperature and using an efficient belief propagation algorithm to obtain the consensus of many partitions with high modularity, rather than looking for a single partition that maximizes it. We show analytically and numerically that the proposed algorithm works all of the way down to the detectability transition in networks generated by the stochastic block model. It also performs well on real-world networks, revealing large communities in some networks where previous work has claimed no communities exist. Finally we show that by applying our algorithm recursively, subdividing communities until no statistically significant subcommunities can be found, we can detect hierarchical structure in real-world networks more efficiently than previous methods. PMID:25489096

  3. Recursion Removal as an Instructional Method to Enhance the Understanding of Recursion Tracing

    ERIC Educational Resources Information Center

    Velázquez-Iturbide, J. Ángel; Castellanos, M. Eugenia; Hijón-Neira, Raquel

    2016-01-01

    Recursion is one of the most difficult programming topics for students. In this paper, an instructional method is proposed to enhance students' understanding of recursion tracing. The proposal is based on the use of rules to translate linear recursion algorithms into equivalent, iterative ones. The paper has two main contributions: the…

  4. A hybrid video codec based on extended block sizes, recursive integer transforms, improved interpolation, and flexible motion representation

    NASA Astrophysics Data System (ADS)

    Karczewicz, Marta; Chen, Peisong; Joshi, Rajan; Wang, Xianglin; Chien, Wei-Jung; Panchal, Rahul; Coban, Muhammed; Chong, In Suk; Reznik, Yuriy A.

    2011-01-01

    This paper describes video coding technology proposal submitted by Qualcomm Inc. in response to a joint call for proposal (CfP) issued by ITU-T SG16 Q.6 (VCEG) and ISO/IEC JTC1/SC29/WG11 (MPEG) in January 2010. Proposed video codec follows a hybrid coding approach based on temporal prediction, followed by transform, quantization, and entropy coding of the residual. Some of its key features are extended block sizes (up to 64x64), recursive integer transforms, single pass switched interpolation filters with offsets (single pass SIFO), mode dependent directional transform (MDDT) for intra-coding, luma and chroma high precision filtering, geometry motion partitioning, adaptive motion vector resolution. It also incorporates internal bit-depth increase (IBDI), and modified quadtree based adaptive loop filtering (QALF). Simulation results are presented for a variety of bit rates, resolutions and coding configurations to demonstrate the high compression efficiency achieved by the proposed video codec at moderate level of encoding and decoding complexity. For random access hierarchical B configuration (HierB), the proposed video codec achieves an average BD-rate reduction of 30.88c/o compared to the H.264/AVC alpha anchor. For low delay hierarchical P (HierP) configuration, the proposed video codec achieves an average BD-rate reduction of 32.96c/o and 48.57c/o, compared to the H.264/AVC beta and gamma anchors, respectively.

  5. HARP: A Dynamic Inertial Spectral Partitioner

    NASA Technical Reports Server (NTRS)

    Simon, Horst D.; Sohn, Andrew; Biswas, Rupak

    1997-01-01

    Partitioning unstructured graphs is central to the parallel solution of computational science and engineering problems. Spectral partitioners, such recursive spectral bisection (RSB), have proven effecfive in generating high-quality partitions of realistically-sized meshes. The major problem which hindered their wide-spread use was their long execution times. This paper presents a new inertial spectral partitioner, called HARP. The main objective of the proposed approach is to quickly partition the meshes at runtime in a manner that works efficiently for real applications in the context of distributed-memory machines. The underlying principle of HARP is to find the eigenvectors of the unpartitioned vertices and then project them onto the eigerivectors of the original mesh. Results for various meshes ranging in size from 1000 to 100,000 vertices indicate that HARP can indeed partition meshes rapidly at runtime. Experimental results show that our largest mesh can be partitioned sequentially in only a few seconds on an SP2 which is several times faster than other spectral partitioners while maintaining the solution quality of the proven RSB method. A parallel WI version of HARP has also been implemented on IBM SP2 and Cray T3E. Parallel HARP, running on 64 processors SP2 and T3E, can partition a mesh containing more than 100,000 vertices into 64 subgrids in about half a second. These results indicate that graph partitioning can now be truly embedded in dynamically-changing real-world applications.

  6. Is recursion language-specific? Evidence of recursive mechanisms in the structure of intentional action.

    PubMed

    Vicari, Giuseppe; Adenzato, Mauro

    2014-05-01

    In their 2002 seminal paper Hauser, Chomsky and Fitch hypothesize that recursion is the only human-specific and language-specific mechanism of the faculty of language. While debate focused primarily on the meaning of recursion in the hypothesis and on the human-specific and syntax-specific character of recursion, the present work focuses on the claim that recursion is language-specific. We argue that there are recursive structures in the domain of motor intentionality by way of extending John R. Searle's analysis of intentional action. We then discuss evidence from cognitive science and neuroscience supporting the claim that motor-intentional recursion is language-independent and suggest some explanatory hypotheses: (1) linguistic recursion is embodied in sensory-motor processing; (2) linguistic and motor-intentional recursions are distinct and mutually independent mechanisms. Finally, we propose some reflections about the epistemic status of HCF as presenting an empirically falsifiable hypothesis, and on the possibility of testing recursion in different cognitive domains. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Towards rigorous analysis of the Levitov-Mirlin-Evers recursion

    NASA Astrophysics Data System (ADS)

    Fyodorov, Y. V.; Kupiainen, A.; Webb, C.

    2016-12-01

    This paper aims to develop a rigorous asymptotic analysis of an approximate renormalization group recursion for inverse participation ratios P q of critical powerlaw random band matrices. The recursion goes back to the work by Mirlin and Evers (2000 Phys. Rev. B 62 7920) and earlier works by Levitov (1990 Phys. Rev. Lett. 64 547, 1999 Ann. Phys. 8 697-706) and is aimed to describe the ensuing multifractality of the eigenvectors of such matrices. We point out both similarities and dissimilarities between the LME recursion and those appearing in the theory of multiplicative cascades and branching random walks and show that the methods developed in those fields can be adapted to the present case. In particular the LME recursion is shown to exhibit a phase transition, which we expect is a freezing transition, where the role of temperature is played by the exponent q. However, the LME recursion has features that make its rigorous analysis considerably harder and we point out several open problems for further study.

  8. Three list scheduling temporal partitioning algorithm of time space characteristic analysis and compare for dynamic reconfigurable computing

    NASA Astrophysics Data System (ADS)

    Chen, Naijin

    2013-03-01

    Level Based Partitioning (LBP) algorithm, Cluster Based Partitioning (CBP) algorithm and Enhance Static List (ESL) temporal partitioning algorithm based on adjacent matrix and adjacent table are designed and implemented in this paper. Also partitioning time and memory occupation based on three algorithms are compared. Experiment results show LBP partitioning algorithm possesses the least partitioning time and better parallel character, as far as memory occupation and partitioning time are concerned, algorithms based on adjacent table have less partitioning time and less space memory occupation.

  9. Improved Accuracy Using Recursive Bayesian Estimation Based Language Model Fusion in ERP-Based BCI Typing Systems

    PubMed Central

    Orhan, U.; Erdogmus, D.; Roark, B.; Oken, B.; Purwar, S.; Hild, K. E.; Fowler, A.; Fried-Oken, M.

    2013-01-01

    RSVP Keyboard™ is an electroencephalography (EEG) based brain computer interface (BCI) typing system, designed as an assistive technology for the communication needs of people with locked-in syndrome (LIS). It relies on rapid serial visual presentation (RSVP) and does not require precise eye gaze control. Existing BCI typing systems which uses event related potentials (ERP) in EEG suffer from low accuracy due to low signal-to-noise ratio. Henceforth, RSVP Keyboard™ utilizes a context based decision making via incorporating a language model, to improve the accuracy of letter decisions. To further improve the contributions of the language model, we propose recursive Bayesian estimation, which relies on non-committing string decisions, and conduct an offline analysis, which compares it with the existing naïve Bayesian fusion approach. The results indicate the superiority of the recursive Bayesian fusion and in the next generation of RSVP Keyboard™ we plan to incorporate this new approach. PMID:23366432

  10. Orthogonal recursive bisection as data decomposition strategy for massively parallel cardiac simulations.

    PubMed

    Reumann, Matthias; Fitch, Blake G; Rayshubskiy, Aleksandr; Pitman, Michael C; Rice, John J

    2011-06-01

    We present the orthogonal recursive bisection algorithm that hierarchically segments the anatomical model structure into subvolumes that are distributed to cores. The anatomy is derived from the Visible Human Project, with electrophysiology based on the FitzHugh-Nagumo (FHN) and ten Tusscher (TT04) models with monodomain diffusion. Benchmark simulations with up to 16,384 and 32,768 cores on IBM Blue Gene/P and L supercomputers for both FHN and TT04 results show good load balancing with almost perfect speedup factors that are close to linear with the number of cores. Hence, strong scaling is demonstrated. With 32,768 cores, a 1000 ms simulation of full heart beat requires about 6.5 min of wall clock time for a simulation of the FHN model. For the largest machine partitions, the simulations execute at a rate of 0.548 s (BG/P) and 0.394 s (BG/L) of wall clock time per 1 ms of simulation time. To our knowledge, these simulations show strong scaling to substantially higher numbers of cores than reported previously for organ-level simulation of the heart, thus significantly reducing run times. The ability to reduce runtimes could play a critical role in enabling wider use of cardiac models in research and clinical applications.

  11. A mean field neural network for hierarchical module placement

    NASA Technical Reports Server (NTRS)

    Unaltuna, M. Kemal; Pitchumani, Vijay

    1992-01-01

    This paper proposes a mean field neural network for the two-dimensional module placement problem. An efficient coding scheme with only O(N log N) neurons is employed where N is the number of modules. The neurons are evolved in groups of N in log N iteration steps such that the circuit is recursively partitioned in alternating vertical and horizontal directions. In our simulations, the network was able to find optimal solutions to all test problems with up to 128 modules.

  12. A CONCISE PANEL OF BIOMARKERS IDENTIFIES NEUROCOGNITIVE FUNCTIONING CHANGES IN HIV-INFECTED INDIVIDUALS

    PubMed Central

    Marcotte, Thomas D.; Deutsch, Reena; Michael, Benedict Daniel; Franklin, Donald; Cookson, Debra Rosario; Bharti, Ajay R.; Grant, Igor; Letendre, Scott L.

    2013-01-01

    Background Neurocognitive (NC) impairment (NCI) occurs commonly in people living with HIV. Despite substantial effort, no biomarkers have been sufficiently validated for diagnosis and prognosis of NCI in the clinic. The goal of this project was to identify diagnostic or prognostic biomarkers for NCI in a comprehensively characterized HIV cohort. Methods Multidisciplinary case review selected 98 HIV-infected individuals and categorized them into four NC groups using normative data: stably normal (SN), stably impaired (SI), worsening (Wo), or improving (Im). All subjects underwent comprehensive NC testing, phlebotomy, and lumbar puncture at two timepoints separated by a median of 6.2 months. Eight biomarkers were measured in CSF and blood by immunoassay. Results were analyzed using mixed model linear regression and staged recursive partitioning. Results At the first visit, subjects were mostly middle-aged (median 45) white (58%) men (84%) who had AIDS (70%). Of the 73% who took antiretroviral therapy (ART), 54% had HIV RNA levels below 50 c/mL in plasma. Mixed model linear regression identified that only MCP-1 in CSF was associated with neurocognitive change group. Recursive partitioning models aimed at diagnosis (i.e., correctly classifying neurocognitive status at the first visit) were complex and required most biomarkers to achieve misclassification limits. In contrast, prognostic models were more efficient. A combination of three biomarkers (sCD14, MCP-1, SDF-1α) correctly classified 82% of Wo and SN subjects, including 88% of SN subjects. A combination of two biomarkers (MCP-1, TNF-α) correctly classified 81% of Im and SI subjects, including 100% of SI subjects. Conclusions This analysis of well-characterized individuals identified concise panels of biomarkers associated with NC change. Across all analyses, the two most frequently identified biomarkers were sCD14 and MCP-1, indicators of monocyte/macrophage activation. While the panels differed depending on the outcome and on the degree of misclassification, nearly all stable patients were correctly classified. PMID:24101401

  13. Automatic Tortuosity-Based Retinopathy of Prematurity Screening System

    NASA Astrophysics Data System (ADS)

    Sukkaew, Lassada; Uyyanonvara, Bunyarit; Makhanov, Stanislav S.; Barman, Sarah; Pangputhipong, Pannet

    Retinopathy of Prematurity (ROP) is an infant disease characterized by increased dilation and tortuosity of the retinal blood vessels. Automatic tortuosity evaluation from retinal digital images is very useful to facilitate an ophthalmologist in the ROP screening and to prevent childhood blindness. This paper proposes a method to automatically classify the image into tortuous and non-tortuous. The process imitates expert ophthalmologists' screening by searching for clearly tortuous vessel segments. First, a skeleton of the retinal blood vessels is extracted from the original infant retinal image using a series of morphological operators. Next, we propose to partition the blood vessels recursively using an adaptive linear interpolation scheme. Finally, the tortuosity is calculated based on the curvature of the resulting vessel segments. The retinal images are then classified into two classes using segments characterized by the highest tortuosity. For an optimal set of training parameters the prediction is as high as 100%.

  14. Structural Group-based Auditing of Missing Hierarchical Relationships in UMLS

    PubMed Central

    Chen, Yan; Gu, Huanying(Helen); Perl, Yehoshua; Geller, James

    2009-01-01

    The Metathesaurus of the UMLS was created by integrating various source terminologies. The inter-concept relationships were either integrated into the UMLS from the source terminologies or specially generated. Due to the extensive size and inherent complexity of the Metathesaurus, the accidental omission of some hierarchical relationships was inevitable. We present a recursive procedure which allows a human expert, with the support of an algorithm, to locate missing hierarchical relationships. The procedure starts with a group of concepts with exactly the same (correct) semantic type assignments. It then partitions the concepts, based on child-of hierarchical relationships, into smaller, singly rooted, hierarchically connected subgroups. The auditor only needs to focus on the subgroups with very few concepts and their concepts with semantic type reassignments. The procedure was evaluated by comparing it with a comprehensive manual audit and it exhibits a perfect error recall. PMID:18824248

  15. Toward quantitative estimation of material properties with dynamic mode atomic force microscopy: a comparative study.

    PubMed

    Ghosal, Sayan; Gannepalli, Anil; Salapaka, Murti

    2017-08-11

    In this article, we explore methods that enable estimation of material properties with the dynamic mode atomic force microscopy suitable for soft matter investigation. The article presents the viewpoint of casting the system, comprising of a flexure probe interacting with the sample, as an equivalent cantilever system and compares a steady-state analysis based method with a recursive estimation technique for determining the parameters of the equivalent cantilever system in real time. The steady-state analysis of the equivalent cantilever model, which has been implicitly assumed in studies on material property determination, is validated analytically and experimentally. We show that the steady-state based technique yields results that quantitatively agree with the recursive method in the domain of its validity. The steady-state technique is considerably simpler to implement, however, slower compared to the recursive technique. The parameters of the equivalent system are utilized to interpret storage and dissipative properties of the sample. Finally, the article identifies key pitfalls that need to be avoided toward the quantitative estimation of material properties.

  16. Deep Learning Accurately Predicts Estrogen Receptor Status in Breast Cancer Metabolomics Data.

    PubMed

    Alakwaa, Fadhl M; Chaudhary, Kumardeep; Garmire, Lana X

    2018-01-05

    Metabolomics holds the promise as a new technology to diagnose highly heterogeneous diseases. Conventionally, metabolomics data analysis for diagnosis is done using various statistical and machine learning based classification methods. However, it remains unknown if deep neural network, a class of increasingly popular machine learning methods, is suitable to classify metabolomics data. Here we use a cohort of 271 breast cancer tissues, 204 positive estrogen receptor (ER+), and 67 negative estrogen receptor (ER-) to test the accuracies of feed-forward networks, a deep learning (DL) framework, as well as six widely used machine learning models, namely random forest (RF), support vector machines (SVM), recursive partitioning and regression trees (RPART), linear discriminant analysis (LDA), prediction analysis for microarrays (PAM), and generalized boosted models (GBM). DL framework has the highest area under the curve (AUC) of 0.93 in classifying ER+/ER- patients, compared to the other six machine learning algorithms. Furthermore, the biological interpretation of the first hidden layer reveals eight commonly enriched significant metabolomics pathways (adjusted P-value <0.05) that cannot be discovered by other machine learning methods. Among them, protein digestion and absorption and ATP-binding cassette (ABC) transporters pathways are also confirmed in integrated analysis between metabolomics and gene expression data in these samples. In summary, deep learning method shows advantages for metabolomics based breast cancer ER status classification, with both the highest prediction accuracy (AUC = 0.93) and better revelation of disease biology. We encourage the adoption of feed-forward networks based deep learning method in the metabolomics research community for classification.

  17. A Method to Predict the Structure and Stability of RNA/RNA Complexes.

    PubMed

    Xu, Xiaojun; Chen, Shi-Jie

    2016-01-01

    RNA/RNA interactions are essential for genomic RNA dimerization and regulation of gene expression. Intermolecular loop-loop base pairing is a widespread and functionally important tertiary structure motif in RNA machinery. However, computational prediction of intermolecular loop-loop base pairing is challenged by the entropy and free energy calculation due to the conformational constraint and the intermolecular interactions. In this chapter, we describe a recently developed statistical mechanics-based method for the prediction of RNA/RNA complex structures and stabilities. The method is based on the virtual bond RNA folding model (Vfold). The main emphasis in the method is placed on the evaluation of the entropy and free energy for the loops, especially tertiary kissing loops. The method also uses recursive partition function calculations and two-step screening algorithm for large, complicated structures of RNA/RNA complexes. As case studies, we use the HIV-1 Mal dimer and the siRNA/HIV-1 mutant (T4) to illustrate the method.

  18. On recursion.

    PubMed

    Watumull, Jeffrey; Hauser, Marc D; Roberts, Ian G; Hornstein, Norbert

    2014-01-08

    It is a truism that conceptual understanding of a hypothesis is required for its empirical investigation. However, the concept of recursion as articulated in the context of linguistic analysis has been perennially confused. Nowhere has this been more evident than in attempts to critique and extend Hauseretal's. (2002) articulation. These authors put forward the hypothesis that what is uniquely human and unique to the faculty of language-the faculty of language in the narrow sense (FLN)-is a recursive system that generates and maps syntactic objects to conceptual-intentional and sensory-motor systems. This thesis was based on the standard mathematical definition of recursion as understood by Gödel and Turing, and yet has commonly been interpreted in other ways, most notably and incorrectly as a thesis about the capacity for syntactic embedding. As we explain, the recursiveness of a function is defined independent of such output, whether infinite or finite, embedded or unembedded-existent or non-existent. And to the extent that embedding is a sufficient, though not necessary, diagnostic of recursion, it has not been established that the apparent restriction on embedding in some languages is of any theoretical import. Misunderstanding of these facts has generated research that is often irrelevant to the FLN thesis as well as to other theories of language competence that focus on its generative power of expression. This essay is an attempt to bring conceptual clarity to such discussions as well as to future empirical investigations by explaining three criterial properties of recursion: computability (i.e., rules in intension rather than lists in extension); definition by induction (i.e., rules strongly generative of structure); and mathematical induction (i.e., rules for the principled-and potentially unbounded-expansion of strongly generated structure). By these necessary and sufficient criteria, the grammars of all natural languages are recursive.

  19. On recursion

    PubMed Central

    Watumull, Jeffrey; Hauser, Marc D.; Roberts, Ian G.; Hornstein, Norbert

    2014-01-01

    It is a truism that conceptual understanding of a hypothesis is required for its empirical investigation. However, the concept of recursion as articulated in the context of linguistic analysis has been perennially confused. Nowhere has this been more evident than in attempts to critique and extend Hauseretal's. (2002) articulation. These authors put forward the hypothesis that what is uniquely human and unique to the faculty of language—the faculty of language in the narrow sense (FLN)—is a recursive system that generates and maps syntactic objects to conceptual-intentional and sensory-motor systems. This thesis was based on the standard mathematical definition of recursion as understood by Gödel and Turing, and yet has commonly been interpreted in other ways, most notably and incorrectly as a thesis about the capacity for syntactic embedding. As we explain, the recursiveness of a function is defined independent of such output, whether infinite or finite, embedded or unembedded—existent or non-existent. And to the extent that embedding is a sufficient, though not necessary, diagnostic of recursion, it has not been established that the apparent restriction on embedding in some languages is of any theoretical import. Misunderstanding of these facts has generated research that is often irrelevant to the FLN thesis as well as to other theories of language competence that focus on its generative power of expression. This essay is an attempt to bring conceptual clarity to such discussions as well as to future empirical investigations by explaining three criterial properties of recursion: computability (i.e., rules in intension rather than lists in extension); definition by induction (i.e., rules strongly generative of structure); and mathematical induction (i.e., rules for the principled—and potentially unbounded—expansion of strongly generated structure). By these necessary and sufficient criteria, the grammars of all natural languages are recursive. PMID:24409164

  20. Online damage detection using recursive principal component analysis and recursive condition indicators

    NASA Astrophysics Data System (ADS)

    Krishnan, M.; Bhowmik, B.; Tiwari, A. K.; Hazra, B.

    2017-08-01

    In this paper, a novel baseline free approach for continuous online damage detection of multi degree of freedom vibrating structures using recursive principal component analysis (RPCA) in conjunction with online damage indicators is proposed. In this method, the acceleration data is used to obtain recursive proper orthogonal modes in online using the rank-one perturbation method, and subsequently utilized to detect the change in the dynamic behavior of the vibrating system from its pristine state to contiguous linear/nonlinear-states that indicate damage. The RPCA algorithm iterates the eigenvector and eigenvalue estimates for sample covariance matrices and new data point at each successive time instants, using the rank-one perturbation method. An online condition indicator (CI) based on the L2 norm of the error between actual response and the response projected using recursive eigenvector matrix updates over successive iterations is proposed. This eliminates the need for offline post processing and facilitates online damage detection especially when applied to streaming data. The proposed CI, named recursive residual error, is also adopted for simultaneous spatio-temporal damage detection. Numerical simulations performed on five-degree of freedom nonlinear system under white noise and El Centro excitations, with different levels of nonlinearity simulating the damage scenarios, demonstrate the robustness of the proposed algorithm. Successful results obtained from practical case studies involving experiments performed on a cantilever beam subjected to earthquake excitation, for full sensors and underdetermined cases; and data from recorded responses of the UCLA Factor building (full data and its subset) demonstrate the efficacy of the proposed methodology as an ideal candidate for real-time, reference free structural health monitoring.

  1. CREATIVE COMPUTATION.

    DTIC Science & Technology

    ARTIFICIAL INTELLIGENCE , RECURSIVE FUNCTIONS), (*RECURSIVE FUNCTIONS, ARTIFICIAL INTELLIGENCE ), (*MATHEMATICAL LOGIC, ARTIFICIAL INTELLIGENCE ), METAMATHEMATICS, AUTOMATA, NUMBER THEORY, INFORMATION THEORY, COMBINATORIAL ANALYSIS

  2. Recursive least-squares learning algorithms for neural networks

    NASA Astrophysics Data System (ADS)

    Lewis, Paul S.; Hwang, Jenq N.

    1990-11-01

    This paper presents the development of a pair of recursive least squares (ItLS) algorithms for online training of multilayer perceptrons which are a class of feedforward artificial neural networks. These algorithms incorporate second order information about the training error surface in order to achieve faster learning rates than are possible using first order gradient descent algorithms such as the generalized delta rule. A least squares formulation is derived from a linearization of the training error function. Individual training pattern errors are linearized about the network parameters that were in effect when the pattern was presented. This permits the recursive solution of the least squares approximation either via conventional RLS recursions or by recursive QR decomposition-based techniques. The computational complexity of the update is 0(N2) where N is the number of network parameters. This is due to the estimation of the N x N inverse Hessian matrix. Less computationally intensive approximations of the ilLS algorithms can be easily derived by using only block diagonal elements of this matrix thereby partitioning the learning into independent sets. A simulation example is presented in which a neural network is trained to approximate a two dimensional Gaussian bump. In this example RLS training required an order of magnitude fewer iterations on average (527) than did training with the generalized delta rule (6 1 BACKGROUND Artificial neural networks (ANNs) offer an interesting and potentially useful paradigm for signal processing and pattern recognition. The majority of ANN applications employ the feed-forward multilayer perceptron (MLP) network architecture in which network parameters are " trained" by a supervised learning algorithm employing the generalized delta rule (GDIt) [1 2]. The GDR algorithm approximates a fixed step steepest descent algorithm using derivatives computed by error backpropagatiori. The GDII algorithm is sometimes referred to as the backpropagation algorithm. However in this paper we will use the term backpropagation to refer only to the process of computing error derivatives. While multilayer perceptrons provide a very powerful nonlinear modeling capability GDR training can be very slow and inefficient. In linear adaptive filtering the analog of the GDR algorithm is the leastmean- squares (LMS) algorithm. Steepest descent-based algorithms such as GDR or LMS are first order because they use only first derivative or gradient information about the training error to be minimized. To speed up the training process second order algorithms may be employed that take advantage of second derivative or Hessian matrix information. Second order information can be incorporated into MLP training in different ways. In many applications especially in the area of pattern recognition the training set is finite. In these cases block learning can be applied using standard nonlinear optimization techniques [3 4 5].

  3. Strong monogamy of bipartite and genuine multipartite entanglement: the Gaussian case.

    PubMed

    Adesso, Gerardo; Illuminati, Fabrizio

    2007-10-12

    We demonstrate the existence of general constraints on distributed quantum correlations, which impose a trade-off on bipartite and multipartite entanglement at once. For all N-mode Gaussian states under permutation invariance, we establish exactly a monogamy inequality, stronger than the traditional one, that by recursion defines a proper measure of genuine N-partite entanglement. Strong monogamy holds as well for subsystems of arbitrary size, and the emerging multipartite entanglement measure is found to be scale invariant. We unveil its operational connection with the optimal fidelity of continuous variable teleportation networks.

  4. Image Segmentation Analysis for NASA Earth Science Applications

    NASA Technical Reports Server (NTRS)

    Tilton, James C.

    2010-01-01

    NASA collects large volumes of imagery data from satellite-based Earth remote sensing sensors. Nearly all of the computerized image analysis of this data is performed pixel-by-pixel, in which an algorithm is applied directly to individual image pixels. While this analysis approach is satisfactory in many cases, it is usually not fully effective in extracting the full information content from the high spatial resolution image data that s now becoming increasingly available from these sensors. The field of object-based image analysis (OBIA) has arisen in recent years to address the need to move beyond pixel-based analysis. The Recursive Hierarchical Segmentation (RHSEG) software developed by the author is being used to facilitate moving from pixel-based image analysis to OBIA. The key unique aspect of RHSEG is that it tightly intertwines region growing segmentation, which produces spatially connected region objects, with region object classification, which groups sets of region objects together into region classes. No other practical, operational image segmentation approach has this tight integration of region growing object finding with region classification This integration is made possible by the recursive, divide-and-conquer implementation utilized by RHSEG, in which the input image data is recursively subdivided until the image data sections are small enough to successfully mitigat the combinatorial explosion caused by the need to compute the dissimilarity between each pair of image pixels. RHSEG's tight integration of region growing object finding and region classification is what enables the high spatial fidelity of the image segmentations produced by RHSEG. This presentation will provide an overview of the RHSEG algorithm and describe how it is currently being used to support OBIA or Earth Science applications such as snow/ice mapping and finding archaeological sites from remotely sensed data.

  5. A Recursive Partitioning Method for the Prediction of Preference Rankings Based Upon Kemeny Distances.

    PubMed

    D'Ambrosio, Antonio; Heiser, Willem J

    2016-09-01

    Preference rankings usually depend on the characteristics of both the individuals judging a set of objects and the objects being judged. This topic has been handled in the literature with log-linear representations of the generalized Bradley-Terry model and, recently, with distance-based tree models for rankings. A limitation of these approaches is that they only work with full rankings or with a pre-specified pattern governing the presence of ties, and/or they are based on quite strict distributional assumptions. To overcome these limitations, we propose a new prediction tree method for ranking data that is totally distribution-free. It combines Kemeny's axiomatic approach to define a unique distance between rankings with the CART approach to find a stable prediction tree. Furthermore, our method is not limited by any particular design of the pattern of ties. The method is evaluated in an extensive full-factorial Monte Carlo study with a new simulation design.

  6. Mutual Information between Discrete Variables with Many Categories using Recursive Adaptive Partitioning

    PubMed Central

    Seok, Junhee; Seon Kang, Yeong

    2015-01-01

    Mutual information, a general measure of the relatedness between two random variables, has been actively used in the analysis of biomedical data. The mutual information between two discrete variables is conventionally calculated by their joint probabilities estimated from the frequency of observed samples in each combination of variable categories. However, this conventional approach is no longer efficient for discrete variables with many categories, which can be easily found in large-scale biomedical data such as diagnosis codes, drug compounds, and genotypes. Here, we propose a method to provide stable estimations for the mutual information between discrete variables with many categories. Simulation studies showed that the proposed method reduced the estimation errors by 45 folds and improved the correlation coefficients with true values by 99 folds, compared with the conventional calculation of mutual information. The proposed method was also demonstrated through a case study for diagnostic data in electronic health records. This method is expected to be useful in the analysis of various biomedical data with discrete variables. PMID:26046461

  7. Prognostic value of lymphovascular invasion of the primary tumor in hypopharyngeal carcinoma after total laryngopharyngectomy.

    PubMed

    Saito, Yuki; Omura, Go; Yasuhara, Kazuo; Rikitake, Ryoko; Akashi, Ken; Fukuoka, Osamu; Yoshida, Masafumi; Ando, Mizuo; Asakage, Takahiro; Yamasoba, Tatsuya

    2017-08-01

    We aimed to determinate the prognostic value of lymphovascular invasion in the specimens resected during total laryngopharyngectomy for hypopharyngeal carcinoma. Patients who underwent total laryngopharyngectomy at our institution between 2004 and 2014 were included in this study and retrospectively analyzed. We then discriminated for vascular invasion and lymphatic invasion of the primary tumor in all cases. We reviewed 135 records (120 men and 15 women; age range, 36-84 years). Tumors with lymphatic invasion tended to be associated with more metastatic lymph nodes and extracapsular spread (ECS) of metastatic lymph nodes. Tumors with vascular invasion tended to be associated with nonpyriform sinus locations. In a multivariate analysis, nonpyriform sinus locations, >3 metastatic lymph nodes, and vascular invasion remained significant prognostic factors for overall survival (OS); in recursive partitioning analysis, ECS and vascular invasion remained important categorical variables for OS. Vascular invasion is a strong prognostic biomarker for advanced hypopharyngeal carcinoma. © 2017 Wiley Periodicals, Inc. Head Neck 39: 1535-1543, 2017. © 2017 Wiley Periodicals, Inc.

  8. Identification of individuals with ADHD using the Dean-Woodcock sensory motor battery and a boosted tree algorithm.

    PubMed

    Finch, Holmes W; Davis, Andrew; Dean, Raymond S

    2015-03-01

    The accurate and early identification of individuals with pervasive conditions such as attention deficit hyperactivity disorder (ADHD) is crucial to ensuring that they receive appropriate and timely assistance and treatment. Heretofore, identification of such individuals has proven somewhat difficult, typically involving clinical decision making based on descriptions and observations of behavior, in conjunction with the administration of cognitive assessments. The present study reports on the use of a sensory motor battery in conjunction with a recursive partitioning computer algorithm, boosted trees, to develop a prediction heuristic for identifying individuals with ADHD. Results of the study demonstrate that this method is able to do so with accuracy rates of over 95 %, much higher than the popular logistic regression model against which it was compared. Implications of these results for practice are provided.

  9. Partitioning-based mechanisms under personalized differential privacy.

    PubMed

    Li, Haoran; Xiong, Li; Ji, Zhanglong; Jiang, Xiaoqian

    2017-05-01

    Differential privacy has recently emerged in private statistical aggregate analysis as one of the strongest privacy guarantees. A limitation of the model is that it provides the same privacy protection for all individuals in the database. However, it is common that data owners may have different privacy preferences for their data. Consequently, a global differential privacy parameter may provide excessive privacy protection for some users, while insufficient for others. In this paper, we propose two partitioning-based mechanisms, privacy-aware and utility-based partitioning, to handle personalized differential privacy parameters for each individual in a dataset while maximizing utility of the differentially private computation. The privacy-aware partitioning is to minimize the privacy budget waste, while utility-based partitioning is to maximize the utility for a given aggregate analysis. We also develop a t -round partitioning to take full advantage of remaining privacy budgets. Extensive experiments using real datasets show the effectiveness of our partitioning mechanisms.

  10. Partitioning-based mechanisms under personalized differential privacy

    PubMed Central

    Li, Haoran; Xiong, Li; Ji, Zhanglong; Jiang, Xiaoqian

    2017-01-01

    Differential privacy has recently emerged in private statistical aggregate analysis as one of the strongest privacy guarantees. A limitation of the model is that it provides the same privacy protection for all individuals in the database. However, it is common that data owners may have different privacy preferences for their data. Consequently, a global differential privacy parameter may provide excessive privacy protection for some users, while insufficient for others. In this paper, we propose two partitioning-based mechanisms, privacy-aware and utility-based partitioning, to handle personalized differential privacy parameters for each individual in a dataset while maximizing utility of the differentially private computation. The privacy-aware partitioning is to minimize the privacy budget waste, while utility-based partitioning is to maximize the utility for a given aggregate analysis. We also develop a t-round partitioning to take full advantage of remaining privacy budgets. Extensive experiments using real datasets show the effectiveness of our partitioning mechanisms. PMID:28932827

  11. Entanglement distillation protocols and number theory

    NASA Astrophysics Data System (ADS)

    Bombin, H.; Martin-Delgado, M. A.

    2005-09-01

    We show that the analysis of entanglement distillation protocols for qudits of arbitrary dimension D benefits from applying basic concepts from number theory, since the set ZDn associated with Bell diagonal states is a module rather than a vector space. We find that a partition of ZDn into divisor classes characterizes the invariant properties of mixed Bell diagonal states under local permutations. We construct a very general class of recursion protocols by means of unitary operations implementing these local permutations. We study these distillation protocols depending on whether we use twirling operations in the intermediate steps or not, and we study them both analytically and numerically with Monte Carlo methods. In the absence of twirling operations, we construct extensions of the quantum privacy algorithms valid for secure communications with qudits of any dimension D . When D is a prime number, we show that distillation protocols are optimal both qualitatively and quantitatively.

  12. Disparities in pediatric leukemia early survival in Argentina: a population-based study.

    PubMed

    Garibotti, Gilda; Moreno, Florencia; Dussel, Veronica; Orellana, Liliana

    2014-10-01

    To identify disparities-using recursive partitioning (RP)-in early survival for children with leukemias treated in Argentina, and to depict the main characteristics of the most vulnerable groups. This secondary data analysis evaluated 12-month survival (12-ms) in 3 987 children diagnosed between 2000 and 2008 with lymphoid leukemia (LL) and myeloid leukemia (ML) and registered in Argentina's population-based oncopediatric registry. Prognostic groups based on age at diagnosis, gender, socioeconomic index of the province of residence, and migration to a different province to receive health care were identified using the RP method. Overall 12-ms for LL and ML cases was 83.7% and 59.9% respectively. RP detected major gaps in 12-ms. Among 1-10-year-old LL patients from poorer provinces, 12-ms for those who did and did not migrate was 87.0% and 78.2% respectively. Survival of ML patients < 2 years old from provinces with a low/medium socioeconomic index was 38.9% compared to 62.1% for those in the same age group from richer provinces. For 2-14-year-old ML patients living in poor provinces, patient migration was associated with a 30% increase in 12-ms. Major disparities in leukemia survival among Argentine children were found. Patient migration and socioeconomic index of residence province were associated with survival. The RP method was instrumental in identifying and characterizing vulnerable groups.

  13. An Investigation of Document Partitions.

    ERIC Educational Resources Information Center

    Shaw, W. M., Jr.

    1986-01-01

    Empirical significance of document partitions is investigated as a function of index term-weight and similarity thresholds. Results show the same empirically preferred partitions can be detected by two independent strategies: an analysis of cluster-based retrieval analysis and an analysis of regularities in the underlying structure of the document…

  14. Binary tree eigen solver in finite element analysis

    NASA Technical Reports Server (NTRS)

    Akl, F. A.; Janetzke, D. C.; Kiraly, L. J.

    1993-01-01

    This paper presents a transputer-based binary tree eigensolver for the solution of the generalized eigenproblem in linear elastic finite element analysis. The algorithm is based on the method of recursive doubling, which parallel implementation of a number of associative operations on an arbitrary set having N elements is of the order of o(log2N), compared to (N-1) steps if implemented sequentially. The hardware used in the implementation of the binary tree consists of 32 transputers. The algorithm is written in OCCAM which is a high-level language developed with the transputers to address parallel programming constructs and to provide the communications between processors. The algorithm can be replicated to match the size of the binary tree transputer network. Parallel and sequential finite element analysis programs have been developed to solve for the set of the least-order eigenpairs using the modified subspace method. The speed-up obtained for a typical analysis problem indicates close agreement with the theoretical prediction given by the method of recursive doubling.

  15. Faces of matrix models

    NASA Astrophysics Data System (ADS)

    Morozov, A.

    2012-08-01

    Partition functions of eigenvalue matrix models possess a number of very different descriptions: as matrix integrals, as solutions to linear and nonlinear equations, as τ-functions of integrable hierarchies and as special-geometry prepotentials, as result of the action of W-operators and of various recursions on elementary input data, as gluing of certain elementary building blocks. All this explains the central role of such matrix models in modern mathematical physics: they provide the basic "special functions" to express the answers and relations between them, and they serve as a dream model of what one should try to achieve in any other field.

  16. Linear-algebraic bath transformation for simulating complex open quantum systems

    DOE PAGES

    Huh, Joonsuk; Mostame, Sarah; Fujita, Takatoshi; ...

    2014-12-02

    In studying open quantum systems, the environment is often approximated as a collection of non-interacting harmonic oscillators, a configuration also known as the star-bath model. It is also well known that the star-bath can be transformed into a nearest-neighbor interacting chain of oscillators. The chain-bath model has been widely used in renormalization group approaches. The transformation can be obtained by recursion relations or orthogonal polynomials. Based on a simple linear algebraic approach, we propose a bath partition strategy to reduce the system-bath coupling strength. As a result, the non-interacting star-bath is transformed into a set of weakly coupled multiple parallelmore » chains. Furthermore, the transformed bath model allows complex problems to be practically implemented on quantum simulators, and it can also be employed in various numerical simulations of open quantum dynamics.« less

  17. A decoupled recursive approach for constrained flexible multibody system dynamics

    NASA Technical Reports Server (NTRS)

    Lai, Hao-Jan; Kim, Sung-Soo; Haug, Edward J.; Bae, Dae-Sung

    1989-01-01

    A variational-vector calculus approach is employed to derive a recursive formulation for dynamic analysis of flexible multibody systems. Kinematic relationships for adjacent flexible bodies are derived in a companion paper, using a state vector notation that represents translational and rotational components simultaneously. Cartesian generalized coordinates are assigned for all body and joint reference frames, to explicitly formulate deformation kinematics under small deformation kinematics and an efficient flexible dynamics recursive algorithm is developed. Dynamic analysis of a closed loop robot is performed to illustrate efficiency of the algorithm.

  18. Incorporating Multi-criteria Optimization and Uncertainty Analysis in the Model-Based Systems Engineering of an Autonomous Surface Craft

    DTIC Science & Technology

    2009-09-01

    SAS Statistical Analysis Software SE Systems Engineering SEP Systems Engineering Process SHP Shaft Horsepower SIGINT Signals Intelligence......management occurs (OSD 2002). The Systems Engineering Process (SEP), displayed in Figure 2, is a comprehensive , iterative and recursive problem

  19. Recursive Feature Extraction in Graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-08-14

    ReFeX extracts recursive topological features from graph data. The input is a graph as a csv file and the output is a csv file containing feature values for each node in the graph. The features are based on topological counts in the neighborhoods of each nodes, as well as recursive summaries of neighbors' features.

  20. Recursive feature selection with significant variables of support vectors.

    PubMed

    Tsai, Chen-An; Huang, Chien-Hsun; Chang, Ching-Wei; Chen, Chun-Houh

    2012-01-01

    The development of DNA microarray makes researchers screen thousands of genes simultaneously and it also helps determine high- and low-expression level genes in normal and disease tissues. Selecting relevant genes for cancer classification is an important issue. Most of the gene selection methods use univariate ranking criteria and arbitrarily choose a threshold to choose genes. However, the parameter setting may not be compatible to the selected classification algorithms. In this paper, we propose a new gene selection method (SVM-t) based on the use of t-statistics embedded in support vector machine. We compared the performance to two similar SVM-based methods: SVM recursive feature elimination (SVMRFE) and recursive support vector machine (RSVM). The three methods were compared based on extensive simulation experiments and analyses of two published microarray datasets. In the simulation experiments, we found that the proposed method is more robust in selecting informative genes than SVMRFE and RSVM and capable to attain good classification performance when the variations of informative and noninformative genes are different. In the analysis of two microarray datasets, the proposed method yields better performance in identifying fewer genes with good prediction accuracy, compared to SVMRFE and RSVM.

  1. Multidisciplinary treatment of brain metastases derived from clear cell renal cancer incorporating stereotactic radiosurgery.

    PubMed

    Samlowski, Wolfram E; Majer, Martin; Boucher, Kenneth M; Shrieve, Annabelle F; Dechet, Christopher; Jensen, Randy L; Shrieve, Dennis C

    2008-11-01

    Brain metastases are a frequent complication in patients with metastatic clear cell renal cancer. Survival after whole-brain radiotherapy (WBRT) is disappointing. A retrospective analysis of multimodality treatment was performed in patients who had received linear accelerator (LINAC)-based stereotactic radiosurgery (SRS). Thirty-two patients underwent SRS-based treatment for 71 metastatic foci between 2000 and 2006. All patients had a Karnofsky performance status >or=70 and all 32 patients had extracranial metastatic disease (Radiation Therapy Oncology Group recursive partitioning analysis [RPA] Class 2). Survival was calculated from the time of diagnosis of brain metastases. The minimum potential follow-up was 1 year after SRS. Univariate and multivariate analysis of potential prognostic factors affecting survival was performed. Twenty-six patients required only 1 SRS treatment (84%) to achieve central nervous system (CNS) control, whereas 5 patients received 2 to 3 treatments (16%). The median survival of renal cancer patients from the diagnosis of brain metastases was 10.1 months (95% confidence interval, 6.4-14.8 months). One-year and 3-year survival rates were 43% and 16%, respectively. The addition of surgery or WBRT did not appear to prolong survival. Immunotherapy after control of brain metastases with SRS appeared to result in significantly improved survival. Survival was also found to be strongly influenced by prognostic stratification of metastatic disease using Motzer or modified risk criteria. The results of the current study demonstrated that SRS-based treatment of patients with up to 5 brain metastases from clear cell renal cancer is feasible and results in excellent CNS control. Survival beyond 3 years from the time of diagnosis of brain metastases was achievable in 16% of patients and was associated with the use of systemic immunotherapy with interleukin-2 and interferon but not antiangiogenic agents.

  2. Performance analysis of a dual-tree algorithm for computing spatial distance histograms

    PubMed Central

    Chen, Shaoping; Tu, Yi-Cheng; Xia, Yuni

    2011-01-01

    Many scientific and engineering fields produce large volume of spatiotemporal data. The storage, retrieval, and analysis of such data impose great challenges to database systems design. Analysis of scientific spatiotemporal data often involves computing functions of all point-to-point interactions. One such analytics, the Spatial Distance Histogram (SDH), is of vital importance to scientific discovery. Recently, algorithms for efficient SDH processing in large-scale scientific databases have been proposed. These algorithms adopt a recursive tree-traversing strategy to process point-to-point distances in the visited tree nodes in batches, thus require less time when compared to the brute-force approach where all pairwise distances have to be computed. Despite the promising experimental results, the complexity of such algorithms has not been thoroughly studied. In this paper, we present an analysis of such algorithms based on a geometric modeling approach. The main technique is to transform the analysis of point counts into a problem of quantifying the area of regions where pairwise distances can be processed in batches by the algorithm. From the analysis, we conclude that the number of pairwise distances that are left to be processed decreases exponentially with more levels of the tree visited. This leads to the proof of a time complexity lower than the quadratic time needed for a brute-force algorithm and builds the foundation for a constant-time approximate algorithm. Our model is also general in that it works for a wide range of point spatial distributions, histogram types, and space-partitioning options in building the tree. PMID:21804753

  3. Categorial Compositionality III: F-(co)algebras and the Systematicity of Recursive Capacities in Human Cognition

    PubMed Central

    Phillips, Steven; Wilson, William H.

    2012-01-01

    Human cognitive capacity includes recursively definable concepts, which are prevalent in domains involving lists, numbers, and languages. Cognitive science currently lacks a satisfactory explanation for the systematic nature of such capacities (i.e., why the capacity for some recursive cognitive abilities–e.g., finding the smallest number in a list–implies the capacity for certain others–finding the largest number, given knowledge of number order). The category-theoretic constructs of initial F-algebra, catamorphism, and their duals, final coalgebra and anamorphism provide a formal, systematic treatment of recursion in computer science. Here, we use this formalism to explain the systematicity of recursive cognitive capacities without ad hoc assumptions (i.e., to the same explanatory standard used in our account of systematicity for non-recursive capacities). The presence of an initial algebra/final coalgebra explains systematicity because all recursive cognitive capacities, in the domain of interest, factor through (are composed of) the same component process. Moreover, this factorization is unique, hence no further (ad hoc) assumptions are required to establish the intrinsic connection between members of a group of systematically-related capacities. This formulation also provides a new perspective on the relationship between recursive cognitive capacities. In particular, the link between number and language does not depend on recursion, as such, but on the underlying functor on which the group of recursive capacities is based. Thus, many species (and infants) can employ recursive processes without having a full-blown capacity for number and language. PMID:22514704

  4. Are species photosynthetic characteristics good predictors of seedling post-hurricane demographic patterns and species spatiotemporal distribution in a hurricane impacted wet montane forest?

    NASA Astrophysics Data System (ADS)

    Luke, Denneko; McLaren, Kurt

    2018-05-01

    In situ measurements of leaf level photosynthetic response to light were collected from seedlings of ten tree species from a tropical montane wet forest, the John Crow Mountains, Jamaica. A model-based recursive partitioning ('mob') algorithm was then used to identify species associations based on their fitted photosynthetic response curves. Leaf area dark respiration (RD) and light saturated maximum photosynthetic (Amax) rates were also used as 'mob' partitioning variables, to identify species associations based on seedling demographic patterns (from June 2007 to May 2010) following a hurricane (Aug. 2007) and the spatiotemporal distribution patterns of stems in 2006 and 2012. RD and Amax rates ranged from 1.14 to 2.02 μmol (CO2) m-2s-1 and 2.97-5.87 μmol (CO2) m-2s-1, respectively, placing the ten species in the range of intermediate shade tolerance. Several parsimonious species 'mob' groups were formed based on 1) interspecific differences among species response curves, 2) variations in post-hurricane seedling demographic trends and 3) RD rates and species spatiotemporal distribution patterns at aspects that are more or less exposed to hurricanes. The composition of parsimonious groupings based on photosynthetic curves was not concordant with the groups based on demographic trends but was partially concordant with the RD - species spatiotemporal distribution groups. Our results indicated that the influence of photosynthetic characteristics on demographic traits and species distributions was not straightforward. Rather, there was a complex pattern of interaction between ecophysiological and demographic traits, which determined species successional status, post-hurricane response and ultimately, species distribution at our study site.

  5. Exact partition functions for deformed N=2 theories with N_f=4 flavours

    NASA Astrophysics Data System (ADS)

    Beccaria, Matteo; Fachechi, Alberto; Macorini, Guido; Martina, Luigi

    2016-12-01

    We consider the Ω-deformed N=2 SU(2) gauge theory in four dimensions with N f = 4 massive fundamental hypermultiplets. The low energy effective action depends on the deformation parameters ɛ 1 , ɛ 2, the scalar field expectation value a, and the hypermultiplet masses m = ( m 1 , m 2 , m 3 , m 4). Motivated by recent findings in the N={2}^{*} theory, we explore the theories that are characterized by special fixed ratios ɛ 2 /ɛ 1 and m /ɛ 1 and propose a simple condition on the structure of the multi-instanton contributions to the prepotential determining the effective action. This condition determines a finite set Π N of special points such that the prepotential has N poles at fixed positions independent on the instanton number. In analogy with what happens in the N={2}^{*} gauge theory, the full prepotential of the Π N theories may be given in closed form as an explicit function of a and the modular parameter q appearing in special combinations of Eisenstein series and Jacobi theta functions with well defined modular properties. The resulting finite pole partition functions are related by AGT correspondence to special 4-point spherical conformal blocks of the Virasoro algebra. We examine in full details special cases where the closed expression of the block is known and confirms our Ansatz. We systematically study the special features of Zamolodchikov's recursion for the Π N conformal blocks. As a result, we provide a novel effective recursion relation that can be exactly solved and allows to prove the conjectured closed expressions analytically in the case of the Π1 and Π2 conformal blocks.

  6. Factors affecting exits from homelessness among persons with serious mental illness and substance use disorders

    PubMed Central

    Gabrielian, Sonya; Bromley, Elizabeth; Hellemann, Gerhard S.; Kern, Robert S.; Goldenson, Nicholas I.; Danley, Megan E.; Young, Alexander S.

    2015-01-01

    Objective We sought to understand the housing trajectories of homeless consumers with serious mental illness (SMI) and co-occurring substance use disorders (SUD) and to identify factors that best-predicted achievement of independent housing. Methods Using administrative data, we identified homeless persons with SMI and SUD admitted to a residential rehabilitation program from 12/2008-11/2011. On a random sample (n=36), we assessed a range of potential predictors of housing outcomes, including symptoms, cognition, and social/community supports. We used the Residential Time-Line Follow-Back (TLFB) Inventory to gather housing histories since exiting rehabilitation and identify housing outcomes. We used recursive partitioning to identify variables that best-differentiated participants by these outcomes. Results We identified three housing trajectories: stable housing (n=14); unstable housing (n=15); and continuously engaged in housing services (n=7). Using recursive partitioning, two variables (symbol digit modalities test (SDMT), a neurocognitive speed of processing measure and Behavior and Symptom Identification Scale (BASIS)-relationships subscale, which quantifies symptoms affecting relationships) were sufficient to capture information provided by 26 predictors to classify participants by housing outcome. Participants predicted to continuously engage in services had impaired processing speeds (SDMT score<32.5). Among consumers with SDMT score≥32.5, those predicted to achieve stable housing had fewer interpersonal symptoms (BASIS-relationships score<0.81) than those predicted to have unstable housing. This model explains 57% of this sample's variability and 14% of this population's variability in housing outcomes. Conclusion As cognition and symptoms influencing relationships predicted housing outcomes for homeless adults with SMI and SUD, cognitive and social skills trainings may be useful for this population. PMID:25919839

  7. Experiments to Determine Whether Recursive Partitioning (CART) or an Artificial Neural Network Overcomes Theoretical Limitations of Cox Proportional Hazards Regression

    NASA Technical Reports Server (NTRS)

    Kattan, Michael W.; Hess, Kenneth R.; Kattan, Michael W.

    1998-01-01

    New computationally intensive tools for medical survival analyses include recursive partitioning (also called CART) and artificial neural networks. A challenge that remains is to better understand the behavior of these techniques in effort to know when they will be effective tools. Theoretically they may overcome limitations of the traditional multivariable survival technique, the Cox proportional hazards regression model. Experiments were designed to test whether the new tools would, in practice, overcome these limitations. Two datasets in which theory suggests CART and the neural network should outperform the Cox model were selected. The first was a published leukemia dataset manipulated to have a strong interaction that CART should detect. The second was a published cirrhosis dataset with pronounced nonlinear effects that a neural network should fit. Repeated sampling of 50 training and testing subsets was applied to each technique. The concordance index C was calculated as a measure of predictive accuracy by each technique on the testing dataset. In the interaction dataset, CART outperformed Cox (P less than 0.05) with a C improvement of 0.1 (95% Cl, 0.08 to 0.12). In the nonlinear dataset, the neural network outperformed the Cox model (P less than 0.05), but by a very slight amount (0.015). As predicted by theory, CART and the neural network were able to overcome limitations of the Cox model. Experiments like these are important to increase our understanding of when one of these new techniques will outperform the standard Cox model. Further research is necessary to predict which technique will do best a priori and to assess the magnitude of superiority.

  8. Global-to-local, shape-based, real and virtual landmarks for shape modeling by recursive boundary subdivision

    NASA Astrophysics Data System (ADS)

    Rueda, Sylvia; Udupa, Jayaram K.

    2011-03-01

    Landmark based statistical object modeling techniques, such as Active Shape Model (ASM), have proven useful in medical image analysis. Identification of the same homologous set of points in a training set of object shapes is the most crucial step in ASM, which has encountered challenges such as (C1) defining and characterizing landmarks; (C2) ensuring homology; (C3) generalizing to n > 2 dimensions; (C4) achieving practical computations. In this paper, we propose a novel global-to-local strategy that attempts to address C3 and C4 directly and works in Rn. The 2D version starts from two initial corresponding points determined in all training shapes via a method α, and subsequently by subdividing the shapes into connected boundary segments by a line determined by these points. A shape analysis method β is applied on each segment to determine a landmark on the segment. This point introduces more pairs of points, the lines defined by which are used to further subdivide the boundary segments. This recursive boundary subdivision (RBS) process continues simultaneously on all training shapes, maintaining synchrony of the level of recursion, and thereby keeping correspondence among generated points automatically by the correspondence of the homologous shape segments in all training shapes. The process terminates when no subdividing lines are left to be considered that indicate (as per method β) that a point can be selected on the associated segment. Examples of α and β are presented based on (a) distance; (b) Principal Component Analysis (PCA); and (c) the novel concept of virtual landmarks.

  9. Overlapped Partitioning for Ensemble Classifiers of P300-Based Brain-Computer Interfaces

    PubMed Central

    Onishi, Akinari; Natsume, Kiyohisa

    2014-01-01

    A P300-based brain-computer interface (BCI) enables a wide range of people to control devices that improve their quality of life. Ensemble classifiers with naive partitioning were recently applied to the P300-based BCI and these classification performances were assessed. However, they were usually trained on a large amount of training data (e.g., 15300). In this study, we evaluated ensemble linear discriminant analysis (LDA) classifiers with a newly proposed overlapped partitioning method using 900 training data. In addition, the classification performances of the ensemble classifier with naive partitioning and a single LDA classifier were compared. One of three conditions for dimension reduction was applied: the stepwise method, principal component analysis (PCA), or none. The results show that an ensemble stepwise LDA (SWLDA) classifier with overlapped partitioning achieved a better performance than the commonly used single SWLDA classifier and an ensemble SWLDA classifier with naive partitioning. This result implies that the performance of the SWLDA is improved by overlapped partitioning and the ensemble classifier with overlapped partitioning requires less training data than that with naive partitioning. This study contributes towards reducing the required amount of training data and achieving better classification performance. PMID:24695550

  10. Overlapped partitioning for ensemble classifiers of P300-based brain-computer interfaces.

    PubMed

    Onishi, Akinari; Natsume, Kiyohisa

    2014-01-01

    A P300-based brain-computer interface (BCI) enables a wide range of people to control devices that improve their quality of life. Ensemble classifiers with naive partitioning were recently applied to the P300-based BCI and these classification performances were assessed. However, they were usually trained on a large amount of training data (e.g., 15300). In this study, we evaluated ensemble linear discriminant analysis (LDA) classifiers with a newly proposed overlapped partitioning method using 900 training data. In addition, the classification performances of the ensemble classifier with naive partitioning and a single LDA classifier were compared. One of three conditions for dimension reduction was applied: the stepwise method, principal component analysis (PCA), or none. The results show that an ensemble stepwise LDA (SWLDA) classifier with overlapped partitioning achieved a better performance than the commonly used single SWLDA classifier and an ensemble SWLDA classifier with naive partitioning. This result implies that the performance of the SWLDA is improved by overlapped partitioning and the ensemble classifier with overlapped partitioning requires less training data than that with naive partitioning. This study contributes towards reducing the required amount of training data and achieving better classification performance.

  11. Comparison of modeling approaches for carbon partitioning: Impact on estimates of global net primary production and equilibrium biomass of woody vegetation from MODIS GPP

    NASA Astrophysics Data System (ADS)

    Ise, Takeshi; Litton, Creighton M.; Giardina, Christian P.; Ito, Akihiko

    2010-12-01

    Partitioning of gross primary production (GPP) to aboveground versus belowground, to growth versus respiration, and to short versus long-lived tissues exerts a strong influence on ecosystem structure and function, with potentially large implications for the global carbon budget. A recent meta-analysis of forest ecosystems suggests that carbon partitioning to leaves, stems, and roots varies consistently with GPP and that the ratio of net primary production (NPP) to GPP is conservative across environmental gradients. To examine influences of carbon partitioning schemes employed by global ecosystem models, we used this meta-analysis-based model and a satellite-based (MODIS) terrestrial GPP data set to estimate global woody NPP and equilibrium biomass, and then compared it to two process-based ecosystem models (Biome-BGC and VISIT) using the same GPP data set. We hypothesized that different carbon partitioning schemes would result in large differences in global estimates of woody NPP and equilibrium biomass. Woody NPP estimated by Biome-BGC and VISIT was 25% and 29% higher than the meta-analysis-based model for boreal forests, with smaller differences in temperate and tropics. Global equilibrium woody biomass, calculated from model-specific NPP estimates and a single set of tissue turnover rates, was 48 and 226 Pg C higher for Biome-BGC and VISIT compared to the meta-analysis-based model, reflecting differences in carbon partitioning to structural versus metabolically active tissues. In summary, we found that different carbon partitioning schemes resulted in large variations in estimates of global woody carbon flux and storage, indicating that stand-level controls on carbon partitioning are not yet accurately represented in ecosystem models.

  12. Topological string, supersymmetric gauge theory and bps counting

    NASA Astrophysics Data System (ADS)

    Pan, Guang

    In this thesis we study the Donaldson-Thomas theory on the local curve geometry, which arises in the context of geometric engineering of supersymmetric gauge theory from type IIA string compactification. The topological A-model amplitude gives the F-term interaction of the compactified theory. In particular, it is related to the instanton partition function via Nekrasov conjecture. We will introduce ADHM sheaves on curve, as an alternative description of local Donaldson-Thomas theory. We derive the wallcrossing of ADHM invariants and their refinements. We show that it is equivalent to the semi-primitive wallcrossing from supergravity, and the Kontsevich-Soibelman wallcrossing formula. As an application, we discuss the connection between ADHM moduli space with Hitchin system. In particular we give a recursive formula for the Poincare polynomial of Hitchin system in terms of instanton partition function, from refined wallcrossing. We also introduce higher rank generalization of Donaldson-Thomas invariant in the context of ADHM sheaves. We study their wallcrossing and discuss their physical interpretation via string duality.

  13. Lord-Wingersky Algorithm Version 2.0 for Hierarchical Item Factor Models with Applications in Test Scoring, Scale Alignment, and Model Fit Testing. CRESST Report 830

    ERIC Educational Resources Information Center

    Cai, Li

    2013-01-01

    Lord and Wingersky's (1984) recursive algorithm for creating summed score based likelihoods and posteriors has a proven track record in unidimensional item response theory (IRT) applications. Extending the recursive algorithm to handle multidimensionality is relatively simple, especially with fixed quadrature because the recursions can be defined…

  14. Accelerating calculations of RNA secondary structure partition functions using GPUs

    PubMed Central

    2013-01-01

    Background RNA performs many diverse functions in the cell in addition to its role as a messenger of genetic information. These functions depend on its ability to fold to a unique three-dimensional structure determined by the sequence. The conformation of RNA is in part determined by its secondary structure, or the particular set of contacts between pairs of complementary bases. Prediction of the secondary structure of RNA from its sequence is therefore of great interest, but can be computationally expensive. In this work we accelerate computations of base-pair probababilities using parallel graphics processing units (GPUs). Results Calculation of the probabilities of base pairs in RNA secondary structures using nearest-neighbor standard free energy change parameters has been implemented using CUDA to run on hardware with multiprocessor GPUs. A modified set of recursions was introduced, which reduces memory usage by about 25%. GPUs are fastest in single precision, and for some hardware, restricted to single precision. This may introduce significant roundoff error. However, deviations in base-pair probabilities calculated using single precision were found to be negligible compared to those resulting from shifting the nearest-neighbor parameters by a random amount of magnitude similar to their experimental uncertainties. For large sequences running on our particular hardware, the GPU implementation reduces execution time by a factor of close to 60 compared with an optimized serial implementation, and by a factor of 116 compared with the original code. Conclusions Using GPUs can greatly accelerate computation of RNA secondary structure partition functions, allowing calculation of base-pair probabilities for large sequences in a reasonable amount of time, with a negligible compromise in accuracy due to working in single precision. The source code is integrated into the RNAstructure software package and available for download at http://rna.urmc.rochester.edu. PMID:24180434

  15. Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods.

    PubMed

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J Sunil

    2014-08-01

    We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called "Patient Recursive Survival Peeling" is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called "combined" cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication.

  16. Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods

    PubMed Central

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J. Sunil

    2015-01-01

    We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called “Patient Recursive Survival Peeling” is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called “combined” cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication. PMID:26997922

  17. Recursive Directional Ligation Approach for Cloning Recombinant Spider Silks.

    PubMed

    Dinjaski, Nina; Huang, Wenwen; Kaplan, David L

    2018-01-01

    Recent advances in genetic engineering have provided a route to produce various types of recombinant spider silks. Different cloning strategies have been applied to achieve this goal (e.g., concatemerization, step-by-step ligation, recursive directional ligation). Here we describe recursive directional ligation as an approach that allows for facile modularity and control over the size of the genetic cassettes. This approach is based on sequential ligation of genetic cassettes (monomers) where the junctions between them are formed without interrupting key gene sequences with additional base pairs.

  18. [Comparison of Discriminant Analysis and Decision Trees for the Detection of Subclinical Keratoconus].

    PubMed

    Kleinhans, Sonja; Herrmann, Eva; Kohnen, Thomas; Bühren, Jens

    2017-08-15

    Background Iatrogenic keratectasia is one of the most dreaded complications of refractive surgery. In most cases, keratectasia develops after refractive surgery of eyes suffering from subclinical stages of keratoconus with few or no signs. Unfortunately, there has been no reliable procedure for the early detection of keratoconus. In this study, we used binary decision trees (recursive partitioning) to assess their suitability for discrimination between normal eyes and eyes with subclinical keratoconus. Patients and Methods The method of decision tree analysis was compared with discriminant analysis which has shown good results in previous studies. Input data were 32 eyes of 32 patients with newly diagnosed keratoconus in the contralateral eye and preoperative data of 10 eyes of 5 patients with keratectasia after laser in-situ keratomileusis (LASIK). The control group was made up of 245 normal eyes after LASIK and 12-month follow-up without any signs of iatrogenic keratectasia. Results Decision trees gave better accuracy and specificity than did discriminant analysis. The sensitivity of decision trees was lower than the sensitivity of discriminant analysis. Conclusion On the basis of the patient population of this study, decision trees did not prove to be superior to linear discriminant analysis for the detection of subclinical keratoconus. Georg Thieme Verlag KG Stuttgart · New York.

  19. A basic recursion concept inventory

    NASA Astrophysics Data System (ADS)

    Hamouda, Sally; Edwards, Stephen H.; Elmongui, Hicham G.; Ernst, Jeremy V.; Shaffer, Clifford A.

    2017-04-01

    Recursion is both an important and a difficult topic for introductory Computer Science students. Students often develop misconceptions about the topic that need to be diagnosed and corrected. In this paper, we report on our initial attempts to develop a concept inventory that measures student misconceptions on basic recursion topics. We present a collection of misconceptions and difficulties encountered by students when learning introductory recursion as presented in a typical CS2 course. Based on this collection, a draft concept inventory in the form of a series of questions was developed and evaluated, with the question rubric tagged to the list of misconceptions and difficulties.

  20. Some practicable applications of quadtree data structures/representation in astronomy

    NASA Technical Reports Server (NTRS)

    Pasztor, L.

    1992-01-01

    Development of quadtree as hierarchical data structuring technique for representing spatial data (like points, regions, surfaces, lines, curves, volumes, etc.) has been motivated to a large extent by storage requirements of images, maps, and other multidimensional (spatially structured) data. For many spatial algorithms, time-efficiency of quadtrees in terms of execution may be as important as their space-efficiency concerning storage conditions. Briefly, the quadtree is a class of hierarchical data structures which is based on the recursive partition of a square region into quadrants and sub-quadrants until a predefined limit. Beyond the wide applicability of quadtrees in image processing, spatial information analysis, and building digital databases (processes becoming ordinary for the astronomical community), there may be numerous further applications in astronomy. Some of these practicable applications based on quadtree representation of astronomical data are presented and suggested for further considerations. Examples are shown for use of point as well as region quadtrees. Statistics of different leaf and non-leaf nodes (homogeneous and heterogeneous sub-quadrants respectively) at different levels may provide useful information on spatial structure of astronomical data in question. By altering the principle guiding the decomposition process, different types of spatial data may be focused on. Finally, a sampling method based on quadtree representation of an image is proposed which may prove to be efficient in the elaboration of sampling strategy in a region where observations were carried out previously either with different resolution or/and in different bands.

  1. Distribution-Preserving Stratified Sampling for Learning Problems.

    PubMed

    Cervellera, Cristiano; Maccio, Danilo

    2017-06-09

    The need for extracting a small sample from a large amount of real data, possibly streaming, arises routinely in learning problems, e.g., for storage, to cope with computational limitations, obtain good training/test/validation sets, and select minibatches for stochastic gradient neural network training. Unless we have reasons to select the samples in an active way dictated by the specific task and/or model at hand, it is important that the distribution of the selected points is as similar as possible to the original data. This is obvious for unsupervised learning problems, where the goal is to gain insights on the distribution of the data, but it is also relevant for supervised problems, where the theory explains how the training set distribution influences the generalization error. In this paper, we analyze the technique of stratified sampling from the point of view of distances between probabilities. This allows us to introduce an algorithm, based on recursive binary partition of the input space, aimed at obtaining samples that are distributed as much as possible as the original data. A theoretical analysis is proposed, proving the (greedy) optimality of the procedure together with explicit error bounds. An adaptive version of the algorithm is also introduced to cope with streaming data. Simulation tests on various data sets and different learning tasks are also provided.

  2. Real-time Adaptive EEG Source Separation using Online Recursive Independent Component Analysis

    PubMed Central

    Hsu, Sheng-Hsiou; Mullen, Tim; Jung, Tzyy-Ping; Cauwenberghs, Gert

    2016-01-01

    Independent Component Analysis (ICA) has been widely applied to electroencephalographic (EEG) biosignal processing and brain-computer interfaces. The practical use of ICA, however, is limited by its computational complexity, data requirements for convergence, and assumption of data stationarity, especially for high-density data. Here we study and validate an optimized online recursive ICA algorithm (ORICA) with online recursive least squares (RLS) whitening for blind source separation of high-density EEG data, which offers instantaneous incremental convergence upon presentation of new data. Empirical results of this study demonstrate the algorithm's: (a) suitability for accurate and efficient source identification in high-density (64-channel) realistically-simulated EEG data; (b) capability to detect and adapt to non-stationarity in 64-ch simulated EEG data; and (c) utility for rapidly extracting principal brain and artifact sources in real 61-channel EEG data recorded by a dry and wearable EEG system in a cognitive experiment. ORICA was implemented as functions in BCILAB and EEGLAB and was integrated in an open-source Real-time EEG Source-mapping Toolbox (REST), supporting applications in ICA-based online artifact rejection, feature extraction for real-time biosignal monitoring in clinical environments, and adaptable classifications in brain-computer interfaces. PMID:26685257

  3. Implementation of 3-D isoparametric finite elements on supercomputer for the formulation of recursive dynamical equations of multi-body systems

    NASA Technical Reports Server (NTRS)

    Shareef, N. H.; Amirouche, F. M. L.

    1991-01-01

    A computational algorithmic procedure is developed and implemented for the dynamic analysis of a multibody system with rigid/flexible interconnected bodies. The algorithm takes into consideration the large rotation/translation and small elastic deformations associated with the rigid-body degrees of freedom and the flexibility of the bodies in the system respectively. Versatile three-dimensional isoparametric brick elements are employed for the modeling of the geometric configurations of the bodies. The formulation of the recursive dynamical equations of motion is based on the recursive Kane's equations, strain energy concepts, and the techniques of component mode synthesis. In order to minimize CPU-intensive matrix multiplication operations and speed up the execution process, the concepts of indexed arrays is utilized in the formulation of the equations of motion. A spin-up maneuver of a space robot with three flexible links carrying a solar panel is used as an illustrative example.

  4. Classification and regression tree (CART) analysis of endometrial carcinoma: Seeing the forest for the trees.

    PubMed

    Barlin, Joyce N; Zhou, Qin; St Clair, Caryn M; Iasonos, Alexia; Soslow, Robert A; Alektiar, Kaled M; Hensley, Martee L; Leitao, Mario M; Barakat, Richard R; Abu-Rustum, Nadeem R

    2013-09-01

    The objectives of the study are to evaluate which clinicopathologic factors influenced overall survival (OS) in endometrial carcinoma and to determine if the surgical effort to assess para-aortic (PA) lymph nodes (LNs) at initial staging surgery impacts OS. All patients diagnosed with endometrial cancer from 1/1993-12/2011 who had LNs excised were included. PALN assessment was defined by the identification of one or more PALNs on final pathology. A multivariate analysis was performed to assess the effect of PALNs on OS. A form of recursive partitioning called classification and regression tree (CART) analysis was implemented. Variables included: age, stage, tumor subtype, grade, myometrial invasion, total LNs removed, evaluation of PALNs, and adjuvant chemotherapy. The cohort included 1920 patients, with a median age of 62 years. The median number of LNs removed was 16 (range, 1-99). The removal of PALNs was not associated with OS (P=0.450). Using the CART hierarchically, stage I vs. stages II-IV and grades 1-2 vs. grade 3 emerged as predictors of OS. If the tree was allowed to grow, further branching was based on age and myometrial invasion. Total number of LNs removed and assessment of PALNs as defined in this study were not predictive of OS. This innovative CART analysis emphasized the importance of proper stage assignment and a binary grading system in impacting OS. Notably, the total number of LNs removed and specific evaluation of PALNs as defined in this study were not important predictors of OS. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. Binary Decision Trees for Preoperative Periapical Cyst Screening Using Cone-beam Computed Tomography.

    PubMed

    Pitcher, Brandon; Alaqla, Ali; Noujeim, Marcel; Wealleans, James A; Kotsakis, Georgios; Chrepa, Vanessa

    2017-03-01

    Cone-beam computed tomographic (CBCT) analysis allows for 3-dimensional assessment of periradicular lesions and may facilitate preoperative periapical cyst screening. The purpose of this study was to develop and assess the predictive validity of a cyst screening method based on CBCT volumetric analysis alone or combined with designated radiologic criteria. Three independent examiners evaluated 118 presurgical CBCT scans from cases that underwent apicoectomies and had an accompanying gold standard histopathological diagnosis of either a cyst or granuloma. Lesion volume, density, and specific radiologic characteristics were assessed using specialized software. Logistic regression models with histopathological diagnosis as the dependent variable were constructed for cyst prediction, and receiver operating characteristic curves were used to assess the predictive validity of the models. A conditional inference binary decision tree based on a recursive partitioning algorithm was constructed to facilitate preoperative screening. Interobserver agreement was excellent for volume and density, but it varied from poor to good for the radiologic criteria. Volume and root displacement were strong predictors for cyst screening in all analyses. The binary decision tree classifier determined that if the volume of the lesion was >247 mm 3 , there was 80% probability of a cyst. If volume was <247 mm 3 and root displacement was present, cyst probability was 60% (78% accuracy). The good accuracy and high specificity of the decision tree classifier renders it a useful preoperative cyst screening tool that can aid in clinical decision making but not a substitute for definitive histopathological diagnosis after biopsy. Confirmatory studies are required to validate the present findings. Published by Elsevier Inc.

  6. The Partition of Multi-Resolution LOD Based on Qtm

    NASA Astrophysics Data System (ADS)

    Hou, M.-L.; Xing, H.-Q.; Zhao, X.-S.; Chen, J.

    2011-08-01

    The partition hierarch of Quaternary Triangular Mesh (QTM) determine the accuracy of spatial analysis and application based on QTM. In order to resolve the problem that the partition hierarch of QTM is limited by the level of the computer hardware, the new method that Multi- Resolution LOD (Level of Details) based on QTM will be discussed in this paper. This method can make the resolution of the cells varying with the viewpoint position by partitioning the cells of QTM, selecting the particular area according to the viewpoint; dealing with the cracks caused by different subdivisions, it satisfies the request of unlimited partition in part.

  7. The language faculty that wasn't: a usage-based account of natural language recursion

    PubMed Central

    Christiansen, Morten H.; Chater, Nick

    2015-01-01

    In the generative tradition, the language faculty has been shrinking—perhaps to include only the mechanism of recursion. This paper argues that even this view of the language faculty is too expansive. We first argue that a language faculty is difficult to reconcile with evolutionary considerations. We then focus on recursion as a detailed case study, arguing that our ability to process recursive structure does not rely on recursion as a property of the grammar, but instead emerges gradually by piggybacking on domain-general sequence learning abilities. Evidence from genetics, comparative work on non-human primates, and cognitive neuroscience suggests that humans have evolved complex sequence learning skills, which were subsequently pressed into service to accommodate language. Constraints on sequence learning therefore have played an important role in shaping the cultural evolution of linguistic structure, including our limited abilities for processing recursive structure. Finally, we re-evaluate some of the key considerations that have often been taken to require the postulation of a language faculty. PMID:26379567

  8. The language faculty that wasn't: a usage-based account of natural language recursion.

    PubMed

    Christiansen, Morten H; Chater, Nick

    2015-01-01

    In the generative tradition, the language faculty has been shrinking-perhaps to include only the mechanism of recursion. This paper argues that even this view of the language faculty is too expansive. We first argue that a language faculty is difficult to reconcile with evolutionary considerations. We then focus on recursion as a detailed case study, arguing that our ability to process recursive structure does not rely on recursion as a property of the grammar, but instead emerges gradually by piggybacking on domain-general sequence learning abilities. Evidence from genetics, comparative work on non-human primates, and cognitive neuroscience suggests that humans have evolved complex sequence learning skills, which were subsequently pressed into service to accommodate language. Constraints on sequence learning therefore have played an important role in shaping the cultural evolution of linguistic structure, including our limited abilities for processing recursive structure. Finally, we re-evaluate some of the key considerations that have often been taken to require the postulation of a language faculty.

  9. Heterogeneity of (18)F-FDG PET combined with expression of EGFR may improve the prognostic stratification of advanced oropharyngeal carcinoma.

    PubMed

    Wang, Hung-Ming; Cheng, Nai-Ming; Lee, Li-Yu; Fang, Yu-Hua Dean; Chang, Joseph Tung-Chieh; Tsan, Din-Li; Ng, Shu-Hang; Liao, Chun-Ta; Yang, Lan-Yan; Yen, Tzu-Chen

    2016-02-01

    The Ang's risk profile (based on p16, smoking and cancer stage) is a well-known prognostic factor in oropharyngeal squamous cell carcinoma (OPSCC). Whether heterogeneity in (18)F-fluorodeoxyglucose (FDG) positron emission tomographic (PET) images and epidermal growth factor receptor (EGFR) expression could provide additional information on clinical outcomes in advanced-stage OPSCC was investigated. Patients with stage III-IV OPSCC who completed primary therapy were eligible. Zone-size nonuniformity (ZSNU) extracted from pretreatment FDG PET scans was used as an index of image heterogeneity. EGFR and p16 expression were examined by immunohistochemistry. Disease-specific survival (DSS) and overall survival (OS) served as outcome measures. Kaplan-Meier estimates and Cox proportional hazards regression models were used for survival analysis. A bootstrap resampling technique was applied to investigate the stability of outcomes. Finally, a recursive partitioning analysis (RPA)-based model was constructed. A total of 113 patients were included, of which 28 were p16-positive. Multivariate analysis identified the Ang's profile, EGFR and ZSNU as independent predictors of both DSS and OS. Using RPA, the three risk factors were used to devise a prognostic scoring system that successfully predicted DSS in both p16-positive and -negative cases. The c-statistic of the prognostic index for DSS was 0.81, a value which was significantly superior to both AJCC stage (0.60) and the Ang's risk profile (0.68). In patients showing an Ang's high-risk profile (N = 77), the use of our scoring system clearly identified three distinct prognostic subgroups. It was concluded that a novel index may improve the prognostic stratification of patients with advanced-stage OPSCC. © 2015 UICC.

  10. Pseudobulbar affect as a negative prognostic indicator in amyotrophic lateral sclerosis.

    PubMed

    Tortelli, R; Arcuti, S; Copetti, M; Barone, R; Zecca, C; Capozzo, R; Barulli, M R; Simone, I L; Logroscino, G

    2018-07-01

    To evaluate whether the presence of pseudobulbar affect (PBA) in an early stage of the disease influences survival in a population-based incident cohort of amyotrophic lateral sclerosis (ALS). Incident ALS cases, diagnosed according to El Escorial criteria, were enrolled from a prospective population-based registry in Puglia, Southern Italy. The Center for Neurologic Study-Lability Scale (CNS-LS), a self-administered questionnaire, was used to evaluate PBA. Total scores range from 7 to 35. A score ≥13 was used to identify PBA. Cox proportional hazard models were used for survival analysis. The modified C-statistic for censored survival data was used for models' discrimination. RECursive Partitioning and AMalgamation (RECPAM) analysis was used to identify subgroups of patients with different patterns of risk, depending on baseline characteristics. We enrolled 94 sporadic ALS, median age of 64 years (range: 26-80). At the censoring date, 65 of 94 (69.2%), 39 of 60 (65.0%), and 26 of 34 (76.5%) patients reached the outcome (tracheotomy/death), in the whole, non-PBA and in the PBA groups, respectively. Kaplan-Meier survival curves for the two subgroups were not significantly different (log-rank test: 1.3, P = .25). The discrimination ability of a multivariable model with demographic and clinical variables of interest was not improved by adding PBA. In the RECPAM analysis, ALSFRSr and the total score of CNS-LS scale (

  11. An efficient sampling approach for variance-based sensitivity analysis based on the law of total variance in the successive intervals without overlapping

    NASA Astrophysics Data System (ADS)

    Yun, Wanying; Lu, Zhenzhou; Jiang, Xian

    2018-06-01

    To efficiently execute the variance-based global sensitivity analysis, the law of total variance in the successive intervals without overlapping is proved at first, on which an efficient space-partition sampling-based approach is subsequently proposed in this paper. Through partitioning the sample points of output into different subsets according to different inputs, the proposed approach can efficiently evaluate all the main effects concurrently by one group of sample points. In addition, there is no need for optimizing the partition scheme in the proposed approach. The maximum length of subintervals is decreased by increasing the number of sample points of model input variables in the proposed approach, which guarantees the convergence condition of the space-partition approach well. Furthermore, a new interpretation on the thought of partition is illuminated from the perspective of the variance ratio function. Finally, three test examples and one engineering application are employed to demonstrate the accuracy, efficiency and robustness of the proposed approach.

  12. Generalized recursive solutions to Ornstein-Zernike integral equations

    NASA Astrophysics Data System (ADS)

    Rossky, Peter J.; Dale, William D. T.

    1980-09-01

    Recursive procedures for the solution of a class of integral equations based on the Ornstein-Zernike equation are developed; the hypernetted chain and Percus-Yevick equations are two special cases of the class considered. It is shown that certain variants of the new procedures developed here are formally equivalent to those recently developed by Dale and Friedman, if the new recursive expressions are initialized in the same way as theirs. However, the computational solution of the new equations is significantly more efficient. Further, the present analysis leads to the identification of various graphical quantities arising in the earlier study with more familiar quantities related to pair correlation functions. The analysis is greatly facilitated by the use of several identities relating simple chain sums whose graphical elements can be written as a sum of two or more parts. In particular, the use of these identities permits renormalization of the equivalent series solution to the integral equation to be directly incorporated into the recursive solution in a straightforward manner. Formulas appropriate to renormalization with respect to long and short range parts of the pair potential, as well as more general components of the direct correlation function, are obtained. To further illustrate the utility of this approach, we show that a simple generalization of the hypernetted chain closure relation for the direct correlation function leads directly to the reference hypernetted chain (RHNC) equation due to Lado. The form of the correlation function used in the exponential approximation of Andersen and Chandler is then seen to be equivalent to the first estimate obtained from a renormalized RHNC equation.

  13. N/om, Change, and Social Work: A Recursive Frame Analysis of the Transformative Rituals of the Ju/'hoan Bushmen

    ERIC Educational Resources Information Center

    Keeney, Hillary; Keeney, Bradford

    2013-01-01

    The Ju/'hoan Bushman origin myth is depicted as a contextual frame for their healing and transformative ways. Using Recursive Frame Analysis, these performances are shown to be an enactment of the border crossing between First and Second Creation, that is, pre-linguistic and linguistic domains of experience. Here n/om, or the presumed creative…

  14. Anonymizing and Sharing Medical Text Records

    PubMed Central

    Li, Xiao-Bai; Qin, Jialun

    2017-01-01

    Health information technology has increased accessibility of health and medical data and benefited medical research and healthcare management. However, there are rising concerns about patient privacy in sharing medical and healthcare data. A large amount of these data are in free text form. Existing techniques for privacy-preserving data sharing deal largely with structured data. Current privacy approaches for medical text data focus on detection and removal of patient identifiers from the data, which may be inadequate for protecting privacy or preserving data quality. We propose a new systematic approach to extract, cluster, and anonymize medical text records. Our approach integrates methods developed in both data privacy and health informatics fields. The key novel elements of our approach include a recursive partitioning method to cluster medical text records based on the similarity of the health and medical information and a value-enumeration method to anonymize potentially identifying information in the text data. An experimental study is conducted using real-world medical documents. The results of the experiments demonstrate the effectiveness of the proposed approach. PMID:29569650

  15. Proceedings of the second SISAL users` conference

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feo, J T; Frerking, C; Miller, P J

    1992-12-01

    This report contains papers on the following topics: A sisal code for computing the fourier transform on S{sub N}; five ways to fill your knapsack; simulating material dislocation motion in sisal; candis as an interface for sisal; parallelisation and performance of the burg algorithm on a shared-memory multiprocessor; use of genetic algorithm in sisal to solve the file design problem; implementing FFT`s in sisal; programming and evaluating the performance of signal processing applications in the sisal programming environment; sisal and Von Neumann-based languages: translation and intercommunication; an IF2 code generator for ADAM architecture; program partitioning for NUMA multiprocessor computer systems;more » mapping functional parallelism on distributed memory machines; implicit array copying: prevention is better than cure ; mathematical syntax for sisal; an approach for optimizing recursive functions; implementing arrays in sisal 2.0; Fol: an object oriented extension to the sisal language; twine: a portable, extensible sisal execution kernel; and investigating the memory performance of the optimizing sisal compiler.« less

  16. Association Analysis of the Extended MHC Region in Celiac Disease Implicates Multiple Independent Susceptibility Loci

    PubMed Central

    Ahn, Richard; Ding, Yuan Chun; Murray, Joseph; Fasano, Alessio; Green, Peter H. R.; Neuhausen, Susan L.; Garner, Chad

    2012-01-01

    Celiac disease is a common autoimmune disease caused by sensitivity to the dietary protein gluten. Forty loci have been implicated in the disease. All disease loci have been characterized as low-penetrance, with the exception of the high-risk genotypes in the HLA-DQA1 and HLA-DQB1 genes, which are necessary but not sufficient to cause the disease. The very strong effects from the known HLA loci and the genetically complex nature of the major histocompatibility complex (MHC) have precluded a thorough investigation of the region. The purpose of this study was to test the hypothesis that additional celiac disease loci exist within the extended MHC (xMHC). A set of 1898 SNPs was analyzed for association across the 7.6 Mb xMHC region in 1668 confirmed celiac disease cases and 517 unaffected controls. Conditional recursive partitioning was used to create an informative indicator of the known HLA-DQA1 and HLA-DQB1 high-risk genotypes that was included in the association analysis to account for their effects. A linkage disequilibrium-based grouping procedure was utilized to estimate the number of independent celiac disease loci present in the xMHC after accounting for the known effects. There was significant statistical evidence for four new independent celiac disease loci within the classic MHC region. This study is the first comprehensive association analysis of the xMHC in celiac disease that specifically accounts for the known HLA disease genotypes and the genetic complexity of the region. PMID:22615847

  17. A nonparametric clustering technique which estimates the number of clusters

    NASA Technical Reports Server (NTRS)

    Ramey, D. B.

    1983-01-01

    In applications of cluster analysis, one usually needs to determine the number of clusters, K, and the assignment of observations to each cluster. A clustering technique based on recursive application of a multivariate test of bimodality which automatically estimates both K and the cluster assignments is presented.

  18. Insight: demographic differences and associations with one-year outcome in schizophrenia and schizoaffective disorder.

    PubMed

    Wiffen, Benjamin D R; Rabinowitz, Jonathan; Fleischhacker, W Wolfgang; David, Anthony S

    2010-10-01

    Insight is increasingly seen as an important variable for study in psychotic illness, particularly in relation to treatment adherence. This study aims to quantify the association of insight with outcome, sociodemographic variables and diagnosis in a large stable patient sample. Data are from a one-year, open-label, international, multicenter trial (n=670) of long-acting risperidone in adult symptomatically stable patients with schizophrenia or schizoaffective disorder. Psychopathology and insight were quantified using the Positive and Negative Syndrome Scale (PANSS). Patients were assessed at four time points over the year of the study. 31.2% of the sample showed clinically significant deficits in insight at baseline. There were no differences based on sex, but significant differences in age and diagnosis, with oldest patients and schizophrenia patients (cf., schizoaffective disorder) showing more deficits. Baseline insight impairment was correlated with change in PANSS score at one year (r=-0.243, p<0.001). Recursive partitioning showed that, of those whose symptoms improved, those whose insight also improved were more likely to complete the trial. Insight is important above and beyond the effects of symptoms for predicting continuation in drug trials. This may have implications for the design and analysis of such trials, as well as suggesting the importance of targeting insight in treatment to increase likelihood of adherence to treatment. There also appear to be small but significant differences in insight based on age and diagnosis within the schizophrenia spectrum.

  19. Reduction of artifacts in computer simulation of breast Cooper's ligaments

    NASA Astrophysics Data System (ADS)

    Pokrajac, David D.; Kuperavage, Adam; Maidment, Andrew D. A.; Bakic, Predrag R.

    2016-03-01

    Anthropomorphic software breast phantoms have been introduced as a tool for quantitative validation of breast imaging systems. Efficacy of the validation results depends on the realism of phantom images. The recursive partitioning algorithm based upon the octree simulation has been demonstrated as versatile and capable of efficiently generating large number of phantoms to support virtual clinical trials of breast imaging. Previously, we have observed specific artifacts, (here labeled "dents") on the boundaries of simulated Cooper's ligaments. In this work, we have demonstrated that these "dents" result from the approximate determination of the closest simulated ligament to an examined subvolume (i.e., octree node) of the phantom. We propose a modification of the algorithm that determines the closest ligament by considering a pre-specified number of neighboring ligaments selected based upon the functions that govern the shape of ligaments simulated in the subvolume. We have qualitatively and quantitatively demonstrated that the modified algorithm can lead to elimination or reduction of dent artifacts in software phantoms. In a proof-of concept example, we simulated a 450 ml phantom with 333 compartments at 100 micrometer resolution. After the proposed modification, we corrected 148,105 dents, with an average size of 5.27 voxels (5.27nl). We have also qualitatively analyzed the corresponding improvement in the appearance of simulated mammographic images. The proposed algorithm leads to reduction of linear and star-like artifacts in simulated phantom projections, which can be attributed to dents. Analysis of a larger number of phantoms is ongoing.

  20. Orthogonal recursive bisection data decomposition for high performance computing in cardiac model simulations: dependence on anatomical geometry.

    PubMed

    Reumann, Matthias; Fitch, Blake G; Rayshubskiy, Aleksandr; Keller, David U J; Seemann, Gunnar; Dossel, Olaf; Pitman, Michael C; Rice, John J

    2009-01-01

    Orthogonal recursive bisection (ORB) algorithm can be used as data decomposition strategy to distribute a large data set of a cardiac model to a distributed memory supercomputer. It has been shown previously that good scaling results can be achieved using the ORB algorithm for data decomposition. However, the ORB algorithm depends on the distribution of computational load of each element in the data set. In this work we investigated the dependence of data decomposition and load balancing on different rotations of the anatomical data set to achieve optimization in load balancing. The anatomical data set was given by both ventricles of the Visible Female data set in a 0.2 mm resolution. Fiber orientation was included. The data set was rotated by 90 degrees around x, y and z axis, respectively. By either translating or by simply taking the magnitude of the resulting negative coordinates we were able to create 14 data set of the same anatomy with different orientation and position in the overall volume. Computation load ratios for non - tissue vs. tissue elements used in the data decomposition were 1:1, 1:2, 1:5, 1:10, 1:25, 1:38.85, 1:50 and 1:100 to investigate the effect of different load ratios on the data decomposition. The ten Tusscher et al. (2004) electrophysiological cell model was used in monodomain simulations of 1 ms simulation time to compare performance using the different data sets and orientations. The simulations were carried out for load ratio 1:10, 1:25 and 1:38.85 on a 512 processor partition of the IBM Blue Gene/L supercomputer. Th results show that the data decomposition does depend on the orientation and position of the anatomy in the global volume. The difference in total run time between the data sets is 10 s for a simulation time of 1 ms. This yields a difference of about 28 h for a simulation of 10 s simulation time. However, given larger processor partitions, the difference in run time decreases and becomes less significant. Depending on the processor partition size, future work will have to consider the orientation of the anatomy in the global volume for longer simulation runs.

  1. Experimental evaluation of a recursive model identification technique for type 1 diabetes.

    PubMed

    Finan, Daniel A; Doyle, Francis J; Palerm, Cesar C; Bevier, Wendy C; Zisser, Howard C; Jovanovic, Lois; Seborg, Dale E

    2009-09-01

    A model-based controller for an artificial beta cell requires an accurate model of the glucose-insulin dynamics in type 1 diabetes subjects. To ensure the robustness of the controller for changing conditions (e.g., changes in insulin sensitivity due to illnesses, changes in exercise habits, or changes in stress levels), the model should be able to adapt to the new conditions by means of a recursive parameter estimation technique. Such an adaptive strategy will ensure that the most accurate model is used for the current conditions, and thus the most accurate model predictions are used in model-based control calculations. In a retrospective analysis, empirical dynamic autoregressive exogenous input (ARX) models were identified from glucose-insulin data for nine type 1 diabetes subjects in ambulatory conditions. Data sets consisted of continuous (5-minute) glucose concentration measurements obtained from a continuous glucose monitor, basal insulin infusion rates and times and amounts of insulin boluses obtained from the subjects' insulin pumps, and subject-reported estimates of the times and carbohydrate content of meals. Two identification techniques were investigated: nonrecursive, or batch methods, and recursive methods. Batch models were identified from a set of training data, whereas recursively identified models were updated at each sampling instant. Both types of models were used to make predictions of new test data. For the purpose of comparison, model predictions were compared to zero-order hold (ZOH) predictions, which were made by simply holding the current glucose value constant for p steps into the future, where p is the prediction horizon. Thus, the ZOH predictions are model free and provide a base case for the prediction metrics used to quantify the accuracy of the model predictions. In theory, recursive identification techniques are needed only when there are changing conditions in the subject that require model adaptation. Thus, the identification and validation techniques were performed with both "normal" data and data collected during conditions of reduced insulin sensitivity. The latter were achieved by having the subjects self-administer a medication, prednisone, for 3 consecutive days. The recursive models were allowed to adapt to this condition of reduced insulin sensitivity, while the batch models were only identified from normal data. Data from nine type 1 diabetes subjects in ambulatory conditions were analyzed; six of these subjects also participated in the prednisone portion of the study. For normal test data, the batch ARX models produced 30-, 45-, and 60-minute-ahead predictions that had average root mean square error (RMSE) values of 26, 34, and 40 mg/dl, respectively. For test data characterized by reduced insulin sensitivity, the batch ARX models produced 30-, 60-, and 90-minute-ahead predictions with average RMSE values of 27, 46, and 59 mg/dl, respectively; the recursive ARX models demonstrated similar performance with corresponding values of 27, 45, and 61 mg/dl, respectively. The identified ARX models (batch and recursive) produced more accurate predictions than the model-free ZOH predictions, but only marginally. For test data characterized by reduced insulin sensitivity, RMSE values for the predictions of the batch ARX models were 9, 5, and 5% more accurate than the ZOH predictions for prediction horizons of 30, 60, and 90 minutes, respectively. In terms of RMSE values, the 30-, 60-, and 90-minute predictions of the recursive models were more accurate than the ZOH predictions, by 10, 5, and 2%, respectively. In this experimental study, the recursively identified ARX models resulted in predictions of test data that were similar, but not superior, to the batch models. Even for the test data characteristic of reduced insulin sensitivity, the batch and recursive models demonstrated similar prediction accuracy. The predictions of the identified ARX models were only marginally more accurate than the model-free ZOH predictions. Given the simplicity of the ARX models and the computational ease with which they are identified, however, even modest improvements may justify the use of these models in a model-based controller for an artificial beta cell. 2009 Diabetes Technology Society.

  2. Recursive regularization step for high-order lattice Boltzmann methods

    NASA Astrophysics Data System (ADS)

    Coreixas, Christophe; Wissocq, Gauthier; Puigt, Guillaume; Boussuge, Jean-François; Sagaut, Pierre

    2017-09-01

    A lattice Boltzmann method (LBM) with enhanced stability and accuracy is presented for various Hermite tensor-based lattice structures. The collision operator relies on a regularization step, which is here improved through a recursive computation of nonequilibrium Hermite polynomial coefficients. In addition to the reduced computational cost of this procedure with respect to the standard one, the recursive step allows to considerably enhance the stability and accuracy of the numerical scheme by properly filtering out second- (and higher-) order nonhydrodynamic contributions in under-resolved conditions. This is first shown in the isothermal case where the simulation of the doubly periodic shear layer is performed with a Reynolds number ranging from 104 to 106, and where a thorough analysis of the case at Re=3 ×104 is conducted. In the latter, results obtained using both regularization steps are compared against the Bhatnagar-Gross-Krook LBM for standard (D2Q9) and high-order (D2V17 and D2V37) lattice structures, confirming the tremendous increase of stability range of the proposed approach. Further comparisons on thermal and fully compressible flows, using the general extension of this procedure, are then conducted through the numerical simulation of Sod shock tubes with the D2V37 lattice. They confirm the stability increase induced by the recursive approach as compared with the standard one.

  3. Recursive computer architecture for VLSI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Treleaven, P.C.; Hopkins, R.P.

    1982-01-01

    A general-purpose computer architecture based on the concept of recursion and suitable for VLSI computer systems built from replicated (lego-like) computing elements is presented. The recursive computer architecture is defined by presenting a program organisation, a machine organisation and an experimental machine implementation oriented to VLSI. The experimental implementation is being restricted to simple, identical microcomputers each containing a memory, a processor and a communications capability. This future generation of lego-like computer systems are termed fifth generation computers by the Japanese. 30 references.

  4. Microarray-based SNP genotyping to identify genetic risk factors of triple-negative breast cancer (TNBC) in South Indian population.

    PubMed

    Aravind Kumar, M; Singh, Vineeta; Naushad, Shaik Mohammad; Shanker, Uday; Lakshmi Narasu, M

    2018-05-01

    In the view of aggressive nature of Triple-Negative Breast cancer (TNBC) due to the lack of receptors (ER, PR, HER2) and high incidence of drug resistance associated with it, a case-control association study was conducted to identify the contributing genetic risk factors for Triple-negative breast cancer (TNBC). A total of 30 TNBC patients and 50 age and gender-matched controls of Indian origin were screened for 9,00,000 SNP markers using microarray-based SNP genotyping approach. The initial PLINK association analysis (p < 0.01, MAF 0.14-0.44, OR 10-24) identified 28 non-synonymous SNPs and one stop gain mutation in the exonic region as possible determinants of TNBC risk. All the 29 SNPs were annotated using ANNOVAR. The interactions between these markers were evaluated using Multifactor dimensionality reduction (MDR) analysis. The interactions were in the following order: exm408776 > exm1278309 > rs316389 > rs1651654 > rs635538 > exm1292477. Recursive partitioning analysis (RPA) was performed to construct decision tree useful in predicting TNBC risk. As shown in this analysis, rs1651654 and exm585172 SNPs are found to be determinants of TNBC risk. Artificial neural network model was used to generate the Receiver operating characteristic curves (ROC), which showed high sensitivity and specificity (AUC-0.94) of these markers. To conclude, among the 9,00,000 SNPs tested, CCDC42 exm1292477, ANXA3 exm408776, SASH1 exm585172 are found to be the most significant genetic predicting factors for TNBC. The interactions among exm408776, exm1278309, rs316389, rs1651654, rs635538, exm1292477 SNPs inflate the risk for TNBC further. Targeted analysis of these SNPs and genes alone also will have similar clinical utility in predicting TNBC.

  5. A Diagnostic Model for Impending Death in Cancer Patients: Preliminary Report

    PubMed Central

    Hui, David; Hess, Kenneth; dos Santos, Renata; Chisholm, Gary; Bruera, Eduardo

    2015-01-01

    Background We recently identified several highly specific bedside physical signs associated with impending death within 3 days among patients with advanced cancer. In this study, we developed and assessed a diagnostic model for impending death based on these physical signs. Methods We systematically documented 62 physical signs every 12 hours from admission to death or discharge in 357 patients with advanced cancer admitted to acute palliative care units (APCUs) at two tertiary care cancer centers. We used recursive partitioning analysis (RPA) to develop a prediction model for impending death in 3 days using admission data. We validated the model with 5 iterations of 10-fold cross-validation, and also applied the model to APCU days 2/3/4/5/6. Results Among 322/357 (90%) patients with complete data for all signs, the 3-day mortality was 24% on admission. The final model was based on 2 variables (palliative performance scale [PPS] and drooping of nasolabial fold) and had 4 terminal leaves: PPS≤20% and drooping of nasolabial fold present, PPS≤20% and drooping of nasolabial fold absent, PPS 30–60% and PPS ≥ 70%, with 3-day mortality of 94%, 42%, 16% and 3%, respectively. The diagnostic accuracy was 81% for the original tree, 80% for cross-validation, and 79%–84% for subsequent APCU days. Conclusion(s) We developed a diagnostic model for impending death within 3 days based on 2 objective bedside physical signs. This model was applicable to both APCU admission and subsequent days. Upon further external validation, this model may help clinicians to formulate the diagnosis of impending death. PMID:26218612

  6. Recursive computation of mutual potential between two polyhedra

    NASA Astrophysics Data System (ADS)

    Hirabayashi, Masatoshi; Scheeres, Daniel J.

    2013-11-01

    Recursive computation of mutual potential, force, and torque between two polyhedra is studied. Based on formulations by Werner and Scheeres (Celest Mech Dyn Astron 91:337-349, 2005) and Fahnestock and Scheeres (Celest Mech Dyn Astron 96:317-339, 2006) who applied the Legendre polynomial expansion to gravity interactions and expressed each order term by a shape-dependent part and a shape-independent part, this paper generalizes the computation of each order term, giving recursive relations of the shape-dependent part. To consider the potential, force, and torque, we introduce three tensors. This method is applicable to any multi-body systems. Finally, we implement this recursive computation to simulate the dynamics of a two rigid-body system that consists of two equal-sized parallelepipeds.

  7. [On the partition of acupuncture academic schools].

    PubMed

    Yang, Pengyan; Luo, Xi; Xia, Youbing

    2016-05-01

    Nowadays extensive attention has been paid on the research of acupuncture academic schools, however, a widely accepted method of partition of acupuncture academic schools is still in need. In this paper, the methods of partition of acupuncture academic schools in the history have been arranged, and three typical methods of"partition of five schools" "partition of eighteen schools" and "two-stage based partition" are summarized. After adeep analysis on the disadvantages and advantages of these three methods, a new method of partition of acupuncture academic schools that is called "three-stage based partition" is proposed. In this method, after the overall acupuncture academic schools are divided into an ancient stage, a modern stage and a contemporary stage, each schoolis divided into its sub-school category. It is believed that this method of partition can remedy the weaknesses ofcurrent methods, but also explore a new model of inheritance and development under a different aspect through thedifferentiation and interaction of acupuncture academic schools at three stages.

  8. A Clinical Decision Tree to Predict Whether a Bacteremic Patient Is Infected With an Extended-Spectrum β-Lactamase-Producing Organism.

    PubMed

    Goodman, Katherine E; Lessler, Justin; Cosgrove, Sara E; Harris, Anthony D; Lautenbach, Ebbing; Han, Jennifer H; Milstone, Aaron M; Massey, Colin J; Tamma, Pranita D

    2016-10-01

    Timely identification of extended-spectrum β-lactamase (ESBL) bacteremia can improve clinical outcomes while minimizing unnecessary use of broad-spectrum antibiotics, including carbapenems. However, most clinical microbiology laboratories currently require at least 24 additional hours from the time of microbial genus and species identification to confirm ESBL production. Our objective was to develop a user-friendly decision tree to predict which organisms are ESBL producing, to guide appropriate antibiotic therapy. We included patients ≥18 years of age with bacteremia due to Escherichia coli or Klebsiella species from October 2008 to March 2015 at Johns Hopkins Hospital. Isolates with ceftriaxone minimum inhibitory concentrations ≥2 µg/mL underwent ESBL confirmatory testing. Recursive partitioning was used to generate a decision tree to determine the likelihood that a bacteremic patient was infected with an ESBL producer. Discrimination of the original and cross-validated models was evaluated using receiver operating characteristic curves and by calculation of C-statistics. A total of 1288 patients with bacteremia met eligibility criteria. For 194 patients (15%), bacteremia was due to a confirmed ESBL producer. The final classification tree for predicting ESBL-positive bacteremia included 5 predictors: history of ESBL colonization/infection, chronic indwelling vascular hardware, age ≥43 years, recent hospitalization in an ESBL high-burden region, and ≥6 days of antibiotic exposure in the prior 6 months. The decision tree's positive and negative predictive values were 90.8% and 91.9%, respectively. Our findings suggest that a clinical decision tree can be used to estimate a bacteremic patient's likelihood of infection with ESBL-producing bacteria. Recursive partitioning offers a practical, user-friendly approach for addressing important diagnostic questions. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  9. Recursion and the Competence/Performance Distinction in AGL Tasks

    ERIC Educational Resources Information Center

    Lobina, David J.

    2011-01-01

    The term "recursion" is used in at least four distinct theoretical senses within cognitive science. Some of these senses in turn relate to the different levels of analysis described by David Marr some 20 years ago; namely, the underlying competence capacity (the "computational" level), the performance operations used in real-time processing (the…

  10. Collagen analysis by second-harmonic generation microscopy predicts outcome of luminal breast cancer.

    PubMed

    Natal, Rodrigo A; Vassallo, José; Paiva, Geisilene R; Pelegati, Vitor B; Barbosa, Guilherme O; Mendonça, Guilherme R; Bondarik, Caroline; Derchain, Sophie F; Carvalho, Hernandes F; Lima, Carmen S; Cesar, Carlos L; Sarian, Luís Otávio

    2018-04-01

    Second-harmonic generation microscopy represents an important tool to evaluate extracellular matrix collagen structure, which undergoes changes during cancer progression. Thus, it is potentially relevant to assess breast cancer development. We propose the use of second-harmonic generation images of tumor stroma selected on hematoxylin and eosin-stained slides to evaluate the prognostic value of collagen fibers analyses in peri and intratumoral areas in patients diagnosed with invasive ductal breast carcinoma. Quantitative analyses of collagen parameters were performed using ImageJ software. These parameters presented significantly higher values in peri than in intratumoral areas. Higher intratumoral collagen uniformity was associated with high pathological stages and with the presence of axillary lymph node metastasis. In patients with immunohistochemistry-based luminal subtype, higher intratumoral collagen uniformity and quantity were independently associated with poorer relapse-free and overall survival, respectively. A multivariate response recursive partitioning model determined 12.857 and 11.894 as the best cut-offs for intratumoral collagen quantity and uniformity, respectively. These values have shown high sensitivity and specificity to differentiate distinct outcomes. Values of intratumoral collagen quantity and uniformity exceeding the cut-offs were strongly associated with poorer relapse-free and overall survival. Our findings support a promising prognostic value of quantitative evaluation of intratumoral collagen by second-harmonic generation imaging mainly in the luminal subtype breast cancer.

  11. Bayesian Weibull tree models for survival analysis of clinico-genomic data

    PubMed Central

    Clarke, Jennifer; West, Mike

    2008-01-01

    An important goal of research involving gene expression data for outcome prediction is to establish the ability of genomic data to define clinically relevant risk factors. Recent studies have demonstrated that microarray data can successfully cluster patients into low- and high-risk categories. However, the need exists for models which examine how genomic predictors interact with existing clinical factors and provide personalized outcome predictions. We have developed clinico-genomic tree models for survival outcomes which use recursive partitioning to subdivide the current data set into homogeneous subgroups of patients, each with a specific Weibull survival distribution. These trees can provide personalized predictive distributions of the probability of survival for individuals of interest. Our strategy is to fit multiple models; within each model we adopt a prior on the Weibull scale parameter and update this prior via Empirical Bayes whenever the sample is split at a given node. The decision to split is based on a Bayes factor criterion. The resulting trees are weighted according to their relative likelihood values and predictions are made by averaging over models. In a pilot study of survival in advanced stage ovarian cancer we demonstrate that clinical and genomic data are complementary sources of information relevant to survival, and we use the exploratory nature of the trees to identify potential genomic biomarkers worthy of further study. PMID:18618012

  12. Reoperation and readmission after clipping of an unruptured intracranial aneurysm: a National Surgical Quality Improvement Program analysis.

    PubMed

    Dasenbrock, Hormuzdiyar H; Smith, Timothy R; Rudy, Robert F; Gormley, William B; Aziz-Sultan, M Ali; Du, Rose

    2018-03-01

    OBJECTIVE Although reoperation and readmission have been used as quality metrics, there are limited data evaluating the rate of, reasons for, and predictors of reoperation and readmission after microsurgical clipping of unruptured aneurysms. METHODS Adult patients who underwent craniotomy for clipping of an unruptured aneurysm electively were extracted from the prospective National Surgical Quality Improvement Program registry (2011-2014). Multivariable logistic regression and recursive partitioning analysis evaluated the independent predictors of nonroutine hospital discharge, unplanned 30-day reoperation, and readmission. Predictors screened included patient age, sex, comorbidities, American Society of Anesthesiologists (ASA) classification, functional status, aneurysm location, preoperative laboratory values, operative time, and postoperative complications. RESULTS Among the 460 patients evaluated, 4.2% underwent any reoperation at a median of 7 days (interquartile range [IQR] 2-17 days) postoperatively, and 1.1% required a cranial reoperation. The most common reoperation was ventricular shunt placement (23.5%); other reoperations were tracheostomy, craniotomy for hematoma evacuation, and decompressive hemicraniectomy. Independent predictors of any unplanned reoperation were age greater than 51 years and longer operative time (p ≤ 0.04). Readmission occurred in 6.3% of patients at a median of 6 days (IQR 5-13 days) after discharge from the surgical hospitalization; 59.1% of patients were readmitted within 1 week and 86.4% within 2 weeks of discharge. The most common reason for readmission was seizure (26.7%); other causes of readmission included hydrocephalus, cerebrovascular accidents, and headache. Unplanned readmission was independently associated with age greater than 65 years, Class II or III obesity (body mass index > 35 kg/m 2 ), preoperative hyponatremia, and preoperative anemia (p ≤ 0.04). Readmission was not associated with operative time, complications during the surgical hospitalization, length of stay, or discharge disposition. Recursive partitioning analysis identified the same 4 variables, as well as ASA classification, as associated with unplanned readmission. The most potent predictors of nonroutine hospital discharge (16.7%) were postoperative neurological and cardiopulmonary complications; other predictors were age greater than 51 years, preoperative hyponatremia, African American and Asian race, and a complex vertebrobasilar circulation aneurysm. CONCLUSIONS In this national analysis, patient age greater than 65 years, Class II or III obesity, preoperative hyponatremia, and anemia were associated with adverse events, highlighting patients who may be at risk for complications after clipping of unruptured cerebral aneurysms. The preponderance of early readmissions highlights the importance of early surveillance and follow-up after discharge; the frequency of readmission for seizure emphasizes the need for additional data evaluating the utility and duration of postcraniotomy seizure prophylaxis. Moreover, readmission was primarily associated with preoperative characteristics rather than metrics of perioperative care, suggesting that readmission may be a suboptimal indicator of the quality of care received during the surgical hospitalization in this patient population.

  13. Recursive heuristic classification

    NASA Technical Reports Server (NTRS)

    Wilkins, David C.

    1994-01-01

    The author will describe a new problem-solving approach called recursive heuristic classification, whereby a subproblem of heuristic classification is itself formulated and solved by heuristic classification. This allows the construction of more knowledge-intensive classification programs in a way that yields a clean organization. Further, standard knowledge acquisition and learning techniques for heuristic classification can be used to create, refine, and maintain the knowledge base associated with the recursively called classification expert system. The method of recursive heuristic classification was used in the Minerva blackboard shell for heuristic classification. Minerva recursively calls itself every problem-solving cycle to solve the important blackboard scheduler task, which involves assigning a desirability rating to alternative problem-solving actions. Knowing these ratings is critical to the use of an expert system as a component of a critiquing or apprenticeship tutoring system. One innovation of this research is a method called dynamic heuristic classification, which allows selection among dynamically generated classification categories instead of requiring them to be prenumerated.

  14. Syntactic Recursion Facilitates and Working Memory Predicts Recursive Theory of Mind

    PubMed Central

    Arslan, Burcu; Hohenberger, Annette; Verbrugge, Rineke

    2017-01-01

    In this study, we focus on the possible roles of second-order syntactic recursion and working memory in terms of simple and complex span tasks in the development of second-order false belief reasoning. We tested 89 Turkish children in two age groups, one younger (4;6–6;5 years) and one older (6;7–8;10 years). Although second-order syntactic recursion is significantly correlated with the second-order false belief task, results of ordinal logistic regressions revealed that the main predictor of second-order false belief reasoning is complex working memory span. Unlike simple working memory and second-order syntactic recursion tasks, the complex working memory task required processing information serially with additional reasoning demands that require complex working memory strategies. Based on our results, we propose that children’s second-order theory of mind develops when they have efficient reasoning rules to process embedded beliefs serially, thus overcoming a possible serial processing bottleneck. PMID:28072823

  15. Model parameter estimation approach based on incremental analysis for lithium-ion batteries without using open circuit voltage

    NASA Astrophysics Data System (ADS)

    Wu, Hongjie; Yuan, Shifei; Zhang, Xi; Yin, Chengliang; Ma, Xuerui

    2015-08-01

    To improve the suitability of lithium-ion battery model under varying scenarios, such as fluctuating temperature and SoC variation, dynamic model with parameters updated realtime should be developed. In this paper, an incremental analysis-based auto regressive exogenous (I-ARX) modeling method is proposed to eliminate the modeling error caused by the OCV effect and improve the accuracy of parameter estimation. Then, its numerical stability, modeling error, and parametric sensitivity are analyzed at different sampling rates (0.02, 0.1, 0.5 and 1 s). To identify the model parameters recursively, a bias-correction recursive least squares (CRLS) algorithm is applied. Finally, the pseudo random binary sequence (PRBS) and urban dynamic driving sequences (UDDSs) profiles are performed to verify the realtime performance and robustness of the newly proposed model and algorithm. Different sampling rates (1 Hz and 10 Hz) and multiple temperature points (5, 25, and 45 °C) are covered in our experiments. The experimental and simulation results indicate that the proposed I-ARX model can present high accuracy and suitability for parameter identification without using open circuit voltage.

  16. Experiments with recursive estimation in astronomical image processing

    NASA Technical Reports Server (NTRS)

    Busko, I.

    1992-01-01

    Recursive estimation concepts were applied to image enhancement problems since the 70's. However, very few applications in the particular area of astronomical image processing are known. These concepts were derived, for 2-dimensional images, from the well-known theory of Kalman filtering in one dimension. The historic reasons for application of these techniques to digital images are related to the images' scanned nature, in which the temporal output of a scanner device can be processed on-line by techniques borrowed directly from 1-dimensional recursive signal analysis. However, recursive estimation has particular properties that make it attractive even in modern days, when big computer memories make the full scanned image available to the processor at any given time. One particularly important aspect is the ability of recursive techniques to deal with non-stationary phenomena, that is, phenomena which have their statistical properties variable in time (or position in a 2-D image). Many image processing methods make underlying stationary assumptions either for the stochastic field being imaged, for the imaging system properties, or both. They will underperform, or even fail, when applied to images that deviate significantly from stationarity. Recursive methods, on the contrary, make it feasible to perform adaptive processing, that is, to process the image by a processor with properties tuned to the image's local statistical properties. Recursive estimation can be used to build estimates of images degraded by such phenomena as noise and blur. We show examples of recursive adaptive processing of astronomical images, using several local statistical properties to drive the adaptive processor, as average signal intensity, signal-to-noise and autocorrelation function. Software was developed under IRAF, and as such will be made available to interested users.

  17. Aneurysmal subarachnoid hemorrhage prognostic decision-making algorithm using classification and regression tree analysis.

    PubMed

    Lo, Benjamin W Y; Fukuda, Hitoshi; Angle, Mark; Teitelbaum, Jeanne; Macdonald, R Loch; Farrokhyar, Forough; Thabane, Lehana; Levine, Mitchell A H

    2016-01-01

    Classification and regression tree analysis involves the creation of a decision tree by recursive partitioning of a dataset into more homogeneous subgroups. Thus far, there is scarce literature on using this technique to create clinical prediction tools for aneurysmal subarachnoid hemorrhage (SAH). The classification and regression tree analysis technique was applied to the multicenter Tirilazad database (3551 patients) in order to create the decision-making algorithm. In order to elucidate prognostic subgroups in aneurysmal SAH, neurologic, systemic, and demographic factors were taken into account. The dependent variable used for analysis was the dichotomized Glasgow Outcome Score at 3 months. Classification and regression tree analysis revealed seven prognostic subgroups. Neurological grade, occurrence of post-admission stroke, occurrence of post-admission fever, and age represented the explanatory nodes of this decision tree. Split sample validation revealed classification accuracy of 79% for the training dataset and 77% for the testing dataset. In addition, the occurrence of fever at 1-week post-aneurysmal SAH is associated with increased odds of post-admission stroke (odds ratio: 1.83, 95% confidence interval: 1.56-2.45, P < 0.01). A clinically useful classification tree was generated, which serves as a prediction tool to guide bedside prognostication and clinical treatment decision making. This prognostic decision-making algorithm also shed light on the complex interactions between a number of risk factors in determining outcome after aneurysmal SAH.

  18. Teaching Non-Recursive Binary Searching: Establishing a Conceptual Framework.

    ERIC Educational Resources Information Center

    Magel, E. Terry

    1989-01-01

    Discusses problems associated with teaching non-recursive binary searching in computer language classes, and describes a teacher-directed dialog based on dictionary use that helps students use their previous searching experiences to conceptualize the binary search process. Algorithmic development is discussed and appropriate classroom discussion…

  19. A fast ellipse extended target PHD filter using box-particle implementation

    NASA Astrophysics Data System (ADS)

    Zhang, Yongquan; Ji, Hongbing; Hu, Qi

    2018-01-01

    This paper presents a box-particle implementation of the ellipse extended target probability hypothesis density (ET-PHD) filter, called the ellipse extended target box particle PHD (EET-BP-PHD) filter, where the extended targets are described as a Poisson model developed by Gilholm et al. and the term "box" is here equivalent to the term "interval" used in interval analysis. The proposed EET-BP-PHD filter is capable of dynamically tracking multiple ellipse extended targets and estimating the target states and the number of targets, in the presence of clutter measurements, false alarms and missed detections. To derive the PHD recursion of the EET-BP-PHD filter, a suitable measurement likelihood is defined for a given partitioning cell, and the main implementation steps are presented along with the necessary box approximations and manipulations. The limitations and capabilities of the proposed EET-BP-PHD filter are illustrated by simulation examples. The simulation results show that a box-particle implementation of the ET-PHD filter can avoid the high number of particles and reduce computational burden, compared to a particle implementation of that for extended target tracking.

  20. SH c realization of minimal model CFT: triality, poset and Burge condition

    NASA Astrophysics Data System (ADS)

    Fukuda, M.; Nakamura, S.; Matsuo, Y.; Zhu, R.-D.

    2015-11-01

    Recently an orthogonal basis of {{W}}_N -algebra (AFLT basis) labeled by N-tuple Young diagrams was found in the context of 4D/2D duality. Recursion relations among the basis are summarized in the form of an algebra SH c which is universal for any N. We show that it has an {{S}}_3 automorphism which is referred to as triality. We study the level-rank duality between minimal models, which is a special example of the automorphism. It is shown that the nonvanishing states in both systems are described by N or M Young diagrams with the rows of boxes appropriately shuffled. The reshuffling of rows implies there exists partial ordering of the set which labels them. For the simplest example, one can compute the partition functions for the partially ordered set (poset) explicitly, which reproduces the Rogers-Ramanujan identities. We also study the description of minimal models by SH c . Simple analysis reproduces some known properties of minimal models, the structure of singular vectors and the N-Burge condition in the Hilbert space.

  1. Framework for Detection and Localization of Extreme Climate Event with Pixel Recursive Super Resolution

    NASA Astrophysics Data System (ADS)

    Kim, S. K.; Lee, J.; Zhang, C.; Ames, S.; Williams, D. N.

    2017-12-01

    Deep learning techniques have been successfully applied to solve many problems in climate and geoscience using massive-scaled observed and modeled data. For extreme climate event detections, several models based on deep neural networks have been recently proposed and attend superior performance that overshadows all previous handcrafted expert based method. The issue arising, though, is that accurate localization of events requires high quality of climate data. In this work, we propose framework capable of detecting and localizing extreme climate events in very coarse climate data. Our framework is based on two models using deep neural networks, (1) Convolutional Neural Networks (CNNs) to detect and localize extreme climate events, and (2) Pixel recursive recursive super resolution model to reconstruct high resolution climate data from low resolution climate data. Based on our preliminary work, we have presented two CNNs in our framework for different purposes, detection and localization. Our results using CNNs for extreme climate events detection shows that simple neural nets can capture the pattern of extreme climate events with high accuracy from very coarse reanalysis data. However, localization accuracy is relatively low due to the coarse resolution. To resolve this issue, the pixel recursive super resolution model reconstructs the resolution of input of localization CNNs. We present a best networks using pixel recursive super resolution model that synthesizes details of tropical cyclone in ground truth data while enhancing their resolution. Therefore, this approach not only dramat- ically reduces the human effort, but also suggests possibility to reduce computing cost required for downscaling process to increase resolution of data.

  2. Using Recursive Regression to Explore Nonlinear Relationships and Interactions: A Tutorial Applied to a Multicultural Education Study

    ERIC Educational Resources Information Center

    Strang, Kenneth David

    2009-01-01

    This paper discusses how a seldom-used statistical procedure, recursive regression (RR), can numerically and graphically illustrate data-driven nonlinear relationships and interaction of variables. This routine falls into the family of exploratory techniques, yet a few interesting features make it a valuable compliment to factor analysis and…

  3. Comparison of Modeling Approaches for Carbon Partitioning: Impact on Estimates of Global Net Primary Production and Equilibrium Biomass of Woody Vegetation from MODIS GPP

    NASA Astrophysics Data System (ADS)

    Ise, T.; Litton, C. M.; Giardina, C. P.; Ito, A.

    2009-12-01

    Plant partitioning of carbon (C) to above- vs. belowground, to growth vs. respiration, and to short vs. long lived tissues exerts a large influence on ecosystem structure and function with implications for the global C budget. Importantly, outcomes of process-based terrestrial vegetation models are likely to vary substantially with different C partitioning algorithms. However, controls on C partitioning patterns remain poorly quantified, and studies have yielded variable, and at times contradictory, results. A recent meta-analysis of forest studies suggests that the ratio of net primary production (NPP) and gross primary production (GPP) is fairly conservative across large scales. To illustrate the effect of this unique meta-analysis-based partitioning scheme (MPS), we compared an application of MPS to a terrestrial satellite-based (MODIS) GPP to estimate NPP vs. two global process-based vegetation models (Biome-BGC and VISIT) to examine the influence of C partitioning on C budgets of woody plants. Due to the temperature dependence of maintenance respiration, NPP/GPP predicted by the process-based models increased with latitude while the ratio remained constant with MPS. Overall, global NPP estimated with MPS was 17 and 27% lower than the process-based models for temperate and boreal biomes, respectively, with smaller differences in the tropics. Global equilibrium biomass of woody plants was then calculated from the NPP estimates and tissue turnover rates from VISIT. Since turnover rates differed greatly across tissue types (i.e., metabolically active vs. structural), global equilibrium biomass estimates were sensitive to the partitioning scheme employed. The MPS estimate of global woody biomass was 7-21% lower than that of the process-based models. In summary, we found that model output for NPP and equilibrium biomass was quite sensitive to the choice of C partitioning schemes. Carbon use efficiency (CUE; NPP/GPP) by forest biome and the globe. Values are means for 2001-2006.

  4. Event-based recursive filtering for a class of nonlinear stochastic parameter systems over fading channels

    NASA Astrophysics Data System (ADS)

    Shen, Yuxuan; Wang, Zidong; Shen, Bo; Alsaadi, Fuad E.

    2018-07-01

    In this paper, the recursive filtering problem is studied for a class of time-varying nonlinear systems with stochastic parameter matrices. The measurement transmission between the sensor and the filter is conducted through a fading channel characterized by the Rice fading model. An event-based transmission mechanism is adopted to decide whether the sensor measurement should be transmitted to the filter. A recursive filter is designed such that, in the simultaneous presence of the stochastic parameter matrices and fading channels, the filtering error covariance is guaranteed to have an upper bound and such an upper bound is then minimized by appropriately choosing filter gain matrix. Finally, a simulation example is presented to demonstrate the effectiveness of the proposed filtering scheme.

  5. A prediction model of drug-induced ototoxicity developed by an optimal support vector machine (SVM) method.

    PubMed

    Zhou, Shu; Li, Guo-Bo; Huang, Lu-Yi; Xie, Huan-Zhang; Zhao, Ying-Lan; Chen, Yu-Zong; Li, Lin-Li; Yang, Sheng-Yong

    2014-08-01

    Drug-induced ototoxicity, as a toxic side effect, is an important issue needed to be considered in drug discovery. Nevertheless, current experimental methods used to evaluate drug-induced ototoxicity are often time-consuming and expensive, indicating that they are not suitable for a large-scale evaluation of drug-induced ototoxicity in the early stage of drug discovery. We thus, in this investigation, established an effective computational prediction model of drug-induced ototoxicity using an optimal support vector machine (SVM) method, GA-CG-SVM. Three GA-CG-SVM models were developed based on three training sets containing agents bearing different risk levels of drug-induced ototoxicity. For comparison, models based on naïve Bayesian (NB) and recursive partitioning (RP) methods were also used on the same training sets. Among all the prediction models, the GA-CG-SVM model II showed the best performance, which offered prediction accuracies of 85.33% and 83.05% for two independent test sets, respectively. Overall, the good performance of the GA-CG-SVM model II indicates that it could be used for the prediction of drug-induced ototoxicity in the early stage of drug discovery. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Institutional Gender Equity Salary Analysis and Recursive Impact of Career and Life Choices

    ERIC Educational Resources Information Center

    Peterson, Teri S.

    2013-01-01

    This study employed mixed methods, engaging both quantitative and qualitative inquiries. In terms of the quantitative inquiry, the purpose of this study was to explore and assess gender-based salary inequities at a Carnegie Classified Research High university in the Intermountain West. Qualitative inquiry was used to follow up and contextually…

  7. An observational study identifying obese subgroups among older adults at increased risk of mobility disability: do perceptions of the neighborhood environment matter?

    PubMed

    King, Abby C; Salvo, Deborah; Banda, Jorge A; Ahn, David K; Gill, Thomas M; Miller, Michael; Newman, Anne B; Fielding, Roger A; Siordia, Carlos; Moore, Spencer; Folta, Sara; Spring, Bonnie; Manini, Todd; Pahor, Marco

    2015-12-18

    Obesity is an increasingly prevalent condition among older adults, yet relatively little is known about how built environment variables may be associated with obesity in older age groups. This is particularly the case for more vulnerable older adults already showing functional limitations associated with subsequent disability. The Lifestyle Interventions and Independence for Elders (LIFE) trial dataset (n = 1600) was used to explore the associations between perceived built environment variables and baseline obesity levels. Age-stratified recursive partitioning methods were applied to identify distinct subgroups with varying obesity prevalence. Among participants aged 70-78 years, four distinct subgroups, defined by combinations of perceived environment and race-ethnicity variables, were identified. The subgroups with the lowest obesity prevalence (45.5-59.4%) consisted of participants who reported living in neighborhoods with higher residential density. Among participants aged 79-89 years, the subgroup (of three distinct subgroups identified) with the lowest obesity prevalence (19.4%) consisted of non-African American/Black participants who reported living in neighborhoods with friends or acquaintances similar in demographic characteristics to themselves. Overall support for the partitioned subgroupings was obtained using mixed model regression analysis. The results suggest that, in combination with race/ethnicity, features of the perceived neighborhood built and social environments differentiated distinct groups of vulnerable older adults from different age strata that differed in obesity prevalence. Pending further verification, the results may help to inform subsequent targeting of such subgroups for further investigation. Clinicaltrials.gov Identifier =  NCT01072500.

  8. Predictors of trying to lose weight among overweight and obese Mexican-Americans: a signal detection analysis.

    PubMed

    Bersamin, Andrea; Hanni, Krista D; Winkleby, Marilyn A

    2009-01-01

    Signal detection analysis, a form of recursive partitioning, was used to identify combinations of sociodemographic and acculturation factors that predict trying to lose weight in a community-based sample of 957 overweight and obese Mexican-American adults (ages 18-69 years). Data were pooled from the 2004 and 2006 Behavioral Risk Factor Surveillance System conducted in a low-income, semi-rural community in California. Overall, 59 % of the population reported trying to lose weight. The proportion of adults who were trying to lose weight was highly variable across the seven mutually exclusive groups identified by signal detection (range 30-79 %). Significant predictors of trying to lose weight included BMI, gender, age and income. Women who were very overweight (BMI > 28.5 kg/m2) were most likely to be trying to lose weight (79 %), followed by very overweight higher-income men and moderately overweight (BMI = 25.0-28.5 kg/m2) higher-income women (72 % and 70 %, respectively). Moderately overweight men, aged 28-69 years, were the least likely to be trying to lose weight (30 %), followed by moderately overweight lower-income women (47 %) and very overweight lower-income men (49 %). The latter group is of particular concern since they have characteristics associated with medical complications of obesity (low education and poor access to medical care). Our findings highlight opportunities and challenges for public health professionals working with overweight Mexican-American adults - particularly lower-income adults who were born in Mexico - who are not trying to lose weight and are therefore at high risk for obesity-related co-morbidities.

  9. Citation-based Estimation of Scholarly Activity Among Domestic Academic Radiation Oncologists: Five-Year Update.

    PubMed

    Choi, Mehee; Holliday, Emma B; Jagsi, Reshma; Wilson, Lynn D; Fuller, Clifton D; Thomas, Charles R

    2014-03-01

    To analyze up-to-date Hirsch index ( h -index) data to estimate the scholarly productivity of academic radiation oncology faculty. Bibliometric citation database searches were performed for radiation oncology faculty at domestic residency-training institutions. Outcomes analyzed included the number of manuscripts, number of citations, and h -index between 1996 and 2012. Analyses of overall h -index rankings with stratification by academic ranking, gender, and departmental faculty size were performed. One thousand thirty-seven radiation oncologists from 87 programs were included. Overall, the mean h -index was 10.8. Among the top 10% by h -index, 38% were chairpersons, all were senior faculty, and 11% were women. As expected, higher h -index was associated with higher academic ranking and senior faculty status. Recursive partitioning analysis revealed an h -index threshold of 20 ( p <0.001) as an identified breakpoint between senior vs. junior faculty. Furthermore, h -index breakpoints of 12 ( p <0.001) and 25 ( p <0.001) were identified between assistant professor vs. associate professor, and associate professor vs. professor levels, respectively. Multivariate analysis identified higher academic ranking, male gender, and larger departmental faculty size as independent variables associated with higher h -index. The current results suggest an overall rise in scholarly citation metrics among domestic academic radiation oncologists, with a current mean h- index of 10.8, vs. 8.5 in 2008. Significant relationships exist between h -index and academic rank, gender, and departmental size. The results offer up-to-date benchmarks for evaluating academic radiation oncologist to the national average and potentially has utility in the process of appointment and promotion decisions.

  10. Simple recursion relations for general field theories

    DOE PAGES

    Cheung, Clifford; Shen, Chia -Hsien; Trnka, Jaroslav

    2015-06-17

    On-shell methods offer an alternative definition of quantum field theory at tree-level, replacing Feynman diagrams with recursion relations and interaction vertices with a handful of seed scattering amplitudes. In this paper we determine the simplest recursion relations needed to construct a general four-dimensional quantum field theory of massless particles. For this purpose we define a covering space of recursion relations which naturally generalizes all existing constructions, including those of BCFW and Risager. The validity of each recursion relation hinges on the large momentum behavior of an n-point scattering amplitude under an m-line momentum shift, which we determine solely from dimensionalmore » analysis, Lorentz invariance, and locality. We show that all amplitudes in a renormalizable theory are 5-line constructible. Amplitudes are 3-line constructible if an external particle carries spin or if the scalars in the theory carry equal charge under a global or gauge symmetry. Remarkably, this implies the 3-line constructibility of all gauge theories with fermions and complex scalars in arbitrary representations, all supersymmetric theories, and the standard model. Moreover, all amplitudes in non-renormalizable theories without derivative interactions are constructible; with derivative interactions, a subset of amplitudes is constructible. We illustrate our results with examples from both renormalizable and non-renormalizable theories. In conclusion, our study demonstrates both the power and limitations of recursion relations as a self-contained formulation of quantum field theory.« less

  11. On the Hosoya index of a family of deterministic recursive trees

    NASA Astrophysics Data System (ADS)

    Chen, Xufeng; Zhang, Jingyuan; Sun, Weigang

    2017-01-01

    In this paper, we calculate the Hosoya index in a family of deterministic recursive trees with a special feature that includes new nodes which are connected to existing nodes with a certain rule. We then obtain a recursive solution of the Hosoya index based on the operations of a determinant. The computational complexity of our proposed algorithm is O(log2 n) with n being the network size, which is lower than that of the existing numerical methods. Finally, we give a weighted tree shrinking method as a graphical interpretation of the recurrence formula for the Hosoya index.

  12. An iterative network partition algorithm for accurate identification of dense network modules

    PubMed Central

    Sun, Siqi; Dong, Xinran; Fu, Yao; Tian, Weidong

    2012-01-01

    A key step in network analysis is to partition a complex network into dense modules. Currently, modularity is one of the most popular benefit functions used to partition network modules. However, recent studies suggested that it has an inherent limitation in detecting dense network modules. In this study, we observed that despite the limitation, modularity has the advantage of preserving the primary network structure of the undetected modules. Thus, we have developed a simple iterative Network Partition (iNP) algorithm to partition a network. The iNP algorithm provides a general framework in which any modularity-based algorithm can be implemented in the network partition step. Here, we tested iNP with three modularity-based algorithms: multi-step greedy (MSG), spectral clustering and Qcut. Compared with the original three methods, iNP achieved a significant improvement in the quality of network partition in a benchmark study with simulated networks, identified more modules with significantly better enrichment of functionally related genes in both yeast protein complex network and breast cancer gene co-expression network, and discovered more cancer-specific modules in the cancer gene co-expression network. As such, iNP should have a broad application as a general method to assist in the analysis of biological networks. PMID:22121225

  13. Estimation of citation-based scholarly activity among radiation oncology faculty at domestic residency-training institutions: 1996-2007.

    PubMed

    Choi, Mehee; Fuller, Clifton D; Thomas, Charles R

    2009-05-01

    Advancement in academic radiation oncology is largely contingent on research productivity and the perceived external influence of an individual's scholarly work. The purpose of this study was to use the Hirsch index (h-index) to estimate the research productivity of current radiation oncology faculty at U.S. academic institutions between 1996 and 2007. We performed bibliometric citation database searches for available radiation oncology faculty at domestic residency-training institutions (n = 826). The outcomes analyzed included the total number of manuscripts, total number of citations, and the h-index between 1996 and 2007. Analysis of overall h-index rankings with stratification by academic ranking, junior vs. senior faculty status, and gender was performed. Of the 826 radiation oncologists, the mean h-index was 8.5. Of the individuals in the top 10% by the h-index, 34% were chairpersons, 88% were senior faculty, and 13% were women. A greater h-index was associated with a higher academic ranking and senior faculty status. Recursive partitioning analysis revealed an h-index threshold of 15 (p <0.0001) as an identified breakpoint between the senior and junior faculty. Overall, women had lower h-indexes compared with men (mean, 6.4 vs. 9.4); however, when stratified by academic ranking, the gender differential all but disappeared. Using the h-index as a partial surrogate for research productivity, it appears that radiation oncologists in academia today comprise a prolific group, however, with a highly skewed distribution. According to the present analysis, the h-index correlated with academic ranking. Thus, it potentially has utility in the process of promotion decisions. Overall, women in radiation oncology were less academically productive than men; the possible reasons for the gender differential are discussed.

  14. Can texture analysis of tooth microwear detect within guild niche partitioning in extinct species?

    NASA Astrophysics Data System (ADS)

    Purnell, Mark; Nedza, Christopher; Rychlik, Leszek

    2017-04-01

    Recent work shows that tooth microwear analysis can be applied further back in time and deeper into the phylogenetic history of vertebrate clades than previously thought (e.g. niche partitioning in early Jurassic insectivorous mammals; Gill et al., 2014, Nature). Furthermore, quantitative approaches to analysis based on parameterization of surface roughness are increasing the robustness and repeatability of this widely used dietary proxy. Discriminating between taxa within dietary guilds has the potential to significantly increase our ability to determine resource use and partitioning in fossil vertebrates, but how sensitive is the technique? To address this question we analysed tooth microwear texture in sympatric populations of shrew species (Neomys fodiens, Neomys anomalus, Sorex araneus, Sorex minutus) from BiaŁ owieza Forest, Poland. These populations are known to exhibit varying degrees of niche partitioning (Churchfield & Rychlik, 2006, J. Zool.) with greatest overlap between the Neomys species. Sorex araneus also exhibits some niche overlap with N. anomalus, while S. minutus is the most specialised. Multivariate analysis based only on tooth microwear textures recovers the same pattern of niche partitioning. Our results also suggest that tooth textures track seasonal differences in diet. Projecting data from fossils into the multivariate dietary space defined using microwear from extant taxa demonstrates that the technique is capable of subtle dietary discrimination in extinct insectivores.

  15. [Formula: see text]-regularized recursive total least squares based sparse system identification for the error-in-variables.

    PubMed

    Lim, Jun-Seok; Pang, Hee-Suk

    2016-01-01

    In this paper an [Formula: see text]-regularized recursive total least squares (RTLS) algorithm is considered for the sparse system identification. Although recursive least squares (RLS) has been successfully applied in sparse system identification, the estimation performance in RLS based algorithms becomes worse, when both input and output are contaminated by noise (the error-in-variables problem). We proposed an algorithm to handle the error-in-variables problem. The proposed [Formula: see text]-RTLS algorithm is an RLS like iteration using the [Formula: see text] regularization. The proposed algorithm not only gives excellent performance but also reduces the required complexity through the effective inversion matrix handling. Simulations demonstrate the superiority of the proposed [Formula: see text]-regularized RTLS for the sparse system identification setting.

  16. Method for implementation of recursive hierarchical segmentation on parallel computers

    NASA Technical Reports Server (NTRS)

    Tilton, James C. (Inventor)

    2005-01-01

    A method, computer readable storage, and apparatus for implementing a recursive hierarchical segmentation algorithm on a parallel computing platform. The method includes setting a bottom level of recursion that defines where a recursive division of an image into sections stops dividing, and setting an intermediate level of recursion where the recursive division changes from a parallel implementation into a serial implementation. The segmentation algorithm is implemented according to the set levels. The method can also include setting a convergence check level of recursion with which the first level of recursion communicates with when performing a convergence check.

  17. Partitioning Strategy Using Static Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Seo, Yongjin; Soo Kim, Hyeon

    2016-08-01

    Flight software is software used in satellites' on-board computers. It has requirements such as real time and reliability. The IMA architecture is used to satisfy these requirements. The IMA architecture has the concept of partitions and this affected the configuration of flight software. That is, situations occurred in which software that had been loaded on one system was divided into many partitions when being loaded. For new issues, existing studies use experience based partitioning methods. However, these methods have a problem that they cannot be reused. In this respect, this paper proposes a partitioning method that is reusable and consistent.

  18. The Massachusetts abscess rule: a clinical decision rule using ultrasound to identify methicillin-resistant Staphylococcus aureus in skin abscesses.

    PubMed

    Gaspari, Romolo J; Blehar, David; Polan, David; Montoya, Anthony; Alsulaibikh, Amal; Liteplo, Andrew

    2014-05-01

    Treatment failure rates for incision and drainage (I&D) of skin abscesses have increased in recent years and may be attributable to an increased prevalence of community-acquired methicillin-resistant Staphylococcus aureus (CA-MRSA). Previous authors have described sonographic features of abscesses, such as the presence of interstitial fluid, characteristics of abscess debris, and depth of abscess cavity. It is possible that the sonographic features are associated with MRSA and can be used to predict the presence of MRSA. The authors describe a potential clinical decision rule (CDR) using sonographic images to predict the presence of CA-MRSA. This was a pilot CDR derivation study using databases from two emergency departments (EDs) of patients presenting to the ED with uncomplicated skin abscesses who underwent I&D and culture of the abscess contents. Patients underwent ultrasound (US) imaging of the abscesses prior to I&D. Abscess contents were sent for culture and sensitivity. Two independent physicians experienced in soft tissue US blinded to the culture results and clinical data reviewed the images in a standardized fashion for the presence or absence of the predetermined image characteristics. In the instance of a disagreement between the initial two investigators, a third reviewer adjudicated the findings prior to analysis. The association between the primary outcome (presence of MRSA) and each sonographic feature was assessed using univariate and multivariate analysis. The reliability of each sonographic feature was measured by calculating the kappa (κ) coefficient of interobserver agreement. The decision tree model for the CDR was created with recursive partitioning using variables that were both reliable and strongly associated with MRSA. Of the total of 2,167 patients who presented with skin and soft tissue infections during the study period, 605 patients met inclusion criteria with US imaging and culture and sensitivity of purulence. Among the pathogenic organisms, MRSA was the most frequently isolated, representing 50.1% of all patients. Six of the sonographic features were associated with the presence of MRSA, but only four of these features were reliable using the kappa analysis. Recursive partitioning identified three independent variables that were both associated with MRSA and reliable: 1) the lack of a well-defined edge, 2) small volume, and 3) irregular or indistinct shape. This decision rule demonstrates a sensitivity of 89.2% (95% confidence interval [CI] = 84.7% to 92.7%), a specificity of 44.7% (95% CI = 40.9% to 47.8%), a positive predictive value of 57.9 (95% CI = 55.0 to 60.2), a negative predictive value of 82.9 (95% CI = 75.9 to 88.5), and an odds ratio (OR) of 7.0 (95% CI = 4.0 to 12.2). According to our putative CDR, patients with skin abscesses that are small, irregularly shaped, or indistinct, with ill-defined edges, are seven times more likely to demonstrate MRSA on culture. © 2014 by the Society for Academic Emergency Medicine.

  19. Cytologic separation of branchial cleft cyst from metastatic cystic squamous cell carcinoma: A multivariate analysis of nineteen cytomorphologic features.

    PubMed

    Layfield, Lester J; Esebua, Magda; Schmidt, Robert L

    2016-07-01

    The separation of branchial cleft cysts from metastatic cystic squamous cell carcinomas in adults can be clinically and cytologically challenging. Diagnostic accuracy for separation is reported to be as low as 75% prompting some authors to recommend frozen section evaluation of suspected branchial cleft cysts before resection. We evaluated 19 cytologic features to determine which were useful in this distinction. Thirty-three cases (21 squamous carcinoma and 12 branchial cysts) of histologically confirmed cystic lesions of the lateral neck were graded for the presence or absence of 19 cytologic features by two cytopathologists. The cytologic features were analyzed for agreement between observers and underwent multivariate analysis for correlation with the diagnosis of carcinoma. Interobserver agreement was greatest for increased nuclear/cytoplasmic (N/C) ratio, pyknotic nuclei, and irregular nuclear membranes. Recursive partitioning analysis showed increased N/C ratio, small clusters of cells, and irregular nuclear membranes were the best discriminators. The distinction of branchial cleft cysts from cystic squamous cell carcinoma is cytologically difficult. Both digital image analysis and p16 testing have been suggested as aids in this separation, but analysis of cytologic features remains the main method for diagnosis. In an analysis of 19 cytologic features, we found that high nuclear cytoplasmic ratio, irregular nuclear membranes, and small cell clusters were most helpful in their distinction. Diagn. Cytopathol. 2016;44:561-567. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  20. The process and utility of classification and regression tree methodology in nursing research

    PubMed Central

    Kuhn, Lisa; Page, Karen; Ward, John; Worrall-Carter, Linda

    2014-01-01

    Aim This paper presents a discussion of classification and regression tree analysis and its utility in nursing research. Background Classification and regression tree analysis is an exploratory research method used to illustrate associations between variables not suited to traditional regression analysis. Complex interactions are demonstrated between covariates and variables of interest in inverted tree diagrams. Design Discussion paper. Data sources English language literature was sourced from eBooks, Medline Complete and CINAHL Plus databases, Google and Google Scholar, hard copy research texts and retrieved reference lists for terms including classification and regression tree* and derivatives and recursive partitioning from 1984–2013. Discussion Classification and regression tree analysis is an important method used to identify previously unknown patterns amongst data. Whilst there are several reasons to embrace this method as a means of exploratory quantitative research, issues regarding quality of data as well as the usefulness and validity of the findings should be considered. Implications for Nursing Research Classification and regression tree analysis is a valuable tool to guide nurses to reduce gaps in the application of evidence to practice. With the ever-expanding availability of data, it is important that nurses understand the utility and limitations of the research method. Conclusion Classification and regression tree analysis is an easily interpreted method for modelling interactions between health-related variables that would otherwise remain obscured. Knowledge is presented graphically, providing insightful understanding of complex and hierarchical relationships in an accessible and useful way to nursing and other health professions. PMID:24237048

  1. The process and utility of classification and regression tree methodology in nursing research.

    PubMed

    Kuhn, Lisa; Page, Karen; Ward, John; Worrall-Carter, Linda

    2014-06-01

    This paper presents a discussion of classification and regression tree analysis and its utility in nursing research. Classification and regression tree analysis is an exploratory research method used to illustrate associations between variables not suited to traditional regression analysis. Complex interactions are demonstrated between covariates and variables of interest in inverted tree diagrams. Discussion paper. English language literature was sourced from eBooks, Medline Complete and CINAHL Plus databases, Google and Google Scholar, hard copy research texts and retrieved reference lists for terms including classification and regression tree* and derivatives and recursive partitioning from 1984-2013. Classification and regression tree analysis is an important method used to identify previously unknown patterns amongst data. Whilst there are several reasons to embrace this method as a means of exploratory quantitative research, issues regarding quality of data as well as the usefulness and validity of the findings should be considered. Classification and regression tree analysis is a valuable tool to guide nurses to reduce gaps in the application of evidence to practice. With the ever-expanding availability of data, it is important that nurses understand the utility and limitations of the research method. Classification and regression tree analysis is an easily interpreted method for modelling interactions between health-related variables that would otherwise remain obscured. Knowledge is presented graphically, providing insightful understanding of complex and hierarchical relationships in an accessible and useful way to nursing and other health professions. © 2013 The Authors. Journal of Advanced Nursing Published by John Wiley & Sons Ltd.

  2. Use of genomic recursions and algorithm for proven and young animals for single-step genomic BLUP analyses--a simulation study.

    PubMed

    Fragomeni, B O; Lourenco, D A L; Tsuruta, S; Masuda, Y; Aguilar, I; Misztal, I

    2015-10-01

    The purpose of this study was to examine accuracy of genomic selection via single-step genomic BLUP (ssGBLUP) when the direct inverse of the genomic relationship matrix (G) is replaced by an approximation of G(-1) based on recursions for young genotyped animals conditioned on a subset of proven animals, termed algorithm for proven and young animals (APY). With the efficient implementation, this algorithm has a cubic cost with proven animals and linear with young animals. Ten duplicate data sets mimicking a dairy cattle population were simulated. In a first scenario, genomic information for 20k genotyped bulls, divided in 7k proven and 13k young bulls, was generated for each replicate. In a second scenario, 5k genotyped cows with phenotypes were included in the analysis as young animals. Accuracies (average for the 10 replicates) in regular EBV were 0.72 and 0.34 for proven and young animals, respectively. When genomic information was included, they increased to 0.75 and 0.50. No differences between genomic EBV (GEBV) obtained with the regular G(-1) and the approximated G(-1) via the recursive method were observed. In the second scenario, accuracies in GEBV (0.76, 0.51 and 0.59 for proven bulls, young males and young females, respectively) were also higher than those in EBV (0.72, 0.35 and 0.49). Again, no differences between GEBV with regular G(-1) and with recursions were observed. With the recursive algorithm, the number of iterations to achieve convergence was reduced from 227 to 206 in the first scenario and from 232 to 209 in the second scenario. Cows can be treated as young animals in APY without reducing the accuracy. The proposed algorithm can be implemented to reduce computing costs and to overcome current limitations on the number of genotyped animals in the ssGBLUP method. © 2015 Blackwell Verlag GmbH.

  3. Recursive flexible multibody system dynamics using spatial operators

    NASA Technical Reports Server (NTRS)

    Jain, A.; Rodriguez, G.

    1992-01-01

    This paper uses spatial operators to develop new spatially recursive dynamics algorithms for flexible multibody systems. The operator description of the dynamics is identical to that for rigid multibody systems. Assumed-mode models are used for the deformation of each individual body. The algorithms are based on two spatial operator factorizations of the system mass matrix. The first (Newton-Euler) factorization of the mass matrix leads to recursive algorithms for the inverse dynamics, mass matrix evaluation, and composite-body forward dynamics for the systems. The second (innovations) factorization of the mass matrix, leads to an operator expression for the mass matrix inverse and to a recursive articulated-body forward dynamics algorithm. The primary focus is on serial chains, but extensions to general topologies are also described. A comparison of computational costs shows that the articulated-body, forward dynamics algorithm is much more efficient than the composite-body algorithm for most flexible multibody systems.

  4. A diagnostic model for impending death in cancer patients: Preliminary report.

    PubMed

    Hui, David; Hess, Kenneth; dos Santos, Renata; Chisholm, Gary; Bruera, Eduardo

    2015-11-01

    Several highly specific bedside physical signs associated with impending death within 3 days for patients with advanced cancer were recently identified. A diagnostic model for impending death based on these physical signs was developed and assessed. Sixty-two physical signs were systematically documented every 12 hours from admission to death or discharge for 357 patients with advanced cancer who were admitted to acute palliative care units (APCUs) at 2 tertiary care cancer centers. Recursive partitioning analysis was used to develop a prediction model for impending death within 3 days with admission data. The model was validated with 5 iterations of 10-fold cross-validation, and the model was also applied to APCU days 2 to 6. For the 322 of 357 patients (90%) with complete data for all signs, the 3-day mortality rate was 24% on admission. The final model was based on 2 variables (Palliative Performance Scale [PPS] and drooping of nasolabial folds) and had 4 terminal leaves: PPS score ≤ 20% and drooping of nasolabial folds present, PPS score ≤ 20% and drooping of nasolabial folds absent, PPS score of 30% to 60%, and PPS score ≥ 70%. The 3-day mortality rates were 94%, 42%, 16%, and 3%, respectively. The diagnostic accuracy was 81% for the original tree, 80% for cross-validation, and 79% to 84% for subsequent APCU days. Based on 2 objective bedside physical signs, a diagnostic model was developed for impending death within 3 days. This model was applicable to both APCU admission and subsequent days. Upon further external validation, this model may help clinicians to formulate the diagnosis of impending death. © 2015 American Cancer Society.

  5. Metabolic flux ratio analysis and cell staining suggest the existence of C4 photosynthesis in Phaeodactylum tricornutum.

    PubMed

    Huang, A; Liu, L; Zhao, P; Yang, C; Wang, G C

    2016-03-01

    Mechanisms for carbon fixation via photosynthesis in the diatom Phaeodactylum tricornutum Bohlin were studied recently but there remains a long-standing debate concerning the occurrence of C4 photosynthesis in this species. A thorough investigation of carbon metabolism and the evidence for C4 photosynthesis based on organelle partitioning was needed. In this study, we identified the flux ratios between C3 and C4 compounds in P. tricornutum using (13)C-labelling metabolic flux ratio analysis, and stained cells with various cell-permeant fluorescent probes to investigate the likely organelle partitioning required for single-cell C4 photosynthesis. Metabolic flux ratio analysis indicated the C3/C4 exchange ratios were high. Cell staining indicated organelle partitioning required for single-cell C4 photosynthesis might exist in P. tricornutum. The results of (13)C-labelling metabolic flux ratio analysis and cell staining suggest single-cell C4 photosynthesis exists in P. tricornutum. This study provides insights into photosynthesis patterns of P. tricornutum and the evidence for C4 photosynthesis based on (13)C-labelling metabolic flux ratio analysis and organelle partitioning. © 2015 The Society for Applied Microbiology.

  6. Vehicle Sprung Mass Estimation for Rough Terrain

    DTIC Science & Technology

    2011-03-01

    distributions are greater than zero. The multivariate polynomials are functions of the Legendre polynomials (Poularikas (1999...developed methods based on polynomial chaos theory and on the maximum likelihood approach to estimate the most likely value of the vehicle sprung...mass. The polynomial chaos estimator is compared to benchmark algorithms including recursive least squares, recursive total least squares, extended

  7. Becoming with Data: Developing Self-Assessing Recursive Pedagogies in Schools and Using Second-Order Cybernetics as a Thinking Tool

    ERIC Educational Resources Information Center

    Reinertsen, Anne Beate

    2014-01-01

    This article is about developing school-based self-assessing recursive pedagogies and case/action research practices and/or approaches in schools, and teachers, teacher researchers and researchers simultaneously producing and theorising their own practices using second-order cybernetics as a thinking tool. It is a move towards pragmatic…

  8. Moderate deviations-based importance sampling for stochastic recursive equations

    DOE PAGES

    Dupuis, Paul; Johnson, Dane

    2017-11-17

    Abstract Subsolutions to the Hamilton–Jacobi–Bellman equation associated with a moderate deviations approximation are used to design importance sampling changes of measure for stochastic recursive equations. Analogous to what has been done for large deviations subsolution-based importance sampling, these schemes are shown to be asymptotically optimal under the moderate deviations scaling. We present various implementations and numerical results to contrast their performance, and also discuss the circumstances under which a moderate deviation scaling might be appropriate.

  9. Moderate deviations-based importance sampling for stochastic recursive equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dupuis, Paul; Johnson, Dane

    Abstract Subsolutions to the Hamilton–Jacobi–Bellman equation associated with a moderate deviations approximation are used to design importance sampling changes of measure for stochastic recursive equations. Analogous to what has been done for large deviations subsolution-based importance sampling, these schemes are shown to be asymptotically optimal under the moderate deviations scaling. We present various implementations and numerical results to contrast their performance, and also discuss the circumstances under which a moderate deviation scaling might be appropriate.

  10. Critic: a new program for the topological analysis of solid-state electron densities

    NASA Astrophysics Data System (ADS)

    Otero-de-la-Roza, A.; Blanco, M. A.; Pendás, A. Martín; Luaña, Víctor

    2009-01-01

    In this paper we introduce CRITIC, a new program for the topological analysis of the electron densities of crystalline solids. Two different versions of the code are provided, one adapted to the LAPW (Linear Augmented Plane Wave) density calculated by the WIEN2K package and the other to the ab initio Perturbed Ion ( aiPI) density calculated with the PI7 code. Using the converged ground state densities, CRITIC can locate their critical points, determine atomic basins and integrate properties within them, and generate several graphical representations which include topological atomic basins and primary bundles, contour maps of ρ and ∇ρ, vector maps of ∇ρ, chemical graphs, etc. Program summaryProgram title: CRITIC Catalogue identifier: AECB_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AECB_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL, version 3 No. of lines in distributed program, including test data, etc.: 1 206 843 No. of bytes in distributed program, including test data, etc.: 12 648 065 Distribution format: tar.gz Programming language: FORTRAN 77 and 90 Computer: Any computer capable of compiling Fortran Operating system: Unix, GNU/Linux Classification: 7.3 Nature of problem: Topological analysis of the electron density in periodic solids. Solution method: The automatic localization of the electron density critical points is based on a recursive partitioning of the Wigner-Seitz cell into tetrahedra followed by a Newton search from significant points on each tetrahedra. Plotting of and integration on the atomic basins is currently based on a new implementation of Keith's promega algorithm. Running time: Variable, depending on the task. From seconds to a few minutes for the localization of critical points. Hours to days for the determination of the atomic basins shape and properties. Times correspond to a typical 2007 PC.

  11. A pipeline VLSI design of fast singular value decomposition processor for real-time EEG system based on on-line recursive independent component analysis.

    PubMed

    Huang, Kuan-Ju; Shih, Wei-Yeh; Chang, Jui Chung; Feng, Chih Wei; Fang, Wai-Chi

    2013-01-01

    This paper presents a pipeline VLSI design of fast singular value decomposition (SVD) processor for real-time electroencephalography (EEG) system based on on-line recursive independent component analysis (ORICA). Since SVD is used frequently in computations of the real-time EEG system, a low-latency and high-accuracy SVD processor is essential. During the EEG system process, the proposed SVD processor aims to solve the diagonal, inverse and inverse square root matrices of the target matrices in real time. Generally, SVD requires a huge amount of computation in hardware implementation. Therefore, this work proposes a novel design concept for data flow updating to assist the pipeline VLSI implementation. The SVD processor can greatly improve the feasibility of real-time EEG system applications such as brain computer interfaces (BCIs). The proposed architecture is implemented using TSMC 90 nm CMOS technology. The sample rate of EEG raw data adopts 128 Hz. The core size of the SVD processor is 580×580 um(2), and the speed of operation frequency is 20MHz. It consumes 0.774mW of power during the 8-channel EEG system per execution time.

  12. SAMPLING AND ANALYSIS OF SEMIVOLATILE AEROSOLS

    EPA Science Inventory

    Denuder based samplers can effectively separate semivolatile gases from particles and 'freeze' the partitioning in time. Conversely, samples collected on filters partition mass according to the conditions of the influent airstream, which may change over time. As a result thes...

  13. Adaptable Iterative and Recursive Kalman Filter Schemes

    NASA Technical Reports Server (NTRS)

    Zanetti, Renato

    2014-01-01

    Nonlinear filters are often very computationally expensive and usually not suitable for real-time applications. Real-time navigation algorithms are typically based on linear estimators, such as the extended Kalman filter (EKF) and, to a much lesser extent, the unscented Kalman filter. The Iterated Kalman filter (IKF) and the Recursive Update Filter (RUF) are two algorithms that reduce the consequences of the linearization assumption of the EKF by performing N updates for each new measurement, where N is the number of recursions, a tuning parameter. This paper introduces an adaptable RUF algorithm to calculate N on the go, a similar technique can be used for the IKF as well.

  14. A novel recursive Fourier transform for nonuniform sampled signals: application to heart rate variability spectrum estimation.

    PubMed

    Holland, Alexander; Aboy, Mateo

    2009-07-01

    We present a novel method to iteratively calculate discrete Fourier transforms for discrete time signals with sample time intervals that may be widely nonuniform. The proposed recursive Fourier transform (RFT) does not require interpolation of the samples to uniform time intervals, and each iterative transform update of N frequencies has computational order N. Because of the inherent non-uniformity in the time between successive heart beats, an application particularly well suited for this transform is power spectral density (PSD) estimation for heart rate variability. We compare RFT based spectrum estimation with Lomb-Scargle Transform (LST) based estimation. PSD estimation based on the LST also does not require uniform time samples, but the LST has a computational order greater than Nlog(N). We conducted an assessment study involving the analysis of quasi-stationary signals with various levels of randomly missing heart beats. Our results indicate that the RFT leads to comparable estimation performance to the LST with significantly less computational overhead and complexity for applications requiring iterative spectrum estimations.

  15. Lord-Wingersky Algorithm Version 2.0 for Hierarchical Item Factor Models with Applications in Test Scoring, Scale Alignment, and Model Fit Testing.

    PubMed

    Cai, Li

    2015-06-01

    Lord and Wingersky's (Appl Psychol Meas 8:453-461, 1984) recursive algorithm for creating summed score based likelihoods and posteriors has a proven track record in unidimensional item response theory (IRT) applications. Extending the recursive algorithm to handle multidimensionality is relatively simple, especially with fixed quadrature because the recursions can be defined on a grid formed by direct products of quadrature points. However, the increase in computational burden remains exponential in the number of dimensions, making the implementation of the recursive algorithm cumbersome for truly high-dimensional models. In this paper, a dimension reduction method that is specific to the Lord-Wingersky recursions is developed. This method can take advantage of the restrictions implied by hierarchical item factor models, e.g., the bifactor model, the testlet model, or the two-tier model, such that a version of the Lord-Wingersky recursive algorithm can operate on a dramatically reduced set of quadrature points. For instance, in a bifactor model, the dimension of integration is always equal to 2, regardless of the number of factors. The new algorithm not only provides an effective mechanism to produce summed score to IRT scaled score translation tables properly adjusted for residual dependence, but leads to new applications in test scoring, linking, and model fit checking as well. Simulated and empirical examples are used to illustrate the new applications.

  16. ESTimating plant phylogeny: lessons from partitioning

    PubMed Central

    de la Torre, Jose EB; Egan, Mary G; Katari, Manpreet S; Brenner, Eric D; Stevenson, Dennis W; Coruzzi, Gloria M; DeSalle, Rob

    2006-01-01

    Background While Expressed Sequence Tags (ESTs) have proven a viable and efficient way to sample genomes, particularly those for which whole-genome sequencing is impractical, phylogenetic analysis using ESTs remains difficult. Sequencing errors and orthology determination are the major problems when using ESTs as a source of characters for systematics. Here we develop methods to incorporate EST sequence information in a simultaneous analysis framework to address controversial phylogenetic questions regarding the relationships among the major groups of seed plants. We use an automated, phylogenetically derived approach to orthology determination called OrthologID generate a phylogeny based on 43 process partitions, many of which are derived from ESTs, and examine several measures of support to assess the utility of EST data for phylogenies. Results A maximum parsimony (MP) analysis resulted in a single tree with relatively high support at all nodes in the tree despite rampant conflict among trees generated from the separate analysis of individual partitions. In a comparison of broader-scale groupings based on cellular compartment (ie: chloroplast, mitochondrial or nuclear) or function, only the nuclear partition tree (based largely on EST data) was found to be topologically identical to the tree based on the simultaneous analysis of all data. Despite topological conflict among the broader-scale groupings examined, only the tree based on morphological data showed statistically significant differences. Conclusion Based on the amount of character support contributed by EST data which make up a majority of the nuclear data set, and the lack of conflict of the nuclear data set with the simultaneous analysis tree, we conclude that the inclusion of EST data does provide a viable and efficient approach to address phylogenetic questions within a parsimony framework on a genomic scale, if problems of orthology determination and potential sequencing errors can be overcome. In addition, approaches that examine conflict and support in a simultaneous analysis framework allow for a more precise understanding of the evolutionary history of individual process partitions and may be a novel way to understand functional aspects of different kinds of cellular classes of gene products. PMID:16776834

  17. Proposed Staging System for Patients With HPV-Related Oropharyngeal Cancer Based on Nasopharyngeal Cancer N Categories

    PubMed Central

    Dahlstrom, Kristina R.; Garden, Adam S.; William, William N.; Lim, Ming Yann

    2016-01-01

    Purpose Patients with human papillomavirus (HPV)–related oropharyngeal cancer (OPC) generally present with more advanced disease but have better survival than patients with HPV-unrelated OPC. The current American Joint Commission on Cancer (AJCC)/Union for International Cancer Control (UICC) TNM staging system for OPC was developed for HPV-unrelated OPC. A new staging system is needed to adequately predict outcomes of patients with HPV-related OPC. Patients and Methods Patients with newly diagnosed HPV-positive OPC (by p16 immunohistochemistry or in situ hybridization) treated at our institution from January 2003 through December 2012 were included. By using recursive partitioning analysis (RPA), we developed new stage groupings with both traditional OPC regional lymph node (N) categories and nasopharyngeal carcinoma (NPC) N categories. Survival was estimated by the Kaplan-Meier method, and the relationship between stage and survival was examined by using Cox proportional hazards regression analysis. Results A total of 661 patients with HPV-positive OPC met the inclusion criteria. With the traditional TNM staging system, there was no difference in survival between stages (P = .141). RPA with NPC N categories resulted in more balanced stage groups and better separation between groups for 5-year survival than RPA with traditional OPC N categories. With the stage groupings that were based in part on NPC N categories, the risk of death increased with increasing stage (P for trend < .001), and patients with stage III disease had five times the risk of death versus patients with stage IA disease. Conclusion New stage groupings that are based on primary tumor (T) categories and NPC N categories better separate patients with HPV-positive OPC with respect to survival than does the current AJCC/UICC TNM staging system. Although confirmation of our findings in other patient populations is needed, we propose consideration of NPC N categories as an alternative to the traditional OPC N categories in the new AJCC/UICC TNM staging system that is currently being developed. PMID:26884553

  18. Health monitoring system for transmission shafts based on adaptive parameter identification

    NASA Astrophysics Data System (ADS)

    Souflas, I.; Pezouvanis, A.; Ebrahimi, K. M.

    2018-05-01

    A health monitoring system for a transmission shaft is proposed. The solution is based on the real-time identification of the physical characteristics of the transmission shaft i.e. stiffness and damping coefficients, by using a physical oriented model and linear recursive identification. The efficacy of the suggested condition monitoring system is demonstrated on a prototype transient engine testing facility equipped with a transmission shaft capable of varying its physical properties. Simulation studies reveal that coupling shaft faults can be detected and isolated using the proposed condition monitoring system. Besides, the performance of various recursive identification algorithms is addressed. The results of this work recommend that the health status of engine dynamometer shafts can be monitored using a simple lumped-parameter shaft model and a linear recursive identification algorithm which makes the concept practically viable.

  19. A recursively formulated first-order semianalytic artificial satellite theory based on the generalized method of averaging. Volume 1: The generalized method of averaging applied to the artificial satellite problem

    NASA Technical Reports Server (NTRS)

    Mcclain, W. D.

    1977-01-01

    A recursively formulated, first-order, semianalytic artificial satellite theory, based on the generalized method of averaging is presented in two volumes. Volume I comprehensively discusses the theory of the generalized method of averaging applied to the artificial satellite problem. Volume II presents the explicit development in the nonsingular equinoctial elements of the first-order average equations of motion. The recursive algorithms used to evaluate the first-order averaged equations of motion are also presented in Volume II. This semianalytic theory is, in principle, valid for a term of arbitrary degree in the expansion of the third-body disturbing function (nonresonant cases only) and for a term of arbitrary degree and order in the expansion of the nonspherical gravitational potential function.

  20. Brain Network Regional Synchrony Analysis in Deafness

    PubMed Central

    Xu, Lei; Liang, Mao-Jin

    2018-01-01

    Deafness, the most common auditory disease, has greatly affected people for a long time. The major treatment for deafness is cochlear implantation (CI). However, till today, there is still a lack of objective and precise indicator serving as evaluation of the effectiveness of the cochlear implantation. The goal of this EEG-based study is to effectively distinguish CI children from those prelingual deafened children without cochlear implantation. The proposed method is based on the functional connectivity analysis, which focuses on the brain network regional synchrony. Specifically, we compute the functional connectivity between each channel pair first. Then, we quantify the brain network synchrony among regions of interests (ROIs), where both intraregional synchrony and interregional synchrony are computed. And finally the synchrony values are concatenated to form the feature vector for the SVM classifier. What is more, we develop a new ROI partition method of 128-channel EEG recording system. That is, both the existing ROI partition method and the proposed ROI partition method are used in the experiments. Compared with the existing EEG signal classification methods, our proposed method has achieved significant improvements as large as 87.20% and 86.30% when the existing ROI partition method and the proposed ROI partition method are used, respectively. It further demonstrates that the new ROI partition method is comparable to the existing ROI partition method. PMID:29854776

  1. Toward using games to teach fundamental computer science concepts

    NASA Astrophysics Data System (ADS)

    Edgington, Jeffrey Michael

    Video and computer games have become an important area of study in the field of education. Games have been designed to teach mathematics, physics, raise social awareness, teach history and geography, and train soldiers in the military. Recent work has created computer games for teaching computer programming and understanding basic algorithms. We present an investigation where computer games are used to teach two fundamental computer science concepts: boolean expressions and recursion. The games are intended to teach the concepts and not how to implement them in a programming language. For this investigation, two computer games were created. One is designed to teach basic boolean expressions and operators and the other to teach fundamental concepts of recursion. We describe the design and implementation of both games. We evaluate the effectiveness of these games using before and after surveys. The surveys were designed to ascertain basic understanding, attitudes and beliefs regarding the concepts. The boolean game was evaluated with local high school students and students in a college level introductory computer science course. The recursion game was evaluated with students in a college level introductory computer science course. We present the analysis of the collected survey information for both games. This analysis shows a significant positive change in student attitude towards recursion and modest gains in student learning outcomes for both topics.

  2. Electrocardiographic predictors of adverse cardiovascular events in suspected poisoning.

    PubMed

    Manini, Alex F; Nelson, Lewis S; Skolnick, Adam H; Slater, William; Hoffman, Robert S

    2010-06-01

    Poisoning is the second leading cause of injury-related fatality in the USA and the leading cause of cardiac arrest in victims under 40 years of age. The study objective was to define the electrocardiographic (ECG) predictors of adverse cardiovascular events (ACVE) complicating suspected acute poisoning (SAP). This was a case-control study in adults at three tertiary-care hospitals and one regional Poison Control Center. We compared 34 cases of SAP complicated by ACVE to 101 consecutive control patients with uncomplicated SAP. The initial ECG was analyzed for rhythm, intervals, QT dispersion, ischemia, and infarction. ECGs were interpreted by a cardiologist, blinded to study hypothesis and case data. Subjects were 48% male, with mean age 42 +/- 19 years. In addition to clinical suspicion of poisoning in 100% of patients, routine toxicology screens were positive in 77%, most commonly for benzodiazepines, opioids, and/or acetaminophen. Neither the ventricular rate, the QRS duration, nor the presence of infarction predicted the risk of ACVE. However, the rhythm, QTc, QT dispersion, and presence of ischemia correlated with the risk of ACVE. Independent predictors of ACVE based on multivariable logistic regression were prolonged QTc, any non-sinus rhythm, ventricular ectopy, and ischemia. Recursive partitioning analysis identified very low risk criteria (94.1% sensitivity, 96.2% NPV) and high risk criteria (95% specificity). Among patients with SAP, the presence of QTc prolongation, QT dispersion, ventricular ectopy, any non-sinus rhythm, and evidence of ischemia on the initial ECG are strongly associated with ACVE.

  3. Association of Streptomyces community composition determined by PCR-denaturing gradient gel electrophoresis with indoor mold status

    PubMed Central

    Johansson, Elisabet; Reponen, Tiina; Meller, Jarek; Vesper, Stephen; Yadav, Jagjit

    2014-01-01

    Both Streptomyces species and mold species have previously been isolated from moisture-damaged building materials; however, an association between these two groups of microorganisms in indoor environments is not clear. In this study we used a culture-independent method, PCR denaturing gradient gel electrophoresis (PCR-DGGE) to investigate the composition of the Streptomyces community in house dust. Twenty-three dust samples each from two sets of homes categorized as high-mold and low-mold based on mold specific quantitative PCR-analysis were used in the study. Taxonomic identification of prominent bands was performed by cloning and sequencing. Associations between DGGE amplicon band intensities and home mold status were assessed using univariate analyses, as well as multivariate recursive partitioning (decision trees) to test the predictive value of combinations of bands intensities. In the final classification tree, a combination of two bands was significantly associated with mold status of the home (p = 0.001). The sequence corresponding to one of the bands in the final decision tree matched a group of Streptomyces species that included S. coelicolor and S. sampsonii, both of which have been isolated from moisture-damaged buildings previously. The closest match for the majority of sequences corresponding to a second band consisted of a group of Streptomyces species that included S. hygroscopicus, an important producer of antibiotics and immunosuppressors. Taken together, the study showed that DGGE can be a useful tool for identifying bacterial species that may be more prevalent in mold-damaged buildings. PMID:25331035

  4. Recursion equations in predicting band width under gradient elution.

    PubMed

    Liang, Heng; Liu, Ying

    2004-06-18

    The evolution of solute zone under gradient elution is a typical problem of non-linear continuity equation since the local diffusion coefficient and local migration velocity of the mass cells of solute zones are the functions of position and time due to space- and time-variable mobile phase composition. In this paper, based on the mesoscopic approaches (Lagrangian description, the continuity theory and the local equilibrium assumption), the evolution of solute zones in space- and time-dependent fields is described by the iterative addition of local probability density of the mass cells of solute zones. Furthermore, on macroscopic levels, the recursion equations have been proposed to simulate zone migration and spreading in reversed-phase high-performance liquid chromatography (RP-HPLC) through directly relating local retention factor and local diffusion coefficient to local mobile phase concentration. This new approach differs entirely from the traditional theories on plate concept with Eulerian description, since band width recursion equation is actually the accumulation of local diffusion coefficients of solute zones to discrete-time slices. Recursion equations and literature equations were used in dealing with same experimental data in RP-HPLC, and the comparison results show that the recursion equations can accurately predict band width under gradient elution.

  5. Recursive inversion of externally defined linear systems

    NASA Technical Reports Server (NTRS)

    Bach, Ralph E., Jr.; Baram, Yoram

    1988-01-01

    The approximate inversion of an internally unknown linear system, given by its impulse response sequence, by an inverse system having a finite impulse response, is considered. The recursive least squares procedure is shown to have an exact initialization, based on the triangular Toeplitz structure of the matrix involved. The proposed approach also suggests solutions to the problems of system identification and compensation.

  6. Multi-viewpoint clustering analysis

    NASA Technical Reports Server (NTRS)

    Mehrotra, Mala; Wild, Chris

    1993-01-01

    In this paper, we address the feasibility of partitioning rule-based systems into a number of meaningful units to enhance the comprehensibility, maintainability and reliability of expert systems software. Preliminary results have shown that no single structuring principle or abstraction hierarchy is sufficient to understand complex knowledge bases. We therefore propose the Multi View Point - Clustering Analysis (MVP-CA) methodology to provide multiple views of the same expert system. We present the results of using this approach to partition a deployed knowledge-based system that navigates the Space Shuttle's entry. We also discuss the impact of this approach on verification and validation of knowledge-based systems.

  7. Application of recursive approaches to differential orbit correction of near Earth asteroids

    NASA Astrophysics Data System (ADS)

    Dmitriev, Vasily; Lupovka, Valery; Gritsevich, Maria

    2016-10-01

    Comparison of three approaches to the differential orbit correction of celestial bodies was performed: batch least squares fitting, Kalman filter, and recursive least squares filter. The first two techniques are well known and widely used (Montenbruck, O. & Gill, E., 2000). The most attention is paid to the algorithm and details of program realization of recursive least squares filter. The filter's algorithm was derived based on recursive least squares technique that are widely used in data processing applications (Simon, D, 2006). Usage recursive least squares filter, makes possible to process a new set of observational data, without reprocessing data, which has been processed before. Specific feature of such approach is that number of observation in data set may be variable. This feature makes recursive least squares filter more flexible approach compare to batch least squares (process complete set of observations in each iteration) and Kalman filtering (suppose updating state vector on each epoch with measurements).Advantages of proposed approach are demonstrated by processing of real astrometric observations of near Earth asteroids. The case of 2008 TC3 was studied. 2008 TC3 was discovered just before its impact with Earth. There are a many closely spaced observations of 2008 TC3 on the interval between discovering and impact, which creates favorable conditions for usage of recursive approaches. Each of approaches has very similar precision in case of 2008 TC3. At the same time, recursive least squares approaches have much higher performance. Thus, this approach more favorable for orbit fitting of a celestial body, which was detected shortly before the collision or close approach to the Earth.This work was carried out at MIIGAiK and supported by the Russian Science Foundation, Project no. 14-22-00197.References:O. Montenbruck and E. Gill, "Satellite Orbits, Models, Methods and Applications," Springer-Verlag, 2000, pp. 1-369.D. Simon, "Optimal State Estimation: Kalman, H Infinity, and Nonlinear Approaches",1 edition. Hoboken, N.J.: Wiley-Interscience, 2006.

  8. A Note on Local Stability Conditions for Two Types of Monetary Models with Recursive Utility

    NASA Astrophysics Data System (ADS)

    Miyazaki, Kenji; Utsunomiya, Hitoshi

    2009-09-01

    This note explores local stability conditions for money-in-utility-function (MIUF) and transaction-costs (TC) models with recursive utility. Although Chen et al. [Chen, B.-L., M. Hsu, and C.-H. Lin, 2008, Inflation and growth: impatience and a qualitative equivalent, Journal of Money, Credit, and Banking, Vol. 40, No. 6, 1310-1323] investigated the relationship between inflation and growth in MIUF and TC models with recursive utility, they conducted only a comparative static analysis in a steady state. By establishing sufficient conditions for local stability, this note proves that impatience should be increasing in consumption and real balances. Increasing impatience, although less plausible from an empirical point of view, receives more support from a theoretical viewpoint.

  9. Optimal hydrograph separation using a recursive digital filter constrained by chemical mass balance, with application to selected Chesapeake Bay watersheds

    USGS Publications Warehouse

    Raffensperger, Jeff P.; Baker, Anna C.; Blomquist, Joel D.; Hopple, Jessica A.

    2017-06-26

    Quantitative estimates of base flow are necessary to address questions concerning the vulnerability and response of the Nation’s water supply to natural and human-induced change in environmental conditions. An objective of the U.S. Geological Survey National Water-Quality Assessment Project is to determine how hydrologic systems are affected by watershed characteristics, including land use, land cover, water use, climate, and natural characteristics (geology, soil type, and topography). An important component of any hydrologic system is base flow, generally described as the part of streamflow that is sustained between precipitation events, fed to stream channels by delayed (usually subsurface) pathways, and more specifically as the volumetric discharge of water, estimated at a measurement site or gage at the watershed scale, which represents groundwater that discharges directly or indirectly to stream reaches and is then routed to the measurement point.Hydrograph separation using a recursive digital filter was applied to 225 sites in the Chesapeake Bay watershed. The recursive digital filter was chosen for the following reasons: it is based in part on the assumption that groundwater acts as a linear reservoir, and so has a physical basis; it has only two adjustable parameters (alpha, obtained directly from recession analysis, and beta, the maximum value of the base-flow index that can be modeled by the filter), which can be determined objectively and with the same physical basis of groundwater reservoir linearity, or that can be optimized by applying a chemical-mass-balance constraint. Base-flow estimates from the recursive digital filter were compared with those from five other hydrograph-separation methods with respect to two metrics: the long-term average fraction of streamflow that is base flow, or base-flow index, and the fraction of days where streamflow is entirely base flow. There was generally good correlation between the methods, with some biased slightly high and some biased slightly low compared to the recursive digital filter. There were notable differences between the days at base flow estimated by the different methods, with the recursive digital filter having a smaller range of values. This was attributed to how the different methods determine cessation of quickflow (the part of streamflow which is not base flow).For 109 Chesapeake Bay watershed sites with available specific conductance data, the parameters of the filter were optimized using a chemical-mass-balance constraint and two different models for the time-dependence of base-flow specific conductance. Sixty-seven models were deemed acceptable and the results compared well with non-optimized results. There are a number of limitations to the optimal hydrograph-separation approach resulting from the assumptions implicit in the conceptual model, the mathematical model, and the approach taken to impose chemical mass balance (including tracer choice). These limitations may be evidenced by poor model results; conversely, poor model fit may provide an indication that two-component separation does not adequately describe the hydrologic system’s runoff response.The results of this study may be used to address a number of questions regarding the role of groundwater in understanding past changes in stream-water quality and forecasting possible future changes, such as the timing and magnitude of land-use and management practice effects on stream and groundwater quality. Ongoing and future modeling efforts may benefit from the estimates of base flow as calibration targets or as a means to filter chemical data to model base-flow loads and trends. Ultimately, base-flow estimation might provide the basis for future work aimed at improving the ability to quantify groundwater discharge, not only at the scale of a gaged watershed, but at the scale of individual reaches as well.

  10. Plant interspecies competition for sunlight: a mathematical model of canopy partitioning.

    PubMed

    Nevai, Andrew L; Vance, Richard R

    2007-07-01

    We examine the influence of canopy partitioning on the outcome of competition between two plant species that interact only by mutually shading each other. This analysis is based on a Kolmogorov-type canopy partitioning model for plant species with clonal growth form and fixed vertical leaf profiles (Vance and Nevai in J. Theor. Biol., 2007, to appear). We show that canopy partitioning is necessary for the stable coexistence of the two competing plant species. We also use implicit methods to show that, under certain conditions, the species' nullclines can intersect at most once. We use nullcline endpoint analysis to show that when the nullclines do intersect, and in such a way that they cross, then the resulting equilibrium point is always stable. We also construct surfaces that divide parameter space into regions within which the various outcomes of competition occur, and then study parameter dependence in the locations of these surfaces. The analysis presented here and in a companion paper (Nevai and Vance, The role of leaf height in plant competition for sunlight: analysis of a canopy partitioning model, in review) together shows that canopy partitioning is both necessary and, under appropriate parameter values, sufficient for the stable coexistence of two hypothetical plant species whose structure and growth are described by our model.

  11. A combination of spatial and recursive temporal filtering for noise reduction when using region of interest (ROI) fluoroscopy for patient dose reduction in image guided vascular interventions with significant anatomical motion

    NASA Astrophysics Data System (ADS)

    Setlur Nagesh, S. V.; Khobragade, P.; Ionita, C.; Bednarek, D. R.; Rudin, S.

    2015-03-01

    Because x-ray based image-guided vascular interventions are minimally invasive they are currently the most preferred method of treating disorders such as stroke, arterial stenosis, and aneurysms; however, the x-ray exposure to the patient during long image-guided interventional procedures could cause harmful effects such as cancer in the long run and even tissue damage in the short term. ROI fluoroscopy reduces patient dose by differentially attenuating the incident x-rays outside the region-of-interest. To reduce the noise in the dose-reduced regions previously recursive temporal filtering was successfully demonstrated for neurovascular interventions. However, in cardiac interventions, anatomical motion is significant and excessive recursive filtering could cause blur. In this work the effects of three noise-reduction schemes, including recursive temporal filtering, spatial mean filtering, and a combination of spatial and recursive temporal filtering, were investigated in a simulated ROI dose-reduced cardiac intervention. First a model to simulate the aortic arch and its movement was built. A coronary stent was used to simulate a bioprosthetic valve used in TAVR procedures and was deployed under dose-reduced ROI fluoroscopy during the simulated heart motion. The images were then retrospectively processed for noise reduction in the periphery, using recursive temporal filtering, spatial filtering and a combination of both. Quantitative metrics for all three noise reduction schemes are calculated and are presented as results. From these it can be concluded that with significant anatomical motion, a combination of spatial and recursive temporal filtering scheme is best suited for reducing the excess quantum noise in the periphery. This new noise-reduction technique in combination with ROI fluoroscopy has the potential for substantial patient-dose savings in cardiac interventions.

  12. Teaching and learning recursive programming: a review of the research literature

    NASA Astrophysics Data System (ADS)

    McCauley, Renée; Grissom, Scott; Fitzgerald, Sue; Murphy, Laurie

    2015-01-01

    Hundreds of articles have been published on the topics of teaching and learning recursion, yet fewer than 50 of them have published research results. This article surveys the computing education research literature and presents findings on challenges students encounter in learning recursion, mental models students develop as they learn recursion, and best practices in introducing recursion. Effective strategies for introducing the topic include using different contexts such as recurrence relations, programming examples, fractal images, and a description of how recursive methods are processed using a call stack. Several studies compared the efficacy of introducing iteration before recursion and vice versa. The paper concludes with suggestions for future research into how students learn and understand recursion, including a look at the possible impact of instructor attitude and newer pedagogies.

  13. Multi-fidelity Gaussian process regression for prediction of random fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parussini, L.; Venturi, D., E-mail: venturi@ucsc.edu; Perdikaris, P.

    We propose a new multi-fidelity Gaussian process regression (GPR) approach for prediction of random fields based on observations of surrogate models or hierarchies of surrogate models. Our method builds upon recent work on recursive Bayesian techniques, in particular recursive co-kriging, and extends it to vector-valued fields and various types of covariances, including separable and non-separable ones. The framework we propose is general and can be used to perform uncertainty propagation and quantification in model-based simulations, multi-fidelity data fusion, and surrogate-based optimization. We demonstrate the effectiveness of the proposed recursive GPR techniques through various examples. Specifically, we study the stochastic Burgersmore » equation and the stochastic Oberbeck–Boussinesq equations describing natural convection within a square enclosure. In both cases we find that the standard deviation of the Gaussian predictors as well as the absolute errors relative to benchmark stochastic solutions are very small, suggesting that the proposed multi-fidelity GPR approaches can yield highly accurate results.« less

  14. The Power Plant Operating Data Based on Real-time Digital Filtration Technology

    NASA Astrophysics Data System (ADS)

    Zhao, Ning; Chen, Ya-mi; Wang, Hui-jie

    2018-03-01

    Real-time monitoring of the data of the thermal power plant was the basis of accurate analyzing thermal economy and accurate reconstruction of the operating state. Due to noise interference was inevitable; we need real-time monitoring data filtering to get accurate information of the units and equipment operating data of the thermal power plant. Real-time filtering algorithm couldn’t be used to correct the current data with future data. Compared with traditional filtering algorithm, there were a lot of constraints. First-order lag filtering method and weighted recursive average filtering method could be used for real-time filtering. This paper analyzes the characteristics of the two filtering methods and applications for real-time processing of the positive spin simulation data, and the thermal power plant operating data. The analysis was revealed that the weighted recursive average filtering method applied to the simulation and real-time plant data filtering achieved very good results.

  15. Recursive analytical solution describing artificial satellite motion perturbed by an arbitrary number of zonal terms

    NASA Technical Reports Server (NTRS)

    Mueller, A. C.

    1977-01-01

    An analytical first order solution has been developed which describes the motion of an artificial satellite perturbed by an arbitrary number of zonal harmonics of the geopotential. A set of recursive relations for the solution, which was deduced from recursive relations of the geopotential, was derived. The method of solution is based on Von-Zeipel's technique applied to a canonical set of two-body elements in the extended phase space which incorporates the true anomaly as a canonical element. The elements are of Poincare type, that is, they are regular for vanishing eccentricities and inclinations. Numerical results show that this solution is accurate to within a few meters after 500 revolutions.

  16. A fast recursive algorithm for molecular dynamics simulation

    NASA Technical Reports Server (NTRS)

    Jain, A.; Vaidehi, N.; Rodriguez, G.

    1993-01-01

    The present recursive algorithm for solving molecular systems' dynamical equations of motion employs internal variable models that reduce such simulations' computation time by an order of magnitude, relative to Cartesian models. Extensive use is made of spatial operator methods recently developed for analysis and simulation of the dynamics of multibody systems. A factor-of-450 speedup over the conventional O(N-cubed) algorithm is demonstrated for the case of a polypeptide molecule with 400 residues.

  17. A recursive vesicle-based model protocell with a primitive model cell cycle

    NASA Astrophysics Data System (ADS)

    Kurihara, Kensuke; Okura, Yusaku; Matsuo, Muneyuki; Toyota, Taro; Suzuki, Kentaro; Sugawara, Tadashi

    2015-09-01

    Self-organized lipid structures (protocells) have been proposed as an intermediate between nonliving material and cellular life. Synthetic production of model protocells can demonstrate the potential processes by which living cells first arose. While we have previously described a giant vesicle (GV)-based model protocell in which amplification of DNA was linked to self-reproduction, the ability of a protocell to recursively self-proliferate for multiple generations has not been demonstrated. Here we show that newborn daughter GVs can be restored to the status of their parental GVs by pH-induced vesicular fusion of daughter GVs with conveyer GVs filled with depleted substrates. We describe a primitive model cell cycle comprising four discrete phases (ingestion, replication, maturity and division), each of which is selectively activated by a specific external stimulus. The production of recursive self-proliferating model protocells represents a step towards eventual production of model protocells that are able to mimic evolution.

  18. Mining IP to Domain Name Interactions to Detect DNS Flood Attacks on Recursive DNS Servers.

    PubMed

    Alonso, Roberto; Monroy, Raúl; Trejo, Luis A

    2016-08-17

    The Domain Name System (DNS) is a critical infrastructure of any network, and, not surprisingly a common target of cybercrime. There are numerous works that analyse higher level DNS traffic to detect anomalies in the DNS or any other network service. By contrast, few efforts have been made to study and protect the recursive DNS level. In this paper, we introduce a novel abstraction of the recursive DNS traffic to detect a flooding attack, a kind of Distributed Denial of Service (DDoS). The crux of our abstraction lies on a simple observation: Recursive DNS queries, from IP addresses to domain names, form social groups; hence, a DDoS attack should result in drastic changes on DNS social structure. We have built an anomaly-based detection mechanism, which, given a time window of DNS usage, makes use of features that attempt to capture the DNS social structure, including a heuristic that estimates group composition. Our detection mechanism has been successfully validated (in a simulated and controlled setting) and with it the suitability of our abstraction to detect flooding attacks. To the best of our knowledge, this is the first time that work is successful in using this abstraction to detect these kinds of attacks at the recursive level. Before concluding the paper, we motivate further research directions considering this new abstraction, so we have designed and tested two additional experiments which exhibit promising results to detect other types of anomalies in recursive DNS servers.

  19. Mining IP to Domain Name Interactions to Detect DNS Flood Attacks on Recursive DNS Servers

    PubMed Central

    Alonso, Roberto; Monroy, Raúl; Trejo, Luis A.

    2016-01-01

    The Domain Name System (DNS) is a critical infrastructure of any network, and, not surprisingly a common target of cybercrime. There are numerous works that analyse higher level DNS traffic to detect anomalies in the DNS or any other network service. By contrast, few efforts have been made to study and protect the recursive DNS level. In this paper, we introduce a novel abstraction of the recursive DNS traffic to detect a flooding attack, a kind of Distributed Denial of Service (DDoS). The crux of our abstraction lies on a simple observation: Recursive DNS queries, from IP addresses to domain names, form social groups; hence, a DDoS attack should result in drastic changes on DNS social structure. We have built an anomaly-based detection mechanism, which, given a time window of DNS usage, makes use of features that attempt to capture the DNS social structure, including a heuristic that estimates group composition. Our detection mechanism has been successfully validated (in a simulated and controlled setting) and with it the suitability of our abstraction to detect flooding attacks. To the best of our knowledge, this is the first time that work is successful in using this abstraction to detect these kinds of attacks at the recursive level. Before concluding the paper, we motivate further research directions considering this new abstraction, so we have designed and tested two additional experiments which exhibit promising results to detect other types of anomalies in recursive DNS servers. PMID:27548169

  20. Recursive inversion of externally defined linear systems by FIR filters

    NASA Technical Reports Server (NTRS)

    Bach, Ralph E., Jr.; Baram, Yoram

    1989-01-01

    The approximate inversion of an internally unknown linear system, given by its impulse response sequence, by an inverse system having a finite impulse response, is considered. The recursive least-squares procedure is shown to have an exact initialization, based on the triangular Toeplitz structure of the matrix involved. The proposed approach also suggests solutions to the problem of system identification and compensation.

  1. Propensity score method: a non-parametric technique to reduce model dependence

    PubMed Central

    2017-01-01

    Propensity score analysis (PSA) is a powerful technique that it balances pretreatment covariates, making the causal effect inference from observational data as reliable as possible. The use of PSA in medical literature has increased exponentially in recent years, and the trend continue to rise. The article introduces rationales behind PSA, followed by illustrating how to perform PSA in R with MatchIt package. There are a variety of methods available for PS matching such as nearest neighbors, full matching, exact matching and genetic matching. The task can be easily done by simply assigning a string value to the method argument in the matchit() function. The generic summary() and plot() functions can be applied to an object of class matchit to check covariate balance after matching. Furthermore, there is a useful package PSAgraphics that contains several graphical functions to check covariate balance between treatment groups across strata. If covariate balance is not achieved, one can modify model specifications or use other techniques such as random forest and recursive partitioning to better represent the underlying structure between pretreatment covariates and treatment assignment. The process can be repeated until the desirable covariate balance is achieved. PMID:28164092

  2. Preliminary clinical outcomes of image-guided 3-dimensional conformal radiotherapy for limited brain metastases instead of stereotactic irradiation referral.

    PubMed

    Ohtakara, Kazuhiro; Hoshi, Hiroaki

    2014-06-01

    To determine the preliminary clinical outcomes of image-guided 3-dimensional conformal radiotherapy (IG-3DCRT) for limited but variably-sized brain metastases (BM). Sixty-two lesions in 24 patients were retrospectively evaluated; out of these patients 75% were ≥ 65 years of age, and 37.5% were categorized into recursive partitioning analysis (RPA) class 3. The median value for the maximum diameter of the lesions was 19 mm (range=4-72 mm). The median sole treatment dose was 36 Gy in 10 fractions. The median survival durations after IG-3DCRT were 12.0 months and 3.2 months for patients categorized into RPA classes ≤ 2 and 3, respectively. Local recurrences occurred in two lesions with a 6-month local control probability of 93.0%. Major toxicities included radiation necrosis in two patients. IG-3DCRT is feasible even for patients with limited BM who are categorized into RPA class 3, and confers clinical outcomes comparable to those of stereotactic radiosurgery, including excellent local control and minimal toxicity even for large tumors. Copyright© 2014 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.

  3. WE-E-17A-06: Assessing the Scale of Tumor Heterogeneity by Complete Hierarchical Segmentation On MRI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gensheimer, M; Trister, A; Ermoian, R

    2014-06-15

    Purpose: In many cancers, intratumoral heterogeneity exists in vascular and genetic structure. We developed an algorithm which uses clinical imaging to interrogate different scales of heterogeneity. We hypothesize that heterogeneity of perfusion at large distance scales may correlate with propensity for disease recurrence. We applied the algorithm to initial diagnosis MRI of rhabdomyosarcoma patients to predict recurrence. Methods: The Spatial Heterogeneity Analysis by Recursive Partitioning (SHARP) algorithm recursively segments the tumor image. The tumor is repeatedly subdivided, with each dividing line chosen to maximize signal intensity difference between the two subregions. This process continues to the voxel level, producing segmentsmore » at multiple scales. Heterogeneity is measured by comparing signal intensity histograms between each segmented region and the adjacent region. We measured the scales of contrast enhancement heterogeneity of the primary tumor in 18 rhabdomyosarcoma patients. Using Cox proportional hazards regression, we explored the influence of heterogeneity parameters on relapse-free survival (RFS). To compare with existing methods, fractal and Haralick texture features were also calculated. Results: The complete segmentation produced by SHARP allows extraction of diverse features, including the amount of heterogeneity at various distance scales, the area of the tumor with the most heterogeneity at each scale, and for a given point in the tumor, the heterogeneity at different scales. 10/18 rhabdomyosarcoma patients suffered disease recurrence. On contrast-enhanced MRI, larger scale of maximum signal intensity heterogeneity, relative to tumor diameter, predicted for shorter RFS (p=0.05). Fractal dimension, fractal fit, and three Haralick features did not predict RFS (p=0.09-0.90). Conclusion: SHARP produces an automatic segmentation of tumor regions and reports the amount of heterogeneity at various distance scales. In rhabdomyosarcoma, RFS was shorter when the primary tumor exhibited larger scale of heterogeneity on contrast-enhanced MRI. If validated on a larger dataset, this imaging biomarker could be useful to help personalize treatment.« less

  4. Research in interactive scene analysis

    NASA Technical Reports Server (NTRS)

    Tenenbaum, J. M.; Garvey, T. D.; Weyl, S. A.; Wolf, H. C.

    1975-01-01

    An interactive scene interpretation system (ISIS) was developed as a tool for constructing and experimenting with man-machine and automatic scene analysis methods tailored for particular image domains. A recently developed region analysis subsystem based on the paradigm of Brice and Fennema is described. Using this subsystem a series of experiments was conducted to determine good criteria for initially partitioning a scene into atomic regions and for merging these regions into a final partition of the scene along object boundaries. Semantic (problem-dependent) knowledge is essential for complete, correct partitions of complex real-world scenes. An interactive approach to semantic scene segmentation was developed and demonstrated on both landscape and indoor scenes. This approach provides a reasonable methodology for segmenting scenes that cannot be processed completely automatically, and is a promising basis for a future automatic system. A program is described that can automatically generate strategies for finding specific objects in a scene based on manually designated pictorial examples.

  5. Fermionic Approach to Weighted Hurwitz Numbers and Topological Recursion

    NASA Astrophysics Data System (ADS)

    Alexandrov, A.; Chapuy, G.; Eynard, B.; Harnad, J.

    2017-12-01

    A fermionic representation is given for all the quantities entering in the generating function approach to weighted Hurwitz numbers and topological recursion. This includes: KP and 2D Toda {τ} -functions of hypergeometric type, which serve as generating functions for weighted single and double Hurwitz numbers; the Baker function, which is expanded in an adapted basis obtained by applying the same dressing transformation to all vacuum basis elements; the multipair correlators and the multicurrent correlators. Multiplicative recursion relations and a linear differential system are deduced for the adapted bases and their duals, and a Christoffel-Darboux type formula is derived for the pair correlator. The quantum and classical spectral curves linking this theory with the topological recursion program are derived, as well as the generalized cut-and-join equations. The results are detailed for four special cases: the simple single and double Hurwitz numbers, the weakly monotone case, corresponding to signed enumeration of coverings, the strongly monotone case, corresponding to Belyi curves and the simplest version of quantum weighted Hurwitz numbers.

  6. Fermionic Approach to Weighted Hurwitz Numbers and Topological Recursion

    NASA Astrophysics Data System (ADS)

    Alexandrov, A.; Chapuy, G.; Eynard, B.; Harnad, J.

    2018-06-01

    A fermionic representation is given for all the quantities entering in the generating function approach to weighted Hurwitz numbers and topological recursion. This includes: KP and 2 D Toda {τ} -functions of hypergeometric type, which serve as generating functions for weighted single and double Hurwitz numbers; the Baker function, which is expanded in an adapted basis obtained by applying the same dressing transformation to all vacuum basis elements; the multipair correlators and the multicurrent correlators. Multiplicative recursion relations and a linear differential system are deduced for the adapted bases and their duals, and a Christoffel-Darboux type formula is derived for the pair correlator. The quantum and classical spectral curves linking this theory with the topological recursion program are derived, as well as the generalized cut-and-join equations. The results are detailed for four special cases: the simple single and double Hurwitz numbers, the weakly monotone case, corresponding to signed enumeration of coverings, the strongly monotone case, corresponding to Belyi curves and the simplest version of quantum weighted Hurwitz numbers.

  7. Predictors of trying to lose weight among overweight and obese Mexican-Americans: a signal detection analysis

    PubMed Central

    Bersamin, Andrea; Hanni, Krista D; Winkleby, Marilyn A

    2017-01-01

    Objective Signal detection analysis, a form of recursive partitioning, was used to identify combinations of sociodemographic and acculturation factors that predict trying to lose weight in a community-based sample of 957 overweight and obese Mexican-American adults (ages 18–69 years). Design Data were pooled from the 2004 and 2006 Behavioral Risk Factor Surveillance System conducted in a low-income, semi-rural community in California. Results Overall, 59 % of the population reported trying to lose weight. The proportion of adults who were trying to lose weight was highly variable across the seven mutually exclusive groups identified by signal detection (range 30–79 %). Significant predictors of trying to lose weight included BMI, gender, age and income. Women who were very overweight (BMI > 28·5 kg/m2) were most likely to be trying to lose weight (79 %), followed by very overweight higher-income men and moderately overweight (BMI = 25·0–28·5 kg/m2) higher-income women (72 % and 70 %, respectively). Moderately overweight men, aged 28–69 years, were the least likely to be trying to lose weight (30 %), followed by moderately overweight lower-income women (47 %) and very overweight lower-income men (49 %). The latter group is of particular concern since they have characteristics associated with medical complications of obesity (low education and poor access to medical care). Conclusions Our findings highlight opportunities and challenges for public health professionals working with overweight Mexican-American adults – particularly lower-income adults who were born in Mexico – who are not trying to lose weight and are therefore at high risk for obesity-related co-morbidities. PMID:18339224

  8. Intratumoral heterogeneity characterized by pretreatment PET in non-small cell lung cancer patients predicts progression-free survival on EGFR tyrosine kinase inhibitor

    PubMed Central

    Paeng, Jin Chul; Keam, Bhumsuk; Kim, Tae Min; Kim, Dong-Wan; Heo, Dae Seog

    2018-01-01

    Intratumoral heterogeneity has been suggested to be an important resistance mechanism leading to treatment failure. We hypothesized that radiologic images could be an alternative method for identification of tumor heterogeneity. We tested heterogeneity textural parameters on pretreatment FDG-PET/CT in order to assess the predictive value of target therapy. Recurred or metastatic non-small cell lung cancer (NSCLC) subjects with an activating EGFR mutation treated with either gefitinib or erlotinib were reviewed. An exploratory data set (n = 161) and a validation data set (n = 21) were evaluated, and eight parameters were selected for survival analysis. The optimal cutoff value was determined by the recursive partitioning method, and the predictive value was calculated using Harrell’s C-index. Univariate analysis revealed that all eight parameters showed an increased hazard ratio (HR) for progression-free survival (PFS). The highest HR was 6.41 (P<0.01) with co-occurrence (Co) entropy. Increased risk remained present after adjusting for initial stage, performance status (PS), and metabolic volume (MV) (aHR: 4.86, P<0.01). Textural parameters were found to have an incremental predictive value of early EGFR tyrosine kinase inhibitor (TKI) failure compared to that of the base model of the stage and PS (C-index 0.596 vs. 0.662, P = 0.02, by Co entropy). Heterogeneity textural parameters acquired from pretreatment FDG-PET/CT are highly predictive factors for PFS of EGFR TKI in EGFR-mutated NSCLC patients. These parameters are easily applicable to the identification of a subpopulation at increased risk of early EGFR TKI failure. Correlation to genomic alteration should be determined in future studies. PMID:29385152

  9. Efficient block processing of long duration biotelemetric brain data for health care monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soumya, I.; Zia Ur Rahman, M., E-mail: mdzr-5@ieee.org; Rama Koti Reddy, D. V.

    In real time clinical environment, the brain signals which doctor need to analyze are usually very long. Such a scenario can be made simple by partitioning the input signal into several blocks and applying signal conditioning. This paper presents various block based adaptive filter structures for obtaining high resolution electroencephalogram (EEG) signals, which estimate the deterministic components of the EEG signal by removing noise. To process these long duration signals, we propose Time domain Block Least Mean Square (TDBLMS) algorithm for brain signal enhancement. In order to improve filtering capability, we introduce normalization in the weight update recursion of TDBLMS,more » which results TD-B-normalized-least mean square (LMS). To increase accuracy and resolution in the proposed noise cancelers, we implement the time domain cancelers in frequency domain which results frequency domain TDBLMS and FD-B-Normalized-LMS. Finally, we have applied these algorithms on real EEG signals obtained from human using Emotive Epoc EEG recorder and compared their performance with the conventional LMS algorithm. The results show that the performance of the block based algorithms is superior to the LMS counter-parts in terms of signal to noise ratio, convergence rate, excess mean square error, misadjustment, and coherence.« less

  10. Entropy-based gene ranking without selection bias for the predictive classification of microarray data.

    PubMed

    Furlanello, Cesare; Serafini, Maria; Merler, Stefano; Jurman, Giuseppe

    2003-11-06

    We describe the E-RFE method for gene ranking, which is useful for the identification of markers in the predictive classification of array data. The method supports a practical modeling scheme designed to avoid the construction of classification rules based on the selection of too small gene subsets (an effect known as the selection bias, in which the estimated predictive errors are too optimistic due to testing on samples already considered in the feature selection process). With E-RFE, we speed up the recursive feature elimination (RFE) with SVM classifiers by eliminating chunks of uninteresting genes using an entropy measure of the SVM weights distribution. An optimal subset of genes is selected according to a two-strata model evaluation procedure: modeling is replicated by an external stratified-partition resampling scheme, and, within each run, an internal K-fold cross-validation is used for E-RFE ranking. Also, the optimal number of genes can be estimated according to the saturation of Zipf's law profiles. Without a decrease of classification accuracy, E-RFE allows a speed-up factor of 100 with respect to standard RFE, while improving on alternative parametric RFE reduction strategies. Thus, a process for gene selection and error estimation is made practical, ensuring control of the selection bias, and providing additional diagnostic indicators of gene importance.

  11. a Recursive Approach to Compute Normal Forms

    NASA Astrophysics Data System (ADS)

    HSU, L.; MIN, L. J.; FAVRETTO, L.

    2001-06-01

    Normal forms are instrumental in the analysis of dynamical systems described by ordinary differential equations, particularly when singularities close to a bifurcation are to be characterized. However, the computation of a normal form up to an arbitrary order is numerically hard. This paper focuses on the computer programming of some recursive formulas developed earlier to compute higher order normal forms. A computer program to reduce the system to its normal form on a center manifold is developed using the Maple symbolic language. However, it should be stressed that the program relies essentially on recursive numerical computations, while symbolic calculations are used only for minor tasks. Some strategies are proposed to save computation time. Examples are presented to illustrate the application of the program to obtain high order normalization or to handle systems with large dimension.

  12. Analysis of red blood cell partitioning at bifurcations in simulated microvascular networks

    NASA Astrophysics Data System (ADS)

    Balogh, Peter; Bagchi, Prosenjit

    2018-05-01

    Partitioning of red blood cells (RBCs) at vascular bifurcations has been studied over many decades using in vivo, in vitro, and theoretical models. These studies have shown that RBCs usually do not distribute to the daughter vessels with the same proportion as the blood flow. Such disproportionality occurs, whereby the cell distribution fractions are either higher or lower than the flow fractions and have been referred to as classical partitioning and reverse partitioning, respectively. The current work presents a study of RBC partitioning based on, for the first time, a direct numerical simulation (DNS) of a flowing cell suspension through modeled vascular networks that are comprised of multiple bifurcations and have topological similarity to microvasculature in vivo. The flow of deformable RBCs at physiological hematocrits is considered through the networks, and the 3D dynamics of each individual cell are accurately resolved. The focus is on the detailed analysis of the partitioning, based on the DNS data, as it develops naturally in successive bifurcations, and the underlying mechanisms. We find that while the time-averaged partitioning at a bifurcation manifests in one of two ways, namely, the classical or reverse partitioning, the time-dependent behavior can cycle between these two types. We identify and analyze four different cellular-scale mechanisms underlying the time-dependent partitioning. These mechanisms arise, in general, either due to an asymmetry in the RBC distribution in the feeding vessels caused by the events at an upstream bifurcation or due to a temporary increase in cell concentration near capillary bifurcations. Using the DNS results, we show that a positive skewness in the hematocrit profile in the feeding vessel is associated with the classical partitioning, while a negative skewness is associated with the reverse one. We then present a detailed analysis of the two components of disproportionate partitioning as identified in prior studies, namely, plasma skimming and cell screening. The plasma skimming component is shown to under-predict the disproportionality, leaving the cell screening component to make up for the difference. The crossing of the separation surface by the cells is observed to be a dominant mechanism underlying the cell screening, which is shown to mitigate extreme heterogeneity in RBC distribution across the networks.

  13. Genuine multipartite entanglement of symmetric Gaussian states: Strong monogamy, unitary localization, scaling behavior, and molecular sharing structure

    NASA Astrophysics Data System (ADS)

    Adesso, Gerardo; Illuminati, Fabrizio

    2008-10-01

    We investigate the structural aspects of genuine multipartite entanglement in Gaussian states of continuous variable systems. Generalizing the results of Adesso and Illuminati [Phys. Rev. Lett. 99, 150501 (2007)], we analyze whether the entanglement shared by blocks of modes distributes according to a strong monogamy law. This property, once established, allows us to quantify the genuine N -partite entanglement not encoded into 2,…,K,…,(N-1) -partite quantum correlations. Strong monogamy is numerically verified, and the explicit expression of the measure of residual genuine multipartite entanglement is analytically derived, by a recursive formula, for a subclass of Gaussian states. These are fully symmetric (permutation-invariant) states that are multipartitioned into blocks, each consisting of an arbitrarily assigned number of modes. We compute the genuine multipartite entanglement shared by the blocks of modes and investigate its scaling properties with the number and size of the blocks, the total number of modes, the global mixedness of the state, and the squeezed resources needed for state engineering. To achieve the exact computation of the block entanglement, we introduce and prove a general result of symplectic analysis: Correlations among K blocks in N -mode multisymmetric and multipartite Gaussian states, which are locally invariant under permutation of modes within each block, can be transformed by a local (with respect to the partition) unitary operation into correlations shared by K single modes, one per block, in effective nonsymmetric states where N-K modes are completely uncorrelated. Due to this theorem, the above results, such as the derivation of the explicit expression for the residual multipartite entanglement, its nonnegativity, and its scaling properties, extend to the subclass of non-symmetric Gaussian states that are obtained by the unitary localization of the multipartite entanglement of symmetric states. These findings provide strong numerical evidence that the distributed Gaussian entanglement is strongly monogamous under and possibly beyond specific symmetry constraints, and that the residual continuous-variable tangle is a proper measure of genuine multipartite entanglement for permutation-invariant Gaussian states under any multipartition of the modes.

  14. Improving the Statistical Modeling of the TRMM Extreme Precipitation Monitoring System

    NASA Astrophysics Data System (ADS)

    Demirdjian, L.; Zhou, Y.; Huffman, G. J.

    2016-12-01

    This project improves upon an existing extreme precipitation monitoring system based on the Tropical Rainfall Measuring Mission (TRMM) daily product (3B42) using new statistical models. The proposed system utilizes a regional modeling approach, where data from similar grid locations are pooled to increase the quality and stability of the resulting model parameter estimates to compensate for the short data record. The regional frequency analysis is divided into two stages. In the first stage, the region defined by the TRMM measurements is partitioned into approximately 27,000 non-overlapping clusters using a recursive k-means clustering scheme. In the second stage, a statistical model is used to characterize the extreme precipitation events occurring in each cluster. Instead of utilizing the block-maxima approach used in the existing system, where annual maxima are fit to the Generalized Extreme Value (GEV) probability distribution at each cluster separately, the present work adopts the peak-over-threshold (POT) method of classifying points as extreme if they exceed a pre-specified threshold. Theoretical considerations motivate the use of the Generalized-Pareto (GP) distribution for fitting threshold exceedances. The fitted parameters can be used to construct simple and intuitive average recurrence interval (ARI) maps which reveal how rare a particular precipitation event is given its spatial location. The new methodology eliminates much of the random noise that was produced by the existing models due to a short data record, producing more reasonable ARI maps when compared with NOAA's long-term Climate Prediction Center (CPC) ground based observations. The resulting ARI maps can be useful for disaster preparation, warning, and management, as well as increased public awareness of the severity of precipitation events. Furthermore, the proposed methodology can be applied to various other extreme climate records.

  15. Selective Perioperative Administration of Pasireotide is More Cost-Effective Than Routine Administration for Pancreatic Fistula Prophylaxis.

    PubMed

    Denbo, Jason W; Slack, Rebecca S; Bruno, Morgan; Cloyd, Jordan M; Prakash, Laura; Fleming, Jason B; Kim, Michael P; Aloia, Thomas A; Vauthey, Jean-Nicolas; Lee, Jeffrey E; Katz, Matthew H G

    2017-04-01

    In a randomized trial, pasireotide significantly decreased the incidence and severity of postoperative pancreatic fistula (POPF). Subsequent analyses concluded that its routine use is cost-effective. We hypothesized that selective administration of the drug to patients at high risk for POPF would be more cost-effective. Consecutive patients who did not receive pasireotide and underwent pancreatoduodenectomy (PD) or distal pancreatectomy (DP) between July 2011 and January 2014 were distributed into groups based on their risk of POPF using a multivariate recursive partitioning regression tree analysis (RPA) of preoperative clinical factors. The costs of treating hypothetical patients in each risk group were then computed based upon actual institutional hospital costs and previously published relative risk values associated with pasireotide. Among 315 patients who underwent pancreatectomy, grade B/C POPF occurred in 64 (20%). RPA allocated patients who underwent PD into four groups with a risk for grade B/C POPF of 0, 10, 29, or 60% (P < 0.001) on the basis of diagnosis, pancreatic duct diameter, and body mass index. Patients who underwent DP were allocated to three groups with a grade B/C POPF risk of 14, 26, or 44% (P = 0.05) on the basis of pancreatic duct diameter alone. Although the routine administration of pasireotide to all 315 patients would have theoretically saved $30,892 over standard care, restriction of pasireotide to only patients at high risk for POPF would have led to a cost savings of $831,916. Preoperative clinical characteristics can be used to characterize patients' risk for POPF following pancreatectomy. Selective administration of pasireotide only to patients at high risk for grade B/C POPF may maximize the cost-efficacy of prophylactic pasireotide.

  16. Age as an independent prognostic factor in patients with glioblastoma: a Radiation Therapy Oncology Group and American College of Surgeons National Cancer Data Base comparison.

    PubMed

    Siker, Malika L; Wang, Meihua; Porter, Kimberly; Nelson, Diana F; Curran, Walter J; Michalski, Jeff M; Souhami, Luis; Chakravarti, Arnab; Yung, W K Alfred; Delrowe, John; Coughlin, Christopher T; Mehta, Minesh P

    2011-08-01

    Glioblastoma (GBM) is rare in early adulthood and little information is available on this subgroup. We investigated whether young age (18-30 years) had an independent effect on survival. We retrospectively reviewed patients from two large databases: Radiation Therapy Oncology Group (RTOG) and American College of Surgeons National Cancer Data Base (NCDB). In the RTOG evaluation, we analyzed all eligible GBM cases from 17 RTOG studies from 1974 to 2002. All patients with GBM during 1985-1998 in the NCDB were examined for comparison. Patients were divided into three cohorts: ages 18-30, 31-49, and ≥50. Overall survival, as a function of age (discreet and continuous), was assessed. The RTOG review included 3,136 patients: 112 (3.6%) were 18-30, 780 (24.9%) were 31-49, and 2,244 (71.6%) were ≥50. The median survival times of the three groups were 21.0, 13.5, and 9.1 months (P < 0.0001). Significant improvement in survival for younger patients was demonstrated with adjustment for recursive partitioning analysis (RPA) class. Of the 37,260 patients analyzed in the NCDB, 796 (2.1%) were 18-30, 5,711 (15.3%) were 31-49, and 30,753 (82.5%) were ≥50. The median survival times of the three groups were 18.0, 12.8, and 6.3 months (P < 0.0001). Data were not available for RPA class from this series. GBM is rare in young adulthood, comprising 2.1-3.6% of our patients. They have superior survival, even when adjusted for RPA class. More investigations on the unique biologic and clinical characteristics of tumors in this population are needed.

  17. Discriminating modes of toxic action in mice using toxicity in BALB/c mouse fibroblast (3T3) cells.

    PubMed

    Huang, Tao; Yan, Lichen; Zheng, Shanshan; Wang, Yue; Wang, Xiaohong; Fan, Lingyun; Li, Chao; Zhao, Yuanhui; Martyniuk, Christopher J

    2017-12-01

    The objective of this study was to determine whether toxicity in mouse fibroblast cells (3T3 cells) could predict toxicity in mice. Synthesized data on toxicity was subjected to regression analysis and it was observed that relationship of toxicities between mice and 3T3 cells was not strong (R 2  = 0.41). Inclusion of molecular descriptors (e.g. ionization, pKa) improved the regression to R 2  = 0.56, indicating that this relationship is influenced by kinetic processes of chemicals or specific toxic mechanisms associated to the compounds. However, to determine if we were able to discriminate modes of action (MOAs) in mice using the toxicities generated from 3T3 cells, compounds were first classified into "baseline" and "reactive" guided by the toxic ratio (TR) for each compound in mice. Sequence, binomial and recursive partitioning analyses provided strong predictions of MOAs in mice based upon toxicities in 3T3 cells. The correct classification of MOAs based on these methods was 86%. Nearly all the baseline compounds predicted from toxicities in 3T3 cells were identified as baseline compounds from the TR in mice. The incorrect assignment of MOAs for some compounds is hypothesized to be due to experimental uncertainty that exists in toxicity assays for both mice and 3T3 cells. Conversely, lack of assignment can also arise because some reactive compounds have MOAs that are different in mice compared to 3T3 cells. The methods developed here are novel and contribute to efforts to reduce animal numbers in toxicity tests that are used to evaluate risks associated with organic pollutants in the environment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Recursive Hierarchical Image Segmentation by Region Growing and Constrained Spectral Clustering

    NASA Technical Reports Server (NTRS)

    Tilton, James C.

    2002-01-01

    This paper describes an algorithm for hierarchical image segmentation (referred to as HSEG) and its recursive formulation (referred to as RHSEG). The HSEG algorithm is a hybrid of region growing and constrained spectral clustering that produces a hierarchical set of image segmentations based on detected convergence points. In the main, HSEG employs the hierarchical stepwise optimization (HS WO) approach to region growing, which seeks to produce segmentations that are more optimized than those produced by more classic approaches to region growing. In addition, HSEG optionally interjects between HSWO region growing iterations merges between spatially non-adjacent regions (i.e., spectrally based merging or clustering) constrained by a threshold derived from the previous HSWO region growing iteration. While the addition of constrained spectral clustering improves the segmentation results, especially for larger images, it also significantly increases HSEG's computational requirements. To counteract this, a computationally efficient recursive, divide-and-conquer, implementation of HSEG (RHSEG) has been devised and is described herein. Included in this description is special code that is required to avoid processing artifacts caused by RHSEG s recursive subdivision of the image data. Implementations for single processor and for multiple processor computer systems are described. Results with Landsat TM data are included comparing HSEG with classic region growing. Finally, an application to image information mining and knowledge discovery is discussed.

  19. New recursive-least-squares algorithms for nonlinear active control of sound and vibration using neural networks.

    PubMed

    Bouchard, M

    2001-01-01

    In recent years, a few articles describing the use of neural networks for nonlinear active control of sound and vibration were published. Using a control structure with two multilayer feedforward neural networks (one as a nonlinear controller and one as a nonlinear plant model), steepest descent algorithms based on two distinct gradient approaches were introduced for the training of the controller network. The two gradient approaches were sometimes called the filtered-x approach and the adjoint approach. Some recursive-least-squares algorithms were also introduced, using the adjoint approach. In this paper, an heuristic procedure is introduced for the development of recursive-least-squares algorithms based on the filtered-x and the adjoint gradient approaches. This leads to the development of new recursive-least-squares algorithms for the training of the controller neural network in the two networks structure. These new algorithms produce a better convergence performance than previously published algorithms. Differences in the performance of algorithms using the filtered-x and the adjoint gradient approaches are discussed in the paper. The computational load of the algorithms discussed in the paper is evaluated for multichannel systems of nonlinear active control. Simulation results are presented to compare the convergence performance of the algorithms, showing the convergence gain provided by the new algorithms.

  20. Models for liquid-liquid partition in the system dimethyl sulfoxide-organic solvent and their use for estimating descriptors for organic compounds.

    PubMed

    Karunasekara, Thushara; Poole, Colin F

    2011-07-15

    Partition coefficients for varied compounds were determined for the organic solvent-dimethyl sulfoxide biphasic partition system where the organic solvent is n-heptane or isopentyl ether. These partition coefficient databases are analyzed using the solvation parameter model facilitating a quantitative comparison of the dimethyl sulfoxide-based partition systems with other totally organic partition systems. Dimethyl sulfoxide is a moderately cohesive solvent, reasonably dipolar/polarizable and strongly hydrogen-bond basic. Although generally considered to be non-hydrogen-bond acidic, analysis of the partition coefficient database strongly supports reclassification as a weak hydrogen-bond acid in agreement with recent literature. The system constants for the n-heptane-dimethyl sulfoxide biphasic system provide an explanation of the mechanism for the selective isolation of polycyclic aromatic compounds from mixtures containing low-polarity hydrocarbons based on the capability of the polar interactions (dipolarity/polarizability and hydrogen-bonding) to overcome the opposing cohesive forces in dimethyl sulfoxide that are absent for the interactions with hydrocarbons of low polarity. In addition, dimethyl sulfoxide-organic solvent systems afford a complementary approach to other totally organic biphasic partition systems for descriptor measurements of compounds virtually insoluble in water. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Attitude determination and calibration using a recursive maximum likelihood-based adaptive Kalman filter

    NASA Technical Reports Server (NTRS)

    Kelly, D. A.; Fermelia, A.; Lee, G. K. F.

    1990-01-01

    An adaptive Kalman filter design that utilizes recursive maximum likelihood parameter identification is discussed. At the center of this design is the Kalman filter itself, which has the responsibility for attitude determination. At the same time, the identification algorithm is continually identifying the system parameters. The approach is applicable to nonlinear, as well as linear systems. This adaptive Kalman filter design has much potential for real time implementation, especially considering the fast clock speeds, cache memory and internal RAM available today. The recursive maximum likelihood algorithm is discussed in detail, with special attention directed towards its unique matrix formulation. The procedure for using the algorithm is described along with comments on how this algorithm interacts with the Kalman filter.

  2. Deciding Termination for Ancestor Match- Bounded String Rewriting Systems

    NASA Technical Reports Server (NTRS)

    Geser, Alfons; Hofbauer, Dieter; Waldmann, Johannes

    2005-01-01

    Termination of a string rewriting system can be characterized by termination on suitable recursively defined languages. This kind of termination criteria has been criticized for its lack of automation. In an earlier paper we have shown how to construct an automated termination criterion if the recursion is aligned with the rewrite relation. We have demonstrated the technique with Dershowitz's forward closure criterion. In this paper we show that a different approach is suitable when the recursion is aligned with the inverse of the rewrite relation. We apply this idea to Kurth's ancestor graphs and obtain ancestor match-bounded string rewriting systems. Termination is shown to be decidable for this class. The resulting method improves upon those based on match-boundedness or inverse match-boundedness.

  3. Hierarchical image segmentation via recursive superpixel with adaptive regularity

    NASA Astrophysics Data System (ADS)

    Nakamura, Kensuke; Hong, Byung-Woo

    2017-11-01

    A fast and accurate segmentation algorithm in a hierarchical way based on a recursive superpixel technique is presented. We propose a superpixel energy formulation in which the trade-off between data fidelity and regularization is dynamically determined based on the local residual in the energy optimization procedure. We also present an energy optimization algorithm that allows a pixel to be shared by multiple regions to improve the accuracy and appropriate the number of segments. The qualitative and quantitative evaluations demonstrate that our algorithm, combining the proposed energy and optimization, outperforms the conventional k-means algorithm by up to 29.10% in F-measure. We also perform comparative analysis with state-of-the-art algorithms in the hierarchical segmentation. Our algorithm yields smooth regions throughout the hierarchy as opposed to the others that include insignificant details. Our algorithm overtakes the other algorithms in terms of balance between accuracy and computational time. Specifically, our method runs 36.48% faster than the region-merging approach, which is the fastest of the comparing algorithms, while achieving a comparable accuracy.

  4. Multilevel Green's function interpolation method for scattering from composite metallic and dielectric objects.

    PubMed

    Shi, Yan; Wang, Hao Gang; Li, Long; Chan, Chi Hou

    2008-10-01

    A multilevel Green's function interpolation method based on two kinds of multilevel partitioning schemes--the quasi-2D and the hybrid partitioning scheme--is proposed for analyzing electromagnetic scattering from objects comprising both conducting and dielectric parts. The problem is formulated using the surface integral equation for homogeneous dielectric and conducting bodies. A quasi-2D multilevel partitioning scheme is devised to improve the efficiency of the Green's function interpolation. In contrast to previous multilevel partitioning schemes, noncubic groups are introduced to discretize the whole EM structure in this quasi-2D multilevel partitioning scheme. Based on the detailed analysis of the dimension of the group in this partitioning scheme, a hybrid quasi-2D/3D multilevel partitioning scheme is proposed to effectively handle objects with fine local structures. Selection criteria for some key parameters relating to the interpolation technique are given. The proposed algorithm is ideal for the solution of problems involving objects such as missiles, microstrip antenna arrays, photonic bandgap structures, etc. Numerical examples are presented to show that CPU time is between O(N) and O(N log N) while the computer memory requirement is O(N).

  5. Predicting DPP-IV inhibitors with machine learning approaches

    NASA Astrophysics Data System (ADS)

    Cai, Jie; Li, Chanjuan; Liu, Zhihong; Du, Jiewen; Ye, Jiming; Gu, Qiong; Xu, Jun

    2017-04-01

    Dipeptidyl peptidase IV (DPP-IV) is a promising Type 2 diabetes mellitus (T2DM) drug target. DPP-IV inhibitors prolong the action of glucagon-like peptide-1 (GLP-1) and gastric inhibitory peptide (GIP), improve glucose homeostasis without weight gain, edema, and hypoglycemia. However, the marketed DPP-IV inhibitors have adverse effects such as nasopharyngitis, headache, nausea, hypersensitivity, skin reactions and pancreatitis. Therefore, it is still expected for novel DPP-IV inhibitors with minimal adverse effects. The scaffolds of existing DPP-IV inhibitors are structurally diversified. This makes it difficult to build virtual screening models based upon the known DPP-IV inhibitor libraries using conventional QSAR approaches. In this paper, we report a new strategy to predict DPP-IV inhibitors with machine learning approaches involving naïve Bayesian (NB) and recursive partitioning (RP) methods. We built 247 machine learning models based on 1307 known DPP-IV inhibitors with optimized molecular properties and topological fingerprints as descriptors. The overall predictive accuracies of the optimized models were greater than 80%. An external test set, composed of 65 recently reported compounds, was employed to validate the optimized models. The results demonstrated that both NB and RP models have a good predictive ability based on different combinations of descriptors. Twenty "good" and twenty "bad" structural fragments for DPP-IV inhibitors can also be derived from these models for inspiring the new DPP-IV inhibitor scaffold design.

  6. What's special about human language? The contents of the "narrow language faculty" revisited.

    PubMed

    Traxler, Matthew J; Boudewyn, Megan; Loudermilk, Jessica

    2012-10-01

    In this review we re-evaluate the recursion-only hypothesis, advocated by Fitch, Hauser and Chomsky (Hauser, Chomsky & Fitch, 2002; Fitch, Hauser & Chomsky, 2005). According to the recursion-only hypothesis, the property that distinguishes human language from animal communication systems is recursion, which refers to the potentially infinite embedding of one linguistic representation within another of the same type. This hypothesis predicts (1) that non-human primates and other animals lack the ability to learn recursive grammar, and (2) that recursive grammar is the sole cognitive mechanism that is unique to human language. We first review animal studies of recursive grammar, before turning to the claim that recursion is a property of all human languages. Finally, we discuss other views on what abilities may be unique to human language.

  7. A Dual Super-Element Domain Decomposition Approach for Parallel Nonlinear Finite Element Analysis

    NASA Astrophysics Data System (ADS)

    Jokhio, G. A.; Izzuddin, B. A.

    2015-05-01

    This article presents a new domain decomposition method for nonlinear finite element analysis introducing the concept of dual partition super-elements. The method extends ideas from the displacement frame method and is ideally suited for parallel nonlinear static/dynamic analysis of structural systems. In the new method, domain decomposition is realized by replacing one or more subdomains in a "parent system," each with a placeholder super-element, where the subdomains are processed separately as "child partitions," each wrapped by a dual super-element along the partition boundary. The analysis of the overall system, including the satisfaction of equilibrium and compatibility at all partition boundaries, is realized through direct communication between all pairs of placeholder and dual super-elements. The proposed method has particular advantages for matrix solution methods based on the frontal scheme, and can be readily implemented for existing finite element analysis programs to achieve parallelization on distributed memory systems with minimal intervention, thus overcoming memory bottlenecks typically faced in the analysis of large-scale problems. Several examples are presented in this article which demonstrate the computational benefits of the proposed parallel domain decomposition approach and its applicability to the nonlinear structural analysis of realistic structural systems.

  8. Evaluation of a prognostic scoring system based on the systemic inflammatory and nutritional status of patients with locally advanced non-small-cell lung cancer treated with chemoradiotherapy.

    PubMed

    Mitsuyoshi, Takamasa; Matsuo, Yukinori; Itou, Hitoshi; Shintani, Takashi; Iizuka, Yusuke; Kim, Young Hak; Mizowaki, Takashi

    2018-01-01

    Systemic inflammation and poor nutritional status have a negative effect on the outcomes of cancer. Here, we analyzed the effects of the pretreatment inflammatory and nutritional status on clinical outcomes of locally advanced non-small-cell lung cancer (NSCLC) patients treated with chemoradiotherapy. We retrospectively reviewed 89 patients with locally advanced NSCLC treated with chemoradiotherapy between July 2006 and June 2013. Serum C-reactive protein (CRP) was assessed as an inflammatory marker, and serum albumin, body mass index (BMI) and skeletal mass index were assessed as nutritional status markers. The relationships between these markers and overall survival (OS) were assessed. The median OS was 24.6 months [95% confidence interval (CI): 19.4-39.3 months]. During follow-up, 58 patients (65%) had disease recurrence and 52 patients (58%) died. In multivariate Cox hazard analysis, CRP levels and BMI approached but did not achieve a significant association with OS (P = 0.062 and 0.094, respectively). Recursive partitioning analysis identified three prognostic groups based on hazard similarity (CRP-BMI scores): 0 = CRP < 0.3 mg/dl, 1 = CRP ≥ 0.3 mg/dl and BMI ≥ 18.5 kg/m2, and 2 = CRP ≥ 0.3 mg/dl and BMI < 18.5 kg/m2. The CRP-BMI score was significantly associated with OS (P = 0.023). Patients with scores of 0, 1 and 2 had median OS of 39.3, 24.5 and 14.5 months, respectively, and the scores also predicted the probability of receiving salvage treatment after recurrence. The CRP-BMI score is thus a simple and useful prognostic marker of clinical outcome for patients with locally advanced NSCLC treated with chemoradiotherapy. © The Author 2017. Published by Oxford University Press on behalf of The Japan Radiation Research Society and Japanese Society for Radiation Oncology.

  9. Classification of visible and infrared hyperspectral images based on image segmentation and edge-preserving filtering

    NASA Astrophysics Data System (ADS)

    Cui, Binge; Ma, Xiudan; Xie, Xiaoyun; Ren, Guangbo; Ma, Yi

    2017-03-01

    The classification of hyperspectral images with a few labeled samples is a major challenge which is difficult to meet unless some spatial characteristics can be exploited. In this study, we proposed a novel spectral-spatial hyperspectral image classification method that exploited spatial autocorrelation of hyperspectral images. First, image segmentation is performed on the hyperspectral image to assign each pixel to a homogeneous region. Second, the visible and infrared bands of hyperspectral image are partitioned into multiple subsets of adjacent bands, and each subset is merged into one band. Recursive edge-preserving filtering is performed on each merged band which utilizes the spectral information of neighborhood pixels. Third, the resulting spectral and spatial feature band set is classified using the SVM classifier. Finally, bilateral filtering is performed to remove "salt-and-pepper" noise in the classification result. To preserve the spatial structure of hyperspectral image, edge-preserving filtering is applied independently before and after the classification process. Experimental results on different hyperspectral images prove that the proposed spectral-spatial classification approach is robust and offers more classification accuracy than state-of-the-art methods when the number of labeled samples is small.

  10. The association between neighborhood characteristics and body size and physical activity in the California teachers study cohort.

    PubMed

    Keegan, Theresa H M; Hurley, Susan; Goldberg, Debbie; Nelson, David O; Reynolds, Peggy; Bernstein, Leslie; Horn-Ross, Pam L; Gomez, Scarlett L

    2012-04-01

    We considered interactions between physical activity and body mass index (BMI) and neighborhood factors. We used recursive partitioning to identify predictors of low recreational physical activity (< 2.5 hours/week) and overweight and obesity (BMI ≥ 25.0 kg/m(2)) among 118,315 women in the California Teachers Study. Neighborhood characteristics were based on 2000 US Census data and Reference US business listings. Low physical activity and being overweight or obese were associated with individual sociodemographic characteristics, including race/ethnicity and age. Among White women aged 36 to 75 years, living in neighborhoods with more household crowding was associated with a higher probability of low physical activity (54% vs 45% to 51%). In less crowded neighborhoods where more people worked outside the home, the existence of fewer neighborhood amenities was associated with a higher probability of low physical activity (51% vs 46%). Among non-African American middle-aged women, living in neighborhoods with a lower socioeconomic status was associated with a higher probability of being overweight or obese (46% to 59% vs 38% in high-socioeconomic status neighborhoods). Associations between physical activity, overweight and obesity, and the built environment varied by sociodemographic characteristics in this educated population.

  11. Trellises and Trellis-Based Decoding Algorithms for Linear Block Codes. Part 3; A Recursive Maximum Likelihood Decoding

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Fossorier, Marc

    1998-01-01

    The Viterbi algorithm is indeed a very simple and efficient method of implementing the maximum likelihood decoding. However, if we take advantage of the structural properties in a trellis section, other efficient trellis-based decoding algorithms can be devised. Recently, an efficient trellis-based recursive maximum likelihood decoding (RMLD) algorithm for linear block codes has been proposed. This algorithm is more efficient than the conventional Viterbi algorithm in both computation and hardware requirements. Most importantly, the implementation of this algorithm does not require the construction of the entire code trellis, only some special one-section trellises of relatively small state and branch complexities are needed for constructing path (or branch) metric tables recursively. At the end, there is only one table which contains only the most likely code-word and its metric for a given received sequence r = (r(sub 1), r(sub 2),...,r(sub n)). This algorithm basically uses the divide and conquer strategy. Furthermore, it allows parallel/pipeline processing of received sequences to speed up decoding.

  12. Use of HCA in Subproteome-immunization and Screening of Hybridoma Supernatants to Define Distinct Antibody Binding Patterns

    PubMed Central

    Szafran, Adam T.; Mancini, Maureen G.; Nickerson, Jeffrey A.; Edwards, Dean P.; Mancini, Michael A.

    2016-01-01

    Understanding the properties and functions of complex biological systems depends upon knowing the proteins present and the interactions between them. Recent advances in mass spectrometry have given us greater insights into the participating proteomes, however, monoclonal antibodies remain key to understanding the structures, functions, locations and macromolecular interactions of the involved proteins. The traditional single immunogen method to produce monoclonal antibodies using hybridoma technology are time, resource and cost intensive, limiting the number of reagents that are available. Using a high content analysis screening approach, we have developed a method in which a complex mixture of proteins (e.g., subproteome) is used to generate a panel of monoclonal antibodies specific to a subproteome located in a defined subcellular compartment such as the nucleus. The immunofluorescent images in the primary hybridoma screen are analyzed using an automated processing approach and classified using a recursive partitioning forest classification model derived from images obtained from the Human Protein Atlas. Using an ammonium sulfate purified nuclear matrix fraction as an example of reverse proteomics, we identified 866 hybridoma supernatants with a positive immunofluorescent signal. Of those, 402 produced a nuclear signal from which patterns similar to known nuclear matrix associated proteins were identified. Detailed here is our method, the analysis techniques, and a discussion of the application to further in vivo antibody production. PMID:26521976

  13. Use of HCA in subproteome-immunization and screening of hybridoma supernatants to define distinct antibody binding patterns.

    PubMed

    Szafran, Adam T; Mancini, Maureen G; Nickerson, Jeffrey A; Edwards, Dean P; Mancini, Michael A

    2016-03-01

    Understanding the properties and functions of complex biological systems depends upon knowing the proteins present and the interactions between them. Recent advances in mass spectrometry have given us greater insights into the participating proteomes, however, monoclonal antibodies remain key to understanding the structures, functions, locations and macromolecular interactions of the involved proteins. The traditional single immunogen method to produce monoclonal antibodies using hybridoma technology are time, resource and cost intensive, limiting the number of reagents that are available. Using a high content analysis screening approach, we have developed a method in which a complex mixture of proteins (e.g., subproteome) is used to generate a panel of monoclonal antibodies specific to a subproteome located in a defined subcellular compartment such as the nucleus. The immunofluorescent images in the primary hybridoma screen are analyzed using an automated processing approach and classified using a recursive partitioning forest classification model derived from images obtained from the Human Protein Atlas. Using an ammonium sulfate purified nuclear matrix fraction as an example of reverse proteomics, we identified 866 hybridoma supernatants with a positive immunofluorescent signal. Of those, 402 produced a nuclear signal from which patterns similar to known nuclear matrix associated proteins were identified. Detailed here is our method, the analysis techniques, and a discussion of the application to further in vivo antibody production. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. What's special about human language? The contents of the "narrow language faculty" revisited

    PubMed Central

    Traxler, Matthew J.; Boudewyn, Megan; Loudermilk, Jessica

    2012-01-01

    In this review we re-evaluate the recursion-only hypothesis, advocated by Fitch, Hauser and Chomsky (Hauser, Chomsky & Fitch, 2002; Fitch, Hauser & Chomsky, 2005). According to the recursion-only hypothesis, the property that distinguishes human language from animal communication systems is recursion, which refers to the potentially infinite embedding of one linguistic representation within another of the same type. This hypothesis predicts (1) that non-human primates and other animals lack the ability to learn recursive grammar, and (2) that recursive grammar is the sole cognitive mechanism that is unique to human language. We first review animal studies of recursive grammar, before turning to the claim that recursion is a property of all human languages. Finally, we discuss other views on what abilities may be unique to human language. PMID:23105948

  15. A recursive Bayesian approach for fatigue damage prognosis: An experimental validation at the reliability component level

    NASA Astrophysics Data System (ADS)

    Gobbato, Maurizio; Kosmatka, John B.; Conte, Joel P.

    2014-04-01

    Fatigue-induced damage is one of the most uncertain and highly unpredictable failure mechanisms for a large variety of mechanical and structural systems subjected to cyclic and random loads during their service life. A health monitoring system capable of (i) monitoring the critical components of these systems through non-destructive evaluation (NDE) techniques, (ii) assessing their structural integrity, (iii) recursively predicting their remaining fatigue life (RFL), and (iv) providing a cost-efficient reliability-based inspection and maintenance plan (RBIM) is therefore ultimately needed. In contribution to these objectives, the first part of the paper provides an overview and extension of a comprehensive reliability-based fatigue damage prognosis methodology — previously developed by the authors — for recursively predicting and updating the RFL of critical structural components and/or sub-components in aerospace structures. In the second part of the paper, a set of experimental fatigue test data, available in the literature, is used to provide a numerical verification and an experimental validation of the proposed framework at the reliability component level (i.e., single damage mechanism evolving at a single damage location). The results obtained from this study demonstrate (i) the importance and the benefits of a nearly continuous NDE monitoring system, (ii) the efficiency of the recursive Bayesian updating scheme, and (iii) the robustness of the proposed framework in recursively updating and improving the RFL estimations. This study also demonstrates that the proposed methodology can lead to either an extent of the RFL (with a consequent economical gain without compromising the minimum safety requirements) or an increase of safety by detecting a premature fault and therefore avoiding a very costly catastrophic failure.

  16. A comparison of latent class, K-means, and K-median methods for clustering dichotomous data.

    PubMed

    Brusco, Michael J; Shireman, Emilie; Steinley, Douglas

    2017-09-01

    The problem of partitioning a collection of objects based on their measurements on a set of dichotomous variables is a well-established problem in psychological research, with applications including clinical diagnosis, educational testing, cognitive categorization, and choice analysis. Latent class analysis and K-means clustering are popular methods for partitioning objects based on dichotomous measures in the psychological literature. The K-median clustering method has recently been touted as a potentially useful tool for psychological data and might be preferable to its close neighbor, K-means, when the variable measures are dichotomous. We conducted simulation-based comparisons of the latent class, K-means, and K-median approaches for partitioning dichotomous data. Although all 3 methods proved capable of recovering cluster structure, K-median clustering yielded the best average performance, followed closely by latent class analysis. We also report results for the 3 methods within the context of an application to transitive reasoning data, in which it was found that the 3 approaches can exhibit profound differences when applied to real data. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. A Survey on Teaching and Learning Recursive Programming

    ERIC Educational Resources Information Center

    Rinderknecht, Christian

    2014-01-01

    We survey the literature about the teaching and learning of recursive programming. After a short history of the advent of recursion in programming languages and its adoption by programmers, we present curricular approaches to recursion, including a review of textbooks and some programming methodology, as well as the functional and imperative…

  18. Cluster formation and drag reduction-proposed mechanism of particle recirculation within the partition column of the bottom spray fluid-bed coater.

    PubMed

    Wang, Li Kun; Heng, Paul Wan Sia; Liew, Celine Valeria

    2015-04-01

    Bottom spray fluid-bed coating is a common technique for coating multiparticulates. Under the quality-by-design framework, particle recirculation within the partition column is one of the main variability sources affecting particle coating and coat uniformity. However, the occurrence and mechanism of particle recirculation within the partition column of the coater are not well understood. The purpose of this study was to visualize and define particle recirculation within the partition column. Based on different combinations of partition gap setting, air accelerator insert diameter, and particle size fraction, particle movements within the partition column were captured using a high-speed video camera. The particle recirculation probability and voidage information were mapped using a visiometric process analyzer. High-speed images showed that particles contributing to the recirculation phenomenon were behaving as clustered colonies. Fluid dynamics analysis indicated that particle recirculation within the partition column may be attributed to the combined effect of cluster formation and drag reduction. Both visiometric process analysis and particle coating experiments showed that smaller particles had greater propensity toward cluster formation than larger particles. The influence of cluster formation on coating performance and possible solutions to cluster formation were further discussed. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  19. An Accelerated Recursive Doubling Algorithm for Block Tridiagonal Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seal, Sudip K

    2014-01-01

    Block tridiagonal systems of linear equations arise in a wide variety of scientific and engineering applications. Recursive doubling algorithm is a well-known prefix computation-based numerical algorithm that requires O(M^3(N/P + log P)) work to compute the solution of a block tridiagonal system with N block rows and block size M on P processors. In real-world applications, solutions of tridiagonal systems are most often sought with multiple, often hundreds and thousands, of different right hand sides but with the same tridiagonal matrix. Here, we show that a recursive doubling algorithm is sub-optimal when computing solutions of block tridiagonal systems with multiplemore » right hand sides and present a novel algorithm, called the accelerated recursive doubling algorithm, that delivers O(R) improvement when solving block tridiagonal systems with R distinct right hand sides. Since R is typically about 100 1000, this improvement translates to very significant speedups in practice. Detailed complexity analyses of the new algorithm with empirical confirmation of runtime improvements are presented. To the best of our knowledge, this algorithm has not been reported before in the literature.« less

  20. Short-course whole-brain radiotherapy (WBRT) for brain metastases due to small-cell lung cancer (SCLC).

    PubMed

    Bohlen, Guenther; Meyners, Thekla; Kieckebusch, Susanne; Lohynska, Radka; Veninga, Theo; Stalpers, Lukas J A; Schild, Steven E; Rades, Dirk

    2010-04-01

    Many patients with brain metastases due to SCLC have a poor survival prognosis. The most common treatment is whole-brain radiotherapy (WBRT). This retrospective study compares short-course WBRT with 5x4Gy in 1 week to standard WBRT with 10x3Gy in 2 weeks. Forty-four SCLC patients receiving WBRT with 5x4Gy were compared to 102 patients receiving 10x3Gy for survival (OS) and local (intracerebral) control (LC). Seven further potential prognostic factors were investigated: age, gender, Karnofsky Performance Score (KPS), number of brain metastases, extracerebral metastases, interval from tumor diagnosis to WBRT, RPA (Recursive Partitioning Analysis) class. After 5x4Gy, 12-month OS was 15%, versus 22% after 10x3Gy (p=0.69). On multivariate analysis, improved OS was associated with age or=70 (p<0.001), <4 brain metastases (p=0.011), and RPA class 1 (p<0.001). 12-month LC was 34% after 5x4Gy versus 25% after 10x3Gy (p=0.32). On multivariate analysis, improved LC was associated with KPS >or=70 (p<0.001), <4 brain metastases (p=0.027), and RPA class 1 (p<0.001). In patients with brain metastases due to SCLC, short-course WBRT with 5x4Gy provided similar outcomes as 10x3Gy and appears preferable, particularly for patients with poor estimated survival.

  1. A recursive vesicle-based model protocell with a primitive model cell cycle

    PubMed Central

    Kurihara, Kensuke; Okura, Yusaku; Matsuo, Muneyuki; Toyota, Taro; Suzuki, Kentaro; Sugawara, Tadashi

    2015-01-01

    Self-organized lipid structures (protocells) have been proposed as an intermediate between nonliving material and cellular life. Synthetic production of model protocells can demonstrate the potential processes by which living cells first arose. While we have previously described a giant vesicle (GV)-based model protocell in which amplification of DNA was linked to self-reproduction, the ability of a protocell to recursively self-proliferate for multiple generations has not been demonstrated. Here we show that newborn daughter GVs can be restored to the status of their parental GVs by pH-induced vesicular fusion of daughter GVs with conveyer GVs filled with depleted substrates. We describe a primitive model cell cycle comprising four discrete phases (ingestion, replication, maturity and division), each of which is selectively activated by a specific external stimulus. The production of recursive self-proliferating model protocells represents a step towards eventual production of model protocells that are able to mimic evolution. PMID:26418735

  2. Rotational-path decomposition based recursive planning for spacecraft attitude reorientation

    NASA Astrophysics Data System (ADS)

    Xu, Rui; Wang, Hui; Xu, Wenming; Cui, Pingyuan; Zhu, Shengying

    2018-02-01

    The spacecraft reorientation is a common task in many space missions. With multiple pointing constraints, it is greatly difficult to solve the constrained spacecraft reorientation planning problem. To deal with this problem, an efficient rotational-path decomposition based recursive planning (RDRP) method is proposed in this paper. The uniform pointing-constraint-ignored attitude rotation planning process is designed to solve all rotations without considering pointing constraints. Then the whole path is checked node by node. If any pointing constraint is violated, the nearest critical increment approach will be used to generate feasible alternative nodes in the process of rotational-path decomposition. As the planning path of each subdivision may still violate pointing constraints, multiple decomposition is needed and the reorientation planning is designed as a recursive manner. Simulation results demonstrate the effectiveness of the proposed method. The proposed method has been successfully applied in two SPARK microsatellites to solve onboard constrained attitude reorientation planning problem, which were developed by the Shanghai Engineering Center for Microsatellites and launched on 22 December 2016.

  3. Survival Outcomes of Whole-Pelvic Versus Prostate-Only Radiation Therapy for High-Risk Prostate Cancer Patients With Use of the National Cancer Data Base

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amini, Arya; Jones, Bernard L.; Yeh, Norman

    Purpose/Objectives: The addition of whole pelvic (WP) compared with prostate-only (PO) radiation therapy (RT) for clinically node-negative prostate cancer remains controversial. The purpose of our study was to evaluate the survival benefit of adding WPRT versus PO-RT for high-risk, node-negative prostate cancer, using the National Cancer Data Base (NCDB). Methods and Materials: Patients with high-risk prostate cancer treated from 2004 to 2006, with available data for RT volume, coded as prostate and pelvis (WPRT) or prostate alone (PO-RT) were included. Multivariate analysis (MVA) and propensity-score matched analysis (PSM) were performed. Recursive partitioning analysis (RPA) based on overall survival (OS) usingmore » Gleason score (GS), T stage, and pretreatment prostate-specific antigen (PSA) was also conducted. Results: A total of 14,817 patients were included: 7606 (51.3%) received WPRT, and 7211 (48.7%) received PO-RT. The median follow-up time was 81 months (range, 2-122 months). Under MVA, the addition of WPRT for high-risk patients had no OS benefit compared with PO-RT (HR 1.05; P=.100). On subset analysis, patients receiving dose-escalated RT also did not benefit from WPRT (HR 1.01; P=.908). PSM confirmed no survival benefit with the addition of WPRT for high-risk patients (HR 1.05; P=.141). In addition, RPA was unable to demonstrate a survival benefit of WPRT for any subset. Other prognostic factors for inferior OS under MVA included older age (HR 1.25; P<.001), increasing comorbidity scores (HR 1.46; P<.001), higher T stage (HR 1.17; P<.001), PSA (HR 1.81; P<.001), and GS (HR 1.29; P<.001), and decreasing median county household income (HR 1.15; P=.011). Factors improving OS included the addition of androgen deprivation therapy (HR 0.92; P=.033), combination external beam RT plus brachytherapy boost (HR 0.71; P<.001), and treatment at an academic/research institution (HR 0.84; P=.002). Conclusion: In the largest reported analysis of WPRT for patients with high-risk prostate cancer treated in the dose-escalated era, the addition of WPRT demonstrated no survival advantage compared with PO-RT.« less

  4. Recursion in aphasia.

    PubMed

    Bánréti, Zoltán

    2010-11-01

    This study investigates how aphasic impairment impinges on syntactic and/or semantic recursivity of human language. A series of tests has been conducted with the participation of five Hungarian speaking aphasic subjects and 10 control subjects. Photographs representing simple situations were presented to subjects and questions were asked about them. The responses are supposed to involve formal structural recursion, but they contain semantic-pragmatic operations instead, with 'theory of mind' type embeddings. Aphasic individuals tend to exploit the parallel between 'theory of mind' embeddings and syntactic-structural embeddings in order to avoid formal structural recursion. Formal structural recursion may be more impaired in Broca's aphasia and semantic recursivity may remain selectively unimpaired in this type of aphasia.

  5. ADMET Evaluation in Drug Discovery. 16. Predicting hERG Blockers by Combining Multiple Pharmacophores and Machine Learning Approaches.

    PubMed

    Wang, Shuangquan; Sun, Huiyong; Liu, Hui; Li, Dan; Li, Youyong; Hou, Tingjun

    2016-08-01

    Blockade of human ether-à-go-go related gene (hERG) channel by compounds may lead to drug-induced QT prolongation, arrhythmia, and Torsades de Pointes (TdP), and therefore reliable prediction of hERG liability in the early stages of drug design is quite important to reduce the risk of cardiotoxicity-related attritions in the later development stages. In this study, pharmacophore modeling and machine learning approaches were combined to construct classification models to distinguish hERG active from inactive compounds based on a diverse data set. First, an optimal ensemble of pharmacophore hypotheses that had good capability to differentiate hERG active from inactive compounds was identified by the recursive partitioning (RP) approach. Then, the naive Bayesian classification (NBC) and support vector machine (SVM) approaches were employed to construct classification models by integrating multiple important pharmacophore hypotheses. The integrated classification models showed improved predictive capability over any single pharmacophore hypothesis, suggesting that the broad binding polyspecificity of hERG can only be well characterized by multiple pharmacophores. The best SVM model achieved the prediction accuracies of 84.7% for the training set and 82.1% for the external test set. Notably, the accuracies for the hERG blockers and nonblockers in the test set reached 83.6% and 78.2%, respectively. Analysis of significant pharmacophores helps to understand the multimechanisms of action of hERG blockers. We believe that the combination of pharmacophore modeling and SVM is a powerful strategy to develop reliable theoretical models for the prediction of potential hERG liability.

  6. Teaching and Learning Recursive Programming: A Review of the Research Literature

    ERIC Educational Resources Information Center

    McCauley, Renée; Grissom, Scott; Fitzgerald, Sue; Murphy, Laurie

    2015-01-01

    Hundreds of articles have been published on the topics of teaching and learning recursion, yet fewer than 50 of them have published research results. This article surveys the computing education research literature and presents findings on challenges students encounter in learning recursion, mental models students develop as they learn recursion,…

  7. How Learning Logic Programming Affects Recursion Comprehension

    ERIC Educational Resources Information Center

    Haberman, Bruria

    2004-01-01

    Recursion is a central concept in computer science, yet it is difficult for beginners to comprehend. Israeli high-school students learn recursion in the framework of a special modular program in computer science (Gal-Ezer & Harel, 1999). Some of them are introduced to the concept of recursion in two different paradigms: the procedural…

  8. Recursive Objects--An Object Oriented Presentation of Recursion

    ERIC Educational Resources Information Center

    Sher, David B.

    2004-01-01

    Generally, when recursion is introduced to students the concept is illustrated with a toy (Towers of Hanoi) and some abstract mathematical functions (factorial, power, Fibonacci). These illustrate recursion in the same sense that counting to 10 can be used to illustrate a for loop. These are all good illustrations, but do not represent serious…

  9. How children perceive fractals: Hierarchical self-similarity and cognitive development

    PubMed Central

    Martins, Maurício Dias; Laaha, Sabine; Freiberger, Eva Maria; Choi, Soonja; Fitch, W. Tecumseh

    2014-01-01

    The ability to understand and generate hierarchical structures is a crucial component of human cognition, available in language, music, mathematics and problem solving. Recursion is a particularly useful mechanism for generating complex hierarchies by means of self-embedding rules. In the visual domain, fractals are recursive structures in which simple transformation rules generate hierarchies of infinite depth. Research on how children acquire these rules can provide valuable insight into the cognitive requirements and learning constraints of recursion. Here, we used fractals to investigate the acquisition of recursion in the visual domain, and probed for correlations with grammar comprehension and general intelligence. We compared second (n = 26) and fourth graders (n = 26) in their ability to represent two types of rules for generating hierarchical structures: Recursive rules, on the one hand, which generate new hierarchical levels; and iterative rules, on the other hand, which merely insert items within hierarchies without generating new levels. We found that the majority of fourth graders, but not second graders, were able to represent both recursive and iterative rules. This difference was partially accounted by second graders’ impairment in detecting hierarchical mistakes, and correlated with between-grade differences in grammar comprehension tasks. Empirically, recursion and iteration also differed in at least one crucial aspect: While the ability to learn recursive rules seemed to depend on the previous acquisition of simple iterative representations, the opposite was not true, i.e., children were able to acquire iterative rules before they acquired recursive representations. These results suggest that the acquisition of recursion in vision follows learning constraints similar to the acquisition of recursion in language, and that both domains share cognitive resources involved in hierarchical processing. PMID:24955884

  10. Physicochemical properties/descriptors governing the solubility and partitioning of chemicals in water-solvent-gas systems. Part 1. Partitioning between octanol and air.

    PubMed

    Raevsky, O A; Grigor'ev, V J; Raevskaja, O E; Schaper, K-J

    2006-06-01

    QSPR analyses of a data set containing experimental partition coefficients in the three systems octanol-water, water-gas, and octanol-gas for 98 chemicals have shown that it is possible to calculate any partition coefficient in the system 'gas phase/octanol/water' by three different approaches: (1) from experimental partition coefficients obtained in the corresponding two other subsystems. However, in many cases these data may not be available. Therefore, a solution may be approached (2), a traditional QSPR analysis based on e.g. HYBOT descriptors (hydrogen bond acceptor and donor factors, SigmaCa and SigmaCd, together with polarisability alpha, a steric bulk effect descriptor) and supplemented with substructural indicator variables. (3) A very promising approach which is a combination of the similarity concept and QSPR based on HYBOT descriptors. In this approach observed partition coefficients of structurally nearest neighbours of a compound-of-interest are used. In addition, contributions arising from differences in alpha, SigmaCa, and SigmaCd values between the compound-of-interest and its nearest neighbour(s), respectively, are considered. In this investigation highly significant relationships were obtained by approaches (1) and (3) for the octanol/gas phase partition coefficient (log Log).

  11. Parallelizing flow-accumulation calculations on graphics processing units—From iterative DEM preprocessing algorithm to recursive multiple-flow-direction algorithm

    NASA Astrophysics Data System (ADS)

    Qin, Cheng-Zhi; Zhan, Lijun

    2012-06-01

    As one of the important tasks in digital terrain analysis, the calculation of flow accumulations from gridded digital elevation models (DEMs) usually involves two steps in a real application: (1) using an iterative DEM preprocessing algorithm to remove the depressions and flat areas commonly contained in real DEMs, and (2) using a recursive flow-direction algorithm to calculate the flow accumulation for every cell in the DEM. Because both algorithms are computationally intensive, quick calculation of the flow accumulations from a DEM (especially for a large area) presents a practical challenge to personal computer (PC) users. In recent years, rapid increases in hardware capacity of the graphics processing units (GPUs) provided in modern PCs have made it possible to meet this challenge in a PC environment. Parallel computing on GPUs using a compute-unified-device-architecture (CUDA) programming model has been explored to speed up the execution of the single-flow-direction algorithm (SFD). However, the parallel implementation on a GPU of the multiple-flow-direction (MFD) algorithm, which generally performs better than the SFD algorithm, has not been reported. Moreover, GPU-based parallelization of the DEM preprocessing step in the flow-accumulation calculations has not been addressed. This paper proposes a parallel approach to calculate flow accumulations (including both iterative DEM preprocessing and a recursive MFD algorithm) on a CUDA-compatible GPU. For the parallelization of an MFD algorithm (MFD-md), two different parallelization strategies using a GPU are explored. The first parallelization strategy, which has been used in the existing parallel SFD algorithm on GPU, has the problem of computing redundancy. Therefore, we designed a parallelization strategy based on graph theory. The application results show that the proposed parallel approach to calculate flow accumulations on a GPU performs much faster than either sequential algorithms or other parallel GPU-based algorithms based on existing parallelization strategies.

  12. Implications of Middle School Behavior Problems for High School Graduation and Employment Outcomes of Young Adults: Estimation of a Recursive Model.

    PubMed

    Karakus, Mustafa C; Salkever, David S; Slade, Eric P; Ialongo, Nicholas; Stuart, Elizabeth

    2012-01-01

    The potentially serious adverse impacts of behavior problems during adolescence on employment outcomes in adulthood provide a key economic rationale for early intervention programs. However, the extent to which lower educational attainment accounts for the total impact of adolescent behavior problems on later employment remains unclear As an initial step in exploring this issue, we specify and estimate a recursive bivariate probit model that 1) relates middle school behavior problems to high school graduation and 2) models later employment in young adulthood as a function of these behavior problems and of high school graduation. Our model thus allows for both a direct effect of behavior problems on later employment as well as an indirect effect that operates via graduation from high school. Our empirical results, based on analysis of data from the NELS, suggest that the direct effects of externalizing behavior problems on later employment are not significant but that these problems have important indirect effects operating through high school graduation.

  13. Adherence and Recursive Perception Among Young Adults with Cystic Fibrosis.

    PubMed

    Oddleifson, D August; Sawicki, Gregory S

    2017-04-01

    Adherence to prescribed treatment is a pressing issue for adolescents and young adults with cystic fibrosis (CF). This paper presents two narratives from the thematic analysis of unstructured interviews with 14 adolescents, young adults, and older adults living with CF. Through a new identity-based framework termed recursive perception that draws focus on how an individual perceives how others view them, it explores the social context of adherence and self-care among young adults with CF. It demonstrates that an individual's understanding of self and desire to maintain a certain image for peers can be deeply embedded in adherence and self-care patterns, leading individuals to feel they need to choose between tending to their health needs and living their lives. This suggests that current biomedical innovation in CF care must be complemented with renewed efforts to find effective means to empower young adults with CF to successfully navigate the social challenges of their illness and avoid the pitfalls of nonadherence that can lead to a permanent worsening of their health condition.

  14. Model evaluation of the phytoextraction potential of heavy metal hyperaccumulators and non-hyperaccumulators.

    PubMed

    Liang, Hong-Ming; Lin, Ting-Hsiang; Chiou, Jeng-Min; Yeh, Kuo-Chen

    2009-06-01

    Evaluation of the remediation ability of zinc/cadmium in hyper- and non-hyperaccumulator plant species through greenhouse studies is limited. To bridge the gap between greenhouse studies and field applications for phytoextraction, we used published data to examine the partitioning of heavy metals between plants and soil (defined as the bioconcentration factor). We compared the remediation ability of the Zn/Cd hyperaccumulators Thlaspi caerulescens and Arabidopsis halleri and the non-hyperaccumulators Nicotiana tabacum and Brassica juncea using a hierarchical linear model (HLM). A recursive algorithm was then used to evaluate how many harvest cycles were required to clean a contaminated site to meet Taiwan Environmental Protection Agency regulations. Despite the high bioconcentration factor of both hyperaccumulators, metal removal was still limited because of the plants' small biomass. Simulation with N. tabacum and the Cadmium model suggests further study and development of plants with high biomass and improved phytoextraction potential for use in environmental cleanup.

  15. Excess BMI in Childhood: A Modifiable Risk Factor for Type 1 Diabetes Development?

    PubMed

    Ferrara, Christine Therese; Geyer, Susan Michelle; Liu, Yuk-Fun; Evans-Molina, Carmella; Libman, Ingrid M; Besser, Rachel; Becker, Dorothy J; Rodriguez, Henry; Moran, Antoinette; Gitelman, Stephen E; Redondo, Maria J

    2017-05-01

    We aimed to determine the effect of elevated BMI over time on the progression to type 1 diabetes in youth. We studied 1,117 children in the TrialNet Pathway to Prevention cohort (autoantibody-positive relatives of patients with type 1 diabetes). Longitudinally accumulated BMI above the 85th age- and sex-adjusted percentile generated a cumulative excess BMI (ceBMI) index. Recursive partitioning and multivariate analyses yielded sex- and age-specific ceBMI thresholds for greatest type 1 diabetes risk. Higher ceBMI conferred significantly greater risk of progressing to type 1 diabetes. The increased diabetes risk occurred at lower ceBMI values in children <12 years of age compared with older subjects and in females versus males. Elevated BMI is associated with increased risk of diabetes progression in pediatric autoantibody-positive relatives, but the effect varies by sex and age. © 2017 by the American Diabetes Association.

  16. Excess BMI in Childhood: A Modifiable Risk Factor for Type 1 Diabetes Development?

    PubMed Central

    Liu, Yuk-Fun; Evans-Molina, Carmella; Libman, Ingrid M.; Besser, Rachel; Becker, Dorothy J.; Rodriguez, Henry; Moran, Antoinette; Gitelman, Stephen E.; Redondo, Maria J.

    2017-01-01

    OBJECTIVE We aimed to determine the effect of elevated BMI over time on the progression to type 1 diabetes in youth. RESEARCH DESIGN AND METHODS We studied 1,117 children in the TrialNet Pathway to Prevention cohort (autoantibody-positive relatives of patients with type 1 diabetes). Longitudinally accumulated BMI above the 85th age- and sex-adjusted percentile generated a cumulative excess BMI (ceBMI) index. Recursive partitioning and multivariate analyses yielded sex- and age-specific ceBMI thresholds for greatest type 1 diabetes risk. RESULTS Higher ceBMI conferred significantly greater risk of progressing to type 1 diabetes. The increased diabetes risk occurred at lower ceBMI values in children <12 years of age compared with older subjects and in females versus males. CONCLUSIONS Elevated BMI is associated with increased risk of diabetes progression in pediatric autoantibody-positive relatives, but the effect varies by sex and age. PMID:28202550

  17. A recursive algorithm for Zernike polynomials

    NASA Technical Reports Server (NTRS)

    Davenport, J. W.

    1982-01-01

    The analysis of a function defined on a rotationally symmetric system, with either a circular or annular pupil is discussed. In order to numerically analyze such systems it is typical to expand the given function in terms of a class of orthogonal polynomials. Because of their particular properties, the Zernike polynomials are especially suited for numerical calculations. Developed is a recursive algorithm that can be used to generate the Zernike polynomials up to a given order. The algorithm is recursively defined over J where R(J,N) is the Zernike polynomial of degree N obtained by orthogonalizing the sequence R(J), R(J+2), ..., R(J+2N) over (epsilon, 1). The terms in the preceding row - the (J-1) row - up to the N+1 term is needed for generating the (J,N)th term. Thus, the algorith generates an upper left-triangular table. This algorithm was placed in the computer with the necessary support program also included.

  18. EEG and MEG source localization using recursively applied (RAP) MUSIC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mosher, J.C.; Leahy, R.M.

    1996-12-31

    The multiple signal characterization (MUSIC) algorithm locates multiple asynchronous dipolar sources from electroencephalography (EEG) and magnetoencephalography (MEG) data. A signal subspace is estimated from the data, then the algorithm scans a single dipole model through a three-dimensional head volume and computes projections onto this subspace. To locate the sources, the user must search the head volume for local peaks in the projection metric. Here we describe a novel extension of this approach which we refer to as RAP (Recursively APplied) MUSIC. This new procedure automatically extracts the locations of the sources through a recursive use of subspace projections, which usesmore » the metric of principal correlations as a multidimensional form of correlation analysis between the model subspace and the data subspace. The dipolar orientations, a form of `diverse polarization,` are easily extracted using the associated principal vectors.« less

  19. Characterization of groups using composite kernels and multi-source fMRI analysis data: application to schizophrenia

    PubMed Central

    Castro, Eduardo; Martínez-Ramón, Manel; Pearlson, Godfrey; Sui, Jing; Calhoun, Vince D.

    2011-01-01

    Pattern classification of brain imaging data can enable the automatic detection of differences in cognitive processes of specific groups of interest. Furthermore, it can also give neuroanatomical information related to the regions of the brain that are most relevant to detect these differences by means of feature selection procedures, which are also well-suited to deal with the high dimensionality of brain imaging data. This work proposes the application of recursive feature elimination using a machine learning algorithm based on composite kernels to the classification of healthy controls and patients with schizophrenia. This framework, which evaluates nonlinear relationships between voxels, analyzes whole-brain fMRI data from an auditory task experiment that is segmented into anatomical regions and recursively eliminates the uninformative ones based on their relevance estimates, thus yielding the set of most discriminative brain areas for group classification. The collected data was processed using two analysis methods: the general linear model (GLM) and independent component analysis (ICA). GLM spatial maps as well as ICA temporal lobe and default mode component maps were then input to the classifier. A mean classification accuracy of up to 95% estimated with a leave-two-out cross-validation procedure was achieved by doing multi-source data classification. In addition, it is shown that the classification accuracy rate obtained by using multi-source data surpasses that reached by using single-source data, hence showing that this algorithm takes advantage of the complimentary nature of GLM and ICA. PMID:21723948

  20. Cognitive representation of "musical fractals": Processing hierarchy and recursion in the auditory domain.

    PubMed

    Martins, Mauricio Dias; Gingras, Bruno; Puig-Waldmueller, Estela; Fitch, W Tecumseh

    2017-04-01

    The human ability to process hierarchical structures has been a longstanding research topic. However, the nature of the cognitive machinery underlying this faculty remains controversial. Recursion, the ability to embed structures within structures of the same kind, has been proposed as a key component of our ability to parse and generate complex hierarchies. Here, we investigated the cognitive representation of both recursive and iterative processes in the auditory domain. The experiment used a two-alternative forced-choice paradigm: participants were exposed to three-step processes in which pure-tone sequences were built either through recursive or iterative processes, and had to choose the correct completion. Foils were constructed according to generative processes that did not match the previous steps. Both musicians and non-musicians were able to represent recursion in the auditory domain, although musicians performed better. We also observed that general 'musical' aptitudes played a role in both recursion and iteration, although the influence of musical training was somehow independent from melodic memory. Moreover, unlike iteration, recursion in audition was well correlated with its non-auditory (recursive) analogues in the visual and action sequencing domains. These results suggest that the cognitive machinery involved in establishing recursive representations is domain-general, even though this machinery requires access to information resulting from domain-specific processes. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  1. The rid-redundant procedure in C-Prolog

    NASA Technical Reports Server (NTRS)

    Chen, Huo-Yan; Wah, Benjamin W.

    1987-01-01

    C-Prolog can conveniently be used for logical inferences on knowledge bases. However, as similar to many search methods using backward chaining, a large number of redundant computation may be produced in recursive calls. To overcome this problem, the 'rid-redundant' procedure was designed to rid all redundant computations in running multi-recursive procedures. Experimental results obtained for C-Prolog on the Vax 11/780 computer show that there is an order of magnitude improvement in the running time and solvable problem size.

  2. Dynamically heterogenous partitions and phylogenetic inference: an evaluation of analytical strategies with cytochrome b and ND6 gene sequences in cranes.

    PubMed

    Krajewski, C; Fain, M G; Buckley, L; King, D G

    1999-11-01

    ki ctes over whether molecular sequence data should be partitioned for phylogenetic analysis often confound two types of heterogeneity among partitions. We distinguish historical heterogeneity (i.e., different partitions have different evolutionary relationships) from dynamic heterogeneity (i.e., different partitions show different patterns of sequence evolution) and explore the impact of the latter on phylogenetic accuracy and precision with a two-gene, mitochondrial data set for cranes. The well-established phylogeny of cranes allows us to contrast tree-based estimates of relevant parameter values with estimates based on pairwise comparisons and to ascertain the effects of incorporating different amounts of process information into phylogenetic estimates. We show that codon positions in the cytochrome b and NADH dehydrogenase subunit 6 genes are dynamically heterogenous under both Poisson and invariable-sites + gamma-rates versions of the F84 model and that heterogeneity includes variation in base composition and transition bias as well as substitution rate. Estimates of transition-bias and relative-rate parameters from pairwise sequence comparisons were comparable to those obtained as tree-based maximum likelihood estimates. Neither rate-category nor mixed-model partitioning strategies resulted in a loss of phylogenetic precision relative to unpartitioned analyses. We suggest that weighted-average distances provide a computationally feasible alternative to direct maximum likelihood estimates of phylogeny for mixed-model analyses of large, dynamically heterogenous data sets. Copyright 1999 Academic Press.

  3. Reliable and Efficient Parallel Processing Algorithms and Architectures for Modern Signal Processing. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Liu, Kuojuey Ray

    1990-01-01

    Least-squares (LS) estimations and spectral decomposition algorithms constitute the heart of modern signal processing and communication problems. Implementations of recursive LS and spectral decomposition algorithms onto parallel processing architectures such as systolic arrays with efficient fault-tolerant schemes are the major concerns of this dissertation. There are four major results in this dissertation. First, we propose the systolic block Householder transformation with application to the recursive least-squares minimization. It is successfully implemented on a systolic array with a two-level pipelined implementation at the vector level as well as at the word level. Second, a real-time algorithm-based concurrent error detection scheme based on the residual method is proposed for the QRD RLS systolic array. The fault diagnosis, order degraded reconfiguration, and performance analysis are also considered. Third, the dynamic range, stability, error detection capability under finite-precision implementation, order degraded performance, and residual estimation under faulty situations for the QRD RLS systolic array are studied in details. Finally, we propose the use of multi-phase systolic algorithms for spectral decomposition based on the QR algorithm. Two systolic architectures, one based on triangular array and another based on rectangular array, are presented for the multiphase operations with fault-tolerant considerations. Eigenvectors and singular vectors can be easily obtained by using the multi-pase operations. Performance issues are also considered.

  4. Improved parallel data partitioning by nested dissection with applications to information retrieval.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolf, Michael M.; Chevalier, Cedric; Boman, Erik Gunnar

    The computational work in many information retrieval and analysis algorithms is based on sparse linear algebra. Sparse matrix-vector multiplication is a common kernel in many of these computations. Thus, an important related combinatorial problem in parallel computing is how to distribute the matrix and the vectors among processors so as to minimize the communication cost. We focus on minimizing the total communication volume while keeping the computation balanced across processes. In [1], the first two authors presented a new 2D partitioning method, the nested dissection partitioning algorithm. In this paper, we improve on that algorithm and show that it ismore » a good option for data partitioning in information retrieval. We also show partitioning time can be substantially reduced by using the SCOTCH software, and quality improves in some cases, too.« less

  5. Variants of the MTHFR gene and susceptibility to acute lymphoblastic leukemia in children: a synthesis of genetic association studies.

    PubMed

    Zintzaras, Elias; Doxani, Chrysoula; Rodopoulou, Paraskevi; Bakalos, Georgios; Ziogas, Dimitris C; Ziakas, Panayiotis; Voulgarelis, Michael

    2012-04-01

    Acute lymphoblastic leukemia (ALL) is a complex disease with genetic background. The genetic association studies (GAS) that investigated the association between ALL and the MTHFR C677T and A1298C gene variants have produced contradictory or inconclusive results. In order to decrease the uncertainty of estimated genetic risk effects, a meticulous meta-analysis of published GAS related the variants in the MTFHR gene with susceptibility to ALL was conducted. The risk effects were estimated based on the odds ratio (OR) of the allele contrast and the generalized odds ratio (OR(G)). Cumulative and recursive cumulative meta-analyses were also performed. The analysis showed marginal significant association for the C677T variant, overall [OR=0.91 (0.82-1.00) and OR(G)=0.89 (0.79-1.01)], and in Whites [OR=0.88 (0.77-0.99) and OR(G)=0.85 (0.73-0.99)]. The A1298C variant produced non-significant results. For both variants, the cumulative meta-analysis did not show a trend of association as evidence accumulates and the recursive cumulative meta-analysis indicated lack of sufficient evidence for denying or claiming an association. The current evidence is not sufficient to draw definite conclusions regarding the association of MTHFR variants and development of ALL. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Revisiting the immune microenvironment of diffuse large B-cell lymphoma using a tissue microarray and immunohistochemistry: robust semi-automated analysis reveals CD3 and FoxP3 as potential predictors of response to R-CHOP

    PubMed Central

    Coutinho, Rita; Clear, Andrew J.; Mazzola, Emanuele; Owen, Andrew; Greaves, Paul; Wilson, Andrew; Matthews, Janet; Lee, Abigail; Alvarez, Rute; da Silva, Maria Gomes; Cabeçadas, José; Neuberg, Donna; Calaminici, Maria; Gribben, John G.

    2015-01-01

    Gene expression studies have identified the microenvironment as a prognostic player in diffuse large B-cell lymphoma. However, there is a lack of simple immune biomarkers that can be applied in the clinical setting and could be helpful in stratifying patients. Immunohistochemistry has been used for this purpose but the results are inconsistent. We decided to reinvestigate the immune microenvironment and its impact using immunohistochemistry, with two systems of image analysis, in a large set of patients with diffuse large B-cell lymphoma. Diagnostic tissue from 309 patients was arrayed onto tissue microarrays. Results from 161 chemoimmunotherapy-treated patients were used for outcome prediction. Positive cells, percentage stained area and numbers of pixels/area were quantified and results were compared with the purpose of inferring consistency between the two semi-automated systems. Measurement cutpoints were assessed using a recursive partitioning algorithm classifying results according to survival. Kaplan-Meier estimators and Fisher exact tests were evaluated to check for significant differences between measurement classes, and for dependence between pairs of measurements, respectively. Results were validated by multivariate analysis incorporating the International Prognostic Index. The concordance between the two systems of image analysis was surprisingly high, supporting their applicability for immunohistochemistry studies. Patients with a high density of CD3 and FoxP3 by both methods had a better outcome. Automated analysis should be the preferred method for immunohistochemistry studies. Following the use of two methods of semi-automated analysis we suggest that CD3 and FoxP3 play a role in predicting response to chemoimmunotherapy in diffuse large B-cell lymphoma. PMID:25425693

  7. Revisiting the immune microenvironment of diffuse large B-cell lymphoma using a tissue microarray and immunohistochemistry: robust semi-automated analysis reveals CD3 and FoxP3 as potential predictors of response to R-CHOP.

    PubMed

    Coutinho, Rita; Clear, Andrew J; Mazzola, Emanuele; Owen, Andrew; Greaves, Paul; Wilson, Andrew; Matthews, Janet; Lee, Abigail; Alvarez, Rute; da Silva, Maria Gomes; Cabeçadas, José; Neuberg, Donna; Calaminici, Maria; Gribben, John G

    2015-03-01

    Gene expression studies have identified the microenvironment as a prognostic player in diffuse large B-cell lymphoma. However, there is a lack of simple immune biomarkers that can be applied in the clinical setting and could be helpful in stratifying patients. Immunohistochemistry has been used for this purpose but the results are inconsistent. We decided to reinvestigate the immune microenvironment and its impact using immunohistochemistry, with two systems of image analysis, in a large set of patients with diffuse large B-cell lymphoma. Diagnostic tissue from 309 patients was arrayed onto tissue microarrays. Results from 161 chemoimmunotherapy-treated patients were used for outcome prediction. Positive cells, percentage stained area and numbers of pixels/area were quantified and results were compared with the purpose of inferring consistency between the two semi-automated systems. Measurement cutpoints were assessed using a recursive partitioning algorithm classifying results according to survival. Kaplan-Meier estimators and Fisher exact tests were evaluated to check for significant differences between measurement classes, and for dependence between pairs of measurements, respectively. Results were validated by multivariate analysis incorporating the International Prognostic Index. The concordance between the two systems of image analysis was surprisingly high, supporting their applicability for immunohistochemistry studies. Patients with a high density of CD3 and FoxP3 by both methods had a better outcome. Automated analysis should be the preferred method for immunohistochemistry studies. Following the use of two methods of semi-automated analysis we suggest that CD3 and FoxP3 play a role in predicting response to chemoimmunotherapy in diffuse large B-cell lymphoma. Copyright© Ferrata Storti Foundation.

  8. Clinical outcome and molecular characterization of brain metastases from esophageal and gastric cancer: a systematic review.

    PubMed

    Ghidini, Michele; Petrelli, Fausto; Hahne, Jens Claus; De Giorgi, Annamaria; Toppo, Laura; Pizzo, Claudio; Ratti, Margherita; Barni, Sandro; Passalacqua, Rodolfo; Tomasello, Gianluca

    2017-04-01

    The aim of the study was to collect the available data on central nervous system (CNS) metastases from esophageal and gastric cancer. A PubMed, EMBASE, SCOPUS, Web of Science, LILACS, Ovid and Cochrane Library search was performed. Thirty-seven studies including 779 patients were considered. Among the data extracted, treatment of tumor and brain metastases (BMs), time to BMs development, number and subsite, extracerebral metastases rate, median overall survival (OS) and prognostic factors were included. For esophageal cancer, the median OS from diagnosis of BMs was 4.2 months. Prognostic factors for OS included: performance status, multimodal therapy, adjuvant chemotherapy, single BM, brain only disease and surgery. For gastric cancer, median OS was 2.4 months. Prognostic factors for OS included: recursive partitioning analysis class 2, stereotactic radiosurgery (SRT) and use of intrathecal therapy. HER2-positive gastric cancer was shown to be associated with a higher risk and shorter time to CNS relapse. Patients harboring BMs from gastric and esophageal tumors, except cases with single lesions that are treated aggressively, have a poor prognosis. SRT (plus or minus surgery and whole brain radiotherapy) seems to give better results in terms of longer OS after brain relapse.

  9. Randomized Phase II Trial of High-Dose Melatonin and Radiation Therapy for RPA Class 2 Patients With Brain Metastases (RTOG 0119)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berk, Lawrence; Berkey, Brian; Rich, Tyvin

    Purpose: To determine if high-dose melatonin for Radiation Therapy Oncology Group (RTOG) recursive partitioning analysis (RPA) Class 2 patients with brain metastases improved survival over historical controls, and to determine if the time of day melatonin was given affected its toxicity or efficacy. RTOG 0119 was a phase II randomized trial for this group of patients. Methods and Materials: RTOG RPA Class 2 patients with brain metastases were randomized to 20 mg of melatonin, given either in the morning (8-9 AM) or in the evening (8-9 PM). All patients received radiation therapy (30 Gy in 10 fractions) in the afternoon.more » Melatonin was continued until neurologic deterioration or death. The primary endpoint was overall survival time. Neurologic deterioration, as reflected by the Mini-Mental Status Examination, was also measured. Results: Neither of the randomized groups had survival distributions that differed significantly from the historic controls of patients treated with whole-brain radiotherapy. The median survivals of the morning and evening melatonin treatments were 3.4 and 2.8 months, while the RTOG historical control survival was 4.1 months. Conclusions: High-dose melatonin did not show any beneficial effect in this group of patients.« less

  10. Communication-Avoiding Parallel Recursive Algorithms for Matrix Multiplication

    DTIC Science & Technology

    2013-05-17

    cost recurrence is FUM(n, P ) = 15 ( n2 4P ) + FUM ( n 2 , P 7 ) with base case FUM(n, 1) = csn ω0 − 5n2, where cs is the constant of Strassen-Winograd...message varies according to the recursion depth, and is the number of words a processor owns of any Si, Ti, or Qi, namely n2 4P words. 1If one does not...recurrence for the entire UM scheme: WUM(n, P ) = 36 n2 4P +WUM ( n 2 , P 7 ) SUM(n, P ) = 36 + SUM ( n 2 , P 7 ) with base case SUM(n, 1) = WUM(n, 1

  11. Survival outcomes with concurrent chemoradiation for elderly patients with locally advanced head and neck cancer according to the National Cancer Data Base.

    PubMed

    Amini, Arya; Jones, Bernard L; McDermott, Jessica D; Serracino, Hilary S; Jimeno, Antonio; Raben, David; Ghosh, Debashis; Bowles, Daniel W; Karam, Sana D

    2016-05-15

    The overall survival (OS) benefit of concurrent chemoradiotherapy (CRT) for head and neck squamous cell carcinoma patients older than 70 years is debated. This study examines the outcomes of elderly patients receiving CRT versus radiotherapy (RT) alone. The National Cancer Data Base was queried for patients older than 70 years with nonmetastatic oropharyngeal, laryngeal, or hypopharyngeal cancer (T3-4 or N(+)). CRT was defined as chemotherapy started within 14 days of the initiation of RT. Univariate analysis, multivariate analysis (MVA), propensity score matching (PSM), and recursive partitioning analysis (RPA) were performed. The study included 4042 patients: 2538 (63%) received CRT. The median follow-up was 19 months. The unadjusted median OS was longer with the addition of CRT (P < .001). OS was superior with CRT in the MVA (hazard ratio [HR], 0.63; 95% confidence interval [CI], 0.58-0.68; P < .001) and PSM analyses (HR, 0.73; 95% CI, 0.66-0.80; P < .001) in comparison with RT alone. According to RPA, CRT was associated with longer OS for patients 81 years or younger with low comorbidity scores and either T1-2/N2-3 disease or T3-4/N0-3 disease. The survival benefit with CRT disappeared for 2 subgroups in the 71- to 81-year age range: those with T1-2, N1, and Charlson-Deyo 0-1 (CD0-1) disease and those with T3-4, N1+, and CD1+ disease. Patients who were older than 81 years did not have increased survival with CRT. The receipt of CRT was associated with a longer duration of RT (odds ratio, 1.74; 95% CI, 1.50-2.01; P < .001). Patients older than 70 years should not be denied concurrent chemotherapy solely on the basis of age; additional factors, including the performance status and the tumor stage, should be taken into account. Cancer 2016;122:1533-43. © 2016 American Cancer Society. © 2016 American Cancer Society.

  12. Strain improvement of Sporolactobacillus inulinus ATCC 15538 for acid tolerance and production of D-lactic acid by genome shuffling.

    PubMed

    Zheng, Huijie; Gong, Jixian; Chen, Tao; Chen, Xun; Zhao, Xueming

    2010-02-01

    Improvement of acid tolerance and production of D-lactic acid by Sporolactobacillus inulinus ATCC 15538 was performed by using recursive protoplast fusion in a genome shuffling format. The starting population was generated by ultraviolet irradiation, diethyl sulfate mutagenesis, and pH-gradient filter and then, subjected for the recursive protoplast fusion. The concentration of lysozyme, time, and temperature for enzyme treatment were optimized by response surface methodology based on the central composite design. Based on contour plots and variance analysis, the model predicted a maximum Y (multiply protoplasts formation ratio by protoplasts regeneration ratio), 60.4%, and the corresponding above used values were 7.75 mg/ml lysozyme, 1.59 h, and 38 degrees C. A pH-5-resistant recombinant, F3-4, was obtained after three rounds of genome shuffling and its production of D-lactic acid reached 93.4 g/l in a 5 L bioreactor, which was increased by 39.8% and 119% in comparison with that of UV generated strain and the original strain S. inulinus ATCC 15538, respectively. The subculture experiments indicated that F3-4 was genetically stable.

  13. A combinatorial model for the Macdonald polynomials.

    PubMed

    Haglund, J

    2004-11-16

    We introduce a polynomial C(mu)[Z; q, t], depending on a set of variables Z = z(1), z(2),..., a partition mu, and two extra parameters q, t. The definition of C(mu) involves a pair of statistics (maj(sigma, mu), inv(sigma, mu)) on words sigma of positive integers, and the coefficients of the z(i) are manifestly in N[q,t]. We conjecture that C(mu)[Z; q, t] is none other than the modified Macdonald polynomial H(mu)[Z; q, t]. We further introduce a general family of polynomials F(T)[Z; q, S], where T is an arbitrary set of squares in the first quadrant of the xy plane, and S is an arbitrary subset of T. The coefficients of the F(T)[Z; q, S] are in N[q], and C(mu)[Z; q, t] is a sum of certain F(T)[Z; q, S] times nonnegative powers of t. We prove F(T)[Z; q, S] is symmetric in the z(i) and satisfies other properties consistent with the conjecture. We also show how the coefficient of a monomial in F(T)[Z; q, S] can be expressed recursively. maple calculations indicate the F(T)[Z; q, S] are Schur-positive, and we present a combinatorial conjecture for their Schur coefficients when the set T is a partition with at most three columns.

  14. Sparse Regression as a Sparse Eigenvalue Problem

    NASA Technical Reports Server (NTRS)

    Moghaddam, Baback; Gruber, Amit; Weiss, Yair; Avidan, Shai

    2008-01-01

    We extend the l0-norm "subspectral" algorithms for sparse-LDA [5] and sparse-PCA [6] to general quadratic costs such as MSE in linear (kernel) regression. The resulting "Sparse Least Squares" (SLS) problem is also NP-hard, by way of its equivalence to a rank-1 sparse eigenvalue problem (e.g., binary sparse-LDA [7]). Specifically, for a general quadratic cost we use a highly-efficient technique for direct eigenvalue computation using partitioned matrix inverses which leads to dramatic x103 speed-ups over standard eigenvalue decomposition. This increased efficiency mitigates the O(n4) scaling behaviour that up to now has limited the previous algorithms' utility for high-dimensional learning problems. Moreover, the new computation prioritizes the role of the less-myopic backward elimination stage which becomes more efficient than forward selection. Similarly, branch-and-bound search for Exact Sparse Least Squares (ESLS) also benefits from partitioned matrix inverse techniques. Our Greedy Sparse Least Squares (GSLS) generalizes Natarajan's algorithm [9] also known as Order-Recursive Matching Pursuit (ORMP). Specifically, the forward half of GSLS is exactly equivalent to ORMP but more efficient. By including the backward pass, which only doubles the computation, we can achieve lower MSE than ORMP. Experimental comparisons to the state-of-the-art LARS algorithm [3] show forward-GSLS is faster, more accurate and more flexible in terms of choice of regularization

  15. Recursive Subsystems in Aphasia and Alzheimer's Disease: Case Studies in Syntax and Theory of Mind.

    PubMed

    Bánréti, Zoltán; Hoffmann, Ildikó; Vincze, Veronika

    2016-01-01

    The relationship between recursive sentence embedding and theory-of-mind (ToM) inference is investigated in three persons with Broca's aphasia, two persons with Wernicke's aphasia, and six persons with mild and moderate Alzheimer's disease (AD). We asked questions of four types about photographs of various real-life situations. Type 4 questions asked participants about intentions, thoughts, or utterances of the characters in the pictures ("What may X be thinking/asking Y to do?"). The expected answers typically involved subordinate clauses introduced by conjunctions or direct quotations of the characters' utterances. Broca's aphasics did not produce answers with recursive sentence embedding. Rather, they projected themselves into the characters' mental states and gave direct answers in the first person singular, with relevant ToM content. We call such replies "situative statements." Where the question concerned the mental state of the character but did not require an answer with sentence embedding ("What does X hate?"), aphasics gave descriptive answers rather than situative statements. Most replies given by persons with AD to Type 4 questions were grammatical instances of recursive sentence embedding. They also gave a few situative statements but the ToM content of these was irrelevant. In more than one third of their well-formed sentence embeddings, too, they conveyed irrelevant ToM contents. Persons with moderate AD were unable to pass secondary false belief tests. The results reveal double dissociation: Broca's aphasics are unable to access recursive sentence embedding but they can make appropriate ToM inferences; moderate AD persons make the wrong ToM inferences but they are able to access recursive sentence embedding. The double dissociation may be relevant for the nature of the relationship between the two recursive capacities. Broca's aphasics compensated for the lack of recursive sentence embedding by recursive ToM reasoning represented in very simple syntactic forms: they used one recursive subsystem to stand in for another recursive subsystem.

  16. Recursive Subsystems in Aphasia and Alzheimer's Disease: Case Studies in Syntax and Theory of Mind

    PubMed Central

    Bánréti, Zoltán; Hoffmann, Ildikó; Vincze, Veronika

    2016-01-01

    The relationship between recursive sentence embedding and theory-of-mind (ToM) inference is investigated in three persons with Broca's aphasia, two persons with Wernicke's aphasia, and six persons with mild and moderate Alzheimer's disease (AD). We asked questions of four types about photographs of various real-life situations. Type 4 questions asked participants about intentions, thoughts, or utterances of the characters in the pictures (“What may X be thinking/asking Y to do?”). The expected answers typically involved subordinate clauses introduced by conjunctions or direct quotations of the characters' utterances. Broca's aphasics did not produce answers with recursive sentence embedding. Rather, they projected themselves into the characters' mental states and gave direct answers in the first person singular, with relevant ToM content. We call such replies “situative statements.” Where the question concerned the mental state of the character but did not require an answer with sentence embedding (“What does X hate?”), aphasics gave descriptive answers rather than situative statements. Most replies given by persons with AD to Type 4 questions were grammatical instances of recursive sentence embedding. They also gave a few situative statements but the ToM content of these was irrelevant. In more than one third of their well-formed sentence embeddings, too, they conveyed irrelevant ToM contents. Persons with moderate AD were unable to pass secondary false belief tests. The results reveal double dissociation: Broca's aphasics are unable to access recursive sentence embedding but they can make appropriate ToM inferences; moderate AD persons make the wrong ToM inferences but they are able to access recursive sentence embedding. The double dissociation may be relevant for the nature of the relationship between the two recursive capacities. Broca's aphasics compensated for the lack of recursive sentence embedding by recursive ToM reasoning represented in very simple syntactic forms: they used one recursive subsystem to stand in for another recursive subsystem. PMID:27064887

  17. Ultra wide-band localization and SLAM: a comparative study for mobile robot navigation.

    PubMed

    Segura, Marcelo J; Auat Cheein, Fernando A; Toibero, Juan M; Mut, Vicente; Carelli, Ricardo

    2011-01-01

    In this work, a comparative study between an Ultra Wide-Band (UWB) localization system and a Simultaneous Localization and Mapping (SLAM) algorithm is presented. Due to its high bandwidth and short pulses length, UWB potentially allows great accuracy in range measurements based on Time of Arrival (TOA) estimation. SLAM algorithms recursively estimates the map of an environment and the pose (position and orientation) of a mobile robot within that environment. The comparative study presented here involves the performance analysis of implementing in parallel an UWB localization based system and a SLAM algorithm on a mobile robot navigating within an environment. Real time results as well as error analysis are also shown in this work.

  18. Binary partition tree analysis based on region evolution and its application to tree simplification.

    PubMed

    Lu, Huihai; Woods, John C; Ghanbari, Mohammed

    2007-04-01

    Pyramid image representations via tree structures are recognized methods for region-based image analysis. Binary partition trees can be applied which document the merging process with small details found at the bottom levels and larger ones close to the root. Hindsight of the merging process is stored within the tree structure and provides the change histories of an image property from the leaf to the root node. In this work, the change histories are modelled by evolvement functions and their second order statistics are analyzed by using a knee function. Knee values show the reluctancy of each merge. We have systematically formulated these findings to provide a novel framework for binary partition tree analysis, where tree simplification is demonstrated. Based on an evolvement function, for each upward path in a tree, the tree node associated with the first reluctant merge is considered as a pruning candidate. The result is a simplified version providing a reduced solution space and still complying with the definition of a binary tree. The experiments show that image details are preserved whilst the number of nodes is dramatically reduced. An image filtering tool also results which preserves object boundaries and has applications for segmentation.

  19. Utilizing Microsoft[R] Office to Produce and Present Recursive Frame Analysis Findings

    ERIC Educational Resources Information Center

    Chenail, Ronald J.; Duffy, Maureen

    2011-01-01

    Although researchers conducting qualitative descriptive studies, ethnographies, phenomenologies, grounded theory, and narrative inquiries commonly use computer-assisted qualitative data analysis software (CAQDAS) to manage their projects and analyses, investigators conducting discursive methodologies such as discourse or conversation analysis seem…

  20. Inference and Analysis of Population Structure Using Genetic Data and Network Theory.

    PubMed

    Greenbaum, Gili; Templeton, Alan R; Bar-David, Shirli

    2016-04-01

    Clustering individuals to subpopulations based on genetic data has become commonplace in many genetic studies. Inference about population structure is most often done by applying model-based approaches, aided by visualization using distance-based approaches such as multidimensional scaling. While existing distance-based approaches suffer from a lack of statistical rigor, model-based approaches entail assumptions of prior conditions such as that the subpopulations are at Hardy-Weinberg equilibria. Here we present a distance-based approach for inference about population structure using genetic data by defining population structure using network theory terminology and methods. A network is constructed from a pairwise genetic-similarity matrix of all sampled individuals. The community partition, a partition of a network to dense subgraphs, is equated with population structure, a partition of the population to genetically related groups. Community-detection algorithms are used to partition the network into communities, interpreted as a partition of the population to subpopulations. The statistical significance of the structure can be estimated by using permutation tests to evaluate the significance of the partition's modularity, a network theory measure indicating the quality of community partitions. To further characterize population structure, a new measure of the strength of association (SA) for an individual to its assigned community is presented. The strength of association distribution (SAD) of the communities is analyzed to provide additional population structure characteristics, such as the relative amount of gene flow experienced by the different subpopulations and identification of hybrid individuals. Human genetic data and simulations are used to demonstrate the applicability of the analyses. The approach presented here provides a novel, computationally efficient model-free method for inference about population structure that does not entail assumption of prior conditions. The method is implemented in the software NetStruct (available at https://giligreenbaum.wordpress.com/software/). Copyright © 2016 by the Genetics Society of America.

  1. Nodal domains of a non-separable problem—the right-angled isosceles triangle

    NASA Astrophysics Data System (ADS)

    Aronovitch, Amit; Band, Ram; Fajman, David; Gnutzmann, Sven

    2012-03-01

    We study the nodal set of eigenfunctions of the Laplace operator on the right-angled isosceles triangle. A local analysis of the nodal pattern provides an algorithm for computing the number νn of nodal domains for any eigenfunction. In addition, an exact recursive formula for the number of nodal domains is found to reproduce all existing data. Eventually, we use the recursion formula to analyse a large sequence of nodal counts statistically. Our analysis shows that the distribution of nodal counts for this triangular shape has a much richer structure than the known cases of regular separable shapes or completely irregular shapes. Furthermore, we demonstrate that the nodal count sequence contains information about the periodic orbits of the corresponding classical ray dynamics.

  2. MODFLOW-CDSS, a version of MODFLOW-2005 with modifications for Colorado Decision Support Systems

    USGS Publications Warehouse

    Banta, Edward R.

    2011-01-01

    MODFLOW-CDSS is a three-dimensional, finite-difference groundwater-flow model based on MODFLOW-2005, with two modifications. The first modification is the introduction of a Partition Stress Boundaries capability, which enables the user to partition a selected subset of MODFLOW's stress-boundary packages, with each partition defined by a separate input file. Volumetric water-budget components of each partition are tracked and listed separately in the volumetric water-budget tables. The second modification enables the user to specify that execution of a simulation should continue despite failure of the solver to satisfy convergence criteria. This modification is particularly intended to be used in conjunction with automated model-analysis software; its use is not recommended for other purposes.

  3. Human motion planning based on recursive dynamics and optimal control techniques

    NASA Technical Reports Server (NTRS)

    Lo, Janzen; Huang, Gang; Metaxas, Dimitris

    2002-01-01

    This paper presents an efficient optimal control and recursive dynamics-based computer animation system for simulating and controlling the motion of articulated figures. A quasi-Newton nonlinear programming technique (super-linear convergence) is implemented to solve minimum torque-based human motion-planning problems. The explicit analytical gradients needed in the dynamics are derived using a matrix exponential formulation and Lie algebra. Cubic spline functions are used to make the search space for an optimal solution finite. Based on our formulations, our method is well conditioned and robust, in addition to being computationally efficient. To better illustrate the efficiency of our method, we present results of natural looking and physically correct human motions for a variety of human motion tasks involving open and closed loop kinematic chains.

  4. Drug Distribution. Part 1. Models to Predict Membrane Partitioning.

    PubMed

    Nagar, Swati; Korzekwa, Ken

    2017-03-01

    Tissue partitioning is an important component of drug distribution and half-life. Protein binding and lipid partitioning together determine drug distribution. Two structure-based models to predict partitioning into microsomal membranes are presented. An orientation-based model was developed using a membrane template and atom-based relative free energy functions to select drug conformations and orientations for neutral and basic drugs. The resulting model predicts the correct membrane positions for nine compounds tested, and predicts the membrane partitioning for n = 67 drugs with an average fold-error of 2.4. Next, a more facile descriptor-based model was developed for acids, neutrals and bases. This model considers the partitioning of neutral and ionized species at equilibrium, and can predict membrane partitioning with an average fold-error of 2.0 (n = 92 drugs). Together these models suggest that drug orientation is important for membrane partitioning and that membrane partitioning can be well predicted from physicochemical properties.

  5. A tree-based statistical classification algorithm (CHAID) for identifying variables responsible for the occurrence of faecal indicator bacteria during waterworks operations

    NASA Astrophysics Data System (ADS)

    Bichler, Andrea; Neumaier, Arnold; Hofmann, Thilo

    2014-11-01

    Microbial contamination of groundwater used for drinking water can affect public health and is of major concern to local water authorities and water suppliers. Potential hazards need to be identified in order to protect raw water resources. We propose a non-parametric data mining technique for exploring the presence of total coliforms (TC) in a groundwater abstraction well and its relationship to readily available, continuous time series of hydrometric monitoring parameters (seven year records of precipitation, river water levels, and groundwater heads). The original monitoring parameters were used to create an extensive generic dataset of explanatory variables by considering different accumulation or averaging periods, as well as temporal offsets of the explanatory variables. A classification tree based on the Chi-Squared Automatic Interaction Detection (CHAID) recursive partitioning algorithm revealed statistically significant relationships between precipitation and the presence of TC in both a production well and a nearby monitoring well. Different secondary explanatory variables were identified for the two wells. Elevated water levels and short-term water table fluctuations in the nearby river were found to be associated with TC in the observation well. The presence of TC in the production well was found to relate to elevated groundwater heads and fluctuations in groundwater levels. The generic variables created proved useful for increasing significance levels. The tree-based model was used to predict the occurrence of TC on the basis of hydrometric variables.

  6. AlzhCPI: A knowledge base for predicting chemical-protein interactions towards Alzheimer's disease.

    PubMed

    Fang, Jiansong; Wang, Ling; Li, Yecheng; Lian, Wenwen; Pang, Xiaocong; Wang, Hong; Yuan, Dongsheng; Wang, Qi; Liu, Ai-Lin; Du, Guan-Hua

    2017-01-01

    Alzheimer's disease (AD) is a complicated progressive neurodegeneration disorder. To confront AD, scientists are searching for multi-target-directed ligands (MTDLs) to delay disease progression. The in silico prediction of chemical-protein interactions (CPI) can accelerate target identification and drug discovery. Previously, we developed 100 binary classifiers to predict the CPI for 25 key targets against AD using the multi-target quantitative structure-activity relationship (mt-QSAR) method. In this investigation, we aimed to apply the mt-QSAR method to enlarge the model library to predict CPI towards AD. Another 104 binary classifiers were further constructed to predict the CPI for 26 preclinical AD targets based on the naive Bayesian (NB) and recursive partitioning (RP) algorithms. The internal 5-fold cross-validation and external test set validation were applied to evaluate the performance of the training sets and test set, respectively. The area under the receiver operating characteristic curve (ROC) for the test sets ranged from 0.629 to 1.0, with an average of 0.903. In addition, we developed a web server named AlzhCPI to integrate the comprehensive information of approximately 204 binary classifiers, which has potential applications in network pharmacology and drug repositioning. AlzhCPI is available online at http://rcidm.org/AlzhCPI/index.html. To illustrate the applicability of AlzhCPI, the developed system was employed for the systems pharmacology-based investigation of shichangpu against AD to enhance the understanding of the mechanisms of action of shichangpu from a holistic perspective.

  7. A novel tree-based procedure for deciphering the genomic spectrum of clinical disease entities.

    PubMed

    Mbogning, Cyprien; Perdry, Hervé; Toussile, Wilson; Broët, Philippe

    2014-01-01

    Dissecting the genomic spectrum of clinical disease entities is a challenging task. Recursive partitioning (or classification trees) methods provide powerful tools for exploring complex interplay among genomic factors, with respect to a main factor, that can reveal hidden genomic patterns. To take confounding variables into account, the partially linear tree-based regression (PLTR) model has been recently published. It combines regression models and tree-based methodology. It is however computationally burdensome and not well suited for situations for which a large number of exploratory variables is expected. We developed a novel procedure that represents an alternative to the original PLTR procedure, and considered different selection criteria. A simulation study with different scenarios has been performed to compare the performances of the proposed procedure to the original PLTR strategy. The proposed procedure with a Bayesian Information Criterion (BIC) achieved good performances to detect the hidden structure as compared to the original procedure. The novel procedure was used for analyzing patterns of copy-number alterations in lung adenocarcinomas, with respect to Kirsten Rat Sarcoma Viral Oncogene Homolog gene (KRAS) mutation status, while controlling for a cohort effect. Results highlight two subgroups of pure or nearly pure wild-type KRAS tumors with particular copy-number alteration patterns. The proposed procedure with a BIC criterion represents a powerful and practical alternative to the original procedure. Our procedure performs well in a general framework and is simple to implement.

  8. Predicting recovery criteria for threatened and endangered plant species on the basis of past abundances and biological traits.

    PubMed

    Neel, Maile C; Che-Castaldo, Judy P

    2013-04-01

    Recovery plans for species listed under the U.S. Endangered Species Act are required to specify measurable criteria that can be used to determine when the species can be delisted. For the 642 listed endangered and threatened plant species that have recovery plans, we applied recursive partitioning methods to test whether the number of individuals or populations required for delisting can be predicted on the basis of distributional and biological traits, previous abundance at multiple time steps, or a combination of traits and previous abundances. We also tested listing status (threatened or endangered) and the year the recovery plan was written as predictors of recovery criteria. We analyzed separately recovery criteria that were stated as number of populations and as number of individuals (population-based and individual-based criteria, respectively). Previous abundances alone were relatively good predictors of population-based recovery criteria. Fewer populations, but a greater proportion of historically known populations, were required to delist species that had few populations at listing compared with species that had more populations at listing. Previous abundances were also good predictors of individual-based delisting criteria when models included both abundances and traits. The physiographic division in which the species occur was also a good predictor of individual-based criteria. Our results suggest managers are relying on previous abundances and patterns of decline as guidelines for setting recovery criteria. This may be justifiable in that previous abundances inform managers of the effects of both intrinsic traits and extrinsic threats that interact and determine extinction risk. © 2013 Society for Conservation Biology.

  9. Clustering Financial Time Series by Network Community Analysis

    NASA Astrophysics Data System (ADS)

    Piccardi, Carlo; Calatroni, Lisa; Bertoni, Fabio

    In this paper, we describe a method for clustering financial time series which is based on community analysis, a recently developed approach for partitioning the nodes of a network (graph). A network with N nodes is associated to the set of N time series. The weight of the link (i, j), which quantifies the similarity between the two corresponding time series, is defined according to a metric based on symbolic time series analysis, which has recently proved effective in the context of financial time series. Then, searching for network communities allows one to identify groups of nodes (and then time series) with strong similarity. A quantitative assessment of the significance of the obtained partition is also provided. The method is applied to two distinct case-studies concerning the US and Italy Stock Exchange, respectively. In the US case, the stability of the partitions over time is also thoroughly investigated. The results favorably compare with those obtained with the standard tools typically used for clustering financial time series, such as the minimal spanning tree and the hierarchical tree.

  10. RECURSIVE PROTEIN MODELING: A DIVIDE AND CONQUER STRATEGY FOR PROTEIN STRUCTURE PREDICTION AND ITS CASE STUDY IN CASP9

    PubMed Central

    CHENG, JIANLIN; EICKHOLT, JESSE; WANG, ZHENG; DENG, XIN

    2013-01-01

    After decades of research, protein structure prediction remains a very challenging problem. In order to address the different levels of complexity of structural modeling, two types of modeling techniques — template-based modeling and template-free modeling — have been developed. Template-based modeling can often generate a moderate- to high-resolution model when a similar, homologous template structure is found for a query protein but fails if no template or only incorrect templates are found. Template-free modeling, such as fragment-based assembly, may generate models of moderate resolution for small proteins of low topological complexity. Seldom have the two techniques been integrated together to improve protein modeling. Here we develop a recursive protein modeling approach to selectively and collaboratively apply template-based and template-free modeling methods to model template-covered (i.e. certain) and template-free (i.e. uncertain) regions of a protein. A preliminary implementation of the approach was tested on a number of hard modeling cases during the 9th Critical Assessment of Techniques for Protein Structure Prediction (CASP9) and successfully improved the quality of modeling in most of these cases. Recursive modeling can signicantly reduce the complexity of protein structure modeling and integrate template-based and template-free modeling to improve the quality and efficiency of protein structure prediction. PMID:22809379

  11. Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages

    ERIC Educational Resources Information Center

    Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro

    2017-01-01

    Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…

  12. Analysis of hybrid subcarrier multiplexing of OCDMA based on single photodiode detection

    NASA Astrophysics Data System (ADS)

    Ahmad, N. A. A.; Junita, M. N.; Aljunid, S. A.; Rashidi, C. B. M.; Endut, R.

    2017-11-01

    This paper analyzes the performance of subcarrier multiplexing (SCM) of spectral amplitude coding optical code multiple access (SAC-OCDMA) by applying Recursive Combinatorial (RC) code based on single photodiode detection (SPD). SPD is used in the receiver part to reduce the effect of multiple access interference (MAI) which contributes as a dominant noise in incoherent SAC-OCDMA systems. Results indicate that the SCM OCDMA network performance could be improved by using lower data rates and higher number of weight. Total number of users can also be enhanced by adding lower data rates and higher number of subcarriers.

  13. Prognostic factors affecting survival after whole brain radiotherapy in patients with brain metastasized lung cancer.

    PubMed

    Tsakonas, Georgios; Hellman, Fatou; Gubanski, Michael; Friesland, Signe; Tendler, Salomon; Lewensohn, Rolf; Ekman, Simon; de Petris, Luigi

    2018-02-01

    Whole-brain radiotherapy (WBRT) has been the standard of care for multiple NSCLC brain metastases but due to its toxicity and lack of survival benefit, its use in the palliative setting is being questioned. This was a single institution cohort study including brain metastasized lung cancer patients who received WBRT at Karolinska University Hospital. Information about Recursive Partitioning Analysis (RPA) and Graded Prognostic Assessment (GPA) scores, demographics, histopathological results and received oncological therapy were collected. Predictors of overall survival (OS) from the time of received WBRT were identified by Cox regression analyses. OS between GPA and RPA classes were compared by pairwise log rank test. A subgroup OS analysis was performed stratified by RPA class. The cohort consisted of 280 patients. RPA 1 and 2 classes had better OS compared to class 3, patients with GPA <1.5 points had better OS compared to GPA≥ 1.5 points and age >70 years was associated with worse OS (p< .0001 for all comparisons). In RPA class 2 subgroup analysis GPA ≥1.5 points, age ≤70 years and CNS surgery before salvage WBRT were independent positive prognostic factors. RPA class 3 patients should not receive WBRT, whereas RPA class 1 patients should receive WBRT if clinically indicated. RPA class 2 patients with age ≤70 years and GPA ≥1.5 points should be treated as RPA 1. WBRT should be omitted in RPA 2 patients with age >70. In RPA 2 patients with age ≤70 years and GPA <1.5 points WBRT could be a reasonable option.

  14. [Clinical analysis of 23 gynecologic carcinoma patients with brain metastasis].

    PubMed

    Zhang, J X; Wang, S Z; Li, B; Zhang, Z Y

    2016-06-21

    To explore the clinicopathological characteristics and treatments of brain metastasis (BM) in patients with gynecologic carcinoma. Twenty-three pathologically confirmed patients with gynecologic carcinoma who had brain metastasis between February 2008 and October 2012 were analyzed retrospectively. The primary carcinoma was cervical cancer in 5 patients, endometrial carcinoma in 8 patients and ovarian cancer in 10 patients, which accounted for 1.81% (5/276), 2.10% (8/380) and 2.67% (10/374) of patient with the same diagnosis of the same period, respectively.Among them, 91.3% (21/23) patients had heterochronous BM.Single BM was documented in 52.2% (12/23) patients.Besides, 78.2% (18/23) BM located in cerebrum.At the time of BM, 91.3% (21/23) patients had symptoms of central nervous system, in which headache ranked the top (90.4%). Altogether, thirteen patients had extracranial metastasis, in which 9 of them had metastasis of the lung.The median post-brain-metastasis survival (mPBMS) for the recursive partitioning analysis (RPA) classes Ⅰ-Ⅲ was 54 months, 9 months and 1 month, respectively (P<0.01). None of surgery, radiotherapy or chemotherapy treatment was proven to have prognosis-improving ability either in single variant or multivariate analysis.However, in patients with extracranial metastasis, chemotherapy could significantly improve their mPBMS (P<0.05). The incidence of brain metastasis in patients with cervical cancer, endometrial carcinoma, and ovarian cancer increased gradually.RPA was valuable for a prognostic assessment in gynecologic carcinoma patients with BM.Chemotherapy could significantly improve prognosis of gynecologic carcinoma patients with BM if extracranial metastasis was presented.

  15. Recursive mass matrix factorization and inversion: An operator approach to open- and closed-chain multibody dynamics

    NASA Technical Reports Server (NTRS)

    Rodriguez, G.; Kreutz, K.

    1988-01-01

    This report advances a linear operator approach for analyzing the dynamics of systems of joint-connected rigid bodies.It is established that the mass matrix M for such a system can be factored as M=(I+H phi L)D(I+H phi L) sup T. This yields an immediate inversion M sup -1=(I-H psi L) sup T D sup -1 (I-H psi L), where H and phi are given by known link geometric parameters, and L, psi and D are obtained recursively by a spatial discrete-step Kalman filter and by the corresponding Riccati equation associated with this filter. The factors (I+H phi L) and (I-H psi L) are lower triangular matrices which are inverses of each other, and D is a diagonal matrix. This factorization and inversion of the mass matrix leads to recursive algortihms for forward dynamics based on spatially recursive filtering and smoothing. The primary motivation for advancing the operator approach is to provide a better means to formulate, analyze and understand spatial recursions in multibody dynamics. This is achieved because the linear operator notation allows manipulation of the equations of motion using a very high-level analytical framework (a spatial operator algebra) that is easy to understand and use. Detailed lower-level recursive algorithms can readily be obtained for inspection from the expressions involving spatial operators. The report consists of two main sections. In Part 1, the problem of serial chain manipulators is analyzed and solved. Extensions to a closed-chain system formed by multiple manipulators moving a common task object are contained in Part 2. To retain ease of exposition in the report, only these two types of multibody systems are considered. However, the same methods can be easily applied to arbitrary multibody systems formed by a collection of joint-connected regid bodies.

  16. The Association Between Neighborhood Characteristics and Body Size and Physical Activity in the California Teachers Study Cohort

    PubMed Central

    Hurley, Susan; Goldberg, Debbie; Nelson, David O.; Reynolds, Peggy; Bernstein, Leslie; Horn-Ross, Pam L.; Gomez, Scarlett L.

    2012-01-01

    Objectives. We considered interactions between physical activity and body mass index (BMI) and neighborhood factors. Methods. We used recursive partitioning to identify predictors of low recreational physical activity (< 2.5 hours/week) and overweight and obesity (BMI ≥ 25.0 kg/m2) among 118 315 women in the California Teachers Study. Neighborhood characteristics were based on 2000 US Census data and Reference US business listings. Results. Low physical activity and being overweight or obese were associated with individual sociodemographic characteristics, including race/ethnicity and age. Among White women aged 36 to 75 years, living in neighborhoods with more household crowding was associated with a higher probability of low physical activity (54% vs 45% to 51%). In less crowded neighborhoods where more people worked outside the home, the existence of fewer neighborhood amenities was associated with a higher probability of low physical activity (51% vs 46%). Among non–African American middle-aged women, living in neighborhoods with a lower socioeconomic status was associated with a higher probability of being overweight or obese (46% to 59% vs 38% in high–socioeconomic status neighborhoods). Conclusions. Associations between physical activity, overweight and obesity, and the built environment varied by sociodemographic characteristics in this educated population. PMID:21852626

  17. Predicting human liver microsomal stability with machine learning techniques.

    PubMed

    Sakiyama, Yojiro; Yuki, Hitomi; Moriya, Takashi; Hattori, Kazunari; Suzuki, Misaki; Shimada, Kaoru; Honma, Teruki

    2008-02-01

    To ensure a continuing pipeline in pharmaceutical research, lead candidates must possess appropriate metabolic stability in the drug discovery process. In vitro ADMET (absorption, distribution, metabolism, elimination, and toxicity) screening provides us with useful information regarding the metabolic stability of compounds. However, before the synthesis stage, an efficient process is required in order to deal with the vast quantity of data from large compound libraries and high-throughput screening. Here we have derived a relationship between the chemical structure and its metabolic stability for a data set of in-house compounds by means of various in silico machine learning such as random forest, support vector machine (SVM), logistic regression, and recursive partitioning. For model building, 1952 proprietary compounds comprising two classes (stable/unstable) were used with 193 descriptors calculated by Molecular Operating Environment. The results using test compounds have demonstrated that all classifiers yielded satisfactory results (accuracy > 0.8, sensitivity > 0.9, specificity > 0.6, and precision > 0.8). Above all, classification by random forest as well as SVM yielded kappa values of approximately 0.7 in an independent validation set, slightly higher than other classification tools. These results suggest that nonlinear/ensemble-based classification methods might prove useful in the area of in silico ADME modeling.

  18. Ultra-precise tracking control of piezoelectric actuators via a fuzzy hysteresis model.

    PubMed

    Li, Pengzhi; Yan, Feng; Ge, Chuan; Zhang, Mingchao

    2012-08-01

    In this paper, a novel Takagi-Sugeno (T-S) fuzzy system based model is proposed for hysteresis in piezoelectric actuators. The antecedent and consequent structures of the fuzzy hysteresis model (FHM) can be, respectively, identified on-line through uniform partition approach and recursive least squares (RLS) algorithm. With respect to controller design, the inverse of FHM is used to develop a feedforward controller to cancel out the hysteresis effect. Then a hybrid controller is designed for high-performance tracking. It combines the feedforward controller with a proportional integral differential (PID) controller favourable for stabilization and disturbance compensation. To achieve nanometer-scale tracking precision, the enhanced adaptive hybrid controller is further developed. It uses real-time input and output data to update FHM, thus changing the feedforward controller to suit the on-site hysteresis character of the piezoelectric actuator. Finally, as to 3 cases of 50 Hz sinusoidal, multiple frequency sinusoidal and 50 Hz triangular trajectories tracking, experimental results demonstrate the efficiency of the proposed controllers. Especially, being only 0.35% of the maximum desired displacement, the maximum error of 50 Hz sinusoidal tracking is greatly reduced to 5.8 nm, which clearly shows the ultra-precise nanometer-scale tracking performance of the developed adaptive hybrid controller.

  19. On-Line, Gyro-Based, Mass-Property Identification for Thruster-Controlled Spacecraft Using Recursive Least Squares

    NASA Technical Reports Server (NTRS)

    Wilson, Edward; Lages, Chris; Mah, Robert; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Spacecraft control, state estimation, and fault-detection-and-isolation systems are affected by unknown v aerations in the vehicle mass properties. It is often difficult to accurately measure inertia terms on the ground, and mass properties can change on-orbit as fuel is expended, the configuration changes, or payloads are added or removed. Recursive least squares -based algorithms that use gyro signals to identify the center of mass and inverse inertia matrix are presented. They are applied in simulation to 3 thruster-controlled vehicles: the X-38 and Mini-AERCam under development at NASA-JSC, and the SAM, an air-bearing spacecraft simulator at the NASA-Ames Smart Systems Research Lab (SSRL).

  20. Recursion to food plants by free-ranging Bornean elephant

    PubMed Central

    Gillespie, Graeme; Goossens, Benoit; Ismail, Sulaiman; Ancrenaz, Marc; Linklater, Wayne

    2015-01-01

    Plant recovery rates after herbivory are thought to be a key factor driving recursion by herbivores to sites and plants to optimise resource-use but have not been investigated as an explanation for recursion in large herbivores. We investigated the relationship between plant recovery and recursion by elephants (Elephas maximus borneensis) in the Lower Kinabatangan Wildlife Sanctuary, Sabah. We identified 182 recently eaten food plants, from 30 species, along 14 × 50 m transects and measured their recovery growth each month over nine months or until they were re-browsed by elephants. The monthly growth in leaf and branch or shoot length for each plant was used to calculate the time required (months) for each species to recover to its pre-eaten length. Elephant returned to all but two transects with 10 eaten plants, a further 26 plants died leaving 146 plants that could be re-eaten. Recursion occurred to 58% of all plants and 12 of the 30 species. Seventy-seven percent of the re-eaten plants were grasses. Recovery times to all plants varied from two to twenty months depending on the species. Recursion to all grasses coincided with plant recovery whereas recursion to most browsed plants occurred four to twelve months before they had recovered to their previous length. The small sample size of many browsed plants that received recursion and uneven plant species distribution across transects limits our ability to generalise for most browsed species but a prominent pattern in plant-scale recursion did emerge. Plant recovery time was a good predictor of time to recursion but varied as a function of growth form (grass, ginger, palm, liana and woody) and differences between sites. Time to plant recursion coincided with plant recovery time for the elephant’s preferred food, grasses, and perhaps also gingers, but not the other browsed species. Elephants are bulk feeders so it is likely that they time their returns to bulk feed on these grass species when quantities have recovered sufficiently to meet their intake requirements. The implications for habitat and elephant management are discussed. PMID:26290779

  1. Recursion to food plants by free-ranging Bornean elephant.

    PubMed

    English, Megan; Gillespie, Graeme; Goossens, Benoit; Ismail, Sulaiman; Ancrenaz, Marc; Linklater, Wayne

    2015-01-01

    Plant recovery rates after herbivory are thought to be a key factor driving recursion by herbivores to sites and plants to optimise resource-use but have not been investigated as an explanation for recursion in large herbivores. We investigated the relationship between plant recovery and recursion by elephants (Elephas maximus borneensis) in the Lower Kinabatangan Wildlife Sanctuary, Sabah. We identified 182 recently eaten food plants, from 30 species, along 14 × 50 m transects and measured their recovery growth each month over nine months or until they were re-browsed by elephants. The monthly growth in leaf and branch or shoot length for each plant was used to calculate the time required (months) for each species to recover to its pre-eaten length. Elephant returned to all but two transects with 10 eaten plants, a further 26 plants died leaving 146 plants that could be re-eaten. Recursion occurred to 58% of all plants and 12 of the 30 species. Seventy-seven percent of the re-eaten plants were grasses. Recovery times to all plants varied from two to twenty months depending on the species. Recursion to all grasses coincided with plant recovery whereas recursion to most browsed plants occurred four to twelve months before they had recovered to their previous length. The small sample size of many browsed plants that received recursion and uneven plant species distribution across transects limits our ability to generalise for most browsed species but a prominent pattern in plant-scale recursion did emerge. Plant recovery time was a good predictor of time to recursion but varied as a function of growth form (grass, ginger, palm, liana and woody) and differences between sites. Time to plant recursion coincided with plant recovery time for the elephant's preferred food, grasses, and perhaps also gingers, but not the other browsed species. Elephants are bulk feeders so it is likely that they time their returns to bulk feed on these grass species when quantities have recovered sufficiently to meet their intake requirements. The implications for habitat and elephant management are discussed.

  2. Partition of some key regulating services in terrestrial ecosystems: Meta-analysis and review.

    PubMed

    Viglizzo, E F; Jobbágy, E G; Ricard, M F; Paruelo, J M

    2016-08-15

    Our knowledge about the functional foundations of ecosystem service (ES) provision is still limited and more research is needed to elucidate key functional mechanisms. Using a simplified eco-hydrological scheme, in this work we analyzed how land-use decisions modify the partition of some essential regulatory ES by altering basic relationships between biomass stocks and water flows. A comprehensive meta-analysis and review was conducted based on global, regional and local data from peer-reviewed publications. We analyzed five datasets comprising 1348 studies and 3948 records on precipitation (PPT), aboveground biomass (AGB), AGB change, evapotranspiration (ET), water yield (WY), WY change, runoff (R) and infiltration (I). The conceptual framework was focused on ES that are associated with the ecological functions (e.g., intermediate ES) of ET, WY, R and I. ES included soil protection, carbon sequestration, local climate regulation, water-flow regulation and water recharge. To address the problem of data normality, the analysis included both parametric and non-parametric regression analysis. Results demonstrate that PPT is a first-order biophysical factor that controls ES release at the broader scales. At decreasing scales, ES are partitioned as result of PPT interactions with other biophysical and anthropogenic factors. At intermediate scales, land-use change interacts with PPT modifying ES partition as it the case of afforestation in dry regions, where ET and climate regulation may be enhanced at the expense of R and water-flow regulation. At smaller scales, site-specific conditions such as topography interact with PPT and AGB displaying different ES partition formats. The probable implications of future land-use and climate change on some key ES production and partition are discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Distinctive signatures of recursion.

    PubMed

    Martins, Maurício Dias

    2012-07-19

    Although recursion has been hypothesized to be a necessary capacity for the evolution of language, the multiplicity of definitions being used has undermined the broader interpretation of empirical results. I propose that only a definition focused on representational abilities allows the prediction of specific behavioural traits that enable us to distinguish recursion from non-recursive iteration and from hierarchical embedding: only subjects able to represent recursion, i.e. to represent different hierarchical dependencies (related by parenthood) with the same set of rules, are able to generalize and produce new levels of embedding beyond those specified a priori (in the algorithm or in the input). The ability to use such representations may be advantageous in several domains: action sequencing, problem-solving, spatial navigation, social navigation and for the emergence of conventionalized communication systems. The ability to represent contiguous hierarchical levels with the same rules may lead subjects to expect unknown levels and constituents to behave similarly, and this prior knowledge may bias learning positively. Finally, a new paradigm to test for recursion is presented. Preliminary results suggest that the ability to represent recursion in the spatial domain recruits both visual and verbal resources. Implications regarding language evolution are discussed.

  4. The potential of cloud point system as a novel two-phase partitioning system for biotransformation.

    PubMed

    Wang, Zhilong

    2007-05-01

    Although the extractive biotransformation in two-phase partitioning systems have been studied extensively, such as the water-organic solvent two-phase system, the aqueous two-phase system, the reverse micelle system, and the room temperature ionic liquid, etc., this has not yet resulted in a widespread industrial application. Based on the discussion of the main obstacles, an exploitation of a cloud point system, which has already been applied in a separation field known as a cloud point extraction, as a novel two-phase partitioning system for biotransformation, is reviewed by analysis of some topical examples. At the end of the review, the process control and downstream processing in the application of the novel two-phase partitioning system for biotransformation are also briefly discussed.

  5. The Recursive Paradigm: Suppose We Already Knew.

    ERIC Educational Resources Information Center

    Maurer, Stephen B.

    1995-01-01

    Explains the recursive model in discrete mathematics through five examples and problems. Discusses the relationship between the recursive model, mathematical induction, and inductive reasoning and the relevance of these concepts in the school curriculum. Provides ideas for approaching this material with students. (Author/DDD)

  6. Recursive approach to the moment-based phase unwrapping method.

    PubMed

    Langley, Jason A; Brice, Robert G; Zhao, Qun

    2010-06-01

    The moment-based phase unwrapping algorithm approximates the phase map as a product of Gegenbauer polynomials, but the weight function for the Gegenbauer polynomials generates artificial singularities along the edge of the phase map. A method is presented to remove the singularities inherent to the moment-based phase unwrapping algorithm by approximating the phase map as a product of two one-dimensional Legendre polynomials and applying a recursive property of derivatives of Legendre polynomials. The proposed phase unwrapping algorithm is tested on simulated and experimental data sets. The results are then compared to those of PRELUDE 2D, a widely used phase unwrapping algorithm, and a Chebyshev-polynomial-based phase unwrapping algorithm. It was found that the proposed phase unwrapping algorithm provides results that are comparable to those obtained by using PRELUDE 2D and the Chebyshev phase unwrapping algorithm.

  7. A Machine Learning Approach to Identifying Placebo Responders in Late-Life Depression Trials.

    PubMed

    Zilcha-Mano, Sigal; Roose, Steven P; Brown, Patrick J; Rutherford, Bret R

    2018-01-11

    Despite efforts to identify characteristics associated with medication-placebo differences in antidepressant trials, few consistent findings have emerged to guide participant selection in drug development settings and differential therapeutics in clinical practice. Limitations in the methodologies used, particularly searching for a single moderator while treating all other variables as noise, may partially explain the failure to generate consistent results. The present study tested whether interactions between pretreatment patient characteristics, rather than a single-variable solution, may better predict who is most likely to benefit from placebo versus medication. Data were analyzed from 174 patients aged 75 years and older with unipolar depression who were randomly assigned to citalopram or placebo. Model-based recursive partitioning analysis was conducted to identify the most robust significant moderators of placebo versus citalopram response. The greatest signal detection between medication and placebo in favor of medication was among patients with fewer years of education (≤12) who suffered from a longer duration of depression since their first episode (>3.47 years) (B = 2.53, t(32) = 3.01, p = 0.004). Compared with medication, placebo had the greatest response for those who were more educated (>12 years), to the point where placebo almost outperformed medication (B = -0.57, t(96) = -1.90, p = 0.06). Machine learning approaches capable of evaluating the contributions of multiple predictor variables may be a promising methodology for identifying placebo versus medication responders. Duration of depression and education should be considered in the efforts to modulate placebo magnitude in drug development settings and in clinical practice. Copyright © 2018 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  8. External validity of two nomograms for predicting distant brain failure after radiosurgery for brain metastases in a bi-institutional independent patient cohort.

    PubMed

    Prabhu, Roshan S; Press, Robert H; Boselli, Danielle M; Miller, Katherine R; Lankford, Scott P; McCammon, Robert J; Moeller, Benjamin J; Heinzerling, John H; Fasola, Carolina E; Patel, Kirtesh R; Asher, Anthony L; Sumrall, Ashley L; Curran, Walter J; Shu, Hui-Kuo G; Burri, Stuart H

    2018-03-01

    Patients treated with stereotactic radiosurgery (SRS) for brain metastases (BM) are at increased risk of distant brain failure (DBF). Two nomograms have been recently published to predict individualized risk of DBF after SRS. The goal of this study was to assess the external validity of these nomograms in an independent patient cohort. The records of consecutive patients with BM treated with SRS at Levine Cancer Institute and Emory University between 2005 and 2013 were reviewed. Three validation cohorts were generated based on the specific nomogram or recursive partitioning analysis (RPA) entry criteria: Wake Forest nomogram (n = 281), Canadian nomogram (n = 282), and Canadian RPA (n = 303) validation cohorts. Freedom from DBF at 1-year in the Wake Forest study was 30% compared with 50% in the validation cohort. The validation c-index for both the 6-month and 9-month freedom from DBF Wake Forest nomograms was 0.55, indicating poor discrimination ability, and the goodness-of-fit test for both nomograms was highly significant (p < 0.001), indicating poor calibration. The 1-year actuarial DBF in the Canadian nomogram study was 43.9% compared with 50.9% in the validation cohort. The validation c-index for the Canadian 1-year DBF nomogram was 0.56, and the goodness-of-fit test was also highly significant (p < 0.001). The validation accuracy and c-index of the Canadian RPA classification was 53% and 0.61, respectively. The Wake Forest and Canadian nomograms for predicting risk of DBF after SRS were found to have limited predictive ability in an independent bi-institutional validation cohort. These results reinforce the importance of validating predictive models in independent patient cohorts.

  9. Red cell distribution width in anemic patients undergoing transcatheter aortic valve implantation

    PubMed Central

    Hellhammer, Katharina; Zeus, Tobias; Verde, Pablo E; Veulemanns, Verena; Kahlstadt, Lisa; Wolff, Georg; Erkens, Ralf; Westenfeld, Ralf; Navarese, Eliano P; Merx, Marc W; Rassaf, Tienush; Kelm, Malte

    2016-01-01

    AIM: To determine the impact of red blood cell distribution width on outcome in anemic patients undergoing transcatheter aortic valve implantation (TAVI). METHODS: In a retrospective single center cohort study we determined the impact of baseline red cell distribution width (RDW) and anemia on outcome in 376 patients with aortic stenosis undergoing TAVI. All patients were discussed in the institutional heart team and declined for surgical aortic valve replacement due to high operative risk. Collected data included patient characteristics, imaging findings, periprocedural in hospital data, laboratory results and follow up data. Blood samples for hematology and biochemistry analysis were taken from every patient before and at fixed intervals up to 72 h after TAVI including blood count and creatinine. Descriptive statistics were used for patient’s characteristics. Kaplan-Meier survival curves were used for time to event outcomes. A recursive partitioning regression and classification was used to investigate the association between potential risk factors and outcome variables. RESULTS: Mean age in our study population was 81 ± 6.1 years. Anemia was prevalent in 63.6% (n = 239) of our patients. Age and creatinine were identified as risk factors for anemia. In our study population, anemia per se did influence 30-d mortality but did not predict longterm mortality. In contrast, a RDW > 14% showed to be highly predictable for a reduced short- and longterm survival in patients with aortic valve disease after TAVI procedure. CONCLUSION: Age and kidney function determine the degree of anemia. The anisocytosis of red blood cells in anemic patients supplements prognostic information in addition to that derived from the WHO-based definition of anemia. PMID:26981217

  10. HFSRT of the resection cavity in patients with brain metastases.

    PubMed

    Specht, Hanno M; Kessel, Kerstin A; Oechsner, Markus; Meyer, Bernhard; Zimmer, Claus; Combs, Stephanie E

    2016-06-01

    Aim of this single center, retrospective study was to assess the efficacy and safety of linear accelerator-based hypofractionated stereotactic radiotherapy (HFSRT) to the resection cavity of brain metastases after surgical resection. Local control (LC), locoregional control (LRC = new brain metastases outside of the treatment volume), overall survival (OS) as well as acute and late toxicity were evaluated. 46 patients with large (> 3 cm) or symptomatic brain metastases were treated with HFSRT. Median resection cavity volume was 14.16 cm(3) (range 1.44-38.68 cm(3)) and median planning target volume (PTV) was 26.19 cm(3) (range 3.45-63.97 cm(3)). Patients were treated with 35 Gy in 7 fractions prescribed to the 95-100 % isodose line in a stereotactic treatment setup. LC and LRC were assessed by follow-up magnetic resonance imaging. The 1-year LC rate was 88 % and LRC was 48 %; 57% of all patients showed cranial progression after HFSRT (4% local, 44% locoregional, 9% local and locoregional). The median follow-up was 19 months; median OS for the whole cohort was 25 months. Tumor histology and recursive partitioning analysis score were significant predictors for OS. HFSRT was tolerated well without any severe acute side effects > grade 2 according to CTCAE criteria. HFSRT after surgical resection of brain metastases was tolerated well without any severe acute side effects and led to excellent LC and a favorable OS. Since more than half of the patients showed cranial progression after local irradiation of the resection cavity, close patient follow-up is warranted. A prospective evaluation in clinical trials is currently being performed.

  11. Recursive linearization of multibody dynamics equations of motion

    NASA Technical Reports Server (NTRS)

    Lin, Tsung-Chieh; Yae, K. Harold

    1989-01-01

    The equations of motion of a multibody system are nonlinear in nature, and thus pose a difficult problem in linear control design. One approach is to have a first-order approximation through the numerical perturbations at a given configuration, and to design a control law based on the linearized model. Here, a linearized model is generated analytically by following the footsteps of the recursive derivation of the equations of motion. The equations of motion are first written in a Newton-Euler form, which is systematic and easy to construct; then, they are transformed into a relative coordinate representation, which is more efficient in computation. A new computational method for linearization is obtained by applying a series of first-order analytical approximations to the recursive kinematic relationships. The method has proved to be computationally more efficient because of its recursive nature. It has also turned out to be more accurate because of the fact that analytical perturbation circumvents numerical differentiation and other associated numerical operations that may accumulate computational error, thus requiring only analytical operations of matrices and vectors. The power of the proposed linearization algorithm is demonstrated, in comparison to a numerical perturbation method, with a two-link manipulator and a seven degrees of freedom robotic manipulator. Its application to control design is also demonstrated.

  12. Algebraic function operator expectation value based quantum eigenstate determination: A case of twisted or bent Hamiltonian, or, a spatially univariate quantum system on a curved space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baykara, N. A.

    Recent studies on quantum evolutionary problems in Demiralp’s group have arrived at a stage where the construction of an expectation value formula for a given algebraic function operator depending on only position operator becomes possible. It has also been shown that this formula turns into an algebraic recursion amongst some finite number of consecutive elements in a set of expectation values of an appropriately chosen basis set over the natural number powers of the position operator as long as the function under consideration and the system Hamiltonian are both autonomous. This recursion corresponds to a denumerable infinite number of algebraicmore » equations whose solutions can or can not be obtained analytically. This idea is not completely original. There are many recursive relations amongst the expectation values of the natural number powers of position operator. However, those recursions may not be always efficient to get the system energy values and especially the eigenstate wavefunctions. The present approach is somehow improved and generalized form of those expansions. We focus on this issue for a specific system where the Hamiltonian is defined on the coordinate of a curved space instead of the Cartesian one.« less

  13. Recursive thoughts on the simulation of the flexible multibody dynamics of slender offshore structures

    NASA Astrophysics Data System (ADS)

    Schilder, J.; Ellenbroek, M.; de Boer, A.

    2017-12-01

    In this work, the floating frame of reference formulation is used to create a flexible multibody model of slender offshore structures such as pipelines and risers. It is shown that due to the chain-like topology of the considered structures, the equation of motion can be expressed in terms of absolute interface coordinates. In the presented form, kinematic constraint equations are satisfied explicitly and the Lagrange multipliers are eliminated from the equations. Hence, the structures can be conveniently coupled to finite element or multibody models of for example seabed and vessel. The chain-like topology enables the efficient use of recursive solution procedures for both transient dynamic analysis and equilibrium analysis. For this, the transfer matrix method is used. In order to improve the convergence of the equilibrium analysis, the analytical solution of an ideal catenary is used as an initial configuration, reducing the number of required iterations.

  14. Optimal design of minimum mean-square error noise reduction algorithms using the simulated annealing technique.

    PubMed

    Bai, Mingsian R; Hsieh, Ping-Ju; Hur, Kur-Nan

    2009-02-01

    The performance of the minimum mean-square error noise reduction (MMSE-NR) algorithm in conjunction with time-recursive averaging (TRA) for noise estimation is found to be very sensitive to the choice of two recursion parameters. To address this problem in a more systematic manner, this paper proposes an optimization method to efficiently search the optimal parameters of the MMSE-TRA-NR algorithms. The objective function is based on a regression model, whereas the optimization process is carried out with the simulated annealing algorithm that is well suited for problems with many local optima. Another NR algorithm proposed in the paper employs linear prediction coding as a preprocessor for extracting the correlated portion of human speech. Objective and subjective tests were undertaken to compare the optimized MMSE-TRA-NR algorithm with several conventional NR algorithms. The results of subjective tests were processed by using analysis of variance to justify the statistic significance. A post hoc test, Tukey's Honestly Significant Difference, was conducted to further assess the pairwise difference between the NR algorithms.

  15. Implications of Middle School Behavior Problems for High School Graduation and Employment Outcomes of Young Adults: Estimation of a Recursive Model

    PubMed Central

    Karakus, Mustafa C.; Salkever, David S.; Slade, Eric P.; Ialongo, Nicholas; Stuart, Elizabeth

    2013-01-01

    The potentially serious adverse impacts of behavior problems during adolescence on employment outcomes in adulthood provide a key economic rationale for early intervention programs. However, the extent to which lower educational attainment accounts for the total impact of adolescent behavior problems on later employment remains unclear As an initial step in exploring this issue, we specify and estimate a recursive bivariate probit model that 1) relates middle school behavior problems to high school graduation and 2) models later employment in young adulthood as a function of these behavior problems and of high school graduation. Our model thus allows for both a direct effect of behavior problems on later employment as well as an indirect effect that operates via graduation from high school. Our empirical results, based on analysis of data from the NELS, suggest that the direct effects of externalizing behavior problems on later employment are not significant but that these problems have important indirect effects operating through high school graduation. PMID:23576834

  16. Ultra Wide-Band Localization and SLAM: A Comparative Study for Mobile Robot Navigation

    PubMed Central

    Segura, Marcelo J.; Auat Cheein, Fernando A.; Toibero, Juan M.; Mut, Vicente; Carelli, Ricardo

    2011-01-01

    In this work, a comparative study between an Ultra Wide-Band (UWB) localization system and a Simultaneous Localization and Mapping (SLAM) algorithm is presented. Due to its high bandwidth and short pulses length, UWB potentially allows great accuracy in range measurements based on Time of Arrival (TOA) estimation. SLAM algorithms recursively estimates the map of an environment and the pose (position and orientation) of a mobile robot within that environment. The comparative study presented here involves the performance analysis of implementing in parallel an UWB localization based system and a SLAM algorithm on a mobile robot navigating within an environment. Real time results as well as error analysis are also shown in this work. PMID:22319397

  17. Direct optimization, affine gap costs, and node stability.

    PubMed

    Aagesen, Lone

    2005-09-01

    The outcome of a phylogenetic analysis based on DNA sequence data is highly dependent on the homology-assignment step and may vary with alignment parameter costs. Robustness to changes in parameter costs is therefore a desired quality of a data set because the final conclusions will be less dependent on selecting a precise optimal cost set. Here, node stability is explored in relationship to separate versus combined analysis in three different data sets, all including several data partitions. Robustness to changes in cost sets is measured as number of successive changes that can be made in a given cost set before a specific clade is lost. The changes are in all cases base change cost, gap penalties, and adding/removing/changing affine gap costs. When combining data partitions, the number of clades that appear in the entire parameter space is not remarkably increased, in some cases this number even decreased. However, when combining data partitions the trees from cost sets including affine gap costs were always more similar than the trees were from cost sets without affine gap costs. This was not the case when the data partitions were analyzed independently. When data sets were combined approximately 80% of the clades found under cost sets including affine gap costs resisted at least one change to the cost set.

  18. Applying Recursive Sensitivity Analysis to Multi-Criteria Decision Models to Reduce Bias in Defense Cyber Engineering Analysis

    DTIC Science & Technology

    2015-10-28

    techniques such as regression analysis, correlation, and multicollinearity assessment to identify the change and error on the input to the model...between many of the independent or predictor variables, the issue of multicollinearity may arise [18]. VII. SUMMARY Accurate decisions concerning

  19. Meshfree truncated hierarchical refinement for isogeometric analysis

    NASA Astrophysics Data System (ADS)

    Atri, H. R.; Shojaee, S.

    2018-05-01

    In this paper truncated hierarchical B-spline (THB-spline) is coupled with reproducing kernel particle method (RKPM) to blend advantages of the isogeometric analysis and meshfree methods. Since under certain conditions, the isogeometric B-spline and NURBS basis functions are exactly represented by reproducing kernel meshfree shape functions, recursive process of producing isogeometric bases can be omitted. More importantly, a seamless link between meshfree methods and isogeometric analysis can be easily defined which provide an authentic meshfree approach to refine the model locally in isogeometric analysis. This procedure can be accomplished using truncated hierarchical B-splines to construct new bases and adaptively refine them. It is also shown that the THB-RKPM method can provide efficient approximation schemes for numerical simulations and represent a promising performance in adaptive refinement of partial differential equations via isogeometric analysis. The proposed approach for adaptive locally refinement is presented in detail and its effectiveness is investigated through well-known benchmark examples.

  20. Theory of Mind Development in Adolescence and Early Adulthood: The Growing Complexity of Recursive Thinking Ability

    PubMed Central

    Valle, Annalisa; Massaro, Davide; Castelli, Ilaria; Marchetti, Antonella

    2015-01-01

    This study explores the development of theory of mind, operationalized as recursive thinking ability, from adolescence to early adulthood (N = 110; young adolescents = 47; adolescents = 43; young adults = 20). The construct of theory of mind has been operationalized in two different ways: as the ability to recognize the correct mental state of a character, and as the ability to attribute the correct mental state in order to predict the character’s behaviour. The Imposing Memory Task, with five recursive thinking levels, and a third-order false-belief task with three recursive thinking levels (devised for this study) have been used. The relationship among working memory, executive functions, and linguistic skills are also analysed. Results show that subjects exhibit less understanding of elevated recursive thinking levels (third, fourth, and fifth) compared to the first and second levels. Working memory is correlated with total recursive thinking, whereas performance on the linguistic comprehension task is related to third level recursive thinking in both theory of mind tasks. An effect of age on third-order false-belief task performance was also found. A key finding of the present study is that the third-order false-belief task shows significant age differences in the application of recursive thinking that involves the prediction of others’ behaviour. In contrast, such an age effect is not observed in the Imposing Memory Task. These results may support the extension of the investigation of the third order false belief after childhood. PMID:27247645

  1. Predictive factors for pericardial effusion identified by heart dose-volume histogram analysis in oesophageal cancer patients treated with chemoradiotherapy.

    PubMed

    Hayashi, K; Fujiwara, Y; Nomura, M; Kamata, M; Kojima, H; Kohzai, M; Sumita, K; Tanigawa, N

    2015-02-01

    To identify predictive factors for the development of pericardial effusion (PCE) in patients with oesophageal cancer treated with chemotherapy and radiotherapy (RT). From March 2006 to November 2012, patients with oesophageal cancer treated with chemoradiotherapy (CRT) using the following criteria were evaluated: radiation dose >50 Gy; heart included in the radiation field; dose-volume histogram (DVH) data available for analysis; no previous thoracic surgery; and no PCE before treatment. The diagnosis of PCE was independently determined by two radiologists. Clinical factors, the percentage of heart volume receiving >5-60 Gy in increments of 5 Gy (V5-60, respectively), maximum heart dose and mean heart dose were analysed. A total of 143 patients with oesophageal cancer were reviewed retrospectively. The median follow-up by CT was 15 months (range, 2.1-72.6 months) after RT. PCE developed in 55 patients (38.5%) after RT, and the median time to develop PCE was 3.5 months (range, 0.2-9.9 months). On univariate analysis, DVH parameters except for V60 were significantly associated with the development of PCE (p < 0.001). No clinical factor was significantly related to the development of PCE. Recursive partitioning analysis including all DVH parameters as variables showed a V10 cut-off value of 72.8% to be the most influential factor. The present results showed that DVH parameters are strong independent predictive factors for the development of PCE in patients with oesophageal cancer treated with CRT. A heart dosage was associated with the development of PCE with radiation and without prophylactic nodal irradiation.

  2. A matched-pair study comparing whole-brain irradiation alone to radiosurgery or fractionated stereotactic radiotherapy alone in patients irradiated for up to three brain metastases.

    PubMed

    Rades, Dirk; Janssen, Stefan; Dziggel, Liesa; Blanck, Oliver; Bajrovic, Amira; Veninga, Theo; Schild, Steven E

    2017-01-06

    This matched-pair study was initiated to validate the results of a retrospective study of 186 patients published in 2007 that compared whole-brain irradiation (WBI) alone and radiosurgery (RS) alone for up to three brain metastases. One-hundred-fifty-two patients receiving WBI alone for up to three brain metastases were matched with 152 patients treated with RS of fractionated stereotactic radiotherapy (FSRT) alone 1:1 for each of eight factors (age, gender, Eastern Oncology Cooperative Group (ECOG)-performance score, nature of tumor, brain metastases number, extra-cerebral spread, period from cancer detection to irradiation of brain metastases, and recursive partitioning analysis (RPA)-class. Groups were analyzed regarding intracerebral control (IC) and overall survival (OS). On univariate analysis of IC, type of irradiation did not significantly affect outcomes (p = 0.84). On Cox regression, brain metastases number (p < 0.001), nature of tumor (p < 0.001) and period from cancer detection to irradiation of brain metastases (p = 0.013) were significantly associated with IC. On univariate analysis of OS, type of irradiation showed no significant association with outcomes (p = 0.63). On multivariate analyses, OS was significantly associated with ECOG performance score (p = 0.011), nature of tumor (p = 0.035), brain metastases number (p = 0.048), extra-cerebral spread (p = 0.002) and RPA-class (p < 0.001). In this matched-pair study, RS/FSRT alone was not superior to WBI alone regarding IC and OS. These results can be considered a revision of the findings from our retrospective previous study without matched-pair design, where RS alone resulted in significantly better IC than WBI alone on multivariate analysis.

  3. The Analysis of Image Segmentation Hierarchies with a Graph-based Knowledge Discovery System

    NASA Technical Reports Server (NTRS)

    Tilton, James C.; Cooke, diane J.; Ketkar, Nikhil; Aksoy, Selim

    2008-01-01

    Currently available pixel-based analysis techniques do not effectively extract the information content from the increasingly available high spatial resolution remotely sensed imagery data. A general consensus is that object-based image analysis (OBIA) is required to effectively analyze this type of data. OBIA is usually a two-stage process; image segmentation followed by an analysis of the segmented objects. We are exploring an approach to OBIA in which hierarchical image segmentations provided by the Recursive Hierarchical Segmentation (RHSEG) software developed at NASA GSFC are analyzed by the Subdue graph-based knowledge discovery system developed by a team at Washington State University. In this paper we discuss out initial approach to representing the RHSEG-produced hierarchical image segmentations in a graphical form understandable by Subdue, and provide results on real and simulated data. We also discuss planned improvements designed to more effectively and completely convey the hierarchical segmentation information to Subdue and to improve processing efficiency.

  4. Practical application of cure mixture model for long-term censored survivor data from a withdrawal clinical trial of patients with major depressive disorder.

    PubMed

    Arano, Ichiro; Sugimoto, Tomoyuki; Hamasaki, Toshimitsu; Ohno, Yuko

    2010-04-23

    Survival analysis methods such as the Kaplan-Meier method, log-rank test, and Cox proportional hazards regression (Cox regression) are commonly used to analyze data from randomized withdrawal studies in patients with major depressive disorder. However, unfortunately, such common methods may be inappropriate when a long-term censored relapse-free time appears in data as the methods assume that if complete follow-up were possible for all individuals, each would eventually experience the event of interest. In this paper, to analyse data including such a long-term censored relapse-free time, we discuss a semi-parametric cure regression (Cox cure regression), which combines a logistic formulation for the probability of occurrence of an event with a Cox proportional hazards specification for the time of occurrence of the event. In specifying the treatment's effect on disease-free survival, we consider the fraction of long-term survivors and the risks associated with a relapse of the disease. In addition, we develop a tree-based method for the time to event data to identify groups of patients with differing prognoses (cure survival CART). Although analysis methods typically adapt the log-rank statistic for recursive partitioning procedures, the method applied here used a likelihood ratio (LR) test statistic from a fitting of cure survival regression assuming exponential and Weibull distributions for the latency time of relapse. The method is illustrated using data from a sertraline randomized withdrawal study in patients with major depressive disorder. We concluded that Cox cure regression reveals facts on who may be cured, and how the treatment and other factors effect on the cured incidence and on the relapse time of uncured patients, and that cure survival CART output provides easily understandable and interpretable information, useful both in identifying groups of patients with differing prognoses and in utilizing Cox cure regression models leading to meaningful interpretations.

  5. Accelerated Time-Domain Modeling of Electromagnetic Pulse Excitation of Finite-Length Dissipative Conductors over a Ground Plane via Function Fitting and Recursive Convolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campione, Salvatore; Warne, Larry K.; Sainath, Kamalesh

    In this report we overview the fundamental concepts for a pair of techniques which together greatly hasten computational predictions of electromagnetic pulse (EMP) excitation of finite-length dissipative conductors over a ground plane. In a time- domain, transmission line (TL) model implementation, predictions are computationally bottlenecked time-wise, either for late-time predictions (about 100ns-10000ns range) or predictions concerning EMP excitation of long TLs (order of kilometers or more ). This is because the method requires a temporal convolution to account for the losses in the ground. Addressing this to facilitate practical simulation of EMP excitation of TLs, we first apply a techniquemore » to extract an (approximate) complex exponential function basis-fit to the ground/Earth's impedance function, followed by incorporating this into a recursion-based convolution acceleration technique. Because the recursion-based method only requires the evaluation of the most recent voltage history data (versus the entire history in a "brute-force" convolution evaluation), we achieve necessary time speed- ups across a variety of TL/Earth geometry/material scenarios. Intentionally Left Blank« less

  6. On Fusing Recursive Traversals of K-d Trees

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rajbhandari, Samyam; Kim, Jinsung; Krishnamoorthy, Sriram

    Loop fusion is a key program transformation for data locality optimization that is implemented in production compilers. But optimizing compilers currently cannot exploit fusion opportunities across a set of recursive tree traversal computations with producer-consumer relationships. In this paper, we develop a compile-time approach to dependence characterization and program transformation to enable fusion across recursively specified traversals over k-ary trees. We present the FuseT source-to-source code transformation framework to automatically generate fused composite recursive operators from an input program containing a sequence of primitive recursive operators. We use our framework to implement fused operators for MADNESS, Multiresolution Adaptive Numerical Environmentmore » for Scientific Simulation. We show that locality optimization through fusion can offer more than an order of magnitude performance improvement.« less

  7. XAPiir: A recursive digital filtering package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, D.

    1990-09-21

    XAPiir is a basic recursive digital filtering package, containing both design and implementation subroutines. XAPiir was developed for the experimental array processor (XAP) software package, and is written in FORTRAN. However, it is intended to be incorporated into any general- or special-purpose signal analysis program. It replaces the older package RECFIL, offering several enhancements. RECFIL is used in several large analysis programs developed at LLNL, including the seismic analysis package SAC, several expert systems (NORSEA and NETSEA), and two general purpose signal analysis packages (SIG and VIEW). This report is divided into two sections: the first describes the use ofmore » the subroutine package, and the second, its internal organization. In the first section, the filter design problem is briefly reviewed, along with the definitions of the filter design parameters and their relationship to the subroutine input parameters. In the second section, the internal organization is documented to simplify maintenance and extensions to the package. 5 refs., 9 figs.« less

  8. The Deformation Behavior Analysis and Mechanical Modeling of Step/Intercritical Quenching and Partitioning-Treated Multiphase Steels

    NASA Astrophysics Data System (ADS)

    Zhao, Hongshan; Li, Wei; Wang, Li; Zhou, Shu; Jin, Xuejun

    2016-08-01

    T wo types of multiphase steels containing blocky or fine martensite have been used to study the phase interaction and the TRIP effect. These steels were obtained by step-quenching and partitioning (S-QP820) or intercritical-quenching and partitioning (I-QP800 & I-QP820). The retained austenite (RA) in S-QP820 specimen containing blocky martensite transformed too early to prevent the local failure at high strain due to the local strain concentration. In contrast, plentiful RA in I-QP800 specimen containing finely dispersed martensite transformed uniformly at high strain, which led to optimized strength and elongation. By applying a coordinate conversion method to the microhardness test, the load partitioning between ferrite and partitioned martensite was proved to follow the linear mixture law. The mechanical behavior of multiphase S-QP820 steel can be modeled based on the Mecking-Kocks theory, Bouquerel's spherical assumption, and Gladman-type mixture law. Finally, the transformation-induced martensite hardening effect has been studied on a bake-hardened specimen.

  9. Inference and Analysis of Population Structure Using Genetic Data and Network Theory

    PubMed Central

    Greenbaum, Gili; Templeton, Alan R.; Bar-David, Shirli

    2016-01-01

    Clustering individuals to subpopulations based on genetic data has become commonplace in many genetic studies. Inference about population structure is most often done by applying model-based approaches, aided by visualization using distance-based approaches such as multidimensional scaling. While existing distance-based approaches suffer from a lack of statistical rigor, model-based approaches entail assumptions of prior conditions such as that the subpopulations are at Hardy-Weinberg equilibria. Here we present a distance-based approach for inference about population structure using genetic data by defining population structure using network theory terminology and methods. A network is constructed from a pairwise genetic-similarity matrix of all sampled individuals. The community partition, a partition of a network to dense subgraphs, is equated with population structure, a partition of the population to genetically related groups. Community-detection algorithms are used to partition the network into communities, interpreted as a partition of the population to subpopulations. The statistical significance of the structure can be estimated by using permutation tests to evaluate the significance of the partition’s modularity, a network theory measure indicating the quality of community partitions. To further characterize population structure, a new measure of the strength of association (SA) for an individual to its assigned community is presented. The strength of association distribution (SAD) of the communities is analyzed to provide additional population structure characteristics, such as the relative amount of gene flow experienced by the different subpopulations and identification of hybrid individuals. Human genetic data and simulations are used to demonstrate the applicability of the analyses. The approach presented here provides a novel, computationally efficient model-free method for inference about population structure that does not entail assumption of prior conditions. The method is implemented in the software NetStruct (available at https://giligreenbaum.wordpress.com/software/). PMID:26888080

  10. Improvement of selective screening strategy for gestational diabetes through a more accurate definition of high-risk groups.

    PubMed

    Pintaudi, Basilio; Di Vieste, Giacoma; Corrado, Francesco; Lucisano, Giuseppe; Pellegrini, Fabio; Giunta, Loretta; Nicolucci, Antonio; D'Anna, Rosario; Di Benedetto, Antonino

    2014-01-01

    This study aimed to assess the predictive value of risk factors (RFs) for gestational diabetes mellitus (GDM) established by selective screening (SS) and to identify subgroups of women at a higher risk of developing GDM. A retrospective, single-center study design was employed. Data of 1015 women screened for GDM at 24-28 weeks of gestation and diagnosed according to the International Association of Diabetes and Pregnancy Study Groups criteria were evaluated. Information on RFs established by SS was also collected and their association with GDM was determined. To identify distinct and homogeneous subgroups of patients at a higher risk, the RECursive Partitioning and AMalgamation (RECPAM) method was used. Overall, 113 (11.1%) women were diagnosed as having GDM. The application of the SS criteria would result in the execution of an oral glucose tolerance test (OGTT) in 58.3% of women and 26 (23.0%) cases of GDM would not be detected due to the absence of any RF. The RECPAM analysis identified high-risk subgroups characterized by fasting plasma glucose values >5.1 mmol/l (odds ratio (OR)=26.5; 95% CI 14.3-49.0) and pre-pregnancy BMI (OR=7.0; 95% CI 3.9-12.8 for overweight women). In a final logistic model including RECPAM classes, previous macrosomia (OR=3.6; 95% CI 1.1-11.6), and family history of diabetes (OR=1.8; 95% CI 1.1-2.8), but not maternal age, were also found to be associated with an increased risk of developing GDM. A screening approach based on the RECPAM model would reduce by over 50% (23.0 vs 10.6%) the number of undiagnosed GDM cases when compared with the current SS approach, at the expense of 50 additional OGTTs required. A screening approach based on our RECPAM model results in a significant reduction in the number of undetected GDM cases compared with the current SS procedure.

  11. A Simple Algorithm for Predicting Bacteremia Using Food Consumption and Shaking Chills: A Prospective Observational Study.

    PubMed

    Komatsu, Takayuki; Takahashi, Erika; Mishima, Kentaro; Toyoda, Takeo; Saitoh, Fumihiro; Yasuda, Akari; Matsuoka, Joe; Sugita, Manabu; Branch, Joel; Aoki, Makoto; Tierney, Lawrence; Inoue, Kenji

    2017-07-01

    Predicting the presence of true bacteremia based on clinical examination is unreliable. We aimed to construct a simple algorithm for predicting true bacteremia by using food consumption and shaking chills. A prospective multicenter observational study. Three hospital centers in a large Japanese city. In total, 1,943 hospitalized patients aged 14 to 96 years who underwent blood culture acquisitions between April 2013 and August 2014 were enrolled. Patients with anorexia-inducing conditions were excluded. We assessed the patients' oral food intake based on the meal immediately prior to the blood culture with definition as "normal food consumption" when >80% of a meal was consumed and "poor food consumption" when <80% was consumed. We also concurrently evaluated for a history of shaking chills. We calculated the statistical characteristics of food consumption and shaking chills for the presence of true bacteremia, and subsequently built the algorithm by using recursive partitioning analysis. Among 1,943 patients, 223 cases were true bacteremia. Among patients with normal food consumption, without shaking chills, the incidence of true bacteremia was 2.4% (13/552). Among patients with poor food consumption and shaking chills, the incidence of true bacteremia was 47.7% (51/107). The presence of poor food consumption had a sensitivity of 93.7% (95% confidence interval [CI], 89.4%-97.9%) for true bacteremia, and the absence of poor food consumption (ie, normal food consumption) had a negative likelihood ratio (LR) of 0.18 (95% CI, 0.17-0.19) for excluding true bacteremia, respectively. Conversely, the presence of the shaking chills had a specificity of 95.1% (95% CI, 90.7%-99.4%) and a positive LR of 4.78 (95% CI, 4.56-5.00) for true bacteremia. A 2-item screening checklist for food consumption and shaking chills had excellent statistical properties as a brief screening instrument for predicting true bacteremia. © 2017 Society of Hospital Medicine

  12. Predicting Overall Survival After Stereotactic Ablative Radiation Therapy in Early-Stage Lung Cancer: Development and External Validation of the Amsterdam Prognostic Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Louie, Alexander V., E-mail: Dr.alexlouie@gmail.com; Department of Radiation Oncology, London Regional Cancer Program, University of Western Ontario, London, Ontario; Department of Epidemiology, Harvard School of Public Health, Harvard University, Boston, Massachusetts

    Purpose: A prognostic model for 5-year overall survival (OS), consisting of recursive partitioning analysis (RPA) and a nomogram, was developed for patients with early-stage non-small cell lung cancer (ES-NSCLC) treated with stereotactic ablative radiation therapy (SABR). Methods and Materials: A primary dataset of 703 ES-NSCLC SABR patients was randomly divided into a training (67%) and an internal validation (33%) dataset. In the former group, 21 unique parameters consisting of patient, treatment, and tumor factors were entered into an RPA model to predict OS. Univariate and multivariate models were constructed for RPA-selected factors to evaluate their relationship with OS. A nomogrammore » for OS was constructed based on factors significant in multivariate modeling and validated with calibration plots. Both the RPA and the nomogram were externally validated in independent surgical (n=193) and SABR (n=543) datasets. Results: RPA identified 2 distinct risk classes based on tumor diameter, age, World Health Organization performance status (PS) and Charlson comorbidity index. This RPA had moderate discrimination in SABR datasets (c-index range: 0.52-0.60) but was of limited value in the surgical validation cohort. The nomogram predicting OS included smoking history in addition to RPA-identified factors. In contrast to RPA, validation of the nomogram performed well in internal validation (r{sup 2}=0.97) and external SABR (r{sup 2}=0.79) and surgical cohorts (r{sup 2}=0.91). Conclusions: The Amsterdam prognostic model is the first externally validated prognostication tool for OS in ES-NSCLC treated with SABR available to individualize patient decision making. The nomogram retained strong performance across surgical and SABR external validation datasets. RPA performance was poor in surgical patients, suggesting that 2 different distinct patient populations are being treated with these 2 effective modalities.« less

  13. Underestimation of boreal soil carbon stocks by mathematical soil carbon models linked to soil nutrient status

    NASA Astrophysics Data System (ADS)

    Ťupek, Boris; Ortiz, Carina A.; Hashimoto, Shoji; Stendahl, Johan; Dahlgren, Jonas; Karltun, Erik; Lehtonen, Aleksi

    2016-08-01

    Inaccurate estimate of the largest terrestrial carbon pool, soil organic carbon (SOC) stock, is the major source of uncertainty in simulating feedback of climate warming on ecosystem-atmosphere carbon dioxide exchange by process-based ecosystem and soil carbon models. Although the models need to simplify complex environmental processes of soil carbon sequestration, in a large mosaic of environments a missing key driver could lead to a modeling bias in predictions of SOC stock change.We aimed to evaluate SOC stock estimates of process-based models (Yasso07, Q, and CENTURY soil sub-model v4) against a massive Swedish forest soil inventory data set (3230 samples) organized by a recursive partitioning method into distinct soil groups with underlying SOC stock development linked to physicochemical conditions.For two-thirds of measurements all models predicted accurate SOC stock levels regardless of the detail of input data, e.g., whether they ignored or included soil properties. However, in fertile sites with high N deposition, high cation exchange capacity, or moderately increased soil water content, Yasso07 and Q models underestimated SOC stocks. In comparison to Yasso07 and Q, accounting for the site-specific soil characteristics (e. g. clay content and topsoil mineral N) by CENTURY improved SOC stock estimates for sites with high clay content, but not for sites with high N deposition.Our analysis suggested that the soils with poorly predicted SOC stocks, as characterized by the high nutrient status and well-sorted parent material, indeed have had other predominant drivers of SOC stabilization lacking in the models, presumably the mycorrhizal organic uptake and organo-mineral stabilization processes. Our results imply that the role of soil nutrient status as regulator of organic matter mineralization has to be re-evaluated, since correct SOC stocks are decisive for predicting future SOC change and soil CO2 efflux.

  14. A fast iterative recursive least squares algorithm for Wiener model identification of highly nonlinear systems.

    PubMed

    Kazemi, Mahdi; Arefi, Mohammad Mehdi

    2017-03-01

    In this paper, an online identification algorithm is presented for nonlinear systems in the presence of output colored noise. The proposed method is based on extended recursive least squares (ERLS) algorithm, where the identified system is in polynomial Wiener form. To this end, an unknown intermediate signal is estimated by using an inner iterative algorithm. The iterative recursive algorithm adaptively modifies the vector of parameters of the presented Wiener model when the system parameters vary. In addition, to increase the robustness of the proposed method against variations, a robust RLS algorithm is applied to the model. Simulation results are provided to show the effectiveness of the proposed approach. Results confirm that the proposed method has fast convergence rate with robust characteristics, which increases the efficiency of the proposed model and identification approach. For instance, the FIT criterion will be achieved 92% in CSTR process where about 400 data is used. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  15. A probabilistic, distributed, recursive mechanism for decision-making in the brain

    PubMed Central

    Gurney, Kevin N.

    2018-01-01

    Decision formation recruits many brain regions, but the procedure they jointly execute is unknown. Here we characterize its essential composition, using as a framework a novel recursive Bayesian algorithm that makes decisions based on spike-trains with the statistics of those in sensory cortex (MT). Using it to simulate the random-dot-motion task, we demonstrate it quantitatively replicates the choice behaviour of monkeys, whilst predicting losses of otherwise usable information from MT. Its architecture maps to the recurrent cortico-basal-ganglia-thalamo-cortical loops, whose components are all implicated in decision-making. We show that the dynamics of its mapped computations match those of neural activity in the sensorimotor cortex and striatum during decisions, and forecast those of basal ganglia output and thalamus. This also predicts which aspects of neural dynamics are and are not part of inference. Our single-equation algorithm is probabilistic, distributed, recursive, and parallel. Its success at capturing anatomy, behaviour, and electrophysiology suggests that the mechanism implemented by the brain has these same characteristics. PMID:29614077

  16. Face recognition using tridiagonal matrix enhanced multivariance products representation

    NASA Astrophysics Data System (ADS)

    Ã-zay, Evrim Korkmaz

    2017-01-01

    This study aims to retrieve face images from a database according to a target face image. For this purpose, Tridiagonal Matrix Enhanced Multivariance Products Representation (TMEMPR) is taken into consideration. TMEMPR is a recursive algorithm based on Enhanced Multivariance Products Representation (EMPR). TMEMPR decomposes a matrix into three components which are a matrix of left support terms, a tridiagonal matrix of weight parameters for each recursion, and a matrix of right support terms, respectively. In this sense, there is an analogy between Singular Value Decomposition (SVD) and TMEMPR. However TMEMPR is a more flexible algorithm since its initial support terms (or vectors) can be chosen as desired. Low computational complexity is another advantage of TMEMPR because the algorithm has been constructed with recursions of certain arithmetic operations without requiring any iteration. The algorithm has been trained and tested with ORL face image database with 400 different grayscale images of 40 different people. TMEMPR's performance has been compared with SVD's performance as a result.

  17. Three applications of a bonus relation for gravity amplitudes

    NASA Astrophysics Data System (ADS)

    Spradlin, Marcus; Volovich, Anastasia; Wen, Congkao

    2009-04-01

    Arkani-Hamed et al. have recently shown that all tree-level scattering amplitudes in maximal supergravity exhibit exceptionally soft behavior when two supermomenta are taken to infinity in a particular complex direction, and that this behavior implies new non-trivial relations amongst amplitudes in addition to the well-known on-shell recursion relations. We consider the application of these new 'bonus relations' to MHV amplitudes, showing that they can be used quite generally to relate (n - 2) !-term formulas typically obtained from recursion relations to (n - 3) !-term formulas related to the original BGK conjecture. Specifically we provide (1) a direct proof of a formula presented by Elvang and Freedman, (2) a new formula based on one due to Bedford et al., and (3) an alternate proof of a formula recently obtained by Mason and Skinner. Our results also provide the first direct proof that the conjectured BGK formula, only very recently proven via completely different methods, satisfies the on-shell recursion.

  18. Soft sensor modelling by time difference, recursive partial least squares and adaptive model updating

    NASA Astrophysics Data System (ADS)

    Fu, Y.; Yang, W.; Xu, O.; Zhou, L.; Wang, J.

    2017-04-01

    To investigate time-variant and nonlinear characteristics in industrial processes, a soft sensor modelling method based on time difference, moving-window recursive partial least square (PLS) and adaptive model updating is proposed. In this method, time difference values of input and output variables are used as training samples to construct the model, which can reduce the effects of the nonlinear characteristic on modelling accuracy and retain the advantages of recursive PLS algorithm. To solve the high updating frequency of the model, a confidence value is introduced, which can be updated adaptively according to the results of the model performance assessment. Once the confidence value is updated, the model can be updated. The proposed method has been used to predict the 4-carboxy-benz-aldehyde (CBA) content in the purified terephthalic acid (PTA) oxidation reaction process. The results show that the proposed soft sensor modelling method can reduce computation effectively, improve prediction accuracy by making use of process information and reflect the process characteristics accurately.

  19. Kernel Recursive Least-Squares Temporal Difference Algorithms with Sparsification and Regularization

    PubMed Central

    Zhu, Qingxin; Niu, Xinzheng

    2016-01-01

    By combining with sparse kernel methods, least-squares temporal difference (LSTD) algorithms can construct the feature dictionary automatically and obtain a better generalization ability. However, the previous kernel-based LSTD algorithms do not consider regularization and their sparsification processes are batch or offline, which hinder their widespread applications in online learning problems. In this paper, we combine the following five techniques and propose two novel kernel recursive LSTD algorithms: (i) online sparsification, which can cope with unknown state regions and be used for online learning, (ii) L 2 and L 1 regularization, which can avoid overfitting and eliminate the influence of noise, (iii) recursive least squares, which can eliminate matrix-inversion operations and reduce computational complexity, (iv) a sliding-window approach, which can avoid caching all history samples and reduce the computational cost, and (v) the fixed-point subiteration and online pruning, which can make L 1 regularization easy to implement. Finally, simulation results on two 50-state chain problems demonstrate the effectiveness of our algorithms. PMID:27436996

  20. Kernel Recursive Least-Squares Temporal Difference Algorithms with Sparsification and Regularization.

    PubMed

    Zhang, Chunyuan; Zhu, Qingxin; Niu, Xinzheng

    2016-01-01

    By combining with sparse kernel methods, least-squares temporal difference (LSTD) algorithms can construct the feature dictionary automatically and obtain a better generalization ability. However, the previous kernel-based LSTD algorithms do not consider regularization and their sparsification processes are batch or offline, which hinder their widespread applications in online learning problems. In this paper, we combine the following five techniques and propose two novel kernel recursive LSTD algorithms: (i) online sparsification, which can cope with unknown state regions and be used for online learning, (ii) L 2 and L 1 regularization, which can avoid overfitting and eliminate the influence of noise, (iii) recursive least squares, which can eliminate matrix-inversion operations and reduce computational complexity, (iv) a sliding-window approach, which can avoid caching all history samples and reduce the computational cost, and (v) the fixed-point subiteration and online pruning, which can make L 1 regularization easy to implement. Finally, simulation results on two 50-state chain problems demonstrate the effectiveness of our algorithms.

  1. In silico models for the prediction of dose-dependent human hepatotoxicity

    NASA Astrophysics Data System (ADS)

    Cheng, Ailan; Dixon, Steven L.

    2003-12-01

    The liver is extremely vulnerable to the effects of xenobiotics due to its critical role in metabolism. Drug-induced hepatotoxicity may involve any number of different liver injuries, some of which lead to organ failure and, ultimately, patient death. Understandably, liver toxicity is one of the most important dose-limiting considerations in the drug development cycle, yet there remains a serious shortage of methods to predict hepatotoxicity from chemical structure. We discuss our latest findings in this area and present a new, fully general in silico model which is able to predict the occurrence of dose-dependent human hepatotoxicity with greater than 80% accuracy. Utilizing an ensemble recursive partitioning approach, the model classifies compounds as toxic or non-toxic and provides a confidence level to indicate which predictions are most likely to be correct. Only 2D structural information is required and predictions can be made quite rapidly, so this approach is entirely appropriate for data mining applications and for profiling large synthetic and/or virtual libraries.

  2. Anger, preoccupied attachment, and domain disorganization in borderline personality disorder

    PubMed Central

    Morse, Jennifer Q.; Hill, Jonathan; Pilkonis, Paul A.; Yaggi, Kirsten; Broyden, Nichaela; Stepp, Stephanie; Reed, Lawrence Ian; Feske, Ulrike

    2010-01-01

    Emotional dysregulation and attachment insecurity have been reported in borderline personality disorder (BPD). Domain disorganization, evidenced in poor regulation of emotions and behaviors in relation to the demands of different social domains, may be a distinguishing feature of BPD. Understanding the interplay between these factors may be critical for identifying interacting processes in BPD and potential subtypes of BPD. Therefore, we examined the joint and interactive effects of anger, preoccupied attachment, and domain disorganization on BPD traits in clinical sample of 128 psychiatric patients. The results suggest that these factors contribute to BPD both independently and in interaction, even when controlling for other personality disorder traits and Axis I symptoms. In regression analyses, the interaction between anger and domain disorganization predicted BPD traits. In recursive partitioning analyses, two possible paths to BPD were identified: high anger combined with high domain disorganization and low anger combined with preoccupied attachment. These results may suggest possible subtypes of BPD or possible mechanisms by which BPD traits are established and maintained. PMID:19538080

  3. Pretreatment 18F-FDG PET Textural Features in Locally Advanced Non-Small Cell Lung Cancer: Secondary Analysis of ACRIN 6668/RTOG 0235.

    PubMed

    Ohri, Nitin; Duan, Fenghai; Snyder, Bradley S; Wei, Bo; Machtay, Mitchell; Alavi, Abass; Siegel, Barry A; Johnson, Douglas W; Bradley, Jeffrey D; DeNittis, Albert; Werner-Wasik, Maria; El Naqa, Issam

    2016-06-01

    In a secondary analysis of American College of Radiology Imaging Network (ACRIN) 6668/RTOG 0235, high pretreatment metabolic tumor volume (MTV) on (18)F-FDG PET was found to be a poor prognostic factor for patients treated with chemoradiotherapy for locally advanced non-small cell lung cancer (NSCLC). Here we utilize the same dataset to explore whether heterogeneity metrics based on PET textural features can provide additional prognostic information. Patients with locally advanced NSCLC underwent (18)F-FDG PET prior to treatment. A gradient-based segmentation tool was used to contour each patient's primary tumor. MTV, maximum SUV, and 43 textural features were extracted for each tumor. To address overfitting and high collinearity among PET features, the least absolute shrinkage and selection operator (LASSO) method was applied to identify features that were independent predictors of overall survival (OS) after adjusting for MTV. Recursive binary partitioning in a conditional inference framework was utilized to identify optimal thresholds. Kaplan-Meier curves and log-rank testing were used to compare outcomes among patient groups. Two hundred one patients met inclusion criteria. The LASSO procedure identified 1 textural feature (SumMean) as an independent predictor of OS. The optimal cutpoint for MTV was 93.3 cm(3), and the optimal SumMean cutpoint for tumors above 93.3 cm(3) was 0.018. This grouped patients into three categories: low tumor MTV (n = 155; median OS, 22.6 mo), high tumor MTV and high SumMean (n = 23; median OS, 20.0 mo), and high tumor MTV and low SumMean (n = 23; median OS, 6.2 mo; log-rank P < 0.001). We have described an appropriate methodology to evaluate the prognostic value of textural PET features in the context of established prognostic factors. We have also identified a promising feature that may have prognostic value in locally advanced NSCLC patients with large tumors who are treated with chemoradiotherapy. Validation studies are warranted. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  4. Pretreatment 18F-FDG PET Textural Features in Locally Advanced Non–Small Cell Lung Cancer: Secondary Analysis of ACRIN 6668/RTOG 0235

    PubMed Central

    Ohri, Nitin; Duan, Fenghai; Snyder, Bradley S.; Wei, Bo; Machtay, Mitchell; Alavi, Abass; Siegel, Barry A.; Johnson, Douglas W.; Bradley, Jeffrey D.; DeNittis, Albert; Werner-Wasik, Maria; El Naqa, Issam

    2016-01-01

    In a secondary analysis of American College of Radiology Imaging Network (ACRIN) 6668/RTOG 0235, high pretreatment metabolic tumor volume (MTV) on 18F-FDG PET was found to be a poor prognostic factor for patients treated with chemoradiotherapy for locally advanced non–small cell lung cancer (NSCLC). Here we utilize the same dataset to explore whether heterogeneity metrics based on PET textural features can provide additional prognostic information. Methods Patients with locally advanced NSCLC underwent 18F-FDG PET prior to treatment. A gradient-based segmentation tool was used to contour each patient’s primary tumor. MTV, maximum SUV, and 43 textural features were extracted for each tumor. To address over-fitting and high collinearity among PET features, the least absolute shrinkage and selection operator (LASSO) method was applied to identify features that were independent predictors of overall survival (OS) after adjusting for MTV. Recursive binary partitioning in a conditional inference framework was utilized to identify optimal thresholds. Kaplan–Meier curves and log-rank testing were used to compare outcomes among patient groups. Results Two hundred one patients met inclusion criteria. The LASSO procedure identified 1 textural feature (SumMean) as an independent predictor of OS. The optimal cutpoint for MTV was 93.3 cm3, and the optimal Sum-Mean cutpoint for tumors above 93.3 cm3 was 0.018. This grouped patients into three categories: low tumor MTV (n = 155; median OS, 22.6 mo), high tumor MTV and high SumMean (n = 23; median OS, 20.0 mo), and high tumor MTV and low SumMean (n = 23; median OS, 6.2 mo; log-rank P < 0.001). Conclusion We have described an appropriate methodology to evaluate the prognostic value of textural PET features in the context of established prognostic factors. We have also identified a promising feature that may have prognostic value in locally advanced NSCLC patients with large tumors who are treated with chemoradiotherapy. Validation studies are warranted. PMID:26912429

  5. Programming Pascal's Triangle

    ERIC Educational Resources Information Center

    Curley, Walter

    1974-01-01

    After a brief discussion of Pascal's triangle and description of four methods of hand construction, the author provides FORTRAN and BASIC programs for computer construction based on recursive definition. (SD)

  6. Evaluating uses of data mining techniques in propensity score estimation: a simulation study.

    PubMed

    Setoguchi, Soko; Schneeweiss, Sebastian; Brookhart, M Alan; Glynn, Robert J; Cook, E Francis

    2008-06-01

    In propensity score modeling, it is a standard practice to optimize the prediction of exposure status based on the covariate information. In a simulation study, we examined in what situations analyses based on various types of exposure propensity score (EPS) models using data mining techniques such as recursive partitioning (RP) and neural networks (NN) produce unbiased and/or efficient results. We simulated data for a hypothetical cohort study (n = 2000) with a binary exposure/outcome and 10 binary/continuous covariates with seven scenarios differing by non-linear and/or non-additive associations between exposure and covariates. EPS models used logistic regression (LR) (all possible main effects), RP1 (without pruning), RP2 (with pruning), and NN. We calculated c-statistics (C), standard errors (SE), and bias of exposure-effect estimates from outcome models for the PS-matched dataset. Data mining techniques yielded higher C than LR (mean: NN, 0.86; RPI, 0.79; RP2, 0.72; and LR, 0.76). SE tended to be greater in models with higher C. Overall bias was small for each strategy, although NN estimates tended to be the least biased. C was not correlated with the magnitude of bias (correlation coefficient [COR] = -0.3, p = 0.1) but increased SE (COR = 0.7, p < 0.001). Effect estimates from EPS models by simple LR were generally robust. NN models generally provided the least numerically biased estimates. C was not associated with the magnitude of bias but was with the increased SE.

  7. Development of a Novel, Objective Measure of Health Care–Related Financial Burden for U.S. Families with Children

    PubMed Central

    Wisk, Lauren E; Gangnon, Ronald; Vanness, David J; Galbraith, Alison A; Mullahy, John; Witt, Whitney P

    2014-01-01

    Objective To develop and validate a theoretically based and empirically driven objective measure of financial burden for U.S. families with children. Data Sources The measure was developed using 149,021 families with children from the National Health Interview Survey, and it was validated using 18,488 families with children from the Medical Expenditure Panel Survey. Study Design We estimated the marginal probability of unmet health care need due to cost using a bivariate tensor product spline for family income and out-of-pocket health care costs (OOPC; e.g., deductibles, copayments), while adjusting for confounders. Recursive partitioning was performed on these probabilities, as a function of income and OOPC, to establish thresholds demarcating levels of predicted risk. Principal Findings We successfully generated a novel measure of financial burden with four categories that were associated with unmet need (vs. low burden: midlow OR: 1.93, 95 percent CI: 1.78–2.09; midhigh OR: 2.78, 95 percent CI: 2.49–3.10; high OR: 4.38, 95 percent CI: 3.99–4.80). The novel burden measure demonstrated significantly better model fit and less underestimation of financial burden compared to an existing measure (OOPC/income ≥10 percent). Conclusion The newly developed measure of financial burden establishes thresholds based on different combinations of family income and OOPC that can be applied in future studies of health care utilization and expenditures and in policy development and evaluation. PMID:25328073

  8. Recursion, Language, and Starlings

    ERIC Educational Resources Information Center

    Corballis, Michael C.

    2007-01-01

    It has been claimed that recursion is one of the properties that distinguishes human language from any other form of animal communication. Contrary to this claim, a recent study purports to demonstrate center-embedded recursion in starlings. I show that the performance of the birds in this study can be explained by a counting strategy, without any…

  9. Parametric output-only identification of time-varying structures using a kernel recursive extended least squares TARMA approach

    NASA Astrophysics Data System (ADS)

    Ma, Zhi-Sai; Liu, Li; Zhou, Si-Da; Yu, Lei; Naets, Frank; Heylen, Ward; Desmet, Wim

    2018-01-01

    The problem of parametric output-only identification of time-varying structures in a recursive manner is considered. A kernelized time-dependent autoregressive moving average (TARMA) model is proposed by expanding the time-varying model parameters onto the basis set of kernel functions in a reproducing kernel Hilbert space. An exponentially weighted kernel recursive extended least squares TARMA identification scheme is proposed, and a sliding-window technique is subsequently applied to fix the computational complexity for each consecutive update, allowing the method to operate online in time-varying environments. The proposed sliding-window exponentially weighted kernel recursive extended least squares TARMA method is employed for the identification of a laboratory time-varying structure consisting of a simply supported beam and a moving mass sliding on it. The proposed method is comparatively assessed against an existing recursive pseudo-linear regression TARMA method via Monte Carlo experiments and shown to be capable of accurately tracking the time-varying dynamics. Furthermore, the comparisons demonstrate the superior achievable accuracy, lower computational complexity and enhanced online identification capability of the proposed kernel recursive extended least squares TARMA approach.

  10. Inner and Outer Recursive Neural Networks for Chemoinformatics Applications.

    PubMed

    Urban, Gregor; Subrahmanya, Niranjan; Baldi, Pierre

    2018-02-26

    Deep learning methods applied to problems in chemoinformatics often require the use of recursive neural networks to handle data with graphical structure and variable size. We present a useful classification of recursive neural network approaches into two classes, the inner and outer approach. The inner approach uses recursion inside the underlying graph, to essentially "crawl" the edges of the graph, while the outer approach uses recursion outside the underlying graph, to aggregate information over progressively longer distances in an orthogonal direction. We illustrate the inner and outer approaches on several examples. More importantly, we provide open-source implementations [available at www.github.com/Chemoinformatics/InnerOuterRNN and cdb.ics.uci.edu ] for both approaches in Tensorflow which can be used in combination with training data to produce efficient models for predicting the physical, chemical, and biological properties of small molecules.

  11. Stereotactic radiosurgery boost to the resection cavity for cerebral metastases: Report of overall survival, complications, and corticosteroid protocol

    PubMed Central

    Kellogg, Robert G.; Straus, David C.; Choi, Mehee; Chaudhry, Thymur A.; Diaz, Aidnag Z.; Muñoz, Lorenzo F.

    2013-01-01

    Background: This report focuses on the overall survival and complications associated with treatment of cerebral metastases with surgical resection followed by stereotactic radiosurgery (SRS). Management and complications of corticosteroid therapy are underreported in the literature but represent an important source of morbidity for patients. Methods: Fifty-nine consecutive patients underwent surgical resection of a cerebral metastasis followed by SRS to the cavity. Patient charts were reviewed retrospectively to ascertain overall survival, local control, surgical complications, SRS complications, and corticosteroid complications. Results: Our mean follow-up was 14.4 months (median 12.0 months, range 0.9-62.9 months). Median overall survival in this series was 15.25 months and local control was 98.3%. There was a statistically significant survival benefit conferred by Radiation Therapy Oncology Group recursive partitioning analysis Classes 1 and 2. The surgical complication rate was 6.8% while the SRS complication rate was 2.4%. Corticosteroid complications are reported and dependence at 1 month was 20.3%, at 3 months 6.8%, at 6 months 1.7%, and at 12 months no patients remained on corticosteroid therapy. Conclusions: Overall survival and local control with this treatment paradigm compare well to the other published literature. Complications associated with this patient population are low. A corticosteroid tapering protocol is proposed and demonstrated lower rates of steroid-related complications and dependence than previously reported. PMID:24349867

  12. Prenatal hazardous substance use and adverse birth outcomes

    PubMed Central

    Quesada, Odayme; Gotman, Nathan; Howell, Heather B.; Funai, Edmund F.; Rounsaville, Bruce J.; Yonkers, Kimberly A.

    2012-01-01

    Objective Assess the relative effects of a variety of illicit and licit drugs on risk for adverse birth outcomes. Methods We used data from two large prospective investigations, and a novel analytic method, recursive partitioning class analysis to identify risk factors associated with preterm birth and delivering a small for gestational age infant. Results Compared to cocaine and opiate non-users, cocaine users were 3.53 times as likely (95% Cl: 1.65–7.56; p=0.001) and opiate users 2.86 times as likely (95% Cl: 1.11–7.36; p=0.03) to deliver preterm. The odds of delivering a small for gestational age infant for women who smoked more than two cigarettes daily was 3.74, (95% Cl: 2.47–5.65; p<0.0001) compared to women who smoked two or less cigarettes daily and had one previous child. Similarly, less educated, nulliparous women who smoked two or fewer cigarettes daily were 4.12 times as likely (95% Cl: 2.04–8.34; p<0.0001) to have a small for gestational age infant. Conclusions Among our covariates, prenatal cocaine and opiate use are the predominant risk factors for preterm birth; while tobacco use was the primary risk factor predicting small for gestational age at delivery. Multi-substance use did not substantially increase risk of adverse birth outcomes over these risk factors. PMID:22489543

  13. DAFi: A directed recursive data filtering and clustering approach for improving and interpreting data clustering identification of cell populations from polychromatic flow cytometry data.

    PubMed

    Lee, Alexandra J; Chang, Ivan; Burel, Julie G; Lindestam Arlehamn, Cecilia S; Mandava, Aishwarya; Weiskopf, Daniela; Peters, Bjoern; Sette, Alessandro; Scheuermann, Richard H; Qian, Yu

    2018-04-17

    Computational methods for identification of cell populations from polychromatic flow cytometry data are changing the paradigm of cytometry bioinformatics. Data clustering is the most common computational approach to unsupervised identification of cell populations from multidimensional cytometry data. However, interpretation of the identified data clusters is labor-intensive. Certain types of user-defined cell populations are also difficult to identify by fully automated data clustering analysis. Both are roadblocks before a cytometry lab can adopt the data clustering approach for cell population identification in routine use. We found that combining recursive data filtering and clustering with constraints converted from the user manual gating strategy can effectively address these two issues. We named this new approach DAFi: Directed Automated Filtering and Identification of cell populations. Design of DAFi preserves the data-driven characteristics of unsupervised clustering for identifying novel cell subsets, but also makes the results interpretable to experimental scientists through mapping and merging the multidimensional data clusters into the user-defined two-dimensional gating hierarchy. The recursive data filtering process in DAFi helped identify small data clusters which are otherwise difficult to resolve by a single run of the data clustering method due to the statistical interference of the irrelevant major clusters. Our experiment results showed that the proportions of the cell populations identified by DAFi, while being consistent with those by expert centralized manual gating, have smaller technical variances across samples than those from individual manual gating analysis and the nonrecursive data clustering analysis. Compared with manual gating segregation, DAFi-identified cell populations avoided the abrupt cut-offs on the boundaries. DAFi has been implemented to be used with multiple data clustering methods including K-means, FLOCK, FlowSOM, and the ClusterR package. For cell population identification, DAFi supports multiple options including clustering, bisecting, slope-based gating, and reversed filtering to meet various autogating needs from different scientific use cases. © 2018 International Society for Advancement of Cytometry. © 2018 International Society for Advancement of Cytometry.

  14. Recursive Deadbeat Controller Design

    NASA Technical Reports Server (NTRS)

    Juang, Jer-Nan; Phan, Minh Q.

    1997-01-01

    This paper presents a recursive algorithm for a deadbeat predictive controller design. The method combines together the concepts of system identification and deadbeat controller designs. It starts with the multi-step output prediction equation and derives the control force in terms of past input and output time histories. The formulation thus derived satisfies simultaneously system identification and deadbeat controller design requirements. As soon as the coefficient matrices are identified satisfying the output prediction equation, no further work is required to compute the deadbeat control gain matrices. The method can be implemented recursively just as any typical recursive system identification techniques.

  15. Real-time correction of tsunami site effect by frequency-dependent tsunami-amplification factor

    NASA Astrophysics Data System (ADS)

    Tsushima, H.

    2017-12-01

    For tsunami early warning, I developed frequency-dependent tsunami-amplification factor and used it to design a recursive digital filter that can be applicable for real-time correction of tsunami site response. In this study, I assumed that a tsunami waveform at an observing point could be modeled by convolution of source, path and site effects in time domain. Under this assumption, spectral ratio between offshore and the nearby coast can be regarded as site response (i.e. frequency-dependent amplification factor). If the amplification factor can be prepared before tsunamigenic earthquakes, its temporal convolution to offshore tsunami waveform provides tsunami prediction at coast in real time. In this study, tsunami waveforms calculated by tsunami numerical simulations were used to develop frequency-dependent tsunami-amplification factor. Firstly, I performed numerical tsunami simulations based on nonlinear shallow-water theory from many tsuanmigenic earthquake scenarios by varying the seismic magnitudes and locations. The resultant tsunami waveforms at offshore and the nearby coastal observing points were then used in spectral-ratio analysis. An average of the resulted spectral ratios from the tsunamigenic-earthquake scenarios is regarded as frequency-dependent amplification factor. Finally, the estimated amplification factor is used in design of a recursive digital filter that can be applicable in time domain. The above procedure is applied to Miyako bay at the Pacific coast of northeastern Japan. The averaged tsunami-height spectral ratio (i.e. amplification factor) between the location at the center of the bay and the outside show a peak at wave-period of 20 min. A recursive digital filter based on the estimated amplification factor shows good performance in real-time correction of tsunami-height amplification due to the site effect. This study is supported by Japan Society for the Promotion of Science (JSPS) KAKENHI grant 15K16309.

  16. [An object-oriented remote sensing image segmentation approach based on edge detection].

    PubMed

    Tan, Yu-Min; Huai, Jian-Zhu; Tang, Zhong-Shi

    2010-06-01

    Satellite sensor technology endorsed better discrimination of various landscape objects. Image segmentation approaches to extracting conceptual objects and patterns hence have been explored and a wide variety of such algorithms abound. To this end, in order to effectively utilize edge and topological information in high resolution remote sensing imagery, an object-oriented algorithm combining edge detection and region merging is proposed. Susan edge filter is firstly applied to the panchromatic band of Quickbird imagery with spatial resolution of 0.61 m to obtain the edge map. Thanks to the resulting edge map, a two-phrase region-based segmentation method operates on the fusion image from panchromatic and multispectral Quickbird images to get the final partition result. In the first phase, a quad tree grid consisting of squares with sides parallel to the image left and top borders agglomerates the square subsets recursively where the uniform measure is satisfied to derive image object primitives. Before the merger of the second phrase, the contextual and spatial information, (e. g., neighbor relationship, boundary coding) of the resulting squares are retrieved efficiently by means of the quad tree structure. Then a region merging operation is performed with those primitives, during which the criterion for region merging integrates edge map and region-based features. This approach has been tested on the QuickBird images of some site in Sanxia area and the result is compared with those of ENVI Zoom Definiens. In addition, quantitative evaluation of the quality of segmentation results is also presented. Experiment results demonstrate stable convergence and efficiency.

  17. [What is impaired consciousness? Revisiting impaired consciousness as psychiatric concept].

    PubMed

    Kanemoto, Kousuke

    2004-01-01

    For decades, psychiatrists have considered that concepts of impaired consciousness in the study of psychiatry were inconsistent with those applied in the field of neurology, in which the usefulness of the concept of consciousness has long been seriously doubted. Gloor concluded that the concept of consciousness does not further the understanding of seizure mechanisms or brain function, which is the current representative opinion of most epileptologists. Loss of consciousness tends to be reduced to aggregates of individual impairments of higher cognitive functions, and the concept of consciousness is preferably avoided by neurologists by assigning various behavioral disturbances during disturbed consciousness to particular neuropsychological centers. In contrast, psychiatrists, especially those in Europe, are more likely to include phenomena involving problems related to phenomenological intentionality in impaired consciousness. For the present study, we first divided consciousness into vigilance and recursive consciousness, and then attempted to determine what kind of impaired consciousness would be an ideal candidate to represent pure disturbance of recursive consciousness. Then, 4 patients, 1 each with pure amnestic states followed immediately by complex partial seizures, an akinetic mutistic state caused by absence status, and mental diplopia as a manifestation of postictal psychosis, as well as a patient with Alzheimer's disease who gracefully performed Japanese tea ceremony, were studied. Based on our findings, we concluded that impaired consciousness as a generic term in general medicine does not indicate any unitary entity corresponding to some well-demarcated physiological function or constitute a base from which recursive consciousness emerges as a superstructure. From that, we stressed that a pure form of impairment of recursive consciousness could occur without the impaired consciousness named generically in general medicine. Second, following observation of an additional 3 cases, descriptions of naissance of the first word (taken from the autobiography of Helen Keller), visual object agnosia, and chronic schizophrenia with schizophasia were discussed to examine the relationship between impairments of recursive consciousness and semantic generation dysfunction. Attempts to bridge semantic generation and recursive consciousness, performed by psychopathologists such as Bin Kimura and Hiroyuki Koide, were also briefly discussed. In light of these case presentations and related discussions, we re-examined traditional theories of impaired consciousness, including Mayer-Gross's Gestalt theory, later replaced by Conrad and Henri Ey's theory related to intentionality. Furthermore, we attempted to link Denett's theory of consciousness to those traditional theories as well as to our own postulations, and neuropsychological data such as those of implicit memory and blindsight. Finally, the significance of Freud's unconsciousness in the framework of neuroscience was discussed.

  18. Hierarchical Image Segmentation of Remotely Sensed Data using Massively Parallel GNU-LINUX Software

    NASA Technical Reports Server (NTRS)

    Tilton, James C.

    2003-01-01

    A hierarchical set of image segmentations is a set of several image segmentations of the same image at different levels of detail in which the segmentations at coarser levels of detail can be produced from simple merges of regions at finer levels of detail. In [1], Tilton, et a1 describes an approach for producing hierarchical segmentations (called HSEG) and gave a progress report on exploiting these hierarchical segmentations for image information mining. The HSEG algorithm is a hybrid of region growing and constrained spectral clustering that produces a hierarchical set of image segmentations based on detected convergence points. In the main, HSEG employs the hierarchical stepwise optimization (HSWO) approach to region growing, which was described as early as 1989 by Beaulieu and Goldberg. The HSWO approach seeks to produce segmentations that are more optimized than those produced by more classic approaches to region growing (e.g. Horowitz and T. Pavlidis, [3]). In addition, HSEG optionally interjects between HSWO region growing iterations, merges between spatially non-adjacent regions (i.e., spectrally based merging or clustering) constrained by a threshold derived from the previous HSWO region growing iteration. While the addition of constrained spectral clustering improves the utility of the segmentation results, especially for larger images, it also significantly increases HSEG s computational requirements. To counteract this, a computationally efficient recursive, divide-and-conquer, implementation of HSEG (RHSEG) was devised, which includes special code to avoid processing artifacts caused by RHSEG s recursive subdivision of the image data. The recursive nature of RHSEG makes for a straightforward parallel implementation. This paper describes the HSEG algorithm, its recursive formulation (referred to as RHSEG), and the implementation of RHSEG using massively parallel GNU-LINUX software. Results with Landsat TM data are included comparing RHSEG with classic region growing.

  19. The Paradigm Recursion: Is It More Accessible When Introduced in Middle School?

    ERIC Educational Resources Information Center

    Gunion, Katherine; Milford, Todd; Stege, Ulrike

    2009-01-01

    Recursion is a programming paradigm as well as a problem solving strategy thought to be very challenging to grasp for university students. This article outlines a pilot study, which expands the age range of students exposed to the concept of recursion in computer science through instruction in a series of interesting and engaging activities. In…

  20. A Preliminary Instrument for Measuring Students' Subjective Perceptions of Difficulties in Learning Recursion

    ERIC Educational Resources Information Center

    Lacave, Carmen; Molina, Ana I.; Redondo, Miguel A.

    2018-01-01

    Contribution: Findings are provided from an initial survey to evaluate the magnitude of the recursion problem from the student point of view. Background: A major difficulty that programming students must overcome--the learning of recursion--has been addressed by many authors, using various approaches, but none have considered how students perceive…

  1. Using Spreadsheets to Help Students Think Recursively

    ERIC Educational Resources Information Center

    Webber, Robert P.

    2012-01-01

    Spreadsheets lend themselves naturally to recursive computations, since a formula can be defined as a function of one of more preceding cells. A hypothesized closed form for the "n"th term of a recursive sequence can be tested easily by using a spreadsheet to compute a large number of the terms. Similarly, a conjecture about the limit of a series…

  2. A numerical identifiability test for state-space models--application to optimal experimental design.

    PubMed

    Hidalgo, M E; Ayesa, E

    2001-01-01

    This paper describes a mathematical tool for identifiability analysis, easily applicable to high order non-linear systems modelled in state-space and implementable in simulators with a time-discrete approach. This procedure also permits a rigorous analysis of the expected estimation errors (average and maximum) in calibration experiments. The methodology is based on the recursive numerical evaluation of the information matrix during the simulation of a calibration experiment and in the setting-up of a group of information parameters based on geometric interpretations of this matrix. As an example of the utility of the proposed test, the paper presents its application to an optimal experimental design of ASM Model No. 1 calibration, in order to estimate the maximum specific growth rate microH and the concentration of heterotrophic biomass XBH.

  3. Serial turbo trellis coded modulation using a serially concatenated coder

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush (Inventor); Dolinar, Samuel J. (Inventor); Pollara, Fabrizio (Inventor)

    2010-01-01

    Serial concatenated trellis coded modulation (SCTCM) includes an outer coder, an interleaver, a recursive inner coder and a mapping element. The outer coder receives data to be coded and produces outer coded data. The interleaver permutes the outer coded data to produce interleaved data. The recursive inner coder codes the interleaved data to produce inner coded data. The mapping element maps the inner coded data to a symbol. The recursive inner coder has a structure which facilitates iterative decoding of the symbols at a decoder system. The recursive inner coder and the mapping element are selected to maximize the effective free Euclidean distance of a trellis coded modulator formed from the recursive inner coder and the mapping element. The decoder system includes a demodulation unit, an inner SISO (soft-input soft-output) decoder, a deinterleaver, an outer SISO decoder, and an interleaver.

  4. Comparison of Source Partitioning Methods for CO2 and H2O Fluxes Based on High Frequency Eddy Covariance Data

    NASA Astrophysics Data System (ADS)

    Klosterhalfen, Anne; Moene, Arnold; Schmidt, Marius; Ney, Patrizia; Graf, Alexander

    2017-04-01

    Source partitioning of eddy covariance (EC) measurements of CO2 into respiration and photosynthesis is routinely used for a better understanding of the exchange of greenhouse gases, especially between terrestrial ecosystems and the atmosphere. The most frequently used methods are usually based either on relations of fluxes to environmental drivers or on chamber measurements. However, they often depend strongly on assumptions or invasive measurements and do usually not offer partitioning estimates for latent heat fluxes into evaporation and transpiration. SCANLON and SAHU (2008) and SCANLON and KUSTAS (2010) proposed an promising method to estimate the contributions of transpiration and evaporation using measured high frequency time series of CO2 and H2O fluxes - no extra instrumentation necessary. This method (SK10 in the following) is based on the spatial separation and relative strength of sources and sinks of CO2 and water vapor among the sub-canopy and canopy. Assuming that air from those sources and sinks is not yet perfectly mixed before reaching EC sensors, partitioning is estimated based on the separate application of the flux-variance similarity theory to the stomatal and non-stomatal components of the regarded fluxes, as well as on additional assumptions on stomatal water use efficiency (WUE). The CO2 partitioning method after THOMAS et al. (2008) (TH08 in the following) also follows the argument that the dissimilarities of sources and sinks in and below a canopy affect the relation between H2O and CO2 fluctuations. Instead of involving assumptions on WUE, TH08 directly screens their scattergram for signals of joint respiration and evaporation events and applies a conditional sampling methodology. In spite of their different main targets (H2O vs. CO2), both methods can yield partitioning estimates on both fluxes. We therefore compare various sub-methods of SK10 and TH08 including own modifications (e.g., cluster analysis) to each other, to established source partitioning methods, and to chamber measurements at various agroecosystems. Further, profile measurements and a canopy-resolving Large Eddy Simulation model are used to test the assumptions involved in SK10. Scanlon, T.M., Kustas, W.P., 2010. Partitioning carbon dioxide and water vapor fluxes using correlation analysis. Agricultural and Forest Meteorology 150 (1), 89-99. Scanlon, T.M., Sahu, P., 2008. On the correlation structure of water vapor and carbon dioxide in the atmospheric surface layer: A basis for flux partitioning. Water Resources Research 44 (10), W10418, 15 pp. Thomas, C., Martin, J.G., Goeckede, M., Siqueira, M.B., Foken, T., Law, B.E., Loescher H.W., Katul, G., 2008. Estimating daytime subcanopy respiration from conditional sampling methods applied to multi-scalar high frequency turbulence time series. Agricultural and Forest Meteorology 148 (8-9), 1210-1229.

  5. Copula-based analysis of rhythm

    NASA Astrophysics Data System (ADS)

    García, J. E.; González-López, V. A.; Viola, M. L. Lanfredi

    2016-06-01

    In this paper we establish stochastic profiles of the rhythm for three languages: English, Japanese and Spanish. We model the increase or decrease of the acoustical energy, collected into three bands coming from the acoustic signal. The number of parameters needed to specify a discrete multivariate Markov chain grows exponentially with the order and dimension of the chain. In this case the size of the database is not large enough for a consistent estimation of the model. We apply a strategy to estimate a multivariate process with an order greater than the order achieved using standard procedures. The new strategy consist on obtaining a partition of the state space which is constructed from a combination of the partitions corresponding to the three marginal processes, one for each band of energy, and the partition coming from to the multivariate Markov chain. Then, all the partitions are linked using a copula, in order to estimate the transition probabilities.

  6. Significant Scales in Community Structure

    NASA Astrophysics Data System (ADS)

    Traag, V. A.; Krings, G.; van Dooren, P.

    2013-10-01

    Many complex networks show signs of modular structure, uncovered by community detection. Although many methods succeed in revealing various partitions, it remains difficult to detect at what scale some partition is significant. This problem shows foremost in multi-resolution methods. We here introduce an efficient method for scanning for resolutions in one such method. Additionally, we introduce the notion of ``significance'' of a partition, based on subgraph probabilities. Significance is independent of the exact method used, so could also be applied in other methods, and can be interpreted as the gain in encoding a graph by making use of a partition. Using significance, we can determine ``good'' resolution parameters, which we demonstrate on benchmark networks. Moreover, optimizing significance itself also shows excellent performance. We demonstrate our method on voting data from the European Parliament. Our analysis suggests the European Parliament has become increasingly ideologically divided and that nationality plays no role.

  7. Implementation of hybrid clustering based on partitioning around medoids algorithm and divisive analysis on human Papillomavirus DNA

    NASA Astrophysics Data System (ADS)

    Arimbi, Mentari Dian; Bustamam, Alhadi; Lestari, Dian

    2017-03-01

    Data clustering can be executed through partition or hierarchical method for many types of data including DNA sequences. Both clustering methods can be combined by processing partition algorithm in the first level and hierarchical in the second level, called hybrid clustering. In the partition phase some popular methods such as PAM, K-means, or Fuzzy c-means methods could be applied. In this study we selected partitioning around medoids (PAM) in our partition stage. Furthermore, following the partition algorithm, in hierarchical stage we applied divisive analysis algorithm (DIANA) in order to have more specific clusters and sub clusters structures. The number of main clusters is determined using Davies Bouldin Index (DBI) value. We choose the optimal number of clusters if the results minimize the DBI value. In this work, we conduct the clustering on 1252 HPV DNA sequences data from GenBank. The characteristic extraction is initially performed, followed by normalizing and genetic distance calculation using Euclidean distance. In our implementation, we used the hybrid PAM and DIANA using the R open source programming tool. In our results, we obtained 3 main clusters with average DBI value is 0.979, using PAM in the first stage. After executing DIANA in the second stage, we obtained 4 sub clusters for Cluster-1, 9 sub clusters for Cluster-2 and 2 sub clusters in Cluster-3, with the BDI value 0.972, 0.771, and 0.768 for each main cluster respectively. Since the second stage produce lower DBI value compare to the DBI value in the first stage, we conclude that this hybrid approach can improve the accuracy of our clustering results.

  8. A physically based catchment partitioning method for hydrological analysis

    NASA Astrophysics Data System (ADS)

    Menduni, Giovanni; Riboni, Vittoria

    2000-07-01

    We propose a partitioning method for the topographic surface, which is particularly suitable for hydrological distributed modelling and shallow-landslide distributed modelling. The model provides variable mesh size and appears to be a natural evolution of contour-based digital terrain models. The proposed method allows the drainage network to be derived from the contour lines. The single channels are calculated via a search for the steepest downslope lines. Then, for each network node, the contributing area is determined by means of a search for both steepest upslope and downslope lines. This leads to the basin being partitioned into physically based finite elements delimited by irregular polygons. In particular, the distributed computation of local geomorphological parameters (i.e. aspect, average slope and elevation, main stream length, concentration time, etc.) can be performed easily for each single element. The contributing area system, together with the information on the distribution of geomorphological parameters provide a useful tool for distributed hydrological modelling and simulation of environmental processes such as erosion, sediment transport and shallow landslides.

  9. Chemical amplification based on fluid partitioning

    DOEpatents

    Anderson, Brian L [Lodi, CA; Colston, Jr., Billy W.; Elkin, Chris [San Ramon, CA

    2006-05-09

    A system for nucleic acid amplification of a sample comprises partitioning the sample into partitioned sections and performing PCR on the partitioned sections of the sample. Another embodiment of the invention provides a system for nucleic acid amplification and detection of a sample comprising partitioning the sample into partitioned sections, performing PCR on the partitioned sections of the sample, and detecting and analyzing the partitioned sections of the sample.

  10. Research on the Automatic Fusion Strategy of Fixed Value Boundary Based on the Weak Coupling Condition of Grid Partition

    NASA Astrophysics Data System (ADS)

    Wang, X. Y.; Dou, J. M.; Shen, H.; Li, J.; Yang, G. S.; Fan, R. Q.; Shen, Q.

    2018-03-01

    With the continuous strengthening of power grids, the network structure is becoming more and more complicated. An open and regional data modeling is used to complete the calculation of the protection fixed value based on the local region. At the same time, a high precision, quasi real-time boundary fusion technique is needed to seamlessly integrate the various regions so as to constitute an integrated fault computing platform which can conduct transient stability analysis of covering the whole network with high accuracy and multiple modes, deal with the impact results of non-single fault, interlocking fault and build “the first line of defense” of the power grid. The boundary fusion algorithm in this paper is an automatic fusion algorithm based on the boundary accurate coupling of the networking power grid partition, which takes the actual operation mode for qualification, complete the boundary coupling algorithm of various weak coupling partition based on open-loop mode, improving the fusion efficiency, truly reflecting its transient stability level, and effectively solving the problems of too much data, too many difficulties of partition fusion, and no effective fusion due to mutually exclusive conditions. In this paper, the basic principle of fusion process is introduced firstly, and then the method of boundary fusion customization is introduced by scene description. Finally, an example is given to illustrate the specific algorithm on how it effectively implements the boundary fusion after grid partition and to verify the accuracy and efficiency of the algorithm.

  11. Temperature dependence of the kinetic energy in the Zr40Be60 amorphous alloy

    NASA Astrophysics Data System (ADS)

    Syrykh, G. F.; Stolyarov, A. A.; Krzystyniak, M.; Romanelli, G.; Sadykov, R. A.

    2017-05-01

    The average kinetic energy < E(T)> of the atomic nucleus for each element of the amorphous alloy Zr40Be60 in the temperature range 10-300 K has been measured for the first time using VESUVIO spectrometer (ISIS). The experimental values of < E(T)> have been compared to the partial ZrBe spectra refined by a recursion method based on the data obtained with thermal neutron scattering. The satisfactory agreement has been reached with the calculations using partial spectra based on thermal neutron spectra obtained with recursion method. In addition, the experimental data have been compared to the Debye model. The measurements at different temperatures (10, 200, and 300 K) will provide an opportunity to evaluate the significance of anharmonicity in the dynamics of metallic glasses.

  12. Dose Escalation of Whole-Brain Radiotherapy for Brain Metastases From Melanoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rades, Dirk, E-mail: Rades.Dirk@gmx.ne; Heisterkamp, Christine; Huttenlocher, Stefan

    2010-06-01

    Purpose: The majority of patients with brain metastases from melanoma receive whole-brain radiotherapy (WBRT). However, the results are poor. Hypofractionation regimens failed to improve the outcome of these patients. This study investigates a potential benefit from escalation of the WBRT dose beyond the 'standard' regimen 30 Gy in 10 fractions (10x3 Gy). Methods and Materials: Data from 51 melanoma patients receiving WBRT alone were retrospectively analyzed. A dosage of 10x3 Gy (n = 33) was compared with higher doses including 40 Gy/20 fractions (n = 11) and 45 Gy/15 fractions (n = 7) for survival (OS) and local (intracerebral) controlmore » (LC). Additional potential prognostic factors were evaluated: age, gender, performance status, number of metastases, extracerebral metastases, and recursive partitioning analysis (RPA) class. Results: At 6 months, OS rates were 27% after 10x3 Gy and 50% after higher doses (p = 0.009). The OS rates at 12 months were 4% and 20%. On multivariate analysis, higher WBRT doses (p = 0.010), fewer than four brain metastases (p = 0.012), no extracerebral metastases (p = 0.006), and RPA class 1 (p = 0.005) were associated with improved OS. The LC rates at 6 months were 23% after 10x3 Gy and 50% after higher doses (p = 0.021). The LC rates at 12 months were 0% and 13%. On multivariate analysis, higher WBRT doses (p = 0.020) and fewer than brain metastases (p = 0.002) were associated with better LC. Conclusions: Given the limitations of a retrospective study, the findings suggest that patients with brain metastases from melanoma receiving WBRT alone may benefit from dose escalation beyond 10x3 Gy. The hypothesis generated by this study must be confirmed in a randomized trial stratifying for significant prognostic factors.« less

  13. A Phase III Study of Conventional Radiation Therapy Plus Thalidomide Versus Conventional Radiation Therapy for Multiple Brain Metastases (RTOG 0118)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knisely, Jonathan P.S.; Berkey, Brian; Chakravarti, Arnab

    2008-05-01

    Purpose: To compare whole-brain radiation therapy (WBRT) with WBRT combined with thalidomide for patients with brain metastases not amenable to resection or radiosurgery. Patients and Methods: Patients with Zubrod performance status 0-1, MRI-documented multiple (>3), large (>4 cm), or midbrain brain metastases arising from a histopathologically confirmed extracranial primary tumor, and an anticipated survival of >8 weeks were randomized to receive WBRT to a dose of 37.5 Gy in 15 fractions with or without thalidomide during and after WBRT. Prerandomization stratification used Radiation Therapy Oncology Group (RTOG) Recursive Partitioning Analysis (RPA) Class and whether post-WBRT chemotherapy was planned. Endpoints includedmore » overall survival, progression-free survival, time to neurocognitive progression, the cause of death, toxicities, and quality of life. A protocol-planned interim analysis documented that the trial had an extremely low probability of ever showing a significant difference favoring the thalidomide arm given the results at the time of the analysis, and it was therefore closed on the basis of predefined statistical guidelines. Results: Enrolled in the study were 332 patients. Of 183 accrued patients, 93 were randomized to receive WBRT alone and 90 to WBRT and thalidomide. Median survival was 3.9 months for both arms. No novel toxicities were seen, but thalidomide was not well tolerated in this population. Forty-eight percent of patients discontinued thalidomide because of side effects. Conclusion: Thalidomide provided no survival benefit for patients with multiple, large, or midbrain metastases when combined with WBRT; nearly half the patients discontinued thalidomide due to side effects.« less

  14. KPS/LDH index: a simple tool for identifying patients with metastatic melanoma who are unlikely to benefit from palliative whole brain radiotherapy.

    PubMed

    Partl, Richard; Fastner, Gerd; Kaiser, Julia; Kronhuber, Elisabeth; Cetin-Strohmer, Klaudia; Steffal, Claudia; Böhmer-Breitfelder, Barbara; Mayer, Johannes; Avian, Alexander; Berghold, Andrea

    2016-02-01

    Low Karnofsky performance status (KPS) and elevated lactate dehydrogenases (LDHs) as a surrogate marker for tumor load and cell turnover may depict patients with a very short life expectancy. To validate this finding and compare it to other indices, namely, the recursive partitioning analysis (RPA) and diagnosis-specific graded prognostic assessment (DS-GPA), a multicenter analysis was undertaken. A retrospective analysis of 234 metastatic melanoma patients uniformly treated with palliative whole brain radiotherapy (WBRT) was done. Univariate and multivariate analyses were used to determine the impact of patient-, tumor-, and treatment-related parameters on overall survival (OS). KPS and LDH emerged as independent factors predicting OS. By combining KPS and LDH values (KPS/LDH index), groups of patients with statistically significant differences in median OS (days; 95 % CI) after onset of WBRT were identified: group 1 (KPS ≥ 70/normal LDH) 234 (96-372), group 2 (KPS ≥ 70/elevated LDH) 112 (69-155), group 3 (KPS <70/normal LDH) 43 (12-74), and group 4 (KPS <70/elevated LDH) 29 (17-41). Between all four groups, statistically significant differences were observed. The RPA and DS-GPA indices failed to distinguish significantly between good and moderate prognosis and were inferior in predicting a very unfavorable prognosis. The parameters KPS and LDH independently impacted on OS. The combination of both (KPS/LDH index) identified patients with a very short life expectancy, who might be better served by recommending best supportive care instead of WBRT. The KPS/LDH index is simple and effective in terms of time and cost as compared to other prognostic indices.

  15. Identification of extremely premature infants at high risk of rehospitalization.

    PubMed

    Ambalavanan, Namasivayam; Carlo, Waldemar A; McDonald, Scott A; Yao, Qing; Das, Abhik; Higgins, Rosemary D

    2011-11-01

    Extremely low birth weight infants often require rehospitalization during infancy. Our objective was to identify at the time of discharge which extremely low birth weight infants are at higher risk for rehospitalization. Data from extremely low birth weight infants in Eunice Kennedy Shriver National Institute of Child Health and Human Development Neonatal Research Network centers from 2002-2005 were analyzed. The primary outcome was rehospitalization by the 18- to 22-month follow-up, and secondary outcome was rehospitalization for respiratory causes in the first year. Using variables and odds ratios identified by stepwise logistic regression, scoring systems were developed with scores proportional to odds ratios. Classification and regression-tree analysis was performed by recursive partitioning and automatic selection of optimal cutoff points of variables. A total of 3787 infants were evaluated (mean ± SD birth weight: 787 ± 136 g; gestational age: 26 ± 2 weeks; 48% male, 42% black). Forty-five percent of the infants were rehospitalized by 18 to 22 months; 14.7% were rehospitalized for respiratory causes in the first year. Both regression models (area under the curve: 0.63) and classification and regression-tree models (mean misclassification rate: 40%-42%) were moderately accurate. Predictors for the primary outcome by regression were shunt surgery for hydrocephalus, hospital stay of >120 days for pulmonary reasons, necrotizing enterocolitis stage II or higher or spontaneous gastrointestinal perforation, higher fraction of inspired oxygen at 36 weeks, and male gender. By classification and regression-tree analysis, infants with hospital stays of >120 days for pulmonary reasons had a 66% rehospitalization rate compared with 42% without such a stay. The scoring systems and classification and regression-tree analysis models identified infants at higher risk of rehospitalization and might assist planning for care after discharge.

  16. Identification of Extremely Premature Infants at High Risk of Rehospitalization

    PubMed Central

    Carlo, Waldemar A.; McDonald, Scott A.; Yao, Qing; Das, Abhik; Higgins, Rosemary D.

    2011-01-01

    OBJECTIVE: Extremely low birth weight infants often require rehospitalization during infancy. Our objective was to identify at the time of discharge which extremely low birth weight infants are at higher risk for rehospitalization. METHODS: Data from extremely low birth weight infants in Eunice Kennedy Shriver National Institute of Child Health and Human Development Neonatal Research Network centers from 2002–2005 were analyzed. The primary outcome was rehospitalization by the 18- to 22-month follow-up, and secondary outcome was rehospitalization for respiratory causes in the first year. Using variables and odds ratios identified by stepwise logistic regression, scoring systems were developed with scores proportional to odds ratios. Classification and regression-tree analysis was performed by recursive partitioning and automatic selection of optimal cutoff points of variables. RESULTS: A total of 3787 infants were evaluated (mean ± SD birth weight: 787 ± 136 g; gestational age: 26 ± 2 weeks; 48% male, 42% black). Forty-five percent of the infants were rehospitalized by 18 to 22 months; 14.7% were rehospitalized for respiratory causes in the first year. Both regression models (area under the curve: 0.63) and classification and regression-tree models (mean misclassification rate: 40%–42%) were moderately accurate. Predictors for the primary outcome by regression were shunt surgery for hydrocephalus, hospital stay of >120 days for pulmonary reasons, necrotizing enterocolitis stage II or higher or spontaneous gastrointestinal perforation, higher fraction of inspired oxygen at 36 weeks, and male gender. By classification and regression-tree analysis, infants with hospital stays of >120 days for pulmonary reasons had a 66% rehospitalization rate compared with 42% without such a stay. CONCLUSIONS: The scoring systems and classification and regression-tree analysis models identified infants at higher risk of rehospitalization and might assist planning for care after discharge. PMID:22007016

  17. Brain volume reduction after whole-brain radiotherapy: quantification and prognostic relevance.

    PubMed

    Hoffmann, Christian; Distel, Luitpold; Knippen, Stefan; Gryc, Thomas; Schmidt, Manuel Alexander; Fietkau, Rainer; Putz, Florian

    2018-01-22

    Recent studies have questioned the value of adding whole-brain radiotherapy (WBRT) to stereotactic radiosurgery (SRS) for brain metastasis treatment. Neurotoxicity, including radiation-induced brain volume reduction, could be one reason why not all patients benefit from the addition of WBRT. In this study, we quantified brain volume reduction after WBRT and assessed its prognostic significance. Brain volumes of 91 patients with cerebral metastases were measured during a 150-day period after commencing WBRT and were compared with their pretreatment volumes. The average daily relative change in brain volume of each patient, referred to as the "brain volume reduction rate," was calculated. Univariate and multivariate Cox regression analyses were performed to assess the prognostic significance of the brain volume reduction rate, as well as of 3 treatment-related and 9 pretreatment factors. A one-way analysis of variance was used to compare the brain volume reduction rate across recursive partitioning analysis (RPA) classes. On multivariate Cox regression analysis, the brain volume reduction rate was a significant predictor of overall survival after WBRT (P < 0.001), as well as the number of brain metastases (P = 0.002) and age (P = 0.008). Patients with a relatively favorable prognosis (RPA classes 1 and 2) experienced significantly less brain volume decrease after WBRT than patients with a poor prognosis (RPA class 3) (P = 0.001). There was no significant correlation between delivered radiation dose and brain volume reduction rate (P = 0.147). In this retrospective study, a smaller decrease in brain volume after WBRT was an independent predictor of longer overall survival. © The Author(s) 2017. Published by Oxford University Press on behalf of the Society for Neuro-Oncology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  18. Predictive Factor Analysis of Response-Adapted Radiation Therapy for Chemotherapy-Sensitive Pediatric Hodgkin Lymphoma: Analysis of the Children's Oncology Group AHOD 0031 Trial

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charpentier, Anne-Marie; Friedman, Debra L.; Wolden, Suzanne

    Purpose: To evaluate whether clinical risk factors could further distinguish children with intermediate-risk Hodgkin lymphoma (HL) with rapid early and complete anatomic response (RER/CR) who benefit significantly from involved-field RT (IFRT) from those who do not, and thereby aid refinement of treatment selection. Methods and Materials: Children with intermediate-risk HL treated on the Children's Oncology Group AHOD 0031 trial who achieved RER/CR with 4 cycles of chemotherapy, and who were randomized to 21-Gy IFRT or no additional therapy (n=716) were the subject of this study. Recursive partitioning analysis was used to identify factors associated with clinically and statistically significant improvement inmore » event-free survival (EFS) after randomization to IFRT. Bootstrap sampling was used to evaluate the robustness of the findings. Result: Although most RER/CR patients did not benefit significantly from IFRT, those with a combination of anemia and bulky limited-stage disease (n=190) had significantly better 4-year EFS with the addition of IFRT (89.3% vs 77.9% without IFRT; P=.019); this benefit was consistently reproduced in bootstrap analyses and after adjusting for other prognostic factors. Conclusion: Although most patients achieving RER/CR had favorable outcomes with 4 cycles of chemotherapy alone, those children with initial bulky stage I/II disease and anemia had significantly better EFS with the addition of IFRT as part of combined-modality therapy. Further work evaluating the interaction of clinical and biologic factors and imaging response is needed to further optimize and refine treatment selection.« less

  19. Quantified degree of eccentricity of aortic valve calcification predicts risk of paravalvular regurgitation and response to balloon post-dilation after self-expandable transcatheter aortic valve replacement.

    PubMed

    Park, Jun-Bean; Hwang, In-Chang; Lee, Whal; Han, Jung-Kyu; Kim, Chi-Hoon; Lee, Seung-Pyo; Yang, Han-Mo; Park, Eun-Ah; Kim, Hyung-Kwan; Chiam, Paul T L; Kim, Yong-Jin; Koo, Bon-Kwon; Sohn, Dae-Won; Ahn, Hyuk; Kang, Joon-Won; Park, Seung-Jung; Kim, Hyo-Soo

    2018-05-15

    Limited data exist regarding the impact of aortic valve calcification (AVC) eccentricity on the risk of paravalvular regurgitation (PVR) and response to balloon post-dilation (BPD) after transcatheter aortic valve replacement (TAVR). We investigated the prognostic value of AVC eccentricity in predicting the risk of PVR and response to BPD in patients undergoing TAVR. We analyzed 85 patients with severe aortic stenosis who underwent self-expandable TAVR (43 women; 77.2±7.1years). AVC was quantified as the total amount of calcification (total AVC load) and as the eccentricity of calcium (EoC) using calcium volume scoring with contrast computed tomography angiography (CTA). The EoC was defined as the maximum absolute difference in calcium volume scores between 2 adjacent sectors (bi-partition method) or between sectors based on leaflets (leaflet-based method). Total AVC load and bi-partition EoC, but not leaflet-based EoC, were significant predictors for the occurrence of ≥moderate PVR, and bi-partition EoC had a better predictive value than total AVC load (area under the curve [AUC]=0.863 versus 0.760, p for difference=0.006). In multivariate analysis, bi-partition EoC was an independent predictor for the risk of ≥moderate PVR regardless of perimeter oversizing index. The greater bi-partition EoC was the only significant parameter to predict poor response to BPD (AUC=0.775, p=0.004). Pre-procedural assessment of AVC eccentricity using CTA as "bi-partition EoC" provides useful predictive information on the risk of significant PVR and response to BPD in patients undergoing TAVR with self-expandable valves. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Language, Mind, Practice: Families of Recursive Thinking in Human Reasoning

    ERIC Educational Resources Information Center

    Josephson, Marika

    2011-01-01

    In 2002, Chomsky, Hauser, and Fitch asserted that recursion may be the one aspect of the human language faculty that makes human language unique in the narrow sense--unique to language and unique to human beings. They also argue somewhat more quietly (as do Pinker and Jackendoff 2005) that recursion may be possible outside of language: navigation,…

  1. Differential Impact of Whole-Brain Radiotherapy Added to Radiosurgery for Brain Metastases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kong, Doo-Sik; Lee, Jung-Il, E-mail: jilee@skku.ed; Im, Yong-Seok

    2010-10-01

    Purpose: The authors investigated whether the addition of whole-brain radiotherapy (WBRT) to stereotactic radiosurgery (SRS) provided any therapeutic benefit according to recursive partitioning analysis (RPA) class. Methods and Materials: Two hundred forty-five patients with 1 to 10 metastases who underwent SRS between January 2002 and December 2007 were included in the study. Of those, 168 patients were treated with SRS alone and 77 patients received SRS followed by WBRT. Actuarial curves were estimated using the Kaplan-Meier method regarding overall survival (OS), distant brain control (DC), and local brain control (LC) stratified by RPA class. Analyses for known prognostic variables weremore » performed using the Cox proportional hazards model. Results: Univariate and multivariate analysis revealed that control of the primary tumor, small number of brain metastases, Karnofsky performance scale (KPS) > 70, and initial treatment modalities were significant predictors for survival. For RPA class 1, SRS plus WBRT was associated with a longer survival time compared with SRS alone (854 days vs. 426 days, p = 0.042). The SRS plus WBRT group also showed better LC rate than did the SRS-alone group (p = 0.021), although they did not show a better DC rate (p = 0.079). By contrast, for RPA class 2 or 3, no significant difference in OS, LC, or DC was found between the two groups. Conclusions: These results suggest that RPA classification should determine whether or not WBRT is added to SRS. WBRT may be recommended to be added to SRS for patients in whom long-term survival is expected on the basis of RPA classification.« less

  2. The extension of the parametrization of the radio source coordinates in geodetic VLBI and its impact on the time series analysis

    NASA Astrophysics Data System (ADS)

    Karbon, Maria; Heinkelmann, Robert; Mora-Diaz, Julian; Xu, Minghui; Nilsson, Tobias; Schuh, Harald

    2017-07-01

    The radio sources within the most recent celestial reference frame (CRF) catalog ICRF2 are represented by a single, time-invariant coordinate pair. The datum sources were chosen mainly according to certain statistical properties of their position time series. Yet, such statistics are not applicable unconditionally, and also ambiguous. However, ignoring systematics in the source positions of the datum sources inevitably leads to a degradation of the quality of the frame and, therefore, also of the derived quantities such as the Earth orientation parameters. One possible approach to overcome these deficiencies is to extend the parametrization of the source positions, similarly to what is done for the station positions. We decided to use the multivariate adaptive regression splines algorithm to parametrize the source coordinates. It allows a great deal of automation, by combining recursive partitioning and spline fitting in an optimal way. The algorithm finds the ideal knot positions for the splines and, thus, the best number of polynomial pieces to fit the data autonomously. With that we can correct the ICRF2 a priori coordinates for our analysis and eliminate the systematics in the position estimates. This allows us to introduce also special handling sources into the datum definition, leading to on average 30 % more sources in the datum. We find that not only the CPO can be improved by more than 10 % due to the improved geometry, but also the station positions, especially in the early years of VLBI, can benefit greatly.

  3. Anastomotic leakage after colorectal surgery: diagnostic accuracy of CT.

    PubMed

    Kauv, Paul; Benadjaoud, Samir; Curis, Emmanuel; Boulay-Coletta, Isabelle; Loriau, Jerome; Zins, Marc

    2015-12-01

    To evaluate the diagnostic accuracy of CT in postoperative colorectal anastomotic leakage (AL). Two independent blinded radiologists reviewed 153 CTs performed for suspected AL within 60 days after surgery in 131 consecutive patients, with (n = 58) or without (n = 95) retrograde contrast enema (RCE). Results were compared to original interpretations. The reference standard was reoperation or consensus (a radiologist and a surgeon) regarding clinical, laboratory, radiological, and follow-up data after medical treatment. AL was confirmed in 34/131 patients. For the two reviewers and original interpretation, sensitivity of CT was 82 %, 87 %, and 71 %, respectively; specificity was 84 %, 84 %, and 92 %. RCE significantly increased the positive predictive value (from 40 % to 88 %, P = 0.0009; 41 % to 92 %, P = 0.0016; and 40 % to 100 %, P = 0.0006). Contrast extravasation was the most sensitive (reviewers, 83 % and 83 %) and specific (97 % and 97 %) sign and was significantly associated with AL by univariate analysis (P < 0.0001 and P < 0.0001). By multivariate analysis with recursive partitioning, CT with RCE was accurate to confirm or rule out AL with contrast extravasation. CT with RCE is accurate for diagnosing postoperative colorectal AL. Contrast extravasation is the most reliable sign. RCE should be performed during CT for suspected AL. • CT accurately diagnosed clinically suspected colorectal AL and showed good interobserver agreement • Contrast extravasation was the most sensitive and specific CT sign • Retrograde contrast enema during CT improved positive predictive value • Retrograde contrast enema decreased false-negative or indeterminate original CT interpretations.

  4. DNA methylation analysis reveals distinct methylation signatures in pediatric germ cell tumors.

    PubMed

    Amatruda, James F; Ross, Julie A; Christensen, Brock; Fustino, Nicholas J; Chen, Kenneth S; Hooten, Anthony J; Nelson, Heather; Kuriger, Jacquelyn K; Rakheja, Dinesh; Frazier, A Lindsay; Poynter, Jenny N

    2013-06-27

    Aberrant DNA methylation is a prominent feature of many cancers, and may be especially relevant in germ cell tumors (GCTs) due to the extensive epigenetic reprogramming that occurs in the germ line during normal development. We used the Illumina GoldenGate Cancer Methylation Panel to compare DNA methylation in the three main histologic subtypes of pediatric GCTs (germinoma, teratoma and yolk sac tumor (YST); N = 51) and used recursively partitioned mixture models (RPMM) to test associations between methylation pattern and tumor and demographic characteristics. We identified genes and pathways that were differentially methylated using generalized linear models and Ingenuity Pathway Analysis. We also measured global DNA methylation at LINE1 elements and evaluated methylation at selected imprinted loci using pyrosequencing. Methylation patterns differed by tumor histology, with 18/19 YSTs forming a distinct methylation class. Four pathways showed significant enrichment for YSTs, including a human embryonic stem cell pluripotency pathway. We identified 190 CpG loci with significant methylation differences in mature and immature teratomas (q < 0.05), including a number of CpGs in stem cell and pluripotency-related pathways. Both YST and germinoma showed significantly lower methylation at LINE1 elements compared with normal adjacent tissue while there was no difference between teratoma (mature and immature) and normal tissue. DNA methylation at imprinted loci differed significantly by tumor histology and location. Understanding methylation patterns may identify the developmental stage at which the GCT arose and the at-risk period when environmental exposures could be most harmful. Further, identification of relevant genetic pathways could lead to the development of new targets for therapy.

  5. Joint multifractal analysis based on the partition function approach: analytical analysis, numerical simulation and empirical application

    NASA Astrophysics Data System (ADS)

    Xie, Wen-Jie; Jiang, Zhi-Qiang; Gu, Gao-Feng; Xiong, Xiong; Zhou, Wei-Xing

    2015-10-01

    Many complex systems generate multifractal time series which are long-range cross-correlated. Numerous methods have been proposed to characterize the multifractal nature of these long-range cross correlations. However, several important issues about these methods are not well understood and most methods consider only one moment order. We study the joint multifractal analysis based on partition function with two moment orders, which was initially invented to investigate fluid fields, and derive analytically several important properties. We apply the method numerically to binomial measures with multifractal cross correlations and bivariate fractional Brownian motions without multifractal cross correlations. For binomial multifractal measures, the explicit expressions of mass function, singularity strength and multifractal spectrum of the cross correlations are derived, which agree excellently with the numerical results. We also apply the method to stock market indexes and unveil intriguing multifractality in the cross correlations of index volatilities.

  6. Correlation of soil and sediment organic matter polarity to aqueous sorption of nonionic compounds

    USGS Publications Warehouse

    Kile, D.E.; Wershaw, R. L.; Chiou, C.T.

    1999-01-01

    Polarities of the soiL/sediment organic matter (SOM) in 19 soil and 9 freshwater sediment sam pies were determined from solid-state 13C-CP/MAS NMR spectra and compared with published partition coefficients (K(oc)) of carbon tetrachloride (CT) from aqueous solution. Nondestructive analysis of whole samples by solid-state NMR permits a direct assessment of the polarity of SOM that is not possible by elemental analysis. The percent of organic carbon associated with polar functional groups was estimated from the combined fraction of carbohydrate and carboxylamide-ester carbons. A plot of the measured partition coefficients (K(oc)) of carbon tetrachloride (CT) vs. percent polar organic carbon (POC) shows distinctly different populations of soils and sediments as well as a roughly inverse trend among the soil/sediment populations. Plots of K(oc) values for CT against other structural group carbon fractions did not yield distinct populations. The results indicate that the polarity of SOM is a significant factor in accounting for differences in K(oc) between the organic matter in soils and sediments. The alternate direct correlation of the sum of aliphatic and aromatic structural carbons with K(oc) illustrates the influence of nonpolar hydrocarbon on solute partition interaction. Additional elemental analysis data of selected samples further substantiate the effect of the organic matter polarity on the partition efficiency of nonpolar solutes. The separation between soil and sediment samples based on percent POC reflects definite differences of the properties of soil and sediment organic matters that are attributable to diagenesis.Polarities of the soil/sediment organic matter (SOM) in 19 soil and 9 freshwater sediment samples were determined from solid-state 13C-CP/MAS NMR spectra and compared with published partition coefficients (Koc) of carbon tetrachloride (CT) from aqueous solution. Nondestructive analysis of whole samples by solid-state NMR permits a direct assessment of the polarity of SOM that is not possible by elemental analysis. The percent of organic carbon associated with polar functional groups was estimated from the combined fraction of carbohydrate and carboxyl-amide-ester carbons. A plot of the measured partition coefficients (Koc) of carbon tetrachloride (CT) vs. percent polar organic carbon (POC) shows distinctly different populations of soils and sediments as well as a roughly inverse trend among the soil/sediment populations. Plots of Koc values for CT against other structural group carbon fractions did not yield distinct populations. The results indicate that the polarity of SOM is a significant factor in accounting for differences in Koc between the organic matter in soils and sediments. The alternate direct correlation of the sum of aliphatic and aromatic structural carbons with Koc illustrates the influence of nonpolar hydrocarbon on solute partition interaction. Additional elemental analysis data of selected samples further substantiate the effect of the organic matter polarity on the partition efficiency of nonpolar solutes. The separation between soil and sediment samples based on percent POC reflects definite differences of the properties of soil and sediment organic matters that are attributable to diagenesis.

  7. Construction and Analysis of Multi-Rate Partitioned Runge-Kutta Methods

    DTIC Science & Technology

    2012-06-01

    ANALYSIS OF MULTI-RATE PARTITIONED RUNGE-KUTTA METHODS by Patrick R. Mugg June 2012 Thesis Advisor: Francis Giraldo Second Reader: Hong...COVERED Master’s Thesis 4. TITLE AND SUBTITLE Construction and Analysis of Multi-Rate Partitioned Runge-Kutta Methods 5. FUNDING NUMBERS 6. AUTHOR...The most widely known and used procedure for analyzing stability is the Von Neumann Method , such that Von Neumann’s stability analysis looks at

  8. Image defog algorithm based on open close filter and gradient domain recursive bilateral filter

    NASA Astrophysics Data System (ADS)

    Liu, Daqian; Liu, Wanjun; Zhao, Qingguo; Fei, Bowen

    2017-11-01

    To solve the problems of fuzzy details, color distortion, low brightness of the image obtained by the dark channel prior defog algorithm, an image defog algorithm based on open close filter and gradient domain recursive bilateral filter, referred to as OCRBF, was put forward. The algorithm named OCRBF firstly makes use of weighted quad tree to obtain more accurate the global atmospheric value, then exploits multiple-structure element morphological open and close filter towards the minimum channel map to obtain a rough scattering map by dark channel prior, makes use of variogram to correct the transmittance map,and uses gradient domain recursive bilateral filter for the smooth operation, finally gets recovery images by image degradation model, and makes contrast adjustment to get bright, clear and no fog image. A large number of experimental results show that the proposed defog method in this paper can be good to remove the fog , recover color and definition of the fog image containing close range image, image perspective, the image including the bright areas very well, compared with other image defog algorithms,obtain more clear and natural fog free images with details of higher visibility, what's more, the relationship between the time complexity of SIDA algorithm and the number of image pixels is a linear correlation.

  9. Video noise reduction

    NASA Astrophysics Data System (ADS)

    Drewery, J. O.; Storey, R.; Tanton, N. E.

    1984-07-01

    A video noise and film grain reducer is described which is based on a first-order recursive temporal filter. Filtering of moving detail is avoided by inhibiting recursion in response to the amount of motion in a picture. Motion detection is based on the point-by-point power of the picture difference signal coupled with a knowledge of the noise statistics. A control system measures the noise power and adjusts the working point of the motion detector accordingly. A field trial of a manual version of the equipment at Television Center indicated that a worthwhile improvement in the quality of noisy or grainy pictures received by the viewer could be obtained. Subsequent trials of the automated version confirmed that the improvement could be maintained. Commercial equipment based on the design is being manufactured and marketed by Pye T.V.T. under license. It is in regular use on both the BBC1 and BBC2 networks.

  10. Adaptively loaded IM/DD optical OFDM based on set-partitioned QAM formats.

    PubMed

    Zhao, Jian; Chen, Lian-Kuan

    2017-04-17

    We investigate the constellation design and symbol error rate (SER) of set-partitioned (SP) quadrature amplitude modulation (QAM) formats. Based on the SER analysis, we derive the adaptive bit and power loading algorithm for SP QAM based intensity-modulation direct-detection (IM/DD) orthogonal frequency division multiplexing (OFDM). We experimentally show that the proposed system significantly outperforms the conventional adaptively-loaded IM/DD OFDM and can increase the data rate from 36 Gbit/s to 42 Gbit/s in the presence of severe dispersion-induced spectral nulls after 40-km single-mode fiber. It is also shown that the adaptive algorithm greatly enhances the tolerance to fiber nonlinearity and allows for more power budget.

  11. Chemical amplification based on fluid partitioning in an immiscible liquid

    DOEpatents

    Anderson, Brian L.; Colston, Bill W.; Elkin, Christopher J.

    2010-09-28

    A system for nucleic acid amplification of a sample comprises partitioning the sample into partitioned sections and performing PCR on the partitioned sections of the sample. Another embodiment of the invention provides a system for nucleic acid amplification and detection of a sample comprising partitioning the sample into partitioned sections, performing PCR on the partitioned sections of the sample, and detecting and analyzing the partitioned sections of the sample.

  12. Improving Cluster Analysis with Automatic Variable Selection Based on Trees

    DTIC Science & Technology

    2014-12-01

    regression trees Daisy DISsimilAritY PAM partitioning around medoids PMA penalized multivariate analysis SPC sparse principal components UPGMA unweighted...unweighted pair-group average method ( UPGMA ). This method measures dissimilarities between all objects in two clusters and takes the average value

  13. Unintended consequences of information technologies in health care--an interactive sociotechnical analysis.

    PubMed

    Harrison, Michael I; Koppel, Ross; Bar-Lev, Shirly

    2007-01-01

    Many unintended and undesired consequences of Healthcare Information Technologies (HIT) flow from interactions between the HIT and the healthcare organization's sociotechnical system-its workflows, culture, social interactions, and technologies. This paper develops and illustrates a conceptual model of these processes that we call Interactive Sociotechnical Analysis (ISTA). ISTA captures common types of interaction with special emphasis on recursive processes, i.e., feedback loops that alter the newly introduced HIT and promote second-level changes in the social system. ISTA draws on prior studies of unintended consequences, along with research in sociotechnical systems, ergonomics, social informatics, technology-in-practice, and social construction of technology. We present five types of sociotechnical interaction and illustrate each with cases from published research. The ISTA model should further research on emergent and recursive processes in HIT implementation and their unintended consequences. Familiarity with the model can also foster practitioners' awareness of unanticipated consequences that only become evident during HIT implementation.

  14. Optical and Gravimetric Partitioning of Coastal Ocean Suspended Particulate Inorganic Matter (PIM)

    NASA Astrophysics Data System (ADS)

    Stavn, R. H.; Zhang, X.; Falster, A. U.; Gray, D. J.; Rick, J. J.; Gould, R. W., Jr.

    2016-02-01

    Recent work on the composition of suspended particulates of estuarine and coastal waters increases our capabilities to investigate the biogeochemal processes occurring in these waters. The biogeochemical properties associated with the particulates involve primarily sorption/desorption of dissolved matter onto the particle surfaces, which vary with the types of particulates. Therefore, the breakdown into chemical components of suspended matter will greatly expand the biogeochemistry of the coastal ocean region. The gravimetric techniques for these studies are here expanded and refined. In addition, new optical inversions greatly expand our capabilities to study spatial extent of the components of suspended particulate matter. The partitioning of a gravimetric PIM determination into clay minerals and amorphous silica is aided by electron microprobe analysis. The amorphous silica is further partitioned into contributions by detrital material and by the tests of living diatoms based on an empirical formula relating the chlorophyll content of cultured living diatoms in log phase growth to their frustules determined after gravimetric analysis of the ashed diatom residue. The optical inversion of composition of suspended particulates is based on the entire volume scattering function (VSF) measured in the field with a Multispectral Volume Scattering Meter and a LISST 100 meter. The VSF is partitioned into an optimal combination of contributions by particle subpopulations, each of which is uniquely represented by a refractive index and a log-normal size distribution. These subpopulations are aggregated to represent the two components of PIM using the corresponding refractive indices and sizes which also yield a particle size distribution for the two components. The gravimetric results of partitioning PIM into clay minerals and amorphous silica confirm the optical inversions from the VSF.

  15. pH recycling aqueous two-phase systems applied in extraction of Maitake β-Glucan and mechanism analysis using low-field nuclear magnetic resonance.

    PubMed

    Hou, Huiyun; Cao, Xuejun

    2015-07-31

    In this paper, a recycling aqueous two-phase systems (ATPS) based on two pH-response copolymers PADB and PMDM were used in purification of β-Glucan from Grifola frondosa. The main parameters, such as polymer concentration, type and concentration of salt, extraction temperature and pH, were investigated to optimize partition conditions. The results demonstrated that β-Glucan was extracted into PADB-rich phase, while impurities were extracted into PMDM-rich phase. In this 2.5% PADB/2.5% PMDM ATPS, 7.489 partition coefficient and 96.92% extraction recovery for β-Glucan were obtained in the presence of 30mmol/L KBr, at pH 8.20, 30°C. The phase-forming copolymers could be recycled by adjusting pH, with recoveries of over 96.0%. Furthermore, the partition mechanism of Maitake β-Glucan in PADB/PMDM aqueous two-phase systems was studied. Fourier transform infrared spectra, ForteBio Octet system and low-field nuclear magnetic resonance (LF-NMR) were introduced for elucidating the partition mechanism of β-Glucan. Especially, LF-NMR was firstly used in the mechanism analysis in partition of aqueous two-phase systems. The change of transverse relaxation time (T2) in ATPS could reflect the interaction between polymers and β-Glucan. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Apparatus for chemical amplification based on fluid partitioning in an immiscible liquid

    DOEpatents

    Anderson, Brian L [Lodi, CA; Colston, Bill W [San Ramon, CA; Elkin, Christopher J [San Ramon, CA

    2012-05-08

    A system for nucleic acid amplification of a sample comprises partitioning the sample into partitioned sections and performing PCR on the partitioned sections of the sample. Another embodiment of the invention provides a system for nucleic acid amplification and detection of a sample comprising partitioning the sample into partitioned sections, performing PCR on the partitioned sections of the sample, and detecting and analyzing the partitioned sections of the sample.

  17. Method for chemical amplification based on fluid partitioning in an immiscible liquid

    DOEpatents

    Anderson, Brian L.; Colston, Bill W.; Elkin, Christopher J.

    2015-06-02

    A system for nucleic acid amplification of a sample comprises partitioning the sample into partitioned sections and performing PCR on the partitioned sections of the sample. Another embodiment of the invention provides a system for nucleic acid amplification and detection of a sample comprising partitioning the sample into partitioned sections, performing PCR on the partitioned sections of the sample, and detecting and analyzing the partitioned sections of the sample.

  18. Method for chemical amplification based on fluid partitioning in an immiscible liquid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Brian L.; Colston, Bill W.; Elkin, Christopher J.

    A system for nucleic acid amplification of a sample comprises partitioning the sample into partitioned sections and performing PCR on the partitioned sections of the sample. Another embodiment of the invention provides a system for nucleic acid amplification and detection of a sample comprising partitioning the sample into partitioned sections, performing PCR on the partitioned sections of the sample, and detecting and analyzing the partitioned sections of the sample.

  19. A new algorithm for grid-based hydrologic analysis by incorporating stormwater infrastructure

    NASA Astrophysics Data System (ADS)

    Choi, Yosoon; Yi, Huiuk; Park, Hyeong-Dong

    2011-08-01

    We developed a new algorithm, the Adaptive Stormwater Infrastructure (ASI) algorithm, to incorporate ancillary data sets related to stormwater infrastructure into the grid-based hydrologic analysis. The algorithm simultaneously considers the effects of the surface stormwater collector network (e.g., diversions, roadside ditches, and canals) and underground stormwater conveyance systems (e.g., waterway tunnels, collector pipes, and culverts). The surface drainage flows controlled by the surface runoff collector network are superimposed onto the flow directions derived from a DEM. After examining the connections between inlets and outfalls in the underground stormwater conveyance system, the flow accumulation and delineation of watersheds are calculated based on recursive computations. Application of the algorithm to the Sangdong tailings dam in Korea revealed superior performance to that of a conventional D8 single-flow algorithm in terms of providing reasonable hydrologic information on watersheds with stormwater infrastructure.

  20. Atmospheric turbulence simulation for Shuttle orbiter

    NASA Technical Reports Server (NTRS)

    Tatom, F. B.; Smith, S. R.

    1979-01-01

    An improved non-recursive model for atmospheric turbulence along the flight path of the Shuttle Orbiter is developed which provides for simulation of instantaneous vertical and horizontal gusts at the vehicle center-of-gravity, and also for simulation of instantaneous gust gradients. Based on this model the time series for both gusts and gust gradients are generated and stored on a series of magnetic tapes. Section 2 provides a description of the various technical considerations associated with the turbulence simulation model. Included in this section are descriptions of the digital filter simulation model, the von Karman spectra with finite upper limits, and the final non recursive turbulence simulation model which was used to generate the time series. Section 2 provides a description of the various technical considerations associated with the turbulence simulation model. Included in this section are descriptions of the digial filter simulation model, the von Karman spectra with finite upper limits, and the final non recursive turbulence simulation model which was used to generate the time series. Section 3 provides a description of the time series as currently recorded on magnetic tape. Conclusions and recommendations are presented in Section 4.

  1. Recursive regularization for inferring gene networks from time-course gene expression profiles

    PubMed Central

    Shimamura, Teppei; Imoto, Seiya; Yamaguchi, Rui; Fujita, André; Nagasaki, Masao; Miyano, Satoru

    2009-01-01

    Background Inferring gene networks from time-course microarray experiments with vector autoregressive (VAR) model is the process of identifying functional associations between genes through multivariate time series. This problem can be cast as a variable selection problem in Statistics. One of the promising methods for variable selection is the elastic net proposed by Zou and Hastie (2005). However, VAR modeling with the elastic net succeeds in increasing the number of true positives while it also results in increasing the number of false positives. Results By incorporating relative importance of the VAR coefficients into the elastic net, we propose a new class of regularization, called recursive elastic net, to increase the capability of the elastic net and estimate gene networks based on the VAR model. The recursive elastic net can reduce the number of false positives gradually by updating the importance. Numerical simulations and comparisons demonstrate that the proposed method succeeds in reducing the number of false positives drastically while keeping the high number of true positives in the network inference and achieves two or more times higher true discovery rate (the proportion of true positives among the selected edges) than the competing methods even when the number of time points is small. We also compared our method with various reverse-engineering algorithms on experimental data of MCF-7 breast cancer cells stimulated with two ErbB ligands, EGF and HRG. Conclusion The recursive elastic net is a powerful tool for inferring gene networks from time-course gene expression profiles. PMID:19386091

  2. Recursive formulas for determining perturbing accelerations in intermediate satellite motion

    NASA Astrophysics Data System (ADS)

    Stoianov, L.

    Recursive formulas for Legendre polynomials and associated Legendre functions are used to obtain recursive relationships for determining acceleration components which perturb intermediate satellite motion. The formulas are applicable in all cases when the perturbation force function is presented as a series in spherical functions (gravitational, tidal, thermal, geomagnetic, and other perturbations of intermediate motion). These formulas can be used to determine the order of perturbing accelerations.

  3. Adaptive Fading Memory H∞ Filter Design for Compensation of Delayed Components in Self Powered Flux Detectors

    NASA Astrophysics Data System (ADS)

    Tamboli, Prakash Kumar; Duttagupta, Siddhartha P.; Roy, Kallol

    2015-08-01

    The paper deals with dynamic compensation of delayed Self Powered Flux Detectors (SPFDs) using discrete time H∞ filtering method for improving the response of SPFDs with significant delayed components such as Platinum and Vanadium SPFD. We also present a comparative study between the Linear Matrix Inequality (LMI) based H∞ filtering and Algebraic Riccati Equation (ARE) based Kalman filtering methods with respect to their delay compensation capabilities. Finally an improved recursive H∞ filter based on the adaptive fading memory technique is proposed which provides an improved performance over existing methods. The existing delay compensation algorithms do not account for the rate of change in the signal for determining the filter gain and therefore add significant noise during the delay compensation process. The proposed adaptive fading memory H∞ filter minimizes the overall noise very effectively at the same time keeps the response time at minimum values. The recursive algorithm is easy to implement in real time as compared to the LMI (or ARE) based solutions.

  4. Phytochemistry of cimicifugic acids and associated bases in Cimicifuga racemosa root extracts.

    PubMed

    Gödecke, Tanja; Nikolic, Dejan; Lankin, David C; Chen, Shao-Nong; Powell, Sharla L; Dietz, Birgit; Bolton, Judy L; van Breemen, Richard B; Farnsworth, Norman R; Pauli, Guido F

    2009-01-01

    Earlier studies reported serotonergic activity for cimicifugic acids (CA) isolated from Cimicifuga racemosa. The discovery of strongly basic alkaloids, cimipronidines, from the active extract partition and evaluation of previously employed work-up procedures has led to the hypothesis of strong acid/base association in the extract. Re-isolation of the CAs was desired to permit further detailed studies. Based on the acid/base association hypothesis, a new separation scheme of the active partition was required, which separates acids from associated bases. A new 5-HT(7) bioassay guided work-up procedure was developed that concentrates activity into one partition. The latter was subjected to a new two-step centrifugal partitioning chromatography (CPC) method, which applies pH zone refinement gradient (pHZR CPC) to dissociate the acid/base complexes. The resulting CA fraction was subjected to a second CPC step. Fractions and compounds were monitored by (1)H NMR using a structure-based spin-pattern analysis facilitating dereplication of the known acids. Bioassay results were obtained for the pHZR CPC fractions and for purified CAs. A new CA was characterised. While none of the pure CAs was active, the serotonergic activity was concentrated in a single pHZR CPC fraction, which was subsequently shown to contain low levels of the potent 5-HT(7) ligand, N(omega)-methylserotonin. This study shows that CAs are not responsible for serotonergic activity in black cohosh. New phytochemical methodology (pHZR CPC) and a sensitive dereplication method (LC-MS) led to the identification of N(omega)-methylserotonin as serotonergic active principle. Copyright (c) 2009 John Wiley & Sons, Ltd.

  5. Phytochemistry of Cimicifugic Acids and Associated Bases in Cimicifuga racemosa Root Extracts

    PubMed Central

    GÖdecke, Tanja; Nikolic, Dejan; Lankin, David C.; Chen, Shao-Nong; Powell, Sharla L.; Dietz, Birgit; Bolton, Judy L.; Van Breemen, Richard B.; Farnsworth, Norman R.; Pauli, Guido F.

    2009-01-01

    Introduction Earlier studies reported serotonergic activity for cimicifugic acids (CA) isolated from Cimicifuga racemosa. The discovery of strongly basic alkaloids, cimipronidines, from the active extract partition and evaluation of previously employed work-up procedures has led to the hypothesis of strong acid/base association in the extract. Objective Re-isolation of the CAs was desired to permit further detailed studies. Based on the acid/base association hypothesis, a new separation scheme of the active partition was required, which separates acids from associated bases. Methodology A new 5-HT7 bioassay guided work-up procedure was developed that concentrates activity into one partition. The latter was subjected to a new 2-step centrifugal partitioning chromatography (CPC) method, which applies pH zone refinement gradient (pHZR CPC) to dissociate the acid/base complexes. The resulting CA fraction was subjected to a second CPC step. Fractions and compounds were monitored by 1H NMR using a structure based spin-pattern analysis facilitating dereplication of the known acids. Bioassay results were obtained for the pHZR CPC fractions and for purified CAs. Results A new CA was characterized. While none of the pure CAs was active, the serotonergic activity was concentrated in a single pHZR CPC fraction, which was subsequently shown to contain low levels of the potent 5-HT7 ligand, Nω–methylserotonin. Conclusion This study shows that CAs are not responsible for serotonergic activity in black cohosh. New phytochemical methodology (pHZR CPC) and a sensitive dereplication method (LC-MS) led to the identification of Nω–methylserotonin as serotonergic active principle. PMID:19140115

  6. Model‐based analysis of the influence of catchment properties on hydrologic partitioning across five mountain headwater subcatchments

    PubMed Central

    Wagener, Thorsten; McGlynn, Brian

    2015-01-01

    Abstract Ungauged headwater basins are an abundant part of the river network, but dominant influences on headwater hydrologic response remain difficult to predict. To address this gap, we investigated the ability of a physically based watershed model (the Distributed Hydrology‐Soil‐Vegetation Model) to represent controls on metrics of hydrologic partitioning across five adjacent headwater subcatchments. The five study subcatchments, located in Tenderfoot Creek Experimental Forest in central Montana, have similar climate but variable topography and vegetation distribution. This facilitated a comparative hydrology approach to interpret how parameters that influence partitioning, detected via global sensitivity analysis, differ across catchments. Model parameters were constrained a priori using existing regional information and expert knowledge. Influential parameters were compared to perceptions of catchment functioning and its variability across subcatchments. Despite between‐catchment differences in topography and vegetation, hydrologic partitioning across all metrics and all subcatchments was sensitive to a similar subset of snow, vegetation, and soil parameters. Results also highlighted one subcatchment with low certainty in parameter sensitivity, indicating that the model poorly represented some complexities in this subcatchment likely because an important process is missing or poorly characterized in the mechanistic model. For use in other basins, this method can assess parameter sensitivities as a function of the specific ungauged system to which it is applied. Overall, this approach can be employed to identify dominant modeled controls on catchment response and their agreement with system understanding. PMID:27642197

  7. A Refinement of the McMillen (1988) Recursive Digital Filter for the Analysis of Atmospheric Turbulence

    NASA Astrophysics Data System (ADS)

    Falocchi, Marco; Giovannini, Lorenzo; Franceschi, Massimiliano de; Zardi, Dino

    2018-05-01

    We present a refinement of the recursive digital filter proposed by McMillen (Boundary-Layer Meteorol 43:231-245, 1988), for separating surface-layer turbulence from low-frequency fluctuations affecting the mean flow, especially over complex terrain. In fact, a straightforward application of the filter causes both an amplitude attenuation and a forward phase shift in the filtered signal. As a consequence turbulence fluctuations, evaluated as the difference between the original series and the filtered one, as well as higher-order moments calculated from them, may be affected by serious inaccuracies. The new algorithm (i) produces a rigorous zero-phase filter, (ii) restores the amplitude of the low-frequency signal, and (iii) corrects all filter-induced signal distortions.

  8. On N = 1 partition functions without R-symmetry

    DOE PAGES

    Knodel, Gino; Liu, James T.; Zayas, Leopoldo A. Pando

    2015-03-25

    Here, we examine the dependence of four-dimensional Euclidean N = 1 partition functions on coupling constants. In particular, we focus on backgrounds without R-symmetry, which arise in the rigid limit of old minimal supergravity. Backgrounds preserving a single supercharge may be classified as having either trivial or SU(2) structure, with the former including S 4. We show that, in the absence of additional symmetries, the partition function depends non-trivially on all couplings in the trivial structure case, and (anti)-holomorphically on couplings in the SU(2) structure case. In both cases, this allows for ambiguities in the form of finite counterterms, whichmore » in principle render the partition function unphysical. However, we argue that on dimensional grounds, ambiguities are restricted to finite powers in relevant couplings, and can therefore be kept under control. On the other hand, for backgrounds preserving supercharges of opposite chiralities, the partition function is completely independent of all couplings. In this case, the background admits an R-symmetry, and the partition function is physical, in agreement with the results obtained in the rigid limit of new minimal supergravity. Based on a systematic analysis of supersymmetric invariants, we also demonstrate that N = 1 localization is not possible for backgrounds without R-symmetry.« less

  9. A stable partitioned FSI algorithm for rigid bodies and incompressible flow. Part I: Model problem analysis

    NASA Astrophysics Data System (ADS)

    Banks, J. W.; Henshaw, W. D.; Schwendeman, D. W.; Tang, Qi

    2017-08-01

    A stable partitioned algorithm is developed for fluid-structure interaction (FSI) problems involving viscous incompressible flow and rigid bodies. This added-mass partitioned (AMP) algorithm remains stable, without sub-iterations, for light and even zero mass rigid bodies when added-mass and viscous added-damping effects are large. The scheme is based on a generalized Robin interface condition for the fluid pressure that includes terms involving the linear acceleration and angular acceleration of the rigid body. Added-mass effects are handled in the Robin condition by inclusion of a boundary integral term that depends on the pressure. Added-damping effects due to the viscous shear forces on the body are treated by inclusion of added-damping tensors that are derived through a linearization of the integrals defining the force and torque. Added-damping effects may be important at low Reynolds number, or, for example, in the case of a rotating cylinder or rotating sphere when the rotational moments of inertia are small. In this first part of a two-part series, the properties of the AMP scheme are motivated and evaluated through the development and analysis of some model problems. The analysis shows when and why the traditional partitioned scheme becomes unstable due to either added-mass or added-damping effects. The analysis also identifies the proper form of the added-damping which depends on the discrete time-step and the grid-spacing normal to the rigid body. The results of the analysis are confirmed with numerical simulations that also demonstrate a second-order accurate implementation of the AMP scheme.

  10. ``Carbon Credits'' for Resource-Bounded Computations Using Amortised Analysis

    NASA Astrophysics Data System (ADS)

    Jost, Steffen; Loidl, Hans-Wolfgang; Hammond, Kevin; Scaife, Norman; Hofmann, Martin

    Bounding resource usage is important for a number of areas, notably real-time embedded systems and safety-critical systems. In this paper, we present a fully automatic static type-based analysis for inferring upper bounds on resource usage for programs involving general algebraic datatypes and full recursion. Our method can easily be used to bound any countable resource, without needing to revisit proofs. We apply the analysis to the important metrics of worst-case execution time, stack- and heap-space usage. Our results from several realistic embedded control applications demonstrate good matches between our inferred bounds and measured worst-case costs for heap and stack usage. For time usage we infer good bounds for one application. Where we obtain less tight bounds, this is due to the use of software floating-point libraries.

  11. Recursive algorithms for bias and gain nonuniformity correction in infrared videos.

    PubMed

    Pipa, Daniel R; da Silva, Eduardo A B; Pagliari, Carla L; Diniz, Paulo S R

    2012-12-01

    Infrared focal-plane array (IRFPA) detectors suffer from fixed-pattern noise (FPN) that degrades image quality, which is also known as spatial nonuniformity. FPN is still a serious problem, despite recent advances in IRFPA technology. This paper proposes new scene-based correction algorithms for continuous compensation of bias and gain nonuniformity in FPA sensors. The proposed schemes use recursive least-square and affine projection techniques that jointly compensate for both the bias and gain of each image pixel, presenting rapid convergence and robustness to noise. The synthetic and real IRFPA videos experimentally show that the proposed solutions are competitive with the state-of-the-art in FPN reduction, by presenting recovered images with higher fidelity.

  12. Multitarget mixture reduction algorithm with incorporated target existence recursions

    NASA Astrophysics Data System (ADS)

    Ristic, Branko; Arulampalam, Sanjeev

    2000-07-01

    The paper derives a deferred logic data association algorithm based on the mixture reduction approach originally due to Salmond [SPIE vol.1305, 1990]. The novelty of the proposed algorithm provides the recursive formulae for both data association and target existence (confidence) estimation, thus allowing automatic track initiation and termination. T he track initiation performance of the proposed filter is investigated by computer simulations. It is observed that at moderately high levels of clutter density the proposed filter initiates tracks more reliably than its corresponding PDA filter. An extension of the proposed filter to the multi-target case is also presented. In addition, the paper compares the track maintenance performance of the MR algorithm with an MHT implementation.

  13. A recursive solution for a fading memory filter derived from Kalman filter theory

    NASA Technical Reports Server (NTRS)

    Statman, J. I.

    1986-01-01

    A simple recursive solution for a class of fading memory tracking filters is presented. A fading memory filter provides estimates of filter states based on past measurements, similar to a traditional Kalman filter. Unlike a Kalman filter, an exponentially decaying weight is applied to older measurements, discounting their effect on present state estimates. It is shown that Kalman filters and fading memory filters are closely related solutions to a general least squares estimator problem. Closed form filter transfer functions are derived for a time invariant, steady state, fading memory filter. These can be applied in loop filter implementation of the Deep Space Network (DSN) Advanced Receiver carrier phase locked loop (PLL).

  14. Recursive Fact-finding: A Streaming Approach to Truth Estimation in Crowdsourcing Applications

    DTIC Science & Technology

    2013-07-01

    are reported over the course of the campaign, lending themselves better to the abstraction of a data stream arriving from the community of sources. In...EM Recursive EM Figure 4. Recursive EM Algorithm Convergence V. RELATED WORK Social sensing which is also referred to as human- centric sensing [4...systems, where different sources offer reviews on products (or brands, companies) they have experienced [16]. Customers are affected by those reviews

  15. Hypoglycemia early alarm systems based on recursive autoregressive partial least squares models.

    PubMed

    Bayrak, Elif Seyma; Turksoy, Kamuran; Cinar, Ali; Quinn, Lauretta; Littlejohn, Elizabeth; Rollins, Derrick

    2013-01-01

    Hypoglycemia caused by intensive insulin therapy is a major challenge for artificial pancreas systems. Early detection and prevention of potential hypoglycemia are essential for the acceptance of fully automated artificial pancreas systems. Many of the proposed alarm systems are based on interpretation of recent values or trends in glucose values. In the present study, subject-specific linear models are introduced to capture glucose variations and predict future blood glucose concentrations. These models can be used in early alarm systems of potential hypoglycemia. A recursive autoregressive partial least squares (RARPLS) algorithm is used to model the continuous glucose monitoring sensor data and predict future glucose concentrations for use in hypoglycemia alarm systems. The partial least squares models constructed are updated recursively at each sampling step with a moving window. An early hypoglycemia alarm algorithm using these models is proposed and evaluated. Glucose prediction models based on real-time filtered data has a root mean squared error of 7.79 and a sum of squares of glucose prediction error of 7.35% for six-step-ahead (30 min) glucose predictions. The early alarm systems based on RARPLS shows good performance. A sensitivity of 86% and a false alarm rate of 0.42 false positive/day are obtained for the early alarm system based on six-step-ahead predicted glucose values with an average early detection time of 25.25 min. The RARPLS models developed provide satisfactory glucose prediction with relatively smaller error than other proposed algorithms and are good candidates to forecast and warn about potential hypoglycemia unless preventive action is taken far in advance. © 2012 Diabetes Technology Society.

  16. Hypoglycemia Early Alarm Systems Based on Recursive Autoregressive Partial Least Squares Models

    PubMed Central

    Bayrak, Elif Seyma; Turksoy, Kamuran; Cinar, Ali; Quinn, Lauretta; Littlejohn, Elizabeth; Rollins, Derrick

    2013-01-01

    Background Hypoglycemia caused by intensive insulin therapy is a major challenge for artificial pancreas systems. Early detection and prevention of potential hypoglycemia are essential for the acceptance of fully automated artificial pancreas systems. Many of the proposed alarm systems are based on interpretation of recent values or trends in glucose values. In the present study, subject-specific linear models are introduced to capture glucose variations and predict future blood glucose concentrations. These models can be used in early alarm systems of potential hypoglycemia. Methods A recursive autoregressive partial least squares (RARPLS) algorithm is used to model the continuous glucose monitoring sensor data and predict future glucose concentrations for use in hypoglycemia alarm systems. The partial least squares models constructed are updated recursively at each sampling step with a moving window. An early hypoglycemia alarm algorithm using these models is proposed and evaluated. Results Glucose prediction models based on real-time filtered data has a root mean squared error of 7.79 and a sum of squares of glucose prediction error of 7.35% for six-step-ahead (30 min) glucose predictions. The early alarm systems based on RARPLS shows good performance. A sensitivity of 86% and a false alarm rate of 0.42 false positive/day are obtained for the early alarm system based on six-step-ahead predicted glucose values with an average early detection time of 25.25 min. Conclusions The RARPLS models developed provide satisfactory glucose prediction with relatively smaller error than other proposed algorithms and are good candidates to forecast and warn about potential hypoglycemia unless preventive action is taken far in advance. PMID:23439179

  17. Microstructural evolution during quenching and partitioning of 0.2C-1.5Mn-1.3Si steels with Cr or Ni additions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pierce, Dean T.; Coughlin, D. R.; Clarke, Kester D.

    Here, the influence of Cr and Ni additions and quench and partition (Q&P) processing parameters on the microstructural development, including carbide formation and austenite retention during Q&P, was studied in two steels with a base composition of 0.2C-1.5Mn-1.3Si wt.% and additions of 1.5 wt.% Cr (1.5Cr) or Ni (1.5Ni). Additions of 1.5 wt.% Cr significantly slowed the kinetics of austenite decomposition relative to the 1.5Ni alloy at all partitioning temperatures, promoting greater austenite retention, lower retained austenite carbon (C) contents, and reduced sensitivity of the retained austenite amounts to processing variables. In the 1.5Cr alloy after partitioning at 400 °Cmore » for 300 s, η-carbides were identified by transmission electron microscopy (TEM) and atom probe tomography (APT) revealed no significant enrichment of substitutional elements in the carbides. In the 1.5Ni alloy after partitioning at 450 °C for 300 s, both plate-like and globular carbides were observed by TEM. APT analysis of the globular carbides clearly revealed significant Si rejection and Mn enrichment. Mössbauer effect spectroscopy was used to quantify the amount of carbides after Q&P. In general, carbide amounts below ~0.3% of Fe were measured in both alloys after partitioning for short times (10 s), irrespective of quench or partitioning temperature, which corresponds to a relatively small portion of the bulk C. With increasing partitioning time, carbide amounts remained approximately constant or increased, depending on the alloy, quench temperature, and/or partitioning temperature.« less

  18. Microstructural evolution during quenching and partitioning of 0.2C-1.5Mn-1.3Si steels with Cr or Ni additions

    DOE PAGES

    Pierce, Dean T.; Coughlin, D. R.; Clarke, Kester D.; ...

    2018-03-08

    Here, the influence of Cr and Ni additions and quench and partition (Q&P) processing parameters on the microstructural development, including carbide formation and austenite retention during Q&P, was studied in two steels with a base composition of 0.2C-1.5Mn-1.3Si wt.% and additions of 1.5 wt.% Cr (1.5Cr) or Ni (1.5Ni). Additions of 1.5 wt.% Cr significantly slowed the kinetics of austenite decomposition relative to the 1.5Ni alloy at all partitioning temperatures, promoting greater austenite retention, lower retained austenite carbon (C) contents, and reduced sensitivity of the retained austenite amounts to processing variables. In the 1.5Cr alloy after partitioning at 400 °Cmore » for 300 s, η-carbides were identified by transmission electron microscopy (TEM) and atom probe tomography (APT) revealed no significant enrichment of substitutional elements in the carbides. In the 1.5Ni alloy after partitioning at 450 °C for 300 s, both plate-like and globular carbides were observed by TEM. APT analysis of the globular carbides clearly revealed significant Si rejection and Mn enrichment. Mössbauer effect spectroscopy was used to quantify the amount of carbides after Q&P. In general, carbide amounts below ~0.3% of Fe were measured in both alloys after partitioning for short times (10 s), irrespective of quench or partitioning temperature, which corresponds to a relatively small portion of the bulk C. With increasing partitioning time, carbide amounts remained approximately constant or increased, depending on the alloy, quench temperature, and/or partitioning temperature.« less

  19. An adaptable binary entropy coder

    NASA Technical Reports Server (NTRS)

    Kiely, A.; Klimesh, M.

    2001-01-01

    We present a novel entropy coding technique which is based on recursive interleaving of variable-to-variable length binary source codes. We discuss code design and performance estimation methods, as well as practical encoding and decoding algorithms.

  20. Travel Time Estimation Using Freeway Point Detector Data Based on Evolving Fuzzy Neural Inference System.

    PubMed

    Tang, Jinjun; Zou, Yajie; Ash, John; Zhang, Shen; Liu, Fang; Wang, Yinhai

    2016-01-01

    Travel time is an important measurement used to evaluate the extent of congestion within road networks. This paper presents a new method to estimate the travel time based on an evolving fuzzy neural inference system. The input variables in the system are traffic flow data (volume, occupancy, and speed) collected from loop detectors located at points both upstream and downstream of a given link, and the output variable is the link travel time. A first order Takagi-Sugeno fuzzy rule set is used to complete the inference. For training the evolving fuzzy neural network (EFNN), two learning processes are proposed: (1) a K-means method is employed to partition input samples into different clusters, and a Gaussian fuzzy membership function is designed for each cluster to measure the membership degree of samples to the cluster centers. As the number of input samples increases, the cluster centers are modified and membership functions are also updated; (2) a weighted recursive least squares estimator is used to optimize the parameters of the linear functions in the Takagi-Sugeno type fuzzy rules. Testing datasets consisting of actual and simulated data are used to test the proposed method. Three common criteria including mean absolute error (MAE), root mean square error (RMSE), and mean absolute relative error (MARE) are utilized to evaluate the estimation performance. Estimation results demonstrate the accuracy and effectiveness of the EFNN method through comparison with existing methods including: multiple linear regression (MLR), instantaneous model (IM), linear model (LM), neural network (NN), and cumulative plots (CP).

Top